WorldWideScience

Sample records for model parameter uncertainty

  1. Uncertainty Quantification for Optical Model Parameters

    CERN Document Server

    Lovell, A E; Sarich, J; Wild, S M

    2016-01-01

    Although uncertainty quantification has been making its way into nuclear theory, these methods have yet to be explored in the context of reaction theory. For example, it is well known that different parameterizations of the optical potential can result in different cross sections, but these differences have not been systematically studied and quantified. The purpose of this work is to investigate the uncertainties in nuclear reactions that result from fitting a given model to elastic-scattering data, as well as to study how these uncertainties propagate to the inelastic and transfer channels. We use statistical methods to determine a best fit and create corresponding 95\\% confidence bands. A simple model of the process is fit to elastic-scattering data and used to predict either inelastic or transfer cross sections. In this initial work, we assume that our model is correct, and the only uncertainties come from the variation of the fit parameters. We study a number of reactions involving neutron and deuteron p...

  2. Parameter and Uncertainty Estimation in Groundwater Modelling

    DEFF Research Database (Denmark)

    Jensen, Jacob Birk

    The data basis on which groundwater models are constructed is in general very incomplete, and this leads to uncertainty in model outcome. Groundwater models form the basis for many, often costly decisions and if these are to be made on solid grounds, the uncertainty attached to model results must...... be quantified. This study was motivated by the need to estimate the uncertainty involved in groundwater models.Chapter 2 presents an integrated surface/subsurface unstructured finite difference model that was developed and applied to a synthetic case study.The following two chapters concern calibration...... and uncertainty estimation. Essential issues relating to calibration are discussed. The classical regression methods are described; however, the main focus is on the Generalized Likelihood Uncertainty Estimation (GLUE) methodology. The next two chapters describe case studies in which the GLUE methodology...

  3. Model and parameter uncertainty in IDF relationships under climate change

    Science.gov (United States)

    Chandra, Rupa; Saha, Ujjwal; Mujumdar, P. P.

    2015-05-01

    Quantifying distributional behavior of extreme events is crucial in hydrologic designs. Intensity Duration Frequency (IDF) relationships are used extensively in engineering especially in urban hydrology, to obtain return level of extreme rainfall event for a specified return period and duration. Major sources of uncertainty in the IDF relationships are due to insufficient quantity and quality of data leading to parameter uncertainty due to the distribution fitted to the data and uncertainty as a result of using multiple GCMs. It is important to study these uncertainties and propagate them to future for accurate assessment of return levels for future. The objective of this study is to quantify the uncertainties arising from parameters of the distribution fitted to data and the multiple GCM models using Bayesian approach. Posterior distribution of parameters is obtained from Bayes rule and the parameters are transformed to obtain return levels for a specified return period. Markov Chain Monte Carlo (MCMC) method using Metropolis Hastings algorithm is used to obtain the posterior distribution of parameters. Twenty six CMIP5 GCMs along with four RCP scenarios are considered for studying the effects of climate change and to obtain projected IDF relationships for the case study of Bangalore city in India. GCM uncertainty due to the use of multiple GCMs is treated using Reliability Ensemble Averaging (REA) technique along with the parameter uncertainty. Scale invariance theory is employed for obtaining short duration return levels from daily data. It is observed that the uncertainty in short duration rainfall return levels is high when compared to the longer durations. Further it is observed that parameter uncertainty is large compared to the model uncertainty.

  4. Uncertainty of Modal Parameters Estimated by ARMA Models

    DEFF Research Database (Denmark)

    Jensen, Jakob Laigaard; Brincker, Rune; Rytter, Anders

    In this paper the uncertainties of identified modal parameters such as eigenfrequencies and damping ratios are assessed. From the measured response of dynamic excited structures the modal parameters may be identified and provide important structural knowledge. However the uncertainty of the param......In this paper the uncertainties of identified modal parameters such as eigenfrequencies and damping ratios are assessed. From the measured response of dynamic excited structures the modal parameters may be identified and provide important structural knowledge. However the uncertainty...... by a simulation study of a lightly damped single degree of freedom system. Identification by ARMA models has been chosen as system identification method. It is concluded that both the sampling interval and number of sampled points may play a significant role with respect to the statistical errors. Furthermore...

  5. Uncertainty of Modal Parameters Estimated by ARMA Models

    DEFF Research Database (Denmark)

    Jensen, Jacob Laigaard; Brincker, Rune; Rytter, Anders

    1990-01-01

    In this paper the uncertainties of identified modal parameters such as eidenfrequencies and damping ratios are assed. From the measured response of dynamic excited structures the modal parameters may be identified and provide important structural knowledge. However the uncertainty of the parameters...... by simulation study of a lightly damped single degree of freedom system. Identification by ARMA models has been choosen as system identification method. It is concluded that both the sampling interval and number of sampled points may play a significant role with respect to the statistical errors. Furthermore...

  6. Combined Estimation of Hydrogeologic Conceptual Model and Parameter Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Philip D.; Ye, Ming; Neuman, Shlomo P.; Cantrell, Kirk J.

    2004-03-01

    The objective of the research described in this report is the development and application of a methodology for comprehensively assessing the hydrogeologic uncertainties involved in dose assessment, including uncertainties associated with conceptual models, parameters, and scenarios. This report describes and applies a statistical method to quantitatively estimate the combined uncertainty in model predictions arising from conceptual model and parameter uncertainties. The method relies on model averaging to combine the predictions of a set of alternative models. Implementation is driven by the available data. When there is minimal site-specific data the method can be carried out with prior parameter estimates based on generic data and subjective prior model probabilities. For sites with observations of system behavior (and optionally data characterizing model parameters), the method uses model calibration to update the prior parameter estimates and model probabilities based on the correspondence between model predictions and site observations. The set of model alternatives can contain both simplified and complex models, with the requirement that all models be based on the same set of data. The method was applied to the geostatistical modeling of air permeability at a fractured rock site. Seven alternative variogram models of log air permeability were considered to represent data from single-hole pneumatic injection tests in six boreholes at the site. Unbiased maximum likelihood estimates of variogram and drift parameters were obtained for each model. Standard information criteria provided an ambiguous ranking of the models, which would not justify selecting one of them and discarding all others as is commonly done in practice. Instead, some of the models were eliminated based on their negligibly small updated probabilities and the rest were used to project the measured log permeabilities by kriging onto a rock volume containing the six boreholes. These four

  7. Predicting the Term Structure of Interest Rates: Incorporating parameter uncertainty, model uncertainty and macroeconomic information

    NARCIS (Netherlands)

    M.D. de Pooter (Michiel); F. Ravazzolo (Francesco); D.J.C. van Dijk (Dick)

    2007-01-01

    textabstractWe forecast the term structure of U.S. Treasury zero-coupon bond yields by analyzing a range of models that have been used in the literature. We assess the relevance of parameter uncertainty by examining the added value of using Bayesian inference compared to frequentist estimation

  8. Parameter and uncertainty estimation for mechanistic, spatially explicit epidemiological models

    Science.gov (United States)

    Finger, Flavio; Schaefli, Bettina; Bertuzzo, Enrico; Mari, Lorenzo; Rinaldo, Andrea

    2014-05-01

    Epidemiological models can be a crucially important tool for decision-making during disease outbreaks. The range of possible applications spans from real-time forecasting and allocation of health-care resources to testing alternative intervention mechanisms such as vaccines, antibiotics or the improvement of sanitary conditions. Our spatially explicit, mechanistic models for cholera epidemics have been successfully applied to several epidemics including, the one that struck Haiti in late 2010 and is still ongoing. Calibration and parameter estimation of such models represents a major challenge because of properties unusual in traditional geoscientific domains such as hydrology. Firstly, the epidemiological data available might be subject to high uncertainties due to error-prone diagnosis as well as manual (and possibly incomplete) data collection. Secondly, long-term time-series of epidemiological data are often unavailable. Finally, the spatially explicit character of the models requires the comparison of several time-series of model outputs with their real-world counterparts, which calls for an appropriate weighting scheme. It follows that the usual assumption of a homoscedastic Gaussian error distribution, used in combination with classical calibration techniques based on Markov chain Monte Carlo algorithms, is likely to be violated, whereas the construction of an appropriate formal likelihood function seems close to impossible. Alternative calibration methods, which allow for accurate estimation of total model uncertainty, particularly regarding the envisaged use of the models for decision-making, are thus needed. Here we present the most recent developments regarding methods for parameter and uncertainty estimation to be used with our mechanistic, spatially explicit models for cholera epidemics, based on informal measures of goodness of fit.

  9. Model parameter uncertainty analysis for an annual field-scale P loss model

    Science.gov (United States)

    Bolster, Carl H.; Vadas, Peter A.; Boykin, Debbie

    2016-08-01

    Phosphorous (P) fate and transport models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. Because all models are simplifications of complex systems, there will exist an inherent amount of uncertainty associated with their predictions. It is therefore important that efforts be directed at identifying, quantifying, and communicating the different sources of model uncertainties. In this study, we conducted an uncertainty analysis with the Annual P Loss Estimator (APLE) model. Our analysis included calculating parameter uncertainties and confidence and prediction intervals for five internal regression equations in APLE. We also estimated uncertainties of the model input variables based on values reported in the literature. We then predicted P loss for a suite of fields under different management and climatic conditions while accounting for uncertainties in the model parameters and inputs and compared the relative contributions of these two sources of uncertainty to the overall uncertainty associated with predictions of P loss. Both the overall magnitude of the prediction uncertainties and the relative contributions of the two sources of uncertainty varied depending on management practices and field characteristics. This was due to differences in the number of model input variables and the uncertainties in the regression equations associated with each P loss pathway. Inspection of the uncertainties in the five regression equations brought attention to a previously unrecognized limitation with the equation used to partition surface-applied fertilizer P between leaching and runoff losses. As a result, an alternate equation was identified that provided similar predictions with much less uncertainty. Our results demonstrate how a thorough uncertainty and model residual analysis can be used to identify limitations with a model. Such insight can then be used to guide future data collection and model

  10. Parameters-related uncertainty in modeling sugar cane yield with an agro-Land Surface Model

    Science.gov (United States)

    Valade, A.; Ciais, P.; Vuichard, N.; Viovy, N.; Ruget, F.; Gabrielle, B.

    2012-12-01

    Agro-Land Surface Models (agro-LSM) have been developed from the coupling of specific crop models and large-scale generic vegetation models. They aim at accounting for the spatial distribution and variability of energy, water and carbon fluxes within soil-vegetation-atmosphere continuum with a particular emphasis on how crop phenology and agricultural management practice influence the turbulent fluxes exchanged with the atmosphere, and the underlying water and carbon pools. A part of the uncertainty in these models is related to the many parameters included in the models' equations. In this study, we quantify the parameter-based uncertainty in the simulation of sugar cane biomass production with the agro-LSM ORCHIDEE-STICS on a multi-regional approach with data from sites in Australia, La Reunion and Brazil. First, the main source of uncertainty for the output variables NPP, GPP, and sensible heat flux (SH) is determined through a screening of the main parameters of the model on a multi-site basis leading to the selection of a subset of most sensitive parameters causing most of the uncertainty. In a second step, a sensitivity analysis is carried out on the parameters selected from the screening analysis at a regional scale. For this, a Monte-Carlo sampling method associated with the calculation of Partial Ranked Correlation Coefficients is used. First, we quantify the sensitivity of the output variables to individual input parameters on a regional scale for two regions of intensive sugar cane cultivation in Australia and Brazil. Then, we quantify the overall uncertainty in the simulation's outputs propagated from the uncertainty in the input parameters. Seven parameters are identified by the screening procedure as driving most of the uncertainty in the agro-LSM ORCHIDEE-STICS model output at all sites. These parameters control photosynthesis (optimal temperature of photosynthesis, optimal carboxylation rate), radiation interception (extinction coefficient), root

  11. The Effects of Uncertainty in Speed-Flow Curve Parameters on a Large-Scale Model

    DEFF Research Database (Denmark)

    Manzo, Stefano; Nielsen, Otto Anker; Prato, Carlo Giacomo

    2014-01-01

    Uncertainty is inherent in transport models and prevents the use of a deterministic approach when traffic is modeled. Quantifying uncertainty thus becomes an indispensable step to produce a more informative and reliable output of transport models. In traffic assignment models, volume-delay functi......Uncertainty is inherent in transport models and prevents the use of a deterministic approach when traffic is modeled. Quantifying uncertainty thus becomes an indispensable step to produce a more informative and reliable output of transport models. In traffic assignment models, volume...... uncertainty. This aspect is evident particularly for stretches of the network with a high number of competing routes. Model sensitivity was also tested for BPR parameter uncertainty combined with link capacity uncertainty. The resultant increase in model sensitivity demonstrates even further the importance...

  12. An information theory approach to minimise correlated systematic uncertainty in modelling resonance parameters

    Energy Technology Data Exchange (ETDEWEB)

    Krishna Kumar, P.T. [Research Laboratory for Nuclear Reactors, Tokyo Institute of Technology, 2-12-1, O-Okayama, Meguro-Ku, Tokyo 152-8550 (Japan)], E-mail: gstptk@yahoo.co.in; Sekimoto, Hiroshi [Research Laboratory for Nuclear Reactors, Tokyo Institute of Technology, 2-12-1, O-Okayama, Meguro-Ku, Tokyo 152-8550 (Japan)], E-mail: hsekimot@nr.titech.ac.jp

    2009-02-15

    Covariance matrix elements depict the statistical and systematic uncertainties in reactor parameter measurements. All the efforts have so far been devoted only to minimise the statistical uncertainty by repeated measurements but the dominant systematic uncertainty has either been neglected or randomized. In recent years efforts has been devoted to simulate the resonance parameter uncertainty information through covariance matrices in code SAMMY. But, the code does not have any provision to check the reliability of the simulated covariance data. We propose a new approach called entropy based information theory to reduce the systematic uncertainty in the correlation matrix element so that resonance parameters with minimum systematic uncertainty can be modelled. We apply our information theory approach in generating the resonance parameters of {sup 156}Gd with reduced systematic uncertainty and demonstrate the superiority of our technique over the principal component analysis method.

  13. Land Building Models: Uncertainty in and Sensitivity to Input Parameters

    Science.gov (United States)

    2013-08-01

    Louisiana Coastal Area Ecosystem Restoration Projects Study , Vol. 3, Final integrated ERDC/CHL CHETN-VI-44 August 2013 24 feasibility study and... Nourishment Module, Chapter 8. In Coastal Louisiana Ecosystem Assessment and Restoration (CLEAR) Model of Louisiana Coastal Area (LCA) Comprehensive...to Input Parameters by Ty V. Wamsley PURPOSE: The purpose of this Coastal and Hydraulics Engineering Technical Note (CHETN) is to document a

  14. Optimal Parameter and Uncertainty Estimation of a Land Surface Model: Sensitivity to Parameter Ranges and Model Complexities

    Institute of Scientific and Technical Information of China (English)

    Youlong XIA; Zong-Liang YANG; Paul L. STOFFA; Mrinal K. SEN

    2005-01-01

    Most previous land-surface model calibration studies have defined global ranges for their parameters to search for optimal parameter sets. Little work has been conducted to study the impacts of realistic versus global ranges as well as model complexities on the calibration and uncertainty estimates. The primary purpose of this paper is to investigate these impacts by employing Bayesian Stochastic Inversion (BSI)to the Chameleon Surface Model (CHASM). The CHASM was designed to explore the general aspects of land-surface energy balance representation within a common modeling framework that can be run from a simple energy balance formulation to a complex mosaic type structure. The BSI is an uncertainty estimation technique based on Bayes theorem, importance sampling, and very fast simulated annealing.The model forcing data and surface flux data were collected at seven sites representing a wide range of climate and vegetation conditions. For each site, four experiments were performed with simple and complex CHASM formulations as well as realistic and global parameter ranges. Twenty eight experiments were conducted and 50 000 parameter sets were used for each run. The results show that the use of global and realistic ranges gives similar simulations for both modes for most sites, but the global ranges tend to produce some unreasonable optimal parameter values. Comparison of simple and complex modes shows that the simple mode has more parameters with unreasonable optimal values. Use of parameter ranges and model complexities have significant impacts on frequency distribution of parameters, marginal posterior probability density functions, and estimates of uncertainty of simulated sensible and latent heat fluxes.Comparison between model complexity and parameter ranges shows that the former has more significant impacts on parameter and uncertainty estimations.

  15. Statistical approach for uncertainty quantification of experimental modal model parameters

    DEFF Research Database (Denmark)

    Luczak, M.; Peeters, B.; Kahsin, M.

    2014-01-01

    estimates obtained from vibration experiments. Modal testing results are influenced by numerous factors introducing uncertainty to the measurement results. Different experimental techniques applied to the same test item or testing numerous nominally identical specimens yields different test results...

  16. Propagation of Uncertainty in System Parameters of a LWR Model by Sampling MCNPX Calculations - Burnup Analysis

    Science.gov (United States)

    Campolina, Daniel de A. M.; Lima, Claubia P. B.; Veloso, Maria Auxiliadora F.

    2014-06-01

    For all the physical components that comprise a nuclear system there is an uncertainty. Assessing the impact of uncertainties in the simulation of fissionable material systems is essential for a best estimate calculation that has been replacing the conservative model calculations as the computational power increases. The propagation of uncertainty in a simulation using a Monte Carlo code by sampling the input parameters is recent because of the huge computational effort required. In this work a sample space of MCNPX calculations was used to propagate the uncertainty. The sample size was optimized using the Wilks formula for a 95th percentile and a two-sided statistical tolerance interval of 95%. Uncertainties in input parameters of the reactor considered included geometry dimensions and densities. It was showed the capacity of the sampling-based method for burnup when the calculations sample size is optimized and many parameter uncertainties are investigated together, in the same input.

  17. Uncertainty in the relationship between flow and parameters in models of pollutant transport

    Science.gov (United States)

    Romanowicz, R.; Osuch, M.; Wallis, S.; Napiórkowski, J. J.

    2009-04-01

    Fluorescent dye-tracer studies are usually performed under steady-state flow conditions. However, the model parameters, estimated using the tracer data, depend on the discharges. This paper investigates uncertainties in the relationship between discharges and parameters of a transient storage (TS) and an aggregated dead zone (ADZ) models. We apply a Bayesian statistical approach to derive the cumulative distribution of a range of model parameters conditioned on discharges. The data consist of eighteen tracer concentration profiles taken at different flow values at two cross-sections from the Murray Burn, a stream flowing through the Heriot-Watt University Campus at Riccarton in Edinburgh, Scotland. A number of studies have been reported of the dependence of TS and ADZ model parameters on discharge but there are very few studies on the uncertainty related to that parameterization, which is the aim of this work. As the TS model is purely deterministic and the ADZ model is stochastic, different approaches are required to estimate the uncertainty in the dependence of their parameters on flow. The Generalised Likelihood Uncertainty Estimation (GLUE) approach is suitable for the deterministic models and is therefore applied to the TS model. The method applies Monte Carlo sampling of parameter space used in multiple simulations of a deterministic transient storage model. The relationship between model parameters and flow has the form of a nonlinear regression model based on multiple random realizations of the deterministic transport model. The parameterization of that relationship and its introduction into the TS model allow for the conditioning of parameter estimates and as a result, also model predictions on the whole set of available observations. In the case of the ADZ model, the approach is based on Monte Carlo sampling of ADZ model parameters, taking into account heteroscedastic variance of the observations and estimates of the covariance of the model parameters

  18. Estimation of Model and Parameter Uncertainty For A Distributed Rainfall-runoff Model

    Science.gov (United States)

    Engeland, K.

    The distributed rainfall-runoff model Ecomag is applied as a regional model for nine catchments in the NOPEX area in Sweden. Ecomag calculates streamflow on a daily time resolution. The posterior distribution of the model parameters is conditioned on the observed streamflow in all nine catchments, and calculated using Bayesian statistics. The distribution is estimated by Markov chain Monte Carlo (MCMC). The Bayesian method requires a definition of the likelihood of the parameters. Two alter- native formulations are used. The first formulation is a subjectively chosen objective function describing the goodness of fit between the simulated and observed streamflow as it is used in the GLUE framework. The second formulation is to use a more statis- tically correct likelihood function that describes the simulation errors. The simulation error is defined as the difference between log-transformed observed and simulated streamflows. A statistical model for the simulation errors is constructed. Some param- eters are dependent on the catchment, while others depend on climate. The statistical and the hydrological parameters are estimated simultaneously. Confidence intervals, due to the uncertainty of the Ecomag parameters, for the simulated streamflow are compared for the two likelihood functions. Confidence intervals based on the statis- tical model for the simulation errors are also calculated. The results indicate that the parameter uncertainty depends on the formulation of the likelihood function. The sub- jectively chosen likelihood function gives relatively wide confidence intervals whereas the 'statistical' likelihood function gives more narrow confidence intervals. The statis- tical model for the simulation errors indicates that the structural errors of the model are as least as important as the parameter uncertainty.

  19. Stochastic and Perturbed Parameter Representations of Model Uncertainty in Convection Parameterization

    Science.gov (United States)

    Christensen, H. M.; Moroz, I.; Palmer, T.

    2015-12-01

    It is now acknowledged that representing model uncertainty in atmospheric simulators is essential for the production of reliable probabilistic ensemble forecasts, and a number of different techniques have been proposed for this purpose. Stochastic convection parameterization schemes use random numbers to represent the difference between a deterministic parameterization scheme and the true atmosphere, accounting for the unresolved sub grid-scale variability associated with convective clouds. An alternative approach varies the values of poorly constrained physical parameters in the model to represent the uncertainty in these parameters. This study presents new perturbed parameter schemes for use in the European Centre for Medium Range Weather Forecasts (ECMWF) convection scheme. Two types of scheme are developed and implemented. Both schemes represent the joint uncertainty in four of the parameters in the convection parametrisation scheme, which was estimated using the Ensemble Prediction and Parameter Estimation System (EPPES). The first scheme developed is a fixed perturbed parameter scheme, where the values of uncertain parameters are changed between ensemble members, but held constant over the duration of the forecast. The second is a stochastically varying perturbed parameter scheme. The performance of these schemes was compared to the ECMWF operational stochastic scheme, Stochastically Perturbed Parametrisation Tendencies (SPPT), and to a model which does not represent uncertainty in convection. The skill of probabilistic forecasts made using the different models was evaluated. While the perturbed parameter schemes improve on the stochastic parametrisation in some regards, the SPPT scheme outperforms the perturbed parameter approaches when considering forecast variables that are particularly sensitive to convection. Overall, SPPT schemes are the most skilful representations of model uncertainty due to convection parametrisation. Reference: H. M. Christensen, I

  20. The parameters uncertainty inflation fallacy

    CERN Document Server

    Pernot, Pascal

    2016-01-01

    Statistical estimation of the prediction uncertainty of physical models is typically hindered by the inadequacy of these models due to various approximations they are built upon. The prediction errors due to model inadequacy can be handled either by correcting the model's results, or by adapting the model's parameters uncertainty to generate prediction uncertainty representative, in a way to be defined, of model inadequacy errors. The main advantage of the latter approach is its transferability to the prediction of other quantities of interest based on the same parameters. A critical review of state-of-the-art implementations of this approach in computational chemistry shows that it is biased, in the sense that it does not produce prediction uncertainty bands conforming with model inadequacy errors.

  1. Parameter estimation and uncertainty assessment in hydrological modelling

    DEFF Research Database (Denmark)

    Blasone, Roberta-Serena

    En rationel og effektiv vandressourceadministration forudsætter indsigt i og forståelse af de hydrologiske processer samt præcise opgørelser af de tilgængelige vandmængder i både overfladevands- og grundvandsmagasiner. Til det formål er hydrologiske modeller et uomgængeligt værktøj. I de senest 10......-20 år er der forsket meget i hydrologiske processer og især i implementeringen af denne viden i numeriske modelsystemer. Dette har ledt til modeller af stigende kompleksitet. Samtidig er en række forskellige teknikker til at estimere modelparametre og til at skønne usikkerheden på modelprædiktioner...... hertil har været de lange beregningstider og omfattende datakrav, der karakteriserer denne type modeller, og som udgør et stort problem ved rekursiv anvendelse af modellerne. Dertil kommer, at de komplekse modeller sædvanligvis ikke er frit tilgængelige på samme måde som de simple nedbør...

  2. Parameter estimation and uncertainty assessment in hydrological modelling

    DEFF Research Database (Denmark)

    Blasone, Roberta-Serena

    En rationel og effektiv vandressourceadministration forudsætter indsigt i og forståelse af de hydrologiske processer samt præcise opgørelser af de tilgængelige vandmængder i både overfladevands- og grundvandsmagasiner. Til det formål er hydrologiske modeller et uomgængeligt værktøj. I de senest 10......-20 år er der forsket meget i hydrologiske processer og især i implementeringen af denne viden i numeriske modelsystemer. Dette har ledt til modeller af stigende kompleksitet. Samtidig er en række forskellige teknikker til at estimere modelparametre og til at skønne usikkerheden på modelprædiktioner...... hertil har været de lange beregningstider og omfattende datakrav, der karakteriserer denne type modeller, og som udgør et stort problem ved rekursiv anvendelse af modellerne. Dertil kommer, at de komplekse modeller sædvanligvis ikke er frit tilgængelige på samme måde som de simple nedbør...

  3. Incorporating Parameter Uncertainty in Bayesian Segmentation Models: Application to Hippocampal Subfield Volumetry

    OpenAIRE

    Iglesias, Juan Eugenio; Sabuncu, Mert Rory; Van Leemput, Koen

    2012-01-01

    Many successful segmentation algorithms are based on Bayesian models in which prior anatomical knowledge is combined with the available image information. However, these methods typically have many free parameters that are estimated to obtain point estimates only, whereas a faithful Bayesian analysis would also consider all possible alternate values these parameters may take. In this paper, we propose to incorporate the uncertainty of the free parameters in Bayesian segmentation models more a...

  4. Uncertainty from synergistic effects of multiple parameters in the Johnson and Ettinger (1991) vapor intrusion model

    Science.gov (United States)

    Tillman, Fred D.; Weaver, James W.

    Migration of volatile chemicals from the subsurface into overlying buildings is known as vapor intrusion (VI). Under certain circumstances, people living in homes above contaminated soil or ground water may be exposed to harmful levels of these vapors. VI is a particularly difficult pathway to assess, as challenges exist in delineating subsurface contributions to measured indoor-air concentrations as well as in adequate characterization of subsurface parameters necessary to calibrate a predictive flow and transport model. Often, a screening-level model is employed to determine if a potential indoor inhalation exposure pathway exists and, if such a pathway is complete, whether long-term exposure increases the occupants' risk for cancer or other toxic effects to an unacceptable level. A popular screening-level algorithm currently in wide use in the United States, Canada and the UK for making such determinations is the "Johnson and Ettinger" (J&E) model. Concern exists over using the J&E model for deciding whether or not further action is necessary at sites as many parameters are not routinely measured (or are un-measurable). Many screening decisions are then made based on simulations using "best estimate" look-up parameter values. While research exists on the sensitivity of the J&E model to individual parameter uncertainty, little published information is available on the combined effects of multiple uncertain parameters and their effect on screening decisions. This paper presents results of multiple-parameter uncertainty analyses using the J&E model to evaluate risk to humans from VI. Software was developed to produce automated uncertainty analyses of the model. Results indicate an increase in predicted cancer risk from multiple-parameter uncertainty by nearly a factor of 10 compared with single-parameter uncertainty. Additionally, a positive skew in model response to variation of some parameters was noted for both single and multiple parameter uncertainty analyses

  5. Utilizing Soize's Approach to Identify Parameter and Model Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Bonney, Matthew S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Univ. of Wisconsin, Madison, WI (United States); Brake, Matthew Robert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-10-01

    Quantifying uncertainty in model parameters is a challenging task for analysts. Soize has derived a method that is able to characterize both model and parameter uncertainty independently. This method is explained with the assumption that some experimental data is available, and is divided into seven steps. Monte Carlo analyses are performed to select the optimal dispersion variable to match the experimental data. Along with the nominal approach, an alternative distribution can be used along with corrections that can be utilized to expand the scope of this method. This method is one of a very few methods that can quantify uncertainty in the model form independently of the input parameters. Two examples are provided to illustrate the methodology, and example code is provided in the Appendix.

  6. A novel approach to parameter uncertainty analysis of hydrological models using neural networks

    Directory of Open Access Journals (Sweden)

    D. P. Solomatine

    2009-07-01

    Full Text Available In this study, a methodology has been developed to emulate a time consuming Monte Carlo (MC simulation by using an Artificial Neural Network (ANN for the assessment of model parametric uncertainty. First, MC simulation of a given process model is run. Then an ANN is trained to approximate the functional relationships between the input variables of the process model and the synthetic uncertainty descriptors estimated from the MC realizations. The trained ANN model encapsulates the underlying characteristics of the parameter uncertainty and can be used to predict uncertainty descriptors for the new data vectors. This approach was validated by comparing the uncertainty descriptors in the verification data set with those obtained by the MC simulation. The method is applied to estimate the parameter uncertainty of a lumped conceptual hydrological model, HBV, for the Brue catchment in the United Kingdom. The results are quite promising as the prediction intervals estimated by the ANN are reasonably accurate. The proposed techniques could be useful in real time applications when it is not practicable to run a large number of simulations for complex hydrological models and when the forecast lead time is very short.

  7. Modelling pesticide leaching under climate change: parameter vs. climate input uncertainty

    Directory of Open Access Journals (Sweden)

    K. Steffens

    2013-08-01

    Full Text Available The assessment of climate change impacts on the risk for pesticide leaching needs careful consideration of different sources of uncertainty. We investigated the uncertainty related to climate scenario input and its importance relative to parameter uncertainty of the pesticide leaching model. The pesticide fate model MACRO was calibrated against a comprehensive one-year field data set for a well-structured clay soil in south-west Sweden. We obtained an ensemble of 56 acceptable parameter sets that represented the parameter uncertainty. Nine different climate model projections of the regional climate model RCA3 were available as driven by different combinations of global climate models (GCM, greenhouse gas emission scenarios and initial states of the GCM. The future time series of weather data used to drive the MACRO-model were generated by scaling a reference climate data set (1970–1999 for an important agricultural production area in south-west Sweden based on monthly change factors for 2070–2099. 30 yr simulations were performed for different combinations of pesticide properties and application seasons. Our analysis showed that both the magnitude and the direction of predicted change in pesticide leaching from present to future depended strongly on the particular climate scenario. The effect of parameter uncertainty was of major importance for simulating absolute pesticide losses, whereas the climate uncertainty was relatively more important for predictions of changes of pesticide losses from present to future. The climate uncertainty should be accounted for by applying an ensemble of different climate scenarios. The aggregated ensemble prediction based on both acceptable parameterizations and different climate scenarios could provide robust probabilistic estimates of future pesticide losses and assessments of changes in pesticide leaching risks.

  8. Modelling pesticide leaching under climate change: parameter vs. climate input uncertainty

    Directory of Open Access Journals (Sweden)

    K. Steffens

    2014-02-01

    Full Text Available Assessing climate change impacts on pesticide leaching requires careful consideration of different sources of uncertainty. We investigated the uncertainty related to climate scenario input and its importance relative to parameter uncertainty of the pesticide leaching model. The pesticide fate model MACRO was calibrated against a comprehensive one-year field data set for a well-structured clay soil in south-western Sweden. We obtained an ensemble of 56 acceptable parameter sets that represented the parameter uncertainty. Nine different climate model projections of the regional climate model RCA3 were available as driven by different combinations of global climate models (GCM, greenhouse gas emission scenarios and initial states of the GCM. The future time series of weather data used to drive the MACRO model were generated by scaling a reference climate data set (1970–1999 for an important agricultural production area in south-western Sweden based on monthly change factors for 2070–2099. 30 yr simulations were performed for different combinations of pesticide properties and application seasons. Our analysis showed that both the magnitude and the direction of predicted change in pesticide leaching from present to future depended strongly on the particular climate scenario. The effect of parameter uncertainty was of major importance for simulating absolute pesticide losses, whereas the climate uncertainty was relatively more important for predictions of changes of pesticide losses from present to future. The climate uncertainty should be accounted for by applying an ensemble of different climate scenarios. The aggregated ensemble prediction based on both acceptable parameterizations and different climate scenarios has the potential to provide robust probabilistic estimates of future pesticide losses.

  9. Water quality modeling under hydrologic variability and parameter uncertainty using erosion-scaled export coefficients

    Science.gov (United States)

    Khadam, Ibrahim M.; Kaluarachchi, Jagath J.

    2006-10-01

    SummaryWater quality modeling is important to assess the health of a watershed and to make necessary management decisions to control existing and future pollution of receiving water bodies. The existing export coefficient approach is attractive due to minimum data requirements; however, this method does not account for hydrologic variability. In this paper, an erosion-scaled export coefficient approach is proposed that can model and explain the hydrologic variability in predicting the annual phosphorus (P) loading to the receiving stream. Here sediment discharge was introduced into the export coefficient model as a surrogate for hydrologic variability. Application of this approach to model P in the Fishtrap Creek of Washington State showed the superiority of this approach compared to the traditional export coefficient approach, while maintaining its simplicity and low data requirement characteristics. In addition, a Bayesian framework is proposed to assess the parameter uncertainty of the export coefficient method instead of subjective assignment of uncertainty. This work also showed through a joint variability-uncertainty analysis the importance of separate consideration of hydrologic variability and parameter uncertainty, as these represent two independent and important characteristics of the overall model uncertainty. The paper also recommends the use of a longitudinal data collection scheme to reduce the uncertainty in export coefficients.

  10. Assessment of structural model and parameter uncertainty with a multi-model system for soil water balance models

    Science.gov (United States)

    Michalik, Thomas; Multsch, Sebastian; Frede, Hans-Georg; Breuer, Lutz

    2016-04-01

    Water for agriculture is strongly limited in arid and semi-arid regions and often of low quality in terms of salinity. The application of saline waters for irrigation increases the salt load in the rooting zone and has to be managed by leaching to maintain a healthy soil, i.e. to wash out salts by additional irrigation. Dynamic simulation models are helpful tools to calculate the root zone water fluxes and soil salinity content in order to investigate best management practices. However, there is little information on structural and parameter uncertainty for simulations regarding the water and salt balance of saline irrigation. Hence, we established a multi-model system with four different models (AquaCrop, RZWQM, SWAP, Hydrus1D/UNSATCHEM) to analyze the structural and parameter uncertainty by using the Global Likelihood and Uncertainty Estimation (GLUE) method. Hydrus1D/UNSATCHEM and SWAP were set up with multiple sets of different implemented functions (e.g. matric and osmotic stress for root water uptake) which results in a broad range of different model structures. The simulations were evaluated against soil water and salinity content observations. The posterior distribution of the GLUE analysis gives behavioral parameters sets and reveals uncertainty intervals for parameter uncertainty. Throughout all of the model sets, most parameters accounting for the soil water balance show a low uncertainty, only one or two out of five to six parameters in each model set displays a high uncertainty (e.g. pore-size distribution index in SWAP and Hydrus1D/UNSATCHEM). The differences between the models and model setups reveal the structural uncertainty. The highest structural uncertainty is observed for deep percolation fluxes between the model sets of Hydrus1D/UNSATCHEM (~200 mm) and RZWQM (~500 mm) that are more than twice as high for the latter. The model sets show a high variation in uncertainty intervals for deep percolation as well, with an interquartile range (IQR) of

  11. Fault detection and identification in dynamic systems with noisy data and parameter/modeling uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Dinca, Laurian; Aldemir, Tunc; Rizzoni, Giorgio

    1999-06-01

    A probabilistic approach is presented which can be used for the estimation of system parameters and unmonitored state variables towards model-based fault diagnosis in dynamic systems. The method can be used with any type of input-output model and can accommodate noisy data and/or parameter/modeling uncertainties. The methodology is based on Markovian representation of system dynamics in discretized state space. The example system used for the illustration of the methodology focuses on the intake, fueling, combustion and exhaust components of internal combustion engines. The results show that the methodology is capable of estimating the system parameters and tracking the unmonitored dynamic variables within user-specified magnitude intervals (which may reflect noise in the monitored data, random changes in the parameters or modeling uncertainties in general) within data collection time and hence has potential for on-line implementation.

  12. Robust model-reference control for descriptor linear systems subject to parameter uncertainties

    Institute of Scientific and Technical Information of China (English)

    Guangren DUAN; Biao ZHANG

    2007-01-01

    Robust model-reference control for descriptor linear systems with structural parameter uncertainties is investigated. A sufficient condition for existing a model-reference zero-error asymptotic tracking controller is given. It is shown that the robust model reference control problem can be decomposed into two subproblems: a robust state feedback stabilization problem for descriptor systems subject to parameter uncertainties and a robust compensation problem. The latter aims to find three coefficient matrices which satisfy four matrix equations and simultaneously minimize the effect of the uncertainties to the tracking error. Based on a complete parametric solution to a class of generalized Sylvester matrix equations, the robust compensation problem is converted into a minimization problem with quadratic cost and linear constraints. A numerical example shows the effect of the proposed approach.

  13. Uncertainties of optical-model parameters for the study of the threshold anomaly

    CERN Document Server

    Abriola, Daniel; Testoni, J; Gollan, F; Martí, G V

    2015-01-01

    In the analysis of elastic-scattering experimental data, optical-model parameters (usually, depths of real and imaginary potentials) are fitted and conclusions are drawn analyzing their variations at bombardment energies close to the Coulomb barrier (threshold anomaly). The judgement about the shape of this variation (related to the physical processes producing this anomaly) depends on these fitted values but the robustness of the conclusions strongly depends on the uncertainties with which these parameters are derived. We will show that previous published studies have not used a common criterium for the evaluation of the parameter uncertainties. In this work, a study of these uncertainties is presented, using conventional statistic tools as well as bootstrapping techniques. As case studies, these procedures are applied to re-analyze detailed elastic-scattering data for the $^{12}$C + $^{208}$Pb and the $^6$Li + $^{80}$Se systems.

  14. Parameter uncertainty in CGE Modeling of the environmental impacts of economic policies

    Energy Technology Data Exchange (ETDEWEB)

    Abler, D.G.; Shortle, J.S. [Agricultural Economics, Pennsylvania State University, University Park, PA (United States); Rodriguez, A.G. [University of Costa Rica, San Jose (Costa Rica)

    1999-07-01

    This study explores the role of parameter uncertainty in Computable General Equilibrium (CGE) modeling of the environmental impacts of macroeconomic and sectoral policies, using Costa Rica as a case for study. A CGE model is constructed which includes eight environmental indicators covering deforestation, pesticides, overfishing, hazardous wastes, inorganic wastes, organic wastes, greenhouse gases, and air pollution. The parameters are treated as random variables drawn from prespecified distributions. Evaluation of each policy option consists of a Monte Carlo experiment. The impacts of the policy options on the environmental indicators are relatively robust to different parameter values, in spite of the wide range of parameter values employed. 33 refs.

  15. Bayesian parameter inference for empirical stochastic models of paleoclimatic records with dating uncertainty

    Science.gov (United States)

    Boers, Niklas; Goswami, Bedartha; Chekroun, Mickael; Svensson, Anders; Rousseau, Denis-Didier; Ghil, Michael

    2016-04-01

    In the recent past, empirical stochastic models have been successfully applied to model a wide range of climatic phenomena [1,2]. In addition to enhancing our understanding of the geophysical systems under consideration, multilayer stochastic models (MSMs) have been shown to be solidly grounded in the Mori-Zwanzig formalism of statistical physics [3]. They are also well-suited for predictive purposes, e.g., for the El Niño Southern Oscillation [4] and the Madden-Julian Oscillation [5]. In general, these models are trained on a given time series under consideration, and then assumed to reproduce certain dynamical properties of the underlying natural system. Most existing approaches are based on least-squares fitting to determine optimal model parameters, which does not allow for an uncertainty estimation of these parameters. This approach significantly limits the degree to which dynamical characteristics of the time series can be safely inferred from the model. Here, we are specifically interested in fitting low-dimensional stochastic models to time series obtained from paleoclimatic proxy records, such as the oxygen isotope ratio and dust concentration of the NGRIP record [6]. The time series derived from these records exhibit substantial dating uncertainties, in addition to the proxy measurement errors. In particular, for time series of this kind, it is crucial to obtain uncertainty estimates for the final model parameters. Following [7], we first propose a statistical procedure to shift dating uncertainties from the time axis to the proxy axis of layer-counted paleoclimatic records. Thereafter, we show how Maximum Likelihood Estimation in combination with Markov Chain Monte Carlo parameter sampling can be employed to translate all uncertainties present in the original proxy time series to uncertainties of the parameter estimates of the stochastic model. We compare time series simulated by the empirical model to the original time series in terms of standard

  16. Parameter sensitivity and uncertainty analysis for a storm surge and wave model

    Science.gov (United States)

    Bastidas, Luis A.; Knighton, James; Kline, Shaun W.

    2016-09-01

    Development and simulation of synthetic hurricane tracks is a common methodology used to estimate hurricane hazards in the absence of empirical coastal surge and wave observations. Such methods typically rely on numerical models to translate stochastically generated hurricane wind and pressure forcing into coastal surge and wave estimates. The model output uncertainty associated with selection of appropriate model parameters must therefore be addressed. The computational overburden of probabilistic surge hazard estimates is exacerbated by the high dimensionality of numerical surge and wave models. We present a model parameter sensitivity analysis of the Delft3D model for the simulation of hazards posed by Hurricane Bob (1991) utilizing three theoretical wind distributions (NWS23, modified Rankine, and Holland). The sensitive model parameters (of 11 total considered) include wind drag, the depth-induced breaking γB, and the bottom roughness. Several parameters show no sensitivity (threshold depth, eddy viscosity, wave triad parameters, and depth-induced breaking αB) and can therefore be excluded to reduce the computational overburden of probabilistic surge hazard estimates. The sensitive model parameters also demonstrate a large number of interactions between parameters and a nonlinear model response. While model outputs showed sensitivity to several parameters, the ability of these parameters to act as tuning parameters for calibration is somewhat limited as proper model calibration is strongly reliant on accurate wind and pressure forcing data. A comparison of the model performance with forcings from the different wind models is also presented.

  17. Impact of uncertainties in discharge determination on the parameter estimation and performance of a hydrological model

    NARCIS (Netherlands)

    Tillaart, van den S.P.M.; Booij, M.J.; Krol, M.S

    2013-01-01

    Uncertainties in discharge determination may have serious consequences for hydrological modelling and resulting discharge predictions used for flood forecasting, climate change impact assessment and reservoir operation. The aim of this study is to quantify the effect of discharge errors on parameter

  18. Are subject-specific musculoskeletal models robust to the uncertainties in parameter identification?

    Directory of Open Access Journals (Sweden)

    Giordano Valente

    Full Text Available Subject-specific musculoskeletal modeling can be applied to study musculoskeletal disorders, allowing inclusion of personalized anatomy and properties. Independent of the tools used for model creation, there are unavoidable uncertainties associated with parameter identification, whose effect on model predictions is still not fully understood. The aim of the present study was to analyze the sensitivity of subject-specific model predictions (i.e., joint angles, joint moments, muscle and joint contact forces during walking to the uncertainties in the identification of body landmark positions, maximum muscle tension and musculotendon geometry. To this aim, we created an MRI-based musculoskeletal model of the lower limbs, defined as a 7-segment, 10-degree-of-freedom articulated linkage, actuated by 84 musculotendon units. We then performed a Monte-Carlo probabilistic analysis perturbing model parameters according to their uncertainty, and solving a typical inverse dynamics and static optimization problem using 500 models that included the different sets of perturbed variable values. Model creation and gait simulations were performed by using freely available software that we developed to standardize the process of model creation, integrate with OpenSim and create probabilistic simulations of movement. The uncertainties in input variables had a moderate effect on model predictions, as muscle and joint contact forces showed maximum standard deviation of 0.3 times body-weight and maximum range of 2.1 times body-weight. In addition, the output variables significantly correlated with few input variables (up to 7 out of 312 across the gait cycle, including the geometry definition of larger muscles and the maximum muscle tension in limited gait portions. Although we found subject-specific models not markedly sensitive to parameter identification, researchers should be aware of the model precision in relation to the intended application. In fact, force

  19. Parameter Uncertainty in CGE Modeling of the Macroeconomic Impact of Carbon Reduction in China

    Institute of Scientific and Technical Information of China (English)

    WANG Can; CHEN Jining

    2006-01-01

    Formal methods are used to characterize the uncertainty in the computable general equilibrium (CGE) model outputs to assess the use of the CGE model of China (integrated energy-economy-environment dynamic CGE, TEDCGE) for carbon tax policy issues. Monte Carlo experiment was used for the parameter uncertainty propagation and unconditional sensitivity analysis, using the variance of the conditional expectation (VCE) as the importance index to identify critical uncertainties. The results illustrate the statistical characteristics of TEDCGE outputs and sensitivities of the TEDCGE outputs to 50 uncertain elasticities. The results show that the carbon tax level for a predefined emission reduction goal is quite sensitive to both capital-energy substitution elasticity and inter-fuel substitution elasticity in the production function, while the key parameter for the GDP reduction rate was only the inter-fuel substitution elasticity. Among the various sectors, heavy industry and electricity are most vitally affected by a carbon tax.

  20. An Integrated Hydrologic Bayesian Multi-Model Combination Framework: Confronting Input, parameter and model structural uncertainty in Hydrologic Prediction

    Energy Technology Data Exchange (ETDEWEB)

    Ajami, N K; Duan, Q; Sorooshian, S

    2006-05-05

    This paper presents a new technique--Integrated Bayesian Uncertainty Estimator (IBUNE) to account for the major uncertainties of hydrologic rainfall-runoff predictions explicitly. The uncertainties from the input (forcing) data--mainly the precipitation observations and from the model parameters are reduced through a Monte Carlo Markov Chain (MCMC) scheme named Shuffled Complex Evolution Metropolis (SCEM) algorithm which has been extended to include a precipitation error model. Afterwards, the Bayesian Model Averaging (BMA) scheme is employed to further improve the prediction skill and uncertainty estimation using multiple model output. A series of case studies using three rainfall-runoff models to predict the streamflow in the Leaf River basin, Mississippi are used to examine the necessity and usefulness of this technique. The results suggests that ignoring either input forcings error or model structural uncertainty will lead to unrealistic model simulations and their associated uncertainty bounds which does not consistently capture and represent the real-world behavior of the watershed.

  1. A software tool to assess uncertainty in transient-storage model parameters using Monte Carlo simulations

    Science.gov (United States)

    Ward, Adam S.; Kelleher, Christa A.; Mason, Seth J. K.; Wagener, Thorsten; McIntyre, Neil; McGlynn, Brian L.; Runkel, Robert L.; Payn, Robert A.

    2017-01-01

    Researchers and practitioners alike often need to understand and characterize how water and solutes move through a stream in terms of the relative importance of in-stream and near-stream storage and transport processes. In-channel and subsurface storage processes are highly variable in space and time and difficult to measure. Storage estimates are commonly obtained using transient-storage models (TSMs) of the experimentally obtained solute-tracer test data. The TSM equations represent key transport and storage processes with a suite of numerical parameters. Parameter values are estimated via inverse modeling, in which parameter values are iteratively changed until model simulations closely match observed solute-tracer data. Several investigators have shown that TSM parameter estimates can be highly uncertain. When this is the case, parameter values cannot be used reliably to interpret stream-reach functioning. However, authors of most TSM studies do not evaluate or report parameter certainty. Here, we present a software tool linked to the One-dimensional Transport with Inflow and Storage (OTIS) model that enables researchers to conduct uncertainty analyses via Monte-Carlo parameter sampling and to visualize uncertainty and sensitivity results. We demonstrate application of our tool to 2 case studies and compare our results to output obtained from more traditional implementation of the OTIS model. We conclude by suggesting best practices for transient-storage modeling and recommend that future applications of TSMs include assessments of parameter certainty to support comparisons and more reliable interpretations of transport processes.

  2. Novel Method for Incorporating Model Uncertainties into Gravitational Wave Parameter Estimates

    CERN Document Server

    Moore, Christopher J

    2014-01-01

    Posterior distributions on parameters computed from experimental data using Bayesian techniques are only as accurate as the models used to construct them. In many applications these models are incomplete, which both reduces the prospects of detection and leads to a systematic error in the parameter estimates. In the analysis of data from gravitational wave detectors, for example, accurate waveform templates can be computed using numerical methods, but the prohibitive cost of these simulations means this can only be done for a small handful of parameters. In this work a novel method to fold model uncertainties into data analysis is proposed; the waveform uncertainty is analytically marginalised over using with a prior distribution constructed by using Gaussian process regression to interpolate the waveform difference from a small training set of accurate templates. The method is well motivated, easy to implement, and no more computationally expensive than standard techniques. The new method is shown to perform...

  3. Parameter estimation and uncertainty quantification in a biogeochemical model using optimal experimental design methods

    Science.gov (United States)

    Reimer, Joscha; Piwonski, Jaroslaw; Slawig, Thomas

    2016-04-01

    The statistical significance of any model-data comparison strongly depends on the quality of the used data and the criterion used to measure the model-to-data misfit. The statistical properties (such as mean values, variances and covariances) of the data should be taken into account by choosing a criterion as, e.g., ordinary, weighted or generalized least squares. Moreover, the criterion can be restricted onto regions or model quantities which are of special interest. This choice influences the quality of the model output (also for not measured quantities) and the results of a parameter estimation or optimization process. We have estimated the parameters of a three-dimensional and time-dependent marine biogeochemical model describing the phosphorus cycle in the ocean. For this purpose, we have developed a statistical model for measurements of phosphate and dissolved organic phosphorus. This statistical model includes variances and correlations varying with time and location of the measurements. We compared the obtained estimations of model output and parameters for different criteria. Another question is if (and which) further measurements would increase the model's quality at all. Using experimental design criteria, the information content of measurements can be quantified. This may refer to the uncertainty in unknown model parameters as well as the uncertainty regarding which model is closer to reality. By (another) optimization, optimal measurement properties such as locations, time instants and quantities to be measured can be identified. We have optimized such properties for additional measurement for the parameter estimation of the marine biogeochemical model. For this purpose, we have quantified the uncertainty in the optimal model parameters and the model output itself regarding the uncertainty in the measurement data using the (Fisher) information matrix. Furthermore, we have calculated the uncertainty reduction by additional measurements depending on time

  4. Model parameter uncertainty analysis for an annual field-scale phosphorus loss model

    Science.gov (United States)

    Phosphorous (P) loss models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. All P loss models, however, have an inherent amount of uncertainty associated with them. In this study, we conducted an uncertainty analysis with ...

  5. Model parameter uncertainty analysis for annual field-scale P loss model

    Science.gov (United States)

    Phosphorous (P) loss models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. All P loss models, however, have an inherent amount of uncertainty associated with them. In this study, we conducted an uncertainty analysis with ...

  6. A practical method to assess model sensitivity and parameter uncertainty in C cycle models

    Science.gov (United States)

    Delahaies, Sylvain; Roulstone, Ian; Nichols, Nancy

    2015-04-01

    The carbon cycle combines multiple spatial and temporal scales, from minutes to hours for the chemical processes occurring in plant cells to several hundred of years for the exchange between the atmosphere and the deep ocean and finally to millennia for the formation of fossil fuels. Together with our knowledge of the transformation processes involved in the carbon cycle, many Earth Observation systems are now available to help improving models and predictions using inverse modelling techniques. A generic inverse problem consists in finding a n-dimensional state vector x such that h(x) = y, for a given N-dimensional observation vector y, including random noise, and a given model h. The problem is well posed if the three following conditions hold: 1) there exists a solution, 2) the solution is unique and 3) the solution depends continuously on the input data. If at least one of these conditions is violated the problem is said ill-posed. The inverse problem is often ill-posed, a regularization method is required to replace the original problem with a well posed problem and then a solution strategy amounts to 1) constructing a solution x, 2) assessing the validity of the solution, 3) characterizing its uncertainty. The data assimilation linked ecosystem carbon (DALEC) model is a simple box model simulating the carbon budget allocation for terrestrial ecosystems. Intercomparison experiments have demonstrated the relative merit of various inverse modelling strategies (MCMC, ENKF) to estimate model parameters and initial carbon stocks for DALEC using eddy covariance measurements of net ecosystem exchange of CO2 and leaf area index observations. Most results agreed on the fact that parameters and initial stocks directly related to fast processes were best estimated with narrow confidence intervals, whereas those related to slow processes were poorly estimated with very large uncertainties. While other studies have tried to overcome this difficulty by adding complementary

  7. The sensitivity of flowline models of tidewater glaciers to parameter uncertainty

    Directory of Open Access Journals (Sweden)

    E. M. Enderlin

    2013-10-01

    Full Text Available Depth-integrated (1-D flowline models have been widely used to simulate fast-flowing tidewater glaciers and predict change because the continuous grounding line tracking, high horizontal resolution, and physically based calving criterion that are essential to realistic modeling of tidewater glaciers can easily be incorporated into the models while maintaining high computational efficiency. As with all models, the values for parameters describing ice rheology and basal friction must be assumed and/or tuned based on observations. For prognostic studies, these parameters are typically tuned so that the glacier matches observed thickness and speeds at an initial state, to which a perturbation is applied. While it is well know that ice flow models are sensitive to these parameters, the sensitivity of tidewater glacier models has not been systematically investigated. Here we investigate the sensitivity of such flowline models of outlet glacier dynamics to uncertainty in three key parameters that influence a glacier's resistive stress components. We find that, within typical observational uncertainty, similar initial (i.e., steady-state glacier configurations can be produced with substantially different combinations of parameter values, leading to differing transient responses after a perturbation is applied. In cases where the glacier is initially grounded near flotation across a basal over-deepening, as typically observed for rapidly changing glaciers, these differences can be dramatic owing to the threshold of stability imposed by the flotation criterion. The simulated transient response is particularly sensitive to the parameterization of ice rheology: differences in ice temperature of ~ 2 °C can determine whether the glaciers thin to flotation and retreat unstably or remain grounded on a marine shoal. Due to the highly non-linear dependence of tidewater glaciers on model parameters, we recommend that their predictions are accompanied by

  8. Hydrological model parameter dimensionality is a weak measure of prediction uncertainty

    Directory of Open Access Journals (Sweden)

    S. Pande

    2015-04-01

    Full Text Available This paper shows that instability of hydrological system representation in response to different pieces of information and associated prediction uncertainty is a function of model complexity. After demonstrating the connection between unstable model representation and model complexity, complexity is analyzed in a step by step manner. This is done measuring differences between simulations of a model under different realizations of input forcings. Algorithms are then suggested to estimate model complexity. Model complexities of the two model structures, SAC-SMA (Sacramento Soil Moisture Accounting and its simplified version SIXPAR (Six Parameter Model, are computed on resampled input data sets from basins that span across the continental US. The model complexities for SIXPAR are estimated for various parameter ranges. It is shown that complexity of SIXPAR increases with lower storage capacity and/or higher recession coefficients. Thus it is argued that a conceptually simple model structure, such as SIXPAR, can be more complex than an intuitively more complex model structure, such as SAC-SMA for certain parameter ranges. We therefore contend that magnitudes of feasible model parameters influence the complexity of the model selection problem just as parameter dimensionality (number of parameters does and that parameter dimensionality is an incomplete indicator of stability of hydrological model selection and prediction problems.

  9. Parameter uncertainty, sensitivity, and sediment coupling in bioenergetics-based food web models

    Energy Technology Data Exchange (ETDEWEB)

    Barron, M.G.; Cacela, D.; Beltman, D. [Hagler Bailly, Boulder, CO (United States)

    1995-12-31

    A bioenergetics-based food web model was developed and calibrated using measured PCB water and sediment concentrations in two Great Lakes food webs: Green Bay, Michigan and Lake Ontario. The model incorporated functional based trophic levels and sediment, water, and food chain exposures of PCBs to aquatic biota. Sensitivity analysis indicated the parameters with the greatest influence on PCBs in top predators were lipid content of plankton and benthos, planktivore assimilation efficiency, Kow, prey selection, and ambient temperature. Sediment-associated PCBs were estimated to contribute over 90% of PCBs in benthivores and less than 50% in piscivores. Ranges of PCB concentrations in top predators estimated by Monte Carlo simulation incorporating parameter uncertainty were within one order of magnitude of modal values. Model applications include estimation of exceedences of human and ecological thresholds. The results indicate that point estimates from bioenergetics-based food web models have substantial uncertainty that should be considered in regulatory and scientific applications.

  10. MOESHA: A genetic algorithm for automatic calibration and estimation of parameter uncertainty and sensitivity of hydrologic models

    Science.gov (United States)

    Characterization of uncertainty and sensitivity of model parameters is an essential and often overlooked facet of hydrological modeling. This paper introduces an algorithm called MOESHA that combines input parameter sensitivity analyses with a genetic algorithm calibration routin...

  11. S-parameter uncertainty computations

    DEFF Research Database (Denmark)

    Vidkjær, Jens

    1993-01-01

    A method for computing uncertainties of measured s-parameters is presented. Unlike the specification software provided with network analyzers, the new method is capable of calculating the uncertainties of arbitrary s-parameter sets and instrument settings.......A method for computing uncertainties of measured s-parameters is presented. Unlike the specification software provided with network analyzers, the new method is capable of calculating the uncertainties of arbitrary s-parameter sets and instrument settings....

  12. The sensitivity of flowline models of tidewater glaciers to parameter uncertainty

    Directory of Open Access Journals (Sweden)

    E. M. Enderlin

    2013-06-01

    Full Text Available Depth-integrated (1-D flowline models have been widely used to simulate fast-flowing tidewater glaciers and predict future change because their computational efficiency allows for continuous grounding line tracking, high horizontal resolution, and a physically-based calving criterion, which are all essential to realistic modeling of tidewater glaciers. As with all models, the values for parameters describing ice rheology and basal friction must be assumed and/or tuned based on observations. For prognostic studies, these parameters are typically tuned so that the glacier matches observed thickness and speeds at an initial state, to which a perturbation is applied. While it is well know that ice flow models are sensitive to these parameters, the sensitivity of tidewater glacier models has not been systematically investigated. Here we investigate the sensitivity of such flowline models of outlet glacier dynamics to uncertainty in three key parameters that influence a glacier's resistive stress components. We find that, within typical observational uncertainty, similar initial (i.e. steady-state glacier configurations can be produced with substantially different combinations of parameter values, leading to differing transient responses after a perturbation is applied. In cases where the glacier is initially grounded near flotation across a basal overdeepening, as typically observed for rapidly changing glaciers, these differences can be dramatic owing to the threshold of stability imposed by the flotation criterion. The simulated transient response is particularly sensitive to the parameterization of ice rheology: differences in ice temperature of ∼ 2 °C can determine whether the glaciers thin to flotation and retreat unstably or remain grounded on a marine shoal. Due the highly non-linear dependence of tidewater glaciers on model parameters, we recommend that their predictions are accompanied by sensitivity tests that take parameter uncertainty

  13. Probabilistic Fatigue Life Prediction of Turbine Disc Considering Model Parameter Uncertainty

    Science.gov (United States)

    He, Liping; Yu, Le; Zhu, Shun-Peng; Ding, Liangliang; Huang, Hong-Zhong

    2016-06-01

    Aiming to improve the predictive ability of Walker model for fatigue life prediction and taking the turbine disc alloy GH4133 as the application example, this paper investigates a new approach for probabilistic fatigue life prediction when considering parameter uncertainty inherent in the life prediction model. Firstly, experimental data are used to update the model parameters using Bayes' theorem, so as to obtain the posterior probability distribution functions of two parameters of the Walker model, as well to achieve the probabilistic life prediction model for turbine disc. During the updating process, Markov Chain Monte Carlo (MCMC) technique is used to generate samples of the given distribution and estimating the parameters distinctly. After that, the turbine disc life is predicted using the probabilistic Walker model based on Monte Carlo simulation technique. The experimental results indicate that: (1) after using the small sample test data obtained from turbine disc, parameter uncertainty of the Walker model can be quantified and the corresponding probabilistic model for fatigue life prediction can be established using Bayes' theorem; (2) there exists obvious dispersion of life data for turbine disc when predicting fatigue life in practical engineering application.

  14. Using an ensemble smoother to evaluate parameter uncertainty of an integrated hydrological model of Yanqi basin

    Science.gov (United States)

    Li, Ning; McLaughlin, Dennis; Kinzelbach, Wolfgang; Li, WenPeng; Dong, XinGuang

    2015-10-01

    Model uncertainty needs to be quantified to provide objective assessments of the reliability of model predictions and of the risk associated with management decisions that rely on these predictions. This is particularly true in water resource studies that depend on model-based assessments of alternative management strategies. In recent decades, Bayesian data assimilation methods have been widely used in hydrology to assess uncertain model parameters and predictions. In this case study, a particular data assimilation algorithm, the Ensemble Smoother with Multiple Data Assimilation (ESMDA) (Emerick and Reynolds, 2012), is used to derive posterior samples of uncertain model parameters and forecasts for a distributed hydrological model of Yanqi basin, China. This model is constructed using MIKESHE/MIKE11software, which provides for coupling between surface and subsurface processes (DHI, 2011a-d). The random samples in the posterior parameter ensemble are obtained by using measurements to update 50 prior parameter samples generated with a Latin Hypercube Sampling (LHS) procedure. The posterior forecast samples are obtained from model runs that use the corresponding posterior parameter samples. Two iterative sample update methods are considered: one based on an a perturbed observation Kalman filter update and one based on a square root Kalman filter update. These alternatives give nearly the same results and converge in only two iterations. The uncertain parameters considered include hydraulic conductivities, drainage and river leakage factors, van Genuchten soil property parameters, and dispersion coefficients. The results show that the uncertainty in many of the parameters is reduced during the smoother updating process, reflecting information obtained from the observations. Some of the parameters are insensitive and do not benefit from measurement information. The correlation coefficients among certain parameters increase in each iteration, although they generally

  15. Model structural uncertainty quantification and hydrologic parameter and prediction error analysis using airborne electromagnetic data

    DEFF Research Database (Denmark)

    Minsley, B. J.; Christensen, Nikolaj Kruse; Christensen, Steen

    Model structure, or the spatial arrangement of subsurface lithological units, is fundamental to the hydrological behavior of Earth systems. Knowledge of geological model structure is critically important in order to make informed hydrological predictions and management decisions. Model structure...... indicator simulation, we produce many realizations of model structure that are consistent with observed datasets and prior knowledge. Given estimates of model structural uncertainty, we incorporate hydrologic observations to evaluate the errors in hydrologic parameter or prediction errors that occur when...... is never perfectly known, however, and incorrect assumptions can be a significant source of error when making model predictions. We describe a systematic approach for quantifying model structural uncertainty that is based on the integration of sparse borehole observations and large-scale airborne...

  16. Variations in environmental tritium doses due to meteorological data averaging and uncertainties in pathway model parameters

    Energy Technology Data Exchange (ETDEWEB)

    Kock, A.

    1996-05-01

    The objectives of this research are: (1) to calculate and compare off site doses from atmospheric tritium releases at the Savannah River Site using monthly versus 5 year meteorological data and annual source terms, including additional seasonal and site specific parameters not included in present annual assessments; and (2) to calculate the range of the above dose estimates based on distributions in model parameters given by uncertainty estimates found in the literature. Consideration will be given to the sensitivity of parameters given in former studies.

  17. Robust H∞ control for aseismic structures With uncertainties in model parameters

    Institute of Scientific and Technical Information of China (English)

    Song Gang; Lin Jiahao; Zhao Yan; W.Paul Howson; Fred W Williams

    2007-01-01

    This paper presents a robust H∞ output feedback control approach for structural systems with uncertainties in model parameters by using available acceleration measurements and proposes conditions for the existence of such a robust ontput feedback controller.The uncertainties of structural stiffness,damping and mass parameters are assumed to be norm-bounded.The proposed control approach is formulated within the framework of linear matrix inequalities,for which existing convex optimization techniques,such as the LMI toolbox in MATLAB,can be used effectively and conveniently.To illustrate the effectiveness of the proposed robust H∞ strategy,a six-story building was subjected both to the 1940 El Centro earthquake record and to a suddenly applied Kanai-Tajimi filtered white noise random excitation.The results show that the proposed robust H∞ controller provides satisfactory results with or without variation of the structural stiffness,damping and mass parameters.

  18. Quantifying Key Climate Parameter Uncertainties Using an Earth System Model with a Dynamic 3D Ocean

    Science.gov (United States)

    Olson, R.; Sriver, R. L.; Goes, M. P.; Urban, N.; Matthews, D.; Haran, M.; Keller, K.

    2011-12-01

    Climate projections hinge critically on uncertain climate model parameters such as climate sensitivity, vertical ocean diffusivity and anthropogenic sulfate aerosol forcings. Climate sensitivity is defined as the equilibrium global mean temperature response to a doubling of atmospheric CO2 concentrations. Vertical ocean diffusivity parameterizes sub-grid scale ocean vertical mixing processes. These parameters are typically estimated using Intermediate Complexity Earth System Models (EMICs) that lack a full 3D representation of the oceans, thereby neglecting the effects of mixing on ocean dynamics and meridional overturning. We improve on these studies by employing an EMIC with a dynamic 3D ocean model to estimate these parameters. We carry out historical climate simulations with the University of Victoria Earth System Climate Model (UVic ESCM) varying parameters that affect climate sensitivity, vertical ocean mixing, and effects of anthropogenic sulfate aerosols. We use a Bayesian approach whereby the likelihood of each parameter combination depends on how well the model simulates surface air temperature and upper ocean heat content. We use a Gaussian process emulator to interpolate the model output to an arbitrary parameter setting. We use Markov Chain Monte Carlo method to estimate the posterior probability distribution function (pdf) of these parameters. We explore the sensitivity of the results to prior assumptions about the parameters. In addition, we estimate the relative skill of different observations to constrain the parameters. We quantify the uncertainty in parameter estimates stemming from climate variability, model and observational errors. We explore the sensitivity of key decision-relevant climate projections to these parameters. We find that climate sensitivity and vertical ocean diffusivity estimates are consistent with previously published results. The climate sensitivity pdf is strongly affected by the prior assumptions, and by the scaling

  19. Assessing the relative importance of parameter and forcing uncertainty and their interactions in conceptual hydrological model simulations

    Science.gov (United States)

    Mockler, E. M.; Chun, K. P.; Sapriza-Azuri, G.; Bruen, M.; Wheater, H. S.

    2016-11-01

    Predictions of river flow dynamics provide vital information for many aspects of water management including water resource planning, climate adaptation, and flood and drought assessments. Many of the subjective choices that modellers make including model and criteria selection can have a significant impact on the magnitude and distribution of the output uncertainty. Hydrological modellers are tasked with understanding and minimising the uncertainty surrounding streamflow predictions before communicating the overall uncertainty to decision makers. Parameter uncertainty in conceptual rainfall-runoff models has been widely investigated, and model structural uncertainty and forcing data have been receiving increasing attention. This study aimed to assess uncertainties in streamflow predictions due to forcing data and the identification of behavioural parameter sets in 31 Irish catchments. By combining stochastic rainfall ensembles and multiple parameter sets for three conceptual rainfall-runoff models, an analysis of variance model was used to decompose the total uncertainty in streamflow simulations into contributions from (i) forcing data, (ii) identification of model parameters and (iii) interactions between the two. The analysis illustrates that, for our subjective choices, hydrological model selection had a greater contribution to overall uncertainty, while performance criteria selection influenced the relative intra-annual uncertainties in streamflow predictions. Uncertainties in streamflow predictions due to the method of determining parameters were relatively lower for wetter catchments, and more evenly distributed throughout the year when the Nash-Sutcliffe Efficiency of logarithmic values of flow (lnNSE) was the evaluation criterion.

  20. Novel method for incorporating model uncertainties into gravitational wave parameter estimates.

    Science.gov (United States)

    Moore, Christopher J; Gair, Jonathan R

    2014-12-19

    Posterior distributions on parameters computed from experimental data using Bayesian techniques are only as accurate as the models used to construct them. In many applications, these models are incomplete, which both reduces the prospects of detection and leads to a systematic error in the parameter estimates. In the analysis of data from gravitational wave detectors, for example, accurate waveform templates can be computed using numerical methods, but the prohibitive cost of these simulations means this can only be done for a small handful of parameters. In this Letter, a novel method to fold model uncertainties into data analysis is proposed; the waveform uncertainty is analytically marginalized over using with a prior distribution constructed by using Gaussian process regression to interpolate the waveform difference from a small training set of accurate templates. The method is well motivated, easy to implement, and no more computationally expensive than standard techniques. The new method is shown to perform extremely well when applied to a toy problem. While we use the application to gravitational wave data analysis to motivate and illustrate the technique, it can be applied in any context where model uncertainties exist.

  1. Conditioning rainfall-runoff model parameters to reduce prediction uncertainty in ungauged basins

    Science.gov (United States)

    Visessri, S.; McIntyre, N.; Maksimovic, C.

    2012-12-01

    Conditioning rainfall-runoff model parameters in ungauged catchments in Thailand presents problems common to ungauged basins involving data availability, data quality, and rainfall-runoff model suitability, which all contribute to prediction uncertainty. This paper attempts to improve the estimation of streamflow in ungauged basins and reduce associated uncertainties using the approaches of conditioning the prior parameter space. 35 catchments from the upper Ping River basin, Thailand are selected as a case study. The catchments have a range of attributes e.g. catchment sizes 20-6350 km2, elevations 632-1529 m above sea level. and annual rainfall 846-1447 mm/year. For each catchment, three indices - rainfall-runoff elasticity, base flow index and runoff coefficient - are calculated using the observed rainfall-runoff data and regression equations relating these indices to the catchment attributes are identified. Uncertainty in expected indices is defined by the regression error distribution, approximated by a Gaussian model. The IHACRES model is applied for simulating streamflow. The IHACRES parameters are randomly sampled from their presumed prior parameter space. For each sampled parameter set, the streamflow and hence the three indices are modelled. The parameter sets are conditioned on the probability distributions of the regionalised indices, allowing ensemble predictions to be made. The objective function, NSE, calculated for daily and weekly time steps from the water years 1995-2000, is used to assess model performance. Ability to capture observed streamflow and the precision of the estimate is evaluated using reliability and sharpness measures. Similarity in modelled and expected indices contributes to good objective function values. Using only the regionalised runoff coefficient to condition the model yields better NSE values compared to using either only the rainfall-runoff elasticity or only the base flow index. Conditioning on the runoff coefficient

  2. Accounting for environmental variability, modeling errors, and parameter estimation uncertainties in structural identification

    Science.gov (United States)

    Behmanesh, Iman; Moaveni, Babak

    2016-07-01

    This paper presents a Hierarchical Bayesian model updating framework to account for the effects of ambient temperature and excitation amplitude. The proposed approach is applied for model calibration, response prediction and damage identification of a footbridge under changing environmental/ambient conditions. The concrete Young's modulus of the footbridge deck is the considered updating structural parameter with its mean and variance modeled as functions of temperature and excitation amplitude. The identified modal parameters over 27 months of continuous monitoring of the footbridge are used to calibrate the updating parameters. One of the objectives of this study is to show that by increasing the levels of information in the updating process, the posterior variation of the updating structural parameter (concrete Young's modulus) is reduced. To this end, the calibration is performed at three information levels using (1) the identified modal parameters, (2) modal parameters and ambient temperatures, and (3) modal parameters, ambient temperatures, and excitation amplitudes. The calibrated model is then validated by comparing the model-predicted natural frequencies and those identified from measured data after deliberate change to the structural mass. It is shown that accounting for modeling error uncertainties is crucial for reliable response prediction, and accounting only the estimated variability of the updating structural parameter is not sufficient for accurate response predictions. Finally, the calibrated model is used for damage identification of the footbridge.

  3. Bayesian Assessment of the Uncertainties of Estimates of a Conceptual Rainfall-Runoff Model Parameters

    Science.gov (United States)

    Silva, F. E. O. E.; Naghettini, M. D. C.; Fernandes, W.

    2014-12-01

    This paper evaluated the uncertainties associated with the estimation of the parameters of a conceptual rainfall-runoff model, through the use of Bayesian inference techniques by Monte Carlo simulation. The Pará River sub-basin, located in the upper São Francisco river basin, in southeastern Brazil, was selected for developing the studies. In this paper, we used the Rio Grande conceptual hydrologic model (EHR/UFMG, 2001) and the Markov Chain Monte Carlo simulation method named DREAM (VRUGT, 2008a). Two probabilistic models for the residues were analyzed: (i) the classic [Normal likelihood - r ≈ N (0, σ²)]; and (ii) a generalized likelihood (SCHOUPS & VRUGT, 2010), in which it is assumed that the differences between observed and simulated flows are correlated, non-stationary, and distributed as a Skew Exponential Power density. The assumptions made for both models were checked to ensure that the estimation of uncertainties in the parameters was not biased. The results showed that the Bayesian approach proved to be adequate to the proposed objectives, enabling and reinforcing the importance of assessing the uncertainties associated with hydrological modeling.

  4. Approaches to highly parameterized inversion: A guide to using PEST for model-parameter and predictive-uncertainty analysis

    Science.gov (United States)

    Doherty, John E.; Hunt, Randall J.; Tonkin, Matthew J.

    2010-01-01

    Analysis of the uncertainty associated with parameters used by a numerical model, and with predictions that depend on those parameters, is fundamental to the use of modeling in support of decisionmaking. Unfortunately, predictive uncertainty analysis with regard to models can be very computationally demanding, due in part to complex constraints on parameters that arise from expert knowledge of system properties on the one hand (knowledge constraints) and from the necessity for the model parameters to assume values that allow the model to reproduce historical system behavior on the other hand (calibration constraints). Enforcement of knowledge and calibration constraints on parameters used by a model does not eliminate the uncertainty in those parameters. In fact, in many cases, enforcement of calibration constraints simply reduces the uncertainties associated with a number of broad-scale combinations of model parameters that collectively describe spatially averaged system properties. The uncertainties associated with other combinations of parameters, especially those that pertain to small-scale parameter heterogeneity, may not be reduced through the calibration process. To the extent that a prediction depends on system-property detail, its postcalibration variability may be reduced very little, if at all, by applying calibration constraints; knowledge constraints remain the only limits on the variability of predictions that depend on such detail. Regrettably, in many common modeling applications, these constraints are weak. Though the PEST software suite was initially developed as a tool for model calibration, recent developments have focused on the evaluation of model-parameter and predictive uncertainty. As a complement to functionality that it provides for highly parameterized inversion (calibration) by means of formal mathematical regularization techniques, the PEST suite provides utilities for linear and nonlinear error-variance and uncertainty analysis in

  5. Patient-specific parameter estimation in single-ventricle lumped circulation models under uncertainty.

    Science.gov (United States)

    Schiavazzi, Daniele E; Baretta, Alessia; Pennati, Giancarlo; Hsia, Tain-Yen; Marsden, Alison L

    2017-03-01

    Computational models of cardiovascular physiology can inform clinical decision-making, providing a physically consistent framework to assess vascular pressures and flow distributions, and aiding in treatment planning. In particular, lumped parameter network (LPN) models that make an analogy to electrical circuits offer a fast and surprisingly realistic method to reproduce the circulatory physiology. The complexity of LPN models can vary significantly to account, for example, for cardiac and valve function, respiration, autoregulation, and time-dependent hemodynamics. More complex models provide insight into detailed physiological mechanisms, but their utility is maximized if one can quickly identify patient specific parameters. The clinical utility of LPN models with many parameters will be greatly enhanced by automated parameter identification, particularly if parameter tuning can match non-invasively obtained clinical data. We present a framework for automated tuning of 0D lumped model parameters to match clinical data. We demonstrate the utility of this framework through application to single ventricle pediatric patients with Norwood physiology. Through a combination of local identifiability, Bayesian estimation and maximum a posteriori simplex optimization, we show the ability to automatically determine physiologically consistent point estimates of the parameters and to quantify uncertainty induced by errors and assumptions in the collected clinical data. We show that multi-level estimation, that is, updating the parameter prior information through sub-model analysis, can lead to a significant reduction in the parameter marginal posterior variance. We first consider virtual patient conditions, with clinical targets generated through model solutions, and second application to a cohort of four single-ventricle patients with Norwood physiology. Copyright © 2016 John Wiley & Sons, Ltd.

  6. Evaluation of Parameter Uncertainty Reduction in Groundwater Flow Modeling Using Multiple Environmental Tracers

    Science.gov (United States)

    Arnold, B. W.; Gardner, P.

    2013-12-01

    Calibration of groundwater flow models for the purpose of evaluating flow and aquifer heterogeneity typically uses observations of hydraulic head in wells and appropriate boundary conditions. Environmental tracers have a wide variety of decay rates and input signals in recharge, resulting in a potentially broad source of additional information to constrain flow rates and heterogeneity. A numerical study was conducted to evaluate the reduction in uncertainty during model calibration using observations of various environmental tracers and combinations of tracers. A synthetic data set was constructed by simulating steady groundwater flow and transient tracer transport in a high-resolution, 2-D aquifer with heterogeneous permeability and porosity using the PFLOTRAN software code. Data on pressure and tracer concentration were extracted at well locations and then used as observations for automated calibration of a flow and transport model using the pilot point method and the PEST code. Optimization runs were performed to estimate parameter values of permeability at 30 pilot points in the model domain for cases using 42 observations of: 1) pressure, 2) pressure and CFC11 concentrations, 3) pressure and Ar-39 concentrations, and 4) pressure, CFC11, Ar-39, tritium, and He-3 concentrations. Results show significantly lower uncertainty, as indicated by the 95% linear confidence intervals, in permeability values at the pilot points for cases including observations of environmental tracer concentrations. The average linear uncertainty range for permeability at the pilot points using pressure observations alone is 4.6 orders of magnitude, using pressure and CFC11 concentrations is 1.6 orders of magnitude, using pressure and Ar-39 concentrations is 0.9 order of magnitude, and using pressure, CFC11, Ar-39, tritium, and He-3 concentrations is 1.0 order of magnitude. Data on Ar-39 concentrations result in the greatest parameter uncertainty reduction because its half-life of 269

  7. Incorporating parameter uncertainty in Bayesian segmentation models: application to hippocampal subfield volumetry.

    Science.gov (United States)

    Iglesias, Juan Eugenio; Sabuncu, Mert Rory; Van Leemput, Koen

    2012-01-01

    Many successful segmentation algorithms are based on Bayesian models in which prior anatomical knowledge is combined with the available image information. However, these methods typically have many free parameters that are estimated to obtain point estimates only, whereas a faithful Bayesian analysis would also consider all possible alternate values these parameters may take. In this paper, we propose to incorporate the uncertainty of the free parameters in Bayesian segmentation models more accurately by using Monte Carlo sampling. We demonstrate our technique by sampling atlas warps in a recent method for hippocampal subfield segmentation, and show a significant improvement in an Alzheimer's disease classification task. As an additional benefit, the method also yields informative "error bars" on the segmentation results for each of the individual sub-structures.

  8. Incorporating Parameter Uncertainty in Bayesian Segmentation Models: Application to Hippocampal Subfield Volumetry

    DEFF Research Database (Denmark)

    Iglesias, J. E.; Sabuncu, M. R.; Van Leemput, Koen

    2012-01-01

    in a recent method for hippocampal subfield segmentation, and show a significant improvement in an Alzheimer’s disease classification task. As an additional benefit, the method also yields informative “error bars” on the segmentation results for each of the individual sub-structures.......Many successful segmentation algorithms are based on Bayesian models in which prior anatomical knowledge is combined with the available image information. However, these methods typically have many free parameters that are estimated to obtain point estimates only, whereas a faithful Bayesian...... analysis would also consider all possible alternate values these parameters may take. In this paper, we propose to incorporate the uncertainty of the free parameters in Bayesian segmentation models more accurately by using Monte Carlo sampling. We demonstrate our technique by sampling atlas warps...

  9. A Bayesian-based multilevel factorial analysis method for analyzing parameter uncertainty of hydrological model

    Science.gov (United States)

    Liu, Y. R.; Li, Y. P.; Huang, G. H.; Zhang, J. L.; Fan, Y. R.

    2017-10-01

    In this study, a Bayesian-based multilevel factorial analysis (BMFA) method is developed to assess parameter uncertainties and their effects on hydrological model responses. In BMFA, Differential Evolution Adaptive Metropolis (DREAM) algorithm is employed to approximate the posterior distributions of model parameters with Bayesian inference; factorial analysis (FA) technique is used for measuring the specific variations of hydrological responses in terms of posterior distributions to investigate the individual and interactive effects of parameters on model outputs. BMFA is then applied to a case study of the Jinghe River watershed in the Loess Plateau of China to display its validity and applicability. The uncertainties of four sensitive parameters, including soil conservation service runoff curve number to moisture condition II (CN2), soil hydraulic conductivity (SOL_K), plant available water capacity (SOL_AWC), and soil depth (SOL_Z), are investigated. Results reveal that (i) CN2 has positive effect on peak flow, implying that the concentrated rainfall during rainy season can cause infiltration-excess surface flow, which is an considerable contributor to peak flow in this watershed; (ii) SOL_K has positive effect on average flow, implying that the widely distributed cambisols can lead to medium percolation capacity; (iii) the interaction between SOL_AWC and SOL_Z has noticeable effect on the peak flow and their effects are dependent upon each other, which discloses that soil depth can significant influence the processes of plant uptake of soil water in this watershed. Based on the above findings, the significant parameters and the relationship among uncertain parameters can be specified, such that hydrological model's capability for simulating/predicting water resources of the Jinghe River watershed can be improved.

  10. Cosmological Parameter Uncertainties from SALT-II Type Ia Supernova Light Curve Models

    Energy Technology Data Exchange (ETDEWEB)

    Mosher, J. [Pennsylvania U.; Guy, J. [LBL, Berkeley; Kessler, R. [Chicago U., KICP; Astier, P. [Paris U., VI-VII; Marriner, J. [Fermilab; Betoule, M. [Paris U., VI-VII; Sako, M. [Pennsylvania U.; El-Hage, P. [Paris U., VI-VII; Biswas, R. [Argonne; Pain, R. [Paris U., VI-VII; Kuhlmann, S. [Argonne; Regnault, N. [Paris U., VI-VII; Frieman, J. A. [Fermilab; Schneider, D. P. [Penn State U.

    2014-08-29

    We use simulated type Ia supernova (SN Ia) samples, including both photometry and spectra, to perform the first direct validation of cosmology analysis using the SALT-II light curve model. This validation includes residuals from the light curve training process, systematic biases in SN Ia distance measurements, and a bias on the dark energy equation of state parameter w. Using the SN-analysis package SNANA, we simulate and analyze realistic samples corresponding to the data samples used in the SNLS3 analysis: ~120 low-redshift (z < 0.1) SNe Ia, ~255 Sloan Digital Sky Survey SNe Ia (z < 0.4), and ~290 SNLS SNe Ia (z ≤ 1). To probe systematic uncertainties in detail, we vary the input spectral model, the model of intrinsic scatter, and the smoothing (i.e., regularization) parameters used during the SALT-II model training. Using realistic intrinsic scatter models results in a slight bias in the ultraviolet portion of the trained SALT-II model, and w biases (w (input) – w (recovered)) ranging from –0.005 ± 0.012 to –0.024 ± 0.010. These biases are indistinguishable from each other within the uncertainty, the average bias on w is –0.014 ± 0.007.

  11. Cosmological parameter uncertainties from SALT-II type Ia supernova light curve models

    Energy Technology Data Exchange (ETDEWEB)

    Mosher, J.; Sako, M. [Department of Physics and Astronomy, University of Pennsylvania, 209 South 33rd Street, Philadelphia, PA 19104 (United States); Guy, J.; Astier, P.; Betoule, M.; El-Hage, P.; Pain, R.; Regnault, N. [LPNHE, CNRS/IN2P3, Université Pierre et Marie Curie Paris 6, Universié Denis Diderot Paris 7, 4 place Jussieu, F-75252 Paris Cedex 05 (France); Kessler, R.; Frieman, J. A. [Kavli Institute for Cosmological Physics, University of Chicago, 5640 South Ellis Avenue, Chicago, IL 60637 (United States); Marriner, J. [Center for Particle Astrophysics, Fermi National Accelerator Laboratory, P.O. Box 500, Batavia, IL 60510 (United States); Biswas, R.; Kuhlmann, S. [Argonne National Laboratory, 9700 South Cass Avenue, Lemont, IL 60439 (United States); Schneider, D. P., E-mail: kessler@kicp.chicago.edu [Department of Astronomy and Astrophysics, The Pennsylvania State University, University Park, PA 16802 (United States)

    2014-09-20

    We use simulated type Ia supernova (SN Ia) samples, including both photometry and spectra, to perform the first direct validation of cosmology analysis using the SALT-II light curve model. This validation includes residuals from the light curve training process, systematic biases in SN Ia distance measurements, and a bias on the dark energy equation of state parameter w. Using the SN-analysis package SNANA, we simulate and analyze realistic samples corresponding to the data samples used in the SNLS3 analysis: ∼120 low-redshift (z < 0.1) SNe Ia, ∼255 Sloan Digital Sky Survey SNe Ia (z < 0.4), and ∼290 SNLS SNe Ia (z ≤ 1). To probe systematic uncertainties in detail, we vary the input spectral model, the model of intrinsic scatter, and the smoothing (i.e., regularization) parameters used during the SALT-II model training. Using realistic intrinsic scatter models results in a slight bias in the ultraviolet portion of the trained SALT-II model, and w biases (w {sub input} – w {sub recovered}) ranging from –0.005 ± 0.012 to –0.024 ± 0.010. These biases are indistinguishable from each other within the uncertainty; the average bias on w is –0.014 ± 0.007.

  12. The Effects of Uncertainty in Speed-Flow Curve Parameters on a Large-Scale Model

    DEFF Research Database (Denmark)

    Manzo, Stefano; Nielsen, Otto Anker; Prato, Carlo Giacomo

    2014-01-01

    Uncertainty is inherent in transport models and prevents the use of a deterministic approach when traffic is modeled. Quantifying uncertainty thus becomes an indispensable step to produce a more informative and reliable output of transport models. In traffic assignment models, volume-delay functi......Uncertainty is inherent in transport models and prevents the use of a deterministic approach when traffic is modeled. Quantifying uncertainty thus becomes an indispensable step to produce a more informative and reliable output of transport models. In traffic assignment models, volume...

  13. Prediction uncertainty assessment of a systems biology model requires a sample of the full probability distribution of its parameters

    NARCIS (Netherlands)

    Mourik, van S.; Braak, ter C.J.F.; Stigter, J.D.; Molenaar, J.

    2014-01-01

    Multi-parameter models in systems biology are typically ‘sloppy’: some parameters or combinations of parameters may be hard to estimate from data, whereas others are not. One might expect that parameter uncertainty automatically leads to uncertain predictions, but this is not the case. We illustrate

  14. Coastal aquifer management under parameter uncertainty: Ensemble surrogate modeling based simulation-optimization

    Science.gov (United States)

    Janardhanan, S.; Datta, B.

    2011-12-01

    Surrogate models are widely used to develop computationally efficient simulation-optimization models to solve complex groundwater management problems. Artificial intelligence based models are most often used for this purpose where they are trained using predictor-predictand data obtained from a numerical simulation model. Most often this is implemented with the assumption that the parameters and boundary conditions used in the numerical simulation model are perfectly known. However, in most practical situations these values are uncertain. Under these circumstances the application of such approximation surrogates becomes limited. In our study we develop a surrogate model based coupled simulation optimization methodology for determining optimal pumping strategies for coastal aquifers considering parameter uncertainty. An ensemble surrogate modeling approach is used along with multiple realization optimization. The methodology is used to solve a multi-objective coastal aquifer management problem considering two conflicting objectives. Hydraulic conductivity and the aquifer recharge are considered as uncertain values. Three dimensional coupled flow and transport simulation model FEMWATER is used to simulate the aquifer responses for a number of scenarios corresponding to Latin hypercube samples of pumping and uncertain parameters to generate input-output patterns for training the surrogate models. Non-parametric bootstrap sampling of this original data set is used to generate multiple data sets which belong to different regions in the multi-dimensional decision and parameter space. These data sets are used to train and test multiple surrogate models based on genetic programming. The ensemble of surrogate models is then linked to a multi-objective genetic algorithm to solve the pumping optimization problem. Two conflicting objectives, viz, maximizing total pumping from beneficial wells and minimizing the total pumping from barrier wells for hydraulic control of

  15. Parameter uncertainty analysis for the annual phosphorus loss estimator (APLE) model

    Science.gov (United States)

    Technical abstract: Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that model predictions are inherently uncertain, few studies have addressed prediction uncertainties using P loss models. In this study, we conduct an uncertainty analys...

  16. SWAT hydrologic model parameter uncertainty and its implications for hydroclimatic projections in snowmelt-dependent watersheds

    Science.gov (United States)

    Ficklin, Darren L.; Barnhart, Bradley L.

    2014-11-01

    The effects of climate change on water resources have been studied extensively throughout the world through the use of hydrologic models coupled with General Circulation Model (GCM) output or climate sensitivity scenarios. This paper examines the effects of hydrologic model parameterization uncertainty or equifinality, where multiple unique hydrologic model parameter sets can result in adequate calibration metrics, on hydrologic projections from downscaled GCMs for three snowmelt-dependent watersheds (upper reaches of the Clearwater, Gunnison, and Sacramento River watersheds) in the western United States. The hydrologic model used in this study is the Soil and Water Assessment Tool (SWAT) and is calibrated for discharge at the watershed outlet in each watershed. Despite achieving similar calibration metrics, a majority of hydrologic projections of average annual streamflow during the 2080s were statistically different, with differences in magnitude and direction (increase or decrease) compared to historical annual streamflows. At the average monthly time-scale, a majority of the hydrologic projections varied in peak streamflow timing, peak streamflow magnitude, summer streamflows, as well as overall increases or decreases compared to the historical monthly streamflows. Snowmelt projections from the SWAT model also widely varied, both in depth and snowmelt peak timing, for all watersheds. Since a large portion of the runoff-producing regions in the western United States is snowmelt-dependent, this has large implications for the prediction of the amount and timing of streamflow in the coming century. This paper shows that hydrologic model parameterizations that give similar adequate calibration metrics can lead to statistically significant differences in hydrologic projections under climate change. Therefore, researchers and water resource managers should account for this uncertainty by assembling ensemble projections from both multiple parameter sets and GCMs.

  17. Efficient probabilistic model personalization integrating uncertainty on data and parameters: Application to eikonal-diffusion models in cardiac electrophysiology.

    Science.gov (United States)

    Konukoglu, Ender; Relan, Jatin; Cilingir, Ulas; Menze, Bjoern H; Chinchapatnam, Phani; Jadidi, Amir; Cochet, Hubert; Hocini, Mélèze; Delingette, Hervé; Jaïs, Pierre; Haïssaguerre, Michel; Ayache, Nicholas; Sermesant, Maxime

    2011-10-01

    Biophysical models are increasingly used for medical applications at the organ scale. However, model predictions are rarely associated with a confidence measure although there are important sources of uncertainty in computational physiology methods. For instance, the sparsity and noise of the clinical data used to adjust the model parameters (personalization), and the difficulty in modeling accurately soft tissue physiology. The recent theoretical progresses in stochastic models make their use computationally tractable, but there is still a challenge in estimating patient-specific parameters with such models. In this work we propose an efficient Bayesian inference method for model personalization using polynomial chaos and compressed sensing. This method makes Bayesian inference feasible in real 3D modeling problems. We demonstrate our method on cardiac electrophysiology. We first present validation results on synthetic data, then we apply the proposed method to clinical data. We demonstrate how this can help in quantifying the impact of the data characteristics on the personalization (and thus prediction) results. Described method can be beneficial for the clinical use of personalized models as it explicitly takes into account the uncertainties on the data and the model parameters while still enabling simulations that can be used to optimize treatment. Such uncertainty handling can be pivotal for the proper use of modeling as a clinical tool, because there is a crucial requirement to know the confidence one can have in personalized models.

  18. Influences of parameter uncertainties within the ICRP 66 respiratory tract model: particle deposition.

    Science.gov (United States)

    Bolch, W E; Farfán, E B; Huh, C; Huston, T E; Bolch, W E

    2001-10-01

    Risk assessment associated with the inhalation of radioactive aerosols requires as an initial step the determination of particle deposition within the various anatomic regions of the respiratory tract. The model outlined in ICRP Publication 66 represents to date one of the most complete overall descriptions of not only particle deposition, but of particle clearance and local radiation dosimetry of lung tissues. In this study, a systematic review of the deposition component within the ICRP 66 respiratory tract model was conducted in which probability density functions were assigned to all input parameters. These distributions were subsequently incorporated within a computer code LUDUC (LUng Dose Uncertainty Code) in which Latin hypercube sampling techniques are used to generate multiple (e.g., 1,000) sets of input vectors (i.e., trials) for all of the model parameters needed to assess particle deposition within the extrathoracic (anterior and posterior), bronchial, bronchiolar, and alveolar-interstitial regions of the ICRP 66 respiratory tract model. Particle deposition values for the various trial simulations were shown to be well described by lognormal probability distributions. Geometric mean deposition fractions from LUDUC were found to be within approximately +/- 10% of the single-value estimates from the LUDEP computer code for each anatomic region and for particle diameters ranging from 0.001 to 50 microm. In all regions of the respiratory tract, LUDUC simulations for an adult male at light exertion show that uncertainties in particle deposition fractions are distributed only over a range of about a factor of approximately 2-4 for particle sizes between 0.005 to 0.2 microm. Below 0.005 microm, uncertainties increase only for deposition within the alveolar region. At particle sizes exceeding 1 microm, uncertainties in the deposition fraction within the extrathoracic regions are relatively small, but approach a factor of 20 for deposition in the bronchial

  19. Cosmological Parameter Uncertainties from SALT-II Type Ia Supernova Light Curve Models

    CERN Document Server

    Mosher, J; Kessler, R; Astier, P; Marriner, J; Betoule, M; Sako, M; El-Hage, P; Biswas, R; Pain, R; Kuhlmann, S; Regnault, N; Frieman, J A; Schneider, D P

    2014-01-01

    We use simulated SN Ia samples, including both photometry and spectra, to perform the first direct validation of cosmology analysis using the SALT-II light curve model. This validation includes residuals from the light curve training process, systematic biases in SN Ia distance measurements, and the bias on the dark energy equation of state parameter w. Using the SN-analysis package SNANA, we simulate and analyze realistic samples corresponding to the data samples used in the SNLS3 analysis: 120 low-redshift (z < 0.1) SNe Ia, 255 SDSS SNe Ia (z < 0.4), and 290 SNLS SNe Ia (z <= 1). To probe systematic uncertainties in detail, we vary the input spectral model, the model of intrinsic scatter, and the smoothing (i.e., regularization) parameters used during the SALT-II model training. Using realistic intrinsic scatter models results in a slight bias in the ultraviolet portion of the trained SALT-II model, and w biases (winput - wrecovered) ranging from -0.005 +/- 0.012 to -0.024 +/- 0.010. These biases a...

  20. [Using CTS and PK-PD models to predict the effect of uncertainty about population parameters on clinical trial power].

    Science.gov (United States)

    Zhu, Ling; Shi, Xinling; Liu, Yajie

    2009-02-01

    The traditional clinical trail designs always depend on expert opinions and lack statistical evaluations. In this article, we present a method and illustrate how population parameter uncertainty may be incorporated in the overall simulation model. Using the techniques of clinical trail simulation (CTS) and setting up predictions on the basis of pharmacokinetics-pharmacodynamics (PK-PD) models, we advance the modeling methods for simulation, for treatment effects, and for the clinical trail power under the given PK-PD conditions. Then we discuss the model of uncertainty, suggest an ANOVA-based method, add eta2 statistics for sensitivity analysis, and canvass the effect of uncertainty about population parameters on clinical trail power. The results from simulations and the indices derived from this type of sensitivity analysis may be used for grading the influence on the prediction quality of uncertainty about different population parameters. The experiment results are satisfactory and the approach presented has practical value in clinical trails.

  1. Modelling Framework for the Identification of Critical Variables and Parameters under Uncertainty in the Bioethanol Production from Lignocellulose

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Meyer, Anne S.; Gernaey, Krist

    2011-01-01

    ) Collection of data and the implementation of dynamic models for each unit operation in the process; (3) Uncertainty and sensitivity analysis, performed to identify the critical operational variables and parameters in the process. The uncertainty analysis is carried out using the Monte-Carlo technique......This study presents the development of a systematic modelling framework for identification of the most critical variables and parameters under uncertainty, evaluated on a lignocellulosic ethanol production case study. The systematic framework starts with: (1) definition of the objectives; (2...

  2. Uncertainties in SDSS galaxy parameter determination: 3D photometrical modelling of test galaxies and restoration of their structural parameters

    CERN Document Server

    Tempel, Elmo; Kipper, Rain; Tenjes, Peeter

    2012-01-01

    Is it realistic to recover the 3D structure of galaxies from their images? To answer this question, we generate a sample of idealised model galaxies consisting of a disc-like component and a spheroidal component (bulge) with varying luminosities, inclination angles and structural parameters, and component density following the Einasto distribution. We simulate these galaxies as if observed in the SDSS project through ugriz filters, thus gaining a set of images of galaxies with known intrinsic properties. We remodel the galaxies with a 3D galaxy modelling procedure and compare the restored parameters to the initial ones in order to determine the uncertainties of the models. Down to the r-band limiting magnitude 18, errors of the restored integral luminosities and colour indices remain within 0.05 mag and errors of the luminosities of individual components within 0.2 mag. Accuracy of the restored bulge-to-disc ratios (B/D) is within 40% in most cases, and becomes even worse for galaxies with low B/D due to diff...

  3. Towards uncertainty quantification and parameter estimation for Earth system models in a component-based modeling framework

    Science.gov (United States)

    Peckham, Scott D.; Kelbert, Anna; Hill, Mary C.; Hutton, Eric W. H.

    2016-05-01

    Component-based modeling frameworks make it easier for users to access, configure, couple, run and test numerical models. However, they do not typically provide tools for uncertainty quantification or data-based model verification and calibration. To better address these important issues, modeling frameworks should be integrated with existing, general-purpose toolkits for optimization, parameter estimation and uncertainty quantification. This paper identifies and then examines the key issues that must be addressed in order to make a component-based modeling framework interoperable with general-purpose packages for model analysis. As a motivating example, one of these packages, DAKOTA, is applied to a representative but nontrivial surface process problem of comparing two models for the longitudinal elevation profile of a river to observational data. Results from a new mathematical analysis of the resulting nonlinear least squares problem are given and then compared to results from several different optimization algorithms in DAKOTA.

  4. Sensitivity of land surface modeling to parameters: An uncertainty quantification method applied to the Community Land Model

    Science.gov (United States)

    Ricciuto, D. M.; Mei, R.; Mao, J.; Hoffman, F. M.; Kumar, J.

    2015-12-01

    Uncertainties in land parameters could have important impacts on simulated water and energy fluxes and land surface states, which will consequently affect atmospheric and biogeochemical processes. Therefore, quantification of such parameter uncertainties using a land surface model is the first step towards better understanding of predictive uncertainty in Earth system models. In this study, we applied a random-sampling, high-dimensional model representation (RS-HDMR) method to analyze the sensitivity of simulated photosynthesis, surface energy fluxes and surface hydrological components to selected land parameters in version 4.5 of the Community Land Model (CLM4.5). Because of the large computational expense of conducting ensembles of global gridded model simulations, we used the results of a previous cluster analysis to select one thousand representative land grid cells for simulation. Plant functional type (PFT)-specific uniform prior ranges for land parameters were determined using expert opinion and literature survey, and samples were generated with a quasi-Monte Carlo approach-Sobol sequence. Preliminary analysis of 1024 simulations suggested that four PFT-dependent parameters (including slope of the conductance-photosynthesis relationship, specific leaf area at canopy top, leaf C:N ratio and fraction of leaf N in RuBisco) are the dominant sensitive parameters for photosynthesis, surface energy and water fluxes across most PFTs, but with varying importance rankings. On the other hand, for surface ans sub-surface runoff, PFT-independent parameters, such as the depth-dependent decay factors for runoff, play more important roles than the previous four PFT-dependent parameters. Further analysis by conditioning the results on different seasons and years are being conducted to provide guidance on how climate variability and change might affect such sensitivity. This is the first step toward coupled simulations including biogeochemical processes, atmospheric processes

  5. Significance of uncertainties derived from settling tank model structure and parameters on predicting WWTP performance - A global sensitivity analysis study

    DEFF Research Database (Denmark)

    Ramin, Elham; Sin, Gürkan; Mikkelsen, Peter Steen

    2011-01-01

    Uncertainty derived from one of the process models – such as one-dimensional secondary settling tank (SST) models – can impact the output of the other process models, e.g., biokinetic (ASM1), as well as the integrated wastewater treatment plant (WWTP) models. The model structure and parameter...... uncertainty of settler models can therefore propagate, and add to the uncertainties in prediction of any plant performance criteria. Here we present an assessment of the relative significance of secondary settling model performance in WWTP simulations. We perform a global sensitivity analysis (GSA) based....... The outcome of this study contributes to a better understanding of uncertainty in WWTPs, and explicitly demonstrates the significance of secondary settling processes that are crucial elements of model prediction under dry and wet-weather loading conditions....

  6. Addressing model uncertainty through stochastic parameter perturbations within the High Resolution Rapid Refresh (HRRR) ensemble

    Science.gov (United States)

    Wolff, J.; Jankov, I.; Beck, J.; Carson, L.; Frimel, J.; Harrold, M.; Jiang, H.

    2016-12-01

    It is well known that global and regional numerical weather prediction ensemble systems are under-dispersive, producing unreliable and overconfident ensemble forecasts. Typical approaches to alleviate this problem include the use of multiple dynamic cores, multiple physics suite configurations, or a combination of the two. While these approaches may produce desirable results, they have practical and theoretical deficiencies and are more difficult and costly to maintain. An active area of research that promotes a more unified and sustainable system for addressing the deficiencies in ensemble modeling is the use of stochastic physics to represent model-related uncertainty. Stochastic approaches include Stochastic Parameter Perturbations (SPP), Stochastic Kinetic Energy Backscatter (SKEB), Stochastic Perturbation of Physics Tendencies (SPPT), or some combination of all three. The focus of this study is to assess the model performance within a convection-permitting ensemble at 3-km grid spacing across the Contiguous United States (CONUS) when using stochastic approaches. For this purpose, the test utilized a single physics suite configuration based on the operational High-Resolution Rapid Refresh (HRRR) model, with ensemble members produced by employing stochastic methods. Parameter perturbations were employed in the Rapid Update Cycle (RUC) land surface model and Mellor-Yamada-Nakanishi-Niino (MYNN) planetary boundary layer scheme. Results will be presented in terms of bias, error, spread, skill, accuracy, reliability, and sharpness using the Model Evaluation Tools (MET) verification package. Due to the high level of complexity of running a frequently updating (hourly), high spatial resolution (3 km), large domain (CONUS) ensemble system, extensive high performance computing (HPC) resources were needed to meet this objective. Supercomputing resources were provided through the National Center for Atmospheric Research (NCAR) Strategic Capability (NSC) project support

  7. Multistate Statistical Modeling: A Tool to Build a Lung Cancer Microsimulation Model That Includes Parameter Uncertainty and Patient Heterogeneity.

    Science.gov (United States)

    Bongers, Mathilda L; de Ruysscher, Dirk; Oberije, Cary; Lambin, Philippe; Uyl-de Groot, Carin A; Coupé, V M H

    2016-01-01

    With the shift toward individualized treatment, cost-effectiveness models need to incorporate patient and tumor characteristics that may be relevant to treatment planning. In this study, we used multistate statistical modeling to inform a microsimulation model for cost-effectiveness analysis of individualized radiotherapy in lung cancer. The model tracks clinical events over time and takes patient and tumor features into account. Four clinical states were included in the model: alive without progression, local recurrence, metastasis, and death. Individual patients were simulated by repeatedly sampling a patient profile, consisting of patient and tumor characteristics. The transitioning of patients between the health states is governed by personalized time-dependent hazard rates, which were obtained from multistate statistical modeling (MSSM). The model simulations for both the individualized and conventional radiotherapy strategies demonstrated internal and external validity. Therefore, MSSM is a useful technique for obtaining the correlated individualized transition rates that are required for the quantification of a microsimulation model. Moreover, we have used the hazard ratios, their 95% confidence intervals, and their covariance to quantify the parameter uncertainty of the model in a correlated way. The obtained model will be used to evaluate the cost-effectiveness of individualized radiotherapy treatment planning, including the uncertainty of input parameters. We discuss the model-building process and the strengths and weaknesses of using MSSM in a microsimulation model for individualized radiotherapy in lung cancer.

  8. Uncertainty Analysis in the Noise Parameters Estimation

    Directory of Open Access Journals (Sweden)

    Pawlik P.

    2012-07-01

    Full Text Available The new approach to the uncertainty estimation in modelling acoustic hazards by means of the interval arithmetic is presented in the paper. In the case of the noise parameters estimation the selection of parameters specifying the acoustic wave propagation in an open space as well as parameters which are required in a form of average values – often constitutes a difficult problem. In such case, it is necessary to determine the variance and then, related strictly to it, the uncertainty of model parameters. The application of the interval arithmetic formalism allows to estimate the input data uncertainties without the necessity of the determination their probability distribution, which is required by other methods of uncertainty assessment. A successive problem in the acoustic hazards estimation is a lack of the exact knowledge of the input parameters. In connection with the above, the analysis of the modelling uncertainty in dependence of inaccuracy of model parameters was performed. To achieve this aim the interval arithmetic formalism – representing the value and its uncertainty in a form of an interval – was applied. The proposed approach was illustrated by the example of the application the Dutch RMR SRM Method, recommended by the European Union Directive 2002/49/WE, in the railway noise modelling.

  9. A methodology for estimating the uncertainty in model parameters applying the robust Bayesian inferences

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Joo Yeon; Lee, Seung Hyun; Park, Tai Jin [Korean Association for Radiation Application, Seoul (Korea, Republic of)

    2016-06-15

    Any real application of Bayesian inference must acknowledge that both prior distribution and likelihood function have only been specified as more or less convenient approximations to whatever the analyzer's true belief might be. If the inferences from the Bayesian analysis are to be trusted, it is important to determine that they are robust to such variations of prior and likelihood as might also be consistent with the analyzer's stated beliefs. The robust Bayesian inference was applied to atmospheric dispersion assessment using Gaussian plume model. The scopes of contaminations were specified as the uncertainties of distribution type and parametric variability. The probabilistic distribution of model parameters was assumed to be contaminated as the symmetric unimodal and unimodal distributions. The distribution of the sector-averaged relative concentrations was then calculated by applying the contaminated priors to the model parameters. The sector-averaged concentrations for stability class were compared by applying the symmetric unimodal and unimodal priors, respectively, as the contaminated one based on the class of ε-contamination. Though ε was assumed as 10%, the medians reflecting the symmetric unimodal priors were nearly approximated within 10% compared with ones reflecting the plausible ones. However, the medians reflecting the unimodal priors were approximated within 20% for a few downwind distances compared with ones reflecting the plausible ones. The robustness has been answered by estimating how the results of the Bayesian inferences are robust to reasonable variations of the plausible priors. From these robust inferences, it is reasonable to apply the symmetric unimodal priors for analyzing the robustness of the Bayesian inferences.

  10. Domain Knowledge Uncertainty and Probabilistic Parameter Constraints

    CERN Document Server

    Mao, Yi

    2012-01-01

    Incorporating domain knowledge into the modeling process is an effective way to improve learning accuracy. However, as it is provided by humans, domain knowledge can only be specified with some degree of uncertainty. We propose to explicitly model such uncertainty through probabilistic constraints over the parameter space. In contrast to hard parameter constraints, our approach is effective also when the domain knowledge is inaccurate and generally results in superior modeling accuracy. We focus on generative and conditional modeling where the parameters are assigned a Dirichlet or Gaussian prior and demonstrate the framework with experiments on both synthetic and real-world data.

  11. Quantifying uncertainties in streamflow predictions through signature based inference of hydrological model parameters

    Science.gov (United States)

    Fenicia, Fabrizio; Reichert, Peter; Kavetski, Dmitri; Albert, Calro

    2016-04-01

    The calibration of hydrological models based on signatures (e.g. Flow Duration Curves - FDCs) is often advocated as an alternative to model calibration based on the full time series of system responses (e.g. hydrographs). Signature based calibration is motivated by various arguments. From a conceptual perspective, calibration on signatures is a way to filter out errors that are difficult to represent when calibrating on the full time series. Such errors may for example occur when observed and simulated hydrographs are shifted, either on the "time" axis (i.e. left or right), or on the "streamflow" axis (i.e. above or below). These shifts may be due to errors in the precipitation input (time or amount), and if not properly accounted in the likelihood function, may cause biased parameter estimates (e.g. estimated model parameters that do not reproduce the recession characteristics of a hydrograph). From a practical perspective, signature based calibration is seen as a possible solution for making predictions in ungauged basins. Where streamflow data are not available, it may in fact be possible to reliably estimate streamflow signatures. Previous research has for example shown how FDCs can be reliably estimated at ungauged locations based on climatic and physiographic influence factors. Typically, the goal of signature based calibration is not the prediction of the signatures themselves, but the prediction of the system responses. Ideally, the prediction of system responses should be accompanied by a reliable quantification of the associated uncertainties. Previous approaches for signature based calibration, however, do not allow reliable estimates of streamflow predictive distributions. Here, we illustrate how the Bayesian approach can be employed to obtain reliable streamflow predictive distributions based on signatures. A case study is presented, where a hydrological model is calibrated on FDCs and additional signatures. We propose an approach where the likelihood

  12. Information on Hydrologic Conceptual Models, Parameters, Uncertainty Analysis, and Data Sources for Dose Assessments at Decommissioning Sites

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Philip D.; Gee, Glendon W.; Nicholson, Thomas J.

    2000-02-28

    This report addresses issues related to the analysis of uncertainty in dose assessments conducted as part of decommissioning analyses. The analysis is limited to the hydrologic aspects of the exposure pathway involving infiltration of water at the ground surface, leaching of contaminants, and transport of contaminants through the groundwater to a point of exposure. The basic conceptual models and mathematical implementations of three dose assessment codes are outlined along with the site-specific conditions under which the codes may provide inaccurate, potentially nonconservative results. In addition, the hydrologic parameters of the codes are identified and compared. A methodology for parameter uncertainty assessment is outlined that considers the potential data limitations and modeling needs of decommissioning analyses. This methodology uses generic parameter distributions based on national or regional databases, sensitivity analysis, probabilistic modeling, and Bayesian updating to incorporate site-specific information. Data sources for best-estimate parameter values and parameter uncertainty information are also reviewed. A follow-on report will illustrate the uncertainty assessment methodology using decommissioning test cases.

  13. Stability Analysis for Li-Ion Battery Model Parameters and State of Charge Estimation by Measurement Uncertainty Consideration

    Directory of Open Access Journals (Sweden)

    Shifei Yuan

    2015-07-01

    Full Text Available Accurate estimation of model parameters and state of charge (SoC is crucial for the lithium-ion battery management system (BMS. In this paper, the stability of the model parameters and SoC estimation under measurement uncertainty is evaluated by three different factors: (i sampling periods of 1/0.5/0.1 s; (ii current sensor precisions of ±5/±50/±500 mA; and (iii voltage sensor precisions of ±1/±2.5/±5 mV. Firstly, the numerical model stability analysis and parametric sensitivity analysis for battery model parameters are conducted under sampling frequency of 1–50 Hz. The perturbation analysis is theoretically performed of current/voltage measurement uncertainty on model parameter variation. Secondly, the impact of three different factors on the model parameters and SoC estimation was evaluated with the federal urban driving sequence (FUDS profile. The bias correction recursive least square (CRLS and adaptive extended Kalman filter (AEKF algorithm were adopted to estimate the model parameters and SoC jointly. Finally, the simulation results were compared and some insightful findings were concluded. For the given battery model and parameter estimation algorithm, the sampling period, and current/voltage sampling accuracy presented a non-negligible effect on the estimation results of model parameters. This research revealed the influence of the measurement uncertainty on the model parameter estimation, which will provide the guidelines to select a reasonable sampling period and the current/voltage sensor sampling precisions in engineering applications.

  14. Implicit Treatment of Technical Specification and Thermal Hydraulic Parameter Uncertainties in Gaussian Process Model to Estimate Safety Margin

    Directory of Open Access Journals (Sweden)

    Douglas A. Fynan

    2016-06-01

    Full Text Available The Gaussian process model (GPM is a flexible surrogate model that can be used for nonparametric regression for multivariate problems. A unique feature of the GPM is that a prediction variance is automatically provided with the regression function. In this paper, we estimate the safety margin of a nuclear power plant by performing regression on the output of best-estimate simulations of a large-break loss-of-coolant accident with sampling of safety system configuration, sequence timing, technical specifications, and thermal hydraulic parameter uncertainties. The key aspect of our approach is that the GPM regression is only performed on the dominant input variables, the safety injection flow rate and the delay time for AC powered pumps to start representing sequence timing uncertainty, providing a predictive model for the peak clad temperature during a reflood phase. Other uncertainties are interpreted as contributors to the measurement noise of the code output and are implicitly treated in the GPM in the noise variance term, providing local uncertainty bounds for the peak clad temperature. We discuss the applicability of the foregoing method to reduce the use of conservative assumptions in best estimate plus uncertainty (BEPU and Level 1 probabilistic safety assessment (PSA success criteria definitions while dealing with a large number of uncertainties.

  15. Implicit treatment of technical specification and thermal hydraulic parameter uncertainties in Gaussian process model to estimate safety margin

    Energy Technology Data Exchange (ETDEWEB)

    Fynan, Douglas A.; Ahn, Kwang Il [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-06-15

    The Gaussian process model (GPM) is a flexible surrogate model that can be used for nonparametric regression for multivariate problems. A unique feature of the GPM is that a prediction variance is automatically provided with the regression function. In this paper, we estimate the safety margin of a nuclear power plant by performing regression on the output of best-estimate simulations of a large-break loss-of-coolant accident with sampling of safety system configuration, sequence timing, technical specifications, and thermal hydraulic parameter uncertainties. The key aspect of our approach is that the GPM regression is only performed on the dominant input variables, the safety injection flow rate and the delay time for AC powered pumps to start representing sequence timing uncertainty, providing a predictive model for the peak clad temperature during a reflood phase. Other uncertainties are interpreted as contributors to the measurement noise of the code output and are implicitly treated in the GPM in the noise variance term, providing local uncertainty bounds for the peak clad temperature. We discuss the applicability of the foregoing method to reduce the use of conservative assumptions in best estimate plus uncertainty (BEPU) and Level 1 probabilistic safety assessment (PSA) success criteria definitions while dealing with a large number of uncertainties.

  16. An Interval-Parameter Fuzzy Linear Programming with Stochastic Vertices Model for Water Resources Management under Uncertainty

    Directory of Open Access Journals (Sweden)

    Yan Han

    2013-01-01

    Full Text Available An interval-parameter fuzzy linear programming with stochastic vertices (IFLPSV method is developed for water resources management under uncertainty by coupling interval-parameter fuzzy linear programming (IFLP with stochastic programming (SP. As an extension of existing interval parameter fuzzy linear programming, the developed IFLPSV approach has advantages in dealing with dual uncertainty optimization problems, which uncertainty presents as interval parameter with stochastic vertices in both of the objective functions and constraints. The developed IFLPSV method improves upon the IFLP method by allowing dual uncertainty parameters to be incorporated into the optimization processes. A hybrid intelligent algorithm based on genetic algorithm and artificial neural network is used to solve the developed model. The developed method is then applied to water resources allocation in Beijing city of China in 2020, where water resources shortage is a challenging issue. The results indicate that reasonable solutions have been obtained, which are helpful and useful for decision makers. Although the amount of water supply from Guanting and Miyun reservoirs is declining with rainfall reduction, water supply from the South-to-North Water Transfer project will have important impact on water supply structure of Beijing city, particularly in dry year and extraordinary dry year.

  17. More Efficient Bayesian-based Optimization and Uncertainty Assessment of Hydrologic Model Parameters

    Science.gov (United States)

    2012-02-01

    is more objective, repeatable, and better capitalizes on the computational capacity of the modern computer) is an active area of research and...existence of multiple local optima , non-smooth objective function surfaces, and long valleys in parameter space that are a result of excessive parameter...outputs, structural aspects of the model, as well as its input dataset, model parameters that are adjustable through the calibration process, and the

  18. Estimation of the Influence of Power System Mathematical Model Parameter Uncertainty on PSS2A System Stabilizers

    Directory of Open Access Journals (Sweden)

    Adrian Nocoń

    2015-09-01

    Full Text Available This paper presents an analysis of the influence of uncertainty of power system mathematical model parameters on optimised parameters of PSS2A system stabilizers. Optimisation of power system stabilizer parameters was based on polyoptimisation (multi-criteria optimisation. Optimisation criteria were determined for disturbances occurring in a multi-machine power system, when taking into account transient waveforms associated with electromechanical swings (instantaneous power, angular speed and terminal voltage waveforms of generators. A genetic algorithm with floating-point encoding, tournament selection, mean crossover and perturbative mutations, modified for the needs of investigations, was used for optimisation. The impact of uncertainties on the quality of operation of power system stabilizers with optimised parameters has been evaluated using various deformation factors.

  19. Rate control system algorithm developed in state space for models with parameter uncertainties

    Directory of Open Access Journals (Sweden)

    Adilson Jesus Teixeira

    2011-09-01

    Full Text Available Researching in weightlessness above the atmosphere needs a payload to carry the experiments. To achieve the weightlessness, the payload uses a rate control system (RCS in order to reduce the centripetal acceleration within the payload. The rate control system normally has actuators that supply a constant force when they are turned on. The development of an algorithm control for this rate control system will be based on the minimum-time problem method in the state space to overcome the payload and actuators dynamics uncertainties of the parameters. This control algorithm uses the initial conditions of optimal trajectories to create intermediate points or to adjust existing points of a switching function. It associated with inequality constraint will form a decision function to turn on or off the actuators. This decision function, for linear time-invariant systems in state space, needs only to test the payload state variables instead of spent effort in solving differential equations and it will be tuned in real time to the payload dynamic. It will be shown, through simulations, the results obtained for some cases of parameters uncertainties that the rate control system algorithm reduced the payload centripetal acceleration below μg level and keep this way with no limit cycle.

  20. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  1. Uncertainty Quantification of GEOS-5 L-band Radiative Transfer Model Parameters Using Bayesian Inference and SMOS Observations

    Science.gov (United States)

    DeLannoy, Gabrielle J. M.; Reichle, Rolf H.; Vrugt, Jasper A.

    2013-01-01

    Uncertainties in L-band (1.4 GHz) radiative transfer modeling (RTM) affect the simulation of brightness temperatures (Tb) over land and the inversion of satellite-observed Tb into soil moisture retrievals. In particular, accurate estimates of the microwave soil roughness, vegetation opacity and scattering albedo for large-scale applications are difficult to obtain from field studies and often lack an uncertainty estimate. Here, a Markov Chain Monte Carlo (MCMC) simulation method is used to determine satellite-scale estimates of RTM parameters and their posterior uncertainty by minimizing the misfit between long-term averages and standard deviations of simulated and observed Tb at a range of incidence angles, at horizontal and vertical polarization, and for morning and evening overpasses. Tb simulations are generated with the Goddard Earth Observing System (GEOS-5) and confronted with Tb observations from the Soil Moisture Ocean Salinity (SMOS) mission. The MCMC algorithm suggests that the relative uncertainty of the RTM parameter estimates is typically less than 25 of the maximum a posteriori density (MAP) parameter value. Furthermore, the actual root-mean-square-differences in long-term Tb averages and standard deviations are found consistent with the respective estimated total simulation and observation error standard deviations of m3.1K and s2.4K. It is also shown that the MAP parameter values estimated through MCMC simulation are in close agreement with those obtained with Particle Swarm Optimization (PSO).

  2. How uncertainty in input and parameters influences transport model output: four-stage model case-study

    DEFF Research Database (Denmark)

    Manzo, Stefano; Nielsen, Otto Anker; Prato, Carlo Giacomo

    2015-01-01

    ) different levels of network congestion. The choice of the probability distributions shows a low impact on the model output uncertainty, quantified in terms of coefficient of variation. Instead, with respect to the choice of different assignment algorithms, the link flow uncertainty, expressed in terms...... of coefficient of variation, resulting from stochastic user equilibrium and user equilibrium is, respectively, of 0.425 and 0.468. Finally, network congestion does not show a high effect on model output uncertainty at the network level. However, the final uncertainty of links with higher volume/capacity ratio...

  3. Parameter and model uncertainty in a life-table model for fine particles (PM2.5: a statistical modeling study

    Directory of Open Access Journals (Sweden)

    Jantunen Matti J

    2007-08-01

    Full Text Available Abstract Background The estimation of health impacts involves often uncertain input variables and assumptions which have to be incorporated into the model structure. These uncertainties may have significant effects on the results obtained with model, and, thus, on decision making. Fine particles (PM2.5 are believed to cause major health impacts, and, consequently, uncertainties in their health impact assessment have clear relevance to policy-making. We studied the effects of various uncertain input variables by building a life-table model for fine particles. Methods Life-expectancy of the Helsinki metropolitan area population and the change in life-expectancy due to fine particle exposures were predicted using a life-table model. A number of parameter and model uncertainties were estimated. Sensitivity analysis for input variables was performed by calculating rank-order correlations between input and output variables. The studied model uncertainties were (i plausibility of mortality outcomes and (ii lag, and parameter uncertainties (iii exposure-response coefficients for different mortality outcomes, and (iv exposure estimates for different age groups. The monetary value of the years-of-life-lost and the relative importance of the uncertainties related to monetary valuation were predicted to compare the relative importance of the monetary valuation on the health effect uncertainties. Results The magnitude of the health effects costs depended mostly on discount rate, exposure-response coefficient, and plausibility of the cardiopulmonary mortality. Other mortality outcomes (lung cancer, other non-accidental and infant mortality and lag had only minor impact on the output. The results highlight the importance of the uncertainties associated with cardiopulmonary mortality in the fine particle impact assessment when compared with other uncertainties. Conclusion When estimating life-expectancy, the estimates used for cardiopulmonary exposure

  4. Assessment of parameter uncertainty in hydrological model using a Markov-Chain-Monte-Carlo-based multilevel-factorial-analysis method

    Science.gov (United States)

    Zhang, Junlong; Li, Yongping; Huang, Guohe; Chen, Xi; Bao, Anming

    2016-07-01

    Without a realistic assessment of parameter uncertainty, decision makers may encounter difficulties in accurately describing hydrologic processes and assessing relationships between model parameters and watershed characteristics. In this study, a Markov-Chain-Monte-Carlo-based multilevel-factorial-analysis (MCMC-MFA) method is developed, which can not only generate samples of parameters from a well constructed Markov chain and assess parameter uncertainties with straightforward Bayesian inference, but also investigate the individual and interactive effects of multiple parameters on model output through measuring the specific variations of hydrological responses. A case study is conducted for addressing parameter uncertainties in the Kaidu watershed of northwest China. Effects of multiple parameters and their interactions are quantitatively investigated using the MCMC-MFA with a three-level factorial experiment (totally 81 runs). A variance-based sensitivity analysis method is used to validate the results of parameters' effects. Results disclose that (i) soil conservation service runoff curve number for moisture condition II (CN2) and fraction of snow volume corresponding to 50% snow cover (SNO50COV) are the most significant factors to hydrological responses, implying that infiltration-excess overland flow and snow water equivalent represent important water input to the hydrological system of the Kaidu watershed; (ii) saturate hydraulic conductivity (SOL_K) and soil evaporation compensation factor (ESCO) have obvious effects on hydrological responses; this implies that the processes of percolation and evaporation would impact hydrological process in this watershed; (iii) the interactions of ESCO and SNO50COV as well as CN2 and SNO50COV have an obvious effect, implying that snow cover can impact the generation of runoff on land surface and the extraction of soil evaporative demand in lower soil layers. These findings can help enhance the hydrological model

  5. Analysis of parameter uncertainty in hydrological and sediment modeling using GLUE method: a case study of SWAT model applied to Three Gorges Reservoir Region, China

    Directory of Open Access Journals (Sweden)

    Z. Y. Shen

    2012-01-01

    Full Text Available The calibration of hydrologic models is a worldwide challenge due to the uncertainty involved in the large number of parameters. The difficulty even increases in a region with high seasonal variation of precipitation, where the results exhibit high heteroscedasticity and autocorrelation. In this study, the Generalized Likelihood Uncertainty Estimation (GLUE method was combined with the Soil and Water Assessment Tool (SWAT to quantify the parameter uncertainty of the stream flow and sediment simulation in the Daning River Watershed of the Three Gorges Reservoir Region (TGRA, China. Based on this study, only a few parameters affected the final simulation output significantly. The results showed that sediment simulation presented greater uncertainty than stream flow, and uncertainty was even greater in high precipitation conditions (from May to September than during the dry season. The main uncertainty sources of stream flow came from the catchment process while a channel process impacts the sediment simulation greatly. It should be noted that identifiable parameters such as CANMX, ALPHA_BNK, SOL_K could be obtained with an optimal parameter range using calibration method. However, equifinality was also observed in hydrologic modeling in TGRA. This study demonstrated that care must be taken when calibrating the SWAT model with non-identifiable parameters because these may lead to equifinality of the parameter values. It was anticipated this study would provide useful information for hydrology modeling related to policy development in the Three Gorges Reservoir Region (TGRA and other similar areas.

  6. Spatial scale effects on model parameter estimation and predictive uncertainty in ungauged basins

    CSIR Research Space (South Africa)

    Hughes, DA

    2013-06-01

    Full Text Available The most appropriate scale to use for hydrological modelling depends on the structure of the chosen model, the purpose of the results and the resolution of the available data used to quantify parameter values and provide the climatic forcing data...

  7. Characterizing parameter sensitivity and uncertainty for a snow model across hydroclimatic regimes

    NARCIS (Netherlands)

    He, M.; Hogue, T.S.; Franz, K.J.; Margulis, S.A.; Vrugt, J.A.

    2011-01-01

    The National Weather Service (NWS) uses the SNOW17 model to forecast snow accumulation and ablation processes in snow-dominated watersheds nationwide. Successful application of the SNOW17 relies heavily on site-specific estimation of model parameters. The current study undertakes a comprehensive

  8. Characterizing parameter sensitivity and uncertainty for a snow model across hydroclimatic regimes

    NARCIS (Netherlands)

    He, M.; Hogue, T.S.; Franz, K.J.; Margulis, S.A.; Vrugt, J.A.

    2011-01-01

    The National Weather Service (NWS) uses the SNOW17 model to forecast snow accumulation and ablation processes in snow-dominated watersheds nationwide. Successful application of the SNOW17 relies heavily on site-specific estimation of model parameters. The current study undertakes a comprehensive sen

  9. Analysis of parameter uncertainty in hydrological modeling using GLUE method: a case study of SWAT model applied to Three Gorges Reservoir Region, China

    Directory of Open Access Journals (Sweden)

    Z. Y. Shen

    2011-08-01

    Full Text Available The calibration of hydrologic models is a worldwide difficulty due to the uncertainty involved in the large number of parameters. The difficulty even increases in the region with high seasonal variation of precipitation, where the results exhibit high heteroscedasticity and autocorrelation. In this study, the Generalized Likelihood Uncertainty Estimation (GLUE method was combined with Soil and Water Assessment Tool (SWAT to quantify the parameter uncertainty of the stream flow and sediment simulation in the Daning River Watershed of the Three Gorges Reservoir Region (TGRA, China. Based on this study, only a few parameters affected the final simulation output significantly. The results showed that sediment simulation presented greater uncertainty than stream flow, and uncertainty even increased in high precipitation condition than dry season. The main uncertainty sources of stream flow mainly came from the catchment process while channel process impacts the sediment simulation greatly. It should be noted that identifiable parameters such as CANMX, ALPHA_BNK, SOL_K could be obtained optimal parameter range using calibration method. However, equifinality was also observed in hydrologic modeling in TGRA. This paper demonstrated that care must be taken when calibrating the SWAT with non-identifiable parameters as these may lead to equifinality of the parameter values. It is anticipated this study would provide useful information for hydrology modeling related to policy development in the Three Gorges Reservoir Region (TGRA and other similar areas.

  10. Modeling sugar cane yield with a process-based model from site to continental scale: uncertainties arising from model structure and parameter values

    Directory of Open Access Journals (Sweden)

    A. Valade

    2014-01-01

    Full Text Available Agro-Land Surface Models (agro-LSM have been developed from the integration of specific crop processes into large-scale generic land surface models that allow calculating the spatial distribution and variability of energy, water and carbon fluxes within the soil-vegetation-atmosphere continuum. When developing agro-LSM models, a particular attention must be given to the effects of crop phenology and management on the turbulent fluxes exchanged with the atmosphere, and the underlying water and carbon pools. A part of the uncertainty of Agro-LSM models is related to their usually large number of parameters. In this study, we quantify the parameter-values uncertainty in the simulation of sugar cane biomass production with the agro-LSM ORCHIDEE-STICS, using a multi-regional approach with data from sites in Australia, La Réunion and Brazil. In ORCHIDEE-STICS, two models are chained: STICS, an agronomy model that calculates phenology and management, and ORCHIDEE, a land surface model that calculates biomass and other ecosystem variables forced by STICS' phenology. First, the parameters that dominate the uncertainty of simulated biomass at harvest date are determined through a screening of 67 different parameters of both STICS and ORCHIDEE on a multi-site basis. Secondly, the uncertainty of harvested biomass attributable to those most sensitive parameters is quantified and specifically attributed to either STICS (phenology, management or to ORCHIDEE (other ecosystem variables including biomass through distinct Monte-Carlo runs. The uncertainty on parameter values is constrained using observations by calibrating the model independently at seven sites. In a third step, a sensitivity analysis is carried out by varying the most sensitive parameters to investigate their effects at continental scale. A Monte-Carlo sampling method associated with the calculation of Partial Ranked Correlation Coefficients is used to quantify the sensitivity of harvested

  11. Modeling sugarcane yield with a process-based model from site to continental scale: uncertainties arising from model structure and parameter values

    Science.gov (United States)

    Valade, A.; Ciais, P.; Vuichard, N.; Viovy, N.; Caubel, A.; Huth, N.; Marin, F.; Martiné, J.-F.

    2014-06-01

    Agro-land surface models (agro-LSM) have been developed from the integration of specific crop processes into large-scale generic land surface models that allow calculating the spatial distribution and variability of energy, water and carbon fluxes within the soil-vegetation-atmosphere continuum. When developing agro-LSM models, particular attention must be given to the effects of crop phenology and management on the turbulent fluxes exchanged with the atmosphere, and the underlying water and carbon pools. A part of the uncertainty of agro-LSM models is related to their usually large number of parameters. In this study, we quantify the parameter-values uncertainty in the simulation of sugarcane biomass production with the agro-LSM ORCHIDEE-STICS, using a multi-regional approach with data from sites in Australia, La Réunion and Brazil. In ORCHIDEE-STICS, two models are chained: STICS, an agronomy model that calculates phenology and management, and ORCHIDEE, a land surface model that calculates biomass and other ecosystem variables forced by STICS phenology. First, the parameters that dominate the uncertainty of simulated biomass at harvest date are determined through a screening of 67 different parameters of both STICS and ORCHIDEE on a multi-site basis. Secondly, the uncertainty of harvested biomass attributable to those most sensitive parameters is quantified and specifically attributed to either STICS (phenology, management) or to ORCHIDEE (other ecosystem variables including biomass) through distinct Monte Carlo runs. The uncertainty on parameter values is constrained using observations by calibrating the model independently at seven sites. In a third step, a sensitivity analysis is carried out by varying the most sensitive parameters to investigate their effects at continental scale. A Monte Carlo sampling method associated with the calculation of partial ranked correlation coefficients is used to quantify the sensitivity of harvested biomass to input

  12. Modeling sugar cane yield with a process-based model from site to continental scale: uncertainties arising from model structure and parameter values

    Science.gov (United States)

    Valade, A.; Ciais, P.; Vuichard, N.; Viovy, N.; Huth, N.; Marin, F.; Martiné, J.-F.

    2014-01-01

    Agro-Land Surface Models (agro-LSM) have been developed from the integration of specific crop processes into large-scale generic land surface models that allow calculating the spatial distribution and variability of energy, water and carbon fluxes within the soil-vegetation-atmosphere continuum. When developing agro-LSM models, a particular attention must be given to the effects of crop phenology and management on the turbulent fluxes exchanged with the atmosphere, and the underlying water and carbon pools. A part of the uncertainty of Agro-LSM models is related to their usually large number of parameters. In this study, we quantify the parameter-values uncertainty in the simulation of sugar cane biomass production with the agro-LSM ORCHIDEE-STICS, using a multi-regional approach with data from sites in Australia, La Réunion and Brazil. In ORCHIDEE-STICS, two models are chained: STICS, an agronomy model that calculates phenology and management, and ORCHIDEE, a land surface model that calculates biomass and other ecosystem variables forced by STICS' phenology. First, the parameters that dominate the uncertainty of simulated biomass at harvest date are determined through a screening of 67 different parameters of both STICS and ORCHIDEE on a multi-site basis. Secondly, the uncertainty of harvested biomass attributable to those most sensitive parameters is quantified and specifically attributed to either STICS (phenology, management) or to ORCHIDEE (other ecosystem variables including biomass) through distinct Monte-Carlo runs. The uncertainty on parameter values is constrained using observations by calibrating the model independently at seven sites. In a third step, a sensitivity analysis is carried out by varying the most sensitive parameters to investigate their effects at continental scale. A Monte-Carlo sampling method associated with the calculation of Partial Ranked Correlation Coefficients is used to quantify the sensitivity of harvested biomass to input

  13. Cassini Spacecraft Uncertainty Analysis Data and Methodology Review and Update/Volume 1: Updated Parameter Uncertainty Models for the Consequence Analysis

    Energy Technology Data Exchange (ETDEWEB)

    WHEELER, TIMOTHY A.; WYSS, GREGORY D.; HARPER, FREDERICK T.

    2000-11-01

    Uncertainty distributions for specific parameters of the Cassini General Purpose Heat Source Radioisotope Thermoelectric Generator (GPHS-RTG) Final Safety Analysis Report consequence risk analysis were revised and updated. The revisions and updates were done for all consequence parameters for which relevant information exists from the joint project on Probabilistic Accident Consequence Uncertainty Analysis by the United States Nuclear Regulatory Commission and the Commission of European Communities.

  14. Uncertainty evaluation of nuclear reaction model parameters using integral and microscopic measurements. Covariances evaluation with CONRAD code

    Directory of Open Access Journals (Sweden)

    Tommasi J.

    2010-10-01

    Full Text Available In the [eV;MeV] energy range, modelling of the neutron induced reactions are based on nuclear reaction models having parameters. Estimation of co-variances on cross sections or on nuclear reaction model parameters is a recurrent puzzle in nuclear data evaluation. Major breakthroughs were asked by nuclear reactor physicists to assess proper uncertainties to be used in applications. In this paper, mathematical methods developped in the CONRAD code[2] will be presented to explain the treatment of all type of uncertainties, including experimental ones (statistical and systematic and propagate them to nuclear reaction model parameters or cross sections. Marginalization procedure will thus be exposed using analytical or Monte-Carlo solutions. Furthermore, one major drawback found by reactor physicist is the fact that integral or analytical experiments (reactor mock-up or simple integral experiment, e.g. ICSBEP, … were not taken into account sufficiently soon in the evaluation process to remove discrepancies. In this paper, we will describe a mathematical framework to take into account properly this kind of information.

  15. Combined Estimation of Hydrogeologic Conceptual Model, Parameter, and Scenario Uncertainty with Application to Uranium Transport at the Hanford Site 300 Area

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Philip D.; Ye, Ming; Rockhold, Mark L.; Neuman, Shlomo P.; Cantrell, Kirk J.

    2007-07-30

    This report to the Nuclear Regulatory Commission (NRC) describes the development and application of a methodology to systematically and quantitatively assess predictive uncertainty in groundwater flow and transport modeling that considers the combined impact of hydrogeologic uncertainties associated with the conceptual-mathematical basis of a model, model parameters, and the scenario to which the model is applied. The methodology is based on a n extension of a Maximum Likelihood implementation of Bayesian Model Averaging. Model uncertainty is represented by postulating a discrete set of alternative conceptual models for a site with associated prior model probabilities that reflect a belief about the relative plausibility of each model based on its apparent consistency with available knowledge and data. Posterior model probabilities are computed and parameter uncertainty is estimated by calibrating each model to observed system behavior; prior parameter estimates are optionally included. Scenario uncertainty is represented as a discrete set of alternative future conditions affecting boundary conditions, source/sink terms, or other aspects of the models, with associated prior scenario probabilities. A joint assessment of uncertainty results from combining model predictions computed under each scenario using as weight the posterior model and prior scenario probabilities. The uncertainty methodology was applied to modeling of groundwater flow and uranium transport at the Hanford Site 300 Area. Eight alternative models representing uncertainty in the hydrogeologic and geochemical properties as well as the temporal variability were considered. Two scenarios represent alternative future behavior of the Columbia River adjacent to the site were considered. The scenario alternatives were implemented in the models through the boundary conditions. Results demonstrate the feasibility of applying a comprehensive uncertainty assessment to large-scale, detailed groundwater flow

  16. Uncertainty Quantification and Parameter Tuning: A Case Study of Convective Parameterization Scheme in the WRF Regional Climate Model

    Science.gov (United States)

    Qian, Y.; Yang, B.; Lin, G.; Leung, R.; Zhang, Y.

    2012-04-01

    The current tuning process of parameters in global climate models is often performed subjectively or treated as an optimization procedure to minimize model biases based on observations. The latter approach may provide more plausible values for a set of tunable parameters to approximate the observed climate, the system could be forced to an unrealistic physical state or improper balance of budgets through compensating errors over different regions of the globe. In this study, the Weather Research and Forecasting (WRF) model was used to provide a more flexible framework to investigate a number of issues related uncertainty quantification (UQ) and parameter tuning. The WRF model was constrained by reanalysis of data over the Southern Great Plains (SGP), where abundant observational data from various sources was available for calibration of the input parameters and validation of the model results. Focusing on five key input parameters in the new Kain-Fritsch (KF) convective parameterization scheme used in WRF as an example, the purpose of this study was to explore the utility of high-resolution observations for improving simulations of regional patterns and evaluate the transferability of UQ and parameter tuning across physical processes, spatial scales, and climatic regimes, which have important implications to UQ and parameter tuning in global and regional models. A stochastic important-sampling algorithm, Multiple Very Fast Simulated Annealing (MVFSA) was employed to efficiently sample the input parameters in the KF scheme based on a skill score so that the algorithm progressively moved toward regions of the parameter space that minimize model errors. The results based on the WRF simulations with 25-km grid spacing over the SGP showed that the precipitation bias in the model could be significantly reduced when five optimal parameters identified by the MVFSA algorithm were used. The model performance was found to be sensitive to downdraft- and entrainment

  17. Understanding mean transit times in Andean tropical montane cloud forest catchments: combining tracer data, lumped parameter models and uncertainty analysis

    Science.gov (United States)

    Timbe, E.; Windhorst, D.; Crespo, P.; Frede, H.-G.; Feyen, J.; Breuer, L.

    2013-12-01

    Weekly samples from surface waters, springs, soil water and rainfall were collected in a 76.9 km2 mountain rain forest catchment and its tributaries in southern Ecuador. Time series of the stable water isotopes δ18O and δ2H were used to calculate mean transit times (MTTs) and the transit time distribution functions (TTDs) solving the convolution method for seven lumped parameter models. For each model setup, the Generalized Likelihood Uncertainty Estimation (GLUE) methodology was applied to find the best predictions, behavioral solutions and parameter identifiability. For the study basin, TTDs based on model types such as the Linear-Piston Flow for soil waters and the Exponential-Piston Flow for surface waters and springs performed better than more versatile equations such as the Gamma and the Two Parallel Linear Reservoirs. Notwithstanding both approaches yielded a better goodness of fit for most sites, but with considerable larger uncertainty shown by GLUE. Among the tested models, corresponding results were obtained for soil waters with short MTTs (ranging from 3 to 12 weeks). For waters with longer MTTs differences were found, suggesting that for those cases the MTT should be based at least on an intercomparison of several models. Under dominant baseflow conditions long MTTs for stream water ≥2 yr were detected, a phenomenon also observed for shallow springs. Short MTTs for water in the top soil layer indicate a rapid exchange of surface waters with deeper soil horizons. Differences in travel times between soils suggest that there is evidence of a land use effect on flow generation.

  18. Uncertainty reduction and parameters estimation of a~distributed hydrological model with ground and remote sensing data

    Science.gov (United States)

    Silvestro, F.; Gabellani, S.; Rudari, R.; Delogu, F.; Laiolo, P.; Boni, G.

    2014-06-01

    During the last decade the opportunity and usefulness of using remote sensing data in hydrology, hydrometeorology and geomorphology has become even more evident and clear. Satellite based products often provide the advantage of observing hydrologic variables in a distributed way while offering a different view that can help to understand and model the hydrological cycle. Moreover, remote sensing data are fundamental in scarce data environments. The use of satellite derived DTM, which are globally available (e.g. from SRTM as used in this work), have become standard practice in hydrologic model implementation, but other types of satellite derived data are still underutilized. In this work, Meteosat Second Generation Land Surface Temperature (LST) estimates and Surface Soil Moisture (SSM) available from EUMETSAT H-SAF are used to calibrate the Continuum hydrological model that computes such state variables in a prognostic mode. This work aims at proving that satellite observations dramatically reduce uncertainties in parameters calibration by reducing their equifinality. Two parameter estimation strategies are implemented and tested: a multi-objective approach that includes ground observations and one solely based on remotely sensed data. Two Italian catchments are used as the test bed to verify the model capability in reproducing long-term (multi-year) simulations.

  19. Uncertainty reduction and parameter estimation of a distributed hydrological model with ground and remote-sensing data

    Science.gov (United States)

    Silvestro, F.; Gabellani, S.; Rudari, R.; Delogu, F.; Laiolo, P.; Boni, G.

    2015-04-01

    During the last decade the opportunity and usefulness of using remote-sensing data in hydrology, hydrometeorology and geomorphology has become even more evident and clear. Satellite-based products often allow for the advantage of observing hydrologic variables in a distributed way, offering a different view with respect to traditional observations that can help with understanding and modeling the hydrological cycle. Moreover, remote-sensing data are fundamental in scarce data environments. The use of satellite-derived digital elevation models (DEMs), which are now globally available at 30 m resolution (e.g., from Shuttle Radar Topographic Mission, SRTM), have become standard practice in hydrologic model implementation, but other types of satellite-derived data are still underutilized. As a consequence there is the need for developing and testing techniques that allow the opportunities given by remote-sensing data to be exploited, parameterizing hydrological models and improving their calibration. In this work, Meteosat Second Generation land-surface temperature (LST) estimates and surface soil moisture (SSM), available from European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT) H-SAF, are used together with streamflow observations (S. N.) to calibrate the Continuum hydrological model that computes such state variables in a prognostic mode. The first part of the work aims at proving that satellite observations can be exploited to reduce uncertainties in parameter calibration by reducing the parameter equifinality that can become an issue in forecast mode. In the second part, four parameter estimation strategies are implemented and tested in a comparative mode: (i) a multi-objective approach that includes both satellite and ground observations which is an attempt to use different sources of data to add constraints to the parameters; (ii and iii) two approaches solely based on remotely sensed data that reproduce the case of a scarce data

  20. Evaluation of Uncertainty in Constituent Input Parameters for Modeling the Fate of RDX

    Science.gov (United States)

    2015-07-01

    accurately estimated, such as solubility, while others — such as degradation rates — are often far more uncertain . Prior to using improved methods for...meet this purpose, a previous application of TREECS™ was used to evaluate parameter sensitivity and the effects of highly uncertain inputs for...than others. One of the most uncertain inputs in this application is the loading rate (grams/year) of unexploded RDX residue. A value of 1.5 kg/yr was

  1. Modeling a production scale milk drying process: parameter estimation, uncertainty and sensitivity analysis

    DEFF Research Database (Denmark)

    Ferrari, A.; Gutierrez, S.; Sin, Gürkan

    2016-01-01

    A steady state model for a production scale milk drying process was built to help process understanding and optimization studies. It involves a spray chamber and also internal/external fluid beds. The model was subjected to a comprehensive statistical analysis for quality assurance using sensitiv...

  2. Parameter-induced uncertainty quantification of soil N2O, NO and CO2 emission from Höglwald spruce forest (Germany using the LandscapeDNDC model

    Directory of Open Access Journals (Sweden)

    K. Butterbach-Bahl

    2012-10-01

    Full Text Available Assessing the uncertainties of simulation results of ecological models is becoming increasingly important, specifically if these models are used to estimate greenhouse gas emissions on site to regional/national levels. Four general sources of uncertainty effect the outcome of process-based models: (i uncertainty of information used to initialise and drive the model, (ii uncertainty of model parameters describing specific ecosystem processes, (iii uncertainty of the model structure, and (iv accurateness of measurements (e.g., soil-atmosphere greenhouse gas exchange which are used for model testing and development. The aim of our study was to assess the simulation uncertainty of the process-based biogeochemical model LandscapeDNDC. For this we set up a Bayesian framework using a Markov Chain Monte Carlo (MCMC method, to estimate the joint model parameter distribution. Data for model testing, parameter estimation and uncertainty assessment were taken from observations of soil fluxes of nitrous oxide (N2O, nitric oxide (NO and carbon dioxide (CO2 as observed over a 10 yr period at the spruce site of the Höglwald Forest, Germany. By running four independent Markov Chains in parallel with identical properties (except for the parameter start values, an objective criteria for chain convergence developed by Gelman et al. (2003 could be used. Our approach shows that by means of the joint parameter distribution, we were able not only to limit the parameter space and specify the probability of parameter values, but also to assess the complex dependencies among model parameters used for simulating soil C and N trace gas emissions. This helped to improve the understanding of the behaviour of the complex LandscapeDNDC model while simulating soil C and N turnover processes and associated C and N soil-atmosphere exchange. In a final step the parameter distribution of the most sensitive parameters determining soil-atmosphere C and N exchange were used to obtain

  3. A Bayesian approach for evaluation of the effect of water quality model parameter uncertainty on TMDLs: A case study of Miyun Reservoir

    Energy Technology Data Exchange (ETDEWEB)

    Liang, Shidong, E-mail: emblembl@sina.com [School of Environment, Tsinghua University, 1 Qinghuayuan, Haidian District, Beijing 100084 (China); Jia, Haifeng, E-mail: jhf@tsinghua.edu.cn [School of Environment, Tsinghua University, 1 Qinghuayuan, Haidian District, Beijing 100084 (China); Xu, Changqing, E-mail: 2008changqing@163.com [School of Environment, Tsinghua University, 1 Qinghuayuan, Haidian District, Beijing 100084 (China); Xu, Te, E-mail: xt_lichking@qq.com [School of Environment, Tsinghua University, 1 Qinghuayuan, Haidian District, Beijing 100084 (China); Melching, Charles, E-mail: steve.melching17@gmail.com [Melching Water Solutions, 4030 W. Edgerton Avenue, Greenfield, WI 53221 (United States)

    2016-08-01

    Facing increasingly serious water pollution, the Chinese government is changing the environmental management strategy from solely pollutant concentration control to a Total Maximum Daily Load (TMDL) program, and water quality models are increasingly being applied to determine the allowable pollutant load in the TMDL. Despite the frequent use of models, few studies have focused on how parameter uncertainty in water quality models affect the allowable pollutant loads in the TMDL program, particularly for complicated and high-dimension water quality models. Uncertainty analysis for such models is limited by time-consuming simulation and high-dimensionality and nonlinearity in parameter spaces. In this study, an allowable pollutant load calculation platform was established using the Environmental Fluid Dynamics Code (EFDC), which is a widely applied hydrodynamic-water quality model. A Bayesian approach, i.e. the DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm, which is a high-efficiency, multi-chain Markov Chain Monte Carlo (MCMC) method, was applied to assess the effects of parameter uncertainty on the water quality model simulations and its influence on the allowable pollutant load calculation in the TMDL program. Miyun Reservoir, which is the most important surface drinking water source for Beijing, suffers from eutrophication and was selected as a case study. The relations between pollutant loads and water quality indicators are obtained through a graphical method in the simulation platform. Ranges of allowable pollutant loads were obtained according to the results of parameter uncertainty analysis, i.e. Total Organic Carbon (TOC): 581.5–1030.6 t·yr{sup −1}; Total Phosphorus (TP): 23.3–31.0 t·yr{sup −1}; and Total Nitrogen (TN): 480–1918.0 t·yr{sup −1}. The wide ranges of allowable pollutant loads reveal the importance of parameter uncertainty analysis in a TMDL program for allowable pollutant load calculation and margin of safety (MOS

  4. A Bayesian approach for evaluation of the effect of water quality model parameter uncertainty on TMDLs: A case study of Miyun Reservoir.

    Science.gov (United States)

    Liang, Shidong; Jia, Haifeng; Xu, Changqing; Xu, Te; Melching, Charles

    2016-08-01

    Facing increasingly serious water pollution, the Chinese government is changing the environmental management strategy from solely pollutant concentration control to a Total Maximum Daily Load (TMDL) program, and water quality models are increasingly being applied to determine the allowable pollutant load in the TMDL. Despite the frequent use of models, few studies have focused on how parameter uncertainty in water quality models affect the allowable pollutant loads in the TMDL program, particularly for complicated and high-dimension water quality models. Uncertainty analysis for such models is limited by time-consuming simulation and high-dimensionality and nonlinearity in parameter spaces. In this study, an allowable pollutant load calculation platform was established using the Environmental Fluid Dynamics Code (EFDC), which is a widely applied hydrodynamic-water quality model. A Bayesian approach, i.e. the DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm, which is a high-efficiency, multi-chain Markov Chain Monte Carlo (MCMC) method, was applied to assess the effects of parameter uncertainty on the water quality model simulations and its influence on the allowable pollutant load calculation in the TMDL program. Miyun Reservoir, which is the most important surface drinking water source for Beijing, suffers from eutrophication and was selected as a case study. The relations between pollutant loads and water quality indicators are obtained through a graphical method in the simulation platform. Ranges of allowable pollutant loads were obtained according to the results of parameter uncertainty analysis, i.e. Total Organic Carbon (TOC): 581.5-1030.6t·yr(-1); Total Phosphorus (TP): 23.3-31.0t·yr(-1); and Total Nitrogen (TN): 480-1918.0t·yr(-1). The wide ranges of allowable pollutant loads reveal the importance of parameter uncertainty analysis in a TMDL program for allowable pollutant load calculation and margin of safety (MOS) determination. The sources

  5. Uncertainty study of nuclear model parameters for the n+ ^{56}Fe reactions in the fast neutron region below 20 MeV

    CERN Document Server

    Duan, Junfeng; Sjöstrand, Henrik; Alhassan, Erwin; Gustavsson, Cecilia; Österlund, Michael; Koning, Arjan; Rochman, Dimitri

    2013-01-01

    In this work, we study the uncertainty of nuclear model parameters for neutron induced ^{56}Fe reactions in fast neutron region by using the Total Monte Carlo method. We perform a large number of TALYS runs and compare the calculated results with the experimental data of the cross sections to obtain the uncertainties of the model parameters. Based on the derived uncertainties another 1000 TALYS runs have been performed to create random cross section files. For comparison with the experimental data we calculate a weighted \\chi^2 value for each random file as well as the ENDF/B-VII.1, JEFF3.1, JENDL4.0 and CENDL3.1 data libraries. Furthermore, we investigate the optical model parameters correlation obtained by way of this procedure.

  6. Activated sludge model 2d calibration with full-scale WWTP data: comparing model parameter identifiability with influent and operational uncertainty.

    Science.gov (United States)

    Machado, Vinicius Cunha; Lafuente, Javier; Baeza, Juan Antonio

    2014-07-01

    The present work developed a model for the description of a full-scale wastewater treatment plant (WWTP) (Manresa, Catalonia, Spain) for further plant upgrades based on the systematic parameter calibration of the activated sludge model 2d (ASM2d) using a methodology based on the Fisher information matrix. The influent was characterized for the application of the ASM2d and the confidence interval of the calibrated parameters was also assessed. No expert knowledge was necessary for model calibration and a huge available plant database was converted into more useful information. The effect of the influent and operating variables on the model fit was also studied using these variables as calibrating parameters and keeping the ASM2d kinetic and stoichiometric parameters, which traditionally are the calibration parameters, at their default values. Such an "inversion" of the traditional way of model fitting allowed evaluating the sensitivity of the main model outputs regarding the influent and the operating variables changes. This new approach is able to evaluate the capacity of the operational variables used by the WWTP feedback control loops to overcome external disturbances in the influent and kinetic/stoichiometric model parameters uncertainties. In addition, the study of the influence of operating variables on the model outputs provides useful information to select input and output variables in decentralized control structures.

  7. Parameter-induced uncertainty quantification of a regional N2O and NO3 inventory using the biogeochemical model LandscapeDNDC

    Science.gov (United States)

    Haas, Edwin; Klatt, Steffen; Kraus, David; Werner, Christian; Ruiz, Ignacio Santa Barbara; Kiese, Ralf; Butterbach-Bahl, Klaus

    2014-05-01

    Numerical simulation models are increasingly used to estimate greenhouse gas emissions at site to regional and national scales and are outlined as the most advanced methodology (Tier 3) for national emission inventory in the framework of UNFCCC reporting. Process-based models incorporate the major processes of the carbon and nitrogen cycle of terrestrial ecosystems like arable land and grasslands and are thus thought to be widely applicable at various spatial and temporal scales. The high complexity of ecosystem processes mirrored by such models requires a large number of model parameters. Many of those parameters are lumped parameters describing simultaneously the effect of environmental drivers on e.g. microbial community activity and individual processes. Thus, the precise quantification of true parameter states is often difficult or even impossible. As a result model uncertainty is not solely originating from input uncertainty but also subject to parameter-induced uncertainty. In this study we quantify regional parameter-induced model uncertainty on nitrous oxide (N2O) emissions and nitrate (NO3) leaching from arable soils of Saxony (Germany) using the biogeochemical model LandscapeDNDC. For this we calculate a regional inventory using a joint parameter distribution for key parameters describing microbial C and N turnover processes as obtained by a Bayesian calibration study. We representatively sampled 400 different parameter vectors from the discrete joint parameter distribution comprising approximately 400,000 parameter combinations and used these to calculate 400 individual realizations of the regional inventory. The spatial domain (represented by 4042 polygons) is set up with spatially explicit soil and climate information and a region-typical 3-year crop rotation consisting of winter wheat, rape- seed, and winter barley. Average N2O emission from arable soils in the state of Saxony across all 400 realizations was 1.43 ± 1.25 [kg N / ha] with a median

  8. Modeling the uncertainty in responsiveness of climatic, genetic, soil and agronomic parameters in CERES-Sorghum model across locations in Kansas, USA

    Science.gov (United States)

    Lamsal, A.; Anandhi, A.; Welch, S.

    2012-12-01

    Kansas leads grain sorghum production in the USA. Crop models are useful tools which provide insight about the functioning of crops, agricultural systems, and their interactions. There is a temperature and precipitation gradient across Kansas. The CERES-Sorghum model in the DSSAT system (Decision Support System for Agro-transfer Technology) was applied to many locations within the state. We hypothesize that the degree of responsiveness to CERES-Sorghum parameters would vary due to these gradients. The objective of this study is to document the uncertainties in the responsiveness of the climatic, genetic, soil and agronomic parameters in CERES-Sorghum across many locations in Kansas using multiple response variables. The input parameter categories evaluated are: climatic (temperature, solar radiation, rainfall, and CO2); genetic (P1, P2O, P5, G2, G5); agronomic (planting date, planting depth, row spacing and plant population); and soil (drained upper limit, drained lower limit, pH, saturated water content, soil organic carbon, bulk density, runoff curve number and drainage rate). Uncertainty analysis was carried out for six output response variables (yield, biomass, anthesis days, maturity days, leaf area index and leaf number) Sensitivity analysis was carried out using the OAT (one at a time) method by perturbing one input at a time keeping rest of the input parameter constant. Both relative sensitivity (a mathematical approach) and a graphical method were used, Cumulative distribution functions were used for uncertainty analysis. Preliminary results showed that, responsiveness of input parameters varied with input parameters, response variable, and location.

  9. Uncertainty quantification of GEOS-5 L-band radiative transfer model parameters using Bayesian inference and SMOS observations

    NARCIS (Netherlands)

    G.J.M. De Lannoy; R.H. Reichle; J.A. Vrugt

    2014-01-01

    Uncertainties in L-band (1.4 GHz) microwave radiative transfer modeling (RTM) affect the simulation of brightness temperatures (Tb) over land and the inversion of satellite-observed Tb into soil moisture retrievals. In particular, accurate estimates of the microwave soil roughness, vegetation optica

  10. Methods for Quantifying the Uncertainties of LSIT Test Parameters, Test Results, and Full-Scale Mixing Performance Using Models Developed from Scaled Test Data

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, Gregory F. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Cooley, Scott K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kuhn, William L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Rector, David R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Heredia-Langner, Alejandro [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-05-01

    This report discusses the statistical methods for quantifying uncertainties in 1) test responses and other parameters in the Large Scale Integrated Testing (LSIT), and 2) estimates of coefficients and predictions of mixing performance from models that relate test responses to test parameters. Testing at a larger scale has been committed to by Bechtel National, Inc. and the U.S. Department of Energy (DOE) to “address uncertainties and increase confidence in the projected, full-scale mixing performance and operations” in the Waste Treatment and Immobilization Plant (WTP).

  11. A generalized Lyapunov theory for robust root clustering of linear state space models with real parameter uncertainty

    Science.gov (United States)

    Yedavalli, R. K.

    1992-01-01

    The problem of analyzing and designing controllers for linear systems subject to real parameter uncertainty is considered. An elegant, unified theory for robust eigenvalue placement is presented for a class of D-regions defined by algebraic inequalities by extending the nominal matrix root clustering theory of Gutman and Jury (1981) to linear uncertain time systems. The author presents explicit conditions for matrix root clustering for different D-regions and establishes the relationship between the eigenvalue migration range and the parameter range. The bounds are all obtained by one-shot computation in the matrix domain and do not need any frequency sweeping or parameter gridding. The method uses the generalized Lyapunov theory for getting the bounds.

  12. A generalized Lyapunov theory for robust root clustering of linear state space models with real parameter uncertainty

    Science.gov (United States)

    Yedavalli, R. K.

    1992-01-01

    The problem of analyzing and designing controllers for linear systems subject to real parameter uncertainty is considered. An elegant, unified theory for robust eigenvalue placement is presented for a class of D-regions defined by algebraic inequalities by extending the nominal matrix root clustering theory of Gutman and Jury (1981) to linear uncertain time systems. The author presents explicit conditions for matrix root clustering for different D-regions and establishes the relationship between the eigenvalue migration range and the parameter range. The bounds are all obtained by one-shot computation in the matrix domain and do not need any frequency sweeping or parameter gridding. The method uses the generalized Lyapunov theory for getting the bounds.

  13. Estimation of Modal Parameters and their Uncertainties

    DEFF Research Database (Denmark)

    Andersen, P.; Brincker, Rune

    1999-01-01

    In this paper it is shown how to estimate the modal parameters as well as their uncertainties using the prediction error method of a dynamic system on the basis of uotput measurements only. The estimation scheme is assessed by means of a simulation study. As a part of the introduction, an example...

  14. Uncertainty propagation within the UNEDF models

    CERN Document Server

    Haverinen, T

    2016-01-01

    The parameters of the nuclear energy density have to be adjusted to experimental data. As a result they carry certain uncertainty which then propagates to calculated values of observables. In the present work we quantify the statistical uncertainties on binding energies for three UNEDF Skyrme energy density functionals by taking advantage of the knowledge of the model parameter uncertainties. We find that the uncertainty of UNEDF models increases rapidly when going towards proton or neutron rich nuclei. We also investigate the impact of each model parameter on the total error budget.

  15. Uncertainty propagation within the UNEDF models

    Science.gov (United States)

    Haverinen, T.; Kortelainen, M.

    2017-04-01

    The parameters of the nuclear energy density have to be adjusted to experimental data. As a result they carry certain uncertainty which then propagates to calculated values of observables. In the present work we quantify the statistical uncertainties of binding energies, proton quadrupole moments and proton matter radius for three UNEDF Skyrme energy density functionals by taking advantage of the knowledge of the model parameter uncertainties. We find that the uncertainty of UNEDF models increases rapidly when going towards proton or neutron rich nuclei. We also investigate the impact of each model parameter on the total error budget.

  16. Investigating uncertainty in BPR formula parameters: a case study

    DEFF Research Database (Denmark)

    Manzo, Stefano; Nielsen, Otto Anker; Prato, Carlo Giacomo

    uncertainty within the model. The research described in this paper investigated uncertainty in the BPR formula parameters. Within traffic assignment models, the relationship between travel time and traffic flows is commonly described by the BPR formula. The BPR formula works as a link performance function......; given free flow travel time, observed flow and link capacity, it uses parameters to fit the equation to various types of roadways and circumstances. Usually, the values for the parameters are pre-defined, based on assumptions and practice. The present paper describes a work implemented to define the BPR...... on their output highly unreliable. The main consequence of this inherent uncertainty is that modelled traffic flows cannot be expressed as a point estimate, because this would only represent one of the possible outputs generated by the model. Instead, modelled traffic flows are better expressed as a central...

  17. Uncertainties in repository modeling

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, J.R.

    1996-12-31

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.

  18. Robustness of dynamic systems with parameter uncertainties

    CERN Document Server

    Balemi, S; Truöl, W

    1992-01-01

    Robust Control is one of the fastest growing and promising areas of research today. In many practical systems there exist uncertainties which have to be considered in the analysis and design of control systems. In the last decade methods were developed for dealing with dynamic systems with unstructured uncertainties such as HOO_ and £I-optimal control. For systems with parameter uncertainties, the seminal paper of V. L. Kharitonov has triggered a large amount of very promising research. An international workshop dealing with all aspects of robust control was successfully organized by S. P. Bhattacharyya and L. H. Keel in San Antonio, Texas, USA in March 1991. We organized the second international workshop in this area in Ascona, Switzer­ land in April 1992. However, this second workshop was restricted to robust control of dynamic systems with parameter uncertainties with the objective to concentrate on some aspects of robust control. This book contains a collection of papers presented at the International W...

  19. Predicting streamflow response to fire-induced landcover change: implications of parameter uncertainty in the MIKE SHE model.

    Science.gov (United States)

    McMichael, Christine E; Hope, Allen S

    2007-08-01

    Fire is a primary agent of landcover transformation in California semi-arid shrubland watersheds, however few studies have examined the impacts of fire and post-fire succession on streamflow dynamics in these basins. While it may seem intuitive that larger fires will have a greater impact on streamflow response than smaller fires in these watersheds, the nature of these relationships has not been determined. The effects of fire size on seasonal and annual streamflow responses were investigated for a medium-sized basin in central California using a modified version of the MIKE SHE model which had been previously calibrated and tested for this watershed using the Generalized Likelihood Uncertainty Estimation methodology. Model simulations were made for two contrasting periods, wet and dry, in order to assess whether fire size effects varied with weather regime. Results indicated that seasonal and annual streamflow response increased nearly linearly with fire size in a given year under both regimes. Annual flow response was generally higher in wetter years for both weather regimes, however a clear trend was confounded by the effect of stand age. These results expand our understanding of the effects of fire size on hydrologic response in chaparral watersheds, but it is important to note that the majority of model predictions were largely indistinguishable from the predictive uncertainty associated with the calibrated model - a key finding that highlights the importance of analyzing hydrologic predictions for altered landcover conditions in the context of model uncertainty. Future work is needed to examine how alternative decisions (e.g., different likelihood measures) may influence GLUE-based MIKE SHE streamflow predictions following different size fires, and how the effect of fire size on streamflow varies with other factors such as fire location.

  20. Parameter Uncertainty for Repository Thermal Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hardin, Ernest [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hadgu, Teklu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Greenberg, Harris [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dupont, Mark [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-10-01

    This report is one follow-on to a study of reference geologic disposal design concepts (Hardin et al. 2011a). Based on an analysis of maximum temperatures, that study concluded that certain disposal concepts would require extended decay storage prior to emplacement, or the use of small waste packages, or both. The study used nominal values for thermal properties of host geologic media and engineered materials, demonstrating the need for uncertainty analysis to support the conclusions. This report is a first step that identifies the input parameters of the maximum temperature calculation, surveys published data on measured values, uses an analytical approach to determine which parameters are most important, and performs an example sensitivity analysis. Using results from this first step, temperature calculations planned for FY12 can focus on only the important parameters, and can use the uncertainty ranges reported here. The survey of published information on thermal properties of geologic media and engineered materials, is intended to be sufficient for use in generic calculations to evaluate the feasibility of reference disposal concepts. A full compendium of literature data is beyond the scope of this report. The term “uncertainty” is used here to represent both measurement uncertainty and spatial variability, or variability across host geologic units. For the most important parameters (e.g., buffer thermal conductivity) the extent of literature data surveyed samples these different forms of uncertainty and variability. Finally, this report is intended to be one chapter or section of a larger FY12 deliverable summarizing all the work on design concepts and thermal load management for geologic disposal (M3FT-12SN0804032, due 15Aug2012).

  1. Incorporation of Model and Parameter Uncertainty in Predicting Radionuclide Fluxes from the Climax Granite Intrusive, Nevada Test Site

    Science.gov (United States)

    Reeves, D. M.; Pohlmann, K. F.; Pohll, G. M.; Chapman, J. B.; Ye, M.

    2006-12-01

    The Yucca Flat-Climax Mine Corrective Action Unit requires the use of numerical models to predict radionuclide flux rates from three subsurface nuclear tests conducted in a fractured rock mass. Modeling flow and transport in the Climax granite intrusive (CGI) is unique; while attributes of rock fractures have been extensively characterized in subsurface tunnel and drift complexes, information on the saturated flow system, including the position of the water table within the CGI, is largely unknown. A modified version of the Death Valley Regional Flow System (DVRFS) model of Belcher et al. (2004) with refined discretization in the area of the CGI is used to provide boundary conditions and a calibration target for a local-scale stochastic continuum fracture flow and transport model. Uncertainty in the Climax DVRFS model is addressed by including five different geologic framework models, each weighted according to expert elicitation. Five ground water recharge models are then applied to each of the five geologic models, resulting in a total of 25 geologic/recharge models. The CGI fracture flow model consists of 3-D discrete fracture networks, randomly distributed according to probability distribution functions for fracture location, orientation, length and permeability. The networks are directly mapped onto a 3-D finite-difference grid and MODFLOW is used to simultaneously solve for fluid flow within the fracture network and rock matrix. Flow model calibration involved matching the geometric mean of total fluid flux through 200 Monte Carlo fracture network realizations to flux computed in the subsection of the Climax DVRFS model representing the area of the local-scale model domain. By maintaining a constant log_10 mean and variance of fracture conductivity, fracture density was altered until the geometric mean of flux from all 200 network realizations is within +/- 5% of the target flux from the regional model. Variability in flux for individual realizations

  2. Uncertainty quantification for environmental models

    Science.gov (United States)

    Hill, Mary C.; Lu, Dan; Kavetski, Dmitri; Clark, Martyn P.; Ye, Ming

    2012-01-01

    Environmental models are used to evaluate the fate of fertilizers in agricultural settings (including soil denitrification), the degradation of hydrocarbons at spill sites, and water supply for people and ecosystems in small to large basins and cities—to mention but a few applications of these models. They also play a role in understanding and diagnosing potential environmental impacts of global climate change. The models are typically mildly to extremely nonlinear. The persistent demand for enhanced dynamics and resolution to improve model realism [17] means that lengthy individual model execution times will remain common, notwithstanding continued enhancements in computer power. In addition, high-dimensional parameter spaces are often defined, which increases the number of model runs required to quantify uncertainty [2]. Some environmental modeling projects have access to extensive funding and computational resources; many do not. The many recent studies of uncertainty quantification in environmental model predictions have focused on uncertainties related to data error and sparsity of data, expert judgment expressed mathematically through prior information, poorly known parameter values, and model structure (see, for example, [1,7,9,10,13,18]). Approaches for quantifying uncertainty include frequentist (potentially with prior information [7,9]), Bayesian [13,18,19], and likelihood-based. A few of the numerous methods, including some sensitivity and inverse methods with consequences for understanding and quantifying uncertainty, are as follows: Bayesian hierarchical modeling and Bayesian model averaging; single-objective optimization with error-based weighting [7] and multi-objective optimization [3]; methods based on local derivatives [2,7,10]; screening methods like OAT (one at a time) and the method of Morris [14]; FAST (Fourier amplitude sensitivity testing) [14]; the Sobol' method [14]; randomized maximum likelihood [10]; Markov chain Monte Carlo (MCMC) [10

  3. Uncertainty in the determination of soil hydraulic parameters and its influence on the performance of two hydrological models of different complexity

    Directory of Open Access Journals (Sweden)

    G. Baroni

    2010-02-01

    Full Text Available Data of soil hydraulic properties forms often a limiting factor in unsaturated zone modelling, especially at the larger scales. Investigations for the hydraulic characterization of soils are time-consuming and costly, and the accuracy of the results obtained by the different methodologies is still debated. However, we may wonder how the uncertainty in soil hydraulic parameters relates to the uncertainty of the selected modelling approach. We performed an intensive monitoring study during the cropping season of a 10 ha maize field in Northern Italy. The data were used to: i compare different methods for determining soil hydraulic parameters and ii evaluate the effect of the uncertainty in these parameters on different variables (i.e. evapotranspiration, average water content in the root zone, flux at the bottom boundary of the root zone simulated by two hydrological models of different complexity: SWAP, a widely used model of soil moisture dynamics in unsaturated soils based on Richards equation, and ALHyMUS, a conceptual model of the same dynamics based on a reservoir cascade scheme. We employed five direct and indirect methods to determine soil hydraulic parameters for each horizon of the experimental profile. Two methods were based on a parameter optimization of: a laboratory measured retention and hydraulic conductivity data and b field measured retention and hydraulic conductivity data. The remaining three methods were based on the application of widely used Pedo-Transfer Functions: c Rawls and Brakensiek, d HYPRES, and e ROSETTA. Simulations were performed using meteorological, irrigation and crop data measured at the experimental site during the period June – October 2006. Results showed a wide range of soil hydraulic parameter values generated with the different methods, especially for the saturated hydraulic conductivity Ksat and the shape parameter α of the van Genuchten curve. This is reflected in a variability of

  4. Uncertainty in the determination of soil hydraulic parameters and its influence on the performance of two hydrological models of different complexity

    Directory of Open Access Journals (Sweden)

    G. Baroni

    2009-06-01

    Full Text Available Data of soil hydraulic properties forms often a limiting factor in unsaturated zone modelling, especially at the larger scales. Investigations for the hydraulic characterization of soils are time-consuming and costly, and the accuracy of the results obtained by the different methodologies is still debated. However, we may wonder how the uncertainty in soil hydraulic parameters relates to the uncertainty of the selected modelling approach.

    We performed an intensive monitoring study during the cropping season of a 10 ha maize field in Northern Italy. These data were used to: i compare different methods for determining soil hydraulic parameters and ii evaluate the effect of the uncertainty in these parameters on different outputs (i.e. evapotranspiration, water content in the root zone, fluxes through the bottom boundary of the root zone of two hydrological models with different complexity: SWAP, a widely used model of soil moisture dynamics in unsaturated soils based on Richards equation, and ALHyMUS, a conceptual model of the same dynamics based on a reservoir cascade scheme. We employed five direct and indirect methods to determine soil hydraulic parameters for each horizon of the experimental field. Two methods were based on a parameter optimization of: a laboratory measured retention and hydraulic conductivity data and b field measured retention and hydraulic conductivity data. Three methods were based on the application of widely used Pedo-Transfer Functions: c Rawls and Brakensiek; d HYPRES; and e ROSETTA. Simulations were performed using meteorological, irrigation and crop data measured at the experimental site during the period June–October 2006.

    Results showed a wide range of soil hydraulic parameter values evaluated with the different methods, especially for the saturated hydraulic conductivity Ksat and the shape parameter α of the Van Genuchten curve. This is reflected in a variability of the

  5. Uncertainty Quantification in Climate Modeling

    Science.gov (United States)

    Sargsyan, K.; Safta, C.; Berry, R.; Debusschere, B.; Najm, H.

    2011-12-01

    We address challenges that sensitivity analysis and uncertainty quantification methods face when dealing with complex computational models. In particular, climate models are computationally expensive and typically depend on a large number of input parameters. We consider the Community Land Model (CLM), which consists of a nested computational grid hierarchy designed to represent the spatial heterogeneity of the land surface. Each computational cell can be composed of multiple land types, and each land type can incorporate one or more sub-models describing the spatial and depth variability. Even for simulations at a regional scale, the computational cost of a single run is quite high and the number of parameters that control the model behavior is very large. Therefore, the parameter sensitivity analysis and uncertainty propagation face significant difficulties for climate models. This work employs several algorithmic avenues to address some of the challenges encountered by classical uncertainty quantification methodologies when dealing with expensive computational models, specifically focusing on the CLM as a primary application. First of all, since the available climate model predictions are extremely sparse due to the high computational cost of model runs, we adopt a Bayesian framework that effectively incorporates this lack-of-knowledge as a source of uncertainty, and produces robust predictions with quantified uncertainty even if the model runs are extremely sparse. In particular, we infer Polynomial Chaos spectral expansions that effectively encode the uncertain input-output relationship and allow efficient propagation of all sources of input uncertainties to outputs of interest. Secondly, the predictability analysis of climate models strongly suffers from the curse of dimensionality, i.e. the large number of input parameters. While single-parameter perturbation studies can be efficiently performed in a parallel fashion, the multivariate uncertainty analysis

  6. A Comprehensive Methodology for Development, Parameter Estimation, and Uncertainty Analysis of Group Contribution Based Property Models -An Application to the Heat of Combustion

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Marcarie, Camille; Abildskov, Jens

    2016-01-01

    A rigorous methodology is developed that addresses numerical and statistical issues when developing group contribution (GC) based property models such as regression methods, optimization algorithms, performance statistics, outlier treatment, parameter identifiability, and uncertainty...... to calculate parameter estimation errors when underlying distribution of residuals is unknown. Many parameters (first,second, third order group contributions) are found unidentifiable from the typically available data, with large estimation error bounds and significant correlation. Due to this poor parameter...... identifiability issues, reporting of the 95% confidence intervals of the predicted property values should be mandatory as opposed to reporting only single value prediction, currently the norm in literature. Moreover, inclusion of higher order groups (additional parameters) does not always lead to improved...

  7. A Comprehensive Methodology for Development, ParameterEstimation, and Uncertainty Analysis of Group Contribution Based Property Models -An Application to the Heat of Combustion

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Marcarie, Camille; Abildskov, Jens;

    2016-01-01

    of the prediction. The methodology is evaluated through development of a GC method for the prediction of the heat of combustion (ΔHco) for pure components. The results showed that robust regression lead to best performance statistics for parameter estimation. The bootstrap method is found to be a valid alternative......A rigorous methodology is developed that addresses numerical and statistical issues when developing group contribution (GC) based property models such as regression methods, optimization algorithms, performance statistics, outlier treatment, parameter identifiability, and uncertainty...... prediction accuracy for the GC-models; in some cases, it may even increase the prediction error (hence worse prediction accuracy). However, additional parameters do not affect calculated 95% confidence interval. Last but not least, the newly developed GC model of the heat of combustion (ΔHco) shows...

  8. Uncertainty in tsunami sediment transport modeling

    Science.gov (United States)

    Jaffe, Bruce E.; Goto, Kazuhisa; Sugawara, Daisuke; Gelfenbaum, Guy R.; La Selle, SeanPaul M.

    2016-01-01

    Erosion and deposition from tsunamis record information about tsunami hydrodynamics and size that can be interpreted to improve tsunami hazard assessment. We explore sources and methods for quantifying uncertainty in tsunami sediment transport modeling. Uncertainty varies with tsunami, study site, available input data, sediment grain size, and model. Although uncertainty has the potential to be large, published case studies indicate that both forward and inverse tsunami sediment transport models perform well enough to be useful for deciphering tsunami characteristics, including size, from deposits. New techniques for quantifying uncertainty, such as Ensemble Kalman Filtering inversion, and more rigorous reporting of uncertainties will advance the science of tsunami sediment transport modeling. Uncertainty may be decreased with additional laboratory studies that increase our understanding of the semi-empirical parameters and physics of tsunami sediment transport, standardized benchmark tests to assess model performance, and development of hybrid modeling approaches to exploit the strengths of forward and inverse models.

  9. Uncertainty relation based on unbiased parameter estimations

    Science.gov (United States)

    Sun, Liang-Liang; Song, Yong-Shun; Qiao, Cong-Feng; Yu, Sixia; Chen, Zeng-Bing

    2017-02-01

    Heisenberg's uncertainty relation has been extensively studied in spirit of its well-known original form, in which the inaccuracy measures used exhibit some controversial properties and don't conform with quantum metrology, where the measurement precision is well defined in terms of estimation theory. In this paper, we treat the joint measurement of incompatible observables as a parameter estimation problem, i.e., estimating the parameters characterizing the statistics of the incompatible observables. Our crucial observation is that, in a sequential measurement scenario, the bias induced by the first unbiased measurement in the subsequent measurement can be eradicated by the information acquired, allowing one to extract unbiased information of the second measurement of an incompatible observable. In terms of Fisher information we propose a kind of information comparison measure and explore various types of trade-offs between the information gains and measurement precisions, which interpret the uncertainty relation as surplus variance trade-off over individual perfect measurements instead of a constraint on extracting complete information of incompatible observables.

  10. Chemical model reduction under uncertainty

    KAUST Repository

    Malpica Galassi, Riccardo

    2017-03-06

    A general strategy for analysis and reduction of uncertain chemical kinetic models is presented, and its utility is illustrated in the context of ignition of hydrocarbon fuel–air mixtures. The strategy is based on a deterministic analysis and reduction method which employs computational singular perturbation analysis to generate simplified kinetic mechanisms, starting from a detailed reference mechanism. We model uncertain quantities in the reference mechanism, namely the Arrhenius rate parameters, as random variables with prescribed uncertainty factors. We propagate this uncertainty to obtain the probability of inclusion of each reaction in the simplified mechanism. We propose probabilistic error measures to compare predictions from the uncertain reference and simplified models, based on the comparison of the uncertain dynamics of the state variables, where the mixture entropy is chosen as progress variable. We employ the construction for the simplification of an uncertain mechanism in an n-butane–air mixture homogeneous ignition case, where a 176-species, 1111-reactions detailed kinetic model for the oxidation of n-butane is used with uncertainty factors assigned to each Arrhenius rate pre-exponential coefficient. This illustration is employed to highlight the utility of the construction, and the performance of a family of simplified models produced depending on chosen thresholds on importance and marginal probabilities of the reactions.

  11. Uncertainties in modelling CH4 emissions from northern wetlands in glacial climates: the role of vegetation parameters

    Directory of Open Access Journals (Sweden)

    J. van Huissteden

    2011-10-01

    Full Text Available Marine Isotope Stage 3 (MIS 3 interstadials are marked by a sharp increase in the atmospheric methane (CH4 concentration, as recorded in ice cores. Wetlands are assumed to be the major source of this CH4, although several other hypotheses have been advanced. Modelling of CH4 emissions is crucial to quantify CH4 sources for past climates. Vegetation effects are generally highly generalized in modelling past and present-day CH4 fluxes, but should not be neglected. Plants strongly affect the soil-atmosphere exchange of CH4 and the net primary production of the vegetation supplies organic matter as substrate for methanogens. For modelling past CH4 fluxes from northern wetlands, assumptions on vegetation are highly relevant since paleobotanical data indicate large differences in Last Glacial (LG wetland vegetation composition as compared to modern wetland vegetation. Besides more cold-adapted vegetation, Sphagnum mosses appear to be much less dominant during large parts of the LG than at present, which particularly affects CH4 oxidation and transport. To evaluate the effect of vegetation parameters, we used the PEATLAND-VU wetland CO2/CH4 model to simulate emissions from wetlands in continental Europe during LG and modern climates. We tested the effect of parameters influencing oxidation during plant transport (fox, vegetation net primary production (NPP, parameter symbol Pmax, plant transport rate (Vtransp, maximum rooting depth (Zroot and root exudation rate (fex. Our model results show that modelled CH4 fluxes are sensitive to fox and Zroot in particular. The effects of Pmax, Vtransp and fex are of lesser relevance. Interactions with water table modelling are significant for Vtransp. We conducted experiments with different wetland vegetation types for Marine Isotope Stage 3 (MIS 3 stadial and interstadial climates and the present-day climate, by coupling PEATLAND-VU to high resolution climate model simulations for Europe. Experiments assuming

  12. Some Issues in Uncertainty Quantification and Parameter Tuning: A Case Study of Convective Parameterization Scheme in the WRF Regional Climate Model

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Ben; Qian, Yun; Lin, Guang; Leung, Lai-Yung R.; Zhang, Yaocun

    2012-03-05

    The current tuning process of parameters in global climate models is often performed subjectively, or treated as an optimization procedure to minimize the difference between model fields and observations. The later approach may be generating a set of tunable parameters that approximate the observed climate but via an unrealistic balance of physical processes and/or compensating errors over different regions in the globe. In this study, we run the Weather Research and Forecasting (WRF) regional model constrained by the reanalysis data over the Southern Great Plains (SGP) where abundant observational data from various resources are available for calibration of the input parameters and validation of the model results. Our goal is to quantify the uncertainty ranges and identify the optimal values of five key input parameters in a new Kain-Frisch (KF) convective parameterization scheme incorporated in the WRF model. A stochastic sampling algorithm, Multiple Very Fast Simulated Annealing (MVFSA), is employed to efficiently sample the input parameters in KF scheme based on the skill score so that the algorithm progressively moves toward regions of the parameter space that minimize model errors. The results based on the WRF simulations with 25-km grid spacing over the SGP show that the model bias for precipitation can be significantly reduced by using five optimal parameters identified by the MVFSA algorithm. The model performance is very sensitive to downdraft and entrainment related parameters and consumption time of Convective Available Potential Energy (CAPE). Simulated convective precipitation decreases as the ratio of downdraft to updraft flux increases. Larger CAPE consumption time results in less convective but more stratiform precipitation. The simulation using optimal parameters obtained by only constraining precipitation generates positive impact on the other output variables, such as temperature and wind. By using the optimal parameters obtained at 25 km

  13. Evaluating uncertainty in simulation models

    Energy Technology Data Exchange (ETDEWEB)

    McKay, M.D.; Beckman, R.J.; Morrison, J.D.; Upton, S.C.

    1998-12-01

    The authors discussed some directions for research and development of methods for assessing simulation variability, input uncertainty, and structural model uncertainty. Variance-based measures of importance for input and simulation variables arise naturally when using the quadratic loss function of the difference between the full model prediction y and the restricted prediction {tilde y}. The concluded that generic methods for assessing structural model uncertainty do not now exist. However, methods to analyze structural uncertainty for particular classes of models, like discrete event simulation models, may be attainable.

  14. Climate model uncertainty vs. conceptual geological uncertainty in hydrological modeling

    Directory of Open Access Journals (Sweden)

    T. O. Sonnenborg

    2015-04-01

    Full Text Available Projections of climate change impact are associated with a cascade of uncertainties including CO2 emission scenario, climate model, downscaling and impact model. The relative importance of the individual uncertainty sources is expected to depend on several factors including the quantity that is projected. In the present study the impacts of climate model uncertainty and geological model uncertainty on hydraulic head, stream flow, travel time and capture zones are evaluated. Six versions of a physically based and distributed hydrological model, each containing a unique interpretation of the geological structure of the model area, are forced by 11 climate model projections. Each projection of future climate is a result of a GCM-RCM model combination (from the ENSEMBLES project forced by the same CO2 scenario (A1B. The changes from the reference period (1991–2010 to the future period (2081–2100 in projected hydrological variables are evaluated and the effects of geological model and climate model uncertainties are quantified. The results show that uncertainty propagation is context dependent. While the geological conceptualization is the dominating uncertainty source for projection of travel time and capture zones, the uncertainty on the climate models is more important for groundwater hydraulic heads and stream flow.

  15. Chemical model reduction under uncertainty

    KAUST Repository

    Najm, Habib

    2016-01-05

    We outline a strategy for chemical kinetic model reduction under uncertainty. We present highlights of our existing deterministic model reduction strategy, and describe the extension of the formulation to include parametric uncertainty in the detailed mechanism. We discuss the utility of this construction, as applied to hydrocarbon fuel-air kinetics, and the associated use of uncertainty-aware measures of error between predictions from detailed and simplified models.

  16. Quantifying Uncertainty in the Predictions of the SimSphere Land Biosphere Model in Simulating Key Parameters Characterising Earth's Energy Balance

    Science.gov (United States)

    North, Matthew; Petropoulos, George

    2014-05-01

    Soil Vegetation Atmosphere Transfer (SVAT) models are becoming the preferred scientific tool to assess land surface energy fluxes due to their computational efficiency, accuracy and ability to provide results at fine temporal scales. An all-inclusive validation of those models is a fundamental step before those can be confidently used for any practical application or research purpose alike. SimSphere is an example of a SVAT model, simulating a large array of parameters characterising various land surface interactions over a 24 hour cycle at a 1-D vertical profile. Being able to appreciate the uncertainty of SimSphere predictions, is of vital importance towards increasing confidence in the models' overall use and ability to represent accurate land surface interactions. This is particularly important, given that its use either as a stand-alone tool or synergistically with Earth Observation (EO) data is currently expanding worldwide. In the present study, uncertainty in the SimSphere's predictions is evaluated at seven European sites, representative of a range of ecosystem conditions and biomes types for which in-situ data from the CarboEurope IP operational network acquired during 2011 were available. Selected sites are characterised by varying topographical characteristics, which further allow developing a comprehensive understanding on how topography can affect the models' ability to reproduce the variables which are evaluated. Model simulations are compared to in-situ data collected on cloud free days and on days with high Energy Balance Ratio. We focused here specifically on evaluating SimSphere capability in predicting selected variables of the energy balance, namely the Latent Heat (LE), Sensible heat (H) and Net Radiation (Rn) fluxes. An evaluation of the uncertainty in the model predictions was evaluated on the basis of extensive statistical analysis that was carried out by computing a series of relevant statistical measures. Results obtained confirmed the

  17. The interaction of climate observation, parameter estimation, and mitigation decisions: Modeling climate policy under uncertainty with a partially observable Markov decision process

    Science.gov (United States)

    Fertig, E.; Webster, M.

    2013-12-01

    Though climate sensitivity remains poorly constrained, the trajectory of future greenhouse gas emissions and observable climate data could lead to improved estimates. Updated parameter estimates could alter decisions on greenhouse mitigation policy, which in turn influences future observed climate data and parameter estimation. Previous research on global climate mitigation policy neglects the cyclic nature of climate observation, parameter estimation, and policy action, instead treating uncertainty in climate sensitivity with scenario analysis or assuming that it will be resolved completely at some point in the future. This paper advances quantitative analysis of decision making under uncertainty (DMUU) in climate sensitivity by modeling the observation/parameter estimation/policy action cycle as a partially observable Markov decision process (POMDP). In a POMDP framework, an objective function is maximized while both observable parameters and probability distributions over unobservable parameters are retained as system states. As time progresses and more data are collected, the probability distributions are updated with Bayesian analysis. To model anthropogenic climate change as a POMDP, we maximize social welfare using a modified DICE model. Climate sensitivity is never directly observable; instead it is modeled with a distribution that is subject to Bayesian updating after observation of stochastic changes in global mean temperature. The maximization problem is posed as a stochastic Bellman equation, which expresses total social welfare as the sum of immediate social welfare resulting from a current mitigation decision under current knowledge of climate sensitivity and the expected cost-to-go, which is the discounted future social welfare in the subsequent time interval as a function of both global mean temperature and the consequent probability distribution over climate sensitivity. While similar, smaller stochastic dynamic programming problems can be solved

  18. Some issues in uncertainty quantification and parameter tuning: a case study of convective parameterization scheme in the WRF regional climate model

    Science.gov (United States)

    Yang, B.; Qian, Y.; Lin, G.; Leung, R.; Zhang, Y.

    2012-03-01

    The current tuning process of parameters in global climate models is often performed subjectively or treated as an optimization procedure to minimize model biases based on observations. While the latter approach may provide more plausible values for a set of tunable parameters to approximate the observed climate, the system could be forced to an unrealistic physical state or improper balance of budgets through compensating errors over different regions of the globe. In this study, the Weather Research and Forecasting (WRF) model was used to provide a more flexible framework to investigate a number of issues related uncertainty quantification (UQ) and parameter tuning. The WRF model was constrained by reanalysis of data over the Southern Great Plains (SGP), where abundant observational data from various sources was available for calibration of the input parameters and validation of the model results. Focusing on five key input parameters in the new Kain-Fritsch (KF) convective parameterization scheme used in WRF as an example, the purpose of this study was to explore the utility of high-resolution observations for improving simulations of regional patterns and evaluate the transferability of UQ and parameter tuning across physical processes, spatial scales, and climatic regimes, which have important implications to UQ and parameter tuning in global and regional models. A stochastic importance sampling algorithm, Multiple Very Fast Simulated Annealing (MVFSA) was employed to efficiently sample the input parameters in the KF scheme based on a skill score so that the algorithm progressively moved toward regions of the parameter space that minimize model errors. The results based on the WRF simulations with 25-km grid spacing over the SGP showed that the precipitation bias in the model could be significantly reduced when five optimal parameters identified by the MVFSA algorithm were used. The model performance was found to be sensitive to downdraft- and entrainment

  19. Some issues in uncertainty quantification and parameter tuning: a case study of convective parameterization scheme in the WRF regional climate model

    Directory of Open Access Journals (Sweden)

    B. Yang

    2012-03-01

    Full Text Available The current tuning process of parameters in global climate models is often performed subjectively or treated as an optimization procedure to minimize model biases based on observations. While the latter approach may provide more plausible values for a set of tunable parameters to approximate the observed climate, the system could be forced to an unrealistic physical state or improper balance of budgets through compensating errors over different regions of the globe. In this study, the Weather Research and Forecasting (WRF model was used to provide a more flexible framework to investigate a number of issues related uncertainty quantification (UQ and parameter tuning. The WRF model was constrained by reanalysis of data over the Southern Great Plains (SGP, where abundant observational data from various sources was available for calibration of the input parameters and validation of the model results. Focusing on five key input parameters in the new Kain-Fritsch (KF convective parameterization scheme used in WRF as an example, the purpose of this study was to explore the utility of high-resolution observations for improving simulations of regional patterns and evaluate the transferability of UQ and parameter tuning across physical processes, spatial scales, and climatic regimes, which have important implications to UQ and parameter tuning in global and regional models. A stochastic importance sampling algorithm, Multiple Very Fast Simulated Annealing (MVFSA was employed to efficiently sample the input parameters in the KF scheme based on a skill score so that the algorithm progressively moved toward regions of the parameter space that minimize model errors.

    The results based on the WRF simulations with 25-km grid spacing over the SGP showed that the precipitation bias in the model could be significantly reduced when five optimal parameters identified by the MVFSA algorithm were used. The model performance was found to be sensitive to

  20. Some issues in uncertainty quantification and parameter tuning: a case study of convective parameterization scheme in the WRF regional climate model

    Directory of Open Access Journals (Sweden)

    B. Yang

    2011-12-01

    Full Text Available The current tuning process of parameters in global climate models is often performed subjectively or treated as an optimization procedure to minimize model biases based on observations. While the latter approach may provide more plausible values for a set of tunable parameters to approximate the observed climate, the system could be forced to an unrealistic physical state or improper balance of budgets through compensating errors over different regions of the globe. In this study, the Weather Research and Forecasting (WRF model was used to provide a more flexible framework to investigate a number of issues related uncertainty quantification (UQ and parameter tuning. The WRF model was constrained by reanalysis of data over the Southern Great Plains (SGP, where abundant observational data from various sources was available for calibration of the input parameters and validation of the model results. Focusing on five key input parameters in the new Kain-Fritsch (KF convective parameterization scheme used in WRF as an example, the purpose of this study was to explore the utility of high-resolution observations for improving simulations of regional patterns and evaluate the transferability of UQ and parameter tuning across physical processes, spatial scales, and climatic regimes, which have important implications to UQ and parameter tuning in global and regional models. A stochastic important-sampling algorithm, Multiple Very Fast Simulated Annealing (MVFSA was employed to efficiently sample the input parameters in the KF scheme based on a skill score so that the algorithm progressively moved toward regions of the parameter space that minimize model errors.

    The results based on the WRF simulations with 25-km grid spacing over the SGP showed that the precipitation bias in the model could be significantly reduced when five optimal parameters identified by the MVFSA algorithm were used. The model performance was found to be sensitive to

  1. Understanding uncertainties when inferring mean transit times of water trough tracer-based lumped-parameter models in Andean tropical montane cloud forest catchments

    Science.gov (United States)

    Timbe, E.; Windhorst, D.; Crespo, P.; Frede, H.-G.; Feyen, J.; Breuer, L.

    2014-04-01

    Weekly samples from surface waters, springs, soil water and rainfall were collected in a 76.9 km2 mountain rain forest catchment and its tributaries in southern Ecuador. Time series of the stable water isotopes δ18O and δ2H were used to calculate mean transit times (MTTs) and the transit time distribution functions (TTDs) solving the convolution method for seven lumped-parameter models. For each model setup, the generalized likelihood uncertainty estimation (GLUE) methodology was applied to find the best predictions, behavioral solutions and parameter identifiability. For the study basin, TTDs based on model types such as the linear-piston flow for soil waters and the exponential-piston flow for surface waters and springs performed better than more versatile equations such as the gamma and the two parallel linear reservoirs. Notwithstanding both approaches yielded a better goodness of fit for most sites, but with considerable larger uncertainty shown by GLUE. Among the tested models, corresponding results were obtained for soil waters with short MTTs (ranging from 2 to 9 weeks). For waters with longer MTTs differences were found, suggesting that for those cases the MTT should be based at least on an intercomparison of several models. Under dominant baseflow conditions long MTTs for stream water ≥ 2 yr were detected, a phenomenon also observed for shallow springs. Short MTTs for water in the top soil layer indicate a rapid exchange of surface waters with deeper soil horizons. Differences in travel times between soils suggest that there is evidence of a land use effect on flow generation.

  2. Uncertainty in Air Quality Modeling.

    Science.gov (United States)

    Fox, Douglas G.

    1984-01-01

    Under the direction of the AMS Steering Committee for the EPA Cooperative Agreement on Air Quality Modeling, a small group of scientists convened to consider the question of uncertainty in air quality modeling. Because the group was particularly concerned with the regulatory use of models, its discussion focused on modeling tall stack, point source emissions.The group agreed that air quality model results should be viewed as containing both reducible error and inherent uncertainty. Reducible error results from improper or inadequate meteorological and air quality data inputs, and from inadequacies in the models. Inherent uncertainty results from the basic stochastic nature of the turbulent atmospheric motions that are responsible for transport and diffusion of released materials. Modelers should acknowledge that all their predictions to date contain some associated uncertainty and strive also to quantify uncertainty.How can the uncertainty be quantified? There was no consensus from the group as to precisely how uncertainty should be calculated. One subgroup, which addressed statistical procedures, suggested that uncertainty information could be obtained from comparisons of observations and predictions. Following recommendations from a previous AMS workshop on performance evaluation (Fox. 1981), the subgroup suggested construction of probability distribution functions from the differences between observations and predictions. Further, they recommended that relatively new computer-intensive statistical procedures be considered to improve the quality of uncertainty estimates for the extreme value statistics of interest in regulatory applications.A second subgroup, which addressed the basic nature of uncertainty in a stochastic system, also recommended that uncertainty be quantified by consideration of the differences between observations and predictions. They suggested that the average of the difference squared was appropriate to isolate the inherent uncertainty that

  3. Parametric uncertainty modeling for robust control

    DEFF Research Database (Denmark)

    Rasmussen, K.H.; Jørgensen, Sten Bay

    1999-01-01

    The dynamic behaviour of a non-linear process can often be approximated with a time-varying linear model. In the presented methodology the dynamics is modeled non-conservatively as parametric uncertainty in linear lime invariant models. The obtained uncertainty description makes it possible...... method can be utilized in identification of a nominal model with uncertainty description. The method is demonstrated on a binary distillation column operating in the LV configuration. The dynamics of the column is approximated by a second order linear model, wherein the parameters vary as the operating...... to perform robustness analysis on a control system using the structured singular value. The idea behind the proposed method is to fit a rational function to the parameter variation. The parameter variation can then be expressed as a linear fractional transformation (LFT), It is discussed how the proposed...

  4. Uncertainty Analysis of Thermal Comfort Parameters

    Science.gov (United States)

    Ribeiro, A. Silva; Alves e Sousa, J.; Cox, Maurice G.; Forbes, Alistair B.; Matias, L. Cordeiro; Martins, L. Lages

    2015-08-01

    International Standard ISO 7730:2005 defines thermal comfort as that condition of mind that expresses the degree of satisfaction with the thermal environment. Although this definition is inevitably subjective, the Standard gives formulae for two thermal comfort indices, predicted mean vote ( PMV) and predicted percentage dissatisfied ( PPD). The PMV formula is based on principles of heat balance and experimental data collected in a controlled climate chamber under steady-state conditions. The PPD formula depends only on PMV. Although these formulae are widely recognized and adopted, little has been done to establish measurement uncertainties associated with their use, bearing in mind that the formulae depend on measured values and tabulated values given to limited numerical accuracy. Knowledge of these uncertainties are invaluable when values provided by the formulae are used in making decisions in various health and civil engineering situations. This paper examines these formulae, giving a general mechanism for evaluating the uncertainties associated with values of the quantities on which the formulae depend. Further, consideration is given to the propagation of these uncertainties through the formulae to provide uncertainties associated with the values obtained for the indices. Current international guidance on uncertainty evaluation is utilized.

  5. Numerical modeling of economic uncertainty

    DEFF Research Database (Denmark)

    Schjær-Jacobsen, Hans

    2007-01-01

    Representation and modeling of economic uncertainty is addressed by different modeling methods, namely stochastic variables and probabilities, interval analysis, and fuzzy numbers, in particular triple estimates. Focusing on discounted cash flow analysis numerical results are presented, comparisons...

  6. Uncertainties in Nuclear Proliferation Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Chul Min; Yim, Man-Sung; Park, Hyeon Seok [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2015-05-15

    There have been various efforts in the research community to understand the determinants of nuclear proliferation and develop quantitative tools to predict nuclear proliferation events. Such systematic approaches have shown the possibility to provide warning for the international community to prevent nuclear proliferation activities. However, there are still large debates for the robustness of the actual effect of determinants and projection results. Some studies have shown that several factors can cause uncertainties in previous quantitative nuclear proliferation modeling works. This paper analyzes the uncertainties in the past approaches and suggests future works in the view of proliferation history, analysis methods, and variable selection. The research community still lacks the knowledge for the source of uncertainty in current models. Fundamental problems in modeling will remain even other advanced modeling method is developed. Before starting to develop fancy model based on the time dependent proliferation determinants' hypothesis, using graph theory, etc., it is important to analyze the uncertainty of current model to solve the fundamental problems of nuclear proliferation modeling. The uncertainty from different proliferation history coding is small. Serious problems are from limited analysis methods and correlation among the variables. Problems in regression analysis and survival analysis cause huge uncertainties when using the same dataset, which decreases the robustness of the result. Inaccurate variables for nuclear proliferation also increase the uncertainty. To overcome these problems, further quantitative research should focus on analyzing the knowledge suggested on the qualitative nuclear proliferation studies.

  7. Representing and managing uncertainty in qualitative ecological models

    NARCIS (Netherlands)

    Nuttle, T.; Bredeweg, B.; Salles, P.; Neumann, M.

    2009-01-01

    Ecologists and decision makers need ways to understand systems, test ideas, and make predictions and explanations about systems. However, uncertainty about causes and effects of processes and parameter values is pervasive in models of ecological systems. Uncertainty associated with incomplete

  8. Orbit control of a stratospheric satellite with parameter uncertainties

    Science.gov (United States)

    Xu, Ming; Huo, Wei

    2016-12-01

    When a stratospheric satellite travels by prevailing winds in the stratosphere, its cross-track displacement needs to be controlled to keep a constant latitude orbital flight. To design the orbit control system, a 6 degree-of-freedom (DOF) model of the satellite is established based on the second Lagrangian formulation, it is proven that the input/output feedback linearization theory cannot be directly implemented for the orbit control with this model, thus three subsystem models are deduced from the 6-DOF model to develop a sequential nonlinear control strategy. The control strategy includes an adaptive controller for the balloon-tether subsystem with uncertain balloon parameters, a PD controller based on feedback linearization for the tether-sail subsystem, and a sliding mode controller for the sail-rudder subsystem with uncertain sail parameters. Simulation studies demonstrate that the proposed control strategy is robust to uncertainties and satisfies high precision requirements for the orbit flight of the satellite.

  9. Explicit consideration of topological and parameter uncertainty gives new insights into a well-established model of glycolysis

    NARCIS (Netherlands)

    Achcar, Fiona; Barrett, Michael P.; Breitling, Rainer

    2013-01-01

    Previous models of glycolysis in the sleeping sickness parasite Trypanosomabrucei assumed that the core part of glycolysis in this unicellular parasite is tightly compartimentalized within an organelle, the glycosome, which had previously been shown to contain most of the glycolytic enzymes. The gly

  10. Approaches in highly parameterized inversion—PEST++ Version 3, a Parameter ESTimation and uncertainty analysis software suite optimized for large environmental models

    Science.gov (United States)

    Welter, David E.; White, Jeremy T.; Hunt, Randall J.; Doherty, John E.

    2015-09-18

    The PEST++ Version 1 object-oriented parameter estimation code is here extended to Version 3 to incorporate additional algorithms and tools to further improve support for large and complex environmental modeling problems. PEST++ Version 3 includes the Gauss-Marquardt-Levenberg (GML) algorithm for nonlinear parameter estimation, Tikhonov regularization, integrated linear-based uncertainty quantification, options of integrated TCP/IP based parallel run management or external independent run management by use of a Version 2 update of the GENIE Version 1 software code, and utilities for global sensitivity analyses. The Version 3 code design is consistent with PEST++ Version 1 and continues to be designed to lower the barriers of entry for users as well as developers while providing efficient and optimized algorithms capable of accommodating large, highly parameterized inverse problems. As such, this effort continues the original focus of (1) implementing the most popular and powerful features of the PEST software suite in a fashion that is easy for novice or experienced modelers to use and (2) developing a software framework that is easy to extend.

  11. Approaches in highly parameterized inversion—PEST++ Version 3, a Parameter ESTimation and uncertainty analysis software suite optimized for large environmental models

    Science.gov (United States)

    Welter, David E.; White, Jeremy T.; Hunt, Randall J.; Doherty, John E.

    2015-09-18

    The PEST++ Version 1 object-oriented parameter estimation code is here extended to Version 3 to incorporate additional algorithms and tools to further improve support for large and complex environmental modeling problems. PEST++ Version 3 includes the Gauss-Marquardt-Levenberg (GML) algorithm for nonlinear parameter estimation, Tikhonov regularization, integrated linear-based uncertainty quantification, options of integrated TCP/IP based parallel run management or external independent run management by use of a Version 2 update of the GENIE Version 1 software code, and utilities for global sensitivity analyses. The Version 3 code design is consistent with PEST++ Version 1 and continues to be designed to lower the barriers of entry for users as well as developers while providing efficient and optimized algorithms capable of accommodating large, highly parameterized inverse problems. As such, this effort continues the original focus of (1) implementing the most popular and powerful features of the PEST software suite in a fashion that is easy for novice or experienced modelers to use and (2) developing a software framework that is easy to extend.

  12. Uncertainty Assessment in Urban Storm Water Drainage Modelling

    DEFF Research Database (Denmark)

    Thorndahl, Søren

    The object of this paper is to make an overall description of the author's PhD study, concerning uncertainties in numerical urban storm water drainage models. Initially an uncertainty localization and assessment of model inputs and parameters as well as uncertainties caused by different model...

  13. Analyzing the effects of geological and parameter uncertainty on prediction of groundwater head and travel time

    Directory of Open Access Journals (Sweden)

    X. He

    2013-08-01

    Full Text Available Uncertainty of groundwater model predictions has in the past mostly been related to uncertainty in the hydraulic parameters, whereas uncertainty in the geological structure has not been considered to the same extent. Recent developments in theoretical methods for quantifying geological uncertainty have made it possible to consider this factor in groundwater modeling. In this study we have applied the multiple-point geostatistical method (MPS integrated in the Stanford Geostatistical Modeling Software (SGeMS for exploring the impact of geological uncertainty on groundwater flow patterns for a site in Denmark. Realizations from the geostatistical model were used as input to a groundwater model developed from Modular three-dimensional finite-difference ground-water model (MODFLOW within the Groundwater Modeling System (GMS modeling environment. The uncertainty analysis was carried out in three scenarios involving simulation of groundwater head distribution and travel time. The first scenario implied 100 stochastic geological models all assigning the same hydraulic parameters for the same geological units. In the second scenario the same 100 geological models were subjected to model optimization, where the hydraulic parameters for each of them were estimated by calibration against observations of hydraulic head and stream discharge. In the third scenario each geological model was run with 216 randomized sets of parameters. The analysis documented that the uncertainty on the conceptual geological model was as significant as the uncertainty related to the embedded hydraulic parameters.

  14. Uncertainty calculation in transport models and forecasts

    DEFF Research Database (Denmark)

    Manzo, Stefano; Prato, Carlo Giacomo

    in a four-stage transport model related to different variable distributions (to be used in a Monte Carlo simulation procedure), assignment procedures and levels of congestion, at both the link and the network level. The analysis used as case study the Næstved model, referring to the Danish town of Næstved2...... the uncertainty propagation pattern over time specific for key model outputs becomes strategically important. 1 Manzo, S., Nielsen, O. A. & Prato, C. G. (2014). The Effects of uncertainty in speed-flow curve parameters on a large-scale model. Transportation Research Record, 1, 30-37. 2 Manzo, S., Nielsen, O. A...

  15. Uncertainty in hydrological change modelling

    DEFF Research Database (Denmark)

    Seaby, Lauren Paige

    Hydrological change modelling methodologies generally use climate models outputs to force hydrological simulations under changed conditions. There are nested sources of uncertainty throughout this methodology, including choice of climate model and subsequent bias correction methods. This Ph.......D. study evaluates the uncertainty of the impact of climate change in hydrological simulations given multiple climate models and bias correction methods of varying complexity. Three distribution based scaling methods (DBS) were developed and benchmarked against a more simplistic and commonly used delta...... change (DC) approach. These climate model projections were then used to force hydrological simulations under climate change for the island Sjælland in Denmark to analyse the contribution of different climate models and bias correction methods to overall uncertainty in the hydrological change modelling...

  16. Numerical modeling of economic uncertainty

    DEFF Research Database (Denmark)

    Schjær-Jacobsen, Hans

    2007-01-01

    Representation and modeling of economic uncertainty is addressed by different modeling methods, namely stochastic variables and probabilities, interval analysis, and fuzzy numbers, in particular triple estimates. Focusing on discounted cash flow analysis numerical results are presented, comparisons...... are made between alternative modeling methods, and characteristics of the methods are discussed....

  17. Robust integrated navigation for Mars atmospheric entry with parameter uncertainties

    Science.gov (United States)

    Yang, H. F.; Fu, H. M.; Wang, Z. H.; Xiao, Q.; Zhang, Y. B.

    2017-07-01

    Mars atmospheric entry is a key phase to actualize Mars pinpoint landing. In this phase, parameters including atmospheric density, ballistic coefficient, and lift-to-drag ratio are uncertain because of environmental complexity. Ignoring these uncertainties may probably cause negative effects on the navigation accuracy. Based on the desensitized unscented Kalman filter (DUKF), which obtains the state estimation by minimizing a cost function involving the trace of posterior covariance matrix and the weighted norm of the posterior state estimation error sensitivities, this paper further introduces parameter uncertainties into the radio beacons/inertial measurement unit integrated navigation scheme and establishes a robust integrated navigation for Mars atmospheric entry with parameter uncertainties. Numerical simulation results show that the robust navigation algorithm based on the DUKF effectively reduces the influence of parameter uncertainties and illustrates a better performance than traditional methods.

  18. Model Uncertainty for Bilinear Hysteric Systems

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    In structural reliability analysis at least three types of uncertainty must be considered, namely physical uncertainty, statistical uncertainty, and model uncertainty (see e.g. Thoft-Christensen & Baker [1]). The physical uncertainty is usually modelled by a number of basic variables by predictive...... density functions, Veneziano [2]. In general, model uncertainty is the uncertainty connected with mathematical modelling of the physical reality. When structural reliability analysis is related to the concept of a failure surface (or limit state surface) in the n-dimension basic variable space then model...... uncertainty is at least due to the neglected variables, the modelling of the failure surface and the computational technique used....

  19. Adressing Replication and Model Uncertainty

    DEFF Research Database (Denmark)

    Ebersberger, Bernd; Galia, Fabrice; Laursen, Keld

    Many fields of strategic management are subject to an important degree of model uncertainty. This is because the true model, and therefore the selection of appropriate explanatory variables, is essentially unknown. Drawing on the literature on the determinants of innovation, and by analyzing inno...

  20. Adressing Replication and Model Uncertainty

    DEFF Research Database (Denmark)

    Ebersberger, Bernd; Galia, Fabrice; Laursen, Keld

    Many fields of strategic management are subject to an important degree of model uncertainty. This is because the true model, and therefore the selection of appropriate explanatory variables, is essentially unknown. Drawing on the literature on the determinants of innovation, and by analyzing inno...

  1. Model uncertainty in growth empirics

    NARCIS (Netherlands)

    Prüfer, P.

    2008-01-01

    This thesis applies so-called Bayesian model averaging (BMA) to three different economic questions substantially exposed to model uncertainty. Chapter 2 addresses a major issue of modern development economics: the analysis of the determinants of pro-poor growth (PPG), which seeks to combine high gro

  2. Parameter uncertainty analysis for simulating streamflow in a river catchment of Vietnam

    Directory of Open Access Journals (Sweden)

    Dao Nguyen Khoi

    2015-07-01

    Full Text Available Hydrological models play vital roles in management of water resources. However, the calibration of the hydrological models is a large challenge because of the uncertainty involved in the large number of parameters. In this study, four uncertainty analysis methods, including Generalized Likelihood Uncertainty Estimation (GLUE, Parameter Solution (ParaSol, Particle Swarm Optimization (PSO, and Sequential Uncertainty Fitting (SUFI-2, were employed to perform parameter uncertainty analysis of streamflow simulation in the Srepok River Catchment by using the Soil and Water Assessment Tool (SWAT model. The four methods were compared in terms of the model prediction uncertainty, the model performance, and the computational efficiency. The results showed that the SUFI-2 method has the advantages in the model calibration and uncertainty analysis. This technique could be run with the smallest of simulation runs to achieve good prediction uncertainty bands and model performance. This technique could be run with the smallest of simulation runs to achieve good prediction uncertainty bands and model performance.

  3. Uncertainty in biology a computational modeling approach

    CERN Document Server

    Gomez-Cabrero, David

    2016-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies.  Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process.  This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples.  This book is intended for graduate stude...

  4. Variability and Uncertainties of Key Hydrochemical Parameters for SKB Sites

    Energy Technology Data Exchange (ETDEWEB)

    Bath, Adrian [Intellisci Ltd, Willoughby on the Wolds, Loughborough (United Kingdom); Hermansson, Hans-Peter [Studsvik Nuclear AB, Nykoeping (Sweden)

    2006-12-15

    The work described in this report is a development of SKI's capability for the review and evaluation of data that will constitute part of SKB's case for selection of a suitable site and application to construct a geological repository for spent nuclear fuel. The aim has been to integrate a number of different approaches to interpreting and evaluating hydrochemical data, especially with respect to the parameters that matter most in assessing the suitability of a site and in understanding the geochemistry and groundwater conditions at a site. It has been focused on taking an independent view of overall uncertainties in reported data, taking account of analytical, sampling and other random and systematic sources of error. This evaluation was carried out initially with a compilation and general inspection of data from the Simpevarp, Forsmark and Laxemar sites plus data from older 'historical' boreholes in the Aespoe area. That was followed by a more specific interpretation by means of geochemical calculations which test the robustness of certain parameters, namely pH and redox/Eh. Geochemical model calculations have been carried out with widely available computer software. Data sources and their handling were also considered, especially access to SKB's SICADA database. In preparation for the use of geochemical modelling programs and to establish comparability of model results with those reported by SKB, the underlying thermodynamic databases were compared with each other and with other generally accepted databases. Comparisons of log K data for selected solid phases and solution complexes from the different thermodynamic databases were made. In general, there is a large degree of comparability between the databases, but there are some significant, and in a few cases large, differences. The present situation is however adequate for present purposes. The interpretation of redox equilibria is dependent on identifying the relevant solid phases and

  5. Uncertainty for calculating transport on Titan: a probabilistic description of bimolecular diffusion parameters

    CERN Document Server

    Plessis, Sylvain; Mandt, Kathy; Greathouse, Thomas; Luspay-Kuti, Adrienn

    2015-01-01

    Bimolecular diffusion coefficients are important parameters used by atmospheric models to calculate altitude profiles of minor constituents in an atmosphere. Unfortunately, laboratory measurements of these coefficients were never conducted at temperature conditions relevant to the atmosphere of Titan. Here we conduct a detailed uncertainty analysis of the bimolecular diffusion coefficient parameters as applied to Titan's upper atmosphere to provide a better understanding of the impact of uncertainty for this parameter on models. Because temperature and pressure conditions are much lower than the laboratory conditions in which bimolecular diffusion parameters were measured, we apply a Bayesian framework, a problem-agnostic framework, to determine parameter estimates and associated uncertainties. We solve the Bayesian calibration problem using the open-source QUESO library which also performs a propagation of uncertainties in the calibrated parameters to temperature and pressure conditions observed in Titan's u...

  6. Uncertainty quantification for Markov chain models.

    Science.gov (United States)

    Meidani, Hadi; Ghanem, Roger

    2012-12-01

    Transition probabilities serve to parameterize Markov chains and control their evolution and associated decisions and controls. Uncertainties in these parameters can be associated with inherent fluctuations in the medium through which a chain evolves, or with insufficient data such that the inferential value of the chain is jeopardized. The behavior of Markov chains associated with such uncertainties is described using a probabilistic model for the transition matrices. The principle of maximum entropy is used to characterize the probability measure of the transition rates. The formalism is demonstrated on a Markov chain describing the spread of disease, and a number of quantities of interest, pertaining to different aspects of decision-making, are investigated.

  7. Uncertainty Quantification in Climate Modeling and Projection

    Energy Technology Data Exchange (ETDEWEB)

    Qian, Yun; Jackson, Charles; Giorgi, Filippo; Booth, Ben; Duan, Qingyun; Forest, Chris; Higdon, Dave; Hou, Z. Jason; Huerta, Gabriel

    2016-05-01

    assessing reliability and uncertainties of climate change information. An alternative approach is to generate similar ensembles by perturbing parameters within a single-model framework. One of workshop’s objectives was to give participants a deeper understanding of these approaches within a Bayesian statistical framework. However, there remain significant challenges still to be resolved before UQ can be applied in a convincing way to climate models and their projections.

  8. Uncertainty in spatially explicit animal dispersal models

    Science.gov (United States)

    Mooij, Wolf M.; DeAngelis, Donald L.

    2003-01-01

    Uncertainty in estimates of survival of dispersing animals is a vexing difficulty in conservation biology. The current notion is that this uncertainty decreases the usefulness of spatially explicit population models in particular. We examined this problem by comparing dispersal models of three levels of complexity: (1) an event-based binomial model that considers only the occurrence of mortality or arrival, (2) a temporally explicit exponential model that employs mortality and arrival rates, and (3) a spatially explicit grid-walk model that simulates the movement of animals through an artificial landscape. Each model was fitted to the same set of field data. A first objective of the paper is to illustrate how the maximum-likelihood method can be used in all three cases to estimate the means and confidence limits for the relevant model parameters, given a particular set of data on dispersal survival. Using this framework we show that the structure of the uncertainty for all three models is strikingly similar. In fact, the results of our unified approach imply that spatially explicit dispersal models, which take advantage of information on landscape details, suffer less from uncertainly than do simpler models. Moreover, we show that the proposed strategy of model development safeguards one from error propagation in these more complex models. Finally, our approach shows that all models related to animal dispersal, ranging from simple to complex, can be related in a hierarchical fashion, so that the various approaches to modeling such dispersal can be viewed from a unified perspective.

  9. Hysteresis and uncertainty in soil water-retention curve parameters

    Science.gov (United States)

    Likos, William J.; Lu, Ning; Godt, Jonathan W.

    2014-01-01

    Accurate estimates of soil hydraulic parameters representing wetting and drying paths are required for predicting hydraulic and mechanical responses in a large number of applications. A comprehensive suite of laboratory experiments was conducted to measure hysteretic soil-water characteristic curves (SWCCs) representing a wide range of soil types. Results were used to quantitatively assess differences and uncertainty in three simplifications frequently adopted to estimate wetting-path SWCC parameters from more easily measured drying curves. They are the following: (1) αw=2αd, (2) nw=nd, and (3) θws=θds, where α, n, and θs are fitting parameters entering van Genuchten’s commonly adopted SWCC model, and the superscripts w and d indicate wetting and drying paths, respectively. The average ratio αw/αd for the data set was 2.24±1.25. Nominally cohesive soils had a lower αw/αd ratio (1.73±0.94) than nominally cohesionless soils (3.14±1.27). The average nw/nd ratio was 1.01±0.11 with no significant dependency on soil type, thus confirming the nw=nd simplification for a wider range of soil types than previously available. Water content at zero suction during wetting (θws) was consistently less than during drying (θds) owing to air entrapment. The θws/θds ratio averaged 0.85±0.10 and was comparable for nominally cohesive (0.87±0.11) and cohesionless (0.81±0.08) soils. Regression statistics are provided to quantitatively account for uncertainty in estimating hysteretic retention curves. Practical consequences are demonstrated for two case studies.

  10. Multi-model ensemble hydrologic prediction and uncertainties analysis

    Directory of Open Access Journals (Sweden)

    S. Jiang

    2014-09-01

    Full Text Available Modelling uncertainties (i.e. input errors, parameter uncertainties and model structural errors inevitably exist in hydrological prediction. A lot of recent attention has focused on these, of which input error modelling, parameter optimization and multi-model ensemble strategies are the three most popular methods to demonstrate the impacts of modelling uncertainties. In this paper the Xinanjiang model, the Hybrid rainfall–runoff model and the HYMOD model were applied to the Mishui Basin, south China, for daily streamflow ensemble simulation and uncertainty analysis. The three models were first calibrated by two parameter optimization algorithms, namely, the Shuffled Complex Evolution method (SCE-UA and the Shuffled Complex Evolution Metropolis method (SCEM-UA; next, the input uncertainty was accounted for by introducing a normally-distributed error multiplier; then, the simulation sets calculated from the three models were combined by Bayesian model averaging (BMA. The results show that both these parameter optimization algorithms generate good streamflow simulations; specifically the SCEM-UA can imply parameter uncertainty and give the posterior distribution of the parameters. Considering the precipitation input uncertainty, the streamflow simulation precision does not improve very much. While the BMA combination not only improves the streamflow prediction precision, it also gives quantitative uncertainty bounds for the simulation sets. The SCEM-UA calculated prediction interval is better than the SCE-UA calculated one. These results suggest that considering the model parameters' uncertainties and doing multi-model ensemble simulations are very practical for streamflow prediction and flood forecasting, from which more precision prediction and more reliable uncertainty bounds can be generated.

  11. Applied research in uncertainty modeling and analysis

    CERN Document Server

    Ayyub, Bilal

    2005-01-01

    Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

  12. Are models, uncertainty, and dispute resolution compatible?

    Science.gov (United States)

    Anderson, J. D.; Wilson, J. L.

    2013-12-01

    Models and their uncertainty often move from an objective use in planning and decision making into the regulatory environment, then sometimes on to dispute resolution through litigation or other legal forums. Through this last transition whatever objectivity the models and uncertainty assessment may have once possessed becomes biased (or more biased) as each party chooses to exaggerate either the goodness of a model, or its worthlessness, depending on which view is in its best interest. If worthlessness is desired, then what was uncertain becomes unknown, or even unknowable. If goodness is desired, then precision and accuracy are often exaggerated and uncertainty, if it is explicitly recognized, encompasses only some parameters or conceptual issues, ignores others, and may minimize the uncertainty that it accounts for. In dispute resolution, how well is the adversarial process able to deal with these biases? The challenge is that they are often cloaked in computer graphics and animations that appear to lend realism to what could be mostly fancy, or even a manufactured outcome. While junk science can be challenged through appropriate motions in federal court, and in most state courts, it not unusual for biased or even incorrect modeling results, or conclusions based on incorrect results, to be permitted to be presented at trial. Courts allow opinions that are based on a "reasonable degree of scientific certainty," but when that 'certainty' is grossly exaggerated by an expert, one way or the other, how well do the courts determine that someone has stepped over the line? Trials are based on the adversary system of justice, so opposing and often irreconcilable views are commonly allowed, leaving it to the judge or jury to sort out the truth. Can advances in scientific theory and engineering practice, related to both modeling and uncertainty, help address this situation and better ensure that juries and judges see more objective modeling results, or at least see

  13. Inspection Uncertainty and Model Uncertainty Updating for Ship Structures Subjected to Corrosion Deterioration

    Institute of Scientific and Technical Information of China (English)

    LIDian-qing; ZHANGSheng-kun

    2004-01-01

    The classical probability theory cannot effectively quantify the parameter uncertainty in probability of detection.Furthermore,the conventional data analytic method and expert judgment method fail to handle the problem of model uncertainty updating with the information from nondestructive inspection.To overcome these disadvantages,a Bayesian approach was proposed to quantify the parameter uncertainty in probability of detection.Furthermore,the formulae of the multiplication factors to measure the statistical uncertainties in the probability of detection following the Weibull distribution were derived.A Bayesian updating method was applied to compute the posterior probabilities of model weights and the posterior probability density functions of distribution parameters of probability of detection.A total probability model method was proposed to analyze the problem of multi-layered model uncertainty updating.This method was then applied to the problem of multilayered corrosion model uncertainty updating for ship structures.The results indicate that the proposed method is very effective in analyzing the problem of multi-layered model uncertainty updating.

  14. Quantifying uncertainty in LCA-modelling of waste management systems.

    Science.gov (United States)

    Clavreul, Julie; Guyonnet, Dominique; Christensen, Thomas H

    2012-12-01

    Uncertainty analysis in LCA studies has been subject to major progress over the last years. In the context of waste management, various methods have been implemented but a systematic method for uncertainty analysis of waste-LCA studies is lacking. The objective of this paper is (1) to present the sources of uncertainty specifically inherent to waste-LCA studies, (2) to select and apply several methods for uncertainty analysis and (3) to develop a general framework for quantitative uncertainty assessment of LCA of waste management systems. The suggested method is a sequence of four steps combining the selected methods: (Step 1) a sensitivity analysis evaluating the sensitivities of the results with respect to the input uncertainties, (Step 2) an uncertainty propagation providing appropriate tools for representing uncertainties and calculating the overall uncertainty of the model results, (Step 3) an uncertainty contribution analysis quantifying the contribution of each parameter uncertainty to the final uncertainty and (Step 4) as a new approach, a combined sensitivity analysis providing a visualisation of the shift in the ranking of different options due to variations of selected key parameters. This tiered approach optimises the resources available to LCA practitioners by only propagating the most influential uncertainties.

  15. Uncertainty in a monthly water balance model using the generalized likelihood uncertainty estimation methodology

    Indian Academy of Sciences (India)

    Diego Rivera; Yessica Rivas; Alex Godoy

    2015-02-01

    Hydrological models are simplified representations of natural processes and subject to errors. Uncertainty bounds are a commonly used way to assess the impact of an input or model architecture uncertainty in model outputs. Different sets of parameters could have equally robust goodness-of-fit indicators, which is known as Equifinality. We assessed the outputs from a lumped conceptual hydrological model to an agricultural watershed in central Chile under strong interannual variability (coefficient of variability of 25%) by using the Equifinality concept and uncertainty bounds. The simulation period ran from January 1999 to December 2006. Equifinality and uncertainty bounds from GLUE methodology (Generalized Likelihood Uncertainty Estimation) were used to identify parameter sets as potential representations of the system. The aim of this paper is to exploit the use of uncertainty bounds to differentiate behavioural parameter sets in a simple hydrological model. Then, we analyze the presence of equifinality in order to improve the identification of relevant hydrological processes. The water balance model for Chillan River exhibits, at a first stage, equifinality. However, it was possible to narrow the range for the parameters and eventually identify a set of parameters representing the behaviour of the watershed (a behavioural model) in agreement with observational and soft data (calculation of areal precipitation over the watershed using an isohyetal map). The mean width of the uncertainty bound around the predicted runoff for the simulation period decreased from 50 to 20 m3s−1 after fixing the parameter controlling the areal precipitation over the watershed. This decrement is equivalent to decreasing the ratio between simulated and observed discharge from 5.2 to 2.5. Despite the criticisms against the GLUE methodology, such as the lack of statistical formality, it is identified as a useful tool assisting the modeller with the identification of critical parameters.

  16. Modelling of Transport Projects Uncertainties

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Leleur, Steen

    2009-01-01

    This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating...... investment costs, with a quantitative risk analysis based on Monte Carlo simulation and to make use of a set of exploratory scenarios. The analysis is carried out by using the CBA-DK model representing the Danish standard approach to socio-economic cost-benefit analysis. Specifically, the paper proposes......-based graphs which function as risk-related decision support for the appraised transport infrastructure project....

  17. Parameter estimation and uncertainty for gravitational waves from binary black holes

    Science.gov (United States)

    Berry, Christopher; LIGO Scientific Collaboration; Virgo Collaboration

    2016-03-01

    Binary black holes are one of the most promising sources of gravitational waves that could be observed by Advanced LIGO. To accurately infer the parameters of an astrophysical signal, it is necessary to have a reliable model of the gravitational waveform. Uncertainty in the waveform leads to uncertainty in the measured parameters. For loud signals, this theoretical uncertainty could dominate statistical uncertainty, to be the primary source of error in gravitational-wave astronomy. However, we expect the first candidate events will be closer to the detection threshold. We look at how parameter estimation would be influenced by the use of different waveform models for a binary black-hole signal near detection threshold, and how this can be folded in to a Bayesian analysis.

  18. Realising the Uncertainty Enabled Model Web

    Science.gov (United States)

    Cornford, D.; Bastin, L.; Pebesma, E. J.; Williams, M.; Stasch, C.; Jones, R.; Gerharz, L.

    2012-12-01

    conversion between uncertainty types, and between the spatial / temporal support of service inputs / outputs. Finally we describe the tools being generated within the UncertWeb project, considering three main aspects: i) Elicitation of uncertainties on model inputs. We are developing tools to enable domain experts to provide judgements about input uncertainties from UncertWeb model components (e.g. parameters in meteorological models) which allow panels of experts to engage in the process and reach a consensus view on the current knowledge / beliefs about that parameter or variable. We are developing systems for continuous and categorical variables as well as stationary spatial fields. ii) Visualisation of the resulting uncertain outputs from the end of the workflow, but also at intermediate steps. At this point we have prototype implementations driven by the requirements from the use cases that motivate UncertWeb. iii) Sensitivity and uncertainty analysis on model outputs. Here we show the design of the overall system we are developing, including the deployment of an emulator framework to allow computationally efficient approaches. We conclude with a summary of the open issues and remaining challenges we are facing in UncertWeb, and provide a brief overview of how we plan to tackle these.

  19. Uncertainty of Doppler reactivity worth due to uncertainties of JENDL-3.2 resonance parameters

    Energy Technology Data Exchange (ETDEWEB)

    Zukeran, Atsushi [Hitachi Ltd., Hitachi, Ibaraki (Japan). Power and Industrial System R and D Div.; Hanaki, Hiroshi; Nakagawa, Tuneo; Shibata, Keiichi; Ishikawa, Makoto

    1998-03-01

    Analytical formula of Resonance Self-shielding Factor (f-factor) is derived from the resonance integral (J-function) based on NR approximation and the analytical expression for Doppler reactivity worth ({rho}) is also obtained by using the result. Uncertainties of the f-factor and Doppler reactivity worth are evaluated on the basis of sensitivity coefficients to the resonance parameters. The uncertainty of the Doppler reactivity worth at 487{sup 0}K is about 4 % for the PNC Large Fast Breeder Reactor. (author)

  20. Modelling of Transport Projects Uncertainties

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Leleur, Steen

    2009-01-01

    This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating...... investment costs, with a quantitative risk analysis based on Monte Carlo simulation and to make use of a set of exploratory scenarios. The analysis is carried out by using the CBA-DK model representing the Danish standard approach to socio-economic cost-benefit analysis. Specifically, the paper proposes...... to supplement Optimism Bias and the associated Reference Class Forecasting (RCF) technique with a new technique that makes use of a scenario-grid. We tentatively introduce and refer to this as Reference Scenario Forecasting (RSF). The final RSF output from the CBA-DK model consists of a set of scenario...

  1. Development of Property Models with Uncertainty Estimate for Process Design under Uncertainty

    DEFF Research Database (Denmark)

    Hukkerikar, Amol; Sarup, Bent; Abildskov, Jens

    the results of uncertainty analysis to predict the uncertainties in process design. For parameter estimation, large data-sets of experimentally measured property values for a wide range of pure compounds are taken from the CAPEC database. Classical frequentist approach i.e., least square method is adopted...... parameter, octanol/water partition coefficient, aqueous solubility, acentric factor, and liquid molar volume at 298 K. The performance of property models for these properties with the revised set of model parameters is highlighted through a set of compounds not considered in the regression step...... sensitive properties for each unit operation are also identified. This analysis can be used to reduce the uncertainties in property estimates for the properties of critical importance (by performing additional experiments to get better experimental data and better model parameter values). Thus...

  2. Solar Neutrino Data, Solar Model Uncertainties and Neutrino Oscillations

    CERN Document Server

    Krauss, L M; White, M; Krauss, Lawrence M.; Gates, Evalyn; White, Martin

    1993-01-01

    We incorporate all existing solar neutrino flux measurements and take solar model flux uncertainties into account in deriving global fits to parameter space for the MSW and vacuum solutions of the solar neutrino problem.

  3. An educational model for ensemble streamflow simulation and uncertainty analysis

    National Research Council Canada - National Science Library

    AghaKouchak, A; Nakhjiri, N; Habib, E

    2013-01-01

    ...) are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI) and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity...

  4. Solar Neutrino Data, Solar Model Uncertainties and Neutrino Oscillations

    OpenAIRE

    1992-01-01

    We incorporate all existing solar neutrino flux measurements and take solar model flux uncertainties into account in deriving global fits to parameter space for the MSW and vacuum solutions of the solar neutrino problem.

  5. Uncertainty modelling of atmospheric dispersion by stochastic response surface method under aleatory and epistemic uncertainties

    Indian Academy of Sciences (India)

    Rituparna Chutia; Supahi Mahanta; D Datta

    2014-04-01

    The parameters associated to a environmental dispersion model may include different kinds of variability, imprecision and uncertainty. More often, it is seen that available information is interpreted in probabilistic sense. Probability theory is a well-established theory to measure such kind of variability. However, not all available information, data or model parameters affected by variability, imprecision and uncertainty, can be handled by traditional probability theory. Uncertainty or imprecision may occur due to incomplete information or data, measurement error or data obtained from expert judgement or subjective interpretation of available data or information. Thus for model parameters, data may be affected by subjective uncertainty. Traditional probability theory is inappropriate to represent subjective uncertainty. Possibility theory is used as a tool to describe parameters with insufficient knowledge. Based on the polynomial chaos expansion, stochastic response surface method has been utilized in this article for the uncertainty propagation of atmospheric dispersion model under consideration of both probabilistic and possibility information. The proposed method has been demonstrated through a hypothetical case study of atmospheric dispersion.

  6. Where does the uncertainty come from? Attributing Uncertainty in Conceptual Hydrologic Modelling

    Science.gov (United States)

    Abu Shoaib, S.; Marshall, L. A.; Sharma, A.

    2015-12-01

    Defining an appropriate forecasting model is a key phase in water resources planning and design. Quantification of uncertainty is an important step in the development and application of hydrologic models. In this study, we examine the dependency of hydrologic model uncertainty on the observed model inputs, defined model structure, parameter optimization identifiability and identified likelihood. We present here a new uncertainty metric, the Quantile Flow Deviation or QFD, to evaluate the relative uncertainty due to each of these sources under a range of catchment conditions. Through the metric, we may identify the potential spectrum of uncertainty and variability in model simulations. The QFD assesses uncertainty by estimating the deviation in flows at a given quantile across a range of scenarios. By using a quantile based metric, the change in uncertainty across individual percentiles can be assessed, thereby allowing uncertainty to be expressed as a function of time. The QFD method can be disaggregated to examine any part of the modelling process including the selection of certain model subroutines or forcing data. Case study results (including catchments in Australia and USA) suggest that model structure selection is vital irrespective of the flow percentile of interest or the catchment being studied. Examining the QFD across various quantiles additionally demonstrates that lower yielding catchments may have greater variation due to selected model structures. By incorporating multiple model structures, it is possible to assess (i) the relative importance of various sources of uncertainty, (ii) how these vary with the change in catchment location or hydrologic regime; and (iii) the impact of the length of available observations in uncertainty quantification.

  7. Uncertainty in Regional Air Quality Modeling

    Science.gov (United States)

    Digar, Antara

    Effective pollution mitigation is the key to successful air quality management. Although states invest millions of dollars to predict future air quality, the regulatory modeling and analysis process to inform pollution control strategy remains uncertain. Traditionally deterministic ‘bright-line’ tests are applied to evaluate the sufficiency of a control strategy to attain an air quality standard. A critical part of regulatory attainment demonstration is the prediction of future pollutant levels using photochemical air quality models. However, because models are uncertain, they yield a false sense of precision that pollutant response to emission controls is perfectly known and may eventually mislead the selection of control policies. These uncertainties in turn affect the health impact assessment of air pollution control strategies. This thesis explores beyond the conventional practice of deterministic attainment demonstration and presents novel approaches to yield probabilistic representations of pollutant response to emission controls by accounting for uncertainties in regional air quality planning. Computationally-efficient methods are developed and validated to characterize uncertainty in the prediction of secondary pollutant (ozone and particulate matter) sensitivities to precursor emissions in the presence of uncertainties in model assumptions and input parameters. We also introduce impact factors that enable identification of model inputs and scenarios that strongly influence pollutant concentrations and sensitivity to precursor emissions. We demonstrate how these probabilistic approaches could be applied to determine the likelihood that any control measure will yield regulatory attainment, or could be extended to evaluate probabilistic health benefits of emission controls, considering uncertainties in both air quality models and epidemiological concentration-response relationships. Finally, ground-level observations for pollutant (ozone) and precursor

  8. Parameter uncertainty analysis of non-point source pollution from different land use types.

    Science.gov (United States)

    Shen, Zhen-yao; Hong, Qian; Yu, Hong; Niu, Jun-feng

    2010-03-15

    Land use type is one of the most important factors that affect the uncertainty in non-point source (NPS) pollution simulation. In this study, seventeen sensitive parameters were screened from the Soil and Water Assessment Tool (SWAT) model for parameter uncertainty analysis for different land use types in the Daning River Watershed of the Three Gorges Reservoir area, China. First-Order Error Analysis (FOEA) method was adopted to analyze the effect of parameter uncertainty on model outputs under three types of land use, namely, plantation, forest and grassland. The model outputs selected in this study consisted of runoff, sediment yield, organic nitrogen (N), and total phosphorus (TP). The results indicated that the uncertainty conferred by the parameters differed among the three land use types. In forest and grassland, the parameter uncertainty in NPS pollution was primarily associated with runoff processes, but in plantation, the main uncertain parameters were related to runoff process and soil properties. Taken together, the study suggested that adjusting the structure of land use and controlling fertilizer use are helpful methods to control the NPS pollution in the Daning River Watershed.

  9. Uncertainty propagation in up-scaling of subsoil parameters, no fixed distributions allowed

    NARCIS (Netherlands)

    Lourens, Aris; van Geer, Frans C.

    2013-01-01

    When creating numerical groundwater models, the structure and properties of the subsoil is indispensable information. Like all model data, these data are subject to uncertainty. Building a groundwater model, the available geological information, like the geological structure and parameter values, ha

  10. Uncertainty "escalation" and use of machine learning to forecast residual and data model uncertainties

    Science.gov (United States)

    Solomatine, Dimitri

    2016-04-01

    When speaking about model uncertainty many authors implicitly assume the data uncertainty (mainly in parameters or inputs) which is probabilistically described by distributions. Often however it is look also into the residual uncertainty as well. It is hence reasonable to classify the main approaches to uncertainty analysis with respect to the two main types of model uncertainty that can be distinguished: A. The residual uncertainty of models. In this case the model parameters and/or model inputs are considered to be fixed (deterministic), i.e. the model is considered to be optimal (calibrated) and deterministic. Model error is considered as the manifestation of uncertainty. If there is enough past data about the model errors (i.e. it uncertainty), it is possible to build a statistical or machine learning model of uncertainty trained on this data. The following methods can be mentioned: (a) quantile regression (QR) method by Koenker and Basset in which linear regression is used to build predictive models for distribution quantiles [1] (b) a more recent approach that takes into account the input variables influencing such uncertainty and uses more advanced machine learning (non-linear) methods (neural networks, model trees etc.) - the UNEEC method [2,3,7] (c) and even more recent DUBRAUE method (Dynamic Uncertainty Model By Regression on Absolute Error), a autoregressive model of model residuals (it corrects the model residual first and then carries out the uncertainty prediction by a autoregressive statistical model) [5] B. The data uncertainty (parametric and/or input) - in this case we study the propagation of uncertainty (presented typically probabilistically) from parameters or inputs to the model outputs. In case of simple functions representing models analytical approaches can be used, or approximation methods (e.g., first-order second moment method). However, for real complex non-linear models implemented in software there is no other choice except using

  11. Determining Best Estimates and Uncertainties in Cloud Microphysical Parameters from ARM Field Data: Implications for Models, Retrieval Schemes and Aerosol-Cloud-Radiation Interactions

    Energy Technology Data Exchange (ETDEWEB)

    McFarquhar, Greg [Univ. of Illinois, Urbana, IL (United States)

    2015-12-28

    We proposed to analyze in-situ cloud data collected during ARM/ASR field campaigns to create databases of cloud microphysical properties and their uncertainties as needed for the development of improved cloud parameterizations for models and remote sensing retrievals, and for evaluation of model simulations and retrievals. In particular, we proposed to analyze data collected over the Southern Great Plains (SGP) during the Mid-latitude Continental Convective Clouds Experiment (MC3E), the Storm Peak Laboratory Cloud Property Validation Experiment (STORMVEX), the Small Particles in Cirrus (SPARTICUS) Experiment and the Routine AAF Clouds with Low Optical Water Depths (CLOWD) Optical Radiative Observations (RACORO) field campaign, over the North Slope of Alaska during the Indirect and Semi-Direct Aerosol Campaign (ISDAC) and the Mixed-Phase Arctic Cloud Experiment (M-PACE), and over the Tropical Western Pacific (TWP) during The Tropical Warm Pool International Cloud Experiment (TWP-ICE), to meet the following 3 objectives; derive statistical databases of single ice particle properties (aspect ratio AR, dominant habit, mass, projected area) and distributions of ice crystals (size distributions SDs, mass-dimension m-D, area-dimension A-D relations, mass-weighted fall speeds, single-scattering properties, total concentrations N, ice mass contents IWC), complete with uncertainty estimates; assess processes by which aerosols modulate cloud properties in arctic stratus and mid-latitude cumuli, and quantify aerosol’s influence in context of varying meteorological and surface conditions; and determine how ice cloud microphysical, single-scattering and fall-out properties and contributions of small ice crystals to such properties vary according to location, environment, surface, meteorological and aerosol conditions, and develop parameterizations of such effects.In this report we describe the accomplishments that we made on all 3 research objectives.

  12. Lumped-parameter models

    Energy Technology Data Exchange (ETDEWEB)

    Ibsen, Lars Bo; Liingaard, M.

    2006-12-15

    A lumped-parameter model represents the frequency dependent soil-structure interaction of a massless foundation placed on or embedded into an unbounded soil domain. In this technical report the steps of establishing a lumped-parameter model are presented. Following sections are included in this report: Static and dynamic formulation, Simple lumped-parameter models and Advanced lumped-parameter models. (au)

  13. Uncertainties in environmental radiological assessment models and their implications

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, F.O.; Miller, C.W.

    1983-01-01

    Environmental radiological assessments rely heavily on the use of mathematical models. The predictions of these models are inherently uncertain because these models are inexact representations of real systems. The major sources of this uncertainty are related to biases in model formulation and parameter estimation. The best approach for estimating the actual extent of over- or underprediction is model validation, a procedure that requires testing over the range of the intended realm of model application. Other approaches discussed are the use of screening procedures, sensitivity and stochastic analyses, and model comparison. The magnitude of uncertainty in model predictions is a function of the questions asked of the model and the specific radionuclides and exposure pathways of dominant importance. Estimates are made of the relative magnitude of uncertainty for situations requiring predictions of individual and collective risks for both chronic and acute releases of radionuclides. It is concluded that models developed as research tools should be distinguished from models developed for assessment applications. Furthermore, increased model complexity does not necessarily guarantee increased accuracy. To improve the realism of assessment modeling, stochastic procedures are recommended that translate uncertain parameter estimates into a distribution of predicted values. These procedures also permit the importance of model parameters to be ranked according to their relative contribution to the overall predicted uncertainty. Although confidence in model predictions can be improved through site-specific parameter estimation and increased model validation, risk factors and internal dosimetry models will probably remain important contributors to the amount of uncertainty that is irreducible.

  14. Modelling of Transport Projects Uncertainties

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Leleur, Steen

    2012-01-01

    This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating...... investment costs, with a quantitative risk analysis based on Monte Carlo simulation and to make use of a set of exploratory scenarios. The analysis is carried out by using the CBA-DK model representing the Danish standard approach to socio-economic cost-benefit analysis. Specifically, the paper proposes......-based graphs which functions as risk-related decision support for the appraised transport infrastructure project. The presentation of RSF is demonstrated by using an appraisal case concerning a new airfield in the capital of Greenland, Nuuk....

  15. Impacts of biological parameterization, initial conditions, and environmental forcing on parameter sensitivity and uncertainty in a marine ecosystem model for the Bering Sea

    Science.gov (United States)

    Gibson, G. A.; Spitz, Y. H.

    2011-11-01

    We use a series of Monte Carlo experiments to explore simultaneously the sensitivity of the BEST marine ecosystem model to environmental forcing, initial conditions, and biological parameterizations. Twenty model output variables were examined for sensitivity. The true sensitivity of biological and environmental parameters becomes apparent only when each parameter is allowed to vary within its realistic range. Many biological parameters were important only to their corresponding variable, but several biological parameters, e.g., microzooplankton grazing and small phytoplankton doubling rate, were consistently very important to several output variables. Assuming realistic biological and environmental variability, the standard deviation about simulated mean mesozooplankton biomass ranged from 1 to 14 mg C m - 3 during the year. Annual primary productivity was not strongly correlated with temperature but was positively correlated with initial nitrate and light. Secondary productivity was positively correlated with primary productivity and negatively correlated with spring bloom timing. Mesozooplankton productivity was not correlated with water temperature, but a shift towards a system in which smaller zooplankton undertake a greater proportion of the secondary production as the water temperature increases appears likely. This approach to incorporating environmental variability within a sensitivity analysis could be extended to any ecosystem model to gain confidence in climate-driven ecosystem predictions.

  16. Inversion analysis of estimating interannual variability and its uncertainties in biotic and abiotic parameters of a parsimonious physiologically based model after wind disturbance

    Science.gov (United States)

    Toda, M.; Yokozawa, M.; Richardson, A. D.; Kohyama, T.

    2011-12-01

    The effects of wind disturbance on interannual variability in ecosystem CO2 exchange have been assessed in two forests in northern Japan, i.e., a young, even-aged, monocultured, deciduous forest and an uneven-aged mixed forest of evergreen and deciduous trees, including some over 200 years old using eddy covariance (EC) measurements during 2004-2008. The EC measurements have indicated that photosynthetic recovery of trees after a huge typhoon occurred during early September in 2004 activated annual carbon uptake of both forests due to changes in physiological response of tree leaves during their growth stages. However, little have been resolved about what biotic and abiotic factors regulated interannual variability in heat, water and carbon exchange between an atmosphere and forests. In recent years, an inverse modeling analysis has been utilized as a powerful tool to estimate biotic and abiotic parameters that might affect heat, water and CO2 exchange between the atmosphere and forest of a parsimonious physiologically based model. We conducted the Bayesian inverse model analysis for the model with the EC measurements. The preliminary result showed that the above model-derived NEE values were consistent with observed ones on the hourly basis with optimized parameters by Baysian inversion. In the presentation, we would examine interannual variability in biotic and abiotic parameters related to heat, water and carbon exchange between the atmosphere and forests after disturbance by typhoon.

  17. Modelling theoretical uncertainties in phenomenological analyses for particle physics

    CERN Document Server

    Charles, Jérôme; Niess, Valentin; Silva, Luiz Vale

    2016-01-01

    The determination of the fundamental parameters of the Standard Model (and its extensions) is often limited by the presence of statistical and theoretical uncertainties. We present several models for the latter uncertainties (random, nuisance, external) in the frequentist framework, and we derive the corresponding $p$-values. In the case of the nuisance approach where theoretical uncertainties are modeled as biases, we highlight the important, but arbitrary, issue of the range of variation chosen for the bias parameters. We introduce the concept of adaptive $p$-value, which is obtained by adjusting the range of variation for the bias according to the significance considered, and which allows us to tackle metrology and exclusion tests with a single and well-defined unified tool, which exhibits interesting frequentist properties. We discuss how the determination of fundamental parameters is impacted by the model chosen for theoretical uncertainties, illustrating several issues with examples from quark flavour p...

  18. Modeling theoretical uncertainties in phenomenological analyses for particle physics

    Energy Technology Data Exchange (ETDEWEB)

    Charles, Jerome [CNRS, Aix-Marseille Univ, Universite de Toulon, CPT UMR 7332, Marseille Cedex 9 (France); Descotes-Genon, Sebastien [CNRS, Univ. Paris-Sud, Universite Paris-Saclay, Laboratoire de Physique Theorique (UMR 8627), Orsay Cedex (France); Niess, Valentin [CNRS/IN2P3, UMR 6533, Laboratoire de Physique Corpusculaire, Aubiere Cedex (France); Silva, Luiz Vale [CNRS, Univ. Paris-Sud, Universite Paris-Saclay, Laboratoire de Physique Theorique (UMR 8627), Orsay Cedex (France); Univ. Paris-Sud, CNRS/IN2P3, Universite Paris-Saclay, Groupe de Physique Theorique, Institut de Physique Nucleaire, Orsay Cedex (France); J. Stefan Institute, Jamova 39, P. O. Box 3000, Ljubljana (Slovenia)

    2017-04-15

    The determination of the fundamental parameters of the Standard Model (and its extensions) is often limited by the presence of statistical and theoretical uncertainties. We present several models for the latter uncertainties (random, nuisance, external) in the frequentist framework, and we derive the corresponding p values. In the case of the nuisance approach where theoretical uncertainties are modeled as biases, we highlight the important, but arbitrary, issue of the range of variation chosen for the bias parameters. We introduce the concept of adaptive p value, which is obtained by adjusting the range of variation for the bias according to the significance considered, and which allows us to tackle metrology and exclusion tests with a single and well-defined unified tool, which exhibits interesting frequentist properties. We discuss how the determination of fundamental parameters is impacted by the model chosen for theoretical uncertainties, illustrating several issues with examples from quark flavor physics. (orig.)

  19. The effects of ionic strength and organic matter on virus inactivation at low temperatures: general likelihood uncertainty estimation (GLUE) as an alternative to least-squares parameter optimization for the fitting of virus inactivation models

    Science.gov (United States)

    Mayotte, Jean-Marc; Grabs, Thomas; Sutliff-Johansson, Stacy; Bishop, Kevin

    2017-06-01

    This study examined how the inactivation of bacteriophage MS2 in water was affected by ionic strength (IS) and dissolved organic carbon (DOC) using static batch inactivation experiments at 4 °C conducted over a period of 2 months. Experimental conditions were characteristic of an operational managed aquifer recharge (MAR) scheme in Uppsala, Sweden. Experimental data were fit with constant and time-dependent inactivation models using two methods: (1) traditional linear and nonlinear least-squares techniques; and (2) a Monte-Carlo based parameter estimation technique called generalized likelihood uncertainty estimation (GLUE). The least-squares and GLUE methodologies gave very similar estimates of the model parameters and their uncertainty. This demonstrates that GLUE can be used as a viable alternative to traditional least-squares parameter estimation techniques for fitting of virus inactivation models. Results showed a slight increase in constant inactivation rates following an increase in the DOC concentrations, suggesting that the presence of organic carbon enhanced the inactivation of MS2. The experiment with a high IS and a low DOC was the only experiment which showed that MS2 inactivation may have been time-dependent. However, results from the GLUE methodology indicated that models of constant inactivation were able to describe all of the experiments. This suggested that inactivation time-series longer than 2 months were needed in order to provide concrete conclusions regarding the time-dependency of MS2 inactivation at 4 °C under these experimental conditions.

  20. Bayesian Uncertainty Analyses Via Deterministic Model

    Science.gov (United States)

    Krzysztofowicz, R.

    2001-05-01

    Rational decision-making requires that the total uncertainty about a variate of interest (a predictand) be quantified in terms of a probability distribution, conditional on all available information and knowledge. Suppose the state-of-knowledge is embodied in a deterministic model, which is imperfect and outputs only an estimate of the predictand. Fundamentals are presented of three Bayesian approaches to producing a probability distribution of the predictand via any deterministic model. The Bayesian Processor of Output (BPO) quantifies the total uncertainty in terms of a posterior distribution, conditional on model output. The Bayesian Processor of Ensemble (BPE) quantifies the total uncertainty in terms of a posterior distribution, conditional on an ensemble of model output. The Bayesian Forecasting System (BFS) decomposes the total uncertainty into input uncertainty and model uncertainty, which are characterized independently and then integrated into a predictive distribution.

  1. Wastewater treatment modelling: dealing with uncertainties

    DEFF Research Database (Denmark)

    Belia, E.; Amerlinck, Y.; Benedetti, L.;

    2009-01-01

    This paper serves as a problem statement of the issues surrounding uncertainty in wastewater treatment modelling. The paper proposes a structure for identifying the sources of uncertainty introduced during each step of an engineering project concerned with model-based design or optimisation...... of a wastewater treatment system. It briefly references the methods currently used to evaluate prediction accuracy and uncertainty and discusses the relevance of uncertainty evaluations in model applications. The paper aims to raise awareness and initiate a comprehensive discussion among professionals on model...

  2. Possibilistic uncertainty analysis of a conceptual model of snowmelt runoff

    Directory of Open Access Journals (Sweden)

    A. P. Jacquin

    2010-08-01

    Full Text Available This study presents the analysis of predictive uncertainty of a conceptual type snowmelt runoff model. The method applied uses possibilistic rather than probabilistic calculus for the evaluation of predictive uncertainty. Possibility theory is an information theory meant to model uncertainties caused by imprecise or incomplete knowledge about a real system rather than by randomness. A snow dominated catchment in the Chilean Andes is used as case study. Predictive uncertainty arising from parameter uncertainties of the watershed model is assessed. Model performance is evaluated according to several criteria, in order to define the possibility distribution of the parameter vector. The plausibility of the simulated glacier mass balance and snow cover are used for further constraining the model representations. Possibility distributions of the discharge estimates and prediction uncertainty bounds are subsequently derived. The results of the study indicate that the use of additional information allows a reduction of predictive uncertainty. In particular, the assessment of the simulated glacier mass balance and snow cover helps to reduce the width of the uncertainty bounds without a significant increment in the number of unbounded observations.

  3. A new algorithm for importance analysis of the inputs with distribution parameter uncertainty

    Science.gov (United States)

    Li, Luyi; Lu, Zhenzhou

    2016-10-01

    Importance analysis is aimed at finding the contributions by the inputs to the uncertainty in a model output. For structural systems involving inputs with distribution parameter uncertainty, the contributions by the inputs to the output uncertainty are governed by both the variability and parameter uncertainty in their probability distributions. A natural and consistent way to arrive at importance analysis results in such cases would be a three-loop nested Monte Carlo (MC) sampling strategy, in which the parameters are sampled in the outer loop and the inputs are sampled in the inner nested double-loop. However, the computational effort of this procedure is often prohibitive for engineering problem. This paper, therefore, proposes a newly efficient algorithm for importance analysis of the inputs in the presence of parameter uncertainty. By introducing a 'surrogate sampling probability density function (SS-PDF)' and incorporating the single-loop MC theory into the computation, the proposed algorithm can reduce the original three-loop nested MC computation into a single-loop one in terms of model evaluation, which requires substantially less computational effort. Methods for choosing proper SS-PDF are also discussed in the paper. The efficiency and robustness of the proposed algorithm have been demonstrated by results of several examples.

  4. A market model: uncertainty and reachable sets

    Directory of Open Access Journals (Sweden)

    Raczynski Stanislaw

    2015-01-01

    Full Text Available Uncertain parameters are always present in models that include human factor. In marketing the uncertain consumer behavior makes it difficult to predict the future events and elaborate good marketing strategies. Sometimes uncertainty is being modeled using stochastic variables. Our approach is quite different. The dynamic market with uncertain parameters is treated using differential inclusions, which permits to determine the corresponding reachable sets. This is not a statistical analysis. We are looking for solutions to the differential inclusions. The purpose of the research is to find the way to obtain and visualise the reachable sets, in order to know the limits for the important marketing variables. The modeling method consists in defining the differential inclusion and find its solution, using the differential inclusion solver developed by the author. As the result we obtain images of the reachable sets where the main control parameter is the share of investment, being a part of the revenue. As an additional result we also can define the optimal investment strategy. The conclusion is that the differential inclusion solver can be a useful tool in market model analysis.

  5. Fukushima Daiichi unit 1 uncertainty analysis--Preliminary selection of uncertain parameters and analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Cardoni, Jeffrey N.; Kalinich, Donald A.

    2014-02-01

    Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.

  6. $$ Uncertainty from Extrapolation of Cosmic Ray Air Shower Parameters

    CERN Document Server

    Abbasi, R U

    2016-01-01

    Recent measurements at the LHC of the p-p total cross section have reduced the uncertainty in simulations of cosmic ray air showers. In particular of the depth of shower maximum, called $X_{max}$. However, uncertainties of other important parameters, in particular the multiplicity and elasticity of high energy interactions, have not improved, and there is a remaining uncertainty due to the total cross section. Uncertainties due to extrapolations from accelerator data, at a maximum energy of $\\sim$ one TeV in the p-p center of mass, to 250 TeV ($3\\times10^{19}$ eV in a cosmic ray proton's lab frame) introduce significant uncertainties in predictions of $$. In this paper we estimate a lower limit on these uncertainties. The result is that the uncertainty in $$ is larger than the difference among the modern models being used in the field. At the full energy of the LHC, which is equivalent to $\\sim 1\\times10^{17}$ eV in the cosmic ray lab frame, the extrapolation is not as extreme, and the uncertainty is approxim...

  7. Explicitly integrating parameter, input, and structure uncertainties into Bayesian Neural Networks for probabilistic hydrologic forecasting

    KAUST Repository

    Zhang, Xuesong

    2011-11-01

    Estimating uncertainty of hydrologic forecasting is valuable to water resources and other relevant decision making processes. Recently, Bayesian Neural Networks (BNNs) have been proved powerful tools for quantifying uncertainty of streamflow forecasting. In this study, we propose a Markov Chain Monte Carlo (MCMC) framework (BNN-PIS) to incorporate the uncertainties associated with parameters, inputs, and structures into BNNs. This framework allows the structure of the neural networks to change by removing or adding connections between neurons and enables scaling of input data by using rainfall multipliers. The results show that the new BNNs outperform BNNs that only consider uncertainties associated with parameters and model structures. Critical evaluation of posterior distribution of neural network weights, number of effective connections, rainfall multipliers, and hyper-parameters shows that the assumptions held in our BNNs are not well supported. Further understanding of characteristics of and interactions among different uncertainty sources is expected to enhance the application of neural networks for uncertainty analysis of hydrologic forecasting. © 2011 Elsevier B.V.

  8. Spatial uncertainty model for visual features using a Kinect™ sensor.

    Science.gov (United States)

    Park, Jae-Han; Shin, Yong-Deuk; Bae, Ji-Hun; Baeg, Moon-Hong

    2012-01-01

    This study proposes a mathematical uncertainty model for the spatial measurement of visual features using Kinect™ sensors. This model can provide qualitative and quantitative analysis for the utilization of Kinect™ sensors as 3D perception sensors. In order to achieve this objective, we derived the propagation relationship of the uncertainties between the disparity image space and the real Cartesian space with the mapping function between the two spaces. Using this propagation relationship, we obtained the mathematical model for the covariance matrix of the measurement error, which represents the uncertainty for spatial position of visual features from Kinect™ sensors. In order to derive the quantitative model of spatial uncertainty for visual features, we estimated the covariance matrix in the disparity image space using collected visual feature data. Further, we computed the spatial uncertainty information by applying the covariance matrix in the disparity image space and the calibrated sensor parameters to the proposed mathematical model. This spatial uncertainty model was verified by comparing the uncertainty ellipsoids for spatial covariance matrices and the distribution of scattered matching visual features. We expect that this spatial uncertainty model and its analyses will be useful in various Kinect™ sensor applications.

  9. Spatial Uncertainty Model for Visual Features Using a Kinect™ Sensor

    Directory of Open Access Journals (Sweden)

    Jae-Han Park

    2012-06-01

    Full Text Available This study proposes a mathematical uncertainty model for the spatial measurement of visual features using Kinect™ sensors. This model can provide qualitative and quantitative analysis for the utilization of Kinect™ sensors as 3D perception sensors. In order to achieve this objective, we derived the propagation relationship of the uncertainties between the disparity image space and the real Cartesian space with the mapping function between the two spaces. Using this propagation relationship, we obtained the mathematical model for the covariance matrix of the measurement error, which represents the uncertainty for spatial position of visual features from Kinect™ sensors. In order to derive the quantitative model of spatial uncertainty for visual features, we estimated the covariance matrix in the disparity image space using collected visual feature data. Further, we computed the spatial uncertainty information by applying the covariance matrix in the disparity image space and the calibrated sensor parameters to the proposed mathematical model. This spatial uncertainty model was verified by comparing the uncertainty ellipsoids for spatial covariance matrices and the distribution of scattered matching visual features. We expect that this spatial uncertainty model and its analyses will be useful in various Kinect™ sensor applications.

  10. Return Predictability, Model Uncertainty, and Robust Investment

    DEFF Research Database (Denmark)

    Lukas, Manuel

    Stock return predictability is subject to great uncertainty. In this paper we use the model confidence set approach to quantify uncertainty about expected utility from investment, accounting for potential return predictability. For monthly US data and six representative return prediction models, we...

  11. Uncertainty in spatially explicit animal dispersal models

    NARCIS (Netherlands)

    Mooij, W.M.; DeAngelis, D.L.

    2003-01-01

    Uncertainty in estimates of survival of dispersing animals is a vexing difficulty in conservation biology. The current notion is that this uncertainty decreases the usefulness of spatially explicit population models in particular. We examined this problem by comparing dispersal models of three level

  12. Influences of parameter uncertainties within the ICRP-66 respiratory tract model: regional tissue doses for 239PuO2 and 238UO2/238U3O8.

    Science.gov (United States)

    Farfán, Eduardo B; Huston, Thomas E; Bolch, W Emmett; Vernetson, William G; Bolch, Wesley E

    2003-04-01

    This paper extends an examination of the influence of parameter uncertainties on regional doses to respiratory tract tissues for short-ranged alpha particles using the ICRP-66 respiratory tract model. Previous papers examined uncertainties in the deposition and clearance aspects of the model. The critical parameters examined in this study included target tissue depths, thicknesses, and masses, particularly within the thoracic or lung regions of the respiratory tract. Probability density functions were assigned for the parameters based on published data. The probabilistic computer code LUDUC (Lung Dose Uncertainty Code) was used to assess regional and total lung doses from inhaled aerosols of 239PuO2 and 238UO2/238U3O8. Dose uncertainty was noted to depend on the particle aerodynamic diameter. Additionally, dose distributions were found to follow a lognormal distribution pattern. For 239PuO2 and 238UO2/238U3O8, this study showed that the uncertainty in lung dose increases by factors of approximately 50 and approximately 70 for plutonium and uranium oxides, respectively, over the particle size range from 0.1 to 20 microm. For typical exposure scenarios involving both radionuclides, the ratio of the 95% dose fractile to the 5% dose fractile ranged from approximately 8-10 (corresponding to a geometric standard deviation, or GSD, of about 1.7-2) for particle diameters of 0.1 to 1 microm. This ratio increased to about 370 for plutonium oxide (GSD approximately 4.5) and to about 600 for uranium oxide (GSD approximately 5) as the particle diameter approached 20 microm. However, thoracic tissue doses were quite low at larger particle sizes because most of the deposition occurred in the extrathoracic airways. For 239PuO2, median doses from LUDUC were found be in general agreement with those for Reference Man (via deterministic LUDEP 2.0 calculations) in the particle range of 0.1 to 5 microm. However, median doses to the basal cell nuclei of the bronchial airways (BB

  13. Modeling uncertainty in geographic information and analysis

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Uncertainty modeling and data quality for spatial data and spatial analyses are im-portant topics in geographic information science together with space and time in geography,as well as spatial analysis. In the past two decades,a lot of efforts have been made to research the uncertainty modeling for spatial data and analyses. This paper presents our work in the research. In particular,four progresses in the re-search are given out: (a) from determinedness-to uncertainty-based representation of geographic objects in GIS; (b) from uncertainty modeling for static data to dy-namic spatial analyses; (c) from modeling uncertainty for spatial data to models; and (d) from error descriptions to quality control for spatial data.

  14. Committee of machine learning predictors of hydrological models uncertainty

    Science.gov (United States)

    Kayastha, Nagendra; Solomatine, Dimitri

    2014-05-01

    In prediction of uncertainty based on machine learning methods, the results of various sampling schemes namely, Monte Carlo sampling (MCS), generalized likelihood uncertainty estimation (GLUE), Markov chain Monte Carlo (MCMC), shuffled complex evolution metropolis algorithm (SCEMUA), differential evolution adaptive metropolis (DREAM), particle swarm optimization (PSO) and adaptive cluster covering (ACCO)[1] used to build a predictive models. These models predict the uncertainty (quantiles of pdf) of a deterministic output from hydrological model [2]. Inputs to these models are the specially identified representative variables (past events precipitation and flows). The trained machine learning models are then employed to predict the model output uncertainty which is specific for the new input data. For each sampling scheme three machine learning methods namely, artificial neural networks, model tree, locally weighted regression are applied to predict output uncertainties. The problem here is that different sampling algorithms result in different data sets used to train different machine learning models which leads to several models (21 predictive uncertainty models). There is no clear evidence which model is the best since there is no basis for comparison. A solution could be to form a committee of all models and to sue a dynamic averaging scheme to generate the final output [3]. This approach is applied to estimate uncertainty of streamflows simulation from a conceptual hydrological model HBV in the Nzoia catchment in Kenya. [1] N. Kayastha, D. L. Shrestha and D. P. Solomatine. Experiments with several methods of parameter uncertainty estimation in hydrological modeling. Proc. 9th Intern. Conf. on Hydroinformatics, Tianjin, China, September 2010. [2] D. L. Shrestha, N. Kayastha, and D. P. Solomatine, and R. Price. Encapsulation of parameteric uncertainty statistics by various predictive machine learning models: MLUE method, Journal of Hydroinformatic, in press

  15. Sensitivity and uncertainty analysis of estimated soil hydraulic parameters for simulating soil water content

    Science.gov (United States)

    Gupta, Manika; Garg, Naveen Kumar; Srivastava, Prashant K.

    2014-05-01

    The sensitivity and uncertainty analysis has been carried out for the scalar parameters (soil hydraulic parameters (SHPs)), which govern the simulation of soil water content in the unsaturated soil zone. The study involves field experiments, which were conducted in real field conditions for wheat crop in Roorkee, India under irrigated conditions. Soil samples were taken for the soil profile of 60 cm depth at an interval of 15 cm in the experimental field to determine soil water retention curves (SWRCs). These experimentally determined SWRCs were used to estimate the SHPs by least square optimization under constrained conditions. Sensitivity of the SHPs estimated by various pedotransfer functions (PTFs), that relate various easily measurable soil properties like soil texture, bulk density and organic carbon content, is compared with lab derived parameters to simulate respective soil water retention curves. Sensitivity analysis was carried out using the monte carlo simulations and the one factor at a time approach. The different sets of SHPs, along with experimentally determined saturated permeability, are then used as input parameters in physically based, root water uptake model to ascertain the uncertainties in simulating soil water content. The generalised likelihood uncertainty estimation procedure (GLUE) was subsequently used to estimate the uncertainty bounds (UB) on the model predictions. It was found that the experimentally obtained SHPs were able to simulate the soil water contents with efficiencies of 70-80% at all the depths for the three irrigation treatments. The SHPs obtained from the PTFs, performed with varying uncertainties in simulating the soil water contents. Keywords: Sensitivity analysis, Uncertainty estimation, Pedotransfer functions, Soil hydraulic parameters, Hydrological modelling

  16. Possibilistic uncertainty analysis of a conceptual model of snowmelt runoff

    Directory of Open Access Journals (Sweden)

    A. P. Jacquin

    2010-03-01

    Full Text Available This study presents the analysis of predictive uncertainty of a conceptual type snowmelt runoff model. The method applied uses possibilistic rather than probabilistic calculus for the evaluation of predictive uncertainty. Possibility theory is an information theory meant to model uncertainties caused by imprecise or incomplete knowledge about a real system rather than by randomness. A snow dominated catchment in the Chilean Andes is used as case study. Predictive uncertainty arising from parameter uncertainties of the watershed model is assessed. Model performance is evaluated according to several criteria, in order to define the possibility distribution of the model representations. The likelihood of the simulated glacier mass balance and snow cover are used for further assessing model credibility. Possibility distributions of the discharge estimates and prediction uncertainty bounds are subsequently derived. The results of the study indicate that the use of additional information allows a reduction of predictive uncertainty. In particular, the assessment of the simulated glacier mass balance and snow cover helps to reduce the width of the uncertainty bounds without a significant increment in the number of unbounded observations.

  17. Geostatistical simulation of geological architecture and uncertainty propagation in groundwater modeling

    DEFF Research Database (Denmark)

    He, Xiulan

    Groundwater modeling plays an essential role in modern subsurface hydrology research. It’s generally recognized that simulations and predictions by groundwater models are associated with uncertainties that originate from various sources. The two major uncertainty sources are related to model...... parameters and model structures, which are the primary focuses of this PhD research. Parameter uncertainty was analyzed using an optimization tool (PEST: Parameter ESTimation) in combination with a random sampling method (LHS: Latin Hypercube Sampling). Model structure, namely geological architecture...

  18. Aerosol model selection and uncertainty modelling by adaptive MCMC technique

    Directory of Open Access Journals (Sweden)

    M. Laine

    2008-12-01

    Full Text Available We present a new technique for model selection problem in atmospheric remote sensing. The technique is based on Monte Carlo sampling and it allows model selection, calculation of model posterior probabilities and model averaging in Bayesian way.

    The algorithm developed here is called Adaptive Automatic Reversible Jump Markov chain Monte Carlo method (AARJ. It uses Markov chain Monte Carlo (MCMC technique and its extension called Reversible Jump MCMC. Both of these techniques have been used extensively in statistical parameter estimation problems in wide area of applications since late 1990's. The novel feature in our algorithm is the fact that it is fully automatic and easy to use.

    We show how the AARJ algorithm can be implemented and used for model selection and averaging, and to directly incorporate the model uncertainty. We demonstrate the technique by applying it to the statistical inversion problem of gas profile retrieval of GOMOS instrument on board the ENVISAT satellite. Four simple models are used simultaneously to describe the dependence of the aerosol cross-sections on wavelength. During the AARJ estimation all the models are used and we obtain a probability distribution characterizing how probable each model is. By using model averaging, the uncertainty related to selecting the aerosol model can be taken into account in assessing the uncertainty of the estimates.

  19. Assessment of parametric uncertainty for groundwater reactive transport modeling,

    Science.gov (United States)

    Shi, Xiaoqing; Ye, Ming; Curtis, Gary P.; Miller, Geoffery L.; Meyer, Philip D.; Kohler, Matthias; Yabusaki, Steve; Wu, Jichun

    2014-01-01

    The validity of using Gaussian assumptions for model residuals in uncertainty quantification of a groundwater reactive transport model was evaluated in this study. Least squares regression methods explicitly assume Gaussian residuals, and the assumption leads to Gaussian likelihood functions, model parameters, and model predictions. While the Bayesian methods do not explicitly require the Gaussian assumption, Gaussian residuals are widely used. This paper shows that the residuals of the reactive transport model are non-Gaussian, heteroscedastic, and correlated in time; characterizing them requires using a generalized likelihood function such as the formal generalized likelihood function developed by Schoups and Vrugt (2010). For the surface complexation model considered in this study for simulating uranium reactive transport in groundwater, parametric uncertainty is quantified using the least squares regression methods and Bayesian methods with both Gaussian and formal generalized likelihood functions. While the least squares methods and Bayesian methods with Gaussian likelihood function produce similar Gaussian parameter distributions, the parameter distributions of Bayesian uncertainty quantification using the formal generalized likelihood function are non-Gaussian. In addition, predictive performance of formal generalized likelihood function is superior to that of least squares regression and Bayesian methods with Gaussian likelihood function. The Bayesian uncertainty quantification is conducted using the differential evolution adaptive metropolis (DREAM(zs)) algorithm; as a Markov chain Monte Carlo (MCMC) method, it is a robust tool for quantifying uncertainty in groundwater reactive transport models. For the surface complexation model, the regression-based local sensitivity analysis and Morris- and DREAM(ZS)-based global sensitivity analysis yield almost identical ranking of parameter importance. The uncertainty analysis may help select appropriate likelihood

  20. Uncertainty quantification in Rothermel's Model using an efficient sampling method

    Science.gov (United States)

    Edwin Jimenez; M. Yousuff Hussaini; Scott L. Goodrick

    2007-01-01

    The purpose of the present work is to quantify parametric uncertainty in Rothermel’s wildland fire spread model (implemented in software such as BehavePlus3 and FARSITE), which is undoubtedly among the most widely used fire spread models in the United States. This model consists of a nonlinear system of equations that relates environmental variables (input parameter...

  1. Comparative Analysis of Uncertainties in Urban Surface Runoff Modelling

    DEFF Research Database (Denmark)

    Thorndahl, Søren; Schaarup-Jensen, Kjeld

    2007-01-01

    In the present paper a comparison between three different surface runoff models, in the numerical urban drainage tool MOUSE, is conducted. Analysing parameter uncertainty, it is shown that the models are very sensitive with regards to the choice of hydrological parameters, when combined overflow...... analysis, further research in improved parameter assessment for surface runoff models is needed....... volumes are compared - especially when the models are uncalibrated. The occurrences of flooding and surcharge are highly dependent on both hydrological and hydrodynamic parameters. Thus, the conclusion of the paper is that if the use of model simulations is to be a reliable tool for drainage system...

  2. Urban drainage models - making uncertainty analysis simple

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana

    2012-01-01

    There is increasing awareness about uncertainties in modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here, a modif...

  3. Uncertainty Analysis of Surface Dust Emission Parameters of a Dust Model%沙尘模式地表起沙参数不确定性分析

    Institute of Scientific and Technical Information of China (English)

    周旭; 吴成来; 林朝晖; 隆宵; 王萍

    2011-01-01

    Dust emission scheme plays an important role in dust event forecast. In this paper, we first analyze the effect of the parameters of soil plastic pressure p and cy on surface horizontal dust flux and vertical dust flux. Then, the WRF-Chem3. 0 coupled with dust emission scheme is used to model a dust event occurring in western China in March 27-28, 2007 and to present the effect of parameters on the modeled results. The results indicate that the predicted PM10 concentration is very sensitive to soil plastic pressure, p will affect the concentration value, the central place of concentration and the scope. The value of cy just affect the predicted PM10 concentration, whereas not affect the concentration central place and the scope.%起沙的参数化方法对沙尘天气的预报质量起着至关重要的作用.首先分析了目前应用广泛的沙尘模式中的参数土壤塑性压力P和Cy对地表水平沙通量与垂直沙尘通量计算结果的影响,然后利用WRF-Chem3.0模式对2007年3月27-28日发生在我国西部地区的一次沙尘过程进行了模拟,分析了由于参数值选取的误差而造成沙尘模拟的不确定.结果表明,土壤塑性压力P对模拟结果有很大影响,P值的改变不仅影响PM10浓度大小的预测,而且还影响其浓度中心位置以及分布范围;cy值仅影响沙尘区域内PM10浓度大小的预测,对浓度中心位置和分布范围大小的预测没有影响.

  4. Evaluation of parameter uncertainties obtained from in-situ tracer experiments

    Energy Technology Data Exchange (ETDEWEB)

    Sawada, Atsushi; Yoshino, Naoto [Japan Nuclear Cycle Development Inst., Tokai Works, Tokai, Ibaraki (Japan); Ijiri, Yuji; Hata, Akihito [Taisei Corp., Tokyo (Japan); Hosono, Kenichi [Geoscience Research Laboratory, Yamato, Kanagawa (Japan)

    2003-03-01

    Radionuclide transport parameter uncertainty is an important consideration in the safety assessment of high-level radioactive waste disposal. This paper describes the development of a method for the quantitative estimation of transport parameter uncertainties from in-situ tracer experiments. The method utilizes a probabilistic inversion based on the maximum likelihood method. Transport parameters and their uncertainties are derived from a series of conservative and reactive tracer tests conducted in a single fracture at the Aespoe Hard Rock Laboratory in Sweden. These transport parameters and uncertainties are useful for evaluating the influence of parameter uncertainty on safety assessment. (author)

  5. Uncertainty Quantification in Control Problems for Flocking Models

    Directory of Open Access Journals (Sweden)

    Giacomo Albi

    2015-01-01

    Full Text Available The optimal control of flocking models with random inputs is investigated from a numerical point of view. The effect of uncertainty in the interaction parameters is studied for a Cucker-Smale type model using a generalized polynomial chaos (gPC approach. Numerical evidence of threshold effects in the alignment dynamic due to the random parameters is given. The use of a selective model predictive control permits steering of the system towards the desired state even in unstable regimes.

  6. Assimilating multi-source uncertainties of a parsimonious conceptual hydrological model using hierarchical Bayesian modeling

    Science.gov (United States)

    Wei Wu; James Clark; James Vose

    2010-01-01

    Hierarchical Bayesian (HB) modeling allows for multiple sources of uncertainty by factoring complex relationships into conditional distributions that can be used to draw inference and make predictions. We applied an HB model to estimate the parameters and state variables of a parsimonious hydrological model – GR4J – by coherently assimilating the uncertainties from the...

  7. Parameter uncertainty-based pattern identification and optimization for robust decision making on watershed load reduction

    Science.gov (United States)

    Jiang, Qingsong; Su, Han; Liu, Yong; Zou, Rui; Ye, Rui; Guo, Huaicheng

    2017-04-01

    Nutrients loading reduction in watershed is essential for lake restoration from eutrophication. The efficient and optimal decision-making on loading reduction is generally based on water quality modeling and the quantitative identification of nutrient sources at the watershed scale. The modeling process is influenced inevitably by inherent uncertainties, especially by uncertain parameters due to equifinality. Therefore, the emerging question is: if there is parameter uncertainty, how to ensure the robustness of the optimal decisions? Based on simulation-optimization models, an integrated approach of pattern identification and analysis of robustness was proposed in this study that focuses on the impact of parameter uncertainty in water quality modeling. Here the pattern represents the discernable regularity of solutions for load reduction under multiple parameter sets. Pattern identification is achieved by using a hybrid clustering analysis (i.e., Ward-Hierarchical and K-means), which was flexible and efficient in analyzing Lake Bali near the Yangtze River in China. The results demonstrated that urban domestic nutrient load is the most potential source that should be reduced, and there are two patterns for Total Nitrogen (TN) reduction and three patterns for Total Phosphorus (TP) reduction. The patterns indicated different total reduction of nutrient loads, which reflect diverse decision preferences. The robust solution was identified by the highest accomplishment with the water quality at monitoring stations that were improved uniformly with this solution. We conducted a process analysis of robust decision-making that was based on pattern identification and uncertainty, which provides effective support for decision-making with preference under uncertainty.

  8. Uncertainty analysis of fluvial outcrop data for stochastic reservoir modelling

    Energy Technology Data Exchange (ETDEWEB)

    Martinius, A.W. [Statoil Research Centre, Trondheim (Norway); Naess, A. [Statoil Exploration and Production, Stjoerdal (Norway)

    2005-07-01

    Uncertainty analysis and reduction is a crucial part of stochastic reservoir modelling and fluid flow simulation studies. Outcrop analogue studies are often employed to define reservoir model parameters but the analysis of uncertainties associated with sedimentological information is often neglected. In order to define uncertainty inherent in outcrop data more accurately, this paper presents geometrical and dimensional data from individual point bars and braid bars, from part of the low net:gross outcropping Tortola fluvial system (Spain) that has been subjected to a quantitative and qualitative assessment. Four types of primary outcrop uncertainties are discussed: (1) the definition of the conceptual depositional model; (2) the number of observations on sandstone body dimensions; (3) the accuracy and representativeness of observed three-dimensional (3D) sandstone body size data; and (4) sandstone body orientation. Uncertainties related to the depositional model are the most difficult to quantify but can be appreciated qualitatively if processes of deposition related to scales of time and the general lack of information are considered. Application of the N0 measure is suggested to assess quantitatively whether a statistically sufficient number of dimensional observations is obtained to reduce uncertainty to an acceptable level. The third type of uncertainty is evaluated in a qualitative sense and determined by accurate facies analysis. The orientation of sandstone bodies is shown to influence spatial connectivity. As a result, an insufficient number or quality of observations may have important consequences for estimated connected volumes. This study will give improved estimations for reservoir modelling. (author)

  9. Lumped-parameter models

    DEFF Research Database (Denmark)

    Ibsen, Lars Bo; Liingaard, Morten

    A lumped-parameter model represents the frequency dependent soil-structure interaction of a massless foundation placed on or embedded into an unbounded soil domain. The lumped-parameter model development have been reported by (Wolf 1991b; Wolf 1991a; Wolf and Paronesso 1991; Wolf and Paronesso 19...

  10. Urban drainage models - making uncertainty analysis simple

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana;

    2012-01-01

    There is increasing awareness about uncertainties in modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here, a modif...... probability distributions (often used for sensitivity analyses) and prediction intervals. To demonstrate the new method, it is applied to a conceptual rainfall-runoff model using a dataset collected from Melbourne, Australia....

  11. Model development and data uncertainty integration

    Energy Technology Data Exchange (ETDEWEB)

    Swinhoe, Martyn Thomas [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-12-02

    The effect of data uncertainties is discussed, with the epithermal neutron multiplicity counter as an illustrative example. Simulation using MCNP6, cross section perturbations and correlations are addressed, along with the effect of the 240Pu spontaneous fission neutron spectrum, the effect of P(ν) for 240Pu spontaneous fission, and the effect of spontaneous fission and (α,n) intensity. The effect of nuclear data is the product of the initial uncertainty and the sensitivity -- both need to be estimated. In conclusion, a multi-parameter variation method has been demonstrated, the most significant parameters are the basic emission rates of spontaneous fission and (α,n) processes, and uncertainties and important data depend on the analysis technique chosen.

  12. Uncertainty Propagation in Predictions of Hydraulic Parameters Based on the Pedotransfer Functions

    Science.gov (United States)

    Faybishenko, B.; Tokunaga, T. K.; Kim, Y.; Agarwal, D.

    2016-12-01

    Although the accuracy of measurements of physical soil characteristics on individual soil samples is usually better than +/-10%, the application of these data to the estimation of unsaturated hydraulic parameters using pedotransfer functions involves propagation of errors into the equations, resulting in increased uncertainties of calculated hydrological parameters. We evaluated the uncertainty of unsaturated hydraulic conductivity and water retention functions calculated using two types of pedotransfer functions (PTFs): (a) originally developed by Wosten et al. (1999), and (b) recently developed from the European Hydropedological Data Inventory (EUHYDI) by Tóth et al. (2014). We first applied the theory of error analysis to assess the propagation of errors in a set of input parameters (particle size distribution and bulk density) into the output parameters of the Wosten et al. PTF model, resulting in the evaluation of probability density functions (PDFs) of the saturated hydraulic conductivity, full saturation, irreducible saturation, and Mualem-van Genuchten parameters n, alpha, and l. Error analysis calculations were performed by means of the Taylor expansion and Monte Carlo simulations. Then, we calculated the unsaturated hydraulic parameters from twenty-two Tóth et al. (2014) PTF models, and compared these parameters with the PDFs of the output parameters from the Wosten model. The comparison showed that the Tóth's model-calculated parameters are within the PDFs of parameters calculated from the Wosten model. Calculations were carried out using the soil physical properties of about 50 samples collected at the SFA LBNL's Rifle and East River field sites in Colorado. The results of calculations were also compared with experimentally determined unsaturated hydraulic parameters. The Bayesian network analysis was applied to deduce the multivariate structural interference between input and output variables of PTF models and to infer the intra-model input

  13. Photovoltaic System Modeling. Uncertainty and Sensitivity Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Clifford W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Martin, Curtis E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    We report an uncertainty and sensitivity analysis for modeling AC energy from ph otovoltaic systems . Output from a PV system is predicted by a sequence of models. We quantify u ncertainty i n the output of each model using empirical distribution s of each model's residuals. We propagate uncertainty through the sequence of models by sampli ng these distributions to obtain a n empirical distribution of a PV system's output. We consider models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane - of - array irradiance; (2) estimate effective irradiance; (3) predict cell temperature; (4) estimate DC voltage, current and power ; (5) reduce DC power for losses due to inefficient maximum power point tracking or mismatch among modules; and (6) convert DC to AC power . O ur analysis consider s a notional PV system com prising an array of FirstSolar FS - 387 modules and a 250 kW AC inverter ; we use measured irradiance and weather at Albuquerque, NM. We found the uncertainty in PV syste m output to be relatively small, on the order of 1% for daily energy. We found that unce rtainty in the models for POA irradiance and effective irradiance to be the dominant contributors to uncertainty in predicted daily energy. Our analysis indicates that efforts to reduce the uncertainty in PV system output predictions may yield the greatest improvements by focusing on the POA and effective irradiance models.

  14. 船模水动力学试验中几何参数的不确定度分析研究%On Geometric Parameters in Uncertainty Analysis of Measurement in Ship Model Test

    Institute of Scientific and Technical Information of China (English)

    吴宝山

    2007-01-01

    A series of ITTC recommended procedures for Uncertainty Analysis in experimental fluid dynamic measurement were put into effect since 1999,but there are still a lot of details in UA application to be further discussed or revised by now. The topic in this paper is one of such details. Nondimensional formulae are usually used to express the measured hydrodynamic forces and moments in ship model test, so that various geometric parameters are involved in uncertainty analysis. Several examples are given to illustrate some confusion in the uncertainty analysis concerning the geometric parameters and,it is recommended that it is better to perform the uncertainty analysis based more on physical and engineering consideration than on mathematical formulae.%自从1992年ITTC-QS工作组致函ITTC技术委员会提请ITTC各成员单位依照ANSI/ASME PTC 19.1开展测量不确定度评定以来,不确定度分析一直是热门议题.AIAA于1999年发布了风洞试验不确定度分析的指南.ITTC从1999年起逐步发布了一系列关于船模试验不确定度分析的规程.在这些规程中,由模型加工误差引起的水动力系数不确定度分量一般是通过水动力系数的(无量纲)表达式进行分析计算的.这种分析,从数学上是合理的,但从物理上分析和依据工程判断,有些则是相当不合理的,甚至是分析中的一个误区.文章列举了若干实例对此进行了阐述,并提出:对于因水动力无量纲化而引入的几何参数,其不确定度影响分量不能简单地依据所采用的无量纲表达式进行分析评定,而是要基于水动力学的分析进行合理的评定.

  15. Network optimization including gas lift and network parameters under subsurface uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Schulze-Riegert, R.; Baffoe, J.; Pajonk, O. [SPT Group GmbH, Hamburg (Germany); Badalov, H.; Huseynov, S. [Technische Univ. Clausthal, Clausthal-Zellerfeld (Germany). ITE; Trick, M. [SPT Group, Calgary, AB (Canada)

    2013-08-01

    Optimization of oil and gas field production systems poses a great challenge to field development due to complex and multiple interactions between various operational design parameters and subsurface uncertainties. Conventional analytical methods are capable of finding local optima based on single deterministic models. They are less applicable for efficiently generating alternative design scenarios in a multi-objective context. Practical implementations of robust optimization workflows integrate the evaluation of alternative design scenarios and multiple realizations of subsurface uncertainty descriptions. Production or economic performance indicators such as NPV (Net Present Value) are linked to a risk-weighted objective function definition to guide the optimization processes. This work focuses on an integrated workflow using a reservoir-network simulator coupled to an optimization framework. The work will investigate the impact of design parameters while considering the physics of the reservoir, wells, and surface facilities. Subsurface uncertainties are described by well parameters such as inflow performance. Experimental design methods are used to investigate parameter sensitivities and interactions. Optimization methods are used to find optimal design parameter combinations which improve key performance indicators of the production network system. The proposed workflow will be applied to a representative oil reservoir coupled to a network which is modelled by an integrated reservoir-network simulator. Gas-lift will be included as an explicit measure to improve production. An objective function will be formulated for the net present value of the integrated system including production revenue and facility costs. Facility and gas lift design parameters are tuned to maximize NPV. Well inflow performance uncertainties are introduced with an impact on gas lift performance. Resulting variances on NPV are identified as a risk measure for the optimized system design. A

  16. Kinematic accuracy and dynamic performance of a simple planar space deployable mechanism with joint clearance considering parameter uncertainty

    Science.gov (United States)

    Li, Junlan; Huang, Hongzhou; Yan, Shaoze; Yang, Yunqiang

    2017-07-01

    Joint clearance and the uncertainty of geometric and physical parameters significantly influence the kinematic accuracy and dynamic response of space deployable mechanisms. Such mechanisms have been widely employed in astronautic missions to improve the capabilities of launchers. This paper proposes a methodology to investigate the kinematic accuracy and dynamic performance of space deployable mechanism with joint clearance while considering parameter uncertainty. The model of space deployable mechanism with a planar revolute joint is provided. With consideration of several uncertain parameters, the solving procedure of the dynamic equations is presented based on the Monte Carlo method. A case study is conducted to reveal the effect of parameter uncertainty on its kinematic accuracy and dynamic performance. The results indicate that parameter uncertainty should be considered to accurately evaluate the performance of long-term operating space deployable mechanisms, especially for such systems with clearance joints. According to the results, brief suggestions for design and evaluation of the mechanisms are provided.

  17. Uncertainty of GIA models across the Greenland

    Science.gov (United States)

    Ruggieri, Gabriella

    2013-04-01

    In the last years various remote sensing techniques have been employed to estimate the current mass balance of the Greenland ice sheet (GIS). In this regards GRACE, laser and radar altimetry observations, employed to constrain the mass balance, consider the glacial isostatic adjustment (GIA) a source of noise. Several GIA models have been elaborated for the Greenland but they differ from each other for mantle viscosity profile and for time history of ice melting. In this work we use the well know ICE-5G (VM2) ice model by Peltier (2004) and two others alternative scenarios of ice melting, ANU05 by Lambeck et al. (1998) and the new regional ice model HUY2 by Simpson et al. (2009) in order to asses the amplitude of the uncertainty related to the GIA predictions. In particular we focus on rates of vertical displacement field, sea surface variations and sea-level change at regional scale. The GIA predictions are estimated using an improved version of SELEN code that solve the sea-level equation for a spherical self-gravitating, incompressible and viscoelastic Earth structure. GIA uncertainty shows a highly variable geographic distribution across the Greenland. Considering the spatial pattern of the GIA predictions related to the three ice models, the western sector of the Greenland Ice Sheets (GrIS) between Thule and Upernavik and around the area of Paamiut, show good agreement while the northeast portion of the Greenland is characterized by a large discrepancy of the GIA predictions inferred by the ice models tested in this work. These differences are ultimately the consequence of the different sets of global relative sea level data and modern geodetic observations used by the authors to constrain the model parameters. Finally GPS Network project (GNET), recently installed around the periphery of the GrIS, are used as a tool to discuss the discrepancies among the GIA models. Comparing the geodetic analysis recently available, appears that among the GPS sites the

  18. Selective Maintenance Model Considering Time Uncertainty

    OpenAIRE

    Le Chen; Zhengping Shu; Yuan Li; Xuezhi Lv

    2012-01-01

    This study proposes a selective maintenance model for weapon system during mission interval. First, it gives relevant definitions and operational process of material support system. Then, it introduces current research on selective maintenance modeling. Finally, it establishes numerical model for selecting corrective and preventive maintenance tasks, considering time uncertainty brought by unpredictability of maintenance procedure, indetermination of downtime for spares and difference of skil...

  19. Uncertainty Estimation in SiGe HBT Small-Signal Modeling

    DEFF Research Database (Denmark)

    Masood, Syed M.; Johansen, Tom Keinicke; Vidkjær, Jens;

    2005-01-01

    An uncertainty estimation and sensitivity analysis is performed on multi-step de-embedding for SiGe HBT small-signal modeling. The uncertainty estimation in combination with uncertainty model for deviation in measured S-parameters, quantifies the possible error value in de-embedded two...

  20. Uncertainty modeling process for semantic technology

    Directory of Open Access Journals (Sweden)

    Rommel N. Carvalho

    2016-08-01

    Full Text Available The ubiquity of uncertainty across application domains generates a need for principled support for uncertainty management in semantically aware systems. A probabilistic ontology provides constructs for representing uncertainty in domain ontologies. While the literature has been growing on formalisms for representing uncertainty in ontologies, there remains little guidance in the knowledge engineering literature for how to design probabilistic ontologies. To address the gap, this paper presents the Uncertainty Modeling Process for Semantic Technology (UMP-ST, a new methodology for modeling probabilistic ontologies. To explain how the methodology works and to verify that it can be applied to different scenarios, this paper describes step-by-step the construction of a proof-of-concept probabilistic ontology. The resulting domain model can be used to support identification of fraud in public procurements in Brazil. While the case study illustrates the development of a probabilistic ontology in the PR-OWL probabilistic ontology language, the methodology is applicable to any ontology formalism that properly integrates uncertainty with domain semantics.

  1. Statistical assessment of predictive modeling uncertainty

    Science.gov (United States)

    Barzaghi, Riccardo; Marotta, Anna Maria

    2017-04-01

    When the results of geophysical models are compared with data, the uncertainties of the model are typically disregarded. We propose a method for defining the uncertainty of a geophysical model based on a numerical procedure that estimates the empirical auto and cross-covariances of model-estimated quantities. These empirical values are then fitted by proper covariance functions and used to compute the covariance matrix associated with the model predictions. The method is tested using a geophysical finite element model in the Mediterranean region. Using a novel χ2 analysis in which both data and model uncertainties are taken into account, the model's estimated tectonic strain pattern due to the Africa-Eurasia convergence in the area that extends from the Calabrian Arc to the Alpine domain is compared with that estimated from GPS velocities while taking into account the model uncertainty through its covariance structure and the covariance of the GPS estimates. The results indicate that including the estimated model covariance in the testing procedure leads to lower observed χ2 values that have better statistical significance and might help a sharper identification of the best-fitting geophysical models.

  2. Handling Unquantifiable Uncertainties in Landslide Modelling

    Science.gov (United States)

    Almeida, S.; Holcombe, E.; Pianosi, F.; Wagener, T.

    2015-12-01

    Landslides have many negative economic and societal impacts, including the potential for significant loss of life and damage to infrastructure. Slope stability assessment can be used to guide decisions about the management of landslide risk, but its usefulness can be challenged by high levels of uncertainty in predicting landslide occurrence. Prediction uncertainty may be associated with the choice of model that is used to assess slope stability, the quality of the available input data, or a lack of knowledge of how future climatic and socio-economic changes may affect future landslide risk. While some of these uncertainties can be characterised by relatively well-defined probability distributions, for other uncertainties, such as those linked to climate change, there is no agreement on what probability distribution should be used to characterise them. This latter type of uncertainty, often referred to as deep uncertainty, means that robust policies need to be developed that are expected to perform adequately under a wide range of future conditions. In our study the impact of deep uncertainty on slope stability predictions is assessed in a quantitative and structured manner using Global Sensitivity Analysis (GSA) and the Combined Hydrology and Stability Model (CHASM). In particular, we use and combine several GSA methods including the Method of Morris, Regional Sensitivity Analysis and CART, as well as advanced visualization tools. Our example application is a slope in the Caribbean, an area that is naturally susceptible to landslides due to a combination of high rainfall rates, steep slopes, and highly weathered residual soils. Rapid unplanned urbanisation and changing climate may further exacerbate landslide risk in the future. Our example shows how we can gain useful information in the presence of deep uncertainty by combining physically based models with GSA in a scenario discovery framework.

  3. Structural uncertainty in watershed phosphorus modeling: Toward a stochastic framework

    Science.gov (United States)

    Chen, Lei; Gong, Yongwei; Shen, Zhenyao

    2016-06-01

    Structural uncertainty is an important source of model predictive errors, but few studies have been conducted on the error-transitivity from model structure to nonpoint source (NPS) prediction. In this study, we focused on the structural uncertainty caused by the algorithms and equations that are used to describe the phosphorus (P) cycle at the watershed scale. The sensitivity of simulated P to each algorithm/equation was quantified using the Soil and Water Assessment Tool (SWAT) in the Three Gorges Reservoir Area, China. The results indicated that the ratios of C:N and P:N for humic materials, as well as the algorithm of fertilization and P leaching contributed the largest output uncertainties. In comparison, the initiation of inorganic P in the soil layer and the transformation algorithm between P pools are less sensitive for the NPS-P predictions. In addition, the coefficient of variation values were quantified as 0.028-0.086, indicating that the structure-induced uncertainty is minor compared to NPS-P prediction uncertainty caused by the model input and parameters. Using the stochastic framework, the cumulative probability of simulated NPS-P data provided a trade-off between expenditure burden and desired risk. In this sense, this paper provides valuable information for the control of model structural uncertainty, and can be extrapolated to other model-based studies.

  4. Assessing uncertainties in solute transport models: Upper Narew case study

    Science.gov (United States)

    Osuch, M.; Romanowicz, R.; Napiórkowski, J. J.

    2009-04-01

    This paper evaluates uncertainties in two solute transport models based on tracer experiment data from the Upper River Narew. Data Based Mechanistic and transient storage models were applied to Rhodamine WT tracer observations. We focus on the analysis of uncertainty and the sensitivity of model predictions to varying physical parameters, such as dispersion and channel geometry. An advection-dispersion model with dead zones (Transient Storage model) adequately describes the transport of pollutants in a single channel river with multiple storage. The applied transient storage model is deterministic; it assumes that observations are free of errors and the model structure perfectly describes the process of transport of conservative pollutants. In order to take into account the model and observation errors, an uncertainty analysis is required. In this study we used a combination of the Generalized Likelihood Uncertainty Estimation technique (GLUE) and the variance based Global Sensitivity Analysis (GSA). The combination is straightforward as the same samples (Sobol samples) were generated for GLUE analysis and for sensitivity assessment. Additionally, the results of the sensitivity analysis were used to specify the best parameter ranges and their prior distributions for the evaluation of predictive model uncertainty using the GLUE methodology. Apart from predictions of pollutant transport trajectories, two ecological indicators were also studied (time over the threshold concentration and maximum concentration). In particular, a sensitivity analysis of the length of "over the threshold" period shows an interesting multi-modal dependence on model parameters. This behavior is a result of the direct influence of parameters on different parts of the dynamic response of the system. As an alternative to the transient storage model, a Data Based Mechanistic approach was tested. Here, the model is identified and the parameters are estimated from available time series data using

  5. Uncertainty analysis in dissolved oxygen modeling in streams.

    Science.gov (United States)

    Hamed, Maged M; El-Beshry, Manar Z

    2004-08-01

    Uncertainty analysis in surface water quality modeling is an important issue. This paper presents a method based on the first-order reliability method (FORM) to assess the exceedance probability of a target dissolved oxygen concentration in a stream, using a Streeter-Phelps prototype model. Basic uncertainty in the input parameters is considered by representing them as random variables with prescribed probability distributions. Results obtained from FORM analysis compared well with those of the Monte Carlo simulation method. The analysis also presents the stochastic sensitivity of the probabilistic outcome in the form of uncertainty importance factors, and shows how they change with changing simulation time. Furthermore, a parametric sensitivity analysis was conducted to show the effect of selection of different probability distribution functions for the three most important parameters on the design point, exceedance probability, and importance factors.

  6. Volcano deformation source parameters estimated from InSAR: Sensitivities to uncertainties in seismic tomography

    Science.gov (United States)

    Masterlark, Timothy; Donovan, Theodore; Feigl, Kurt L.; Haney, Matthew; Thurber, Clifford H.; Tung, Sui

    2016-04-01

    The eruption cycle of a volcano is controlled in part by the upward migration of magma. The characteristics of the magma flux produce a deformation signature at the Earth's surface. Inverse analyses use geodetic data to estimate strategic controlling parameters that describe the position and pressurization of a magma chamber at depth. The specific distribution of material properties controls how observed surface deformation translates to source parameter estimates. Seismic tomography models describe the spatial distributions of material properties that are necessary for accurate models of volcano deformation. This study investigates how uncertainties in seismic tomography models propagate into variations in the estimates of volcano deformation source parameters inverted from geodetic data. We conduct finite element model-based nonlinear inverse analyses of interferometric synthetic aperture radar (InSAR) data for Okmok volcano, Alaska, as an example. We then analyze the estimated parameters and their uncertainties to characterize the magma chamber. Analyses are performed separately for models simulating a pressurized chamber embedded in a homogeneous domain as well as for a domain having a heterogeneous distribution of material properties according to seismic tomography. The estimated depth of the source is sensitive to the distribution of material properties. The estimated depths for the homogeneous and heterogeneous domains are 2666 ± 42 and 3527 ± 56 m below mean sea level, respectively (99% confidence). A Monte Carlo analysis indicates that uncertainties of the seismic tomography cannot account for this discrepancy at the 99% confidence level. Accounting for the spatial distribution of elastic properties according to seismic tomography significantly improves the fit of the deformation model predictions and significantly influences estimates for parameters that describe the location of a pressurized magma chamber.

  7. Modelling of data uncertainties on hybrid computers

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, Anke (ed.)

    2016-06-15

    The codes d{sup 3}f and r{sup 3}t are well established for modelling density-driven flow and nuclide transport in the far field of repositories for hazardous material in deep geological formations. They are applicable in porous media as well as in fractured rock or mudstone, for modelling salt- and heat transport as well as a free groundwater surface. Development of the basic framework of d{sup 3}f and r{sup 3}t had begun more than 20 years ago. Since that time significant advancements took place in the requirements for safety assessment as well as for computer hardware development. The period of safety assessment for a repository of high-level radioactive waste was extended to 1 million years, and the complexity of the models is steadily growing. Concurrently, the demands on accuracy increase. Additionally, model and parameter uncertainties become more and more important for an increased understanding of prediction reliability. All this leads to a growing demand for computational power that requires a considerable software speed-up. An effective way to achieve this is the use of modern, hybrid computer architectures which requires basically the set-up of new data structures and a corresponding code revision but offers a potential speed-up by several orders of magnitude. The original codes d{sup 3}f and r{sup 3}t were applications of the software platform UG /BAS 94/ whose development had begun in the early nineteennineties. However, UG had recently been advanced to the C++ based, substantially revised version UG4 /VOG 13/. To benefit also in the future from state-of-the-art numerical algorithms and to use hybrid computer architectures, the codes d{sup 3}f and r{sup 3}t were transferred to this new code platform. Making use of the fact that coupling between different sets of equations is natively supported in UG4, d{sup 3}f and r{sup 3}t were combined to one conjoint code d{sup 3}f++. A direct estimation of uncertainties for complex groundwater flow models with the

  8. Uncertainty in the determination of soil hydraulic parameters and its influence on the performance of two hydrological models of different complexity

    NARCIS (Netherlands)

    Baroni, G.; Facchi, A.; Gandolfi, C.; Ortuani, B.; Horeschi, D.; Dam, van J.C.

    2010-01-01

    Data of soil hydraulic properties forms often a limiting factor in unsaturated zone modelling, especially at the larger scales. Investigations for the hydraulic characterization of soils are time-consuming and costly, and the accuracy of the results obtained by the different methodologies is still d

  9. Uncertainty in the determination of soil hydraulic parameters and its influence on the performance of two hydrological models of different complexity

    NARCIS (Netherlands)

    Baroni, G.; Facchi, A.; Gandolfi, C.; Ortuani, B.; Horeschi, D.; Dam, van J.C.

    2010-01-01

    Data of soil hydraulic properties forms often a limiting factor in unsaturated zone modelling, especially at the larger scales. Investigations for the hydraulic characterization of soils are time-consuming and costly, and the accuracy of the results obtained by the different methodologies is still

  10. Model Uncertainty for Bilinear Hysteretic Systems

    DEFF Research Database (Denmark)

    1984-01-01

    is related to the concept of a failure surface (or limit state surface) in the n-dimensional basic variable space then model uncertainty is at least due to the neglected variables, the modelling of the failure surface and the computational technique used. A more precise definition is given in section 2...

  11. Coping with Uncertainty Modeling and Policy Issues

    CERN Document Server

    Marti, Kurt; Makowski, Marek

    2006-01-01

    Ongoing global changes bring fundamentally new scientific problems requiring new concepts and tools. The complexity of new problems does not allow to achieve enough certainty by increasing the resolution of models or by bringing in more links. This book talks about new tools for modeling and management of uncertainty.

  12. Response model parameter linking

    NARCIS (Netherlands)

    Barrett, Michelle Derbenwick

    2015-01-01

    With a few exceptions, the problem of linking item response model parameters from different item calibrations has been conceptualized as an instance of the problem of equating observed scores on different test forms. This thesis argues, however, that the use of item response models does not require

  13. An educational model for ensemble streamflow simulation and uncertainty analysis

    Directory of Open Access Journals (Sweden)

    A. AghaKouchak

    2013-02-01

    Full Text Available This paper presents the hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity. HBV-Ensemble was administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of uncertainty in hydrological modeling.

  14. A Simplified Model of Choice Behavior under Uncertainty

    OpenAIRE

    Lin, Ching-Hung; Lin, Yu-Kai; Song, Tzu-Jiun; Huang, Jong-Tsun; Chiu, Yao-Chu

    2016-01-01

    The Iowa Gambling Task (IGT) has been standardized as a clinical assessment tool (Bechara, 2007). Nonetheless, numerous research groups have attempted to modify IGT models to optimize parameters for predicting the choice behavior of normal controls and patients. A decade ago, most researchers considered the expected utility (EU) model (Busemeyer and Stout, 2002) to be the optimal model for predicting choice behavior under uncertainty. However, in recent years, studies have demonstrated that m...

  15. A simplified model of choice behavior under uncertainty

    OpenAIRE

    Ching-Hung Lin; Yu-Kai Lin; Tzu-Jiun Song; Jong-Tsun Huang; Yao-Chu Chiu

    2016-01-01

    The Iowa Gambling Task (IGT) has been standardized as a clinical assessment tool (Bechara, 2007). Nonetheless, numerous research groups have attempted to modify IGT models to optimize parameters for predicting the choice behavior of normal controls and patients. A decade ago, most researchers considered the expected utility (EU) model (Busemeyer and Stout, 2002) to be the optimal model for predicting choice behavior under uncertainty. However, in recent years, studies have demonstrated the pr...

  16. A simplified model of choice behavior under uncertainty

    Directory of Open Access Journals (Sweden)

    Ching-Hung Lin

    2016-08-01

    Full Text Available The Iowa Gambling Task (IGT has been standardized as a clinical assessment tool (Bechara, 2007. Nonetheless, numerous research groups have attempted to modify IGT models to optimize parameters for predicting the choice behavior of normal controls and patients. A decade ago, most researchers considered the expected utility (EU model (Busemeyer and Stout, 2002 to be the optimal model for predicting choice behavior under uncertainty. However, in recent years, studies have demonstrated the prospect utility (PU models (Ahn et al., 2008 to be more effective than the EU models in the IGT. Nevertheless, after some preliminary tests, we propose that Ahn et al. (2008 PU model is not optimal due to some incompatible results between our behavioral and modeling data. This study aims to modify Ahn et al. (2008 PU model to a simplified model and collected 145 subjects’ IGT performance as the benchmark data for comparison. In our simplified PU model, the best goodness-of-fit was found mostly while α approaching zero. More specifically, we retested the key parameters α, λ , and A in the PU model. Notably, the power of influence of the parameters α, λ, and A has a hierarchical order in terms of manipulating the goodness-of-fit in the PU model. Additionally, we found that the parameters λ and A may be ineffective when the parameter α is close to zero in the PU model. The present simplified model demonstrated that decision makers mostly adopted the strategy of gain-stay-loss-shift rather than foreseeing the long-term outcome. However, there still have other behavioral variables that are not well revealed under these dynamic uncertainty situations. Therefore, the optimal behavioral models may not have been found. In short, the best model for predicting choice behavior under dynamic-uncertainty situations should be further evaluated.

  17. A New Ensemble-Based Method for Assessing Uncertainties and Parameter Tradeoffs in Complex Models of Postseismic Deformation: Application to the 2010 M=7.2 El Mayor-Cucapah Earthquake

    Science.gov (United States)

    Rollins, C.; Barbot, S.; Avouac, J. P.

    2014-12-01

    The 2010 M=7.2 El Mayor-Cucapah earthquake occurred in the Salton Trough, a region of thinned lithosphere and high heat flow, and the postseismic deformation following this earthquake presents a unique opportunity to study the rheology of extensional environments and the mechanics of ductile flow within and beneath the lithosphere. Previous work [Rollins et al, in prep.] revealed that GPS timeseries of surface displacement following the earthquake were well fit to a coupled model simulating stress-driven afterslip on the deep extension of the coseismic rupture, Newtonian viscoelastic relaxation in a low-viscosity zone in the lower crust of the Salton Trough aligned with areas of high heat flow, and Newtonian viscoelastic relaxation in a three-dimensional asthenosphere with geometry matching that of the regional lithosphere-asthenosphere boundary inferred from receiver functions. Extending the success of this model to a robust interpretation of the mechanics of deformation at depth requires a better understanding of uncertainties and trade-offs between parameters (depth of the brittle-ductile transition, viscosities of the lower crust and asthenosphere, geometry of viscosity anomalies in the Salton Trough, frictional parameters of the possible downdip extensions of the coseismic rupture, and correlations among these parameters). We will show results from recent work that uses a newly developed method to efficiently explore this model space in a Bayesian sense. The method employs the Neighborhood Algorithm of Sambridge [1999], which makes use of Voronoi cells to optimize the search in the model space, samples regions that contains models of acceptable data fit, and extracts robust information from the ensemble of models obtained. The method is particularly well suited to identify a class of models that fit geodetic data approximately equally well, allowing us to present and discuss a range of possible deformation mechanisms. This method can be applied to any study of

  18. Uncertainty in hydrological change modelling

    DEFF Research Database (Denmark)

    Seaby, Lauren Paige

    methodology for basin discharge and groundwater heads. The ensemble of 11 climate models varied in strength, significance, and sometimes in direction of the climate change signal. The more complex daily DBS correction methods were more accurate at transferring precipitation changes in mean as well...... as the variance, and improving the characterisation of day to day variation as well as heavy events. However, the most highly parameterised of the DBS methods were less robust under climate change conditions. The spatial characteristics of groundwater head and stream discharge were best represented by DBS methods...... applied at the grid scale. Flux and state hydrological outputs which integrate responses over time and space showed more sensitivity to precipitation mean spatial biases and less so on extremes. In the investigated catchments, the projected change of groundwater levels and basin discharge between current...

  19. Spectral optimization and uncertainty quantification in combustion modeling

    Science.gov (United States)

    Sheen, David Allan

    Reliable simulations of reacting flow systems require a well-characterized, detailed chemical model as a foundation. Accuracy of such a model can be assured, in principle, by a multi-parameter optimization against a set of experimental data. However, the inherent uncertainties in the rate evaluations and experimental data leave a model still characterized by some finite kinetic rate parameter space. Without a careful analysis of how this uncertainty space propagates into the model's predictions, those predictions can at best be trusted only qualitatively. In this work, the Method of Uncertainty Minimization using Polynomial Chaos Expansions is proposed to quantify these uncertainties. In this method, the uncertainty in the rate parameters of the as-compiled model is quantified. Then, the model is subjected to a rigorous multi-parameter optimization, as well as a consistency-screening process. Lastly, the uncertainty of the optimized model is calculated using an inverse spectral optimization technique, and then propagated into a range of simulation conditions. An as-compiled, detailed H2/CO/C1-C4 kinetic model is combined with a set of ethylene combustion data to serve as an example. The idea that the hydrocarbon oxidation model should be understood and developed in a hierarchical fashion has been a major driving force in kinetics research for decades. How this hierarchical strategy works at a quantitative level, however, has never been addressed. In this work, we use ethylene and propane combustion as examples and explore the question of hierarchical model development quantitatively. The Method of Uncertainty Minimization using Polynomial Chaos Expansions is utilized to quantify the amount of information that a particular combustion experiment, and thereby each data set, contributes to the model. This knowledge is applied to explore the relationships among the combustion chemistry of hydrogen/carbon monoxide, ethylene, and larger alkanes. Frequently, new data will

  20. Impact of uncertainty description on assimilating hydraulic head in the MIKE SHE distributed hydrological model

    DEFF Research Database (Denmark)

    Zhang, Donghua; Madsen, Henrik; Ridler, Marc E.

    2015-01-01

    uncertainty. In most hydrological EnKF applications, an ad hoc model uncertainty is defined with the aim of avoiding a collapse of the filter. The present work provides a systematic assessment of model uncertainty in DA applications based on combinations of forcing, model parameters, and state uncertainties....... This is tested in a case where groundwater hydraulic heads are assimilated into a distributed and integrated catchment-scale model of the Karup catchment in Denmark. A series of synthetic data assimilation experiments are carried out to analyse the impact of different model uncertainty assumptions...

  1. Influence of hydrodynamic parameters on tsunami run-up uncertainty induced by earthquake random slip distribtutions

    Science.gov (United States)

    Løvholt, Finn; Kim, Jihwan; Pedersen, Geir; Harbitz, Carl

    2016-04-01

    The standard approach in forward modeling of earthquake tsunamis usually assume a uniform slip pattern. This is assumption is used both in deterministic and probabilistic models. However, the slip distribution for an earthquake is subject to (aleatory) uncertainty, and consequently the induced tsunami run-up will have an uncertainty range even given the same moment magnitude and hypocentre earthquake location. Here, we present studies of run-up variability due to stochastic earthquake slip variation in both two and three dimensions. The approach taken is fully idealized, although we draw upon the experience from two of the most destructive events the last hundred years, namely the Mw8 1976 Moro Gulf earthquake and tsunami as well as the Mw9 2011 Tohoku earthquake tsunami. The former event is used to design the two-dimensional stochastic simulations, and the latter event the three-dimensional simulations. Our primary focus is not reproduce past run-up, but rather to investigate how the hydrodynamics influence uncertainty. These quantities include among others the non-hydrodynamic response during generation, frequency dispersion, friction from the seabed, and wave-breaking. We simulate tsunamis for an ensemble of synthetic random slip over an idealized shelf geometry broken into linear segments. The uncertainty propagation from source to run-up for the two different cases are discussed and compared. As demonstrated, both the dimensionality and the earthquake parameters influence the contributions of the hydrodynamic parameters on the uncertainty. Further work will be needed to explore the transitional behaviour between the two very different cases displayed here. The research leading to these results has received funding from the European Union's Seventh Framework Programme (FP7/2007-2013) under grant agreement 603839 (Project ASTARTE).

  2. The effect of uncertainty and systematic errors in hydrological modelling

    Science.gov (United States)

    Steinsland, I.; Engeland, K.; Johansen, S. S.; Øverleir-Petersen, A.; Kolberg, S. A.

    2014-12-01

    The aims of hydrological model identification and calibration are to find the best possible set of process parametrization and parameter values that transform inputs (e.g. precipitation and temperature) to outputs (e.g. streamflow). These models enable us to make predictions of streamflow. Several sources of uncertainties have the potential to hamper the possibility of a robust model calibration and identification. In order to grasp the interaction between model parameters, inputs and streamflow, it is important to account for both systematic and random errors in inputs (e.g. precipitation and temperatures) and streamflows. By random errors we mean errors that are independent from time step to time step whereas by systematic errors we mean errors that persists for a longer period. Both random and systematic errors are important in the observation and interpolation of precipitation and temperature inputs. Important random errors comes from the measurements themselves and from the network of gauges. Important systematic errors originate from the under-catch in precipitation gauges and from unknown spatial trends that are approximated in the interpolation. For streamflow observations, the water level recordings might give random errors whereas the rating curve contributes mainly with a systematic error. In this study we want to answer the question "What is the effect of random and systematic errors in inputs and observed streamflow on estimated model parameters and streamflow predictions?". To answer we test systematically the effect of including uncertainties in inputs and streamflow during model calibration and simulation in distributed HBV model operating on daily time steps for the Osali catchment in Norway. The case study is based on observations from, uncertainty carefullt quantified, and increased uncertainties and systmatical errors are done realistically by for example removing a precipitation gauge from the network.We find that the systematical errors in

  3. Uncertainties in Surface Layer Modeling

    Science.gov (United States)

    Pendergrass, W.

    2015-12-01

    A central problem for micrometeorologists has been the relationship of air-surface exchange rates of momentum and heat to quantities that can be predicted with confidence. The flux-gradient profile developed through Monin-Obukhov Similarity Theory (MOST) provides an integration of the dimensionless wind shear expression where is an empirically derived expression for stable and unstable atmospheric conditions. Empirically derived expressions are far from universally accepted (Garratt, 1992, Table A5). Regardless of what form of these relationships might be used, their significance over any short period of time is questionable since all of these relationships between fluxes and gradients apply to averages that might rarely occur. It is well accepted that the assumption of stationarity and homogeneity do not reflect the true chaotic nature of the processes that control the variables considered in these relationships, with the net consequence that the levels of predictability theoretically attainable might never be realized in practice. This matter is of direct relevance to modern prognostic models which construct forecasts by assuming the universal applicability of relationships among averages for the lower atmosphere, which rarely maintains an average state. Under a Cooperative research and Development Agreement between NOAA and Duke Energy Generation, NOAA/ATDD conducted atmospheric boundary layer (ABL) research using Duke renewable energy sites as research testbeds. One aspect of this research has been the evaluation of legacy flux-gradient formulations (the ϕ functions, see Monin and Obukhov, 1954) for the exchange of heat and momentum. At the Duke Energy Ocotillo site, NOAA/ATDD installed sonic anemometers reporting wind and temperature fluctuations at 10Hz at eight elevations. From these observations, ϕM and ϕH were derived from a two-year database of mean and turbulent wind and temperature observations. From this extensive measurement database, using a

  4. Sensitivities and uncertainties of modeled ground temperatures in mountain environments

    Directory of Open Access Journals (Sweden)

    S. Gubler

    2013-08-01

    discretization parameters. We show that the temporal resolution should be at least 1 h to ensure errors less than 0.2 °C in modeled MAGT, and the uppermost ground layer should at most be 20 mm thick. Within the topographic setting, the total parametric output uncertainties expressed as the length of the 95% uncertainty interval of the Monte Carlo simulations range from 0.5 to 1.5 °C for clay and silt, and ranges from 0.5 to around 2.4 °C for peat, sand, gravel and rock. These uncertainties are comparable to the variability of ground surface temperatures measured within 10 m × 10 m grids in Switzerland. The increased uncertainties for sand, peat and gravel are largely due to their sensitivity to the hydraulic conductivity.

  5. Review of strategies for handling geological uncertainty in groundwater flow and transport modeling

    DEFF Research Database (Denmark)

    Refsgaard, Jens Christian; Christensen, Steen; Sonnenborg, Torben O.;

    2012-01-01

    The geologically related uncertainty in groundwater modeling originates from two main sources: geological structures and hydraulic parameter values within these structures. Within a geological structural element the parameter values will always exhibit local scale heterogeneity, which can...... be accounted for, but is often neglected, in assessments of prediction uncertainties. Strategies for assessing prediction uncertainty due to geologically related uncertainty may be divided into three main categories, accounting for uncertainty due to: (a) the geological structure; (b) effective model...... parameters; and (c) model parameters including local scale heterogeneity. The most common methodologies for uncertainty assessments within each of these categories, such as multiple modeling, Monte Carlo analysis, regression analysis and moment equation approach, are briefly described with emphasis...

  6. Estimation of a multivariate mean under model selection uncertainty

    Directory of Open Access Journals (Sweden)

    Georges Nguefack-Tsague

    2014-05-01

    Full Text Available Model selection uncertainty would occur if we selected a model based on one data set and subsequently applied it for statistical inferences, because the "correct" model would not be selected with certainty.  When the selection and inference are based on the same dataset, some additional problems arise due to the correlation of the two stages (selection and inference. In this paper model selection uncertainty is considered and model averaging is proposed. The proposal is related to the theory of James and Stein of estimating more than three parameters from independent normal observations. We suggest that a model averaging scheme taking into account the selection procedure could be more appropriate than model selection alone. Some properties of this model averaging estimator are investigated; in particular we show using Stein's results that it is a minimax estimator and can outperform Stein-type estimators.

  7. Systematic Uncertainties in High-Energy Hadronic Interaction Models

    Science.gov (United States)

    Zha, M.; Knapp, J.; Ostapchenko, S.

    2003-07-01

    Hadronic interaction models for cosmic ray energies are uncertain since our knowledge of hadronic interactions is extrap olated from accelerator experiments at much lower energies. At present most high-energy models are based on Grib ov-Regge theory of multi-Pomeron exchange, which provides a theoretical framework to evaluate cross-sections and particle production. While experimental data constrain some of the model parameters, others are not well determined and are therefore a source of systematic uncertainties. In this paper we evaluate the variation of results obtained with the QGSJET model, when modifying parameters relating to three ma jor sources of uncertainty: the form of the parton structure function, the role of diffractive interactions, and the string hadronisation. Results on inelastic cross sections, on secondary particle production and on the air shower development are discussed.

  8. Impact of Martian atmosphere parameter uncertainties on entry vehicles aerodynamic for hypersonic rarefied conditions

    Science.gov (United States)

    Fei, Huang; Xu-hong, Jin; Jun-ming, Lv; Xiao-li, Cheng

    2016-11-01

    An attempt has been made to analyze impact of Martian atmosphere parameter uncertainties on entry vehicle aerodynamics for hypersonic rarefied conditions with a DSMC code. The code has been validated by comparing Viking vehicle flight data with present computational results. Then, by simulating flows around the Mars Science Laboratory, the impact of errors of free stream parameter uncertainties on aerodynamics is investigated. The validation results show that the present numerical approach can show good agreement with the Viking flight data. The physical and chemical properties of CO2 has strong impact on aerodynamics of Mars entry vehicles, so it is necessary to make proper corrections to the data obtained with air model in hypersonic rarefied conditions, which is consistent with the conclusions drawn in continuum regime. Uncertainties of free stream density and velocity weakly influence aerodynamics and pitching moment. However, aerodynamics appears to be little influenced by free stream temperature, the maximum error of what is below 0.5%. Center of pressure position is not sensitive to free stream parameters.

  9. Uncertainty quantification and stochastic modeling with Matlab

    CERN Document Server

    Souza de Cursi, Eduardo

    2015-01-01

    Uncertainty Quantification (UQ) is a relatively new research area which describes the methods and approaches used to supply quantitative descriptions of the effects of uncertainty, variability and errors in simulation problems and models. It is rapidly becoming a field of increasing importance, with many real-world applications within statistics, mathematics, probability and engineering, but also within the natural sciences. Literature on the topic has up until now been largely based on polynomial chaos, which raises difficulties when considering different types of approximation and does no

  10. Optical Model and Cross Section Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Herman,M.W.; Pigni, M.T.; Dietrich, F.S.; Oblozinsky, P.

    2009-10-05

    Distinct minima and maxima in the neutron total cross section uncertainties were observed in model calculations using spherical optical potential. We found this oscillating structure to be a general feature of quantum mechanical wave scattering. Specifically, we analyzed neutron interaction with 56Fe from 1 keV up to 65 MeV, and investigated physical origin of the minima.We discuss their potential importance for practical applications as well as the implications for the uncertainties in total and absorption cross sections.

  11. Extended Range Hydrological Predictions: Uncertainty Associated with Model Parametrization

    Science.gov (United States)

    Joseph, J.; Ghosh, S.; Sahai, A. K.

    2016-12-01

    The better understanding of various atmospheric processes has led to improved predictions of meteorological conditions at various temporal scale, ranging from short term which cover a period up to 2 days to long term covering a period of more than 10 days. Accurate prediction of hydrological variables can be done using these predicted meteorological conditions, which would be helpful in proper management of water resources. Extended range hydrological simulation includes the prediction of hydrological variables for a period more than 10 days. The main sources of uncertainty in hydrological predictions include the uncertainty in the initial conditions, meteorological forcing and model parametrization. In the present study, the Extended Range Prediction developed for India for monsoon by Indian Institute of Tropical Meteorology (IITM), Pune is used as meteorological forcing for the Variable Infiltration Capacity (VIC) model. Sensitive hydrological parameters, as derived from literature, along with a few vegetation parameters are assumed to be uncertain and 1000 random values are generated given their prescribed ranges. Uncertainty bands are generated by performing Monte-Carlo Simulations (MCS) for the generated sets of parameters and observed meteorological forcings. The basins with minimum human intervention, within the Indian Peninsular region, are identified and validation of results are carried out using the observed gauge discharge. Further, the uncertainty bands are generated for the extended range hydrological predictions by performing MCS for the same set of parameters and extended range meteorological predictions. The results demonstrate the uncertainty associated with the model parametrisation for the extended range hydrological simulations. Keywords: Extended Range Prediction, Variable Infiltration Capacity model, Monte Carlo Simulation.

  12. Representing uncertainty on model analysis plots

    Science.gov (United States)

    Smith, Trevor I.

    2016-12-01

    Model analysis provides a mechanism for representing student learning as measured by standard multiple-choice surveys. The model plot contains information regarding both how likely students in a particular class are to choose the correct answer and how likely they are to choose an answer consistent with a well-documented conceptual model. Unfortunately, Bao's original presentation of the model plot did not include a way to represent uncertainty in these measurements. I present details of a method to add error bars to model plots by expanding the work of Sommer and Lindell. I also provide a template for generating model plots with error bars.

  13. Uncertainty in wind climate parameters and their influence on wind turbine fatigue loads

    DEFF Research Database (Denmark)

    Toft, Henrik Stensgaard; Svenningsen, Lasse; Sørensen, John Dalsgaard;

    2016-01-01

    Highlights • Probabilistic framework for reliability assessment of site specific wind turbines. • Uncertainty in wind climate parameters propagated to structural loads directly. • Sensitivity analysis to estimate wind climate parameters influence on reliability.......Highlights • Probabilistic framework for reliability assessment of site specific wind turbines. • Uncertainty in wind climate parameters propagated to structural loads directly. • Sensitivity analysis to estimate wind climate parameters influence on reliability....

  14. Distributed Parameter Modelling Applications

    DEFF Research Database (Denmark)

    2011-01-01

    Here the issue of distributed parameter models is addressed. Spatial variations as well as time are considered important. Several applications for both steady state and dynamic applications are given. These relate to the processing of oil shale, the granulation of industrial fertilizers and the d......Here the issue of distributed parameter models is addressed. Spatial variations as well as time are considered important. Several applications for both steady state and dynamic applications are given. These relate to the processing of oil shale, the granulation of industrial fertilizers...... sands processing. The fertilizer granulation model considers the dynamics of MAP-DAP (mono and diammonium phosphates) production within an industrial granulator, that involves complex crystallisation, chemical reaction and particle growth, captured through population balances. A final example considers...

  15. Quantifying Uncertainties in the 2004 Sumatra-Andaman Earthquake Source Parameters by Stochastic Inversion

    CERN Document Server

    Gopinathan, Devaraj; Roy, Debasish; Rajendran, Kusala; Guillas, Serge; Dias, Frederic

    2016-01-01

    Usual inversion for earthquake source parameters from tsunami wave data incorporates subjective elements. Noisy and possibly insufficient data also results in instability and non-uniqueness in most deterministic inversions. Here we employ the satellite altimetry data for the 2004 Sumatra-Andaman tsunami event to invert the source parameters. Using a finite fault model that represents the extent of rupture and the geometry of the trench, we perform a non-linear joint inversion of the slips, rupture velocities and rise times with minimal a priori constraints. Despite persistently good waveform fits, large variance and skewness in the joint parameter distribution constitute a remarkable feature of the inversion. These uncertainties suggest the need for objective inversion strategies that should incorporate more sophisticated physical models in order to significantly improve the performance of early warning systems.

  16. APPLICATION OF UNCERTAINTY ANALYSIS TO MAAP4 ANALYSES FOR LEVEL 2 PRA PARAMETER IMPORTANCE DETERMINATION

    Directory of Open Access Journals (Sweden)

    KEVIN ROBERTS

    2013-11-01

    A key element tied to using a code like MAAP4 is an uncertainty analysis. The purpose of this paper is to present a MAAP4 based analysis to examine the sensitivity of a key parameter, in this case hydrogen production, to a set of model parameters that are related to a Level 2 PRA analysis. The Level 2 analysis examines those sequences that result in core melting and subsequent reactor pressure vessel failure and its impact on the containment. This paper identifies individual contributors and MAAP4 model parameters that statistically influence hydrogen production. Hydrogen generation was chosen because of its direct relationship to oxidation. With greater oxidation, more heat is added to the core region and relocation (core slump should occur faster. This, in theory, would lead to shorter failure times and subsequent “hotter” debris pool on the containment floor.

  17. Uncertainty Visualization in Forward and Inverse Cardiac Models.

    Science.gov (United States)

    Burton, Brett M; Erem, Burak; Potter, Kristin; Rosen, Paul; Johnson, Chris R; Brooks, Dana H; Macleod, Rob S

    2013-01-01

    Quantification and visualization of uncertainty in cardiac forward and inverse problems with complex geometries is subject to various challenges. Specific to visualization is the observation that occlusion and clutter obscure important regions of interest, making visual assessment difficult. In order to overcome these limitations in uncertainty visualization, we have developed and implemented a collection of novel approaches. To highlight the utility of these techniques, we evaluated the uncertainty associated with two examples of modeling myocardial activity. In one case we studied cardiac potentials during the repolarization phase as a function of variability in tissue conductivities of the ischemic heart (forward case). In a second case, we evaluated uncertainty in reconstructed activation times on the epicardium resulting from variation in the control parameter of Tikhonov regularization (inverse case). To overcome difficulties associated with uncertainty visualization, we implemented linked-view windows and interactive animation to the two respective cases. Through dimensionality reduction and superimposed mean and standard deviation measures over time, we were able to display key features in large ensembles of data and highlight regions of interest where larger uncertainties exist.

  18. Uncertainty Analysis in Population-Based Disease Microsimulation Models

    Directory of Open Access Journals (Sweden)

    Behnam Sharif

    2012-01-01

    Full Text Available Objective. Uncertainty analysis (UA is an important part of simulation model validation. However, literature is imprecise as to how UA should be performed in the context of population-based microsimulation (PMS models. In this expository paper, we discuss a practical approach to UA for such models. Methods. By adapting common concepts from published UA guidelines, we developed a comprehensive, step-by-step approach to UA in PMS models, including sample size calculation to reduce the computational time. As an illustration, we performed UA for POHEM-OA, a microsimulation model of osteoarthritis (OA in Canada. Results. The resulting sample size of the simulated population was 500,000 and the number of Monte Carlo (MC runs was 785 for 12-hour computational time. The estimated 95% uncertainty intervals for the prevalence of OA in Canada in 2021 were 0.09 to 0.18 for men and 0.15 to 0.23 for women. The uncertainty surrounding the sex-specific prevalence of OA increased over time. Conclusion. The proposed approach to UA considers the challenges specific to PMS models, such as selection of parameters and calculation of MC runs and population size to reduce computational burden. Our example of UA shows that the proposed approach is feasible. Estimation of uncertainty intervals should become a standard practice in the reporting of results from PMS models.

  19. Uncertainty and Sensitivity in Surface Dynamics Modeling

    Science.gov (United States)

    Kettner, Albert J.; Syvitski, James P. M.

    2016-05-01

    Papers for this special issue on 'Uncertainty and Sensitivity in Surface Dynamics Modeling' heralds from papers submitted after the 2014 annual meeting of the Community Surface Dynamics Modeling System or CSDMS. CSDMS facilitates a diverse community of experts (now in 68 countries) that collectively investigate the Earth's surface-the dynamic interface between lithosphere, hydrosphere, cryosphere, and atmosphere, by promoting, developing, supporting and disseminating integrated open source software modules. By organizing more than 1500 researchers, CSDMS has the privilege of identifying community strengths and weaknesses in the practice of software development. We recognize, for example, that progress has been slow on identifying and quantifying uncertainty and sensitivity in numerical modeling of earth's surface dynamics. This special issue is meant to raise awareness for these important subjects and highlight state-of-the-art progress.

  20. Parameter uncertainty and sensitivity analysis in sediment flux calculation

    Directory of Open Access Journals (Sweden)

    B. Cheviron

    2011-01-01

    Full Text Available This paper examines uncertainties in the calculation of annual sediment budgets at the outlet of rivers. Emphasis is put on the sensitivity of power-law rating curves to degradations of the available discharge-concentration data. The main purpose is to determine how predictions arising from usual or modified power laws resist to the infrequence of concentration data and to relative uncertainties affecting source data. This study identifies cases in which the error on the estimated sediment fluxes remains of the same order of magnitude or even inferior to these in source data, provided the number of concentration data is high enough. The exposed mathematical framework allows considering all limitations at once in further detailed investigations. It is applied here to bound the error on sediment budgets for the major French rivers to the sea.

  1. A Generalized Statistical Uncertainty Model for Satellite Precipitation Products

    Science.gov (United States)

    Sarachi, S.

    2013-12-01

    A mixture model of Generalized Normal Distribution and Gamma distribution (GND-G) is used to model the joint probability distribution of satellite-based and stage IV radar rainfall under a given spatial and temporal resolution (e.g. 1°x1° and daily rainfall). The distribution parameters of GND-G are extended across various rainfall rates and spatial and temporal resolutions. In the study, GND-G is used to describe the uncertainty of the estimates from Precipitation Estimation from Remote Sensing Information using Artificial Neural Network algorithm (PERSIANN). The stage IV-based multi-sensor precipitation estimates (MPE) are used as reference measurements .The study area for constructing the uncertainty model covers a 15°×15°box of 0.25°×0.25° cells over the eastern United States for summer 2004 to 2009. Cells are aggregated in space and time to obtain data with different resolutions for the construction of the model's parameter space. Result shows that comparing to the other statistical uncertainty models, GND-G fits better than the other models, such as Gaussian and Gamma distributions, to the reference precipitation data. The impact of precipitation uncertainty to the stream flow is further demonstrated by Monte Carlo simulation of precipitation forcing in the hydrologic model. The NWS DMIP2 basins over Illinois River basin south of Siloam is selected in this case study. The data covers the time period of 2006 to 2008.The uncertainty range of stream flow from precipitation of GND-G distributions calculated and will be discussed.

  2. Effect of precipitation spatial distribution uncertainty on the uncertainty bounds of a snowmelt runoff model output

    Science.gov (United States)

    Jacquin, A. P.

    2012-04-01

    This study analyses the effect of precipitation spatial distribution uncertainty on the uncertainty bounds of a snowmelt runoff model's discharge estimates. Prediction uncertainty bounds are derived using the Generalized Likelihood Uncertainty Estimation (GLUE) methodology. The model analysed is a conceptual watershed model operating at a monthly time step. The model divides the catchment into five elevation zones, where the fifth zone corresponds to the catchment glaciers. Precipitation amounts at each elevation zone i are estimated as the product between observed precipitation (at a single station within the catchment) and a precipitation factor FPi. Thus, these factors provide a simplified representation of the spatial variation of precipitation, specifically the shape of the functional relationship between precipitation and height. In the absence of information about appropriate values of the precipitation factors FPi, these are estimated through standard calibration procedures. The catchment case study is Aconcagua River at Chacabuquito, located in the Andean region of Central Chile. Monte Carlo samples of the model output are obtained by randomly varying the model parameters within their feasible ranges. In the first experiment, the precipitation factors FPi are considered unknown and thus included in the sampling process. The total number of unknown parameters in this case is 16. In the second experiment, precipitation factors FPi are estimated a priori, by means of a long term water balance between observed discharge at the catchment outlet, evapotranspiration estimates and observed precipitation. In this case, the number of unknown parameters reduces to 11. The feasible ranges assigned to the precipitation factors in the first experiment are slightly wider than the range of fixed precipitation factors used in the second experiment. The mean squared error of the Box-Cox transformed discharge during the calibration period is used for the evaluation of the

  3. Systemic change increases model projection uncertainty

    Science.gov (United States)

    Verstegen, Judith; Karssenberg, Derek; van der Hilst, Floor; Faaij, André

    2014-05-01

    Most spatio-temporal models are based on the assumption that the relationship between system state change and its explanatory processes is stationary. This means that model structure and parameterization are usually kept constant over time, ignoring potential systemic changes in this relationship resulting from e.g., climatic or societal changes, thereby overlooking a source of uncertainty. We define systemic change as a change in the system indicated by a system state change that cannot be simulated using a constant model structure. We have developed a method to detect systemic change, using a Bayesian data assimilation technique, the particle filter. The particle filter was used to update the prior knowledge about the model structure. In contrast to the traditional particle filter approach (e.g., Verstegen et al., 2014), we apply the filter separately for each point in time for which observations are available, obtaining the optimal model structure for each of the time periods in between. This allows us to create a time series of the evolution of the model structure. The Runs test (Wald and Wolfowitz, 1940), a stationarity test, is used to check whether variation in this time series can be attributed to randomness or not. If not, this indicates systemic change. The uncertainty that the systemic change adds to the existing model projection uncertainty can be determined by comparing model outcomes of a model with a stationary model structure and a model with a model structure changing according to the variation found in the time series. To test the systemic change detection methodology, we apply it to a land use change cellular automaton (CA) (Verstegen et al., 2012) and use observations of real land use from all years from 2004 to 2012 and associated uncertainty as observational data in the particle filter. A systemic change was detected for the period 2006 to 2008. In this period the influence on the location of sugar cane expansion of the driver sugar cane in

  4. Physical and Model Uncertainty for Fatigue Design of Composite Material

    DEFF Research Database (Denmark)

    Toft, Henrik Stensgaard; Sørensen, John Dalsgaard

    The main aim of the present report is to establish stochastic models for the uncertainties related to fatigue design of composite materials. The uncertainties considered are the physical uncertainty related to the static and fatigue strength and the model uncertainty related to Miners rule...

  5. Influence of model reduction on uncertainty of flood inundation predictions

    Science.gov (United States)

    Romanowicz, R. J.; Kiczko, A.; Osuch, M.

    2012-04-01

    Derivation of flood risk maps requires an estimation of the maximum inundation extent for a flood with an assumed probability of exceedence, e.g. a 100 or 500 year flood. The results of numerical simulations of flood wave propagation are used to overcome the lack of relevant observations. In practice, deterministic 1-D models are used for flow routing, giving a simplified image of a flood wave propagation process. The solution of a 1-D model depends on the simplifications to the model structure, the initial and boundary conditions and the estimates of model parameters which are usually identified using the inverse problem based on the available noisy observations. Therefore, there is a large uncertainty involved in the derivation of flood risk maps. In this study we examine the influence of model structure simplifications on estimates of flood extent for the urban river reach. As the study area we chose the Warsaw reach of the River Vistula, where nine bridges and several dikes are located. The aim of the study is to examine the influence of water structures on the derived model roughness parameters, with all the bridges and dikes taken into account, with a reduced number and without any water infrastructure. The results indicate that roughness parameter values of a 1-D HEC-RAS model can be adjusted for the reduction in model structure. However, the price we pay is the model robustness. Apart from a relatively simple question regarding reducing model structure, we also try to answer more fundamental questions regarding the relative importance of input, model structure simplification, parametric and rating curve uncertainty to the uncertainty of flood extent estimates. We apply pseudo-Bayesian methods of uncertainty estimation and Global Sensitivity Analysis as the main methodological tools. The results indicate that the uncertainties have a substantial influence on flood risk assessment. In the paper we present a simplified methodology allowing the influence of

  6. Nonlinear structural finite element model updating and uncertainty quantification

    Science.gov (United States)

    Ebrahimian, Hamed; Astroza, Rodrigo; Conte, Joel P.

    2015-04-01

    This paper presents a framework for nonlinear finite element (FE) model updating, in which state-of-the-art nonlinear structural FE modeling and analysis techniques are combined with the maximum likelihood estimation method (MLE) to estimate time-invariant parameters governing the nonlinear hysteretic material constitutive models used in the FE model of the structure. The estimation uncertainties are evaluated based on the Cramer-Rao lower bound (CRLB) theorem. A proof-of-concept example, consisting of a cantilever steel column representing a bridge pier, is provided to verify the proposed nonlinear FE model updating framework.

  7. Verification and Uncertainty Reduction of Amchitka Underground Nuclear Testing Models

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed Hassan; Jenny Chapman

    2006-02-01

    The modeling of Amchitka underground nuclear tests conducted in 2002 is verified and uncertainty in model input parameters, as well as predictions, has been reduced using newly collected data obtained by the summer 2004 field expedition of CRESP. Newly collected data that pertain to the groundwater model include magnetotelluric (MT) surveys conducted on the island to determine the subsurface salinity and porosity structure of the subsurface, and bathymetric surveys to determine the bathymetric maps of the areas offshore from the Long Shot and Cannikin Sites. Analysis and interpretation of the MT data yielded information on the location of the transition zone, and porosity profiles showing porosity values decaying with depth. These new data sets are used to verify the original model in terms of model parameters, model structure, and model output verification. In addition, by using the new data along with the existing data (chemistry and head data), the uncertainty in model input and output is decreased by conditioning on all the available data. A Markov Chain Monte Carlo (MCMC) approach is adapted for developing new input parameter distributions conditioned on prior knowledge and new data. The MCMC approach is a form of Bayesian conditioning that is constructed in such a way that it produces samples of the model parameters that eventually converge to a stationary posterior distribution. The Bayesian MCMC approach enhances probabilistic assessment. Instead of simply propagating uncertainty forward from input parameters into model predictions (i.e., traditional Monte Carlo approach), MCMC propagates uncertainty backward from data onto parameters, and then forward from parameters into predictions. Comparisons between new data and the original model, and conditioning on all available data using MCMC method, yield the following results and conclusions: (1) Model structure is verified at Long Shot and Cannikin where the high-resolution bathymetric data collected by CRESP

  8. Estimating winter wheat phenological parameters: Implications for crop modeling

    Science.gov (United States)

    Crop parameters, such as the timing of developmental events, are critical for accurate simulation results in crop simulation models, yet uncertainty often exists in determining the parameters. Factors contributing to the uncertainty include: a) sources of variation within a plant (i.e., within diffe...

  9. IMPLEMENTATION OF DATA ASSIMILATION METHODOLOGY FOR PHYSICAL MODEL UNCERTAINTY EVALUATION USING POST-CHF EXPERIMENTAL DATA

    Directory of Open Access Journals (Sweden)

    JAESEOK HEO

    2014-10-01

    Full Text Available The Best Estimate Plus Uncertainty (BEPU method has been widely used to evaluate the uncertainty of a best-estimate thermal hydraulic system code against a figure of merit. This uncertainty is typically evaluated based on the physical model's uncertainties determined by expert judgment. This paper introduces the application of data assimilation methodology to determine the uncertainty bands of the physical models, e.g., the mean value and standard deviation of the parameters, based upon the statistical approach rather than expert judgment. Data assimilation suggests a mathematical methodology for the best estimate bias and the uncertainties of the physical models which optimize the system response following the calibration of model parameters and responses. The mathematical approaches include deterministic and probabilistic methods of data assimilation to solve both linear and nonlinear problems with the a posteriori distribution of parameters derived based on Bayes' theorem. The inverse problem was solved analytically to obtain the mean value and standard deviation of the parameters assuming Gaussian distributions for the parameters and responses, and a sampling method was utilized to illustrate the non-Gaussian a posteriori distributions of parameters. SPACE is used to demonstrate the data assimilation method by determining the bias and the uncertainty bands of the physical models employing Bennett's heated tube test data and Becker's post critical heat flux experimental data. Based on the results of the data assimilation process, the major sources of the modeling uncertainties were identified for further model development.

  10. Quantifying uncertainty in stable isotope mixing models

    Science.gov (United States)

    Davis, Paul; Syme, James; Heikoop, Jeffrey; Fessenden-Rahn, Julianna; Perkins, George; Newman, Brent; Chrystal, Abbey E.; Hagerty, Shannon B.

    2015-05-01

    Mixing models are powerful tools for identifying biogeochemical sources and determining mixing fractions in a sample. However, identification of actual source contributors is often not simple, and source compositions typically vary or even overlap, significantly increasing model uncertainty in calculated mixing fractions. This study compares three probabilistic methods, Stable Isotope Analysis in R (SIAR), a pure Monte Carlo technique (PMC), and Stable Isotope Reference Source (SIRS) mixing model, a new technique that estimates mixing in systems with more than three sources and/or uncertain source compositions. In this paper, we use nitrate stable isotope examples (δ15N and δ18O) but all methods tested are applicable to other tracers. In Phase I of a three-phase blind test, we compared methods for a set of six-source nitrate problems. PMC was unable to find solutions for two of the target water samples. The Bayesian method, SIAR, experienced anchoring problems, and SIRS calculated mixing fractions that most closely approximated the known mixing fractions. For that reason, SIRS was the only approach used in the next phase of testing. In Phase II, the problem was broadened where any subset of the six sources could be a possible solution to the mixing problem. Results showed a high rate of Type I errors where solutions included sources that were not contributing to the sample. In Phase III some sources were eliminated based on assumed site knowledge and assumed nitrate concentrations, substantially reduced mixing fraction uncertainties and lowered the Type I error rate. These results demonstrate that valuable insights into stable isotope mixing problems result from probabilistic mixing model approaches like SIRS. The results also emphasize the importance of identifying a minimal set of potential sources and quantifying uncertainties in source isotopic composition as well as demonstrating the value of additional information in reducing the uncertainty in calculated

  11. Three-dimensional lake water quality modeling: sensitivity and uncertainty analyses.

    Science.gov (United States)

    Missaghi, Shahram; Hondzo, Miki; Melching, Charles

    2013-11-01

    Two sensitivity and uncertainty analysis methods are applied to a three-dimensional coupled hydrodynamic-ecological model (ELCOM-CAEDYM) of a morphologically complex lake. The primary goals of the analyses are to increase confidence in the model predictions, identify influential model parameters, quantify the uncertainty of model prediction, and explore the spatial and temporal variabilities of model predictions. The influence of model parameters on four model-predicted variables (model output) and the contributions of each of the model-predicted variables to the total variations in model output are presented. The contributions of predicted water temperature, dissolved oxygen, total phosphorus, and algal biomass contributed 3, 13, 26, and 58% of total model output variance, respectively. The fraction of variance resulting from model parameter uncertainty was calculated by two methods and used for evaluation and ranking of the most influential model parameters. Nine out of the top 10 parameters identified by each method agreed, but their ranks were different. Spatial and temporal changes of model uncertainty were investigated and visualized. Model uncertainty appeared to be concentrated around specific water depths and dates that corresponded to significant storm events. The results suggest that spatial and temporal variations in the predicted water quality variables are sensitive to the hydrodynamics of physical perturbations such as those caused by stream inflows generated by storm events. The sensitivity and uncertainty analyses identified the mineralization of dissolved organic carbon, sediment phosphorus release rate, algal metabolic loss rate, internal phosphorus concentration, and phosphorus uptake rate as the most influential model parameters.

  12. Representing Turbulence Model Uncertainty with Stochastic PDEs

    Science.gov (United States)

    Oliver, Todd; Moser, Robert

    2012-11-01

    Validation of and uncertainty quantification for extrapolative predictions of RANS turbulence models are necessary to ensure that the models are not used outside of their domain of applicability and to properly inform decisions based on such predictions. In previous work, we have developed and calibrated statistical models for these purposes, but it has been found that incorporating all the knowledge of a domain expert--e.g., realizability, spatial smoothness, and known scalings--in such models is difficult. Here, we explore the use of stochastic PDEs for this purpose. The goal of this formulation is to pose the uncertainty model in a setting where it is easier for physical modelers to express what is known. To explore the approach, multiple stochastic models describing the error in the Reynolds stress are coupled with multiple deterministic turbulence models to make uncertain predictions of channel flow. These predictions are compared with DNS data to assess their credibility. This work is supported by the Department of Energy [National Nuclear Security Administration] under Award Number [DE-FC52-08NA28615].

  13. Stochastic reduced order models for inverse problems under uncertainty.

    Science.gov (United States)

    Warner, James E; Aquino, Wilkins; Grigoriu, Mircea D

    2015-03-01

    This work presents a novel methodology for solving inverse problems under uncertainty using stochastic reduced order models (SROMs). Given statistical information about an observed state variable in a system, unknown parameters are estimated probabilistically through the solution of a model-constrained, stochastic optimization problem. The point of departure and crux of the proposed framework is the representation of a random quantity using a SROM - a low dimensional, discrete approximation to a continuous random element that permits e cient and non-intrusive stochastic computations. Characterizing the uncertainties with SROMs transforms the stochastic optimization problem into a deterministic one. The non-intrusive nature of SROMs facilitates e cient gradient computations for random vector unknowns and relies entirely on calls to existing deterministic solvers. Furthermore, the method is naturally extended to handle multiple sources of uncertainty in cases where state variable data, system parameters, and boundary conditions are all considered random. The new and widely-applicable SROM framework is formulated for a general stochastic optimization problem in terms of an abstract objective function and constraining model. For demonstration purposes, however, we study its performance in the specific case of inverse identification of random material parameters in elastodynamics. We demonstrate the ability to efficiently recover random shear moduli given material displacement statistics as input data. We also show that the approach remains effective for the case where the loading in the problem is random as well.

  14. Uncertainty assessment of integrated distributed hydrological models using GLUE with Markov chain Monte Carlo sampling

    DEFF Research Database (Denmark)

    Blasone, Roberta-Serena; Madsen, Henrik; Rosbjerg, Dan

    2008-01-01

    uncertainty estimation (GLUE) procedure based on Markov chain Monte Carlo sampling is applied in order to improve the performance of the methodology in estimating parameters and posterior output distributions. The description of the spatial variations of the hydrological processes is accounted for by defining...... the identifiability of the parameters and results in satisfactory multi-variable simulations and uncertainty estimates. However, the parameter uncertainty alone cannot explain the total uncertainty at all the sites, due to limitations in the distributed data included in the model calibration. The study also indicates...

  15. An educational model for ensemble streamflow simulation and uncertainty analysis

    Directory of Open Access Journals (Sweden)

    A. AghaKouchak

    2012-06-01

    Full Text Available This paper presents a hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this model, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation are interconnected. The model includes a MATLAB Graphical User Interface (GUI and an ensemble simulation scheme that can be used for not only hydrological processes, but also for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity.

  16. Investigating the robustness of ion beam therapy treatment plans to uncertainties in biological treatment parameters

    CERN Document Server

    Boehlen, T T; Dosanjh, M; Ferrari, A; Fossati, P; Haberer, T; Mairani, A; Patera, V

    2012-01-01

    Uncertainties in determining clinically used relative biological effectiveness (RBE) values for ion beam therapy carry the risk of absolute and relative misestimations of RBE-weighted doses for clinical scenarios. This study assesses the consequences of hypothetical misestimations of input parameters to the RBE modelling for carbon ion treatment plans by a variational approach. The impact of the variations on resulting cell survival and RBE values is evaluated as a function of the remaining ion range. In addition, the sensitivity to misestimations in RBE modelling is compared for single fields and two opposed fields using differing optimization criteria. It is demonstrated for single treatment fields that moderate variations (up to +/-50\\%) of representative nominal input parameters for four tumours result mainly in a misestimation of the RBE-weighted dose in the planning target volume (PTV) by a constant factor and only smaller RBE-weighted dose gradients. Ensuring a more uniform radiation quality in the PTV...

  17. Usage of ensemble geothermal models to consider geological uncertainties

    Science.gov (United States)

    Rühaak, Wolfram; Steiner, Sarah; Welsch, Bastian; Sass, Ingo

    2015-04-01

    The usage of geothermal energy for instance by borehole heat exchangers (BHE) is a promising concept for a sustainable supply of heat for buildings. BHE are closed pipe systems, in which a fluid is circulating. Heat from the surrounding rocks is transferred to the fluid purely by conduction. The fluid carries the heat to the surface, where it can be utilized. Larger arrays of BHE require typically previous numerical models. Motivations are the design of the system (number and depth of the required BHE) but also regulatory reasons. Especially such regulatory operating permissions often require maximum realistic models. Although such realistic models are possible in many cases with today's codes and computer resources, they are often expensive in terms of time and effort. A particular problem is the knowledge about the accuracy of the achieved results. An issue, which is often neglected while dealing with highly complex models, is the quantification of parameter uncertainties as a consequence of the natural heterogeneity of the geological subsurface. Experience has shown, that these heterogeneities can lead to wrong forecasts. But also variations in the technical realization and especially of the operational parameters (which are mainly a consequence of the regional climate) can lead to strong variations in the simulation results. Instead of one very detailed single forecast model, it should be considered, to model numerous more simple models. By varying parameters, the presumed subsurface uncertainties, but also the uncertainties in the presumed operational parameters can be reflected. Finally not only one single result should be reported, but instead the range of possible solutions and their respective probabilities. In meteorology such an approach is well known as ensemble-modeling. The concept is demonstrated at a real world data set and discussed.

  18. Modeling and inverse problems in the presence of uncertainty

    CERN Document Server

    Banks, H T; Thompson, W Clayton

    2014-01-01

    Modeling and Inverse Problems in the Presence of Uncertainty collects recent research-including the authors' own substantial projects-on uncertainty propagation and quantification. It covers two sources of uncertainty: where uncertainty is present primarily due to measurement errors and where uncertainty is present due to the modeling formulation itself. After a useful review of relevant probability and statistical concepts, the book summarizes mathematical and statistical aspects of inverse problem methodology, including ordinary, weighted, and generalized least-squares formulations. It then

  19. Parameter Estimation, Model Reduction and Quantum Filtering

    CERN Document Server

    Chase, Bradley A

    2009-01-01

    This dissertation explores the topics of parameter estimation and model reduction in the context of quantum filtering. Chapters 2 and 3 provide a review of classical and quantum probability theory, stochastic calculus and filtering. Chapter 4 studies the problem of quantum parameter estimation and introduces the quantum particle filter as a practical computational method for parameter estimation via continuous measurement. Chapter 5 applies these techniques in magnetometry and studies the estimator's uncertainty scalings in a double-pass atomic magnetometer. Chapter 6 presents an efficient feedback controller for continuous-time quantum error correction. Chapter 7 presents an exact model of symmetric processes of collective qubit systems.

  20. Advances in the study of uncertainty quantification of large-scale hydrological modeling system

    Institute of Scientific and Technical Information of China (English)

    SONG Xiaomeng; ZHAN Chesheng; KONG Fanzhe; XIA Jun

    2011-01-01

    The regional hydrological system is extremely complex because it is affected not only by physical factors but also by human dimensions.And the hydrological models play a very important role in simulating the complex system.However,there have not been effective methods for the model reliability and uncertainty analysis due to its complexity and difficulty.The uncertainties in hydrological modeling come from four important aspects:uncertainties in input data and parameters,uncertainties in model structure,uncertainties in analysis method and the initial and boundary conditions.This paper systematically reviewed the recent advances in the study of the uncertainty analysis approaches in the large-scale complex hydrological model on the basis of uncertainty sources.Also,the shortcomings and insufficiencies in the uncertainty analysis for complex hydrological models are pointed out.And then a new uncertainty quantification platform PSUADE and its uncertainty quantification methods were introduced,which will be a powerful tool and platform for uncertainty analysis of large-scale complex hydrological models.Finally,some future perspectives on uncertainty quantification are put forward.

  1. Fault Detection under Fuzzy Model Uncertainty

    Institute of Scientific and Technical Information of China (English)

    Marek Kowal; Józef Korbicz

    2007-01-01

    The paper tackles the problem of robust fault detection using Takagi-Sugeno fuzzy models. A model-based strategy is employed to generate residuals in order to make a decision about the state of the process. Unfortunately, such a method is corrupted by model uncertainty due to the fact that in real applications there exists a model-reality mismatch. In order to ensure reliable fault detection the adaptive threshold technique is used to deal with the mentioned problem. The paper focuses also on fuzzy model design procedure. The bounded-error approach is applied to generating the rules for the model using available measurements. The proposed approach is applied to fault detection in the DC laboratory engine.

  2. Facets of Uncertainty in Digital Elevation and Slope Modeling

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jingxiong; LI Deren

    2005-01-01

    This paper investigates the differences that result from applying different approaches to uncertainty modeling and reports an experimental examining error estimation and propagation in elevation and slope,with the latter derived from the former. It is confirmed that significant differences exist between uncertainty descriptors, and propagation of uncertainty to end products is immensely affected by the specification of source uncertainty.

  3. Assessment of model uncertainty during the river export modelling of pesticides and transformation products

    Science.gov (United States)

    Gassmann, Matthias; Olsson, Oliver; Kümmerer, Klaus

    2013-04-01

    The modelling of organic pollutants in the environment is burdened by a load of uncertainties. Not only parameter values are uncertain but often also the mass and timing of pesticide application. By introducing transformation products (TPs) into modelling, further uncertainty coming from the dependence of these substances on their parent compounds and the introduction of new model parameters are likely. The purpose of this study was the investigation of the behaviour of a parsimonious catchment scale model for the assessment of river concentrations of the insecticide Chlorpyrifos (CP) and two of its TPs, Chlorpyrifos Oxon (CPO) and 3,5,6-trichloro-2-pyridinol (TCP) under the influence of uncertain input parameter values. Especially parameter uncertainty and pesticide application uncertainty were investigated by Global Sensitivity Analysis (GSA) and the Generalized Likelihood Uncertainty Estimation (GLUE) method, based on Monte-Carlo sampling. GSA revealed that half-lives and sorption parameters as well as half-lives and transformation parameters were correlated to each other. This means, that the concepts of modelling sorption and degradation/transformation were correlated. Thus, it may be difficult in modelling studies to optimize parameter values for these modules. Furthermore, we could show that erroneous pesticide application mass and timing were compensated during Monte-Carlo sampling by changing the half-life of CP. However, the introduction of TCP into the calculation of the objective function was able to enhance identifiability of pesticide application mass. The GLUE analysis showed that CP and TCP were modelled successfully, but CPO modelling failed with high uncertainty and insensitive parameters. We assumed a structural error of the model which was especially important for CPO assessment. This shows that there is the possibility that a chemical and some of its TPs can be modelled successfully by a specific model structure, but for other TPs, the model

  4. Uncertainty modelling of critical column buckling for reinforced concrete buildings

    Indian Academy of Sciences (India)

    Kasim A Korkmaz; Fuat Demir; Hamide Tekeli

    2011-04-01

    Buckling is a critical issue for structural stability in structural design. In most of the buckling analyses, applied loads, structural and material properties are considered certain. However, in reality, these parameters are uncertain. Therefore, a prognostic solution is necessary and uncertainties have to be considered. Fuzzy logic algorithms can be a solution to generate more dependable results. This study investigates the material uncertainties on column design and proposes an uncertainty model for critical column buckling reinforced concrete buildings. Fuzzy logic algorithm was employed in the study. Lower and upper bounds of elastic modulus representing material properties were defined to take uncertainties into account. The results show that uncertainties play an important role in stability analyses and should be considered in the design. The proposed approach is applicable to both future numerical and experimental researches. According to the study results, it is seen that, calculated buckling load values are stayed in lower and upper bounds while the load values are different for same concrete strength values by using different code formula.

  5. Propagating Uncertainties from Source Model Estimations to Coulomb Stress Changes

    Science.gov (United States)

    Baumann, C.; Jonsson, S.; Woessner, J.

    2009-12-01

    Multiple studies have shown that static stress changes due to permanent fault displacement trigger earthquakes on the causative and on nearby faults. Calculations of static stress changes in previous studies have been based on fault parameters without considering any source model uncertainties or with crude assumptions about fault model errors based on available different source models. In this study, we investigate the influence of fault model parameter uncertainties on Coulomb Failure Stress change (ΔCFS) calculations by propagating the uncertainties from the fault estimation process to the Coulomb Failure stress changes. We use 2500 sets of correlated model parameters determined for the June 2000 Mw = 5.8 Kleifarvatn earthquake, southwest Iceland, which were estimated by using a repeated optimization procedure and multiple data sets that had been modified by synthetic noise. The model parameters show that the event was predominantly a right-lateral strike-slip earthquake on a north-south striking fault. The variability of the sets of models represents the posterior probability density distribution for the Kleifarvatn source model. First we investigate the influence of individual source model parameters on the ΔCFS calculations. We show through a correlation analysis that for this event, changes in dip, east location, strike, width and in part north location have stronger impact on the Coulomb failure stress changes than changes in fault length, depth, dip-slip and strike-slip. Second we find that the accuracy of Coulomb failure stress changes appears to increase with increasing distance from the fault. The absolute value of the standard deviation decays rapidly with distance within about 5-6 km around the fault from about 3-3.5 MPa down to a few Pa, implying that the influence of parameter changes decrease with increasing distance. This is underlined by the coefficient of variation CV, defined as the ratio of the standard deviation of the Coulomb stress

  6. The climate dependence of the terrestrial carbon cycle; including parameter and structural uncertainties

    Directory of Open Access Journals (Sweden)

    M. J. Smith

    2012-10-01

    Full Text Available The feedback between climate and the terrestrial carbon cycle will be a key determinant of the dynamics of the Earth System over the coming decades and centuries. However Earth System Model projections of the terrestrial carbon-balance vary widely over these timescales. This is largely due to differences in their carbon cycle models. A major goal in biogeosciences is therefore to improve understanding of the terrestrial carbon cycle to enable better constrained projections. Essential to achieving this goal will be assessing the empirical support for alternative models of component processes, identifying key uncertainties and inconsistencies, and ultimately identifying the models that are most consistent with empirical evidence. To begin meeting these requirements we data-constrained all parameters of all component processes within a global terrestrial carbon model. Our goals were to assess the climate dependencies obtained for different component processes when all parameters have been inferred from empirical data, assess whether these were consistent with current knowledge and understanding, assess the importance of different data sets and the model structure for inferring those dependencies, assess the predictive accuracy of the model, and to identify a methodology by which alternative component models could be compared within the same framework in future. Although formulated as differential equations describing carbon fluxes through plant and soil pools, the model was fitted assuming the carbon pools were in states of dynamic equilibrium (input rates equal output rates. Thus, the parameterised model is of the equilibrium terrestrial carbon cycle. All but 2 of the 12 component processes to the model were inferred to have strong climate dependencies although it was not possible to data-constrain all parameters indicating some potentially redundant details. Similar climate dependencies were obtained for most processes whether inferred

  7. A python framework for environmental model uncertainty analysis

    Science.gov (United States)

    White, Jeremy; Fienen, Michael; Doherty, John E.

    2016-01-01

    We have developed pyEMU, a python framework for Environmental Modeling Uncertainty analyses, open-source tool that is non-intrusive, easy-to-use, computationally efficient, and scalable to highly-parameterized inverse problems. The framework implements several types of linear (first-order, second-moment (FOSM)) and non-linear uncertainty analyses. The FOSM-based analyses can also be completed prior to parameter estimation to help inform important modeling decisions, such as parameterization and objective function formulation. Complete workflows for several types of FOSM-based and non-linear analyses are documented in example notebooks implemented using Jupyter that are available in the online pyEMU repository. Example workflows include basic parameter and forecast analyses, data worth analyses, and error-variance analyses, as well as usage of parameter ensemble generation and management capabilities. These workflows document the necessary steps and provides insights into the results, with the goal of educating users not only in how to apply pyEMU, but also in the underlying theory of applied uncertainty quantification.

  8. Geostatistical simulation of geological architecture and uncertainty propagation in groundwater modeling

    DEFF Research Database (Denmark)

    He, Xiulan

    parameters and model structures, which are the primary focuses of this PhD research. Parameter uncertainty was analyzed using an optimization tool (PEST: Parameter ESTimation) in combination with a random sampling method (LHS: Latin Hypercube Sampling). Model structure, namely geological architecture...... was analyzed using both a traditional two-point based geostatistical approach and multiple-point geostatistics (MPS). Our results documented that model structure is as important as model parameter regarding groundwater modeling uncertainty. Under certain circumstances the inaccuracy on model structure can...

  9. Intrinsic Uncertainties in Modeling Complex Systems.

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, Curtis S; Bramson, Aaron L.; Ames, Arlo L.

    2014-09-01

    Models are built to understand and predict the behaviors of both natural and artificial systems. Because it is always necessary to abstract away aspects of any non-trivial system being modeled, we know models can potentially leave out important, even critical elements. This reality of the modeling enterprise forces us to consider the prospective impacts of those effects completely left out of a model - either intentionally or unconsidered. Insensitivity to new structure is an indication of diminishing returns. In this work, we represent a hypothetical unknown effect on a validated model as a finite perturba- tion whose amplitude is constrained within a control region. We find robustly that without further constraints, no meaningful bounds can be placed on the amplitude of a perturbation outside of the control region. Thus, forecasting into unsampled regions is a very risky proposition. We also present inherent difficulties with proper time discretization of models and representing in- herently discrete quantities. We point out potentially worrisome uncertainties, arising from math- ematical formulation alone, which modelers can inadvertently introduce into models of complex systems. Acknowledgements This work has been funded under early-career LDRD project #170979, entitled "Quantify- ing Confidence in Complex Systems Models Having Structural Uncertainties", which ran from 04/2013 to 09/2014. We wish to express our gratitude to the many researchers at Sandia who con- tributed ideas to this work, as well as feedback on the manuscript. In particular, we would like to mention George Barr, Alexander Outkin, Walt Beyeler, Eric Vugrin, and Laura Swiler for provid- ing invaluable advice and guidance through the course of the project. We would also like to thank Steven Kleban, Amanda Gonzales, Trevor Manzanares, and Sarah Burwell for their assistance in managing project tasks and resources.

  10. Using dynamical uncertainty models estimating uncertainty bounds on power plant performance prediction

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Stoustrup, Jakob; Mataji, B.

    2007-01-01

    Predicting the performance of large scale plants can be difficult due to model uncertainties etc, meaning that one can be almost certain that the prediction will diverge from the plant performance with time. In this paper output multiplicative uncertainty models are used as dynamical models of th...... models, is applied to two different sets of measured plant data. The computed uncertainty bounds cover the measured plant output, while the nominal prediction is outside these uncertainty bounds for some samples in these examples.  ......Predicting the performance of large scale plants can be difficult due to model uncertainties etc, meaning that one can be almost certain that the prediction will diverge from the plant performance with time. In this paper output multiplicative uncertainty models are used as dynamical models...... of the prediction error. These proposed dynamical uncertainty models result in an upper and lower bound on the predicted performance of the plant. The dynamical uncertainty models are used to estimate the uncertainty of the predicted performance of a coal-fired power plant. The proposed scheme, which uses dynamical...

  11. Model uncertainty and Bayesian model averaging in vector autoregressive processes

    NARCIS (Netherlands)

    R.W. Strachan (Rodney); H.K. van Dijk (Herman)

    2006-01-01

    textabstractEconomic forecasts and policy decisions are often informed by empirical analysis based on econometric models. However, inference based upon a single model, when several viable models exist, limits its usefulness. Taking account of model uncertainty, a Bayesian model averaging procedure i

  12. Uncertainty and the Conceptual Site Model

    Science.gov (United States)

    Price, V.; Nicholson, T. J.

    2007-12-01

    Our focus is on uncertainties in the underlying conceptual framework upon which all subsequent steps in numerical and/or analytical modeling efforts depend. Experienced environmental modelers recognize the value of selecting an optimal conceptual model from several competing site models, but usually do not formally explore possible alternative models, in part due to incomplete or missing site data, as well as relevant regional data for establishing boundary conditions. The value in and approach for developing alternative conceptual site models (CSM) is demonstrated by analysis of case histories. These studies are based on reported flow or transport modeling in which alternative site models are formulated using data that were not available to, or not used by, the original modelers. An important concept inherent to model abstraction of these alternative conceptual models is that it is "Far better an approximate answer to the right question, which is often vague, than the exact answer to the wrong question, which can always be made precise." (Tukey, 1962) The case histories discussed here illustrate the value of formulating alternative models and evaluating them using site-specific data: (1) Charleston Naval Site where seismic characterization data allowed significant revision of the CSM and subsequent contaminant transport modeling; (2) Hanford 300-Area where surface- and ground-water interactions affecting the unsaturated zone suggested an alternative component to the site model; (3) Savannah River C-Area where a characterization report for a waste site within the modeled area was not available to the modelers, but provided significant new information requiring changes to the underlying geologic and hydrogeologic CSM's used; (4) Amargosa Desert Research Site (ADRS) where re-interpretation of resistivity sounding data and water-level data suggested an alternative geologic model. Simple 2-D spreadsheet modeling of the ADRS with the revised CSM provided an improved

  13. Influence of parameter estimation uncertainty in Kriging: Part 2 - Test and case study applications

    Directory of Open Access Journals (Sweden)

    E. Todini

    2001-01-01

    Full Text Available The theoretical approach introduced in Part 1 is applied to a numerical example and to the case of yearly average precipitation estimation over the Veneto Region in Italy. The proposed methodology was used to assess the effects of parameter estimation uncertainty on Kriging estimates and on their estimated error variance. The Maximum Likelihood (ML estimator proposed in Part 1, was applied to the zero mean deviations from yearly average precipitation over the Veneto Region in Italy, obtained after the elimination of a non-linear drift with elevation. Three different semi-variogram models were used, namely the exponential, the Gaussian and the modified spherical, and the relevant biases as well as the increases in variance have been assessed. A numerical example was also conducted to demonstrate how the procedure leads to unbiased estimates of the random functions. One hundred sets of 82 observations were generated by means of the exponential model on the basis of the parameter values identified for the Veneto Region rainfall problem and taken as characterising the true underlining process. The values of parameter and the consequent cross-validation errors, were estimated from each sample. The cross-validation errors were first computed in the classical way and then corrected with the procedure derived in Part 1. Both sets, original and corrected, were then tested, by means of the Likelihood ratio test, against the null hypothesis of deriving from a zero mean process with unknown covariance. The results of the experiment clearly show the effectiveness of the proposed approach. Keywords: yearly rainfall, maximum likelihood, Kriging, parameter estimation uncertainty

  14. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    Energy Technology Data Exchange (ETDEWEB)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  15. Impact of uncertainty in attributing modeled North American terrestrial carbon fluxes to anthropogenic forcings

    Science.gov (United States)

    Ricciuto, D. M.

    2015-12-01

    Although much progress has been made in the past decade in constraining the net North American terrestrial carbon flux, considerable uncertainty remains in the sink magnitude and trend. Terrestrial carbon cycle models are increasing in spatial resolution, complexity and predictive skill, allowing for increased process-level understanding and attribution of net carbon fluxes to specific causes. Here we examine the various sources of uncertainty, including driver uncertainty, model parameter uncertainty, and structural uncertainty, and the contribution of each type uncertainty to the net sink, and the attribution of this sink to anthropogenic causes: Increasing CO2 concentrations, nitrogen deposition, land use change, and changing climate. To examine driver and parameter uncertainty, model simulations are performed using the Community Land Model version 4.5 (CLM4.5) with literature-based parameter ranges and three different reanalysis meteorological forcing datasets. We also examine structural uncertainty thorough analysis of the Multiscale Terrestrial Model Intercomparison (MsTMIP). Identififying major sources of uncertainty can help to guide future observations, experiments, and model development activities.

  16. How well can we forecast future model error and uncertainty by mining past model performance data

    Science.gov (United States)

    Solomatine, Dimitri

    2016-04-01

    Consider a hydrological model Y(t) = M(X(t), P), where X=vector of inputs; P=vector of parameters; Y=model output (typically flow); t=time. In cases when there is enough past data on the model M performance, it is possible to use this data to build a (data-driven) model EC of model M error. This model EC will be able to forecast error E when a new input X is fed into model M; then subtracting E from the model prediction Y a better estimate of Y can be obtained. Model EC is usually called the error corrector (in meteorology - a bias corrector). However, we may go further in characterizing model deficiencies, and instead of using the error (a real value) we may consider a more sophisticated characterization, namely a probabilistic one. So instead of rather a model EC of the model M error it is also possible to build a model U of model M uncertainty; if uncertainty is described as the model error distribution D this model will calculate its properties - mean, variance, other moments, and quantiles. The general form of this model could be: D = U (RV), where RV=vector of relevant variables having influence on model uncertainty (to be identified e.g. by mutual information analysis); D=vector of variables characterizing the error distribution (typically, two or more quantiles). There is one aspect which is not always explicitly mentioned in uncertainty analysis work. In our view it is important to distinguish the following main types of model uncertainty: 1. The residual uncertainty of models. In this case the model parameters and/or model inputs are considered to be fixed (deterministic), i.e. the model is considered to be optimal (calibrated) and deterministic. Model error is considered as the manifestation of uncertainty. If there is enough past data about the model errors (i.e. its uncertainty), it is possible to build a statistical or machine learning model of uncertainty trained on this data. Here the following methods can be mentioned: (a) quantile regression (QR

  17. Reliable Estimation of Prediction Uncertainty for Physicochemical Property Models.

    Science.gov (United States)

    Proppe, Jonny; Reiher, Markus

    2017-07-11

    One of the major challenges in computational science is to determine the uncertainty of a virtual measurement, that is the prediction of an observable based on calculations. As highly accurate first-principles calculations are in general unfeasible for most physical systems, one usually resorts to parameteric property models of observables, which require calibration by incorporating reference data. The resulting predictions and their uncertainties are sensitive to systematic errors such as inconsistent reference data, parametric model assumptions, or inadequate computational methods. Here, we discuss the calibration of property models in the light of bootstrapping, a sampling method that can be employed for identifying systematic errors and for reliable estimation of the prediction uncertainty. We apply bootstrapping to assess a linear property model linking the (57)Fe Mössbauer isomer shift to the contact electron density at the iron nucleus for a diverse set of 44 molecular iron compounds. The contact electron density is calculated with 12 density functionals across Jacob's ladder (PWLDA, BP86, BLYP, PW91, PBE, M06-L, TPSS, B3LYP, B3PW91, PBE0, M06, TPSSh). We provide systematic-error diagnostics and reliable, locally resolved uncertainties for isomer-shift predictions. Pure and hybrid density functionals yield average prediction uncertainties of 0.06-0.08 mm s(-1) and 0.04-0.05 mm s(-1), respectively, the latter being close to the average experimental uncertainty of 0.02 mm s(-1). Furthermore, we show that both model parameters and prediction uncertainty depend significantly on the composition and number of reference data points. Accordingly, we suggest that rankings of density functionals based on performance measures (e.g., the squared coefficient of correlation, r(2), or the root-mean-square error, RMSE) should not be inferred from a single data set. This study presents the first statistically rigorous calibration analysis for theoretical M

  18. Morphological divergence rate tests for natural selection: uncertainty of parameter estimation and robustness of results

    Directory of Open Access Journals (Sweden)

    Leandro R. Monteiro

    2005-01-01

    Full Text Available In this study, we used a combination of geometric morphometric and evolutionary genetics methods for the inference of possible mechanisms of evolutionary divergence. A sensitivity analysis for the constant-heritability rate test results regarding variation in genetic and demographic parameters was performed, in order to assess the relative influence of uncertainty of parameter estimation on the robustness of test results. As an application, we present a study on body shape variation among populations of the poeciliine fish Poecilia vivipara inhabiting lagoons of the quaternary plains in northern Rio de Janeiro State, Brazil. The sensitivity analysis showed that, in general, the most important parameters are heritability, effective population size and number of generations since divergence. For this specific example, using a conservatively wide range of parameters, the neutral model of genetic drift could not be accepted as a sole cause for the observed magnitude of morphological divergence among populations. A mechanism of directional selection is suggested as the main cause of variation among populations in different habitats and lagoons. The implications of parameter estimation and biological assumptions and consequences are discussed.

  19. Uncertainty analysis of strain modal parameters by Bayesian method using frequency response function

    Institute of Scientific and Technical Information of China (English)

    Xu Li; Yi Weijian; Zhihua Yi

    2007-01-01

    Structural strain modes are able to detect changes in local structural performance, but errors are inevitably intermixed in the measured data. In this paper, strain modal parameters are considered as random variables, and their uncertainty is analyzed by a Bayesian method based on the structural frequency response function (FRF). The estimates of strain modal parameters with maximal posterior probability are determined. Several independent measurements of the FRF of a four-story reinforced concrete frame structural model were performed in the laboratory. The ability to identify the stiffness change in a concrete column using the strain mode was verified. It is shown that the uncertainty of the natural frequency is very small. Compared with the displacement mode shape, the variations of strain mode shapes at each point are quite different. The damping ratios are more affected by the types of test systems. Except for the case where a high order strain mode does not identify local damage, the first order strain mode can provide an exact indication of the damage location.

  20. A SYSTEMS DYNAMICS APPROACH TO COMPETING TECHNOLOGIES: EXPLORING UNCERTAINTY OF INTERACTION AND MARKET PARAMETERS

    Directory of Open Access Journals (Sweden)

    L. Pretorius

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Technology can be identified as the result of an innovation process that may be time-dependent. Furthermore, technology is both an input to the innovation process and an output of it. When two competing technologies are diffused into the market, they are evaluated as a technology system by means of a systems dynamics approach. It is shown that systems thinking can be used initially to identify and assess the important factors that influence the competitive behaviour of the two technologies. Interesting dynamics of this technology management system are presented and discussed in the context of uncertainty of interaction between the two technologies. It is specifically shown that the life span of the existing technology, which resists competition, may be adversely affected under conditions of uncertainty. The effect of uncertainty in more than one systems dynamics model parameter – specifically, the interaction and market parameter in the competing technology system – is also addressed. The Lotka-Volterra approach of predator-prey interaction is used to model the interaction between and diffusion of the two technologies in the system. A qualitative assessment of the systems dynamics model without uncertainty is attempted in the exploration of a real case study of two competing technologies.

    AFRIKAANSE OPSOMMING: DTegnologie kan beskryf word as die resultaat van ’n innovasie proses wat tydsveranderlik kan wees. Tegnologie is beide ’n inset sowel as ’n uitset van die innovasie proses. ’n Geval waar twee kompeterende tegnologieë in die mark diffundeer word met behulp van sisteemdinamika geëvalueer as ’n tegnologiestelsel. Dit word aangetoon dat stelselsdenke gebruik kan word as voorloper om die belangrike faktore wat die kompeterende gedrag van die twee tegnologieë beïnvloed, te identifiseer en te assesseer. Interessante dinamiese gedrag van hierdie tegnologiebestuurstelsel word aangebied en bespreek in

  1. Experimental Active Vibration Control in Truss Structures Considering Uncertainties in System Parameters

    Directory of Open Access Journals (Sweden)

    Douglas Domingues Bueno

    2008-01-01

    Full Text Available This paper deals with the study of algorithms for robust active vibration control in flexible structures considering uncertainties in system parameters. It became an area of enormous interest, mainly due to the countless demands of optimal performance in mechanical systems as aircraft, aerospace, and automotive structures. An important and difficult problem for designing active vibration control is to get a representative dynamic model. Generally, this model can be obtained using finite element method (FEM or an identification method using experimental data. Actuators and sensors may affect the dynamics properties of the structure, for instance, electromechanical coupling of piezoelectric material must be considered in FEM formulation for flexible and lightly damping structure. The nonlinearities and uncertainties involved in these structures make it a difficult task, mainly for complex structures as spatial truss structures. On the other hand, by using an identification method, it is possible to obtain the dynamic model represented through a state space realization considering this coupling. This paper proposes an experimental methodology for vibration control in a 3D truss structure using PZT wafer stacks and a robust control algorithm solved by linear matrix inequalities.

  2. Sensitivities and uncertainties of modeled ground temperatures in mountain environments

    Directory of Open Access Journals (Sweden)

    S. Gubler

    2013-02-01

    Full Text Available Before operational use or for decision making, models must be validated, and the degree of trust in model outputs should be quantified. Often, model validation is performed at single locations due to the lack of spatially-distributed data. Since the analysis of parametric model uncertainties can be performed independently of observations, it is a suitable method to test the influence of environmental variability on model evaluation. In this study, the sensitivities and uncertainty of a physically-based mountain permafrost model are quantified within an artificial topography consisting of different elevations and exposures combined with six ground types characterized by their hydraulic properties. The analyses performed for all combinations of topographic factors and ground types allowed to quantify the variability of model sensitivity and uncertainty within mountain regions. We found that modeled snow duration considerably influences the mean annual ground temperature (MAGT. The melt-out day of snow (MD is determined by processes determining snow accumulation and melting. Parameters such as the temperature and precipitation lapse rate and the snow correction factor have therefore a great impact on modeled MAGT. Ground albedo changes MAGT from 0.5 to 4°C in dependence of the elevation, the aspect and the ground type. South-exposed inclined locations are more sensitive to changes in ground albedo than north-exposed slopes since they receive more solar radiation. The sensitivity to ground albedo increases with decreasing elevation due to shorter snow cover. Snow albedo and other parameters determining the amount of reflected solar radiation are important, changing MAGT at different depths by more than 1°C. Parameters influencing the turbulent fluxes as the roughness length or the dew temperature are more sensitive at low elevation sites due to higher air temperatures and decreased solar radiation. Modeling the individual terms of the energy

  3. Understanding uncertainties in model-based predictions of Aedes aegypti population dynamics.

    Directory of Open Access Journals (Sweden)

    Chonggang Xu

    2010-09-01

    Full Text Available Aedes aegypti is one of the most important mosquito vectors of human disease. The development of spatial models for Ae. aegypti provides a promising start toward model-guided vector control and risk assessment, but this will only be possible if models make reliable predictions. The reliability of model predictions is affected by specific sources of uncertainty in the model.This study quantifies uncertainties in the predicted mosquito population dynamics at the community level (a cluster of 612 houses and the individual-house level based on Skeeter Buster, a spatial model of Ae. aegypti, for the city of Iquitos, Peru. The study considers two types of uncertainty: 1 uncertainty in the estimates of 67 parameters that describe mosquito biology and life history, and 2 uncertainty due to environmental and demographic stochasticity. Our results show that for pupal density and for female adult density at the community level, respectively, the 95% prediction confidence interval ranges from 1000 to 3000 and from 700 to 5,000 individuals. The two parameters contributing most to the uncertainties in predicted population densities at both individual-house and community levels are the female adult survival rate and a coefficient determining weight loss due to energy used in metabolism at the larval stage (i.e. metabolic weight loss. Compared to parametric uncertainty, stochastic uncertainty is relatively low for population density predictions at the community level (less than 5% of the overall uncertainty but is substantially higher for predictions at the individual-house level (larger than 40% of the overall uncertainty. Uncertainty in mosquito spatial dispersal has little effect on population density predictions at the community level but is important for the prediction of spatial clustering at the individual-house level.This is the first systematic uncertainty analysis of a detailed Ae. aegypti population dynamics model and provides an approach for

  4. Understanding uncertainties in model-based predictions of Aedes aegypti population dynamics.

    Science.gov (United States)

    Xu, Chonggang; Legros, Mathieu; Gould, Fred; Lloyd, Alun L

    2010-09-28

    Aedes aegypti is one of the most important mosquito vectors of human disease. The development of spatial models for Ae. aegypti provides a promising start toward model-guided vector control and risk assessment, but this will only be possible if models make reliable predictions. The reliability of model predictions is affected by specific sources of uncertainty in the model. This study quantifies uncertainties in the predicted mosquito population dynamics at the community level (a cluster of 612 houses) and the individual-house level based on Skeeter Buster, a spatial model of Ae. aegypti, for the city of Iquitos, Peru. The study considers two types of uncertainty: 1) uncertainty in the estimates of 67 parameters that describe mosquito biology and life history, and 2) uncertainty due to environmental and demographic stochasticity. Our results show that for pupal density and for female adult density at the community level, respectively, the 95% prediction confidence interval ranges from 1000 to 3000 and from 700 to 5,000 individuals. The two parameters contributing most to the uncertainties in predicted population densities at both individual-house and community levels are the female adult survival rate and a coefficient determining weight loss due to energy used in metabolism at the larval stage (i.e. metabolic weight loss). Compared to parametric uncertainty, stochastic uncertainty is relatively low for population density predictions at the community level (less than 5% of the overall uncertainty) but is substantially higher for predictions at the individual-house level (larger than 40% of the overall uncertainty). Uncertainty in mosquito spatial dispersal has little effect on population density predictions at the community level but is important for the prediction of spatial clustering at the individual-house level. This is the first systematic uncertainty analysis of a detailed Ae. aegypti population dynamics model and provides an approach for identifying those

  5. Management of California Oak Woodlands: Uncertainties and Modeling

    Science.gov (United States)

    Jay E. Noel; Richard P. Thompson

    1995-01-01

    A mathematical policy model of oak woodlands is presented. The model illustrates the policy uncertainties that exist in the management of oak woodlands. These uncertainties include: (1) selection of a policy criterion function, (2) woodland dynamics, (3) initial and final state of the woodland stock. The paper provides a review of each of the uncertainty issues. The...

  6. The importance of expression of uncertainty of acoustical parameters of ultrasonic phantoms

    Energy Technology Data Exchange (ETDEWEB)

    Maggi, L E; Souza, A B B; Ichinose, R M; Pereira, W C A; Kruger, M A von [Programa de Engenharia Biomedica/COPPE - UFRJ, Rio de Janeiro (Brazil); Costa-Felix, R P B, E-mail: luis.maggi@gmail.com [Ultrasound Laboratory, Diavi/Dimci/Inmetro, Duque de Caxias, RJ (Brazil)

    2011-02-01

    The measurement of uncertainties in scientific experiments improves greatly quality and reliability of the results. However, in many cases, experimental results are only expressed by its average value and standard deviation. The longitudinal velocity and attenuation coefficient are acoustic parameters commonly used to characterize biological tissues and materials. In this work it is studied the uncertainty in experiments designed to evaluate these parameters on two different materials (silicone rubber and PVCP). The uncertainties were studied following the Guide to the Expression of Uncertainty in Measurement and calculated by a program in Labview8.6. One setup was developed to measure the acoustic parameters by a transmission/reception technique. Five signals of each medium (water and materials) were collected. The attenuation coefficient was calculated using the relation between the amplitude spectrum peak of the water signal and its respective point on the spectrum of the material signal. The longitudinal velocity was calculated using the time delay between signal peaks (from water and from the material). The individual uncertainties of each part of setup were estimated and these values permitted to identify which were the sources of uncertainty that most contributed to increase the value of associated uncertainty. It permitted to improve experiment's quality and reliability.

  7. Power system transient stability simulation under uncertainty based on Taylor model arithmetic

    Institute of Scientific and Technical Information of China (English)

    Shouxiang WANG; Zhijie ZHENG; Chengshan WANG

    2009-01-01

    The Taylor model arithmetic is introduced to deal with uncertainty. The uncertainty of model parameters is described by Taylor models and each variable in functions is replaced with the Taylor model (TM). Thus,time domain simulation under uncertainty is transformed to the integration of TM-based differential equations. In this paper, the Taylor series method is employed to compute differential equations; moreover, power system time domain simulation under uncertainty based on Taylor model method is presented. This method allows a rigorous estimation of the influence of either form of uncertainty and only needs one simulation. It is computationally fast compared with the Monte Carlo method, which is another technique for uncertainty analysis. The proposed method has been tested on the 39-bus New England system. The test results illustrate the effectiveness and practical value of the approach by comparing with the results of Monte Carlo simulation and traditional time domain simulation.

  8. Gaze categorization under uncertainty: psychophysics and modeling.

    Science.gov (United States)

    Mareschal, Isabelle; Calder, Andrew J; Dadds, Mark R; Clifford, Colin W G

    2013-04-22

    The accurate perception of another person's gaze direction underlies most social interactions and provides important information about his or her future intentions. As a first step to measuring gaze perception, most experiments determine the range of gaze directions that observers judge as being direct: the cone of direct gaze. This measurement has revealed the flexibility of observers' perception of gaze and provides a useful benchmark against which to test clinical populations with abnormal gaze behavior. Here, we manipulated effective signal strength by adding noise to the eyes of synthetic face stimuli or removing face information. We sought to move beyond a descriptive account of gaze categorization by fitting a model to the data that relies on changing the uncertainty associated with an estimate of gaze direction as a function of the signal strength. This model accounts for all the data and provides useful insight into the visual processes underlying normal gaze perception.

  9. Three Dimensional Vapor Intrusion Modeling: Model Validation and Uncertainty Analysis

    Science.gov (United States)

    Akbariyeh, S.; Patterson, B.; Rakoczy, A.; Li, Y.

    2013-12-01

    Volatile organic chemicals (VOCs), such as chlorinated solvents and petroleum hydrocarbons, are prevalent groundwater contaminants due to their improper disposal and accidental spillage. In addition to contaminating groundwater, VOCs may partition into the overlying vadose zone and enter buildings through gaps and cracks in foundation slabs or basement walls, a process termed vapor intrusion. Vapor intrusion of VOCs has been recognized as a detrimental source for human exposures to potential carcinogenic or toxic compounds. The simulation of vapor intrusion from a subsurface source has been the focus of many studies to better understand the process and guide field investigation. While multiple analytical and numerical models were developed to simulate the vapor intrusion process, detailed validation of these models against well controlled experiments is still lacking, due to the complexity and uncertainties associated with site characterization and soil gas flux and indoor air concentration measurement. In this work, we present an effort to validate a three-dimensional vapor intrusion model based on a well-controlled experimental quantification of the vapor intrusion pathways into a slab-on-ground building under varying environmental conditions. Finally, a probabilistic approach based on Monte Carlo simulations is implemented to determine the probability distribution of indoor air concentration based on the most uncertain input parameters.

  10. Probabilistic uncertainty analysis of epidemiological modeling to guide public health intervention policy

    Directory of Open Access Journals (Sweden)

    Jennifer A. Gilbert

    2014-03-01

    Full Text Available Mathematical modeling of disease transmission has provided quantitative predictions for health policy, facilitating the evaluation of epidemiological outcomes and the cost-effectiveness of interventions. However, typical sensitivity analyses of deterministic dynamic infectious disease models focus on model architecture and the relative importance of parameters but neglect parameter uncertainty when reporting model predictions. Consequently, model results that identify point estimates of intervention levels necessary to terminate transmission yield limited insight into the probability of success. We apply probabilistic uncertainty analysis to a dynamic model of influenza transmission and assess global uncertainty in outcome. We illustrate that when parameter uncertainty is not incorporated into outcome estimates, levels of vaccination and treatment predicted to prevent an influenza epidemic will only have an approximately 50% chance of terminating transmission and that sensitivity analysis alone is not sufficient to obtain this information. We demonstrate that accounting for parameter uncertainty yields probabilities of epidemiological outcomes based on the degree to which data support the range of model predictions. Unlike typical sensitivity analyses of dynamic models that only address variation in parameters, the probabilistic uncertainty analysis described here enables modelers to convey the robustness of their predictions to policy makers, extending the power of epidemiological modeling to improve public health.

  11. Dealing with Uncertainty about Item Parameters: Expected Response Functions

    Science.gov (United States)

    1994-04-01

    Planning Council, Educational Testing Service. We are grateful to Duanli Yan for computing assistance and to Mama Golub-Smith, Charlie Lewis, and Jerry...such an approximation. This expedient makes it possible to use standard off-the-shelf software designed for popular parametric IRT models to estimate...Psychometric Society, Nashville TN, June, 1985. Lindley, D.V. (1980). Approximate Bayesian methods. Trabajos Estadistica, 31,223-237. Lord, F.M. (1980

  12. Effects of input uncertainty on cross-scale crop modeling

    Science.gov (United States)

    Waha, Katharina; Huth, Neil; Carberry, Peter

    2014-05-01

    The quality of data on climate, soils and agricultural management in the tropics is in general low or data is scarce leading to uncertainty in process-based modeling of cropping systems. Process-based crop models are common tools for simulating crop yields and crop production in climate change impact studies, studies on mitigation and adaptation options or food security studies. Crop modelers are concerned about input data accuracy as this, together with an adequate representation of plant physiology processes and choice of model parameters, are the key factors for a reliable simulation. For example, assuming an error in measurements of air temperature, radiation and precipitation of ± 0.2°C, ± 2 % and ± 3 % respectively, Fodor & Kovacs (2005) estimate that this translates into an uncertainty of 5-7 % in yield and biomass simulations. In our study we seek to answer the following questions: (1) are there important uncertainties in the spatial variability of simulated crop yields on the grid-cell level displayed on maps, (2) are there important uncertainties in the temporal variability of simulated crop yields on the aggregated, national level displayed in time-series, and (3) how does the accuracy of different soil, climate and management information influence the simulated crop yields in two crop models designed for use at different spatial scales? The study will help to determine whether more detailed information improves the simulations and to advise model users on the uncertainty related to input data. We analyse the performance of the point-scale crop model APSIM (Keating et al., 2003) and the global scale crop model LPJmL (Bondeau et al., 2007) with different climate information (monthly and daily) and soil conditions (global soil map and African soil map) under different agricultural management (uniform and variable sowing dates) for the low-input maize-growing areas in Burkina Faso/West Africa. We test the models' response to different levels of input

  13. Biases and Uncertainties in Physical Parameter Estimates of Lyman Break Galaxies from Broad-band Photometry

    CERN Document Server

    Lee, Seong-Kook; Ferguson, Henry C; Somerville, Rachel S; Wiklind, Tommy; Giavalisco, Mauro

    2008-01-01

    We investigate the biases and uncertainties in estimates of physical parameters of high-redshift Lyman break galaxies (LBGs), such as stellar mass, mean stellar population age, and star formation rate (SFR), obtained from broad-band photometry. By combining LCDM hierarchical structure formation theory, semi-analytic treatments of baryonic physics, and stellar population synthesis models, we construct model galaxy catalogs from which we select LBGs at redshifts z ~ 3.4, 4.0, and 5.0. The broad-band spectral energy distributions (SEDs) of these model LBGs are then analysed by fitting galaxy template SEDs derived from stellar population synthesis models with smoothly declining SFRs. We compare the statistical properties of LBGs' physical parameters -- such as stellar mass, SFR, and stellar population age -- as derived from the best-fit galaxy templates with the intrinsic values from the semi-analytic model. We find some trends in these distributions: first, when the redshift is known, SED-fitting methods reprodu...

  14. Bayesian methods for model choice and propagation of model uncertainty in groundwater transport modeling

    Science.gov (United States)

    Mendes, B. S.; Draper, D.

    2008-12-01

    The issue of model uncertainty and model choice is central in any groundwater modeling effort [Neuman and Wierenga, 2003]; among the several approaches to the problem we favour using Bayesian statistics because it is a method that integrates in a natural way uncertainties (arising from any source) and experimental data. In this work, we experiment with several Bayesian approaches to model choice, focusing primarily on demonstrating the usefulness of the Reversible Jump Markov Chain Monte Carlo (RJMCMC) simulation method [Green, 1995]; this is an extension of the now- common MCMC methods. Standard MCMC techniques approximate posterior distributions for quantities of interest, often by creating a random walk in parameter space; RJMCMC allows the random walk to take place between parameter spaces with different dimensionalities. This fact allows us to explore state spaces that are associated with different deterministic models for experimental data. Our work is exploratory in nature; we restrict our study to comparing two simple transport models applied to a data set gathered to estimate the breakthrough curve for a tracer compound in groundwater. One model has a mean surface based on a simple advection dispersion differential equation; the second model's mean surface is also governed by a differential equation but in two dimensions. We focus on artificial data sets (in which truth is known) to see if model identification is done correctly, but we also address the issues of over and under-paramerization, and we compare RJMCMC's performance with other traditional methods for model selection and propagation of model uncertainty, including Bayesian model averaging, BIC and DIC.References Neuman and Wierenga (2003). A Comprehensive Strategy of Hydrogeologic Modeling and Uncertainty Analysis for Nuclear Facilities and Sites. NUREG/CR-6805, Division of Systems Analysis and Regulatory Effectiveness Office of Nuclear Regulatory Research, U. S. Nuclear Regulatory Commission

  15. Constraining Parameter Uncertainty in Simulations of Water and Heat Dynamics in Seasonally Frozen Soil Using Limited Observed Data

    Directory of Open Access Journals (Sweden)

    Mousong Wu

    2016-02-01

    Full Text Available Water and energy processes in frozen soils are important for better understanding hydrologic processes and water resources management in cold regions. To investigate the water and energy balance in seasonally frozen soils, CoupModel combined with the generalized likelihood uncertainty estimation (GLUE method was used. Simulation work on water and heat processes in frozen soil in northern China during the 2012/2013 winter was conducted. Ensemble simulations through the Monte Carlo sampling method were generated for uncertainty analysis. Behavioral simulations were selected based on combinations of multiple model performance index criteria with respect to simulated soil water and temperature at four depths (5 cm, 15 cm, 25 cm, and 35 cm. Posterior distributions for parameters related to soil hydraulic, radiation processes, and heat transport indicated that uncertainties in both input and model structures could influence model performance in modeling water and heat processes in seasonally frozen soils. Seasonal courses in water and energy partitioning were obvious during the winter. Within the day-cycle, soil evaporation/condensation and energy distributions were well captured and clarified as an important phenomenon in the dynamics of the energy balance system. The combination of the CoupModel simulations with the uncertainty-based calibration method provides a way of understanding the seasonal courses of hydrology and energy processes in cold regions with limited data. Additional measurements may be used to further reduce the uncertainty of regulating factors during the different stages of freezing–thawing.

  16. Uncertainty and sensitivity analysis: Mathematical model of coupled heat and mass transfer for a contact baking process

    DEFF Research Database (Denmark)

    Feyissa, Aberham Hailu; Gernaey, Krist; Adler-Nissen, Jens

    2012-01-01

    Similar to other processes, the modelling of heat and mass transfer during food processing involves uncertainty in the values of input parameters (heat and mass transfer coefficients, evaporation rate parameters, thermo-physical properties, initial and boundary conditions) which leads...... to uncertainty in the model predictions. The aim of the current paper is to address this uncertainty challenge in the modelling of food production processes using a combination of uncertainty and sensitivity analysis, where the uncertainty analysis and global sensitivity analysis were applied to a heat and mass...... transfer model of a contact baking process. The Monte Carlo procedure was applied for propagating uncertainty in the input parameters to uncertainty in the model predictions. Monte Carlo simulations and the least squares method were used in the sensitivity analysis: for each model output, a linear...

  17. Impact of uncertainty description on assimilating hydraulic head in the MIKE SHE distributed hydrological model

    Science.gov (United States)

    Zhang, Donghua; Madsen, Henrik; Ridler, Marc E.; Refsgaard, Jens C.; Jensen, Karsten H.

    2015-12-01

    The ensemble Kalman filter (EnKF) is a popular data assimilation (DA) technique that has been extensively used in environmental sciences for combining complementary information from model predictions and observations. One of the major challenges in EnKF applications is the description of model uncertainty. In most hydrological EnKF applications, an ad hoc model uncertainty is defined with the aim of avoiding a collapse of the filter. The present work provides a systematic assessment of model uncertainty in DA applications based on combinations of forcing, model parameters, and state uncertainties. This is tested in a case where groundwater hydraulic heads are assimilated into a distributed and integrated catchment-scale model of the Karup catchment in Denmark. A series of synthetic data assimilation experiments are carried out to analyse the impact of different model uncertainty assumptions on the feasibility and efficiency of the assimilation. The synthetic data used in the assimilation study makes it possible to diagnose model uncertainty assumptions statistically. Besides the model uncertainty, other factors such as observation error, observation locations, and ensemble size are also analysed with respect to performance and sensitivity. Results show that inappropriate definition of model uncertainty can greatly degrade the assimilation performance, and an appropriate combination of different model uncertainty sources is advised.

  18. Response analysis based on smallest interval-set of parameters for structures with uncertainty

    Institute of Scientific and Technical Information of China (English)

    Xiao-jun WANG; Lei WANG; Zhi-ping QIU

    2012-01-01

    An integral analytic process from quantification to propagation based on limited uncertain parameters is investigated to deal with practical engineering problems.A new method by use of the smallest interval-set/hyper-rectangle containing all experimental data is proposed to quantify the parameter uncertainties. With the smallest parameter interval-set,the uncertainty propagation evaluation of the most favorable response and the least favorable response of the structures is studied based on the interval analysis.The relationship between the proposed interval analysis method (IAM) and the classical IAM is discussed.Two numerical examples are presented to demonstrate the feasibility and validity of the proposed method.

  19. Minimizing uncertainty of daily rainfall interpolation over large catchments through realistic sampling of anisotropic correlogram parameters

    Science.gov (United States)

    Gyasi-Agyei, Yeboah

    2016-04-01

    It has been established that daily rainfall gauged network density is not adequate for the level of hydrological modelling required of large catchments involving pollutant and sediment transport, such as the catchments draining the coastal regions of Queensland, Australia, to the sensitive Great Barrier Reef. This paper seeks to establish a link between the spatial structure of radar and gauge rainfall for improved interpolation of the limited gauged data over a grid or functional units of catchments in regions with or without radar records. The study area is within Mt. Stapylton weather radar station range, a 128 km square region for calibration and validation, and the Brisbane river catchment for validation only. Two time periods (2000-01-01 to 2008-12-31 and 2009-01-01 to 2015-06-30) were considered, the later period for calibration when radar records were available and both time periods for validation without regard to radar information. Anisotropic correlograms of both the gauged and radar data were developed and used to establish the linkage required for areas without radar records. The maximum daily temperature significantly influenced the distributional parameters of the linkage. While the gauged, radar and sampled correlogram parameters reproduced the mean estimates similarly using leave-one-out cross-validation of Ordinary Kriging, the gauged parameters overestimated the standard deviation (SD) which reflects uncertainty by over 91% of cases compared with the radar or the sampled parameter sets. However, the distribution of the SD generated by the radar and the sampled correlogram parameters could not be distinguished, with a Kolmogorov-Smirnov test p-value of 0.52. For the validation case with the catchment, the percentage overestimation of SD by the gauged parameter sets decreased to 81.2% and 87.1% for the earlier and later time periods, respectively. It is observed that the extreme wet days' parameters and statistics were fairly widely distributed

  20. Improving the precision of lake ecosystem metabolism estimates by identifying predictors of model uncertainty

    Science.gov (United States)

    Rose, Kevin C.; Winslow, Luke A.; Read, Jordan S.; Read, Emily K.; Solomon, Christopher T.; Adrian, Rita; Hanson, Paul C.

    2014-01-01

    Diel changes in dissolved oxygen are often used to estimate gross primary production (GPP) and ecosystem respiration (ER) in aquatic ecosystems. Despite the widespread use of this approach to understand ecosystem metabolism, we are only beginning to understand the degree and underlying causes of uncertainty for metabolism model parameter estimates. Here, we present a novel approach to improve the precision and accuracy of ecosystem metabolism estimates by identifying physical metrics that indicate when metabolism estimates are highly uncertain. Using datasets from seventeen instrumented GLEON (Global Lake Ecological Observatory Network) lakes, we discovered that many physical characteristics correlated with uncertainty, including PAR (photosynthetically active radiation, 400-700 nm), daily variance in Schmidt stability, and wind speed. Low PAR was a consistent predictor of high variance in GPP model parameters, but also corresponded with low ER model parameter variance. We identified a threshold (30% of clear sky PAR) below which GPP parameter variance increased rapidly and was significantly greater in nearly all lakes compared with variance on days with PAR levels above this threshold. The relationship between daily variance in Schmidt stability and GPP model parameter variance depended on trophic status, whereas daily variance in Schmidt stability was consistently positively related to ER model parameter variance. Wind speeds in the range of ~0.8-3 m s–1 were consistent predictors of high variance for both GPP and ER model parameters, with greater uncertainty in eutrophic lakes. Our findings can be used to reduce ecosystem metabolism model parameter uncertainty and identify potential sources of that uncertainty.

  1. Quantifying uncertainty, variability and likelihood for ordinary differential equation models

    LENUS (Irish Health Repository)

    Weisse, Andrea Y

    2010-10-28

    Abstract Background In many applications, ordinary differential equation (ODE) models are subject to uncertainty or variability in initial conditions and parameters. Both, uncertainty and variability can be quantified in terms of a probability density function on the state and parameter space. Results The partial differential equation that describes the evolution of this probability density function has a form that is particularly amenable to application of the well-known method of characteristics. The value of the density at some point in time is directly accessible by the solution of the original ODE extended by a single extra dimension (for the value of the density). This leads to simple methods for studying uncertainty, variability and likelihood, with significant advantages over more traditional Monte Carlo and related approaches especially when studying regions with low probability. Conclusions While such approaches based on the method of characteristics are common practice in other disciplines, their advantages for the study of biological systems have so far remained unrecognized. Several examples illustrate performance and accuracy of the approach and its limitations.

  2. Mean-value second-order uncertainty analysis method: application to water quality modelling

    Science.gov (United States)

    Mailhot, Alain; Villeneuve, Jean-Pierre

    Uncertainty analysis in hydrology and water quality modelling is an important issue. Various methods have been proposed to estimate uncertainties on model results based on given uncertainties on model parameters. Among these methods, the mean-value first-order second-moment (MFOSM) method and the advanced mean-value first-order second-moment (AFOSM) method are the most common ones. This paper presents a method based on a second-order approximation of a model output function. The application of this method requires the estimation of first- and second-order derivatives at a mean-value point in the parameter space. Application to a Streeter-Phelps prototype model is presented. Uncertainties on two and six parameters are considered. Exceedance probabilities (EP) of dissolved oxygen concentrations are obtained and compared with EP computed using Monte Carlo, AFOSM and MFOSM methods. These results show that the mean-value second-order method leads to better estimates of EP.

  3. 'spup' - an R package for uncertainty propagation analysis in spatial environmental modelling

    Science.gov (United States)

    Sawicka, Kasia; Heuvelink, Gerard

    2017-04-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability and being able to deal with case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected visualization methods that are understandable by non-experts with limited background in

  4. 'spup' - an R package for uncertainty propagation in spatial environmental modelling

    Science.gov (United States)

    Sawicka, Kasia; Heuvelink, Gerard

    2016-04-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected static and interactive visualization methods that are understandable by non-experts with limited background in

  5. Uncertainty Reduction Via Parameter Design of A Fast Digital Integrator for Magnetic Field Measurement

    CERN Document Server

    Arpaia, P; Lucariello, G; Spiezia, G

    2007-01-01

    At European Centre of Nuclear Research (CERN), within the new Large Hadron Collider (LHC) project, measurements of magnetic flux with uncertainty of 10 ppm at a few of decades of Hz for several minutes are required. With this aim, a new Fast Digital Integrator (FDI) has been developed in cooperation with University of Sannio, Italy [1]. This paper deals with the final design tuning for achieving target uncertainty by means of experimental statistical parameter design.

  6. Delineating parameter unidentifiabilities in complex models

    Science.gov (United States)

    Raman, Dhruva V.; Anderson, James; Papachristodoulou, Antonis

    2017-03-01

    Scientists use mathematical modeling as a tool for understanding and predicting the properties of complex physical systems. In highly parametrized models there often exist relationships between parameters over which model predictions are identical, or nearly identical. These are known as structural or practical unidentifiabilities, respectively. They are hard to diagnose and make reliable parameter estimation from data impossible. They furthermore imply the existence of an underlying model simplification. We describe a scalable method for detecting unidentifiabilities, as well as the functional relations defining them, for generic models. This allows for model simplification, and appreciation of which parameters (or functions thereof) cannot be estimated from data. Our algorithm can identify features such as redundant mechanisms and fast time-scale subsystems, as well as the regimes in parameter space over which such approximations are valid. We base our algorithm on a quantification of regional parametric sensitivity that we call `multiscale sloppiness'. Traditionally, the link between parametric sensitivity and the conditioning of the parameter estimation problem is made locally, through the Fisher information matrix. This is valid in the regime of infinitesimal measurement uncertainty. We demonstrate the duality between multiscale sloppiness and the geometry of confidence regions surrounding parameter estimates made where measurement uncertainty is non-negligible. Further theoretical relationships are provided linking multiscale sloppiness to the likelihood-ratio test. From this, we show that a local sensitivity analysis (as typically done) is insufficient for determining the reliability of parameter estimation, even with simple (non)linear systems. Our algorithm can provide a tractable alternative. We finally apply our methods to a large-scale, benchmark systems biology model of necrosis factor (NF)-κ B , uncovering unidentifiabilities.

  7. Using CV-GLUE procedure in analysis of wetland model predictive uncertainty.

    Science.gov (United States)

    Huang, Chun-Wei; Lin, Yu-Pin; Chiang, Li-Chi; Wang, Yung-Chieh

    2014-07-01

    This study develops a procedure that is related to Generalized Likelihood Uncertainty Estimation (GLUE), called the CV-GLUE procedure, for assessing the predictive uncertainty that is associated with different model structures with varying degrees of complexity. The proposed procedure comprises model calibration, validation, and predictive uncertainty estimation in terms of a characteristic coefficient of variation (characteristic CV). The procedure first performed two-stage Monte-Carlo simulations to ensure predictive accuracy by obtaining behavior parameter sets, and then the estimation of CV-values of the model outcomes, which represent the predictive uncertainties for a model structure of interest with its associated behavior parameter sets. Three commonly used wetland models (the first-order K-C model, the plug flow with dispersion model, and the Wetland Water Quality Model; WWQM) were compared based on data that were collected from a free water surface constructed wetland with paddy cultivation in Taipei, Taiwan. The results show that the first-order K-C model, which is simpler than the other two models, has greater predictive uncertainty. This finding shows that predictive uncertainty does not necessarily increase with the complexity of the model structure because in this case, the more simplistic representation (first-order K-C model) of reality results in a higher uncertainty in the prediction made by the model. The CV-GLUE procedure is suggested to be a useful tool not only for designing constructed wetlands but also for other aspects of environmental management.

  8. Parameter uncertainties in the design and optimization of cantilever piezoelectric energy harvesters

    Science.gov (United States)

    Franco, V. R.; Varoto, P. S.

    2017-09-01

    A crucial issue in piezoelectric energy harvesting is the efficiency of the mechanical to electrical conversion process. Several techniques have been investigated in order to obtain a set of optimum design parameters that will lead to the best performance of the harvester in terms of electrical power generation. Once an optimum design is reached it is also important to consider uncertainties in the selected parameters that in turn can lead to loss of performance in the energy conversion process. The main goal of this paper is to perform a comprehensive discussion of the effects of multi-parameter aleatory uncertainties on the performance and design optimization of a given energy harvesting system. For that, a typical energy harvester consisting of a cantilever beam carrying a tip mass and partially covered by piezoelectric layers on top and bottom surfaces is considered. A distributed parameter electromechanical modal of the harvesting system is formulated and validated through experimental tests. First, the SQP (Sequential Quadratic Planning) optimization is employed to obtain an optimum set of parameters that will lead to best performance of the harvester. Second, once the optimum harvester configuration is found random perturbations are introduced in the key parameters and Monte Carlo simulations are performed to investigate how these uncertainties propagate and affect the performance of the device studied. Numerically simulated results indicate that small variations in some design parameters can cause a significant variation in the output electrical power, what strongly suggests that uncertainties must be accounted for in the design of beam energy harvesting systems.

  9. A robust inverse approach for estimating the magnetic material properties of an electromagnetic device with minimum influence of the uncertainty in the geometrical parameters

    OpenAIRE

    Mohamed Abouelyazied Abdallh, Ahmed; Crevecoeur, Guillaume; Dupré, Luc

    2011-01-01

    The magnetic properties of the magnetic circuit of an electromagnetic device (EMD) can be identified by solving an inverse problem, where sets of measurements are properly interpreted using a forward numerical model of the device. However, the uncertainties of the geometrical parameter values in the forward model result in recovery errors in the reconstructed material parameter values. This paper proposes a novel inverse approach technique, in which the propagations of the uncertainties in th...

  10. Good modeling practice for PAT applications: propagation of input uncertainty and sensitivity analysis.

    Science.gov (United States)

    Sin, Gürkan; Gernaey, Krist V; Lantz, Anna Eliasson

    2009-01-01

    The uncertainty and sensitivity analysis are evaluated for their usefulness as part of the model-building within Process Analytical Technology applications. A mechanistic model describing a batch cultivation of Streptomyces coelicolor for antibiotic production was used as case study. The input uncertainty resulting from assumptions of the model was propagated using the Monte Carlo procedure to estimate the output uncertainty. The results showed that significant uncertainty exists in the model outputs. Moreover the uncertainty in the biomass, glucose, ammonium and base-consumption were found low compared to the large uncertainty observed in the antibiotic and off-gas CO(2) predictions. The output uncertainty was observed to be lower during the exponential growth phase, while higher in the stationary and death phases - meaning the model describes some periods better than others. To understand which input parameters are responsible for the output uncertainty, three sensitivity methods (Standardized Regression Coefficients, Morris and differential analysis) were evaluated and compared. The results from these methods were mostly in agreement with each other and revealed that only few parameters (about 10) out of a total 56 were mainly responsible for the output uncertainty. Among these significant parameters, one finds parameters related to fermentation characteristics such as biomass metabolism, chemical equilibria and mass-transfer. Overall the uncertainty and sensitivity analysis are found promising for helping to build reliable mechanistic models and to interpret the model outputs properly. These tools make part of good modeling practice, which can contribute to successful PAT applications for increased process understanding, operation and control purposes.

  11. Robust stability analysis of singular linear system with delay and parameter uncertainty

    Institute of Scientific and Technical Information of China (English)

    Renxin ZHONG; Zhi YANG

    2005-01-01

    This paper deals with the problem of robust stability for continuous-time singular systems with state delay and parameter uncertainty.The uncertain singular systems with delay considered in this paper are assumed to be regular and impulse free.By decomposing the systems into slow and fast subsystems,a robust delay-dependent asymptotic stability criteria based on linear matrix inequality is proposed,which is derived by using Lyapunov-Krasovskii functionals,neither model transformation nor bounding for cross terms is required in the derivation of our delay-dependent result.The robust delay-dependent stability criterion proposed in this paper is a sufficient condition.Finally,numerical examples and Matlab simulation are provided to illustrate the effectiveness of the proposed method.

  12. A framework for modeling uncertainty in regional climate change

    Science.gov (United States)

    In this study, we present a new modeling framework and a large ensemble of climate projections to investigate the uncertainty in regional climate change over the United States associated with four dimensions of uncertainty. The sources of uncertainty considered in this framework ...

  13. Uncertainty propagation in urban hydrology water quality modelling

    NARCIS (Netherlands)

    Torres Matallana, Arturo; Leopold, U.; Heuvelink, G.B.M.

    2016-01-01

    Uncertainty is often ignored in urban hydrology modelling. Engineering practice typically ignores uncertainties and uncertainty propagation. This can have large impacts, such as the wrong dimensioning of urban drainage systems and the inaccurate estimation of pollution in the environment caused by c

  14. Hydrological model uncertainty due to spatial evapotranspiration estimation methods

    Science.gov (United States)

    Yu, Xuan; Lamačová, Anna; Duffy, Christopher; Krám, Pavel; Hruška, Jakub

    2016-05-01

    Evapotranspiration (ET) continues to be a difficult process to estimate in seasonal and long-term water balances in catchment models. Approaches to estimate ET typically use vegetation parameters (e.g., leaf area index [LAI], interception capacity) obtained from field observation, remote sensing data, national or global land cover products, and/or simulated by ecosystem models. In this study we attempt to quantify the uncertainty that spatial evapotranspiration estimation introduces into hydrological simulations when the age of the forest is not precisely known. The Penn State Integrated Hydrologic Model (PIHM) was implemented for the Lysina headwater catchment, located 50°03‧N, 12°40‧E in the western part of the Czech Republic. The spatial forest patterns were digitized from forest age maps made available by the Czech Forest Administration. Two ET methods were implemented in the catchment model: the Biome-BGC forest growth sub-model (1-way coupled to PIHM) and with the fixed-seasonal LAI method. From these two approaches simulation scenarios were developed. We combined the estimated spatial forest age maps and two ET estimation methods to drive PIHM. A set of spatial hydrologic regime and streamflow regime indices were calculated from the modeling results for each method. Intercomparison of the hydrological responses to the spatial vegetation patterns suggested considerable variation in soil moisture and recharge and a small uncertainty in the groundwater table elevation and streamflow. The hydrologic modeling with ET estimated by Biome-BGC generated less uncertainty due to the plant physiology-based method. The implication of this research is that overall hydrologic variability induced by uncertain management practices was reduced by implementing vegetation models in the catchment models.

  15. A Monte Carlo Uncertainty Analysis of Ozone Trend Predictions in a Two Dimensional Model. Revision

    Science.gov (United States)

    Considine, D. B.; Stolarski, R. S.; Hollandsworth, S. M.; Jackman, C. H.; Fleming, E. L.

    1998-01-01

    We use Monte Carlo analysis to estimate the uncertainty in predictions of total O3 trends between 1979 and 1995 made by the Goddard Space Flight Center (GSFC) two-dimensional (2D) model of stratospheric photochemistry and dynamics. The uncertainty is caused by gas-phase chemical reaction rates, photolysis coefficients, and heterogeneous reaction parameters which are model inputs. The uncertainty represents a lower bound to the total model uncertainty assuming the input parameter uncertainties are characterized correctly. Each of the Monte Carlo runs was initialized in 1970 and integrated for 26 model years through the end of 1995. This was repeated 419 times using input parameter sets generated by Latin Hypercube Sampling. The standard deviation (a) of the Monte Carlo ensemble of total 03 trend predictions is used to quantify the model uncertainty. The 34% difference between the model trend in globally and annually averaged total O3 using nominal inputs and atmospheric trends calculated from Nimbus 7 and Meteor 3 total ozone mapping spectrometer (TOMS) version 7 data is less than the 46% calculated 1 (sigma), model uncertainty, so there is no significant difference between the modeled and observed trends. In the northern hemisphere midlatitude spring the modeled and observed total 03 trends differ by more than 1(sigma) but less than 2(sigma), which we refer to as marginal significance. We perform a multiple linear regression analysis of the runs which suggests that only a few of the model reactions contribute significantly to the variance in the model predictions. The lack of significance in these comparisons suggests that they are of questionable use as guides for continuing model development. Large model/measurement differences which are many multiples of the input parameter uncertainty are seen in the meridional gradients of the trend and the peak-to-peak variations in the trends over an annual cycle. These discrepancies unambiguously indicate model formulation

  16. Effect of Baseflow Separation on Uncertainty of Hydrological Modeling in the Xinanjiang Model

    Directory of Open Access Journals (Sweden)

    Kairong Lin

    2014-01-01

    Full Text Available Based on the idea of inputting more available useful information for evaluation to gain less uncertainty, this study focuses on how well the uncertainty can be reduced by considering the baseflow estimation information obtained from the smoothed minima method (SMM. The Xinanjiang model and the generalized likelihood uncertainty estimation (GLUE method with the shuffled complex evolution Metropolis (SCEM-UA sampling algorithm were used for hydrological modeling and uncertainty analysis, respectively. The Jiangkou basin, located in the upper of the Hanjiang River, was selected as case study. It was found that the number and standard deviation of behavioral parameter sets both decreased when the threshold value for the baseflow efficiency index increased, and the high Nash-Sutcliffe efficiency coefficients correspond well with the high baseflow efficiency coefficients. The results also showed that uncertainty interval width decreased significantly, while containing ratio did not decrease by much and the simulated runoff with the behavioral parameter sets can fit better to the observed runoff, when threshold for the baseflow efficiency index was taken into consideration. These implied that using the baseflow estimation information can reduce the uncertainty in hydrological modeling to some degree and gain more reasonable prediction bounds.

  17. Bayesian methods for model uncertainty analysis with application to future sea level rise

    Energy Technology Data Exchange (ETDEWEB)

    Patwardhan, A.; Small, M.J. (Carnegie Mellon Univ., Pittsburgh, PA (United States))

    1992-12-01

    This paper addresses the use of data for identifying and characterizing uncertainties in model parameters and predictions. The Bayesian Monte Carlo method is formally presented and elaborated, and applied to the analysis of the uncertainty in a predictive model for global mean sea level change. The method uses observations of output variables, made with an assumed error structure, to determine a posterior distribution of model outputs. This is used to derive a posterior distribution for the model parameters. Results demonstrate the resolution of the uncertainty that is obtained as a result of the Bayesian analysis and also indicate the key contributors to the uncertainty in the sea level rise model. While the technique is illustrated with a simple, preliminary model, the analysis provides an iterative framework for model refinement. The methodology developed in this paper provides a mechanism for the incorporation of ongoing data collection and research in decision-making for problems involving uncertain environmental change.

  18. Forward propagation of parametric uncertainties through models of NDE inspection scenarios

    Science.gov (United States)

    Cherry, Matthew; Sabbagh, Harold; Aldrin, John; Knopp, Jeremy; Pilchak, Adam

    2015-03-01

    Forward uncertainty propagation has been a topic of interest to NDE researchers for several years. To this point, the purpose has been to gain an understanding of the uncertainties that can be seen in signals from NDE sensors given uncertainties in the geometric and material parameters of the problem. However, a complex analysis of an inspection scenario with high variability has not been performed. Furthermore, these methods have not seen direct practical application in the areas of model assisted probability of detection or inverse problems. In this paper, uncertainty due to spatial heterogeneity in material systems that undergo NDE inspection will be discussed. Propagation of this uncertainty through forward models of inspection scenarios will be outlined and the mechanisms for representing the spatial heterogeneity will be explained in detail. Examples will be provided that illustrate the effect of high variability in uncertainty propagation in the context of forward modeling.

  19. Uncertainty in a spatial evacuation model

    Science.gov (United States)

    Mohd Ibrahim, Azhar; Venkat, Ibrahim; Wilde, Philippe De

    2017-08-01

    Pedestrian movements in crowd motion can be perceived in terms of agents who basically exhibit patient or impatient behavior. We model crowd motion subject to exit congestion under uncertainty conditions in a continuous space and compare the proposed model via simulations with the classical social force model. During a typical emergency evacuation scenario, agents might not be able to perceive with certainty the strategies of opponents (other agents) owing to the dynamic changes entailed by the neighborhood of opponents. In such uncertain scenarios, agents will try to update their strategy based on their own rules or their intrinsic behavior. We study risk seeking, risk averse and risk neutral behaviors of such agents via certain game theory notions. We found that risk averse agents tend to achieve faster evacuation time whenever the time delay in conflicts appears to be longer. The results of our simulations also comply with previous work and conform to the fact that evacuation time of agents becomes shorter once mutual cooperation among agents is achieved. Although the impatient strategy appears to be the rational strategy that might lead to faster evacuation times, our study scientifically shows that the more the agents are impatient, the slower is the egress time.

  20. Incorporating Fuzzy Systems Modeling and Possibility Theory in Hydrogeological Uncertainty Analysis

    Science.gov (United States)

    Faybishenko, B.

    2008-12-01

    Hydrogeological predictions are subject to numerous uncertainties, including the development of conceptual, mathematical, and numerical models, as well as determination of their parameters. Stochastic simulations of hydrogeological systems and the associated uncertainty analysis are usually based on the assumption that the data characterizing spatial and temporal variations of hydrogeological processes are random, and the output uncertainty is quantified using a probability distribution. However, hydrogeological systems are often characterized by imprecise, vague, inconsistent, incomplete or subjective information. One of the modern approaches to modeling and uncertainty quantification of such systems is based on using a combination of statistical and fuzzy-logic uncertainty analyses. The aims of this presentation are to: (1) present evidence of fuzziness in developing conceptual hydrogeological models, and (2) give examples of the integration of the statistical and fuzzy-logic analyses in modeling and assessing both aleatoric uncertainties (e.g., caused by vagueness in assessing the subsurface system heterogeneities of fractured-porous media) and epistemic uncertainties (e.g., caused by the selection of different simulation models) involved in hydrogeological modeling. The author will discuss several case studies illustrating the application of fuzzy modeling for assessing the water balance and water travel time in unsaturated-saturated media. These examples will include the evaluation of associated uncertainties using the main concepts of possibility theory, a comparison between the uncertainty evaluation using probabilistic and possibility theories, and a transformation of the probabilities into possibilities distributions (and vice versa) for modeling hydrogeological processes.

  1. Identification and communication of uncertainties of phenomenological models in PSA

    Energy Technology Data Exchange (ETDEWEB)

    Pulkkinen, U.; Simola, K. [VTT Automation (Finland)

    2001-11-01

    This report aims at presenting a view upon uncertainty analysis of phenomenological models with an emphasis on the identification and documentation of various types of uncertainties and assumptions in the modelling of the phenomena. In an uncertainty analysis, it is essential to include and document all unclear issues, in order to obtain a maximal coverage of unresolved issues. This holds independently on their nature or type of the issues. The classification of uncertainties is needed in the decomposition of the problem and it helps in the identification of means for uncertainty reduction. Further, an enhanced documentation serves to evaluate the applicability of the results to various risk-informed applications. (au)

  2. Uncertainty in surface water flood risk modelling

    Science.gov (United States)

    Butler, J. B.; Martin, D. N.; Roberts, E.; Domuah, R.

    2009-04-01

    uniform flow formulae (Manning's Equation) to direct flow over the model domain, sourcing water from the channel or sea so as to provide a detailed representation of river and coastal flood risk. The initial development step was to include spatially-distributed rainfall as a new source term within the model domain. This required optimisation to improve computational efficiency, given the ubiquity of ‘wet' cells early on in the simulation. Collaboration with UK water companies has provided detailed drainage information, and from this a simplified representation of the drainage system has been included in the model via the inclusion of sinks and sources of water from the drainage network. This approach has clear advantages relative to a fully coupled method both in terms of reduced input data requirements and computational overhead. Further, given the difficulties associated with obtaining drainage information over large areas, tests were conducted to evaluate uncertainties associated with excluding drainage information and the impact that this has upon flood model predictions. This information can be used, for example, to inform insurance underwriting strategies and loss estimation as well as for emergency response and planning purposes. The Flowroute surface-water flood risk platform enables efficient mapping of areas sensitive to flooding from high-intensity rainfall events due to topography and drainage infrastructure. As such, the technology has widespread potential for use as a risk mapping tool by the UK Environment Agency, European Member States, water authorities, local governments and the insurance industry. Keywords: Surface water flooding, Model Uncertainty, Insurance Underwriting, Flood inundation modelling, Risk mapping.

  3. Numerical Modelling of Structures with Uncertainties

    Directory of Open Access Journals (Sweden)

    Kahsin Maciej

    2017-04-01

    Full Text Available The nature of environmental interactions, as well as large dimensions and complex structure of marine offshore objects, make designing, building and operation of these objects a great challenge. This is the reason why a vast majority of investment cases of this type include structural analysis, performed using scaled laboratory models and complemented by extended computer simulations. The present paper focuses on FEM modelling of the offshore wind turbine supporting structure. Then problem is studied using the modal analysis, sensitivity analysis, as well as the design of experiment (DOE and response surface model (RSM methods. The results of modal analysis based simulations were used for assessing the quality of the FEM model against the data measured during the experimental modal analysis of the scaled laboratory model for different support conditions. The sensitivity analysis, in turn, has provided opportunities for assessing the effect of individual FEM model parameters on the dynamic response of the examined supporting structure. The DOE and RSM methods allowed to determine the effect of model parameter changes on the supporting structure response.

  4. Quantification of the impact of precipitation spatial distribution uncertainty on predictive uncertainty of a snowmelt runoff model

    Science.gov (United States)

    Jacquin, A. P.

    2012-04-01

    This study is intended to quantify the impact of uncertainty about precipitation spatial distribution on predictive uncertainty of a snowmelt runoff model. This problem is especially relevant in mountain catchments with a sparse precipitation observation network and relative short precipitation records. The model analysed is a conceptual watershed model operating at a monthly time step. The model divides the catchment into five elevation zones, where the fifth zone corresponds to the catchment's glaciers. Precipitation amounts at each elevation zone i are estimated as the product between observed precipitation at a station and a precipitation factor FPi. If other precipitation data are not available, these precipitation factors must be adjusted during the calibration process and are thus seen as parameters of the model. In the case of the fifth zone, glaciers are seen as an inexhaustible source of water that melts when the snow cover is depleted.The catchment case study is Aconcagua River at Chacabuquito, located in the Andean region of Central Chile. The model's predictive uncertainty is measured in terms of the output variance of the mean squared error of the Box-Cox transformed discharge, the relative volumetric error, and the weighted average of snow water equivalent in the elevation zones at the end of the simulation period. Sobol's variance decomposition (SVD) method is used for assessing the impact of precipitation spatial distribution, represented by the precipitation factors FPi, on the models' predictive uncertainty. In the SVD method, the first order effect of a parameter (or group of parameters) indicates the fraction of predictive uncertainty that could be reduced if the true value of this parameter (or group) was known. Similarly, the total effect of a parameter (or group) measures the fraction of predictive uncertainty that would remain if the true value of this parameter (or group) was unknown, but all the remaining model parameters could be fixed

  5. Addressing Conceptual Model Uncertainty in the Evaluation of Model Prediction Errors

    Science.gov (United States)

    Carrera, J.; Pool, M.

    2014-12-01

    Model predictions are uncertain because of errors in model parameters, future forcing terms, and model concepts. The latter remain the largest and most difficult to assess source of uncertainty in long term model predictions. We first review existing methods to evaluate conceptual model uncertainty. We argue that they are highly sensitive to the ingenuity of the modeler, in the sense that they rely on the modeler's ability to propose alternative model concepts. Worse, we find that the standard practice of stochastic methods leads to poor, potentially biased and often too optimistic, estimation of actual model errors. This is bad news because stochastic methods are purported to properly represent uncertainty. We contend that the problem does not lie on the stochastic approach itself, but on the way it is applied. Specifically, stochastic inversion methodologies, which demand quantitative information, tend to ignore geological understanding, which is conceptually rich. We illustrate some of these problems with the application to Mar del Plata aquifer, where extensive data are available for nearly a century. Geologically based models, where spatial variability is handled through zonation, yield calibration fits similar to geostatiscally based models, but much better predictions. In fact, the appearance of the stochastic T fields is similar to the geologically based models only in areas with high density of data. We take this finding to illustrate the ability of stochastic models to accommodate many data, but also, ironically, their inability to address conceptual model uncertainty. In fact, stochastic model realizations tend to be too close to the "most likely" one (i.e., they do not really realize the full conceptualuncertainty). The second part of the presentation is devoted to argue that acknowledging model uncertainty may lead to qualitatively different decisions than just working with "most likely" model predictions. Therefore, efforts should concentrate on

  6. Effects of Cracking Test Conditions on Estimation Uncertainty for Weibull Parameters Considering Time-Dependent Censoring Interval

    Directory of Open Access Journals (Sweden)

    Jae Phil Park

    2016-12-01

    Full Text Available It is extremely difficult to predict the initiation time of cracking due to a large time spread in most cracking experiments. Thus, probabilistic models, such as the Weibull distribution, are usually employed to model the initiation time of cracking. Therefore, the parameters of the Weibull distribution are estimated from data collected from a cracking test. However, although the development of a reliable cracking model under ideal experimental conditions (e.g., a large number of specimens and narrow censoring intervals could be achieved in principle, it is not straightforward to quantitatively assess the effects of the ideal experimental conditions on model estimation uncertainty. The present study investigated the effects of key experimental conditions, including the time-dependent effect of the censoring interval length, on the estimation uncertainties of the Weibull parameters through Monte Carlo simulations. The simulation results provided quantified estimation uncertainties of Weibull parameters in various cracking test conditions. Hence, it is expected that the results of this study can offer some insight for experimenters developing a probabilistic crack initiation model by performing experiments.

  7. Uncertainty in population growth rates: determining confidence intervals from point estimates of parameters.

    Directory of Open Access Journals (Sweden)

    Eleanor S Devenish Nelson

    Full Text Available BACKGROUND: Demographic models are widely used in conservation and management, and their parameterisation often relies on data collected for other purposes. When underlying data lack clear indications of associated uncertainty, modellers often fail to account for that uncertainty in model outputs, such as estimates of population growth. METHODOLOGY/PRINCIPAL FINDINGS: We applied a likelihood approach to infer uncertainty retrospectively from point estimates of vital rates. Combining this with resampling techniques and projection modelling, we show that confidence intervals for population growth estimates are easy to derive. We used similar techniques to examine the effects of sample size on uncertainty. Our approach is illustrated using data on the red fox, Vulpes vulpes, a predator of ecological and cultural importance, and the most widespread extant terrestrial mammal. We show that uncertainty surrounding estimated population growth rates can be high, even for relatively well-studied populations. Halving that uncertainty typically requires a quadrupling of sampling effort. CONCLUSIONS/SIGNIFICANCE: Our results compel caution when comparing demographic trends between populations without accounting for uncertainty. Our methods will be widely applicable to demographic studies of many species.

  8. On how to avoid input and structural uncertainties corrupt the inference of hydrological parameters using a Bayesian framework

    Science.gov (United States)

    Hernández, Mario R.; Francés, Félix

    2015-04-01

    One phase of the hydrological models implementation process, significantly contributing to the hydrological predictions uncertainty, is the calibration phase in which values of the unknown model parameters are tuned by optimizing an objective function. An unsuitable error model (e.g. Standard Least Squares or SLS) introduces noise into the estimation of the parameters. The main sources of this noise are the input errors and the hydrological model structural deficiencies. Thus, the biased calibrated parameters cause the divergence model phenomenon, where the errors variance of the (spatially and temporally) forecasted flows far exceeds the errors variance in the fitting period, and provoke the loss of part or all of the physical meaning of the modeled processes. In other words, yielding a calibrated hydrological model which works well, but not for the right reasons. Besides, an unsuitable error model yields a non-reliable predictive uncertainty assessment. Hence, with the aim of prevent all these undesirable effects, this research focuses on the Bayesian joint inference (BJI) of both the hydrological and error model parameters, considering a general additive (GA) error model that allows for correlation, non-stationarity (in variance and bias) and non-normality of model residuals. As hydrological model, it has been used a conceptual distributed model called TETIS, with a particular split structure of the effective model parameters. Bayesian inference has been performed with the aid of a Markov Chain Monte Carlo (MCMC) algorithm called Dream-ZS. MCMC algorithm quantifies the uncertainty of the hydrological and error model parameters by getting the joint posterior probability distribution, conditioned on the observed flows. The BJI methodology is a very powerful and reliable tool, but it must be used correctly this is, if non-stationarity in errors variance and bias is modeled, the Total Laws must be taken into account. The results of this research show that the

  9. The climate dependence of the terrestrial carbon cycle, including parameter and structural uncertainties

    Directory of Open Access Journals (Sweden)

    M. J. Smith

    2013-01-01

    Full Text Available The feedback between climate and the terrestrial carbon cycle will be a key determinant of the dynamics of the Earth System (the thin layer that contains and supports life over the coming decades and centuries. However, Earth System Model projections of the terrestrial carbon-balance vary widely over these timescales. This is largely due to differences in their terrestrial carbon cycle models. A major goal in biogeosciences is therefore to improve understanding of the terrestrial carbon cycle to enable better constrained projections. Utilising empirical data to constrain and assess component processes in terrestrial carbon cycle models will be essential to achieving this goal. We used a new model construction method to data-constrain all parameters of all component processes within a global terrestrial carbon model, employing as data constraints a collection of 12 empirical data sets characterising global patterns of carbon stocks and flows. Our goals were to assess the climate dependencies inferred for all component processes, assess whether these were consistent with current knowledge and understanding, assess the importance of different data sets and the model structure for inferring those dependencies, assess the predictive accuracy of the model and ultimately to identify a methodology by which alternative component models could be compared within the same framework in the future. Although formulated as differential equations describing carbon fluxes through plant and soil pools, the model was fitted assuming the carbon pools were in states of dynamic equilibrium (input rates equal output rates. Thus, the parameterised model is of the equilibrium terrestrial carbon cycle. All but 2 of the 12 component processes to the model were inferred to have strong climate dependencies, although it was not possible to data-constrain all parameters, indicating some potentially redundant details. Similar climate dependencies were obtained for most

  10. COMMUNICATING THE PARAMETER UNCERTAINTY IN THE IQWIG EFFICIENCY FRONTIER TO DECISION-MAKERS

    Science.gov (United States)

    Stollenwerk, Björn; Lhachimi, Stefan K; Briggs, Andrew; Fenwick, Elisabeth; Caro, Jaime J; Siebert, Uwe; Danner, Marion; Gerber-Grote, Andreas

    2015-01-01

    The Institute for Quality and Efficiency in Health Care (IQWiG) developed—in a consultation process with an international expert panel—the efficiency frontier (EF) approach to satisfy a range of legal requirements for economic evaluation in Germany's statutory health insurance system. The EF approach is distinctly different from other health economic approaches. Here, we evaluate established tools for assessing and communicating parameter uncertainty in terms of their applicability to the EF approach. Among these are tools that perform the following: (i) graphically display overall uncertainty within the IQWiG EF (scatter plots, confidence bands, and contour plots) and (ii) communicate the uncertainty around the reimbursable price. We found that, within the EF approach, most established plots were not always easy to interpret. Hence, we propose the use of price reimbursement acceptability curves—a modification of the well-known cost-effectiveness acceptability curves. Furthermore, it emerges that the net monetary benefit allows an intuitive interpretation of parameter uncertainty within the EF approach. This research closes a gap for handling uncertainty in the economic evaluation approach of the IQWiG methods when using the EF. However, the precise consequences of uncertainty when determining prices are yet to be defined. © 2014 The Authors. Health Economics published by John Wiley & Sons Ltd. PMID:24590819

  11. Evaluation of thermal-hydraulic parameter uncertainties in a TRIGA research reactor

    Energy Technology Data Exchange (ETDEWEB)

    Mesquita, Amir Z.; Costa, Antonio C.L.; Ladeira, Luiz C.D.; Rezende, Hugo C., E-mail: amir@cdtn.br, E-mail: aclc@cdtn.br, E-mail: lcdl@cdtn.br, E-mail: hcr@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil); Palma, Daniel A.P., E-mail: dapalma@cnen.gov.br [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil)

    2015-07-01

    Experimental studies had been performed in the TRIGA Research Nuclear Reactor of CDTN/CNEN to find out the its thermal hydraulic parameters. Fuel to coolant heat transfer patterns must be evaluated as function of the reactor power in order to assess the thermal hydraulic performance of the core. The heat generated by nuclear fission in the reactor core is transferred from fuel elements to the cooling system through the fuel-cladding (gap) and the cladding to coolant interfaces. As the reactor core power increases the heat transfer regime from the fuel cladding to the coolant changes from single-phase natural convection to subcooled nucleate boiling. This paper presents the uncertainty analysis in the results of the thermal hydraulics experiments performed. The methodology used to evaluate the propagation of uncertainty in the results was done based on the pioneering article of Kline and McClintock, with the propagation of uncertainties based on the specification of uncertainties in various primary measurements. The uncertainty analysis on thermal hydraulics parameters of the CDTN TRIGA fuel element is determined, basically, by the uncertainty of the reactor's thermal power. (author)

  12. A Bayesian framework for parameter estimation in dynamical models.

    Directory of Open Access Journals (Sweden)

    Flávio Codeço Coelho

    Full Text Available Mathematical models in biology are powerful tools for the study and exploration of complex dynamics. Nevertheless, bringing theoretical results to an agreement with experimental observations involves acknowledging a great deal of uncertainty intrinsic to our theoretical representation of a real system. Proper handling of such uncertainties is key to the successful usage of models to predict experimental or field observations. This problem has been addressed over the years by many tools for model calibration and parameter estimation. In this article we present a general framework for uncertainty analysis and parameter estimation that is designed to handle uncertainties associated with the modeling of dynamic biological systems while remaining agnostic as to the type of model used. We apply the framework to fit an SIR-like influenza transmission model to 7 years of incidence data in three European countries: Belgium, the Netherlands and Portugal.

  13. Modelling of physical properties - databases, uncertainties and predictive power

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    Physical and thermodynamic property in the form of raw data or estimated values for pure compounds and mixtures are important pre-requisites for performing tasks such as, process design, simulation and optimization; computer aided molecular/mixture (product) design; and, product-process analysis....... While use of experimentally measured values of the needed properties is desirable in these tasks, the experimental data of the properties of interest may not be available or may not be measurable in many cases. Therefore, property models that are reliable, predictive and easy to use are necessary....... However, which models should be used to provide the reliable estimates of the required properties? And, how much measured data is necessary to regress the model parameters? How to ensure predictive capabilities in the developed models? Also, as it is necessary to know the associated uncertainties...

  14. INTERSTELLAR NEUTRAL HELIUM IN THE HELIOSPHERE FROM IBEX OBSERVATIONS. I. UNCERTAINTIES AND BACKGROUNDS IN THE DATA AND PARAMETER DETERMINATION METHOD

    Energy Technology Data Exchange (ETDEWEB)

    Swaczyna, P.; Bzowski, M.; Kubiak, M. A.; Sokół, J. M. [Space Research Centre of the Polish Academy of Sciences, Warsaw (Poland); Fuselier, S. A.; McComas, D. J.; Schwadron, N. A. [Southwest Research Institute, San Antonio, TX (United States); Heirtzler, D.; Kucharek, H.; Leonard, T. W.; Möbius, E., E-mail: pswaczyna@cbk.waw.pl [Space Science Center and Department of Physics, University of New Hampshire, Durham, NH (United States)

    2015-10-15

    This paper is one of three companion papers presenting the results of our in-depth analysis of the interstellar neutral helium (ISN He) observations carried out using IBEX-Lo during the first six Interstellar Boundary Explorer (IBEX) observation seasons. We derive corrections for losses due to the limited throughput of the interface buffer and determine the IBEX spin-axis pointing. We develop an uncertainty system for the data, taking into account the resulting correlations between the data points. This system includes uncertainties due to Poisson statistics, background, spin-axis determination, systematic deviation of the boresight from the prescribed position, correction for the interface buffer losses, and the expected Warm Breeze (WB) signal. Subsequently, we analyze the data from 2009 to examine the role of various components of the uncertainty system. We show that the ISN He flow parameters are in good agreement with the values obtained by the original analysis. We identify the WB as the principal contributor to the global χ{sup 2} values in previous analyses. Other uncertainties have a much milder role and their contributions are comparable to each other. The application of this uncertainty system reduced the minimum χ{sup 2} value 4-fold. The obtained χ{sup 2} value, still exceeding the expected value, suggests that either the uncertainty system may still be incomplete or the adopted physical model lacks a potentially important element, likely an imperfect determination of the WB parameters. The derived corrections and uncertainty system are used in the accompanying paper by Bzowski et al. in an analysis of the data from six seasons.

  15. Differential uncertainty analysis for evaluating the accuracy of S-parameter retrieval methods for electromagnetic properties of metamaterial slabs.

    Science.gov (United States)

    Hasar, Ugur Cem; Barroso, Joaquim J; Sabah, Cumali; Kaya, Yunus; Ertugrul, Mehmet

    2012-12-17

    We apply a complete uncertainty analysis, not studied in the literature, to investigate the dependences of retrieved electromagnetic properties of two MM slabs (the first one with only split-ring resonators (SRRs) and the second with SRRs and a continuous wire) with single-band and dual-band resonating properties on the measured/simulated scattering parameters, the slab length, and the operating frequency. Such an analysis is necessary for the selection of a suitable retrieval method together with the correct examination of exotic properties of MM slabs especially in their resonance regions. For this analysis, a differential uncertainty model is developed to monitor minute changes in the dependent variables (electromagnetic properties of MM slabs) in functions of independent variables (scattering (S-) parameters, the slab length, and the operating frequency). Two complementary approaches (the analytical approach and the dispersion model approach) each with different strengths are utilized to retrieve the electromagnetic properties of various MM slabs, which are needed for the application of the uncertainty analysis. We note the following important results from our investigation. First, uncertainties in the retrieved electromagnetic properties of the analyzed MM slabs drastically increase when values of electromagnetic properties shrink to zero or near resonance regions where S-parameters exhibit rapid changes. Second, any low-loss or medium-loss inside the MM slabs due to an imperfect dielectric substrate or a finite conductivity of metals can decrease these uncertainties near resonance regions because these losses hinder abrupt changes in S-parameters. Finally, we note that precise information of especially the slab length and the operating frequency is a prerequisite for accurate analysis of exotic electromagnetic properties of MM slabs (especially multiband MM slabs) near resonance regions.

  16. Simulation of corn yields and parameters uncertainties analysis in Hebei and Sichuang, China

    Science.gov (United States)

    Fu, A.; Xue, Y.; Hartman, M. D.; Chandran, A.; Qiu, B.; Liu, Y.

    2016-12-01

    Corn is one of most important agricultural production in China. Research on the impacts of climate change and human activities on corn yields is important in understanding and mitigating the negative effects of environmental factors on corn yields and maintaining the stable corn production. Using climatic data, including daily temperature, precipitation, and solar radiation from 1948 to 2010, soil properties, observed corn yields, and farmland management information, corn yields in Sichuang and Hebei Provinces of China in the past 63 years were simulated using the Daycent model, and the results was evaluated using Root mean square errors, bias, simulation efficiency, and standard deviation. The primary climatic factors influencing corn yields were examined, the uncertainties of climatic factors was analyzed, and the uncertainties of human activity parameters were also studied by changing fertilization levels and cultivated ways. The results showed that: (1) Daycent model is capable to simulate corn yields in Sichuang and Hebei provinces of China. Observed and simulated corn yields have the similar increasing trend with time. (2) The minimum daily temperature is the primary factor influencing corn yields in Sichuang. In Hebei Province, daily temperature, precipitation and wind speed significantly affect corn yields.(3) When the global warming trend of original data was removed, simulated corn yields were lower than before, decreased by about 687 kg/hm2 from 1992 to 2010; When the fertilization levels, cultivated ways were increased and decreased by 50% and 75%, respectively in the Schedule file in Daycent model, the simulated corn yields increased by 1206 kg/hm2 and 776 kg/hm2, respectively, with the enhancement of fertilization level and the improvement of cultivated way. This study provides a scientific base for selecting a suitable fertilization level and cultivated way in corn fields in China.

  17. Aerodynamic Modeling with Heterogeneous Data Assimilation and Uncertainty Quantification Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Clear Science Corp. proposes to develop an aerodynamic modeling tool that assimilates data from different sources and facilitates uncertainty quantification. The...

  18. Use of different sampling schemes in machine learning-based prediction of hydrological models' uncertainty

    Science.gov (United States)

    Kayastha, Nagendra; Solomatine, Dimitri; Lal Shrestha, Durga; van Griensven, Ann

    2013-04-01

    In recent years, a lot of attention in the hydrologic literature is given to model parameter uncertainty analysis. The robustness estimation of uncertainty depends on the efficiency of sampling method used to generate the best fit responses (outputs) and on ease of use. This paper aims to investigate: (1) how sampling strategies effect the uncertainty estimations of hydrological models, (2) how to use this information in machine learning predictors of models uncertainty. Sampling of parameters may employ various algorithms. We compared seven different algorithms namely, Monte Carlo (MC) simulation, generalized likelihood uncertainty estimation (GLUE), Markov chain Monte Carlo (MCMC), shuffled complex evolution metropolis algorithm (SCEMUA), differential evolution adaptive metropolis (DREAM), partical swarm optimization (PSO) and adaptive cluster covering (ACCO) [1]. These methods were applied to estimate uncertainty of streamflow simulation using conceptual model HBV and Semi-distributed hydrological model SWAT. Nzoia catchment in West Kenya is considered as the case study. The results are compared and analysed based on the shape of the posterior distribution of parameters, uncertainty results on model outputs. The MLUE method [2] uses results of Monte Carlo sampling (or any other sampling shceme) to build a machine learning (regression) model U able to predict uncertainty (quantiles of pdf) of a hydrological model H outputs. Inputs to these models are specially identified representative variables (past events precipitation and flows). The trained machine learning models are then employed to predict the model output uncertainty which is specific for the new input data. The problem here is that different sampling algorithms result in different data sets used to train such a model U, which leads to several models (and there is no clear evidence which model is the best since there is no basis for comparison). A solution could be to form a committee of all models U and

  19. Impact on Model Uncertainty of Diabatization in Distillation Columns

    DEFF Research Database (Denmark)

    Bisgaard, Thomas; Huusom, Jakob Kjøbsted; Abildskov, Jens

    2014-01-01

    This work provides uncertainty and sensitivity analysis of design of conventional and heat integrated distillation columns using Monte Carlo simulations. Selected uncertain parameters are relative volatility, heat of vaporization, the overall heat transfer coefficient , tray hold-up, and adiabat ...

  20. Imprecision and Uncertainty in the UFO Database Model.

    Science.gov (United States)

    Van Gyseghem, Nancy; De Caluwe, Rita

    1998-01-01

    Discusses how imprecision and uncertainty are dealt with in the UFO (Uncertainty and Fuzziness in an Object-oriented) database model. Such information is expressed by means of possibility distributions, and modeled by means of the proposed concept of "role objects." The role objects model uncertain, tentative information about objects, and thus…

  1. Imprecision and Uncertainty in the UFO Database Model.

    Science.gov (United States)

    Van Gyseghem, Nancy; De Caluwe, Rita

    1998-01-01

    Discusses how imprecision and uncertainty are dealt with in the UFO (Uncertainty and Fuzziness in an Object-oriented) database model. Such information is expressed by means of possibility distributions, and modeled by means of the proposed concept of "role objects." The role objects model uncertain, tentative information about objects,…

  2. Imprecision and Uncertainty in the UFO Database Model.

    Science.gov (United States)

    Van Gyseghem, Nancy; De Caluwe, Rita

    1998-01-01

    Discusses how imprecision and uncertainty are dealt with in the UFO (Uncertainty and Fuzziness in an Object-oriented) database model. Such information is expressed by means of possibility distributions, and modeled by means of the proposed concept of "role objects." The role objects model uncertain, tentative information about objects,…

  3. Estimating the magnitude of prediction uncertainties for the APLE model

    Science.gov (United States)

    Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that model predictions are inherently uncertain, few studies have addressed prediction uncertainties using P loss models. In this study, we conduct an uncertainty analysis for the Annual P ...

  4. Influence of parameter estimation uncertainty in Kriging: Part 1 - Theoretical Development

    Directory of Open Access Journals (Sweden)

    E. Todini

    2001-01-01

    Full Text Available This paper deals with a theoretical approach to assessing the effects of parameter estimation uncertainty both on Kriging estimates and on their estimated error variance. Although a comprehensive treatment of parameter estimation uncertainty is covered by full Bayesian Kriging at the cost of extensive numerical integration, the proposed approach has a wide field of application, given its relative simplicity. The approach is based upon a truncated Taylor expansion approximation and, within the limits of the proposed approximation, the conventional Kriging estimates are shown to be biased for all variograms, the bias depending upon the second order derivatives with respect to the parameters times the variance-covariance matrix of the parameter estimates. A new Maximum Likelihood (ML estimator for semi-variogram parameters in ordinary Kriging, based upon the assumption of a multi-normal distribution of the Kriging cross-validation errors, is introduced as a mean for the estimation of the parameter variance-covariance matrix. Keywords: Kriging, maximum likelihood, parameter estimation, uncertainty

  5. Operationalising uncertainty in data and models for integrated water resources management.

    Science.gov (United States)

    Blind, M W; Refsgaard, J C

    2007-01-01

    Key sources of uncertainty of importance for water resources management are (1) uncertainty in data; (2) uncertainty related to hydrological models (parameter values, model technique, model structure); and (3) uncertainty related to the context and the framing of the decision-making process. The European funded project 'Harmonised techniques and representative river basin data for assessment and use of uncertainty information in integrated water management (HarmoniRiB)' has resulted in a range of tools and methods to assess such uncertainties, focusing on items (1) and (2). The project also engaged in a number of discussions surrounding uncertainty and risk assessment in support of decision-making in water management. Based on the project's results and experiences, and on the subsequent discussions a number of conclusions can be drawn on the future needs for successful adoption of uncertainty analysis in decision support. These conclusions range from additional scientific research on specific uncertainties, dedicated guidelines for operational use to capacity building at all levels. The purpose of this paper is to elaborate on these conclusions and anchoring them in the broad objective of making uncertainty and risk assessment an essential and natural part in future decision-making processes.

  6. Calibration under uncertainty for finite element models of masonry monuments

    Energy Technology Data Exchange (ETDEWEB)

    Atamturktur, Sezer,; Hemez, Francois,; Unal, Cetin

    2010-02-01

    Historical unreinforced masonry buildings often include features such as load bearing unreinforced masonry vaults and their supporting framework of piers, fill, buttresses, and walls. The masonry vaults of such buildings are among the most vulnerable structural components and certainly among the most challenging to analyze. The versatility of finite element (FE) analyses in incorporating various constitutive laws, as well as practically all geometric configurations, has resulted in the widespread use of the FE method for the analysis of complex unreinforced masonry structures over the last three decades. However, an FE model is only as accurate as its input parameters, and there are two fundamental challenges while defining FE model input parameters: (1) material properties and (2) support conditions. The difficulties in defining these two aspects of the FE model arise from the lack of knowledge in the common engineering understanding of masonry behavior. As a result, engineers are unable to define these FE model input parameters with certainty, and, inevitably, uncertainties are introduced to the FE model.

  7. GRS Method for Uncertainties Evaluation of Parameters in a Prospective Fast Reactor

    Science.gov (United States)

    Peregudov, A.; Andrianova, O.; Raskach, K.; Tsibulya, A.

    2014-04-01

    A number of recent studies have been devoted to the uncertainty estimation of reactor calculation parameters by the GRS (Generation Random Sampled) method. This method is based on direct sampling input data resulting in formation of random sets of input parameters which are used for multiple calculations. Once these calculations are performed, statistical processing of the calculation results is carried out to determine the mean value and the variance of each calculation parameter of interest. In our study this method is used to estimate the uncertainty of calculation parameters (keff, power density, dose rate) of a prospective sodium-cooled fast reactor. Neutron transport calculations were performed by the nodal diffusion code TRIGEX and Monte Carlo code MMK.

  8. Parameter Estimation for Groundwater Models under Uncertain Irrigation Data.

    Science.gov (United States)

    Demissie, Yonas; Valocchi, Albert; Cai, Ximing; Brozovic, Nicholas; Senay, Gabriel; Gebremichael, Mekonnen

    2015-01-01

    The success of modeling groundwater is strongly influenced by the accuracy of the model parameters that are used to characterize the subsurface system. However, the presence of uncertainty and possibly bias in groundwater model source/sink terms may lead to biased estimates of model parameters and model predictions when the standard regression-based inverse modeling techniques are used. This study first quantifies the levels of bias in groundwater model parameters and predictions due to the presence of errors in irrigation data. Then, a new inverse modeling technique called input uncertainty weighted least-squares (IUWLS) is presented for unbiased estimation of the parameters when pumping and other source/sink data are uncertain. The approach uses the concept of generalized least-squares method with the weight of the objective function depending on the level of pumping uncertainty and iteratively adjusted during the parameter optimization process. We have conducted both analytical and numerical experiments, using irrigation pumping data from the Republican River Basin in Nebraska, to evaluate the performance of ordinary least-squares (OLS) and IUWLS calibration methods under different levels of uncertainty of irrigation data and calibration conditions. The result from the OLS method shows the presence of statistically significant (p irrigation pumping uncertainties during the calibration procedures, the proposed IUWLS is able to minimize the bias effectively without adding significant computational burden to the calibration processes.

  9. Parameter estimation for groundwater models under uncertain irrigation data

    Science.gov (United States)

    Demissie, Yonas; Valocchi, Albert J.; Cai, Ximing; Brozovic, Nicholas; Senay, Gabriel; Gebremichael, Mekonnen

    2015-01-01

    The success of modeling groundwater is strongly influenced by the accuracy of the model parameters that are used to characterize the subsurface system. However, the presence of uncertainty and possibly bias in groundwater model source/sink terms may lead to biased estimates of model parameters and model predictions when the standard regression-based inverse modeling techniques are used. This study first quantifies the levels of bias in groundwater model parameters and predictions due to the presence of errors in irrigation data. Then, a new inverse modeling technique called input uncertainty weighted least-squares (IUWLS) is presented for unbiased estimation of the parameters when pumping and other source/sink data are uncertain. The approach uses the concept of generalized least-squares method with the weight of the objective function depending on the level of pumping uncertainty and iteratively adjusted during the parameter optimization process. We have conducted both analytical and numerical experiments, using irrigation pumping data from the Republican River Basin in Nebraska, to evaluate the performance of ordinary least-squares (OLS) and IUWLS calibration methods under different levels of uncertainty of irrigation data and calibration conditions. The result from the OLS method shows the presence of statistically significant (p irrigation pumping uncertainties during the calibration procedures, the proposed IUWLS is able to minimize the bias effectively without adding significant computational burden to the calibration processes.

  10. Using Predictive Uncertainty Analysis to Assess Hydrologic Model Performance for a Watershed in Oregon

    Science.gov (United States)

    Brannan, K. M.; Somor, A.

    2016-12-01

    A variety of statistics are used to assess watershed model performance but these statistics do not directly answer the question: what is the uncertainty of my prediction. Understanding predictive uncertainty is important when using a watershed model to develop a Total Maximum Daily Load (TMDL). TMDLs are a key component of the US Clean Water Act and specify the amount of a pollutant that can enter a waterbody when the waterbody meets water quality criteria. TMDL developers use watershed models to estimate pollutant loads from nonpoint sources of pollution. We are developing a TMDL for bacteria impairments in a watershed in the Coastal Range of Oregon. We setup an HSPF model of the watershed and used the calibration software PEST to estimate HSPF hydrologic parameters and then perform predictive uncertainty analysis of stream flow. We used Monte-Carlo simulation to run the model with 1,000 different parameter sets and assess predictive uncertainty. In order to reduce the chance of specious parameter sets, we accounted for the relationships among parameter values by using mathematically-based regularization techniques and an estimate of the parameter covariance when generating random parameter sets. We used a novel approach to select flow data for predictive uncertainty analysis. We set aside flow data that occurred on days that bacteria samples were collected. We did not use these flows in the estimation of the model parameters. We calculated a percent uncertainty for each flow observation based 1,000 model runs. We also used several methods to visualize results with an emphasis on making the data accessible to both technical and general audiences. We will use the predictive uncertainty estimates in the next phase of our work, simulating bacteria fate and transport in the watershed.

  11. Comparing Two Strategies to Model Uncertainties in Structural Dynamics

    Directory of Open Access Journals (Sweden)

    Rubens Sampaio

    2010-01-01

    Full Text Available In the modeling of dynamical systems, uncertainties are present and they must be taken into account to improve the prediction of the models. Some strategies have been used to model uncertainties and the aim of this work is to discuss two of those strategies and to compare them. This will be done using the simplest model possible: a two d.o.f. (degrees of freedom dynamical system. A simple system is used because it is very helpful to assure a better understanding and, consequently, comparison of the strategies. The first strategy (called parametric strategy consists in taking each spring stiffness as uncertain and a random variable is associated to each one of them. The second strategy (called nonparametric strategy is more general and considers the whole stiffness matrix as uncertain, and associates a random matrix to it. In both cases, the probability density functions either of the random parameters or of the random matrix are deduced from the Maximum Entropy Principle using only the available information. With this example, some important results can be discussed, which cannot be assessed when complex structures are used, as it has been done so far in the literature. One important element for the comparison of the two strategies is the analysis of the samples spaces and the how to compare them.

  12. Dynamic modelling under uncertainty: the case of Trypanosoma brucei energy metabolism.

    Directory of Open Access Journals (Sweden)

    Fiona Achcar

    2012-01-01

    Full Text Available Kinetic models of metabolism require detailed knowledge of kinetic parameters. However, due to measurement errors or lack of data this knowledge is often uncertain. The model of glycolysis in the parasitic protozoan Trypanosoma brucei is a particularly well analysed example of a quantitative metabolic model, but so far it has been studied with a fixed set of parameters only. Here we evaluate the effect of parameter uncertainty. In order to define probability distributions for each parameter, information about the experimental sources and confidence intervals for all parameters were collected. We created a wiki-based website dedicated to the detailed documentation of this information: the SilicoTryp wiki (http://silicotryp.ibls.gla.ac.uk/wiki/Glycolysis. Using information collected in the wiki, we then assigned probability distributions to all parameters of the model. This allowed us to sample sets of alternative models, accurately representing our degree of uncertainty. Some properties of the model, such as the repartition of the glycolytic flux between the glycerol and pyruvate producing branches, are robust to these uncertainties. However, our analysis also allowed us to identify fragilities of the model leading to the accumulation of 3-phosphoglycerate and/or pyruvate. The analysis of the control coefficients revealed the importance of taking into account the uncertainties about the parameters, as the ranking of the reactions can be greatly affected. This work will now form the basis for a comprehensive Bayesian analysis and extension of the model considering alternative topologies.

  13. Comparing the effects of climate and impact model uncertainty on climate impacts estimates for grain maize

    Science.gov (United States)

    Holzkämper, Annelie; Honti, Mark; Fuhrer, Jürg

    2015-04-01

    Crop models are commonly applied to estimate impacts of projected climate change and to anticipate suitable adaptation measures. Thereby, uncertainties from global climate models, regional climate models, and impacts models cascade down to impact estimates. It is essential to quantify and understand uncertainties in impact assessments in order to provide informed guidance for decision making in adaptation planning. A question that has hardly been investigated in this context is how sensitive climate impact estimates are to the choice of the impact model approach. In a case study for Switzerland we compare results of three different crop modelling approaches to assess the relevance of impact model choice in relation to other uncertainty sources. The three approaches include an expert-based, a statistical and a process-based model. With each approach impact model parameter uncertainty and climate model uncertainty (originating from climate model chain and downscaling approach) are accounted for. ANOVA-based uncertainty partitioning is performed to quantify the relative importance of different uncertainty sources. Results suggest that uncertainty in estimated yield changes originating from the choice of the crop modelling approach can be greater than uncertainty from climate model chains. The uncertainty originating from crop model parameterization is small in comparison. While estimates of yield changes are highly uncertain, the directions of estimated changes in climatic limitations are largely consistent. This leads us to the conclusion that by focusing on estimated changes in climate limitations, more meaningful information can be provided to support decision making in adaptation planning - especially in cases where yield changes are highly uncertain.

  14. Robust sensor fault detection and isolation of gas turbine engines subjected to time-varying parameter uncertainties

    Science.gov (United States)

    Pourbabaee, Bahareh; Meskin, Nader; Khorasani, Khashayar

    2016-08-01

    In this paper, a novel robust sensor fault detection and isolation (FDI) strategy using the multiple model-based (MM) approach is proposed that remains robust with respect to both time-varying parameter uncertainties and process and measurement noise in all the channels. The scheme is composed of robust Kalman filters (RKF) that are constructed for multiple piecewise linear (PWL) models that are constructed at various operating points of an uncertain nonlinear system. The parameter uncertainty is modeled by using a time-varying norm bounded admissible structure that affects all the PWL state space matrices. The robust Kalman filter gain matrices are designed by solving two algebraic Riccati equations (AREs) that are expressed as two linear matrix inequality (LMI) feasibility conditions. The proposed multiple RKF-based FDI scheme is simulated for a single spool gas turbine engine to diagnose various sensor faults despite the presence of parameter uncertainties, process and measurement noise. Our comparative studies confirm the superiority of our proposed FDI method when compared to the methods that are available in the literature.

  15. Stochastic uncertainties and sensitivities of a regional-scale transport model of nitrate in groundwater

    NARCIS (Netherlands)

    Brink, C.v.d.; Zaadnoordijk, W.J.; Burgers, S.; Griffioen, J.

    2008-01-01

    Groundwater quality management relies more and more on models in recent years. These models are used to predict the risk of groundwater contamination for various land uses. This paper presents an assessment of uncertainties and sensitivities to input parameters for a regional model. The model had

  16. Stochastic modelling of landfill leachate and biogas production incorporating waste heterogeneity. Model formulation and uncertainty analysis.

    Science.gov (United States)

    Zacharof, A I; Butler, A P

    2004-01-01

    A mathematical model simulating the hydrological and biochemical processes occurring in landfilled waste is presented and demonstrated. The model combines biochemical and hydrological models into an integrated representation of the landfill environment. Waste decomposition is modelled using traditional biochemical waste decomposition pathways combined with a simplified methodology for representing the rate of decomposition. Water flow through the waste is represented using a statistical velocity model capable of representing the effects of waste heterogeneity on leachate flow through the waste. Given the limitations in data capture from landfill sites, significant emphasis is placed on improving parameter identification and reducing parameter requirements. A sensitivity analysis is performed, highlighting the model's response to changes in input variables. A model test run is also presented, demonstrating the model capabilities. A parameter perturbation model sensitivity analysis was also performed. This has been able to show that although the model is sensitive to certain key parameters, its overall intuitive response provides a good basis for making reasonable predictions of the future state of the landfill system. Finally, due to the high uncertainty associated with landfill data, a tool for handling input data uncertainty is incorporated in the model's structure. It is concluded that the model can be used as a reasonable tool for modelling landfill processes and that further work should be undertaken to assess the model's performance.

  17. On the meaning of feedback parameter, transient climate response, and the greenhouse effect: Basic considerations and the discussion of uncertainties

    CERN Document Server

    Kramm, Gerhard

    2010-01-01

    In this paper we discuss the meaning of feedback parameter, greenhouse effect and transient climate response usually related to the globally averaged energy balance model of Schneider and Mass. After scrutinizing this model and the corresponding planetary radiation balance we state that (a) the this globally averaged energy balance model is flawed by unsuitable physical considerations, (b) the planetary radiation balance for an Earth in the absence of an atmosphere is fraught by the inappropriate assumption of a uniform surface temperature, the so-called radiative equilibrium temperature of about 255 K, and (c) the effect of the radiative anthropogenic forcing, considered as a perturbation to the natural system, is much smaller than the uncertainty involved in the solution of the model of Schneider and Mass. This uncertainty is mainly related to the empirical constants suggested by various authors and used for predicting the emission of infrared radiation by the Earth's skin. Furthermore, after inserting the ...

  18. Consistent Stochastic Modelling of Meteocean Design Parameters

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Sterndorff, M. J.

    2000-01-01

    Consistent stochastic models of metocean design parameters and their directional dependencies are essential for reliability assessment of offshore structures. In this paper a stochastic model for the annual maximum values of the significant wave height, and the associated wind velocity, current...... velocity, and water level is presented. The stochastic model includes statistical uncertainty and dependency between the four stochastic variables. Further, a new stochastic model for annual maximum directional significant wave heights is presented. The model includes dependency between the maximum wave...... height from neighboring directional sectors. Numerical examples are presented where the models are calibrated using the Maximum Likelihood method to data from the central part of the North Sea. The calibration of the directional distributions is made such that the stochastic model for the omnidirectional...

  19. Uncertainties in modelling the climate impact of irrigation

    Science.gov (United States)

    de Vrese, Philipp; Hagemann, Stefan

    2017-04-01

    Many issues related to the climate impact of irrigation are addressed in studies that apply a wide range of models. These involve uncertainties related to differences in the model's general structure and parametrizations on the one hand and the need for simplifying assumptions with respect to the representation of irrigation on the other hand. To address these uncertainties, we used the Max Planck Institute for Meteorology's Earth System model into which a simple irrigation scheme was implemented. In several simulations, we varied certain irrigation characteristics to estimate the resulting variations in irrigation's climate impact and found a large sensitivity with respect to the irrigation effectiveness. Here, the assumed effectiveness of the scheme is a combination of the target soil moisture and the degree to which water losses are accounted for. In general, the simulated impact of irrigation on the state of the land surface and the atmosphere is more than three times larger when assuming a low irrigation effectiveness compared to a high effectiveness. In an additional set of simulations, we varied certain aspects of the model's general structure, namely the land-surface-atmosphere coupling, to estimate the related uncertainties. Here we compared the impact of irrigation between simulations using a parameter aggregation, a simple flux aggregation scheme and a coupling scheme that also accounts for spatial heterogeneity within the lowest layers of the atmosphere. It was found that changes in the land-surface-atmosphere coupling do not only affect the magnitude of climate impacts but they can even affect the direction of the impacts.

  20. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Dai, Heng [Pacific Northwest National Laboratory, Richland Washington USA; Ye, Ming [Department of Scientific Computing, Florida State University, Tallahassee Florida USA; Walker, Anthony P. [Environmental Sciences Division and Climate Change Science Institute, Oak Ridge National Laboratory, Oak Ridge Tennessee USA; Chen, Xingyuan [Pacific Northwest National Laboratory, Richland Washington USA

    2017-04-01

    Hydrological models are always composed of multiple components that represent processes key to intended model applications. When a process can be simulated by multiple conceptual-mathematical models (process models), model uncertainty in representing the process arises. While global sensitivity analysis methods have been widely used for identifying important processes in hydrologic modeling, the existing methods consider only parametric uncertainty but ignore the model uncertainty for process representation. To address this problem, this study develops a new method to probe multimodel process sensitivity by integrating the model averaging methods into the framework of variance-based global sensitivity analysis, given that the model averaging methods quantify both parametric and model uncertainty. A new process sensitivity index is derived as a metric of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and model parameters. For demonstration, the new index is used to evaluate the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that converting precipitation to recharge, and the geology process is also simulated by two models of different parameterizations of hydraulic conductivity; each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.

  1. Monte Carlo analysis of uncertainty propagation in a stratospheric model. 2: Uncertainties due to reaction rates

    Science.gov (United States)

    Stolarski, R. S.; Butler, D. M.; Rundel, R. D.

    1977-01-01

    A concise stratospheric model was used in a Monte-Carlo analysis of the propagation of reaction rate uncertainties through the calculation of an ozone perturbation due to the addition of chlorine. Two thousand Monte-Carlo cases were run with 55 reaction rates being varied. Excellent convergence was obtained in the output distributions because the model is sensitive to the uncertainties in only about 10 reactions. For a 1 ppby chlorine perturbation added to a 1.5 ppby chlorine background, the resultant 1 sigma uncertainty on the ozone perturbation is a factor of 1.69 on the high side and 1.80 on the low side. The corresponding 2 sigma factors are 2.86 and 3.23. Results are also given for the uncertainties, due to reaction rates, in the ambient concentrations of stratospheric species.

  2. The impact of model and rainfall forcing errors on characterizing soil moisture uncertainty in land surface modeling

    Directory of Open Access Journals (Sweden)

    V. Maggioni

    2012-10-01

    Full Text Available The contribution of rainfall forcing errors relative to model (structural and parameter uncertainty in the prediction of soil moisture is investigated by integrating the NASA Catchment Land Surface Model (CLSM, forced with hydro-meteorological data, in the Oklahoma region. Rainfall-forcing uncertainty is introduced using a stochastic error model that generates ensemble rainfall fields from satellite rainfall products. The ensemble satellite rain fields are propagated through CLSM to produce soil moisture ensembles. Errors in CLSM are modeled with two different approaches: either by perturbing model parameters (representing model parameter uncertainty or by adding randomly generated noise (representing model structure and parameter uncertainty to the model prognostic variables. Our findings highlight that the method currently used in the NASA GEOS-5 Land Data Assimilation System to perturb CLSM variables poorly describes the uncertainty in the predicted soil moisture, even when combined with rainfall model perturbations. On the other hand, by adding model parameter perturbations to rainfall forcing perturbations, a better characterization of uncertainty in soil moisture simulations is observed. Specifically, an analysis of the rank histograms shows that the most consistent ensemble of soil moisture is obtained by combining rainfall and model parameter perturbations. When rainfall forcing and model prognostic perturbations are added, the rank histogram shows a U-shape at the domain average scale, which corresponds to a lack of variability in the forecast ensemble. The more accurate estimation of the soil moisture prediction uncertainty obtained by combining rainfall and parameter perturbations is encouraging for the application of this approach in ensemble data assimilation systems.

  3. Uncertainty models applied to the substation planning

    Energy Technology Data Exchange (ETDEWEB)

    Fontoura Filho, Roberto N. [ELETROBRAS, Rio de Janeiro, RJ (Brazil); Aires, Joao Carlos O.; Tortelly, Debora L.S. [Light Servicos de Eletricidade S.A., Rio de Janeiro, RJ (Brazil)

    1994-12-31

    The selection of the reinforcements for a power system expansion becomes a difficult task on an environment of uncertainties. These uncertainties can be classified according to their sources as exogenous and endogenous. The first one is associated to the elements of the generation, transmission and distribution systems. The exogenous uncertainly is associated to external aspects, as the financial resources, the time spent to build the installations, the equipment price and the load level. The load uncertainly is extremely sensible to the behaviour of the economic conditions. Although the impossibility to take out completely the uncertainty , the endogenous one can be convenient treated and the exogenous uncertainly can be compensated. This paper describes an uncertainty treatment methodology and a practical application to a group of substations belonging to LIGHT company, the Rio de Janeiro electric utility. The equipment performance uncertainty is treated by adopting a probabilistic approach. The uncertainly associated to the load increase is considered by using technical analysis of scenarios and choice criteria based on the Decision Theory. On this paper it was used the Savage Method and the Fuzzy Set Method, in order to select the best middle term reinforcements plan. (author) 7 refs., 4 figs., 6 tabs.

  4. Estimated Frequency Domain Model Uncertainties used in Robust Controller Design

    DEFF Research Database (Denmark)

    Tøffner-Clausen, S.; Andersen, Palle; Stoustrup, Jakob;

    1994-01-01

    This paper deals with the combination of system identification and robust controller design. Recent results on estimation of frequency domain model uncertainty are......This paper deals with the combination of system identification and robust controller design. Recent results on estimation of frequency domain model uncertainty are...

  5. Numerical daemons in hydrological modeling: Effects on uncertainty assessment, sensitivity analysis and model predictions

    Science.gov (United States)

    Kavetski, D.; Clark, M. P.; Fenicia, F.

    2011-12-01

    Hydrologists often face sources of uncertainty that dwarf those normally encountered in many engineering and scientific disciplines. Especially when representing large scale integrated systems, internal heterogeneities such as stream networks, preferential flowpaths, vegetation, etc, are necessarily represented with a considerable degree of lumping. The inputs to these models are themselves often the products of sparse observational networks. Given the simplifications inherent in environmental models, especially lumped conceptual models, does it really matter how they are implemented? At the same time, given the complexities usually found in the response surfaces of hydrological models, increasingly sophisticated analysis methodologies are being proposed for sensitivity analysis, parameter calibration and uncertainty assessment. Quite remarkably, rather than being caused by the model structure/equations themselves, in many cases model analysis complexities are consequences of seemingly trivial aspects of the model implementation - often, literally, whether the start-of-step or end-of-step fluxes are used! The extent of problems can be staggering, including (i) degraded performance of parameter optimization and uncertainty analysis algorithms, (ii) erroneous and/or misleading conclusions of sensitivity analysis, parameter inference and model interpretations and, finally, (iii) poor reliability of a calibrated model in predictive applications. While the often nontrivial behavior of numerical approximations has long been recognized in applied mathematics and in physically-oriented fields of environmental sciences, it remains a problematic issue in many environmental modeling applications. Perhaps detailed attention to numerics is only warranted for complicated engineering models? Would not numerical errors be an insignificant component of total uncertainty when typical data and model approximations are present? Is this really a serious issue beyond some rare isolated

  6. One-parameter class of uncertainty relations based on entropy power

    Science.gov (United States)

    Jizba, Petr; Ma, Yue; Hayes, Anthony; Dunningham, Jacob A.

    2016-06-01

    We use the concept of entropy power to derive a one-parameter class of information-theoretic uncertainty relations for pairs of conjugate observables in an infinite-dimensional Hilbert space. This class constitutes an infinite tower of higher-order statistics uncertainty relations, which allows one in principle to determine the shape of the underlying information-distribution function by measuring the relevant entropy powers. We illustrate the capability of this class by discussing two examples: superpositions of vacuum and squeezed states and the Cauchy-type heavy-tailed wave function.

  7. Uncertainty and target accuracy studies for the very high temperature reactor(VHTR) physics parameters.

    Energy Technology Data Exchange (ETDEWEB)

    Taiwo, T. A.; Palmiotti, G.; Aliberti, G.; Salvatores, M.; Kim, T.K.

    2005-09-16

    The potential impact of nuclear data uncertainties on a number of performance parameters (core and fuel cycle) of the prismatic block-type Very High Temperature Reactor (VHTR) has been evaluated and results are presented in this report. An uncertainty analysis has been performed, based on sensitivity theory, which underlines what cross-sections, what energy range and what isotopes are responsible for the most significant uncertainties. In order to give guidelines on priorities for new evaluations or validation experiments, required accuracies on specific nuclear data have been derived, accounting for target accuracies on major design parameters. Results of an extensive analysis indicate only a limited number of relevant parameters do not meet the target accuracies assumed in this work; this does not imply that the existing nuclear cross-section data cannot be used for the feasibility and pre-conceptual assessments of the VHTR. However, the results obtained depend on the uncertainty data used, and it is suggested to focus some future evaluation work on the production of consistent, as far as possible complete and user oriented covariance data.

  8. Online Prediction Under Model Uncertainty via Dynamic Model Averaging: Application to a Cold Rolling Mill.

    Science.gov (United States)

    Raftery, Adrian E; Kárný, Miroslav; Ettler, Pavel

    2010-02-01

    We consider the problem of online prediction when it is uncertain what the best prediction model to use is. We develop a method called Dynamic Model Averaging (DMA) in which a state space model for the parameters of each model is combined with a Markov chain model for the correct model. This allows the "correct" model to vary over time. The state space and Markov chain models are both specified in terms of forgetting, leading to a highly parsimonious representation. As a special case, when the model and parameters do not change, DMA is a recursive implementation of standard Bayesian model averaging, which we call recursive model averaging. The method is applied to the problem of predicting the output strip thickness for a cold rolling mill, where the output is measured with a time delay. We found that when only a small number of physically motivated models were considered and one was clearly best, the method quickly converged to the best model, and the cost of model uncertainty was small; indeed DMA performed slightly better than the best physical model. When model uncertainty and the number of models considered were large, our method ensured that the penalty for model uncertainty was small. At the beginning of the process, when control is most difficult, we found that DMA over a large model space led to better predictions than the single best performing physically motivated model. We also applied the method to several simulated examples, and found that it recovered both constant and time-varying regression parameters and model specifications quite well.

  9. Temperature response functions introduce high uncertainty in modelled carbon stocks in cold temperature regimes

    Directory of Open Access Journals (Sweden)

    H. Portner

    2009-08-01

    Full Text Available Models of carbon cycling in terrestrial ecosystems contain formulations for the dependence of respiration on temperature, but the sensitivity of predicted carbon pools and fluxes to these formulations and their parameterization is not understood. Thus, we made an uncertainty analysis of soil organic matter decomposition with respect to its temperature dependency using the ecosystem model LPJ-GUESS.

    We used five temperature response functions (Exponential, Arrhenius, Lloyd-Taylor, Gaussian, Van't Hoff. We determined the parameter uncertainty ranges of the functions by nonlinear regression analysis based on eight experimental datasets from northern hemisphere ecosystems. We sampled over the uncertainty bounds of the parameters and run simulations for each pair of temperature response function and calibration site. The uncertainty in both long-term and short-term soil carbon dynamics was analyzed over an elevation gradient in southern Switzerland.

    The function of Lloyd-Taylor turned out to be adequate for modelling the temperature dependency of soil organic matter decomposition, whereas the other functions either resulted in poor fits (Exponential, Arrhenius or were not applicable for all datasets (Gaussian, Van't Hoff. There were two main sources of uncertainty for model simulations: (1 the uncertainty in the parameter estimates of the response functions, which increased with increasing temperature and (2 the uncertainty in the simulated size of carbon pools, which increased with elevation, as slower turn-over times lead to higher carbon stocks and higher associated uncertainties. The higher uncertainty in carbon pools with slow turn-over rates has important implications for the uncertainty in the projection of the change of soil carbon stocks driven by climate change, which turned out to be more uncertain for higher elevations and hence higher latitudes, which are of key importance for the global terrestrial carbon

  10. A Bayesian Chance-Constrained Method for Hydraulic Barrier Design Under Model Structure Uncertainty

    Science.gov (United States)

    Chitsazan, N.; Pham, H. V.; Tsai, F. T. C.

    2014-12-01

    The groundwater community has widely recognized the model structure uncertainty as the major source of model uncertainty in groundwater modeling. Previous studies in the aquifer remediation design, however, rarely discuss the impact of the model structure uncertainty. This study combines the chance-constrained (CC) programming with the Bayesian model averaging (BMA) as a BMA-CC framework to assess the effect of model structure uncertainty in the remediation design. To investigate the impact of the model structure uncertainty on the remediation design, we compare the BMA-CC method with the traditional CC programming that only considers the model parameter uncertainty. The BMA-CC method is employed to design a hydraulic barrier to protect public supply wells of the Government St. pump station from saltwater intrusion in the "1,500-foot" sand and the "1-700-foot" sand of the Baton Rouge area, southeastern Louisiana. To address the model structure uncertainty, we develop three conceptual groundwater models based on three different hydrostratigraphy structures. The results show that using the traditional CC programming overestimates design reliability. The results also show that at least five additional connector wells are needed to achieve more than 90% design reliability level. The total amount of injected water from connector wells is higher than the total pumpage of the protected public supply wells. While reducing injection rate can be achieved by reducing reliability level, the study finds that the hydraulic barrier design to protect the Government St. pump station is not economically attractive.

  11. Corruption of parameter behavior and regionalization by model and forcing data errors: A Bayesian example using the SNOW17 model

    Science.gov (United States)

    He, Minxue; Hogue, Terri S.; Franz, Kristie J.; Margulis, Steven A.; Vrugt, Jasper A.

    2011-07-01

    The current study evaluates the impacts of various sources of uncertainty involved in hydrologic modeling on parameter behavior and regionalization utilizing different Bayesian likelihood functions and the Differential Evolution Adaptive Metropolis (DREAM) algorithm. The developed likelihood functions differ in their underlying assumptions and treatment of error sources. We apply the developed method to a snow accumulation and ablation model (National Weather Service SNOW17) and generate parameter ensembles to predict snow water equivalent (SWE). Observational data include precipitation and air temperature forcing along with SWE measurements from 24 sites with diverse hydroclimatic characteristics. A multiple linear regression model is used to construct regionalization relationships between model parameters and site characteristics. Results indicate that model structural uncertainty has the largest influence on SNOW17 parameter behavior. Precipitation uncertainty is the second largest source of uncertainty, showing greater impact at wetter sites. Measurement uncertainty in SWE tends to have little impact on the final model parameters and resulting SWE predictions. Considering all sources of uncertainty, parameters related to air temperature and snowfall fraction exhibit the strongest correlations to site characteristics. Parameters related to the length of the melting period also show high correlation to site characteristics. Finally, model structural uncertainty and precipitation uncertainty dramatically alter parameter regionalization relationships in comparison to cases where only uncertainty in model parameters or output measurements is considered. Our results demonstrate that accurate treatment of forcing, parameter, model structural, and calibration data errors is critical for deriving robust regionalization relationships.

  12. DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Eldred, Michael Scott; Vigil, Dena M.; Dalbey, Keith R.; Bohnhoff, William J.; Adams, Brian M.; Swiler, Laura Painton; Lefantzi, Sophia (Sandia National Laboratories, Livermore, CA); Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Eddy, John P.

    2011-12-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the DAKOTA software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of DAKOTA-related research publications in the areas of surrogate-based optimization, uncertainty quantification, and optimization under uncertainty that provide the foundation for many of DAKOTA's iterative analysis capabilities.

  13. Uncertainty quantification for a hydro-morphodynamic model of river Rhine

    Science.gov (United States)

    Hieu Mai, Trung; Nowak, Wolfgang; Kopmann, Rebekka; Oladyshkin, Sergey

    2016-04-01

    Although numerical modelling is state of the art and has been very helpful in River engineering for a long time, it should not be neglected, that uncertainties are unavoidable in numerical modelling. Uncertainties arise from deficient descriptions of the physical processes and from the imprecision of model parameters such as roughness coefficients or sediment grain sizes. Model input parameters are uncertain due to measurement errors, natural variability or unsatisfactory parametrization. The propagation of uncertainties in the input data through simulations might have serious influence on the simulation results. Therefore, it is necessary to quantify the contributions of input uncertainties to the model results in order to appraise their reliability. Uncertainty analysis can help finding the input parameters that cause the larges output uncertainty, and can identify the most uncertain locations and the most uncertain time periods in hydro-mophodynamic model predictions. The Monte Carlo method (MC), the traditional approach for uncertainty analysis, requires a huge computational effort with thousands of simulations. However, for high-resolution numerical hydro-morphodynamic models, each simulation may take hours or days. This implies that the MC method is obviously impossible in river engineering practice. Therefore, other advanced uncertainty quantification methods should be investigated and applied for complex hydro-morphodynamic models. In this study, five methods have been used and compared for uncertainty analysis of numerical simulations: (1) the Monte Carlo method (MC), (2) the First-Order Second Moment method based on numerical differentiation (FOSM/ND) and (3) based on algorithmic differentiation (FOSM/AD), and higher-order expansion methods such as (4) Taylor series expansion and (5) Polynomial Chaos Expansion (PCE). The latter two have been included in order to capture the effects of strong non-linearity in hydro-morphodynamic process during uncertainty

  14. Exploring uncertainty in glacier mass balance modelling with Monte Carlo simulation

    Directory of Open Access Journals (Sweden)

    H. Machguth

    2008-12-01

    Full Text Available By means of Monte Carlo simulations we calculated uncertainty in modelled cumulative mass balance over 400 days at one particular point on the tongue of Morteratsch Glacier, Switzerland, using a glacier energy balance model of intermediate complexity. Before uncertainty assessment, the model was tuned to observed mass balance for the investigated time period and its robustness was tested by comparing observed and modelled mass balance over 11 years, yielding very small deviations. Both systematic and random uncertainties are assigned to twelve input parameters and their respective values estimated from the literature or from available meteorological data sets. The calculated overall uncertainty in the model output is dominated by systematic errors and amounts to 0.7 m w.e. or approximately 10% of total melt over the investigated time span. In order to provide a first order estimate on variability in uncertainty depending on the quality of input data, we conducted a further experiment, calculating overall uncertainty for different levels of uncertainty in measured global radiation and air temperature. Our results show that the output of a well calibrated model is subject to considerable uncertainties, in particular when applied for extrapolation in time and space where systematic errors are likely to be an important issue.

  15. Exploring uncertainty in glacier mass balance modelling with Monte Carlo simulation

    Directory of Open Access Journals (Sweden)

    H. Machguth

    2008-06-01

    Full Text Available By means of Monte Carlo simulations we calculated uncertainty in modelled cumulative mass balance over 400 days at one particular point on the tongue of Morteratsch Glacier, Switzerland, using a glacier energy balance model of intermediate complexity. Before uncertainty assessment, the model was tuned to observed mass balance for the investigated time period and its robustness was tested by comparing observed and modelled mass balance over 11 years, yielding very small deviations. Both systematic and random uncertainties are assigned to twelve input parameters and their respective values estimated from the literature or from available meteorological data sets. The calculated overall uncertainty in the model output is dominated by systematic errors and amounts to 0.7 m w.e. or approximately 10% of total melt over the investigated time span. In order to provide a first order estimate on variability in uncertainty depending on the quality of input data, we conducted a further experiment, calculating overall uncertainty for different levels of uncertainty in measured global radiation and air temperature. Our results show that the output of a well calibrated model is subject to considerable uncertainties, in particular when applied for extrapolation in time and space where systematic errors are likely to be an important issue.

  16. Uncertainty Categorization, Modeling, and Management for Regional Water Supply Planning

    Science.gov (United States)

    Fletcher, S.; Strzepek, K. M.; AlSaati, A.; Alhassan, A.

    2016-12-01

    Many water planners face increased pressure on water supply systems from growing demands, variability in supply and a changing climate. Short-term variation in water availability and demand; long-term uncertainty in climate, groundwater storage, and sectoral competition for water; and varying stakeholder perspectives on the impacts of water shortages make it difficult to assess the necessity of expensive infrastructure investments. We categorize these uncertainties on two dimensions: whether they are the result of stochastic variation or epistemic uncertainty, and whether the uncertainties can be described probabilistically or are deep uncertainties whose likelihood is unknown. We develop a decision framework that combines simulation for probabilistic uncertainty, sensitivity analysis for deep uncertainty and Bayesian decision analysis for uncertainties that are reduced over time with additional information. We apply this framework to two contrasting case studies - drought preparedness in Melbourne, Australia and fossil groundwater depletion in Riyadh, Saudi Arabia - to assess the impacts of different types of uncertainty on infrastructure decisions. Melbourne's water supply system relies on surface water, which is impacted by natural variation in rainfall, and a market-based system for managing water rights. Our results show that small, flexible investment increases can mitigate shortage risk considerably at reduced cost. Riyadh, by contrast, relies primarily on desalination for municipal use and fossil groundwater for agriculture, and a centralized planner makes allocation decisions. Poor regional groundwater measurement makes it difficult to know when groundwater pumping will become uneconomical, resulting in epistemic uncertainty. However, collecting more data can reduce the uncertainty, suggesting the need for different uncertainty modeling and management strategies in Riyadh than in Melbourne. We will categorize the two systems and propose appropriate

  17. A coupled stochastic inverse/sharp interface seawater intrusion approach for coastal aquifers under groundwater parameter uncertainty

    Science.gov (United States)

    Llopis-Albert, Carlos; Merigó, José M.; Xu, Yejun

    2016-09-01

    This paper presents an alternative approach to deal with seawater intrusion problems, that overcomes some of the limitations of previous works, by coupling the well-known SWI2 package for MODFLOW with a stochastic inverse model named GC method. On the one hand, the SWI2 allows a vertically integrated variable-density groundwater flow and seawater intrusion in coastal multi-aquifer systems, and a reduction in number of required model cells and the elimination of the need to solve the advective-dispersive transport equation, which leads to substantial model run-time savings. On the other hand, the GC method allows dealing with groundwater parameter uncertainty by constraining stochastic simulations to flow and mass transport data (i.e., hydraulic conductivity, freshwater heads, saltwater concentrations and travel times) and also to secondary information obtained from expert judgment or geophysical surveys, thus reducing uncertainty and increasing reliability in meeting the environmental standards. The methodology has been successfully applied to a transient movement of the freshwater-seawater interface in response to changing freshwater inflow in a two-aquifer coastal aquifer system, where an uncertainty assessment has been carried out by means of Monte Carlo simulation techniques. The approach also allows partially overcoming the neglected diffusion and dispersion processes after the conditioning process since the uncertainty is reduced and results are closer to available data.

  18. Investigating the Propagation of Meteorological Model Uncertainty for Tracer Modeling

    Science.gov (United States)

    Lopez-Coto, I.; Ghosh, S.; Karion, A.; Martin, C.; Mueller, K. L.; Prasad, K.; Whetstone, J. R.

    2016-12-01

    The North-East Corridor project aims to use a top-down inversion method to quantify sources of Greenhouse Gas (GHG) emissions in the urban areas of Washington DC and Baltimore at approximately 1km2 resolutions. The aim of this project is to help establish reliable measurement methods for quantifying and validating GHG emissions independently of the inventory methods typically used to guide mitigation efforts. Since inversion methods depend strongly on atmospheric transport modeling, analyzing the uncertainties on the meteorological fields and their propagation through the sensitivities of observations to surface fluxes (footprints) is a fundamental step. To this end, six configurations of the Weather Research and Forecasting Model (WRF-ARW) version 3.8 were used to generate an ensemble of meteorological simulations. Specifically, we used 4 planetary boundary layer parameterizations (YSU, MYNN2, BOULAC, QNSE), 2 sources of initial and boundary conditions (NARR and HRRR) and 1 configuration including the building energy parameterization (BEP) urban canopy model. The simulations were compared with more than 150 meteorological surface stations, a wind profiler and radiosondes for a month (February) in 2016 to account for the uncertainties and the ensemble spread for wind speed, direction and mixing height. In addition, we used the Stochastic Time-Inverted Lagrangian Transport model (STILT) to derive the sensitivity of 12 hypothetical observations to surface emissions (footprints) with each WRF configuration. The footprints and integrated sensitivities were compared and the resulting uncertainties estimated.

  19. Assessment of uncertainties of the models used in thermal-hydraulic computer codes

    Science.gov (United States)

    Gricay, A. S.; Migrov, Yu. A.

    2015-09-01

    The article deals with matters concerned with the problem of determining the statistical characteristics of variable parameters (the variation range and distribution law) in analyzing the uncertainty and sensitivity of calculation results to uncertainty in input data. A comparative analysis of modern approaches to uncertainty in input data is presented. The need to develop an alternative method for estimating the uncertainty of model parameters used in thermal-hydraulic computer codes, in particular, in the closing correlations of the loop thermal hydraulics block, is shown. Such a method shall feature the minimal degree of subjectivism and must be based on objective quantitative assessment criteria. The method includes three sequential stages: selecting experimental data satisfying the specified criteria, identifying the key closing correlation using a sensitivity analysis, and carrying out case calculations followed by statistical processing of the results. By using the method, one can estimate the uncertainty range of a variable parameter and establish its distribution law in the above-mentioned range provided that the experimental information is sufficiently representative. Practical application of the method is demonstrated taking as an example the problem of estimating the uncertainty of a parameter appearing in the model describing transition to post-burnout heat transfer that is used in the thermal-hydraulic computer code KORSAR. The performed study revealed the need to narrow the previously established uncertainty range of this parameter and to replace the uniform distribution law in the above-mentioned range by the Gaussian distribution law. The proposed method can be applied to different thermal-hydraulic computer codes. In some cases, application of the method can make it possible to achieve a smaller degree of conservatism in the expert estimates of uncertainties pertinent to the model parameters used in computer codes.

  20. Study on Uncertainties of Seismicity Parameters b and v4 in Seismic Statistical Zones

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    For several seismic statistical zones in North China, the key factors causing uncertainties in the important seismicity parameters b and v4 and the features of their uncertainties are discussed in this paper. The magnitude of uncertainty is also analyzed. It can be seen that the key influencing factors are statistical period, methods of processing statistical samples, lower limit magnitude and the annual average occurrence ratio of large earthquakes. The variation ranges of b and v4 in the Tancheng-Lujiang zone are as high as 0.2 and 1.4 respectively, which are similar to those in the Fenwei zone. They are much smaller however in the Hebei zone because of its sufficient statistical samples.

  1. RESONANCE SELF-SHIELDING EFFECT IN UNCERTAINTY QUANTIFICATION OF FISSION REACTOR NEUTRONICS PARAMETERS

    Directory of Open Access Journals (Sweden)

    GO CHIBA

    2014-06-01

    Full Text Available In order to properly quantify fission reactor neutronics parameter uncertainties, we have to use covariance data and sensitivity profiles consistently. In the present paper, we establish two consistent methodologies for uncertainty quantification: a self-shielded cross section-based consistent methodology and an infinitely-diluted cross section-based consistent methodology. With these methodologies and the covariance data of uranium-238