WorldWideScience

Sample records for model parameter uncertainty

  1. Parameter and Uncertainty Estimation in Groundwater Modelling

    DEFF Research Database (Denmark)

    Jensen, Jacob Birk

    The data basis on which groundwater models are constructed is in general very incomplete, and this leads to uncertainty in model outcome. Groundwater models form the basis for many, often costly decisions and if these are to be made on solid grounds, the uncertainty attached to model results must...... be quantified. This study was motivated by the need to estimate the uncertainty involved in groundwater models.Chapter 2 presents an integrated surface/subsurface unstructured finite difference model that was developed and applied to a synthetic case study.The following two chapters concern calibration...... was applied.Capture zone modelling was conducted on a synthetic stationary 3-dimensional flow problem involving river, surface and groundwater flow. Simulated capture zones were illustrated as likelihood maps and compared with a deterministic capture zones derived from a reference model. The results showed...

  2. Climate change decision-making: Model & parameter uncertainties explored

    Energy Technology Data Exchange (ETDEWEB)

    Dowlatabadi, H.; Kandlikar, M.; Linville, C.

    1995-12-31

    A critical aspect of climate change decision-making is uncertainties in current understanding of the socioeconomic, climatic and biogeochemical processes involved. Decision-making processes are much better informed if these uncertainties are characterized and their implications understood. Quantitative analysis of these uncertainties serve to inform decision makers about the likely outcome of policy initiatives, and help set priorities for research so that outcome ambiguities faced by the decision-makers are reduced. A family of integrated assessment models of climate change have been developed at Carnegie Mellon. These models are distinguished from other integrated assessment efforts in that they were designed from the outset to characterize and propagate parameter, model, value, and decision-rule uncertainties. The most recent of these models is ICAM 2.1. This model includes representation of the processes of demographics, economic activity, emissions, atmospheric chemistry, climate and sea level change and impacts from these changes and policies for emissions mitigation, and adaptation to change. The model has over 800 objects of which about one half are used to represent uncertainty. In this paper we show, that when considering parameter uncertainties, the relative contribution of climatic uncertainties are most important, followed by uncertainties in damage calculations, economic uncertainties and direct aerosol forcing uncertainties. When considering model structure uncertainties we find that the choice of policy is often dominated by model structure choice, rather than parameter uncertainties.

  3. Parameter uncertainty analysis of a biokinetic model of caesium

    International Nuclear Information System (INIS)

    Li, W.B.; Oeh, U.; Klein, W.; Blanchardon, E.; Puncher, M.; Leggett, R.W.; Breustedt, B.; Nosske, D.; Lopez, M.A.

    2015-01-01

    Parameter uncertainties for the biokinetic model of caesium (Cs) developed by Leggett et al. were inventoried and evaluated. The methods of parameter uncertainty analysis were used to assess the uncertainties of model predictions with the assumptions of model parameter uncertainties and distributions. Furthermore, the importance of individual model parameters was assessed by means of sensitivity analysis. The calculated uncertainties of model predictions were compared with human data of Cs measured in blood and in the whole body. It was found that propagating the derived uncertainties in model parameter values reproduced the range of bioassay data observed in human subjects at different times after intake. The maximum ranges, expressed as uncertainty factors (UFs) (defined as a square root of ratio between 97.5. and 2.5. percentiles) of blood clearance, whole-body retention and urinary excretion of Cs predicted at earlier time after intake were, respectively: 1.5, 1.0 and 2.5 at the first day; 1.8, 1.1 and 2.4 at Day 10 and 1.8, 2.0 and 1.8 at Day 100; for the late times (1000 d) after intake, the UFs were increased to 43, 24 and 31, respectively. The model parameters of transfer rates between kidneys and blood, muscle and blood and the rate of transfer from kidneys to urinary bladder content are most influential to the blood clearance and to the whole-body retention of Cs. For the urinary excretion, the parameters of transfer rates from urinary bladder content to urine and from kidneys to urinary bladder content impact mostly. The implication and effect on the estimated equivalent and effective doses of the larger uncertainty of 43 in whole-body retention in the later time, say, after Day 500 will be explored in a successive work in the framework of EURADOS. (authors)

  4. Incorporating model parameter uncertainty into inverse treatment planning

    International Nuclear Information System (INIS)

    Lian Jun; Xing Lei

    2004-01-01

    Radiobiological treatment planning depends not only on the accuracy of the models describing the dose-response relation of different tumors and normal tissues but also on the accuracy of tissue specific radiobiological parameters in these models. Whereas the general formalism remains the same, different sets of model parameters lead to different solutions and thus critically determine the final plan. Here we describe an inverse planning formalism with inclusion of model parameter uncertainties. This is made possible by using a statistical analysis-based frameset developed by our group. In this formalism, the uncertainties of model parameters, such as the parameter a that describes tissue-specific effect in the equivalent uniform dose (EUD) model, are expressed by probability density function and are included in the dose optimization process. We found that the final solution strongly depends on distribution functions of the model parameters. Considering that currently available models for computing biological effects of radiation are simplistic, and the clinical data used to derive the models are sparse and of questionable quality, the proposed technique provides us with an effective tool to minimize the effect caused by the uncertainties in a statistical sense. With the incorporation of the uncertainties, the technique has potential for us to maximally utilize the available radiobiology knowledge for better IMRT treatment

  5. Combined Estimation of Hydrogeologic Conceptual Model and Parameter Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Philip D.; Ye, Ming; Neuman, Shlomo P.; Cantrell, Kirk J.

    2004-03-01

    The objective of the research described in this report is the development and application of a methodology for comprehensively assessing the hydrogeologic uncertainties involved in dose assessment, including uncertainties associated with conceptual models, parameters, and scenarios. This report describes and applies a statistical method to quantitatively estimate the combined uncertainty in model predictions arising from conceptual model and parameter uncertainties. The method relies on model averaging to combine the predictions of a set of alternative models. Implementation is driven by the available data. When there is minimal site-specific data the method can be carried out with prior parameter estimates based on generic data and subjective prior model probabilities. For sites with observations of system behavior (and optionally data characterizing model parameters), the method uses model calibration to update the prior parameter estimates and model probabilities based on the correspondence between model predictions and site observations. The set of model alternatives can contain both simplified and complex models, with the requirement that all models be based on the same set of data. The method was applied to the geostatistical modeling of air permeability at a fractured rock site. Seven alternative variogram models of log air permeability were considered to represent data from single-hole pneumatic injection tests in six boreholes at the site. Unbiased maximum likelihood estimates of variogram and drift parameters were obtained for each model. Standard information criteria provided an ambiguous ranking of the models, which would not justify selecting one of them and discarding all others as is commonly done in practice. Instead, some of the models were eliminated based on their negligibly small updated probabilities and the rest were used to project the measured log permeabilities by kriging onto a rock volume containing the six boreholes. These four

  6. Parameter and uncertainty estimation for mechanistic, spatially explicit epidemiological models

    Science.gov (United States)

    Finger, Flavio; Schaefli, Bettina; Bertuzzo, Enrico; Mari, Lorenzo; Rinaldo, Andrea

    2014-05-01

    Epidemiological models can be a crucially important tool for decision-making during disease outbreaks. The range of possible applications spans from real-time forecasting and allocation of health-care resources to testing alternative intervention mechanisms such as vaccines, antibiotics or the improvement of sanitary conditions. Our spatially explicit, mechanistic models for cholera epidemics have been successfully applied to several epidemics including, the one that struck Haiti in late 2010 and is still ongoing. Calibration and parameter estimation of such models represents a major challenge because of properties unusual in traditional geoscientific domains such as hydrology. Firstly, the epidemiological data available might be subject to high uncertainties due to error-prone diagnosis as well as manual (and possibly incomplete) data collection. Secondly, long-term time-series of epidemiological data are often unavailable. Finally, the spatially explicit character of the models requires the comparison of several time-series of model outputs with their real-world counterparts, which calls for an appropriate weighting scheme. It follows that the usual assumption of a homoscedastic Gaussian error distribution, used in combination with classical calibration techniques based on Markov chain Monte Carlo algorithms, is likely to be violated, whereas the construction of an appropriate formal likelihood function seems close to impossible. Alternative calibration methods, which allow for accurate estimation of total model uncertainty, particularly regarding the envisaged use of the models for decision-making, are thus needed. Here we present the most recent developments regarding methods for parameter and uncertainty estimation to be used with our mechanistic, spatially explicit models for cholera epidemics, based on informal measures of goodness of fit.

  7. Parameter Uncertainty for Aircraft Aerodynamic Modeling using Recursive Least Squares

    Science.gov (United States)

    Grauer, Jared A.; Morelli, Eugene A.

    2016-01-01

    A real-time method was demonstrated for determining accurate uncertainty levels of stability and control derivatives estimated using recursive least squares and time-domain data. The method uses a recursive formulation of the residual autocorrelation to account for colored residuals, which are routinely encountered in aircraft parameter estimation and change the predicted uncertainties. Simulation data and flight test data for a subscale jet transport aircraft were used to demonstrate the approach. Results showed that the corrected uncertainties matched the observed scatter in the parameter estimates, and did so more accurately than conventional uncertainty estimates that assume white residuals. Only small differences were observed between batch estimates and recursive estimates at the end of the maneuver. It was also demonstrated that the autocorrelation could be reduced to a small number of lags to minimize computation and memory storage requirements without significantly degrading the accuracy of predicted uncertainty levels.

  8. Selected examples of practical approaches for the assessment of model reliability - parameter uncertainty analysis

    International Nuclear Information System (INIS)

    Hofer, E.; Hoffman, F.O.

    1987-02-01

    The uncertainty analysis of model predictions has to discriminate between two fundamentally different types of uncertainty. The presence of stochastic variability (Type 1 uncertainty) necessitates the use of a probabilistic model instead of the much simpler deterministic one. Lack of knowledge (Type 2 uncertainty), however, applies to deterministic as well as to probabilistic model predictions and often dominates over uncertainties of Type 1. The term ''probability'' is interpreted differently in the probabilistic analysis of either type of uncertainty. After these discriminations have been explained the discussion centers on the propagation of parameter uncertainties through the model, the derivation of quantitative uncertainty statements for model predictions and the presentation and interpretation of the results of a Type 2 uncertainty analysis. Various alternative approaches are compared for a very simple deterministic model

  9. Parameters-related uncertainty in modeling sugar cane yield with an agro-Land Surface Model

    Science.gov (United States)

    Valade, A.; Ciais, P.; Vuichard, N.; Viovy, N.; Ruget, F.; Gabrielle, B.

    2012-12-01

    Agro-Land Surface Models (agro-LSM) have been developed from the coupling of specific crop models and large-scale generic vegetation models. They aim at accounting for the spatial distribution and variability of energy, water and carbon fluxes within soil-vegetation-atmosphere continuum with a particular emphasis on how crop phenology and agricultural management practice influence the turbulent fluxes exchanged with the atmosphere, and the underlying water and carbon pools. A part of the uncertainty in these models is related to the many parameters included in the models' equations. In this study, we quantify the parameter-based uncertainty in the simulation of sugar cane biomass production with the agro-LSM ORCHIDEE-STICS on a multi-regional approach with data from sites in Australia, La Reunion and Brazil. First, the main source of uncertainty for the output variables NPP, GPP, and sensible heat flux (SH) is determined through a screening of the main parameters of the model on a multi-site basis leading to the selection of a subset of most sensitive parameters causing most of the uncertainty. In a second step, a sensitivity analysis is carried out on the parameters selected from the screening analysis at a regional scale. For this, a Monte-Carlo sampling method associated with the calculation of Partial Ranked Correlation Coefficients is used. First, we quantify the sensitivity of the output variables to individual input parameters on a regional scale for two regions of intensive sugar cane cultivation in Australia and Brazil. Then, we quantify the overall uncertainty in the simulation's outputs propagated from the uncertainty in the input parameters. Seven parameters are identified by the screening procedure as driving most of the uncertainty in the agro-LSM ORCHIDEE-STICS model output at all sites. These parameters control photosynthesis (optimal temperature of photosynthesis, optimal carboxylation rate), radiation interception (extinction coefficient), root

  10. Model uncertainty in financial markets : Long run risk and parameter uncertainty

    NARCIS (Netherlands)

    de Roode, F.A.

    2014-01-01

    Uncertainty surrounding key parameters of financial markets, such as the in- flation and equity risk premium, constitute a major risk for institutional investors with long investment horizons. Hedging the investors’ inflation exposure can be challenging due to the lack of domestic inflation-linked

  11. Characterizing parameter sensitivity and uncertainty for a snow model across hydroclimatic regimes

    Science.gov (United States)

    He, Minxue; Hogue, Terri S.; Franz, Kristie J.; Margulis, Steven A.; Vrugt, Jasper A.

    2011-01-01

    The National Weather Service (NWS) uses the SNOW17 model to forecast snow accumulation and ablation processes in snow-dominated watersheds nationwide. Successful application of the SNOW17 relies heavily on site-specific estimation of model parameters. The current study undertakes a comprehensive sensitivity and uncertainty analysis of SNOW17 model parameters using forcing and snow water equivalent (SWE) data from 12 sites with differing meteorological and geographic characteristics. The Generalized Sensitivity Analysis and the recently developed Differential Evolution Adaptive Metropolis (DREAM) algorithm are utilized to explore the parameter space and assess model parametric and predictive uncertainty. Results indicate that SNOW17 parameter sensitivity and uncertainty generally varies between sites. Of the six hydroclimatic characteristics studied, only air temperature shows strong correlation with the sensitivity and uncertainty ranges of two parameters, while precipitation is highly correlated with the uncertainty of one parameter. Posterior marginal distributions of two parameters are also shown to be site-dependent in terms of distribution type. The SNOW17 prediction ensembles generated by the DREAM-derived posterior parameter sets contain most of the observed SWE. The proposed uncertainty analysis provides posterior parameter information on parameter uncertainty and distribution types that can serve as a foundation for a data assimilation framework for hydrologic models.

  12. Statistical approach for uncertainty quantification of experimental modal model parameters

    DEFF Research Database (Denmark)

    Luczak, M.; Peeters, B.; Kahsin, M.

    2014-01-01

    estimates obtained from vibration experiments. Modal testing results are influenced by numerous factors introducing uncertainty to the measurement results. Different experimental techniques applied to the same test item or testing numerous nominally identical specimens yields different test results...

  13. Propagation of Uncertainty in System Parameters of a LWR Model by Sampling MCNPX Calculations - Burnup Analysis

    Science.gov (United States)

    Campolina, Daniel de A. M.; Lima, Claubia P. B.; Veloso, Maria Auxiliadora F.

    2014-06-01

    For all the physical components that comprise a nuclear system there is an uncertainty. Assessing the impact of uncertainties in the simulation of fissionable material systems is essential for a best estimate calculation that has been replacing the conservative model calculations as the computational power increases. The propagation of uncertainty in a simulation using a Monte Carlo code by sampling the input parameters is recent because of the huge computational effort required. In this work a sample space of MCNPX calculations was used to propagate the uncertainty. The sample size was optimized using the Wilks formula for a 95th percentile and a two-sided statistical tolerance interval of 95%. Uncertainties in input parameters of the reactor considered included geometry dimensions and densities. It was showed the capacity of the sampling-based method for burnup when the calculations sample size is optimized and many parameter uncertainties are investigated together, in the same input.

  14. Propagation of uncertainty in system parameters of a LWR model by sampling MCNPX calculations - Burnup analysis

    International Nuclear Information System (INIS)

    Campolina, D. de A. M.; Lima, C.P.B.; Veloso, M.A.F.

    2013-01-01

    For all the physical components that comprise a nuclear system there is an uncertainty. Assessing the impact of uncertainties in the simulation of fissionable material systems is essential for a best estimate calculation that has been replacing the conservative model calculations as the computational power increases. The propagation of uncertainty in a simulation using a Monte Carlo code by sampling the input parameters is recent because of the huge computational effort required. In this work a sample space of MCNPX calculations was used to propagate the uncertainty. The sample size was optimized using the Wilks formula for a 95. percentile and a two-sided statistical tolerance interval of 95%. Uncertainties in input parameters of the reactor considered included geometry dimensions and densities. It was showed the capacity of the sampling-based method for burnup when the calculations sample size is optimized and many parameter uncertainties are investigated together, in the same input. Particularly it was shown that during the burnup, the variances when considering all the parameters uncertainties is equivalent to the sum of variances if the parameter uncertainties are sampled separately

  15. Model structural uncertainty quantification and hydrologic parameter and prediction error analysis using airborne electromagnetic data

    DEFF Research Database (Denmark)

    Minsley, B. J.; Christensen, Nikolaj Kruse; Christensen, Steen

    electromagnetic (AEM) data. Our estimates of model structural uncertainty follow a Bayesian framework that accounts for both the uncertainties in geophysical parameter estimates given AEM data, and the uncertainties in the relationship between lithology and geophysical parameters. Using geostatistical sequential......Model structure, or the spatial arrangement of subsurface lithological units, is fundamental to the hydrological behavior of Earth systems. Knowledge of geological model structure is critically important in order to make informed hydrological predictions and management decisions. Model structure...... is never perfectly known, however, and incorrect assumptions can be a significant source of error when making model predictions. We describe a systematic approach for quantifying model structural uncertainty that is based on the integration of sparse borehole observations and large-scale airborne...

  16. Uncertainty of Modal Parameters Estimated by ARMA Models

    DEFF Research Database (Denmark)

    Jensen, Jakob Laigaard; Brincker, Rune; Rytter, Anders

    by a simulation study of a lightly damped single degree of freedom system. Identification by ARMA models has been chosen as system identification method. It is concluded that both the sampling interval and number of sampled points may play a significant role with respect to the statistical errors. Furthermore...

  17. Uncertainty of Modal Parameters Estimated by ARMA Models

    DEFF Research Database (Denmark)

    Jensen, Jacob Laigaard; Brincker, Rune; Rytter, Anders

    1990-01-01

    by simulation study of a lightly damped single degree of freedom system. Identification by ARMA models has been choosen as system identification method. It is concluded that both the sampling interval and number of sampled points may play a significant role with respect to the statistical errors. Furthermore...

  18. Land Building Models: Uncertainty in and Sensitivity to Input Parameters

    Science.gov (United States)

    2013-08-01

    Vicksburg, MS: US Army Engineer Research and Development Center. An electronic copy of this CHETN is available from http://chl.erdc.usace.army.mil/chetn...Nourishment Module, Chapter 8. In Coastal Louisiana Ecosystem Assessment and Restoration (CLEAR) Model of Louisiana Coastal Area ( LCA ) Comprehensive

  19. Parameter estimation and uncertainty assessment in hydrological modelling

    DEFF Research Database (Denmark)

    Blasone, Roberta-Serena

    En rationel og effektiv vandressourceadministration forudsætter indsigt i og forståelse af de hydrologiske processer samt præcise opgørelser af de tilgængelige vandmængder i både overfladevands- og grundvandsmagasiner. Til det formål er hydrologiske modeller et uomgængeligt værktøj. I de senest 1...

  20. Uncertainty from synergistic effects of multiple parameters in the Johnson and Ettinger (1991) vapor intrusion model

    Science.gov (United States)

    Tillman, Fred D.; Weaver, James W.

    Migration of volatile chemicals from the subsurface into overlying buildings is known as vapor intrusion (VI). Under certain circumstances, people living in homes above contaminated soil or ground water may be exposed to harmful levels of these vapors. VI is a particularly difficult pathway to assess, as challenges exist in delineating subsurface contributions to measured indoor-air concentrations as well as in adequate characterization of subsurface parameters necessary to calibrate a predictive flow and transport model. Often, a screening-level model is employed to determine if a potential indoor inhalation exposure pathway exists and, if such a pathway is complete, whether long-term exposure increases the occupants' risk for cancer or other toxic effects to an unacceptable level. A popular screening-level algorithm currently in wide use in the United States, Canada and the UK for making such determinations is the "Johnson and Ettinger" (J&E) model. Concern exists over using the J&E model for deciding whether or not further action is necessary at sites as many parameters are not routinely measured (or are un-measurable). Many screening decisions are then made based on simulations using "best estimate" look-up parameter values. While research exists on the sensitivity of the J&E model to individual parameter uncertainty, little published information is available on the combined effects of multiple uncertain parameters and their effect on screening decisions. This paper presents results of multiple-parameter uncertainty analyses using the J&E model to evaluate risk to humans from VI. Software was developed to produce automated uncertainty analyses of the model. Results indicate an increase in predicted cancer risk from multiple-parameter uncertainty by nearly a factor of 10 compared with single-parameter uncertainty. Additionally, a positive skew in model response to variation of some parameters was noted for both single and multiple parameter uncertainty analyses

  1. Accounting for methodological, structural, and parameter uncertainty in decision-analytic models: a practical guide.

    Science.gov (United States)

    Bilcke, Joke; Beutels, Philippe; Brisson, Marc; Jit, Mark

    2011-01-01

    Accounting for uncertainty is now a standard part of decision-analytic modeling and is recommended by many health technology agencies and published guidelines. However, the scope of such analyses is often limited, even though techniques have been developed for presenting the effects of methodological, structural, and parameter uncertainty on model results. To help bring these techniques into mainstream use, the authors present a step-by-step guide that offers an integrated approach to account for different kinds of uncertainty in the same model, along with a checklist for assessing the way in which uncertainty has been incorporated. The guide also addresses special situations such as when a source of uncertainty is difficult to parameterize, resources are limited for an ideal exploration of uncertainty, or evidence to inform the model is not available or not reliable. for identifying the sources of uncertainty that influence results most are also described. Besides guiding analysts, the guide and checklist may be useful to decision makers who need to assess how well uncertainty has been accounted for in a decision-analytic model before using the results to make a decision.

  2. Estimating parameter and predictive uncertainty when model residuals are correlated, heteroscedastic, and non-Gaussian

    Science.gov (United States)

    Schoups, Gerrit; Vrugt, Jasper A.

    2010-05-01

    Estimation of parameter and predictive uncertainty of hydrologic models usually relies on the assumption of additive residual errors that are independent and identically distributed according to a normal distribution with a mean of zero and a constant variance. Here, we investigate to what extent estimates of parameter and predictive uncertainty are affected when these assumptions are relaxed. Parameter and predictive uncertainty are estimated by Monte Carlo Markov Chain sampling from a generalized likelihood function that accounts for correlation, heteroscedasticity, and non-normality of residual errors. Application to rainfall-runoff modeling using daily data from a humid basin reveals that: (i) residual errors are much better described by a heteroscedastic, first-order auto-correlated error model with a Laplacian density characterized by heavier tails than a Gaussian density, and (ii) proper representation of the statistical distribution of residual errors yields tighter predictive uncertainty bands and more physically realistic parameter estimates that are less sensitive to the particular time period used for inference. The latter is especially useful for regionalization and extrapolation of parameter values to ungauged basins. Application to daily rainfall-runoff data from a semi-arid basin shows that allowing skew in the error distribution yields improved estimates of predictive uncertainty when flows are close to zero.

  3. Assessment of structural model and parameter uncertainty with a multi-model system for soil water balance models

    Science.gov (United States)

    Michalik, Thomas; Multsch, Sebastian; Frede, Hans-Georg; Breuer, Lutz

    2016-04-01

    Water for agriculture is strongly limited in arid and semi-arid regions and often of low quality in terms of salinity. The application of saline waters for irrigation increases the salt load in the rooting zone and has to be managed by leaching to maintain a healthy soil, i.e. to wash out salts by additional irrigation. Dynamic simulation models are helpful tools to calculate the root zone water fluxes and soil salinity content in order to investigate best management practices. However, there is little information on structural and parameter uncertainty for simulations regarding the water and salt balance of saline irrigation. Hence, we established a multi-model system with four different models (AquaCrop, RZWQM, SWAP, Hydrus1D/UNSATCHEM) to analyze the structural and parameter uncertainty by using the Global Likelihood and Uncertainty Estimation (GLUE) method. Hydrus1D/UNSATCHEM and SWAP were set up with multiple sets of different implemented functions (e.g. matric and osmotic stress for root water uptake) which results in a broad range of different model structures. The simulations were evaluated against soil water and salinity content observations. The posterior distribution of the GLUE analysis gives behavioral parameters sets and reveals uncertainty intervals for parameter uncertainty. Throughout all of the model sets, most parameters accounting for the soil water balance show a low uncertainty, only one or two out of five to six parameters in each model set displays a high uncertainty (e.g. pore-size distribution index in SWAP and Hydrus1D/UNSATCHEM). The differences between the models and model setups reveal the structural uncertainty. The highest structural uncertainty is observed for deep percolation fluxes between the model sets of Hydrus1D/UNSATCHEM (~200 mm) and RZWQM (~500 mm) that are more than twice as high for the latter. The model sets show a high variation in uncertainty intervals for deep percolation as well, with an interquartile range (IQR) of

  4. A novel approach to parameter uncertainty analysis of hydrological models using neural networks

    Directory of Open Access Journals (Sweden)

    D. P. Solomatine

    2009-07-01

    Full Text Available In this study, a methodology has been developed to emulate a time consuming Monte Carlo (MC simulation by using an Artificial Neural Network (ANN for the assessment of model parametric uncertainty. First, MC simulation of a given process model is run. Then an ANN is trained to approximate the functional relationships between the input variables of the process model and the synthetic uncertainty descriptors estimated from the MC realizations. The trained ANN model encapsulates the underlying characteristics of the parameter uncertainty and can be used to predict uncertainty descriptors for the new data vectors. This approach was validated by comparing the uncertainty descriptors in the verification data set with those obtained by the MC simulation. The method is applied to estimate the parameter uncertainty of a lumped conceptual hydrological model, HBV, for the Brue catchment in the United Kingdom. The results are quite promising as the prediction intervals estimated by the ANN are reasonably accurate. The proposed techniques could be useful in real time applications when it is not practicable to run a large number of simulations for complex hydrological models and when the forecast lead time is very short.

  5. Modelling pesticide leaching under climate change: parameter vs. climate input uncertainty

    Directory of Open Access Journals (Sweden)

    K. Steffens

    2014-02-01

    Full Text Available Assessing climate change impacts on pesticide leaching requires careful consideration of different sources of uncertainty. We investigated the uncertainty related to climate scenario input and its importance relative to parameter uncertainty of the pesticide leaching model. The pesticide fate model MACRO was calibrated against a comprehensive one-year field data set for a well-structured clay soil in south-western Sweden. We obtained an ensemble of 56 acceptable parameter sets that represented the parameter uncertainty. Nine different climate model projections of the regional climate model RCA3 were available as driven by different combinations of global climate models (GCM, greenhouse gas emission scenarios and initial states of the GCM. The future time series of weather data used to drive the MACRO model were generated by scaling a reference climate data set (1970–1999 for an important agricultural production area in south-western Sweden based on monthly change factors for 2070–2099. 30 yr simulations were performed for different combinations of pesticide properties and application seasons. Our analysis showed that both the magnitude and the direction of predicted change in pesticide leaching from present to future depended strongly on the particular climate scenario. The effect of parameter uncertainty was of major importance for simulating absolute pesticide losses, whereas the climate uncertainty was relatively more important for predictions of changes of pesticide losses from present to future. The climate uncertainty should be accounted for by applying an ensemble of different climate scenarios. The aggregated ensemble prediction based on both acceptable parameterizations and different climate scenarios has the potential to provide robust probabilistic estimates of future pesticide losses.

  6. Identifying the effects of parameter uncertainty on the reliability of riverbank stability modelling

    Science.gov (United States)

    Samadi, A.; Amiri-Tokaldany, E.; Darby, S. E.

    2009-05-01

    Bank retreat is a key process in fluvial dynamics affecting a wide range of physical, ecological and socioeconomic issues in the fluvial environment. To predict the undesirable effects of bank retreat and to inform effective measures to prevent it, a wide range of bank stability models have been presented in the literature. These models typically express bank stability by defining a factor of safety as the ratio of driving and resisting forces acting on the incipient failure block. These forces are affected by a range of controlling factors that include such aspects as the bank profile (bank height and angle), the geotechnical properties of the bank materials, as well as the hydrological status of the riverbanks. In this paper we evaluate the extent to which uncertainties in the parameterization of these controlling factors feed through to influence the reliability of the resulting bank stability estimate. This is achieved by employing a simple model of riverbank stability with respect to planar failure (which is the most common type of bank stability model) in a series of sensitivity tests and Monte Carlo analyses to identify, for each model parameter, the range of values that induce significant changes in the simulated factor of safety. These identified parameter value ranges are compared to empirically derived parameter uncertainties to determine whether they are likely to confound the reliability of the resulting bank stability calculations. Our results show that parameter uncertainties are typically high enough that the likelihood of generating unreliable predictions is typically very high (> ˜ 80% for predictions requiring a precision of < ± 15%). Because parameter uncertainties are derived primarily from the natural variability of the parameters, rather than measurement errors, much more careful attention should be paid to field sampling strategies, such that the parameter uncertainties and consequent prediction unreliabilities can be quantified more

  7. Parameter uncertainty in CGE Modeling of the environmental impacts of economic policies

    Energy Technology Data Exchange (ETDEWEB)

    Abler, D.G.; Shortle, J.S. [Agricultural Economics, Pennsylvania State University, University Park, PA (United States); Rodriguez, A.G. [University of Costa Rica, San Jose (Costa Rica)

    1999-07-01

    This study explores the role of parameter uncertainty in Computable General Equilibrium (CGE) modeling of the environmental impacts of macroeconomic and sectoral policies, using Costa Rica as a case for study. A CGE model is constructed which includes eight environmental indicators covering deforestation, pesticides, overfishing, hazardous wastes, inorganic wastes, organic wastes, greenhouse gases, and air pollution. The parameters are treated as random variables drawn from prespecified distributions. Evaluation of each policy option consists of a Monte Carlo experiment. The impacts of the policy options on the environmental indicators are relatively robust to different parameter values, in spite of the wide range of parameter values employed. 33 refs.

  8. Parameter uncertainty in CGE Modeling of the environmental impacts of economic policies

    International Nuclear Information System (INIS)

    Abler, D.G.; Shortle, J.S.; Rodriguez, A.G.

    1999-01-01

    This study explores the role of parameter uncertainty in Computable General Equilibrium (CGE) modeling of the environmental impacts of macroeconomic and sectoral policies, using Costa Rica as a case for study. A CGE model is constructed which includes eight environmental indicators covering deforestation, pesticides, overfishing, hazardous wastes, inorganic wastes, organic wastes, greenhouse gases, and air pollution. The parameters are treated as random variables drawn from prespecified distributions. Evaluation of each policy option consists of a Monte Carlo experiment. The impacts of the policy options on the environmental indicators are relatively robust to different parameter values, in spite of the wide range of parameter values employed. 33 refs

  9. Parameter sensitivity and uncertainty analysis for a storm surge and wave model

    Directory of Open Access Journals (Sweden)

    L. A. Bastidas

    2016-09-01

    Full Text Available Development and simulation of synthetic hurricane tracks is a common methodology used to estimate hurricane hazards in the absence of empirical coastal surge and wave observations. Such methods typically rely on numerical models to translate stochastically generated hurricane wind and pressure forcing into coastal surge and wave estimates. The model output uncertainty associated with selection of appropriate model parameters must therefore be addressed. The computational overburden of probabilistic surge hazard estimates is exacerbated by the high dimensionality of numerical surge and wave models. We present a model parameter sensitivity analysis of the Delft3D model for the simulation of hazards posed by Hurricane Bob (1991 utilizing three theoretical wind distributions (NWS23, modified Rankine, and Holland. The sensitive model parameters (of 11 total considered include wind drag, the depth-induced breaking γB, and the bottom roughness. Several parameters show no sensitivity (threshold depth, eddy viscosity, wave triad parameters, and depth-induced breaking αB and can therefore be excluded to reduce the computational overburden of probabilistic surge hazard estimates. The sensitive model parameters also demonstrate a large number of interactions between parameters and a nonlinear model response. While model outputs showed sensitivity to several parameters, the ability of these parameters to act as tuning parameters for calibration is somewhat limited as proper model calibration is strongly reliant on accurate wind and pressure forcing data. A comparison of the model performance with forcings from the different wind models is also presented.

  10. Global Sensitivity Analysis for Identifying Important Parameters of Nitrogen Nitrification and Denitrification under Model and Scenario Uncertainties

    Science.gov (United States)

    Ye, M.; Chen, Z.; Shi, L.; Zhu, Y.; Yang, J.

    2017-12-01

    Nitrogen reactive transport modeling is subject to uncertainty in model parameters, structures, and scenarios. While global sensitivity analysis is a vital tool for identifying the parameters important to nitrogen reactive transport, conventional global sensitivity analysis only considers parametric uncertainty. This may result in inaccurate selection of important parameters, because parameter importance may vary under different models and modeling scenarios. By using a recently developed variance-based global sensitivity analysis method, this paper identifies important parameters with simultaneous consideration of parametric uncertainty, model uncertainty, and scenario uncertainty. In a numerical example of nitrogen reactive transport modeling, a combination of three scenarios of soil temperature and two scenarios of soil moisture leads to a total of six scenarios. Four alternative models are used to evaluate reduction functions used for calculating actual rates of nitrification and denitrification. The model uncertainty is tangled with scenario uncertainty, as the reduction functions depend on soil temperature and moisture content. The results of sensitivity analysis show that parameter importance varies substantially between different models and modeling scenarios, which may lead to inaccurate selection of important parameters if model and scenario uncertainties are not considered. This problem is avoided by using the new method of sensitivity analysis in the context of model averaging and scenario averaging. The new method of sensitivity analysis can be applied to other problems of contaminant transport modeling when model uncertainty and/or scenario uncertainty are present.

  11. Are subject-specific musculoskeletal models robust to the uncertainties in parameter identification?

    Science.gov (United States)

    Valente, Giordano; Pitto, Lorenzo; Testi, Debora; Seth, Ajay; Delp, Scott L; Stagni, Rita; Viceconti, Marco; Taddei, Fulvia

    2014-01-01

    Subject-specific musculoskeletal modeling can be applied to study musculoskeletal disorders, allowing inclusion of personalized anatomy and properties. Independent of the tools used for model creation, there are unavoidable uncertainties associated with parameter identification, whose effect on model predictions is still not fully understood. The aim of the present study was to analyze the sensitivity of subject-specific model predictions (i.e., joint angles, joint moments, muscle and joint contact forces) during walking to the uncertainties in the identification of body landmark positions, maximum muscle tension and musculotendon geometry. To this aim, we created an MRI-based musculoskeletal model of the lower limbs, defined as a 7-segment, 10-degree-of-freedom articulated linkage, actuated by 84 musculotendon units. We then performed a Monte-Carlo probabilistic analysis perturbing model parameters according to their uncertainty, and solving a typical inverse dynamics and static optimization problem using 500 models that included the different sets of perturbed variable values. Model creation and gait simulations were performed by using freely available software that we developed to standardize the process of model creation, integrate with OpenSim and create probabilistic simulations of movement. The uncertainties in input variables had a moderate effect on model predictions, as muscle and joint contact forces showed maximum standard deviation of 0.3 times body-weight and maximum range of 2.1 times body-weight. In addition, the output variables significantly correlated with few input variables (up to 7 out of 312) across the gait cycle, including the geometry definition of larger muscles and the maximum muscle tension in limited gait portions. Although we found subject-specific models not markedly sensitive to parameter identification, researchers should be aware of the model precision in relation to the intended application. In fact, force predictions could be

  12. Are subject-specific musculoskeletal models robust to the uncertainties in parameter identification?

    Directory of Open Access Journals (Sweden)

    Giordano Valente

    Full Text Available Subject-specific musculoskeletal modeling can be applied to study musculoskeletal disorders, allowing inclusion of personalized anatomy and properties. Independent of the tools used for model creation, there are unavoidable uncertainties associated with parameter identification, whose effect on model predictions is still not fully understood. The aim of the present study was to analyze the sensitivity of subject-specific model predictions (i.e., joint angles, joint moments, muscle and joint contact forces during walking to the uncertainties in the identification of body landmark positions, maximum muscle tension and musculotendon geometry. To this aim, we created an MRI-based musculoskeletal model of the lower limbs, defined as a 7-segment, 10-degree-of-freedom articulated linkage, actuated by 84 musculotendon units. We then performed a Monte-Carlo probabilistic analysis perturbing model parameters according to their uncertainty, and solving a typical inverse dynamics and static optimization problem using 500 models that included the different sets of perturbed variable values. Model creation and gait simulations were performed by using freely available software that we developed to standardize the process of model creation, integrate with OpenSim and create probabilistic simulations of movement. The uncertainties in input variables had a moderate effect on model predictions, as muscle and joint contact forces showed maximum standard deviation of 0.3 times body-weight and maximum range of 2.1 times body-weight. In addition, the output variables significantly correlated with few input variables (up to 7 out of 312 across the gait cycle, including the geometry definition of larger muscles and the maximum muscle tension in limited gait portions. Although we found subject-specific models not markedly sensitive to parameter identification, researchers should be aware of the model precision in relation to the intended application. In fact, force

  13. Parameter estimation and uncertainty quantification in a biogeochemical model using optimal experimental design methods

    Science.gov (United States)

    Reimer, Joscha; Piwonski, Jaroslaw; Slawig, Thomas

    2016-04-01

    The statistical significance of any model-data comparison strongly depends on the quality of the used data and the criterion used to measure the model-to-data misfit. The statistical properties (such as mean values, variances and covariances) of the data should be taken into account by choosing a criterion as, e.g., ordinary, weighted or generalized least squares. Moreover, the criterion can be restricted onto regions or model quantities which are of special interest. This choice influences the quality of the model output (also for not measured quantities) and the results of a parameter estimation or optimization process. We have estimated the parameters of a three-dimensional and time-dependent marine biogeochemical model describing the phosphorus cycle in the ocean. For this purpose, we have developed a statistical model for measurements of phosphate and dissolved organic phosphorus. This statistical model includes variances and correlations varying with time and location of the measurements. We compared the obtained estimations of model output and parameters for different criteria. Another question is if (and which) further measurements would increase the model's quality at all. Using experimental design criteria, the information content of measurements can be quantified. This may refer to the uncertainty in unknown model parameters as well as the uncertainty regarding which model is closer to reality. By (another) optimization, optimal measurement properties such as locations, time instants and quantities to be measured can be identified. We have optimized such properties for additional measurement for the parameter estimation of the marine biogeochemical model. For this purpose, we have quantified the uncertainty in the optimal model parameters and the model output itself regarding the uncertainty in the measurement data using the (Fisher) information matrix. Furthermore, we have calculated the uncertainty reduction by additional measurements depending on time

  14. Estimating Parameter Uncertainty in Binding-Energy Models by the Frequency-Domain Bootstrap

    Science.gov (United States)

    Bertsch, G. F.; Bingham, Derek

    2017-12-01

    We propose using the frequency-domain bootstrap (FDB) to estimate errors of modeling parameters when the modeling error is itself a major source of uncertainty. Unlike the usual bootstrap or the simple χ2 analysis, the FDB can take into account correlations between errors. It is also very fast compared to the Gaussian process Bayesian estimate as often implemented for computer model calibration. The method is illustrated with a simple example, the liquid drop model of nuclear binding energies. We find that the FDB gives a more conservative estimate of the uncertainty in liquid drop parameters than the χ2 method, and is in fair accord with more empirical estimates. For the nuclear physics application, there are no apparent obstacles to apply the method to the more accurate and detailed models based on density-functional theory.

  15. Parameter estimation techniques and uncertainty in ground water flow model predictions

    International Nuclear Information System (INIS)

    Zimmerman, D.A.; Davis, P.A.

    1990-01-01

    Quantification of uncertainty in predictions of nuclear waste repository performance is a requirement of Nuclear Regulatory Commission regulations governing the licensing of proposed geologic repositories for high-level radioactive waste disposal. One of the major uncertainties in these predictions is in estimating the ground-water travel time of radionuclides migrating from the repository to the accessible environment. The cause of much of this uncertainty has been attributed to a lack of knowledge about the hydrogeologic properties that control the movement of radionuclides through the aquifers. A major reason for this lack of knowledge is the paucity of data that is typically available for characterizing complex ground-water flow systems. Because of this, considerable effort has been put into developing parameter estimation techniques that infer property values in regions where no measurements exist. Currently, no single technique has been shown to be superior or even consistently conservative with respect to predictions of ground-water travel time. This work was undertaken to compare a number of parameter estimation techniques and to evaluate how differences in the parameter estimates and the estimation errors are reflected in the behavior of the flow model predictions. That is, we wished to determine to what degree uncertainties in flow model predictions may be affected simply by the choice of parameter estimation technique used. 3 refs., 2 figs

  16. The sensitivity of flowline models of tidewater glaciers to parameter uncertainty

    Directory of Open Access Journals (Sweden)

    E. M. Enderlin

    2013-10-01

    Full Text Available Depth-integrated (1-D flowline models have been widely used to simulate fast-flowing tidewater glaciers and predict change because the continuous grounding line tracking, high horizontal resolution, and physically based calving criterion that are essential to realistic modeling of tidewater glaciers can easily be incorporated into the models while maintaining high computational efficiency. As with all models, the values for parameters describing ice rheology and basal friction must be assumed and/or tuned based on observations. For prognostic studies, these parameters are typically tuned so that the glacier matches observed thickness and speeds at an initial state, to which a perturbation is applied. While it is well know that ice flow models are sensitive to these parameters, the sensitivity of tidewater glacier models has not been systematically investigated. Here we investigate the sensitivity of such flowline models of outlet glacier dynamics to uncertainty in three key parameters that influence a glacier's resistive stress components. We find that, within typical observational uncertainty, similar initial (i.e., steady-state glacier configurations can be produced with substantially different combinations of parameter values, leading to differing transient responses after a perturbation is applied. In cases where the glacier is initially grounded near flotation across a basal over-deepening, as typically observed for rapidly changing glaciers, these differences can be dramatic owing to the threshold of stability imposed by the flotation criterion. The simulated transient response is particularly sensitive to the parameterization of ice rheology: differences in ice temperature of ~ 2 °C can determine whether the glaciers thin to flotation and retreat unstably or remain grounded on a marine shoal. Due to the highly non-linear dependence of tidewater glaciers on model parameters, we recommend that their predictions are accompanied by

  17. Effects of correlated parameters and uncertainty in electronic-structure-based chemical kinetic modelling

    Science.gov (United States)

    Sutton, Jonathan E.; Guo, Wei; Katsoulakis, Markos A.; Vlachos, Dionisios G.

    2016-04-01

    Kinetic models based on first principles are becoming common place in heterogeneous catalysis because of their ability to interpret experimental data, identify the rate-controlling step, guide experiments and predict novel materials. To overcome the tremendous computational cost of estimating parameters of complex networks on metal catalysts, approximate quantum mechanical calculations are employed that render models potentially inaccurate. Here, by introducing correlative global sensitivity analysis and uncertainty quantification, we show that neglecting correlations in the energies of species and reactions can lead to an incorrect identification of influential parameters and key reaction intermediates and reactions. We rationalize why models often underpredict reaction rates and show that, despite the uncertainty being large, the method can, in conjunction with experimental data, identify influential missing reaction pathways and provide insights into the catalyst active site and the kinetic reliability of a model. The method is demonstrated in ethanol steam reforming for hydrogen production for fuel cells.

  18. Improving weather predictability by including land-surface model parameter uncertainty

    Science.gov (United States)

    Orth, Rene; Dutra, Emanuel; Pappenberger, Florian

    2016-04-01

    The land surface forms an important component of Earth system models and interacts nonlinearly with other parts such as ocean and atmosphere. To capture the complex and heterogenous hydrology of the land surface, land surface models include a large number of parameters impacting the coupling to other components of the Earth system model. Focusing on ECMWF's land-surface model HTESSEL we present in this study a comprehensive parameter sensitivity evaluation using multiple observational datasets in Europe. We select 6 poorly constrained effective parameters (surface runoff effective depth, skin conductivity, minimum stomatal resistance, maximum interception, soil moisture stress function shape, total soil depth) and explore their sensitivity to model outputs such as soil moisture, evapotranspiration and runoff using uncoupled simulations and coupled seasonal forecasts. Additionally we investigate the possibility to construct ensembles from the multiple land surface parameters. In the uncoupled runs we find that minimum stomatal resistance and total soil depth have the most influence on model performance. Forecast skill scores are moreover sensitive to the same parameters as HTESSEL performance in the uncoupled analysis. We demonstrate the robustness of our findings by comparing multiple best performing parameter sets and multiple randomly chosen parameter sets. We find better temperature and precipitation forecast skill with the best-performing parameter perturbations demonstrating representativeness of model performance across uncoupled (and hence less computationally demanding) and coupled settings. Finally, we construct ensemble forecasts from ensemble members derived with different best-performing parameterizations of HTESSEL. This incorporation of parameter uncertainty in the ensemble generation yields an increase in forecast skill, even beyond the skill of the default system. Orth, R., E. Dutra, and F. Pappenberger, 2016: Improving weather predictability by

  19. MOESHA: A genetic algorithm for automatic calibration and estimation of parameter uncertainty and sensitivity of hydrologic models

    Science.gov (United States)

    Characterization of uncertainty and sensitivity of model parameters is an essential and often overlooked facet of hydrological modeling. This paper introduces an algorithm called MOESHA that combines input parameter sensitivity analyses with a genetic algorithm calibration routin...

  20. Bayesian Analysis Diagnostics: Diagnosing Predictive and Parameter Uncertainty for Hydrological Models

    Science.gov (United States)

    Thyer, Mark; Kavetski, Dmitri; Evin, Guillaume; Kuczera, George; Renard, Ben; McInerney, David

    2015-04-01

    All scientific and statistical analysis, particularly in natural sciences, is based on approximations and assumptions. For example, the calibration of hydrological models using approaches such as Nash-Sutcliffe efficiency and/or simple least squares (SLS) objective functions may appear to be 'assumption-free'. However, this is a naïve point of view, as SLS assumes that the model residuals (residuals=observed-predictions) are independent, homoscedastic and Gaussian. If these assumptions are poor, parameter inference and model predictions will be correspondingly poor. An essential step in model development is therefore to verify the assumptions and approximations made in the modeling process. Diagnostics play a key role in verifying modeling assumptions. An important advantage of the formal Bayesian approach is that the modeler is required to make the assumptions explicit. Specialized diagnostics can then be developed and applied to test and verify their assumptions. This paper presents a suite of statistical and modeling diagnostics that can be used by environmental modelers to test their modeling calibration assumptions and diagnose model deficiencies. Three major types of diagnostics are presented: Residual Diagnostics Residual diagnostics are used to test whether the assumptions of the residual error model within the likelihood function are compatible with the data. This includes testing for statistical independence, homoscedasticity, unbiasedness, Gaussianity and any distributional assumptions. Parameter Uncertainty and MCMC Diagnostics An important part of Bayesian analysis is assess parameter uncertainty. Markov Chain Monte Carlo (MCMC) methods are a powerful numerical tool for estimating these uncertainties. Diagnostics based on posterior parameter distributions can be used to assess parameter identifiability, interactions and correlations. This provides a very useful tool for detecting and remedying model deficiencies. In addition, numerical diagnostics are

  1. Variations in environmental tritium doses due to meteorological data averaging and uncertainties in pathway model parameters

    International Nuclear Information System (INIS)

    Kock, A.

    1996-05-01

    The objectives of this research are: (1) to calculate and compare off site doses from atmospheric tritium releases at the Savannah River Site using monthly versus 5 year meteorological data and annual source terms, including additional seasonal and site specific parameters not included in present annual assessments; and (2) to calculate the range of the above dose estimates based on distributions in model parameters given by uncertainty estimates found in the literature. Consideration will be given to the sensitivity of parameters given in former studies

  2. Variations in environmental tritium doses due to meteorological data averaging and uncertainties in pathway model parameters

    Energy Technology Data Exchange (ETDEWEB)

    Kock, A.

    1996-05-01

    The objectives of this research are: (1) to calculate and compare off site doses from atmospheric tritium releases at the Savannah River Site using monthly versus 5 year meteorological data and annual source terms, including additional seasonal and site specific parameters not included in present annual assessments; and (2) to calculate the range of the above dose estimates based on distributions in model parameters given by uncertainty estimates found in the literature. Consideration will be given to the sensitivity of parameters given in former studies.

  3. Accounting for environmental variability, modeling errors, and parameter estimation uncertainties in structural identification

    Science.gov (United States)

    Behmanesh, Iman; Moaveni, Babak

    2016-07-01

    This paper presents a Hierarchical Bayesian model updating framework to account for the effects of ambient temperature and excitation amplitude. The proposed approach is applied for model calibration, response prediction and damage identification of a footbridge under changing environmental/ambient conditions. The concrete Young's modulus of the footbridge deck is the considered updating structural parameter with its mean and variance modeled as functions of temperature and excitation amplitude. The identified modal parameters over 27 months of continuous monitoring of the footbridge are used to calibrate the updating parameters. One of the objectives of this study is to show that by increasing the levels of information in the updating process, the posterior variation of the updating structural parameter (concrete Young's modulus) is reduced. To this end, the calibration is performed at three information levels using (1) the identified modal parameters, (2) modal parameters and ambient temperatures, and (3) modal parameters, ambient temperatures, and excitation amplitudes. The calibrated model is then validated by comparing the model-predicted natural frequencies and those identified from measured data after deliberate change to the structural mass. It is shown that accounting for modeling error uncertainties is crucial for reliable response prediction, and accounting only the estimated variability of the updating structural parameter is not sufficient for accurate response predictions. Finally, the calibrated model is used for damage identification of the footbridge.

  4. Model-based verification method for solving the parameter uncertainty in the train control system

    International Nuclear Information System (INIS)

    Cheng, Ruijun; Zhou, Jin; Chen, Dewang; Song, Yongduan

    2016-01-01

    This paper presents a parameter analysis method to solve the parameter uncertainty problem for hybrid system and explore the correlation of key parameters for distributed control system. For improving the reusability of control model, the proposed approach provides the support for obtaining the constraint sets of all uncertain parameters in the abstract linear hybrid automata (LHA) model when satisfying the safety requirements of the train control system. Then, in order to solve the state space explosion problem, the online verification method is proposed to monitor the operating status of high-speed trains online because of the real-time property of the train control system. Furthermore, we construct the LHA formal models of train tracking model and movement authority (MA) generation process as cases to illustrate the effectiveness and efficiency of the proposed method. In the first case, we obtain the constraint sets of uncertain parameters to avoid collision between trains. In the second case, the correlation of position report cycle and MA generation cycle is analyzed under both the normal and the abnormal condition influenced by packet-loss factor. Finally, considering stochastic characterization of time distributions and real-time feature of moving block control system, the transient probabilities of wireless communication process are obtained by stochastic time petri nets. - Highlights: • We solve the parameters uncertainty problem by using model-based method. • We acquire the parameter constraint sets by verifying linear hybrid automata models. • Online verification algorithms are designed to monitor the high-speed trains. • We analyze the correlation of key parameters and uncritical parameters. • The transient probabilities are obtained by using reliability analysis.

  5. Assessing the relative importance of parameter and forcing uncertainty and their interactions in conceptual hydrological model simulations

    Science.gov (United States)

    Mockler, E. M.; Chun, K. P.; Sapriza-Azuri, G.; Bruen, M.; Wheater, H. S.

    2016-11-01

    Predictions of river flow dynamics provide vital information for many aspects of water management including water resource planning, climate adaptation, and flood and drought assessments. Many of the subjective choices that modellers make including model and criteria selection can have a significant impact on the magnitude and distribution of the output uncertainty. Hydrological modellers are tasked with understanding and minimising the uncertainty surrounding streamflow predictions before communicating the overall uncertainty to decision makers. Parameter uncertainty in conceptual rainfall-runoff models has been widely investigated, and model structural uncertainty and forcing data have been receiving increasing attention. This study aimed to assess uncertainties in streamflow predictions due to forcing data and the identification of behavioural parameter sets in 31 Irish catchments. By combining stochastic rainfall ensembles and multiple parameter sets for three conceptual rainfall-runoff models, an analysis of variance model was used to decompose the total uncertainty in streamflow simulations into contributions from (i) forcing data, (ii) identification of model parameters and (iii) interactions between the two. The analysis illustrates that, for our subjective choices, hydrological model selection had a greater contribution to overall uncertainty, while performance criteria selection influenced the relative intra-annual uncertainties in streamflow predictions. Uncertainties in streamflow predictions due to the method of determining parameters were relatively lower for wetter catchments, and more evenly distributed throughout the year when the Nash-Sutcliffe Efficiency of logarithmic values of flow (lnNSE) was the evaluation criterion.

  6. Uncertainty analysis of Multiple Greek Letter parameters in common cause failure model

    International Nuclear Information System (INIS)

    Jordan Cizelj, R.; Vrbanic, I.

    2003-01-01

    As a rule, common cause failures have high influence on the results of Probabilistic Safety Assessments. In the paper, uncertainty analysis for parameters of Multiple-Greek-Letter common-cause-model due to stochastic nature of events is presented. Results of Bayesian analysis and maximum likelihood analysis are compared and interpreted. Special emphasis is given to the assessment of the Bayesian inclusion of generic knowledge, since it may bias the results conservatively. (author)

  7. Estimation and impact assessment of input and parameter uncertainty in predicting groundwater flow with a fully distributed model

    Science.gov (United States)

    Touhidul Mustafa, Syed Md.; Nossent, Jiri; Ghysels, Gert; Huysmans, Marijke

    2017-04-01

    Transient numerical groundwater flow models have been used to understand and forecast groundwater flow systems under anthropogenic and climatic effects, but the reliability of the predictions is strongly influenced by different sources of uncertainty. Hence, researchers in hydrological sciences are developing and applying methods for uncertainty quantification. Nevertheless, spatially distributed flow models pose significant challenges for parameter and spatially distributed input estimation and uncertainty quantification. In this study, we present a general and flexible approach for input and parameter estimation and uncertainty analysis of groundwater models. The proposed approach combines a fully distributed groundwater flow model (MODFLOW) with the DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm. To avoid over-parameterization, the uncertainty of the spatially distributed model input has been represented by multipliers. The posterior distributions of these multipliers and the regular model parameters were estimated using DREAM. The proposed methodology has been applied in an overexploited aquifer in Bangladesh where groundwater pumping and recharge data are highly uncertain. The results confirm that input uncertainty does have a considerable effect on the model predictions and parameter distributions. Additionally, our approach also provides a new way to optimize the spatially distributed recharge and pumping data along with the parameter values under uncertain input conditions. It can be concluded from our approach that considering model input uncertainty along with parameter uncertainty is important for obtaining realistic model predictions and a correct estimation of the uncertainty bounds.

  8. Evaluation of risk impact of changes to Completion Times addressing model and parameter uncertainties

    International Nuclear Information System (INIS)

    Martorell, S.; Martón, I.; Villamizar, M.; Sánchez, A.I.; Carlos, S.

    2014-01-01

    This paper presents an approach and an example of application for the evaluation of risk impact of changes to Completion Times within the License Basis of a Nuclear Power Plant based on the use of the Probabilistic Risk Assessment addressing identification, treatment and analysis of uncertainties in an integrated manner. It allows full development of a three tired approach (Tier 1–3) following the principles of the risk-informed decision-making accounting for uncertainties as proposed by many regulators. Completion Time is the maximum outage time a safety related equipment is allowed to be down, e.g. for corrective maintenance, which is established within the Limiting Conditions for Operation included into Technical Specifications for operation of a Nuclear Power Plant. The case study focuses on a Completion Time change of the Accumulators System of a Nuclear Power Plant using a level 1 PRA. It focuses on several sources of model and parameter uncertainties. The results obtained show the risk impact of the proposed CT change including both types of epistemic uncertainties is small as compared with current safety goals of concern to Tier 1. However, what concerns to Tier 2 and 3, the results obtained show how the use of some traditional and uncertainty importance measures helps in identifying high risky configurations that should be avoided in NPP technical specifications no matter the duration of CT (Tier 2), and other configurations that could take part of a configuration risk management program (Tier 3). - Highlights: • New approach for evaluation of risk impact of changes to Completion Times. • Integrated treatment and analysis of model and parameter uncertainties. • PSA based application to support risk-informed decision-making. • Measures of importance for identification of risky configurations. • Management of important safety issues to accomplish safety goals

  9. Bayesian Assessment of the Uncertainties of Estimates of a Conceptual Rainfall-Runoff Model Parameters

    Science.gov (United States)

    Silva, F. E. O. E.; Naghettini, M. D. C.; Fernandes, W.

    2014-12-01

    This paper evaluated the uncertainties associated with the estimation of the parameters of a conceptual rainfall-runoff model, through the use of Bayesian inference techniques by Monte Carlo simulation. The Pará River sub-basin, located in the upper São Francisco river basin, in southeastern Brazil, was selected for developing the studies. In this paper, we used the Rio Grande conceptual hydrologic model (EHR/UFMG, 2001) and the Markov Chain Monte Carlo simulation method named DREAM (VRUGT, 2008a). Two probabilistic models for the residues were analyzed: (i) the classic [Normal likelihood - r ≈ N (0, σ²)]; and (ii) a generalized likelihood (SCHOUPS & VRUGT, 2010), in which it is assumed that the differences between observed and simulated flows are correlated, non-stationary, and distributed as a Skew Exponential Power density. The assumptions made for both models were checked to ensure that the estimation of uncertainties in the parameters was not biased. The results showed that the Bayesian approach proved to be adequate to the proposed objectives, enabling and reinforcing the importance of assessing the uncertainties associated with hydrological modeling.

  10. Approaches to highly parameterized inversion: A guide to using PEST for model-parameter and predictive-uncertainty analysis

    Science.gov (United States)

    Doherty, John E.; Hunt, Randall J.; Tonkin, Matthew J.

    2010-01-01

    Analysis of the uncertainty associated with parameters used by a numerical model, and with predictions that depend on those parameters, is fundamental to the use of modeling in support of decisionmaking. Unfortunately, predictive uncertainty analysis with regard to models can be very computationally demanding, due in part to complex constraints on parameters that arise from expert knowledge of system properties on the one hand (knowledge constraints) and from the necessity for the model parameters to assume values that allow the model to reproduce historical system behavior on the other hand (calibration constraints). Enforcement of knowledge and calibration constraints on parameters used by a model does not eliminate the uncertainty in those parameters. In fact, in many cases, enforcement of calibration constraints simply reduces the uncertainties associated with a number of broad-scale combinations of model parameters that collectively describe spatially averaged system properties. The uncertainties associated with other combinations of parameters, especially those that pertain to small-scale parameter heterogeneity, may not be reduced through the calibration process. To the extent that a prediction depends on system-property detail, its postcalibration variability may be reduced very little, if at all, by applying calibration constraints; knowledge constraints remain the only limits on the variability of predictions that depend on such detail. Regrettably, in many common modeling applications, these constraints are weak. Though the PEST software suite was initially developed as a tool for model calibration, recent developments have focused on the evaluation of model-parameter and predictive uncertainty. As a complement to functionality that it provides for highly parameterized inversion (calibration) by means of formal mathematical regularization techniques, the PEST suite provides utilities for linear and nonlinear error-variance and uncertainty analysis in

  11. Modeling a production scale milk drying process: parameter estimation, uncertainty and sensitivity analysis

    DEFF Research Database (Denmark)

    Ferrari, A.; Gutierrez, S.; Sin, Gürkan

    2016-01-01

    A steady state model for a production scale milk drying process was built to help process understanding and optimization studies. It involves a spray chamber and also internal/external fluid beds. The model was subjected to a comprehensive statistical analysis for quality assurance using...... sensitivity analysis of inputs/parameters, and uncertainty analysis to estimate confidence intervals on parameters and model predictions (error propagation). Variance based sensitivity analysis (Sobol's method) was used to quantify the influence of inputs on the final powder moisture as the model output...... at chamber inlet air (variation > 100%). The sensitivity analysis results suggest exploring improvements in the current control (Proportional Integral Derivative) for moisture content at concentrate chamber feed in order to reduce the output variance. It is also confirmed that humidity control at chamber...

  12. A Bayesian-based multilevel factorial analysis method for analyzing parameter uncertainty of hydrological model

    Science.gov (United States)

    Liu, Y. R.; Li, Y. P.; Huang, G. H.; Zhang, J. L.; Fan, Y. R.

    2017-10-01

    In this study, a Bayesian-based multilevel factorial analysis (BMFA) method is developed to assess parameter uncertainties and their effects on hydrological model responses. In BMFA, Differential Evolution Adaptive Metropolis (DREAM) algorithm is employed to approximate the posterior distributions of model parameters with Bayesian inference; factorial analysis (FA) technique is used for measuring the specific variations of hydrological responses in terms of posterior distributions to investigate the individual and interactive effects of parameters on model outputs. BMFA is then applied to a case study of the Jinghe River watershed in the Loess Plateau of China to display its validity and applicability. The uncertainties of four sensitive parameters, including soil conservation service runoff curve number to moisture condition II (CN2), soil hydraulic conductivity (SOL_K), plant available water capacity (SOL_AWC), and soil depth (SOL_Z), are investigated. Results reveal that (i) CN2 has positive effect on peak flow, implying that the concentrated rainfall during rainy season can cause infiltration-excess surface flow, which is an considerable contributor to peak flow in this watershed; (ii) SOL_K has positive effect on average flow, implying that the widely distributed cambisols can lead to medium percolation capacity; (iii) the interaction between SOL_AWC and SOL_Z has noticeable effect on the peak flow and their effects are dependent upon each other, which discloses that soil depth can significant influence the processes of plant uptake of soil water in this watershed. Based on the above findings, the significant parameters and the relationship among uncertain parameters can be specified, such that hydrological model's capability for simulating/predicting water resources of the Jinghe River watershed can be improved.

  13. Cosmological Parameter Uncertainties from SALT-II Type Ia Supernova Light Curve Models

    Energy Technology Data Exchange (ETDEWEB)

    Mosher, J. [Pennsylvania U.; Guy, J. [LBL, Berkeley; Kessler, R. [Chicago U., KICP; Astier, P. [Paris U., VI-VII; Marriner, J. [Fermilab; Betoule, M. [Paris U., VI-VII; Sako, M. [Pennsylvania U.; El-Hage, P. [Paris U., VI-VII; Biswas, R. [Argonne; Pain, R. [Paris U., VI-VII; Kuhlmann, S. [Argonne; Regnault, N. [Paris U., VI-VII; Frieman, J. A. [Fermilab; Schneider, D. P. [Penn State U.

    2014-08-29

    We use simulated type Ia supernova (SN Ia) samples, including both photometry and spectra, to perform the first direct validation of cosmology analysis using the SALT-II light curve model. This validation includes residuals from the light curve training process, systematic biases in SN Ia distance measurements, and a bias on the dark energy equation of state parameter w. Using the SN-analysis package SNANA, we simulate and analyze realistic samples corresponding to the data samples used in the SNLS3 analysis: ~120 low-redshift (z < 0.1) SNe Ia, ~255 Sloan Digital Sky Survey SNe Ia (z < 0.4), and ~290 SNLS SNe Ia (z ≤ 1). To probe systematic uncertainties in detail, we vary the input spectral model, the model of intrinsic scatter, and the smoothing (i.e., regularization) parameters used during the SALT-II model training. Using realistic intrinsic scatter models results in a slight bias in the ultraviolet portion of the trained SALT-II model, and w biases (w (input) – w (recovered)) ranging from –0.005 ± 0.012 to –0.024 ± 0.010. These biases are indistinguishable from each other within the uncertainty, the average bias on w is –0.014 ± 0.007.

  14. Accounting for parameter uncertainty in the definition of parametric distributions used to describe individual patient variation in health economic models.

    Science.gov (United States)

    Degeling, Koen; IJzerman, Maarten J; Koopman, Miriam; Koffijberg, Hendrik

    2017-12-15

    Parametric distributions based on individual patient data can be used to represent both stochastic and parameter uncertainty. Although general guidance is available on how parameter uncertainty should be accounted for in probabilistic sensitivity analysis, there is no comprehensive guidance on reflecting parameter uncertainty in the (correlated) parameters of distributions used to represent stochastic uncertainty in patient-level models. This study aims to provide this guidance by proposing appropriate methods and illustrating the impact of this uncertainty on modeling outcomes. Two approaches, 1) using non-parametric bootstrapping and 2) using multivariate Normal distributions, were applied in a simulation and case study. The approaches were compared based on point-estimates and distributions of time-to-event and health economic outcomes. To assess sample size impact on the uncertainty in these outcomes, sample size was varied in the simulation study and subgroup analyses were performed for the case-study. Accounting for parameter uncertainty in distributions that reflect stochastic uncertainty substantially increased the uncertainty surrounding health economic outcomes, illustrated by larger confidence ellipses surrounding the cost-effectiveness point-estimates and different cost-effectiveness acceptability curves. Although both approaches performed similar for larger sample sizes (i.e. n = 500), the second approach was more sensitive to extreme values for small sample sizes (i.e. n = 25), yielding infeasible modeling outcomes. Modelers should be aware that parameter uncertainty in distributions used to describe stochastic uncertainty needs to be reflected in probabilistic sensitivity analysis, as it could substantially impact the total amount of uncertainty surrounding health economic outcomes. If feasible, the bootstrap approach is recommended to account for this uncertainty.

  15. Accounting for parameter uncertainty in the definition of parametric distributions used to describe individual patient variation in health economic models

    Directory of Open Access Journals (Sweden)

    Koen Degeling

    2017-12-01

    Full Text Available Abstract Background Parametric distributions based on individual patient data can be used to represent both stochastic and parameter uncertainty. Although general guidance is available on how parameter uncertainty should be accounted for in probabilistic sensitivity analysis, there is no comprehensive guidance on reflecting parameter uncertainty in the (correlated parameters of distributions used to represent stochastic uncertainty in patient-level models. This study aims to provide this guidance by proposing appropriate methods and illustrating the impact of this uncertainty on modeling outcomes. Methods Two approaches, 1 using non-parametric bootstrapping and 2 using multivariate Normal distributions, were applied in a simulation and case study. The approaches were compared based on point-estimates and distributions of time-to-event and health economic outcomes. To assess sample size impact on the uncertainty in these outcomes, sample size was varied in the simulation study and subgroup analyses were performed for the case-study. Results Accounting for parameter uncertainty in distributions that reflect stochastic uncertainty substantially increased the uncertainty surrounding health economic outcomes, illustrated by larger confidence ellipses surrounding the cost-effectiveness point-estimates and different cost-effectiveness acceptability curves. Although both approaches performed similar for larger sample sizes (i.e. n = 500, the second approach was more sensitive to extreme values for small sample sizes (i.e. n = 25, yielding infeasible modeling outcomes. Conclusions Modelers should be aware that parameter uncertainty in distributions used to describe stochastic uncertainty needs to be reflected in probabilistic sensitivity analysis, as it could substantially impact the total amount of uncertainty surrounding health economic outcomes. If feasible, the bootstrap approach is recommended to account for this uncertainty.

  16. Prediction uncertainty assessment of a systems biology model requires a sample of the full probability distribution of its parameters

    NARCIS (Netherlands)

    Mourik, van S.; Braak, ter C.J.F.; Stigter, J.D.; Molenaar, J.

    2014-01-01

    Multi-parameter models in systems biology are typically ‘sloppy’: some parameters or combinations of parameters may be hard to estimate from data, whereas others are not. One might expect that parameter uncertainty automatically leads to uncertain predictions, but this is not the case. We illustrate

  17. A comparison of bootstrap approaches for estimating uncertainty of parameters in linear mixed-effects models.

    Science.gov (United States)

    Thai, Hoai-Thu; Mentré, France; Holford, Nicholas H G; Veyrat-Follet, Christine; Comets, Emmanuelle

    2013-01-01

    A version of the nonparametric bootstrap, which resamples the entire subjects from original data, called the case bootstrap, has been increasingly used for estimating uncertainty of parameters in mixed-effects models. It is usually applied to obtain more robust estimates of the parameters and more realistic confidence intervals (CIs). Alternative bootstrap methods, such as residual bootstrap and parametric bootstrap that resample both random effects and residuals, have been proposed to better take into account the hierarchical structure of multi-level and longitudinal data. However, few studies have been performed to compare these different approaches. In this study, we used simulation to evaluate bootstrap methods proposed for linear mixed-effect models. We also compared the results obtained by maximum likelihood (ML) and restricted maximum likelihood (REML). Our simulation studies evidenced the good performance of the case bootstrap as well as the bootstraps of both random effects and residuals. On the other hand, the bootstrap methods that resample only the residuals and the bootstraps combining case and residuals performed poorly. REML and ML provided similar bootstrap estimates of uncertainty, but there was slightly more bias and poorer coverage rate for variance parameters with ML in the sparse design. We applied the proposed methods to a real dataset from a study investigating the natural evolution of Parkinson's disease and were able to confirm that the methods provide plausible estimates of uncertainty. Given that most real-life datasets tend to exhibit heterogeneity in sampling schedules, the residual bootstraps would be expected to perform better than the case bootstrap. Copyright © 2013 John Wiley & Sons, Ltd.

  18. The Effects of Uncertainty in Speed-Flow Curve Parameters on a Large-Scale Model

    DEFF Research Database (Denmark)

    Manzo, Stefano; Nielsen, Otto Anker; Prato, Carlo Giacomo

    2014-01-01

    Uncertainty is inherent in transport models and prevents the use of a deterministic approach when traffic is modeled. Quantifying uncertainty thus becomes an indispensable step to produce a more informative and reliable output of transport models. In traffic assignment models, volume-delay functi...

  19. Modelling Framework for the Identification of Critical Variables and Parameters under Uncertainty in the Bioethanol Production from Lignocellulose

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Meyer, Anne S.; Gernaey, Krist

    2011-01-01

    This study presents the development of a systematic modelling framework for identification of the most critical variables and parameters under uncertainty, evaluated on a lignocellulosic ethanol production case study. The systematic framework starts with: (1) definition of the objectives; (2...

  20. Evaluation of risk impact of changes to surveillance requirements addressing model and parameter uncertainties

    International Nuclear Information System (INIS)

    Martorell, S.; Villamizar, M.; Martón, I.; Villanueva, J.F.; Carlos, S.; Sánchez, A.I.

    2014-01-01

    This paper presents a three steps based approach for the evaluation of risk impact of changes to Surveillance Requirements based on the use of the Probabilistic Risk Assessment and addressing identification, treatment and analysis of model and parameter uncertainties in an integrated manner. The paper includes also an example of application that focuses on the evaluation of the risk impact of a Surveillance Frequency change for the Reactor Protection System of a Nuclear Power Plant using a level 1 Probabilistic Risk Assessment. Surveillance Requirements are part of Technical Specifications that are included into the Licensing Basis for operation of Nuclear Power Plants. Surveillance Requirements aim at limiting risk of undetected downtimes of safety related equipment by imposing equipment operability checks, which consist of testing of equipment operational parameters with established Surveillance Frequency and Test Strategy

  1. Rabies epidemic model with uncertainty in parameters: crisp and fuzzy approaches

    Science.gov (United States)

    Ndii, M. Z.; Amarti, Z.; Wiraningsih, E. D.; Supriatna, A. K.

    2018-03-01

    A deterministic mathematical model is formulated to investigate the transmission dynamics of rabies. In particular, we investigate the effects of vaccination, carrying capacity and the transmission rate on the rabies epidemics and allow for uncertainty in the parameters. We perform crisp and fuzzy approaches. We find that, in the case of crisp parameters, rabies epidemics may be interrupted when the carrying capacity and the transmission rate are not high. Our findings suggest that limiting the growth of dog population and reducing the potential contact between susceptible and infectious dogs may aid in interrupting rabies epidemics. We extend the work by considering a fuzzy carrying capacity and allow for low, medium, and high level of carrying capacity. The result confirms the results obtained by using crisp carrying capacity, that is, when the carrying capacity is not too high, the vaccination could confine the disease effectively.

  2. Significance of uncertainties derived from settling tank model structure and parameters on predicting WWTP performance - A global sensitivity analysis study

    DEFF Research Database (Denmark)

    Ramin, Elham; Sin, Gürkan; Mikkelsen, Peter Steen

    2011-01-01

    Uncertainty derived from one of the process models – such as one-dimensional secondary settling tank (SST) models – can impact the output of the other process models, e.g., biokinetic (ASM1), as well as the integrated wastewater treatment plant (WWTP) models. The model structure and parameter...... uncertainty of settler models can therefore propagate, and add to the uncertainties in prediction of any plant performance criteria. Here we present an assessment of the relative significance of secondary settling model performance in WWTP simulations. We perform a global sensitivity analysis (GSA) based....... The outcome of this study contributes to a better understanding of uncertainty in WWTPs, and explicitly demonstrates the significance of secondary settling processes that are crucial elements of model prediction under dry and wet-weather loading conditions....

  3. Parameter optimization, sensitivity, and uncertainty analysis of an ecosystem model at a forest flux tower site in the United States

    Science.gov (United States)

    Wu, Yiping; Liu, Shuguang; Huang, Zhihong; Yan, Wende

    2014-01-01

    Ecosystem models are useful tools for understanding ecological processes and for sustainable management of resources. In biogeochemical field, numerical models have been widely used for investigating carbon dynamics under global changes from site to regional and global scales. However, it is still challenging to optimize parameters and estimate parameterization uncertainty for complex process-based models such as the Erosion Deposition Carbon Model (EDCM), a modified version of CENTURY, that consider carbon, water, and nutrient cycles of ecosystems. This study was designed to conduct the parameter identifiability, optimization, sensitivity, and uncertainty analysis of EDCM using our developed EDCM-Auto, which incorporated a comprehensive R package—Flexible Modeling Framework (FME) and the Shuffled Complex Evolution (SCE) algorithm. Using a forest flux tower site as a case study, we implemented a comprehensive modeling analysis involving nine parameters and four target variables (carbon and water fluxes) with their corresponding measurements based on the eddy covariance technique. The local sensitivity analysis shows that the plant production-related parameters (e.g., PPDF1 and PRDX) are most sensitive to the model cost function. Both SCE and FME are comparable and performed well in deriving the optimal parameter set with satisfactory simulations of target variables. Global sensitivity and uncertainty analysis indicate that the parameter uncertainty and the resulting output uncertainty can be quantified, and that the magnitude of parameter-uncertainty effects depends on variables and seasons. This study also demonstrates that using the cutting-edge R functions such as FME can be feasible and attractive for conducting comprehensive parameter analysis for ecosystem modeling.

  4. Modelling Framework for the Identification of Critical Variables and Parameters under Uncertainty in the Bioethanol Production from Lignocellulose

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Meyer, Anne S.; Gernaey, Krist

    2011-01-01

    This study presents the development of a systematic modelling framework for identification of the most critical variables and parameters under uncertainty, evaluated on a lignocellulosic ethanol production case study. The systematic framework starts with: (1) definition of the objectives; (2....... Sensitivity analysis employs the standardized regression coefficient (SRC) method, which provides a global sensitivity measure, βi, thereby showing how much each parameter contributes to the variance (uncertainty) of the model predictions. Thus, identifying the most critical parameters involved in the process......, suitable for further analysis of the bioprocess. The uncertainty and sensitivity analysis identified the following most critical variables and parameters involved in the lignocellulosic ethanol production case study. For the operating cost, the enzyme loading showed the strongest impact, while reaction...

  5. Quantitative assessments of mantle flow models against seismic observations: Influence of uncertainties in mineralogical parameters

    Science.gov (United States)

    Schuberth, Bernhard S. A.

    2017-04-01

    synthetic traveltime data can then be compared - on statistical grounds - to the traveltime variations observed on Earth. Here, we now investigate the influence of uncertainties in the various input parameters that enter our modelling. This is especially important for the material properties at high pressure and high temperature entering the mineralogical models. In particular, this concerns uncertainties that arise from relating measurements in the laboratory to Earth properties on a global scale. As one example, we will address the question on the influence of anelasticity on the variance of global synthetic traveltime residuals. Owing to the differences in seismic frequency content between laboratory measurements (MHz to GHz) and the Earth (mHz to Hz), the seismic velocities given in the mineralogical models need to be adjusted; that is, corrected for dispersion due to anelastic effects. This correction will increase the sensitivity of the seismic velocities to temperature variations. The magnitude of this increase depends on absolute temperature, frequency, the frequency dependence of attenuation and the activation enthalpy of the dissipative process. Especially the latter two are poorly known for mantle minerals and our results indicate that variations in activation enthalpy potentially produce the largest differences in temperature sensitivity with respect to the purely elastic case. We will present new wave propagation simulations and corresponding statistical analyses of traveltime measurements for different synthetic seismic models spanning the possible range of anelastic velocity conversions (while being based on the same mantle circulation model).

  6. Evaluation of Computational Techniques for Parameter Estimation and Uncertainty Analysis of Comprehensive Watershed Models

    Science.gov (United States)

    Yen, H.; Arabi, M.; Records, R.

    2012-12-01

    The structural complexity of comprehensive watershed models continues to increase in order to incorporate inputs at finer spatial and temporal resolutions and simulate a larger number of hydrologic and water quality responses. Hence, computational methods for parameter estimation and uncertainty analysis of complex models have gained increasing popularity. This study aims to evaluate the performance and applicability of a range of algorithms from computationally frugal approaches to formal implementations of Bayesian statistics using Markov Chain Monte Carlo (MCMC) techniques. The evaluation procedure hinges on the appraisal of (i) the quality of final parameter solution in terms of the minimum value of the objective function corresponding to weighted errors; (ii) the algorithmic efficiency in reaching the final solution; (iii) the marginal posterior distributions of model parameters; (iv) the overall identifiability of the model structure; and (v) the effectiveness in drawing samples that can be classified as behavior-giving solutions. The proposed procedure recognize an important and often neglected issue in watershed modeling that solutions with minimum objective function values may not necessarily reflect the behavior of the system. The general behavior of a system is often characterized by the analysts according to the goals of studies using various error statistics such as percent bias or Nash-Sutcliffe efficiency coefficient. Two case studies are carried out to examine the efficiency and effectiveness of four Bayesian approaches including Metropolis-Hastings sampling (MHA), Gibbs sampling (GSA), uniform covering by probabilistic rejection (UCPR), and differential evolution adaptive Metropolis (DREAM); a greedy optimization algorithm dubbed dynamically dimensioned search (DDS); and shuffle complex evolution (SCE-UA), a widely implemented evolutionary heuristic optimization algorithm. The Soil and Water Assessment Tool (SWAT) is used to simulate hydrologic and

  7. Uncertainty assessment and sensitivity analysis of soil moisture based on model parameter errors - Results from four regions in China

    Science.gov (United States)

    Sun, Guodong; Peng, Fei; Mu, Mu

    2017-12-01

    Model parameter errors are an important cause of uncertainty in soil moisture simulation. In this study, a conditional nonlinear optimal perturbation related to parameter (CNOP-P) approach and a sophisticated land surface model (the Common Land Model, CoLM) are employed in four regions in China to explore extent of uncertainty in soil moisture simulations due to model parameter errors. The CNOP-P approach facilitates calculation of the upper bounds of uncertainty due to parameter errors and investigation of the nonlinear effects of parameter combination on uncertainties in simulation and prediction. The range of uncertainty for simulated soil moisture was found to be from 0.04 to 0.58 m3 m-3. Based on the CNOP-P approach, a new approach is applied to explore a relatively sensitive and important parameter combination for soil moisture simulations and predictions. It is found that the relatively sensitive parameter combination is region- and season-dependent. Furthermore, the results show that simulation of soil moisture could be improved if the errors in these important parameter combinations are reduced. In four study regions, the average extent of improvement (61.6%) in simulating soil moisture using the new approach based on the CNOP-P is larger than that (53.4%) using the one-at-a-time (OAT) approach. These results indicate that simulation and prediction of soil moisture is improved by considering the nonlinear effects of important physical parameter combinations. In addition, the new approach based on the CNOP-P is found to be an effective method to discern the nonlinear effects of important physical parameter combinations on numerical simulation and prediction.

  8. Bayesian uncertainty analysis for complex systems biology models: emulation, global parameter searches and evaluation of gene functions.

    Science.gov (United States)

    Vernon, Ian; Liu, Junli; Goldstein, Michael; Rowe, James; Topping, Jen; Lindsey, Keith

    2018-01-02

    Many mathematical models have now been employed across every area of systems biology. These models increasingly involve large numbers of unknown parameters, have complex structure which can result in substantial evaluation time relative to the needs of the analysis, and need to be compared to observed data of various forms. The correct analysis of such models usually requires a global parameter search, over a high dimensional parameter space, that incorporates and respects the most important sources of uncertainty. This can be an extremely difficult task, but it is essential for any meaningful inference or prediction to be made about any biological system. It hence represents a fundamental challenge for the whole of systems biology. Bayesian statistical methodology for the uncertainty analysis of complex models is introduced, which is designed to address the high dimensional global parameter search problem. Bayesian emulators that mimic the systems biology model but which are extremely fast to evaluate are embeded within an iterative history match: an efficient method to search high dimensional spaces within a more formal statistical setting, while incorporating major sources of uncertainty. The approach is demonstrated via application to a model of hormonal crosstalk in Arabidopsis root development, which has 32 rate parameters, for which we identify the sets of rate parameter values that lead to acceptable matches between model output and observed trend data. The multiple insights into the model's structure that this analysis provides are discussed. The methodology is applied to a second related model, and the biological consequences of the resulting comparison, including the evaluation of gene functions, are described. Bayesian uncertainty analysis for complex models using both emulators and history matching is shown to be a powerful technique that can greatly aid the study of a large class of systems biology models. It both provides insight into model behaviour

  9. Study on uncertainty evaluation methodology related to hydrological parameter of regional groundwater flow analysis model

    International Nuclear Information System (INIS)

    Sakai, Ryutaro; Munakata, Masahiro; Ohoka, Masao; Kameya, Hiroshi

    2009-11-01

    In the safety assessment for a geological disposal of radioactive waste, it is important to develop a methodology for long-term estimation of regional groundwater flow from data acquisition to numerical analyses. In the uncertainties associated with estimation of regional groundwater flow, there are the one that concerns parameters and the one that concerns the hydrologeological evolution. The uncertainties of parameters include measurement errors and their heterogeneity. The authors discussed the uncertainties of hydraulic conductivity as a significant parameter for regional groundwater flow analysis. This study suggests that hydraulic conductivities of rock mass are controlled by rock characteristics such as fractures, porosity and test conditions such as hydraulic gradient, water quality, water temperature and that there exists variations more than ten times in hydraulic conductivity by difference due to test conditions such as hydraulic gradient or due to rock type variations such as rock fractures, porosity. In addition this study demonstrated that confining pressure change caused by uplift and subsidence and change of hydraulic gradient under the long-term evolution of hydrogeological environment could possibly produce variations more than ten times of magnitude in hydraulic conductivity. It was also shown that the effect of water quality change on hydraulic conductivity was not negligible and that the replacement of fresh water and saline water caused by sea level change could induce 0.6 times in current hydraulic conductivities in case of Horonobe site. (author)

  10. An Improved Global Optimization Method and Its Application in Complex Hydrological Model Calibration and Parameter Uncertainty Study

    Science.gov (United States)

    Chu, W.; Gao, X.; Sorooshian, S.

    2009-12-01

    With the advancement of modern computer technology, many heuristic global optimization algorithms have been developed and applied to various fields of science and engineering in the last two decades. In surface hydrology, parameter optimization is a bridge connecting model simulation and real observation. Due to the lack of detailed physical understanding or descriptions of the hydrological process, most rainfall-runoff models are built with conceptual components. Therefore, the model parameters mostly include unknown correlations and uncertainties and have to be calibrated based on observation to make the model function properly. As a good attempt to automatically calibrate conceptual rainfall-runoff models, the shuffled complex evolution (SCE-UA) method was developed and has exhibited its efficacy and efficiency. However, our recent study reveals that the SCE-UA method overlooks some fundamental assumption of direct search theory and hence loses its power when optimizing complex and high-dimensional problems. By integrating some new techniques of heuristic search and overcoming the above-mentioned shortage, a new method has been developed. This method is applied to calibrate the Sacramento Soil Moisture Accounting (SAC-SMA) model and study the parameter uncertainties. Results show that the method outperforms SCE-UA in the following aspects: 1) It retrieves better parameter values which further reduce the model’s root mean square error; 2) The method is more robust; 3) The ensemble of optimized parameters using this method better delineates model parameters’ uncertainty, which is critical to understanding model behaviors.

  11. Stability Analysis for Li-Ion Battery Model Parameters and State of Charge Estimation by Measurement Uncertainty Consideration

    Directory of Open Access Journals (Sweden)

    Shifei Yuan

    2015-07-01

    Full Text Available Accurate estimation of model parameters and state of charge (SoC is crucial for the lithium-ion battery management system (BMS. In this paper, the stability of the model parameters and SoC estimation under measurement uncertainty is evaluated by three different factors: (i sampling periods of 1/0.5/0.1 s; (ii current sensor precisions of ±5/±50/±500 mA; and (iii voltage sensor precisions of ±1/±2.5/±5 mV. Firstly, the numerical model stability analysis and parametric sensitivity analysis for battery model parameters are conducted under sampling frequency of 1–50 Hz. The perturbation analysis is theoretically performed of current/voltage measurement uncertainty on model parameter variation. Secondly, the impact of three different factors on the model parameters and SoC estimation was evaluated with the federal urban driving sequence (FUDS profile. The bias correction recursive least square (CRLS and adaptive extended Kalman filter (AEKF algorithm were adopted to estimate the model parameters and SoC jointly. Finally, the simulation results were compared and some insightful findings were concluded. For the given battery model and parameter estimation algorithm, the sampling period, and current/voltage sampling accuracy presented a non-negligible effect on the estimation results of model parameters. This research revealed the influence of the measurement uncertainty on the model parameter estimation, which will provide the guidelines to select a reasonable sampling period and the current/voltage sensor sampling precisions in engineering applications.

  12. Consideration of source parameter uncertainties in the geophysical ground motion model

    International Nuclear Information System (INIS)

    Suzuki, S.; Kiremidjian, A.S.

    1989-01-01

    In recent years considerable research has been performed on seismic risk assessment of nuclear power plants. These studies show that estimation of seismic hazard should be consistent with earthquake recurrence patterns, the rupture mechanisms, the wave propagation path, and the local soil condition. Therefore, a stochastic ground motion forecast model is proposed based on a geophysical ground motion model and a stochastic earthquake occurrences model. For an earthquake occurrence model, the random slop model is employed. For a ground motion model, the normal mode method is used. In particular uncertainties on stress accumulation rate due to inhomogeneous fault properties are considered to estimate the probabilities of earthquake occurrences in this study

  13. Accounting for parameter uncertainty in the definition of parametric distributions used to describe individual patient variation in health economic models

    NARCIS (Netherlands)

    Degeling, Koen; IJzerman, Maarten J; Koopman, Miriam; Koffijberg, Hendrik

    2017-01-01

    Background Parametric distributions based on individual patient data can be used to represent both stochastic and parameter uncertainty. Although general guidance is available on how parameter uncertainty should be accounted for in probabilistic sensitivity analysis, there is no comprehensive

  14. Accounting for parameter uncertainty in the definition of parametric distributions used to describe individual patient variation in health economic models

    NARCIS (Netherlands)

    Degeling, Koen; Ijzerman, Maarten J.; Koopman, Miriam; Koffijberg, Hendrik

    2017-01-01

    Background: Parametric distributions based on individual patient data can be used to represent both stochastic and parameter uncertainty. Although general guidance is available on how parameter uncertainty should be accounted for in probabilistic sensitivity analysis, there is no comprehensive

  15. How uncertainty in input and parameters influences transport model output: four-stage model case-study

    DEFF Research Database (Denmark)

    Manzo, Stefano; Nielsen, Otto Anker; Prato, Carlo Giacomo

    2015-01-01

    -year model outputs uncertainty. More precisely, this study contributes to the existing literature on the topic by investigating the effects on model outputs uncertainty deriving from the use of (i) different probability distributions in the sampling process, (ii) different assignment algorithms, and (iii...... of coefficient of variation, resulting from stochastic user equilibrium and user equilibrium is, respectively, of 0.425 and 0.468. Finally, network congestion does not show a high effect on model output uncertainty at the network level. However, the final uncertainty of links with higher volume/capacity ratio......If not properly quantified, the uncertainty inherent to transport models makes analyses based on their output highly unreliable. This study investigated uncertainty in four-stage transport models by analysing a Danish case-study: the Næstved model. The model describes the demand of transport...

  16. Ensemble-based flash-flood modelling: Taking into account hydrodynamic parameters and initial soil moisture uncertainties

    Science.gov (United States)

    Edouard, Simon; Vincendon, Béatrice; Ducrocq, Véronique

    2018-05-01

    Intense precipitation events in the Mediterranean often lead to devastating flash floods (FF). FF modelling is affected by several kinds of uncertainties and Hydrological Ensemble Prediction Systems (HEPS) are designed to take those uncertainties into account. The major source of uncertainty comes from rainfall forcing and convective-scale meteorological ensemble prediction systems can manage it for forecasting purpose. But other sources are related to the hydrological modelling part of the HEPS. This study focuses on the uncertainties arising from the hydrological model parameters and initial soil moisture with aim to design an ensemble-based version of an hydrological model dedicated to Mediterranean fast responding rivers simulations, the ISBA-TOP coupled system. The first step consists in identifying the parameters that have the strongest influence on FF simulations by assuming perfect precipitation. A sensitivity study is carried out first using a synthetic framework and then for several real events and several catchments. Perturbation methods varying the most sensitive parameters as well as initial soil moisture allow designing an ensemble-based version of ISBA-TOP. The first results of this system on some real events are presented. The direct perspective of this work will be to drive this ensemble-based version with the members of a convective-scale meteorological ensemble prediction system to design a complete HEPS for FF forecasting.

  17. An Interval-Parameter Fuzzy Linear Programming with Stochastic Vertices Model for Water Resources Management under Uncertainty

    Directory of Open Access Journals (Sweden)

    Yan Han

    2013-01-01

    Full Text Available An interval-parameter fuzzy linear programming with stochastic vertices (IFLPSV method is developed for water resources management under uncertainty by coupling interval-parameter fuzzy linear programming (IFLP with stochastic programming (SP. As an extension of existing interval parameter fuzzy linear programming, the developed IFLPSV approach has advantages in dealing with dual uncertainty optimization problems, which uncertainty presents as interval parameter with stochastic vertices in both of the objective functions and constraints. The developed IFLPSV method improves upon the IFLP method by allowing dual uncertainty parameters to be incorporated into the optimization processes. A hybrid intelligent algorithm based on genetic algorithm and artificial neural network is used to solve the developed model. The developed method is then applied to water resources allocation in Beijing city of China in 2020, where water resources shortage is a challenging issue. The results indicate that reasonable solutions have been obtained, which are helpful and useful for decision makers. Although the amount of water supply from Guanting and Miyun reservoirs is declining with rainfall reduction, water supply from the South-to-North Water Transfer project will have important impact on water supply structure of Beijing city, particularly in dry year and extraordinary dry year.

  18. Implicit Treatment of Technical Specification and Thermal Hydraulic Parameter Uncertainties in Gaussian Process Model to Estimate Safety Margin

    Directory of Open Access Journals (Sweden)

    Douglas A. Fynan

    2016-06-01

    Full Text Available The Gaussian process model (GPM is a flexible surrogate model that can be used for nonparametric regression for multivariate problems. A unique feature of the GPM is that a prediction variance is automatically provided with the regression function. In this paper, we estimate the safety margin of a nuclear power plant by performing regression on the output of best-estimate simulations of a large-break loss-of-coolant accident with sampling of safety system configuration, sequence timing, technical specifications, and thermal hydraulic parameter uncertainties. The key aspect of our approach is that the GPM regression is only performed on the dominant input variables, the safety injection flow rate and the delay time for AC powered pumps to start representing sequence timing uncertainty, providing a predictive model for the peak clad temperature during a reflood phase. Other uncertainties are interpreted as contributors to the measurement noise of the code output and are implicitly treated in the GPM in the noise variance term, providing local uncertainty bounds for the peak clad temperature. We discuss the applicability of the foregoing method to reduce the use of conservative assumptions in best estimate plus uncertainty (BEPU and Level 1 probabilistic safety assessment (PSA success criteria definitions while dealing with a large number of uncertainties.

  19. Estimation of the Influence of Power System Mathematical Model Parameter Uncertainty on PSS2A System Stabilizers

    Directory of Open Access Journals (Sweden)

    Adrian Nocoń

    2015-09-01

    Full Text Available This paper presents an analysis of the influence of uncertainty of power system mathematical model parameters on optimised parameters of PSS2A system stabilizers. Optimisation of power system stabilizer parameters was based on polyoptimisation (multi-criteria optimisation. Optimisation criteria were determined for disturbances occurring in a multi-machine power system, when taking into account transient waveforms associated with electromechanical swings (instantaneous power, angular speed and terminal voltage waveforms of generators. A genetic algorithm with floating-point encoding, tournament selection, mean crossover and perturbative mutations, modified for the needs of investigations, was used for optimisation. The impact of uncertainties on the quality of operation of power system stabilizers with optimised parameters has been evaluated using various deformation factors.

  20. Uncertainty in Model parameter Estimates and Impacts on Risk and Decision Making in the Subsurface

    Science.gov (United States)

    Enzenhöfer, R.; Helmig, R.; Nowak, W.; Binning, P. J.

    2010-12-01

    Advection-based well-head protection zones are commonly used to manage the risk of contamination to drinking water wells. Current Water Safety Plans recommend that catchment managers and stakeholders control and monitor all possible hazards within catchments. In order to do this it is important to not only map the protection zones, but also to characterize their uncertainty. Here the four intrinsic well vulnerability criteria of Frind et al. (2006) are cast in a probabilistic framework. The criteria employ advective-dispersive transport models to determine the: (1) Peak arrival time at the well, (2) peak concentration level, (3) arrival time of threshold concentrations and (4) time of exposure. Our probabilistic framework yields catchment-wide maps of the probability of exceeding each of these criteria. We separate the uncertainty of plume location and actual dilution by resolving heterogeneity with high-resolution Monte-Carlo simulations. To keep computational costs low, we adopt a reverse transport formulation, and combine it with the temporal moment approach for model reduction. We recover the time-dependent breakthrough curves and well vulnerability criteria from the temporal moments by Maximum Entropy reconstruction in log-time. Our method is independent of dimensionality, boundary conditions and other sources of uncertainty. It can be coupled with any method for conditioning on available data. For simplicity, we demonstrate the concept on a 2D example that involves synthetic data. Our approach delivers indispensable information on exposure risk and improves the basis for risk-informed well head management.

  1. Uncertainty Quantification of GEOS-5 L-band Radiative Transfer Model Parameters Using Bayesian Inference and SMOS Observations

    Science.gov (United States)

    DeLannoy, Gabrielle J. M.; Reichle, Rolf H.; Vrugt, Jasper A.

    2013-01-01

    Uncertainties in L-band (1.4 GHz) radiative transfer modeling (RTM) affect the simulation of brightness temperatures (Tb) over land and the inversion of satellite-observed Tb into soil moisture retrievals. In particular, accurate estimates of the microwave soil roughness, vegetation opacity and scattering albedo for large-scale applications are difficult to obtain from field studies and often lack an uncertainty estimate. Here, a Markov Chain Monte Carlo (MCMC) simulation method is used to determine satellite-scale estimates of RTM parameters and their posterior uncertainty by minimizing the misfit between long-term averages and standard deviations of simulated and observed Tb at a range of incidence angles, at horizontal and vertical polarization, and for morning and evening overpasses. Tb simulations are generated with the Goddard Earth Observing System (GEOS-5) and confronted with Tb observations from the Soil Moisture Ocean Salinity (SMOS) mission. The MCMC algorithm suggests that the relative uncertainty of the RTM parameter estimates is typically less than 25 of the maximum a posteriori density (MAP) parameter value. Furthermore, the actual root-mean-square-differences in long-term Tb averages and standard deviations are found consistent with the respective estimated total simulation and observation error standard deviations of m3.1K and s2.4K. It is also shown that the MAP parameter values estimated through MCMC simulation are in close agreement with those obtained with Particle Swarm Optimization (PSO).

  2. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  3. It's the parameters, stupid! Moving beyond multi-model and multi-physics approaches to characterize and reduce predictive uncertainty in process-based hydrological models

    Science.gov (United States)

    Clark, Martyn; Samaniego, Luis; Freer, Jim

    2014-05-01

    Multi-model and multi-physics approaches are a popular tool in environmental modelling, with many studies focusing on optimally combining output from multiple model simulations to reduce predictive errors and better characterize predictive uncertainty. However, a careful and systematic analysis of different hydrological models reveals that individual models are simply small permutations of a master modeling template, and inter-model differences are overwhelmed by uncertainty in the choice of the parameter values in the model equations. Furthermore, inter-model differences do not explicitly represent the uncertainty in modeling a given process, leading to many situations where different models provide the wrong results for the same reasons. In other cases, the available morphological data does not support the very fine spatial discretization of the landscape that typifies many modern applications of process-based models. To make the uncertainty characterization problem worse, the uncertain parameter values in process-based models are often fixed (hard-coded), and the models lack the agility necessary to represent the tremendous heterogeneity in natural systems. This presentation summarizes results from a systematic analysis of uncertainty in process-based hydrological models, where we explicitly analyze the myriad of subjective decisions made throughout both the model development and parameter estimation process. Results show that much of the uncertainty is aleatory in nature - given a "complete" representation of dominant hydrologic processes, uncertainty in process parameterizations can be represented using an ensemble of model parameters. Epistemic uncertainty associated with process interactions and scaling behavior is still important, and these uncertainties can be represented using an ensemble of different spatial configurations. Finally, uncertainty in forcing data can be represented using ensemble methods for spatial meteorological analysis. Our systematic

  4. Spatial scale effects on model parameter estimation and predictive uncertainty in ungauged basins

    CSIR Research Space (South Africa)

    Hughes, DA

    2013-06-01

    Full Text Available The most appropriate scale to use for hydrological modelling depends on the structure of the chosen model, the purpose of the results and the resolution of the available data used to quantify parameter values and provide the climatic forcing data...

  5. A Bayesian approach to model uncertainty

    International Nuclear Information System (INIS)

    Buslik, A.

    1994-01-01

    A Bayesian approach to model uncertainty is taken. For the case of a finite number of alternative models, the model uncertainty is equivalent to parameter uncertainty. A derivation based on Savage's partition problem is given

  6. Modeling sugarcane yield with a process-based model from site to continental scale: uncertainties arising from model structure and parameter values

    Science.gov (United States)

    Valade, A.; Ciais, P.; Vuichard, N.; Viovy, N.; Caubel, A.; Huth, N.; Marin, F.; Martiné, J.-F.

    2014-06-01

    Agro-land surface models (agro-LSM) have been developed from the integration of specific crop processes into large-scale generic land surface models that allow calculating the spatial distribution and variability of energy, water and carbon fluxes within the soil-vegetation-atmosphere continuum. When developing agro-LSM models, particular attention must be given to the effects of crop phenology and management on the turbulent fluxes exchanged with the atmosphere, and the underlying water and carbon pools. A part of the uncertainty of agro-LSM models is related to their usually large number of parameters. In this study, we quantify the parameter-values uncertainty in the simulation of sugarcane biomass production with the agro-LSM ORCHIDEE-STICS, using a multi-regional approach with data from sites in Australia, La Réunion and Brazil. In ORCHIDEE-STICS, two models are chained: STICS, an agronomy model that calculates phenology and management, and ORCHIDEE, a land surface model that calculates biomass and other ecosystem variables forced by STICS phenology. First, the parameters that dominate the uncertainty of simulated biomass at harvest date are determined through a screening of 67 different parameters of both STICS and ORCHIDEE on a multi-site basis. Secondly, the uncertainty of harvested biomass attributable to those most sensitive parameters is quantified and specifically attributed to either STICS (phenology, management) or to ORCHIDEE (other ecosystem variables including biomass) through distinct Monte Carlo runs. The uncertainty on parameter values is constrained using observations by calibrating the model independently at seven sites. In a third step, a sensitivity analysis is carried out by varying the most sensitive parameters to investigate their effects at continental scale. A Monte Carlo sampling method associated with the calculation of partial ranked correlation coefficients is used to quantify the sensitivity of harvested biomass to input

  7. Modeling sugar cane yield with a process-based model from site to continental scale: uncertainties arising from model structure and parameter values

    Science.gov (United States)

    Valade, A.; Ciais, P.; Vuichard, N.; Viovy, N.; Huth, N.; Marin, F.; Martiné, J.-F.

    2014-01-01

    Agro-Land Surface Models (agro-LSM) have been developed from the integration of specific crop processes into large-scale generic land surface models that allow calculating the spatial distribution and variability of energy, water and carbon fluxes within the soil-vegetation-atmosphere continuum. When developing agro-LSM models, a particular attention must be given to the effects of crop phenology and management on the turbulent fluxes exchanged with the atmosphere, and the underlying water and carbon pools. A part of the uncertainty of Agro-LSM models is related to their usually large number of parameters. In this study, we quantify the parameter-values uncertainty in the simulation of sugar cane biomass production with the agro-LSM ORCHIDEE-STICS, using a multi-regional approach with data from sites in Australia, La Réunion and Brazil. In ORCHIDEE-STICS, two models are chained: STICS, an agronomy model that calculates phenology and management, and ORCHIDEE, a land surface model that calculates biomass and other ecosystem variables forced by STICS' phenology. First, the parameters that dominate the uncertainty of simulated biomass at harvest date are determined through a screening of 67 different parameters of both STICS and ORCHIDEE on a multi-site basis. Secondly, the uncertainty of harvested biomass attributable to those most sensitive parameters is quantified and specifically attributed to either STICS (phenology, management) or to ORCHIDEE (other ecosystem variables including biomass) through distinct Monte-Carlo runs. The uncertainty on parameter values is constrained using observations by calibrating the model independently at seven sites. In a third step, a sensitivity analysis is carried out by varying the most sensitive parameters to investigate their effects at continental scale. A Monte-Carlo sampling method associated with the calculation of Partial Ranked Correlation Coefficients is used to quantify the sensitivity of harvested biomass to input

  8. Incorporating Parameter Uncertainty in Bayesian Segmentation Models: Application to Hippocampal Subfield Volumetry

    DEFF Research Database (Denmark)

    Iglesias, J. E.; Sabuncu, M. R.; Van Leemput, Koen

    2012-01-01

    Many successful segmentation algorithms are based on Bayesian models in which prior anatomical knowledge is combined with the available image information. However, these methods typically have many free parameters that are estimated to obtain point estimates only, whereas a faithful Bayesian anal...

  9. An MLE method for finding LKB NTCP model parameters using Monte Carlo uncertainty estimates

    Science.gov (United States)

    Carolan, Martin; Oborn, Brad; Foo, Kerwyn; Haworth, Annette; Gulliford, Sarah; Ebert, Martin

    2014-03-01

    The aims of this work were to establish a program to fit NTCP models to clinical data with multiple toxicity endpoints, to test the method using a realistic test dataset, to compare three methods for estimating confidence intervals for the fitted parameters and to characterise the speed and performance of the program.

  10. Uncertainties of Molecular Structural Parameters

    International Nuclear Information System (INIS)

    Császár, Attila G.

    2014-01-01

    Full text: The most fundamental property of a molecule is its three-dimensional (3D) structure formed by its constituent atoms (see, e.g., the perfectly regular hexagon associated with benzene). It is generally accepted that knowledge of the detailed structure of a molecule is a prerequisite to determine most of its other properties. What nowadays is a seemingly simple concept, namely that molecules have a structure, was introduced into chemistry in the 19th century. Naturally, the word changed its meaning over the years. Elemental analysis, simple structural formulae, two-dimensional and then 3D structures mark the development of the concept to its modern meaning. When quantum physics and quantum chemistry emerged in the 1920s, the simple concept associating structure with a three-dimensional object seemingly gained a firm support. Nevertheless, what seems self-explanatory today is in fact not so straightforward to justify within quantum mechanics. In quantum chemistry the concept of an equilibrium structure of a molecule is tied to the Born-Oppenheimer approximation but beyond the adiabatic separation of the motions of the nuclei and the electrons the meaning of a structure is still slightly obscured. Putting the conceptual difficulties aside, there are several experimental, empirical, and theoretical techniques to determine structures of molecules. One particular problem, strongly related to the question of uncertainties of “measured” or “computed” structural parameters, is that all the different techniques correspond to different structure definitions and thus yield different structural parameters. Experiments probing the structure of molecules rely on a number of structure definitions, to name just a few: r 0 , r g , r a , r s , r m , etc., and one should also consider the temperature dependence of most of these structural parameters which differ from each other in the way the rovibrational motions of the molecules are treated and how the averaging is

  11. A commentary on model uncertainty

    International Nuclear Information System (INIS)

    Apostolakis, G.

    1994-01-01

    A framework is proposed for the identification of model and parameter uncertainties in risk assessment models. Two cases are distinguished; in the first case, a set of mutually exclusive and exhaustive hypotheses (models) can be formulated, while, in the second, only one reference model is available. The relevance of this formulation to decision making and the communication of uncertainties is discussed

  12. Cassini Spacecraft Uncertainty Analysis Data and Methodology Review and Update/Volume 1: Updated Parameter Uncertainty Models for the Consequence Analysis

    Energy Technology Data Exchange (ETDEWEB)

    WHEELER, TIMOTHY A.; WYSS, GREGORY D.; HARPER, FREDERICK T.

    2000-11-01

    Uncertainty distributions for specific parameters of the Cassini General Purpose Heat Source Radioisotope Thermoelectric Generator (GPHS-RTG) Final Safety Analysis Report consequence risk analysis were revised and updated. The revisions and updates were done for all consequence parameters for which relevant information exists from the joint project on Probabilistic Accident Consequence Uncertainty Analysis by the United States Nuclear Regulatory Commission and the Commission of European Communities.

  13. Uncertainty Analysis Using Optimization and Direct Parameter Sampling With Correlation for a Physically Based Contaminant Transport Model

    Science.gov (United States)

    Sykes, J. F.; Yin, Y.

    2008-12-01

    Due to the ill-posed nature of contaminant transport models, inverse modeling and traditional gradient-based optimization approaches often encounter difficulties when applied to real case studies. The correlation of the transport parameters must be included in uncertainty analyses. In this study, a physically based transient groundwater flow model was developed to establish the historical relationship between a contaminant site and the down gradient municipal well field. The parameters for the three-dimensional transient groundwater flow model were calibrated using both punctual data over a thirty-year time period and approximately nine years of head data from continuous well records. Spatially and temporally varying recharge was incorporated in the model to account for water level fluctuations in observation wells. Given the spatially and temporally varying velocities, the six contaminant transport parameters of dispersivities, retardation, initial source concentration and source decay coefficient were estimated using a multi-start PEST algorithm that combined the traditional gradient search approach with a heuristic technique. The feature of multi-start partially resolved the issue of the locality of optimum. The study also compared a Dynamically Dimensioned Search (DDS) algorithm to the multi-start PEST algorithm. A modified Latin Hypercube (LHC) sampling approach accounting for correlation between parameters was employed to conduct an uncertainty analysis for contaminant concentration breakthrough at pumping wells. The LHC sampling can be operated using the multivariate normal distribution for each parameter in which correlations among parameters are specified through optimization and form part of the corresponding probability space. Because of the non-uniqueness issue for ill-posed problems, multiple feasible transport parameter sets and covariance matrices were generated using the mutli-start PEST algorithm. The likelihood for each parameter set was estimated

  14. Parameter Uniqueness And Uncertainty Associated For Multirate Transport Models Applied To Core-Scale Test Data

    Science.gov (United States)

    Kuhlman, K. L.; Malama, B.; James, S. C.

    2011-12-01

    Breakthrough data collected in a set of laboratory tracer experiments are used to constrain the multirate mass transport model STAMMT-L. The uniqueness and variance of the estimated parameters are explored using both Gauss-Marquardt-Levenberg (PEST) and Markov-Chain Monte Carlo (DREAM) algorithms. The efficacy and uniqueness of different multirate distribution types (e.g., lognormal, beta, gamma) for a given dataset are compared. The information content of the different portions of the breakthrough curve (i.e., rising limb, peak, tail) is also explored with this forward model and these inverse modeling tools. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  15. Inversion and uncertainty of highly parameterized models in a Bayesian framework by sampling the maximal conditional posterior distribution of parameters

    Science.gov (United States)

    Mara, Thierry A.; Fajraoui, Noura; Younes, Anis; Delay, Frederick

    2015-02-01

    We introduce the concept of maximal conditional posterior distribution (MCPD) to assess the uncertainty of model parameters in a Bayesian framework. Although, Markov Chains Monte Carlo (MCMC) methods are particularly suited for this task, they become challenging with highly parameterized nonlinear models. The MCPD represents the conditional probability distribution function of a given parameter knowing that the other parameters maximize the conditional posterior density function. Unlike MCMC which accepts or rejects solutions sampled in the parameter space, MCPD is calculated through several optimization processes. Model inversion using MCPD algorithm is particularly useful for highly parameterized problems because calculations are independent. Consequently, they can be evaluated simultaneously with a multi-core computer. In the present work, the MCPD approach is applied to invert a 2D stochastic groundwater flow problem where the log-transmissivity field of the medium is inferred from scarce and noisy data. For this purpose, the stochastic field is expanded onto a set of orthogonal functions using a Karhunen-Loève (KL) transformation. Though the prior guess on the stochastic structure (covariance) of the transmissivity field is erroneous, the MCPD inference of the KL coefficients is able to extract relevant inverse solutions.

  16. Combined Estimation of Hydrogeologic Conceptual Model, Parameter, and Scenario Uncertainty with Application to Uranium Transport at the Hanford Site 300 Area

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Philip D.; Ye, Ming; Rockhold, Mark L.; Neuman, Shlomo P.; Cantrell, Kirk J.

    2007-07-30

    This report to the Nuclear Regulatory Commission (NRC) describes the development and application of a methodology to systematically and quantitatively assess predictive uncertainty in groundwater flow and transport modeling that considers the combined impact of hydrogeologic uncertainties associated with the conceptual-mathematical basis of a model, model parameters, and the scenario to which the model is applied. The methodology is based on a n extension of a Maximum Likelihood implementation of Bayesian Model Averaging. Model uncertainty is represented by postulating a discrete set of alternative conceptual models for a site with associated prior model probabilities that reflect a belief about the relative plausibility of each model based on its apparent consistency with available knowledge and data. Posterior model probabilities are computed and parameter uncertainty is estimated by calibrating each model to observed system behavior; prior parameter estimates are optionally included. Scenario uncertainty is represented as a discrete set of alternative future conditions affecting boundary conditions, source/sink terms, or other aspects of the models, with associated prior scenario probabilities. A joint assessment of uncertainty results from combining model predictions computed under each scenario using as weight the posterior model and prior scenario probabilities. The uncertainty methodology was applied to modeling of groundwater flow and uranium transport at the Hanford Site 300 Area. Eight alternative models representing uncertainty in the hydrogeologic and geochemical properties as well as the temporal variability were considered. Two scenarios represent alternative future behavior of the Columbia River adjacent to the site were considered. The scenario alternatives were implemented in the models through the boundary conditions. Results demonstrate the feasibility of applying a comprehensive uncertainty assessment to large-scale, detailed groundwater flow

  17. Global Nitrous Oxide Emissions from Agricultural Soils: Magnitude and Uncertainties Associated with Input Data and Model Parameters

    Science.gov (United States)

    Xu, R.; Tian, H.; Pan, S.; Yang, J.; Lu, C.; Zhang, B.

    2016-12-01

    Human activities have caused significant perturbations of the nitrogen (N) cycle, resulting in about 21% increase of atmospheric N2O concentration since the pre-industrial era. This large increase is mainly caused by intensive agricultural activities including the application of nitrogen fertilizer and the expansion of leguminous crops. Substantial efforts have been made to quantify the global and regional N2O emission from agricultural soils in the last several decades using a wide variety of approaches, such as ground-based observation, atmospheric inversion, and process-based model. However, large uncertainties exist in those estimates as well as methods themselves. In this study, we used a coupled biogeochemical model (DLEM) to estimate magnitude, spatial, and temporal patterns of N2O emissions from global croplands in the past five decades (1961-2012). To estimate uncertainties associated with input data and model parameters, we have implemented a number of simulation experiments with DLEM, accounting for key parameter values that affect calculation of N2O fluxes (i.e., maximum nitrification and denitrification rates, N fixation rate, and the adsorption coefficient for soil ammonium and nitrate), different sets of input data including climate, land management practices (i.e., nitrogen fertilizer types, application rates and timings, with/without irrigation), N deposition, and land use and land cover change. This work provides a robust estimate of global N2O emissions from agricultural soils as well as identifies key gaps and limitations in the existing model and data that need to be investigated in the future.

  18. Model uncertainty in safety assessment

    International Nuclear Information System (INIS)

    Pulkkinen, U.; Huovinen, T.

    1996-01-01

    The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.)

  19. Parameter-induced uncertainty quantification of soil N2O, NO and CO2 emission from Höglwald spruce forest (Germany using the LandscapeDNDC model

    Directory of Open Access Journals (Sweden)

    K. Butterbach-Bahl

    2012-10-01

    Full Text Available Assessing the uncertainties of simulation results of ecological models is becoming increasingly important, specifically if these models are used to estimate greenhouse gas emissions on site to regional/national levels. Four general sources of uncertainty effect the outcome of process-based models: (i uncertainty of information used to initialise and drive the model, (ii uncertainty of model parameters describing specific ecosystem processes, (iii uncertainty of the model structure, and (iv accurateness of measurements (e.g., soil-atmosphere greenhouse gas exchange which are used for model testing and development. The aim of our study was to assess the simulation uncertainty of the process-based biogeochemical model LandscapeDNDC. For this we set up a Bayesian framework using a Markov Chain Monte Carlo (MCMC method, to estimate the joint model parameter distribution. Data for model testing, parameter estimation and uncertainty assessment were taken from observations of soil fluxes of nitrous oxide (N2O, nitric oxide (NO and carbon dioxide (CO2 as observed over a 10 yr period at the spruce site of the Höglwald Forest, Germany. By running four independent Markov Chains in parallel with identical properties (except for the parameter start values, an objective criteria for chain convergence developed by Gelman et al. (2003 could be used. Our approach shows that by means of the joint parameter distribution, we were able not only to limit the parameter space and specify the probability of parameter values, but also to assess the complex dependencies among model parameters used for simulating soil C and N trace gas emissions. This helped to improve the understanding of the behaviour of the complex LandscapeDNDC model while simulating soil C and N turnover processes and associated C and N soil-atmosphere exchange. In a final step the parameter distribution of the most sensitive parameters determining soil-atmosphere C and N exchange were used to obtain

  20. A framework for model-based optimization of bioprocesses under uncertainty: Identifying critical parameters and operating variables

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Meyer, Anne S.; Gernaey, Krist

    2011-01-01

    This study presents the development and application of a systematic model-based framework for bioprocess optimization, evaluated on a cellulosic ethanol production case study. The implementation of the framework involves the use of dynamic simulations, sophisticated uncertainty analysis (Monte...

  1. Identifying the effects of parameter uncertainty on the reliability of modeling the stability of overhanging, multi-layered, river banks

    Science.gov (United States)

    Samadi, A.; Amiri-Tokaldany, E.; Davoudi, M. H.; Darby, S. E.

    2011-11-01

    Composite river banks consist of a basal layer of non-cohesive material overlain by a cohesive layer of fine-grained material. In such banks, fluvial erosion of the lower, non-cohesive, layer typically occurs at a much higher rate than erosion of the upper part of the bank. Consequently, such banks normally develop a cantilevered bank profile, with bank retreat of the upper part of the bank taking place predominantly by the failure of these cantilevers. To predict the undesirable impacts of this type of bank retreat, a number of bank stability models have been presented in the literature. These models typically express bank stability by defining a factor of safety as the ratio of resisting and driving forces acting on the incipient failure block. These forces are affected by a range of controlling factors that include such aspects as the overhanging block geometry, and the geotechnical properties of the bank materials. In this paper, we introduce a new bank stability relation (for shear-type cantilever failures) that considers the hydrological status of cantilevered riverbanks, while beam-type failures are analyzed using a previously proposed relation. We employ these stability models to evaluate the effects of parameter uncertainty on the reliability of riverbank stability modeling of overhanging banks. This is achieved by employing a simple model of overhanging failure with respect to shear and beam failure mechanisms in a series of sensitivity tests and Monte Carlo analyses to identify, for each model parameter, the range of values that induce significant changes in the simulated factor of safety. The results show that care is required in parameterising (i) the geometrical shape of the overhanging-block and (ii) the bank material cohesion and unit weight, as predictions of bank stability are sensitive to variations of these factors.

  2. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  3. A Bayesian approach for evaluation of the effect of water quality model parameter uncertainty on TMDLs: A case study of Miyun Reservoir

    Energy Technology Data Exchange (ETDEWEB)

    Liang, Shidong, E-mail: emblembl@sina.com [School of Environment, Tsinghua University, 1 Qinghuayuan, Haidian District, Beijing 100084 (China); Jia, Haifeng, E-mail: jhf@tsinghua.edu.cn [School of Environment, Tsinghua University, 1 Qinghuayuan, Haidian District, Beijing 100084 (China); Xu, Changqing, E-mail: 2008changqing@163.com [School of Environment, Tsinghua University, 1 Qinghuayuan, Haidian District, Beijing 100084 (China); Xu, Te, E-mail: xt_lichking@qq.com [School of Environment, Tsinghua University, 1 Qinghuayuan, Haidian District, Beijing 100084 (China); Melching, Charles, E-mail: steve.melching17@gmail.com [Melching Water Solutions, 4030 W. Edgerton Avenue, Greenfield, WI 53221 (United States)

    2016-08-01

    Facing increasingly serious water pollution, the Chinese government is changing the environmental management strategy from solely pollutant concentration control to a Total Maximum Daily Load (TMDL) program, and water quality models are increasingly being applied to determine the allowable pollutant load in the TMDL. Despite the frequent use of models, few studies have focused on how parameter uncertainty in water quality models affect the allowable pollutant loads in the TMDL program, particularly for complicated and high-dimension water quality models. Uncertainty analysis for such models is limited by time-consuming simulation and high-dimensionality and nonlinearity in parameter spaces. In this study, an allowable pollutant load calculation platform was established using the Environmental Fluid Dynamics Code (EFDC), which is a widely applied hydrodynamic-water quality model. A Bayesian approach, i.e. the DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm, which is a high-efficiency, multi-chain Markov Chain Monte Carlo (MCMC) method, was applied to assess the effects of parameter uncertainty on the water quality model simulations and its influence on the allowable pollutant load calculation in the TMDL program. Miyun Reservoir, which is the most important surface drinking water source for Beijing, suffers from eutrophication and was selected as a case study. The relations between pollutant loads and water quality indicators are obtained through a graphical method in the simulation platform. Ranges of allowable pollutant loads were obtained according to the results of parameter uncertainty analysis, i.e. Total Organic Carbon (TOC): 581.5–1030.6 t·yr{sup −1}; Total Phosphorus (TP): 23.3–31.0 t·yr{sup −1}; and Total Nitrogen (TN): 480–1918.0 t·yr{sup −1}. The wide ranges of allowable pollutant loads reveal the importance of parameter uncertainty analysis in a TMDL program for allowable pollutant load calculation and margin of safety (MOS

  4. dOFV distributions: a new diagnostic for the adequacy of parameter uncertainty in nonlinear mixed-effects models applied to the bootstrap.

    Science.gov (United States)

    Dosne, Anne-Gaëlle; Niebecker, Ronald; Karlsson, Mats O

    2016-12-01

    Knowledge of the uncertainty in model parameters is essential for decision-making in drug development. Contrarily to other aspects of nonlinear mixed effects models (NLMEM), scrutiny towards assumptions around parameter uncertainty is low, and no diagnostic exists to judge whether the estimated uncertainty is appropriate. This work aims at introducing a diagnostic capable of assessing the appropriateness of a given parameter uncertainty distribution. The new diagnostic was applied to case bootstrap examples in order to investigate for which dataset sizes case bootstrap is appropriate for NLMEM. The proposed diagnostic is a plot comparing the distribution of differences in objective function values (dOFV) of the proposed uncertainty distribution to a theoretical Chi square distribution with degrees of freedom equal to the number of estimated model parameters. The uncertainty distribution was deemed appropriate if its dOFV distribution was overlaid with or below the theoretical distribution. The diagnostic was applied to the bootstrap of two real data and two simulated data examples, featuring pharmacokinetic and pharmacodynamic models and datasets of 20-200 individuals with between 2 and 5 observations on average per individual. In the real data examples, the diagnostic indicated that case bootstrap was unsuitable for NLMEM analyses with around 70 individuals. A measure of parameter-specific "effective" sample size was proposed as a potentially better indicator of bootstrap adequacy than overall sample size. In the simulation examples, bootstrap confidence intervals were shown to underestimate inter-individual variability at low sample sizes. The proposed diagnostic proved a relevant tool for assessing the appropriateness of a given parameter uncertainty distribution and as such it should be routinely used.

  5. Activated sludge model 2d calibration with full-scale WWTP data: comparing model parameter identifiability with influent and operational uncertainty.

    Science.gov (United States)

    Machado, Vinicius Cunha; Lafuente, Javier; Baeza, Juan Antonio

    2014-07-01

    The present work developed a model for the description of a full-scale wastewater treatment plant (WWTP) (Manresa, Catalonia, Spain) for further plant upgrades based on the systematic parameter calibration of the activated sludge model 2d (ASM2d) using a methodology based on the Fisher information matrix. The influent was characterized for the application of the ASM2d and the confidence interval of the calibrated parameters was also assessed. No expert knowledge was necessary for model calibration and a huge available plant database was converted into more useful information. The effect of the influent and operating variables on the model fit was also studied using these variables as calibrating parameters and keeping the ASM2d kinetic and stoichiometric parameters, which traditionally are the calibration parameters, at their default values. Such an "inversion" of the traditional way of model fitting allowed evaluating the sensitivity of the main model outputs regarding the influent and the operating variables changes. This new approach is able to evaluate the capacity of the operational variables used by the WWTP feedback control loops to overcome external disturbances in the influent and kinetic/stoichiometric model parameters uncertainties. In addition, the study of the influence of operating variables on the model outputs provides useful information to select input and output variables in decentralized control structures.

  6. Estimation of Modal Parameters and their Uncertainties

    DEFF Research Database (Denmark)

    Andersen, P.; Brincker, Rune

    1999-01-01

    In this paper it is shown how to estimate the modal parameters as well as their uncertainties using the prediction error method of a dynamic system on the basis of uotput measurements only. The estimation scheme is assessed by means of a simulation study. As a part of the introduction, an example...... is given showing how the uncertainty estimates can be used in applications such as damage detection....

  7. Parametric uncertainty and global sensitivity analysis in a model of the carotid bifurcation: Identification and ranking of most sensitive model parameters.

    Science.gov (United States)

    Gul, R; Bernhard, S

    2015-11-01

    In computational cardiovascular models, parameters are one of major sources of uncertainty, which make the models unreliable and less predictive. In order to achieve predictive models that allow the investigation of the cardiovascular diseases, sensitivity analysis (SA) can be used to quantify and reduce the uncertainty in outputs (pressure and flow) caused by input (electrical and structural) model parameters. In the current study, three variance based global sensitivity analysis (GSA) methods; Sobol, FAST and a sparse grid stochastic collocation technique based on the Smolyak algorithm were applied on a lumped parameter model of carotid bifurcation. Sensitivity analysis was carried out to identify and rank most sensitive parameters as well as to fix less sensitive parameters at their nominal values (factor fixing). In this context, network location and temporal dependent sensitivities were also discussed to identify optimal measurement locations in carotid bifurcation and optimal temporal regions for each parameter in the pressure and flow waves, respectively. Results show that, for both pressure and flow, flow resistance (R), diameter (d) and length of the vessel (l) are sensitive within right common carotid (RCC), right internal carotid (RIC) and right external carotid (REC) arteries, while compliance of the vessels (C) and blood inertia (L) are sensitive only at RCC. Moreover, Young's modulus (E) and wall thickness (h) exhibit less sensitivities on pressure and flow at all locations of carotid bifurcation. Results of network location and temporal variabilities revealed that most of sensitivity was found in common time regions i.e. early systole, peak systole and end systole. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Methods for Quantifying the Uncertainties of LSIT Test Parameters, Test Results, and Full-Scale Mixing Performance Using Models Developed from Scaled Test Data

    International Nuclear Information System (INIS)

    Piepel, Gregory F.; Cooley, Scott K.; Kuhn, William L.; Rector, David R.; Heredia-Langner, Alejandro

    2015-01-01

    This report discusses the statistical methods for quantifying uncertainties in 1) test responses and other parameters in the Large Scale Integrated Testing (LSIT), and 2) estimates of coefficients and predictions of mixing performance from models that relate test responses to test parameters. Testing at a larger scale has been committed to by Bechtel National, Inc. and the U.S. Department of Energy (DOE) to ''address uncertainties and increase confidence in the projected, full-scale mixing performance and operations'' in the Waste Treatment and Immobilization Plant (WTP).

  9. Methods for Quantifying the Uncertainties of LSIT Test Parameters, Test Results, and Full-Scale Mixing Performance Using Models Developed from Scaled Test Data

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, Gregory F. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Cooley, Scott K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kuhn, William L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Rector, David R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Heredia-Langner, Alejandro [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-05-01

    This report discusses the statistical methods for quantifying uncertainties in 1) test responses and other parameters in the Large Scale Integrated Testing (LSIT), and 2) estimates of coefficients and predictions of mixing performance from models that relate test responses to test parameters. Testing at a larger scale has been committed to by Bechtel National, Inc. and the U.S. Department of Energy (DOE) to “address uncertainties and increase confidence in the projected, full-scale mixing performance and operations” in the Waste Treatment and Immobilization Plant (WTP).

  10. A generalized Lyapunov theory for robust root clustering of linear state space models with real parameter uncertainty

    Science.gov (United States)

    Yedavalli, R. K.

    1992-01-01

    The problem of analyzing and designing controllers for linear systems subject to real parameter uncertainty is considered. An elegant, unified theory for robust eigenvalue placement is presented for a class of D-regions defined by algebraic inequalities by extending the nominal matrix root clustering theory of Gutman and Jury (1981) to linear uncertain time systems. The author presents explicit conditions for matrix root clustering for different D-regions and establishes the relationship between the eigenvalue migration range and the parameter range. The bounds are all obtained by one-shot computation in the matrix domain and do not need any frequency sweeping or parameter gridding. The method uses the generalized Lyapunov theory for getting the bounds.

  11. A Bayesian approach for evaluation of the effect of water quality model parameter uncertainty on TMDLs: A case study of Miyun Reservoir.

    Science.gov (United States)

    Liang, Shidong; Jia, Haifeng; Xu, Changqing; Xu, Te; Melching, Charles

    2016-08-01

    Facing increasingly serious water pollution, the Chinese government is changing the environmental management strategy from solely pollutant concentration control to a Total Maximum Daily Load (TMDL) program, and water quality models are increasingly being applied to determine the allowable pollutant load in the TMDL. Despite the frequent use of models, few studies have focused on how parameter uncertainty in water quality models affect the allowable pollutant loads in the TMDL program, particularly for complicated and high-dimension water quality models. Uncertainty analysis for such models is limited by time-consuming simulation and high-dimensionality and nonlinearity in parameter spaces. In this study, an allowable pollutant load calculation platform was established using the Environmental Fluid Dynamics Code (EFDC), which is a widely applied hydrodynamic-water quality model. A Bayesian approach, i.e. the DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm, which is a high-efficiency, multi-chain Markov Chain Monte Carlo (MCMC) method, was applied to assess the effects of parameter uncertainty on the water quality model simulations and its influence on the allowable pollutant load calculation in the TMDL program. Miyun Reservoir, which is the most important surface drinking water source for Beijing, suffers from eutrophication and was selected as a case study. The relations between pollutant loads and water quality indicators are obtained through a graphical method in the simulation platform. Ranges of allowable pollutant loads were obtained according to the results of parameter uncertainty analysis, i.e. Total Organic Carbon (TOC): 581.5-1030.6t·yr(-1); Total Phosphorus (TP): 23.3-31.0t·yr(-1); and Total Nitrogen (TN): 480-1918.0t·yr(-1). The wide ranges of allowable pollutant loads reveal the importance of parameter uncertainty analysis in a TMDL program for allowable pollutant load calculation and margin of safety (MOS) determination. The sources

  12. Uncertainty propagation within the UNEDF models

    Science.gov (United States)

    Haverinen, T.; Kortelainen, M.

    2017-04-01

    The parameters of the nuclear energy density have to be adjusted to experimental data. As a result they carry certain uncertainty which then propagates to calculated values of observables. In the present work we quantify the statistical uncertainties of binding energies, proton quadrupole moments and proton matter radius for three UNEDF Skyrme energy density functionals by taking advantage of the knowledge of the model parameter uncertainties. We find that the uncertainty of UNEDF models increases rapidly when going towards proton or neutron rich nuclei. We also investigate the impact of each model parameter on the total error budget.

  13. Model uncertainty: Probabilities for models?

    International Nuclear Information System (INIS)

    Winkler, R.L.

    1994-01-01

    Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising

  14. Flood modelling : Parameterisation and inflow uncertainty

    NARCIS (Netherlands)

    Mukolwe, M.M.; Di Baldassarre, G.; Werner, M.; Solomatine, D.P.

    2014-01-01

    This paper presents an analysis of uncertainty in hydraulic modelling of floods, focusing on the inaccuracy caused by inflow errors and parameter uncertainty. In particular, the study develops a method to propagate the uncertainty induced by, firstly, application of a stage–discharge rating curve

  15. Uncertainties in repository modeling

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, J.R.

    1996-12-31

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.

  16. Uncertainties in repository modeling

    International Nuclear Information System (INIS)

    Wilson, J.R.

    1996-01-01

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling

  17. Robustness of dynamic systems with parameter uncertainties

    CERN Document Server

    Balemi, S; Truöl, W

    1992-01-01

    Robust Control is one of the fastest growing and promising areas of research today. In many practical systems there exist uncertainties which have to be considered in the analysis and design of control systems. In the last decade methods were developed for dealing with dynamic systems with unstructured uncertainties such as HOO_ and £I-optimal control. For systems with parameter uncertainties, the seminal paper of V. L. Kharitonov has triggered a large amount of very promising research. An international workshop dealing with all aspects of robust control was successfully organized by S. P. Bhattacharyya and L. H. Keel in San Antonio, Texas, USA in March 1991. We organized the second international workshop in this area in Ascona, Switzer­ land in April 1992. However, this second workshop was restricted to robust control of dynamic systems with parameter uncertainties with the objective to concentrate on some aspects of robust control. This book contains a collection of papers presented at the International W...

  18. Evaluation of bootstrap methods for estimating uncertainty of parameters in nonlinear mixed-effects models: a simulation study in population pharmacokinetics.

    Science.gov (United States)

    Thai, Hoai-Thu; Mentré, France; Holford, Nicholas H G; Veyrat-Follet, Christine; Comets, Emmanuelle

    2014-02-01

    Bootstrap methods are used in many disciplines to estimate the uncertainty of parameters, including multi-level or linear mixed-effects models. Residual-based bootstrap methods which resample both random effects and residuals are an alternative approach to case bootstrap, which resamples the individuals. Most PKPD applications use the case bootstrap, for which software is available. In this study, we evaluated the performance of three bootstrap methods (case bootstrap, nonparametric residual bootstrap and parametric bootstrap) by a simulation study and compared them to that of an asymptotic method (Asym) in estimating uncertainty of parameters in nonlinear mixed-effects models (NLMEM) with heteroscedastic error. This simulation was conducted using as an example of the PK model for aflibercept, an anti-angiogenic drug. As expected, we found that the bootstrap methods provided better estimates of uncertainty for parameters in NLMEM with high nonlinearity and having balanced designs compared to the Asym, as implemented in MONOLIX. Overall, the parametric bootstrap performed better than the case bootstrap as the true model and variance distribution were used. However, the case bootstrap is faster and simpler as it makes no assumptions on the model and preserves both between subject and residual variability in one resampling step. The performance of the nonparametric residual bootstrap was found to be limited when applying to NLMEM due to its failure to reflate the variance before resampling in unbalanced designs where the Asym and the parametric bootstrap performed well and better than case bootstrap even with stratification.

  19. Uncertainties in the Antarctic Ice Sheet Contribution to Sea Level Rise: Exploration of Model Response to Errors in Climate Forcing, Boundary Conditions, and Internal Parameters

    Science.gov (United States)

    Schlegel, N.; Seroussi, H. L.; Boening, C.; Larour, E. Y.; Limonadi, D.; Schodlok, M.; Watkins, M. M.

    2017-12-01

    The Jet Propulsion Laboratory-University of California at Irvine Ice Sheet System Model (ISSM) is a thermo-mechanical 2D/3D parallelized finite element software used to physically model the continental-scale flow of ice at high resolutions. Embedded into ISSM are uncertainty quantification (UQ) tools, based on the Design Analysis Kit for Optimization and Terascale Applications (DAKOTA) software. ISSM-DAKOTA offers various UQ methods for the investigation of how errors in model input impact uncertainty in simulation results. We utilize these tools to regionally sample model input and key parameters, based on specified bounds of uncertainty, and run a suite of continental-scale 100-year ISSM forward simulations of the Antarctic Ice Sheet. Resulting diagnostics (e.g., spread in local mass flux and regional mass balance) inform our conclusion about which parameters and/or forcing has the greatest impact on century-scale model simulations of ice sheet evolution. The results allow us to prioritize the key datasets and measurements that are critical for the minimization of ice sheet model uncertainty. Overall, we find that Antartica's total sea level contribution is strongly affected by grounding line retreat, which is driven by the magnitude of ice shelf basal melt rates and by errors in bedrock topography. In addition, results suggest that after 100 years of simulation, Thwaites glacier is the most significant source of model uncertainty, and its drainage basin has the largest potential for future sea level contribution. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.

  20. Parameter Uncertainty for Repository Thermal Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hardin, Ernest [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hadgu, Teklu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Greenberg, Harris [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dupont, Mark [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-10-01

    This report is one follow-on to a study of reference geologic disposal design concepts (Hardin et al. 2011a). Based on an analysis of maximum temperatures, that study concluded that certain disposal concepts would require extended decay storage prior to emplacement, or the use of small waste packages, or both. The study used nominal values for thermal properties of host geologic media and engineered materials, demonstrating the need for uncertainty analysis to support the conclusions. This report is a first step that identifies the input parameters of the maximum temperature calculation, surveys published data on measured values, uses an analytical approach to determine which parameters are most important, and performs an example sensitivity analysis. Using results from this first step, temperature calculations planned for FY12 can focus on only the important parameters, and can use the uncertainty ranges reported here. The survey of published information on thermal properties of geologic media and engineered materials, is intended to be sufficient for use in generic calculations to evaluate the feasibility of reference disposal concepts. A full compendium of literature data is beyond the scope of this report. The term “uncertainty” is used here to represent both measurement uncertainty and spatial variability, or variability across host geologic units. For the most important parameters (e.g., buffer thermal conductivity) the extent of literature data surveyed samples these different forms of uncertainty and variability. Finally, this report is intended to be one chapter or section of a larger FY12 deliverable summarizing all the work on design concepts and thermal load management for geologic disposal (M3FT-12SN0804032, due 15Aug2012).

  1. Uncertainty in the determination of soil hydraulic parameters and its influence on the performance of two hydrological models of different complexity

    Directory of Open Access Journals (Sweden)

    G. Baroni

    2010-02-01

    Full Text Available Data of soil hydraulic properties forms often a limiting factor in unsaturated zone modelling, especially at the larger scales. Investigations for the hydraulic characterization of soils are time-consuming and costly, and the accuracy of the results obtained by the different methodologies is still debated. However, we may wonder how the uncertainty in soil hydraulic parameters relates to the uncertainty of the selected modelling approach. We performed an intensive monitoring study during the cropping season of a 10 ha maize field in Northern Italy. The data were used to: i compare different methods for determining soil hydraulic parameters and ii evaluate the effect of the uncertainty in these parameters on different variables (i.e. evapotranspiration, average water content in the root zone, flux at the bottom boundary of the root zone simulated by two hydrological models of different complexity: SWAP, a widely used model of soil moisture dynamics in unsaturated soils based on Richards equation, and ALHyMUS, a conceptual model of the same dynamics based on a reservoir cascade scheme. We employed five direct and indirect methods to determine soil hydraulic parameters for each horizon of the experimental profile. Two methods were based on a parameter optimization of: a laboratory measured retention and hydraulic conductivity data and b field measured retention and hydraulic conductivity data. The remaining three methods were based on the application of widely used Pedo-Transfer Functions: c Rawls and Brakensiek, d HYPRES, and e ROSETTA. Simulations were performed using meteorological, irrigation and crop data measured at the experimental site during the period June – October 2006. Results showed a wide range of soil hydraulic parameter values generated with the different methods, especially for the saturated hydraulic conductivity Ksat and the shape parameter α of the van Genuchten curve. This is reflected in a variability of

  2. Uncertainty quantification for environmental models

    Science.gov (United States)

    Hill, Mary C.; Lu, Dan; Kavetski, Dmitri; Clark, Martyn P.; Ye, Ming

    2012-01-01

    Environmental models are used to evaluate the fate of fertilizers in agricultural settings (including soil denitrification), the degradation of hydrocarbons at spill sites, and water supply for people and ecosystems in small to large basins and cities—to mention but a few applications of these models. They also play a role in understanding and diagnosing potential environmental impacts of global climate change. The models are typically mildly to extremely nonlinear. The persistent demand for enhanced dynamics and resolution to improve model realism [17] means that lengthy individual model execution times will remain common, notwithstanding continued enhancements in computer power. In addition, high-dimensional parameter spaces are often defined, which increases the number of model runs required to quantify uncertainty [2]. Some environmental modeling projects have access to extensive funding and computational resources; many do not. The many recent studies of uncertainty quantification in environmental model predictions have focused on uncertainties related to data error and sparsity of data, expert judgment expressed mathematically through prior information, poorly known parameter values, and model structure (see, for example, [1,7,9,10,13,18]). Approaches for quantifying uncertainty include frequentist (potentially with prior information [7,9]), Bayesian [13,18,19], and likelihood-based. A few of the numerous methods, including some sensitivity and inverse methods with consequences for understanding and quantifying uncertainty, are as follows: Bayesian hierarchical modeling and Bayesian model averaging; single-objective optimization with error-based weighting [7] and multi-objective optimization [3]; methods based on local derivatives [2,7,10]; screening methods like OAT (one at a time) and the method of Morris [14]; FAST (Fourier amplitude sensitivity testing) [14]; the Sobol' method [14]; randomized maximum likelihood [10]; Markov chain Monte Carlo (MCMC) [10

  3. Impact of Uncertainty Characterization of Satellite Rainfall Inputs and Model Parameters on Hydrological Data Assimilation with the Ensemble Kalman Filter for Flood Prediction

    Science.gov (United States)

    Vergara, H. J.; Kirstetter, P.; Hong, Y.; Gourley, J. J.; Wang, X.

    2013-12-01

    The Ensemble Kalman Filter (EnKF) is arguably the assimilation approach that has found the widest application in hydrologic modeling. Its relatively easy implementation and computational efficiency makes it an attractive method for research and operational purposes. However, the scientific literature featuring this approach lacks guidance on how the errors in the forecast need to be characterized so as to get the required corrections from the assimilation process. Moreover, several studies have indicated that the performance of the EnKF is 'sub-optimal' when assimilating certain hydrologic observations. Likewise, some authors have suggested that the underlying assumptions of the Kalman Filter and its dependence on linear dynamics make the EnKF unsuitable for hydrologic modeling. Such assertions are often based on ineffectiveness and poor robustness of EnKF implementations resulting from restrictive specification of error characteristics and the absence of a-priori information of error magnitudes. Therefore, understanding the capabilities and limitations of the EnKF to improve hydrologic forecasts require studying its sensitivity to the manner in which errors in the hydrologic modeling system are represented through ensembles. This study presents a methodology that explores various uncertainty representation configurations to characterize the errors in the hydrologic forecasts in a data assimilation context. The uncertainty in rainfall inputs is represented through a Generalized Additive Model for Location, Scale, and Shape (GAMLSS), which provides information about second-order statistics of quantitative precipitation estimates (QPE) error. The uncertainty in model parameters is described adding perturbations based on parameters covariance information. The method allows for the identification of rainfall and parameter perturbation combinations for which the performance of the EnKF is 'optimal' given a set of objective functions. In this process, information about

  4. Groundwater travel time uncertainty analysis. Sensitivity of results to model geometry, and correlations and cross correlations among input parameters

    International Nuclear Information System (INIS)

    Clifton, P.M.

    1985-03-01

    This study examines the sensitivity of the travel time distribution predicted by a reference case model to (1) scale of representation of the model parameters, (2) size of the model domain, (3) correlation range of log-transmissivity, and (4) cross correlations between transmissivity and effective thickness. The basis for the reference model is the preliminary stochastic travel time model previously documented by the Basalt Waste Isolation Project. Results of this study show the following. The variability of the predicted travel times can be adequately represented when the ratio between the size of the zones used to represent the model parameters and the log-transmissivity correlation range is less than about one-fifth. The size of the model domain and the types of boundary conditions can have a strong impact on the distribution of travel times. Longer log-transmissivity correlation ranges cause larger variability in the predicted travel times. Positive cross correlation between transmissivity and effective thickness causes a decrease in the travel time variability. These results demonstrate the need for a sound conceptual model prior to conducting a stochastic travel time analysis

  5. Uncertainty analysis of flexible rotors considering fuzzy parameters and fuzzy-random parameters

    Directory of Open Access Journals (Sweden)

    Fabian Andres Lara-Molina

    Full Text Available Abstract The components of flexible rotors are subjected to uncertainties. The main sources of uncertainties include the variation of mechanical properties. This contribution aims at analyzing the dynamics of flexible rotors under uncertain parameters modeled as fuzzy and fuzzy random variables. The uncertainty analysis encompasses the modeling of uncertain parameters and the numerical simulation of the corresponding flexible rotor model by using an approach based on fuzzy dynamic analysis. The numerical simulation is accomplished by mapping the fuzzy parameters of the deterministic flexible rotor model. Thereby, the flexible rotor is modeled by using both the Fuzzy Finite Element Method and the Fuzzy Stochastic Finite Element Method. Numerical simulations illustrate the methodology conveyed in terms of orbits and frequency response functions subject to uncertain parameters.

  6. Uncertainties in modelling CH4 emissions from northern wetlands in glacial climates: the role of vegetation parameters

    Directory of Open Access Journals (Sweden)

    J. van Huissteden

    2011-10-01

    Full Text Available Marine Isotope Stage 3 (MIS 3 interstadials are marked by a sharp increase in the atmospheric methane (CH4 concentration, as recorded in ice cores. Wetlands are assumed to be the major source of this CH4, although several other hypotheses have been advanced. Modelling of CH4 emissions is crucial to quantify CH4 sources for past climates. Vegetation effects are generally highly generalized in modelling past and present-day CH4 fluxes, but should not be neglected. Plants strongly affect the soil-atmosphere exchange of CH4 and the net primary production of the vegetation supplies organic matter as substrate for methanogens. For modelling past CH4 fluxes from northern wetlands, assumptions on vegetation are highly relevant since paleobotanical data indicate large differences in Last Glacial (LG wetland vegetation composition as compared to modern wetland vegetation. Besides more cold-adapted vegetation, Sphagnum mosses appear to be much less dominant during large parts of the LG than at present, which particularly affects CH4 oxidation and transport. To evaluate the effect of vegetation parameters, we used the PEATLAND-VU wetland CO2/CH4 model to simulate emissions from wetlands in continental Europe during LG and modern climates. We tested the effect of parameters influencing oxidation during plant transport (fox, vegetation net primary production (NPP, parameter symbol Pmax, plant transport rate (Vtransp, maximum rooting depth (Zroot and root exudation rate (fex. Our model results show that modelled CH4 fluxes are sensitive to fox and Zroot in particular. The effects of Pmax, Vtransp and fex are of lesser relevance. Interactions with water table modelling are significant for Vtransp. We conducted experiments with different wetland vegetation types for Marine Isotope Stage 3 (MIS 3 stadial and interstadial climates and the present-day climate, by coupling PEATLAND-VU to high resolution climate model simulations for Europe. Experiments assuming

  7. Bioprocess optimization under uncertainty using ensemble modeling

    OpenAIRE

    Liu, Yang; Gunawan, Rudiyanto

    2017-01-01

    The performance of model-based bioprocess optimizations depends on the accuracy of the mathematical model. However, models of bioprocesses often have large uncertainty due to the lack of model identifiability. In the presence of such uncertainty, process optimizations that rely on the predictions of a single “best fit” model, e.g. the model resulting from a maximum likelihood parameter estimation using the available process data, may perform poorly in real life. In this study, we employed ens...

  8. Chemical model reduction under uncertainty

    KAUST Repository

    Malpica Galassi, Riccardo

    2017-03-06

    A general strategy for analysis and reduction of uncertain chemical kinetic models is presented, and its utility is illustrated in the context of ignition of hydrocarbon fuel–air mixtures. The strategy is based on a deterministic analysis and reduction method which employs computational singular perturbation analysis to generate simplified kinetic mechanisms, starting from a detailed reference mechanism. We model uncertain quantities in the reference mechanism, namely the Arrhenius rate parameters, as random variables with prescribed uncertainty factors. We propagate this uncertainty to obtain the probability of inclusion of each reaction in the simplified mechanism. We propose probabilistic error measures to compare predictions from the uncertain reference and simplified models, based on the comparison of the uncertain dynamics of the state variables, where the mixture entropy is chosen as progress variable. We employ the construction for the simplification of an uncertain mechanism in an n-butane–air mixture homogeneous ignition case, where a 176-species, 1111-reactions detailed kinetic model for the oxidation of n-butane is used with uncertainty factors assigned to each Arrhenius rate pre-exponential coefficient. This illustration is employed to highlight the utility of the construction, and the performance of a family of simplified models produced depending on chosen thresholds on importance and marginal probabilities of the reactions.

  9. Groundwater travel time uncertainty analysis: Sensitivity of results to model geometry, and correlations and cross correlations among input parameters

    International Nuclear Information System (INIS)

    Clifton, P.M.

    1984-12-01

    The deep basalt formations beneath the Hanford Site are being investigated for the Department of Energy (DOE) to assess their suitability as a host medium for a high level nuclear waste repository. Predicted performance of the proposed repository is an important part of the investigation. One of the performance measures being used to gauge the suitability of the host medium is pre-waste-emplacement groundwater travel times to the accessible environment. Many deterministic analyses of groundwater travel times have been completed by Rockwell and other independent organizations. Recently, Rockwell has completed a preliminary stochastic analysis of groundwater travel times. This document presents analyses that show the sensitivity of the results from the previous stochastic travel time study to: (1) scale of representation of model parameters, (2) size of the model domain, (3) correlation range of log-transmissivity, and (4) cross-correlation between transmissivity and effective thickness. 40 refs., 29 figs., 6 tabs

  10. Radiotherapy Dose Fractionation under Parameter Uncertainty

    International Nuclear Information System (INIS)

    Davison, Matt; Kim, Daero; Keller, Harald

    2011-01-01

    In radiotherapy, radiation is directed to damage a tumor while avoiding surrounding healthy tissue. Tradeoffs ensue because dose cannot be exactly shaped to the tumor. It is particularly important to ensure that sensitive biological structures near the tumor are not damaged more than a certain amount. Biological tissue is known to have a nonlinear response to incident radiation. The linear quadratic dose response model, which requires the specification of two clinically and experimentally observed response coefficients, is commonly used to model this effect. This model yields an optimization problem giving two different types of optimal dose sequences (fractionation schedules). Which fractionation schedule is preferred depends on the response coefficients. These coefficients are uncertainly known and may differ from patient to patient. Because of this not only the expected outcomes but also the uncertainty around these outcomes are important, and it might not be prudent to select the strategy with the best expected outcome.

  11. Uncertainty modelling of atmospheric dispersion by stochastic ...

    Indian Academy of Sciences (India)

    2016-08-26

    Aug 26, 2016 ... Uncertainty; polynomial chaos expansion; fuzzy set theory; cumulative distribution function; uniform distribution; membership function. Abstract. The parameters associated to a environmental dispersion model may include different kinds of variability, imprecision and uncertainty. More often, it is seen that ...

  12. Uncertainty from Extrapolation of Cosmic Ray Air Shower Parameters

    Science.gov (United States)

    Abbasi, Rasha; Thomson, Gordon

    In this work we investigate the uncertainties in the prediction of the average shower maximum, , by the currently used high energy cosmic ray shower simulation models. Recent measurements at the LHC have provided constrains on some of the parameters in these models. However, uncertainties in the prediction of remain due to extrapolation from accelerator data up to center of mass of 250 TeV. The extrapolation in the elasticity, multiplicity, and p-p cross section from the LHC energy range to 3 × 1019 eV in a cosmic ray's lab frame is investigated in this proceeding. Our calculation of the uncertainty in is approximately equal to the difference among the modern models being used in the field.

  13. Declarative representation of uncertainty in mathematical models.

    Science.gov (United States)

    Miller, Andrew K; Britten, Randall D; Nielsen, Poul M F

    2012-01-01

    An important aspect of multi-scale modelling is the ability to represent mathematical models in forms that can be exchanged between modellers and tools. While the development of languages like CellML and SBML have provided standardised declarative exchange formats for mathematical models, independent of the algorithm to be applied to the model, to date these standards have not provided a clear mechanism for describing parameter uncertainty. Parameter uncertainty is an inherent feature of many real systems. This uncertainty can result from a number of situations, such as: when measurements include inherent error; when parameters have unknown values and so are replaced by a probability distribution by the modeller; when a model is of an individual from a population, and parameters have unknown values for the individual, but the distribution for the population is known. We present and demonstrate an approach by which uncertainty can be described declaratively in CellML models, by utilising the extension mechanisms provided in CellML. Parameter uncertainty can be described declaratively in terms of either a univariate continuous probability density function or multiple realisations of one variable or several (typically non-independent) variables. We additionally present an extension to SED-ML (the Simulation Experiment Description Markup Language) to describe sampling sensitivity analysis simulation experiments. We demonstrate the usability of the approach by encoding a sample model in the uncertainty markup language, and by developing a software implementation of the uncertainty specification (including the SED-ML extension for sampling sensitivty analyses) in an existing CellML software library, the CellML API implementation. We used the software implementation to run sampling sensitivity analyses over the model to demonstrate that it is possible to run useful simulations on models with uncertainty encoded in this form.

  14. Declarative representation of uncertainty in mathematical models.

    Directory of Open Access Journals (Sweden)

    Andrew K Miller

    Full Text Available An important aspect of multi-scale modelling is the ability to represent mathematical models in forms that can be exchanged between modellers and tools. While the development of languages like CellML and SBML have provided standardised declarative exchange formats for mathematical models, independent of the algorithm to be applied to the model, to date these standards have not provided a clear mechanism for describing parameter uncertainty. Parameter uncertainty is an inherent feature of many real systems. This uncertainty can result from a number of situations, such as: when measurements include inherent error; when parameters have unknown values and so are replaced by a probability distribution by the modeller; when a model is of an individual from a population, and parameters have unknown values for the individual, but the distribution for the population is known. We present and demonstrate an approach by which uncertainty can be described declaratively in CellML models, by utilising the extension mechanisms provided in CellML. Parameter uncertainty can be described declaratively in terms of either a univariate continuous probability density function or multiple realisations of one variable or several (typically non-independent variables. We additionally present an extension to SED-ML (the Simulation Experiment Description Markup Language to describe sampling sensitivity analysis simulation experiments. We demonstrate the usability of the approach by encoding a sample model in the uncertainty markup language, and by developing a software implementation of the uncertainty specification (including the SED-ML extension for sampling sensitivty analyses in an existing CellML software library, the CellML API implementation. We used the software implementation to run sampling sensitivity analyses over the model to demonstrate that it is possible to run useful simulations on models with uncertainty encoded in this form.

  15. PROCEEDINGS OF THE INTERNATIONAL WORKSHOP ON UNCERTAINTY, SENSITIVITY, AND PARAMETER ESTIMATION FOR MULTIMEDIA ENVIRONMENTAL MODELING. EPA/600/R-04/117, NUREG/CP-0187, ERDC SR-04-2.

    Science.gov (United States)

    An International Workshop on Uncertainty, Sensitivity, and Parameter Estimation for Multimedia Environmental Modeling was held August 1921, 2003, at the U.S. Nuclear Regulatory Commission Headquarters in Rockville, Maryland, USA. The workshop was organized and convened by the Fe...

  16. Chemical model reduction under uncertainty

    KAUST Repository

    Najm, Habib

    2016-01-05

    We outline a strategy for chemical kinetic model reduction under uncertainty. We present highlights of our existing deterministic model reduction strategy, and describe the extension of the formulation to include parametric uncertainty in the detailed mechanism. We discuss the utility of this construction, as applied to hydrocarbon fuel-air kinetics, and the associated use of uncertainty-aware measures of error between predictions from detailed and simplified models.

  17. Separating the contributions of variability and parameter uncertainty in probability distributions

    International Nuclear Information System (INIS)

    Sankararaman, S.; Mahadevan, S.

    2013-01-01

    This paper proposes a computational methodology to quantify the individual contributions of variability and distribution parameter uncertainty to the overall uncertainty in a random variable. Even if the distribution type is assumed to be known, sparse or imprecise data leads to uncertainty about the distribution parameters. If uncertain distribution parameters are represented using probability distributions, then the random variable can be represented using a family of probability distributions. The family of distributions concept has been used to obtain qualitative, graphical inference of the contributions of natural variability and distribution parameter uncertainty. The proposed methodology provides quantitative estimates of the contributions of the two types of uncertainty. Using variance-based global sensitivity analysis, the contributions of variability and distribution parameter uncertainty to the overall uncertainty are computed. The proposed method is developed at two different levels; first, at the level of a variable whose distribution parameters are uncertain, and second, at the level of a model output whose inputs have uncertain distribution parameters

  18. Monte Carlo parameter studies and uncertainty analyses with MCNP5

    International Nuclear Information System (INIS)

    Brown, F. B.; Sweezy, J. E.; Hayes, R.

    2004-01-01

    A software tool called mcnp p study has been developed to automate the setup, execution, and collection of results from a series of MCNP5 Monte Carlo calculations. This tool provides a convenient means of performing parameter studies, total uncertainty analyses, parallel job execution on clusters, stochastic geometry modeling, and other types of calculations where a series of MCNP5 jobs must be performed with varying problem input specifications. (authors)

  19. Uncertainty analysis of environmental models

    International Nuclear Information System (INIS)

    Monte, L.

    1990-01-01

    In the present paper an evaluation of the output uncertainty of an environmental model for assessing the transfer of 137 Cs and 131 I in the human food chain are carried out on the basis of a statistical analysis of data reported by the literature. The uncertainty analysis offers the oppotunity of obtaining some remarkable information about the uncertainty of models predicting the migration of non radioactive substances in the environment mainly in relation to the dry and wet deposition

  20. Bayesian conjugate analysis using a generalized inverted Wishart distribution accounts for differential uncertainty among the genetic parameters--an application to the maternal animal model.

    Science.gov (United States)

    Munilla, S; Cantet, R J C

    2012-06-01

    Consider the estimation of genetic (co)variance components from a maternal animal model (MAM) using a conjugated Bayesian approach. Usually, more uncertainty is expected a priori on the value of the maternal additive variance than on the value of the direct additive variance. However, it is not possible to model such differential uncertainty when assuming an inverted Wishart (IW) distribution for the genetic covariance matrix. Instead, consider the use of a generalized inverted Wishart (GIW) distribution. The GIW is essentially an extension of the IW distribution with a larger set of distinct parameters. In this study, the GIW distribution in its full generality is introduced and theoretical results regarding its use as the prior distribution for the genetic covariance matrix of the MAM are derived. In particular, we prove that the conditional conjugacy property holds so that parameter estimation can be accomplished via the Gibbs sampler. A sampling algorithm is also sketched. Furthermore, we describe how to specify the hyperparameters to account for differential prior opinion on the (co)variance components. A recursive strategy to elicit these parameters is then presented and tested using field records and simulated data. The procedure returned accurate estimates and reduced standard errors when compared with non-informative prior settings while improving the convergence rates. In general, faster convergence was always observed when a stronger weight was placed on the prior distributions. However, analyses based on the IW distribution have also produced biased estimates when the prior means were set to over-dispersed values. © 2011 Blackwell Verlag GmbH.

  1. Parametric uncertainty in optical image modeling

    Science.gov (United States)

    Potzick, James; Marx, Egon; Davidson, Mark

    2006-10-01

    Optical photomask feature metrology and wafer exposure process simulation both rely on optical image modeling for accurate results. While it is fair to question the accuracies of the available models, model results also depend on several input parameters describing the object and imaging system. Errors in these parameter values can lead to significant errors in the modeled image. These parameters include wavelength, illumination and objective NA's, magnification, focus, etc. for the optical system, and topography, complex index of refraction n and k, etc. for the object. In this paper each input parameter is varied over a range about its nominal value and the corresponding images simulated. Second order parameter interactions are not explored. Using the scenario of the optical measurement of photomask features, these parametric sensitivities are quantified by calculating the apparent change of the measured linewidth for a small change in the relevant parameter. Then, using reasonable values for the estimated uncertainties of these parameters, the parametric linewidth uncertainties can be calculated and combined to give a lower limit to the linewidth measurement uncertainty for those parameter uncertainties.

  2. Empirical Bayesian inference and model uncertainty

    International Nuclear Information System (INIS)

    Poern, K.

    1994-01-01

    This paper presents a hierarchical or multistage empirical Bayesian approach for the estimation of uncertainty concerning the intensity of a homogeneous Poisson process. A class of contaminated gamma distributions is considered to describe the uncertainty concerning the intensity. These distributions in turn are defined through a set of secondary parameters, the knowledge of which is also described and updated via Bayes formula. This two-stage Bayesian approach is an example where the modeling uncertainty is treated in a comprehensive way. Each contaminated gamma distributions, represented by a point in the 3D space of secondary parameters, can be considered as a specific model of the uncertainty about the Poisson intensity. Then, by the empirical Bayesian method each individual model is assigned a posterior probability

  3. Methodology for the treatment of model uncertainty

    Science.gov (United States)

    Droguett, Enrique Lopez

    The development of a conceptual, unified, framework and methodology for treating model and parameter uncertainties is the subject of this work. Firstly, a discussion on the philosophical grounds of notions such as reality, modeling, models, and their relation is presented. On this, a characterization of the modeling process is presented. The concept of uncertainty, addressing controversial topics such as type and sources of uncertainty, are investigated arguing that uncertainty is fundamentally a characterization of lack of knowledge and as such all uncertainty are of the same type. A discussion about the roles of a model structure and model parameters is presented, in which it is argued that a distinction is for convenience and a function of the stage in the modeling process. From the foregoing discussion, a Bayesian framework for an integrated assessment of model and parameter uncertainties is developed. The methodology has as its central point the treatment of model as source of information regarding the unknown of interest. It allows for the assessment of the model characteristics affecting its performance, such as bias and precision. It also permits the assessment of possible dependencies among multiple models. Furthermore, the proposed framework makes possible the use of not only information from models (e.g., point estimates, qualitative assessments), but also evidence about the models themselves (performance data, confidence in the model, applicability of the model). The methodology is then applied in the context of fire risk models where several examples with real data are studied. These examples demonstrate how the framework and specific techniques developed in this study can address cases involving multiple models, use of performance data to update the predictive capabilities of a model, and the case where a model is applied in a context other than one for which it is designed.

  4. Parameter uncertainty, refreezing and surface energy balance modelling at Austfonna ice cap, Svalbard, over 2004–2008

    NARCIS (Netherlands)

    Oestby, T.I.; Schuler, T.V.; Hagen, J.O.; Hock, Regine; Reijmer, C.H.

    2013-01-01

    We apply a physically based coupled surface energy balance and snowpack model to a site close to the equilibrium line on Austfonna ice cap, Svalbard, over the 2004–08 melt seasons, to explain contributions to the energy available for melting and to quantify the significance of refreezing. The model

  5. Numerical modeling of economic uncertainty

    DEFF Research Database (Denmark)

    Schjær-Jacobsen, Hans

    2007-01-01

    Representation and modeling of economic uncertainty is addressed by different modeling methods, namely stochastic variables and probabilities, interval analysis, and fuzzy numbers, in particular triple estimates. Focusing on discounted cash flow analysis numerical results are presented, comparisons...

  6. Approaches in highly parameterized inversion—PEST++ Version 3, a Parameter ESTimation and uncertainty analysis software suite optimized for large environmental models

    Science.gov (United States)

    Welter, David E.; White, Jeremy T.; Hunt, Randall J.; Doherty, John E.

    2015-09-18

    The PEST++ Version 1 object-oriented parameter estimation code is here extended to Version 3 to incorporate additional algorithms and tools to further improve support for large and complex environmental modeling problems. PEST++ Version 3 includes the Gauss-Marquardt-Levenberg (GML) algorithm for nonlinear parameter estimation, Tikhonov regularization, integrated linear-based uncertainty quantification, options of integrated TCP/IP based parallel run management or external independent run management by use of a Version 2 update of the GENIE Version 1 software code, and utilities for global sensitivity analyses. The Version 3 code design is consistent with PEST++ Version 1 and continues to be designed to lower the barriers of entry for users as well as developers while providing efficient and optimized algorithms capable of accommodating large, highly parameterized inverse problems. As such, this effort continues the original focus of (1) implementing the most popular and powerful features of the PEST software suite in a fashion that is easy for novice or experienced modelers to use and (2) developing a software framework that is easy to extend.

  7. Representing and managing uncertainty in qualitative ecological models

    NARCIS (Netherlands)

    Nuttle, T.; Bredeweg, B.; Salles, P.; Neumann, M.

    2009-01-01

    Ecologists and decision makers need ways to understand systems, test ideas, and make predictions and explanations about systems. However, uncertainty about causes and effects of processes and parameter values is pervasive in models of ecological systems. Uncertainty associated with incomplete

  8. An evaluation of uncertainties in radioecological models

    International Nuclear Information System (INIS)

    Hoffmann, F.O.; Little, C.A.; Miller, C.W.; Dunning, D.E. Jr.; Rupp, E.M.; Shor, R.W.; Schaeffer, D.L.; Baes, C.F. III

    1978-01-01

    The paper presents results of analyses for seven selected parameters commonly used in environmental radiological assessment models, assuming that the available data are representative of the true distribution of parameter values and that their respective distributions are lognormal. Estimates of the most probable, median, mean, and 99th percentile for each parameter are fiven and compared to U.S. NRC default values. The regulatory default values are generally greater than the median values for the selected parameters, but some are associated with percentiles significantly less than the 50th. The largest uncertainties appear to be associated with aquatic bioaccumulation factors for fresh water fish. Approximately one order of magnitude separates median values and values of the 99th percentile. The uncertainty is also estimated for the annual dose rate predicted by a multiplicative chain model for the transport of molecular iodine-131 via the air-pasture-cow-milk-child's thyroid pathway. The value for the 99th percentile is ten times larger than the median value of the predicted dose normalized for a given air concentration of 131 I 2 . About 72% of the uncertainty in this model is contributed by the dose conversion factor and the milk transfer coefficient. Considering the difficulties in obtaining a reliable quantification of the true uncertainties in model predictions, methods for taking these uncertainties into account when determining compliance with regulatory statutes are discussed. (orig./HP) [de

  9. Uncertainty Assessment in Urban Storm Water Drainage Modelling

    DEFF Research Database (Denmark)

    Thorndahl, Søren

    The object of this paper is to make an overall description of the author's PhD study, concerning uncertainties in numerical urban storm water drainage models. Initially an uncertainty localization and assessment of model inputs and parameters as well as uncertainties caused by different model...

  10. Uncertainty in hydrological change modelling

    DEFF Research Database (Denmark)

    Seaby, Lauren Paige

    .D. study evaluates the uncertainty of the impact of climate change in hydrological simulations given multiple climate models and bias correction methods of varying complexity. Three distribution based scaling methods (DBS) were developed and benchmarked against a more simplistic and commonly used delta......Hydrological change modelling methodologies generally use climate models outputs to force hydrological simulations under changed conditions. There are nested sources of uncertainty throughout this methodology, including choice of climate model and subsequent bias correction methods. This Ph...... change (DC) approach. These climate model projections were then used to force hydrological simulations under climate change for the island Sjælland in Denmark to analyse the contribution of different climate models and bias correction methods to overall uncertainty in the hydrological change modelling...

  11. Uncertainty in hydrological change modelling

    DEFF Research Database (Denmark)

    Seaby, Lauren Paige

    Hydrological change modelling methodologies generally use climate models outputs to force hydrological simulations under changed conditions. There are nested sources of uncertainty throughout this methodology, including choice of climate model and subsequent bias correction methods. This Ph.......D. study evaluates the uncertainty of the impact of climate change in hydrological simulations given multiple climate models and bias correction methods of varying complexity. Three distribution based scaling methods (DBS) were developed and benchmarked against a more simplistic and commonly used delta...... change (DC) approach. These climate model projections were then used to force hydrological simulations under climate change for the island Sjælland in Denmark to analyse the contribution of different climate models and bias correction methods to overall uncertainty in the hydrological change modelling...

  12. Bioprocess optimization under uncertainty using ensemble modeling.

    Science.gov (United States)

    Liu, Yang; Gunawan, Rudiyanto

    2017-02-20

    The performance of model-based bioprocess optimizations depends on the accuracy of the mathematical model. However, models of bioprocesses often have large uncertainty due to the lack of model identifiability. In the presence of such uncertainty, process optimizations that rely on the predictions of a single "best fit" model, e.g. the model resulting from a maximum likelihood parameter estimation using the available process data, may perform poorly in real life. In this study, we employed ensemble modeling to account for model uncertainty in bioprocess optimization. More specifically, we adopted a Bayesian approach to define the posterior distribution of the model parameters, based on which we generated an ensemble of model parameters using a uniformly distributed sampling of the parameter confidence region. The ensemble-based process optimization involved maximizing the lower confidence bound of the desired bioprocess objective (e.g. yield or product titer), using a mean-standard deviation utility function. We demonstrated the performance and robustness of the proposed strategy in an application to a monoclonal antibody batch production by mammalian hybridoma cell culture. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  13. Uncertainty in reactive transport geochemical modelling

    International Nuclear Information System (INIS)

    Oedegaard-Jensen, A.; Ekberg, C.

    2005-01-01

    Full text of publication follows: Geochemical modelling is one way of predicting the transport of i.e. radionuclides in a rock formation. In a rock formation there will be fractures in which water and dissolved species can be transported. The composition of the water and the rock can either increase or decrease the mobility of the transported entities. When doing simulations on the mobility or transport of different species one has to know the exact water composition, the exact flow rates in the fracture and in the surrounding rock, the porosity and which minerals the rock is composed of. The problem with simulations on rocks is that the rock itself it not uniform i.e. larger fractures in some areas and smaller in other areas which can give different water flows. The rock composition can be different in different areas. In additions to this variance in the rock there are also problems with measuring the physical parameters used in a simulation. All measurements will perturb the rock and this perturbation will results in more or less correct values of the interesting parameters. The analytical methods used are also encumbered with uncertainties which in this case are added to the uncertainty from the perturbation of the analysed parameters. When doing simulation the effect of the uncertainties must be taken into account. As the computers are getting faster and faster the complexity of simulated systems are increased which also increase the uncertainty in the results from the simulations. In this paper we will show how the uncertainty in the different parameters will effect the solubility and mobility of different species. Small uncertainties in the input parameters can result in large uncertainties in the end. (authors)

  14. Model uncertainty in growth empirics

    NARCIS (Netherlands)

    Prüfer, P.

    2008-01-01

    This thesis applies so-called Bayesian model averaging (BMA) to three different economic questions substantially exposed to model uncertainty. Chapter 2 addresses a major issue of modern development economics: the analysis of the determinants of pro-poor growth (PPG), which seeks to combine high

  15. Analyzing the effects of geological and parameter uncertainty on prediction of groundwater head and travel time

    DEFF Research Database (Denmark)

    He, Xin; Sonnenborg, Torben Obel; Jørgensen, F.

    2013-01-01

    Uncertainty of groundwater model predictions has in the past mostly been related to uncertainty in the hydraulic parameters, whereas uncertainty in the geological structure has not been considered to the same extent. Recent developments in theoretical methods for quantifying geological uncertainty...... have made it possible to consider this factor in groundwater modeling. In this study we have applied the multiple-point geostatistical method (MPS) integrated in the Stanford Geostatistical Modeling Software (SGeMS) for exploring the impact of geological uncertainty on groundwater flow patterns...... for a site in Denmark. Realizations from the geostatistical model were used as input to a groundwater model developed from Modular three-dimensional finite-difference ground-water model (MODFLOW) within the Groundwater Modeling System (GMS) modeling environment. The uncertainty analysis was carried out...

  16. Parameter uncertainty analysis for simulating streamflow in a river catchment of Vietnam

    Directory of Open Access Journals (Sweden)

    Dao Nguyen Khoi

    2015-07-01

    Full Text Available Hydrological models play vital roles in management of water resources. However, the calibration of the hydrological models is a large challenge because of the uncertainty involved in the large number of parameters. In this study, four uncertainty analysis methods, including Generalized Likelihood Uncertainty Estimation (GLUE, Parameter Solution (ParaSol, Particle Swarm Optimization (PSO, and Sequential Uncertainty Fitting (SUFI-2, were employed to perform parameter uncertainty analysis of streamflow simulation in the Srepok River Catchment by using the Soil and Water Assessment Tool (SWAT model. The four methods were compared in terms of the model prediction uncertainty, the model performance, and the computational efficiency. The results showed that the SUFI-2 method has the advantages in the model calibration and uncertainty analysis. This technique could be run with the smallest of simulation runs to achieve good prediction uncertainty bands and model performance. This technique could be run with the smallest of simulation runs to achieve good prediction uncertainty bands and model performance.

  17. Uncertainty in biology a computational modeling approach

    CERN Document Server

    Gomez-Cabrero, David

    2016-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies.  Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process.  This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples.  This book is intended for graduate stude...

  18. Variability and Uncertainties of Key Hydrochemical Parameters for SKB Sites

    Energy Technology Data Exchange (ETDEWEB)

    Bath, Adrian [Intellisci Ltd, Willoughby on the Wolds, Loughborough (United Kingdom); Hermansson, Hans-Peter [Studsvik Nuclear AB, Nykoeping (Sweden)

    2006-12-15

    The work described in this report is a development of SKI's capability for the review and evaluation of data that will constitute part of SKB's case for selection of a suitable site and application to construct a geological repository for spent nuclear fuel. The aim has been to integrate a number of different approaches to interpreting and evaluating hydrochemical data, especially with respect to the parameters that matter most in assessing the suitability of a site and in understanding the geochemistry and groundwater conditions at a site. It has been focused on taking an independent view of overall uncertainties in reported data, taking account of analytical, sampling and other random and systematic sources of error. This evaluation was carried out initially with a compilation and general inspection of data from the Simpevarp, Forsmark and Laxemar sites plus data from older 'historical' boreholes in the Aespoe area. That was followed by a more specific interpretation by means of geochemical calculations which test the robustness of certain parameters, namely pH and redox/Eh. Geochemical model calculations have been carried out with widely available computer software. Data sources and their handling were also considered, especially access to SKB's SICADA database. In preparation for the use of geochemical modelling programs and to establish comparability of model results with those reported by SKB, the underlying thermodynamic databases were compared with each other and with other generally accepted databases. Comparisons of log K data for selected solid phases and solution complexes from the different thermodynamic databases were made. In general, there is a large degree of comparability between the databases, but there are some significant, and in a few cases large, differences. The present situation is however adequate for present purposes. The interpretation of redox equilibria is dependent on identifying the relevant solid phases and

  19. Uncertainty modeling and decision support

    International Nuclear Information System (INIS)

    Yager, Ronald R.

    2004-01-01

    We first formulate the problem of decision making under uncertainty. The importance of the representation of our knowledge about the uncertainty in formulating a decision process is pointed out. We begin with a brief discussion of the case of probabilistic uncertainty. Next, in considerable detail, we discuss the case of decision making under ignorance. For this case the fundamental role of the attitude of the decision maker is noted and its subjective nature is emphasized. Next the case in which a Dempster-Shafer belief structure is used to model our knowledge of the uncertainty is considered. Here we also emphasize the subjective choices the decision maker must make in formulating a decision function. The case in which the uncertainty is represented by a fuzzy measure (monotonic set function) is then investigated. We then return to the Dempster-Shafer belief structure and show its relationship to the fuzzy measure. This relationship allows us to get a deeper understanding of the formulation the decision function used Dempster- Shafer framework. We discuss how this deeper understanding allows a decision analyst to better make the subjective choices needed in the formulation of the decision function

  20. Uncertainty Quantification in Climate Modeling and Projection

    Energy Technology Data Exchange (ETDEWEB)

    Qian, Yun; Jackson, Charles; Giorgi, Filippo; Booth, Ben; Duan, Qingyun; Forest, Chris; Higdon, Dave; Hou, Z. Jason; Huerta, Gabriel

    2016-05-01

    assessing reliability and uncertainties of climate change information. An alternative approach is to generate similar ensembles by perturbing parameters within a single-model framework. One of workshop’s objectives was to give participants a deeper understanding of these approaches within a Bayesian statistical framework. However, there remain significant challenges still to be resolved before UQ can be applied in a convincing way to climate models and their projections.

  1. Model Uncertainty for Bilinear Hysteric Systems

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    density functions, Veneziano [2]. In general, model uncertainty is the uncertainty connected with mathematical modelling of the physical reality. When structural reliability analysis is related to the concept of a failure surface (or limit state surface) in the n-dimension basic variable space then model......In structural reliability analysis at least three types of uncertainty must be considered, namely physical uncertainty, statistical uncertainty, and model uncertainty (see e.g. Thoft-Christensen & Baker [1]). The physical uncertainty is usually modelled by a number of basic variables by predictive...

  2. Assessing Groundwater Model Uncertainty for the Central Nevada Test Area

    International Nuclear Information System (INIS)

    Pohll, Greg; Pohlmann, Karl; Hassan, Ahmed; Chapman, Jenny; Mihevc, Todd

    2002-01-01

    The purpose of this study is to quantify the flow and transport model uncertainty for the Central Nevada Test Area (CNTA). Six parameters were identified as uncertain, including the specified head boundary conditions used in the flow model, the spatial distribution of the underlying welded tuff unit, effective porosity, sorption coefficients, matrix diffusion coefficient, and the geochemical release function which describes nuclear glass dissolution. The parameter uncertainty was described by assigning prior statistical distributions for each of these parameters. Standard Monte Carlo techniques were used to sample from the parameter distributions to determine the full prediction uncertainty. Additional analysis is performed to determine the most cost-beneficial characterization activities. The maximum radius of the tritium and strontium-90 contaminant boundary was used as the output metric for evaluation of prediction uncertainty. The results indicate that combining all of the uncertainty in the parameters listed above propagates to a prediction uncertainty in the maximum radius of the contaminant boundary of 234 to 308 m and 234 to 302 m, for tritium and strontium-90, respectively. Although the uncertainty in the input parameters is large, the prediction uncertainty in the contaminant boundary is relatively small. The relatively small prediction uncertainty is primarily due to the small transport velocities such that large changes in the uncertain input parameters causes small changes in the contaminant boundary. This suggests that the model is suitable in terms of predictive capability for the contaminant boundary delineation

  3. Quantifying Registration Uncertainty With Sparse Bayesian Modelling.

    Science.gov (United States)

    Le Folgoc, Loic; Delingette, Herve; Criminisi, Antonio; Ayache, Nicholas

    2017-02-01

    We investigate uncertainty quantification under a sparse Bayesian model of medical image registration. Bayesian modelling has proven powerful to automate the tuning of registration hyperparameters, such as the trade-off between the data and regularization functionals. Sparsity-inducing priors have recently been used to render the parametrization itself adaptive and data-driven. The sparse prior on transformation parameters effectively favors the use of coarse basis functions to capture the global trends in the visible motion while finer, highly localized bases are introduced only in the presence of coherent image information and motion. In earlier work, approximate inference under the sparse Bayesian model was tackled in an efficient Variational Bayes (VB) framework. In this paper we are interested in the theoretical and empirical quality of uncertainty estimates derived under this approximate scheme vs. under the exact model. We implement an (asymptotically) exact inference scheme based on reversible jump Markov Chain Monte Carlo (MCMC) sampling to characterize the posterior distribution of the transformation and compare the predictions of the VB and MCMC based methods. The true posterior distribution under the sparse Bayesian model is found to be meaningful: orders of magnitude for the estimated uncertainty are quantitatively reasonable, the uncertainty is higher in textureless regions and lower in the direction of strong intensity gradients.

  4. The role of parameter uncertainty in seismic risk assessment

    International Nuclear Information System (INIS)

    Ellingwood, B.

    1989-01-01

    Research is underway to examine the validity and limitations of seismic PRA methods through an investigation of how various uncertainties affect risk estimates, inferences and regulatory decisions. Indications are that the uncertainty in the basic seismic hazard at the plant site appears to be the single most source of uncertainty in core damage probability. However, when the fragility modeling and plant logic are uncoupled from the seismic hazard analysis in a margin study, fragility modeling assumptions may become important. 12 refs., 3 figs., 5 tabs

  5. Measurement uncertainties physical parameters and calibration of instruments

    CERN Document Server

    Gupta, S V

    2012-01-01

    This book fulfills the global need to evaluate measurement results along with the associated uncertainty. In the book, together with the details of uncertainty calculations for many physical parameters, probability distributions and their properties are discussed. Definitions of various terms are given and will help the practicing metrologists to grasp the subject. The book helps to establish international standards for the evaluation of the quality of raw data obtained from various laboratories for interpreting the results of various national metrology institutes in an international inter-comparisons. For the routine calibration of instruments, a new idea for the use of pooled variance is introduced. The uncertainty calculations are explained for (i) independent linear inputs, (ii) non-linear inputs and (iii) correlated inputs. The merits and limitations of the Guide to the Expression of Uncertainty in Measurement (GUM) are discussed. Monte Carlo methods for the derivation of the output distribution from the...

  6. Quantifying Parameter and Structural Uncertainty of Dynamic Disease Transmission Models Using MCMC: An Application to Rotavirus Vaccination in England and Wales.

    Science.gov (United States)

    Bilcke, Joke; Chapman, Ruth; Atchison, Christina; Cromer, Deborah; Johnson, Helen; Willem, Lander; Cox, Martin; Edmunds, William John; Jit, Mark

    2015-07-01

    Two vaccines (Rotarix and RotaTeq) are highly effective at preventing severe rotavirus disease. Rotavirus vaccination has been introduced in the United Kingdom and other countries partly based on modeling and cost-effectiveness results. However, most of these models fail to account for the uncertainty about several vaccine characteristics and the mechanism of vaccine action. A deterministic dynamic transmission model of rotavirus vaccination in the United Kingdom was developed. This improves on previous models by 1) allowing for 2 different mechanisms of action for Rotarix and RotaTeq, 2) using clinical trial data to understand these mechanisms, and 3) accounting for uncertainty by using Markov Chain Monte Carlo. In the long run, Rotarix and RotaTeq are predicted to reduce the overall rotavirus incidence by 50% (39%-63%) and 44% (30%-62%), respectively but with an increase in incidence in primary school children and adults up to 25 y of age. The vaccines are estimated to give more protection than 1 or 2 natural infections. The duration of protection is highly uncertain but has only impact on the predicted reduction in rotavirus burden for values lower than 10 y. The 2 vaccine mechanism structures fit equally well with the clinical trial data. Long-term postvaccination dynamics cannot be predicted reliably with the data available. Accounting for the joint uncertainty of several vaccine characteristics resulted in more insight into which of these are crucial for determining the impact of rotavirus vaccination. Data for up to at least 10 y postvaccination and covering older children and adults are crucial to address remaining questions on the impact of widespread rotavirus vaccination. © The Author(s) 2015.

  7. Evidential Model Validation under Epistemic Uncertainty

    Directory of Open Access Journals (Sweden)

    Wei Deng

    2018-01-01

    Full Text Available This paper proposes evidence theory based methods to both quantify the epistemic uncertainty and validate computational model. Three types of epistemic uncertainty concerning input model data, that is, sparse points, intervals, and probability distributions with uncertain parameters, are considered. Through the proposed methods, the given data will be described as corresponding probability distributions for uncertainty propagation in the computational model, thus, for the model validation. The proposed evidential model validation method is inspired by the idea of Bayesian hypothesis testing and Bayes factor, which compares the model predictions with the observed experimental data so as to assess the predictive capability of the model and help the decision making of model acceptance. Developed by the idea of Bayes factor, the frame of discernment of Dempster-Shafer evidence theory is constituted and the basic probability assignment (BPA is determined. Because the proposed validation method is evidence based, the robustness of the result can be guaranteed, and the most evidence-supported hypothesis about the model testing will be favored by the BPA. The validity of proposed methods is illustrated through a numerical example.

  8. Aspects of uncertainty analysis in accident consequence modeling

    International Nuclear Information System (INIS)

    Travis, C.C.; Hoffman, F.O.

    1981-01-01

    Mathematical models are frequently used to determine probable dose to man from an accidental release of radionuclides by a nuclear facility. With increased emphasis on the accuracy of these models, the incorporation of uncertainty analysis has become one of the most crucial and sensitive components in evaluating the significance of model predictions. In the present paper, we address three aspects of uncertainty in models used to assess the radiological impact to humans: uncertainties resulting from the natural variability in human biological parameters; the propagation of parameter variability by mathematical models; and comparison of model predictions to observational data

  9. Lumped-parameter models

    Energy Technology Data Exchange (ETDEWEB)

    Ibsen, Lars Bo; Liingaard, M.

    2006-12-15

    A lumped-parameter model represents the frequency dependent soil-structure interaction of a massless foundation placed on or embedded into an unbounded soil domain. In this technical report the steps of establishing a lumped-parameter model are presented. Following sections are included in this report: Static and dynamic formulation, Simple lumped-parameter models and Advanced lumped-parameter models. (au)

  10. Applied research in uncertainty modeling and analysis

    CERN Document Server

    Ayyub, Bilal

    2005-01-01

    Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

  11. Appropriatie spatial scales to achieve model output uncertainty goals

    NARCIS (Netherlands)

    Booij, Martijn J.; Melching, Charles S.; Chen, Xiaohong; Chen, Yongqin; Xia, Jun; Zhang, Hailun

    2008-01-01

    Appropriate spatial scales of hydrological variables were determined using an existing methodology based on a balance in uncertainties from model inputs and parameters extended with a criterion based on a maximum model output uncertainty. The original methodology uses different relationships between

  12. Some illustrative examples of model uncertainty

    International Nuclear Information System (INIS)

    Bier, V.M.

    1994-01-01

    In this paper, we first discuss the view of model uncertainty proposed by Apostolakis. We then present several illustrative examples related to model uncertainty, some of which are not well handled by this formalism. Thus, Apostolakis' approach seems to be well suited to describing some types of model uncertainty, but not all. Since a comprehensive approach for characterizing and quantifying model uncertainty is not yet available, it is hoped that the examples presented here will service as a springboard for further discussion

  13. Impulsive control of permanent magnet synchronous motors with parameters uncertainties

    International Nuclear Information System (INIS)

    Li Dong; Zhang Xiaohong; Wang Shilong; Yan Dan; Wang Hui

    2008-01-01

    The permanent magnet synchronous motors (PMSMs) may have chaotic behaviours for the uncertain values of parameters or under certain working conditions, which threatens the secure and stable operation of motor-driven. It is important to study methods of controlling or suppressing chaos in PMSMs. In this paper, robust stabilities of PMSM with parameter uncertainties are investigated. After the uncertain matrices which represent the variable system parameters are formulated through matrix analysis, a novel asymptotical stability criterion is established. Some illustrated examples are also given to show the effectiveness of the obtained results

  14. Bayesian analysis for uncertainty estimation of a canopy transpiration model

    Science.gov (United States)

    Samanta, S.; Mackay, D. S.; Clayton, M. K.; Kruger, E. L.; Ewers, B. E.

    2007-04-01

    A Bayesian approach was used to fit a conceptual transpiration model to half-hourly transpiration rates for a sugar maple (Acer saccharum) stand collected over a 5-month period and probabilistically estimate its parameter and prediction uncertainties. The model used the Penman-Monteith equation with the Jarvis model for canopy conductance. This deterministic model was extended by adding a normally distributed error term. This extension enabled using Markov chain Monte Carlo simulations to sample the posterior parameter distributions. The residuals revealed approximate conformance to the assumption of normally distributed errors. However, minor systematic structures in the residuals at fine timescales suggested model changes that would potentially improve the modeling of transpiration. Results also indicated considerable uncertainties in the parameter and transpiration estimates. This simple methodology of uncertainty analysis would facilitate the deductive step during the development cycle of deterministic conceptual models by accounting for these uncertainties while drawing inferences from data.

  15. Modeling of uncertainty in atmospheric transport system using hybrid method

    International Nuclear Information System (INIS)

    Pandey, M.; Ranade, Ashok; Brij Kumar; Datta, D.

    2012-01-01

    Atmospheric dispersion models are routinely used at nuclear and chemical plants to estimate exposure to the members of the public and occupational workers due to release of hazardous contaminants into the atmosphere. Atmospheric dispersion is a stochastic phenomenon and in general, the concentration of the contaminant estimated at a given time and at a predetermined location downwind of a source cannot be predicted precisely. Uncertainty in atmospheric dispersion model predictions is associated with: 'data' or 'parameter' uncertainty resulting from errors in the data used to execute and evaluate the model, uncertainties in empirical model parameters, and initial and boundary conditions; 'model' or 'structural' uncertainty arising from inaccurate treatment of dynamical and chemical processes, approximate numerical solutions, and internal model errors; and 'stochastic' uncertainty, which results from the turbulent nature of the atmosphere as well as from unpredictability of human activities related to emissions, The possibility theory based on fuzzy measure has been proposed in recent years as an alternative approach to address knowledge uncertainty of a model in situations where available information is too vague to represent the parameters statistically. The paper presents a novel approach (called Hybrid Method) to model knowledge uncertainty in a physical system by a combination of probabilistic and possibilistic representation of parametric uncertainties. As a case study, the proposed approach is applied for estimating the ground level concentration of hazardous contaminant in air due to atmospheric releases through the stack (chimney) of a nuclear plant. The application illustrates the potential of the proposed approach. (author)

  16. Impact of geological model uncertainty on integrated catchment hydrological modeling

    Science.gov (United States)

    He, Xin; Jørgensen, Flemming; Refsgaard, Jens Christian

    2014-05-01

    Various types of uncertainty can influence hydrological model performance. Among them, uncertainty originated from geological model may play an important role in process-based integrated hydrological modeling, if the model is used outside the calibration base. In the present study, we try to assess the hydrological model predictive uncertainty caused by uncertainty of the geology using an ensemble of geological models with equal plausibility. The study is carried out in the 101 km2 Norsminde catchment in western Denmark. Geostatistical software TProGS is used to generate 20 stochastic geological realizations for the west side the of study area. This process is done while incorporating the borehole log data from 108 wells and high resolution airborne transient electromagnetic (AEM) data for conditioning. As a result, 10 geological models are generated based solely on borehole data, and another 10 geological models are based on both borehole and AEM data. Distributed surface water - groundwater models are developed using MIKE SHE code for each of the 20 geological models. The models are then calibrated using field data collected from stream discharge and groundwater head observations. The model simulation results are evaluated based on the same two types of field data. The results show that the differences between simulated discharge flows caused by using different geological models are relatively small. The model calibration is shown to be able to account for the systematic bias in different geological realizations and hence varies the calibrated model parameters. This results in an increase in the variance between the hydrological realizations compared to the uncalibrated models that uses the same parameter values in all 20 models. Furthermore, borehole based hydrological models in general show more variance between simulations than the AEM based models; however, the combined total uncertainty, bias plus variance, is not necessarily higher.

  17. Probabilistic Radiological Performance Assessment Modeling and Uncertainty

    Science.gov (United States)

    Tauxe, J.

    2004-12-01

    A generic probabilistic radiological Performance Assessment (PA) model is presented. The model, built using the GoldSim systems simulation software platform, concerns contaminant transport and dose estimation in support of decision making with uncertainty. Both the U.S. Nuclear Regulatory Commission (NRC) and the U.S. Department of Energy (DOE) require assessments of potential future risk to human receptors of disposal of LLW. Commercially operated LLW disposal facilities are licensed by the NRC (or agreement states), and the DOE operates such facilities for disposal of DOE-generated LLW. The type of PA model presented is probabilistic in nature, and hence reflects the current state of knowledge about the site by using probability distributions to capture what is expected (central tendency or average) and the uncertainty (e.g., standard deviation) associated with input parameters, and propagating through the model to arrive at output distributions that reflect expected performance and the overall uncertainty in the system. Estimates of contaminant release rates, concentrations in environmental media, and resulting doses to human receptors well into the future are made by running the model in Monte Carlo fashion, with each realization representing a possible combination of input parameter values. Statistical summaries of the results can be compared to regulatory performance objectives, and decision makers are better informed of the inherently uncertain aspects of the model which supports their decision-making. While this information may make some regulators uncomfortable, they must realize that uncertainties which were hidden in a deterministic analysis are revealed in a probabilistic analysis, and the chance of making a correct decision is now known rather than hoped for. The model includes many typical features and processes that would be part of a PA, but is entirely fictitious. This does not represent any particular site and is meant to be a generic example. A

  18. Adressing Replication and Model Uncertainty

    DEFF Research Database (Denmark)

    Ebersberger, Bernd; Galia, Fabrice; Laursen, Keld

    innovation survey data for France, Germany and the UK, we conduct a ‘large-scale’ replication using the Bayesian averaging approach of classical estimators. Our method tests a wide range of determinants of innovation suggested in the prior literature, and establishes a robust set of findings on the variables...... which shape the introduction of new to the firm and new to the world innovations. We provide some implications for innovation research, and explore the potential application of our approach to other domains of research in strategic management.......Many fields of strategic management are subject to an important degree of model uncertainty. This is because the true model, and therefore the selection of appropriate explanatory variables, is essentially unknown. Drawing on the literature on the determinants of innovation, and by analyzing...

  19. Uncertainty of Doppler reactivity worth due to uncertainties of JENDL-3.2 resonance parameters

    Energy Technology Data Exchange (ETDEWEB)

    Zukeran, Atsushi [Hitachi Ltd., Hitachi, Ibaraki (Japan). Power and Industrial System R and D Div.; Hanaki, Hiroshi; Nakagawa, Tuneo; Shibata, Keiichi; Ishikawa, Makoto

    1998-03-01

    Analytical formula of Resonance Self-shielding Factor (f-factor) is derived from the resonance integral (J-function) based on NR approximation and the analytical expression for Doppler reactivity worth ({rho}) is also obtained by using the result. Uncertainties of the f-factor and Doppler reactivity worth are evaluated on the basis of sensitivity coefficients to the resonance parameters. The uncertainty of the Doppler reactivity worth at 487{sup 0}K is about 4 % for the PNC Large Fast Breeder Reactor. (author)

  20. Determining Best Estimates and Uncertainties in Cloud Microphysical Parameters from ARM Field Data: Implications for Models, Retrieval Schemes and Aerosol-Cloud-Radiation Interactions

    Energy Technology Data Exchange (ETDEWEB)

    McFarquhar, Greg [Univ. of Illinois, Urbana, IL (United States)

    2015-12-28

    We proposed to analyze in-situ cloud data collected during ARM/ASR field campaigns to create databases of cloud microphysical properties and their uncertainties as needed for the development of improved cloud parameterizations for models and remote sensing retrievals, and for evaluation of model simulations and retrievals. In particular, we proposed to analyze data collected over the Southern Great Plains (SGP) during the Mid-latitude Continental Convective Clouds Experiment (MC3E), the Storm Peak Laboratory Cloud Property Validation Experiment (STORMVEX), the Small Particles in Cirrus (SPARTICUS) Experiment and the Routine AAF Clouds with Low Optical Water Depths (CLOWD) Optical Radiative Observations (RACORO) field campaign, over the North Slope of Alaska during the Indirect and Semi-Direct Aerosol Campaign (ISDAC) and the Mixed-Phase Arctic Cloud Experiment (M-PACE), and over the Tropical Western Pacific (TWP) during The Tropical Warm Pool International Cloud Experiment (TWP-ICE), to meet the following 3 objectives; derive statistical databases of single ice particle properties (aspect ratio AR, dominant habit, mass, projected area) and distributions of ice crystals (size distributions SDs, mass-dimension m-D, area-dimension A-D relations, mass-weighted fall speeds, single-scattering properties, total concentrations N, ice mass contents IWC), complete with uncertainty estimates; assess processes by which aerosols modulate cloud properties in arctic stratus and mid-latitude cumuli, and quantify aerosol’s influence in context of varying meteorological and surface conditions; and determine how ice cloud microphysical, single-scattering and fall-out properties and contributions of small ice crystals to such properties vary according to location, environment, surface, meteorological and aerosol conditions, and develop parameterizations of such effects.In this report we describe the accomplishments that we made on all 3 research objectives.

  1. Uncertainty "escalation" and use of machine learning to forecast residual and data model uncertainties

    Science.gov (United States)

    Solomatine, Dimitri

    2016-04-01

    When speaking about model uncertainty many authors implicitly assume the data uncertainty (mainly in parameters or inputs) which is probabilistically described by distributions. Often however it is look also into the residual uncertainty as well. It is hence reasonable to classify the main approaches to uncertainty analysis with respect to the two main types of model uncertainty that can be distinguished: A. The residual uncertainty of models. In this case the model parameters and/or model inputs are considered to be fixed (deterministic), i.e. the model is considered to be optimal (calibrated) and deterministic. Model error is considered as the manifestation of uncertainty. If there is enough past data about the model errors (i.e. it uncertainty), it is possible to build a statistical or machine learning model of uncertainty trained on this data. The following methods can be mentioned: (a) quantile regression (QR) method by Koenker and Basset in which linear regression is used to build predictive models for distribution quantiles [1] (b) a more recent approach that takes into account the input variables influencing such uncertainty and uses more advanced machine learning (non-linear) methods (neural networks, model trees etc.) - the UNEEC method [2,3,7] (c) and even more recent DUBRAUE method (Dynamic Uncertainty Model By Regression on Absolute Error), a autoregressive model of model residuals (it corrects the model residual first and then carries out the uncertainty prediction by a autoregressive statistical model) [5] B. The data uncertainty (parametric and/or input) - in this case we study the propagation of uncertainty (presented typically probabilistically) from parameters or inputs to the model outputs. In case of simple functions representing models analytical approaches can be used, or approximation methods (e.g., first-order second moment method). However, for real complex non-linear models implemented in software there is no other choice except using

  2. Parameter uncertainty analysis of non-point source pollution from different land use types.

    Science.gov (United States)

    Shen, Zhen-yao; Hong, Qian; Yu, Hong; Niu, Jun-feng

    2010-03-15

    Land use type is one of the most important factors that affect the uncertainty in non-point source (NPS) pollution simulation. In this study, seventeen sensitive parameters were screened from the Soil and Water Assessment Tool (SWAT) model for parameter uncertainty analysis for different land use types in the Daning River Watershed of the Three Gorges Reservoir area, China. First-Order Error Analysis (FOEA) method was adopted to analyze the effect of parameter uncertainty on model outputs under three types of land use, namely, plantation, forest and grassland. The model outputs selected in this study consisted of runoff, sediment yield, organic nitrogen (N), and total phosphorus (TP). The results indicated that the uncertainty conferred by the parameters differed among the three land use types. In forest and grassland, the parameter uncertainty in NPS pollution was primarily associated with runoff processes, but in plantation, the main uncertain parameters were related to runoff process and soil properties. Taken together, the study suggested that adjusting the structure of land use and controlling fertilizer use are helpful methods to control the NPS pollution in the Daning River Watershed.

  3. Uncertainty and its propagation in dynamics models

    International Nuclear Information System (INIS)

    Devooght, J.

    1994-01-01

    The purpose of this paper is to bring together some characteristics due to uncertainty when we deal with dynamic models and therefore to propagation of uncertainty. The respective role of uncertainty and inaccuracy is examined. A mathematical formalism based on Chapman-Kolmogorov equation allows to define a open-quotes subdynamicsclose quotes where the evolution equation takes the uncertainty into account. The problem of choosing or combining models is examined through a loss function associated to a decision

  4. Where does the uncertainty come from? Attributing Uncertainty in Conceptual Hydrologic Modelling

    Science.gov (United States)

    Abu Shoaib, S.; Marshall, L. A.; Sharma, A.

    2015-12-01

    Defining an appropriate forecasting model is a key phase in water resources planning and design. Quantification of uncertainty is an important step in the development and application of hydrologic models. In this study, we examine the dependency of hydrologic model uncertainty on the observed model inputs, defined model structure, parameter optimization identifiability and identified likelihood. We present here a new uncertainty metric, the Quantile Flow Deviation or QFD, to evaluate the relative uncertainty due to each of these sources under a range of catchment conditions. Through the metric, we may identify the potential spectrum of uncertainty and variability in model simulations. The QFD assesses uncertainty by estimating the deviation in flows at a given quantile across a range of scenarios. By using a quantile based metric, the change in uncertainty across individual percentiles can be assessed, thereby allowing uncertainty to be expressed as a function of time. The QFD method can be disaggregated to examine any part of the modelling process including the selection of certain model subroutines or forcing data. Case study results (including catchments in Australia and USA) suggest that model structure selection is vital irrespective of the flow percentile of interest or the catchment being studied. Examining the QFD across various quantiles additionally demonstrates that lower yielding catchments may have greater variation due to selected model structures. By incorporating multiple model structures, it is possible to assess (i) the relative importance of various sources of uncertainty, (ii) how these vary with the change in catchment location or hydrologic regime; and (iii) the impact of the length of available observations in uncertainty quantification.

  5. Uncertainties in environmental radiological assessment models and their implications

    International Nuclear Information System (INIS)

    Hoffman, F.O.; Miller, C.W.

    1983-01-01

    Environmental radiological assessments rely heavily on the use of mathematical models. The predictions of these models are inherently uncertain because these models are inexact representations of real systems. The major sources of this uncertainty are related to biases in model formulation and parameter estimation. The best approach for estimating the actual extent of over- or underprediction is model validation, a procedure that requires testing over the range of the intended realm of model application. Other approaches discussed are the use of screening procedures, sensitivity and stochastic analyses, and model comparison. The magnitude of uncertainty in model predictions is a function of the questions asked of the model and the specific radionuclides and exposure pathways of dominant importance. Estimates are made of the relative magnitude of uncertainty for situations requiring predictions of individual and collective risks for both chronic and acute releases of radionuclides. It is concluded that models developed as research tools should be distinguished from models developed for assessment applications. Furthermore, increased model complexity does not necessarily guarantee increased accuracy. To improve the realism of assessment modeling, stochastic procedures are recommended that translate uncertain parameter estimates into a distribution of predicted values. These procedures also permit the importance of model parameters to be ranked according to their relative contribution to the overall predicted uncertainty. Although confidence in model predictions can be improved through site-specific parameter estimation and increased model validation, risk factors and internal dosimetry models will probably remain important contributors to the amount of uncertainty that is irreducible

  6. Sensitivity and uncertainty analysis of the PATHWAY radionuclide transport model

    International Nuclear Information System (INIS)

    Otis, M.D.

    1983-01-01

    Procedures were developed for the uncertainty and sensitivity analysis of a dynamic model of radionuclide transport through human food chains. Uncertainty in model predictions was estimated by propagation of parameter uncertainties using a Monte Carlo simulation technique. Sensitivity of model predictions to individual parameters was investigated using the partial correlation coefficient of each parameter with model output. Random values produced for the uncertainty analysis were used in the correlation analysis for sensitivity. These procedures were applied to the PATHWAY model which predicts concentrations of radionuclides in foods grown in Nevada and Utah and exposed to fallout during the period of atmospheric nuclear weapons testing in Nevada. Concentrations and time-integrated concentrations of iodine-131, cesium-136, and cesium-137 in milk and other foods were investigated. 9 figs., 13 tabs

  7. Review of strategies for handling geological uncertainty in groundwater flow and transport modeling

    DEFF Research Database (Denmark)

    Refsgaard, Jens Christian; Christensen, Steen; Sonnenborg, Torben O.

    2012-01-01

    be accounted for, but is often neglected, in assessments of prediction uncertainties. Strategies for assessing prediction uncertainty due to geologically related uncertainty may be divided into three main categories, accounting for uncertainty due to: (a) the geological structure; (b) effective model...... parameters; and (c) model parameters including local scale heterogeneity. The most common methodologies for uncertainty assessments within each of these categories, such as multiple modeling, Monte Carlo analysis, regression analysis and moment equation approach, are briefly described with emphasis...

  8. Analytic uncertainty and sensitivity analysis of models with input correlations

    Science.gov (United States)

    Zhu, Yueying; Wang, Qiuping A.; Li, Wei; Cai, Xu

    2018-03-01

    Probabilistic uncertainty analysis is a common means of evaluating mathematical models. In mathematical modeling, the uncertainty in input variables is specified through distribution laws. Its contribution to the uncertainty in model response is usually analyzed by assuming that input variables are independent of each other. However, correlated parameters are often happened in practical applications. In the present paper, an analytic method is built for the uncertainty and sensitivity analysis of models in the presence of input correlations. With the method, it is straightforward to identify the importance of the independence and correlations of input variables in determining the model response. This allows one to decide whether or not the input correlations should be considered in practice. Numerical examples suggest the effectiveness and validation of our analytic method in the analysis of general models. A practical application of the method is also proposed to the uncertainty and sensitivity analysis of a deterministic HIV model.

  9. Application of a virtual coordinate measuring machine for measurement uncertainty estimation of aspherical lens parameters

    International Nuclear Information System (INIS)

    Küng, Alain; Meli, Felix; Nicolet, Anaïs; Thalmann, Rudolf

    2014-01-01

    Tactile ultra-precise coordinate measuring machines (CMMs) are very attractive for accurately measuring optical components with high slopes, such as aspheres. The METAS µ-CMM, which exhibits a single point measurement repeatability of a few nanometres, is routinely used for measurement services of microparts, including optical lenses. However, estimating the measurement uncertainty is very demanding. Because of the many combined influencing factors, an analytic determination of the uncertainty of parameters that are obtained by numerical fitting of the measured surface points is almost impossible. The application of numerical simulation (Monte Carlo methods) using a parametric fitting algorithm coupled with a virtual CMM based on a realistic model of the machine errors offers an ideal solution to this complex problem: to each measurement data point, a simulated measurement variation calculated from the numerical model of the METAS µ-CMM is added. Repeated several hundred times, these virtual measurements deliver the statistical data for calculating the probability density function, and thus the measurement uncertainty for each parameter. Additionally, the eventual cross-correlation between parameters can be analyzed. This method can be applied for the calibration and uncertainty estimation of any parameter of the equation representing a geometric element. In this article, we present the numerical simulation model of the METAS µ-CMM and the application of a Monte Carlo method for the uncertainty estimation of measured asphere parameters. (paper)

  10. Modeling theoretical uncertainties in phenomenological analyses for particle physics

    Energy Technology Data Exchange (ETDEWEB)

    Charles, Jerome [CNRS, Aix-Marseille Univ, Universite de Toulon, CPT UMR 7332, Marseille Cedex 9 (France); Descotes-Genon, Sebastien [CNRS, Univ. Paris-Sud, Universite Paris-Saclay, Laboratoire de Physique Theorique (UMR 8627), Orsay Cedex (France); Niess, Valentin [CNRS/IN2P3, UMR 6533, Laboratoire de Physique Corpusculaire, Aubiere Cedex (France); Silva, Luiz Vale [CNRS, Univ. Paris-Sud, Universite Paris-Saclay, Laboratoire de Physique Theorique (UMR 8627), Orsay Cedex (France); Univ. Paris-Sud, CNRS/IN2P3, Universite Paris-Saclay, Groupe de Physique Theorique, Institut de Physique Nucleaire, Orsay Cedex (France); J. Stefan Institute, Jamova 39, P. O. Box 3000, Ljubljana (Slovenia)

    2017-04-15

    The determination of the fundamental parameters of the Standard Model (and its extensions) is often limited by the presence of statistical and theoretical uncertainties. We present several models for the latter uncertainties (random, nuisance, external) in the frequentist framework, and we derive the corresponding p values. In the case of the nuisance approach where theoretical uncertainties are modeled as biases, we highlight the important, but arbitrary, issue of the range of variation chosen for the bias parameters. We introduce the concept of adaptive p value, which is obtained by adjusting the range of variation for the bias according to the significance considered, and which allows us to tackle metrology and exclusion tests with a single and well-defined unified tool, which exhibits interesting frequentist properties. We discuss how the determination of fundamental parameters is impacted by the model chosen for theoretical uncertainties, illustrating several issues with examples from quark flavor physics. (orig.)

  11. Reusable launch vehicle model uncertainties impact analysis

    Science.gov (United States)

    Chen, Jiaye; Mu, Rongjun; Zhang, Xin; Deng, Yanpeng

    2018-03-01

    Reusable launch vehicle(RLV) has the typical characteristics of complex aerodynamic shape and propulsion system coupling, and the flight environment is highly complicated and intensely changeable. So its model has large uncertainty, which makes the nominal system quite different from the real system. Therefore, studying the influences caused by the uncertainties on the stability of the control system is of great significance for the controller design. In order to improve the performance of RLV, this paper proposes the approach of analyzing the influence of the model uncertainties. According to the typical RLV, the coupling dynamic and kinematics models are built. Then different factors that cause uncertainties during building the model are analyzed and summed up. After that, the model uncertainties are expressed according to the additive uncertainty model. Choosing the uncertainties matrix's maximum singular values as the boundary model, and selecting the uncertainties matrix's norm to show t how much the uncertainty factors influence is on the stability of the control system . The simulation results illustrate that the inertial factors have the largest influence on the stability of the system, and it is necessary and important to take the model uncertainties into consideration before the designing the controller of this kind of aircraft( like RLV, etc).

  12. Multi-scenario modelling of uncertainty in stochastic chemical systems

    International Nuclear Information System (INIS)

    Evans, R. David; Ricardez-Sandoval, Luis A.

    2014-01-01

    Uncertainty analysis has not been well studied at the molecular scale, despite extensive knowledge of uncertainty in macroscale systems. The ability to predict the effect of uncertainty allows for robust control of small scale systems such as nanoreactors, surface reactions, and gene toggle switches. However, it is difficult to model uncertainty in such chemical systems as they are stochastic in nature, and require a large computational cost. To address this issue, a new model of uncertainty propagation in stochastic chemical systems, based on the Chemical Master Equation, is proposed in the present study. The uncertain solution is approximated by a composite state comprised of the averaged effect of samples from the uncertain parameter distributions. This model is then used to study the effect of uncertainty on an isomerization system and a two gene regulation network called a repressilator. The results of this model show that uncertainty in stochastic systems is dependent on both the uncertain distribution, and the system under investigation. -- Highlights: •A method to model uncertainty on stochastic systems was developed. •The method is based on the Chemical Master Equation. •Uncertainty in an isomerization reaction and a gene regulation network was modelled. •Effects were significant and dependent on the uncertain input and reaction system. •The model was computationally more efficient than Kinetic Monte Carlo

  13. Wastewater treatment modelling: dealing with uncertainties

    DEFF Research Database (Denmark)

    Belia, E.; Amerlinck, Y.; Benedetti, L.

    2009-01-01

    This paper serves as a problem statement of the issues surrounding uncertainty in wastewater treatment modelling. The paper proposes a structure for identifying the sources of uncertainty introduced during each step of an engineering project concerned with model-based design or optimisation...... of a wastewater treatment system. It briefly references the methods currently used to evaluate prediction accuracy and uncertainty and discusses the relevance of uncertainty evaluations in model applications. The paper aims to raise awareness and initiate a comprehensive discussion among professionals on model...

  14. Uncertainty associated with selected environmental transport models

    International Nuclear Information System (INIS)

    Little, C.A.; Miller, C.W.

    1979-11-01

    A description is given of the capabilities of several models to predict accurately either pollutant concentrations in environmental media or radiological dose to human organs. The models are discussed in three sections: aquatic or surface water transport models, atmospheric transport models, and terrestrial and aquatic food chain models. Using data published primarily by model users, model predictions are compared to observations. This procedure is infeasible for food chain models and, therefore, the uncertainty embodied in the models input parameters, rather than the model output, is estimated. Aquatic transport models are divided into one-dimensional, longitudinal-vertical, and longitudinal-horizontal models. Several conclusions were made about the ability of the Gaussian plume atmospheric dispersion model to predict accurately downwind air concentrations from releases under several sets of conditions. It is concluded that no validation study has been conducted to test the predictions of either aquatic or terrestrial food chain models. Using the aquatic pathway from water to fish to an adult for 137 Cs as an example, a 95% one-tailed confidence limit interval for the predicted exposure is calculated by examining the distributions of the input parameters. Such an interval is found to be 16 times the value of the median exposure. A similar one-tailed limit for the air-grass-cow-milk-thyroid for 131 I and infants was 5.6 times the median dose. Of the three model types discussed in this report,the aquatic transport models appear to do the best job of predicting observed concentrations. However, this conclusion is based on many fewer aquatic validation data than were availaable for atmospheric model validation

  15. Toward Improved Reliability of Seasonal Hydrologic Forecast: Accounting for Initial Condition and State-Parameter Uncertainties

    Science.gov (United States)

    DeChant, C. M.; Moradkhani, H.

    2012-12-01

    Providing reliable estimates of seasonal water supply is a primary goal in operational hydro-meteorological prediction. In order to achieve this goal, it is accepted that hydrologists must accurately estimate forecast initial conditions (land surface states prior to forecast) and the future climate conditions, and quantify the uncertainty in these two forecast stages to provide a full estimation of the uncertainty in a given forecast. Recent work has highlighted the benefits of such a framework through advancing both land surface state estimation techniques and future climate estimation/modeling, within the operational Ensemble Streamflow Prediction (ESP) methodology. Often overlooked in this framework, the uncertainty in land surface state estimates play a key role in providing reliable seasonal forecasts. In order to quantify and reduce this uncertainty, land surface state-parameter estimation, through ensemble data assimilation, is performed with observations of snow and streamflow in a mountainous basin. Through incorporation of both snow and streamflow data for estimation of land surface states and parameters, the quantity of water stored at the land surface can be estimated, and parameter uncertainty can be estimated for seasonal simulations. With the inclusion of parameter uncertainty in the hydrologic forecasting framework, more robust quantification of hydrologic uncertainty is possible, leading to more useful forecasts for end users. This study seeks to examine the role of combined state-parameter estimation for characterization of initial conditions with the potential to be formally adopted in operational ESP framework, and validates results with probabilistic verification of both ESP and ESP with state-parameter estimation.

  16. Quantifying uncertainty in Bayesian calibrated animal-to-human PBPK models with informative prior distributions

    Science.gov (United States)

    Understanding and quantifying the uncertainty of model parameters and predictions has gained more interest in recent years with the increased use of computational models in chemical risk assessment. Fully characterizing the uncertainty in risk metrics derived from linked quantita...

  17. Uncertainties in Parameters Estimated with Neural Networks: Application to Strong Gravitational Lensing

    Science.gov (United States)

    Perreault Levasseur, Laurence; Hezaveh, Yashar D.; Wechsler, Risa H.

    2017-11-01

    In Hezaveh et al. we showed that deep learning can be used for model parameter estimation and trained convolutional neural networks to determine the parameters of strong gravitational-lensing systems. Here we demonstrate a method for obtaining the uncertainties of these parameters. We review the framework of variational inference to obtain approximate posteriors of Bayesian neural networks and apply it to a network trained to estimate the parameters of the Singular Isothermal Ellipsoid plus external shear and total flux magnification. We show that the method can capture the uncertainties due to different levels of noise in the input data, as well as training and architecture-related errors made by the network. To evaluate the accuracy of the resulting uncertainties, we calculate the coverage probabilities of marginalized distributions for each lensing parameter. By tuning a single variational parameter, the dropout rate, we obtain coverage probabilities approximately equal to the confidence levels for which they were calculated, resulting in accurate and precise uncertainty estimates. Our results suggest that the application of approximate Bayesian neural networks to astrophysical modeling problems can be a fast alternative to Monte Carlo Markov Chains, allowing orders of magnitude improvement in speed.

  18. Fuzzy techniques for subjective workload-score modeling under uncertainties.

    Science.gov (United States)

    Kumar, Mohit; Arndt, Dagmar; Kreuzfeld, Steffi; Thurow, Kerstin; Stoll, Norbert; Stoll, Regina

    2008-12-01

    This paper deals with the development of a computer model to estimate the subjective workload score of individuals by evaluating their heart-rate (HR) signals. The identification of a model to estimate the subjective workload score of individuals under different workload situations is too ambitious a task because different individuals (due to different body conditions, emotional states, age, gender, etc.) show different physiological responses (assessed by evaluating the HR signal) under different workload situations. This is equivalent to saying that the mathematical mappings between physiological parameters and the workload score are uncertain. Our approach to deal with the uncertainties in a workload-modeling problem consists of the following steps: 1) The uncertainties arising due the individual variations in identifying a common model valid for all the individuals are filtered out using a fuzzy filter; 2) stochastic modeling of the uncertainties (provided by the fuzzy filter) use finite-mixture models and utilize this information regarding uncertainties for identifying the structure and initial parameters of a workload model; and 3) finally, the workload model parameters for an individual are identified in an online scenario using machine learning algorithms. The contribution of this paper is to propose, with a mathematical analysis, a fuzzy-based modeling technique that first filters out the uncertainties from the modeling problem, analyzes the uncertainties statistically using finite-mixture modeling, and, finally, utilizes the information about uncertainties for adapting the workload model to an individual's physiological conditions. The approach of this paper, demonstrated with the real-world medical data of 11 subjects, provides a fuzzy-based tool useful for modeling in the presence of uncertainties.

  19. A market model: uncertainty and reachable sets

    Directory of Open Access Journals (Sweden)

    Raczynski Stanislaw

    2015-01-01

    Full Text Available Uncertain parameters are always present in models that include human factor. In marketing the uncertain consumer behavior makes it difficult to predict the future events and elaborate good marketing strategies. Sometimes uncertainty is being modeled using stochastic variables. Our approach is quite different. The dynamic market with uncertain parameters is treated using differential inclusions, which permits to determine the corresponding reachable sets. This is not a statistical analysis. We are looking for solutions to the differential inclusions. The purpose of the research is to find the way to obtain and visualise the reachable sets, in order to know the limits for the important marketing variables. The modeling method consists in defining the differential inclusion and find its solution, using the differential inclusion solver developed by the author. As the result we obtain images of the reachable sets where the main control parameter is the share of investment, being a part of the revenue. As an additional result we also can define the optimal investment strategy. The conclusion is that the differential inclusion solver can be a useful tool in market model analysis.

  20. Fukushima Daiichi unit 1 uncertainty analysis--Preliminary selection of uncertain parameters and analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Cardoni, Jeffrey N.; Kalinich, Donald A.

    2014-02-01

    Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.

  1. Explicitly integrating parameter, input, and structure uncertainties into Bayesian Neural Networks for probabilistic hydrologic forecasting

    KAUST Repository

    Zhang, Xuesong

    2011-11-01

    Estimating uncertainty of hydrologic forecasting is valuable to water resources and other relevant decision making processes. Recently, Bayesian Neural Networks (BNNs) have been proved powerful tools for quantifying uncertainty of streamflow forecasting. In this study, we propose a Markov Chain Monte Carlo (MCMC) framework (BNN-PIS) to incorporate the uncertainties associated with parameters, inputs, and structures into BNNs. This framework allows the structure of the neural networks to change by removing or adding connections between neurons and enables scaling of input data by using rainfall multipliers. The results show that the new BNNs outperform BNNs that only consider uncertainties associated with parameters and model structures. Critical evaluation of posterior distribution of neural network weights, number of effective connections, rainfall multipliers, and hyper-parameters shows that the assumptions held in our BNNs are not well supported. Further understanding of characteristics of and interactions among different uncertainty sources is expected to enhance the application of neural networks for uncertainty analysis of hydrologic forecasting. © 2011 Elsevier B.V.

  2. Spatial Uncertainty Model for Visual Features Using a Kinect™ Sensor

    Directory of Open Access Journals (Sweden)

    Jae-Han Park

    2012-06-01

    Full Text Available This study proposes a mathematical uncertainty model for the spatial measurement of visual features using Kinect™ sensors. This model can provide qualitative and quantitative analysis for the utilization of Kinect™ sensors as 3D perception sensors. In order to achieve this objective, we derived the propagation relationship of the uncertainties between the disparity image space and the real Cartesian space with the mapping function between the two spaces. Using this propagation relationship, we obtained the mathematical model for the covariance matrix of the measurement error, which represents the uncertainty for spatial position of visual features from Kinect™ sensors. In order to derive the quantitative model of spatial uncertainty for visual features, we estimated the covariance matrix in the disparity image space using collected visual feature data. Further, we computed the spatial uncertainty information by applying the covariance matrix in the disparity image space and the calibrated sensor parameters to the proposed mathematical model. This spatial uncertainty model was verified by comparing the uncertainty ellipsoids for spatial covariance matrices and the distribution of scattered matching visual features. We expect that this spatial uncertainty model and its analyses will be useful in various Kinect™ sensor applications.

  3. Spatial uncertainty model for visual features using a Kinect™ sensor.

    Science.gov (United States)

    Park, Jae-Han; Shin, Yong-Deuk; Bae, Ji-Hun; Baeg, Moon-Hong

    2012-01-01

    This study proposes a mathematical uncertainty model for the spatial measurement of visual features using Kinect™ sensors. This model can provide qualitative and quantitative analysis for the utilization of Kinect™ sensors as 3D perception sensors. In order to achieve this objective, we derived the propagation relationship of the uncertainties between the disparity image space and the real Cartesian space with the mapping function between the two spaces. Using this propagation relationship, we obtained the mathematical model for the covariance matrix of the measurement error, which represents the uncertainty for spatial position of visual features from Kinect™ sensors. In order to derive the quantitative model of spatial uncertainty for visual features, we estimated the covariance matrix in the disparity image space using collected visual feature data. Further, we computed the spatial uncertainty information by applying the covariance matrix in the disparity image space and the calibrated sensor parameters to the proposed mathematical model. This spatial uncertainty model was verified by comparing the uncertainty ellipsoids for spatial covariance matrices and the distribution of scattered matching visual features. We expect that this spatial uncertainty model and its analyses will be useful in various Kinect™ sensor applications.

  4. Synchronization of chaotic systems with parameter uncertainties via variable structure control

    International Nuclear Information System (INIS)

    Etemadi, Shahram; Alasty, Aria; Salarieh, Hassan

    2006-01-01

    The Letter introduces a robust control design method to synchronize a pair of different uncertain chaotic systems. The technique is based on sliding-mode and variable structure control theories. Comparison of proposed method with previous works is performed during simulations. It is shown that the proposed controller while appearing in a faster response, is able to overcome random uncertainties of all model parameters

  5. Resonance parameter data uncertainty effects on integral characteristic of fast reactors

    International Nuclear Information System (INIS)

    Salvatores, M.; Palmiotti, G.; Derrien, H.; Fort, E.; Oliva, G.

    1981-10-01

    Sensitivity studies are presented of integral parameters of interest for fast reactors to uncertainties of resonance parameters of U-238, Pu-239, Pu-240 and Pu-241. Consequences due to some uncertainty correlation hypothesis are also considered

  6. Geostatistical simulation of geological architecture and uncertainty propagation in groundwater modeling

    DEFF Research Database (Denmark)

    He, Xiulan

    Groundwater modeling plays an essential role in modern subsurface hydrology research. It’s generally recognized that simulations and predictions by groundwater models are associated with uncertainties that originate from various sources. The two major uncertainty sources are related to model...... parameters and model structures, which are the primary focuses of this PhD research. Parameter uncertainty was analyzed using an optimization tool (PEST: Parameter ESTimation) in combination with a random sampling method (LHS: Latin Hypercube Sampling). Model structure, namely geological architecture...

  7. Aerosol model selection and uncertainty modelling by adaptive MCMC technique

    Directory of Open Access Journals (Sweden)

    M. Laine

    2008-12-01

    Full Text Available We present a new technique for model selection problem in atmospheric remote sensing. The technique is based on Monte Carlo sampling and it allows model selection, calculation of model posterior probabilities and model averaging in Bayesian way.

    The algorithm developed here is called Adaptive Automatic Reversible Jump Markov chain Monte Carlo method (AARJ. It uses Markov chain Monte Carlo (MCMC technique and its extension called Reversible Jump MCMC. Both of these techniques have been used extensively in statistical parameter estimation problems in wide area of applications since late 1990's. The novel feature in our algorithm is the fact that it is fully automatic and easy to use.

    We show how the AARJ algorithm can be implemented and used for model selection and averaging, and to directly incorporate the model uncertainty. We demonstrate the technique by applying it to the statistical inversion problem of gas profile retrieval of GOMOS instrument on board the ENVISAT satellite. Four simple models are used simultaneously to describe the dependence of the aerosol cross-sections on wavelength. During the AARJ estimation all the models are used and we obtain a probability distribution characterizing how probable each model is. By using model averaging, the uncertainty related to selecting the aerosol model can be taken into account in assessing the uncertainty of the estimates.

  8. Modelling of Transport Projects Uncertainties

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Leleur, Steen

    2009-01-01

    This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating......-based graphs which function as risk-related decision support for the appraised transport infrastructure project....

  9. Assessment of parametric uncertainty for groundwater reactive transport modeling,

    Science.gov (United States)

    Shi, Xiaoqing; Ye, Ming; Curtis, Gary P.; Miller, Geoffery L.; Meyer, Philip D.; Kohler, Matthias; Yabusaki, Steve; Wu, Jichun

    2014-01-01

    The validity of using Gaussian assumptions for model residuals in uncertainty quantification of a groundwater reactive transport model was evaluated in this study. Least squares regression methods explicitly assume Gaussian residuals, and the assumption leads to Gaussian likelihood functions, model parameters, and model predictions. While the Bayesian methods do not explicitly require the Gaussian assumption, Gaussian residuals are widely used. This paper shows that the residuals of the reactive transport model are non-Gaussian, heteroscedastic, and correlated in time; characterizing them requires using a generalized likelihood function such as the formal generalized likelihood function developed by Schoups and Vrugt (2010). For the surface complexation model considered in this study for simulating uranium reactive transport in groundwater, parametric uncertainty is quantified using the least squares regression methods and Bayesian methods with both Gaussian and formal generalized likelihood functions. While the least squares methods and Bayesian methods with Gaussian likelihood function produce similar Gaussian parameter distributions, the parameter distributions of Bayesian uncertainty quantification using the formal generalized likelihood function are non-Gaussian. In addition, predictive performance of formal generalized likelihood function is superior to that of least squares regression and Bayesian methods with Gaussian likelihood function. The Bayesian uncertainty quantification is conducted using the differential evolution adaptive metropolis (DREAM(zs)) algorithm; as a Markov chain Monte Carlo (MCMC) method, it is a robust tool for quantifying uncertainty in groundwater reactive transport models. For the surface complexation model, the regression-based local sensitivity analysis and Morris- and DREAM(ZS)-based global sensitivity analysis yield almost identical ranking of parameter importance. The uncertainty analysis may help select appropriate likelihood

  10. Study on Uncertainty and Contextual Modelling

    Czech Academy of Sciences Publication Activity Database

    Klimešová, Dana; Ocelíková, E.

    2007-01-01

    Roč. 1, č. 1 (2007), s. 12-15 ISSN 1998-0140 Institutional research plan: CEZ:AV0Z10750506 Keywords : Knowledge * contextual modelling * temporal modelling * uncertainty * knowledge management Subject RIV: BD - Theory of Information

  11. Using the sampling method to propagate uncertainties of physical parameters in systems with fissile material

    International Nuclear Information System (INIS)

    Campolina, Daniel de Almeida Magalhães

    2015-01-01

    There is an uncertainty for all the components that comprise the model of a nuclear system. Assessing the impact of uncertainties in the simulation of fissionable material systems is essential for a realistic calculation that has been replacing conservative model calculations as the computational power increases. The propagation of uncertainty in a simulation using a Monte Carlo code by sampling the input parameters is recent because of the huge computational effort required. By analyzing the propagated uncertainty to the effective neutron multiplication factor (k eff ), the effects of the sample size, computational uncertainty and efficiency of a random number generator to represent the distributions that characterize physical uncertainty in a light water reactor was investigated. A program entitled GB s ample was implemented to enable the application of the random sampling method, which requires an automated process and robust statistical tools. The program was based on the black box model and the MCNPX code was used in and parallel processing for the calculation of particle transport. The uncertainties considered were taken from a benchmark experiment in which the effects in k eff due to physical uncertainties is done through a conservative method. In this work a script called GB s ample was implemented to automate the sampling based method, use multiprocessing and assure the necessary robustness. It has been found the possibility of improving the efficiency of the random sampling method by selecting distributions obtained from a random number generator in order to obtain a better representation of uncertainty figures. After the convergence of the method is achieved, in order to reduce the variance of the uncertainty propagated without increase in computational time, it was found the best number o components to be sampled. It was also observed that if the sampling method is used to calculate the effect on k eff due to physical uncertainties reported by

  12. Partitioning uncertainty in streamflow projections under nonstationary model conditions

    Science.gov (United States)

    Chawla, Ila; Mujumdar, P. P.

    2018-02-01

    Assessing the impacts of Land Use (LU) and climate change on future streamflow projections is necessary for efficient management of water resources. However, model projections are burdened with significant uncertainty arising from various sources. Most of the previous studies have considered climate models and scenarios as major sources of uncertainty, but uncertainties introduced by land use change and hydrologic model assumptions are rarely investigated. In this paper an attempt is made to segregate the contribution from (i) general circulation models (GCMs), (ii) emission scenarios, (iii) land use scenarios, (iv) stationarity assumption of the hydrologic model, and (v) internal variability of the processes, to overall uncertainty in streamflow projections using analysis of variance (ANOVA) approach. Generally, most of the impact assessment studies are carried out with unchanging hydrologic model parameters in future. It is, however, necessary to address the nonstationarity in model parameters with changing land use and climate. In this paper, a regression based methodology is presented to obtain the hydrologic model parameters with changing land use and climate scenarios in future. The Upper Ganga Basin (UGB) in India is used as a case study to demonstrate the methodology. The semi-distributed Variable Infiltration Capacity (VIC) model is set-up over the basin, under nonstationary conditions. Results indicate that model parameters vary with time, thereby invalidating the often-used assumption of model stationarity. The streamflow in UGB under the nonstationary model condition is found to reduce in future. The flows are also found to be sensitive to changes in land use. Segregation results suggest that model stationarity assumption and GCMs along with their interactions with emission scenarios, act as dominant sources of uncertainty. This paper provides a generalized framework for hydrologists to examine stationarity assumption of models before considering them

  13. Incorporating parametric uncertainty into population viability analysis models

    Science.gov (United States)

    McGowan, Conor P.; Runge, Michael C.; Larson, Michael A.

    2011-01-01

    Uncertainty in parameter estimates from sampling variation or expert judgment can introduce substantial uncertainty into ecological predictions based on those estimates. However, in standard population viability analyses, one of the most widely used tools for managing plant, fish and wildlife populations, parametric uncertainty is often ignored in or discarded from model projections. We present a method for explicitly incorporating this source of uncertainty into population models to fully account for risk in management and decision contexts. Our method involves a two-step simulation process where parametric uncertainty is incorporated into the replication loop of the model and temporal variance is incorporated into the loop for time steps in the model. Using the piping plover, a federally threatened shorebird in the USA and Canada, as an example, we compare abundance projections and extinction probabilities from simulations that exclude and include parametric uncertainty. Although final abundance was very low for all sets of simulations, estimated extinction risk was much greater for the simulation that incorporated parametric uncertainty in the replication loop. Decisions about species conservation (e.g., listing, delisting, and jeopardy) might differ greatly depending on the treatment of parametric uncertainty in population models.

  14. Uncertainties

    Indian Academy of Sciences (India)

    The imperfect understanding of some of the processes and physics in the carbon cycle and chemistry models generate uncertainties in the conversion of emissions to concentration. To reflect this uncertainty in the climate scenarios, the use of AOGCMs that explicitly simulate the carbon cycle and chemistry of all the ...

  15. Modelling of Transport Projects Uncertainties

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Leleur, Steen

    2012-01-01

    This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating......-based graphs which functions as risk-related decision support for the appraised transport infrastructure project. The presentation of RSF is demonstrated by using an appraisal case concerning a new airfield in the capital of Greenland, Nuuk....

  16. Model uncertainties in top-quark physics

    CERN Document Server

    Seidel, Markus

    2014-01-01

    The ATLAS and CMS collaborations at the Large Hadron Collider (LHC) are studying the top quark in pp collisions at 7 and 8 TeV. Due to the large integrated luminosity, precision measurements of production cross-sections and properties are often limited by systematic uncertainties. An overview of the modeling uncertainties for simulated events is given in this report.

  17. Urban drainage models - making uncertainty analysis simple

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana

    2012-01-01

    There is increasing awareness about uncertainties in modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here, a modif...

  18. Development of Property Models with Uncertainty Estimate for Process Design under Uncertainty

    DEFF Research Database (Denmark)

    Hukkerikar, Amol; Sarup, Bent; Abildskov, Jens

    that can also provide estimates of uncertainties in predictions of properties and their effects on process design becomes necessary. For instance, the accuracy of design of distillation column to achieve a given product purity is dependent on many pure compound properties such as critical pressure...... of formation, standard enthalpy of fusion, standard enthalpy of vaporization at 298 K and at the normal boiling point, entropy of vaporization at the normal boiling point, surface tension at 298 K, viscosity at 300 K, flash point, auto ignition temperature, Hansen solubility parameters, Hildebrand solubility....... The comparison of model prediction uncertainties with reported range of measurement uncertainties is presented for the properties with related available data. The application of the developed methodology to quantify the effect of these uncertainties on the design of different unit operations (distillation column...

  19. Uncertainty Quantification in Control Problems for Flocking Models

    Directory of Open Access Journals (Sweden)

    Giacomo Albi

    2015-01-01

    Full Text Available The optimal control of flocking models with random inputs is investigated from a numerical point of view. The effect of uncertainty in the interaction parameters is studied for a Cucker-Smale type model using a generalized polynomial chaos (gPC approach. Numerical evidence of threshold effects in the alignment dynamic due to the random parameters is given. The use of a selective model predictive control permits steering of the system towards the desired state even in unstable regimes.

  20. Assimilating multi-source uncertainties of a parsimonious conceptual hydrological model using hierarchical Bayesian modeling

    Science.gov (United States)

    Wei Wu; James Clark; James Vose

    2010-01-01

    Hierarchical Bayesian (HB) modeling allows for multiple sources of uncertainty by factoring complex relationships into conditional distributions that can be used to draw inference and make predictions. We applied an HB model to estimate the parameters and state variables of a parsimonious hydrological model – GR4J – by coherently assimilating the uncertainties from the...

  1. Uncertainty calculation for modal parameters used with stochastic subspace identification: an application to a bridge structure

    Science.gov (United States)

    Hsu, Wei-Ting; Loh, Chin-Hsiung; Chao, Shu-Hsien

    2015-03-01

    Stochastic subspace identification method (SSI) has been proven to be an efficient algorithm for the identification of liner-time-invariant system using multivariate measurements. Generally, the estimated modal parameters through SSI may be afflicted with statistical uncertainty, e.g. undefined measurement noises, non-stationary excitation, finite number of data samples etc. Therefore, the identified results are subjected to variance errors. Accordingly, the concept of the stabilization diagram can help users to identify the correct model, i.e. through removing the spurious modes. Modal parameters are estimated at successive model orders where the physical modes of the system are extracted and separated from the spurious modes. Besides, an uncertainty computation scheme was derived for the calculation of uncertainty bounds for modal parameters at some given model order. The uncertainty bounds of damping ratios are particularly interesting, as the estimation of damping ratios are difficult to obtain. In this paper, an automated stochastic subspace identification algorithm is addressed. First, the identification of modal parameters through covariance-driven stochastic subspace identification from the output-only measurements is used for discussion. A systematic way of investigation on the criteria for the stabilization diagram is presented. Secondly, an automated algorithm of post-processing on stabilization diagram is demonstrated. Finally, the computation of uncertainty bounds for each mode with all model order in the stabilization diagram is utilized to determine system natural frequencies and damping ratios. Demonstration of this study on the system identification of a three-span steel bridge under operation condition is presented. It is shown that the proposed new operation procedure for the automated covariance-driven stochastic subspace identification can enhance the robustness and reliability in structural health monitoring.

  2. Bayesian models for comparative analysis integrating phylogenetic uncertainty

    Science.gov (United States)

    2012-01-01

    Background Uncertainty in comparative analyses can come from at least two sources: a) phylogenetic uncertainty in the tree topology or branch lengths, and b) uncertainty due to intraspecific variation in trait values, either due to measurement error or natural individual variation. Most phylogenetic comparative methods do not account for such uncertainties. Not accounting for these sources of uncertainty leads to false perceptions of precision (confidence intervals will be too narrow) and inflated significance in hypothesis testing (e.g. p-values will be too small). Although there is some application-specific software for fitting Bayesian models accounting for phylogenetic error, more general and flexible software is desirable. Methods We developed models to directly incorporate phylogenetic uncertainty into a range of analyses that biologists commonly perform, using a Bayesian framework and Markov Chain Monte Carlo analyses. Results We demonstrate applications in linear regression, quantification of phylogenetic signal, and measurement error models. Phylogenetic uncertainty was incorporated by applying a prior distribution for the phylogeny, where this distribution consisted of the posterior tree sets from Bayesian phylogenetic tree estimation programs. The models were analysed using simulated data sets, and applied to a real data set on plant traits, from rainforest plant species in Northern Australia. Analyses were performed using the free and open source software OpenBUGS and JAGS. Conclusions Incorporating phylogenetic uncertainty through an empirical prior distribution of trees leads to more precise estimation of regression model parameters than using a single consensus tree and enables a more realistic estimation of confidence intervals. In addition, models incorporating measurement errors and/or individual variation, in one or both variables, are easily formulated in the Bayesian framework. We show that BUGS is a useful, flexible general purpose tool for

  3. Uncertainty Assessment in Long Term Urban Drainage Modelling

    DEFF Research Database (Denmark)

    Thorndahl, Søren

    on the rainfall inputs. In order to handle the uncertainties three different stochastic approaches are investigated applying a case catchment in the town Frejlev: (1) a reliability approach in which a parameterization of the rainfall input is conducted in order to generate synthetic rainfall events and find...... return periods, and even within the return periods specified in the design criteria. If urban drainage models are based on standard parameters and hence not calibrated, the uncertainties are even larger. The greatest uncertainties are shown to be the rainfall input and the assessment of the contributing...

  4. A Procedure for Characterizing the Range of Input Uncertainty Parameters by the Use of FFTBM

    International Nuclear Information System (INIS)

    Petruzzi, A.; Kovtonyuk, A.; Raucci, M.; De Luca, D.; Veronese, F.; D'Auria, F.

    2013-01-01

    In the last years various methodologies were proposed to evaluate the uncertainty of Best Estimate (BE) code predictions. The most used method at the industrial level is based upon the selection of input uncertain parameters, on assigning related ranges of variations and Probability Distribution Functions (PDFs) and on performing a suitable number of code runs to get the combined effect of the variations on the results. A procedure to characterize the variation ranges of the input uncertain parameters is proposed in the paper in place of the usual approach based (mostly) on engineering judgment. The procedure is based on the use of the Fast Fourier Transform Based Method (FFTBM), already part of the Uncertainty Method based on the Accuracy Extrapolation (UMAE) method and extensively used in several international frameworks. The FFTBM has been originally developed to answer questions like 'How long improvements should be added to the system thermal-hydraulic code model? How much simplifications can be introduced and how to conduct an objective comparison?'. The method, easy to understand, convenient to use and user independent, clearly indicates when simulation needs to be improved. The procedure developed for characterizing the range of input uncertainty parameters involves the following main aspects: a) One single input parameter shall not be 'responsible' for the entire error |exp-calc|, unless exceptional situations to be evaluated case by case; b) Initial guess for Max and Min for variation ranges to be based on the usual (adopted) expertise; c) More than one experiment can be used per each NPP and each scenario. Highly influential parameters are expected to be the same. The bounding ranges should be considered for the NPP uncertainty analysis; d) A data base of suitable uncertainty input parameters can be created per each NPP and each transient scenario. (authors)

  5. Response model parameter linking

    NARCIS (Netherlands)

    Barrett, M.L.D.

    2015-01-01

    With a few exceptions, the problem of linking item response model parameters from different item calibrations has been conceptualized as an instance of the problem of equating observed scores on different test forms. This thesis argues, however, that the use of item response models does not require

  6. Network optimization including gas lift and network parameters under subsurface uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Schulze-Riegert, R.; Baffoe, J.; Pajonk, O. [SPT Group GmbH, Hamburg (Germany); Badalov, H.; Huseynov, S. [Technische Univ. Clausthal, Clausthal-Zellerfeld (Germany). ITE; Trick, M. [SPT Group, Calgary, AB (Canada)

    2013-08-01

    Optimization of oil and gas field production systems poses a great challenge to field development due to complex and multiple interactions between various operational design parameters and subsurface uncertainties. Conventional analytical methods are capable of finding local optima based on single deterministic models. They are less applicable for efficiently generating alternative design scenarios in a multi-objective context. Practical implementations of robust optimization workflows integrate the evaluation of alternative design scenarios and multiple realizations of subsurface uncertainty descriptions. Production or economic performance indicators such as NPV (Net Present Value) are linked to a risk-weighted objective function definition to guide the optimization processes. This work focuses on an integrated workflow using a reservoir-network simulator coupled to an optimization framework. The work will investigate the impact of design parameters while considering the physics of the reservoir, wells, and surface facilities. Subsurface uncertainties are described by well parameters such as inflow performance. Experimental design methods are used to investigate parameter sensitivities and interactions. Optimization methods are used to find optimal design parameter combinations which improve key performance indicators of the production network system. The proposed workflow will be applied to a representative oil reservoir coupled to a network which is modelled by an integrated reservoir-network simulator. Gas-lift will be included as an explicit measure to improve production. An objective function will be formulated for the net present value of the integrated system including production revenue and facility costs. Facility and gas lift design parameters are tuned to maximize NPV. Well inflow performance uncertainties are introduced with an impact on gas lift performance. Resulting variances on NPV are identified as a risk measure for the optimized system design. A

  7. Photovoltaic System Modeling. Uncertainty and Sensitivity Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Clifford W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Martin, Curtis E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    We report an uncertainty and sensitivity analysis for modeling AC energy from ph otovoltaic systems . Output from a PV system is predicted by a sequence of models. We quantify u ncertainty i n the output of each model using empirical distribution s of each model's residuals. We propagate uncertainty through the sequence of models by sampli ng these distributions to obtain a n empirical distribution of a PV system's output. We consider models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane - of - array irradiance; (2) estimate effective irradiance; (3) predict cell temperature; (4) estimate DC voltage, current and power ; (5) reduce DC power for losses due to inefficient maximum power point tracking or mismatch among modules; and (6) convert DC to AC power . O ur analysis consider s a notional PV system com prising an array of FirstSolar FS - 387 modules and a 250 kW AC inverter ; we use measured irradiance and weather at Albuquerque, NM. We found the uncertainty in PV syste m output to be relatively small, on the order of 1% for daily energy. We found that unce rtainty in the models for POA irradiance and effective irradiance to be the dominant contributors to uncertainty in predicted daily energy. Our analysis indicates that efforts to reduce the uncertainty in PV system output predictions may yield the greatest improvements by focusing on the POA and effective irradiance models.

  8. Model Uncertainty Quantification Methods In Data Assimilation

    Science.gov (United States)

    Pathiraja, S. D.; Marshall, L. A.; Sharma, A.; Moradkhani, H.

    2017-12-01

    Data Assimilation involves utilising observations to improve model predictions in a seamless and statistically optimal fashion. Its applications are wide-ranging; from improving weather forecasts to tracking targets such as in the Apollo 11 mission. The use of Data Assimilation methods in high dimensional complex geophysical systems is an active area of research, where there exists many opportunities to enhance existing methodologies. One of the central challenges is in model uncertainty quantification; the outcome of any Data Assimilation study is strongly dependent on the uncertainties assigned to both observations and models. I focus on developing improved model uncertainty quantification methods that are applicable to challenging real world scenarios. These include developing methods for cases where the system states are only partially observed, where there is little prior knowledge of the model errors, and where the model error statistics are likely to be highly non-Gaussian.

  9. Sensitivity of Asteroid Impact Risk to Uncertainty in Asteroid Properties and Entry Parameters

    Science.gov (United States)

    Wheeler, Lorien; Mathias, Donovan; Dotson, Jessie L.; NASA Asteroid Threat Assessment Project

    2017-10-01

    A central challenge in assessing the threat posed by asteroids striking Earth is the large amount of uncertainty inherent throughout all aspects of the problem. Many asteroid properties are not well characterized and can range widely from strong, dense, monolithic irons to loosely bound, highly porous rubble piles. Even for an object of known properties, the specific entry velocity, angle, and impact location can swing the potential consequence from no damage to causing millions of casualties. Due to the extreme rarity of large asteroid strikes, there are also large uncertainties in how different types of asteroids will interact with the atmosphere during entry, how readily they may break up or ablate, and how much surface damage will be caused by the resulting airbursts or impacts.In this work, we use our Probabilistic Asteroid Impact Risk (PAIR) model to investigate the sensitivity of asteroid impact damage to uncertainties in key asteroid properties, entry parameters, or modeling assumptions. The PAIR model combines physics-based analytic models of asteroid entry and damage in a probabilistic Monte Carlo framework to assess the risk posed by a wide range of potential impacts. The model samples from uncertainty distributions of asteroid properties and entry parameters to generate millions of specific impact cases, and models the atmospheric entry and damage for each case, including blast overpressure, thermal radiation, tsunami inundation, and global effects. To assess the risk sensitivity, we alternately fix and vary the different input parameters and compare the effect on the resulting range of damage produced. The goal of these studies is to help guide future efforts in asteroid characterization and model refinement by determining which properties most significantly affect the potential risk.

  10. An Iterative Uncertainty Assessment Technique for Environmental Modeling

    International Nuclear Information System (INIS)

    Engel, David W.; Liebetrau, Albert M.; Jarman, Kenneth D.; Ferryman, Thomas A.; Scheibe, Timothy D.; Didier, Brett T.

    2004-01-01

    The reliability of and confidence in predictions from model simulations are crucial--these predictions can significantly affect risk assessment decisions. For example, the fate of contaminants at the U.S. Department of Energy's Hanford Site has critical impacts on long-term waste management strategies. In the uncertainty estimation efforts for the Hanford Site-Wide Groundwater Modeling program, computational issues severely constrain both the number of uncertain parameters that can be considered and the degree of realism that can be included in the models. Substantial improvements in the overall efficiency of uncertainty analysis are needed to fully explore and quantify significant sources of uncertainty. We have combined state-of-the-art statistical and mathematical techniques in a unique iterative, limited sampling approach to efficiently quantify both local and global prediction uncertainties resulting from model input uncertainties. The approach is designed for application to widely diverse problems across multiple scientific domains. Results are presented for both an analytical model where the response surface is ''known'' and a simplified contaminant fate transport and groundwater flow model. The results show that our iterative method for approximating a response surface (for subsequent calculation of uncertainty estimates) of specified precision requires less computing time than traditional approaches based upon noniterative sampling methods

  11. Uncertainty Estimation in SiGe HBT Small-Signal Modeling

    DEFF Research Database (Denmark)

    Masood, Syed M.; Johansen, Tom Keinicke; Vidkjær, Jens

    2005-01-01

    An uncertainty estimation and sensitivity analysis is performed on multi-step de-embedding for SiGe HBT small-signal modeling. The uncertainty estimation in combination with uncertainty model for deviation in measured S-parameters, quantifies the possible error value in de-embedded two-port param...

  12. Uncertainty modeling process for semantic technology

    Directory of Open Access Journals (Sweden)

    Rommel N. Carvalho

    2016-08-01

    Full Text Available The ubiquity of uncertainty across application domains generates a need for principled support for uncertainty management in semantically aware systems. A probabilistic ontology provides constructs for representing uncertainty in domain ontologies. While the literature has been growing on formalisms for representing uncertainty in ontologies, there remains little guidance in the knowledge engineering literature for how to design probabilistic ontologies. To address the gap, this paper presents the Uncertainty Modeling Process for Semantic Technology (UMP-ST, a new methodology for modeling probabilistic ontologies. To explain how the methodology works and to verify that it can be applied to different scenarios, this paper describes step-by-step the construction of a proof-of-concept probabilistic ontology. The resulting domain model can be used to support identification of fraud in public procurements in Brazil. While the case study illustrates the development of a probabilistic ontology in the PR-OWL probabilistic ontology language, the methodology is applicable to any ontology formalism that properly integrates uncertainty with domain semantics.

  13. Uncertainty in the determination of soil hydraulic parameters and its influence on the performance of two hydrological models of different complexity

    NARCIS (Netherlands)

    Baroni, G.; Facchi, A.; Gandolfi, C.; Ortuani, B.; Horeschi, D.; Dam, van J.C.

    2010-01-01

    Data of soil hydraulic properties forms often a limiting factor in unsaturated zone modelling, especially at the larger scales. Investigations for the hydraulic characterization of soils are time-consuming and costly, and the accuracy of the results obtained by the different methodologies is still

  14. Volcano deformation source parameters estimated from InSAR: Sensitivities to uncertainties in seismic tomography

    Science.gov (United States)

    Masterlark, Timothy; Donovan, Theodore; Feigl, Kurt L.; Haney, Matt; Thurber, Clifford H.; Tung, Sui

    2016-01-01

    The eruption cycle of a volcano is controlled in part by the upward migration of magma. The characteristics of the magma flux produce a deformation signature at the Earth's surface. Inverse analyses use geodetic data to estimate strategic controlling parameters that describe the position and pressurization of a magma chamber at depth. The specific distribution of material properties controls how observed surface deformation translates to source parameter estimates. Seismic tomography models describe the spatial distributions of material properties that are necessary for accurate models of volcano deformation. This study investigates how uncertainties in seismic tomography models propagate into variations in the estimates of volcano deformation source parameters inverted from geodetic data. We conduct finite element model-based nonlinear inverse analyses of interferometric synthetic aperture radar (InSAR) data for Okmok volcano, Alaska, as an example. We then analyze the estimated parameters and their uncertainties to characterize the magma chamber. Analyses are performed separately for models simulating a pressurized chamber embedded in a homogeneous domain as well as for a domain having a heterogeneous distribution of material properties according to seismic tomography. The estimated depth of the source is sensitive to the distribution of material properties. The estimated depths for the homogeneous and heterogeneous domains are 2666 ± 42 and 3527 ± 56 m below mean sea level, respectively (99% confidence). A Monte Carlo analysis indicates that uncertainties of the seismic tomography cannot account for this discrepancy at the 99% confidence level. Accounting for the spatial distribution of elastic properties according to seismic tomography significantly improves the fit of the deformation model predictions and significantly influences estimates for parameters that describe the location of a pressurized magma chamber.

  15. Reducing uncertainty based on model fitness: Application to a ...

    African Journals Online (AJOL)

    A weakness of global sensitivity and uncertainty analysis methodologies is the often subjective definition of prior parameter probability distributions, especially ... The reservoir representing the central part of the wetland, where flood waters separate into several independent distributaries, is a keystone area within the model.

  16. Meteorological uncertainty of atmospheric dispersion model results (MUD)

    International Nuclear Information System (INIS)

    Havskov Soerensen, J.; Amstrup, B.; Feddersen, H.

    2013-08-01

    The MUD project addresses assessment of uncertainties of atmospheric dispersion model predictions, as well as possibilities for optimum presentation to decision makers. Previously, it has not been possible to estimate such uncertainties quantitatively, but merely to calculate the 'most likely' dispersion scenario. However, recent developments in numerical weather prediction (NWP) include probabilistic forecasting techniques, which can be utilised also for long-range atmospheric dispersion models. The ensemble statistical methods developed and applied to NWP models aim at describing the inherent uncertainties of the meteorological model results. These uncertainties stem from e.g. limits in meteorological observations used to initialise meteorological forecast series. By perturbing e.g. the initial state of an NWP model run in agreement with the available observational data, an ensemble of meteorological forecasts is produced from which uncertainties in the various meteorological parameters are estimated, e.g. probabilities for rain. Corresponding ensembles of atmospheric dispersion can now be computed from which uncertainties of predicted radionuclide concentration and deposition patterns can be derived. (Author)

  17. Modelling of data uncertainties on hybrid computers

    International Nuclear Information System (INIS)

    Schneider, Anke

    2016-06-01

    The codes d 3 f and r 3 t are well established for modelling density-driven flow and nuclide transport in the far field of repositories for hazardous material in deep geological formations. They are applicable in porous media as well as in fractured rock or mudstone, for modelling salt- and heat transport as well as a free groundwater surface. Development of the basic framework of d 3 f and r 3 t had begun more than 20 years ago. Since that time significant advancements took place in the requirements for safety assessment as well as for computer hardware development. The period of safety assessment for a repository of high-level radioactive waste was extended to 1 million years, and the complexity of the models is steadily growing. Concurrently, the demands on accuracy increase. Additionally, model and parameter uncertainties become more and more important for an increased understanding of prediction reliability. All this leads to a growing demand for computational power that requires a considerable software speed-up. An effective way to achieve this is the use of modern, hybrid computer architectures which requires basically the set-up of new data structures and a corresponding code revision but offers a potential speed-up by several orders of magnitude. The original codes d 3 f and r 3 t were applications of the software platform UG /BAS 94/ whose development had begun in the early nineteennineties. However, UG had recently been advanced to the C++ based, substantially revised version UG4 /VOG 13/. To benefit also in the future from state-of-the-art numerical algorithms and to use hybrid computer architectures, the codes d 3 f and r 3 t were transferred to this new code platform. Making use of the fact that coupling between different sets of equations is natively supported in UG4, d 3 f and r 3 t were combined to one conjoint code d 3 f++. A direct estimation of uncertainties for complex groundwater flow models with the help of Monte Carlo simulations will not be

  18. Modelling of data uncertainties on hybrid computers

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, Anke (ed.)

    2016-06-15

    The codes d{sup 3}f and r{sup 3}t are well established for modelling density-driven flow and nuclide transport in the far field of repositories for hazardous material in deep geological formations. They are applicable in porous media as well as in fractured rock or mudstone, for modelling salt- and heat transport as well as a free groundwater surface. Development of the basic framework of d{sup 3}f and r{sup 3}t had begun more than 20 years ago. Since that time significant advancements took place in the requirements for safety assessment as well as for computer hardware development. The period of safety assessment for a repository of high-level radioactive waste was extended to 1 million years, and the complexity of the models is steadily growing. Concurrently, the demands on accuracy increase. Additionally, model and parameter uncertainties become more and more important for an increased understanding of prediction reliability. All this leads to a growing demand for computational power that requires a considerable software speed-up. An effective way to achieve this is the use of modern, hybrid computer architectures which requires basically the set-up of new data structures and a corresponding code revision but offers a potential speed-up by several orders of magnitude. The original codes d{sup 3}f and r{sup 3}t were applications of the software platform UG /BAS 94/ whose development had begun in the early nineteennineties. However, UG had recently been advanced to the C++ based, substantially revised version UG4 /VOG 13/. To benefit also in the future from state-of-the-art numerical algorithms and to use hybrid computer architectures, the codes d{sup 3}f and r{sup 3}t were transferred to this new code platform. Making use of the fact that coupling between different sets of equations is natively supported in UG4, d{sup 3}f and r{sup 3}t were combined to one conjoint code d{sup 3}f++. A direct estimation of uncertainties for complex groundwater flow models with the

  19. Evaluation of uncertainties in selected environmental dispersion models

    International Nuclear Information System (INIS)

    Little, C.A.; Miller, C.W.

    1979-01-01

    Compliance with standards of radiation dose to the general public has necessitated the use of dispersion models to predict radionuclide concentrations in the environment due to releases from nuclear facilities. Because these models are only approximations of reality and because of inherent variations in the input parameters used in these models, their predictions are subject to uncertainty. Quantification of this uncertainty is necessary to assess the adequacy of these models for use in determining compliance with protection standards. This paper characterizes the capabilities of several dispersion models to predict accurately pollutant concentrations in environmental media. Three types of models are discussed: aquatic or surface water transport models, atmospheric transport models, and terrestrial and aquatic food chain models. Using data published primarily by model users, model predictions are compared to observations

  20. Modeling Uncertainty in Climate Change: A Multi-Model Comparison

    Energy Technology Data Exchange (ETDEWEB)

    Gillingham, Kenneth; Nordhaus, William; Anthoff, David; Blanford, Geoffrey J.; Bosetti, Valentina; Christensen, Peter; McJeon, Haewon C.; Reilly, J. M.; Sztorc, Paul

    2015-10-01

    The economics of climate change involves a vast array of uncertainties, complicating both the analysis and development of climate policy. This study presents the results of the first comprehensive study of uncertainty in climate change using multiple integrated assessment models. The study looks at model and parametric uncertainties for population, total factor productivity, and climate sensitivity and estimates the pdfs of key output variables, including CO2 concentrations, temperature, damages, and the social cost of carbon (SCC). One key finding is that parametric uncertainty is more important than uncertainty in model structure. Our resulting pdfs also provide insight on tail events.

  1. Determination of a PWR key neutron parameters uncertainties and conformity studies applications

    International Nuclear Information System (INIS)

    Bernard, D.

    2002-01-01

    The aim of this thesis was to evaluate uncertainties of key neutron parameters of slab reactors. Uncertainties sources have many origins, technologic origin for parameters of fabrication and physical origin for nuclear data. First, each contribution of uncertainties is calculated and finally, a factor of uncertainties is associated to key slab parameter like reactivity, isotherm reactivity coefficient, control rod efficiency, power form factor before irradiation and lifetime. This factors of uncertainties were computed by Generalized Perturbations Theory in case of step 0 and by directs calculations in case of irradiation problems. One of neutronic conformity applications was about fabrication and nuclear data targets precision adjustments. Statistic (uncertainties) and deterministic (deviations) approaches were studied. Then neutronics key slab parameters uncertainties were reduced and so nuclear performances were optimised. (author)

  2. Quantification of parameter uncertainty for robust control of shape memory alloy bending actuators

    International Nuclear Information System (INIS)

    Crews, John H; McMahan, Jerry A; Smith, Ralph C; Hannen, Jennifer C

    2013-01-01

    In this paper, we employ Bayesian parameter estimation techniques to derive gains for robust control of smart materials. Specifically, we demonstrate the feasibility of utilizing parameter uncertainty estimation provided by Markov chain Monte Carlo (MCMC) methods to determine controller gains for a shape memory alloy bending actuator. We treat the parameters in the equations governing the actuator’s temperature dynamics as uncertain and use the MCMC method to construct the probability densities for these parameters. The densities are then used to derive parameter bounds for robust control algorithms. For illustrative purposes, we construct a sliding mode controller based on the homogenized energy model and experimentally compare its performance to a proportional-integral controller. While sliding mode control is used here, the techniques described in this paper provide a useful starting point for many robust control algorithms. (paper)

  3. Uncertainty propagation through dynamic models of assemblies of mechanical structures

    International Nuclear Information System (INIS)

    Daouk, Sami

    2016-01-01

    When studying the behaviour of mechanical systems, mathematical models and structural parameters are usually considered deterministic. Return on experience shows however that these elements are uncertain in most cases, due to natural variability or lack of knowledge. Therefore, quantifying the quality and reliability of the numerical model of an industrial assembly remains a major question in low-frequency dynamics. The purpose of this thesis is to improve the vibratory design of bolted assemblies through setting up a dynamic connector model that takes account of different types and sources of uncertainty on stiffness parameters, in a simple, efficient and exploitable in industrial context. This work has been carried out in the framework of the SICODYN project, led by EDF R and D, that aims to characterise and quantify, numerically and experimentally, the uncertainties in the dynamic behaviour of bolted industrial assemblies. Comparative studies of several numerical methods of uncertainty propagation demonstrate the advantage of using the Lack-Of-Knowledge theory. An experimental characterisation of uncertainties in bolted structures is performed on a dynamic test rig and on an industrial assembly. The propagation of many small and large uncertainties through different dynamic models of mechanical assemblies leads to the assessment of the efficiency of the Lack-Of-Knowledge theory and its applicability in an industrial environment. (author)

  4. Spatial Uncertainty Analysis of Ecological Models

    Energy Technology Data Exchange (ETDEWEB)

    Jager, H.I.; Ashwood, T.L.; Jackson, B.L.; King, A.W.

    2000-09-02

    The authors evaluated the sensitivity of a habitat model and a source-sink population model to spatial uncertainty in landscapes with different statistical properties and for hypothetical species with different habitat requirements. Sequential indicator simulation generated alternative landscapes from a source map. Their results showed that spatial uncertainty was highest for landscapes in which suitable habitat was rare and spatially uncorrelated. Although, they were able to exert some control over the degree of spatial uncertainty by varying the sampling density drawn from the source map, intrinsic spatial properties (i.e., average frequency and degree of spatial autocorrelation) played a dominant role in determining variation among realized maps. To evaluate the ecological significance of landscape variation, they compared the variation in predictions from a simple habitat model to variation among landscapes for three species types. Spatial uncertainty in predictions of the amount of source habitat depended on both the spatial life history characteristics of the species and the statistical attributes of the synthetic landscapes. Species differences were greatest when the landscape contained a high proportion of suitable habitat. The predicted amount of source habitat was greater for edge-dependent (interior) species in landscapes with spatially uncorrelated(correlated) suitable habitat. A source-sink model demonstrated that, although variation among landscapes resulted in relatively little variation in overall population growth rate, this spatial uncertainty was sufficient in some situations, to produce qualitatively different predictions about population viability (i.e., population decline vs. increase).

  5. Estimating Coastal Digital Elevation Model (DEM) Uncertainty

    Science.gov (United States)

    Amante, C.; Mesick, S.

    2017-12-01

    Integrated bathymetric-topographic digital elevation models (DEMs) are representations of the Earth's solid surface and are fundamental to the modeling of coastal processes, including tsunami, storm surge, and sea-level rise inundation. Deviations in elevation values from the actual seabed or land surface constitute errors in DEMs, which originate from numerous sources, including: (i) the source elevation measurements (e.g., multibeam sonar, lidar), (ii) the interpolative gridding technique (e.g., spline, kriging) used to estimate elevations in areas unconstrained by source measurements, and (iii) the datum transformation used to convert bathymetric and topographic data to common vertical reference systems. The magnitude and spatial distribution of the errors from these sources are typically unknown, and the lack of knowledge regarding these errors represents the vertical uncertainty in the DEM. The National Oceanic and Atmospheric Administration (NOAA) National Centers for Environmental Information (NCEI) has developed DEMs for more than 200 coastal communities. This study presents a methodology developed at NOAA NCEI to derive accompanying uncertainty surfaces that estimate DEM errors at the individual cell-level. The development of high-resolution (1/9th arc-second), integrated bathymetric-topographic DEMs along the southwest coast of Florida serves as the case study for deriving uncertainty surfaces. The estimated uncertainty can then be propagated into the modeling of coastal processes that utilize DEMs. Incorporating the uncertainty produces more reliable modeling results, and in turn, better-informed coastal management decisions.

  6. Model Uncertainty for Bilinear Hysteretic Systems

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    1984-01-01

    is related to the concept of a failure surface (or limit state surface) in the n-dimensional basic variable space then model uncertainty is at least due to the neglected variables, the modelling of the failure surface and the computational technique used. A more precise definition is given in section 2...

  7. Uncertainty quantification in wind farm flow models

    DEFF Research Database (Denmark)

    Murcia Leon, Juan Pablo

    uncertainties through a model chain are presented and applied to several wind energy related problems such as: annual energy production estimation, wind turbine power curve estimation, wake model calibration and validation, and estimation of lifetime equivalent fatigue loads on a wind turbine. Statistical...

  8. A simplified model of choice behavior under uncertainty

    Directory of Open Access Journals (Sweden)

    Ching-Hung Lin

    2016-08-01

    Full Text Available The Iowa Gambling Task (IGT has been standardized as a clinical assessment tool (Bechara, 2007. Nonetheless, numerous research groups have attempted to modify IGT models to optimize parameters for predicting the choice behavior of normal controls and patients. A decade ago, most researchers considered the expected utility (EU model (Busemeyer and Stout, 2002 to be the optimal model for predicting choice behavior under uncertainty. However, in recent years, studies have demonstrated the prospect utility (PU models (Ahn et al., 2008 to be more effective than the EU models in the IGT. Nevertheless, after some preliminary tests, we propose that Ahn et al. (2008 PU model is not optimal due to some incompatible results between our behavioral and modeling data. This study aims to modify Ahn et al. (2008 PU model to a simplified model and collected 145 subjects’ IGT performance as the benchmark data for comparison. In our simplified PU model, the best goodness-of-fit was found mostly while α approaching zero. More specifically, we retested the key parameters α, λ , and A in the PU model. Notably, the power of influence of the parameters α, λ, and A has a hierarchical order in terms of manipulating the goodness-of-fit in the PU model. Additionally, we found that the parameters λ and A may be ineffective when the parameter α is close to zero in the PU model. The present simplified model demonstrated that decision makers mostly adopted the strategy of gain-stay-loss-shift rather than foreseeing the long-term outcome. However, there still have other behavioral variables that are not well revealed under these dynamic uncertainty situations. Therefore, the optimal behavioral models may not have been found. In short, the best model for predicting choice behavior under dynamic-uncertainty situations should be further evaluated.

  9. An educational model for ensemble streamflow simulation and uncertainty analysis

    Directory of Open Access Journals (Sweden)

    A. AghaKouchak

    2013-02-01

    Full Text Available This paper presents the hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity. HBV-Ensemble was administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of uncertainty in hydrological modeling.

  10. Uncertainty analysis in WWTP model applications: a critical discussion using an example from design

    DEFF Research Database (Denmark)

    Sin, Gürkan; Gernaey, Krist; Neumann, Marc B.

    2009-01-01

    This study focuses on uncertainty analysis of WWTP models and analyzes the issue of framing and how it affects the interpretation of uncertainty analysis results. As a case study, the prediction of uncertainty involved in model-based design of a wastewater treatment plant is studied. The Monte...... of design performance criteria differs significantly. The implication for the practical applications of uncertainty analysis in the wastewater industry is profound: (i) as the uncertainty analysis results are specific to the framing used, the results must be interpreted within the context of that framing...... to stoichiometric, biokinetic and influent parameters; (2) uncertainty due to hydraulic behaviour of the plant and mass transfer parameters; (3) uncertainty due to the combination of (1) and (2). The results demonstrate that depending on the way the uncertainty analysis is framed, the estimated uncertainty...

  11. Uncertainties in model-based outcome predictions for treatment planning

    International Nuclear Information System (INIS)

    Deasy, Joseph O.; Chao, K.S. Clifford; Markman, Jerry

    2001-01-01

    Purpose: Model-based treatment-plan-specific outcome predictions (such as normal tissue complication probability [NTCP] or the relative reduction in salivary function) are typically presented without reference to underlying uncertainties. We provide a method to assess the reliability of treatment-plan-specific dose-volume outcome model predictions. Methods and Materials: A practical method is proposed for evaluating model prediction based on the original input data together with bootstrap-based estimates of parameter uncertainties. The general framework is applicable to continuous variable predictions (e.g., prediction of long-term salivary function) and dichotomous variable predictions (e.g., tumor control probability [TCP] or NTCP). Using bootstrap resampling, a histogram of the likelihood of alternative parameter values is generated. For a given patient and treatment plan we generate a histogram of alternative model results by computing the model predicted outcome for each parameter set in the bootstrap list. Residual uncertainty ('noise') is accounted for by adding a random component to the computed outcome values. The residual noise distribution is estimated from the original fit between model predictions and patient data. Results: The method is demonstrated using a continuous-endpoint model to predict long-term salivary function for head-and-neck cancer patients. Histograms represent the probabilities for the level of posttreatment salivary function based on the input clinical data, the salivary function model, and the three-dimensional dose distribution. For some patients there is significant uncertainty in the prediction of xerostomia, whereas for other patients the predictions are expected to be more reliable. In contrast, TCP and NTCP endpoints are dichotomous, and parameter uncertainties should be folded directly into the estimated probabilities, thereby improving the accuracy of the estimates. Using bootstrap parameter estimates, competing treatment

  12. Return Predictability, Model Uncertainty, and Robust Investment

    DEFF Research Database (Denmark)

    Lukas, Manuel

    Stock return predictability is subject to great uncertainty. In this paper we use the model confidence set approach to quantify uncertainty about expected utility from investment, accounting for potential return predictability. For monthly US data and six representative return prediction models, we...... find that confidence sets are very wide, change significantly with the predictor variables, and frequently include expected utilities for which the investor prefers not to invest. The latter motivates a robust investment strategy maximizing the minimal element of the confidence set. The robust investor...... allocates a much lower share of wealth to stocks compared to a standard investor....

  13. Uncertainty calculation in transport models and forecasts

    DEFF Research Database (Denmark)

    Manzo, Stefano; Prato, Carlo Giacomo

    . Forthcoming: European Journal of Transport and Infrastructure Research, 15-3, 64-72. 4 The last paper4 examined uncertainty in the spatial composition of residence and workplace locations in the Danish National Transport Model. Despite the evidence that spatial structure influences travel behaviour...... to increase the quality of the decision process and to develop robust or adaptive plans. In fact, project evaluation processes that do not take into account model uncertainty produce not fully informative and potentially misleading results so increasing the risk inherent to the decision to be taken...

  14. Modeling transport phenomena and uncertainty quantification in solidification processes

    Science.gov (United States)

    Fezi, Kyle S.

    Direct chill (DC) casting is the primary processing route for wrought aluminum alloys. This semicontinuous process consists of primary cooling as the metal is pulled through a water cooled mold followed by secondary cooling with a water jet spray and free falling water. To gain insight into this complex solidification process, a fully transient model of DC casting was developed to predict the transport phenomena of aluminum alloys for various conditions. This model is capable of solving mixture mass, momentum, energy, and species conservation equations during multicomponent solidification. Various DC casting process parameters were examined for their effect on transport phenomena predictions in an alloy of commercial interest (aluminum alloy 7050). The practice of placing a wiper to divert cooling water from the ingot surface was studied and the results showed that placement closer to the mold causes remelting at the surface and increases susceptibility to bleed outs. Numerical models of metal alloy solidification, like the one previously mentioned, are used to gain insight into physical phenomena that cannot be observed experimentally. However, uncertainty in model inputs cause uncertainty in results and those insights. The analysis of model assumptions and probable input variability on the level of uncertainty in model predictions has not been calculated in solidification modeling as yet. As a step towards understanding the effect of uncertain inputs on solidification modeling, uncertainty quantification (UQ) and sensitivity analysis were first performed on a transient solidification model of a simple binary alloy (Al-4.5wt.%Cu) in a rectangular cavity with both columnar and equiaxed solid growth models. This analysis was followed by quantifying the uncertainty in predictions from the recently developed transient DC casting model. The PRISM Uncertainty Quantification (PUQ) framework quantified the uncertainty and sensitivity in macrosegregation, solidification

  15. Uncertainties

    Indian Academy of Sciences (India)

    To reflect this uncertainty in the climate scenarios, the use of AOGCMs that explicitly simulate the carbon cycle and chemistry of all the substances are needed. The Hadley Centre has developed a version of the climate model that allows the effect of climate change on the carbon cycle and its feedback into climate, to be ...

  16. Uncertainty and sensitivity analysis of environmental transport models

    International Nuclear Information System (INIS)

    Margulies, T.S.; Lancaster, L.E.

    1985-01-01

    An uncertainty and sensitivity analysis has been made of the CRAC-2 (Calculations of Reactor Accident Consequences) atmospheric transport and deposition models. Robustness and uncertainty aspects of air and ground deposited material and the relative contribution of input and model parameters were systematically studied. The underlying data structures were investigated using a multiway layout of factors over specified ranges generated via a Latin hypercube sampling scheme. The variables selected in our analysis include: weather bin, dry deposition velocity, rain washout coefficient/rain intensity, duration of release, heat content, sigma-z (vertical) plume dispersion parameter, sigma-y (crosswind) plume dispersion parameter, and mixing height. To determine the contributors to the output variability (versus distance from the site) step-wise regression analyses were performed on transformations of the spatial concentration patterns simulated. 27 references, 2 figures, 3 tables

  17. Effect of uncertainty parameters on graphene sheets Young's modulus prediction

    Energy Technology Data Exchange (ETDEWEB)

    Sahlaoui, Habib; Sidhom Habib [University of Tunis, Taha Hussein (Turkey); Guedri, Mohamed [Carthage University, Nabeul (Turkey)

    2013-09-15

    Software based on molecular structural mechanics approach (MSMA) and using finite element method (FEM) has been developed to predict the Young's modulus of graphene sheets. Obtained results have been compared to results available in the literature and good agreement has been shown when the same values of uncertainty parameters are used. A sensibility of the models to their uncertainty parameters has been investigated using a stochastic finite element method (SFEM). The different values of the used uncertainty parameters, such as molecular mechanics force field constants k{sub r} and k{sub θ}, thickness (t) of a graphene sheet and length ( L{sub B}) of a carbon carbon bonds, have been collected from the literature. Strong sensibilities of 91% to the thickness and of 21% to the stretching force (k{sub r}) have been shown. The results justify the great difference between Young's modulus predicted values of the graphene sheets and their large disagreement with experimental results.

  18. Sensitivities and uncertainties of modeled ground temperatures in mountain environments

    Directory of Open Access Journals (Sweden)

    S. Gubler

    2013-08-01

    discretization parameters. We show that the temporal resolution should be at least 1 h to ensure errors less than 0.2 °C in modeled MAGT, and the uppermost ground layer should at most be 20 mm thick. Within the topographic setting, the total parametric output uncertainties expressed as the length of the 95% uncertainty interval of the Monte Carlo simulations range from 0.5 to 1.5 °C for clay and silt, and ranges from 0.5 to around 2.4 °C for peat, sand, gravel and rock. These uncertainties are comparable to the variability of ground surface temperatures measured within 10 m × 10 m grids in Switzerland. The increased uncertainties for sand, peat and gravel are largely due to their sensitivity to the hydraulic conductivity.

  19. Accept & Reject Statement-Based Uncertainty Models

    NARCIS (Netherlands)

    E. Quaeghebeur (Erik); G. de Cooman; F. Hermans (Felienne)

    2015-01-01

    textabstractWe develop a framework for modelling and reasoning with uncertainty based on accept and reject statements about gambles. It generalises the frameworks found in the literature based on statements of acceptability, desirability, or favourability and clarifies their relative position. Next

  20. Optical Model and Cross Section Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Herman,M.W.; Pigni, M.T.; Dietrich, F.S.; Oblozinsky, P.

    2009-10-05

    Distinct minima and maxima in the neutron total cross section uncertainties were observed in model calculations using spherical optical potential. We found this oscillating structure to be a general feature of quantum mechanical wave scattering. Specifically, we analyzed neutron interaction with 56Fe from 1 keV up to 65 MeV, and investigated physical origin of the minima.We discuss their potential importance for practical applications as well as the implications for the uncertainties in total and absorption cross sections.

  1. Uncertainty quantification and stochastic modeling with Matlab

    CERN Document Server

    Souza de Cursi, Eduardo

    2015-01-01

    Uncertainty Quantification (UQ) is a relatively new research area which describes the methods and approaches used to supply quantitative descriptions of the effects of uncertainty, variability and errors in simulation problems and models. It is rapidly becoming a field of increasing importance, with many real-world applications within statistics, mathematics, probability and engineering, but also within the natural sciences. Literature on the topic has up until now been largely based on polynomial chaos, which raises difficulties when considering different types of approximation and does no

  2. APPLICATION OF UNCERTAINTY ANALYSIS TO MAAP4 ANALYSES FOR LEVEL 2 PRA PARAMETER IMPORTANCE DETERMINATION

    Directory of Open Access Journals (Sweden)

    KEVIN ROBERTS

    2013-11-01

    A key element tied to using a code like MAAP4 is an uncertainty analysis. The purpose of this paper is to present a MAAP4 based analysis to examine the sensitivity of a key parameter, in this case hydrogen production, to a set of model parameters that are related to a Level 2 PRA analysis. The Level 2 analysis examines those sequences that result in core melting and subsequent reactor pressure vessel failure and its impact on the containment. This paper identifies individual contributors and MAAP4 model parameters that statistically influence hydrogen production. Hydrogen generation was chosen because of its direct relationship to oxidation. With greater oxidation, more heat is added to the core region and relocation (core slump should occur faster. This, in theory, would lead to shorter failure times and subsequent “hotter” debris pool on the containment floor.

  3. Representing uncertainty on model analysis plots

    Directory of Open Access Journals (Sweden)

    Trevor I. Smith

    2016-09-01

    Full Text Available Model analysis provides a mechanism for representing student learning as measured by standard multiple-choice surveys. The model plot contains information regarding both how likely students in a particular class are to choose the correct answer and how likely they are to choose an answer consistent with a well-documented conceptual model. Unfortunately, Bao’s original presentation of the model plot did not include a way to represent uncertainty in these measurements. I present details of a method to add error bars to model plots by expanding the work of Sommer and Lindell. I also provide a template for generating model plots with error bars.

  4. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    International Nuclear Information System (INIS)

    Heo, Jaeseok; Kim, Kyung Doo

    2015-01-01

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper

  5. Uncertainty analysis of hydrological modeling in a tropical area using different algorithms

    Science.gov (United States)

    Rafiei Emam, Ammar; Kappas, Martin; Fassnacht, Steven; Linh, Nguyen Hoang Khanh

    2018-01-01

    Hydrological modeling outputs are subject to uncertainty resulting from different sources of errors (e.g., error in input data, model structure, and model parameters), making quantification of uncertainty in hydrological modeling imperative and meant to improve reliability of modeling results. The uncertainty analysis must solve difficulties in calibration of hydrological models, which further increase in areas with data scarcity. The purpose of this study is to apply four uncertainty analysis algorithms to a semi-distributed hydrological model, quantifying different source of uncertainties (especially parameter uncertainty) and evaluate their performance. In this study, the Soil and Water Assessment Tools (SWAT) eco-hydrological model was implemented for the watershed in the center of Vietnam. The sensitivity of parameters was analyzed, and the model was calibrated. The uncertainty analysis for the hydrological model was conducted based on four algorithms: Generalized Likelihood Uncertainty Estimation (GLUE), Sequential Uncertainty Fitting (SUFI), Parameter Solution method (ParaSol) and Particle Swarm Optimization (PSO). The performance of the algorithms was compared using P-factor and Rfactor, coefficient of determination (R 2), the Nash Sutcliffe coefficient of efficiency (NSE) and Percent Bias (PBIAS). The results showed the high performance of SUFI and PSO with P-factor>0.83, R-factor 0.91, NSE>0.89, and 0.18model use for policy or management decisions.

  6. Uncertainty Quantification in Geomagnetic Field Modeling

    Science.gov (United States)

    Chulliat, A.; Nair, M. C.; Alken, P.; Meyer, B.; Saltus, R.; Woods, A.

    2017-12-01

    Geomagnetic field models are mathematical descriptions of the various sources of the Earth's magnetic field, and are generally obtained by solving an inverse problem. They are widely used in research to separate and characterize field sources, but also in many practical applications such as aircraft and ship navigation, smartphone orientation, satellite attitude control, and directional drilling. In recent years, more sophisticated models have been developed, thanks to the continuous availability of high quality satellite data and to progress in modeling techniques. Uncertainty quantification has become an integral part of model development, both to assess the progress made and to address specific users' needs. Here we report on recent advances made by our group in quantifying the uncertainty of geomagnetic field models. We first focus on NOAA's World Magnetic Model (WMM) and the International Geomagnetic Reference Field (IGRF), two reference models of the main (core) magnetic field produced every five years. We describe the methods used in quantifying the model commission error as well as the omission error attributed to various un-modeled sources such as magnetized rocks in the crust and electric current systems in the atmosphere and near-Earth environment. A simple error model was derived from this analysis, to facilitate usage in practical applications. We next report on improvements brought by combining a main field model with a high resolution crustal field model and a time-varying, real-time external field model, like in NOAA's High Definition Geomagnetic Model (HDGM). The obtained uncertainties are used by the directional drilling industry to mitigate health, safety and environment risks.

  7. Nuclear data adjustment methodology utilizing resonance parameter sensitivities and uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Broadhead, B.L.

    1984-01-01

    This work presents the development and demonstration of a Nuclear Data Adjustment Method that allows inclusion of both energy and spatial self-shielding into the adjustment procedure. The resulting adjustments are for the basic parameters (i.e., resonance parameters) in the resonance regions and for the group cross sections elsewhere. The majority of this development effort concerns the production of resonance parameter sensitivity information which allows the linkage between the responses of interest and the basic parameters. The resonance parameter sensitivity methodology developed herein usually provides accurate results when compared to direct recalculations using existing and well-known cross section processing codes. However, it has been shown in several cases that self-shielded cross sections can be very non-linear functions of the basic parameters. For this reason caution must be used in any study which assumes that a linear relationship exists between a given self-shielded group cross section and its corresponding basic data parameters.

  8. Quantum-memory-assisted entropic uncertainty in spin models with Dzyaloshinskii-Moriya interaction

    Science.gov (United States)

    Huang, Zhiming

    2018-02-01

    In this article, we investigate the dynamics and correlations of quantum-memory-assisted entropic uncertainty, the tightness of the uncertainty, entanglement, quantum correlation and mixedness for various spin chain models with Dzyaloshinskii-Moriya (DM) interaction, including the XXZ model with DM interaction, the XY model with DM interaction and the Ising model with DM interaction. We find that the uncertainty grows to a stable value with growing temperature but reduces as the coupling coefficient, anisotropy parameter and DM values increase. It is found that the entropic uncertainty is closely correlated with the mixedness of the system. The increasing quantum correlation can result in a decrease in the uncertainty, and the robustness of quantum correlation is better than entanglement since entanglement means sudden birth and death. The tightness of the uncertainty drops to zero, apart from slight volatility as various parameters increase. Furthermore, we propose an effective approach to steering the uncertainty by weak measurement reversal.

  9. Using dynamical uncertainty models estimating uncertainty bounds on power plant performance prediction

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Stoustrup, Jakob; Mataji, B.

    2007-01-01

    of the prediction error. These proposed dynamical uncertainty models result in an upper and lower bound on the predicted performance of the plant. The dynamical uncertainty models are used to estimate the uncertainty of the predicted performance of a coal-fired power plant. The proposed scheme, which uses dynamical...

  10. Matching experimental and three dimensional numerical models for structural vibration problems with uncertainties

    Science.gov (United States)

    Langer, P.; Sepahvand, K.; Guist, C.; Bär, J.; Peplow, A.; Marburg, S.

    2018-03-01

    The simulation model which examines the dynamic behavior of real structures needs to address the impact of uncertainty in both geometry and material parameters. This article investigates three-dimensional finite element models for structural dynamics problems with respect to both model and parameter uncertainties. The parameter uncertainties are determined via laboratory measurements on several beam-like samples. The parameters are then considered as random variables to the finite element model for exploring the uncertainty effects on the quality of the model outputs, i.e. natural frequencies. The accuracy of the output predictions from the model is compared with the experimental results. To this end, the non-contact experimental modal analysis is conducted to identify the natural frequency of the samples. The results show a good agreement compared with experimental data. Furthermore, it is demonstrated that geometrical uncertainties have more influence on the natural frequencies compared to material parameters and material uncertainties are about two times higher than geometrical uncertainties. This gives valuable insights for improving the finite element model due to various parameter ranges required in a modeling process involving uncertainty.

  11. Verification and Uncertainty Reduction of Amchitka Underground Nuclear Testing Models

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed Hassan; Jenny Chapman

    2006-02-01

    The modeling of Amchitka underground nuclear tests conducted in 2002 is verified and uncertainty in model input parameters, as well as predictions, has been reduced using newly collected data obtained by the summer 2004 field expedition of CRESP. Newly collected data that pertain to the groundwater model include magnetotelluric (MT) surveys conducted on the island to determine the subsurface salinity and porosity structure of the subsurface, and bathymetric surveys to determine the bathymetric maps of the areas offshore from the Long Shot and Cannikin Sites. Analysis and interpretation of the MT data yielded information on the location of the transition zone, and porosity profiles showing porosity values decaying with depth. These new data sets are used to verify the original model in terms of model parameters, model structure, and model output verification. In addition, by using the new data along with the existing data (chemistry and head data), the uncertainty in model input and output is decreased by conditioning on all the available data. A Markov Chain Monte Carlo (MCMC) approach is adapted for developing new input parameter distributions conditioned on prior knowledge and new data. The MCMC approach is a form of Bayesian conditioning that is constructed in such a way that it produces samples of the model parameters that eventually converge to a stationary posterior distribution. The Bayesian MCMC approach enhances probabilistic assessment. Instead of simply propagating uncertainty forward from input parameters into model predictions (i.e., traditional Monte Carlo approach), MCMC propagates uncertainty backward from data onto parameters, and then forward from parameters into predictions. Comparisons between new data and the original model, and conditioning on all available data using MCMC method, yield the following results and conclusions: (1) Model structure is verified at Long Shot and Cannikin where the high-resolution bathymetric data collected by CRESP

  12. Impact of model defect and experimental uncertainties on evaluated output

    International Nuclear Information System (INIS)

    Neudecker, D.; Capote, R.; Leeb, H.

    2013-01-01

    One of the current major problems in nuclear data evaluation is the unreasonably small evaluated uncertainties often obtained. These small uncertainties are partly attributed to missing correlations of experimental uncertainties as well as to deficiencies of the model employed for the prior information. In this article, both uncertainty sources are included in an evaluation of 55 Mn cross-sections for incident neutrons. Their impact on the evaluated output is studied using a prior obtained by the Full Bayesian Evaluation Technique and a prior obtained by the nuclear model program EMPIRE. It is shown analytically and by means of an evaluation that unreasonably small evaluated uncertainties can be obtained not only if correlated systematic uncertainties of the experiment are neglected but also if prior uncertainties are smaller or about the same magnitude as the experimental ones. Furthermore, it is shown that including model defect uncertainties in the evaluation of 55 Mn leads to larger evaluated uncertainties for channels where the model is deficient. It is concluded that including correlated experimental uncertainties is equally important as model defect uncertainties, if the model calculations deviate significantly from the measurements. -- Highlights: • We study possible causes of unreasonably small evaluated nuclear data uncertainties. • Two different formulations of model defect uncertainties are presented and compared. • Smaller prior than experimental uncertainties cause too small evaluated ones. • Neglected correlations of experimental uncertainties cause too small evaluated ones. • Including model defect uncertainties in the prior improves the evaluated output

  13. Uncertainty Analysis of Geochemical Parameters Related To CO2 Leakage into an Unconfined Limestone Aquifer

    Science.gov (United States)

    Bacon, D. H.; Keating, E. H.; Viswanathan, H. S.; Dai, Z.

    2011-12-01

    Accurate prediction of the impact of leaking CO2 on groundwater quality is limited by the complexity of subsurface aquifers and the geochemical reactions that control drinking water compositions. As a result there is a high uncertainty associated with any predictions, hampering monitoring plans, interpretation of the monitoring results, and mitigation plans for a given site. Many physical and geochemical characteristics will dictate a drinking water aquifer's response to a CO2 leak. As a part of the National Risk Assessment Program (NRAP), funded by the U.S. Department of Energy, scientists at Pacific Northwest National Laboratory (PNNL) and Los Alamos National Laboratory (LANL) have collaborated on the development of a 3D heterogeneous model of the Edwards Aquifer in Texas to examine the impacts of CO2 leakage into an unconfined, carbonate aquifer. Using the same base case model, LANL has focused on uncertainty quantification (UQ) of the aquifer's hydraulic properties, whereas PNNL has examined the impact of uncertainty related to geochemical parameters. This abstract describes PNNL's work on geochemical UQ. The uncertainty analysis looks at the impact on several model outputs, including CO2 leakage rate from the water table, mean pH value, and pHdisordered dolomite. We also examine the impact on drinking water quality, specifically TDS, from leakage of brine from the underlying formation, forced upwards by increased pressure due to CO2 injection. To conduct these simulations we use STOMP-CO2-R, which is a multiphase flow simulator, coupled with the reactive transport module ECKEChem, developed at PNNL to simulate CO2 sequestration in deep saline formations and the associated reactions with formation minerals. For uncertainty quantification, we use the PSUADE code. The user defines a range and/or distribution of key input parameters. PSUADE then generates a number of parameter files containing sample points from prescribed sampling method (i.e. Latin hypercube

  14. Investigating the robustness of ion beam therapy treatment plans to uncertainties in biological treatment parameters

    CERN Document Server

    Boehlen, T T; Dosanjh, M; Ferrari, A; Fossati, P; Haberer, T; Mairani, A; Patera, V

    2012-01-01

    Uncertainties in determining clinically used relative biological effectiveness (RBE) values for ion beam therapy carry the risk of absolute and relative misestimations of RBE-weighted doses for clinical scenarios. This study assesses the consequences of hypothetical misestimations of input parameters to the RBE modelling for carbon ion treatment plans by a variational approach. The impact of the variations on resulting cell survival and RBE values is evaluated as a function of the remaining ion range. In addition, the sensitivity to misestimations in RBE modelling is compared for single fields and two opposed fields using differing optimization criteria. It is demonstrated for single treatment fields that moderate variations (up to +/-50\\%) of representative nominal input parameters for four tumours result mainly in a misestimation of the RBE-weighted dose in the planning target volume (PTV) by a constant factor and only smaller RBE-weighted dose gradients. Ensuring a more uniform radiation quality in the PTV...

  15. Stochastic reduced order models for inverse problems under uncertainty.

    Science.gov (United States)

    Warner, James E; Aquino, Wilkins; Grigoriu, Mircea D

    2015-03-01

    This work presents a novel methodology for solving inverse problems under uncertainty using stochastic reduced order models (SROMs). Given statistical information about an observed state variable in a system, unknown parameters are estimated probabilistically through the solution of a model-constrained, stochastic optimization problem. The point of departure and crux of the proposed framework is the representation of a random quantity using a SROM - a low dimensional, discrete approximation to a continuous random element that permits e cient and non-intrusive stochastic computations. Characterizing the uncertainties with SROMs transforms the stochastic optimization problem into a deterministic one. The non-intrusive nature of SROMs facilitates e cient gradient computations for random vector unknowns and relies entirely on calls to existing deterministic solvers. Furthermore, the method is naturally extended to handle multiple sources of uncertainty in cases where state variable data, system parameters, and boundary conditions are all considered random. The new and widely-applicable SROM framework is formulated for a general stochastic optimization problem in terms of an abstract objective function and constraining model. For demonstration purposes, however, we study its performance in the specific case of inverse identification of random material parameters in elastodynamics. We demonstrate the ability to efficiently recover random shear moduli given material displacement statistics as input data. We also show that the approach remains effective for the case where the loading in the problem is random as well.

  16. Estimation of uncertainties in predictions of environmental transfer models: evaluation of methods and application to CHERPAC

    International Nuclear Information System (INIS)

    Koch, J.; Peterson, S-R.

    1995-10-01

    Models used to simulate environmental transfer of radionuclides typically include many parameters, the values of which are uncertain. An estimation of the uncertainty associated with the predictions is therefore essential. Difference methods to quantify the uncertainty in the prediction parameter uncertainties are reviewed. A statistical approach using random sampling techniques is recommended for complex models with many uncertain parameters. In this approach, the probability density function of the model output is obtained from multiple realizations of the model according to a multivariate random sample of the different input parameters. Sampling efficiency can be improved by using a stratified scheme (Latin Hypercube Sampling). Sample size can also be restricted when statistical tolerance limits needs to be estimated. Methods to rank parameters according to their contribution to uncertainty in the model prediction are also reviewed. Recommended are measures of sensitivity, correlation and regression coefficients that can be calculated on values of input and output variables generated during the propagation of uncertainties through the model. A parameter uncertainty analysis is performed for the CHERPAC food chain model which estimates subjective confidence limits and intervals on the predictions at a 95% confidence level. A sensitivity analysis is also carried out using partial rank correlation coefficients. This identified and ranks the parameters which are the main contributors to uncertainty in the predictions, thereby guiding further research efforts. (author). 44 refs., 2 tabs., 4 figs

  17. Procedure to approximately estimate the uncertainty of material ratio parameters due to inhomogeneity of surface roughness

    International Nuclear Information System (INIS)

    Hüser, Dorothee; Thomsen-Schmidt, Peter; Hüser, Jonathan; Rief, Sebastian; Seewig, Jörg

    2016-01-01

    Roughness parameters that characterize contacting surfaces with regard to friction and wear are commonly stated without uncertainties, or with an uncertainty only taking into account a very limited amount of aspects such as repeatability of reproducibility (homogeneity) of the specimen. This makes it difficult to discriminate between different values of single roughness parameters. Therefore uncertainty assessment methods are required that take all relevant aspects into account. In the literature this is rarely performed and examples specific for parameters used in friction and wear are not yet given. We propose a procedure to derive the uncertainty from a single profile employing a statistical method that is based on the statistical moments of the amplitude distribution and the autocorrelation length of the profile. To show the possibilities and the limitations of this method we compare the uncertainty derived from a single profile with that derived from a high statistics experiment. (paper)

  18. Handling Uncertainty in Palaeo-Climate Models and Data

    Science.gov (United States)

    Voss, J.; Haywood, A. M.; Dolan, A. M.; Domingo, D.

    2017-12-01

    The study of palaeoclimates can provide data on the behaviour of the Earth system with boundary conditions different from the ones we observe in the present. One of the main challenges in this approach is that data on past climates comes with large uncertainties, since quantities of interest cannot be observed directly, but must be derived from proxies instead. We consider proxy-derived data from the Pliocene (around 3 millions years ago; the last interval in Earth history when CO2 was at modern or near future levels) and contrast this data to the output of complex climate models. In order to perform a meaningful data-model comparison, uncertainties must be taken into account. In this context, we discuss two examples of complex data-model comparison problems. Both examples have in common that they involve fitting a statistical model to describe how the output of the climate simulations depends on various model parameters, including atmospheric CO2 concentration and orbital parameters (obliquity, excentricity, and precession). This introduces additional uncertainties, but allows to explore a much larger range of model parameters than would be feasible by only relying on simulation runs. The first example shows how Gaussian process emulators can be used to perform data-model comparison when simulation runs only differ in the choice of orbital parameters, but temperature data is given in the (somewhat inconvenient) form of "warm peak averages". The second example shows how a simpler approach, based on linear regression, can be used to analyse a more complex problem where we use a larger and more varied ensemble of climate simulations with the aim to estimate Earth System Sensitivity.

  19. A python framework for environmental model uncertainty analysis

    Science.gov (United States)

    White, Jeremy; Fienen, Michael N.; Doherty, John E.

    2016-01-01

    We have developed pyEMU, a python framework for Environmental Modeling Uncertainty analyses, open-source tool that is non-intrusive, easy-to-use, computationally efficient, and scalable to highly-parameterized inverse problems. The framework implements several types of linear (first-order, second-moment (FOSM)) and non-linear uncertainty analyses. The FOSM-based analyses can also be completed prior to parameter estimation to help inform important modeling decisions, such as parameterization and objective function formulation. Complete workflows for several types of FOSM-based and non-linear analyses are documented in example notebooks implemented using Jupyter that are available in the online pyEMU repository. Example workflows include basic parameter and forecast analyses, data worth analyses, and error-variance analyses, as well as usage of parameter ensemble generation and management capabilities. These workflows document the necessary steps and provides insights into the results, with the goal of educating users not only in how to apply pyEMU, but also in the underlying theory of applied uncertainty quantification.

  20. Current status of uncertainty analysis methods for computer models

    International Nuclear Information System (INIS)

    Ishigami, Tsutomu

    1989-11-01

    This report surveys several existing uncertainty analysis methods for estimating computer output uncertainty caused by input uncertainties, illustrating application examples of those methods to three computer models, MARCH/CORRAL II, TERFOC and SPARC. Merits and limitations of the methods are assessed in the application, and recommendation for selecting uncertainty analysis methods is provided. (author)

  1. Intrinsic Uncertainties in Modeling Complex Systems.

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, Curtis S; Bramson, Aaron L.; Ames, Arlo L.

    2014-09-01

    Models are built to understand and predict the behaviors of both natural and artificial systems. Because it is always necessary to abstract away aspects of any non-trivial system being modeled, we know models can potentially leave out important, even critical elements. This reality of the modeling enterprise forces us to consider the prospective impacts of those effects completely left out of a model - either intentionally or unconsidered. Insensitivity to new structure is an indication of diminishing returns. In this work, we represent a hypothetical unknown effect on a validated model as a finite perturba- tion whose amplitude is constrained within a control region. We find robustly that without further constraints, no meaningful bounds can be placed on the amplitude of a perturbation outside of the control region. Thus, forecasting into unsampled regions is a very risky proposition. We also present inherent difficulties with proper time discretization of models and representing in- herently discrete quantities. We point out potentially worrisome uncertainties, arising from math- ematical formulation alone, which modelers can inadvertently introduce into models of complex systems. Acknowledgements This work has been funded under early-career LDRD project #170979, entitled "Quantify- ing Confidence in Complex Systems Models Having Structural Uncertainties", which ran from 04/2013 to 09/2014. We wish to express our gratitude to the many researchers at Sandia who con- tributed ideas to this work, as well as feedback on the manuscript. In particular, we would like to mention George Barr, Alexander Outkin, Walt Beyeler, Eric Vugrin, and Laura Swiler for provid- ing invaluable advice and guidance through the course of the project. We would also like to thank Steven Kleban, Amanda Gonzales, Trevor Manzanares, and Sarah Burwell for their assistance in managing project tasks and resources.

  2. Model parameter updating using Bayesian networks

    Energy Technology Data Exchange (ETDEWEB)

    Treml, C. A. (Christine A.); Ross, Timothy J.

    2004-01-01

    This paper outlines a model parameter updating technique for a new method of model validation using a modified model reference adaptive control (MRAC) framework with Bayesian Networks (BNs). The model parameter updating within this method is generic in the sense that the model/simulation to be validated is treated as a black box. It must have updateable parameters to which its outputs are sensitive, and those outputs must have metrics that can be compared to that of the model reference, i.e., experimental data. Furthermore, no assumptions are made about the statistics of the model parameter uncertainty, only upper and lower bounds need to be specified. This method is designed for situations where a model is not intended to predict a complete point-by-point time domain description of the item/system behavior; rather, there are specific points, features, or events of interest that need to be predicted. These specific points are compared to the model reference derived from actual experimental data. The logic for updating the model parameters to match the model reference is formed via a BN. The nodes of this BN consist of updateable model input parameters and the specific output values or features of interest. Each time the model is executed, the input/output pairs are used to adapt the conditional probabilities of the BN. Each iteration further refines the inferred model parameters to produce the desired model output. After parameter updating is complete and model inputs are inferred, reliabilities for the model output are supplied. Finally, this method is applied to a simulation of a resonance control cooling system for a prototype coupled cavity linac. The results are compared to experimental data.

  3. Uncertainty analysis in rainfall-runoff modelling : Application of machine learning techniques

    NARCIS (Netherlands)

    Shrestha, D.l.

    2009-01-01

    This thesis presents powerful machine learning (ML) techniques to build predictive models of uncertainty with application to hydrological models. Two different methods are developed and tested. First one focuses on parameter uncertainty analysis by emulating the results of Monte Carlo simulations of

  4. Uncertainty Analysis in Rainfall-Runoff Modelling: Application of Machine Learning Techniques

    NARCIS (Netherlands)

    Shrestha, D.L.

    2009-01-01

    This thesis presents powerful machine learning (ML) techniques to build predictive models of uncertainty with application to hydrological models. Two different methods are developed and tested. First one focuses on parameter uncertainty analysis by emulating the results of Monte Carlo simulations of

  5. Uncertainty in eddy covariance measurements and its application to physiological models

    Science.gov (United States)

    D.Y. Hollinger; A.D. Richardson; A.D. Richardson

    2005-01-01

    Flux data are noisy, and this uncertainty is largely due to random measurement error. Knowledge of uncertainty is essential for the statistical evaluation of modeled andmeasured fluxes, for comparison of parameters derived by fitting models to measured fluxes and in formal data-assimilation efforts. We used the difference between simultaneous measurements from two...

  6. Uncertainties of the Yn Parameters of the Hage-Cifarelli Formalism

    Energy Technology Data Exchange (ETDEWEB)

    Smith-Nelson, Mark A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Burr, Thomas Lee [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hutchinson, Jesson D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Cutler, Theresa Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-08-14

    One method for determining the physical parameters of a multiplying system is summarized by Cifarelli [1]. In this methodology the single, double and triple rates are determined from what is commonly referred to as Feynman histograms. This paper will examine two methods for estimating the uncertainty in the parameters used in inferring these rates. These methods will be compared with simulated data in order to determine which one best approximates the sample uncertainty.

  7. Robust nonlinear control of nuclear reactors under model uncertainty

    International Nuclear Information System (INIS)

    Park, Moon Ghu

    1993-02-01

    A nonlinear model-based control method is developed for the robust control of a nuclear reactor. The nonlinear plant model is used to design a unique control law which covers a wide operating range. The robustness is a crucial factor for the fully automatic control of reactor power due to time-varying, uncertain parameters, and state estimation error, or unmodeled dynamics. A variable structure control (VSC) method is introduced which consists of an adaptive performance specification (fime control) after the tracking error reaches the narrow boundary-layer by a time-optimal control (coarse control). Variable structure control is a powerful method for nonlinear system controller design which has inherent robustness to parameter variations or external disturbances using the known uncertainty bounds, and it requires very low computational efforts. In spite of its desirable properties, conventional VSC presents several important drawbacks that limit its practical applicability. One of the most undesirable phenomena is chattering, which implies extremely high control activity and may excite high-frequency unmodeled dynamics. This problem is due to the neglected actuator time-delay or sampling effects. The problem was partially remedied by replacing chattering control by a smooth control inter-polation in a boundary layer neighnboring a time-varying sliding surface. But, for the nuclear reactor systems which has very fast dynamic response, the sampling effect may destroy the narrow boundary layer when a large uncertainty bound is used. Due to the very short neutron life time, large uncertainty bound leads to the high gain in feedback control. To resolve this problem, a derivative feedback is introduced that gives excellent performance by reducing the uncertainty bound. The stability of tracking error dynamics is guaranteed by the second method of Lyapunov using the two-level uncertainty bounds that are obtained from the knowledge of uncertainty bound and the estimated

  8. Dealing with uncertainty in landscape genetic resistance models: a case of three co-occurring marsupials.

    Science.gov (United States)

    Dudaniec, Rachael Y; Worthington Wilmer, Jessica; Hanson, Jeffrey O; Warren, Matthew; Bell, Sarah; Rhodes, Jonathan R

    2016-01-01

    Landscape genetics lacks explicit methods for dealing with the uncertainty in landscape resistance estimation, which is particularly problematic when sample sizes of individuals are small. Unless uncertainty can be quantified, valuable but small data sets may be rendered unusable for conservation purposes. We offer a method to quantify uncertainty in landscape resistance estimates using multimodel inference as an improvement over single model-based inference. We illustrate the approach empirically using co-occurring, woodland-preferring Australian marsupials within a common study area: two arboreal gliders (Petaurus breviceps, and Petaurus norfolcensis) and one ground-dwelling antechinus (Antechinus flavipes). First, we use maximum-likelihood and a bootstrap procedure to identify the best-supported isolation-by-resistance model out of 56 models defined by linear and non-linear resistance functions. We then quantify uncertainty in resistance estimates by examining parameter selection probabilities from the bootstrapped data. The selection probabilities provide estimates of uncertainty in the parameters that drive the relationships between landscape features and resistance. We then validate our method for quantifying uncertainty using simulated genetic and landscape data showing that for most parameter combinations it provides sensible estimates of uncertainty. We conclude that small data sets can be informative in landscape genetic analyses provided uncertainty can be explicitly quantified. Being explicit about uncertainty in landscape genetic models will make results more interpretable and useful for conservation decision-making, where dealing with uncertainty is critical. © 2015 John Wiley & Sons Ltd.

  9. Uncertainty modelling of critical column buckling for reinforced ...

    Indian Academy of Sciences (India)

    gates the material uncertainties on column design and proposes an uncertainty model for critical ... ances the accuracy of the structural models by using experimental results and design codes. (Baalbaki et al ..... Elishakoff I 1999 Whys and hows in uncertainty modeling, probability, fuzziness and anti-optimization. New York: ...

  10. Model uncertainty from a regulatory point of view

    International Nuclear Information System (INIS)

    Abramson, L.R.

    1994-01-01

    This paper discusses model uncertainty in the larger context of knowledge and random uncertainty. It explores some regulatory implications of model uncertainty and argues that, from a regulator's perspective, a conservative approach must be taken. As a consequence of this perspective, averaging over model results is ruled out

  11. How well can we forecast future model error and uncertainty by mining past model performance data

    Science.gov (United States)

    Solomatine, Dimitri

    2016-04-01

    Consider a hydrological model Y(t) = M(X(t), P), where X=vector of inputs; P=vector of parameters; Y=model output (typically flow); t=time. In cases when there is enough past data on the model M performance, it is possible to use this data to build a (data-driven) model EC of model M error. This model EC will be able to forecast error E when a new input X is fed into model M; then subtracting E from the model prediction Y a better estimate of Y can be obtained. Model EC is usually called the error corrector (in meteorology - a bias corrector). However, we may go further in characterizing model deficiencies, and instead of using the error (a real value) we may consider a more sophisticated characterization, namely a probabilistic one. So instead of rather a model EC of the model M error it is also possible to build a model U of model M uncertainty; if uncertainty is described as the model error distribution D this model will calculate its properties - mean, variance, other moments, and quantiles. The general form of this model could be: D = U (RV), where RV=vector of relevant variables having influence on model uncertainty (to be identified e.g. by mutual information analysis); D=vector of variables characterizing the error distribution (typically, two or more quantiles). There is one aspect which is not always explicitly mentioned in uncertainty analysis work. In our view it is important to distinguish the following main types of model uncertainty: 1. The residual uncertainty of models. In this case the model parameters and/or model inputs are considered to be fixed (deterministic), i.e. the model is considered to be optimal (calibrated) and deterministic. Model error is considered as the manifestation of uncertainty. If there is enough past data about the model errors (i.e. its uncertainty), it is possible to build a statistical or machine learning model of uncertainty trained on this data. Here the following methods can be mentioned: (a) quantile regression (QR

  12. Estimating the Uncertainty In Diameter Growth Model Predictions and Its Effects On The Uncertainty of Annual Inventory Estimates

    Science.gov (United States)

    Ronald E. McRoberts; Veronica C. Lessard

    2001-01-01

    Uncertainty in diameter growth predictions is attributed to three general sources: measurement error or sampling variability in predictor variables, parameter covariances, and residual or unexplained variation around model expectations. Using measurement error and sampling variability distributions obtained from the literature and Monte Carlo simulation methods, the...

  13. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    Energy Technology Data Exchange (ETDEWEB)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  14. Experimental Active Vibration Control in Truss Structures Considering Uncertainties in System Parameters

    Directory of Open Access Journals (Sweden)

    Douglas Domingues Bueno

    2008-01-01

    Full Text Available This paper deals with the study of algorithms for robust active vibration control in flexible structures considering uncertainties in system parameters. It became an area of enormous interest, mainly due to the countless demands of optimal performance in mechanical systems as aircraft, aerospace, and automotive structures. An important and difficult problem for designing active vibration control is to get a representative dynamic model. Generally, this model can be obtained using finite element method (FEM or an identification method using experimental data. Actuators and sensors may affect the dynamics properties of the structure, for instance, electromechanical coupling of piezoelectric material must be considered in FEM formulation for flexible and lightly damping structure. The nonlinearities and uncertainties involved in these structures make it a difficult task, mainly for complex structures as spatial truss structures. On the other hand, by using an identification method, it is possible to obtain the dynamic model represented through a state space realization considering this coupling. This paper proposes an experimental methodology for vibration control in a 3D truss structure using PZT wafer stacks and a robust control algorithm solved by linear matrix inequalities.

  15. Implications of model uncertainty for the practice of risk assessment

    International Nuclear Information System (INIS)

    Laskey, K.B.

    1994-01-01

    A model is a representation of a system that can be used to answer questions about the system's behavior. The term model uncertainty refers to problems in which there is no generally agreed upon, validated model that can be used as a surrogate for the system itself. Model uncertainty affects both the methodology appropriate for building models and how models should be used. This paper discusses representations of model uncertainty, methodologies for exercising and interpreting models in the presence of model uncertainty, and the appropriate use of fallible models for policy making

  16. Sensitivity and uncertainty analysis of a polyurethane foam decomposition model

    Energy Technology Data Exchange (ETDEWEB)

    HOBBS,MICHAEL L.; ROBINSON,DAVID G.

    2000-03-14

    Sensitivity/uncertainty analyses are not commonly performed on complex, finite-element engineering models because the analyses are time consuming, CPU intensive, nontrivial exercises that can lead to deceptive results. To illustrate these ideas, an analytical sensitivity/uncertainty analysis is used to determine the standard deviation and the primary factors affecting the burn velocity of polyurethane foam exposed to firelike radiative boundary conditions. The complex, finite element model has 25 input parameters that include chemistry, polymer structure, and thermophysical properties. The response variable was selected as the steady-state burn velocity calculated as the derivative of the burn front location versus time. The standard deviation of the burn velocity was determined by taking numerical derivatives of the response variable with respect to each of the 25 input parameters. Since the response variable is also a derivative, the standard deviation is essentially determined from a second derivative that is extremely sensitive to numerical noise. To minimize the numerical noise, 50-micron elements and approximately 1-msec time steps were required to obtain stable uncertainty results. The primary effect variable was shown to be the emissivity of the foam.

  17. Investigation of machining stability in micro milling considering the parameter uncertainty

    Directory of Open Access Journals (Sweden)

    Ziyang Cao

    2015-03-01

    Full Text Available Micro milling can fabricate miniaturized components using micro end mill at high rotational speeds. A major obstacle that limits the productivity in machining operations is the presence of machine tool chatter. The analysis of machining stability in micro milling plays an important role in characterizing the cutting process, in estimating the tool life and optimizing the process. But the majority of the traditional models used to predict chatter stability assume that parameters remain unchanged. In reality, the parameters affecting the machining stability vary with the high spindle speed and dynamic characteristic of the milling system. A numerical analysis and experimental method is present to investigate the machining stability in micro end milling process considering the parameter uncertainty. A robust chatter stability model based on the analytical chatter stability milling model is developed, and the edge theorem and the zero exclusion condition are used. The method is verified experimentally for micro milling operations while considering a changing cutting coefficient and natural frequency.

  18. How to: understanding SWAT model uncertainty relative to measured results

    Science.gov (United States)

    Watershed models are being relied upon to contribute to most policy-making decisions of watershed management, and the demand for an accurate accounting of complete model uncertainty is rising. Generalized likelihood uncertainty estimation (GLUE) is a widely used method for quantifying uncertainty i...

  19. Model structural uncertainty quantification and hydrogeophysical data integration using airborne electromagnetic data (Invited)

    DEFF Research Database (Denmark)

    Minsley, Burke; Christensen, Nikolaj Kruse; Christensen, Steen

    estimates of model structural uncertainty are then combined with hydrologic observations to assess the impact of model structural error on hydrologic calibration and prediction errors. Using a synthetic numerical model, we describe a sequential hydrogeophysical approach that: (1) uses Bayesian Markov chain...... Monte Carlo (McMC) methods to produce a robust estimate of uncertainty in electrical resistivity parameter values, (2) combines geophysical parameter uncertainty estimates with borehole observations of lithology to produce probabilistic estimates of model structural uncertainty over the entire AEM...... of airborne electromagnetic (AEM) data to estimate large-scale model structural geometry, i.e. the spatial distribution of different lithological units based on assumed or estimated resistivity-lithology relationships, and the uncertainty in those structures given imperfect measurements. Geophysically derived...

  20. DRAINMOD-GIS: a lumped parameter watershed scale drainage and water quality model

    Science.gov (United States)

    G.P. Fernandez; G.M. Chescheir; R.W. Skaggs; D.M. Amatya

    2006-01-01

    A watershed scale lumped parameter hydrology and water quality model that includes an uncertainty analysis component was developed and tested on a lower coastal plain watershed in North Carolina. Uncertainty analysis was used to determine the impacts of uncertainty in field and network parameters of the model on the predicted outflows and nitrate-nitrogen loads at the...

  1. Minimizing uncertainty of daily rainfall interpolation over large catchments through realistic sampling of anisotropic correlogram parameters

    Science.gov (United States)

    Gyasi-Agyei, Yeboah

    2016-04-01

    It has been established that daily rainfall gauged network density is not adequate for the level of hydrological modelling required of large catchments involving pollutant and sediment transport, such as the catchments draining the coastal regions of Queensland, Australia, to the sensitive Great Barrier Reef. This paper seeks to establish a link between the spatial structure of radar and gauge rainfall for improved interpolation of the limited gauged data over a grid or functional units of catchments in regions with or without radar records. The study area is within Mt. Stapylton weather radar station range, a 128 km square region for calibration and validation, and the Brisbane river catchment for validation only. Two time periods (2000-01-01 to 2008-12-31 and 2009-01-01 to 2015-06-30) were considered, the later period for calibration when radar records were available and both time periods for validation without regard to radar information. Anisotropic correlograms of both the gauged and radar data were developed and used to establish the linkage required for areas without radar records. The maximum daily temperature significantly influenced the distributional parameters of the linkage. While the gauged, radar and sampled correlogram parameters reproduced the mean estimates similarly using leave-one-out cross-validation of Ordinary Kriging, the gauged parameters overestimated the standard deviation (SD) which reflects uncertainty by over 91% of cases compared with the radar or the sampled parameter sets. However, the distribution of the SD generated by the radar and the sampled correlogram parameters could not be distinguished, with a Kolmogorov-Smirnov test p-value of 0.52. For the validation case with the catchment, the percentage overestimation of SD by the gauged parameter sets decreased to 81.2% and 87.1% for the earlier and later time periods, respectively. It is observed that the extreme wet days' parameters and statistics were fairly widely distributed

  2. Effects of input uncertainty on cross-scale crop modeling

    Science.gov (United States)

    Waha, Katharina; Huth, Neil; Carberry, Peter

    2014-05-01

    The quality of data on climate, soils and agricultural management in the tropics is in general low or data is scarce leading to uncertainty in process-based modeling of cropping systems. Process-based crop models are common tools for simulating crop yields and crop production in climate change impact studies, studies on mitigation and adaptation options or food security studies. Crop modelers are concerned about input data accuracy as this, together with an adequate representation of plant physiology processes and choice of model parameters, are the key factors for a reliable simulation. For example, assuming an error in measurements of air temperature, radiation and precipitation of ± 0.2°C, ± 2 % and ± 3 % respectively, Fodor & Kovacs (2005) estimate that this translates into an uncertainty of 5-7 % in yield and biomass simulations. In our study we seek to answer the following questions: (1) are there important uncertainties in the spatial variability of simulated crop yields on the grid-cell level displayed on maps, (2) are there important uncertainties in the temporal variability of simulated crop yields on the aggregated, national level displayed in time-series, and (3) how does the accuracy of different soil, climate and management information influence the simulated crop yields in two crop models designed for use at different spatial scales? The study will help to determine whether more detailed information improves the simulations and to advise model users on the uncertainty related to input data. We analyse the performance of the point-scale crop model APSIM (Keating et al., 2003) and the global scale crop model LPJmL (Bondeau et al., 2007) with different climate information (monthly and daily) and soil conditions (global soil map and African soil map) under different agricultural management (uniform and variable sowing dates) for the low-input maize-growing areas in Burkina Faso/West Africa. We test the models' response to different levels of input

  3. Constraining Parameter Uncertainty in Simulations of Water and Heat Dynamics in Seasonally Frozen Soil Using Limited Observed Data

    Directory of Open Access Journals (Sweden)

    Mousong Wu

    2016-02-01

    Full Text Available Water and energy processes in frozen soils are important for better understanding hydrologic processes and water resources management in cold regions. To investigate the water and energy balance in seasonally frozen soils, CoupModel combined with the generalized likelihood uncertainty estimation (GLUE method was used. Simulation work on water and heat processes in frozen soil in northern China during the 2012/2013 winter was conducted. Ensemble simulations through the Monte Carlo sampling method were generated for uncertainty analysis. Behavioral simulations were selected based on combinations of multiple model performance index criteria with respect to simulated soil water and temperature at four depths (5 cm, 15 cm, 25 cm, and 35 cm. Posterior distributions for parameters related to soil hydraulic, radiation processes, and heat transport indicated that uncertainties in both input and model structures could influence model performance in modeling water and heat processes in seasonally frozen soils. Seasonal courses in water and energy partitioning were obvious during the winter. Within the day-cycle, soil evaporation/condensation and energy distributions were well captured and clarified as an important phenomenon in the dynamics of the energy balance system. The combination of the CoupModel simulations with the uncertainty-based calibration method provides a way of understanding the seasonal courses of hydrology and energy processes in cold regions with limited data. Additional measurements may be used to further reduce the uncertainty of regulating factors during the different stages of freezing–thawing.

  4. Improving the precision of lake ecosystem metabolism estimates by identifying predictors of model uncertainty

    Science.gov (United States)

    Rose, Kevin C.; Winslow, Luke A.; Read, Jordan S.; Read, Emily K.; Solomon, Christopher T.; Adrian, Rita; Hanson, Paul C.

    2014-01-01

    Diel changes in dissolved oxygen are often used to estimate gross primary production (GPP) and ecosystem respiration (ER) in aquatic ecosystems. Despite the widespread use of this approach to understand ecosystem metabolism, we are only beginning to understand the degree and underlying causes of uncertainty for metabolism model parameter estimates. Here, we present a novel approach to improve the precision and accuracy of ecosystem metabolism estimates by identifying physical metrics that indicate when metabolism estimates are highly uncertain. Using datasets from seventeen instrumented GLEON (Global Lake Ecological Observatory Network) lakes, we discovered that many physical characteristics correlated with uncertainty, including PAR (photosynthetically active radiation, 400-700 nm), daily variance in Schmidt stability, and wind speed. Low PAR was a consistent predictor of high variance in GPP model parameters, but also corresponded with low ER model parameter variance. We identified a threshold (30% of clear sky PAR) below which GPP parameter variance increased rapidly and was significantly greater in nearly all lakes compared with variance on days with PAR levels above this threshold. The relationship between daily variance in Schmidt stability and GPP model parameter variance depended on trophic status, whereas daily variance in Schmidt stability was consistently positively related to ER model parameter variance. Wind speeds in the range of ~0.8-3 m s–1 were consistent predictors of high variance for both GPP and ER model parameters, with greater uncertainty in eutrophic lakes. Our findings can be used to reduce ecosystem metabolism model parameter uncertainty and identify potential sources of that uncertainty.

  5. Physical and Model Uncertainty for Fatigue Design of Composite Material

    DEFF Research Database (Denmark)

    Toft, Henrik Stensgaard; Sørensen, John Dalsgaard

    The main aim of the present report is to establish stochastic models for the uncertainties related to fatigue design of composite materials. The uncertainties considered are the physical uncertainty related to the static and fatigue strength and the model uncertainty related to Miners rule...... for linear damage accumulation. Test data analyzed are taken from the Optimat database [1] which is public available. The composite material tested within the Optimat project is normally used for wind turbine blades....

  6. Quantification of Model Uncertainty in Modeling Mechanisms of Soil Microbial Respiration Pulses to Simulate Birch Effect

    Science.gov (United States)

    Elshall, A. S.; Ye, M.; Niu, G. Y.; Barron-Gafford, G.

    2014-12-01

    A Bayesian framework is developed to quantify predictive uncertainty in environmental modeling caused by uncertainty in modeling scenarios, model structures, model parameters, and data. An example of using the framework to quantify model uncertainty is presented to simulate soil microbial respiration pulses in response to episodic rainfall pulses (the "Birch effect"). A total of five models are developed; they evolve from an existing four-carbon (C) pool model to models with additional C pools and recently developed models with explicit representations of soil moisture controls on C degradation and microbial uptake rates. Markov chain Monte Carlo (MCMC) methods with generalized likelihood function (not Gaussian) are used to estimate posterior parameter distributions of the models, and the posterior parameter samples are used to evaluate probabilities of the models. The models with explicit representations of soil moisture controls outperform the other models. The models with additional C pools for accumulation of degraded C in the dry zone of the soil pore space result in a higher probability of reproducing the observed Birch pulses. A cross-validation is conducted to explore predictive performance of model averaging and of individual models. The Bayesian framework is mathematically general and can be applied to a wide range of environmental problems.

  7. Identifying and assessing critical uncertainty thresholds in a forest pest risk model

    Science.gov (United States)

    Frank H. Koch; Denys Yemshanov

    2015-01-01

    Pest risk maps can provide helpful decision support for invasive alien species management, but often fail to address adequately the uncertainty associated with their predicted risk values. Th is chapter explores how increased uncertainty in a risk model’s numeric assumptions (i.e. its principal parameters) might aff ect the resulting risk map. We used a spatial...

  8. Uncertainty Quantification for Combined Polynomial Chaos Kriging Surrogate Models

    Science.gov (United States)

    Weinmeister, Justin; Gao, Xinfeng; Krishna Prasad, Aditi; Roy, Sourajeet

    2017-11-01

    Surrogate modeling techniques are currently used to perform uncertainty quantification on computational fluid dynamics (CFD) models for their ability to identify the most impactful parameters on CFD simulations and help reduce computational cost in engineering design process. The accuracy of these surrogate models depends on a number of factors, such as the training data created from the CFD simulations, the target functions, the surrogate model framework, and so on. Recently, we have combined polynomial chaos expansions (PCE) and Kriging to produce a more accurate surrogate model, polynomial chaos Kriging (PCK). In this talk, we analyze the error convergence rate for the Kriging, PCE, and PCK model on a convection-diffusion-reaction problem, and validate the statistical measures and performance of the PCK method for its application to practical CFD simulations.

  9. Quantifying uncertainty, variability and likelihood for ordinary differential equation models

    LENUS (Irish Health Repository)

    Weisse, Andrea Y

    2010-10-28

    Abstract Background In many applications, ordinary differential equation (ODE) models are subject to uncertainty or variability in initial conditions and parameters. Both, uncertainty and variability can be quantified in terms of a probability density function on the state and parameter space. Results The partial differential equation that describes the evolution of this probability density function has a form that is particularly amenable to application of the well-known method of characteristics. The value of the density at some point in time is directly accessible by the solution of the original ODE extended by a single extra dimension (for the value of the density). This leads to simple methods for studying uncertainty, variability and likelihood, with significant advantages over more traditional Monte Carlo and related approaches especially when studying regions with low probability. Conclusions While such approaches based on the method of characteristics are common practice in other disciplines, their advantages for the study of biological systems have so far remained unrecognized. Several examples illustrate performance and accuracy of the approach and its limitations.

  10. Model Uncertainties for Valencia RPA Effect for MINERvA

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Richard [Univ. of Minnesota, Duluth, MN (United States)

    2017-05-08

    This technical note describes the application of the Valencia RPA multi-nucleon effect and its uncertainty to QE reactions from the GENIE neutrino event generator. The analysis of MINERvA neutrino data in Rodrigues et al. PRL 116 071802 (2016) paper makes clear the need for an RPA suppression, especially at very low momentum and energy transfer. That published analysis does not constrain the magnitude of the effect; it only tests models with and without the effect against the data. Other MINERvA analyses need an expression of the model uncertainty in the RPA effect. A well-described uncertainty can be used for systematics for unfolding, for model errors in the analysis of non-QE samples, and as input for fitting exercises for model testing or constraining backgrounds. This prescription takes uncertainties on the parameters in the Valencia RPA model and adds a (not-as-tight) constraint from muon capture data. For MINERvA we apply it as a 2D ($q_0$,$q_3$) weight to GENIE events, in lieu of generating a full beyond-Fermi-gas quasielastic events. Because it is a weight, it can be applied to the generated and fully Geant4 simulated events used in analysis without a special GENIE sample. For some limited uses, it could be cast as a 1D $Q^2$ weight without much trouble. This procedure is a suitable starting point for NOvA and DUNE where the energy dependence is modest, but probably not adequate for T2K or MicroBooNE.

  11. Uncertainty assessment of integrated distributed hydrological models using GLUE with Markov chain Monte Carlo sampling

    DEFF Research Database (Denmark)

    Blasone, Roberta-Serena; Madsen, Henrik; Rosbjerg, Dan

    2008-01-01

    uncertainty estimation (GLUE) procedure based on Markov chain Monte Carlo sampling is applied in order to improve the performance of the methodology in estimating parameters and posterior output distributions. The description of the spatial variations of the hydrological processes is accounted for by defining......-distributed responses are, however, still quite unexplored. Especially for complex models, rigorous parameterization, reduction of the parameter space and use of efficient and effective algorithms are essential to facilitate the calibration process and make it more robust. Moreover, for these models multi...... the identifiability of the parameters and results in satisfactory multi-variable simulations and uncertainty estimates. However, the parameter uncertainty alone cannot explain the total uncertainty at all the sites, due to limitations in the distributed data included in the model calibration. The study also indicates...

  12. Parameter estimation for groundwater models under uncertain irrigation data

    Science.gov (United States)

    Demissie, Yonas; Valocchi, Albert J.; Cai, Ximing; Brozovic, Nicholas; Senay, Gabriel; Gebremichael, Mekonnen

    2015-01-01

    The success of modeling groundwater is strongly influenced by the accuracy of the model parameters that are used to characterize the subsurface system. However, the presence of uncertainty and possibly bias in groundwater model source/sink terms may lead to biased estimates of model parameters and model predictions when the standard regression-based inverse modeling techniques are used. This study first quantifies the levels of bias in groundwater model parameters and predictions due to the presence of errors in irrigation data. Then, a new inverse modeling technique called input uncertainty weighted least-squares (IUWLS) is presented for unbiased estimation of the parameters when pumping and other source/sink data are uncertain. The approach uses the concept of generalized least-squares method with the weight of the objective function depending on the level of pumping uncertainty and iteratively adjusted during the parameter optimization process. We have conducted both analytical and numerical experiments, using irrigation pumping data from the Republican River Basin in Nebraska, to evaluate the performance of ordinary least-squares (OLS) and IUWLS calibration methods under different levels of uncertainty of irrigation data and calibration conditions. The result from the OLS method shows the presence of statistically significant (p model predictions that persist despite calibrating the models to different calibration data and sample sizes. However, by directly accounting for the irrigation pumping uncertainties during the calibration procedures, the proposed IUWLS is able to minimize the bias effectively without adding significant computational burden to the calibration processes.

  13. Parameter uncertainties in the design and optimization of cantilever piezoelectric energy harvesters

    Science.gov (United States)

    Franco, V. R.; Varoto, P. S.

    2017-09-01

    A crucial issue in piezoelectric energy harvesting is the efficiency of the mechanical to electrical conversion process. Several techniques have been investigated in order to obtain a set of optimum design parameters that will lead to the best performance of the harvester in terms of electrical power generation. Once an optimum design is reached it is also important to consider uncertainties in the selected parameters that in turn can lead to loss of performance in the energy conversion process. The main goal of this paper is to perform a comprehensive discussion of the effects of multi-parameter aleatory uncertainties on the performance and design optimization of a given energy harvesting system. For that, a typical energy harvester consisting of a cantilever beam carrying a tip mass and partially covered by piezoelectric layers on top and bottom surfaces is considered. A distributed parameter electromechanical modal of the harvesting system is formulated and validated through experimental tests. First, the SQP (Sequential Quadratic Planning) optimization is employed to obtain an optimum set of parameters that will lead to best performance of the harvester. Second, once the optimum harvester configuration is found random perturbations are introduced in the key parameters and Monte Carlo simulations are performed to investigate how these uncertainties propagate and affect the performance of the device studied. Numerically simulated results indicate that small variations in some design parameters can cause a significant variation in the output electrical power, what strongly suggests that uncertainties must be accounted for in the design of beam energy harvesting systems.

  14. Uncertainty Reduction Via Parameter Design of A Fast Digital Integrator for Magnetic Field Measurement

    CERN Document Server

    Arpaia, P; Lucariello, G; Spiezia, G

    2007-01-01

    At European Centre of Nuclear Research (CERN), within the new Large Hadron Collider (LHC) project, measurements of magnetic flux with uncertainty of 10 ppm at a few of decades of Hz for several minutes are required. With this aim, a new Fast Digital Integrator (FDI) has been developed in cooperation with University of Sannio, Italy [1]. This paper deals with the final design tuning for achieving target uncertainty by means of experimental statistical parameter design.

  15. Coupled Monte Carlo simulation and Copula theory for uncertainty analysis of multiphase flow simulation models.

    Science.gov (United States)

    Jiang, Xue; Na, Jin; Lu, Wenxi; Zhang, Yu

    2017-11-01

    Simulation-optimization techniques are effective in identifying an optimal remediation strategy. Simulation models with uncertainty, primarily in the form of parameter uncertainty with different degrees of correlation, influence the reliability of the optimal remediation strategy. In this study, a coupled Monte Carlo simulation and Copula theory is proposed for uncertainty analysis of a simulation model when parameters are correlated. Using the self-adaptive weight particle swarm optimization Kriging method, a surrogate model was constructed to replace the simulation model and reduce the computational burden and time consumption resulting from repeated and multiple Monte Carlo simulations. The Akaike information criterion (AIC) and the Bayesian information criterion (BIC) were employed to identify whether the t Copula function or the Gaussian Copula is the optimal Copula function to match the relevant structure of the parameters. The results show that both the AIC and BIC values of the t Copula function are less than those of the Gaussian Copula function. This indicates that the t Copula function is the optimal function for matching the relevant structure of the parameters. The outputs of the simulation model when parameter correlation was considered and when it was ignored were compared. The results show that the amplitude of the fluctuation interval when parameter correlation was considered is less than the corresponding amplitude when parameter estimation was ignored. Moreover, it was demonstrated that considering the correlation among parameters is essential for uncertainty analysis of a simulation model, and the results of uncertainty analysis should be incorporated into the remediation strategy optimization process.

  16. A Novel SHLNN Based Robust Control and Tracking Method for Hypersonic Vehicle under Parameter Uncertainty

    Directory of Open Access Journals (Sweden)

    Chuanfeng Li

    2017-01-01

    Full Text Available Hypersonic vehicle is a typical parameter uncertain system with significant characteristics of strong coupling, nonlinearity, and external disturbance. In this paper, a combined system modeling approach is proposed to approximate the actual vehicle system. The state feedback control strategy is adopted based on the robust guaranteed cost control (RGCC theory, where the Lyapunov function is applied to get control law for nonlinear system and the problem is transformed into a feasible solution by linear matrix inequalities (LMI method. In addition, a nonfragile guaranteed cost controller solved by LMI optimization approach is employed to the linear error system, where a single hidden layer neural network (SHLNN is employed as an additive gain compensator to reduce excessive performance caused by perturbations and uncertainties. Simulation results show the stability and well tracking performance for the proposed strategy in controlling the vehicle system.

  17. 'spup' - an R package for uncertainty propagation analysis in spatial environmental modelling

    Science.gov (United States)

    Sawicka, Kasia; Heuvelink, Gerard

    2017-04-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability and being able to deal with case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected visualization methods that are understandable by non-experts with limited background in

  18. Laboratory transport experiments with antibiotic sulfadiazine: Experimental results and parameter uncertainty analysis

    Science.gov (United States)

    Sittig, S.; Vrugt, J. A.; Kasteel, R.; Groeneweg, J.; Vereecken, H.

    2011-12-01

    Persistent antibiotics in the soil potentially contaminate the groundwater and affect the quality of drinking water. To improve our understanding of antibiotic transport in soils, we performed laboratory transport experiments in soil columns under constant irrigation conditions with repeated applications of chloride and radio-labeled SDZ. The tracers were incorporated in the first centimeter, either with pig manure or with solution. Breakthrough curves and concentration profiles of the parent compound and the main transformation products were measured. The goal is to describe the observed nonlinear and kinetic transport behavior of SDZ. Our analysis starts with synthetic transport data for the given laboratory flow conditions for tracers which exhibit increasingly complex interactions with the solid phase. This first step is necessary to benchmark our inverse modeling approach for ideal situations. Then we analyze the transport behavior using the column experiments in the laboratory. Our analysis uses a Markov chain Monte Carlo sampler (Differential Evolution Adaptive Metropolis algorithm, DREAM) to efficiently search the parameter space of an advective-dispersion model. Sorption of the antibiotics to the soil was described using a model regarding reversible as well as irreversible sorption. This presentation will discuss our initial findings. We will present the data of our laboratory experiments along with an analysis of parameter uncertainty.

  19. Discriminative Random Field Models for Subsurface Contamination Uncertainty Quantification

    Science.gov (United States)

    Arshadi, M.; Abriola, L. M.; Miller, E. L.; De Paolis Kaluza, C.

    2017-12-01

    Application of flow and transport simulators for prediction of the release, entrapment, and persistence of dense non-aqueous phase liquids (DNAPLs) and associated contaminant plumes is a computationally intensive process that requires specification of a large number of material properties and hydrologic/chemical parameters. Given its computational burden, this direct simulation approach is particularly ill-suited for quantifying both the expected performance and uncertainty associated with candidate remediation strategies under real field conditions. Prediction uncertainties primarily arise from limited information about contaminant mass distributions, as well as the spatial distribution of subsurface hydrologic properties. Application of direct simulation to quantify uncertainty would, thus, typically require simulating multiphase flow and transport for a large number of permeability and release scenarios to collect statistics associated with remedial effectiveness, a computationally prohibitive process. The primary objective of this work is to develop and demonstrate a methodology that employs measured field data to produce equi-probable stochastic representations of a subsurface source zone that capture the spatial distribution and uncertainty associated with key features that control remediation performance (i.e., permeability and contamination mass). Here we employ probabilistic models known as discriminative random fields (DRFs) to synthesize stochastic realizations of initial mass distributions consistent with known, and typically limited, site characterization data. Using a limited number of full scale simulations as training data, a statistical model is developed for predicting the distribution of contaminant mass (e.g., DNAPL saturation and aqueous concentration) across a heterogeneous domain. Monte-Carlo sampling methods are then employed, in conjunction with the trained statistical model, to generate realizations conditioned on measured borehole data

  20. Effect of Baseflow Separation on Uncertainty of Hydrological Modeling in the Xinanjiang Model

    Directory of Open Access Journals (Sweden)

    Kairong Lin

    2014-01-01

    Full Text Available Based on the idea of inputting more available useful information for evaluation to gain less uncertainty, this study focuses on how well the uncertainty can be reduced by considering the baseflow estimation information obtained from the smoothed minima method (SMM. The Xinanjiang model and the generalized likelihood uncertainty estimation (GLUE method with the shuffled complex evolution Metropolis (SCEM-UA sampling algorithm were used for hydrological modeling and uncertainty analysis, respectively. The Jiangkou basin, located in the upper of the Hanjiang River, was selected as case study. It was found that the number and standard deviation of behavioral parameter sets both decreased when the threshold value for the baseflow efficiency index increased, and the high Nash-Sutcliffe efficiency coefficients correspond well with the high baseflow efficiency coefficients. The results also showed that uncertainty interval width decreased significantly, while containing ratio did not decrease by much and the simulated runoff with the behavioral parameter sets can fit better to the observed runoff, when threshold for the baseflow efficiency index was taken into consideration. These implied that using the baseflow estimation information can reduce the uncertainty in hydrological modeling to some degree and gain more reasonable prediction bounds.

  1. Spatial variability and parametric uncertainty in performance assessment models

    International Nuclear Information System (INIS)

    Pensado, Osvaldo; Mancillas, James; Painter, Scott; Tomishima, Yasuo

    2011-01-01

    The problem of defining an appropriate treatment of distribution functions (which could represent spatial variability or parametric uncertainty) is examined based on a generic performance assessment model for a high-level waste repository. The generic model incorporated source term models available in GoldSim ® , the TDRW code for contaminant transport in sparse fracture networks with a complex fracture-matrix interaction process, and a biosphere dose model known as BDOSE TM . Using the GoldSim framework, several Monte Carlo sampling approaches and transport conceptualizations were evaluated to explore the effect of various treatments of spatial variability and parametric uncertainty on dose estimates. Results from a model employing a representative source and ensemble-averaged pathway properties were compared to results from a model allowing for stochastic variation of transport properties along streamline segments (i.e., explicit representation of spatial variability within a Monte Carlo realization). We concluded that the sampling approach and the definition of an ensemble representative do influence consequence estimates. In the examples analyzed in this paper, approaches considering limited variability of a transport resistance parameter along a streamline increased the frequency of fast pathways resulting in relatively high dose estimates, while those allowing for broad variability along streamlines increased the frequency of 'bottlenecks' reducing dose estimates. On this basis, simplified approaches with limited consideration of variability may suffice for intended uses of the performance assessment model, such as evaluation of site safety. (author)

  2. Characterization uncertainty and its effects on models and performance

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, C.A.; Treadway, A.H.

    1991-01-01

    Geostatistical simulation is being used to develop multiple geologic models of rock properties at the proposed Yucca Mountain repository site. Because each replicate model contains the same known information, and is thus essentially indistinguishable statistically from others, the differences between models may be thought of as representing the uncertainty in the site description. The variability among performance measures, such as ground water travel time, calculated using these replicate models therefore quantifies the uncertainty in performance that arises from uncertainty in site characterization.

  3. Uncertainty in a spatial evacuation model

    Science.gov (United States)

    Mohd Ibrahim, Azhar; Venkat, Ibrahim; Wilde, Philippe De

    2017-08-01

    Pedestrian movements in crowd motion can be perceived in terms of agents who basically exhibit patient or impatient behavior. We model crowd motion subject to exit congestion under uncertainty conditions in a continuous space and compare the proposed model via simulations with the classical social force model. During a typical emergency evacuation scenario, agents might not be able to perceive with certainty the strategies of opponents (other agents) owing to the dynamic changes entailed by the neighborhood of opponents. In such uncertain scenarios, agents will try to update their strategy based on their own rules or their intrinsic behavior. We study risk seeking, risk averse and risk neutral behaviors of such agents via certain game theory notions. We found that risk averse agents tend to achieve faster evacuation time whenever the time delay in conflicts appears to be longer. The results of our simulations also comply with previous work and conform to the fact that evacuation time of agents becomes shorter once mutual cooperation among agents is achieved. Although the impatient strategy appears to be the rational strategy that might lead to faster evacuation times, our study scientifically shows that the more the agents are impatient, the slower is the egress time.

  4. Addressing Conceptual Model Uncertainty in the Evaluation of Model Prediction Errors

    Science.gov (United States)

    Carrera, J.; Pool, M.

    2014-12-01

    Model predictions are uncertain because of errors in model parameters, future forcing terms, and model concepts. The latter remain the largest and most difficult to assess source of uncertainty in long term model predictions. We first review existing methods to evaluate conceptual model uncertainty. We argue that they are highly sensitive to the ingenuity of the modeler, in the sense that they rely on the modeler's ability to propose alternative model concepts. Worse, we find that the standard practice of stochastic methods leads to poor, potentially biased and often too optimistic, estimation of actual model errors. This is bad news because stochastic methods are purported to properly represent uncertainty. We contend that the problem does not lie on the stochastic approach itself, but on the way it is applied. Specifically, stochastic inversion methodologies, which demand quantitative information, tend to ignore geological understanding, which is conceptually rich. We illustrate some of these problems with the application to Mar del Plata aquifer, where extensive data are available for nearly a century. Geologically based models, where spatial variability is handled through zonation, yield calibration fits similar to geostatiscally based models, but much better predictions. In fact, the appearance of the stochastic T fields is similar to the geologically based models only in areas with high density of data. We take this finding to illustrate the ability of stochastic models to accommodate many data, but also, ironically, their inability to address conceptual model uncertainty. In fact, stochastic model realizations tend to be too close to the "most likely" one (i.e., they do not really realize the full conceptualuncertainty). The second part of the presentation is devoted to argue that acknowledging model uncertainty may lead to qualitatively different decisions than just working with "most likely" model predictions. Therefore, efforts should concentrate on

  5. A GLUE uncertainty analysis of a drying model of pharmaceutical granules

    DEFF Research Database (Denmark)

    Mortier, Séverine Thérèse F.C.; Van Hoey, Stijn; Cierkens, Katrijn

    2013-01-01

    uncertainty) originating from uncertainty in input data, model parameters, model structure, boundary conditions and software. In this paper, the model prediction uncertainty is evaluated for a model describing the continuous drying of single pharmaceutical wet granules in a six-segmented fluidized bed drying...... unit, which is part of the full continuous from-powder-to-tablet manufacturing line (Consigma™, GEA Pharma Systems). A validated model describing the drying behaviour of a single pharmaceutical granule in two consecutive phases is used. First of all, the effect of the assumptions at the particle level...

  6. On how to avoid input and structural uncertainties corrupt the inference of hydrological parameters using a Bayesian framework

    Science.gov (United States)

    Hernández, Mario R.; Francés, Félix

    2015-04-01

    One phase of the hydrological models implementation process, significantly contributing to the hydrological predictions uncertainty, is the calibration phase in which values of the unknown model parameters are tuned by optimizing an objective function. An unsuitable error model (e.g. Standard Least Squares or SLS) introduces noise into the estimation of the parameters. The main sources of this noise are the input errors and the hydrological model structural deficiencies. Thus, the biased calibrated parameters cause the divergence model phenomenon, where the errors variance of the (spatially and temporally) forecasted flows far exceeds the errors variance in the fitting period, and provoke the loss of part or all of the physical meaning of the modeled processes. In other words, yielding a calibrated hydrological model which works well, but not for the right reasons. Besides, an unsuitable error model yields a non-reliable predictive uncertainty assessment. Hence, with the aim of prevent all these undesirable effects, this research focuses on the Bayesian joint inference (BJI) of both the hydrological and error model parameters, considering a general additive (GA) error model that allows for correlation, non-stationarity (in variance and bias) and non-normality of model residuals. As hydrological model, it has been used a conceptual distributed model called TETIS, with a particular split structure of the effective model parameters. Bayesian inference has been performed with the aid of a Markov Chain Monte Carlo (MCMC) algorithm called Dream-ZS. MCMC algorithm quantifies the uncertainty of the hydrological and error model parameters by getting the joint posterior probability distribution, conditioned on the observed flows. The BJI methodology is a very powerful and reliable tool, but it must be used correctly this is, if non-stationarity in errors variance and bias is modeled, the Total Laws must be taken into account. The results of this research show that the

  7. A review of different perspectives on uncertainty and risk and an alternative modeling paradigm

    International Nuclear Information System (INIS)

    Samson, Sundeep; Reneke, James A.; Wiecek, Margaret M.

    2009-01-01

    The literature in economics, finance, operations research, engineering and in general mathematics is first reviewed on the subject of defining uncertainty and risk. The review goes back to 1901. Different perspectives on uncertainty and risk are examined and a new paradigm to model uncertainty and risk is proposed using relevant ideas from this study. This new paradigm is used to represent, aggregate and propagate uncertainty and interpret the resulting variability in a challenge problem developed by Oberkampf et al. [2004, Challenge problems: uncertainty in system response given uncertain parameters. Reliab Eng Syst Safety 2004; 85(1): 11-9]. The challenge problem is further extended into a decision problem that is treated within a multicriteria decision making framework to illustrate how the new paradigm yields optimal decisions under uncertainty. The accompanying risk is defined as the probability of an unsatisfactory system response quantified by a random function of the uncertainty

  8. Holistic uncertainty analysis in river basin modeling for climate vulnerability assessment

    Science.gov (United States)

    Taner, M. U.; Wi, S.; Brown, C.

    2017-12-01

    The challenges posed by uncertain future climate are a prominent concern for water resources managers. A number of frameworks exist for assessing the impacts of climate-related uncertainty, including internal climate variability and anthropogenic climate change, such as scenario-based approaches and vulnerability-based approaches. While in many cases climate uncertainty may be dominant, other factors such as future evolution of the river basin, hydrologic response and reservoir operations are potentially significant sources of uncertainty. While uncertainty associated with modeling hydrologic response has received attention, very little attention has focused on the range of uncertainty and possible effects of the water resources infrastructure and management. This work presents a holistic framework that allows analysis of climate, hydrologic and water management uncertainty in water resources systems analysis with the aid of a water system model designed to integrate component models for hydrology processes and water management activities. The uncertainties explored include those associated with climate variability and change, hydrologic model parameters, and water system operation rules. A Bayesian framework is used to quantify and model the uncertainties at each modeling steps in integrated fashion, including prior and the likelihood information about model parameters. The framework is demonstrated in a case study for the St. Croix Basin located at border of United States and Canada.

  9. Linking Item Response Model Parameters.

    Science.gov (United States)

    van der Linden, Wim J; Barrett, Michelle D

    2016-09-01

    With a few exceptions, the problem of linking item response model parameters from different item calibrations has been conceptualized as an instance of the problem of test equating scores on different test forms. This paper argues, however, that the use of item response models does not require any test score equating. Instead, it involves the necessity of parameter linking due to a fundamental problem inherent in the formal nature of these models-their general lack of identifiability. More specifically, item response model parameters need to be linked to adjust for the different effects of the identifiability restrictions used in separate item calibrations. Our main theorems characterize the formal nature of these linking functions for monotone, continuous response models, derive their specific shapes for different parameterizations of the 3PL model, and show how to identify them from the parameter values of the common items or persons in different linking designs.

  10. Identification and communication of uncertainties of phenomenological models in PSA

    International Nuclear Information System (INIS)

    Pulkkinen, U.; Simola, K.

    2001-11-01

    This report aims at presenting a view upon uncertainty analysis of phenomenological models with an emphasis on the identification and documentation of various types of uncertainties and assumptions in the modelling of the phenomena. In an uncertainty analysis, it is essential to include and document all unclear issues, in order to obtain a maximal coverage of unresolved issues. This holds independently on their nature or type of the issues. The classification of uncertainties is needed in the decomposition of the problem and it helps in the identification of means for uncertainty reduction. Further, an enhanced documentation serves to evaluate the applicability of the results to various risk-informed applications. (au)

  11. Uncertainty quantification of voice signal production mechanical model and experimental updating

    OpenAIRE

    Cataldo, Edson; Soize, Christian; Sampaio, Rubens

    2013-01-01

    International audience; The aim of this paper is to analyze the uncertainty quantification in a voice production mechanical model and update the probability density function corresponding to the tension parameter using the bayes method and experimental data. Three parameters are considered uncertain in the voice production mechanical model used: the tension parameter, the neutral glottal area and the subglottal pressure. The tension parameter of the vocal folds is mainly responsible for the c...

  12. Considering the Epistemic Uncertainties of the Variogram Model in Locating Additional Exploratory Drillholes

    Directory of Open Access Journals (Sweden)

    Saeed Soltani

    2015-06-01

    Full Text Available To enhance the certainty of the grade block model, it is necessary to increase the number of exploratory drillholes and collect more data from the deposit. The inputs of the process of locating these additional drillholes include the variogram model parameters, locations of the samples taken from the initial drillholes, and the geological block model. The uncertainties of these inputs will lead to uncertainties in the optimal locations of additional drillholes. Meanwhile, the locations of the initial data are crisp, but the variogram model parameters and the geological model have uncertainties due to the limitation of the number of initial data. In this paper, effort has been made to consider the effects of variogram uncertainties on the optimal location of additional drillholes using the fuzzy kriging and solve the locating problem with the genetic algorithm (GA optimization method.A bauxite deposit case study has shown the efficiency of the proposed model.

  13. Uncertainty in population growth rates: determining confidence intervals from point estimates of parameters.

    Directory of Open Access Journals (Sweden)

    Eleanor S Devenish Nelson

    Full Text Available BACKGROUND: Demographic models are widely used in conservation and management, and their parameterisation often relies on data collected for other purposes. When underlying data lack clear indications of associated uncertainty, modellers often fail to account for that uncertainty in model outputs, such as estimates of population growth. METHODOLOGY/PRINCIPAL FINDINGS: We applied a likelihood approach to infer uncertainty retrospectively from point estimates of vital rates. Combining this with resampling techniques and projection modelling, we show that confidence intervals for population growth estimates are easy to derive. We used similar techniques to examine the effects of sample size on uncertainty. Our approach is illustrated using data on the red fox, Vulpes vulpes, a predator of ecological and cultural importance, and the most widespread extant terrestrial mammal. We show that uncertainty surrounding estimated population growth rates can be high, even for relatively well-studied populations. Halving that uncertainty typically requires a quadrupling of sampling effort. CONCLUSIONS/SIGNIFICANCE: Our results compel caution when comparing demographic trends between populations without accounting for uncertainty. Our methods will be widely applicable to demographic studies of many species.

  14. Modeling Input Errors to Improve Uncertainty Estimates for Sediment Transport Model Predictions

    Science.gov (United States)

    Jung, J. Y.; Niemann, J. D.; Greimann, B. P.

    2016-12-01

    Bayesian methods using Markov chain Monte Carlo algorithms have recently been applied to sediment transport models to assess the uncertainty in the model predictions due to the parameter values. Unfortunately, the existing approaches can only attribute overall uncertainty to the parameters. This limitation is critical because no model can produce accurate forecasts if forced with inaccurate input data, even if the model is well founded in physical theory. In this research, an existing Bayesian method is modified to consider the potential errors in input data during the uncertainty evaluation process. The input error is modeled using Gaussian distributions, and the means and standard deviations are treated as uncertain parameters. The proposed approach is tested by coupling it to the Sedimentation and River Hydraulics - One Dimension (SRH-1D) model and simulating a 23-km reach of the Tachia River in Taiwan. The Wu equation in SRH-1D is used for computing the transport capacity for a bed material load of non-cohesive material. Three types of input data are considered uncertain: (1) the input flowrate at the upstream boundary, (2) the water surface elevation at the downstream boundary, and (3) the water surface elevation at a hydraulic structure in the middle of the reach. The benefits of modeling the input errors in the uncertainty analysis are evaluated by comparing the accuracy of the most likely forecast and the coverage of the observed data by the credible intervals to those of the existing method. The results indicate that the internal boundary condition has the largest uncertainty among those considered. Overall, the uncertainty estimates from the new method are notably different from those of the existing method for both the calibration and forecast periods.

  15. Simulation of corn yields and parameters uncertainties analysis in Hebei and Sichuang, China

    Science.gov (United States)

    Fu, A.; Xue, Y.; Hartman, M. D.; Chandran, A.; Qiu, B.; Liu, Y.

    2016-12-01

    Corn is one of most important agricultural production in China. Research on the impacts of climate change and human activities on corn yields is important in understanding and mitigating the negative effects of environmental factors on corn yields and maintaining the stable corn production. Using climatic data, including daily temperature, precipitation, and solar radiation from 1948 to 2010, soil properties, observed corn yields, and farmland management information, corn yields in Sichuang and Hebei Provinces of China in the past 63 years were simulated using the Daycent model, and the results was evaluated using Root mean square errors, bias, simulation efficiency, and standard deviation. The primary climatic factors influencing corn yields were examined, the uncertainties of climatic factors was analyzed, and the uncertainties of human activity parameters were also studied by changing fertilization levels and cultivated ways. The results showed that: (1) Daycent model is capable to simulate corn yields in Sichuang and Hebei provinces of China. Observed and simulated corn yields have the similar increasing trend with time. (2) The minimum daily temperature is the primary factor influencing corn yields in Sichuang. In Hebei Province, daily temperature, precipitation and wind speed significantly affect corn yields.(3) When the global warming trend of original data was removed, simulated corn yields were lower than before, decreased by about 687 kg/hm2 from 1992 to 2010; When the fertilization levels, cultivated ways were increased and decreased by 50% and 75%, respectively in the Schedule file in Daycent model, the simulated corn yields increased by 1206 kg/hm2 and 776 kg/hm2, respectively, with the enhancement of fertilization level and the improvement of cultivated way. This study provides a scientific base for selecting a suitable fertilization level and cultivated way in corn fields in China.

  16. Differential uncertainty analysis for evaluating the accuracy of S-parameter retrieval methods for electromagnetic properties of metamaterial slabs.

    Science.gov (United States)

    Hasar, Ugur Cem; Barroso, Joaquim J; Sabah, Cumali; Kaya, Yunus; Ertugrul, Mehmet

    2012-12-17

    We apply a complete uncertainty analysis, not studied in the literature, to investigate the dependences of retrieved electromagnetic properties of two MM slabs (the first one with only split-ring resonators (SRRs) and the second with SRRs and a continuous wire) with single-band and dual-band resonating properties on the measured/simulated scattering parameters, the slab length, and the operating frequency. Such an analysis is necessary for the selection of a suitable retrieval method together with the correct examination of exotic properties of MM slabs especially in their resonance regions. For this analysis, a differential uncertainty model is developed to monitor minute changes in the dependent variables (electromagnetic properties of MM slabs) in functions of independent variables (scattering (S-) parameters, the slab length, and the operating frequency). Two complementary approaches (the analytical approach and the dispersion model approach) each with different strengths are utilized to retrieve the electromagnetic properties of various MM slabs, which are needed for the application of the uncertainty analysis. We note the following important results from our investigation. First, uncertainties in the retrieved electromagnetic properties of the analyzed MM slabs drastically increase when values of electromagnetic properties shrink to zero or near resonance regions where S-parameters exhibit rapid changes. Second, any low-loss or medium-loss inside the MM slabs due to an imperfect dielectric substrate or a finite conductivity of metals can decrease these uncertainties near resonance regions because these losses hinder abrupt changes in S-parameters. Finally, we note that precise information of especially the slab length and the operating frequency is a prerequisite for accurate analysis of exotic electromagnetic properties of MM slabs (especially multiband MM slabs) near resonance regions.

  17. Impact on Model Uncertainty of Diabatization in Distillation Columns

    DEFF Research Database (Denmark)

    Bisgaard, Thomas; Huusom, Jakob Kjøbsted; Abildskov, Jens

    2014-01-01

    This work provides uncertainty and sensitivity analysis of design of conventional and heat integrated distillation columns using Monte Carlo simulations. Selected uncertain parameters are relative volatility, heat of vaporization, the overall heat transfer coefficient , tray hold-up, and adiabat ...

  18. Calibration under uncertainty for finite element models of masonry monuments

    Energy Technology Data Exchange (ETDEWEB)

    Atamturktur, Sezer,; Hemez, Francois,; Unal, Cetin

    2010-02-01

    Historical unreinforced masonry buildings often include features such as load bearing unreinforced masonry vaults and their supporting framework of piers, fill, buttresses, and walls. The masonry vaults of such buildings are among the most vulnerable structural components and certainly among the most challenging to analyze. The versatility of finite element (FE) analyses in incorporating various constitutive laws, as well as practically all geometric configurations, has resulted in the widespread use of the FE method for the analysis of complex unreinforced masonry structures over the last three decades. However, an FE model is only as accurate as its input parameters, and there are two fundamental challenges while defining FE model input parameters: (1) material properties and (2) support conditions. The difficulties in defining these two aspects of the FE model arise from the lack of knowledge in the common engineering understanding of masonry behavior. As a result, engineers are unable to define these FE model input parameters with certainty, and, inevitably, uncertainties are introduced to the FE model.

  19. Effect of information, uncertainty and parameter variability on profits in a queue with various pricing strategies

    Science.gov (United States)

    Sun, Wei; Li, Shiyong

    2014-08-01

    This paper presents an unobservable single-server queueing system with three types of uncertainty, where the service rate, or waiting cost or service quality is random variable that may obtain n(n > 2) values. The information about the realised values of parameters is only known to the server. We are concerned about the server's behaviour: revealing or concealing the information to customers. The n-value assumption and the server's behaviour enable us to consider various pricing strategies. In this paper, we analyse the effect of information and uncertainty on profits and make comparisons between the profits under different pricing strategies. Moreover, as for parameter variability reflected by the number of each parameter's possible choices n, we observe the effect of variable n on all types of profits and find that revealing the parameter information can much more benefit the server with the increase of n.

  20. Simultaneous inference for model averaging of derived parameters

    DEFF Research Database (Denmark)

    Jensen, Signe Marie; Ritz, Christian

    2015-01-01

    Model averaging is a useful approach for capturing uncertainty due to model selection. Currently, this uncertainty is often quantified by means of approximations that do not easily extend to simultaneous inference. Moreover, in practice there is a need for both model averaging and simultaneous...... inference for derived parameters calculated in an after-fitting step. We propose a method for obtaining asymptotically correct standard errors for one or several model-averaged estimates of derived parameters and for obtaining simultaneous confidence intervals that asymptotically control the family...

  1. Imprecision and Uncertainty in the UFO Database Model.

    Science.gov (United States)

    Van Gyseghem, Nancy; De Caluwe, Rita

    1998-01-01

    Discusses how imprecision and uncertainty are dealt with in the UFO (Uncertainty and Fuzziness in an Object-oriented) database model. Such information is expressed by means of possibility distributions, and modeled by means of the proposed concept of "role objects." The role objects model uncertain, tentative information about objects,…

  2. Reducing uncertainty in within-host parameter estimates of influenza infection by measuring both infectious and total viral load.

    Directory of Open Access Journals (Sweden)

    Stephen M Petrie

    Full Text Available For in vivo studies of influenza dynamics where within-host measurements are fit with a mathematical model, infectivity assays (e.g. 50% tissue culture infectious dose; TCID50 are often used to estimate the infectious virion concentration over time. Less frequently, measurements of the total (infectious and non-infectious viral particle concentration (obtained using real-time reverse transcription-polymerase chain reaction; rRT-PCR have been used as an alternative to infectivity assays. We investigated the degree to which measuring both infectious (via TCID50 and total (via rRT-PCR viral load allows within-host model parameters to be estimated with greater consistency and reduced uncertainty, compared with fitting to TCID50 data alone. We applied our models to viral load data from an experimental ferret infection study. Best-fit parameter estimates for the "dual-measurement" model are similar to those from the TCID50-only model, with greater consistency in best-fit estimates across different experiments, as well as reduced uncertainty in some parameter estimates. Our results also highlight how variation in TCID50 assay sensitivity and calibration may hinder model interpretation, as some parameter estimates systematically vary with known uncontrolled variations in the assay. Our techniques may aid in drawing stronger quantitative inferences from in vivo studies of influenza virus dynamics.

  3. Plasticity models of material variability based on uncertainty quantification techniques

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Reese E. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Rizzi, Francesco [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Boyce, Brad [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Templeton, Jeremy Alan [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Ostien, Jakob [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2017-11-01

    The advent of fabrication techniques like additive manufacturing has focused attention on the considerable variability of material response due to defects and other micro-structural aspects. This variability motivates the development of an enhanced design methodology that incorporates inherent material variability to provide robust predictions of performance. In this work, we develop plasticity models capable of representing the distribution of mechanical responses observed in experiments using traditional plasticity models of the mean response and recently developed uncertainty quantification (UQ) techniques. Lastly, we demonstrate that the new method provides predictive realizations that are superior to more traditional ones, and how these UQ techniques can be used in model selection and assessing the quality of calibrated physical parameters.

  4. Bayesian and Frequentist Methods for Estimating Joint Uncertainty of Freundlich Adsorption Isotherm Fitting Parameters

    Science.gov (United States)

    In this paper, we present methods for estimating Freundlich isotherm fitting parameters (K and N) and their joint uncertainty, which have been implemented into the freeware software platforms R and WinBUGS. These estimates were determined by both Frequentist and Bayesian analyse...

  5. The Generic Containment SB-LOCA accident simulation: Comparison of the parameter uncertainties and user-effect

    International Nuclear Information System (INIS)

    Povilaitis, Mantas; Kelm, Stephan; Urbonavičius, Egidijus

    2017-01-01

    Highlights: • Uncertainty and sensitivity analysis for the Generic Containment severe accident. • Comparison of the analysis results with the uncertainties based in the user effect. • Demonstration of the similar importance of both the reducing the user effect and input uncertainties. - Abstract: Uncertainties in safety assessment of the nuclear power plants using computer codes come from several sources: choice of computer code, user effect (a strong impact of user choices on the simulation’s outcome) and uncertainty of various physical parameters. The “Generic Containment” activity was performed in the frames of the EU-FP7 project SARNET2 to investigate the influence of user effect and computer code choice on the results on the nuclear power plant scale. During this activity, a Generic Containment nodalisation was developed and used for exercise by the participants applying various computer codes. Even though the model of the Generic Containment and the transient scenario were precisely and uniquely defined, considerably different results were obtained not only among different codes but also among participants using the same code, showing significant influence of the user effect. This paper present analysis, which is an extension of the “Generic Containment” benchmark and investigates the effect of input parameter’s uncertainties in comparison to the user effect. Calculations were performed using the computer code ASTEC, the uncertainty and sensitivity of the results were estimated using GRS method and tool SUSA. The results of the present analysis show, that while there are differences between the uncertainty bands of the parameters, in general the deviation bands caused by parameters’ uncertainty and the user effect are comparable and of the same order. The properties of concrete and the surface areas may have more influence on containment pressure than the user effect and choice of computer code as identified in the SARNET2 Generic

  6. Corruption of parameter behavior and regionalization by model and forcing data errors: A Bayesian example using the SNOW17 model

    Science.gov (United States)

    He, Minxue; Hogue, Terri S.; Franz, Kristie J.; Margulis, Steven A.; Vrugt, Jasper A.

    2011-07-01

    The current study evaluates the impacts of various sources of uncertainty involved in hydrologic modeling on parameter behavior and regionalization utilizing different Bayesian likelihood functions and the Differential Evolution Adaptive Metropolis (DREAM) algorithm. The developed likelihood functions differ in their underlying assumptions and treatment of error sources. We apply the developed method to a snow accumulation and ablation model (National Weather Service SNOW17) and generate parameter ensembles to predict snow water equivalent (SWE). Observational data include precipitation and air temperature forcing along with SWE measurements from 24 sites with diverse hydroclimatic characteristics. A multiple linear regression model is used to construct regionalization relationships between model parameters and site characteristics. Results indicate that model structural uncertainty has the largest influence on SNOW17 parameter behavior. Precipitation uncertainty is the second largest source of uncertainty, showing greater impact at wetter sites. Measurement uncertainty in SWE tends to have little impact on the final model parameters and resulting SWE predictions. Considering all sources of uncertainty, parameters related to air temperature and snowfall fraction exhibit the strongest correlations to site characteristics. Parameters related to the length of the melting period also show high correlation to site characteristics. Finally, model structural uncertainty and precipitation uncertainty dramatically alter parameter regionalization relationships in comparison to cases where only uncertainty in model parameters or output measurements is considered. Our results demonstrate that accurate treatment of forcing, parameter, model structural, and calibration data errors is critical for deriving robust regionalization relationships.

  7. Incorporating model uncertainty into optimal insurance contract design

    OpenAIRE

    Pflug, G.; Timonina-Farkas, A.; Hochrainer-Stigler, S.

    2017-01-01

    In stochastic optimization models, the optimal solution heavily depends on the selected probability model for the scenarios. However, the scenario models are typically chosen on the basis of statistical estimates and are therefore subject to model error. We demonstrate here how the model uncertainty can be incorporated into the decision making process. We use a nonparametric approach for quantifying the model uncertainty and a minimax setup to find model-robust solutions. The method is illust...

  8. Assessment and Reduction of Model Parametric Uncertainties: A Case Study with A Distributed Hydrological Model

    Science.gov (United States)

    Gan, Y.; Liang, X. Z.; Duan, Q.; Xu, J.; Zhao, P.; Hong, Y.

    2017-12-01

    The uncertainties associated with the parameters of a hydrological model need to be quantified and reduced for it to be useful for operational hydrological forecasting and decision support. An uncertainty quantification framework is presented to facilitate practical assessment and reduction of model parametric uncertainties. A case study, using the distributed hydrological model CREST for daily streamflow simulation during the period 2008-2010 over ten watershed, was used to demonstrate the performance of this new framework. Model behaviors across watersheds were analyzed by a two-stage stepwise sensitivity analysis procedure, using LH-OAT method for screening out insensitive parameters, followed by MARS-based Sobol' sensitivity indices for quantifying each parameter's contribution to the response variance due to its first-order and higher-order effects. Pareto optimal sets of the influential parameters were then found by the adaptive surrogate-based multi-objective optimization procedure, using MARS model for approximating the parameter-response relationship and SCE-UA algorithm for searching the optimal parameter sets of the adaptively updated surrogate model. The final optimal parameter sets were validated against the daily streamflow simulation of the same watersheds during the period 2011-2012. The stepwise sensitivity analysis procedure efficiently reduced the number of parameters that need to be calibrated from twelve to seven, which helps to limit the dimensionality of calibration problem and serves to enhance the efficiency of parameter calibration. The adaptive MARS-based multi-objective calibration exercise provided satisfactory solutions to the reproduction of the observed streamflow for all watersheds. The final optimal solutions showed significant improvement when compared to the default solutions, with about 65-90% reduction in 1-NSE and 60-95% reduction in |RB|. The validation exercise indicated a large improvement in model performance with about 40

  9. Impact of nuclear data uncertainties on neutronics parameters of MYRRHA/XT-ADS

    International Nuclear Information System (INIS)

    Sugawara, T.; Stankovskiy, A.; Van den Eynde, G.; Sarotto, M.

    2011-01-01

    A flexible fast spectrum research reactor MYRRHA able to operate in subcritical (driven by a proton accelerator) and critical modes is being developed in SCK-CEN. In the framework of IP EUROTRANS programme the XT-ADS model has been investigated for MYRRHA. This paper reports the comparison of the sensitivity coefficients calculated for different calculation models and the uncertainties deduced from various covariance data for the discussion on the reliability of XT-ADS neutronics design. Sensitivity analysis is based on the comparison of three-dimensional heterogeneous and two-dimensional RZ calculation models. Three covariance data sets were employed to perform uncertainty analysis. The obtained sensitivity coefficients differ substantially between the 3D heterogeneous and RZ homogenized calculation models. The uncertainties deduced from the covariance data strongly depend on the covariance data variation. The covariance data of the nuclear data libraries is an open issue to discuss the reliability of the neutronics design. The uncertainties deduced from the covariance data for XT-ADS are 0.94% and 1.9% by the SCALE-6 44-group and TENDL-2009 covariance data, accordingly. The uncertainties exceed the 0.3% Δk (confidence level 1σ) target accuracy level. To achieve this target accuracy, the uncertainties should be improved by experiments under adequate conditions such as LBE or Pb moderated environment with MOX or Uranium fuel

  10. SR-Can. Data and uncertainty assessment. Migration parameters for the bentonite buffer in the KBS-3 concept

    International Nuclear Information System (INIS)

    Ochs, Michael; Talerico, Caterina

    2004-08-01

    SKB is currently preparing license applications related to the deep repository for spent nuclear fuel and an encapsulation plant. The present report is one of several specific data reports feeding into the interim reporting for the latter application; it is concerned with the derivation and recommendation of radionuclide migration input parameters for a MX-80 bentonite buffer to PA models. Recommended values for the following parameters as well as the associated uncertainties are derived and documented for a total of 38 elements and oxidation states: diffusion-available porosity (ε); effective diffusivity (D e ); distribution coefficient (K d ). Because of the conditional nature of these parameters, particularly of K d , they were derived specifically for the conditions expected to be relevant for PA consequence calculations. K d values were generally evaluated for the specific porewater composition and solid/water ratio representative for MX-80 compacted to 1,590 kg/m 3 . Because of the highly conditional nature of K d , this was done for several porewater compositions which reflect possible variations in geochemical boundary conditions. D e and ε were derived as a function of density. Parameter derivation was based on systematic datasets available in the literature and/or on thermodynamic models. Associated uncertainties were assessed for a given set of PA conditions and as a function of variability in these conditions. In a final step, apparent diffusivity (D a ) values were calculated from the recommended parameters and compared with independent experimental measurements to arrive at selfconsistent sets of migration parameters

  11. The Impact of Model and Rainfall Forcing Errors on Characterizing Soil Moisture Uncertainty in Land Surface Modeling

    Science.gov (United States)

    Maggioni, V.; Anagnostou, E. N.; Reichle, R. H.

    2013-01-01

    The contribution of rainfall forcing errors relative to model (structural and parameter) uncertainty in the prediction of soil moisture is investigated by integrating the NASA Catchment Land Surface Model (CLSM), forced with hydro-meteorological data, in the Oklahoma region. Rainfall-forcing uncertainty is introduced using a stochastic error model that generates ensemble rainfall fields from satellite rainfall products. The ensemble satellite rain fields are propagated through CLSM to produce soil moisture ensembles. Errors in CLSM are modeled with two different approaches: either by perturbing model parameters (representing model parameter uncertainty) or by adding randomly generated noise (representing model structure and parameter uncertainty) to the model prognostic variables. Our findings highlight that the method currently used in the NASA GEOS-5 Land Data Assimilation System to perturb CLSM variables poorly describes the uncertainty in the predicted soil moisture, even when combined with rainfall model perturbations. On the other hand, by adding model parameter perturbations to rainfall forcing perturbations, a better characterization of uncertainty in soil moisture simulations is observed. Specifically, an analysis of the rank histograms shows that the most consistent ensemble of soil moisture is obtained by combining rainfall and model parameter perturbations. When rainfall forcing and model prognostic perturbations are added, the rank histogram shows a U-shape at the domain average scale, which corresponds to a lack of variability in the forecast ensemble. The more accurate estimation of the soil moisture prediction uncertainty obtained by combining rainfall and parameter perturbations is encouraging for the application of this approach in ensemble data assimilation systems.

  12. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    International Nuclear Information System (INIS)

    Sig Drellack, Lance Prothro

    2007-01-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result of the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The

  13. Evaluation of uncertainties of key neutron parameters of PWR-type reactors with slab fuel, application to neutronic conformity

    International Nuclear Information System (INIS)

    Bernard, D.

    2001-12-01

    The aim of this thesis was to evaluate uncertainties of key neutron parameters of slab reactors. Uncertainties sources have many origins, technologic origin for parameters of fabrication and physical origin for nuclear data. First, each contribution of uncertainties is calculated and finally, a factor of uncertainties is associated to key slab parameter like reactivity, isotherm reactivity coefficient, control rod efficiency, power form factor before irradiation and life-time. This factors of uncertainties were computed by Generalized Perturbations Theory in case of step 0 and by directs calculations in case of irradiation problems. One of neutronic conformity applications was about fabrication and nuclear data targets precision adjustments. Statistic (uncertainties) and deterministic (deviations) approaches were studied. Then, neutronics key slab parameters uncertainties were reduced and so nuclear performances were optimized. (author)

  14. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Dai, Heng [Pacific Northwest National Laboratory, Richland Washington USA; Ye, Ming [Department of Scientific Computing, Florida State University, Tallahassee Florida USA; Walker, Anthony P. [Environmental Sciences Division and Climate Change Science Institute, Oak Ridge National Laboratory, Oak Ridge Tennessee USA; Chen, Xingyuan [Pacific Northwest National Laboratory, Richland Washington USA

    2017-04-01

    Hydrological models are always composed of multiple components that represent processes key to intended model applications. When a process can be simulated by multiple conceptual-mathematical models (process models), model uncertainty in representing the process arises. While global sensitivity analysis methods have been widely used for identifying important processes in hydrologic modeling, the existing methods consider only parametric uncertainty but ignore the model uncertainty for process representation. To address this problem, this study develops a new method to probe multimodel process sensitivity by integrating the model averaging methods into the framework of variance-based global sensitivity analysis, given that the model averaging methods quantify both parametric and model uncertainty. A new process sensitivity index is derived as a metric of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and model parameters. For demonstration, the new index is used to evaluate the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that converting precipitation to recharge, and the geology process is also simulated by two models of different parameterizations of hydraulic conductivity; each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.

  15. Addressing imperfect maintenance modelling uncertainty in unavailability and cost based optimization

    International Nuclear Information System (INIS)

    Sanchez, Ana; Carlos, Sofia; Martorell, Sebastian; Villanueva, Jose F.

    2009-01-01

    Optimization of testing and maintenance activities performed in the different systems of a complex industrial plant is of great interest as the plant availability and economy strongly depend on the maintenance activities planned. Traditionally, two types of models, i.e. deterministic and probabilistic, have been considered to simulate the impact of testing and maintenance activities on equipment unavailability and the cost involved. Both models present uncertainties that are often categorized as either aleatory or epistemic uncertainties. The second group applies when there is limited knowledge on the proper model to represent a problem, and/or the values associated to the model parameters, so the results of the calculation performed with them incorporate uncertainty. This paper addresses the problem of testing and maintenance optimization based on unavailability and cost criteria and considering epistemic uncertainty in the imperfect maintenance modelling. It is framed as a multiple criteria decision making problem where unavailability and cost act as uncertain and conflicting decision criteria. A tolerance interval based approach is used to address uncertainty with regard to effectiveness parameter and imperfect maintenance model embedded within a multiple-objective genetic algorithm. A case of application for a stand-by safety related system of a nuclear power plant is presented. The results obtained in this application show the importance of considering uncertainties in the modelling of imperfect maintenance, as the optimal solutions found are associated with a large uncertainty that influences the final decision making depending on, for example, if the decision maker is risk averse or risk neutral

  16. Addressing imperfect maintenance modelling uncertainty in unavailability and cost based optimization

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez, Ana [Department of Statistics and Operational Research, Polytechnic University of Valencia, Camino de Vera, s/n, 46071 Valencia (Spain); Carlos, Sofia [Department of Chemical and Nuclear Engineering, Polytechnic University of Valencia, Camino de Vera, s/n, 46071 Valencia (Spain); Martorell, Sebastian [Department of Chemical and Nuclear Engineering, Polytechnic University of Valencia, Camino de Vera, s/n, 46071 Valencia (Spain)], E-mail: smartore@iqn.upv.es; Villanueva, Jose F. [Department of Chemical and Nuclear Engineering, Polytechnic University of Valencia, Camino de Vera, s/n, 46071 Valencia (Spain)

    2009-01-15

    Optimization of testing and maintenance activities performed in the different systems of a complex industrial plant is of great interest as the plant availability and economy strongly depend on the maintenance activities planned. Traditionally, two types of models, i.e. deterministic and probabilistic, have been considered to simulate the impact of testing and maintenance activities on equipment unavailability and the cost involved. Both models present uncertainties that are often categorized as either aleatory or epistemic uncertainties. The second group applies when there is limited knowledge on the proper model to represent a problem, and/or the values associated to the model parameters, so the results of the calculation performed with them incorporate uncertainty. This paper addresses the problem of testing and maintenance optimization based on unavailability and cost criteria and considering epistemic uncertainty in the imperfect maintenance modelling. It is framed as a multiple criteria decision making problem where unavailability and cost act as uncertain and conflicting decision criteria. A tolerance interval based approach is used to address uncertainty with regard to effectiveness parameter and imperfect maintenance model embedded within a multiple-objective genetic algorithm. A case of application for a stand-by safety related system of a nuclear power plant is presented. The results obtained in this application show the importance of considering uncertainties in the modelling of imperfect maintenance, as the optimal solutions found are associated with a large uncertainty that influences the final decision making depending on, for example, if the decision maker is risk averse or risk neutral.

  17. Theoretical uncertainties of the Duflo–Zuker shell-model mass formulae

    International Nuclear Information System (INIS)

    Qi, Chong

    2015-01-01

    It is becoming increasingly important to understand the uncertainties of nuclear mass model calculations and their limitations when extrapolating to driplines. In this paper we evaluate the parameter uncertainties of the Duflo–Zuker (DZ) shell model mass formulae by fitting to the latest experimental mass compilation AME2012 using the least square and minimax fitting procedures. We also analyze the propagation of the uncertainties in binding energy calculations when extrapolated to driplines. The parameter uncertainties and uncertain propagations are evaluated with the help of the covariance matrix thus derived. Large deviations from the extrapolations of AME2012 are seen in superheavy nuclei. A simplified version of the DZ model (DZ19) with much smaller uncertainties than that of DZ33 is proposed. Calculations are compared with results from other mass formulae. Systematics on the uncertainty propagation as well as the positions of the driplines are also presented. The DZ mass formulae are shown to be well defined with good extrapolation properties and rather small uncertainties, even though some of the parameters of the full DZ33 model cannot be fully determined by fitting to available experimental data. (paper)

  18. Uncertainty models applied to the substation planning

    Energy Technology Data Exchange (ETDEWEB)

    Fontoura Filho, Roberto N. [ELETROBRAS, Rio de Janeiro, RJ (Brazil); Aires, Joao Carlos O.; Tortelly, Debora L.S. [Light Servicos de Eletricidade S.A., Rio de Janeiro, RJ (Brazil)

    1994-12-31

    The selection of the reinforcements for a power system expansion becomes a difficult task on an environment of uncertainties. These uncertainties can be classified according to their sources as exogenous and endogenous. The first one is associated to the elements of the generation, transmission and distribution systems. The exogenous uncertainly is associated to external aspects, as the financial resources, the time spent to build the installations, the equipment price and the load level. The load uncertainly is extremely sensible to the behaviour of the economic conditions. Although the impossibility to take out completely the uncertainty , the endogenous one can be convenient treated and the exogenous uncertainly can be compensated. This paper describes an uncertainty treatment methodology and a practical application to a group of substations belonging to LIGHT company, the Rio de Janeiro electric utility. The equipment performance uncertainty is treated by adopting a probabilistic approach. The uncertainly associated to the load increase is considered by using technical analysis of scenarios and choice criteria based on the Decision Theory. On this paper it was used the Savage Method and the Fuzzy Set Method, in order to select the best middle term reinforcements plan. (author) 7 refs., 4 figs., 6 tabs.

  19. Uncertainty and target accuracy studies for the very high temperature reactor(VHTR) physics parameters.

    Energy Technology Data Exchange (ETDEWEB)

    Taiwo, T. A.; Palmiotti, G.; Aliberti, G.; Salvatores, M.; Kim, T.K.

    2005-09-16

    The potential impact of nuclear data uncertainties on a number of performance parameters (core and fuel cycle) of the prismatic block-type Very High Temperature Reactor (VHTR) has been evaluated and results are presented in this report. An uncertainty analysis has been performed, based on sensitivity theory, which underlines what cross-sections, what energy range and what isotopes are responsible for the most significant uncertainties. In order to give guidelines on priorities for new evaluations or validation experiments, required accuracies on specific nuclear data have been derived, accounting for target accuracies on major design parameters. Results of an extensive analysis indicate only a limited number of relevant parameters do not meet the target accuracies assumed in this work; this does not imply that the existing nuclear cross-section data cannot be used for the feasibility and pre-conceptual assessments of the VHTR. However, the results obtained depend on the uncertainty data used, and it is suggested to focus some future evaluation work on the production of consistent, as far as possible complete and user oriented covariance data.

  20. Estimated Frequency Domain Model Uncertainties used in Robust Controller Design

    DEFF Research Database (Denmark)

    Tøffner-Clausen, S.; Andersen, Palle; Stoustrup, Jakob

    1994-01-01

    This paper deals with the combination of system identification and robust controller design. Recent results on estimation of frequency domain model uncertainty are......This paper deals with the combination of system identification and robust controller design. Recent results on estimation of frequency domain model uncertainty are...

  1. Some concepts of model uncertainty for performance assessments of nuclear waste repositories

    International Nuclear Information System (INIS)

    Eisenberg, N.A.; Sagar, B.; Wittmeyer, G.W.

    1994-01-01

    Models of the performance of nuclear waste repositories will be central to making regulatory decisions regarding the safety of such facilities. The conceptual model of repository performance is represented by mathematical relationships, which are usually implemented as one or more computer codes. A geologic system may allow many conceptual models, which are consistent with the observations. These conceptual models may or may not have the same mathematical representation. Experiences in modeling the performance of a waste repository representation. Experiences in modeling the performance of a waste repository (which is, in part, a geologic system), show that this non-uniqueness of conceptual models is a significant source of model uncertainty. At the same time, each conceptual model has its own set of parameters and usually, it is not be possible to completely separate model uncertainty from parameter uncertainty for the repository system. Issues related to the origin of model uncertainty, its relation to parameter uncertainty, and its incorporation in safety assessments are discussed from a broad regulatory perspective. An extended example in which these issues are explored numerically is also provided

  2. DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Eldred, Michael Scott; Vigil, Dena M.; Dalbey, Keith R.; Bohnhoff, William J.; Adams, Brian M.; Swiler, Laura Painton; Lefantzi, Sophia (Sandia National Laboratories, Livermore, CA); Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Eddy, John P.

    2011-12-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the DAKOTA software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of DAKOTA-related research publications in the areas of surrogate-based optimization, uncertainty quantification, and optimization under uncertainty that provide the foundation for many of DAKOTA's iterative analysis capabilities.

  3. Investigating the Propagation of Meteorological Model Uncertainty for Tracer Modeling

    Science.gov (United States)

    Lopez-Coto, I.; Ghosh, S.; Karion, A.; Martin, C.; Mueller, K. L.; Prasad, K.; Whetstone, J. R.

    2016-12-01

    The North-East Corridor project aims to use a top-down inversion method to quantify sources of Greenhouse Gas (GHG) emissions in the urban areas of Washington DC and Baltimore at approximately 1km2 resolutions. The aim of this project is to help establish reliable measurement methods for quantifying and validating GHG emissions independently of the inventory methods typically used to guide mitigation efforts. Since inversion methods depend strongly on atmospheric transport modeling, analyzing the uncertainties on the meteorological fields and their propagation through the sensitivities of observations to surface fluxes (footprints) is a fundamental step. To this end, six configurations of the Weather Research and Forecasting Model (WRF-ARW) version 3.8 were used to generate an ensemble of meteorological simulations. Specifically, we used 4 planetary boundary layer parameterizations (YSU, MYNN2, BOULAC, QNSE), 2 sources of initial and boundary conditions (NARR and HRRR) and 1 configuration including the building energy parameterization (BEP) urban canopy model. The simulations were compared with more than 150 meteorological surface stations, a wind profiler and radiosondes for a month (February) in 2016 to account for the uncertainties and the ensemble spread for wind speed, direction and mixing height. In addition, we used the Stochastic Time-Inverted Lagrangian Transport model (STILT) to derive the sensitivity of 12 hypothetical observations to surface emissions (footprints) with each WRF configuration. The footprints and integrated sensitivities were compared and the resulting uncertainties estimated.

  4. Impact of dose-distribution uncertainties on rectal ntcp modeling I: Uncertainty estimates

    International Nuclear Information System (INIS)

    Fenwick, John D.; Nahum, Alan E.

    2001-01-01

    A trial of nonescalated conformal versus conventional radiotherapy treatment of prostate cancer has been carried out at the Royal Marsden NHS Trust (RMH) and Institute of Cancer Research (ICR), demonstrating a significant reduction in the rate of rectal bleeding reported for patients treated using the conformal technique. The relationship between planned rectal dose-distributions and incidences of bleeding has been analyzed, showing that the rate of bleeding falls significantly as the extent of the rectal wall receiving a planned dose-level of more than 57 Gy is reduced. Dose-distributions delivered to the rectal wall over the course of radiotherapy treatment inevitably differ from planned distributions, due to sources of uncertainty such as patient setup error, rectal wall movement and variation in the absolute rectal wall surface area. In this paper estimates of the differences between planned and treated rectal dose-distribution parameters are obtained for the RMH/ICR nonescalated conformal technique, working from a distribution of setup errors observed during the RMH/ICR trial, movement data supplied by Lebesque and colleagues derived from repeat CT scans, and estimates of rectal circumference variations extracted from the literature. Setup errors and wall movement are found to cause only limited systematic differences between mean treated and planned rectal dose-distribution parameter values, but introduce considerable uncertainties into the treated values of some dose-distribution parameters: setup errors lead to 22% and 9% relative uncertainties in the highly dosed fraction of the rectal wall and the wall average dose, respectively, with wall movement leading to 21% and 9% relative uncertainties. Estimates obtained from the literature of the uncertainty in the absolute surface area of the distensible rectal wall are of the order of 13%-18%. In a subsequent paper the impact of these uncertainties on analyses of the relationship between incidences of bleeding

  5. Dimensionality reduction for uncertainty quantification of nuclear engineering models.

    Energy Technology Data Exchange (ETDEWEB)

    Roderick, O.; Wang, Z.; Anitescu, M. (Mathematics and Computer Science)

    2011-01-01

    The task of uncertainty quantification consists of relating the available information on uncertainties in the model setup to the resulting variation in the outputs of the model. Uncertainty quantification plays an important role in complex simulation models of nuclear engineering, where better understanding of uncertainty results in greater confidence in the model and in the improved safety and efficiency of engineering projects. In our previous work, we have shown that the effect of uncertainty can be approximated by polynomial regression with derivatives (PRD): a hybrid regression method that uses first-order derivatives of the model output as additional fitting conditions for a polynomial expansion. Numerical experiments have demonstrated the advantage of this approach over classical methods of uncertainty analysis: in precision, computational efficiency, or both. To obtain derivatives, we used automatic differentiation (AD) on the simulation code; hand-coded derivatives are acceptable for simpler models. We now present improvements on the method. We use a tuned version of the method of snapshots, a technique based on proper orthogonal decomposition (POD), to set up the reduced order representation of essential information on uncertainty in the model inputs. The automatically obtained sensitivity information is required to set up the method. Dimensionality reduction in combination with PRD allows analysis on a larger dimension of the uncertainty space (>100), at modest computational cost.

  6. Impact of AMS-02 Measurements on Reducing GCR Model Uncertainties

    Science.gov (United States)

    Slaba, T. C.; O'Neill, P. M.; Golge, S.; Norbury, J. W.

    2015-01-01

    For vehicle design, shield optimization, mission planning, and astronaut risk assessment, the exposure from galactic cosmic rays (GCR) poses a significant and complex problem both in low Earth orbit and in deep space. To address this problem, various computational tools have been developed to quantify the exposure and risk in a wide range of scenarios. Generally, the tool used to describe the ambient GCR environment provides the input into subsequent computational tools and is therefore a critical component of end-to-end procedures. Over the past few years, several researchers have independently and very carefully compared some of the widely used GCR models to more rigorously characterize model differences and quantify uncertainties. All of the GCR models studied rely heavily on calibrating to available near-Earth measurements of GCR particle energy spectra, typically over restricted energy regions and short time periods. In this work, we first review recent sensitivity studies quantifying the ions and energies in the ambient GCR environment of greatest importance to exposure quantities behind shielding. Currently available measurements used to calibrate and validate GCR models are also summarized within this context. It is shown that the AMS-II measurements will fill a critically important gap in the measurement database. The emergence of AMS-II measurements also provides a unique opportunity to validate existing models against measurements that were not used to calibrate free parameters in the empirical descriptions. Discussion is given regarding rigorous approaches to implement the independent validation efforts, followed by recalibration of empirical parameters.

  7. Comparison of evidence theory and Bayesian theory for uncertainty modeling

    International Nuclear Information System (INIS)

    Soundappan, Prabhu; Nikolaidis, Efstratios; Haftka, Raphael T.; Grandhi, Ramana; Canfield, Robert

    2004-01-01

    This paper compares Evidence Theory (ET) and Bayesian Theory (BT) for uncertainty modeling and decision under uncertainty, when the evidence about uncertainty is imprecise. The basic concepts of ET and BT are introduced and the ways these theories model uncertainties, propagate them through systems and assess the safety of these systems are presented. ET and BT approaches are demonstrated and compared on challenge problems involving an algebraic function whose input variables are uncertain. The evidence about the input variables consists of intervals provided by experts. It is recommended that a decision-maker compute both the Bayesian probabilities of the outcomes of alternative actions and their plausibility and belief measures when evidence about uncertainty is imprecise, because this helps assess the importance of imprecision and the value of additional information. Finally, the paper presents and demonstrates a method for testing approaches for decision under uncertainty in terms of their effectiveness in making decisions

  8. RESONANCE SELF-SHIELDING EFFECT IN UNCERTAINTY QUANTIFICATION OF FISSION REACTOR NEUTRONICS PARAMETERS

    Directory of Open Access Journals (Sweden)

    GO CHIBA

    2014-06-01

    Full Text Available In order to properly quantify fission reactor neutronics parameter uncertainties, we have to use covariance data and sensitivity profiles consistently. In the present paper, we establish two consistent methodologies for uncertainty quantification: a self-shielded cross section-based consistent methodology and an infinitely-diluted cross section-based consistent methodology. With these methodologies and the covariance data of uranium-238 nuclear data given in JENDL-3.3, we quantify uncertainties of infinite neutron multiplication factors of light water reactor and fast reactor fuel cells. While an inconsistent methodology gives results which depend on the energy group structure of neutron flux and neutron-nuclide reaction cross section representation, both the consistent methodologies give fair results with no such dependences.

  9. Uncertainty shocks in a model of effective demand

    OpenAIRE

    Bundick, Brent; Basu, Susanto

    2014-01-01

    Can increased uncertainty about the future cause a contraction in output and its components? An identified uncertainty shock in the data causes significant declines in output, consumption, investment, and hours worked. Standard general-equilibrium models with flexible prices cannot reproduce this comovement. However, uncertainty shocks can easily generate comovement with countercyclical markups through sticky prices. Monetary policy plays a key role in offsetting the negative impact of uncert...

  10. Uncertainty modelling of atmospheric dispersion by stochastic ...

    Indian Academy of Sciences (India)

    sensitivity and uncertainty of atmospheric dispersion using fuzzy set theory can be found in. Chutia et al (2013). ..... tainties have been presented, will facilitate the decision makers in the said field to take a decision on the quality of the air if ..... Annals of Fuzzy Mathematics and Informatics 5(1): 213–22. Chutia R, Mahanta S ...

  11. Modeling uncertainty in requirements engineering decision support

    Science.gov (United States)

    Feather, Martin S.; Maynard-Zhang, Pedrito; Kiper, James D.

    2005-01-01

    One inherent characteristic of requrements engineering is a lack of certainty during this early phase of a project. Nevertheless, decisions about requirements must be made in spite of this uncertainty. Here we describe the context in which we are exploring this, and some initial work to support elicitation of uncertain requirements, and to deal with the combination of such information from multiple stakeholders.

  12. [Application of an uncertainty model for fibromyalgia].

    Science.gov (United States)

    Triviño Martínez, Ángeles; Solano Ruiz, M Carmen; Siles González, José

    2016-04-01

    Finding out women's experiences diagnosed with fibromyalgia applying the Theory of Uncertainty proposed by M. Mishel. A qualitative study was conducted, using a phenomenological approach. An Association of patients in the province of Alicante during the months of June 2012 to November 2013. A total of 14 women diagnosed with fibromyalgia participated in the study as volunteers, aged between 45 and 65 years. Information generated through structured interviews with recording and transcription, prior confidentiality pledge and informed consent. Analysis content by extracting different categories according to the theory proposed. The study patients perceive a high level of uncertainty related to the difficulty to deal with symptoms, uncertainty about diagnosis and treatment complexity. Moreover, the ability of coping with the disease it is influenced by social support, relationships with health professionals and help and information attending to patient associations. The health professional must provide clear information on the pathology to the fibromyalgia suffers, the larger lever of knowledge of the patients about their disease and the better the quality of the information provided, it is reported to be the less anxiety and uncertainty in the experience of the disease. Likewise patient associations should have health professionals in order to avoid bias in the information and advice with scientific evidence. Copyright © 2015 Elsevier España, S.L.U. All rights reserved.

  13. Uncertainty modelling of atmospheric dispersion by stochastic ...

    Indian Academy of Sciences (India)

    discharges and related regulated pollution criteria for the marine environment. An Integrated. Simulation-Assessment Approach (ISAA) (Yang et al 2010) is developed to systematically tackle multiple uncertainties associated with hydrocarbon contaminant transport in subsurface and assessment of carcinogenic health risk ...

  14. Uncertainties in modeling hazardous gas releases for emergency response

    Directory of Open Access Journals (Sweden)

    Kathrin Baumann-Stanzer

    2011-02-01

    Full Text Available In case of an accidental release of toxic gases the emergency responders need fast information about the affected area and the maximum impact. Hazard distances calculated with the models MET, ALOHA, BREEZE, TRACE and SAMS for scenarios with chlorine, ammoniac and butane releases are compared in this study. The variations of the model results are measures for uncertainties in source estimation and dispersion calculation. Model runs for different wind speeds, atmospheric stability and roughness lengths indicate the model sensitivity to these input parameters. In-situ measurements at two urban near-traffic sites are compared to results of the Integrated Nowcasting through Comprehensive Analysis (INCA in order to quantify uncertainties in the meteorological input. The hazard zone estimates from the models vary up to a factor of 4 due to different input requirements as well as due to different internal model assumptions. None of the models is found to be 'more conservative' than the others in all scenarios. INCA wind-speeds are correlated to in-situ observations at two urban sites in Vienna with a factor of 0.89. The standard deviations of the normal error distribution are 0.8 ms-1 in wind speed, on the scale of 50 degrees in wind direction, up to 4°C in air temperature and up to 10 % in relative humidity. The observed air temperature and humidity are well reproduced by INCA with correlation coefficients of 0.96 to 0.99. INCA is therefore found to give a good representation of the local meteorological conditions. Besides of real-time data, the INCA-short range forecast for the following hours may support the action planning of the first responders.

  15. Uncertainties in modeling hazardous gas releases for emergency response

    Energy Technology Data Exchange (ETDEWEB)

    Baumann-Stanzer, Kathrin; Stenzel, Sirma [Zentralanstalt fuer Meteorologie und Geodynamik, Vienna (Austria)

    2011-02-15

    In case of an accidental release of toxic gases the emergency responders need fast information about the affected area and the maximum impact. Hazard distances calculated with the models MET, ALOHA, BREEZE, TRACE and SAMS for scenarios with chlorine, ammoniac and butane releases are compared in this study. The variations of the model results are measures for uncertainties in source estimation and dispersion calculation. Model runs for different wind speeds, atmospheric stability and roughness lengths indicate the model sensitivity to these input parameters. In-situ measurements at two urban near-traffic sites are compared to results of the Integrated Nowcasting through Comprehensive Analysis (INCA) in order to quantify uncertainties in the meteorological input. The hazard zone estimates from the models vary up to a factor of 4 due to different input requirements as well as due to different internal model assumptions. None of the models is found to be 'more conservative' than the others in all scenarios. INCA wind-speeds are correlated to in-situ observations at two urban sites in Vienna with a factor of 0.89. The standard deviations of the normal error distribution are 0.8 ms{sup -1} in wind speed, on the scale of 50 degrees in wind direction, up to 4 C in air temperature and up to 10 % in relative humidity. The observed air temperature and humidity are well reproduced by INCA with correlation coefficients of 0.96 to 0.99. INCA is therefore found to give a good representation of the local meteorological conditions. Besides of real-time data, the INCA-short range forecast for the following hours may support the action planning of the first responders. (orig.)

  16. Uncertainty and sensitivity analysis: Mathematical model of coupled heat and mass transfer for a contact baking process

    DEFF Research Database (Denmark)

    Feyissa, Aberham Hailu; Gernaey, Krist; Adler-Nissen, Jens

    2012-01-01

    transfer model of a contact baking process. The Monte Carlo procedure was applied for propagating uncertainty in the input parameters to uncertainty in the model predictions. Monte Carlo simulations and the least squares method were used in the sensitivity analysis: for each model output, a linear...... be used to prioritize future experimental efforts, as discussed for the contact baking process....

  17. Reservoir management under geological uncertainty using fast model update

    NARCIS (Netherlands)

    Hanea, R.; Evensen, G.; Hustoft, L.; Ek, T.; Chitu, A.; Wilschut, F.

    2015-01-01

    Statoil is implementing "Fast Model Update (FMU)," an integrated and automated workflow for reservoir modeling and characterization. FMU connects all steps and disciplines from seismic depth conversion to prediction and reservoir management taking into account relevant reservoir uncertainty. FMU

  18. Dynamic modeling of predictive uncertainty by regression on absolute errors

    NARCIS (Netherlands)

    Pianosi, F.; Raso, L.

    2012-01-01

    Uncertainty of hydrological forecasts represents valuable information for water managers and hydrologists. This explains the popularity of probabilistic models, which provide the entire distribution of the hydrological forecast. Nevertheless, many existing hydrological models are deterministic and

  19. Modelling uncertainty due to imperfect forward model and aerosol microphysical model selection in the satellite aerosol retrieval

    Science.gov (United States)

    Määttä, Anu; Laine, Marko; Tamminen, Johanna

    2015-04-01

    This study aims to characterize the uncertainty related to the aerosol microphysical model selection and the modelling error due to approximations in the forward modelling. Many satellite aerosol retrieval algorithms rely on pre-calculated look-up tables of model parameters representing various atmospheric conditions. In the retrieval we need to choose the most appropriate aerosol microphysical models from the pre-defined set of models by fitting them to the observations. The aerosol properties, e.g. AOD, are then determined from the best models. This choice of an appropriate aerosol model composes a notable part in the AOD retrieval uncertainty. The motivation in our study was to account these two sources in the total uncertainty budget: uncertainty in selecting the most appropriate model, and uncertainty resulting from the approximations in the pre-calculated aerosol microphysical model. The systematic model error was analysed by studying the behaviour of the model residuals, i.e. the differences between modelled and observed reflectances, by statistical methods. We utilised Gaussian processes to characterize the uncertainty related to approximations in aerosol microphysics modelling due to use of look-up tables and other non-modelled systematic features in the Level 1 data. The modelling error is described by a non-diagonal covariance matrix parameterised by correlation length, which is estimated from the residuals using computational tools from spatial statistics. In addition, we utilised Bayesian model selection and model averaging methods to account the uncertainty due to aerosol model selection. By acknowledging the modelling error as a source of uncertainty in the retrieval of AOD from observed spectral reflectance, we allow the observed values to deviate from the modelled values within limits determined by both the measurement and modelling errors. This results in a more realistic uncertainty level of the retrieved AOD. The method is illustrated by both

  20. Incorporating rainfall uncertainty in a SWAT model: the river Zenne basin (Belgium) case study

    Science.gov (United States)

    Tolessa Leta, Olkeba; Nossent, Jiri; van Griensven, Ann; Bauwens, Willy

    2013-04-01

    The European Union Water Framework Directive (EU-WFD) called its member countries to achieve a good ecological status for all inland and coastal water bodies by 2015. According to recent studies, the river Zenne (Belgium) is far from this objective. Therefore, an interuniversity and multidisciplinary project "Towards a Good Ecological Status in the river Zenne (GESZ)" was launched to evaluate the effects of wastewater management plans on the river. In this project, different models have been developed and integrated using the Open Modelling Interface (OpenMI). The hydrologic, semi-distributed Soil and Water Assessment Tool (SWAT) is hereby used as one of the model components in the integrated modelling chain in order to model the upland catchment processes. The assessment of the uncertainty of SWAT is an essential aspect of the decision making process, in order to design robust management strategies that take the predicted uncertainties into account. Model uncertainty stems from the uncertainties on the model parameters, the input data (e.g, rainfall), the calibration data (e.g., stream flows) and on the model structure itself. The objective of this paper is to assess the first three sources of uncertainty in a SWAT model of the river Zenne basin. For the assessment of rainfall measurement uncertainty, first, we identified independent rainfall periods, based on the daily precipitation and stream flow observations and using the Water Engineering Time Series PROcessing tool (WETSPRO). Secondly, we assigned a rainfall multiplier parameter for each of the independent rainfall periods, which serves as a multiplicative input error corruption. Finally, we treated these multipliers as latent parameters in the model optimization and uncertainty analysis (UA). For parameter uncertainty assessment, due to the high number of parameters of the SWAT model, first, we screened out its most sensitive parameters using the Latin Hypercube One-factor-At-a-Time (LH-OAT) technique

  1. The magnitude and causes of uncertainty in global model simulations of cloud condensation nuclei

    Directory of Open Access Journals (Sweden)

    L. A. Lee

    2013-09-01

    Full Text Available Aerosol–cloud interaction effects are a major source of uncertainty in climate models so it is important to quantify the sources of uncertainty and thereby direct research efforts. However, the computational expense of global aerosol models has prevented a full statistical analysis of their outputs. Here we perform a variance-based analysis of a global 3-D aerosol microphysics model to quantify the magnitude and leading causes of parametric uncertainty in model-estimated present-day concentrations of cloud condensation nuclei (CCN. Twenty-eight model parameters covering essentially all important aerosol processes, emissions and representation of aerosol size distributions were defined based on expert elicitation. An uncertainty analysis was then performed based on a Monte Carlo-type sampling of an emulator built for each model grid cell. The standard deviation around the mean CCN varies globally between about ±30% over some marine regions to ±40–100% over most land areas and high latitudes, implying that aerosol processes and emissions are likely to be a significant source of uncertainty in model simulations of aerosol–cloud effects on climate. Among the most important contributors to CCN uncertainty are the sizes of emitted primary particles, including carbonaceous combustion particles from wildfires, biomass burning and fossil fuel use, as well as sulfate particles formed on sub-grid scales. Emissions of carbonaceous combustion particles affect CCN uncertainty more than sulfur emissions. Aerosol emission-related parameters dominate the uncertainty close to sources, while uncertainty in aerosol microphysical processes becomes increasingly important in remote regions, being dominated by deposition and aerosol sulfate formation during cloud-processing. The results lead to several recommendations for research that would result in improved modelling of cloud–active aerosol on a global scale.

  2. Uncertainty analysis on simple mass balance model to calculate critical loads for soil acidity

    International Nuclear Information System (INIS)

    Li Harbin; McNulty, Steven G.

    2007-01-01

    Simple mass balance equations (SMBE) of critical acid loads (CAL) in forest soil were developed to assess potential risks of air pollutants to ecosystems. However, to apply SMBE reliably at large scales, SMBE must be tested for adequacy and uncertainty. Our goal was to provide a detailed analysis of uncertainty in SMBE so that sound strategies for scaling up CAL estimates to the national scale could be developed. Specifically, we wanted to quantify CAL uncertainty under natural variability in 17 model parameters, and determine their relative contributions in predicting CAL. Results indicated that uncertainty in CAL came primarily from components of base cation weathering (BC w ; 49%) and acid neutralizing capacity (46%), whereas the most critical parameters were BC w base rate (62%), soil depth (20%), and soil temperature (11%). Thus, improvements in estimates of these factors are crucial to reducing uncertainty and successfully scaling up SMBE for national assessments of CAL. - A comprehensive uncertainty analysis, with advanced techniques and full list and full value ranges of all individual parameters, was used to examine a simple mass balance model and address questions of error partition and uncertainty reduction in critical acid load estimates that were not fully answered by previous studies

  3. Uncertainty and sensitivity analysis of control strategies using the benchmark simulation model No1 (BSM1).

    Science.gov (United States)

    Flores-Alsina, Xavier; Rodriguez-Roda, Ignasi; Sin, Gürkan; Gernaey, Krist V

    2009-01-01

    The objective of this paper is to perform an uncertainty and sensitivity analysis of the predictions of the Benchmark Simulation Model (BSM) No. 1, when comparing four activated sludge control strategies. The Monte Carlo simulation technique is used to evaluate the uncertainty in the BSM1 predictions, considering the ASM1 bio-kinetic parameters and influent fractions as input uncertainties while the Effluent Quality Index (EQI) and the Operating Cost Index (OCI) are focused on as model outputs. The resulting Monte Carlo simulations are presented using descriptive statistics indicating the degree of uncertainty in the predicted EQI and OCI. Next, the Standard Regression Coefficients (SRC) method is used for sensitivity analysis to identify which input parameters influence the uncertainty in the EQI predictions the most. The results show that control strategies including an ammonium (S(NH)) controller reduce uncertainty in both overall pollution removal and effluent total Kjeldahl nitrogen. Also, control strategies with an external carbon source reduce the effluent nitrate (S(NO)) uncertainty increasing both their economical cost and variability as a trade-off. Finally, the maximum specific autotrophic growth rate (micro(A)) causes most of the variance in the effluent for all the evaluated control strategies. The influence of denitrification related parameters, e.g. eta(g) (anoxic growth rate correction factor) and eta(h) (anoxic hydrolysis rate correction factor), becomes less important when a S(NO) controller manipulating an external carbon source addition is implemented.

  4. Robust stability analysis of quaternion-valued neural networks with time delays and parameter uncertainties.

    Science.gov (United States)

    Chen, Xiaofeng; Li, Zhongshan; Song, Qiankun; Hu, Jin; Tan, Yuanshun

    2017-07-01

    This paper addresses the problem of robust stability for quaternion-valued neural networks (QVNNs) with leakage delay, discrete delay and parameter uncertainties. Based on Homeomorphic mapping theorem and Lyapunov theorem, via modulus inequality technique of quaternions, some sufficient conditions on the existence, uniqueness, and global robust stability of the equilibrium point are derived for the delayed QVNNs with parameter uncertainties. Furthermore, as direct applications of these results, several sufficient conditions are obtained for checking the global robust stability of QVNNs without leakage delay as well as complex-valued neural networks (CVNNs) with both leakage and discrete delays. Finally, two numerical examples are provided to substantiate the effectiveness of the proposed results. Published by Elsevier Ltd.

  5. Uncertainties in model predictions of nitrogen fluxes from agro-ecosystems in Europe

    Directory of Open Access Journals (Sweden)

    J. Kros

    2012-11-01

    Full Text Available To assess the responses of nitrogen and greenhouse gas emissions to pan-European changes in land cover, land management and climate, an integrated dynamic model, INTEGRATOR, has been developed. This model includes both simple process-based descriptions and empirical relationships and uses detailed GIS-based environmental and farming data in combination with various downscaling methods. This paper analyses the propagation of uncertainties in model inputs and parameters to outputs of INTEGRATOR, using a Monte Carlo analysis. Uncertain model inputs and parameters were represented by probability distributions, while spatial correlation in these uncertainties was taken into account by assigning correlation coefficients at various spatial scales. The uncertainty propagation was analysed for the emissions of NH3, N2O and NOx, N leaching to groundwater and N runoff to surface water for the entire EU27 and for individual countries. Results show large uncertainties for N leaching and runoff (relative errors of ∼ 19% for Europe as a whole, and smaller uncertainties for emission of N2O, NH3 and NOx (relative errors of ∼ 12%. Uncertainties for Europe as a whole were much smaller compared to uncertainties at country level, because errors partly cancelled out due to spatial aggregation.

  6. Development of a Prototype Model-Form Uncertainty Knowledge Base

    Science.gov (United States)

    Green, Lawrence L.

    2016-01-01

    Uncertainties are generally classified as either aleatory or epistemic. Aleatory uncertainties are those attributed to random variation, either naturally or through manufacturing processes. Epistemic uncertainties are generally attributed to a lack of knowledge. One type of epistemic uncertainty is called model-form uncertainty. The term model-form means that among the choices to be made during a design process within an analysis, there are different forms of the analysis process, which each give different results for the same configuration at the same flight conditions. Examples of model-form uncertainties include the grid density, grid type, and solver type used within a computational fluid dynamics code, or the choice of the number and type of model elements within a structures analysis. The objectives of this work are to identify and quantify a representative set of model-form uncertainties and to make this information available to designers through an interactive knowledge base (KB). The KB can then be used during probabilistic design sessions, so as to enable the possible reduction of uncertainties in the design process through resource investment. An extensive literature search has been conducted to identify and quantify typical model-form uncertainties present within aerospace design. An initial attempt has been made to assemble the results of this literature search into a searchable KB, usable in real time during probabilistic design sessions. A concept of operations and the basic structure of a model-form uncertainty KB are described. Key operations within the KB are illustrated. Current limitations in the KB, and possible workarounds are explained.

  7. IAEA CRP on HTGR Uncertainties in Modeling: Assessment of Phase I Lattice to Core Model Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Rouxelin, Pascal Nicolas [Idaho National Lab. (INL), Idaho Falls, ID (United States); Strydom, Gerhard [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    Best-estimate plus uncertainty analysis of reactors is replacing the traditional conservative (stacked uncertainty) method for safety and licensing analysis. To facilitate uncertainty analysis applications, a comprehensive approach and methodology must be developed and applied. High temperature gas cooled reactors (HTGRs) have several features that require techniques not used in light-water reactor analysis (e.g., coated-particle design and large graphite quantities at high temperatures). The International Atomic Energy Agency has therefore launched the Coordinated Research Project on HTGR Uncertainty Analysis in Modeling to study uncertainty propagation in the HTGR analysis chain. The benchmark problem defined for the prismatic design is represented by the General Atomics Modular HTGR 350. The main focus of this report is the compilation and discussion of the results obtained for various permutations of Exercise I 2c and the use of the cross section data in Exercise II 1a of the prismatic benchmark, which is defined as the last and first steps of the lattice and core simulation phases, respectively. The report summarizes the Idaho National Laboratory (INL) best estimate results obtained for Exercise I 2a (fresh single-fuel block), Exercise I 2b (depleted single-fuel block), and Exercise I 2c (super cell) in addition to the first results of an investigation into the cross section generation effects for the super-cell problem. The two dimensional deterministic code known as the New ESC based Weighting Transport (NEWT) included in the Standardized Computer Analyses for Licensing Evaluation (SCALE) 6.1.2 package was used for the cross section evaluation, and the results obtained were compared to the three dimensional stochastic SCALE module KENO VI. The NEWT cross section libraries were generated for several permutations of the current benchmark super-cell geometry and were then provided as input to the Phase II core calculation of the stand alone neutronics Exercise

  8. Status of standard model predictions and uncertainties for electroweak observables

    International Nuclear Information System (INIS)

    Kniehl, B.A.

    1993-11-01

    Recent progress in theoretical predictions of electroweak parameters beyond one loop in the standard model is reviewed. The topics include universal corrections of O(G F 2 M H 2 M W 2 ), O(G F 2 m t 4 ), O(α s G F M W 2 ), and those due to virtual t anti t threshold effects, as well as specific corrections to Γ(Z → b anti b) of O(G F 2 m t 4 ), O(α s G F m t 2 ), and O(α s 2 m b 2 /M Z 2 ). An update of the hadronic contributions to Δα is presented. Theoretical uncertainties, other than those due to the lack of knowledge of M H and m t , are estimated. (orig.)

  9. Improved Wave-vessel Transfer Functions by Uncertainty Modelling

    DEFF Research Database (Denmark)

    Nielsen, Ulrik Dam; Fønss Bach, Kasper; Iseki, Toshio

    2016-01-01

    This paper deals with uncertainty modelling of wave-vessel transfer functions used to calculate or predict wave-induced responses of a ship in a seaway. Although transfer functions, in theory, can be calculated to exactly reflect the behaviour of the ship when exposed to waves, uncertainty in input...

  10. Urban drainage models simplifying uncertainty analysis for practitioners

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana

    2013-01-01

    There is increasing awareness about uncertainties in the modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here, a m...

  11. A Model-Free Definition of Increasing Uncertainty

    NARCIS (Netherlands)

    Grant, S.; Quiggin, J.

    2001-01-01

    We present a definition of increasing uncertainty, independent of any notion of subjective probabilities, or of any particular model of preferences.Our notion of an elementary increase in the uncertainty of any act corresponds to the addition of an 'elementary bet' which increases consumption by a

  12. Uncertainty modelling of critical column buckling for reinforced ...

    Indian Academy of Sciences (India)

    Buckling is a critical issue for structural stability in structural design. ... This study investigates the material uncertainties on column design and proposes an uncertainty model for critical column buckling reinforced concrete buildings. ... Civil Engineering Department, Suleyman Demirel University, Isparta 32260, Turkey ...

  13. Uncertainty in a monthly water balance model using the generalized ...

    Indian Academy of Sciences (India)

    Uncertainty in a monthly water balance model using the generalized likelihood uncertainty estimation methodology. Diego Rivera1,∗. , Yessica Rivas. 2 and Alex Godoy. 3. 1. Laboratory of Comparative Policy in Water Resources Management, University of Concepcion,. CONICYT/FONDAP 15130015, Concepcion, Chile. 2.

  14. Uncertainty in Discount Models and Environmental Accounting

    Directory of Open Access Journals (Sweden)

    Donald Ludwig

    2005-12-01

    Full Text Available Cost-benefit analysis (CBA is controversial for environmental issues, but is nevertheless employed by many governments and private organizations for making environmental decisions. Controversy centers on the practice of economic discounting in CBA for decisions that have substantial long-term consequences, as do most environmental decisions. Customarily, economic discounting has been calculated at a constant exponential rate, a practice that weights the present heavily in comparison with the future. Recent analyses of economic data show that the assumption of constant exponential discounting should be modified to take into account large uncertainties in long-term discount rates. A proper treatment of this uncertainty requires that we consider returns over a plausible range of assumptions about future discounting rates. When returns are averaged in this way, the schemes with the most severe discounting have a negligible effect on the average after a long period of time has elapsed. This re-examination of economic uncertainty provides support for policies that prevent or mitigate environmental damage. We examine these effects for three examples: a stylized renewable resource, management of a long-lived species (Atlantic Right Whales, and lake eutrophication.

  15. Uncertainty and error in complex plasma chemistry models

    Science.gov (United States)

    Turner, Miles M.

    2015-06-01

    Chemistry models that include dozens of species and hundreds to thousands of reactions are common in low-temperature plasma physics. The rate constants used in such models are uncertain, because they are obtained from some combination of experiments and approximate theories. Since the predictions of these models are a function of the rate constants, these predictions must also be uncertain. However, systematic investigations of the influence of uncertain rate constants on model predictions are rare to non-existent. In this work we examine a particular chemistry model, for helium-oxygen plasmas. This chemistry is of topical interest because of its relevance to biomedical applications of atmospheric pressure plasmas. We trace the primary sources for every rate constant in the model, and hence associate an error bar (or equivalently, an uncertainty) with each. We then use a Monte Carlo procedure to quantify the uncertainty in predicted plasma species densities caused by the uncertainty in the rate constants. Under the conditions investigated, the range of uncertainty in most species densities is a factor of two to five. However, the uncertainty can vary strongly for different species, over time, and with other plasma conditions. There are extreme (pathological) cases where the uncertainty is more than a factor of ten. One should therefore be cautious in drawing any conclusion from plasma chemistry modelling, without first ensuring that the conclusion in question survives an examination of the related uncertainty.

  16. Mass discharge estimation from contaminated sites: Multi-model solutions for assessment of conceptual uncertainty

    Science.gov (United States)

    Thomsen, N. I.; Troldborg, M.; McKnight, U. S.; Binning, P. J.; Bjerg, P. L.

    2012-04-01

    Mass discharge estimates are increasingly being used in the management of contaminated sites. Such estimates have proven useful for supporting decisions related to the prioritization of contaminated sites in a groundwater catchment. Potential management options can be categorised as follows: (1) leave as is, (2) clean up, or (3) further investigation needed. However, mass discharge estimates are often very uncertain, which may hamper the management decisions. If option 1 is incorrectly chosen soil and water quality will decrease, threatening or destroying drinking water resources. The risk of choosing option 2 is to spend money on remediating a site that does not pose a problem. Choosing option 3 will often be safest, but may not be the optimal economic solution. Quantification of the uncertainty in mass discharge estimates can therefore greatly improve the foundation for selecting the appropriate management option. The uncertainty of mass discharge estimates depends greatly on the extent of the site characterization. A good approach for uncertainty estimation will be flexible with respect to the investigation level, and account for both parameter and conceptual model uncertainty. We propose a method for quantifying the uncertainty of dynamic mass discharge estimates from contaminant point sources on the local scale. The method considers both parameter and conceptual uncertainty through a multi-model approach. The multi-model approach evaluates multiple conceptual models for the same site. The different conceptual models consider different source characterizations and hydrogeological descriptions. The idea is to include a set of essentially different conceptual models where each model is believed to be realistic representation of the given site, based on the current level of information. Parameter uncertainty is quantified using Monte Carlo simulations. For each conceptual model we calculate a transient mass discharge estimate with uncertainty bounds resulting from

  17. Development of mechanistic sorption model and treatment of uncertainties for Ni sorption on montmorillonite/bentonite

    International Nuclear Information System (INIS)

    Ochs, Michael; Ganter, Charlotte; Tachi, Yukio; Suyama, Tadahiro; Yui, Mikazu

    2011-02-01

    Sorption and diffusion of radionuclides in buffer materials (bentonite) are the key processes in the safe geological disposal of radioactive waste, because migration of radionuclides in this barrier is expected to be diffusion-controlled and retarded by sorption processes. It is therefore necessary to understand the detailed/coupled processes of sorption and diffusion in compacted bentonite and develop mechanistic /predictive models, so that reliable parameters can be set under a variety of geochemical conditions relevant to performance assessment (PA). For this purpose, JAEA has developed the integrated sorption and diffusion (ISD) model/database in montmorillonite/bentonite systems. The main goal of the mechanistic model/database development is to provide a tool for a consistent explanation, prediction, and uncertainty assessment of K d as well as diffusion parameters needed for the quantification of radionuclide transport. The present report focuses on developing the thermodynamic sorption model (TSM) and on the quantification and handling of model uncertainties in applications, based on illustrating by example of Ni sorption on montmorillonite/bentonite. This includes 1) a summary of the present state of the art of thermodynamic sorption modeling, 2) a discussion of the selection of surface species and model design appropriate for the present purpose, 3) possible sources and representations of TSM uncertainties, and 4) details of modeling, testing and uncertainty evaluation for Ni sorption. Two fundamentally different approaches are presented and compared for representing TSM uncertainties: 1) TSM parameter uncertainties calculated by FITEQL optimization routines and some statistical procedure, 2) overall error estimated by direct comparison of modeled and experimental K d values. The overall error in K d is viewed as the best representation of model uncertainty in ISD model/database development. (author)

  18. Propagation of uncertainty and sensitivity analysis in an integral oil-gas plume model

    KAUST Repository

    Wang, Shitao

    2016-05-27

    Polynomial Chaos expansions are used to analyze uncertainties in an integral oil-gas plume model simulating the Deepwater Horizon oil spill. The study focuses on six uncertain input parameters—two entrainment parameters, the gas to oil ratio, two parameters associated with the droplet-size distribution, and the flow rate—that impact the model\\'s estimates of the plume\\'s trap and peel heights, and of its various gas fluxes. The ranges of the uncertain inputs were determined by experimental data. Ensemble calculations were performed to construct polynomial chaos-based surrogates that describe the variations in the outputs due to variations in the uncertain inputs. The surrogates were then used to estimate reliably the statistics of the model outputs, and to perform an analysis of variance. Two experiments were performed to study the impacts of high and low flow rate uncertainties. The analysis shows that in the former case the flow rate is the largest contributor to output uncertainties, whereas in the latter case, with the uncertainty range constrained by aposteriori analyses, the flow rate\\'s contribution becomes negligible. The trap and peel heights uncertainties are then mainly due to uncertainties in the 95% percentile of the droplet size and in the entrainment parameters.

  19. Evaluating uncertainty estimates in hydrologic models: borrowing measures from the forecast verification community

    Science.gov (United States)

    Franz, K. J.; Hogue, T. S.

    2011-11-01

    The hydrologic community is generally moving towards the use of probabilistic estimates of streamflow, primarily through the implementation of Ensemble Streamflow Prediction (ESP) systems, ensemble data assimilation methods, or multi-modeling platforms. However, evaluation of probabilistic outputs has not necessarily kept pace with ensemble generation. Much of the modeling community is still performing model evaluation using standard deterministic measures, such as error, correlation, or bias, typically applied to the ensemble mean or median. Probabilistic forecast verification methods have been well developed, particularly in the atmospheric sciences, yet few have been adopted for evaluating uncertainty estimates in hydrologic model simulations. In the current paper, we overview existing probabilistic forecast verification methods and apply the methods to evaluate and compare model ensembles produced from two different parameter uncertainty estimation methods: the Generalized Uncertainty Likelihood Estimator (GLUE), and the Shuffle Complex Evolution Metropolis (SCEM). Model ensembles are generated for the National Weather Service SACramento Soil Moisture Accounting (SAC-SMA) model for 12 forecast basins located in the Southeastern United States. We evaluate the model ensembles using relevant metrics in the following categories: distribution, correlation, accuracy, conditional statistics, and categorical statistics. We show that the presented probabilistic metrics are easily adapted to model simulation ensembles and provide a robust analysis of model performance associated with parameter uncertainty. Application of these methods requires no information in addition to what is already available as part of traditional model validation methodology and considers the entire ensemble or uncertainty range in the approach.

  20. Evaluating uncertainty estimates in hydrologic models: borrowing measures from the forecast verification community

    Directory of Open Access Journals (Sweden)

    K. J. Franz

    2011-11-01

    Full Text Available The hydrologic community is generally moving towards the use of probabilistic estimates of streamflow, primarily through the implementation of Ensemble Streamflow Prediction (ESP systems, ensemble data assimilation methods, or multi-modeling platforms. However, evaluation of probabilistic outputs has not necessarily kept pace with ensemble generation. Much of the modeling community is still performing model evaluation using standard deterministic measures, such as error, correlation, or bias, typically applied to the ensemble mean or median. Probabilistic forecast verification methods have been well developed, particularly in the atmospheric sciences, yet few have been adopted for evaluating uncertainty estimates in hydrologic model simulations. In the current paper, we overview existing probabilistic forecast verification methods and apply the methods to evaluate and compare model ensembles produced from two different parameter uncertainty estimation methods: the Generalized Uncertainty Likelihood Estimator (GLUE, and the Shuffle Complex Evolution Metropolis (SCEM. Model ensembles are generated for the National Weather Service SACramento Soil Moisture Accounting (SAC-SMA model for 12 forecast basins located in the Southeastern United States. We evaluate the model ensembles using relevant metrics in the following categories: distribution, correlation, accuracy, conditional statistics, and categorical statistics. We show that the presented probabilistic metrics are easily adapted to model simulation ensembles and provide a robust analysis of model performance associated with parameter uncertainty. Application of these methods requires no information in addition to what is already available as part of traditional model validation methodology and considers the entire ensemble or uncertainty range in the approach.

  1. Parameter optimization for surface flux transport models

    Science.gov (United States)

    Whitbread, T.; Yeates, A. R.; Muñoz-Jaramillo, A.; Petrie, G. J. D.

    2017-11-01

    Accurate prediction of solar activity calls for precise calibration of solar cycle models. Consequently we aim to find optimal parameters for models which describe the physical processes on the solar surface, which in turn act as proxies for what occurs in the interior and provide source terms for coronal models. We use a genetic algorithm to optimize surface flux transport models using National Solar Observatory (NSO) magnetogram data for Solar Cycle 23. This is applied to both a 1D model that inserts new magnetic flux in the form of idealized bipolar magnetic regions, and also to a 2D model that assimilates specific shapes of real active regions. The genetic algorithm searches for parameter sets (meridional flow speed and profile, supergranular diffusivity, initial magnetic field, and radial decay time) that produce the best fit between observed and simulated butterfly diagrams, weighted by a latitude-dependent error structure which reflects uncertainty in observations. Due to the easily adaptable nature of the 2D model, the optimization process is repeated for Cycles 21, 22, and 24 in order to analyse cycle-to-cycle variation of the optimal solution. We find that the ranges and optimal solutions for the various regimes are in reasonable agreement with results from the literature, both theoretical and observational. The optimal meridional flow profiles for each regime are almost entirely within observational bounds determined by magnetic feature tracking, with the 2D model being able to accommodate the mean observed profile more successfully. Differences between models appear to be important in deciding values for the diffusive and decay terms. In like fashion, differences in the behaviours of different solar cycles lead to contrasts in parameters defining the meridional flow and initial field strength.

  2. Balancing the stochastic description of uncertainties as a function of hydrologic model complexity

    Science.gov (United States)

    Del Giudice, D.; Reichert, P.; Albert, C.; Kalcic, M.; Logsdon Muenich, R.; Scavia, D.; Bosch, N. S.; Michalak, A. M.

    2016-12-01

    Uncertainty analysis is becoming an important component of forecasting water and pollutant fluxes in urban and rural environments. Properly accounting for errors in the modeling process can help to robustly assess the uncertainties associated with the inputs (e.g. precipitation) and outputs (e.g. runoff) of hydrological models. In recent years we have investigated several Bayesian methods to infer the parameters of a mechanistic hydrological model along with those of the stochastic error component. The latter describes the uncertainties of model outputs and possibly inputs. We have adapted our framework to a variety of applications, ranging from predicting floods in small stormwater systems to nutrient loads in large agricultural watersheds. Given practical constraints, we discuss how in general the number of quantities to infer probabilistically varies inversely with the complexity of the mechanistic model. Most often, when evaluating a hydrological model of intermediate complexity, we can infer the parameters of the model as well as of the output error model. Describing the output errors as a first order autoregressive process can realistically capture the "downstream" effect of inaccurate inputs and structure. With simpler runoff models we can additionally quantify input uncertainty by using a stochastic rainfall process. For complex hydrologic transport models, instead, we show that keeping model parameters fixed and just estimating time-dependent output uncertainties could be a viable option. The common goal across all these applications is to create time-dependent prediction intervals which are both reliable (cover the nominal amount of validation data) and precise (are as narrow as possible). In conclusion, we recommend focusing both on the choice of the hydrological model and of the probabilistic error description. The latter can include output uncertainty only, if the model is computationally-expensive, or, with simpler models, it can separately account

  3. Generalized martingale model of the uncertainty evolution of streamflow forecasts

    Science.gov (United States)

    Zhao, Tongtiegang; Zhao, Jianshi; Yang, Dawen; Wang, Hao

    2013-07-01

    Streamflow forecasts are dynamically updated in real-time, thus facilitating a process of forecast uncertainty evolution. Forecast uncertainty generally decreases over time and as more hydrologic information becomes available. The process of forecasting and uncertainty updating can be described by the martingale model of forecast evolution (MMFE), which formulates the total forecast uncertainty of a streamflow in one future period as the sum of forecast improvements in the intermediate periods. This study tests the assumptions, i.e., unbiasedness, Gaussianity, temporal independence, and stationarity, of MMFE using real-world streamflow forecast data. The results show that (1) real-world forecasts can be biased and tend to underestimate the actual streamflow, and (2) real-world forecast uncertainty is non-Gaussian and heavy-tailed. Based on these statistical tests, this study proposes a generalized martingale model GMMFE for the simulation of biased and non-Gaussian forecast uncertainties. The new model combines the normal quantile transform (NQT) with MMFE to formulate the uncertainty evolution of real-world streamflow forecasts. Reservoir operations based on a synthetic forecast by GMMFE illustrates that applications of streamflow forecasting facilitate utility improvements and that special attention should be focused on the statistical distribution of forecast uncertainty.

  4. A method for analyzing geothermal gradient histories using the statistical assessment of uncertainties in maturity models

    Energy Technology Data Exchange (ETDEWEB)

    Huvaz, O. [Turkish Petroleum Corp., Ankara (Turkey). Exploration Group; Thomsen, R.O. [Maersk Oil and Gas AS, Copenhagen (Denmark); Noeth, S. [Schlumberger Data and Consultating Services, Houston, TX (United States)

    2005-04-01

    A major factor contributing to uncertainty in basin modelling is the determination of the parameters necessary to reconstruct the basin's thermal history. Thermal maturity modelling is widely used in basin modelling for assessing the exploration risk. Of the available models, the chemical kinetic model Easy%Ro has gained wide acceptance. In this study, the thermal gradient at five wells in the Danish North Sea is calibrated against vitrinite reflectance using the Easy%Ro model coupled with an inverse scheme in order to perform sensitivity analysis and to assess the uncertainty. The mean squared residual (MSR) is used as a quantitative measure of mismatch between the modelled and measured reflectance values. A 90% confidence interval is constructed for the determined mean of the squared residuals to assess the uncertainty for the given level of confidence. The sensitivity of the Easy%Ro model to variations in the thermal gradient is investigated using the uncertainty associated with scatter in the calibration data. The best thermal gradient (minimum MSR) is obtained from the MSR curve for each well. The aim is to show how the reconstruction of the thermal gradient is related to the control data and the applied model. The applied method helps not only to determine the average thermal gradient history of a basin, but also helps to investigate the quality of the calibration data and provides a quick assessment of the uncertainty and sensitivity of any parameter in a forward deterministic model. (author)

  5. Meteorological Uncertainty of atmospheric Dispersion model results (MUD)

    DEFF Research Database (Denmark)

    Havskov Sørensen, Jens; Amstrup, Bjarne; Feddersen, Henrik

    . However, recent developments in numerical weather prediction (NWP) include probabilistic forecasting techniques, which can be utilised also for atmospheric dispersion models. The ensemble statistical methods developed and applied to NWP models aim at describing the inherent uncertainties......The MUD project addresses assessment of uncertainties of atmospheric dispersion model predictions, as well as optimum presentation to decision makers. Previously, it has not been possible to estimate such uncertainties quantitatively, but merely to calculate the 'most likely' dispersion scenario...... of the meteorological model results. These uncertainties stem from e.g. limits in meteorological obser-vations used to initialise meteorological forecast series. By perturbing the initial state of an NWP model run in agreement with the available observa-tional data, an ensemble of meteorological forecasts is produced...

  6. Meteorological Uncertainty of atmospheric Dispersion model results (MUD)

    DEFF Research Database (Denmark)

    Havskov Sørensen, Jens; Amstrup, Bjarne; Feddersen, Henrik

    ’ dispersion scenario. However, recent developments in numerical weather prediction (NWP) include probabilistic forecasting techniques, which can be utilised also for long-range atmospheric dispersion models. The ensemble statistical methods developed and applied to NWP models aim at describing the inherent......The MUD project addresses assessment of uncertainties of atmospheric dispersion model predictions, as well as possibilities for optimum presentation to decision makers. Previously, it has not been possible to estimate such uncertainties quantitatively, but merely to calculate the ‘most likely...... uncertainties of the meteorological model results. These uncertainties stem from e.g. limits in meteorological observations used to initialise meteorological forecast series. By perturbing e.g. the initial state of an NWP model run in agreement with the available observational data, an ensemble...

  7. Uncertainty and Sensitivity Analysis of Filtration Models for Non-Fickian transport and Hyperexponential deposition

    DEFF Research Database (Denmark)

    Yuan, Hao; Sin, Gürkan

    2011-01-01

    filtration coefficients and the CTRW equation expressed in Laplace space, are selected to simulate eight experiments. These experiments involve both porous media and colloid-medium interactions of different heterogeneity degrees. The uncertainty of elliptic equation predictions with distributed filtration......Uncertainty and sensitivity analyses are carried out to investigate the predictive accuracy of the filtration models for describing non-Fickian transport and hyperexponential deposition. Five different modeling approaches, involving the elliptic equation with different types of distributed...... coefficients is larger than that with a single filtration coefficient. The uncertainties of model predictions from the elliptic equation and CTRW equation in Laplace space are minimal for solute transport. Higher uncertainties of parameter estimation and model outputs are observed in the cases with the porous...

  8. Parametric uncertainty modeling for robust control

    DEFF Research Database (Denmark)

    Rasmussen, K.H.; Jørgensen, Sten Bay

    1999-01-01

    to perform robustness analysis on a control system using the structured singular value. The idea behind the proposed method is to fit a rational function to the parameter variation. The parameter variation can then be expressed as a linear fractional transformation (LFT), It is discussed how the proposed....... (C) 1999 Elsevier Science Ltd. All rights reserved....

  9. Scenario and modelling uncertainty in global mean temperature change derived from emission driven Global Climate Models

    OpenAIRE

    B. B. B. Booth; D. Bernie; D. McNeall; E. Hawkins; J. Caesar; C. Boulton; P. Friedlingstein; D. Sexton

    2012-01-01

    We compare future changes in global mean temperature in response to different future scenarios which, for the first time, arise from emission driven rather than concentration driven perturbed parameter ensemble of a Global Climate Model (GCM). These new GCM simulations sample uncertainties in atmospheric feedbacks, land carbon cycle, ocean physics and aerosol sulphur cycle processes. We find broader ranges of projected temperature responses arising when considering emission rather than concen...

  10. Analysis of parameter uncertainties in the assessment of seismic risk for nuclear power plants

    International Nuclear Information System (INIS)

    Yucemen, S.M.

    1981-04-01

    Probabilistic and statistical methods are used to develop a procedure by which the seismic risk at a specific site can be systematically analyzed. The proposed probabilistic procedure provides a consisted method for the modelling, analysis and updating of uncertainties that are involved in the seismic risk analysis for nuclear power plants. Methods are proposed for including these uncertainties in the final value of calculated risks. Two specific case studies are presented in detail to illustrate the application of the probabilistic method of seismic risk evaluation and to investigate the sensitivity of results to different assumptions

  11. Finding the effective parameter perturbations in atmospheric models: the LORENZ63 model as case study

    NARCIS (Netherlands)

    Moolenaar, H.E.; Selten, F.M.

    2004-01-01

    Climate models contain numerous parameters for which the numeric values are uncertain. In the context of climate simulation and prediction, a relevant question is what range of climate outcomes is possible given the range of parameter uncertainties. Which parameter perturbation changes the climate

  12. Mathematics of uncertainty modeling in the analysis of engineering and science problems

    CERN Document Server

    Chakraverty, S

    2014-01-01

    For various scientific and engineering problems, how to deal with variables and parameters of uncertain value is an important issue. Full analysis of the specific errors in measurement, observations, experiments, and applications are vital in dealing with the parameters taken to simplify the problem. Mathematics of Uncertainty Modeling in the Analysis of Engineering and Science Problems aims to provide the reader with basic concepts for soft computing and other methods for various means of uncertainty in handling solutions, analysis, and applications. This book is an essential reference work for students, scholars, practitioners and researchers in the assorted fields of engineering and applied mathematics interested in a model for uncertain physical problems.

  13. Parameter estimation of component reliability models in PSA model of Krsko NPP

    International Nuclear Information System (INIS)

    Jordan Cizelj, R.; Vrbanic, I.

    2001-01-01

    In the paper, the uncertainty analysis of component reliability models for independent failures is shown. The present approach for parameter estimation of component reliability models in NPP Krsko is presented. Mathematical approaches for different types of uncertainty analyses are introduced and used in accordance with some predisposed requirements. Results of the uncertainty analyses are shown in an example for time-related components. As the most appropriate uncertainty analysis proved the Bayesian estimation with the numerical estimation of a posterior, which can be approximated with some appropriate probability distribution, in this paper with lognormal distribution.(author)

  14. Energy planning of a hospital using Mathematical Programming and Monte Carlo simulation for dealing with uncertainty in the economic parameters

    Energy Technology Data Exchange (ETDEWEB)

    Mavrotas, George; Florios, Kostas; Vlachou, Dimitra [Laboratory of Industrial and Energy Economics, School of Chemical Engineering, National Technical University of Athens, Zographou Campus, 15780 Athens (Greece)

    2010-04-15

    For more than 40 years, Mathematical Programming is the traditional tool for energy planning at the national or regional level aiming at cost minimization subject to specific technological, political and demand satisfaction constraints. The liberalization of the energy market along with the ongoing technical progress increased the level of competition and forced energy consumers, even at the unit level, to make their choices among a large number of alternative or complementary energy technologies, fuels and/or suppliers. In the present work we develop a modelling framework for energy planning in units of the tertiary sector giving special emphasis to model reduction and to the uncertainty of the economic parameters. In the given case study, the energy rehabilitation of a hospital in Athens is examined and the installation of a cogeneration, absorption and compression unit is examined for the supply of the electricity, heating and cooling load. The basic innovation of the given energy model lies in the uncertainty modelling through the combined use of Mathematical Programming (namely, Mixed Integer Linear Programming, MILP) and Monte Carlo simulation that permits the risk management for the most volatile parameters of the objective function such as the fuel costs and the interest rate. The results come in the form of probability distributions that provide fruitful information to the decision maker. The effect of model reduction through appropriate data compression of the load data is also addressed. (author)

  15. Validation and uncertainty analysis of a pre-treatment 2D dose prediction model

    Science.gov (United States)

    Baeza, Jose A.; Wolfs, Cecile J. A.; Nijsten, Sebastiaan M. J. J. G.; Verhaegen, Frank

    2018-02-01

    Independent verification of complex treatment delivery with megavolt photon beam radiotherapy (RT) has been effectively used to detect and prevent errors. This work presents the validation and uncertainty analysis of a model that predicts 2D portal dose images (PDIs) without a patient or phantom in the beam. The prediction model is based on an exponential point dose model with separable primary and secondary photon fluence components. The model includes a scatter kernel, off-axis ratio map, transmission values and penumbra kernels for beam-delimiting components. These parameters were derived through a model fitting procedure supplied with point dose and dose profile measurements of radiation fields. The model was validated against a treatment planning system (TPS; Eclipse) and radiochromic film measurements for complex clinical scenarios, including volumetric modulated arc therapy (VMAT). Confidence limits on fitted model parameters were calculated based on simulated measurements. A sensitivity analysis was performed to evaluate the effect of the parameter uncertainties on the model output. For the maximum uncertainty, the maximum deviating measurement sets were propagated through the fitting procedure and the model. The overall uncertainty was assessed using all simulated measurements. The validation of the prediction model against the TPS and the film showed a good agreement, with on average 90.8% and 90.5% of pixels passing a (2%,2 mm) global gamma analysis respectively, with a low dose threshold of 10%. The maximum and overall uncertainty of the model is dependent on the type of clinical plan used as input. The results can be used to study the robustness of the model. A model for predicting accurate 2D pre-treatment PDIs in complex RT scenarios can be used clinically and its uncertainties can be taken into account.

  16. Error and Uncertainty Analysis for Ecological Modeling and Simulation

    National Research Council Canada - National Science Library

    Gertner, George

    1998-01-01

    The main objectives of this project are a) to develop a general methodology for conducting sensitivity and uncertainty analysis and building error budgets in simulation modeling over space and time; and b...

  17. Uncertainty and sensitivity analysis for photovoltaic system modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Clifford W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pohl, Andrew Phillip [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jordan, Dirk [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2013-12-01

    We report an uncertainty and sensitivity analysis for modeling DC energy from photovoltaic systems. We consider two systems, each comprised of a single module using either crystalline silicon or CdTe cells, and located either at Albuquerque, NM, or Golden, CO. Output from a PV system is predicted by a sequence of models. Uncertainty in the output of each model is quantified by empirical distributions of each model's residuals. We sample these distributions to propagate uncertainty through the sequence of models to obtain an empirical distribution for each PV system's output. We considered models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane-of-array irradiance; (2) estimate effective irradiance from plane-of-array irradiance; (3) predict cell temperature; and (4) estimate DC voltage, current and power. We found that the uncertainty in PV system output to be relatively small, on the order of 1% for daily energy. Four alternative models were considered for the POA irradiance modeling step; we did not find the choice of one of these models to be of great significance. However, we observed that the POA irradiance model introduced a bias of upwards of 5% of daily energy which translates directly to a systematic difference in predicted energy. Sensitivity analyses relate uncertainty in the PV system output to uncertainty arising from each model. We found that the residuals arising from the POA irradiance and the effective irradiance models to be the dominant contributors to residuals for daily energy, for either technology or location considered. This analysis indicates that efforts to reduce the uncertainty in PV system output should focus on improvements to the POA and effective irradiance models.

  18. Bayesian estimation of parameters in a regional hydrological model

    Directory of Open Access Journals (Sweden)

    K. Engeland

    2002-01-01

    Full Text Available This study evaluates the applicability of the distributed, process-oriented Ecomag model for prediction of daily streamflow in ungauged basins. The Ecomag model is applied as a regional model to nine catchments in the NOPEX area, using Bayesian statistics to estimate the posterior distribution of the model parameters conditioned on the observed streamflow. The distribution is calculated by Markov Chain Monte Carlo (MCMC analysis. The Bayesian method requires formulation of a likelihood function for the parameters and three alternative formulations are used. The first is a subjectively chosen objective function that describes the goodness of fit between the simulated and observed streamflow, as defined in the GLUE framework. The second and third formulations are more statistically correct likelihood models that describe the simulation errors. The full statistical likelihood model describes the simulation errors as an AR(1 process, whereas the simple model excludes the auto-regressive part. The statistical parameters depend on the catchments and the hydrological processes and the statistical and the hydrological parameters are estimated simultaneously. The results show that the simple likelihood model gives the most robust parameter estimates. The simulation error may be explained to a large extent by the catchment characteristics and climatic conditions, so it is possible to transfer knowledge about them to ungauged catchments. The statistical models for the simulation errors indicate that structural errors in the model are more important than parameter uncertainties. Keywords: regional hydrological model, model uncertainty, Bayesian analysis, Markov Chain Monte Carlo analysis

  19. The cascade of uncertainty in modeling the impacts of climate change on Europe's forests

    Science.gov (United States)

    Reyer, Christopher; Lasch-Born, Petra; Suckow, Felicitas; Gutsch, Martin

    2015-04-01

    Projecting the impacts of global change on forest ecosystems is a cornerstone for designing sustainable forest management strategies and paramount for assessing the potential of Europe's forest to contribute to the EU bioeconomy. Research on climate change impacts on forests relies to a large extent on model applications along a model chain from Integrated Assessment Models to General and Regional Circulation Models that provide important driving variables for forest models. Or to decision support systems that synthesize findings of more detailed forest models to inform forest managers. At each step in the model chain, model-specific uncertainties about, amongst others, parameter values, input data or model structure accumulate, leading to a cascade of uncertainty. For example, climate change impacts on forests strongly depend on the in- or exclusion of CO2-effects or on the use of an ensemble of climate models rather than relying on one particular climate model. In the past, these uncertainties have not or only partly been considered in studies of climate change impacts on forests. This has left managers and decision-makers in doubt of how robust the projected impacts on forest ecosystems are. We deal with this cascade of uncertainty in a structured way and the objective of this presentation is to assess how different types of uncertainties affect projections of the effects of climate change on forest ecosystems. To address this objective we synthesized a large body of scientific literature on modeled productivity changes and the effects of extreme events on plant processes. Furthermore, we apply the process-based forest growth model 4C to forest stands all over Europe and assess how different climate models, emission scenarios and assumptions about the parameters and structure of 4C affect the uncertainty of the model projections. We show that there are consistent regional changes in forest productivity such as an increase in NPP in cold and wet regions while

  20. Data-driven Modelling for decision making under uncertainty

    Science.gov (United States)

    Angria S, Layla; Dwi Sari, Yunita; Zarlis, Muhammad; Tulus

    2018-01-01

    The rise of the issues with the uncertainty of decision making has become a very warm conversation in operation research. Many models have been presented, one of which is with data-driven modelling (DDM). The purpose of this paper is to extract and recognize patterns in data, and find the best model in decision-making problem under uncertainty by using data-driven modeling approach with linear programming, linear and nonlinear differential equation, bayesian approach. Model criteria tested to determine the smallest error, and it will be the best model that can be used.

  1. Sliding mode fault tolerant control dealing with modeling uncertainties and actuator faults.

    Science.gov (United States)

    Wang, Tao; Xie, Wenfang; Zhang, Youmin

    2012-05-01

    In this paper, two sliding mode control algorithms are developed for nonlinear systems with both modeling uncertainties and actuator faults. The first algorithm is developed under an assumption that the uncertainty bounds are known. Different design parameters are utilized to deal with modeling uncertainties and actuator faults, respectively. The second algorithm is an adaptive version of the first one, which is developed to accommodate uncertainties and faults without utilizing exact bounds information. The stability of the overall control systems is proved by using a Lyapunov function. The effectiveness of the developed algorithms have been verified on a nonlinear longitudinal model of Boeing 747-100/200. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.

  2. Parametric uncertainty analysis of pulse wave propagation in a model of a human arterial network

    Science.gov (United States)

    Xiu, Dongbin; Sherwin, Spencer J.

    2007-10-01

    Reduced models of human arterial networks are an efficient approach to analyze quantitative macroscopic features of human arterial flows. The justification for such models typically arise due to the significantly long wavelength associated with the system in comparison to the lengths of arteries in the networks. Although these types of models have been employed extensively and many issues associated with their implementations have been widely researched, the issue of data uncertainty has received comparatively little attention. Similar to many biological systems, a large amount of uncertainty exists in the value of the parameters associated with the models. Clearly reliable assessment of the system behaviour cannot be made unless the effect of such data uncertainty is quantified. In this paper we present a study of parametric data uncertainty in reduced modelling of human arterial networks which is governed by a hyperbolic system. The uncertain parameters are modelled as random variables and the governing equations for the arterial network therefore become stochastic. This type stochastic hyperbolic systems have not been previously systematically studied due to the difficulties introduced by the uncertainty such as a potential change in the mathematical character of the system and imposing boundary conditions. We demonstrate how the application of a high-order stochastic collocation method based on the generalized polynomial chaos expansion, combined with a discontinuous Galerkin spectral/hp element discretization in physical space, can successfully simulate this type of hyperbolic system subject to uncertain inputs with bounds. Building upon a numerical study of propagation of uncertainty and sensitivity in a simplified model with a single bifurcation, a systematical parameter sensitivity analysis is conducted on the wave dynamics in a multiple bifurcating human arterial network. Using the physical understanding of the dynamics of pulse waves in these types of

  3. SR-Can. Data and uncertainty assessment. Migration parameters for the bentonite buffer in the KBS-3 concept

    Energy Technology Data Exchange (ETDEWEB)

    Ochs, Michael; Talerico, Caterina [BMG Engineering Ltd, Zuerich (Switzerland)

    2004-08-01

    SKB is currently preparing license applications related to the deep repository for spent nuclear fuel and an encapsulation plant. The present report is one of several specific data reports feeding into the interim reporting for the latter application; it is concerned with the derivation and recommendation of radionuclide migration input parameters for a MX-80 bentonite buffer to PA models. Recommended values for the following parameters as well as the associated uncertainties are derived and documented for a total of 38 elements and oxidation states: diffusion-available porosity ({epsilon}); effective diffusivity (D{sub e}); distribution coefficient (K{sub d}). Because of the conditional nature of these parameters, particularly of K{sub d}, they were derived specifically for the conditions expected to be relevant for PA consequence calculations. K{sub d} values were generally evaluated for the specific porewater composition and solid/water ratio representative for MX-80 compacted to 1,590 kg/m{sup 3}. Because of the highly conditional nature of K{sub d}, this was done for several porewater compositions which reflect possible variations in geochemical boundary conditions. D{sub e} and {epsilon} were derived as a function of density. Parameter derivation was based on systematic datasets available in the literature and/or on thermodynamic models. Associated uncertainties were assessed for a given set of PA conditions and as a function of variability in these conditions. In a final step, apparent diffusivity (D{sub a}) values were calculated from the recommended parameters and compared with independent experimental measurements to arrive at selfconsistent sets of migration parameters.

  4. Model-specification uncertainty in future forest pest outbreak.

    Science.gov (United States)

    Boulanger, Yan; Gray, David R; Cooke, Barry J; De Grandpré, Louis

    2016-04-01

    Climate change will modify forest pest outbreak characteristics, although there are disagreements regarding the specifics of these changes. A large part of this variability may be attributed to model specifications. As a case study, we developed a consensus model predicting spruce budworm (SBW, Choristoneura fumiferana [Clem.]) outbreak duration using two different predictor data sets and six different correlative methods. The model was used to project outbreak duration and the uncertainty associated with using different data sets and correlative methods (=model-specification uncertainty) for 2011-2040, 2041-2070 and 2071-2100, according to three forcing scenarios (RCP 2.6, RCP 4.5 and RCP 8.5). The consensus model showed very high explanatory power and low bias. The model projected a more important northward shift and decrease in outbreak duration under the RCP 8.5 scenario. However, variation in single-model projections increases with time, making future projections highly uncertain. Notably, the magnitude of the shifts in northward expansion, overall outbreak duration and the patterns of outbreaks duration at the southern edge were highly variable according to the predictor data set and correlative method used. We also demonstrated that variation in forcing scenarios contributed only slightly to the uncertainty of model projections compared with the two sources of model-specification uncertainty. Our approach helped to quantify model-specification uncertainty in future forest pest outbreak characteristics. It may contribute to sounder decision-making by acknowledging the limits of the projections and help to identify areas where model-specification uncertainty is high. As such, we further stress that this uncertainty should be strongly considered when making forest management plans, notably by adopting adaptive management strategies so as to reduce future risks. © 2015 Her Majesty the Queen in Right of Canada Global Change Biology © 2015 Published by John

  5. Metrics for evaluating performance and uncertainty of Bayesian network models

    Science.gov (United States)

    Bruce G. Marcot

    2012-01-01

    This paper presents a selected set of existing and new metrics for gauging Bayesian network model performance and uncertainty. Selected existing and new metrics are discussed for conducting model sensitivity analysis (variance reduction, entropy reduction, case file simulation); evaluating scenarios (influence analysis); depicting model complexity (numbers of model...

  6. Comparative uncertainty analysis of copper loads in stormwater systems using GLUE and grey-box modeling

    DEFF Research Database (Denmark)

    Lindblom, Erik Ulfson; Madsen, Henrik; Mikkelsen, Peter Steen

    2007-01-01

    of the measurements. In the second attempt the conceptual model is reformulated to a grey-box model followed by parameter estimation. Given data from an extensive measurement campaign, the two methods suggest that the output of the stormwater pollution model is associated with significant uncertainty....... With the proposed model and input data, the GLUE analysis show that the total sampled copper mass can be predicted within a range of +/- 50% of the median value ( 385 g), whereas the grey-box analysis showed a prediction uncertainty of less than +/- 30%. Future work will clarify the pros and cons of the two methods...

  7. Sensitivity and uncertainty analysis of a sediment transport model: a global approach

    Science.gov (United States)

    Chang, C.; Yang, J.; Tung, Y.

    1993-12-01

    Computerized sediment transport models are frequently employed to quantitatively simulate the movement of sediment materials in rivers. In spite of the deterministic nature of the models, the outputs are subject to uncertainty due to the inherent variability of many input parameters in time and in space, along with the lack of complete understanding of the involved processes. The commonly used first-order method for sensitivity and uncertainty analyses is to approximate a model by linear expansion at a selected point. Conclusions from the first-order method could be of limited use if the model responses drastically vary at different points in parameter space. To obtain the global sensitivity and uncertainty features of a sediment transport model over a larger input parameter space, the Latin hypercubic sampling technique along with regression procedures were employed. For the purpose of illustrating the methodologies, the computer model HEC2-SR was selected in this study. Through an example application, the results about the parameters sensitivity and uncertainty of water surface, bed elevation and sediment discharge were discussed.

  8. Evaluation of Uncertainties in hydrogeological modeling and groundwater flow analyses. Model calibration

    International Nuclear Information System (INIS)

    Ijiri, Yuji; Ono, Makoto; Sugihara, Yutaka; Shimo, Michito; Yamamoto, Hajime; Fumimura, Kenichi

    2003-03-01

    This study involves evaluation of uncertainty in hydrogeological modeling and groundwater flow analysis. Three-dimensional groundwater flow in Shobasama site in Tono was analyzed using two continuum models and one discontinuous model. The domain of this study covered area of four kilometers in east-west direction and six kilometers in north-south direction. Moreover, for the purpose of evaluating how uncertainties included in modeling of hydrogeological structure and results of groundwater simulation decreased with progress of investigation research, updating and calibration of the models about several modeling techniques of hydrogeological structure and groundwater flow analysis techniques were carried out, based on the information and knowledge which were newly acquired. The acquired knowledge is as follows. As a result of setting parameters and structures in renewal of the models following to the circumstances by last year, there is no big difference to handling between modeling methods. The model calibration is performed by the method of matching numerical simulation with observation, about the pressure response caused by opening and closing of a packer in MIU-2 borehole. Each analysis technique attains reducing of residual sum of squares of observations and results of numerical simulation by adjusting hydrogeological parameters. However, each model adjusts different parameters as water conductivity, effective porosity, specific storage, and anisotropy. When calibrating models, sometimes it is impossible to explain the phenomena only by adjusting parameters. In such case, another investigation may be required to clarify details of hydrogeological structure more. As a result of comparing research from beginning to this year, the following conclusions are obtained about investigation. (1) The transient hydraulic data are effective means in reducing the uncertainty of hydrogeological structure. (2) Effective porosity for calculating pore water velocity of

  9. Review of uncertainty estimates associated with models for assessing the impact of breeder reactor radioactivity releases

    International Nuclear Information System (INIS)

    Miller, C.; Little, C.A.

    1982-08-01

    The purpose is to summarize estimates based on currently available data of the uncertainty associated with radiological assessment models. The models being examined herein are those recommended previously for use in breeder reactor assessments. Uncertainty estimates are presented for models of atmospheric and hydrologic transport, terrestrial and aquatic food-chain bioaccumulation, and internal and external dosimetry. Both long-term and short-term release conditions are discussed. The uncertainty estimates presented in this report indicate that, for many sites, generic models and representative parameter values may be used to calculate doses from annual average radionuclide releases when these calculated doses are on the order of one-tenth or less of a relevant dose limit. For short-term, accidental releases, especially those from breeder reactors located in sites dominated by complex terrain and/or coastal meteorology, the uncertainty in the dose calculations may be much larger than an order of magnitude. As a result, it may be necessary to incorporate site-specific information into the dose calculation under these circumstances to reduce this uncertainty. However, even using site-specific information, natural variability and the uncertainties in the dose conversion factor will likely result in an overall uncertainty of greater than an order of magnitude for predictions of dose or concentration in environmental media following shortterm releases

  10. Impact of rainfall temporal resolution on urban water quality modelling performance and uncertainties.

    Science.gov (United States)

    Manz, Bastian Johann; Rodríguez, Juan Pablo; Maksimović, Cedo; McIntyre, Neil

    2013-01-01

    A key control on the response of an urban drainage model is how well the observed rainfall records represent the real rainfall variability. Particularly in urban catchments with fast response flow regimes, the selection of temporal resolution in rainfall data collection is critical. Furthermore, the impact of the rainfall variability on the model response is amplified for water quality estimates, as uncertainty in rainfall intensity affects both the rainfall-runoff and pollutant wash-off sub-models, thus compounding uncertainties. A modelling study was designed to investigate the impact of altering rainfall temporal resolution on the magnitude and behaviour of uncertainties associated with the hydrological modelling compared with water quality modelling. The case study was an 85-ha combined sewer sub-catchment in Bogotá (Colombia). Water quality estimates showed greater sensitivity to the inter-event variability in rainfall hyetograph characteristics than to changes in the rainfall input temporal resolution. Overall, uncertainties from the water quality model were two- to five-fold those of the hydrological model. However, owing to the intrinsic scarcity of observations in urban water quality modelling, total model output uncertainties, especially from the water quality model, were too large to make recommendations for particular model structures or parameter values with respect to rainfall temporal resolution.

  11. Graphical models and their (un)certainties

    NARCIS (Netherlands)

    Leisink, M.A.R.

    2004-01-01

    'A graphical models is a powerful tool to deal with complex probability models. Although in principle any set of probabilistic relationships can be modelled, the calculation of the actual numbers can be very hard. Every graphical model suffers from a phenomenon known as exponential scaling. To

  12. Evaluation of soil, unsaturated, and saturated zone parameter uncertainty using GSFlow and PEST in an agricultural watershed

    Science.gov (United States)

    Zuidema, S.; Davis, J. M.

    2011-12-01

    A coupled surface-ground water hydrological model of the Burley-DeMerritt Organic Dairy Research Farm in southeastern New Hampshire is under continued development in support of a long-term mission to understand nutrient dynamics and water use in sustainable New England dairy operations. To build on previous simulations of ground water recharge and nitrogen transport, an estimate of net recharge under an array of climate scenarios is required to facilitate modeling of nutrient dynamics for the projected life span of the dairy farm. The model must therefore incorporate spatially distributed surface and soil zone processes that influence the shallow ground water system. GSFlow couples the USGS Precipitation Runoff Modeling System and MODFLOW codes and is used to simulate surface, soil, and subsurface hydrological processes using a suite of empirical and process-based algorithms and parameters. Topography of the 83 hectare model domain was derived from a 1-meter horizontal resolution LiDAR DEM with centimeter-scale accuracy. Zonation at the soil surface was derived from detailed soils mapping, aerial land cover assessment, and drainage boundaries derived from the LiDAR DEM. Meteorological forcing data are taken from nearby (5 km) NCDC and AIRMAP meteorological towers. The farm's catchment consists of regionally common land covers including pasture, forest, and forested wetland. An array of surface model structures and spatial discretizations are evaluated ranging from fine scale, incorporating areas of consistent land cover or soil types, to the catchment scale. Parameter identification and uncertainty for both PRMS and MODFLOW components is conducted using PEST software, where distributed measurements of hydraulic head, soil moisture, and streamflow are weighted by measurement uncertainty and the relative measurement abundance or redundancy to define a model to measurement misfit objective function. Model validation will be conducted against data collected in the

  13. Uncertainty modelling of critical column buckling for reinforced ...

    Indian Academy of Sciences (India)

    Buckling is a critical issue for structural stability in structural design. In most of the buckling analyses, applied loads, structural and material properties are considered certain. However, in reality, these parameters are uncertain. Therefore, a prognostic solution is necessary and uncertainties have to be considered. Fuzzy logic ...

  14. Sensitiveness Analysis of Neutronic Parameters Due to Uncertainty in Thermo-hydraulic parameters on CAREM-25 Reactor

    International Nuclear Information System (INIS)

    Serra, Oscar

    2000-01-01

    Some studies were done about the effect of the uncertainty in the values of several thermo-hydraulic parameters on the core behaviour of the CAREM-25 reactor.By using the chain codes CITVAP-THERMIT and the perturbation the reference states, it was found that concerning to the total power, the effects were not very important, but were much bigger for the pressure.Furthermore were hardly significant in the presence of any perturbation on the void fraction calculation and the fuel temperature.The reactivity and the power peaking factor had highly important changes in the case of the coolant flow.We conclude that the use of this procedure is adequate and useful to our purpose

  15. The Impact of Economic Parameter Uncertainty Growth on Regional Energy Demand Assessment

    Directory of Open Access Journals (Sweden)

    Olga Vasilyevna Mazurova

    2017-06-01

    Full Text Available The article deals with the forecasting studies based on the energy demand and prices in the region in terms of the complex interconnections between economy (and energy and the growth of uncertainty of the future development of the country and territories. The authors propose a methodological approach, which combines the assessment of the price elasticity of energy demand with the optimization of energy and fuel regional supply. In this case, the price elasticity of demand is determined taking into account the comparison of cost-effectiveness of using different types of fuel and energy by different consumers. The originality of the proposed approach consists in simulating the behaviour of suppliers’ (energy companies and large customers’ (power plants, boiler rooms, industry, transport, population depending on energy price changes, the existing and new technologies, energy-saving activities and restrictions on fuel supplies. To take into account the uncertainty of future economic and energy conditions, some parameters such as prospective technical and economic parameters, price, technological parameters are set as the intervals of possible values with different probability levels. This approach allows making multivariate studies with different combinations of the expected conditions and receiving as a result the range of the projected values of studied indicators. The multivariate calculations show that the fuel demand has a nonlinear dependence on the consumer characteristics, pricing, projection horizon, and the nature of the future conditions uncertainty. The authors have shown that this effect can be significant and should be considered in the forecasts of the development of fuel and energy sector. The methodological approach and quantitative evaluation can be used to improve the economic and energy development strategies of the country and regions

  16. Geostatistical simulation of geological architecture and uncertainty propagation in groundwater modeling

    DEFF Research Database (Denmark)

    He, Xiulan

    be compensated by model parameters, e.g. when hydraulic heads are considered. However, geological structure is the primary source of uncertainty with respect to simulations of groundwater age and capture zone. Operational MPS based software has been on stage for just around ten years; yet, issues regarding...... parameters and model structures, which are the primary focuses of this PhD research. Parameter uncertainty was analyzed using an optimization tool (PEST: Parameter ESTimation) in combination with a random sampling method (LHS: Latin Hypercube Sampling). Model structure, namely geological architecture...... geological structures of these three sites provided appropriate conditions for testing the methods. Our study documented that MPS is an efficient approach for simulating geological heterogeneity, especially for non-stationary system. The high resolution of geophysical data such as SkyTEM is valuable both...

  17. Exploring uncertainty of Amazon dieback in a perturbed parameter Earth system ensemble.

    Science.gov (United States)

    Boulton, Chris A; Booth, Ben B B; Good, Peter

    2017-12-01

    The future of the Amazon rainforest is unknown due to uncertainties in projected climate change and the response of the forest to this change (forest resiliency). Here, we explore the effect of some uncertainties in climate and land surface processes on the future of the forest, using a perturbed physics ensemble of HadCM3C. This is the first time Amazon forest changes are presented using an ensemble exploring both land vegetation processes and physical climate feedbacks in a fully coupled modelling framework. Under three different emissions scenarios, we measure the change in the forest coverage by the end of the 21st century (the transient response) and make a novel adaptation to a previously used method known as "dry-season resilience" to predict the long-term committed response of the forest, should the state of the climate remain constant past 2100. Our analysis of this ensemble suggests that there will be a high chance of greater forest loss on longer timescales than is realized by 2100, especially for mid-range and low emissions scenarios. In both the transient and predicted committed responses, there is an increasing uncertainty in the outcome of the forest as the strength of the emissions scenarios increases. It is important to note however, that very few of the simulations produce future forest loss of the magnitude previously shown under the standard model configuration. We find that low optimum temperatures for photosynthesis and a high minimum leaf area index needed for the forest to compete for space appear to be precursors for dieback. We then decompose the uncertainty into that associated with future climate change and that associated with forest resiliency, finding that it is important to reduce the uncertainty in both of these if we are to better determine the Amazon's outcome. © 2017 John Wiley & Sons Ltd.

  18. Non-parametric correlative uncertainty quantification and sensitivity analysis: Application to a Langmuir bimolecular adsorption model

    Directory of Open Access Journals (Sweden)

    Jinchao Feng

    2018-03-01

    Full Text Available We propose non-parametric methods for both local and global sensitivity analysis of chemical reaction models with correlated parameter dependencies. The developed mathematical and statistical tools are applied to a benchmark Langmuir competitive adsorption model on a close packed platinum surface, whose parameters, estimated from quantum-scale computations, are correlated and are limited in size (small data. The proposed mathematical methodology employs gradient-based methods to compute sensitivity indices. We observe that ranking influential parameters depends critically on whether or not correlations between parameters are taken into account. The impact of uncertainty in the correlation and the necessity of the proposed non-parametric perspective are demonstrated.

  19. Non-parametric correlative uncertainty quantification and sensitivity analysis: Application to a Langmuir bimolecular adsorption model

    Science.gov (United States)

    Feng, Jinchao; Lansford, Joshua; Mironenko, Alexander; Pourkargar, Davood Babaei; Vlachos, Dionisios G.; Katsoulakis, Markos A.

    2018-03-01

    We propose non-parametric methods for both local and global sensitivity analysis of chemical reaction models with correlated parameter dependencies. The developed mathematical and statistical tools are applied to a benchmark Langmuir competitive adsorption model on a close packed platinum surface, whose parameters, estimated from quantum-scale computations, are correlated and are limited in size (small data). The proposed mathematical methodology employs gradient-based methods to compute sensitivity indices. We observe that ranking influential parameters depends critically on whether or not correlations between parameters are taken into account. The impact of uncertainty in the correlation and the necessity of the proposed non-parametric perspective are demonstrated.

  20. Stochastic modelling of landfill processes incorporating waste heterogeneity and data uncertainty

    International Nuclear Information System (INIS)

    Zacharof, A.I.; Butler, A.P.

    2004-01-01

    A landfill is a very complex heterogeneous environment and as such it presents many modelling challenges. Attempts to develop models that reproduce these complexities generally involve the use of large numbers of spatially dependent parameters that cannot be properly characterised in the face of data uncertainty. An alternative method is presented, which couples a simplified microbial degradation model with a stochastic hydrological and contaminant transport model. This provides a framework for incorporating the complex effects of spatial heterogeneity within the landfill in a simplified manner, along with other key variables. A methodology for handling data uncertainty is also integrated into the model structure. Illustrative examples of the model's output are presented to demonstrate effects of data uncertainty on leachate composition and gas volume prediction

  1. Representing Uncertainty on Model Analysis Plots

    Science.gov (United States)

    Smith, Trevor I.

    2016-01-01

    Model analysis provides a mechanism for representing student learning as measured by standard multiple-choice surveys. The model plot contains information regarding both how likely students in a particular class are to choose the correct answer and how likely they are to choose an answer consistent with a well-documented conceptual model.…

  2. Accounting for uncertainty in ecological analysis: the strengths and limitations of hierarchical statistical modeling.

    Science.gov (United States)

    Cressie, Noel; Calder, Catherine A; Clark, James S; Ver Hoef, Jay M; Wikle, Christopher K

    2009-04-01

    Analyses of ecological data should account for the uncertainty in the process(es) that generated the data. However, accounting for these uncertainties is a difficult task, since ecology is known for its complexity. Measurement and/or process errors are often the only sources of uncertainty modeled when addressing complex ecological problems, yet analyses should also account for uncertainty in sampling design, in model specification, in parameters governing the specified model, and in initial and boundary conditions. Only then can we be confident in the scientific inferences and forecasts made from an analysis. Probability and statistics provide a framework that accounts for multiple sources of uncertainty. Given the complexities of ecological studies, the hierarchical statistical model is an invaluable tool. This approach is not new in ecology, and there are many examples (both Bayesian and non-Bayesian) in the literature illustrating the benefits of this approach. In this article, we provide a baseline for concepts, notation, and methods, from which discussion on hierarchical statistical modeling in ecology can proceed. We have also planted some seeds for discussion and tried to show where the practical difficulties lie. Our thesis is that hierarchical statistical modeling is a powerful way of approaching ecological analysis in the presence of inevitable but quantifiable uncertainties, even if practical issues sometimes require pragmatic compromises.

  3. UNCERTAINTY SUPPLY CHAIN MODEL AND TRANSPORT IN ITS DEPLOYMENTS

    Directory of Open Access Journals (Sweden)

    Fabiana Lucena Oliveira

    2014-05-01

    Full Text Available This article discusses the Model Uncertainty of Supply Chain, and proposes a matrix with their transportation modes best suited to their chains. From the detailed analysis of the matrix of uncertainty, it is suggested transportation modes best suited to the management of these chains, so that transport is the most appropriate optimization of the gains previously proposed by the original model, particularly when supply chains are distant from suppliers of raw materials and / or supplies.Here we analyze in detail Agile Supply Chains, which is a result of Uncertainty Supply Chain Model, with special attention to Manaus Industrial Center. This research was done at Manaus Industrial Pole, which is a model of industrial agglomerations, based in Manaus, State of Amazonas (Brazil, which contemplates different supply chains and strategies sharing same infrastructure of transport, handling and storage and clearance process and uses inbound for suppliers of raw material.  The state of art contemplates supply chain management, uncertainty supply chain model, agile supply chains, Manaus Industrial Center (MIC and Brazilian legislation, as a business case, and presents concepts and features, of each one. The main goal is to present and discuss how transport is able to support Uncertainty Supply Chain Model, in order to complete management model. The results obtained confirms the hypothesis of integrated logistics processes are able to guarantee attractivity for industrial agglomerations, and open discussions when the suppliers are far from the manufacturer center, in a logistics management.

  4. Uncertainty and the Social Cost of Methane Using Bayesian Constrained Climate Models

    Science.gov (United States)

    Errickson, F. C.; Anthoff, D.; Keller, K.

    2016-12-01

    Social cost estimates of greenhouse gases are important for the design of sound climate policies and are also plagued by uncertainty. One major source of uncertainty stems from the simplified representation of the climate system used in the integrated assessment models that provide these social cost estimates. We explore how uncertainty over the social cost of methane varies with the way physical processes and feedbacks in the methane cycle are modeled by (i) coupling three different methane models to a simple climate model, (ii) using MCMC to perform a Bayesian calibration of the three coupled climate models that simulates direct sampling from the joint posterior probability density function (pdf) of model parameters, and (iii) producing probabilistic climate projections that are then used to calculate the Social Cost of Methane (SCM) with the DICE and FUND integrated assessment models. We find that including a temperature feedback in the methane cycle acts as an additional constraint during the calibration process and results in a correlation between the tropospheric lifetime of methane and several climate model parameters. This correlation is not seen in the models lacking this feedback. Several of the estimated marginal pdfs of the model parameters also exhibit different distributional shapes and expected values depending on the methane model used. As a result, probabilistic projections of the climate system out to the year 2300 exhibit different levels of uncertainty and magnitudes of warming for each of the three models under an RCP8.5 scenario. We find these differences in climate projections result in differences in the distributions and expected values for our estimates of the SCM. We also examine uncertainty about the SCM by performing a Monte Carlo analysis using a distribution for the climate sensitivity while holding all other climate model parameters constant. Our SCM estimates using the Bayesian calibration are lower and exhibit less uncertainty

  5. Bayesian tsunami fragility modeling considering input data uncertainty

    OpenAIRE

    De Risi, Raffaele; Goda, Katsu; Mori, Nobuhito; Yasuda, Tomohiro

    2017-01-01

    Empirical tsunami fragility curves are developed based on a Bayesian framework by accounting for uncertainty of input tsunami hazard data in a systematic and comprehensive manner. Three fragility modeling approaches, i.e. lognormal method, binomial logistic method, and multinomial logistic method, are considered, and are applied to extensive tsunami damage data for the 2011 Tohoku earthquake. A unique aspect of this study is that uncertainty of tsunami inundation data (i.e. input hazard data ...

  6. Uncertainty quantification of squeal instability via surrogate modelling

    Science.gov (United States)

    Nobari, Amir; Ouyang, Huajiang; Bannister, Paul

    2015-08-01

    One of the major issues that car manufacturers are facing is the noise and vibration of brake systems. Of the different sorts of noise and vibration, which a brake system may generate, squeal as an irritating high-frequency noise costs the manufacturers significantly. Despite considerable research that has been conducted on brake squeal, the root cause of squeal is still not fully understood. The most common assumption, however, is mode-coupling. Complex eigenvalue analysis is the most widely used approach to the analysis of brake squeal problems. One of the major drawbacks of this technique, nevertheless, is that the effects of variability and uncertainty are not included in the results. Apparently, uncertainty and variability are two inseparable parts of any brake system. Uncertainty is mainly caused by friction, contact, wear and thermal effects while variability mostly stems from the manufacturing process, material properties and component geometries. Evaluating the effects of uncertainty and variability in the complex eigenvalue analysis improves the predictability of noise propensity and helps produce a more robust design. The biggest hurdle in the uncertainty analysis of brake systems is the computational cost and time. Most uncertainty analysis techniques rely on the results of many deterministic analyses. A full finite element model of a brake system typically consists of millions of degrees-of-freedom and many load cases. Running time of such models is so long that automotive industry is reluctant to do many deterministic analyses. This paper, instead, proposes an efficient method of uncertainty propagation via surrogate modelling. A surrogate model of a brake system is constructed in order to reproduce the outputs of the large-scale finite element model and overcome the issue of computational workloads. The probability distribution of the real part of an unstable mode can then be obtained by using the surrogate model with a massive saving of

  7. Toward improving the reliability of hydrologic prediction: Model structure uncertainty and its quantification using ensemble-based genetic programming framework

    Science.gov (United States)

    Parasuraman, Kamban; Elshorbagy, Amin

    2008-12-01

    Uncertainty analysis is starting to be widely acknowledged as an integral part of hydrological modeling. The conventional treatment of uncertainty analysis in hydrologic modeling is to assume a deterministic model structure, and treat its associated parameters as imperfectly known, thereby neglecting the uncertainty associated with the model structure. In this paper, a modeling framework that can explicitly account for the effect of model structure uncertainty has been proposed. The modeling framework is based on initially generating different realizations of the original data set using a non-parametric bootstrap method, and then exploiting the ability of the self-organizing algorithms, namely genetic programming, to evolve their own model structure for each of the resampled data sets. The resulting ensemble of models is then used to quantify the uncertainty associated with the model structure. The performance of the proposed modeling framework is analyzed with regards to its ability in characterizing the evapotranspiration process at the Southwest Sand Storage facility, located near Ft. McMurray, Alberta. Eddy-covariance-measured actual evapotranspiration is modeled as a function of net radiation, air temperature, ground temperature, relative humidity, and wind speed. Investigating the relation between model complexity, prediction accuracy, and uncertainty, two sets of experiments were carried out by varying the level of mathematical operators that can be used to define the predictand-predictor relationship. While the first set uses just the additive operators, the second set uses both the additive and the multiplicative operators to define the predictand-predictor relationship. The results suggest that increasing the model complexity may lead to better prediction accuracy but at an expense of increasing uncertainty. Compared to the model parameter uncertainty, the relative contribution of model structure uncertainty to the predictive uncertainty of a model is

  8. Multi-Site Validation of the SWAT Model on the Bani Catchment: Model Performance and Predictive Uncertainty

    Directory of Open Access Journals (Sweden)

    Jamilatou Chaibou Begou

    2016-04-01

    Full Text Available The objective of this study was to assess the performance and predictive uncertainty of the Soil and Water Assessment Tool (SWAT model on the Bani River Basin, at catchment and subcatchment levels. The SWAT model was calibrated using the Generalized Likelihood Uncertainty Estimation (GLUE approach. Potential Evapotranspiration (PET and biomass were considered in the verification of model outputs accuracy. Global Sensitivity Analysis (GSA was used for identifying important model parameters. Results indicated a good performance of the global model at daily as well as monthly time steps with adequate predictive uncertainty. PET was found to be overestimated but biomass was better predicted in agricultural land and forest. Surface runoff represents the dominant process on streamflow generation in that region. Individual calibration at subcatchment scale yielded better performance than when the global parameter sets were applied. These results are very useful and provide a support to further studies on regionalization to make prediction in ungauged basins.

  9. Uncertainty Analysis of Resistance Tests in Ata Nutku Ship Model Testing Laboratory of Istanbul Technical University

    Directory of Open Access Journals (Sweden)

    Cihad DELEN

    2015-12-01

    Full Text Available In this study, some systematical resistance tests, where were performed in Ata Nutku Ship Model Testing Laboratory of Istanbul Technical University (ITU, have been included in order to determine the uncertainties. Experiments which are conducted in the framework of mathematical and physical rules for the solution of engineering problems, measurements, calculations include uncertainty. To question the reliability of the obtained values, the existing uncertainties should be expressed as quantities. The uncertainty of a measurement system is not known if the results do not carry a universal value. On the other hand, resistance is one of the most important parameters that should be considered in the process of ship design. Ship resistance during the design phase of a ship cannot be determined precisely and reliably due to the uncertainty resources in determining the resistance value that are taken into account. This case may cause negative effects to provide the required specifications in the latter design steps. The uncertainty arising from the resistance test has been estimated and compared for a displacement type ship and high speed marine vehicles according to ITTC 2002 and ITTC 2014 regulations which are related to the uncertainty analysis methods. Also, the advantages and disadvantages of both ITTC uncertainty analysis methods have been discussed.

  10. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    Science.gov (United States)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; Geraci, Gianluca; Eldred, Michael S.; Vane, Zachary P.; Lacaze, Guilhem; Oefelein, Joseph C.; Najm, Habib N.

    2018-03-01

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis is conducted to identify influential uncertain input parameters, which can help reduce the systems stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. These methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.

  11. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    Energy Technology Data Exchange (ETDEWEB)

    Huan, Xun [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Safta, Cosmin [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Sargsyan, Khachik [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Geraci, Gianluca [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eldred, Michael S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vane, Zachary P. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Lacaze, Guilhem [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Oefelein, Joseph C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Najm, Habib N. [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2018-02-09

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis is conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. Finally, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.

  12. Risk assessment of the Groningen geothermal potential : From seismic to reservoir uncertainty using a discrete parameter analysis

    NARCIS (Netherlands)

    Daniilidis, Alexandros; Doddema, Leon; Herber, Rien

    2016-01-01

    Geothermal exploitation is subject to several uncertainties, even in settings with high data availability, adding to project risk. Uncertainty can stem from the reservoir's initial state, as well as from the geological and operational parameters. The interplay between these aspects entails

  13. The calibration and uncertainty analysis of a hydrological model based on cuckoo search and the M-GLUE method

    Science.gov (United States)

    Zhang, H.; Chang, J.; Wang, Y.

    2017-12-01

    The watershed hydrological model is regarded as a powerful tool for simulating streamflow, but it is subject to many uncertainties. This paper uses TOPMODEL for the hydrological modeling, uses the GLUE and M-GLUE methods to investigate the effect of model parameter uncertainty on streamflow simulation and uses three CMIP5 climate models to investigate the uncertainty induced by meteorological input data. A new parameter calibration method (cuckoo search algorithm) is proposed. The Beiluo River basin is selected as the study area for this paper. Analysis of the simulation results reveals that the cuckoo search algorithm is applicable and can quickly and effectively optimize the model parameters. The Morris method and the GLUE method are applied to analyze the sensitivity of the parameters, their results are consistent, and there are three sensitive parameters, denoted SRmax, Rv and CHv . The results of the M-GLUE method are better than those of the GLUE method, and both methods can effectively analyze the uncertainty of hydrological model parameters. The precipitation and potential evaporation predicted by the three climate models exhibit an increasing trend, and the simulated average annual streamflow of the BCC-CSM1.1 model is the greatest, followed by that of the CNRM-CM5 model and, finally, that of the CanESM2 model, but all three are greater than the baseline period value, which indicates that the diverse input data of the hydrological model lead to uncertainty in the streamflow simulation.

  14. Alchemy and uncertainty: What good are models?

    Science.gov (United States)

    F.L. Bunnell

    1989-01-01

    Wildlife-habitat models are increasing in abundance, diversity, and use, but symptoms of failure are evident in their application, including misuse, disuse, failure to test, and litigation. Reasons for failure often relate to the different purposes managers and researchers have for using the models to predict and to aid understanding. This paper examines these two...

  15. Uncertainty and Complexity in Mathematical Modeling

    Science.gov (United States)

    Cannon, Susan O.; Sanders, Mark

    2017-01-01

    Modeling is an effective tool to help students access mathematical concepts. Finding a math teacher who has not drawn a fraction bar or pie chart on the board would be difficult, as would finding students who have not been asked to draw models and represent numbers in different ways. In this article, the authors will discuss: (1) the properties of…

  16. Model Uncertainty and Exchange Rate Forecasting

    NARCIS (Netherlands)

    Kouwenberg, R.; Markiewicz, A.; Verhoeks, R.; Zwinkels, R.C.J.

    2017-01-01

    Exchange rate models with uncertain and incomplete information predict that investors focus on a small set of fundamentals that changes frequently over time. We design a model selection rule that captures the current set of fundamentals that best predicts the exchange rate. Out-of-sample tests show

  17. On the Uncertainty of Identification of Civil Engineering Structures Using ARMA Models

    DEFF Research Database (Denmark)

    Andersen, Palle; Brincker, Rune; Kirkegaard, Poul Henning

    1995-01-01

    In this paper the uncertainties of modal parameters estimated using ARMA models for identification of civil engineering structures are investigated. How to initialize the predictor part of a Gauss-Newton optimization algorithm is put in focus. A backward-forecasting procedure for initialization...

  18. On the Uncertainty of Identification of Civil Engineering Structures using ARMA Models

    DEFF Research Database (Denmark)

    Andersen, P.; Brincker, Rune; Kirkegaard, Poul Henning

    In this paper the uncertainties of modal parameters estimated using ARMA models for identification of civil engineering structures are investigated. How to initialize the predictor part of a Gauss-Newton optimization algorithm is put in focus. A backward-forecasting procedure for initialization...

  19. Bayesian uncertainty assessment of a semi-distributed integrated catchment model of phosphorus transport.

    Science.gov (United States)

    Starrfelt, Jostein; Kaste, Øyvind

    2014-07-01

    Process-based models of nutrient transport are often used as tools for management of eutrophic waters, as decision makers need to judge the potential effects of alternative remediation measures, under current conditions and with future land use and climate change. All modelling exercises entail uncertainty arising from various sources, such as the input data, selection of parameter values and the choice of model itself. Here we perform Bayesian uncertainty assessment of an integrated catchment model of phosphorus (INCA-P). We use an auto-calibration procedure and an algorithm for including parametric uncertainty to simulate phosphorus transport in a Norwegian lowland river basin. Two future scenarios were defined to exemplify the importance of parametric uncertainty in generating predictions. While a worst case scenario yielded a robust prediction of increased loading of phosphorus, a best case scenario only gave rise to a reduction in load with probability 0.78, highlighting the importance of taking parametric uncertainty into account in process-based catchment scale modelling of possible remediation scenarios. Estimates of uncertainty can be included in information provided to decision makers, thus making a stronger scientific basis for sound decisions to manage water resources.

  20. "Wrong, but useful": negotiating uncertainty in infectious disease modelling.

    Directory of Open Access Journals (Sweden)

    Robert M Christley

    Full Text Available For infectious disease dynamical models to inform policy for containment of infectious diseases the models must be able to predict; however, it is well recognised that such prediction will never be perfect. Nevertheless, the consensus is that although models are uncertain, some may yet inform effective action. This assumes that the quality of a model can be ascertained in order to evaluate sufficiently model uncertainties, and to decide whether or not, or in what ways or under what conditions, the model should be 'used'. We examined uncertainty in modelling, utilising a range of data: interviews with scientists, policy-makers and advisors, and analysis of policy documents, scientific publications and reports of major inquiries into key livestock epidemics. We show that the discourse of uncertainty in infectious disease models is multi-layered, flexible, contingent, embedded in context and plays a critical role in nego