WorldWideScience

Sample records for analysis uncertainty

  1. Uncertainty analysis

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software

  2. Uncertainty analysis guide

    Andres, T.H.

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  3. Uncertainty analysis guide

    Andres, T.H

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  4. Deterministic uncertainty analysis

    Worley, B.A.

    1987-01-01

    Uncertainties of computer results are of primary interest in applications such as high-level waste (HLW) repository performance assessment in which experimental validation is not possible or practical. This work presents an alternate deterministic approach for calculating uncertainties that has the potential to significantly reduce the number of computer runs required for conventional statistical analysis. 7 refs., 1 fig

  5. Sensitivity and uncertainty analysis

    Cacuci, Dan G; Navon, Ionel Michael

    2005-01-01

    As computer-assisted modeling and analysis of physical processes have continued to grow and diversify, sensitivity and uncertainty analyses have become indispensable scientific tools. Sensitivity and Uncertainty Analysis. Volume I: Theory focused on the mathematical underpinnings of two important methods for such analyses: the Adjoint Sensitivity Analysis Procedure and the Global Adjoint Sensitivity Analysis Procedure. This volume concentrates on the practical aspects of performing these analyses for large-scale systems. The applications addressed include two-phase flow problems, a radiative c

  6. Deterministic uncertainty analysis

    Worley, B.A.

    1987-12-01

    This paper presents a deterministic uncertainty analysis (DUA) method for calculating uncertainties that has the potential to significantly reduce the number of computer runs compared to conventional statistical analysis. The method is based upon the availability of derivative and sensitivity data such as that calculated using the well known direct or adjoint sensitivity analysis techniques. Formation of response surfaces using derivative data and the propagation of input probability distributions are discussed relative to their role in the DUA method. A sample problem that models the flow of water through a borehole is used as a basis to compare the cumulative distribution function of the flow rate as calculated by the standard statistical methods and the DUA method. Propogation of uncertainties by the DUA method is compared for ten cases in which the number of reference model runs was varied from one to ten. The DUA method gives a more accurate representation of the true cumulative distribution of the flow rate based upon as few as two model executions compared to fifty model executions using a statistical approach. 16 refs., 4 figs., 5 tabs

  7. Uncertainty analysis techniques

    Marivoet, J.; Saltelli, A.; Cadelli, N.

    1987-01-01

    The origin of the uncertainty affecting Performance Assessments, as well as their propagation to dose and risk results is discussed. The analysis is focused essentially on the uncertainties introduced by the input parameters, the values of which may range over some orders of magnitude and may be given as probability distribution function. The paper briefly reviews the existing sampling techniques used for Monte Carlo simulations and the methods for characterizing the output curves, determining their convergence and confidence limits. Annual doses, expectation values of the doses and risks are computed for a particular case of a possible repository in clay, in order to illustrate the significance of such output characteristics as the mean, the logarithmic mean and the median as well as their ratios. The report concludes that provisionally, due to its better robustness, such estimation as the 90th percentile may be substituted to the arithmetic mean for comparison of the estimated doses with acceptance criteria. In any case, the results obtained through Uncertainty Analyses must be interpreted with caution as long as input data distribution functions are not derived from experiments reasonably reproducing the situation in a well characterized repository and site

  8. DS02 uncertainty analysis

    Kaul, Dean C.; Egbert, Stephen D.; Woolson, William A.

    2005-01-01

    In order to avoid the pitfalls that so discredited DS86 and its uncertainty estimates, and to provide DS02 uncertainties that are both defensible and credible, this report not only presents the ensemble uncertainties assembled from uncertainties in individual computational elements and radiation dose components but also describes how these relate to comparisons between observed and computed quantities at critical intervals in the computational process. These comparisons include those between observed and calculated radiation free-field components, where observations include thermal- and fast-neutron activation and gamma-ray thermoluminescence, which are relevant to the estimated systematic uncertainty for DS02. The comparisons also include those between calculated and observed survivor shielding, where the observations consist of biodosimetric measurements for individual survivors, which are relevant to the estimated random uncertainty for DS02. (J.P.N.)

  9. Uncertainty analysis of environmental models

    Monte, L.

    1990-01-01

    In the present paper an evaluation of the output uncertainty of an environmental model for assessing the transfer of 137 Cs and 131 I in the human food chain are carried out on the basis of a statistical analysis of data reported by the literature. The uncertainty analysis offers the oppotunity of obtaining some remarkable information about the uncertainty of models predicting the migration of non radioactive substances in the environment mainly in relation to the dry and wet deposition

  10. Reliability analysis under epistemic uncertainty

    Nannapaneni, Saideep; Mahadevan, Sankaran

    2016-01-01

    This paper proposes a probabilistic framework to include both aleatory and epistemic uncertainty within model-based reliability estimation of engineering systems for individual limit states. Epistemic uncertainty is considered due to both data and model sources. Sparse point and/or interval data regarding the input random variables leads to uncertainty regarding their distribution types, distribution parameters, and correlations; this statistical uncertainty is included in the reliability analysis through a combination of likelihood-based representation, Bayesian hypothesis testing, and Bayesian model averaging techniques. Model errors, which include numerical solution errors and model form errors, are quantified through Gaussian process models and included in the reliability analysis. The probability integral transform is used to develop an auxiliary variable approach that facilitates a single-level representation of both aleatory and epistemic uncertainty. This strategy results in an efficient single-loop implementation of Monte Carlo simulation (MCS) and FORM/SORM techniques for reliability estimation under both aleatory and epistemic uncertainty. Two engineering examples are used to demonstrate the proposed methodology. - Highlights: • Epistemic uncertainty due to data and model included in reliability analysis. • A novel FORM-based approach proposed to include aleatory and epistemic uncertainty. • A single-loop Monte Carlo approach proposed to include both types of uncertainties. • Two engineering examples used for illustration.

  11. Mathematical Analysis of Uncertainty

    Angel GARRIDO

    2016-01-01

    Full Text Available Classical Logic showed early its insufficiencies for solving AI problems. The introduction of Fuzzy Logic aims at this problem. There have been research in the conventional Rough direction alone or in the Fuzzy direction alone, and more recently, attempts to combine both into Fuzzy Rough Sets or Rough Fuzzy Sets. We analyse some new and powerful tools in the study of Uncertainty, as the Probabilistic Graphical Models, Chain Graphs, Bayesian Networks, and Markov Networks, integrating our knowledge of graphs and probability.

  12. Uncertainty analysis for hot channel

    Panka, I.; Kereszturi, A.

    2006-01-01

    The fulfillment of the safety analysis acceptance criteria is usually evaluated by separate hot channel calculations using the results of neutronic or/and thermo hydraulic system calculations. In case of an ATWS event (inadvertent withdrawal of control assembly), according to the analysis, a number of fuel rods are experiencing DNB for a longer time and must be regarded as failed. Their number must be determined for a further evaluation of the radiological consequences. In the deterministic approach, the global power history must be multiplied by different hot channel factors (kx) taking into account the radial power peaking factors for each fuel pin. If DNB occurs it is necessary to perform a few number of hot channel calculations to determine the limiting kx leading just to DNB and fuel failure (the conservative DNBR limit is 1.33). Knowing the pin power distribution from the core design calculation, the number of failed fuel pins can be calculated. The above procedure can be performed by conservative assumptions (e.g. conservative input parameters in the hot channel calculations), as well. In case of hot channel uncertainty analysis, the relevant input parameters (k x, mass flow, inlet temperature of the coolant, pin average burnup, initial gap size, selection of power history influencing the gap conductance value) of hot channel calculations and the DNBR limit are varied considering the respective uncertainties. An uncertainty analysis methodology was elaborated combining the response surface method with the one sided tolerance limit method of Wilks. The results of deterministic and uncertainty hot channel calculations are compared regarding to the number of failed fuel rods, max. temperature of the clad surface and max. temperature of the fuel (Authors)

  13. LOFT differential pressure uncertainty analysis

    Evans, R.P.; Biladeau, G.L.; Quinn, P.A.

    1977-03-01

    A performance analysis of the LOFT differential pressure (ΔP) measurement is presented. Along with completed descriptions of test programs and theoretical studies that have been conducted on the ΔP, specific sources of measurement uncertainty are identified, quantified, and combined to provide an assessment of the ability of this measurement to satisfy the SDD 1.4.1C (June 1975) requirement of measurement of differential pressure

  14. Uncertainty analysis of the FRAP code

    Peck, S.O.

    1978-01-01

    A user oriented, automated uncertainty analysis capability has been built into the FRAP code (Fuel Rod Analysis Program) and applied to a PWR fuel rod undergoing a LOCA. The method of uncertainty analysis is the Response Surface Method (RSM). (author)

  15. Uncertainty analysis in seismic tomography

    Owoc, Bartosz; Majdański, Mariusz

    2017-04-01

    Velocity field from seismic travel time tomography depends on several factors like regularization, inversion path, model parameterization etc. The result also strongly depends on an initial velocity model and precision of travel times picking. In this research we test dependence on starting model in layered tomography and compare it with effect of picking precision. Moreover, in our analysis for manual travel times picking the uncertainty distribution is asymmetric. This effect is shifting the results toward faster velocities. For calculation we are using JIVE3D travel time tomographic code. We used data from geo-engineering and industrial scale investigations, which were collected by our team from IG PAS.

  16. Applied research in uncertainty modeling and analysis

    Ayyub, Bilal

    2005-01-01

    Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

  17. LOFT uncertainty-analysis methodology

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear-reactor-safety research program is described and compared with other methodologies established for performing uncertainty analyses

  18. LOFT uncertainty-analysis methodology

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear reactor safety research program is described and compared with other methodologies established for performing uncertainty analyses

  19. Uncertainty Management and Sensitivity Analysis

    Rosenbaum, Ralph K.; Georgiadis, Stylianos; Fantke, Peter

    2018-01-01

    Uncertainty is always there and LCA is no exception to that. The presence of uncertainties of different types and from numerous sources in LCA results is a fact, but managing them allows to quantify and improve the precision of a study and the robustness of its conclusions. LCA practice sometimes...... suffers from an imbalanced perception of uncertainties, justifying modelling choices and omissions. Identifying prevalent misconceptions around uncertainties in LCA is a central goal of this chapter, aiming to establish a positive approach focusing on the advantages of uncertainty management. The main...... objectives of this chapter are to learn how to deal with uncertainty in the context of LCA, how to quantify it, interpret and use it, and how to communicate it. The subject is approached more holistically than just focusing on relevant statistical methods or purely mathematical aspects. This chapter...

  20. Uncertainty analysis in safety assessment

    Lemos, Francisco Luiz de; Sullivan, Terry

    1997-01-01

    Nuclear waste disposal is a very complex subject which requires the study of many different fields of science, like hydro geology, meteorology, geochemistry, etc. In addition, the waste disposal facilities are designed to last for a very long period of time. Both of these conditions make safety assessment projections filled with uncertainty. This paper addresses approaches for treatment of uncertainties in the safety assessment modeling due to the variability of data and some current approaches used to deal with this problem. (author)

  1. Uncertainty analysis in safety assessment

    Lemos, Francisco Luiz de [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN), Belo Horizonte, MG (Brazil); Sullivan, Terry [Brookhaven National Lab., Upton, NY (United States)

    1997-12-31

    Nuclear waste disposal is a very complex subject which requires the study of many different fields of science, like hydro geology, meteorology, geochemistry, etc. In addition, the waste disposal facilities are designed to last for a very long period of time. Both of these conditions make safety assessment projections filled with uncertainty. This paper addresses approaches for treatment of uncertainties in the safety assessment modeling due to the variability of data and some current approaches used to deal with this problem. (author) 13 refs.; e-mail: lemos at bnl.gov; sulliva1 at bnl.gov

  2. Uncertainty analysis of the FRAP code

    Peck, S.O.

    1978-01-01

    A user oriented, automated uncertainty analysis capability has been built into the Fuel Rod Analysis Program (FRAP) code and has been applied to a pressurized water reactor (PWR) fuel rod undergoing a loss-of-coolant accident (LOCA). The method of uncertainty analysis is the response surface method. The automated version significantly reduced the time required to complete the analysis and, at the same time, greatly increased the problem scope. Results of the analysis showed a significant difference in the total and relative contributions to the uncertainty of the response parameters between steady state and transient conditions

  3. Uncertainty quantification and error analysis

    Higdon, Dave M [Los Alamos National Laboratory; Anderson, Mark C [Los Alamos National Laboratory; Habib, Salman [Los Alamos National Laboratory; Klein, Richard [Los Alamos National Laboratory; Berliner, Mark [OHIO STATE UNIV.; Covey, Curt [LLNL; Ghattas, Omar [UNIV OF TEXAS; Graziani, Carlo [UNIV OF CHICAGO; Seager, Mark [LLNL; Sefcik, Joseph [LLNL; Stark, Philip [UC/BERKELEY; Stewart, James [SNL

    2010-01-01

    UQ studies all sources of error and uncertainty, including: systematic and stochastic measurement error; ignorance; limitations of theoretical models; limitations of numerical representations of those models; limitations on the accuracy and reliability of computations, approximations, and algorithms; and human error. A more precise definition for UQ is suggested below.

  4. Risk Characterization uncertainties associated description, sensitivity analysis

    Carrillo, M.; Tovar, M.; Alvarez, J.; Arraez, M.; Hordziejewicz, I.; Loreto, I.

    2013-01-01

    The power point presentation is about risks to the estimated levels of exposure, uncertainty and variability in the analysis, sensitivity analysis, risks from exposure to multiple substances, formulation of guidelines for carcinogenic and genotoxic compounds and risk subpopulations

  5. Urban drainage models - making uncertainty analysis simple

    Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana

    2012-01-01

    in each measured/observed datapoint; an issue which is commonly overlook in the uncertainty analysis of urban drainage models. This comparison allows the user to intuitively estimate the optimum number of simulations required to conduct uncertainty analyses. The output of the method includes parameter......There is increasing awareness about uncertainties in modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here...

  6. Uncertainty analysis in Monte Carlo criticality computations

    Qi Ao

    2011-01-01

    Highlights: ► Two types of uncertainty methods for k eff Monte Carlo computations are examined. ► Sampling method has the least restrictions on perturbation but computing resources. ► Analytical method is limited to small perturbation on material properties. ► Practicality relies on efficiency, multiparameter applicability and data availability. - Abstract: Uncertainty analysis is imperative for nuclear criticality risk assessments when using Monte Carlo neutron transport methods to predict the effective neutron multiplication factor (k eff ) for fissionable material systems. For the validation of Monte Carlo codes for criticality computations against benchmark experiments, code accuracy and precision are measured by both the computational bias and uncertainty in the bias. The uncertainty in the bias accounts for known or quantified experimental, computational and model uncertainties. For the application of Monte Carlo codes for criticality analysis of fissionable material systems, an administrative margin of subcriticality must be imposed to provide additional assurance of subcriticality for any unknown or unquantified uncertainties. Because of a substantial impact of the administrative margin of subcriticality on economics and safety of nuclear fuel cycle operations, recently increasing interests in reducing the administrative margin of subcriticality make the uncertainty analysis in criticality safety computations more risk-significant. This paper provides an overview of two most popular k eff uncertainty analysis methods for Monte Carlo criticality computations: (1) sampling-based methods, and (2) analytical methods. Examples are given to demonstrate their usage in the k eff uncertainty analysis due to uncertainties in both neutronic and non-neutronic parameters of fissionable material systems.

  7. CEC/USDOE workshop on uncertainty analysis

    Elderkin, C.E.; Kelly, G.N.

    1990-07-01

    Any measured or assessed quantity contains uncertainty. The quantitative estimation of such uncertainty is becoming increasingly important, especially in assuring that safety requirements are met in design, regulation, and operation of nuclear installations. The CEC/USDOE Workshop on Uncertainty Analysis, held in Santa Fe, New Mexico, on November 13 through 16, 1989, was organized jointly by the Commission of European Communities (CEC's) Radiation Protection Research program, dealing with uncertainties throughout the field of consequence assessment, and DOE's Atmospheric Studies in Complex Terrain (ASCOT) program, concerned with the particular uncertainties in time and space variant transport and dispersion. The workshop brought together US and European scientists who have been developing or applying uncertainty analysis methodologies, conducted in a variety of contexts, often with incomplete knowledge of the work of others in this area. Thus, it was timely to exchange views and experience, identify limitations of approaches to uncertainty and possible improvements, and enhance the interface between developers and users of uncertainty analysis methods. Furthermore, the workshop considered the extent to which consistent, rigorous methods could be used in various applications within consequence assessment. 3 refs

  8. Fuzzy Uncertainty Evaluation for Fault Tree Analysis

    Kim, Ki Beom; Shim, Hyung Jin [Seoul National University, Seoul (Korea, Republic of); Jae, Moo Sung [Hanyang University, Seoul (Korea, Republic of)

    2015-05-15

    This traditional probabilistic approach can calculate relatively accurate results. However it requires a long time because of repetitive computation due to the MC method. In addition, when informative data for statistical analysis are not sufficient or some events are mainly caused by human error, the probabilistic approach may not be possible because uncertainties of these events are difficult to be expressed by probabilistic distributions. In order to reduce the computation time and quantify uncertainties of top events when basic events whose uncertainties are difficult to be expressed by probabilistic distributions exist, the fuzzy uncertainty propagation based on fuzzy set theory can be applied. In this paper, we develop a fuzzy uncertainty propagation code and apply the fault tree of the core damage accident after the large loss of coolant accident (LLOCA). The fuzzy uncertainty propagation code is implemented and tested for the fault tree of the radiation release accident. We apply this code to the fault tree of the core damage accident after the LLOCA in three cases and compare the results with those computed by the probabilistic uncertainty propagation using the MC method. The results obtained by the fuzzy uncertainty propagation can be calculated in relatively short time, covering the results obtained by the probabilistic uncertainty propagation.

  9. Some reflections on uncertainty analysis and management

    Aven, Terje

    2010-01-01

    A guide to quantitative uncertainty analysis and management in industry has recently been issued. The guide provides an overall framework for uncertainty modelling and characterisations, using probabilities but also other uncertainty representations (including the Dempster-Shafer theory). A number of practical applications showing how to use the framework are presented. The guide is considered as an important contribution to the field, but there is a potential for improvements. These relate mainly to the scientific basis and clarification of critical issues, for example, concerning the meaning of a probability and the concept of model uncertainty. A reformulation of the framework is suggested using probabilities as the only representation of uncertainty. Several simple examples are included to motivate and explain the basic ideas of the modified framework.

  10. Uncertainty analysis for secondary energy distributions

    Gerstl, S.A.W.

    1978-01-01

    In many transport calculations the integral design parameter of interest (response) is determined mainly by secondary particles such as gamma rays from (n,γ) reactions or secondary neutrons from inelastic scattering events or (n,2n) reactions. Standard sensitivity analysis usually allows to calculate the sensitivities to the production cross sections of such secondaries, but an extended formalism is needed to also obtain the sensitivities to the energy distribution of the generated secondary particles. For a 30-group standard cross-section set 84% of all non-zero table positions pertain to the description of secondary energy distributions (SED's) and only 16% to the actual reaction cross sections. Therefore, any sensitivity/uncertainty analysis which does not consider the effects of SED's is incomplete and neglects most of the input data. This paper describes the methods of how sensitivity profiles for SED's are obtained and used to estimate the uncertainty of an integral response due to uncertainties in these SED's. The detailed theory is documented elsewhere and implemented in the LASL sensitivity code SENSIT. SED sensitivity profiles have proven particularly valuable in cross-section uncertainty analyses for fusion reactors. Even when the production cross sections for secondary neutrons were assumed to be without error, the uncertainties in the energy distribution of these secondaries produced appreciable uncertainties in the calculated tritium breeding rate. However, complete error files for SED's are presently nonexistent. Therefore, methods will be described that allow rough error estimates due to estimated SED uncertainties based on integral SED sensitivities

  11. Approach to uncertainty in risk analysis

    Rish, W.R.

    1988-08-01

    In the Fall of 1985 EPA's Office of Radiation Programs (ORP) initiated a project to develop a formal approach to dealing with uncertainties encountered when estimating and evaluating risks to human health and the environment. Based on a literature review of modeling uncertainty, interviews with ORP technical and management staff, and input from experts on uncertainty analysis, a comprehensive approach was developed. This approach recognizes by design the constraints on budget, time, manpower, expertise, and availability of information often encountered in ''real world'' modeling. It is based on the observation that in practice risk modeling is usually done to support a decision process. As such, the approach focuses on how to frame a given risk modeling problem, how to use that framing to select an appropriate mixture of uncertainty analyses techniques, and how to integrate the techniques into an uncertainty assessment that effectively communicates important information and insight to decision-makers. The approach is presented in this report. Practical guidance on characterizing and analyzing uncertainties about model form and quantities and on effectively communicating uncertainty analysis results is included. Examples from actual applications are presented.

  12. Approach to uncertainty in risk analysis

    Rish, W.R.

    1988-08-01

    In the Fall of 1985 EPA's Office of Radiation Programs (ORP) initiated a project to develop a formal approach to dealing with uncertainties encountered when estimating and evaluating risks to human health and the environment. Based on a literature review of modeling uncertainty, interviews with ORP technical and management staff, and input from experts on uncertainty analysis, a comprehensive approach was developed. This approach recognizes by design the constraints on budget, time, manpower, expertise, and availability of information often encountered in ''real world'' modeling. It is based on the observation that in practice risk modeling is usually done to support a decision process. As such, the approach focuses on how to frame a given risk modeling problem, how to use that framing to select an appropriate mixture of uncertainty analyses techniques, and how to integrate the techniques into an uncertainty assessment that effectively communicates important information and insight to decision-makers. The approach is presented in this report. Practical guidance on characterizing and analyzing uncertainties about model form and quantities and on effectively communicating uncertainty analysis results is included. Examples from actual applications are presented

  13. Analysis of uncertainties of thermal hydraulic calculations

    Macek, J.; Vavrin, J.

    2002-12-01

    In 1993-1997 it was proposed, within OECD projects, that a common program should be set up for uncertainty analysis by a probabilistic method based on a non-parametric statistical approach for system computer codes such as RELAP, ATHLET and CATHARE and that a method should be developed for statistical analysis of experimental databases for the preparation of the input deck and statistical analysis of the output calculation results. Software for such statistical analyses would then have to be processed as individual tools independent of the computer codes used for the thermal hydraulic analysis and programs for uncertainty analysis. In this context, a method for estimation of a thermal hydraulic calculation is outlined and selected methods of statistical analysis of uncertainties are described, including methods for prediction accuracy assessment based on the discrete Fourier transformation principle. (author)

  14. Uncertainties in thick-target PIXE analysis

    Campbell, J.L.; Cookson, J.A.; Paul, H.

    1983-01-01

    Thick-target PIXE analysis insolves uncertainties arising from the calculation of thick-target X-ray production in addition to the usual PIXE uncertainties. The calculation demands knowledge of ionization cross-sections, stopping powers and photon attenuation coefficients. Information on these is reviewed critically and a computational method is used to estimate the uncertainties transmitted from this data base into results of thick-target PIXE analyses with reference to particular specimen types using beams of 2-3 MeV protons. A detailed assessment of the accuracy of thick-target PIXE is presented. (orig.)

  15. Uncertainty Propagation in Monte Carlo Depletion Analysis

    Shim, Hyung Jin; Kim, Yeong-il; Park, Ho Jin; Joo, Han Gyu; Kim, Chang Hyo

    2008-01-01

    A new formulation aimed at quantifying uncertainties of Monte Carlo (MC) tallies such as k eff and the microscopic reaction rates of nuclides and nuclide number densities in MC depletion analysis and examining their propagation behaviour as a function of depletion time step (DTS) is presented. It is shown that the variance of a given MC tally used as a measure of its uncertainty in this formulation arises from four sources; the statistical uncertainty of the MC tally, uncertainties of microscopic cross sections and nuclide number densities, and the cross correlations between them and the contribution of the latter three sources can be determined by computing the correlation coefficients between the uncertain variables. It is also shown that the variance of any given nuclide number density at the end of each DTS stems from uncertainties of the nuclide number densities (NND) and microscopic reaction rates (MRR) of nuclides at the beginning of each DTS and they are determined by computing correlation coefficients between these two uncertain variables. To test the viability of the formulation, we conducted MC depletion analysis for two sample depletion problems involving a simplified 7x7 fuel assembly (FA) and a 17x17 PWR FA, determined number densities of uranium and plutonium isotopes and their variances as well as k ∞ and its variance as a function of DTS, and demonstrated the applicability of the new formulation for uncertainty propagation analysis that need be followed in MC depletion computations. (authors)

  16. Uncertainty Principles and Fourier Analysis

    analysis on the part of the reader. Those who are not fa- miliar with Fourier analysis are encouraged to look up Box. 1 along with [3]. (A) Heisenberg's inequality: Let us measure concentration in terms of standard deviation i.e. for a square integrable func-. 00 tion defined on 1R and normalized so that J If(x)12d,x = 1,. -00. 00.

  17. Uncertainty analysis for Ulysses safety evaluation report

    Frank, M.V.

    1991-01-01

    As part of the effort to review the Ulysses Final Safety Analysis Report and to understand the risk of plutonium release from the Ulysses spacecraft General Purpose Heat Source---Radioisotope Thermal Generator (GPHS-RTG), the Interagency Nuclear Safety Review Panel (INSRP) and the author performed an integrated, quantitative analysis of the uncertainties of the calculated risk of plutonium release from Ulysses. Using state-of-art probabilistic risk assessment technology, the uncertainty analysis accounted for both variability and uncertainty of the key parameters of the risk analysis. The results show that INSRP had high confidence that risk of fatal cancers from potential plutonium release associated with calculated launch and deployment accident scenarios is low

  18. Uncertainty analysis of neutron transport calculation

    Oka, Y.; Furuta, K.; Kondo, S.

    1987-01-01

    A cross section sensitivity-uncertainty analysis code, SUSD was developed. The code calculates sensitivity coefficients for one and two-dimensional transport problems based on the first order perturbation theory. Variance and standard deviation of detector responses or design parameters can be obtained using cross section covariance matrix. The code is able to perform sensitivity-uncertainty analysis for secondary neutron angular distribution(SAD) and secondary neutron energy distribution(SED). Covariances of 6 Li and 7 Li neutron cross sections in JENDL-3PR1 were evaluated including SAD and SED. Covariances of Fe and Be were also evaluated. The uncertainty of tritium breeding ratio, fast neutron leakage flux and neutron heating was analysed on four types of blanket concepts for a commercial tokamak fusion reactor. The uncertainty of tritium breeding ratio was less than 6 percent. Contribution from SAD/SED uncertainties are significant for some parameters. Formulas to estimate the errors of numerical solution of the transport equation were derived based on the perturbation theory. This method enables us to deterministically estimate the numerical errors due to iterative solution, spacial discretization and Legendre polynomial expansion of transfer cross-sections. The calculational errors of the tritium breeding ratio and the fast neutron leakage flux of the fusion blankets were analysed. (author)

  19. Uncertainty analysis for geologic disposal of radioactive waste

    Cranwell, R.M.; Helton, J.C.

    1981-01-01

    The incorporation and representation of uncertainty in the analysis of the consequences and risks associated with the geologic disposal of high-level radioactive waste are discussed. Such uncertainty has three primary components: process modeling uncertainty, model input data uncertainty, and scenario uncertainty. The following topics are considered in connection with the preceding components: propagation of uncertainty in the modeling of a disposal site, sampling of input data for models, and uncertainty associated with model output

  20. Uncertainty Propagation in Hypersonic Vehicle Aerothermoelastic Analysis

    Lamorte, Nicolas Etienne

    Hypersonic vehicles face a challenging flight environment. The aerothermoelastic analysis of its components requires numerous simplifying approximations. Identifying and quantifying the effect of uncertainties pushes the limits of the existing deterministic models, and is pursued in this work. An uncertainty quantification framework is used to propagate the effects of identified uncertainties on the stability margins and performance of the different systems considered. First, the aeroelastic stability of a typical section representative of a control surface on a hypersonic vehicle is examined. Variability in the uncoupled natural frequencies of the system is modeled to mimic the effect of aerodynamic heating. Next, the stability of an aerodynamically heated panel representing a component of the skin of a generic hypersonic vehicle is considered. Uncertainty in the location of transition from laminar to turbulent flow and the heat flux prediction is quantified using CFD. In both cases significant reductions of the stability margins are observed. A loosely coupled airframe--integrated scramjet engine is considered next. The elongated body and cowl of the engine flow path are subject to harsh aerothermodynamic loading which causes it to deform. Uncertainty associated with deformation prediction is propagated to the engine performance analysis. The cowl deformation is the main contributor to the sensitivity of the propulsion system performance. Finally, a framework for aerothermoelastic stability boundary calculation for hypersonic vehicles using CFD is developed. The usage of CFD enables one to consider different turbulence conditions, laminar or turbulent, and different models of the air mixture, in particular real gas model which accounts for dissociation of molecules at high temperature. The system is found to be sensitive to turbulence modeling as well as the location of the transition from laminar to turbulent flow. Real gas effects play a minor role in the

  1. Optimization of FRAP uncertainty analysis option

    Peck, S.O.

    1979-10-01

    The automated uncertainty analysis option that has been incorporated in the FRAP codes (FRAP-T5 and FRAPCON-2) provides the user with a means of obtaining uncertainty bands on code predicted variables at user-selected times during a fuel pin analysis. These uncertainty bands are obtained by multiple single fuel pin analyses to generate data which can then be analyzed by second order statistical error propagation techniques. In this process, a considerable amount of data is generated and stored on tape. The user has certain choices to make regarding which independent variables are to be used in the analysis and what order of error propagation equation should be used in modeling the output response. To aid the user in these decisions, a computer program, ANALYZ, has been written and added to the uncertainty analysis option package. A variety of considerations involved in fitting response surface equations and certain pit-falls of which the user should be aware are discussed. An equation is derived expressing a residual as a function of a fitted model and an assumed true model. A variety of experimental design choices are discussed, including the advantages and disadvantages of each approach. Finally, a description of the subcodes which constitute program ANALYZ is provided

  2. Uncertainties in elemental quantitative analysis by PIXE

    Montenegro, E.C.; Baptista, G.B.; Paschoa, A.S.; Barros Leite, C.V.

    1979-01-01

    The effects of the degree of non-uniformity of the particle beam, matrix composition and matrix thickness in a quantitative elemental analysis by particle induced X-ray emission (PIXE) are discussed and a criterion to evaluate the resulting degree of uncertainty in the mass determination by this method is established. (Auth.)

  3. Uncertainty Assessments in Fast Neutron Activation Analysis

    W. D. James; R. Zeisler

    2000-01-01

    Fast neutron activation analysis (FNAA) carried out with the use of small accelerator-based neutron generators is routinely used for major/minor element determinations in industry, mineral and petroleum exploration, and to some extent in research. While the method shares many of the operational procedures and therefore errors inherent to conventional thermal neutron activation analysis, its unique implementation gives rise to additional specific concerns that can result in errors or increased uncertainties of measured quantities. The authors were involved in a recent effort to evaluate irreversible incorporation of oxygen into a standard reference material (SRM) by direct measurement of oxygen by FNAA. That project required determination of oxygen in bottles of the SRM stored in varying environmental conditions and a comparison of the results. We recognized the need to accurately describe the total uncertainty of the measurements to accurately characterize any differences in the resulting average concentrations. It is our intent here to discuss the breadth of potential parameters that have the potential to contribute to the random and nonrandom errors of the method and provide estimates of the magnitude of uncertainty introduced. In addition, we will discuss the steps taken in this recent FNAA project to control quality, assess the uncertainty of the measurements, and evaluate results based on the statistical reproducibility

  4. Systematic Analysis Of Ocean Colour Uncertainties

    Lavender, Samantha

    2013-12-01

    This paper reviews current research into the estimation of uncertainties as a pixel-based measure to aid non- specialist users of remote sensing products. An example MERIS image, captured on the 28 March 2012, was processed with above-water atmospheric correction code. This was initially based on both the Antoine & Morel Standard Atmospheric Correction, with Bright Pixel correction component, and Doerffer Neural Network coastal water's approach. It's showed that analysis of the atmospheric by-products yield important information about the separation of the atmospheric and in-water signals, helping to sign-post possible uncertainties in the atmospheric correction results. Further analysis has concentrated on implementing a ‘simplistic' atmospheric correction so that the impact of changing the input auxiliary data can be analysed; the influence of changing surface pressure is demonstrated. Future work will focus on automating the analysis, so that the methodology can be implemented within an operational system.

  5. Approach to uncertainty evaluation for safety analysis

    Ogura, Katsunori

    2005-01-01

    Nuclear power plant safety used to be verified and confirmed through accident simulations using computer codes generally because it is very difficult to perform integrated experiments or tests for the verification and validation of the plant safety due to radioactive consequence, cost, and scaling to the actual plant. Traditionally the plant safety had been secured owing to the sufficient safety margin through the conservative assumptions and models to be applied to those simulations. Meanwhile the best-estimate analysis based on the realistic assumptions and models in support of the accumulated insights could be performed recently, inducing the reduction of safety margin in the analysis results and the increase of necessity to evaluate the reliability or uncertainty of the analysis results. This paper introduces an approach to evaluate the uncertainty of accident simulation and its results. (Note: This research had been done not in the Japan Nuclear Energy Safety Organization but in the Tokyo Institute of Technology.) (author)

  6. Erha Uncertainty Analysis: Planning for the future

    Brami, T.R.; Hopkins, D.F.; Loguer, W.L.; Cornagia, D.M.; Braisted, A.W.C.

    2002-01-01

    The Erha field (OPL 209) was discovered in 1999 approximately 100 km off the coast of Nigeria in 1,100 m of water. The discovery well (Erha-1) encountered oil and gas in deep-water clastic reservoirs. The first appraisal well (Erha-2) drilled 1.6 km downdip to the northwest penetrated an oil-water contact and confirmed a potentially commercial discovery. However, the Erha-3 and Erha-3 ST-1 boreholes, drilled on the faulted east-side of the field in 2001, encountered shallower fluid contacts. As a result of these findings, a comprehensive field-wide uncertainty analysis was performed to better understand what we know versus what we think regarding resource size and economic viability The uncertainty analysis process applied at Erha is an integrated scenario-based probabilistic approach to model resource and reserves. Its goal is to provide quantitative results for a variety of scenarios, thus allowing identification of and focus on critical controls (the variables that are likely to impose the greatest influence).The initial focus at Erha was to incorporate the observed fluid contacts and to develop potential scenarios that included the range of possibilities in unpenetrated portions of the field. Four potential compartmentalization scenarios were hypothesized. The uncertainty model combines these scenarios with reservoir parameters and their plausible ranges. Input data comes from multiple sources including: wells, 3D seismic, reservoir flow simulation, geochemistry, fault-seal analysis, sequence stratigraphic analysis, and analogs. Once created, the model is sampled using Monte-Carlo techniques to create probability density functions for a variety of variables including oil in-place and recoverable reserves.Results of the uncertainty analysis support that despite a thinner oil column on the faulted east-side of the field, Erha is an economically attractive opportunity. Further, the results have been to develop data acquisition plans and mitigation strategies that

  7. Uncertainty Prediction in Passive Target Motion Analysis

    2016-05-12

    Number 15/152,696 Filing Date 12 May 2016 Inventor John G. Baylog et al Address any questions concerning this matter to the Office of...300118 1 of 25 UNCERTAINTY PREDICTION IN PASSIVE TARGET MOTION ANALYSIS STATEMENT OF GOVERNMENT INTEREST [0001] The invention described herein...at an unknown location and following an unknown course relative to an observer 12. Observer 12 has a sensor array such as a passive sonar or radar

  8. Parameter Uncertainty for Repository Thermal Analysis

    Hardin, Ernest [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hadgu, Teklu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Greenberg, Harris [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dupont, Mark [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-10-01

    This report is one follow-on to a study of reference geologic disposal design concepts (Hardin et al. 2011a). Based on an analysis of maximum temperatures, that study concluded that certain disposal concepts would require extended decay storage prior to emplacement, or the use of small waste packages, or both. The study used nominal values for thermal properties of host geologic media and engineered materials, demonstrating the need for uncertainty analysis to support the conclusions. This report is a first step that identifies the input parameters of the maximum temperature calculation, surveys published data on measured values, uses an analytical approach to determine which parameters are most important, and performs an example sensitivity analysis. Using results from this first step, temperature calculations planned for FY12 can focus on only the important parameters, and can use the uncertainty ranges reported here. The survey of published information on thermal properties of geologic media and engineered materials, is intended to be sufficient for use in generic calculations to evaluate the feasibility of reference disposal concepts. A full compendium of literature data is beyond the scope of this report. The term “uncertainty” is used here to represent both measurement uncertainty and spatial variability, or variability across host geologic units. For the most important parameters (e.g., buffer thermal conductivity) the extent of literature data surveyed samples these different forms of uncertainty and variability. Finally, this report is intended to be one chapter or section of a larger FY12 deliverable summarizing all the work on design concepts and thermal load management for geologic disposal (M3FT-12SN0804032, due 15Aug2012).

  9. Uncertainty analysis of nuclear waste package corrosion

    Kurth, R.E.; Nicolosi, S.L.

    1986-01-01

    This paper describes the results of an evaluation of three uncertainty analysis methods for assessing the possible variability in calculating the corrosion process in a nuclear waste package. The purpose of the study is the determination of how each of three uncertainty analysis methods, Monte Carlo, Latin hypercube sampling (LHS) and a modified discrete probability distribution method, perform in such calculations. The purpose is not to examine the absolute magnitude of the numbers but rather to rank the performance of each of the uncertainty methods in assessing the model variability. In this context it was found that the Monte Carlo method provided the most accurate assessment but at a prohibitively high cost. The modified discrete probability method provided accuracy close to that of the Monte Carlo for a fraction of the cost. The LHS method was found to be too inaccurate for this calculation although it would be appropriate for use in a model which requires substantially more computer time than the one studied in this paper

  10. Representing uncertainty on model analysis plots

    Trevor I. Smith

    2016-09-01

    Full Text Available Model analysis provides a mechanism for representing student learning as measured by standard multiple-choice surveys. The model plot contains information regarding both how likely students in a particular class are to choose the correct answer and how likely they are to choose an answer consistent with a well-documented conceptual model. Unfortunately, Bao’s original presentation of the model plot did not include a way to represent uncertainty in these measurements. I present details of a method to add error bars to model plots by expanding the work of Sommer and Lindell. I also provide a template for generating model plots with error bars.

  11. Uncertainty Analysis and Expert Judgment in Seismic Hazard Analysis

    Klügel, Jens-Uwe

    2011-01-01

    The large uncertainty associated with the prediction of future earthquakes is usually regarded as the main reason for increased hazard estimates which have resulted from some recent large scale probabilistic seismic hazard analysis studies (e.g. the PEGASOS study in Switzerland and the Yucca Mountain study in the USA). It is frequently overlooked that such increased hazard estimates are characteristic for a single specific method of probabilistic seismic hazard analysis (PSHA): the traditional (Cornell-McGuire) PSHA method which has found its highest level of sophistication in the SSHAC probability method. Based on a review of the SSHAC probability model and its application in the PEGASOS project, it is shown that the surprising results of recent PSHA studies can be explained to a large extent by the uncertainty model used in traditional PSHA, which deviates from the state of the art in mathematics and risk analysis. This uncertainty model, the Ang-Tang uncertainty model, mixes concepts of decision theory with probabilistic hazard assessment methods leading to an overestimation of uncertainty in comparison to empirical evidence. Although expert knowledge can be a valuable source of scientific information, its incorporation into the SSHAC probability method does not resolve the issue of inflating uncertainties in PSHA results. Other, more data driven, PSHA approaches in use in some European countries are less vulnerable to this effect. The most valuable alternative to traditional PSHA is the direct probabilistic scenario-based approach, which is closely linked with emerging neo-deterministic methods based on waveform modelling.

  12. Risk uncertainty analysis methods for NUREG-1150

    Benjamin, A.S.; Boyd, G.J.

    1987-01-01

    Evaluation and display of risk uncertainties for NUREG-1150 constitute a principal focus of the Severe Accident Risk Rebaselining/Risk Reduction Program (SARRP). Some of the principal objectives of the uncertainty evaluation are: (1) to provide a quantitative estimate that reflects, for those areas considered, a credible and realistic range of uncertainty in risk; (2) to rank the various sources of uncertainty with respect to their importance for various measures of risk; and (3) to characterize the state of understanding of each aspect of the risk assessment for which major uncertainties exist. This paper describes the methods developed to fulfill these objectives

  13. Representation of analysis results involving aleatory and epistemic uncertainty.

    Johnson, Jay Dean (ProStat, Mesa, AZ); Helton, Jon Craig (Arizona State University, Tempe, AZ); Oberkampf, William Louis; Sallaberry, Cedric J.

    2008-08-01

    Procedures are described for the representation of results in analyses that involve both aleatory uncertainty and epistemic uncertainty, with aleatory uncertainty deriving from an inherent randomness in the behavior of the system under study and epistemic uncertainty deriving from a lack of knowledge about the appropriate values to use for quantities that are assumed to have fixed but poorly known values in the context of a specific study. Aleatory uncertainty is usually represented with probability and leads to cumulative distribution functions (CDFs) or complementary cumulative distribution functions (CCDFs) for analysis results of interest. Several mathematical structures are available for the representation of epistemic uncertainty, including interval analysis, possibility theory, evidence theory and probability theory. In the presence of epistemic uncertainty, there is not a single CDF or CCDF for a given analysis result. Rather, there is a family of CDFs and a corresponding family of CCDFs that derive from epistemic uncertainty and have an uncertainty structure that derives from the particular uncertainty structure (i.e., interval analysis, possibility theory, evidence theory, probability theory) used to represent epistemic uncertainty. Graphical formats for the representation of epistemic uncertainty in families of CDFs and CCDFs are investigated and presented for the indicated characterizations of epistemic uncertainty.

  14. Assessing scenario and parametric uncertainties in risk analysis: a model uncertainty audit

    Tarantola, S.; Saltelli, A.; Draper, D.

    1999-01-01

    In the present study a process of model audit is addressed on a computational model used for predicting maximum radiological doses to humans in the field of nuclear waste disposal. Global uncertainty and sensitivity analyses are employed to assess output uncertainty and to quantify the contribution of parametric and scenario uncertainties to the model output. These tools are of fundamental importance for risk analysis and decision making purposes

  15. Reusable launch vehicle model uncertainties impact analysis

    Chen, Jiaye; Mu, Rongjun; Zhang, Xin; Deng, Yanpeng

    2018-03-01

    Reusable launch vehicle(RLV) has the typical characteristics of complex aerodynamic shape and propulsion system coupling, and the flight environment is highly complicated and intensely changeable. So its model has large uncertainty, which makes the nominal system quite different from the real system. Therefore, studying the influences caused by the uncertainties on the stability of the control system is of great significance for the controller design. In order to improve the performance of RLV, this paper proposes the approach of analyzing the influence of the model uncertainties. According to the typical RLV, the coupling dynamic and kinematics models are built. Then different factors that cause uncertainties during building the model are analyzed and summed up. After that, the model uncertainties are expressed according to the additive uncertainty model. Choosing the uncertainties matrix's maximum singular values as the boundary model, and selecting the uncertainties matrix's norm to show t how much the uncertainty factors influence is on the stability of the control system . The simulation results illustrate that the inertial factors have the largest influence on the stability of the system, and it is necessary and important to take the model uncertainties into consideration before the designing the controller of this kind of aircraft( like RLV, etc).

  16. The uncertainty analysis of model results a practical guide

    Hofer, Eduard

    2018-01-01

    This book is a practical guide to the uncertainty analysis of computer model applications. Used in many areas, such as engineering, ecology and economics, computer models are subject to various uncertainties at the level of model formulations, parameter values and input data. Naturally, it would be advantageous to know the combined effect of these uncertainties on the model results as well as whether the state of knowledge should be improved in order to reduce the uncertainty of the results most effectively. The book supports decision-makers, model developers and users in their argumentation for an uncertainty analysis and assists them in the interpretation of the analysis results.

  17. Validation of Fuel Performance Uncertainty for RIA Safety Analysis

    Park, Nam-Gyu; Yoo, Jong-Sung; Jung, Yil-Sup [KEPCO Nuclear Fuel Co., Daejeon (Korea, Republic of)

    2016-10-15

    To achieve this the computer code performance has to be validated based on the experimental results. And for the uncertainty quantification, important uncertainty parameters need to be selected, and combined uncertainty has to be evaluated with an acceptable statistical treatment. And important uncertainty parameters to the rod performance such as fuel enthalpy, fission gas release, cladding hoop strain etc. were chosen through the rigorous sensitivity studies. And their validity has been assessed by utilizing the experimental results, which were tested in CABRI and NSRR. Analysis results revealed that several tested rods were not bounded within combined fuel performance uncertainty. Assessment of fuel performance with an extended fuel power uncertainty on tested rods in NSRR and CABRI has been done. Analysis results showed that several tested rods were not bounded within calculated fuel performance uncertainty. This implies that the currently considered uncertainty range of the parameters is not enough to cover the fuel performance sufficiently.

  18. A methodology for uncertainty analysis of reference equations of state

    Cheung, Howard; Frutiger, Jerome; Bell, Ian H.

    We present a detailed methodology for the uncertainty analysis of reference equations of state (EOS) based on Helmholtz energy. In recent years there has been an increased interest in uncertainties of property data and process models of thermal systems. In the literature there are various...... for uncertainty analysis is suggested as a tool for EOS. The uncertainties of the EOS properties are calculated from the experimental values and the EOS model structure through the parameter covariance matrix and subsequent linear error propagation. This allows reporting the uncertainty range (95% confidence...

  19. Measurement uncertainty analysis techniques applied to PV performance measurements

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results

  20. Uncertainty as Knowledge: Constraints on Policy Choices Provided by Analysis of Uncertainty

    Lewandowsky, S.; Risbey, J.; Smithson, M.; Newell, B. R.

    2012-12-01

    Uncertainty forms an integral part of climate science, and it is often cited in connection with arguments against mitigative action. We argue that an analysis of uncertainty must consider existing knowledge as well as uncertainty, and the two must be evaluated with respect to the outcomes and risks associated with possible policy options. Although risk judgments are inherently subjective, an analysis of the role of uncertainty within the climate system yields two constraints that are robust to a broad range of assumptions. Those constraints are that (a) greater uncertainty about the climate system is necessarily associated with greater expected damages from warming, and (b) greater uncertainty translates into a greater risk of the failure of mitigation efforts. These ordinal constraints are unaffected by subjective or cultural risk-perception factors, they are independent of the discount rate, and they are independent of the magnitude of the estimate for climate sensitivity. The constraints mean that any appeal to uncertainty must imply a stronger, rather than weaker, need to cut greenhouse gas emissions than in the absence of uncertainty.

  1. Uncertainties in Safety Analysis. A literature review

    Ekberg, C.

    1995-05-01

    The purpose of the presented work has been to give a short summary of the origins of many uncertainties arising in the designing and performance assessment of a repository for spent nuclear fuel. Some different methods to treat these uncertainties is also included. The methods and conclusions are in many cases general in the sense that they are applicable to many other disciplines where simulations are used. As a conclusion it may be noted that uncertainties of different origin have been discussed and debated, but one large group, e.g. computer simulations, where the methods to make a more explicit investigation exists, have not been investigated in a satisfying way. 50 refs

  2. Uncertainties in Safety Analysis. A literature review

    Ekberg, C [Chalmers Univ. of Technology, Goeteborg (Sweden). Dept. of Nuclear Chemistry

    1995-05-01

    The purpose of the presented work has been to give a short summary of the origins of many uncertainties arising in the designing and performance assessment of a repository for spent nuclear fuel. Some different methods to treat these uncertainties is also included. The methods and conclusions are in many cases general in the sense that they are applicable to many other disciplines where simulations are used. As a conclusion it may be noted that uncertainties of different origin have been discussed and debated, but one large group, e.g. computer simulations, where the methods to make a more explicit investigation exists, have not been investigated in a satisfying way. 50 refs.

  3. Sensitivity functions for uncertainty analysis: Sensitivity and uncertainty analysis of reactor performance parameters

    Greenspan, E.

    1982-01-01

    This chapter presents the mathematical basis for sensitivity functions, discusses their physical meaning and information they contain, and clarifies a number of issues concerning their application, including the definition of group sensitivities, the selection of sensitivity functions to be included in the analysis, and limitations of sensitivity theory. Examines the theoretical foundation; criticality reset sensitivities; group sensitivities and uncertainties; selection of sensitivities included in the analysis; and other uses and limitations of sensitivity functions. Gives the theoretical formulation of sensitivity functions pertaining to ''as-built'' designs for performance parameters of the form of ratios of linear flux functionals (such as reaction-rate ratios), linear adjoint functionals, bilinear functions (such as reactivity worth ratios), and for reactor reactivity. Offers a consistent procedure for reducing energy-dependent or fine-group sensitivities and uncertainties to broad group sensitivities and uncertainties. Provides illustrations of sensitivity functions as well as references to available compilations of such functions and of total sensitivities. Indicates limitations of sensitivity theory originating from the fact that this theory is based on a first-order perturbation theory

  4. Uncertainty analysis and validation of environmental models. The empirically based uncertainty analysis

    Monte, Luigi; Hakanson, Lars; Bergstroem, Ulla; Brittain, John; Heling, Rudie

    1996-01-01

    The principles of Empirically Based Uncertainty Analysis (EBUA) are described. EBUA is based on the evaluation of 'performance indices' that express the level of agreement between the model and sets of empirical independent data collected in different experimental circumstances. Some of these indices may be used to evaluate the confidence limits of the model output. The method is based on the statistical analysis of the distribution of the index values and on the quantitative relationship of these values with the ratio 'experimental data/model output'. Some performance indices are described in the present paper. Among these, the so-called 'functional distance' (d) between the logarithm of model output and the logarithm of the experimental data, defined as d 2 =Σ n 1 ( ln M i - ln O i ) 2 /n where M i is the i-th experimental value, O i the corresponding model evaluation and n the number of the couplets 'experimental value, predicted value', is an important tool for the EBUA method. From the statistical distribution of this performance index, it is possible to infer the characteristics of the distribution of the ratio 'experimental data/model output' and, consequently to evaluate the confidence limits for the model predictions. This method was applied to calculate the uncertainty level of a model developed to predict the migration of radiocaesium in lacustrine systems. Unfortunately, performance indices are affected by the uncertainty of the experimental data used in validation. Indeed, measurement results of environmental levels of contamination are generally associated with large uncertainty due to the measurement and sampling techniques and to the large variability in space and time of the measured quantities. It is demonstrated that this non-desired effect, in some circumstances, may be corrected by means of simple formulae

  5. The characterisation and evaluation of uncertainty in probabilistic risk analysis

    Parry, G.W.; Winter, P.W.

    1980-10-01

    The sources of uncertainty in probabilistic risk analysis are discussed using the event/fault tree methodology as an example. The role of statistics in quantifying these uncertainties is investigated. A class of uncertainties is identified which is, at present, unquantifiable, using either classical or Bayesian statistics. It is argued that Bayesian statistics is the more appropriate vehicle for the probabilistic analysis of rare events and a short review is given with some discussion on the representation of ignorance. (author)

  6. Uncertainty analysis of energy consumption in dwellings

    Pettersen, Trine Dyrstad

    1997-12-31

    This thesis presents a comprehensive study of an energy estimation model that can be used to examine the uncertainty of predicted energy consumption in a dwelling. The variation and uncertainty of input parameters due to the outdoor climate, the building construction and the inhabitants are studied as a basis for further energy evaluations. The occurring variations of energy consumption in nominal similar dwellings are also investigated due to verification of the simulated energy consumption. The main topics are (1) a study of expected variations and uncertainties in both input parameters used in energy consumption calculations and the energy consumption in the dwelling, (2) the development and evaluation of a simplified energy calculation model that considers uncertainties due to the input parameters, (3) an evaluation of the influence of the uncertain parameters on the total variation so that the most important parameters can be identified, and (4) the recommendation of a simplified procedure for treating uncertainties or possible deviations from average conditions. 90 refs., 182 figs., 73 tabs.

  7. Analysis of uncertainty in modeling perceived risks

    Melnyk, R.; Sandquist, G.M.

    2005-01-01

    Expanding on a mathematical model developed for quantifying and assessing perceived risks, the distribution functions, variances, and uncertainties associated with estimating the model parameters are quantified. The analytical model permits the identification and assignment of any number of quantifiable risk perception factors that can be incorporated within standard risk methodology. Those risk perception factors associated with major technical issues are modeled using lognormal probability density functions to span the potentially large uncertainty variations associated with these risk perceptions. The model quantifies the logic of public risk perception and provides an effective means for measuring and responding to perceived risks. (authors)

  8. Qualitative uncertainty analysis in probabilistic safety assessment context

    Apostol, M.; Constantin, M; Turcu, I.

    2007-01-01

    In Probabilistic Safety Assessment (PSA) context, an uncertainty analysis is performed either to estimate the uncertainty in the final results (the risk to public health and safety) or to estimate the uncertainty in some intermediate quantities (the core damage frequency, the radionuclide release frequency or fatality frequency). The identification and evaluation of uncertainty are important tasks because they afford credit to the results and help in the decision-making process. Uncertainty analysis can be performed qualitatively or quantitatively. This paper performs a preliminary qualitative uncertainty analysis, by identification of major uncertainty in PSA level 1- level 2 interface and in the other two major procedural steps of a level 2 PSA i.e. the analysis of accident progression and of the containment and analysis of source term for severe accidents. One should mention that a level 2 PSA for a Nuclear Power Plant (NPP) involves the evaluation and quantification of the mechanisms, amount and probabilities of subsequent radioactive material releases from the containment. According to NUREG 1150, an important task in source term analysis is fission products transport analysis. The uncertainties related to the isotopes distribution in CANDU NPP primary circuit and isotopes' masses transferred in the containment, using SOPHAEROS module from ASTEC computer code will be also presented. (authors)

  9. Uncertainties

    To reflect this uncertainty in the climate scenarios, the use of AOGCMs that explicitly simulate the carbon cycle and chemistry of all the substances are needed. The Hadley Centre has developed a version of the climate model that allows the effect of climate change on the carbon cycle and its feedback into climate, to be ...

  10. Uncertainty

    Silva, T.A. da

    1988-01-01

    The comparison between the uncertainty method recommended by International Atomic Energy Agency (IAEA) and the and the International Weight and Measure Commitee (CIPM) are showed, for the calibration of clinical dosimeters in the secondary standard Dosimetry Laboratory (SSDL). (C.G.C.) [pt

  11. Durability reliability analysis for corroding concrete structures under uncertainty

    Zhang, Hao

    2018-02-01

    This paper presents a durability reliability analysis of reinforced concrete structures subject to the action of marine chloride. The focus is to provide insight into the role of epistemic uncertainties on durability reliability. The corrosion model involves a number of variables whose probabilistic characteristics cannot be fully determined due to the limited availability of supporting data. All sources of uncertainty, both aleatory and epistemic, should be included in the reliability analysis. Two methods are available to formulate the epistemic uncertainty: the imprecise probability-based method and the purely probabilistic method in which the epistemic uncertainties are modeled as random variables. The paper illustrates how the epistemic uncertainties are modeled and propagated in the two methods, and shows how epistemic uncertainties govern the durability reliability.

  12. Systematic Evaluation of Uncertainty in Material Flow Analysis

    Laner, David; Rechberger, Helmut; Astrup, Thomas Fruergaard

    2014-01-01

    Material flow analysis (MFA) is a tool to investigate material flows and stocks in defined systems as a basis for resource management or environmental pollution control. Because of the diverse nature of sources and the varying quality and availability of data, MFA results are inherently uncertain....... Uncertainty analyses have received increasing attention in recent MFA studies, but systematic approaches for selection of appropriate uncertainty tools are missing. This article reviews existing literature related to handling of uncertainty in MFA studies and evaluates current practice of uncertainty analysis......) and exploratory MFA (identification of critical parameters and system behavior). Whereas mathematically simpler concepts focusing on data uncertainty characterization are appropriate for descriptive MFAs, statistical approaches enabling more-rigorous evaluation of uncertainty and model sensitivity are needed...

  13. Uncertainty analysis of dosimetry spectrum unfolding

    Perey, F.G.

    1977-01-01

    The propagation of uncertainties in the input data is analyzed for the usual dosimetry unfolding solution. A new formulation of the dosimetry unfolding problem is proposed in which the most likely value of the spectrum is obtained. The relationship of this solution to the usual one is discussed

  14. Analytic uncertainty and sensitivity analysis of models with input correlations

    Zhu, Yueying; Wang, Qiuping A.; Li, Wei; Cai, Xu

    2018-03-01

    Probabilistic uncertainty analysis is a common means of evaluating mathematical models. In mathematical modeling, the uncertainty in input variables is specified through distribution laws. Its contribution to the uncertainty in model response is usually analyzed by assuming that input variables are independent of each other. However, correlated parameters are often happened in practical applications. In the present paper, an analytic method is built for the uncertainty and sensitivity analysis of models in the presence of input correlations. With the method, it is straightforward to identify the importance of the independence and correlations of input variables in determining the model response. This allows one to decide whether or not the input correlations should be considered in practice. Numerical examples suggest the effectiveness and validation of our analytic method in the analysis of general models. A practical application of the method is also proposed to the uncertainty and sensitivity analysis of a deterministic HIV model.

  15. Discussion of OECD LWR Uncertainty Analysis in Modelling Benchmark

    Ivanov, K.; Avramova, M.; Royer, E.; Gillford, J.

    2013-01-01

    The demand for best estimate calculations in nuclear reactor design and safety evaluations has increased in recent years. Uncertainty quantification has been highlighted as part of the best estimate calculations. The modelling aspects of uncertainty and sensitivity analysis are to be further developed and validated on scientific grounds in support of their performance and application to multi-physics reactor simulations. The Organization for Economic Co-operation and Development (OECD) / Nuclear Energy Agency (NEA) Nuclear Science Committee (NSC) has endorsed the creation of an Expert Group on Uncertainty Analysis in Modelling (EGUAM). Within the framework of activities of EGUAM/NSC the OECD/NEA initiated the Benchmark for Uncertainty Analysis in Modelling for Design, Operation, and Safety Analysis of Light Water Reactor (OECD LWR UAM benchmark). The general objective of the benchmark is to propagate the predictive uncertainties of code results through complex coupled multi-physics and multi-scale simulations. The benchmark is divided into three phases with Phase I highlighting the uncertainty propagation in stand-alone neutronics calculations, while Phase II and III are focused on uncertainty analysis of reactor core and system respectively. This paper discusses the progress made in Phase I calculations, the Specifications for Phase II and the incoming challenges in defining Phase 3 exercises. The challenges of applying uncertainty quantification to complex code systems, in particular the time-dependent coupled physics models are the large computational burden and the utilization of non-linear models (expected due to the physics coupling). (authors)

  16. Measurement uncertainty analysis techniques applied to PV performance measurements

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  17. Measurement uncertainty analysis techniques applied to PV performance measurements

    Wells, C

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  18. Sensitivity and uncertainty analysis for fission product decay heat calculations

    Rebah, J.; Lee, Y.K.; Nimal, J.C.; Nimal, B.; Luneville, L.; Duchemin, B.

    1994-01-01

    The calculated uncertainty in decay heat due to the uncertainty in basic nuclear data given in the CEA86 Library, is presented. Uncertainties in summation calculation arise from several sources: fission product yields, half-lives and average decay energies. The correlation between basic data is taken into account. The uncertainty analysis were obtained for thermal-neutron-induced fission of U235 and Pu239 in the case of burst fission and irradiation time. The calculated decay heat in this study is compared with experimental results and with new calculation using the JEF2 Library. (from authors) 6 figs., 19 refs

  19. Urban drainage models simplifying uncertainty analysis for practitioners

    Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana

    2013-01-01

    in each measured/observed datapoint; an issue that is commonly overlooked in the uncertainty analysis of urban drainage models. This comparison allows the user to intuitively estimate the optimum number of simulations required to conduct uncertainty analyses. The output of the method includes parameter......There is increasing awareness about uncertainties in the modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here...

  20. Uncertainty Analysis of Consequence Management (CM) Data Products.

    Hunt, Brian D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eckert-Gallup, Aubrey Celia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cochran, Lainy Dromgoole [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kraus, Terrence D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Fournier, Sean Donovan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Allen, Mark B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Schetnan, Richard Reed [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Simpson, Matthew D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Okada, Colin E. [Remote Sensing Lab. (RSL), Nellis AFB, Las Vegas, NV (United States); Bingham, Avery A. [Remote Sensing Lab. (RSL), Nellis AFB, Las Vegas, NV (United States)

    2018-01-01

    The goal of this project is to develop and execute methods for characterizing uncertainty in data products that are deve loped and distributed by the DOE Consequence Management (CM) Program. A global approach to this problem is necessary because multiple sources of error and uncertainty from across the CM skill sets contribute to the ultimate p roduction of CM data products. This report presents the methods used to develop a probabilistic framework to characterize this uncertainty and provides results for an uncertainty analysis for a study scenario analyzed using this framework.

  1. Decisions under uncertainty using Bayesian analysis

    Stelian STANCU

    2006-01-01

    Full Text Available The present paper makes a short presentation of the Bayesian decions method, where extrainformation brings a great support to decision making process, but also attract new costs. In this situation, getting new information, generally experimentaly based, contributes to diminushing the uncertainty degree that influences decision making process. As a conclusion, in a large number of decision problems, there is the possibility that the decision makers will renew some decisions already taken because of the facilities offered by obtainig extrainformation.

  2. Uncertainty Analysis of Light Water Reactor Fuel Lattices

    C. Arenas

    2013-01-01

    Full Text Available The study explored the calculation of uncertainty based on available cross-section covariance data and computational tool on fuel lattice levels, which included pin cell and the fuel assembly models. Uncertainty variations due to temperatures changes and different fuel compositions are the main focus of this analysis. Selected assemblies and unit pin cells were analyzed according to the OECD LWR UAM benchmark specifications. Criticality and uncertainty analysis were performed using TSUNAMI-2D sequence in SCALE 6.1. It was found that uncertainties increase with increasing temperature, while kinf decreases. This increase in the uncertainty is due to the increase in sensitivity of the largest contributing reaction of uncertainty, namely, the neutron capture reaction 238U(n, γ due to the Doppler broadening. In addition, three types (UOX, MOX, and UOX-Gd2O3 of fuel material compositions were analyzed. A remarkable increase in uncertainty in kinf was observed for the case of MOX fuel. The increase in uncertainty of kinf in MOX fuel was nearly twice the corresponding value in UOX fuel. The neutron-nuclide reaction of 238U, mainly inelastic scattering (n, n′, contributed the most to the uncertainties in the MOX fuel, shifting the neutron spectrum to higher energy compared to the UOX fuel.

  3. Automated uncertainty analysis methods in the FRAP computer codes

    Peck, S.O.

    1980-01-01

    A user oriented, automated uncertainty analysis capability has been incorporated in the Fuel Rod Analysis Program (FRAP) computer codes. The FRAP codes have been developed for the analysis of Light Water Reactor fuel rod behavior during steady state (FRAPCON) and transient (FRAP-T) conditions as part of the United States Nuclear Regulatory Commission's Water Reactor Safety Research Program. The objective of uncertainty analysis of these codes is to obtain estimates of the uncertainty in computed outputs of the codes is to obtain estimates of the uncertainty in computed outputs of the codes as a function of known uncertainties in input variables. This paper presents the methods used to generate an uncertainty analysis of a large computer code, discusses the assumptions that are made, and shows techniques for testing them. An uncertainty analysis of FRAP-T calculated fuel rod behavior during a hypothetical loss-of-coolant transient is presented as an example and carried through the discussion to illustrate the various concepts

  4. Uncertainty Analysis with Considering Resonance Self-shielding Effect

    Han, Tae Young

    2016-01-01

    If infinitely diluted multi-group cross sections were used for the sensitivity, the covariance data from the evaluated nuclear data library (ENDL) was directly applied. However, in case of using a self-shielded multi-group cross section, the covariance data should be corrected considering self-shielding effect. Usually, implicit uncertainty can be defined as the uncertainty change by the resonance self-shielding effect as described above. MUSAD ( Modules of Uncertainty and Sensitivity Analysis for DeCART ) has been developed for a multiplication factor and cross section uncertainty based on the generalized perturbation theory and it, however, can only quantify the explicit uncertainty by the self-shielded multi-group cross sections without considering the implicit effect. Thus, this paper addresses the implementation of the implicit uncertainty analysis module into the code and the numerical results for the verification are provided. The implicit uncertainty analysis module has been implemented into MUSAD based on infinitely-diluted cross section-based consistent method. The verification calculation was performed on MHTGR 350 Ex.I-1a and the differences with McCARD result decrease from 40% to 1% in CZP case and 3% in HFP case. From this study, it is expected that MUSAD code can reasonably produce the complete uncertainty on VHTR or LWR where the resonance self-shielding effect should be significantly considered

  5. Uncertainty Analysis with Considering Resonance Self-shielding Effect

    Han, Tae Young [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    If infinitely diluted multi-group cross sections were used for the sensitivity, the covariance data from the evaluated nuclear data library (ENDL) was directly applied. However, in case of using a self-shielded multi-group cross section, the covariance data should be corrected considering self-shielding effect. Usually, implicit uncertainty can be defined as the uncertainty change by the resonance self-shielding effect as described above. MUSAD ( Modules of Uncertainty and Sensitivity Analysis for DeCART ) has been developed for a multiplication factor and cross section uncertainty based on the generalized perturbation theory and it, however, can only quantify the explicit uncertainty by the self-shielded multi-group cross sections without considering the implicit effect. Thus, this paper addresses the implementation of the implicit uncertainty analysis module into the code and the numerical results for the verification are provided. The implicit uncertainty analysis module has been implemented into MUSAD based on infinitely-diluted cross section-based consistent method. The verification calculation was performed on MHTGR 350 Ex.I-1a and the differences with McCARD result decrease from 40% to 1% in CZP case and 3% in HFP case. From this study, it is expected that MUSAD code can reasonably produce the complete uncertainty on VHTR or LWR where the resonance self-shielding effect should be significantly considered.

  6. One Approach to the Fire PSA Uncertainty Analysis

    Simic, Z.; Mikulicic, V.; Vukovic, I.

    2002-01-01

    Experienced practical events and findings from the number of fire probabilistic safety assessment (PSA) studies show that fire has high relative importance for nuclear power plant safety. Fire PSA is a very challenging phenomenon and a number of issues are still in the area of research and development. This has a major impact on the conservatism of fire PSA findings. One way to reduce the level of conservatism is to conduct uncertainty analysis. At the top-level, uncertainty of the fire PSA can be separated in to three segments. The first segment is related to fire initiating events frequencies. The second uncertainty segment is connected to the uncertainty of fire damage. Finally, there is uncertainty related to the PSA model, which propagates this fire-initiated damage to the core damage or other analyzed risk. This paper discusses all three segments of uncertainty. Some recent experience with fire PSA study uncertainty analysis, usage of fire analysis code COMPBRN IIIe, and uncertainty evaluation importance to the final result is presented.(author)

  7. Estimating the measurement uncertainty in forensic blood alcohol analysis.

    Gullberg, Rod G

    2012-04-01

    For many reasons, forensic toxicologists are being asked to determine and report their measurement uncertainty in blood alcohol analysis. While understood conceptually, the elements and computations involved in determining measurement uncertainty are generally foreign to most forensic toxicologists. Several established and well-documented methods are available to determine and report the uncertainty in blood alcohol measurement. A straightforward bottom-up approach is presented that includes: (1) specifying the measurand, (2) identifying the major components of uncertainty, (3) quantifying the components, (4) statistically combining the components and (5) reporting the results. A hypothetical example is presented that employs reasonable estimates for forensic blood alcohol analysis assuming headspace gas chromatography. These computations are easily employed in spreadsheet programs as well. Determining and reporting measurement uncertainty is an important element in establishing fitness-for-purpose. Indeed, the demand for such computations and information from the forensic toxicologist will continue to increase.

  8. Uncertainty about probability: a decision analysis perspective

    Howard, R.A.

    1988-01-01

    The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group

  9. Uncertainty analysis in the applications of nuclear probabilistic risk assessment

    Le Duy, T.D.

    2011-01-01

    The aim of this thesis is to propose an approach to model parameter and model uncertainties affecting the results of risk indicators used in the applications of nuclear Probabilistic Risk assessment (PRA). After studying the limitations of the traditional probabilistic approach to represent uncertainty in PRA model, a new approach based on the Dempster-Shafer theory has been proposed. The uncertainty analysis process of the proposed approach consists in five main steps. The first step aims to model input parameter uncertainties by belief and plausibility functions according to the data PRA model. The second step involves the propagation of parameter uncertainties through the risk model to lay out the uncertainties associated with output risk indicators. The model uncertainty is then taken into account in the third step by considering possible alternative risk models. The fourth step is intended firstly to provide decision makers with information needed for decision making under uncertainty (parametric and model) and secondly to identify the input parameters that have significant uncertainty contributions on the result. The final step allows the process to be continued in loop by studying the updating of beliefs functions given new data. The proposed methodology was implemented on a real but simplified application of PRA model. (author)

  10. Including uncertainty in hazard analysis through fuzzy measures

    Bott, T.F.; Eisenhawer, S.W.

    1997-12-01

    This paper presents a method for capturing the uncertainty expressed by an Hazard Analysis (HA) expert team when estimating the frequencies and consequences of accident sequences and provides a sound mathematical framework for propagating this uncertainty to the risk estimates for these accident sequences. The uncertainty is readily expressed as distributions that can visually aid the analyst in determining the extent and source of risk uncertainty in HA accident sequences. The results also can be expressed as single statistics of the distribution in a manner analogous to expressing a probabilistic distribution as a point-value statistic such as a mean or median. The study discussed here used data collected during the elicitation portion of an HA on a high-level waste transfer process to demonstrate the techniques for capturing uncertainty. These data came from observations of the uncertainty that HA team members expressed in assigning frequencies and consequences to accident sequences during an actual HA. This uncertainty was captured and manipulated using ideas from possibility theory. The result of this study is a practical method for displaying and assessing the uncertainty in the HA team estimates of the frequency and consequences for accident sequences. This uncertainty provides potentially valuable information about accident sequences that typically is lost in the HA process

  11. Uncertainty analysis technique for OMEGA Dante measurementsa)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-10-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  12. Uncertainty analysis technique for OMEGA Dante measurements

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-01-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  13. Uncertainty Analysis Technique for OMEGA Dante Measurements

    May, M.J.; Widmann, K.; Sorce, C.; Park, H.; Schneider, M.

    2010-01-01

    The Dante is an 18 channel X-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g. hohlraums, etc.) at X-ray energies between 50 eV to 10 keV. It is a main diagnostics installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the X-ray diodes, filters and mirrors and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte-Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  14. Assessment and uncertainty analysis of groundwater risk.

    Li, Fawen; Zhu, Jingzhao; Deng, Xiyuan; Zhao, Yong; Li, Shaofei

    2018-01-01

    Groundwater with relatively stable quantity and quality is commonly used by human being. However, as the over-mining of groundwater, problems such as groundwater funnel, land subsidence and salt water intrusion have emerged. In order to avoid further deterioration of hydrogeological problems in over-mining regions, it is necessary to conduct the assessment of groundwater risk. In this paper, risks of shallow and deep groundwater in the water intake area of the South-to-North Water Transfer Project in Tianjin, China, were evaluated. Firstly, two sets of four-level evaluation index system were constructed based on the different characteristics of shallow and deep groundwater. Secondly, based on the normalized factor values and the synthetic weights, the risk values of shallow and deep groundwater were calculated. Lastly, the uncertainty of groundwater risk assessment was analyzed by indicator kriging method. The results meet the decision maker's demand for risk information, and overcome previous risk assessment results expressed in the form of deterministic point estimations, which ignore the uncertainty of risk assessment. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Uncertainty and sensitivity analysis in nuclear accident consequence assessment

    Karlberg, Olof.

    1989-01-01

    This report contains the results of a four year project in research contracts with the Nordic Cooperation in Nuclear Safety and the National Institute for Radiation Protection. An uncertainty/sensitivity analysis methodology consisting of Latin Hypercube sampling and regression analysis was applied to an accident consequence model. A number of input parameters were selected and the uncertainties related to these parameter were estimated within a Nordic group of experts. Individual doses, collective dose, health effects and their related uncertainties were then calculated for three release scenarios and for a representative sample of meteorological situations. From two of the scenarios the acute phase after an accident were simulated and from one the long time consequences. The most significant parameters were identified. The outer limits of the calculated uncertainty distributions are large and will grow to several order of magnitudes for the low probability consequences. The uncertainty in the expectation values are typical a factor 2-5 (1 Sigma). The variation in the model responses due to the variation of the weather parameters is fairly equal to the parameter uncertainty induced variation. The most important parameters showed out to be different for each pathway of exposure, which could be expected. However, the overall most important parameters are the wet deposition coefficient and the shielding factors. A general discussion of the usefulness of uncertainty analysis in consequence analysis is also given. (au)

  16. Uncertainty analysis of geothermal energy economics

    Sener, Adil Caner

    This dissertation research endeavors to explore geothermal energy economics by assessing and quantifying the uncertainties associated with the nature of geothermal energy and energy investments overall. The study introduces a stochastic geothermal cost model and a valuation approach for different geothermal power plant development scenarios. The Monte Carlo simulation technique is employed to obtain probability distributions of geothermal energy development costs and project net present values. In the study a stochastic cost model with incorporated dependence structure is defined and compared with the model where random variables are modeled as independent inputs. One of the goals of the study is to attempt to shed light on the long-standing modeling problem of dependence modeling between random input variables. The dependence between random input variables will be modeled by employing the method of copulas. The study focuses on four main types of geothermal power generation technologies and introduces a stochastic levelized cost model for each technology. Moreover, we also compare the levelized costs of natural gas combined cycle and coal-fired power plants with geothermal power plants. The input data used in the model relies on the cost data recently reported by government agencies and non-profit organizations, such as the Department of Energy, National Laboratories, California Energy Commission and Geothermal Energy Association. The second part of the study introduces the stochastic discounted cash flow valuation model for the geothermal technologies analyzed in the first phase. In this phase of the study, the Integrated Planning Model (IPM) software was used to forecast the revenue streams of geothermal assets under different price and regulation scenarios. These results are then combined to create a stochastic revenue forecast of the power plants. The uncertainties in gas prices and environmental regulations will be modeled and their potential impacts will be

  17. The EURACOS activation experiments: preliminary uncertainty analysis

    Yeivin, Y.

    1982-01-01

    A sequence of counting rates of an irradiated sulphur pellet, r(tsub(i)), measured at different times after the end of the irradiation, are fitted to r(t)=Aexp(-lambda t)+B. A standard adjustment procedure is applied to determine the parameters A and B, their standard deviations and correlation, and chi square. It is demonstrated that if the counting-rate uncertainties are entirely due to the counting statistics, the experimental data are totally inconsistent with the ''theoretical'' model. However, assuming an additional systematic error of approximalety 1%, and eliminating a few ''bad'' data, produces a data set quite consistent with the model. The dependence of chi square on the assumed systematic error and the data elimination procedure are discussed in great detail. A review of the adjustment procedure is appended to the report

  18. Uncertainty Analysis via Failure Domain Characterization: Unrestricted Requirement Functions

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2011-01-01

    This paper proposes an uncertainty analysis framework based on the characterization of the uncertain parameter space. This characterization enables the identification of worst-case uncertainty combinations and the approximation of the failure and safe domains with a high level of accuracy. Because these approximations are comprised of subsets of readily computable probability, they enable the calculation of arbitrarily tight upper and lower bounds to the failure probability. The methods developed herein, which are based on nonlinear constrained optimization, are applicable to requirement functions whose functional dependency on the uncertainty is arbitrary and whose explicit form may even be unknown. Some of the most prominent features of the methodology are the substantial desensitization of the calculations from the assumed uncertainty model (i.e., the probability distribution describing the uncertainty) as well as the accommodation for changes in such a model with a practically insignificant amount of computational effort.

  19. Uncertainty Analysis via Failure Domain Characterization: Polynomial Requirement Functions

    Crespo, Luis G.; Munoz, Cesar A.; Narkawicz, Anthony J.; Kenny, Sean P.; Giesy, Daniel P.

    2011-01-01

    This paper proposes an uncertainty analysis framework based on the characterization of the uncertain parameter space. This characterization enables the identification of worst-case uncertainty combinations and the approximation of the failure and safe domains with a high level of accuracy. Because these approximations are comprised of subsets of readily computable probability, they enable the calculation of arbitrarily tight upper and lower bounds to the failure probability. A Bernstein expansion approach is used to size hyper-rectangular subsets while a sum of squares programming approach is used to size quasi-ellipsoidal subsets. These methods are applicable to requirement functions whose functional dependency on the uncertainty is a known polynomial. Some of the most prominent features of the methodology are the substantial desensitization of the calculations from the uncertainty model assumed (i.e., the probability distribution describing the uncertainty) as well as the accommodation for changes in such a model with a practically insignificant amount of computational effort.

  20. Sensitivity and uncertainty analysis of the PATHWAY radionuclide transport model

    Otis, M.D.

    1983-01-01

    Procedures were developed for the uncertainty and sensitivity analysis of a dynamic model of radionuclide transport through human food chains. Uncertainty in model predictions was estimated by propagation of parameter uncertainties using a Monte Carlo simulation technique. Sensitivity of model predictions to individual parameters was investigated using the partial correlation coefficient of each parameter with model output. Random values produced for the uncertainty analysis were used in the correlation analysis for sensitivity. These procedures were applied to the PATHWAY model which predicts concentrations of radionuclides in foods grown in Nevada and Utah and exposed to fallout during the period of atmospheric nuclear weapons testing in Nevada. Concentrations and time-integrated concentrations of iodine-131, cesium-136, and cesium-137 in milk and other foods were investigated. 9 figs., 13 tabs

  1. Nuclear data sensitivity/uncertainty analysis for XT-ADS

    Sugawara, Takanori; Sarotto, Massimo; Stankovskiy, Alexey; Van den Eynde, Gert

    2011-01-01

    Highlights: → The sensitivity and uncertainty analyses were performed to comprehend the reliability of the XT-ADS neutronic design. → The uncertainties deduced from the covariance data for the XT-ADS criticality were 0.94%, 1.9% and 1.1% by the SCALE 44-group, TENDL-2009 and JENDL-3.3 data, respectively. → When the target accuracy of 0.3%Δk for the criticality was considered, the uncertainties did not satisfy it. → To achieve this accuracy, the uncertainties should be improved by experiments under an adequate condition. - Abstract: The XT-ADS, an accelerator-driven system for an experimental demonstration, has been investigated in the framework of IP EUROTRANS FP6 project. In this study, the sensitivity and uncertainty analyses were performed to comprehend the reliability of the XT-ADS neutronic design. For the sensitivity analysis, it was found that the sensitivity coefficients were significantly different by changing the geometry models and calculation codes. For the uncertainty analysis, it was confirmed that the uncertainties deduced from the covariance data varied significantly by changing them. The uncertainties deduced from the covariance data for the XT-ADS criticality were 0.94%, 1.9% and 1.1% by the SCALE 44-group, TENDL-2009 and JENDL-3.3 data, respectively. When the target accuracy of 0.3%Δk for the criticality was considered, the uncertainties did not satisfy it. To achieve this accuracy, the uncertainties should be improved by experiments under an adequate condition.

  2. Complex Visual Data Analysis, Uncertainty, and Representation

    Schunn, Christian D; Saner, Lelyn D; Kirschenbaum, Susan K; Trafton, J. G; Littleton, Eliza B

    2007-01-01

    ... (weather forecasting, submarine target motion analysis, and fMRI data analysis). Internal spatial representations are coded from spontaneous gestures made during cued-recall summaries of problem solving activities...

  3. Uncertainty analysis with statistically correlated failure data

    Modarres, M.; Dezfuli, H.; Roush, M.L.

    1987-01-01

    Likelihood of occurrence of the top event of a fault tree or sequences of an event tree is estimated from the failure probability of components that constitute the events of the fault/event tree. Component failure probabilities are subject to statistical uncertainties. In addition, there are cases where the failure data are statistically correlated. At present most fault tree calculations are based on uncorrelated component failure data. This chapter describes a methodology for assessing the probability intervals for the top event failure probability of fault trees or frequency of occurrence of event tree sequences when event failure data are statistically correlated. To estimate mean and variance of the top event, a second-order system moment method is presented through Taylor series expansion, which provides an alternative to the normally used Monte Carlo method. For cases where component failure probabilities are statistically correlated, the Taylor expansion terms are treated properly. Moment matching technique is used to obtain the probability distribution function of the top event through fitting the Johnson Ssub(B) distribution. The computer program, CORRELATE, was developed to perform the calculations necessary for the implementation of the method developed. (author)

  4. Aspects of uncertainty analysis in accident consequence modeling

    Travis, C.C.; Hoffman, F.O.

    1981-01-01

    Mathematical models are frequently used to determine probable dose to man from an accidental release of radionuclides by a nuclear facility. With increased emphasis on the accuracy of these models, the incorporation of uncertainty analysis has become one of the most crucial and sensitive components in evaluating the significance of model predictions. In the present paper, we address three aspects of uncertainty in models used to assess the radiological impact to humans: uncertainties resulting from the natural variability in human biological parameters; the propagation of parameter variability by mathematical models; and comparison of model predictions to observational data

  5. New challenges on uncertainty propagation assessment of flood risk analysis

    Martins, Luciano; Aroca-Jiménez, Estefanía; Bodoque, José M.; Díez-Herrero, Andrés

    2016-04-01

    Natural hazards, such as floods, cause considerable damage to the human life, material and functional assets every year and around the World. Risk assessment procedures has associated a set of uncertainties, mainly of two types: natural, derived from stochastic character inherent in the flood process dynamics; and epistemic, that are associated with lack of knowledge or the bad procedures employed in the study of these processes. There are abundant scientific and technical literature on uncertainties estimation in each step of flood risk analysis (e.g. rainfall estimates, hydraulic modelling variables); but very few experience on the propagation of the uncertainties along the flood risk assessment. Therefore, epistemic uncertainties are the main goal of this work, in particular,understand the extension of the propagation of uncertainties throughout the process, starting with inundability studies until risk analysis, and how far does vary a proper analysis of the risk of flooding. These methodologies, such as Polynomial Chaos Theory (PCT), Method of Moments or Monte Carlo, are used to evaluate different sources of error, such as data records (precipitation gauges, flow gauges...), hydrologic and hydraulic modelling (inundation estimation), socio-demographic data (damage estimation) to evaluate the uncertainties propagation (UP) considered in design flood risk estimation both, in numerical and cartographic expression. In order to consider the total uncertainty and understand what factors are contributed most to the final uncertainty, we used the method of Polynomial Chaos Theory (PCT). It represents an interesting way to handle to inclusion of uncertainty in the modelling and simulation process. PCT allows for the development of a probabilistic model of the system in a deterministic setting. This is done by using random variables and polynomials to handle the effects of uncertainty. Method application results have a better robustness than traditional analysis

  6. Application of uncertainty analysis in conceptual fusion reactor design

    Wu, T.; Maynard, C.W.

    1979-01-01

    The theories of sensitivity and uncertainty analysis are described and applied to a new conceptual tokamak fusion reactor design--NUWMAK. The responses investigated in this study include the tritium breeding ratio, first wall Ti dpa and gas productions, nuclear heating in the blanket, energy leakage to the magnet, and the dpa rate in the superconducting magnet aluminum stabilizer. The sensitivities and uncertainties of these responses are calculated. The cost/benefit feature of proposed integral measurements is also studied through the uncertainty reductions of these responses

  7. Sensitivity and uncertainty analysis of NET/ITER shielding blankets

    Hogenbirk, A.; Gruppelaar, H.; Verschuur, K.A.

    1990-09-01

    Results are presented of sensitivity and uncertainty calculations based upon the European fusion file (EFF-1). The effect of uncertainties in Fe, Cr and Ni cross sections on the nuclear heating in the coils of a NET/ITER shielding blanket has been studied. The analysis has been performed for the total cross section as well as partial cross sections. The correct expression for the sensitivity profile was used, including the gain term. The resulting uncertainty in the nuclear heating lies between 10 and 20 per cent. (author). 18 refs.; 2 figs.; 2 tabs

  8. Improved Monte Carlo Method for PSA Uncertainty Analysis

    Choi, Jongsoo

    2016-01-01

    The treatment of uncertainty is an important issue for regulatory decisions. Uncertainties exist from knowledge limitations. A probabilistic approach has exposed some of these limitations and provided a framework to assess their significance and assist in developing a strategy to accommodate them in the regulatory process. The uncertainty analysis (UA) is usually based on the Monte Carlo method. This paper proposes a Monte Carlo UA approach to calculate the mean risk metrics accounting for the SOKC between basic events (including CCFs) using efficient random number generators and to meet Capability Category III of the ASME/ANS PRA standard. Audit calculation is needed in PSA regulatory reviews of uncertainty analysis results submitted for licensing. The proposed Monte Carlo UA approach provides a high degree of confidence in PSA reviews. All PSA needs accounting for the SOKC between event probabilities to meet the ASME/ANS PRA standard

  9. Uncertainty Analysis of Seebeck Coefficient and Electrical Resistivity Characterization

    Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred

    2014-01-01

    In order to provide a complete description of a materials thermoelectric power factor, in addition to the measured nominal value, an uncertainty interval is required. The uncertainty may contain sources of measurement error including systematic bias error and precision error of a statistical nature. The work focuses specifically on the popular ZEM-3 (Ulvac Technologies) measurement system, but the methods apply to any measurement system. The analysis accounts for sources of systematic error including sample preparation tolerance, measurement probe placement, thermocouple cold-finger effect, and measurement parameters; in addition to including uncertainty of a statistical nature. Complete uncertainty analysis of a measurement system allows for more reliable comparison of measurement data between laboratories.

  10. Improved Monte Carlo Method for PSA Uncertainty Analysis

    Choi, Jongsoo [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2016-10-15

    The treatment of uncertainty is an important issue for regulatory decisions. Uncertainties exist from knowledge limitations. A probabilistic approach has exposed some of these limitations and provided a framework to assess their significance and assist in developing a strategy to accommodate them in the regulatory process. The uncertainty analysis (UA) is usually based on the Monte Carlo method. This paper proposes a Monte Carlo UA approach to calculate the mean risk metrics accounting for the SOKC between basic events (including CCFs) using efficient random number generators and to meet Capability Category III of the ASME/ANS PRA standard. Audit calculation is needed in PSA regulatory reviews of uncertainty analysis results submitted for licensing. The proposed Monte Carlo UA approach provides a high degree of confidence in PSA reviews. All PSA needs accounting for the SOKC between event probabilities to meet the ASME/ANS PRA standard.

  11. Uncertainty Instability Risk Analysis of High Concrete Arch Dam Abutments

    Xin Cao

    2017-01-01

    Full Text Available The uncertainties associated with concrete arch dams rise with the increased height of dams. Given the uncertainties associated with influencing factors, the stability of high arch dam abutments as a fuzzy random event was studied. In addition, given the randomness and fuzziness of calculation parameters as well as the failure criterion, hazard point and hazard surface uncertainty instability risk ratio models were proposed for high arch dam abutments on the basis of credibility theory. The uncertainty instability failure criterion was derived through the analysis of the progressive instability failure process on the basis of Shannon’s entropy theory. The uncertainties associated with influencing factors were quantized by probability or possibility distribution assignments. Gaussian random theory was used to generate random realizations for influence factors with spatial variability. The uncertainty stability analysis method was proposed by combining the finite element analysis and the limit equilibrium method. The instability risk ratio was calculated using the Monte Carlo simulation method and fuzzy random postprocessing. Results corroborate that the modeling approach is sound and that the calculation method is feasible.

  12. The explicit treatment of model uncertainties in the presence of aleatory and epistemic parameter uncertainties in risk and reliability analysis

    Ahn, Kwang Il; Yang, Joon Eon

    2003-01-01

    In the risk and reliability analysis of complex technological systems, the primary concern of formal uncertainty analysis is to understand why uncertainties arise, and to evaluate how they impact the results of the analysis. In recent times, many of the uncertainty analyses have focused on parameters of the risk and reliability analysis models, whose values are uncertain in an aleatory or an epistemic way. As the field of parametric uncertainty analysis matures, however, more attention is being paid to the explicit treatment of uncertainties that are addressed in the predictive model itself as well as the accuracy of the predictive model. The essential steps for evaluating impacts of these model uncertainties in the presence of parameter uncertainties are to determine rigorously various sources of uncertainties to be addressed in an underlying model itself and in turn model parameters, based on our state-of-knowledge and relevant evidence. Answering clearly the question of how to characterize and treat explicitly the forgoing different sources of uncertainty is particularly important for practical aspects such as risk and reliability optimization of systems as well as more transparent risk information and decision-making under various uncertainties. The main purpose of this paper is to provide practical guidance for quantitatively treating various model uncertainties that would often be encountered in the risk and reliability modeling process of complex technological systems

  13. Uncertainty analysis of LBLOCA for Advanced Heavy Water Reactor

    Srivastava, A.; Lele, H.G.; Ghosh, A.K.; Kushwaha, H.S.

    2008-01-01

    The main objective of safety analysis is to demonstrate in a robust way that all safety requirements are met, i.e. sufficient margins exist between real values of important parameters and their threshold values at which damage of the barriers against release of radioactivity would occur. As stated in the IAEA Safety Requirements for Design of NPPs 'a safety analysis of the plant design shall be conducted in which methods of both deterministic and probabilistic analysis shall be applied'. It is required that 'the computer programs, analytical methods and plant models used in the safety analysis shall be verified and validated, and adequate consideration shall be given to uncertainties'. Uncertainties are present in calculations due to the computer codes, initial and boundary conditions, plant state, fuel parameters, scaling and numerical solution algorithm. All conservative approaches, still widely used, were introduced to cover uncertainties due to limited capability for modelling and understanding of physical phenomena at the early stages of safety analysis. The results obtained by this approach are quite unrealistic and the level of conservatism is not fully known. Another approach is the use of Best Estimate (BE) codes with realistic initial and boundary conditions. If this approach is selected, it should be based on statistically combined uncertainties for plant initial and boundary conditions, assumptions and code models. The current trends are going into direction of the best estimate code with some conservative assumptions of the system with realistic input data with uncertainty analysis. The BE analysis with evaluation of uncertainties offers, in addition, a way to quantify the existing plant safety margins. Its broader use in the future is therefore envisaged, even though it is not always feasible because of the difficulty of quantifying code uncertainties with sufficiently narrow range for every phenomenon and for each accident sequence. In this paper

  14. Dealing with phenomenological uncertainty in risk analysis

    Theofanous, T.G.

    1994-01-01

    The Risk-Oriented Accident Analysis Methodology (ROAAM) is summarized and developed further towards a formal definition. The key ideas behind the methodology and these more formal aspects are also presented and discussed

  15. The role of sensitivity analysis in assessing uncertainty

    Crick, M.J.; Hill, M.D.

    1987-01-01

    Outside the specialist world of those carrying out performance assessments considerable confusion has arisen about the meanings of sensitivity analysis and uncertainty analysis. In this paper we attempt to reduce this confusion. We then go on to review approaches to sensitivity analysis within the context of assessing uncertainty, and to outline the types of test available to identify sensitive parameters, together with their advantages and disadvantages. The views expressed in this paper are those of the authors; they have not been formally endorsed by the National Radiological Protection Board and should not be interpreted as Board advice

  16. Uncertainty analysis in the task of individual monitoring data

    Molokanov, A.; Badjin, V.; Gasteva, G.; Antipin, E.

    2003-01-01

    Assessment of internal doses is an essential component of individual monitoring programmes for workers and consists of two stages: individual monitoring measurements and interpretation of the monitoring data in terms of annual intake and/or annual internal dose. The overall uncertainty in assessed dose is a combination of the uncertainties in these stages. An algorithm and a computer code were developed for estimating the uncertainties in these stages. An algorithm and a computer code were developed for estimating the uncertainty in the assessment of internal dose in the task of individual monitoring data interpretation. Two main influencing factors are analysed in this paper: the unknown time of the exposure and variability of bioassay measurements. The aim of this analysis is to show that the algorithm is applicable in designing an individual monitoring programme for workers so as to guarantee that the individual dose calculated from individual monitoring measurements does not exceed a required limit with a certain confidence probability. (author)

  17. Analysis and Reduction of Complex Networks Under Uncertainty

    Knio, Omar M

    2014-04-09

    This is a collaborative proposal that aims at developing new methods for the analysis and reduction of complex multiscale networks under uncertainty. The approach is based on combining methods of computational singular perturbation (CSP) and probabilistic uncertainty quantification. In deterministic settings, CSP yields asymptotic approximations of reduced-dimensionality “slow manifolds” on which a multiscale dynamical system evolves. Introducing uncertainty raises fundamentally new issues, particularly concerning its impact on the topology of slow manifolds, and means to represent and quantify associated variability. To address these challenges, this project uses polynomial chaos (PC) methods to reformulate uncertain network models, and to analyze them using CSP in probabilistic terms. Specific objectives include (1) developing effective algorithms that can be used to illuminate fundamental and unexplored connections among model reduction, multiscale behavior, and uncertainty, and (2) demonstrating the performance of these algorithms through applications to model problems.

  18. Uncertainty analysis of power monitoring transit time ultrasonic flow meters

    Orosz, A.; Miller, D. W.; Christensen, R. N.; Arndt, S.

    2006-01-01

    A general uncertainty analysis is applied to chordal, transit time ultrasonic flow meters that are used in nuclear power plant feedwater loops. This investigation focuses on relationships between the major parameters of the flow measurement. For this study, mass flow rate is divided into three components, profile factor, density, and a form of volumetric flow rate. All system parameters are used to calculate values for these three components. Uncertainty is analyzed using a perturbation method. Sensitivity coefficients for major system parameters are shown, and these coefficients are applicable to a range of ultrasonic flow meters used in similar applications. Also shown is the uncertainty to be expected for density along with its relationship to other system uncertainties. One other conclusion is that pipe diameter sensitivity coefficients may be a function of the calibration technique used. (authors)

  19. Bayesian analysis for uncertainty estimation of a canopy transpiration model

    Samanta, S.; Mackay, D. S.; Clayton, M. K.; Kruger, E. L.; Ewers, B. E.

    2007-04-01

    A Bayesian approach was used to fit a conceptual transpiration model to half-hourly transpiration rates for a sugar maple (Acer saccharum) stand collected over a 5-month period and probabilistically estimate its parameter and prediction uncertainties. The model used the Penman-Monteith equation with the Jarvis model for canopy conductance. This deterministic model was extended by adding a normally distributed error term. This extension enabled using Markov chain Monte Carlo simulations to sample the posterior parameter distributions. The residuals revealed approximate conformance to the assumption of normally distributed errors. However, minor systematic structures in the residuals at fine timescales suggested model changes that would potentially improve the modeling of transpiration. Results also indicated considerable uncertainties in the parameter and transpiration estimates. This simple methodology of uncertainty analysis would facilitate the deductive step during the development cycle of deterministic conceptual models by accounting for these uncertainties while drawing inferences from data.

  20. Nordic reference study on uncertainty and sensitivity analysis

    Hirschberg, S.; Jacobsson, P.; Pulkkinen, U.; Porn, K.

    1989-01-01

    This paper provides a review of the first phase of Nordic reference study on uncertainty and sensitivity analysis. The main objective of this study is to use experiences form previous Nordic Benchmark Exercises and reference studies concerning critical modeling issues such as common cause failures and human interactions, and to demonstrate the impact of associated uncertainties on the uncertainty of the investigated accident sequence. This has been done independently by three working groups which used different approaches to modeling and to uncertainty analysis. The estimated uncertainty interval for the analyzed accident sequence is large. Also the discrepancies between the groups are substantial but can be explained. Sensitivity analyses which have been carried out concern e.g. use of different CCF-quantification models, alternative handling of CCF-data, time windows for operator actions and time dependences in phase mission operation, impact of state-of-knowledge dependences and ranking of dominating uncertainty contributors. Specific findings with respect to these issues are summarized in the paper

  1. Representing Uncertainty on Model Analysis Plots

    Smith, Trevor I.

    2016-01-01

    Model analysis provides a mechanism for representing student learning as measured by standard multiple-choice surveys. The model plot contains information regarding both how likely students in a particular class are to choose the correct answer and how likely they are to choose an answer consistent with a well-documented conceptual model.…

  2. Design optimization and uncertainty analysis of SMA morphing structures

    Oehler, S D; Hartl, D J; Lopez, R; Malak, R J; Lagoudas, D C

    2012-01-01

    The continuing implementation of shape memory alloys (SMAs) as lightweight solid-state actuators in morphing structures has now motivated research into finding optimized designs for use in aerospace control systems. This work proposes methods that use iterative analysis techniques to determine optimized designs for morphing aerostructures and consider the impact of uncertainty in model variables on the solution. A combination of commercially available and custom coded tools is utilized. ModelCenter, a suite of optimization algorithms and simulation process management tools, is coupled with the Abaqus finite element analysis suite and a custom SMA constitutive model to assess morphing structure designs in an automated fashion. The chosen case study involves determining the optimized configuration of a morphing aerostructure assembly that includes SMA flexures. This is accomplished by altering design inputs representing the placement of active components to minimize a specified cost function. An uncertainty analysis is also conducted using design of experiment methods to determine the sensitivity of the solution to a set of uncertainty variables. This second study demonstrates the effective use of Monte Carlo techniques to simulate the variance of model variables representing the inherent uncertainty in component fabrication processes. This paper outlines the modeling tools used to execute each case study, details the procedures for constructing the optimization problem and uncertainty analysis, and highlights the results from both studies. (paper)

  3. Uncertainty analysis of light water reactor unit fuel pin cells

    Kamerow, S.; Ivanov, K., E-mail: sln107@PSU.EDU, E-mail: kni1@PSU.EDU [Department of Mechanical and Nuclear Engineering, The Pennsylvania State University, PA (United States); Moreno, C. Arenas, E-mail: cristina.arenas@UPC.EDU [Department of Physics and Nuclear Engineering, Technical University of Catalonia, Barcelona (Spain)

    2011-07-01

    The study explored the calculation of uncertainty based on available covariance data and computational tools. Uncertainty due to temperature changes and different fuel compositions are the main focus of this analysis. Selected unit fuel pin cells were analyzed according to the OECD LWR UAM benchmark specifications. Criticality and uncertainty analyses were performed using TSUNAMI-1D sequence in SCALE 6.0. It was found that uncertainties increase with increasing temperature while k{sub eff} decreases. This increase in the uncertainty is due to the increase in sensitivity of the largest contributor of uncertainty, namely nuclide reaction {sup 238}U (n, gamma). The sensitivity grew larger as the capture cross-section of {sup 238}U expanded due to Doppler broadening. In addition, three different compositions (UOx, MOx, and UOxGd{sub 2}O{sub 3}) of fuel cells were analyzed. It showed a remarkable increase in uncertainty in k{sub eff} for the case of the MOx fuel cell and UOxGd{sub 2}O{sub 3} fuel cell. The increase in the uncertainty of k{sub eff} in UOxGd{sub 2}O{sub 3} fuel was nearly twice of that in MOx fuel and almost four times the amount in UOx fuel. The components of the uncertainties in k{sub eff} in each case were examined and it was found that the neutron-nuclide reaction of {sup 238}U, mainly (n,n'), contributed the most to the uncertainties in the cases of MOx and UOxGd{sub 2}O{sub 3}. At higher energy, the covariance coefficient matrix of {sup 238}U (n,n') to {sup 238}U (n,n') and {sup 238}U (n,n') cross-section showed very large values. Further, examination of the UOxGd{sub 2}O{sub 3} case found that the {sup 238}U (n,n') became the dominant contributor to the uncertainty because most of the thermal neutrons in the cell were absorbed by Gadolinium in UOxGd{sub 2}O{sub 3} case and thus shifting the neutron spectrum to higher energy. For the MOx case on other hand, {sup 239}Pu has a very strong absorption cross-section at low energy

  4. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Uncertainty on faecal analysis on dose assessment

    Juliao, Ligia M.Q.C.; Melo, Dunstana R.; Sousa, Wanderson de O.; Santos, Maristela S.; Fernandes, Paulo Cesar P. [Instituto de Radioprotecao e Dosimetria, Comissao Nacional de Energia Nuclear, Av. Salvador Allende s/n. Via 9, Recreio, CEP 22780-160, Rio de Janeiro, RJ (Brazil)

    2007-07-01

    Monitoring programmes for internal dose assessment may need to have a combination of bioassay techniques, e.g. urine and faecal analysis, especially in workplaces where compounds of different solubilities are handled and also in cases of accidental intakes. Faecal analysis may be an important data for assessment of committed effective dose due to exposure to insoluble compounds, since the activity excreted by urine may not be detectable, unless a very sensitive measurement system is available. This paper discusses the variability of the daily faecal excretion based on data from just one daily collection; collection during three consecutive days: samples analysed individually and samples analysed as a pool. The results suggest that just 1 d collection is not appropriate for dose assessment, since the 24 h uranium excretion may vary by a factor of 40. On the basis of this analysis, the recommendation should be faecal collection during three consecutive days, and samples analysed as a pool, it is more economic and faster. (authors)

  6. A Peep into the Uncertainty-Complexity-Relevance Modeling Trilemma through Global Sensitivity and Uncertainty Analysis

    Munoz-Carpena, R.; Muller, S. J.; Chu, M.; Kiker, G. A.; Perz, S. G.

    2014-12-01

    Model Model complexity resulting from the need to integrate environmental system components cannot be understated. In particular, additional emphasis is urgently needed on rational approaches to guide decision making through uncertainties surrounding the integrated system across decision-relevant scales. However, in spite of the difficulties that the consideration of modeling uncertainty represent for the decision process, it should not be avoided or the value and science behind the models will be undermined. These two issues; i.e., the need for coupled models that can answer the pertinent questions and the need for models that do so with sufficient certainty, are the key indicators of a model's relevance. Model relevance is inextricably linked with model complexity. Although model complexity has advanced greatly in recent years there has been little work to rigorously characterize the threshold of relevance in integrated and complex models. Formally assessing the relevance of the model in the face of increasing complexity would be valuable because there is growing unease among developers and users of complex models about the cumulative effects of various sources of uncertainty on model outputs. In particular, this issue has prompted doubt over whether the considerable effort going into further elaborating complex models will in fact yield the expected payback. New approaches have been proposed recently to evaluate the uncertainty-complexity-relevance modeling trilemma (Muller, Muñoz-Carpena and Kiker, 2011) by incorporating state-of-the-art global sensitivity and uncertainty analysis (GSA/UA) in every step of the model development so as to quantify not only the uncertainty introduced by the addition of new environmental components, but the effect that these new components have over existing components (interactions, non-linear responses). Outputs from the analysis can also be used to quantify system resilience (stability, alternative states, thresholds or tipping

  7. Planning for robust reserve networks using uncertainty analysis

    Moilanen, A.; Runge, M.C.; Elith, Jane; Tyre, A.; Carmel, Y.; Fegraus, E.; Wintle, B.A.; Burgman, M.; Ben-Haim, Y.

    2006-01-01

    Planning land-use for biodiversity conservation frequently involves computer-assisted reserve selection algorithms. Typically such algorithms operate on matrices of species presence?absence in sites, or on species-specific distributions of model predicted probabilities of occurrence in grid cells. There are practically always errors in input data?erroneous species presence?absence data, structural and parametric uncertainty in predictive habitat models, and lack of correspondence between temporal presence and long-run persistence. Despite these uncertainties, typical reserve selection methods proceed as if there is no uncertainty in the data or models. Having two conservation options of apparently equal biological value, one would prefer the option whose value is relatively insensitive to errors in planning inputs. In this work we show how uncertainty analysis for reserve planning can be implemented within a framework of information-gap decision theory, generating reserve designs that are robust to uncertainty. Consideration of uncertainty involves modifications to the typical objective functions used in reserve selection. Search for robust-optimal reserve structures can still be implemented via typical reserve selection optimization techniques, including stepwise heuristics, integer-programming and stochastic global search.

  8. Statistically based uncertainty assessments in nuclear risk analysis

    Spencer, F.W.; Diegert, K.V.; Easterling, R.G.

    1987-01-01

    Over the last decade, the problems of estimation and uncertainty assessment in probabilistics risk assessment (PRAs) have been addressed in a variety of NRC and industry-sponsored projects. These problems have received attention because of a recognition that major uncertainties in risk estimation exist, which can be reduced by collecting more and better data and other information, and because of a recognition that better methods for assessing these uncertainties are needed. In particular, a clear understanding of the nature and magnitude of various sources of uncertainty is needed to facilitate descision-making on possible plant changes and research options. Recent PRAs have employed methods of probability propagation, sometimes involving the use of Bayes Theorem, and intended to formalize the use of ''engineering judgment'' or ''expert opinion.'' All sources, or feelings, of uncertainty are expressed probabilistically, so that uncertainty analysis becomes simply a matter of probability propagation. Alternatives to forcing a probabilistic framework at all stages of a PRA are a major concern in this paper, however

  9. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project

  10. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, main report

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The ultimate objective of the joint effort was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. Experts developed their distributions independently. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. To validate the distributions generated for the dispersion code input variables, samples from the distributions and propagated through the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the first of a three-volume document describing the project

  11. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States); Lui, C.H. [Nuclear Regulatory Commission, Washington, DC (United States); Goossens, L.H.J.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Paesler-Sauer, J. [Research Center, Karlsruhe (Germany); Helton, J.C. [and others

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.

  12. Geological-structural models used in SR 97. Uncertainty analysis

    Saksa, P.; Nummela, J. [FINTACT Oy (Finland)

    1998-10-01

    The uncertainty of geological-structural models was studied for the three sites in SR 97, called Aberg, Beberg and Ceberg. The evaluation covered both regional and site scale models, the emphasis being placed on fracture zones in the site scale. Uncertainty is a natural feature of all geoscientific investigations. It originates from measurements (errors in data, sampling limitations, scale variation) and conceptualisation (structural geometries and properties, ambiguous geometric or parametric solutions) to name the major ones. The structures of A-, B- and Ceberg are fracture zones of varying types. No major differences in the conceptualisation between the sites were noted. One source of uncertainty in the site models is the non-existence of fracture and zone information in the scale from 10 to 300 - 1000 m. At Aberg the development of the regional model has been performed very thoroughly. At the site scale one major source of uncertainty is that a clear definition of the target area is missing. Structures encountered in the boreholes are well explained and an interdisciplinary approach in interpretation have taken place. Beberg and Ceberg regional models contain relatively large uncertainties due to the investigation methodology and experience available at that time. In site scale six additional structures were proposed both to Beberg and Ceberg to variant analysis of these sites. Both sites include uncertainty in the form of many non-interpreted fractured sections along the boreholes. Statistical analysis gives high occurrences of structures for all three sites: typically 20 - 30 structures/km{sup 3}. Aberg has highest structural frequency, Beberg comes next and Ceberg has the lowest. The borehole configuration, orientations and surveying goals were inspected to find whether preferences or factors causing bias were present. Data from Aberg supports the conclusion that Aespoe sub volume would be an anomalously fractured, tectonised unit of its own. This means that

  13. Geological-structural models used in SR 97. Uncertainty analysis

    Saksa, P.; Nummela, J.

    1998-10-01

    The uncertainty of geological-structural models was studied for the three sites in SR 97, called Aberg, Beberg and Ceberg. The evaluation covered both regional and site scale models, the emphasis being placed on fracture zones in the site scale. Uncertainty is a natural feature of all geoscientific investigations. It originates from measurements (errors in data, sampling limitations, scale variation) and conceptualisation (structural geometries and properties, ambiguous geometric or parametric solutions) to name the major ones. The structures of A-, B- and Ceberg are fracture zones of varying types. No major differences in the conceptualisation between the sites were noted. One source of uncertainty in the site models is the non-existence of fracture and zone information in the scale from 10 to 300 - 1000 m. At Aberg the development of the regional model has been performed very thoroughly. At the site scale one major source of uncertainty is that a clear definition of the target area is missing. Structures encountered in the boreholes are well explained and an interdisciplinary approach in interpretation have taken place. Beberg and Ceberg regional models contain relatively large uncertainties due to the investigation methodology and experience available at that time. In site scale six additional structures were proposed both to Beberg and Ceberg to variant analysis of these sites. Both sites include uncertainty in the form of many non-interpreted fractured sections along the boreholes. Statistical analysis gives high occurrences of structures for all three sites: typically 20 - 30 structures/km 3 . Aberg has highest structural frequency, Beberg comes next and Ceberg has the lowest. The borehole configuration, orientations and surveying goals were inspected to find whether preferences or factors causing bias were present. Data from Aberg supports the conclusion that Aespoe sub volume would be an anomalously fractured, tectonised unit of its own. This means that the

  14. Error Analysis of CM Data Products Sources of Uncertainty

    Hunt, Brian D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eckert-Gallup, Aubrey Celia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cochran, Lainy Dromgoole [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kraus, Terrence D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Allen, Mark B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Beal, Bill [National Security Technologies, Joint Base Andrews, MD (United States); Okada, Colin [National Security Technologies, LLC. (NSTec), Las Vegas, NV (United States); Simpson, Mathew [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-01

    This goal of this project is to address the current inability to assess the overall error and uncertainty of data products developed and distributed by DOE’s Consequence Management (CM) Program. This is a widely recognized shortfall, the resolution of which would provide a great deal of value and defensibility to the analysis results, data products, and the decision making process that follows this work. A global approach to this problem is necessary because multiple sources of error and uncertainty contribute to the ultimate production of CM data products. Therefore, this project will require collaboration with subject matter experts across a wide range of FRMAC skill sets in order to quantify the types of uncertainty that each area of the CM process might contain and to understand how variations in these uncertainty sources contribute to the aggregated uncertainty present in CM data products. The ultimate goal of this project is to quantify the confidence level of CM products to ensure that appropriate public and worker protections decisions are supported by defensible analysis.

  15. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 2: Appendices

    Haskin, F.E. [Univ. of New Mexico, Albuquerque, NM (United States); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on early health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  16. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 2: Appendices

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Boardman, J. [AEA Technology (United Kingdom); Jones, J.A. [National Radiological Protection Board (United Kingdom); Harper, F.T.; Young, M.L. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  17. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harrison, J.D. [National Radiological Protection Board (United Kingdom); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  18. Estimating annual bole biomass production using uncertainty analysis

    Travis J. Woolley; Mark E. Harmon; Kari B. O' Connell

    2007-01-01

    Two common sampling methodologies coupled with a simple statistical model were evaluated to determine the accuracy and precision of annual bole biomass production (BBP) and inter-annual variability estimates using this type of approach. We performed an uncertainty analysis using Monte Carlo methods in conjunction with radial growth core data from trees in three Douglas...

  19. Model Uncertainty and Robustness: A Computational Framework for Multimodel Analysis

    Young, Cristobal; Holsteen, Katherine

    2017-01-01

    Model uncertainty is pervasive in social science. A key question is how robust empirical results are to sensible changes in model specification. We present a new approach and applied statistical software for computational multimodel analysis. Our approach proceeds in two steps: First, we estimate the modeling distribution of estimates across all…

  20. An uncertainty analysis using the NRPB accident consequence code Marc

    Jones, J.A.; Crick, M.J.; Simmonds, J.R.

    1991-01-01

    This paper describes an uncertainty analysis of MARC calculations of the consequences of accidental releases of radioactive materials to atmosphere. A total of 98 parameters describing the transfer of material through the environment to man, the doses received, and the health effects resulting from these doses, was considered. The uncertainties in the numbers of early and late health effects, numbers of people affected by countermeasures, the amounts of food restricted and the economic costs of the accident were estimated. This paper concentrates on the results for early death and fatal cancer for a large hypothetical release from a PWR

  1. An educational model for ensemble streamflow simulation and uncertainty analysis

    A. AghaKouchak

    2013-02-01

    Full Text Available This paper presents the hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity. HBV-Ensemble was administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of uncertainty in hydrological modeling.

  2. Statistical uncertainty analysis of radon transport in nonisothermal, unsaturated soils

    Holford, D.J.; Owczarski, P.C.; Gee, G.W.; Freeman, H.D.

    1990-10-01

    To accurately predict radon fluxes soils to the atmosphere, we must know more than the radium content of the soil. Radon flux from soil is affected not only by soil properties, but also by meteorological factors such as air pressure and temperature changes at the soil surface, as well as the infiltration of rainwater. Natural variations in meteorological factors and soil properties contribute to uncertainty in subsurface model predictions of radon flux, which, when coupled with a building transport model, will also add uncertainty to predictions of radon concentrations in homes. A statistical uncertainty analysis using our Rn3D finite-element numerical model was conducted to assess the relative importance of these meteorological factors and the soil properties affecting radon transport. 10 refs., 10 figs., 3 tabs

  3. Decision analysis of shoreline protection under climate change uncertainty

    Chao, Philip T.; Hobbs, Benjamin F.

    1997-04-01

    If global warming occurs, it could significantly affect water resource distribution and availability. Yet it is unclear whether the prospect of such change is relevant to water resources management decisions being made today. We model a shoreline protection decision problem with a stochastic dynamic program (SDP) to determine whether consideration of the possibility of climate change would alter the decision. Three questions are addressed with the SDP: (l) How important is climate change compared to other uncertainties?, (2) What is the economic loss if climate change uncertainty is ignored?, and (3) How does belief in climate change affect the timing of the decision? In the case study, sensitivity analysis shows that uncertainty in real discount rates has a stronger effect upon the decision than belief in climate change. Nevertheless, a strong belief in climate change makes the shoreline protection project less attractive and often alters the decision to build it.

  4. Uncertainty and sensitivity analysis of the nuclear fuel thermal behavior

    Boulore, A., E-mail: antoine.boulore@cea.fr [Commissariat a l' Energie Atomique (CEA), DEN, Fuel Research Department, 13108 Saint-Paul-lez-Durance (France); Struzik, C. [Commissariat a l' Energie Atomique (CEA), DEN, Fuel Research Department, 13108 Saint-Paul-lez-Durance (France); Gaudier, F. [Commissariat a l' Energie Atomique (CEA), DEN, Systems and Structure Modeling Department, 91191 Gif-sur-Yvette (France)

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer A complete quantitative method for uncertainty propagation and sensitivity analysis is applied. Black-Right-Pointing-Pointer The thermal conductivity of UO{sub 2} is modeled as a random variable. Black-Right-Pointing-Pointer The first source of uncertainty is the linear heat rate. Black-Right-Pointing-Pointer The second source of uncertainty is the thermal conductivity of the fuel. - Abstract: In the global framework of nuclear fuel behavior simulation, the response of the models describing the physical phenomena occurring during the irradiation in reactor is mainly conditioned by the confidence in the calculated temperature of the fuel. Amongst all parameters influencing the temperature calculation in our fuel rod simulation code (METEOR V2), several sources of uncertainty have been identified as being the most sensitive: thermal conductivity of UO{sub 2}, radial distribution of power in the fuel pellet, local linear heat rate in the fuel rod, geometry of the pellet and thermal transfer in the gap. Expert judgment and inverse methods have been used to model the uncertainty of these parameters using theoretical distributions and correlation matrices. Propagation of these uncertainties in the METEOR V2 code using the URANIE framework and a Monte-Carlo technique has been performed in different experimental irradiations of UO{sub 2} fuel. At every time step of the simulated experiments, we get a temperature statistical distribution which results from the initial distributions of the uncertain parameters. We then can estimate confidence intervals of the calculated temperature. In order to quantify the sensitivity of the calculated temperature to each of the uncertain input parameters and data, we have also performed a sensitivity analysis using the Sobol' indices at first order.

  5. Error and Uncertainty Analysis for Ecological Modeling and Simulation

    2001-12-01

    nitrate flux to the Gulf of Mexico. Nature (Brief Communication) 414: 166-167. (Uncertainty analysis done with SERDP software) Gertner, G., G...D. Goolsby 2001. Relating N inputs to the Mississippi River Basin and nitrate flux in the Lower Mississippi River: A comparison of approaches...Journal of Remote Sensing, 25(4):367-380. Wu, J., D.E. Jelinski, M. Luck, and P.T. Tueller, 2000. Multiscale analysis of landscape heterogeneity: scale

  6. A Preliminary Study on Sensitivity and Uncertainty Analysis with Statistic Method: Uncertainty Analysis with Cross Section Sampling from Lognormal Distribution

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung [Hanyang Univ., Seoul (Korea, Republic of); Noh, Jae Man [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis.

  7. A Preliminary Study on Sensitivity and Uncertainty Analysis with Statistic Method: Uncertainty Analysis with Cross Section Sampling from Lognormal Distribution

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung; Noh, Jae Man

    2013-01-01

    The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis

  8. Hydrocoin level 3 - Testing methods for sensitivity/uncertainty analysis

    Grundfelt, B.; Lindbom, B.; Larsson, A.; Andersson, K.

    1991-01-01

    The HYDROCOIN study is an international cooperative project for testing groundwater hydrology modelling strategies for performance assessment of nuclear waste disposal. The study was initiated in 1984 by the Swedish Nuclear Power Inspectorate and the technical work was finalised in 1987. The participating organisations are regulatory authorities as well as implementing organisations in 10 countries. The study has been performed at three levels aimed at studying computer code verification, model validation and sensitivity/uncertainty analysis respectively. The results from the first two levels, code verification and model validation, have been published in reports in 1988 and 1990 respectively. This paper focuses on some aspects of the results from Level 3, sensitivity/uncertainty analysis, for which a final report is planned to be published during 1990. For Level 3, seven test cases were defined. Some of these aimed at exploring the uncertainty associated with the modelling results by simply varying parameter values and conceptual assumptions. In other test cases statistical sampling methods were applied. One of the test cases dealt with particle tracking and the uncertainty introduced by this type of post processing. The amount of results available is substantial although unevenly spread over the test cases. It has not been possible to cover all aspects of the results in this paper. Instead, the different methods applied will be illustrated by some typical analyses. 4 figs., 9 refs

  9. Treatment of uncertainties in the IPCC: a philosophical analysis

    Jebeile, J.; Drouet, I.

    2014-12-01

    The IPCC produces scientific reports out of findings on climate and climate change. Because the findings are uncertain in many respects, the production of reports requires aggregating assessments of uncertainties of different kinds. This difficult task is currently regulated by the Guidance note for lead authors of the IPCC fifth assessment report on consistent treatment of uncertainties. The note recommends that two metrics—i.e. confidence and likelihood— be used for communicating the degree of certainty in findings. Confidence is expressed qualitatively "based on the type, amount, quality, and consistency of evidence […] and the degree of agreement", while likelihood is expressed probabilistically "based on statistical analysis of observations or model results, or expert judgment". Therefore, depending on the evidence evaluated, authors have the choice to present either an assigned level of confidence or a quantified measure of likelihood. But aggregating assessments of uncertainties of these two different kinds express distinct and conflicting methodologies. So the question arises whether the treatment of uncertainties in the IPCC is rationally justified. In order to answer the question, it is worth comparing the IPCC procedures with the formal normative theories of epistemic rationality which have been developed by philosophers. These theories—which include contributions to the philosophy of probability and to bayesian probabilistic confirmation theory—are relevant for our purpose because they are commonly used to assess the rationality of common collective jugement formation based on uncertain knowledge. In this paper we make the comparison and pursue the following objectives: i/we determine whether the IPCC confidence and likelihood can be compared with the notions of uncertainty targeted by or underlying the formal normative theories of epistemic rationality; ii/we investigate whether the formal normative theories of epistemic rationality justify

  10. Uncertainty Assessment of Hydrological Frequency Analysis Using Bootstrap Method

    Yi-Ming Hu

    2013-01-01

    Full Text Available The hydrological frequency analysis (HFA is the foundation for the hydraulic engineering design and water resources management. Hydrological extreme observations or samples are the basis for HFA; the representativeness of a sample series to the population distribution is extremely important for the estimation reliability of the hydrological design value or quantile. However, for most of hydrological extreme data obtained in practical application, the size of the samples is usually small, for example, in China about 40~50 years. Generally, samples with small size cannot completely display the statistical properties of the population distribution, thus leading to uncertainties in the estimation of hydrological design values. In this paper, a new method based on bootstrap is put forward to analyze the impact of sampling uncertainty on the design value. By bootstrap resampling technique, a large number of bootstrap samples are constructed from the original flood extreme observations; the corresponding design value or quantile is estimated for each bootstrap sample, so that the sampling distribution of design value is constructed; based on the sampling distribution, the uncertainty of quantile estimation can be quantified. Compared with the conventional approach, this method provides not only the point estimation of a design value but also quantitative evaluation on uncertainties of the estimation.

  11. Uncertainty analysis of a nondestructive radioassay system for transuranic waste

    Harker, Y.D.; Blackwood, L.G.; Meachum, T.R.; Yoon, W.Y.

    1996-01-01

    Radioassay of transuranic waste in 207 liter drums currently stored at the Idaho National Engineering Laboratory is achieved using a Passive Active Neutron (PAN) nondestructive assay system. In order to meet data quality assurance requirements for shipping and eventual permanent storage of these drums at the Waste Isolation Pilot Plant in Carlsbad, New Mexico, the total uncertainty of the PAN system measurements must be assessed. In particular, the uncertainty calculations are required to include the effects of variations in waste matrix parameters and related variables on the final measurement results. Because of the complexities involved in introducing waste matrix parameter effects into the uncertainty calculations, standard methods of analysis (e.g., experimentation followed by propagation of errors) could not be implemented. Instead, a modified statistical sampling and verification approach was developed. In this modified approach the total performance of the PAN system is simulated using computer models of the assay system and the resultant output is compared with the known input to assess the total uncertainty. This paper describes the simulation process and illustrates its application to waste comprised of weapons grade plutonium-contaminated graphite molds

  12. Quantifying and managing uncertainty in operational modal analysis

    Au, Siu-Kui; Brownjohn, James M. W.; Mottershead, John E.

    2018-03-01

    Operational modal analysis aims at identifying the modal properties (natural frequency, damping, etc.) of a structure using only the (output) vibration response measured under ambient conditions. Highly economical and feasible, it is becoming a common practice in full-scale vibration testing. In the absence of (input) loading information, however, the modal properties have significantly higher uncertainty than their counterparts identified from free or forced vibration (known input) tests. Mastering the relationship between identification uncertainty and test configuration is of great interest to both scientists and engineers, e.g., for achievable precision limits and test planning/budgeting. Addressing this challenge beyond the current state-of-the-art that are mostly concerned with identification algorithms, this work obtains closed form analytical expressions for the identification uncertainty (variance) of modal parameters that fundamentally explains the effect of test configuration. Collectively referred as 'uncertainty laws', these expressions are asymptotically correct for well-separated modes, small damping and long data; and are applicable under non-asymptotic situations. They provide a scientific basis for planning and standardization of ambient vibration tests, where factors such as channel noise, sensor number and location can be quantitatively accounted for. The work is reported comprehensively with verification through synthetic and experimental data (laboratory and field), scientific implications and practical guidelines for planning ambient vibration tests.

  13. Two-dimensional cross-section sensitivity and uncertainty analysis for fusion reactor blankets

    Embrechts, M.J.

    1982-02-01

    A two-dimensional sensitivity and uncertainty analysis for the heating of the TF coil for the FED (fusion engineering device) blanket was performed. The uncertainties calculated are of the same order of magnitude as those resulting from a one-dimensional analysis. The largest uncertainties were caused by the cross section uncertainties for chromium

  14. Uncertainty and sensitivity analysis of environmental transport models

    Margulies, T.S.; Lancaster, L.E.

    1985-01-01

    An uncertainty and sensitivity analysis has been made of the CRAC-2 (Calculations of Reactor Accident Consequences) atmospheric transport and deposition models. Robustness and uncertainty aspects of air and ground deposited material and the relative contribution of input and model parameters were systematically studied. The underlying data structures were investigated using a multiway layout of factors over specified ranges generated via a Latin hypercube sampling scheme. The variables selected in our analysis include: weather bin, dry deposition velocity, rain washout coefficient/rain intensity, duration of release, heat content, sigma-z (vertical) plume dispersion parameter, sigma-y (crosswind) plume dispersion parameter, and mixing height. To determine the contributors to the output variability (versus distance from the site) step-wise regression analyses were performed on transformations of the spatial concentration patterns simulated. 27 references, 2 figures, 3 tables

  15. Stochastic analysis in production process and ecology under uncertainty

    Bieda, Bogusław

    2014-01-01

    The monograph addresses a problem of stochastic analysis based on the uncertainty assessment by simulation and application of this method in ecology and steel industry under uncertainty. The first chapter defines the Monte Carlo (MC) method and random variables in stochastic models. Chapter two deals with the contamination transport in porous media. Stochastic approach for Municipal Solid Waste transit time contaminants modeling using MC simulation has been worked out. The third chapter describes the risk analysis of the waste to energy facility proposal for Konin city, including the financial aspects. Environmental impact assessment of the ArcelorMittal Steel Power Plant, in Kraków - in the chapter four - is given. Thus, four scenarios of the energy mix production processes were studied. Chapter five contains examples of using ecological Life Cycle Assessment (LCA) - a relatively new method of environmental impact assessment - which help in preparing pro-ecological strategy, and which can lead to reducing t...

  16. Applications of the TSUNAMI sensitivity and uncertainty analysis methodology

    Rearden, Bradley T.; Hopper, Calvin M.; Elam, Karla R.; Goluoglu, Sedat; Parks, Cecil V.

    2003-01-01

    The TSUNAMI sensitivity and uncertainty analysis tools under development for the SCALE code system have recently been applied in four criticality safety studies. TSUNAMI is used to identify applicable benchmark experiments for criticality code validation, assist in the design of new critical experiments for a particular need, reevaluate previously computed computational biases, and assess the validation coverage and propose a penalty for noncoverage for a specific application. (author)

  17. INTEGRATION OF SYSTEM COMPONENTS AND UNCERTAINTY ANALYSIS - HANFORD EXAMPLES

    Wood, M.I.

    2009-01-01

    (sm b ullet) Deterministic 'One Off' analyses as basis for evaluating sensitivity and uncertainty relative to reference case (sm b ullet) Spatial coverage identical to reference case (sm b ullet) Two types of analysis assumptions - Minimax parameter values around reference case conditions - 'What If' cases that change reference case condition and associated parameter values (sm b ullet) No conclusions about likelihood of estimated result other than' qualitative expectation that actual outcome should tend toward reference case estimate

  18. Summary of the CEC/USDOE workshop on uncertainty analysis

    Elderkin, C.E.; Kelly, G.N.

    1990-06-01

    There is uncertainty in all aspects of assessing the consequences of accidental releases of radioactive material, from understanding and describing the environmental and biological transfer processes to modeling emergency response. The need for an exchange of views and a comparison of approaches between the diverse disciplines led to the organization of a CEC/USDOE Workshop on Uncertainty Analysis held in Santa Fe, New Mexico, in November 1989. The workshop brought together specialists in a number of disciplines, including those expert in the mathematics and statistics of uncertainty analysis, in expert judgment elicitation and evaluation, and in all aspects of assessing the radiological and environmental consequences of accidental releases of radioactive material. In addition, there was participation from users of the output of accident consequences assessment in decision making and/or regulatory frameworks. The main conclusions that emerged from the workshop are summarized in this paper. These are discussed in the context of three different types of accident consequence assessment: probabilistic assessments of accident consequences undertaken as inputs to risk analyses of nuclear installations, assessments of accident consequences in real time to provide inputs to decisions on the introduction of countermeasures, and the reconstruction of doses and risks resulting form past releases of radioactive material

  19. Similarity and uncertainty analysis of the ALLEGRO MOX core

    Vrban, B.; Hascik, J.; Necas, V.; Slugen, V.

    2015-01-01

    The similarity and uncertainty analysis of the ESNII+ ALLEGRO MOX core has identified specific problems and challenges in the field of neutronic calculations. Similarity assessment identified 9 partly comparable experiments where only one reached ck and E values over 0.9. However the Global Integral Index G remains still low (0.75) and cannot be judge das sufficient. The total uncertainty of calculated k eff induced by XS data is according to our calculation 1.04%. The main contributors to this uncertainty are 239 Pu nubar and 238 U inelastic scattering. The additional margin from uncovered sensitivities was determined to be 0.28%. The identified low number of similar experiments prevents the use of advanced XS adjustment and bias estimation methods. More experimental data are needed and presented results may serve as a basic step in development of necessary critical assemblies. Although exact data are not presented in the paper, faster 44 energy group calculation gives almost the same results in similarity analysis in comparison to more complex 238 group calculation. Finally, it was demonstrated that TSUNAMI-IP utility can play a significant role in the future fast reactor development in Slovakia and in the Visegrad region. Clearly a further Research and Development and strong effort should be carried out in order to receive more complex methodology consisting of more plausible covariance data and related quantities. (authors)

  20. Uncertainty analysis on probabilistic fracture mechanics assessment methodology

    Rastogi, Rohit; Vinod, Gopika; Chandra, Vikas; Bhasin, Vivek; Babar, A.K.; Rao, V.V.S.S.; Vaze, K.K.; Kushwaha, H.S.; Venkat-Raj, V.

    1999-01-01

    Fracture Mechanics has found a profound usage in the area of design of components and assessing fitness for purpose/residual life estimation of an operating component. Since defect size and material properties are statistically distributed, various probabilistic approaches have been employed for the computation of fracture probability. Monte Carlo Simulation is one such procedure towards the analysis of fracture probability. This paper deals with uncertainty analysis using the Monte Carlo Simulation methods. These methods were developed based on the R6 failure assessment procedure, which has been widely used in analysing the integrity of structures. The application of this method is illustrated with a case study. (author)

  1. Probabilistic Accident Consequence Uncertainty Analysis of the Food Chain Module in the COSYMA Package (invited paper)

    Brown, J.; Jones, J.A.

    2000-01-01

    This paper describes the uncertainty analysis of the food chain module of COSYMA and the uncertainty distributions on the input parameter values for the food chain model provided by the expert panels that were used for the analysis. Two expert panels were convened, covering the areas of soil and plant transfer processes and transfer to and through animals. The aggregated uncertainty distributions from the experts for the elicited variables were used in an uncertainty analysis of the food chain module of COSYMA. The main aim of the module analysis was to identify those parameters whose uncertainty makes large contributions to the overall uncertainty and so should be included in the overall analysis. (author)

  2. Overview of methods for uncertainty analysis and sensitivity analysis in probabilistic risk assessment

    Iman, R.L.; Helton, J.C.

    1985-01-01

    Probabilistic Risk Assessment (PRA) is playing an increasingly important role in the nuclear reactor regulatory process. The assessment of uncertainties associated with PRA results is widely recognized as an important part of the analysis process. One of the major criticisms of the Reactor Safety Study was that its representation of uncertainty was inadequate. The desire for the capability to treat uncertainties with the MELCOR risk code being developed at Sandia National Laboratories is indicative of the current interest in this topic. However, as yet, uncertainty analysis and sensitivity analysis in the context of PRA is a relatively immature field. In this paper, available methods for uncertainty analysis and sensitivity analysis in a PRA are reviewed. This review first treats methods for use with individual components of a PRA and then considers how these methods could be combined in the performance of a complete PRA. In the context of this paper, the goal of uncertainty analysis is to measure the imprecision in PRA outcomes of interest, and the goal of sensitivity analysis is to identify the major contributors to this imprecision. There are a number of areas that must be considered in uncertainty analysis and sensitivity analysis for a PRA: (1) information, (2) systems analysis, (3) thermal-hydraulic phenomena/fission product behavior, (4) health and economic consequences, and (5) display of results. Each of these areas and the synthesis of them into a complete PRA are discussed

  3. The Impact of Uncertainty on Investment. A Meta-Analysis

    Koetse, M.J. [Department of Spatial Economics, Vrije Universiteit Amsterdam (Netherlands); De Groot, Henri L.F. [Tinbergen Institute, Amsterdam (Netherlands); Florax, R.J.G.M. [Department of Agricultural Economics, Purdue University, West Lafayette (United States)

    2006-07-01

    In this paper we perform a meta-analysis on empirical estimates of the impact between investment and uncertainty. Since the outcomes of primary studies are largely incomparable with respect to the magnitude of the effect, our analysis focuses on the direction and statistical significance of the relationship. The standard approach in this situation is to estimate an ordered probit model on a categorical estimate, defined in terms of the direction of the effect. The estimates are transformed into marginal effects, in order to represent the changes in the probability of finding a negative significant, insignificant, and positive significant estimate. Although a meta-analysis generally does not allow for inferences on the correctness of model specifications in primary studies, our results give clear directions for model building in empirical investment research. For example, not including factor prices in investment models may seriously affect the model outcomes. Furthermore, we find that Q models produce more negative significant estimates than other models do, ceteris paribus. The outcome of a study is also affected by the type of data used in a primary study. Although it is clear that meta-analysis cannot always give decisive insights into the explanations for the variation in empirical outcomes, our meta-analysis shows that we can explain to a large extent why empirical estimates of the investment uncertainty relationship differ.

  4. Uncertainty analysis in WWTP model applications: a critical discussion using an example from design

    Sin, Gürkan; Gernaey, Krist; Neumann, Marc B.

    2009-01-01

    of design performance criteria differs significantly. The implication for the practical applications of uncertainty analysis in the wastewater industry is profound: (i) as the uncertainty analysis results are specific to the framing used, the results must be interpreted within the context of that framing......This study focuses on uncertainty analysis of WWTP models and analyzes the issue of framing and how it affects the interpretation of uncertainty analysis results. As a case study, the prediction of uncertainty involved in model-based design of a wastewater treatment plant is studied. The Monte...... to stoichiometric, biokinetic and influent parameters; (2) uncertainty due to hydraulic behaviour of the plant and mass transfer parameters; (3) uncertainty due to the combination of (1) and (2). The results demonstrate that depending on the way the uncertainty analysis is framed, the estimated uncertainty...

  5. Application of intelligence based uncertainty analysis for HLW disposal

    Kato, Kazuyuki

    2003-01-01

    Safety assessment for geological disposal of high level radioactive waste inevitably involves factors that cannot be specified in a deterministic manner. These are namely: (1) 'variability' that arises from stochastic nature of the processes and features considered, e.g., distribution of canister corrosion times and spatial heterogeneity of a host geological formation; (2) 'ignorance' due to incomplete or imprecise knowledge of the processes and conditions expected in the future, e.g., uncertainty in the estimation of solubilities and sorption coefficients for important nuclides. In many cases, a decision in assessment, e.g., selection among model options or determination of a parameter value, is subjected to both variability and ignorance in a combined form. It is clearly important to evaluate both influences of variability and ignorance on the result of a safety assessment in a consistent manner. We developed a unified methodology to handle variability and ignorance by using probabilistic and possibilistic techniques respectively. The methodology has been applied to safety assessment of geological disposal of high level radioactive waste. Uncertainties associated with scenarios, models and parameters were defined in terms of fuzzy membership functions derived through a series of interviews to the experts while variability was formulated by means of probability density functions (pdfs) based on available data set. The exercise demonstrated applicability of the new methodology and, in particular, its advantage in quantifying uncertainties based on expert's opinion and in providing information on dependence of assessment result on the level of conservatism. In addition, it was also shown that sensitivity analysis could identify key parameters in reducing uncertainties associated with the overall assessment. The above information can be used to support the judgment process and guide the process of disposal system development in optimization of protection against

  6. Biomass Thermogravimetric Analysis: Uncertainty Determination Methodology and Sampling Maps Generation

    Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Eguía, Pablo; Collazo, Joaquín

    2010-01-01

    The objective of this study was to develop a methodology for the determination of the maximum sampling error and confidence intervals of thermal properties obtained from thermogravimetric analysis (TG), including moisture, volatile matter, fixed carbon and ash content. The sampling procedure of the TG analysis was of particular interest and was conducted with care. The results of the present study were compared to those of a prompt analysis, and a correlation between the mean values and maximum sampling errors of the methods were not observed. In general, low and acceptable levels of uncertainty and error were obtained, demonstrating that the properties evaluated by TG analysis were representative of the overall fuel composition. The accurate determination of the thermal properties of biomass with precise confidence intervals is of particular interest in energetic biomass applications. PMID:20717532

  7. Use of error files in uncertainty analysis and data adjustment

    Chestnutt, M.M.; McCracken, A.K.; McCracken, A.K.

    1979-01-01

    Some results are given from uncertainty analyses on Pressurized Water Reactor (PWR) and Fast Reactor Theoretical Benchmarks. Upper limit estimates of calculated quantities are shown to be significantly reduced by the use of ENDF/B data covariance files and recently published few-group covariance matrices. Some problems in the analysis of single-material benchmark experiments are discussed with reference to the Winfrith iron benchmark experiment. Particular attention is given to the difficulty of making use of very extensive measurements which are likely to be a feature of this type of experiment. Preliminary results of an adjustment in iron are shown

  8. Uncertainty analysis in calculations of a road accident consequences

    Bonnefous, S.; Brenot, J.; Hubert, P.

    1995-01-01

    This paper develops a concrete situation witch is the search for an evacuation distance in case of a road accident implying a chlorine tank. The methodological aspect is how implementing uncertainty analysis in deterministic models with random parameters. The study demonstrates a great dispersion in the results. It allows to establish satisfactory decision rules and a hierarchy on parameters witch is useful to define priorities in the search for information and to improve the treatment of these parameters. (authors). 8 refs., 1 fig., 2 tabs

  9. Additional challenges for uncertainty analysis in river engineering

    Berends, Koen; Warmink, Jord; Hulscher, Suzanne

    2016-04-01

    the proposed intervention. The implicit assumption underlying such analysis is that both models are commensurable. We hypothesize that they are commensurable only to a certain extent. In an idealised study we have demonstrated that prediction performance loss should be expected with increasingly large engineering works. When accounting for parametric uncertainty of floodplain roughness in model identification, we see uncertainty bounds for predicted effects of interventions increase with increasing intervention scale. Calibration of these types of models therefore seems to have a shelf-life, beyond which calibration does not longer improves prediction. Therefore a qualification scheme for model use is required that can be linked to model validity. In this study, we characterize model use along three dimensions: extrapolation (using the model with different external drivers), extension (using the model for different output or indicators) and modification (using modified models). Such use of models is expected to have implications for the applicability of surrogating modelling for efficient uncertainty analysis as well, which is recommended for future research. Warmink, J. J.; Straatsma, M. W.; Huthoff, F.; Booij, M. J. & Hulscher, S. J. M. H. 2013. Uncertainty of design water levels due to combined bed form and vegetation roughness in the Dutch river Waal. Journal of Flood Risk Management 6, 302-318 . DOI: 10.1111/jfr3.12014

  10. A Proposal on the Advanced Sampling Based Sensitivity and Uncertainty Analysis Method for the Eigenvalue Uncertainty Analysis

    Kim, Song Hyun; Song, Myung Sub; Shin, Chang Ho; Noh, Jae Man

    2014-01-01

    In using the perturbation theory, the uncertainty of the response can be estimated by a single transport simulation, and therefore it requires small computational load. However, it has a disadvantage that the computation methodology must be modified whenever estimating different response type such as multiplication factor, flux, or power distribution. Hence, it is suitable for analyzing few responses with lots of perturbed parameters. Statistical approach is a sampling based method which uses randomly sampled cross sections from covariance data for analyzing the uncertainty of the response. XSUSA is a code based on the statistical approach. The cross sections are only modified with the sampling based method; thus, general transport codes can be directly utilized for the S/U analysis without any code modifications. However, to calculate the uncertainty distribution from the result, code simulation should be enough repeated with randomly sampled cross sections. Therefore, this inefficiency is known as a disadvantage of the stochastic method. In this study, an advanced sampling method of the cross sections is proposed and verified to increase the estimation efficiency of the sampling based method. In this study, to increase the estimation efficiency of the sampling based S/U method, an advanced sampling and estimation method was proposed. The main feature of the proposed method is that the cross section averaged from each single sampled cross section is used. For the use of the proposed method, the validation was performed using the perturbation theory

  11. Statistical analysis of the uncertainty related to flood hazard appraisal

    Notaro, Vincenza; Freni, Gabriele

    2015-12-01

    The estimation of flood hazard frequency statistics for an urban catchment is of great interest in practice. It provides the evaluation of potential flood risk and related damage and supports decision making for flood risk management. Flood risk is usually defined as function of the probability, that a system deficiency can cause flooding (hazard), and the expected damage, due to the flooding magnitude (damage), taking into account both the exposure and the vulnerability of the goods at risk. The expected flood damage can be evaluated by an a priori estimation of potential damage caused by flooding or by interpolating real damage data. With regard to flood hazard appraisal several procedures propose to identify some hazard indicator (HI) such as flood depth or the combination of flood depth and velocity and to assess the flood hazard corresponding to the analyzed area comparing the HI variables with user-defined threshold values or curves (penalty curves or matrixes). However, flooding data are usually unavailable or piecemeal allowing for carrying out a reliable flood hazard analysis, therefore hazard analysis is often performed by means of mathematical simulations aimed at evaluating water levels and flow velocities over catchment surface. As results a great part of the uncertainties intrinsic to flood risk appraisal can be related to the hazard evaluation due to the uncertainty inherent to modeling results and to the subjectivity of the user defined hazard thresholds applied to link flood depth to a hazard level. In the present work, a statistical methodology was proposed for evaluating and reducing the uncertainties connected with hazard level estimation. The methodology has been applied to a real urban watershed as case study.

  12. Selected examples of practical approaches for the assessment of model reliability - parameter uncertainty analysis

    Hofer, E.; Hoffman, F.O.

    1987-02-01

    The uncertainty analysis of model predictions has to discriminate between two fundamentally different types of uncertainty. The presence of stochastic variability (Type 1 uncertainty) necessitates the use of a probabilistic model instead of the much simpler deterministic one. Lack of knowledge (Type 2 uncertainty), however, applies to deterministic as well as to probabilistic model predictions and often dominates over uncertainties of Type 1. The term ''probability'' is interpreted differently in the probabilistic analysis of either type of uncertainty. After these discriminations have been explained the discussion centers on the propagation of parameter uncertainties through the model, the derivation of quantitative uncertainty statements for model predictions and the presentation and interpretation of the results of a Type 2 uncertainty analysis. Various alternative approaches are compared for a very simple deterministic model

  13. Sensitivity/uncertainty analysis for the Hiroshima dosimetry reevaluation effort

    Broadhead, B.L.; Lillie, R.A.; Pace, J.V. III; Cacuci, D.G.

    1987-01-01

    Uncertainty estimates and cross correlations by range/survivor location have been obtained for the free-in-air (FIA) tissue kerma for the Hiroshima atomic event. These uncertainties in the FIA kerma include contributions due to various modeling parameters and the basic cross section data and are given at three ground ranges, 700, 1000 and 1500 m. The estimated uncertainties are nearly constant over the given ground ranges and are approximately 27% for the prompt neutron kerma and secondary gamma kerma and 35% for the prompt gamma kerma. The total kerma uncertainty is dominated by the secondary gamma kerma uncertainties which are in turn largely due to the modeling parameter uncertainties

  14. Joint analysis of epistemic and aleatory uncertainty in stability analysis for geo-hazard assessments

    Rohmer, Jeremy; Verdel, Thierry

    2017-04-01

    Uncertainty analysis is an unavoidable task of stability analysis of any geotechnical systems. Such analysis usually relies on the safety factor SF (if SF is below some specified threshold), the failure is possible). The objective of the stability analysis is then to estimate the failure probability P for SF to be below the specified threshold. When dealing with uncertainties, two facets should be considered as outlined by several authors in the domain of geotechnics, namely "aleatoric uncertainty" (also named "randomness" or "intrinsic variability") and "epistemic uncertainty" (i.e. when facing "vague, incomplete or imprecise information" such as limited databases and observations or "imperfect" modelling). The benefits of separating both facets of uncertainty can be seen from a risk management perspective because: - Aleatoric uncertainty, being a property of the system under study, cannot be reduced. However, practical actions can be taken to circumvent the potentially dangerous effects of such variability; - Epistemic uncertainty, being due to the incomplete/imprecise nature of available information, can be reduced by e.g., increasing the number of tests (lab or in site survey), improving the measurement methods or evaluating calculation procedure with model tests, confronting more information sources (expert opinions, data from literature, etc.). Uncertainty treatment in stability analysis usually restricts to the probabilistic framework to represent both facets of uncertainty. Yet, in the domain of geo-hazard assessments (like landslides, mine pillar collapse, rockfalls, etc.), the validity of this approach can be debatable. In the present communication, we propose to review the major criticisms available in the literature against the systematic use of probability in situations of high degree of uncertainty. On this basis, the feasibility of using a more flexible uncertainty representation tool is then investigated, namely Possibility distributions (e

  15. Stochastic dynamic analysis of marine risers considering Gaussian system uncertainties

    Ni, Pinghe; Li, Jun; Hao, Hong; Xia, Yong

    2018-03-01

    This paper performs the stochastic dynamic response analysis of marine risers with material uncertainties, i.e. in the mass density and elastic modulus, by using Stochastic Finite Element Method (SFEM) and model reduction technique. These uncertainties are assumed having Gaussian distributions. The random mass density and elastic modulus are represented by using the Karhunen-Loève (KL) expansion. The Polynomial Chaos (PC) expansion is adopted to represent the vibration response because the covariance of the output is unknown. Model reduction based on the Iterated Improved Reduced System (IIRS) technique is applied to eliminate the PC coefficients of the slave degrees of freedom to reduce the dimension of the stochastic system. Monte Carlo Simulation (MCS) is conducted to obtain the reference response statistics. Two numerical examples are studied in this paper. The response statistics from the proposed approach are compared with those from MCS. It is noted that the computational time is significantly reduced while the accuracy is kept. The results demonstrate the efficiency of the proposed approach for stochastic dynamic response analysis of marine risers.

  16. Overview of hybrid subspace methods for uncertainty quantification, sensitivity analysis

    Abdel-Khalik, Hany S.; Bang, Youngsuk; Wang, Congjian

    2013-01-01

    Highlights: ► We overview the state-of-the-art in uncertainty quantification and sensitivity analysis. ► We overview new developments in above areas using hybrid methods. ► We give a tutorial introduction to above areas and the new developments. ► Hybrid methods address the explosion in dimensionality in nonlinear models. ► Representative numerical experiments are given. -- Abstract: The role of modeling and simulation has been heavily promoted in recent years to improve understanding of complex engineering systems. To realize the benefits of modeling and simulation, concerted efforts in the areas of uncertainty quantification and sensitivity analysis are required. The manuscript intends to serve as a pedagogical presentation of the material to young researchers and practitioners with little background on the subjects. We believe this is important as the role of these subjects is expected to be integral to the design, safety, and operation of existing as well as next generation reactors. In addition to covering the basics, an overview of the current state-of-the-art will be given with particular emphasis on the challenges pertaining to nuclear reactor modeling. The second objective will focus on presenting our own development of hybrid subspace methods intended to address the explosion in the computational overhead required when handling real-world complex engineering systems.

  17. Procedures for uncertainty and sensitivity analysis in repository performance assessment

    Poern, K.; Aakerlund, O.

    1985-10-01

    The objective of the project was mainly a literature study of available methods for the treatment of parameter uncertainty propagation and sensitivity aspects in complete models such as those concerning geologic disposal of radioactive waste. The study, which has run parallel with the development of a code package (PROPER) for computer assisted analysis of function, also aims at the choice of accurate, cost-affective methods for uncertainty and sensitivity analysis. Such a choice depends on several factors like the number of input parameters, the capacity of the model and the computer reresources required to use the model. Two basic approaches are addressed in the report. In one of these the model of interest is directly simulated by an efficient sampling technique to generate an output distribution. Applying the other basic method the model is replaced by an approximating analytical response surface, which is then used in the sampling phase or in moment matching to generate the output distribution. Both approaches are illustrated by simple examples in the report. (author)

  18. A retrospective dosimetry method and its uncertainty analysis

    Zhang, L.; Jia, D.; Dai, G.

    2000-01-01

    The main aim of a radiation epidemiological study is to assess the risk of the population exposed to ionizing radiation. The actual work of the assessment may be very difficult because dose information about the population is often indirect and incomplete. It is very important, therefore, to find a way of estimating reasonable and reliable doses of the population by a retrospective method from limited information. In order to provide reasonable dose information for the cohort study of Chinese medical diagnostic X-ray workers, a retrospective dosimetry method was established. In China, a cohort study of more than 27,000 medical diagnostic X-ray workers, with 25,000 controls, has been carried out for about fifteen years in order to assess the risk to an occupationally exposed population. Obviously, a key to the success of the study is to obtain reliable and reasonable results of dose estimation by the dose reconstruction method. Before 1985, there was a lack of information regarding personal dose measured directly; however, we can obtain other indirect information. Examples are information about working loads from the documents of the hospitals, information about operational conditions of the workers of different statuses by a survey of occupational history, and the exposure levels of various working conditions by some simulation methods. The information for estimating organ dose can also be obtained by simulating experiments with a phantom. Based on the information mentioned above, a mathematical model and computerizing system for dose reconstruction of this occupational population was design and developed. Uncertainty analysis very important for dose reconstruction. The sources of uncertainty of our study are coming from two fields. One is coming from the mode of dose reconstruction. Another is coming from the survey of the occupational history. In the result reported, main results of the uncertainty will be presented. In order to control the uncertainty of the

  19. Uncertainty propagation in probabilistic safety analysis of nuclear power plants

    Fleming, P.V.

    1981-09-01

    The uncertainty propagation in probabilistic safety analysis of nuclear power plants, is done. The methodology of the minimal cut is implemented in the computer code SVALON and the results for several cases are compared with corresponding results obtained with the SAMPLE code, which employs the Monte Carlo method to propagate the uncertanties. The results have show that, for a relatively small number of dominant minimal cut sets (n approximately 25) and error factors (r approximately 5) the SVALON code yields results which are comparable to those obtained with SAMPLE. An analysis of the unavailability of the low pressure recirculation system of Angra 1 for both the short and long term recirculation phases, are presented. The results for the short term phase are in good agreement with the corresponding one given in WASH-1400. (E.G.) [pt

  20. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report

    Brown, J. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)] [and others

    1997-06-01

    This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  1. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report

    Brown, J.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-06-01

    This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses

  2. Geoengineering to Avoid Overshoot: An Analysis of Uncertainty

    Tanaka, Katsumasa; Cho, Cheolhung; Krey, Volker; Patt, Anthony; Rafaj, Peter; Rao-Skirbekk, Shilpa; Wagner, Fabian

    2010-05-01

    ., 2009) is employed to calculate climate responses including associated uncertainty and to estimate geoengineering profiles to cap the warming at 2°C since preindustrial. The inversion setup for the model ACC2 is used to estimate the uncertain parameters (e.g. climate sensitivity) against associated historical observations (e.g. global-mean surface air temperature). Our preliminary results show that under climate and scenario uncertainties, a geoengineering intervention to avoid an overshoot would be with medium intensity in the latter half of this century (≈ 1 Mt. Pinatubo eruption every 4 years in terms of stratospheric sulfur injections). The start year of geoengineering intervention does not significantly influence the long-term geoengineering profile. However, a geoengineering intervention of the medium intensity could bring about substantial environmental side effects such as the destruction of stratospheric ozone. Our results point to the necessity to pursue persistently mainstream mitigation efforts. 2) Pollution Abatement and Geoengineering The second study examines the potential of geoengineering combined with air clean policy. A drastic air pollution abatement might result in an abrupt warming because it would suddenly remove the tropospheric aerosols which partly offset the background global warming (e.g. Andreae et al, 2005, Raddatz and Tanaka, 2010). This study investigates the magnitude of unrealized warming under a range of policy assumptions and associated uncertainties. Then the profile of geoengineering is estimated to suppress the warming that would be accompanied by clean air policy. This study is the first attempt to explore uncertainty in the warming caused by clean air policy - Kloster et al. (2009), which assess regional changes in climate and hydrological cycle, has not however included associated uncertainties in the analysis. A variety of policy assumptions will be devised to represent various degrees of air pollution abatement. These

  3. Uncertainty analysis comes to integrated assessment models for climate change…and conversely

    Cooke, R.M.

    2012-01-01

    This article traces the development of uncertainty analysis through three generations punctuated by large methodology investments in the nuclear sector. Driven by a very high perceived legitimation burden, these investments aimed at strengthening the scientific basis of uncertainty quantification.

  4. Analysis and evaluation of regulatory uncertainties in 10 CFR 60 subparts B and E

    Weiner, R.F.; Patrick, W.C.

    1990-01-01

    This paper presents an attribute analysis scheme for prioritizing the resolution of regulatory uncertainties. Attributes are presented which assist in identifying the need for timeliness and durability of the resolution of an uncertainty

  5. Dynamic Simulation, Sensitivity and Uncertainty Analysis of a Demonstration Scale Lignocellulosic Enzymatic Hydrolysis Process

    Prunescu, Remus Mihail; Sin, Gürkan

    2014-01-01

    This study presents the uncertainty and sensitivity analysis of a lignocellulosic enzymatic hydrolysis model considering both model and feed parameters as sources of uncertainty. The dynamic model is parametrized for accommodating various types of biomass, and different enzymatic complexes...

  6. Uncertainty analysis of NDA waste measurements using computer simulations

    Blackwood, L.G.; Harker, Y.D.; Yoon, W.Y.; Meachum, T.R.

    2000-01-01

    Uncertainty assessments for nondestructive radioassay (NDA) systems for nuclear waste are complicated by factors extraneous to the measurement systems themselves. Most notably, characteristics of the waste matrix (e.g., homogeneity) and radioactive source material (e.g., particle size distribution) can have great effects on measured mass values. Under these circumstances, characterizing the waste population is as important as understanding the measurement system in obtaining realistic uncertainty values. When extraneous waste characteristics affect measurement results, the uncertainty results are waste-type specific. The goal becomes to assess the expected bias and precision for the measurement of a randomly selected item from the waste population of interest. Standard propagation-of-errors methods for uncertainty analysis can be very difficult to implement in the presence of significant extraneous effects on the measurement system. An alternative approach that naturally includes the extraneous effects is as follows: (1) Draw a random sample of items from the population of interest; (2) Measure the items using the NDA system of interest; (3) Establish the true quantity being measured using a gold standard technique; and (4) Estimate bias by deriving a statistical regression model comparing the measurements on the system of interest to the gold standard values; similar regression techniques for modeling the standard deviation of the difference values gives the estimated precision. Actual implementation of this method is often impractical. For example, a true gold standard confirmation measurement may not exist. A more tractable implementation is obtained by developing numerical models for both the waste material and the measurement system. A random sample of simulated waste containers generated by the waste population model serves as input to the measurement system model. This approach has been developed and successfully applied to assessing the quantity of

  7. Bayesian uncertainty analysis with applications to turbulence modeling

    Cheung, Sai Hung; Oliver, Todd A.; Prudencio, Ernesto E.; Prudhomme, Serge; Moser, Robert D.

    2011-01-01

    In this paper, we apply Bayesian uncertainty quantification techniques to the processes of calibrating complex mathematical models and predicting quantities of interest (QoI's) with such models. These techniques also enable the systematic comparison of competing model classes. The processes of calibration and comparison constitute the building blocks of a larger validation process, the goal of which is to accept or reject a given mathematical model for the prediction of a particular QoI for a particular scenario. In this work, we take the first step in this process by applying the methodology to the analysis of the Spalart-Allmaras turbulence model in the context of incompressible, boundary layer flows. Three competing model classes based on the Spalart-Allmaras model are formulated, calibrated against experimental data, and used to issue a prediction with quantified uncertainty. The model classes are compared in terms of their posterior probabilities and their prediction of QoI's. The model posterior probability represents the relative plausibility of a model class given the data. Thus, it incorporates the model's ability to fit experimental observations. Alternatively, comparing models using the predicted QoI connects the process to the needs of decision makers that use the results of the model. We show that by using both the model plausibility and predicted QoI, one has the opportunity to reject some model classes after calibration, before subjecting the remaining classes to additional validation challenges.

  8. Methods and computer codes for probabilistic sensitivity and uncertainty analysis

    Vaurio, J.K.

    1985-01-01

    This paper describes the methods and applications experience with two computer codes that are now available from the National Energy Software Center at Argonne National Laboratory. The purpose of the SCREEN code is to identify a group of most important input variables of a code that has many (tens, hundreds) input variables with uncertainties, and do this without relying on judgment or exhaustive sensitivity studies. Purpose of the PROSA-2 code is to propagate uncertainties and calculate the distributions of interesting output variable(s) of a safety analysis code using response surface techniques, based on the same runs used for screening. Several applications are discussed, but the codes are generic, not tailored to any specific safety application code. They are compatible in terms of input/output requirements but also independent of each other, e.g., PROSA-2 can be used without first using SCREEN if a set of important input variables has first been selected by other methods. Also, although SCREEN can select cases to be run (by random sampling), a user can select cases by other methods if he so prefers, and still use the rest of SCREEN for identifying important input variables

  9. Quantification of Uncertainty in the Flood Frequency Analysis

    Kasiapillai Sudalaimuthu, K.; He, J.; Swami, D.

    2017-12-01

    Flood frequency analysis (FFA) is usually carried out for planning and designing of water resources and hydraulic structures. Owing to the existence of variability in sample representation, selection of distribution and estimation of distribution parameters, the estimation of flood quantile has been always uncertain. Hence, suitable approaches must be developed to quantify the uncertainty in the form of prediction interval as an alternate to deterministic approach. The developed framework in the present study to include uncertainty in the FFA discusses a multi-objective optimization approach to construct the prediction interval using ensemble of flood quantile. Through this approach, an optimal variability of distribution parameters is identified to carry out FFA. To demonstrate the proposed approach, annual maximum flow data from two gauge stations (Bow river at Calgary and Banff, Canada) are used. The major focus of the present study was to evaluate the changes in magnitude of flood quantiles due to the recent extreme flood event occurred during the year 2013. In addition, the efficacy of the proposed method was further verified using standard bootstrap based sampling approaches and found that the proposed method is reliable in modeling extreme floods as compared to the bootstrap methods.

  10. Selection of Representative Models for Decision Analysis Under Uncertainty

    Meira, Luis A. A.; Coelho, Guilherme P.; Santos, Antonio Alberto S.; Schiozer, Denis J.

    2016-03-01

    The decision-making process in oil fields includes a step of risk analysis associated with the uncertainties present in the variables of the problem. Such uncertainties lead to hundreds, even thousands, of possible scenarios that are supposed to be analyzed so an effective production strategy can be selected. Given this high number of scenarios, a technique to reduce this set to a smaller, feasible subset of representative scenarios is imperative. The selected scenarios must be representative of the original set and also free of optimistic and pessimistic bias. This paper is devoted to propose an assisted methodology to identify representative models in oil fields. To do so, first a mathematical function was developed to model the representativeness of a subset of models with respect to the full set that characterizes the problem. Then, an optimization tool was implemented to identify the representative models of any problem, considering not only the cross-plots of the main output variables, but also the risk curves and the probability distribution of the attribute-levels of the problem. The proposed technique was applied to two benchmark cases and the results, evaluated by experts in the field, indicate that the obtained solutions are richer than those identified by previously adopted manual approaches. The program bytecode is available under request.

  11. Reduction of uncertainties in probabilistic seismic hazard analysis

    Seo, Jeong Moon; Choun, Young Sun; Choi, In Kil [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-02-01

    An integrated research for the reduction of conservatism and uncertainties in PSHA in Korea was performed. The research consisted of five technical task areas as follows; Task 1: Earthquake Catalog Development for PSHA. Task 2: Evaluation of Seismicity and Tectonics of the Korea Region. Task 3: Development of a Ground Motion Relationships. Task 4: Improvement of PSHA Modelling Methodology. Task 5: Development of Seismic Source Interpretations for the region of Korea for Inputs to PSHA. A series of tests on an ancient wooden house and an analysis on medium size earthquake in Korea were performed intensively. Signification improvement, especially in the estimation of historical earthquake, ground motion attenuation, and seismic source interpretations, were made through this study. 314 refs., 180 figs., 54 tabs. (Author)

  12. Computational Methods for Sensitivity and Uncertainty Analysis in Criticality Safety

    Broadhead, B.L.; Childs, R.L.; Rearden, B.T.

    1999-01-01

    Interest in the sensitivity methods that were developed and widely used in the 1970s (the FORSS methodology at ORNL among others) has increased recently as a result of potential use in the area of criticality safety data validation procedures to define computational bias, uncertainties and area(s) of applicability. Functional forms of the resulting sensitivity coefficients can be used as formal parameters in the determination of applicability of benchmark experiments to their corresponding industrial application areas. In order for these techniques to be generally useful to the criticality safety practitioner, the procedures governing their use had to be updated and simplified. This paper will describe the resulting sensitivity analysis tools that have been generated for potential use by the criticality safety community

  13. Reducing the uncertainty in robotic machining by modal analysis

    Alberdi, Iñigo; Pelegay, Jose Angel; Arrazola, Pedro Jose; Ørskov, Klaus Bonde

    2017-10-01

    The use of industrial robots for machining could lead to high cost and energy savings for the manufacturing industry. Machining robots offer several advantages respect to CNC machines such as flexibility, wide working space, adaptability and relatively low cost. However, there are some drawbacks that are preventing a widespread adoption of robotic solutions namely lower stiffness, vibration/chatter problems and lower accuracy and repeatability. Normally due to these issues conservative cutting parameters are chosen, resulting in a low material removal rate (MRR). In this article, an example of a modal analysis of a robot is presented. For that purpose the Tap-testing technology is introduced, which aims at maximizing productivity, reducing the uncertainty in the selection of cutting parameters and offering a stable process free from chatter vibrations.

  14. Parameter uncertainty effects on variance-based sensitivity analysis

    Yu, W.; Harris, T.J.

    2009-01-01

    In the past several years there has been considerable commercial and academic interest in methods for variance-based sensitivity analysis. The industrial focus is motivated by the importance of attributing variance contributions to input factors. A more complete understanding of these relationships enables companies to achieve goals related to quality, safety and asset utilization. In a number of applications, it is possible to distinguish between two types of input variables-regressive variables and model parameters. Regressive variables are those that can be influenced by process design or by a control strategy. With model parameters, there are typically no opportunities to directly influence their variability. In this paper, we propose a new method to perform sensitivity analysis through a partitioning of the input variables into these two groupings: regressive variables and model parameters. A sequential analysis is proposed, where first an sensitivity analysis is performed with respect to the regressive variables. In the second step, the uncertainty effects arising from the model parameters are included. This strategy can be quite useful in understanding process variability and in developing strategies to reduce overall variability. When this method is used for nonlinear models which are linear in the parameters, analytical solutions can be utilized. In the more general case of models that are nonlinear in both the regressive variables and the parameters, either first order approximations can be used, or numerically intensive methods must be used

  15. Uncertainty importance analysis using parametric moment ratio functions.

    Wei, Pengfei; Lu, Zhenzhou; Song, Jingwen

    2014-02-01

    This article presents a new importance analysis framework, called parametric moment ratio function, for measuring the reduction of model output uncertainty when the distribution parameters of inputs are changed, and the emphasis is put on the mean and variance ratio functions with respect to the variances of model inputs. The proposed concepts efficiently guide the analyst to achieve a targeted reduction on the model output mean and variance by operating on the variances of model inputs. The unbiased and progressive unbiased Monte Carlo estimators are also derived for the parametric mean and variance ratio functions, respectively. Only a set of samples is needed for implementing the proposed importance analysis by the proposed estimators, thus the computational cost is free of input dimensionality. An analytical test example with highly nonlinear behavior is introduced for illustrating the engineering significance of the proposed importance analysis technique and verifying the efficiency and convergence of the derived Monte Carlo estimators. Finally, the moment ratio function is applied to a planar 10-bar structure for achieving a targeted 50% reduction of the model output variance. © 2013 Society for Risk Analysis.

  16. Uncertainty Analysis of the NASA Glenn 8x6 Supersonic Wind Tunnel

    Stephens, Julia; Hubbard, Erin; Walter, Joel; McElroy, Tyler

    2016-01-01

    This paper presents methods and results of a detailed measurement uncertainty analysis that was performed for the 8- by 6-foot Supersonic Wind Tunnel located at the NASA Glenn Research Center. The statistical methods and engineering judgments used to estimate elemental uncertainties are described. The Monte Carlo method of propagating uncertainty was selected to determine the uncertainty of calculated variables of interest. A detailed description of the Monte Carlo method as applied for this analysis is provided. Detailed uncertainty results for the uncertainty in average free stream Mach number as well as other variables of interest are provided. All results are presented as random (variation in observed values about a true value), systematic (potential offset between observed and true value), and total (random and systematic combined) uncertainty. The largest sources contributing to uncertainty are determined and potential improvement opportunities for the facility are investigated.

  17. Uncertainty analysis of a low flow model for the Rhine River

    Demirel, M.C.; Booij, Martijn J.

    2011-01-01

    It is widely recognized that hydrological models are subject to parameter uncertainty. However, little attention has been paid so far to the uncertainty in parameters of the data-driven models like weights in neural networks. This study aims at applying a structured uncertainty analysis to a

  18. Quality in environmental science for policy: assessing uncertainty as a component of policy analysis

    Maxim, L.; van der Sluijs, J.P.

    2011-01-01

    The sheer number of attempts to define and classify uncertainty reveals an awareness of its importance in environmental science for policy, though the nature of uncertainty is often misunderstood. The interdisciplinary field of uncertainty analysis is unstable; there are currently several incomplete

  19. Uncertainty Analysis of In leakage Test for Pressurized Control Room Envelop

    Lee, J. B. [KHNP Central Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    In leakage tests for control room envelops(CRE) of newly constructed nuclear power plants are required to prove the control room habitability. Results of the in leakage tests should be analyzed using an uncertainty analysis. Test uncertainty can be an issue if the test results for pressurized CREs show low in leakage. To have a better knowledge of the test uncertainty, a statistical model for the uncertainty analysis is described here and a representative uncertainty analysis of a sample in leakage test is presented. A statistical method for analyzing the uncertainty of the in leakage test is presented here and a representative uncertainty analysis of a sample in leakage test was performed. By using the statistical method we can evaluate the test result with certain level of significance. This method can be more helpful when the difference of the two mean values of the test result is small.

  20. Uncertainty Analysis of In leakage Test for Pressurized Control Room Envelop

    Lee, J. B.

    2013-01-01

    In leakage tests for control room envelops(CRE) of newly constructed nuclear power plants are required to prove the control room habitability. Results of the in leakage tests should be analyzed using an uncertainty analysis. Test uncertainty can be an issue if the test results for pressurized CREs show low in leakage. To have a better knowledge of the test uncertainty, a statistical model for the uncertainty analysis is described here and a representative uncertainty analysis of a sample in leakage test is presented. A statistical method for analyzing the uncertainty of the in leakage test is presented here and a representative uncertainty analysis of a sample in leakage test was performed. By using the statistical method we can evaluate the test result with certain level of significance. This method can be more helpful when the difference of the two mean values of the test result is small

  1. Regional Frequency and Uncertainty Analysis of Extreme Precipitation in Bangladesh

    Mortuza, M. R.; Demissie, Y.; Li, H. Y.

    2014-12-01

    Increased frequency of extreme precipitations, especially those with multiday durations, are responsible for recent urban floods and associated significant losses of lives and infrastructures in Bangladesh. Reliable and routinely updated estimation of the frequency of occurrence of such extreme precipitation events are thus important for developing up-to-date hydraulic structures and stormwater drainage system that can effectively minimize future risk from similar events. In this study, we have updated the intensity-duration-frequency (IDF) curves for Bangladesh using daily precipitation data from 1961 to 2010 and quantified associated uncertainties. Regional frequency analysis based on L-moments is applied on 1-day, 2-day and 5-day annual maximum precipitation series due to its advantages over at-site estimation. The regional frequency approach pools the information from climatologically similar sites to make reliable estimates of quantiles given that the pooling group is homogeneous and of reasonable size. We have used Region of influence (ROI) approach along with homogeneity measure based on L-moments to identify the homogenous pooling groups for each site. Five 3-parameter distributions (i.e., Generalized Logistic, Generalized Extreme value, Generalized Normal, Pearson Type Three, and Generalized Pareto) are used for a thorough selection of appropriate models that fit the sample data. Uncertainties related to the selection of the distributions and historical data are quantified using the Bayesian Model Averaging and Balanced Bootstrap approaches respectively. The results from this study can be used to update the current design and management of hydraulic structures as well as in exploring spatio-temporal variations of extreme precipitation and associated risk.

  2. Best-estimate analysis and decision making under uncertainty

    Orechwa, Y.

    2004-01-01

    In many engineering analyses of system safety the traditional reliance on conservative evaluation model calculations is being replaced with so called best-estimate analysis. These best-estimate analyses differentiate themselves from the traditional conservative analyses through two ingredients, namely realistic models and an account of the residual uncertainty associated with the model calculations. Best-estimate analysis, in the context of this paper, refers to the numerical evaluation of system properties of interest in situations where direct confirmatory measurements are not feasible. A decision with regard to the safety of the system is then made based on the computed numerical values of the system properties of interest. These situations generally arise in the design of systems that require computed and generally nontrivial extrapolations from the available data. In the case of nuclear reactors, examples are criticality of spent fuel pools, neutronic parameters of new advanced designs where insufficient material is available for mockup critical experiments and, the large break loss of coolant accident (LOCA). In this paper the case of LOCA, is taken to discuss the best-estimate analysis and decision making. Central to decision making is information. Thus, of interest is the source, quantity and quality of the information obtained in a best-estimate analysis, and used to define the acceptance criteria and to formulate a decision rule. This in effect expands the problem from the calculation of a conservative margin to a predefined acceptance criterion, to the formulation of a consistent decision rule and the computation of a test statistic for application of the decision rule. The latter view is a necessary condition for developing risk informed decision rules, and, thus, the relation between design basis analysis criteria and probabilistic risk assessment criteria is key. The discussion is in the context of making a decision under uncertainty for a reactor

  3. Cassini Spacecraft Uncertainty Analysis Data and Methodology Review and Update/Volume 1: Updated Parameter Uncertainty Models for the Consequence Analysis

    WHEELER, TIMOTHY A.; WYSS, GREGORY D.; HARPER, FREDERICK T.

    2000-11-01

    Uncertainty distributions for specific parameters of the Cassini General Purpose Heat Source Radioisotope Thermoelectric Generator (GPHS-RTG) Final Safety Analysis Report consequence risk analysis were revised and updated. The revisions and updates were done for all consequence parameters for which relevant information exists from the joint project on Probabilistic Accident Consequence Uncertainty Analysis by the United States Nuclear Regulatory Commission and the Commission of European Communities.

  4. Incorporating parametric uncertainty into population viability analysis models

    McGowan, Conor P.; Runge, Michael C.; Larson, Michael A.

    2011-01-01

    Uncertainty in parameter estimates from sampling variation or expert judgment can introduce substantial uncertainty into ecological predictions based on those estimates. However, in standard population viability analyses, one of the most widely used tools for managing plant, fish and wildlife populations, parametric uncertainty is often ignored in or discarded from model projections. We present a method for explicitly incorporating this source of uncertainty into population models to fully account for risk in management and decision contexts. Our method involves a two-step simulation process where parametric uncertainty is incorporated into the replication loop of the model and temporal variance is incorporated into the loop for time steps in the model. Using the piping plover, a federally threatened shorebird in the USA and Canada, as an example, we compare abundance projections and extinction probabilities from simulations that exclude and include parametric uncertainty. Although final abundance was very low for all sets of simulations, estimated extinction risk was much greater for the simulation that incorporated parametric uncertainty in the replication loop. Decisions about species conservation (e.g., listing, delisting, and jeopardy) might differ greatly depending on the treatment of parametric uncertainty in population models.

  5. Investments in technology subject to uncertainty. Analysis and policy

    Pedersen, Jørgen Lindgaard

    1997-01-01

    Investments in technology are today of such a magnitude that it matters. In the paper there are three important questions. First on the question in which sense technological uncertainty can be said to be a problem. Second on strategies for diminishing technological uncertainties. Three on policy...

  6. On the principled assignment of probabilities for uncertainty analysis

    Unwin, S.D.; Cook, I.

    1986-01-01

    The authors sympathize with those who raise the questions of inscrutability and over-precision in connection with probabilistic techniques as currently implemented in nuclear PRA. This inscrutability also renders the probabilistic approach, as practiced, open to abuse. They believe that the appropriate remedy is not the discarding of the probabilistic representation of uncertainty in favour of a more simply structured, but logically inconsistent approach such as that of bounding analysis. This would be like forbidding the use of arithmetic in order to prevent the issuing of fraudulent company prospectuses. The remedy, in this analogy, is the enforcement of accounting standards for the valuation of inventory, rates of depreciation etc. They require an analogue of such standards in the PRA domain. What is needed is not the interdiction of probabilistic judgment, but the interdiction of private, inscrutable judgment. Some principles may be conventional in character, as are certain accounting principles. They expound a set of controlling principles which they suggest should govern the formulation of probabilities in nuclear risk analysis. A fuller derivation and consideration of these principles can be found

  7. Uncertainty Analysis of RBMK-Related Experimental Data

    Urbonas, Rolandas; Kaliatka, Algirdas; Liaukonis, Mindaugas

    2002-01-01

    An attempt to validate state-of-the-art thermal hydraulic code ATHLET (GRS, Germany) on the basis of E-108 test facility was made. Originally this code was developed and validated for different type reactors than RBMK. Since state-of-art thermal hydraulic codes are widely used for simulation of RBMK reactors, further codes' implementation and validation is required. The phenomena associated with channel type flow instabilities and CHF were found to be an important step in the frame of the overall effort of state-of-the-art validation and application for RBMK reactors. In the paper one-channel approach analysis is presented. Thus, the oscillatory behaviour of the system was not detected. The results show dependence on the nodalization used in the heated channels, initial and boundary conditions and code selected models. It is shown that the code is able to predict a sudden heat structure temperature excursion, when critical heat flux is approached. GRS developed uncertainty and sensitivity methodology was employed in the analysis. (authors)

  8. The uncertainty in physical measurements an introduction to data analysis in the physics laboratory

    Fornasini, Paolo

    2008-01-01

    All measurements of physical quantities are affected by uncertainty. Understanding the origin of uncertainty, evaluating its extent and suitably taking it into account in data analysis is essential for assessing the degree of accuracy of phenomenological relationships and physical laws in both scientific research and technological applications. The Uncertainty in Physical Measurements: An Introduction to Data Analysis in the Physics Laboratory presents an introduction to uncertainty and to some of the most common procedures of data analysis. This book will serve the reader well by filling the gap between tutorial textbooks and highly specialized monographs. The book is divided into three parts. The first part is a phenomenological introduction to measurement and uncertainty: properties of instruments, different causes and corresponding expressions of uncertainty, histograms and distributions, and unified expression of uncertainty. The second part contains an introduction to probability theory, random variable...

  9. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 2: Appendices

    Brown, J. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)] [and others

    1997-06-01

    This volume is the second of a two-volume document that summarizes a joint project by the US Nuclear Regulatory and the Commission of European Communities to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This two-volume report, which examines mechanisms and uncertainties of transfer through the food chain, is the first in a series of five such reports. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain transfer that affect calculations of offsite radiological consequences. Seven of the experts reported on transfer into the food chain through soil and plants, nine reported on transfer via food products from animals, and two reported on both. The expert judgment elicitation procedure and its outcomes are described in these volumes. This volume contains seven appendices. Appendix A presents a brief discussion of the MAACS and COSYMA model codes. Appendix B is the structure document and elicitation questionnaire for the expert panel on soils and plants. Appendix C presents the rationales and responses of each of the members of the soils and plants expert panel. Appendix D is the structure document and elicitation questionnaire for the expert panel on animal transfer. The rationales and responses of each of the experts on animal transfer are given in Appendix E. Brief biographies of the food chain expert panel members are provided in Appendix F. Aggregated results of expert responses are presented in graph format in Appendix G.

  10. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 2: Appendices

    Brown, J.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-06-01

    This volume is the second of a two-volume document that summarizes a joint project by the US Nuclear Regulatory and the Commission of European Communities to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This two-volume report, which examines mechanisms and uncertainties of transfer through the food chain, is the first in a series of five such reports. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain transfer that affect calculations of offsite radiological consequences. Seven of the experts reported on transfer into the food chain through soil and plants, nine reported on transfer via food products from animals, and two reported on both. The expert judgment elicitation procedure and its outcomes are described in these volumes. This volume contains seven appendices. Appendix A presents a brief discussion of the MAACS and COSYMA model codes. Appendix B is the structure document and elicitation questionnaire for the expert panel on soils and plants. Appendix C presents the rationales and responses of each of the members of the soils and plants expert panel. Appendix D is the structure document and elicitation questionnaire for the expert panel on animal transfer. The rationales and responses of each of the experts on animal transfer are given in Appendix E. Brief biographies of the food chain expert panel members are provided in Appendix F. Aggregated results of expert responses are presented in graph format in Appendix G

  11. Use of probabilistic methods for analysis of cost and duration uncertainties in a decision analysis framework

    Boak, D.M.; Painton, L.

    1995-01-01

    Probabilistic forecasting techniques have been used in many risk assessment and performance assessment applications on radioactive waste disposal projects such as Yucca Mountain and the Waste Isolation Pilot Plant (WIPP). Probabilistic techniques such as Monte Carlo and Latin Hypercube sampling methods are routinely used to treat uncertainties in physical parameters important in simulating radionuclide transport in a coupled geohydrologic system and assessing the ability of that system to comply with regulatory release limits. However, the use of probabilistic techniques in the treatment of uncertainties in the cost and duration of programmatic alternatives on risk and performance assessment projects is less common. Where significant uncertainties exist and where programmatic decisions must be made despite existing uncertainties, probabilistic techniques may yield important insights into decision options, especially when used in a decision analysis framework and when properly balanced with deterministic analyses. For relatively simple evaluations, these types of probabilistic evaluations can be made using personal computer-based software

  12. Fukushima Daiichi unit 1 uncertainty analysis--Preliminary selection of uncertain parameters and analysis methodology

    Cardoni, Jeffrey N.; Kalinich, Donald A.

    2014-02-01

    Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.

  13. Report of a CSNI workshop on uncertainty analysis methods. Volume 1 + 2

    Wickett, A.J.; Yadigaroglu, G.

    1994-08-01

    The OECD NEA CSNI Principal Working Group 2 (PWG2) Task Group on Thermal Hydraulic System Behaviour (TGTHSB) has, in recent years, received presentations of a variety of different methods to analyze the uncertainty in the calculations of advanced unbiased (best estimate) codes. Proposals were also made for an International Standard Problem (ISP) to compare the uncertainty analysis methods. The objectives for the Workshop were to discuss and fully understand the principles of uncertainty analysis relevant to LOCA modelling and like problems, to examine the underlying issues from first principles, in preference to comparing and contrasting the currently proposed methods, to reach consensus on the issues identified as far as possible while not avoiding the controversial aspects, to identify as clearly as possible unreconciled differences, and to issue a Status Report. Eight uncertainty analysis methods were presented. A structured discussion of various aspects of uncertainty analysis followed - the need for uncertainty analysis, identification and ranking of uncertainties, characterisation, quantification and combination of uncertainties and applications, resources and future developments. As a result, the objectives set out above were, to a very large extent, achieved. Plans for the ISP were also discussed. Volume 1 contains a record of the discussions on uncertainty methods. Volume 2 is a compilation of descriptions of the eight uncertainty analysis methods presented at the workshop

  14. Code development for eigenvalue total sensitivity analysis and total uncertainty analysis

    Wan, Chenghui; Cao, Liangzhi; Wu, Hongchun; Zu, Tiejun; Shen, Wei

    2015-01-01

    Highlights: • We develop a new code for total sensitivity and uncertainty analysis. • The implicit effects of cross sections can be considered. • The results of our code agree well with TSUNAMI-1D. • Detailed analysis for origins of implicit effects is performed. - Abstract: The uncertainties of multigroup cross sections notably impact eigenvalue of neutron-transport equation. We report on a total sensitivity analysis and total uncertainty analysis code named UNICORN that has been developed by applying the direct numerical perturbation method and statistical sampling method. In order to consider the contributions of various basic cross sections and the implicit effects which are indirect results of multigroup cross sections through resonance self-shielding calculation, an improved multigroup cross-section perturbation model is developed. The DRAGON 4.0 code, with application of WIMSD-4 format library, is used by UNICORN to carry out the resonance self-shielding and neutron-transport calculations. In addition, the bootstrap technique has been applied to the statistical sampling method in UNICORN to obtain much steadier and more reliable uncertainty results. The UNICORN code has been verified against TSUNAMI-1D by analyzing the case of TMI-1 pin-cell. The numerical results show that the total uncertainty of eigenvalue caused by cross sections can reach up to be about 0.72%. Therefore the contributions of the basic cross sections and their implicit effects are not negligible

  15. Bayesian models for comparative analysis integrating phylogenetic uncertainty

    Villemereuil Pierre de

    2012-06-01

    Full Text Available Abstract Background Uncertainty in comparative analyses can come from at least two sources: a phylogenetic uncertainty in the tree topology or branch lengths, and b uncertainty due to intraspecific variation in trait values, either due to measurement error or natural individual variation. Most phylogenetic comparative methods do not account for such uncertainties. Not accounting for these sources of uncertainty leads to false perceptions of precision (confidence intervals will be too narrow and inflated significance in hypothesis testing (e.g. p-values will be too small. Although there is some application-specific software for fitting Bayesian models accounting for phylogenetic error, more general and flexible software is desirable. Methods We developed models to directly incorporate phylogenetic uncertainty into a range of analyses that biologists commonly perform, using a Bayesian framework and Markov Chain Monte Carlo analyses. Results We demonstrate applications in linear regression, quantification of phylogenetic signal, and measurement error models. Phylogenetic uncertainty was incorporated by applying a prior distribution for the phylogeny, where this distribution consisted of the posterior tree sets from Bayesian phylogenetic tree estimation programs. The models were analysed using simulated data sets, and applied to a real data set on plant traits, from rainforest plant species in Northern Australia. Analyses were performed using the free and open source software OpenBUGS and JAGS. Conclusions Incorporating phylogenetic uncertainty through an empirical prior distribution of trees leads to more precise estimation of regression model parameters than using a single consensus tree and enables a more realistic estimation of confidence intervals. In addition, models incorporating measurement errors and/or individual variation, in one or both variables, are easily formulated in the Bayesian framework. We show that BUGS is a useful, flexible

  16. Bayesian models for comparative analysis integrating phylogenetic uncertainty

    2012-01-01

    Background Uncertainty in comparative analyses can come from at least two sources: a) phylogenetic uncertainty in the tree topology or branch lengths, and b) uncertainty due to intraspecific variation in trait values, either due to measurement error or natural individual variation. Most phylogenetic comparative methods do not account for such uncertainties. Not accounting for these sources of uncertainty leads to false perceptions of precision (confidence intervals will be too narrow) and inflated significance in hypothesis testing (e.g. p-values will be too small). Although there is some application-specific software for fitting Bayesian models accounting for phylogenetic error, more general and flexible software is desirable. Methods We developed models to directly incorporate phylogenetic uncertainty into a range of analyses that biologists commonly perform, using a Bayesian framework and Markov Chain Monte Carlo analyses. Results We demonstrate applications in linear regression, quantification of phylogenetic signal, and measurement error models. Phylogenetic uncertainty was incorporated by applying a prior distribution for the phylogeny, where this distribution consisted of the posterior tree sets from Bayesian phylogenetic tree estimation programs. The models were analysed using simulated data sets, and applied to a real data set on plant traits, from rainforest plant species in Northern Australia. Analyses were performed using the free and open source software OpenBUGS and JAGS. Conclusions Incorporating phylogenetic uncertainty through an empirical prior distribution of trees leads to more precise estimation of regression model parameters than using a single consensus tree and enables a more realistic estimation of confidence intervals. In addition, models incorporating measurement errors and/or individual variation, in one or both variables, are easily formulated in the Bayesian framework. We show that BUGS is a useful, flexible general purpose tool for

  17. Uncertainty Analysis of Multi-Model Flood Forecasts

    Erich J. Plate

    2015-12-01

    Full Text Available This paper demonstrates, by means of a systematic uncertainty analysis, that the use of outputs from more than one model can significantly improve conditional forecasts of discharges or water stages, provided the models are structurally different. Discharge forecasts from two models and the actual forecasted discharge are assumed to form a three-dimensional joint probability density distribution (jpdf, calibrated on long time series of data. The jpdf is decomposed into conditional probability density distributions (cpdf by means of Bayes formula, as suggested and explored by Krzysztofowicz in a series of papers. In this paper his approach is simplified to optimize conditional forecasts for any set of two forecast models. Its application is demonstrated by means of models developed in a study of flood forecasting for station Stung Treng on the middle reach of the Mekong River in South-East Asia. Four different forecast models were used and pairwise combined: forecast with no model, with persistence model, with a regression model, and with a rainfall-runoff model. Working with cpdfs requires determination of dependency among variables, for which linear regressions are required, as was done by Krzysztofowicz. His Bayesian approach based on transforming observed probability distributions of discharges and forecasts into normal distributions is also explored. Results obtained with his method for normal prior and likelihood distributions are identical to results from direct multiple regressions. Furthermore, it is shown that in the present case forecast accuracy is only marginally improved, if Weibull distributed basic data were converted into normally distributed variables.

  18. Sequencing Infrastructure Investments under Deep Uncertainty Using Real Options Analysis

    Nishtha Manocha

    2018-02-01

    Full Text Available The adaptation tipping point and adaptation pathway approach developed to make decisions under deep uncertainty do not shed light on which among the multiple available pathways should be chosen as the preferred pathway. This creates the need to extend these approaches by means of suitable tools that can help sequence actions and subsequently enable the outlining of relevant policies. This paper presents two sequencing approaches, namely, the “Build to Target” and “Build Up” approach, to aid in sub-selecting a set of preferred pathways. Both approaches differ in the levels of flexibility they offer. They are exemplified by means of two case studies wherein the Net Present Valuation and the Real Options Analysis are employed as selection criterions. The results demonstrate the benefit of these two approaches when used in conjunction with the adaptation pathways and show how the pathways selected by means of a Build to Target approach generally have a value greater than, or at least the same as, the pathways selected by the Build Up approach. Further, this paper also demonstrates the capacity of Real Options to quantify and capture the economic value of flexibility, which cannot be done by traditional valuation approaches such as Net Present Valuation.

  19. Application of status uncertainty analysis methods for AP1000 LBLOCA calculation

    Zhang Shunxiang; Liang Guoxing

    2012-01-01

    Parameter uncertainty analysis is developed by using the reasonable method to establish the response relations between input parameter uncertainties and output uncertainties. The application of the parameter uncertainty analysis makes the simulation of plant state more accuracy and improves the plant economy with reasonable security assurance. The AP1000 LBLOCA was analyzed in this paper and the results indicate that the random sampling statistical analysis method, sensitivity analysis numerical method and traditional error propagation analysis method can provide quite large peak cladding temperature (PCT) safety margin, which is much helpful for choosing suitable uncertainty analysis method to improve the plant economy. Additionally, the random sampling statistical analysis method applying mathematical statistics theory makes the largest safety margin due to the reducing of the conservation. Comparing with the traditional conservative bounding parameter analysis method, the random sampling method can provide the PCT margin of 100 K, while the other two methods can only provide 50-60 K. (authors)

  20. Uncertainty analysis of time-dependent nonlinear systems: theory and application to transient thermal hydraulics

    Barhen, J.; Bjerke, M.A.; Cacuci, D.G.; Mullins, C.B.; Wagschal, G.G.

    1982-01-01

    An advanced methodology for performing systematic uncertainty analysis of time-dependent nonlinear systems is presented. This methodology includes a capability for reducing uncertainties in system parameters and responses by using Bayesian inference techniques to consistently combine prior knowledge with additional experimental information. The determination of best estimates for the system parameters, for the responses, and for their respective covariances is treated as a time-dependent constrained minimization problem. Three alternative formalisms for solving this problem are developed. The two ''off-line'' formalisms, with and without ''foresight'' characteristics, require the generation of a complete sensitivity data base prior to performing the uncertainty analysis. The ''online'' formalism, in which uncertainty analysis is performed interactively with the system analysis code, is best suited for treatment of large-scale highly nonlinear time-dependent problems. This methodology is applied to the uncertainty analysis of a transient upflow of a high pressure water heat transfer experiment. For comparison, an uncertainty analysis using sensitivities computed by standard response surface techniques is also performed. The results of the analysis indicate the following. Major reduction of the discrepancies in the calculation/experiment ratios is achieved by using the new methodology. Incorporation of in-bundle measurements in the uncertainty analysis significantly reduces system uncertainties. Accuracy of sensitivities generated by response-surface techniques should be carefully assessed prior to using them as a basis for uncertainty analyses of transient reactor safety problems

  1. A structured analysis of uncertainty surrounding modeled impacts of groundwater-extraction rules

    Guillaume, Joseph H. A.; Qureshi, M. Ejaz; Jakeman, Anthony J.

    2012-08-01

    Integrating economic and groundwater models for groundwater-management can help improve understanding of trade-offs involved between conflicting socioeconomic and biophysical objectives. However, there is significant uncertainty in most strategic decision-making situations, including in the models constructed to represent them. If not addressed, this uncertainty may be used to challenge the legitimacy of the models and decisions made using them. In this context, a preliminary uncertainty analysis was conducted of a dynamic coupled economic-groundwater model aimed at assessing groundwater extraction rules. The analysis demonstrates how a variety of uncertainties in such a model can be addressed. A number of methods are used including propagation of scenarios and bounds on parameters, multiple models, block bootstrap time-series sampling and robust linear regression for model calibration. These methods are described within the context of a theoretical uncertainty management framework, using a set of fundamental uncertainty management tasks and an uncertainty typology.

  2. Uncertainty analysis of thermal quantities measurement in a centrifugal compressor

    Hurda, Lukáš; Matas, Richard

    2017-09-01

    Compressor performance characteristics evaluation process based on the measurement of pressure, temperature and other quantities is examined to find uncertainties for directly measured and derived quantities. CFD is used as a tool to quantify the influences of different sources of uncertainty of measurements for single- and multi-thermocouple total temperature probes. The heat conduction through the body of the thermocouple probe and the heat-up of the air in the intake piping are the main phenomena of interest.

  3. Analysis of Uncertainty in Dynamic Processes Development of Banks Functioning

    Aleksei V. Korovyakovskii

    2013-01-01

    Full Text Available The paper offers the approach to measure of uncertainty estimation in dynamic processes of banks functioning, using statistic data of different banking operations indicators. To calculate measure of uncertainty in dynamic processes of banks functioning the phase images of relevant sets of statistic data are considered. Besides, it is shown that the form of phase image of the studied sets of statistic data can act as a basis of measure of uncertainty estimation in dynamic processes of banks functioning. The set of analytical characteristics are offered to formalize the form of phase image definition of the studied sets of statistic data. It is shown that the offered analytical characteristics consider inequality of changes in values of the studied sets of statistic data, which is one of the ways of uncertainty display in dynamic processes development. The invariant estimates of measure of uncertainty in dynamic processes of banks functioning, considering significant changes in absolute values of the same indicators for different banks were obtained. The examples of calculation of measure of uncertainty in dynamic processes of concrete banks functioning were cited.

  4. Responses to clinical uncertainty in Australian general practice trainees: a cross-sectional analysis.

    Cooke, Georga; Tapley, Amanda; Holliday, Elizabeth; Morgan, Simon; Henderson, Kim; Ball, Jean; van Driel, Mieke; Spike, Neil; Kerr, Rohan; Magin, Parker

    2017-12-01

    Tolerance for ambiguity is essential for optimal learning and professional competence. General practice trainees must be, or must learn to be, adept at managing clinical uncertainty. However, few studies have examined associations of intolerance of uncertainty in this group. The aim of this study was to establish levels of tolerance of uncertainty in Australian general practice trainees and associations of uncertainty with demographic, educational and training practice factors. A cross-sectional analysis was performed on the Registrar Clinical Encounters in Training (ReCEnT) project, an ongoing multi-site cohort study. Scores on three of the four independent subscales of the Physicians' Reaction to Uncertainty (PRU) instrument were analysed as outcome variables in linear regression models with trainee and practice factors as independent variables. A total of 594 trainees contributed data on a total of 1209 occasions. Trainees in earlier training terms had higher scores for 'Anxiety due to uncertainty', 'Concern about bad outcomes' and 'Reluctance to disclose diagnosis/treatment uncertainty to patients'. Beyond this, findings suggest two distinct sets of associations regarding reaction to uncertainty. Firstly, affective aspects of uncertainty (the 'Anxiety' and 'Concern' subscales) were associated with female gender, less experience in hospital prior to commencing general practice training, and graduation overseas. Secondly, a maladaptive response to uncertainty (the 'Reluctance to disclose' subscale) was associated with urban practice, health qualifications prior to studying medicine, practice in an area of higher socio-economic status, and being Australian-trained. This study has established levels of three measures of trainees' responses to uncertainty and associations with these responses. The current findings suggest differing 'phenotypes' of trainees with high 'affective' responses to uncertainty and those reluctant to disclose uncertainty to patients. More

  5. Joint analysis of input and parametric uncertainties in watershed water quality modeling: A formal Bayesian approach

    Han, Feng; Zheng, Yi

    2018-06-01

    Significant Input uncertainty is a major source of error in watershed water quality (WWQ) modeling. It remains challenging to address the input uncertainty in a rigorous Bayesian framework. This study develops the Bayesian Analysis of Input and Parametric Uncertainties (BAIPU), an approach for the joint analysis of input and parametric uncertainties through a tight coupling of Markov Chain Monte Carlo (MCMC) analysis and Bayesian Model Averaging (BMA). The formal likelihood function for this approach is derived considering a lag-1 autocorrelated, heteroscedastic, and Skew Exponential Power (SEP) distributed error model. A series of numerical experiments were performed based on a synthetic nitrate pollution case and on a real study case in the Newport Bay Watershed, California. The Soil and Water Assessment Tool (SWAT) and Differential Evolution Adaptive Metropolis (DREAM(ZS)) were used as the representative WWQ model and MCMC algorithm, respectively. The major findings include the following: (1) the BAIPU can be implemented and used to appropriately identify the uncertain parameters and characterize the predictive uncertainty; (2) the compensation effect between the input and parametric uncertainties can seriously mislead the modeling based management decisions, if the input uncertainty is not explicitly accounted for; (3) the BAIPU accounts for the interaction between the input and parametric uncertainties and therefore provides more accurate calibration and uncertainty results than a sequential analysis of the uncertainties; and (4) the BAIPU quantifies the credibility of different input assumptions on a statistical basis and can be implemented as an effective inverse modeling approach to the joint inference of parameters and inputs.

  6. Uncertainty analysis of nonlinear systems employing the first-order reliability method

    Choi, Chan Kyu; Yoo, Hong Hee

    2012-01-01

    In most mechanical systems, properties of the system elements have uncertainties due to several reasons. For example, mass, stiffness coefficient of a spring, damping coefficient of a damper or friction coefficients have uncertain characteristics. The uncertain characteristics of the elements have a direct effect on the system performance uncertainty. It is very important to estimate the performance uncertainty since the performance uncertainty is directly related to manufacturing yield and consumer satisfaction. Due to this reason, the performance uncertainty should be estimated accurately and considered in the system design. In this paper, performance measures are defined for nonlinear vibration systems and the performance measure uncertainties are estimated employing the first order reliability method (FORM). It was found that the FORM could provide good results in spite of the system nonlinear characteristics. Comparing to the results obtained by Monte Carlo Simulation (MCS), the accuracy of the uncertainty analysis results obtained by the FORM is validated

  7. SENSIT: a cross-section and design sensitivity and uncertainty analysis code

    Gerstl, S.A.W.

    1980-01-01

    SENSIT computes the sensitivity and uncertainty of a calculated integral response (such as a dose rate) due to input cross sections and their uncertainties. Sensitivity profiles are computed for neutron and gamma-ray reaction cross sections of standard multigroup cross section sets and for secondary energy distributions (SEDs) of multigroup scattering matrices. In the design sensitivity mode, SENSIT computes changes in an integral response due to design changes and gives the appropriate sensitivity coefficients. Cross section uncertainty analyses are performed for three types of input data uncertainties: cross-section covariance matrices for pairs of multigroup reaction cross sections, spectral shape uncertainty parameters for secondary energy distributions (integral SED uncertainties), and covariance matrices for energy-dependent response functions. For all three types of data uncertainties SENSIT computes the resulting variance and estimated standard deviation in an integral response of interest, on the basis of generalized perturbation theory. SENSIT attempts to be more comprehensive than earlier sensitivity analysis codes, such as SWANLAKE

  8. Uncertainty Analysis of RELAP5-3D

    Alexandra E Gertman; Dr. George L Mesina

    2012-07-01

    As world-wide energy consumption continues to increase, so does the demand for the use of alternative energy sources, such as Nuclear Energy. Nuclear Power Plants currently supply over 370 gigawatts of electricity, and more than 60 new nuclear reactors have been commissioned by 15 different countries. The primary concern for Nuclear Power Plant operation and lisencing has been safety. The safety of the operation of Nuclear Power Plants is no simple matter- it involves the training of operators, design of the reactor, as well as equipment and design upgrades throughout the lifetime of the reactor, etc. To safely design, operate, and understand nuclear power plants, industry and government alike have relied upon the use of best-estimate simulation codes, which allow for an accurate model of any given plant to be created with well-defined margins of safety. The most widely used of these best-estimate simulation codes in the Nuclear Power industry is RELAP5-3D. Our project focused on improving the modeling capabilities of RELAP5-3D by developing uncertainty estimates for its calculations. This work involved analyzing high, medium, and low ranked phenomena from an INL PIRT on a small break Loss-Of-Coolant Accident as wall as an analysis of a large break Loss-Of- Coolant Accident. Statistical analyses were performed using correlation coefficients. To perform the studies, computer programs were written that modify a template RELAP5 input deck to produce one deck for each combination of key input parameters. Python scripting enabled the running of the generated input files with RELAP5-3D on INL’s massively parallel cluster system. Data from the studies was collected and analyzed with SAS. A summary of the results of our studies are presented.

  9. Comprehensive neutron cross-section and secondary energy distribution uncertainty analysis for a fusion reactor

    Gerstl, S.A.W.; LaBauve, R.J.; Young, P.G.

    1980-05-01

    On the example of General Atomic's well-documented Power Generating Fusion Reactor (PGFR) design, this report exercises a comprehensive neutron cross-section and secondary energy distribution (SED) uncertainty analysis. The LASL sensitivity and uncertainty analysis code SENSIT is used to calculate reaction cross-section sensitivity profiles and integral SED sensitivity coefficients. These are then folded with covariance matrices and integral SED uncertainties to obtain the resulting uncertainties of three calculated neutronics design parameters: two critical radiation damage rates and a nuclear heating rate. The report documents the first sensitivity-based data uncertainty analysis, which incorporates a quantitative treatment of the effects of SED uncertainties. The results demonstrate quantitatively that the ENDF/B-V cross-section data files for C, H, and O, including their SED data, are fully adequate for this design application, while the data for Fe and Ni are at best marginally adequate because they give rise to response uncertainties up to 25%. Much higher response uncertainties are caused by cross-section and SED data uncertainties in Cu (26 to 45%), tungsten (24 to 54%), and Cr (up to 98%). Specific recommendations are given for re-evaluations of certain reaction cross-sections, secondary energy distributions, and uncertainty estimates

  10. Methodologies for uncertainty analysis in the level 2 PSA and their implementation procedures

    Ahn, Kwang Il; Yang, Joon Eun; Kim, Dong Ha

    2002-04-01

    Main purpose of this report to present standardized methodologies for uncertainty analysis in the Level 2 Probabilistic Safety Assessment (PSA) and their implementation procedures, based on results obtained through a critical review of the existing methodologies for the analysis of uncertainties employed in the Level 2 PSA, especially Accident Progression Event Tree (APET). Uncertainties employed in the Level 2 PSA, quantitative expressions of overall knowledge of analysts' and experts' participating in the probabilistic quantification process of phenomenological accident progressions ranging from core melt to containment failure, their numerical values are directly related to the degree of confidence that the analyst has that a given phenomenological event or accident process will or will not occur, or analyst's subjective probabilities of occurrence. These results that are obtained from Level 2 PSA uncertainty analysis, become an essential contributor to the plant risk, in addition to the Level 1 PSA and Level 3 PSA uncertainties. Uncertainty analysis methodologies and their implementation procedures presented in this report was prepared based on the following criteria: 'uncertainty quantification process must be logical, scrutable, complete, consistent and in an appropriate level of detail, as mandated by the Level 2 PSA objectives'. For the aforementioned purpose, this report deals mainly with (1) summary of general or Level 2 PSA specific uncertainty analysis methodologies, (2) selection of phenomenological branch events for uncertainty analysis in the APET, methodology for quantification of APET uncertainty inputs and its implementation procedure, (3) statistical propagation of uncertainty inputs through APET and its implementation procedure, and (4) formal procedure for quantification of APET uncertainties and source term categories (STCs) through the Level 2 PSA quantification codes

  11. Application of Uncertainty and Sensitivity Analysis to a Kinetic Model for Enzymatic Biodiesel Production

    Price, Jason Anthony; Nordblad, Mathias; Woodley, John

    2014-01-01

    This paper demonstrates the added benefits of using uncertainty and sensitivity analysis in the kinetics of enzymatic biodiesel production. For this study, a kinetic model by Fedosov and co-workers is used. For the uncertainty analysis the Monte Carlo procedure was used to statistically quantify...

  12. Modified Phenomena Identification and Ranking Table (PIRT) for Uncertainty Analysis

    Gol-Mohamad, Mohammad P.; Modarres, Mohammad; Mosleh, Ali

    2006-01-01

    This paper describes a methodology of characterizing important phenomena, which is also part of a broader research by the authors called 'Modified PIRT'. The methodology provides robust process of phenomena identification and ranking process for more precise quantification of uncertainty. It is a two-step process of identifying and ranking methodology based on thermal-hydraulics (TH) importance as well as uncertainty importance. Analytical Hierarchical Process (AHP) has been used for as a formal approach for TH identification and ranking. Formal uncertainty importance technique is used to estimate the degree of credibility of the TH model(s) used to represent the important phenomena. This part uses subjective justification by evaluating available information and data from experiments, and code predictions. The proposed methodology was demonstrated by developing a PIRT for large break loss of coolant accident LBLOCA for the LOFT integral facility with highest core power (test LB-1). (authors)

  13. Alignment measurements uncertainties for large assemblies using probabilistic analysis techniques

    AUTHOR|(CDS)2090816; Almond, Heather

    Big science and ambitious industrial projects continually push forward with technical requirements beyond the grasp of conventional engineering techniques. Example of those are ultra-high precision requirements in the field of celestial telescopes, particle accelerators and aerospace industry. Such extreme requirements are limited largely by the capability of the metrology used, namely, it’s uncertainty in relation to the alignment tolerance required. The current work was initiated as part of Maria Curie European research project held at CERN, Geneva aiming to answer those challenges as related to future accelerators requiring alignment of 2 m large assemblies to tolerances in the 10 µm range. The thesis has found several gaps in current knowledge limiting such capability. Among those was the lack of application of state of the art uncertainty propagation methods in alignment measurements metrology. Another major limiting factor found was the lack of uncertainty statements in the thermal errors compensatio...

  14. Uncertainties in criticality analysis which affect the storage and transportation of LWR fuel

    Napolitani, D.G.

    1989-01-01

    Satisfying the design criteria for subcriticality with uncertainties affects: the capacity of LWR storage arrays, maximum allowable enrichment, minimum allowable burnup and economics of various storage options. There are uncertainties due to: calculational method, data libraries, geometric limitations, modelling bias, the number and quality of benchmarks performed and mechanical uncertainties in the array. Yankee Atomic Electric Co. (YAEC) has developed and benchmarked methods to handle: high density storage rack designs, pin consolidation, low density moderation and burnup credit. The uncertainties associated with such criticality analysis are quantified on the basis of clean criticals, power reactor criticals and intercomparison of independent analysis methods

  15. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

    Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States)] [and others

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.

  16. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

    Harper, F.T.; Young, M.L.; Miller, L.A.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes

  17. Uncertainty analysis for the assembly and core simulation of BEAVRS at the HZP conditions

    Wan, Chenghui; Cao, Liangzhi; Wu, Hongchun; Shen, Wei

    2017-01-01

    Highlights: • Uncertainty analysis has been completed based on the “two-step” scheme. • Uncertainty analysis has been performed to BEAVRS at HZP. • For lattice calculations, the few-group constant’s uncertainty was quantified. • For core simulation, uncertainties of k_e_f_f and power distributions were quantified. - Abstract: Based on the “two-step” scheme for the reactor-physics calculations, the capability of uncertainty analysis for the core simulations has been implemented in the UNICORN code, an in-house code for the sensitivity and uncertainty analysis of the reactor-physics calculations. Applying the statistical sampling method, the nuclear-data uncertainties can be propagated to the important predictions of the core simulations. The uncertainties of the few-group constants introduced by the uncertainties of the multigroup microscopic cross sections are quantified first for the lattice calculations; the uncertainties of the few-group constants are then propagated to the core multiplication factor and core power distributions for the core simulations. Up to now, our in-house lattice code NECP-CACTI and the neutron-diffusion solver NECP-VIOLET have been implemented in UNICORN for the steady-state core simulations based on the “two-step” scheme. With NECP-CACTI and NECP-VIOLET, the modeling and simulation of the steady-state BEAVRS benchmark problem at the HZP conditions was performed, and the results were compared with those obtained by CASMO-4E. Based on the modeling and simulation, the UNICORN code has been applied to perform the uncertainty analysis for BAEVRS at HZP. The uncertainty results of the eigenvalues and two-group constants for the lattice calculations and the multiplication factor and the power distributions for the steady-state core simulations are obtained and analyzed in detail.

  18. Principles and applications of measurement and uncertainty analysis in research and calibration

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that ``The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.`` Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true? What kind of information should we include in a statement of uncertainty accompanying a calibrated value? How and where do we get the information to include in an uncertainty statement? How should we interpret and use measurement uncertainty information? This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  19. Principles and applications of measurement and uncertainty analysis in research and calibration

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.'' Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true What kind of information should we include in a statement of uncertainty accompanying a calibrated value How and where do we get the information to include in an uncertainty statement How should we interpret and use measurement uncertainty information This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  20. Uncertainty analysis for the assembly and core simulation of BEAVRS at the HZP conditions

    Wan, Chenghui [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Cao, Liangzhi, E-mail: caolz@mail.xjtu.edu.cn [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Wu, Hongchun [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Shen, Wei [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Canadian Nuclear Safety Commission, Ottawa, Ontario (Canada)

    2017-04-15

    Highlights: • Uncertainty analysis has been completed based on the “two-step” scheme. • Uncertainty analysis has been performed to BEAVRS at HZP. • For lattice calculations, the few-group constant’s uncertainty was quantified. • For core simulation, uncertainties of k{sub eff} and power distributions were quantified. - Abstract: Based on the “two-step” scheme for the reactor-physics calculations, the capability of uncertainty analysis for the core simulations has been implemented in the UNICORN code, an in-house code for the sensitivity and uncertainty analysis of the reactor-physics calculations. Applying the statistical sampling method, the nuclear-data uncertainties can be propagated to the important predictions of the core simulations. The uncertainties of the few-group constants introduced by the uncertainties of the multigroup microscopic cross sections are quantified first for the lattice calculations; the uncertainties of the few-group constants are then propagated to the core multiplication factor and core power distributions for the core simulations. Up to now, our in-house lattice code NECP-CACTI and the neutron-diffusion solver NECP-VIOLET have been implemented in UNICORN for the steady-state core simulations based on the “two-step” scheme. With NECP-CACTI and NECP-VIOLET, the modeling and simulation of the steady-state BEAVRS benchmark problem at the HZP conditions was performed, and the results were compared with those obtained by CASMO-4E. Based on the modeling and simulation, the UNICORN code has been applied to perform the uncertainty analysis for BAEVRS at HZP. The uncertainty results of the eigenvalues and two-group constants for the lattice calculations and the multiplication factor and the power distributions for the steady-state core simulations are obtained and analyzed in detail.

  1. PCT Uncertainty Analysis Using Unscented Transform with Random Orthogonal Matrix

    Fynana, Douglas A.; Ahn, Kwang-Il [KAERI, Daejeon (Korea, Republic of); Lee, John C. [Univ. of Michigan, Michigan (United States)

    2015-05-15

    Most Best Estimate Plus Uncertainty (BEPU) methods employ nonparametric order statistics through Wilks' formula to quantify uncertainties of best estimate simulations of nuclear power plant (NPP) transients. 95%/95% limits, the 95''t{sup h} percentile at a 95% confidence level, are obtained by randomly sampling all uncertainty contributors through conventional Monte Carlo (MC). Advantages are simple implementation of MC sampling of input probability density functions (pdfs) and limited computational expense of 1''s{sup t}, 2''n{sup d}, and 3''r{sup d} order Wilks' formula requiring only 59, 93, or 124 simulations, respectively. A disadvantage of small sample size is large sample to sample variation of statistical estimators. This paper presents a new efficient sampling based algorithm for accurate estimation of mean and variance of the output parameter pdf. The algorithm combines a deterministic sampling method, the unscented transform (UT), with random sampling through the generation of a random orthogonal matrix (ROM). The UT guarantees the mean, covariance, and 3''r{sup d} order moments of the multivariate input parameter distributions are exactly preserved by the sampled input points and the orthogonal transformation of the points by a ROM guarantees the sample error of all 4''t{sup h} order and higher moments are unbiased. The UT with ROM algorithm is applied to the uncertainty quantification of the peak clad temperature (PCT) during a large break loss-of-coolant accident (LBLOCA) in an OPR1000 NPP to demonstrate the applicability of the new algorithm to BEPU. This paper presented a new algorithm combining the UT with ROM for efficient multivariate parameter sampling that ensures sample input covariance and 3''r{sup d} order moments are exactly preserved and 4''th moment errors are small and unbiased. The advantageous sample properties guarantee higher order accuracy and

  2. Probabilistic structural analysis to quantify uncertainties associated with turbopump blades

    Nagpal, Vinod K.; Rubinstein, Robert; Chamis, Christos C.

    1987-01-01

    A probabilistic study of turbopump blades has been in progress at NASA Lewis Research Center for over the last two years. The objectives of this study are to evaluate the effects of uncertainties in geometry and material properties on the structural response of the turbopump blades to evaluate the tolerance limits on the design. A methodology based on probabilistic approach has been developed to quantify the effects of the random uncertainties. The results of this study indicate that only the variations in geometry have significant effects.

  3. Development of a System Analysis Toolkit for Sensitivity Analysis, Uncertainty Propagation, and Estimation of Parameter Distribution

    Heo, Jaeseok; Kim, Kyung Doo

    2015-01-01

    Statistical approaches to uncertainty quantification and sensitivity analysis are very important in estimating the safety margins for an engineering design application. This paper presents a system analysis and optimization toolkit developed by Korea Atomic Energy Research Institute (KAERI), which includes multiple packages of the sensitivity analysis and uncertainty quantification algorithms. In order to reduce the computing demand, multiple compute resources including multiprocessor computers and a network of workstations are simultaneously used. A Graphical User Interface (GUI) was also developed within the parallel computing framework for users to readily employ the toolkit for an engineering design and optimization problem. The goal of this work is to develop a GUI framework for engineering design and scientific analysis problems by implementing multiple packages of system analysis methods in the parallel computing toolkit. This was done by building an interface between an engineering simulation code and the system analysis software packages. The methods and strategies in the framework were designed to exploit parallel computing resources such as those found in a desktop multiprocessor workstation or a network of workstations. Available approaches in the framework include statistical and mathematical algorithms for use in science and engineering design problems. Currently the toolkit has 6 modules of the system analysis methodologies: deterministic and probabilistic approaches of data assimilation, uncertainty propagation, Chi-square linearity test, sensitivity analysis, and FFTBM

  4. Development of a System Analysis Toolkit for Sensitivity Analysis, Uncertainty Propagation, and Estimation of Parameter Distribution

    Heo, Jaeseok; Kim, Kyung Doo [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    Statistical approaches to uncertainty quantification and sensitivity analysis are very important in estimating the safety margins for an engineering design application. This paper presents a system analysis and optimization toolkit developed by Korea Atomic Energy Research Institute (KAERI), which includes multiple packages of the sensitivity analysis and uncertainty quantification algorithms. In order to reduce the computing demand, multiple compute resources including multiprocessor computers and a network of workstations are simultaneously used. A Graphical User Interface (GUI) was also developed within the parallel computing framework for users to readily employ the toolkit for an engineering design and optimization problem. The goal of this work is to develop a GUI framework for engineering design and scientific analysis problems by implementing multiple packages of system analysis methods in the parallel computing toolkit. This was done by building an interface between an engineering simulation code and the system analysis software packages. The methods and strategies in the framework were designed to exploit parallel computing resources such as those found in a desktop multiprocessor workstation or a network of workstations. Available approaches in the framework include statistical and mathematical algorithms for use in science and engineering design problems. Currently the toolkit has 6 modules of the system analysis methodologies: deterministic and probabilistic approaches of data assimilation, uncertainty propagation, Chi-square linearity test, sensitivity analysis, and FFTBM.

  5. Information Theory for Correlation Analysis and Estimation of Uncertainty Reduction in Maps and Models

    J. Florian Wellmann

    2013-04-01

    Full Text Available The quantification and analysis of uncertainties is important in all cases where maps and models of uncertain properties are the basis for further decisions. Once these uncertainties are identified, the logical next step is to determine how they can be reduced. Information theory provides a framework for the analysis of spatial uncertainties when different subregions are considered as random variables. In the work presented here, joint entropy, conditional entropy, and mutual information are applied for a detailed analysis of spatial uncertainty correlations. The aim is to determine (i which areas in a spatial analysis share information, and (ii where, and by how much, additional information would reduce uncertainties. As an illustration, a typical geological example is evaluated: the case of a subsurface layer with uncertain depth, shape and thickness. Mutual information and multivariate conditional entropies are determined based on multiple simulated model realisations. Even for this simple case, the measures not only provide a clear picture of uncertainties and their correlations but also give detailed insights into the potential reduction of uncertainties at each position, given additional information at a different location. The methods are directly applicable to other types of spatial uncertainty evaluations, especially where multiple realisations of a model simulation are analysed. In summary, the application of information theoretic measures opens up the path to a better understanding of spatial uncertainties, and their relationship to information and prior knowledge, for cases where uncertain property distributions are spatially analysed and visualised in maps and models.

  6. Quality in environmental science for policy: Assessing uncertainty as a component of policy analysis

    Maxim, Laura; Sluijs, Jeroen P. van der

    2011-01-01

    The sheer number of attempts to define and classify uncertainty reveals an awareness of its importance in environmental science for policy, though the nature of uncertainty is often misunderstood. The interdisciplinary field of uncertainty analysis is unstable; there are currently several incomplete notions of uncertainty leading to different and incompatible uncertainty classifications. One of the most salient shortcomings of present-day practice is that most of these classifications focus on quantifying uncertainty while ignoring the qualitative aspects that tend to be decisive in the interface between science and policy. Consequently, the current practices of uncertainty analysis contribute to increasing the perceived precision of scientific knowledge, but do not adequately address its lack of socio-political relevance. The 'positivistic' uncertainty analysis models (like those that dominate the fields of climate change modelling and nuclear or chemical risk assessment) have little social relevance, as they do not influence negotiations between stakeholders. From the perspective of the science-policy interface, the current practices of uncertainty analysis are incomplete and incorrectly focused. We argue that although scientific knowledge produced and used in a context of political decision-making embodies traditional scientific characteristics, it also holds additional properties linked to its influence on social, political, and economic relations. Therefore, the significance of uncertainty cannot be assessed based on quality criteria that refer to the scientific content only; uncertainty must also include quality criteria specific to the properties and roles of this scientific knowledge within political, social, and economic contexts and processes. We propose a conceptual framework designed to account for such substantive, contextual, and procedural criteria of knowledge quality. At the same time, the proposed framework includes and synthesizes the various

  7. An information-theoretic basis for uncertainty analysis: application to the QUASAR severe accident study

    Unwin, S.D.; Cazzoli, E.G.; Davis, R.E.; Khatib-Rahbar, M.; Lee, M.; Nourbakhsh, H.; Park, C.K.; Schmidt, E.

    1989-01-01

    The probabilistic characterization of uncertainty can be problematic in circumstances where there is a paucity of supporting data and limited experience on which to base engineering judgement. Information theory provides a framework in which to address this issue through reliance upon entropy-related principles of uncertainty maximization. We describe an application of such principles in the United States Nuclear Regulatory Commission-sponsored program QUASAR (Quantification and Uncertainty Analysis of Source Terms for Severe Accidents in Light Water Reactors). (author)

  8. Uncertainty budget in internal monostandard NAA for small and large size samples analysis

    Dasari, K.B.; Acharya, R.

    2014-01-01

    Total uncertainty budget evaluation on determined concentration value is important under quality assurance programme. Concentration calculation in NAA or carried out by relative NAA and k0 based internal monostandard NAA (IM-NAA) method. IM-NAA method has been used for small and large sample analysis of clay potteries. An attempt was made to identify the uncertainty components in IM-NAA and uncertainty budget for La in both small and large size samples has been evaluated and compared. (author)

  9. Uncertainty analysis for a field-scale P loss model

    Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that model predictions are inherently uncertain, few studies have addressed prediction uncertainties using P loss models. In this study we assessed the effect of model input error on predic...

  10. Uncertainty, financial development and economic growth : an empirical analysis

    Lensink, Robert

    1999-01-01

    This paper examines whether financial sector development may partly undo growth-reducing effects of policy uncertainty. By performing a cross-country growth regression for the 1970-1995 period I find evidence that countries with a more developed financial sector are better able to nullify the

  11. Uncertainty in river discharge observations: a quantitative analysis

    G. Di Baldassarre

    2009-06-01

    Full Text Available This study proposes a framework for analysing and quantifying the uncertainty of river flow data. Such uncertainty is often considered to be negligible with respect to other approximations affecting hydrological studies. Actually, given that river discharge data are usually obtained by means of the so-called rating curve method, a number of different sources of error affect the derived observations. These include: errors in measurements of river stage and discharge utilised to parameterise the rating curve, interpolation and extrapolation error of the rating curve, presence of unsteady flow conditions, and seasonal variations of the state of the vegetation (i.e. roughness. This study aims at analysing these sources of uncertainty using an original methodology. The novelty of the proposed framework lies in the estimation of rating curve uncertainty, which is based on hydraulic simulations. These latter are carried out on a reach of the Po River (Italy by means of a one-dimensional (1-D hydraulic model code (HEC-RAS. The results of the study show that errors in river flow data are indeed far from negligible.

  12. Latent class analysis of indicators of intolerance of uncertainty

    Boelen, P.A.|info:eu-repo/dai/nl/174011954; Lenferink, L.I.M.|info:eu-repo/dai/nl/411295896

    Intolerance of Uncertainty (IU) is a transdiagnostic vulnerability factor involved in depression and anxiety symptoms and disorders. IU encompasses Prospective IU (“Unforeseen events upset me greatly”) and Inhibitory IU (“The smallest doubt can stop me from acting”). Research has yet to explore

  13. 3.8 Proposed approach to uncertainty quantification and sensitivity analysis in the next PA

    Flach, Greg [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Wohlwend, Jen [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-10-02

    This memorandum builds upon Section 3.8 of SRNL (2016) and Flach (2017) by defining key error analysis, uncertainty quantification, and sensitivity analysis concepts and terms, in preparation for the next E-Area Performance Assessment (WSRC 2008) revision.

  14. Determination of uncertainties in energy and exergy analysis of a power plant

    Ege, Ahmet; Şahin, Hacı Mehmet

    2014-01-01

    Highlights: • Energy and exergy efficiency uncertainties in a large thermal power plant examined. • Sensitivity analysis shows importance of basic measurements on efficiency analysis. • A quick and practical approach is provided for determining efficiency uncertainties. • Extreme case analysis characterizes maximum possible boundaries of uncertainties. • Uncertainty determination in a plant is a dynamic process with real data. - Abstract: In this study, energy and exergy efficiency uncertainties of a large scale lignite fired power plant cycle and various measurement parameter sensitivities were investigated for five different design power outputs (100%, 85%, 80%, 60% and 40%) and with real data of the plant. For that purpose a black box method was employed considering coal flow with Lower Heating Value (LHV) as a single input and electricity produced as a single output of the plant. The uncertainty of energy and exergy efficiency of the plant was evaluated with this method by applying sensitivity analysis depending on the effect of measurement parameters such as LHV, coal mass flow rate, cell generator output voltage/current. In addition, an extreme case analysis was investigated to determine the maximum range of the uncertainties. Results of the black box method showed that uncertainties varied between 1.82–1.98% for energy efficiency and 1.32–1.43% for exergy efficiency of the plant at an operating power level of 40–100% of full power. It was concluded that LHV determination was the most important uncertainty source of energy and exergy efficiency of the plant. The uncertainties of the extreme case analysis were determined between 2.30% and 2.36% for energy efficiency while 1.66% and 1.70% for exergy efficiency for 40–100% power output respectively. Proposed method was shown to be an approach for understanding major uncertainties as well as effects of some measurement parameters in a large scale thermal power plant

  15. Uncertainty and sensitivity analysis in a Probabilistic Safety Analysis level-1

    Nunez Mc Leod, Jorge E.; Rivera, Selva S.

    1996-01-01

    A methodology for sensitivity and uncertainty analysis, applicable to a Probabilistic Safety Assessment Level I has been presented. The work contents are: correct association of distributions to parameters, importance and qualification of expert opinions, generations of samples according to sample sizes, and study of the relationships among system variables and systems response. A series of statistical-mathematical techniques are recommended along the development of the analysis methodology, as well as different graphical visualization for the control of the study. (author)

  16. Sampling based uncertainty analysis of 10% hot leg break LOCA in large scale test facility

    Sengupta, Samiran; Kraina, V.; Dubey, S. K.; Rao, R. S.; Gupta, S. K.

    2010-01-01

    Sampling based uncertainty analysis was carried out to quantify uncertainty in predictions of best estimate code RELAP5/MOD3.2 for a thermal hydraulic test (10% hot leg break LOCA) performed in the Large Scale Test Facility (LSTF) as a part of an IAEA coordinated research project. The nodalisation of the test facility was qualified for both steady state and transient level by systematically applying the procedures led by uncertainty methodology based on accuracy extrapolation (UMAE); uncertainty analysis was carried out using the Latin hypercube sampling (LHS) method to evaluate uncertainty for ten input parameters. Sixteen output parameters were selected for uncertainty evaluation and uncertainty band between 5 th and 95 th percentile of the output parameters were evaluated. It was observed that the uncertainty band for the primary pressure during two phase blowdown is larger than that of the remaining period. Similarly, a larger uncertainty band is observed relating to accumulator injection flow during reflood phase. Importance analysis was also carried out and standard rank regression coefficients were computed to quantify the effect of each individual input parameter on output parameters. It was observed that the break discharge coefficient is the most important uncertain parameter relating to the prediction of all the primary side parameters and that the steam generator (SG) relief pressure setting is the most important parameter in predicting the SG secondary pressure

  17. Uncertainty Flow Facilitates Zero-Shot Multi-Label Learning in Affective Facial Analysis

    Wenjun Bai

    2018-02-01

    Full Text Available Featured Application: The proposed Uncertainty Flow framework may benefit the facial analysis with its promised elevation in discriminability in multi-label affective classification tasks. Moreover, this framework also allows the efficient model training and between tasks knowledge transfer. The applications that rely heavily on continuous prediction on emotional valance, e.g., to monitor prisoners’ emotional stability in jail, can be directly benefited from our framework. Abstract: To lower the single-label dependency on affective facial analysis, it urges the fruition of multi-label affective learning. The impediment to practical implementation of existing multi-label algorithms pertains to scarcity of scalable multi-label training datasets. To resolve this, an inductive transfer learning based framework, i.e.,Uncertainty Flow, is put forward in this research to allow knowledge transfer from a single labelled emotion recognition task to a multi-label affective recognition task. I.e., the model uncertainty—which can be quantified in Uncertainty Flow—is distilled from a single-label learning task. The distilled model uncertainty ensures the later efficient zero-shot multi-label affective learning. On the theoretical perspective, within our proposed Uncertainty Flow framework, the feasibility of applying weakly informative priors, e.g., uniform and Cauchy prior, is fully explored in this research. More importantly, based on the derived weight uncertainty, three sets of prediction related uncertainty indexes, i.e., soft-max uncertainty, pure uncertainty and uncertainty plus are proposed to produce reliable and accurate multi-label predictions. Validated on our manual annotated evaluation dataset, i.e., the multi-label annotated FER2013, our proposed Uncertainty Flow in multi-label facial expression analysis exhibited superiority to conventional multi-label learning algorithms and multi-label compatible neural networks. The success of our

  18. Uncertainty analysis for results of thermal hydraulic codes of best-estimate-type

    Alva N, J.

    2010-01-01

    In this thesis, some fundamental knowledge is presented about uncertainty analysis and about diverse methodologies applied in the study of nuclear power plant transient event analysis, particularly related to thermal hydraulics phenomena. These concepts and methodologies mentioned in this work come from a wide bibliographical research in the nuclear power subject. Methodologies for uncertainty analysis have been developed by quite diverse institutions, and they have been widely used worldwide for application to results from best-estimate-type computer codes in nuclear reactor thermal hydraulics and safety analysis. Also, the main uncertainty sources, types of uncertainties, and aspects related to best estimate modeling and methods are introduced. Once the main bases of uncertainty analysis have been set, and some of the known methodologies have been introduced, it is presented in detail the CSAU methodology, which will be applied in the analyses. The main objective of this thesis is to compare the results of an uncertainty and sensibility analysis by using the Response Surface Technique to the application of W ilks formula, apply through a loss coolant experiment and an event of rise in a BWR. Both techniques are options in the part of uncertainty and sensibility analysis of the CSAU methodology, which was developed for the analysis of transients and accidents at nuclear power plants, and it is the base of most of the methodologies used in licensing of nuclear power plants practically everywhere. Finally, the results of applying both techniques are compared and discussed. (Author)

  19. Interpretations of alternative uncertainty representations in a reliability and risk analysis context

    Aven, T.

    2011-01-01

    Probability is the predominant tool used to measure uncertainties in reliability and risk analyses. However, other representations also exist, including imprecise (interval) probability, fuzzy probability and representations based on the theories of evidence (belief functions) and possibility. Many researchers in the field are strong proponents of these alternative methods, but some are also sceptical. In this paper, we address one basic requirement set for quantitative measures of uncertainty: the interpretation needed to explain what an uncertainty number expresses. We question to what extent the various measures meet this requirement. Comparisons are made with probabilistic analysis, where uncertainty is represented by subjective probabilities, using either a betting interpretation or a reference to an uncertainty standard interpretation. By distinguishing between chances (expressing variation) and subjective probabilities, new insights are gained into the link between the alternative uncertainty representations and probability.

  20. Uncertainty and sensitivity analysis applied to coupled code calculations for a VVER plant transient

    Langenbuch, S.; Krzykacz-Hausmann, B.; Schmidt, K. D.

    2004-01-01

    The development of coupled codes, combining thermal-hydraulic system codes and 3D neutron kinetics, is an important step to perform best-estimate plant transient calculations. It is generally agreed that the application of best-estimate methods should be supplemented by an uncertainty and sensitivity analysis to quantify the uncertainty of the results. The paper presents results from the application of the GRS uncertainty and sensitivity method for a VVER-440 plant transient, which was already studied earlier for the validation of coupled codes. For this application, the main steps of the uncertainty method are described. Typical results of the method applied to the analysis of the plant transient by several working groups using different coupled codes are presented and discussed The results demonstrate the capability of an uncertainty and sensitivity analysis. (authors)

  1. Quantitative Analysis of Uncertainty in Medical Reporting: Creating a Standardized and Objective Methodology.

    Reiner, Bruce I

    2018-04-01

    Uncertainty in text-based medical reports has long been recognized as problematic, frequently resulting in misunderstanding and miscommunication. One strategy for addressing the negative clinical ramifications of report uncertainty would be the creation of a standardized methodology for characterizing and quantifying uncertainty language, which could provide both the report author and reader with context related to the perceived level of diagnostic confidence and accuracy. A number of computerized strategies could be employed in the creation of this analysis including string search, natural language processing and understanding, histogram analysis, topic modeling, and machine learning. The derived uncertainty data offers the potential to objectively analyze report uncertainty in real time and correlate with outcomes analysis for the purpose of context and user-specific decision support at the point of care, where intervention would have the greatest clinical impact.

  2. Uncertainty analysis of atmospheric friction torque on the solid Earth

    Haoming Yan

    2016-05-01

    Full Text Available The wind stress acquired from European Centre for Medium-Range Weather Forecasts (ECMWF, National Centers for Environmental Prediction (NCEP climate models and QSCAT satellite observations are analyzed by using frequency-wavenumber spectrum method. The spectrum of two climate models, i.e., ECMWF and NCEP, is similar for both 10 m wind data and model output wind stress data, which indicates that both the climate models capture the key feature of wind stress. While the QSCAT wind stress data shows the similar characteristics with the two climate models in both spectrum domain and the spatial distribution, but with a factor of approximately 1.25 times larger than that of climate models in energy. These differences show the uncertainty in the different wind stress products, which inevitably cause the atmospheric friction torque uncertainties on solid Earth with a 60% departure in annual amplitude, and furtherly affect the precise estimation of the Earth's rotation.

  3. Measurement uncertainties in regression analysis with scarcity of data

    Sousa, J A; Ribeiro, A S; Cox, M G; Harris, P M; Sousa, J F V

    2010-01-01

    The evaluation of measurement uncertainty, in certain fields of science, faces the problem of scarcity of data. This is certainly the case in the testing of geological soils in civil engineering, where tests can take several days or weeks and where the same sample is not available for further testing, being destroyed during the experiment. In this particular study attention will be paid to triaxial compression tests used to typify particular soils. The purpose of the testing is to determine two parameters that characterize the soil, namely, cohesion and friction angle. These parameters are defined in terms of the intercept and slope of a straight line fitted to a small number of points (usually three) derived from experimental data. The use of ordinary least squares to obtain uncertainties associated with estimates of the two parameters would be unreliable if there were only three points (and no replicates) and hence only one degrees of freedom.

  4. Sensitivity, uncertainty, and importance analysis of a risk assessment

    Andsten, R.S.; Vaurio, J.K.

    1992-01-01

    In this paper a number of supplementary studies and applications associated with probabilistic safety assessment (PSA) are described, including sensitivity and importance evaluations of failures, errors, systems, and groups of components. The main purpose is to illustrate the usefulness of a PSA for making decisions about safety improvements, training, allowed outage times, and test intervals. A useful measure of uncertainty importance is presented, and it points out areas needing development, such as reactor vessel aging phenomena, for reducing overall uncertainty. A time-dependent core damage frequency is also presented, illustrating the impact of testing scenarios and intervals. Tea methods and applications presented are based on the Level 1 PSA carried out for the internal initiating event of the Loviisa 1 nuclear power station. Steam generator leakages and associated operator actions are major contributors to the current core-damage frequency estimate of 2 x10 -4 /yr. The results are used to improve the plant and procedures and to guide future improvements

  5. Use of quantitative uncertainty analysis for human health risk assessment

    Duncan, F.L.W.; Gordon, J.W.; Kelly, M.

    1994-01-01

    Current human health risk assessment method for environmental risks typically use point estimates of risk accompanied by qualitative discussions of uncertainty. Alternatively, Monte Carlo simulations may be used with distributions for input parameters to estimate the resulting risk distribution and descriptive risk percentiles. These two techniques are applied for the ingestion of 1,1=dichloroethene in ground water. The results indicate that Monte Carlo simulations provide significantly more information for risk assessment and risk management than do point estimates

  6. Uncertainty analysis in estimating Japanese ingestion of global fallout Cs-137 using health risk evaluation model

    Shimada, Yoko; Morisawa, Shinsuke

    1998-01-01

    Most of model estimation of the environmental contamination includes some uncertainty associated with the parameter uncertainty in the model. In this study, the uncertainty was analyzed in a model for evaluating the ingestion of radionuclide caused by the long-term global low-level radioactive contamination by using various uncertainty analysis methods: the percentile estimate, the robustness analysis and the fuzzy estimate. The model is mainly composed of five sub-models, which include their own uncertainty; we also analyzed the uncertainty. The major findings obtained in this study include that the possibility of the discrepancy between predicted value by the model simulation and the observed data is less than 10%; the uncertainty of the predicted value is higher before 1950 and after 1980; the uncertainty of the predicted value can be reduced by decreasing the uncertainty of some environmental parameters in the model; the reliability of the model can definitively depend on the following environmental factors: direct foliar absorption coefficient, transfer factor of radionuclide from stratosphere down to troposphere, residual rate by food processing and cooking, transfer factor of radionuclide in ocean and sedimentation in ocean. (author)

  7. Uncertainty modelling and analysis of environmental systems: a river sediment yield example

    Keesman, K.J.; Koskela, J.; Guillaume, J.H.; Norton, J.P.; Croke, B.; Jakeman, A.

    2011-01-01

    Abstract: Throughout the last decades uncertainty analysis has become an essential part of environmental model building (e.g. Beck 1987; Refsgaard et al., 2007). The objective of the paper is to introduce stochastic and setmembership uncertainty modelling concepts, which basically differ in the

  8. Model parameter uncertainty analysis for annual field-scale P loss model

    Phosphorous (P) loss models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. All P loss models, however, have an inherent amount of uncertainty associated with them. In this study, we conducted an uncertainty analysis with ...

  9. Model parameter uncertainty analysis for an annual field-scale phosphorus loss model

    Phosphorous (P) loss models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. All P loss models, however, have an inherent amount of uncertainty associated with them. In this study, we conducted an uncertainty analysis with ...

  10. Uncertainties combined in algae and water in chemical analysis in determinations with ICP-OES

    Souza, Poliana Santos de

    2014-01-01

    One way to determine if some trace elements in algae and water is through uncertainty calculations. Spectrometry and inductively coupled plasma optical emission (ICP-OES) is widely used in this procedure, because it allows the analysis in waters and areas of solid samples. Thus, some elements (Fe, Ca and Mg) were used to calculate the uncertainty. (author)

  11. Uncertainty and sensitivity analysis of control strategies using the benchmark simulation model No1 (BSM1)

    Flores-Alsina, Xavier; Rodriguez-Roda, Ignasi; Sin, Gürkan

    2009-01-01

    The objective of this paper is to perform an uncertainty and sensitivity analysis of the predictions of the Benchmark Simulation Model (BSM) No. 1, when comparing four activated sludge control strategies. The Monte Carlo simulation technique is used to evaluate the uncertainty in the BSM1 predict...

  12. An introductory guide to uncertainty analysis in environmental and health risk assessment

    Hoffman, F.O.; Hammonds, J.S.

    1992-10-01

    To compensate for the potential for overly conservative estimates of risk using standard US Environmental Protection Agency methods, an uncertainty analysis should be performed as an integral part of each risk assessment. Uncertainty analyses allow one to obtain quantitative results in the form of confidence intervals that will aid in decision making and will provide guidance for the acquisition of additional data. To perform an uncertainty analysis, one must frequently rely on subjective judgment in the absence of data to estimate the range and a probability distribution describing the extent of uncertainty about a true but unknown value for each parameter of interest. This information is formulated from professional judgment based on an extensive review of literature, analysis of the data, and interviews with experts. Various analytical and numerical techniques are available to allow statistical propagation of the uncertainty in the model parameters to a statement of uncertainty in the risk to a potentially exposed individual. Although analytical methods may be straightforward for relatively simple models, they rapidly become complicated for more involved risk assessments. Because of the tedious efforts required to mathematically derive analytical approaches to propagate uncertainty in complicated risk assessments, numerical methods such as Monte Carlo simulation should be employed. The primary objective of this report is to provide an introductory guide for performing uncertainty analysis in risk assessments being performed for Superfund sites

  13. Spatial GHG Inventory: Analysis of Uncertainty Sources. A Case Study for Ukraine

    Bun, R.; Gusti, M.; Kujii, L.; Tokar, O.; Tsybrivskyy, Y.; Bun, A.

    2007-01-01

    A geoinformation technology for creating spatially distributed greenhouse gas inventories based on a methodology provided by the Intergovernmental Panel on Climate Change and special software linking input data, inventory models, and a means for visualization are proposed. This technology opens up new possibilities for qualitative and quantitative spatially distributed presentations of inventory uncertainty at the regional level. Problems concerning uncertainty and verification of the distributed inventory are discussed. A Monte Carlo analysis of uncertainties in the energy sector at the regional level is performed, and a number of simulations concerning the effectiveness of uncertainty reduction in some regions are carried out. Uncertainties in activity data have a considerable influence on overall inventory uncertainty, for example, the inventory uncertainty in the energy sector declines from 3.2 to 2.0% when the uncertainty of energy-related statistical data on fuels combusted in the energy industries declines from 10 to 5%. Within the energy sector, the 'energy industries' subsector has the greatest impact on inventory uncertainty. The relative uncertainty in the energy sector inventory can be reduced from 2.19 to 1.47% if the uncertainty of specific statistical data on fuel consumption decreases from 10 to 5%. The 'energy industries' subsector has the greatest influence in the Donetsk oblast. Reducing the uncertainty of statistical data on electricity generation in just three regions - the Donetsk, Dnipropetrovsk, and Luhansk oblasts - from 7.5 to 4.0% results in a decline from 2.6 to 1.6% in the uncertainty in the national energy sector inventory

  14. Analysis of uncertainty propagation in nuclear fuel cycle scenarios

    Krivtchik, Guillaume

    2014-01-01

    Nuclear scenario studies model nuclear fleet over a given period. They enable the comparison of different options for the reactor fleet evolution, and the management of the future fuel cycle materials, from mining to disposal, based on criteria such as installed capacity per reactor technology, mass inventories and flows, in the fuel cycle and in the waste. Uncertainties associated with nuclear data and scenario parameters (fuel, reactors and facilities characteristics) propagate along the isotopic chains in depletion calculations, and through out the scenario history, which reduces the precision of the results. The aim of this work is to develop, implement and use a stochastic uncertainty propagation methodology adapted to scenario studies. The method chosen is based on development of depletion computation surrogate models, which reduce the scenario studies computation time, and whose parameters include perturbations of the depletion model; and fabrication of equivalence model which take into account cross-sections perturbations for computation of fresh fuel enrichment. Then the uncertainty propagation methodology is applied to different scenarios of interest, considering different options of evolution for the French PWR fleet with SFR deployment. (author) [fr

  15. A python framework for environmental model uncertainty analysis

    White, Jeremy; Fienen, Michael N.; Doherty, John E.

    2016-01-01

    We have developed pyEMU, a python framework for Environmental Modeling Uncertainty analyses, open-source tool that is non-intrusive, easy-to-use, computationally efficient, and scalable to highly-parameterized inverse problems. The framework implements several types of linear (first-order, second-moment (FOSM)) and non-linear uncertainty analyses. The FOSM-based analyses can also be completed prior to parameter estimation to help inform important modeling decisions, such as parameterization and objective function formulation. Complete workflows for several types of FOSM-based and non-linear analyses are documented in example notebooks implemented using Jupyter that are available in the online pyEMU repository. Example workflows include basic parameter and forecast analyses, data worth analyses, and error-variance analyses, as well as usage of parameter ensemble generation and management capabilities. These workflows document the necessary steps and provides insights into the results, with the goal of educating users not only in how to apply pyEMU, but also in the underlying theory of applied uncertainty quantification.

  16. Survey of sampling-based methods for uncertainty and sensitivity analysis

    Helton, J.C.; Johnson, J.D.; Sallaberry, C.J.; Storlie, C.B.

    2006-01-01

    Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (i) definition of probability distributions to characterize epistemic uncertainty in analysis inputs (ii) generation of samples from uncertain analysis inputs (iii) propagation of sampled inputs through an analysis (iv) presentation of uncertainty analysis results, and (v) determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two-dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition

  17. Survey of sampling-based methods for uncertainty and sensitivity analysis.

    Johnson, Jay Dean; Helton, Jon Craig; Sallaberry, Cedric J. PhD. (.; .); Storlie, Curt B. (Colorado State University, Fort Collins, CO)

    2006-06-01

    Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (1) Definition of probability distributions to characterize epistemic uncertainty in analysis inputs, (2) Generation of samples from uncertain analysis inputs, (3) Propagation of sampled inputs through an analysis, (4) Presentation of uncertainty analysis results, and (5) Determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition.

  18. Uncertainty Estimation of Neutron Activation Analysis in Zinc Elemental Determination in Food Samples

    Endah Damastuti; Muhayatun; Diah Dwiana L

    2009-01-01

    Beside to complished the requirements of international standard of ISO/IEC 17025:2005, uncertainty estimation should be done to increase quality and confidence of analysis results and also to establish traceability of the analysis results to SI unit. Neutron activation analysis is a major technique used by Radiometry technique analysis laboratory and is included as scope of accreditation under ISO/IEC 17025:2005, therefore uncertainty estimation of neutron activation analysis is needed to be carried out. Sample and standard preparation as well as, irradiation and measurement using gamma spectrometry were the main activities which could give contribution to uncertainty. The components of uncertainty sources were specifically explained. The result of expanded uncertainty was 4,0 mg/kg with level of confidence 95% (coverage factor=2) and Zn concentration was 25,1 mg/kg. Counting statistic of cuplikan and standard were the major contribution of combined uncertainty. The uncertainty estimation was expected to increase the quality of the analysis results and could be applied further to other kind of samples. (author)

  19. Uncertainty analysis of hydrological modeling in a tropical area using different algorithms

    Rafiei Emam, Ammar; Kappas, Martin; Fassnacht, Steven; Linh, Nguyen Hoang Khanh

    2018-01-01

    Hydrological modeling outputs are subject to uncertainty resulting from different sources of errors (e.g., error in input data, model structure, and model parameters), making quantification of uncertainty in hydrological modeling imperative and meant to improve reliability of modeling results. The uncertainty analysis must solve difficulties in calibration of hydrological models, which further increase in areas with data scarcity. The purpose of this study is to apply four uncertainty analysis algorithms to a semi-distributed hydrological model, quantifying different source of uncertainties (especially parameter uncertainty) and evaluate their performance. In this study, the Soil and Water Assessment Tools (SWAT) eco-hydrological model was implemented for the watershed in the center of Vietnam. The sensitivity of parameters was analyzed, and the model was calibrated. The uncertainty analysis for the hydrological model was conducted based on four algorithms: Generalized Likelihood Uncertainty Estimation (GLUE), Sequential Uncertainty Fitting (SUFI), Parameter Solution method (ParaSol) and Particle Swarm Optimization (PSO). The performance of the algorithms was compared using P-factor and Rfactor, coefficient of determination (R 2), the Nash Sutcliffe coefficient of efficiency (NSE) and Percent Bias (PBIAS). The results showed the high performance of SUFI and PSO with P-factor>0.83, R-factor 0.91, NSE>0.89, and 0.18analysis. Indeed, the uncertainty analysis must be accounted when the outcomes of the model use for policy or management decisions.

  20. Photovoltaic Calibrations at the National Renewable Energy Laboratory and Uncertainty Analysis Following the ISO 17025 Guidelines

    Emery, Keith [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-09-01

    The measurement of photovoltaic (PV) performance with respect to reference conditions requires measuring current versus voltage for a given tabular reference spectrum, junction temperature, and total irradiance. This report presents the procedures implemented by the PV Cell and Module Performance Characterization Group at the National Renewable Energy Laboratory (NREL) to achieve the lowest practical uncertainty. A rigorous uncertainty analysis of these procedures is presented, which follows the International Organization for Standardization (ISO) Guide to the Expression of Uncertainty in Measurement. This uncertainty analysis is required for the team’s laboratory accreditation under ISO standard 17025, “General Requirements for the Competence of Testing and Calibration Laboratories.” The report also discusses additional areas where the uncertainty can be reduced.

  1. Analysis of the influence of input data uncertainties on determining the reliability of reservoir storage capacity

    Marton Daniel

    2015-12-01

    Full Text Available The paper contains a sensitivity analysis of the influence of uncertainties in input hydrological, morphological and operating data required for a proposal for active reservoir conservation storage capacity and its achieved values. By introducing uncertainties into the considered inputs of the water management analysis of a reservoir, the subsequent analysed reservoir storage capacity is also affected with uncertainties. The values of water outflows from the reservoir and the hydrological reliabilities are affected with uncertainties as well. A simulation model of reservoir behaviour has been compiled with this kind of calculation as stated below. The model allows evaluation of the solution results, taking uncertainties into consideration, in contributing to a reduction in the occurrence of failure or lack of water during reservoir operation in low-water and dry periods.

  2. Accuracy and Uncertainty Analysis of Intelligent Techniques for Predicting the Longitudinal Dispersion Coefficient in Rivers

    Abbas Akbarzadeh

    2010-09-01

    Full Text Available Accurate prediction of longitudinal dispersion coefficient (LDC can be useful for the determination of pollutants concentration distribution in natural rivers. However, the uncertainty associated with the results obtained from forecasting models has a negative effect on pollutant management in water resources. In this research, appropriate models are first developed using ANN and ANFIS techniques to predict the LDC in natural streams. Then, an uncertainty analysis is performed for ANN and ANFIS models based on Monte-Carlo simulation. The input parameters of the models are related to hydraulic variables and stream geometry. Results indicate that ANN is a suitable model for predicting the LDC, but it is also associated with a high level of uncertainty. However, results of uncertainty analysis show that ANFIS model has less uncertainty; i.e. it is the best model for forecasting satisfactorily the LDC in natural streams.

  3. Application of a Novel Dose-Uncertainty Model for Dose-Uncertainty Analysis in Prostate Intensity-Modulated Radiotherapy

    Jin Hosang; Palta, Jatinder R.; Kim, You-Hyun; Kim, Siyong

    2010-01-01

    Purpose: To analyze dose uncertainty using a previously published dose-uncertainty model, and to assess potential dosimetric risks existing in prostate intensity-modulated radiotherapy (IMRT). Methods and Materials: The dose-uncertainty model provides a three-dimensional (3D) dose-uncertainty distribution in a given confidence level. For 8 retrospectively selected patients, dose-uncertainty maps were constructed using the dose-uncertainty model at the 95% CL. In addition to uncertainties inherent to the radiation treatment planning system, four scenarios of spatial errors were considered: machine only (S1), S1 + intrafraction, S1 + interfraction, and S1 + both intrafraction and interfraction errors. To evaluate the potential risks of the IMRT plans, three dose-uncertainty-based plan evaluation tools were introduced: confidence-weighted dose-volume histogram, confidence-weighted dose distribution, and dose-uncertainty-volume histogram. Results: Dose uncertainty caused by interfraction setup error was more significant than that of intrafraction motion error. The maximum dose uncertainty (95% confidence) of the clinical target volume (CTV) was smaller than 5% of the prescribed dose in all but two cases (13.9% and 10.2%). The dose uncertainty for 95% of the CTV volume ranged from 1.3% to 2.9% of the prescribed dose. Conclusions: The dose uncertainty in prostate IMRT could be evaluated using the dose-uncertainty model. Prostate IMRT plans satisfying the same plan objectives could generate a significantly different dose uncertainty because a complex interplay of many uncertainty sources. The uncertainty-based plan evaluation contributes to generating reliable and error-resistant treatment plans.

  4. Uncertainty Analysis of Few Group Cross Sections Based on Generalized Perturbation Theory

    Han, Tae Young; Lee, Hyun Chul; Noh, Jae Man

    2014-01-01

    In this paper, the methodology of the sensitivity and uncertainty analysis code based on GPT was described and the preliminary verification calculations on the PMR200 pin cell problem were carried out. As a result, they are in a good agreement when compared with the results by TSUNAMI. From this study, it is expected that MUSAD code based on GPT can produce the uncertainty of the homogenized few group microscopic cross sections for a core simulator. For sensitivity and uncertainty analyses for general core responses, a two-step method is available and it utilizes the generalized perturbation theory (GPT) for homogenized few group cross sections in the first step and stochastic sampling method for general core responses in the second step. The uncertainty analysis procedure based on GPT in the first step needs the generalized adjoint solution from a cell or lattice code. For this, the generalized adjoint solver has been integrated into DeCART in our previous work. In this paper, MUSAD (Modues of Uncertainty and Sensitivity Analysis for DeCART) code based on the classical perturbation theory was expanded to the function of the sensitivity and uncertainty analysis for few group cross sections based on GPT. First, the uncertainty analysis method based on GPT was described and, in the next section, the preliminary results of the verification calculation on a VHTR pin cell problem were compared with the results by TSUNAMI of SCALE 6.1

  5. The role of uncertainty analysis in dose reconstruction and risk assessment

    Hoffman, F.O.; Simon, S.L.; Thiessen. K.M.

    1996-01-01

    Dose reconstruction and risk assessment rely heavily on the use of mathematical models to extrapolate information beyond the realm of direct observation. Because models are merely approximations of real systems, their predictions are inherently uncertain. As a result, full disclosure of uncertainty in dose and risk estimates is essential to achieve scientific credibility and to build public trust. The need for formal analysis of uncertainty in model predictions was presented during the nineteenth annual meeting of the NCRP. At that time, quantitative uncertainty analysis was considered a relatively new and difficult subject practiced by only a few investigators. Today, uncertainty analysis has become synonymous with the assessment process itself. When an uncertainty analysis is used iteratively within the assessment process, it can guide experimental research to refine dose and risk estimates, deferring potentially high cost or high consequence decisions until uncertainty is either acceptable or irreducible. Uncertainty analysis is now mandated for all ongoing dose reconstruction projects within the United States, a fact that distinguishes dose reconstruction from other types of exposure and risk assessments. 64 refs., 6 figs., 1 tab

  6. PROBABILISTIC SENSITIVITY AND UNCERTAINTY ANALYSIS WORKSHOP SUMMARY REPORT

    Seitz, R

    2008-06-25

    Stochastic or probabilistic modeling approaches are being applied more frequently in the United States and globally to quantify uncertainty and enhance understanding of model response in performance assessments for disposal of radioactive waste. This increased use has resulted in global interest in sharing results of research and applied studies that have been completed to date. This technical report reflects the results of a workshop that was held to share results of research and applied work related to performance assessments conducted at United States Department of Energy sites. Key findings of this research and applied work are discussed and recommendations for future activities are provided.

  7. Flutter analysis of an airfoil with multiple nonlinearities and uncertainties

    Haitao Liao

    2013-09-01

    Full Text Available An original method for calculating the limit cycle oscillations of nonlinear aero-elastic system is presented. The problem of determining the maximum vibration amplitude of limit cycle is transformed into a nonlinear optimization problem. The harmonic balance method and the Floquet theory are selected to construct the general nonlinear equality and inequality constraints. The resulting constrained maximization problem is then solved by using the MultiStart algorithm. Finally, the proposed approach is validated and used to analyse the limit cycle oscillations of an airfoil with multiple nonlinearities and uncertainties. Numerical examples show that the coexistence of multiple nonlinearities may lead to low amplitude limit cycle oscillation.

  8. Best estimate analysis of LOFT L2-5 with CATHARE: uncertainty and sensitivity analysis

    JOUCLA, Jerome; PROBST, Pierre [Institute for Radiological Protection and Nuclear Safety, Fontenay-aux-Roses (France); FOUET, Fabrice [APTUS, Versailles (France)

    2008-07-01

    The revision of the 10 CFR50.46 in 1988 has made possible the use of best-estimate codes. They may be used in safety demonstration and licensing, provided that uncertainties are added to the relevant output parameters before comparing them with the acceptance criteria. In the safety analysis of the large break loss of coolant accident, it was agreed that the 95. percentile estimated with a high degree of confidence should be lower than the acceptance criteria. It appeared necessary to IRSN, technical support of the French Safety Authority, to get more insight into these strategies which are being developed not only in thermal-hydraulics but in other fields such as in neutronics. To estimate the 95. percentile with a high confidence level, we propose to use rank statistics or bootstrap. Toward the objective of assessing uncertainty, it is useful to determine and to classify the main input parameters. We suggest approximating the code by a surrogate model, the Kriging model, which will be used to make a sensitivity analysis with the SOBOL methodology. This paper presents the application of two new methodologies of how to make the uncertainty and sensitivity analysis on the maximum peak cladding temperature of the LOFT L2-5 test with the CATHARE code. (authors)

  9. Global sensitivity analysis for identifying important parameters of nitrogen nitrification and denitrification under model uncertainty and scenario uncertainty

    Chen, Zhuowei; Shi, Liangsheng; Ye, Ming; Zhu, Yan; Yang, Jinzhong

    2018-06-01

    Nitrogen reactive transport modeling is subject to uncertainty in model parameters, structures, and scenarios. By using a new variance-based global sensitivity analysis method, this paper identifies important parameters for nitrogen reactive transport with simultaneous consideration of these three uncertainties. A combination of three scenarios of soil temperature and two scenarios of soil moisture creates a total of six scenarios. Four alternative models describing the effect of soil temperature and moisture content are used to evaluate the reduction functions used for calculating actual reaction rates. The results show that for nitrogen reactive transport problem, parameter importance varies substantially among different models and scenarios. Denitrification and nitrification process is sensitive to soil moisture content status rather than to the moisture function parameter. Nitrification process becomes more important at low moisture content and low temperature. However, the changing importance of nitrification activity with respect to temperature change highly relies on the selected model. Model-averaging is suggested to assess the nitrification (or denitrification) contribution by reducing the possible model error. Despite the introduction of biochemical heterogeneity or not, fairly consistent parameter importance rank is obtained in this study: optimal denitrification rate (Kden) is the most important parameter; reference temperature (Tr) is more important than temperature coefficient (Q10); empirical constant in moisture response function (m) is the least important one. Vertical distribution of soil moisture but not temperature plays predominant role controlling nitrogen reaction. This study provides insight into the nitrogen reactive transport modeling and demonstrates an effective strategy of selecting the important parameters when future temperature and soil moisture carry uncertainties or when modelers face with multiple ways of establishing nitrogen

  10. Uncertainty and sensitivity analysis in the scenario simulation with RELAP/SCDAP and MELCOR codes

    Garcia J, T.; Cardenas V, J.

    2015-09-01

    A methodology was implemented for analysis of uncertainty in simulations of scenarios with RELAP/SCDAP V- 3.4 bi-7 and MELCOR V-2.1 codes, same that are used to perform safety analysis in the Comision Nacional de Seguridad Nuclear y Salvaguardias (CNSNS). The uncertainty analysis methodology chosen is a probabilistic method of type Propagation of uncertainty of the input parameters to the departure parameters. Therefore, it began with the selection of the input parameters considered uncertain and are considered of high importance in the scenario for its direct effect on the output interest variable. These parameters were randomly sampled according to intervals of variation or probability distribution functions assigned by expert judgment to generate a set of input files that were run through the simulation code to propagate the uncertainty to the output parameters. Then, through the use or ordered statistical and formula Wilks, was determined that the minimum number of executions required to obtain the uncertainty bands that include a population of 95% at a confidence level of 95% in the results is 93, is important to mention that in this method that number of executions does not depend on the number of selected input parameters. In the implementation routines in Fortran 90 that allowed automate the process to make the uncertainty analysis in transients for RELAP/SCDAP code were generated. In the case of MELCOR code for severe accident analysis, automation was carried out through complement Dakota Uncertainty incorporated into the Snap platform. To test the practical application of this methodology, two analyzes were performed: the first with the simulation of closing transient of the main steam isolation valves using the RELAP/SCDAP code obtaining the uncertainty band of the dome pressure of the vessel; while in the second analysis, the accident simulation of the power total loss (Sbo) was carried out with the Macarol code obtaining the uncertainty band for the

  11. Analysis and Reduction of Complex Networks Under Uncertainty.

    Ghanem, Roger G [University of Southern California

    2014-07-31

    This effort was a collaboration with Youssef Marzouk of MIT, Omar Knio of Duke University (at the time at Johns Hopkins University) and Habib Najm of Sandia National Laboratories. The objective of this effort was to develop the mathematical and algorithmic capacity to analyze complex networks under uncertainty. Of interest were chemical reaction networks and smart grid networks. The statements of work for USC focused on the development of stochastic reduced models for uncertain networks. The USC team was led by Professor Roger Ghanem and consisted of one graduate student and a postdoc. The contributions completed by the USC team consisted of 1) methodology and algorithms to address the eigenvalue problem, a problem of significance in the stability of networks under stochastic perturbations, 2) methodology and algorithms to characterize probability measures on graph structures with random flows. This is an important problem in characterizing random demand (encountered in smart grid) and random degradation (encountered in infrastructure systems), as well as modeling errors in Markov Chains (with ubiquitous relevance !). 3) methodology and algorithms for treating inequalities in uncertain systems. This is an important problem in the context of models for material failure and network flows under uncertainty where conditions of failure or flow are described in the form of inequalities between the state variables.

  12. Uncertainty analysis of the 35% reactor inlet header break in a CANDU 6 reactor using RELAP/SCDAPSIM/MOD4.0 with integrated uncertainty analysis option

    Dupleac, D., E-mail: danieldu@cne.pub.ro [Politehnica Univ. of Bucharest (Romania); Perez, M.; Reventos, F., E-mail: marina.perez@upc.edu, E-mail: francesc.reventos@upc.edu [Technical Univ. of Catalonia (Spain); Allison, C., E-mail: iss@cableone.net [Innovative Systems Software (United States)

    2011-07-01

    The RELAP/SCDAPSIM/MOD4.0 code, designed to predict the behavior of reactor systems during normal and accident conditions, is being developed as part of an international nuclear technology Software Development and Training Program (SDTP). RELAP/SCDAPSIM/MOD4.0, which is the first version of RELAP5 completely rewritten to FORTRAN 90/95/2000 standards, uses the publicly available RELAP5 and SCDAP models in combination with (a) advanced programming and numerical techniques, (b) advanced SDTP-member-developed models for LWR, HWR, and research reactor analysis, and (c) a variety of other member-developed computational packages. One such computational package is an integrated uncertainty analysis (IUA) package being developed jointly by the Technical University of Catalonia (UPC) and Innovative Systems Software (ISS). RELAP/SCDAPSIM/MOD4.0(IUA) follows the input-propagation approach using probability distribution functions to define the uncertainty of the input parameters. The main steps for this type of methodologies, often referred as to statistical approaches or Wilks’ methods, are the ones that follow: 1. Selection of the plant; 2. Selection of the scenario; 3. Selection of the safety criteria; 4. Identification and ranking of the relevant phenomena based on the safety criteria; 5. Selection of the appropriate code parameters to represent those phenomena; 6. Association of uncertainty by means of Probability Distribution Functions (PDFs) for each selected parameter; 7. Random sampling of the selected parameters according to its PDF and performing multiple computer runs to obtain uncertainty bands with a certain percentile and confidence level; 8. Processing the results of the multiple computer runs to estimate the uncertainty bands for the computed quantities associated with the selected safety criteria. RELAP/SCDAPSIM/MOD4.0(IUA) calculates the number of required code runs given the desired percentile and confidence level, performs the sampling process for the

  13. Uncertainty analysis of the 35% reactor inlet header break in a CANDU 6 reactor using RELAP/SCDAPSIM/MOD4.0 with integrated uncertainty analysis option

    Dupleac, D.; Perez, M.; Reventos, F.; Allison, C.

    2011-01-01

    The RELAP/SCDAPSIM/MOD4.0 code, designed to predict the behavior of reactor systems during normal and accident conditions, is being developed as part of an international nuclear technology Software Development and Training Program (SDTP). RELAP/SCDAPSIM/MOD4.0, which is the first version of RELAP5 completely rewritten to FORTRAN 90/95/2000 standards, uses the publicly available RELAP5 and SCDAP models in combination with (a) advanced programming and numerical techniques, (b) advanced SDTP-member-developed models for LWR, HWR, and research reactor analysis, and (c) a variety of other member-developed computational packages. One such computational package is an integrated uncertainty analysis (IUA) package being developed jointly by the Technical University of Catalonia (UPC) and Innovative Systems Software (ISS). RELAP/SCDAPSIM/MOD4.0(IUA) follows the input-propagation approach using probability distribution functions to define the uncertainty of the input parameters. The main steps for this type of methodologies, often referred as to statistical approaches or Wilks’ methods, are the ones that follow: 1. Selection of the plant; 2. Selection of the scenario; 3. Selection of the safety criteria; 4. Identification and ranking of the relevant phenomena based on the safety criteria; 5. Selection of the appropriate code parameters to represent those phenomena; 6. Association of uncertainty by means of Probability Distribution Functions (PDFs) for each selected parameter; 7. Random sampling of the selected parameters according to its PDF and performing multiple computer runs to obtain uncertainty bands with a certain percentile and confidence level; 8. Processing the results of the multiple computer runs to estimate the uncertainty bands for the computed quantities associated with the selected safety criteria. RELAP/SCDAPSIM/MOD4.0(IUA) calculates the number of required code runs given the desired percentile and confidence level, performs the sampling process for the

  14. Performance Assessment Uncertainty Analysis for Japan's HLW Program Feasibility Study (H12)

    BABA, T.; ISHIGURO, K.; ISHIHARA, Y.; SAWADA, A.; UMEKI, H.; WAKASUGI, K.; WEBB, ERIK K.

    1999-01-01

    Most HLW programs in the world recognize that any estimate of long-term radiological performance must be couched in terms of the uncertainties derived from natural variation, changes through time and lack of knowledge about the essential processes. The Japan Nuclear Cycle Development Institute followed a relatively standard procedure to address two major categories of uncertainty. First, a FEatures, Events and Processes (FEPs) listing, screening and grouping activity was pursued in order to define the range of uncertainty in system processes as well as possible variations in engineering design. A reference and many alternative cases representing various groups of FEPs were defined and individual numerical simulations performed for each to quantify the range of conceptual uncertainty. Second, parameter distributions were developed for the reference case to represent the uncertainty in the strength of these processes, the sequencing of activities and geometric variations. Both point estimates using high and low values for individual parameters as well as a probabilistic analysis were performed to estimate parameter uncertainty. A brief description of the conceptual model uncertainty analysis is presented. This paper focuses on presenting the details of the probabilistic parameter uncertainty assessment

  15. Uncertainty analysis of infinite homogeneous lead and sodium cooled fast reactors at beginning of life

    Vanhanen, R.

    2015-01-01

    The objective of the present work is to estimate breeding ratio, radiation damage rate and minor actinide transmutation rate of infinite homogeneous lead and sodium cooled fast reactors. Uncertainty analysis is performed taking into account uncertainty in nuclear data and composition of the reactors. We use the recently released ENDF/B-VII.1 nuclear data library and restrict the work to the beginning of reactor life. We work under multigroup approximation. The Bondarenko method is used to acquire effective cross sections for the homogeneous reactor. Modeling error and numerical error are estimated. The adjoint sensitivity analysis is performed to calculate generalized adjoint fluxes for the responses. The generalized adjoint fluxes are used to calculate first order sensitivities of the responses to model parameters. The acquired sensitivities are used to propagate uncertainties in the input data to find out uncertainties in the responses. We show that the uncertainty in model parameters is the dominant source of uncertainty, followed by modeling error, input data precision and numerical error. The uncertainty due to composition of the reactor is low. We identify main sources of uncertainty and note that the low-fidelity evaluation of 16 O is problematic due to lack of correlation between total and elastic reactions

  16. Uncertainty Analysis of Resistance Tests in Ata Nutku Ship Model Testing Laboratory of Istanbul Technical University

    Cihad DELEN

    2015-12-01

    Full Text Available In this study, some systematical resistance tests, where were performed in Ata Nutku Ship Model Testing Laboratory of Istanbul Technical University (ITU, have been included in order to determine the uncertainties. Experiments which are conducted in the framework of mathematical and physical rules for the solution of engineering problems, measurements, calculations include uncertainty. To question the reliability of the obtained values, the existing uncertainties should be expressed as quantities. The uncertainty of a measurement system is not known if the results do not carry a universal value. On the other hand, resistance is one of the most important parameters that should be considered in the process of ship design. Ship resistance during the design phase of a ship cannot be determined precisely and reliably due to the uncertainty resources in determining the resistance value that are taken into account. This case may cause negative effects to provide the required specifications in the latter design steps. The uncertainty arising from the resistance test has been estimated and compared for a displacement type ship and high speed marine vehicles according to ITTC 2002 and ITTC 2014 regulations which are related to the uncertainty analysis methods. Also, the advantages and disadvantages of both ITTC uncertainty analysis methods have been discussed.

  17. Uncertainty analysis of infinite homogeneous lead and sodium cooled fast reactors at beginning of life

    Vanhanen, R., E-mail: risto.vanhanen@aalto.fi

    2015-03-15

    The objective of the present work is to estimate breeding ratio, radiation damage rate and minor actinide transmutation rate of infinite homogeneous lead and sodium cooled fast reactors. Uncertainty analysis is performed taking into account uncertainty in nuclear data and composition of the reactors. We use the recently released ENDF/B-VII.1 nuclear data library and restrict the work to the beginning of reactor life. We work under multigroup approximation. The Bondarenko method is used to acquire effective cross sections for the homogeneous reactor. Modeling error and numerical error are estimated. The adjoint sensitivity analysis is performed to calculate generalized adjoint fluxes for the responses. The generalized adjoint fluxes are used to calculate first order sensitivities of the responses to model parameters. The acquired sensitivities are used to propagate uncertainties in the input data to find out uncertainties in the responses. We show that the uncertainty in model parameters is the dominant source of uncertainty, followed by modeling error, input data precision and numerical error. The uncertainty due to composition of the reactor is low. We identify main sources of uncertainty and note that the low-fidelity evaluation of {sup 16}O is problematic due to lack of correlation between total and elastic reactions.

  18. Holistic uncertainty analysis in river basin modeling for climate vulnerability assessment

    Taner, M. U.; Wi, S.; Brown, C.

    2017-12-01

    The challenges posed by uncertain future climate are a prominent concern for water resources managers. A number of frameworks exist for assessing the impacts of climate-related uncertainty, including internal climate variability and anthropogenic climate change, such as scenario-based approaches and vulnerability-based approaches. While in many cases climate uncertainty may be dominant, other factors such as future evolution of the river basin, hydrologic response and reservoir operations are potentially significant sources of uncertainty. While uncertainty associated with modeling hydrologic response has received attention, very little attention has focused on the range of uncertainty and possible effects of the water resources infrastructure and management. This work presents a holistic framework that allows analysis of climate, hydrologic and water management uncertainty in water resources systems analysis with the aid of a water system model designed to integrate component models for hydrology processes and water management activities. The uncertainties explored include those associated with climate variability and change, hydrologic model parameters, and water system operation rules. A Bayesian framework is used to quantify and model the uncertainties at each modeling steps in integrated fashion, including prior and the likelihood information about model parameters. The framework is demonstrated in a case study for the St. Croix Basin located at border of United States and Canada.

  19. Fuzzy probability based fault tree analysis to propagate and quantify epistemic uncertainty

    Purba, Julwan Hendry; Sony Tjahyani, D.T.; Ekariansyah, Andi Sofrany; Tjahjono, Hendro

    2015-01-01

    Highlights: • Fuzzy probability based fault tree analysis is to evaluate epistemic uncertainty in fuzzy fault tree analysis. • Fuzzy probabilities represent likelihood occurrences of all events in a fault tree. • A fuzzy multiplication rule quantifies epistemic uncertainty of minimal cut sets. • A fuzzy complement rule estimate epistemic uncertainty of the top event. • The proposed FPFTA has successfully evaluated the U.S. Combustion Engineering RPS. - Abstract: A number of fuzzy fault tree analysis approaches, which integrate fuzzy concepts into the quantitative phase of conventional fault tree analysis, have been proposed to study reliabilities of engineering systems. Those new approaches apply expert judgments to overcome the limitation of the conventional fault tree analysis when basic events do not have probability distributions. Since expert judgments might come with epistemic uncertainty, it is important to quantify the overall uncertainties of the fuzzy fault tree analysis. Monte Carlo simulation is commonly used to quantify the overall uncertainties of conventional fault tree analysis. However, since Monte Carlo simulation is based on probability distribution, this technique is not appropriate for fuzzy fault tree analysis, which is based on fuzzy probabilities. The objective of this study is to develop a fuzzy probability based fault tree analysis to overcome the limitation of fuzzy fault tree analysis. To demonstrate the applicability of the proposed approach, a case study is performed and its results are then compared to the results analyzed by a conventional fault tree analysis. The results confirm that the proposed fuzzy probability based fault tree analysis is feasible to propagate and quantify epistemic uncertainties in fault tree analysis

  20. Uncertainty analysis and design optimization of hybrid rocket motor powered vehicle for suborbital flight

    Zhu Hao

    2015-06-01

    Full Text Available In this paper, we propose an uncertainty analysis and design optimization method and its applications on a hybrid rocket motor (HRM powered vehicle. The multidisciplinary design model of the rocket system is established and the design uncertainties are quantified. The sensitivity analysis of the uncertainties shows that the uncertainty generated from the error of fuel regression rate model has the most significant effect on the system performances. Then the differences between deterministic design optimization (DDO and uncertainty-based design optimization (UDO are discussed. Two newly formed uncertainty analysis methods, including the Kriging-based Monte Carlo simulation (KMCS and Kriging-based Taylor series approximation (KTSA, are carried out using a global approximation Kriging modeling method. Based on the system design model and the results of design uncertainty analysis, the design optimization of an HRM powered vehicle for suborbital flight is implemented using three design optimization methods: DDO, KMCS and KTSA. The comparisons indicate that the two UDO methods can enhance the design reliability and robustness. The researches and methods proposed in this paper can provide a better way for the general design of HRM powered vehicles.

  1. Sensitivity and uncertainty analysis applied to a repository in rock salt

    Polle, A.N.

    1996-12-01

    This document describes the sensitivity and uncertainty analysis with UNCSAM, as applied to a repository in rock salt for the EVEREST project. UNCSAM is a dedicated software package for sensitivity and uncertainty analysis, which was already used within the preceding PROSA project. The use of UNCSAM provides a flexible interface to EMOS ECN by substituting the sampled values in the various input files to be used by EMOS ECN ; the model calculations for this repository were performed with the EMOS ECN code. Preceding the sensitivity and uncertainty analysis, a number of preparations has been carried out to facilitate EMOS ECN with the probabilistic input data. For post-processing the EMOS ECN results, the characteristic output signals were processed. For the sensitivity and uncertainty analysis with UNCSAM the stochastic input, i.e. sampled values, and the output for the various EMOS ECN runs have been analyzed. (orig.)

  2. May Day: A computer code to perform uncertainty and sensitivity analysis. Manuals

    Bolado, R.; Alonso, A.; Moya, J.M.

    1996-07-01

    The computer program May Day was developed to carry out the uncertainty and sensitivity analysis in the evaluation of radioactive waste storage. The May Day was made by the Polytechnical University of Madrid. (Author)

  3. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis.

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  4. Global sensitivity analysis in wastewater treatment plant model applications: Prioritizing sources of uncertainty

    Sin, Gürkan; Gernaey, Krist; Neumann, Marc B.

    2011-01-01

    This study demonstrates the usefulness of global sensitivity analysis in wastewater treatment plant (WWTP) design to prioritize sources of uncertainty and quantify their impact on performance criteria. The study, which is performed with the Benchmark Simulation Model no. 1 plant design, complements...... insight into devising useful ways for reducing uncertainties in the plant performance. This information can help engineers design robust WWTP plants....... a previous paper on input uncertainty characterisation and propagation (Sin et al., 2009). A sampling-based sensitivity analysis is conducted to compute standardized regression coefficients. It was found that this method is able to decompose satisfactorily the variance of plant performance criteria (with R2...

  5. Uncertainty analysis for probabilistic pipe fracture evaluations in LBB applications

    Rahman, S.; Ghadiali, N.; Wilkowski, G.

    1997-01-01

    During the NRC's Short Cracks in Piping and Piping Welds Program at Battelle, a probabilistic methodology was developed to conduct fracture evaluations of circumferentially cracked pipes for application to leak-rate detection. Later, in the IPIRG-2 program, several parameters that may affect leak-before-break and other pipe flaw evaluations were identified. This paper presents new results from several uncertainty analyses to evaluate the effects of normal operating stresses, normal plus safe-shutdown earthquake stresses, off-centered cracks, restraint of pressure-induced bending, and dynamic and cyclic loading rates on the conditional failure probability of pipes. systems in BWR and PWR. For each parameter, the sensitivity to conditional probability of failure and hence, its importance on probabilistic leak-before-break evaluations were determined

  6. Uncertainty and sensitivity analysis using probabilistic system assessment code. 1

    Honma, Toshimitsu; Sasahara, Takashi.

    1993-10-01

    This report presents the results obtained when applying the probabilistic system assessment code under development to the PSACOIN Level 0 intercomparison exercise organized by the Probabilistic System Assessment Code User Group in the Nuclear Energy Agency (NEA) of OECD. This exercise is one of a series designed to compare and verify probabilistic codes in the performance assessment of geological radioactive waste disposal facilities. The computations were performed using the Monte Carlo sampling code PREP and post-processor code USAMO. The submodels in the waste disposal system were described and coded with the specification of the exercise. Besides the results required for the exercise, further additional uncertainty and sensitivity analyses were performed and the details of these are also included. (author)

  7. Uncertainty analysis for probabilistic pipe fracture evaluations in LBB applications

    Rahman, S.; Ghadiali, N.; Wilkowski, G.

    1997-04-01

    During the NRC`s Short Cracks in Piping and Piping Welds Program at Battelle, a probabilistic methodology was developed to conduct fracture evaluations of circumferentially cracked pipes for application to leak-rate detection. Later, in the IPIRG-2 program, several parameters that may affect leak-before-break and other pipe flaw evaluations were identified. This paper presents new results from several uncertainty analyses to evaluate the effects of normal operating stresses, normal plus safe-shutdown earthquake stresses, off-centered cracks, restraint of pressure-induced bending, and dynamic and cyclic loading rates on the conditional failure probability of pipes. systems in BWR and PWR. For each parameter, the sensitivity to conditional probability of failure and hence, its importance on probabilistic leak-before-break evaluations were determined.

  8. Application of quantile functions for the analysis and comparison of gas pressure balance uncertainties

    Ramnath Vishal

    2017-01-01

    Full Text Available Traditionally in the field of pressure metrology uncertainty quantification was performed with the use of the Guide to the Uncertainty in Measurement (GUM; however, with the introduction of the GUM Supplement 1 (GS1 the use of Monte Carlo simulations has become an accepted practice for uncertainty analysis in metrology for mathematical models in which the underlying assumptions of the GUM are not valid. Consequently the use of quantile functions was developed as a means to easily summarize and report on uncertainty numerical results that were based on Monte Carlo simulations. In this paper, we considered the case of a piston–cylinder operated pressure balance where the effective area is modelled in terms of a combination of explicit/implicit and linear/non-linear models, and how quantile functions may be applied to analyse results and compare uncertainties from a mixture of GUM and GS1 methodologies.

  9. Effect of Uncertainties in Physical Property Estimates on Process Design - Sensitivity Analysis

    Hukkerikar, Amol; Jones, Mark Nicholas; Sin, Gürkan

    for performing sensitivity of process design subject to uncertainties in the property estimates. To this end, first uncertainty analysis of the property models of pure components and their mixtures was performed in order to obtain the uncertainties in the estimated property values. As a next step, sensitivity......Chemical process design calculations require accurate and reliable physical and thermodynamic property data and property models of pure components and their mixtures in order to obtain reliable design parameters which help to achieve desired specifications. The uncertainties in the property values...... can arise from the experiments itself or from the property models employed. It is important to consider the effect of these uncertainties on the process design in order to assess the quality and reliability of the final design. The main objective of this work is to develop a systematic methodology...

  10. Comprehensive Approach to Verification and Validation of CFD Simulations Applied to Backward Facing Step-Application of CFD Uncertainty Analysis

    Groves, Curtis E.; LLie, Marcel; Shallhorn, Paul A.

    2012-01-01

    There are inherent uncertainties and errors associated with using Computational Fluid Dynamics (CFD) to predict the flow field and there is no standard method for evaluating uncertainty in the CFD community. This paper describes an approach to -validate the . uncertainty in using CFD. The method will use the state of the art uncertainty analysis applying different turbulence niodels and draw conclusions on which models provide the least uncertainty and which models most accurately predict the flow of a backward facing step.

  11. Uncertainty and sensitivity analysis on probabilistic safety assessment of an experimental facility

    Burgazzi, L.

    2000-01-01

    The aim of this work is to perform an uncertainty and sensitivity analysis on the probabilistic safety assessment of the International Fusion Materials Irradiation Facility (IFMIF), in order to assess the effect on the final risk values of the uncertainties associated with the generic data used for the initiating events and component reliability and to identify the key quantities contributing to this uncertainty. The analysis is conducted on the expected frequency calculated for the accident sequences, defined through the event tree (ET) modeling. This is in order to increment credit to the ET model quantification, to calculate frequency distributions for the occurrence of events and, consequently, to assess if sequences have been correctly selected on the probability standpoint and finally to verify the fulfillment of the safety conditions. Uncertainty and sensitivity analysis are performed using respectively Monte Carlo sampling and an importance parameter technique. (author)

  12. Analysis of uncertainties in the IAEA/WHO TLD postal dose audit system

    Izewska, J. [Department of Nuclear Sciences and Applications, International Atomic Energy Agency, Wagramer Strasse 5, Vienna (Austria)], E-mail: j.izewska@iaea.org; Hultqvist, M. [Department of Medical Radiation Physics, Karolinska Institute, Stockholm University, Stockholm (Sweden); Bera, P. [Department of Nuclear Sciences and Applications, International Atomic Energy Agency, Wagramer Strasse 5, Vienna (Austria)

    2008-02-15

    The International Atomic Energy Agency (IAEA) and the World Health Organisation (WHO) operate the IAEA/WHO TLD postal dose audit programme. Thermoluminescence dosimeters (TLDs) are used as transfer devices in this programme. In the present work the uncertainties in the dose determination from TLD measurements have been evaluated. The analysis of uncertainties comprises uncertainties in the calibration coefficient of the TLD system and uncertainties in factors correcting for dose response non-linearity, fading of TL signal, energy response and influence of TLD holder. The individual uncertainties have been combined to estimate the total uncertainty in the dose evaluated from TLD measurements. The combined relative standard uncertainty in the dose determined from TLD measurements has been estimated to be 1.2% for irradiations with Co-60 {gamma}-rays and 1.6% for irradiations with high-energy X-rays. Results from irradiations by the Bureau international des poids et mesures (BIPM), Primary Standard Dosimetry Laboratories (PSDLs) and Secondary Standards Dosimetry Laboratories (SSDLs) compare favourably with the estimated uncertainties, whereas TLD results of radiotherapy centres show higher standard deviations than those derived theoretically.

  13. Uncertainty Evaluation of the SFR Subchannel Thermal-Hydraulic Modeling Using a Hot Channel Factors Analysis

    Choi, Sun Rock; Cho, Chung Ho; Kim, Sang Ji

    2011-01-01

    In an SFR core analysis, a hot channel factors (HCF) method is most commonly used to evaluate uncertainty. It was employed to the early design such as the CRBRP and IFR. In other ways, the improved thermal design procedure (ITDP) is able to calculate the overall uncertainty based on the Root Sum Square technique and sensitivity analyses of each design parameters. The Monte Carlo method (MCM) is also employed to estimate the uncertainties. In this method, all the input uncertainties are randomly sampled according to their probability density functions and the resulting distribution for the output quantity is analyzed. Since an uncertainty analysis is basically calculated from the temperature distribution in a subassembly, the core thermal-hydraulic modeling greatly affects the resulting uncertainty. At KAERI, the SLTHEN and MATRA-LMR codes have been utilized to analyze the SFR core thermal-hydraulics. The SLTHEN (steady-state LMR core thermal hydraulics analysis code based on the ENERGY model) code is a modified version of the SUPERENERGY2 code, which conducts a multi-assembly, steady state calculation based on a simplified ENERGY model. The detailed subchannel analysis code MATRA-LMR (Multichannel Analyzer for Steady-State and Transients in Rod Arrays for Liquid Metal Reactors), an LMR version of MATRA, was also developed specifically for the SFR core thermal-hydraulic analysis. This paper describes comparative studies for core thermal-hydraulic models. The subchannel analysis and a hot channel factors based uncertainty evaluation system is established to estimate the core thermofluidic uncertainties using the MATRA-LMR code and the results are compared to those of the SLTHEN code

  14. A practical sensitivity analysis method for ranking sources of uncertainty in thermal–hydraulics applications

    Pourgol-Mohammad, Mohammad, E-mail: pourgolmohammad@sut.ac.ir [Department of Mechanical Engineering, Sahand University of Technology, Tabriz (Iran, Islamic Republic of); Hoseyni, Seyed Mohsen [Department of Basic Sciences, East Tehran Branch, Islamic Azad University, Tehran (Iran, Islamic Republic of); Hoseyni, Seyed Mojtaba [Building & Housing Research Center, Tehran (Iran, Islamic Republic of); Sepanloo, Kamran [Nuclear Science and Technology Research Institute, Tehran (Iran, Islamic Republic of)

    2016-08-15

    Highlights: • Existing uncertainty ranking methods prove inconsistent for TH applications. • Introduction of a new method for ranking sources of uncertainty in TH codes. • Modified PIRT qualitatively identifies and ranks uncertainty sources more precisely. • The importance of parameters is calculated by a limited number of TH code executions. • Methodology is applied successfully on LOFT-LB1 test facility. - Abstract: In application to thermal–hydraulic calculations by system codes, sensitivity analysis plays an important role for managing the uncertainties of code output and risk analysis. Sensitivity analysis is also used to confirm the results of qualitative Phenomena Identification and Ranking Table (PIRT). Several methodologies have been developed to address uncertainty importance assessment. Generally, uncertainty importance measures, mainly devised for the Probabilistic Risk Assessment (PRA) applications, are not affordable for computationally demanding calculations of the complex thermal–hydraulics (TH) system codes. In other words, for effective quantification of the degree of the contribution of each phenomenon to the total uncertainty of the output, a practical approach is needed by considering high computational burden of TH calculations. This study aims primarily to show the inefficiency of the existing approaches and then introduces a solution to cope with the challenges in this area by modification of variance-based uncertainty importance method. Important parameters are identified by the modified PIRT approach qualitatively then their uncertainty importance is quantified by a local derivative index. The proposed index is attractive from its practicality point of view on TH applications. It is capable of calculating the importance of parameters by a limited number of TH code executions. Application of the proposed methodology is demonstrated on LOFT-LB1 test facility.

  15. A practical sensitivity analysis method for ranking sources of uncertainty in thermal–hydraulics applications

    Pourgol-Mohammad, Mohammad; Hoseyni, Seyed Mohsen; Hoseyni, Seyed Mojtaba; Sepanloo, Kamran

    2016-01-01

    Highlights: • Existing uncertainty ranking methods prove inconsistent for TH applications. • Introduction of a new method for ranking sources of uncertainty in TH codes. • Modified PIRT qualitatively identifies and ranks uncertainty sources more precisely. • The importance of parameters is calculated by a limited number of TH code executions. • Methodology is applied successfully on LOFT-LB1 test facility. - Abstract: In application to thermal–hydraulic calculations by system codes, sensitivity analysis plays an important role for managing the uncertainties of code output and risk analysis. Sensitivity analysis is also used to confirm the results of qualitative Phenomena Identification and Ranking Table (PIRT). Several methodologies have been developed to address uncertainty importance assessment. Generally, uncertainty importance measures, mainly devised for the Probabilistic Risk Assessment (PRA) applications, are not affordable for computationally demanding calculations of the complex thermal–hydraulics (TH) system codes. In other words, for effective quantification of the degree of the contribution of each phenomenon to the total uncertainty of the output, a practical approach is needed by considering high computational burden of TH calculations. This study aims primarily to show the inefficiency of the existing approaches and then introduces a solution to cope with the challenges in this area by modification of variance-based uncertainty importance method. Important parameters are identified by the modified PIRT approach qualitatively then their uncertainty importance is quantified by a local derivative index. The proposed index is attractive from its practicality point of view on TH applications. It is capable of calculating the importance of parameters by a limited number of TH code executions. Application of the proposed methodology is demonstrated on LOFT-LB1 test facility.

  16. Uncertainties propagation and global sensitivity analysis of the frequency response function of piezoelectric energy harvesters

    Ruiz, Rafael O.; Meruane, Viviana

    2017-06-01

    The goal of this work is to describe a framework to propagate uncertainties in piezoelectric energy harvesters (PEHs). These uncertainties are related to the incomplete knowledge of the model parameters. The framework presented could be employed to conduct prior robust stochastic predictions. The prior analysis assumes a known probability density function for the uncertain variables and propagates the uncertainties to the output voltage. The framework is particularized to evaluate the behavior of the frequency response functions (FRFs) in PEHs, while its implementation is illustrated by the use of different unimorph and bimorph PEHs subjected to different scenarios: free of uncertainties, common uncertainties, and uncertainties as a product of imperfect clamping. The common variability associated with the PEH parameters are tabulated and reported. A global sensitivity analysis is conducted to identify the Sobol indices. Results indicate that the elastic modulus, density, and thickness of the piezoelectric layer are the most relevant parameters of the output variability. The importance of including the model parameter uncertainties in the estimation of the FRFs is revealed. In this sense, the present framework constitutes a powerful tool in the robust design and prediction of PEH performance.

  17. How does uncertainty shape patient experience in advanced illness? A secondary analysis of qualitative data.

    Etkind, Simon Noah; Bristowe, Katherine; Bailey, Katharine; Selman, Lucy Ellen; Murtagh, Fliss Em

    2017-02-01

    Uncertainty is common in advanced illness but is infrequently studied in this context. If poorly addressed, uncertainty can lead to adverse patient outcomes. We aimed to understand patient experiences of uncertainty in advanced illness and develop a typology of patients' responses and preferences to inform practice. Secondary analysis of qualitative interview transcripts. Studies were assessed for inclusion and interviews were sampled using maximum-variation sampling. Analysis used a thematic approach with 10% of coding cross-checked to enhance reliability. Qualitative interviews from six studies including patients with heart failure, chronic obstructive pulmonary disease, renal disease, cancer and liver failure. A total of 30 transcripts were analysed. Median age was 75 (range, 43-95), 12 patients were women. The impact of uncertainty was frequently discussed: the main related themes were engagement with illness, information needs, patient priorities and the period of time that patients mainly focused their attention on (temporal focus). A typology of patient responses to uncertainty was developed from these themes. Uncertainty influences patient experience in advanced illness through affecting patients' information needs, preferences and future priorities for care. Our typology aids understanding of how patients with advanced illness respond to uncertainty. Assessment of these three factors may be a useful starting point to guide clinical assessment and shared decision making.

  18. Status of XSUSA for sampling based nuclear data uncertainty and sensitivity analysis

    Zwermann, W.; Gallner, L.; Klein, M.; Krzydacz-Hausmann; Pasichnyk, I.; Pautz, A.; Velkov, K.

    2013-01-01

    In the present contribution, an overview of the sampling based XSUSA method for sensitivity and uncertainty analysis with respect to nuclear data is given. The focus is on recent developments and applications of XSUSA. These applications include calculations for critical assemblies, fuel assembly depletion calculations, and steady state as well as transient reactor core calculations. The analyses are partially performed in the framework of international benchmark working groups (UACSA - Uncertainty Analyses for Criticality Safety Assessment, UAM - Uncertainty Analysis in Modelling). It is demonstrated that particularly for full-scale reactor calculations the influence of the nuclear data uncertainties on the results can be substantial. For instance, for the radial fission rate distributions of mixed UO 2 /MOX light water reactor cores, the 2σ uncertainties in the core centre and periphery can reach values exceeding 10%. For a fast transient, the resulting time behaviour of the reactor power was covered by a wide uncertainty band. Overall, the results confirm the necessity of adding systematic uncertainty analyses to best-estimate reactor calculations. (authors)

  19. Uncertainty and sensitivity analysis of control strategies using the benchmark simulation model No1 (BSM1).

    Flores-Alsina, Xavier; Rodriguez-Roda, Ignasi; Sin, Gürkan; Gernaey, Krist V

    2009-01-01

    The objective of this paper is to perform an uncertainty and sensitivity analysis of the predictions of the Benchmark Simulation Model (BSM) No. 1, when comparing four activated sludge control strategies. The Monte Carlo simulation technique is used to evaluate the uncertainty in the BSM1 predictions, considering the ASM1 bio-kinetic parameters and influent fractions as input uncertainties while the Effluent Quality Index (EQI) and the Operating Cost Index (OCI) are focused on as model outputs. The resulting Monte Carlo simulations are presented using descriptive statistics indicating the degree of uncertainty in the predicted EQI and OCI. Next, the Standard Regression Coefficients (SRC) method is used for sensitivity analysis to identify which input parameters influence the uncertainty in the EQI predictions the most. The results show that control strategies including an ammonium (S(NH)) controller reduce uncertainty in both overall pollution removal and effluent total Kjeldahl nitrogen. Also, control strategies with an external carbon source reduce the effluent nitrate (S(NO)) uncertainty increasing both their economical cost and variability as a trade-off. Finally, the maximum specific autotrophic growth rate (micro(A)) causes most of the variance in the effluent for all the evaluated control strategies. The influence of denitrification related parameters, e.g. eta(g) (anoxic growth rate correction factor) and eta(h) (anoxic hydrolysis rate correction factor), becomes less important when a S(NO) controller manipulating an external carbon source addition is implemented.

  20. A global water supply reservoir yield model with uncertainty analysis

    Kuria, Faith W; Vogel, Richard M

    2014-01-01

    Understanding the reliability and uncertainty associated with water supply yields derived from surface water reservoirs is central for planning purposes. Using a global dataset of monthly river discharge, we introduce a generalized model for estimating the mean and variance of water supply yield, Y, expected from a reservoir for a prespecified reliability, R, and storage capacity, S assuming a flow record of length n. The generalized storage–reliability–yield (SRY) relationships reported here have numerous water resource applications ranging from preliminary water supply investigations, to economic and climate change impact assessments. An example indicates how our generalized SRY relationship can be combined with a hydroclimatic model to determine the impact of climate change on surface reservoir water supply yields. We also document that the variability of estimates of water supply yield are invariant to characteristics of the reservoir system, including its storage capacity and reliability. Standardized metrics of the variability of water supply yields are shown to depend only on the sample size of the inflows and the statistical characteristics of the inflow series. (paper)

  1. A GLUE uncertainty analysis of a drying model of pharmaceutical granules

    Mortier, Séverine Thérèse F.C.; Van Hoey, Stijn; Cierkens, Katrijn

    2013-01-01

    unit, which is part of the full continuous from-powder-to-tablet manufacturing line (Consigma™, GEA Pharma Systems). A validated model describing the drying behaviour of a single pharmaceutical granule in two consecutive phases is used. First of all, the effect of the assumptions at the particle level...... on the prediction uncertainty is assessed. Secondly, the paper focuses on the influence of the most sensitive parameters in the model. Finally, a combined analysis (particle level plus most sensitive parameters) is performed and discussed. To propagate the uncertainty originating from the parameter uncertainty...

  2. Uncertainty Analysis Framework - Hanford Site-Wide Groundwater Flow and Transport Model

    Cole, Charles R.; Bergeron, Marcel P.; Murray, Christopher J.; Thorne, Paul D.; Wurstner, Signe K.; Rogers, Phillip M.

    2001-11-09

    Pacific Northwest National Laboratory (PNNL) embarked on a new initiative to strengthen the technical defensibility of the predictions being made with a site-wide groundwater flow and transport model at the U.S. Department of Energy Hanford Site in southeastern Washington State. In FY 2000, the focus of the initiative was on the characterization of major uncertainties in the current conceptual model that would affect model predictions. The long-term goals of the initiative are the development and implementation of an uncertainty estimation methodology in future assessments and analyses using the site-wide model. This report focuses on the development and implementation of an uncertainty analysis framework.

  3. Simulating fuel behavior under transient conditions using FRAPTRAN and uncertainty analysis using Dakota

    Gomes, Daniel S.; Teixeira, Antonio S.

    2017-01-01

    Although regulatory agencies have shown a special interest in incorporating best estimate approaches in the fuel licensing process, fuel codes are currently licensed based on only the deterministic limits such as those seen in 10CRF50, and therefore, may yield unrealistic safety margins. The concept of uncertainty analysis is employed to more realistically manage this risk. In this study, uncertainties were classified into two categories: probabilistic and epistemic (owing to a lack of pre-existing knowledge in this area). Fuel rods have three sources of uncertainty: manufacturing tolerance, boundary conditions, and physical models. The first step in successfully analyzing the uncertainties involves performing a statistical analysis on the input parameters used throughout the fuel code. The response obtained from this analysis must show proportional index correlations because the uncertainties are globally propagated. The Dakota toolkit was used to analyze the FRAPTRAN transient fuel code. The subsequent sensitivity analyses helped in identifying the key parameters with the highest correlation indices including the peak cladding temperature and the time required for cladding failures. The uncertainty analysis was performed using an IFA-650-5 fuel rod, which was in line with the tests performed in the Halden Project in Norway. The main objectives of the Halden project included studying the ballooning and rupture processes. The results of this experiment demonstrate the accuracy and applicability of the physical models in evaluating the thermal conductivity, mechanical model, and fuel swelling formulations. (author)

  4. Simulating fuel behavior under transient conditions using FRAPTRAN and uncertainty analysis using Dakota

    Gomes, Daniel S.; Teixeira, Antonio S., E-mail: dsgomes@ipen.br, E-mail: teixeira@ipen [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    Although regulatory agencies have shown a special interest in incorporating best estimate approaches in the fuel licensing process, fuel codes are currently licensed based on only the deterministic limits such as those seen in 10CRF50, and therefore, may yield unrealistic safety margins. The concept of uncertainty analysis is employed to more realistically manage this risk. In this study, uncertainties were classified into two categories: probabilistic and epistemic (owing to a lack of pre-existing knowledge in this area). Fuel rods have three sources of uncertainty: manufacturing tolerance, boundary conditions, and physical models. The first step in successfully analyzing the uncertainties involves performing a statistical analysis on the input parameters used throughout the fuel code. The response obtained from this analysis must show proportional index correlations because the uncertainties are globally propagated. The Dakota toolkit was used to analyze the FRAPTRAN transient fuel code. The subsequent sensitivity analyses helped in identifying the key parameters with the highest correlation indices including the peak cladding temperature and the time required for cladding failures. The uncertainty analysis was performed using an IFA-650-5 fuel rod, which was in line with the tests performed in the Halden Project in Norway. The main objectives of the Halden project included studying the ballooning and rupture processes. The results of this experiment demonstrate the accuracy and applicability of the physical models in evaluating the thermal conductivity, mechanical model, and fuel swelling formulations. (author)

  5. HTGR reactor physics, thermal-hydraulics and depletion uncertainty analysis: a proposed IAEA coordinated research project

    Tyobeka, Bismark; Reitsma, Frederik; Ivanov, Kostadin

    2011-01-01

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The predictive capability of coupled neutronics/thermal hydraulics and depletion simulations for reactor design and safety analysis can be assessed with sensitivity analysis and uncertainty analysis methods. In order to benefit from recent advances in modeling and simulation and the availability of new covariance data (nuclear data uncertainties) extensive sensitivity and uncertainty studies are needed for quantification of the impact of different sources of uncertainties on the design and safety parameters of HTGRs. Uncertainty and sensitivity studies are an essential component of any significant effort in data and simulation improvement. In February 2009, the Technical Working Group on Gas-Cooled Reactors recommended that the proposed IAEA Coordinated Research Project (CRP) on the HTGR Uncertainty Analysis in Modeling be implemented. In the paper the current status and plan are presented. The CRP will also benefit from interactions with the currently ongoing OECD/NEA Light Water Reactor (LWR) UAM benchmark activity by taking into consideration the peculiarities of HTGR designs and simulation requirements. (author)

  6. Practical Policy Applications of Uncertainty Analysis for National Greenhouse Gas Inventories

    Gillenwater, M. [Environmental Resources Trust (United States)], E-mail: mgillenwater@ert.net; Sussman, F.; Cohen, J. [ICF International (United States)

    2007-09-15

    goals precisely in terms of relationships among important variables (such as emissions estimate, commitment level, or statistical confidence); and (3) develop a quantifiable adjustment mechanism that reflects these environmental goals. We recommend that countries implement an investigation-focused (i.e., qualitative) uncertainty analysis that will (1) provide the type of information necessary to develop more substantive, and potentially useful, quantitative uncertainty estimates-regardless of whether those quantitative estimates are used for policy purposes; and (2) provide information needed to understand the likely causes of uncertainty in inventory data and thereby point to ways to improve inventory quality (i.e., accuracy, transparency, completeness, and consistency)

  7. Practical Policy Applications of Uncertainty Analysis for National Greenhouse Gas Inventories

    Gillenwater, M.; Sussman, F.; Cohen, J.

    2007-01-01

    goals precisely in terms of relationships among important variables (such as emissions estimate, commitment level, or statistical confidence); and (3) develop a quantifiable adjustment mechanism that reflects these environmental goals. We recommend that countries implement an investigation-focused (i.e., qualitative) uncertainty analysis that will (1) provide the type of information necessary to develop more substantive, and potentially useful, quantitative uncertainty estimates-regardless of whether those quantitative estimates are used for policy purposes; and (2) provide information needed to understand the likely causes of uncertainty in inventory data and thereby point to ways to improve inventory quality (i.e., accuracy, transparency, completeness, and consistency)

  8. Uncertainty Analysis of the Temperature–Resistance Relationship of Temperature Sensing Fabric

    Muhammad Dawood Husain

    2016-11-01

    Full Text Available This paper reports the uncertainty analysis of the temperature–resistance (TR data of the newly developed temperature sensing fabric (TSF, which is a double-layer knitted structure fabricated on an electronic flat-bed knitting machine, made of polyester as a basal yarn, and embedded with fine metallic wire as sensing element. The measurement principle of the TSF is identical to temperature resistance detector (RTD; that is, change in resistance due to change in temperature. The regression uncertainty (uncertainty within repeats and repeatability uncertainty (uncertainty among repeats were estimated by analysing more than 300 TR experimental repeats of 50 TSF samples. The experiments were performed under dynamic heating and cooling environments on a purpose-built test rig within the temperature range of 20–50 °C. The continuous experimental data was recorded through LabVIEW-based graphical user interface. The result showed that temperature and resistance values were not only repeatable but reproducible, with only minor variations. The regression uncertainty was found to be less than ±0.3 °C; the TSF sample made of Ni and W wires showed regression uncertainty of <±0.13 °C in comparison to Cu-based TSF samples (>±0.18 °C. The cooling TR data showed considerably reduced values (±0.07 °C of uncertainty in comparison with the heating TR data (±0.24 °C. The repeatability uncertainty was found to be less than ±0.5 °C. By increasing the number of samples and repeats, the uncertainties may be reduced further. The TSF could be used for continuous measurement of the temperature profile on the surface of the human body.

  9. Introducing uncertainty analysis of nucleation and crystal growth models in Process Analytical Technology (PAT) system design of crystallization processes

    Abdul Samad, Noor Asma Fazli Bin; Sin, Gürkan; Gernaey, Krist

    2013-01-01

    This paper presents the application of uncertainty and sensitivity analysis as part of a systematic modelbased process monitoring and control (PAT) system design framework for crystallization processes. For the uncertainty analysis, the Monte Carlo procedure is used to propagate input uncertainty...

  10. Preliminary Uncertainty Analysis for SMART Digital Core Protection and Monitoring System

    Koo, Bon Seung; In, Wang Kee; Hwang, Dae Hyun

    2012-01-01

    The Korea Atomic Energy Research Institute (KAERI) developed on-line digital core protection and monitoring systems, called SCOPS and SCOMS as a part of SMART plant protection and monitoring system. SCOPS simplified the protection system by directly connecting the four RSPT signals to each core protection channel and eliminated the control element assembly calculator (CEAC) hardware. SCOMS adopted DPCM3D method in synthesizing core power distribution instead of Fourier expansion method being used in conventional PWRs. The DPCM3D method produces a synthetic 3-D power distribution by coupling a neutronics code and measured in-core detector signals. The overall uncertainty analysis methodology which is used statistically combining uncertainty components of SMART core protection and monitoring system was developed. In this paper, preliminary overall uncertainty factors for SCOPS/SCOMS of SMART initial core were evaluated by applying newly developed uncertainty analysis method

  11. Development of Uncertainty Analysis Method for SMART Digital Core Protection and Monitoring System

    Koo, Bon Seung; In, Wang Kee; Hwang, Dae Hyun

    2012-01-01

    The Korea Atomic Energy Research Institute has developed a system-integrated modular advanced reactor (SMART) for a seawater desalination and electricity generation. Online digital core protection and monitoring systems, called SCOPS and SCOMS respectively were developed. SCOPS calculates minimum DNBR and maximum LPD based on the several online measured system parameters. SCOMS calculates the variables of limiting conditions for operation. KAERI developed overall uncertainty analysis methodology which is used statistically combining uncertainty components of SMART core protection and monitoring system. By applying overall uncertainty factors in on-line SCOPS/SCOMS calculation, calculated LPD and DNBR are conservative with a 95/95 probability/confidence level. In this paper, uncertainty analysis method is described for SMART core protection and monitoring system

  12. The uncertainty analysis of a liquid metal reactor for burning minor actinides from light water reactors

    Choi, Hang Bok [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    The neutronics analysis of a liquid metal reactor for burning minor actinides has shown that uncertainties in the nuclear data of several key minor actinide isotopes can introduce large uncertainties in the predicted performance of the core. A comprehensive sensitivity and uncertainty analysis was performed on a 1200 MWth actinide burner designed for a low burnup reactivity swing, negative doppler coefficient, and low sodium void worth. Sensitivities were generated using depletion perturbation methods for the equilibrium cycle of the reactor and covariance data was taken ENDF-B/V and other published sources. The relative uncertainties in the burnup swing, doppler coefficient, and void worth were conservatively estimated to be 180%, 97%, and 46%, respectively. 5 refs., 1 fig., 3 tabs. (Author)

  13. A new measure of uncertainty importance based on distributional sensitivity analysis for PSA

    Han, Seok Jung; Tak, Nam Il; Chun, Moon Hyun

    1996-01-01

    The main objective of the present study is to propose a new measure of uncertainty importance based on distributional sensitivity analysis. The new measure is developed to utilize a metric distance obtained from cumulative distribution functions (cdfs). The measure is evaluated for two cases: one is a cdf given by a known analytical distribution and the other given by an empirical distribution generated by a crude Monte Carlo simulation. To study its applicability, the present measure has been applied to two different cases. The results are compared with those of existing three methods. The present approach is a useful measure of uncertainty importance which is based on cdfs. This method is simple and easy to calculate uncertainty importance without any complex process. On the basis of the results obtained in the present work, the present method is recommended to be used as a tool for the analysis of uncertainty importance

  14. The uncertainty analysis of a liquid metal reactor for burning minor actinides from light water reactors

    Choi, Hang Bok [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1999-12-31

    The neutronics analysis of a liquid metal reactor for burning minor actinides has shown that uncertainties in the nuclear data of several key minor actinide isotopes can introduce large uncertainties in the predicted performance of the core. A comprehensive sensitivity and uncertainty analysis was performed on a 1200 MWth actinide burner designed for a low burnup reactivity swing, negative doppler coefficient, and low sodium void worth. Sensitivities were generated using depletion perturbation methods for the equilibrium cycle of the reactor and covariance data was taken ENDF-B/V and other published sources. The relative uncertainties in the burnup swing, doppler coefficient, and void worth were conservatively estimated to be 180%, 97%, and 46%, respectively. 5 refs., 1 fig., 3 tabs. (Author)

  15. Code development of total sensitivity and uncertainty analysis for reactor physics calculations

    Wan, C.; Cao, L.; Wu, H.; Zu, T.; Shen, W.

    2015-01-01

    Sensitivity and uncertainty analysis are essential parts for reactor system to perform risk and policy analysis. In this study, total sensitivity and corresponding uncertainty analysis for responses of neutronics calculations have been accomplished and developed the S&U analysis code named UNICORN. The UNICORN code can consider the implicit effects of multigroup cross sections on the responses. The UNICORN code has been applied to typical pin-cell case in this paper, and can be proved correct by comparison the results with those of the TSUNAMI-1D code. (author)

  16. Code development of total sensitivity and uncertainty analysis for reactor physics calculations

    Wan, C.; Cao, L.; Wu, H.; Zu, T., E-mail: chenghuiwan@stu.xjtu.edu.cn, E-mail: caolz@mail.xjtu.edu.cn, E-mail: hongchun@mail.xjtu.edu.cn, E-mail: tiejun@mail.xjtu.edu.cn [Xi' an Jiaotong Univ., School of Nuclear Science and Technology, Xi' an (China); Shen, W., E-mail: Wei.Shen@cnsc-ccsn.gc.ca [Xi' an Jiaotong Univ., School of Nuclear Science and Technology, Xi' an (China); Canadian Nuclear Safety Commission, Ottawa, ON (Canada)

    2015-07-01

    Sensitivity and uncertainty analysis are essential parts for reactor system to perform risk and policy analysis. In this study, total sensitivity and corresponding uncertainty analysis for responses of neutronics calculations have been accomplished and developed the S&U analysis code named UNICORN. The UNICORN code can consider the implicit effects of multigroup cross sections on the responses. The UNICORN code has been applied to typical pin-cell case in this paper, and can be proved correct by comparison the results with those of the TSUNAMI-1D code. (author)

  17. Phenomenological uncertainty analysis of early containment failure at severe accident of nuclear power plant

    Lee, Su Won

    2011-02-15

    The severe accident has inherently significant uncertainty due to wide range of conditions and performing experiments, validation and practical application are extremely difficult because of its high temperature and pressure. Although internal and external researches were put into practice, the reference used in Korean nuclear plants were foreign data of 1980s and safety analysis as the probabilistic safety assessment has not applied the newest methodology. Also, it is applied to containment pressure formed into point value as results of thermal hydraulic analysis to identify the probability of containment failure in level 2 PSA. In this paper, the uncertainty analysis methods for phenomena of severe accident influencing early containment failure were developed, the uncertainty analysis that apply Korean nuclear plants using the MELCOR code was performed and it is a point of view to present the distribution of containment pressure as a result of uncertainty analysis. Because early containment failure is important factor of Large Early Release Frequency(LERF) that is used as representative criteria of decision-making in nuclear power plants, it was selected in this paper among various modes of containment failure. Important phenomena of early containment failure at severe accident based on previous researches were comprehended and methodology of 7th steps to evaluate uncertainty was developed. The MELCOR input for analysis of the severe accident reflected natural circulation flow was developed and the accident scenario for station black out that was representative initial event of early containment failure was determined. By reviewing the internal model and correlation for MELCOR model relevant important phenomena of early containment failure, the uncertainty factors which could affect on the uncertainty were founded and the major factors were finally identified through the sensitivity analysis. In order to determine total number of MELCOR calculations which can

  18. Parameter sensitivity and uncertainty of the forest carbon flux model FORUG : a Monte Carlo analysis

    Verbeeck, H.; Samson, R.; Lemeur, R. [Ghent Univ., Ghent (Belgium). Laboratory of Plant Ecology; Verdonck, F. [Ghent Univ., Ghent (Belgium). Dept. of Applied Mathematics, Biometrics and Process Control

    2006-06-15

    The FORUG model is a multi-layer process-based model that simulates carbon dioxide (CO{sub 2}) and water exchange between forest stands and the atmosphere. The main model outputs are net ecosystem exchange (NEE), total ecosystem respiration (TER), gross primary production (GPP) and evapotranspiration. This study used a sensitivity analysis to identify the parameters contributing to NEE uncertainty in the FORUG model. The aim was to determine if it is necessary to estimate the uncertainty of all parameters of a model to determine overall output uncertainty. Data used in the study were the meteorological and flux data of beech trees in Hesse. The Monte Carlo method was used to rank sensitivity and uncertainty parameters in combination with a multiple linear regression. Simulations were run in which parameters were assigned probability distributions and the effect of variance in the parameters on the output distribution was assessed. The uncertainty of the output for NEE was estimated. Based on the arbitrary uncertainty of 10 key parameters, a standard deviation of 0.88 Mg C per year per NEE was found, which was equal to 24 per cent of the mean value of NEE. The sensitivity analysis showed that the overall output uncertainty of the FORUG model could be determined by accounting for only a few key parameters, which were identified as corresponding to critical parameters in the literature. It was concluded that the 10 most important parameters determined more than 90 per cent of the output uncertainty. High ranking parameters included soil respiration; photosynthesis; and crown architecture. It was concluded that the Monte Carlo technique is a useful tool for ranking the uncertainty of parameters of process-based forest flux models. 48 refs., 2 tabs., 2 figs.

  19. Status report on activities of ASTM E10.05.01 Task Group on Uncertainty Analysis

    Kam, F.B.K.; Stallman, F.W.

    1979-01-01

    Uncertainties in the field of reactor dosimetry are analyzed. A survey on uncertainty analysis as it is practiced at leading laboratories which are involved in reactor dosimetry is described. A questionnaire was prepared and mailed to about 45 installations and researchers. Nine replies were received, several of them were prepared by more than one author. Three of the nine came from installations outside the US. Results and the questionnaire are presented

  20. Crashworthiness uncertainty analysis of typical civil aircraft based on Box–Behnken method

    Ren Yiru; Xiang Jinwu

    2014-01-01

    The crashworthiness is an important design factor of civil aircraft related with the safety of occupant during impact accident. It is a highly nonlinear transient dynamic problem and may be greatly influenced by the uncertainty factors. Crashworthiness uncertainty analysis is conducted to investigate the effects of initial conditions, structural dimensions and material properties. Simplified finite element model is built based on the geometrical model and basic physics phenomenon. Box–Behnken...

  1. Analysis of uncertainties in CRAC2 calculations: the inhalation pathway

    Killough, G.G.; Dunning, D.E. Jr.

    1984-01-01

    CRAC2 is a computer code for estimating the health effects and economic costs that might result from a release of radioactivity from a nuclear reactor to the environment. This paper describes tests of sensitivity of the predicted health effects to uncertainties in parameters associated with inhalation of the released radionuclides. These parameters are the particle size of the carrier aerosol and, for each element in the release, the clearance parameters for the lung model on which the code's dose conversion factors for inhalation are based. CRAC2 uses hourly meteorological data and a straight-line Gaussian plume model to predict the transport of airborne radioactivity; it includes models for plume depletion and population evacuation, and data for the distributions of population and land use. The code can compute results for single weather sequences, or it can perform random sampling of weather sequences from the meteorological data file and compute results for each weather sequence in the sample. For the work described in this paper, we concentrated on three fixed weather sequences that represent a range of conditions. For each fixed weather sequence, we applied random sampling to joint distributions of the inhalation parameters in order to estimate the sensitivity of the predicted health effects. All sampling runs produced coefficients of variation that were less than 50%, but some differences of means between weather sequences were substantial, as were some differences between means and the corresponding CRAC2 results without random sampling. Early injuries showed differences of as much as 1 to 2 orders of magnitude, while the differences in early fatalities were less than a factor of 2. Latent cancer fatalities varied by less than 10%. 19 references, 6 figures, 3 tables

  2. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    Sig Drellack, Lance Prothro

    2007-01-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result of the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The

  3. Application of uncertainty and sensitivity analysis to the air quality SHERPA modelling tool

    Pisoni, E.; Albrecht, D.; Mara, T. A.; Rosati, R.; Tarantola, S.; Thunis, P.

    2018-06-01

    Air quality has significantly improved in Europe over the past few decades. Nonetheless we still find high concentrations in measurements mainly in specific regions or cities. This dimensional shift, from EU-wide to hot-spot exceedances, calls for a novel approach to regional air quality management (to complement EU-wide existing policies). The SHERPA (Screening for High Emission Reduction Potentials on Air quality) modelling tool was developed in this context. It provides an additional tool to be used in support to regional/local decision makers responsible for the design of air quality plans. It is therefore important to evaluate the quality of the SHERPA model, and its behavior in the face of various kinds of uncertainty. Uncertainty and sensitivity analysis techniques can be used for this purpose. They both reveal the links between assumptions and forecasts, help in-model simplification and may highlight unexpected relationships between inputs and outputs. Thus, a policy steered SHERPA module - predicting air quality improvement linked to emission reduction scenarios - was evaluated by means of (1) uncertainty analysis (UA) to quantify uncertainty in the model output, and (2) by sensitivity analysis (SA) to identify the most influential input sources of this uncertainty. The results of this study provide relevant information about the key variables driving the SHERPA output uncertainty, and advise policy-makers and modellers where to place their efforts for an improved decision-making process.

  4. Uncertainty analysis technique of dynamic response and cumulative damage properties of piping system

    Suzuki, Kohei; Aoki, Shigeru; Hara, Fumio; Hanaoka, Masaaki; Yamashita, Tadashi.

    1982-01-01

    It is a technologically important subject to establish the method of uncertainty analysis statistically examining the variation of the earthquake response and damage properties of equipment and piping system due to the change of input load and the parameters of structural system, for evaluating the aseismatic capability and dynamic structural reliability of these systems. The uncertainty in the response and damage properties when equipment and piping system are subjected to excessive vibration load is mainly dependent on the irregularity of acting input load such as the unsteady vibration of earthquakes, and structural uncertainty in forms and dimensions. This study is the basic one to establish the method for evaluating the uncertainty in the cumulative damage property at the time of resonant vibration of piping system due to the disperse of structural parameters with a simple model. First, the piping models with simple form were broken by resonant vibration, and the uncertainty in the cumulative damage property was evaluated. Next, the response analysis using an elasto-plastic mechanics model was performed by numerical simulation. Finally, the method of uncertainty analysis for response and damage properties by the perturbation method utilizing equivalent linearization was proposed, and its propriety was proved. (Kako, I.)

  5. Integrated uncertainty analysis using RELAP/SCDAPSIM/MOD4.0

    Perez, M.; Reventos, F.; Wagner, R.; Allison, C.

    2009-01-01

    The RELAP/SCDAPSIM/MOD4.0 code, designed to predict the behavior of reactor systems during normal and accident conditions, is being developed as part of an international nuclear technology Software Development and Training Program (SDTP). RELAP/SCDAPSIM/MOD4.0, which is the first version of RELAP5 completely rewritten to FORTRAN 90/95/2000 standards, uses the publicly available RELAP5 and SCDAP models in combination with (a) advanced programming and numerical techniques, (b) advanced SDTP-member-developed models for LWR, HWR, and research reactor analysis, and (c) a variety of other member-developed computational packages. One such computational package is an integrated uncertainty analysis package being developed jointly by the Technical University of Catalunya (UPC) and Innovative Systems Software (ISS). The integrated uncertainty analysis approach used in the package uses the following steps: 1. Selection of the plant; 2. Selection of the scenario; 3. Selection of the safety criteria; 4. Identification and ranking of the relevant phenomena based on the safety criteria; 5. Selection of the appropriate code parameters to represent those phenomena; 6. Association of uncertainty by means of Probability Distribution Functions (PDFs) for each selected parameter; 7. Random sampling of the selected parameters according to its PDF and performing multiple computer runs to obtain uncertainty bands with a certain percentile and confidence level; 8. Processing the results of the multiple computer runs to estimate the uncertainty bands for the computed quantities associated with the selected safety criteria. The first four steps are performed by the user prior to the RELAP/SCDAPSIM/MOD4.0 analysis. The remaining steps are included with the MOD4.0 integrated uncertainty analysis (IUA) package. This paper briefly describes the integrated uncertainty analysis package including (a) the features of the package, (b) the implementation of the package into RELAP/SCDAPSIM/MOD4.0, and

  6. Incorporating uncertainty analysis into life cycle estimates of greenhouse gas emissions from biomass production

    Johnson, David R.; Willis, Henry H.; Curtright, Aimee E.; Samaras, Constantine; Skone, Timothy

    2011-01-01

    Before further investments are made in utilizing biomass as a source of renewable energy, both policy makers and the energy industry need estimates of the net greenhouse gas (GHG) reductions expected from substituting biobased fuels for fossil fuels. Such GHG reductions depend greatly on how the biomass is cultivated, transported, processed, and converted into fuel or electricity. Any policy aiming to reduce GHGs with biomass-based energy must account for uncertainties in emissions at each stage of production, or else it risks yielding marginal reductions, if any, while potentially imposing great costs. This paper provides a framework for incorporating uncertainty analysis specifically into estimates of the life cycle GHG emissions from the production of biomass. We outline the sources of uncertainty, discuss the implications of uncertainty and variability on the limits of life cycle assessment (LCA) models, and provide a guide for practitioners to best practices in modeling these uncertainties. The suite of techniques described herein can be used to improve the understanding and the representation of the uncertainties associated with emissions estimates, thus enabling improved decision making with respect to the use of biomass for energy and fuel production. -- Highlights: → We describe key model, scenario and data uncertainties in LCAs of biobased fuels. → System boundaries and allocation choices should be consistent with study goals. → Scenarios should be designed around policy levers that can be controlled. → We describe a new way to analyze the importance of covariance between inputs.

  7. Uncertainty analysis of suppression pool heating during an ATWS in a BWR-5 plant

    Wulff, W.; Cheng, H.S.; Mallen, A.N.; Johnsen, G.W.; Lellouche, G.S.

    1994-03-01

    The uncertainty has been estimated of predicting the peak temperature in the suppression pool of a BWR power plant, which undergoes an NRC-postulated Anticipated Transient Without Scram (ATWS). The ATWS is initiated by recirculation-pump trips, and then leads to power and flow oscillations as they had occurred at the LaSalle-2 Power Station in March of 1988. After limit-cycle oscillations have been established, the turbines are tripped, but without MSIV closure, allowing steam discharge through the turbine bypass into the condenser. Postulated operator actions, namely to lower the reactor vessel pressure and the level elevation in the downcomer, are simulated by a robot model which accounts for operator uncertainty. All balance of plant and control systems modeling uncertainties were part of the statistical uncertainty analysis that was patterned after the Code Scaling, Applicability and Uncertainty (CSAU) evaluation methodology. The analysis showed that the predicted suppression-pool peak temperature of 329.3 K (133 degrees F) has a 95-percentile uncertainty of 14.4 K (26 degrees F), and that the size of this uncertainty bracket is dominated by the experimental uncertainty of measuring Safety and Relief Valve mass flow rates under critical-flow conditions. The analysis showed also that the probability of exceeding the suppression-pool temperature limit of 352.6 K (175 degrees F) is most likely zero (it is estimated as < 5-104). The square root of the sum of the squares of all the computed peak pool temperatures is 350.7 K (171.6 degrees F)

  8. Effect of activation cross section uncertainties in transmutation analysis of realistic low-activation steels for IFMIF

    Cabellos, O.; Garcya-Herranz, N.; Sanz, J. [Institute of Nuclear Fusion, UPM, Madrid (Spain); Cabellos, O.; Garcya-Herranz, N.; Fernandez, P.; Fernandez, B. [Dept. of Nuclear Engineering, UPM, Madrid (Spain); Sanz, J. [Dept. of Power Engineering, UNED, Madrid (Spain); Reyes, S. [Safety, Environment and Health Group, ITER Joint Work Site, Cadarache Center (France)

    2008-07-01

    We address uncertainty analysis to draw conclusions on the reliability of the activation calculation in the International Fusion Materials Irradiation Facility (IFMIF) under the potential impact of activation cross section uncertainties. The Monte Carlo methodology implemented in ACAB code gives the uncertainty estimates due to the synergetic/global effect of the complete set of cross section uncertainties. An element-by-element analysis has been demonstrated as a helpful tool to easily analyse the transmutation performance of irradiated materials.The uncertainty analysis results showed that for times over about 24 h the relative error in the contact dose rate can be as large as 23 per cent. We have calculated the effect of cross section uncertainties in the IFMIF activation of all different elements. For EUROFER, uncertainties in H and He elements are 7.3% and 5.6%, respectively. We have found significant uncertainties in the transmutation response for C, P and Nb.

  9. Thermal-Hydraulic Analysis for SBLOCA in OPR1000 and Evaluation of Uncertainty for PSA

    Kim, Tae Jin; Park, Goon Cherl

    2012-01-01

    Probabilistic Safety assessment (PSA) is a mathematical tool to evaluate numerical estimates of risk for nuclear power plants (NPPs). But PSA has the problems about quality and reliability since the quantification of uncertainties from thermal hydraulic (TH) analysis has not been included in the quantification of overall uncertainties in PSA. From the former research, it is proved that the quantification of uncertainties from best-estimate LBLOCA analysis can improve the PSA quality by modifying the core damage frequency (CDF) from the existing PSA report. Basing on the similar concept, this study considers the quantification of SBLOCA analysis results. In this study, however, operator error parameters are also included in addition to the phenomenon parameters which are considered in LBLOCA analysis

  10. Effect of Uncertainty Parameters in Blowdown and Reflood Models for OPR1000 LBLOCA Analysis

    Huh, Byung Gil; Jin, Chang Yong; Seul, Kwangwon; Hwang, Taesuk [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2014-05-15

    KINS(Korea Institute of Nuclear Safety) has also performed the audit calculation with the KINS Realistic Evaluation Methodology(KINS-REM) to confirm the validity of licensee's calculation. In the BEPU method, it is very important to quantify the code and model uncertainty. It is referred in the following requirement: BE calculations in Regulatory Guide 1.157 - 'the code and models used are acceptable and applicable to the specific facility over the intended operating range and must quantify the uncertainty in the specific application'. In general, the uncertainty of model/code should be obtained through the data comparison with relevant integral- and separate-effect tests at different scales. However, it is not easy to determine these kinds of uncertainty because of the difficulty for evaluating accurately various experiments. Therefore, the expert judgment has been used in many cases even with the limitation that the uncertainty range of important parameters can be wide and inaccurate. In the KINS-REM, six heat transfer parameters in the blowdown phase have been used to consider the uncertainty of models. Recently, MARS-KS code was modified to consider the uncertainty of the five heat transfer parameters in the reflood phase. Accordingly, it is required that the uncertainty range for parameters of reflood models is determined and the effect of these ranges is evaluated. In this study, the large break LOCA (LBLOCA) analysis for OPR1000 was performed to identify the effect of uncertainty parameters in blowdown and reflood models.

  11. Uncertainty analysis of a one-dimensional constitutive model for shape memory alloy thermomechanical description

    Oliveira, Sergio A.; Savi, Marcelo A.; Santos, Ilmar F.

    2014-01-01

    The use of shape memory alloys (SMAs) in engineering applications has increased the interest of the accuracy analysis of their thermomechanical description. This work presents an uncertainty analysis related to experimental tensile tests conducted with shape memory alloy wires. Experimental data...... are compared with numerical simulations obtained from a constitutive model with internal constraints employed to describe the thermomechanical behavior of SMAs. The idea is to evaluate if the numerical simulations are within the uncertainty range of the experimental data. Parametric analysis is also developed...

  12. Uncertainties in Cancer Risk Coefficients for Environmental Exposure to Radionuclides. An Uncertainty Analysis for Risk Coefficients Reported in Federal Guidance Report No. 13

    Pawel, David [U.S. Environmental Protection Agency; Leggett, Richard Wayne [ORNL; Eckerman, Keith F [ORNL; Nelson, Christopher [U.S. Environmental Protection Agency

    2007-01-01

    Federal Guidance Report No. 13 (FGR 13) provides risk coefficients for estimation of the risk of cancer due to low-level exposure to each of more than 800 radionuclides. Uncertainties in risk coefficients were quantified in FGR 13 for 33 cases (exposure to each of 11 radionuclides by each of three exposure pathways) on the basis of sensitivity analyses in which various combinations of plausible biokinetic, dosimetric, and radiation risk models were used to generate alternative risk coefficients. The present report updates the uncertainty analysis in FGR 13 for the cases of inhalation and ingestion of radionuclides and expands the analysis to all radionuclides addressed in that report. The analysis indicates that most risk coefficients for inhalation or ingestion of radionuclides are determined within a factor of 5 or less by current information. That is, application of alternate plausible biokinetic and dosimetric models and radiation risk models (based on the linear, no-threshold hypothesis with an adjustment for the dose and dose rate effectiveness factor) is unlikely to change these coefficients by more than a factor of 5. In this analysis the assessed uncertainty in the radiation risk model was found to be the main determinant of the uncertainty category for most risk coefficients, but conclusions concerning the relative contributions of risk and dose models to the total uncertainty in a risk coefficient may depend strongly on the method of assessing uncertainties in the risk model.

  13. Uncertainty analysis for the BEACON-COLSS core monitoring system application

    Morita, T.; Boyd, W.A.; Seong, K.B.

    2005-01-01

    This paper will cover the measurement uncertainty analysis of BEACON-COLSS core monitoring system. The uncertainty evaluation is made by using a BEACON-COLSS simulation program. By simulating the BEACON on-line operation for analytically generated reactor conditions, accuracy of the 'Measured' results can be evaluated by comparing to analytically generated 'Truth'. The DNB power margin is evaluated based on the Combustion Engineering's Modified Statistical Combination of Uncertainties (MSCU) using the CETOPD code for the DNBR calculation. A BEACON-COLSS simulation program for the uncertainty evaluation function has been established for plant applications. Qualification work has been completed for two Combustion Engineering plants. Results of the BEACON-COLSS measured peaking factors and DNBR power margin are plant type dependent and are applicable to reload cores as long as the core geometry and detector layout are unchanged. (authors)

  14. Coupled code analysis of uncertainty and sensitivity of Kalinin-3 benchmark

    Pasichnyk, Ihor; Zwermann, Winfried; Velkov, Kiril [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Garching (Germany); Nikonov, Sergey [VNIIAES, Moscow (Russian Federation)

    2016-09-15

    An uncertainty and sensitivity analysis is performed for the OECD/NEA coolant transient Benchmark (K-3) on measured data at Kalinin-3 Nuclear Power Plant (NPP). A switch off of one main coolant pump (MCP) at nominal reactor power is calculated using a coupled thermohydraulic and neutron-kinetic ATHLET-PARCS code. The objectives are to study uncertainty of total reactor power and to identify the main sources of reactor power uncertainty. The GRS uncertainty and sensitivity software package XSUSA is applied to propagate uncertainties in nuclear data libraries to the full core coupled transient calculations. A set of most important thermal-hydraulic parameters of the primary circuit is identified and a total of 23 thermohydraulic parameters are statistically varied using GRS code SUSA. The ATHLET model contains also a balance-of-plant (BOP) model which is simulated using ATHLET GCSM module. In particular the operation of the main steam generator regulators is modelled in detail. A set of 200 varied coupled ATHLET-PARCS calculations is analyzed. The results obtained show a clustering effect in the behavior of global reactor parameters. It is found that the GCSM system together with varied input parameters strongly influence the overall nuclear power plant behavior and can even lead to a new scenario. Possible reasons of the clustering effect are discussed in the paper. This work is a step forward in establishing a ''best-estimate calculations in combination with performing uncertainty analysis'' methodology for coupled full core calculations.

  15. Propagation of uncertainty and sensitivity analysis in an integral oil-gas plume model

    Wang, Shitao

    2016-05-27

    Polynomial Chaos expansions are used to analyze uncertainties in an integral oil-gas plume model simulating the Deepwater Horizon oil spill. The study focuses on six uncertain input parameters—two entrainment parameters, the gas to oil ratio, two parameters associated with the droplet-size distribution, and the flow rate—that impact the model\\'s estimates of the plume\\'s trap and peel heights, and of its various gas fluxes. The ranges of the uncertain inputs were determined by experimental data. Ensemble calculations were performed to construct polynomial chaos-based surrogates that describe the variations in the outputs due to variations in the uncertain inputs. The surrogates were then used to estimate reliably the statistics of the model outputs, and to perform an analysis of variance. Two experiments were performed to study the impacts of high and low flow rate uncertainties. The analysis shows that in the former case the flow rate is the largest contributor to output uncertainties, whereas in the latter case, with the uncertainty range constrained by aposteriori analyses, the flow rate\\'s contribution becomes negligible. The trap and peel heights uncertainties are then mainly due to uncertainties in the 95% percentile of the droplet size and in the entrainment parameters.

  16. Two-dimensional cross-section and SED uncertainty analysis for the Fusion Engineering Device (FED)

    Embrechts, M.J.; Urban, W.T.; Dudziak, D.J.

    1982-01-01

    The theory of two-dimensional cross-section and secondary-energy-distribution (SED) sensitivity was implemented by developing a two-dimensional sensitivity and uncertainty analysis code, SENSIT-2D. Analyses of the Fusion Engineering Design (FED) conceptual inboard shield indicate that, although the calculated uncertainties in the 2-D model are of the same order of magnitude as those resulting from the 1-D model, there might be severe differences. The more complex the geometry, the more compulsory a 2-D analysis becomes. Specific results show that the uncertainty for the integral heating of the toroidal field (TF) coil for the FED is 114.6%. The main contributors to the cross-section uncertainty are chromium and iron. Contributions to the total uncertainty were smaller for nickel, copper, hydrogen and carbon. All analyses were performed with the Los Alamos 42-group cross-section library generated from ENDF/B-V data, and the COVFILS covariance matrix library. The large uncertainties due to chromium result mainly from large convariances for the chromium total and elastic scattering cross sections

  17. Accounting for uncertainty in ecological analysis: the strengths and limitations of hierarchical statistical modeling.

    Cressie, Noel; Calder, Catherine A; Clark, James S; Ver Hoef, Jay M; Wikle, Christopher K

    2009-04-01

    Analyses of ecological data should account for the uncertainty in the process(es) that generated the data. However, accounting for these uncertainties is a difficult task, since ecology is known for its complexity. Measurement and/or process errors are often the only sources of uncertainty modeled when addressing complex ecological problems, yet analyses should also account for uncertainty in sampling design, in model specification, in parameters governing the specified model, and in initial and boundary conditions. Only then can we be confident in the scientific inferences and forecasts made from an analysis. Probability and statistics provide a framework that accounts for multiple sources of uncertainty. Given the complexities of ecological studies, the hierarchical statistical model is an invaluable tool. This approach is not new in ecology, and there are many examples (both Bayesian and non-Bayesian) in the literature illustrating the benefits of this approach. In this article, we provide a baseline for concepts, notation, and methods, from which discussion on hierarchical statistical modeling in ecology can proceed. We have also planted some seeds for discussion and tried to show where the practical difficulties lie. Our thesis is that hierarchical statistical modeling is a powerful way of approaching ecological analysis in the presence of inevitable but quantifiable uncertainties, even if practical issues sometimes require pragmatic compromises.

  18. Validation of methodology and uncertainty assessment of antimony determination in environmental materials using Neutron Activation Analysis

    Matsubara, Tassiane C.M.; Saiki, Mitiko; Zahn, Guilherme S.; Moreira, Edson G.

    2013-01-01

    Antimony is an element found in low concentrations in the environment. However, its determination has attracted great interest because of the knowledge of its toxicity and increasing application. Neutron activation analysis (NAA) is a suitable method for the determination of several elements in different types, but in case of Sb, the analysis presents some difficulties due to spectral interferences. The objective of this research was to validate the method of NAA and uncertainty assessment for Sb determination in environmental samples. The experimental procedure consisted of irradiating twelve certified reference samples of different kind of matrices. The samples were irradiated in the nuclear research reactor IEA R1 IPEN/CNEN/SP followed by measurement of induced radioactivity, using a hyperpure germanium detector coupled to a gamma ray spectrometry. The radioisotopes 122 Sb and 124 Sb were measured and the Sb concentrations with their respective uncertainties were obtained by the comparative method. Relative errors and values of Z scores were calculated to evaluate the accuracy of the results for Sb determination in certified reference materials. The evaluation of the components that contribute to uncertainty measurement of the Sb concentration, showed that the major uncertainty contribution is due to statistical counting. The results also indicated that the uncertainty value of the combined standard uncertainty depends on the radioisotope measured and the decay time used for counting. (author)

  19. Propagation of uncertainty and sensitivity analysis in an integral oil-gas plume model

    Wang, Shitao; Iskandarani, Mohamed; Srinivasan, Ashwanth; Thacker, W. Carlisle; Winokur, Justin; Knio, Omar

    2016-01-01

    Polynomial Chaos expansions are used to analyze uncertainties in an integral oil-gas plume model simulating the Deepwater Horizon oil spill. The study focuses on six uncertain input parameters—two entrainment parameters, the gas to oil ratio, two parameters associated with the droplet-size distribution, and the flow rate—that impact the model's estimates of the plume's trap and peel heights, and of its various gas fluxes. The ranges of the uncertain inputs were determined by experimental data. Ensemble calculations were performed to construct polynomial chaos-based surrogates that describe the variations in the outputs due to variations in the uncertain inputs. The surrogates were then used to estimate reliably the statistics of the model outputs, and to perform an analysis of variance. Two experiments were performed to study the impacts of high and low flow rate uncertainties. The analysis shows that in the former case the flow rate is the largest contributor to output uncertainties, whereas in the latter case, with the uncertainty range constrained by aposteriori analyses, the flow rate's contribution becomes negligible. The trap and peel heights uncertainties are then mainly due to uncertainties in the 95% percentile of the droplet size and in the entrainment parameters.

  20. Uncertainty analysis of constant amplitude fatigue test data employing the six parameters random fatigue limit model

    Leonetti Davide

    2018-01-01

    Full Text Available Estimating and reducing uncertainty in fatigue test data analysis is a relevant task in order to assess the reliability of a structural connection with respect to fatigue. Several statistical models have been proposed in the literature with the aim of representing the stress range vs. endurance trend of fatigue test data under constant amplitude loading and the scatter in the finite and infinite life regions. In order to estimate the safety level of the connection also the uncertainty related to the amount of information available need to be estimated using the methods provided by the theory of statistic. The Bayesian analysis is employed to reduce the uncertainty due to the often small amount of test data by introducing prior information related to the parameters of the statistical model. In this work, the inference of fatigue test data belonging to cover plated steel beams is presented. The uncertainty is estimated by making use of Bayesian and frequentist methods. The 5% quantile of the fatigue life is estimated by taking into account the uncertainty related to the sample size for both a dataset containing few samples and one containing more data. The S-N curves resulting from the application of the employed methods are compared and the effect of the reduction of uncertainty in the infinite life region is quantified.

  1. Operational modal analysis modeling, Bayesian inference, uncertainty laws

    Au, Siu-Kui

    2017-01-01

    This book presents operational modal analysis (OMA), employing a coherent and comprehensive Bayesian framework for modal identification and covering stochastic modeling, theoretical formulations, computational algorithms, and practical applications. Mathematical similarities and philosophical differences between Bayesian and classical statistical approaches to system identification are discussed, allowing their mathematical tools to be shared and their results correctly interpreted. Many chapters can be used as lecture notes for the general topic they cover beyond the OMA context. After an introductory chapter (1), Chapters 2–7 present the general theory of stochastic modeling and analysis of ambient vibrations. Readers are first introduced to the spectral analysis of deterministic time series (2) and structural dynamics (3), which do not require the use of probability concepts. The concepts and techniques in these chapters are subsequently extended to a probabilistic context in Chapter 4 (on stochastic pro...

  2. Uncertainty and sensitivity analysis: Mathematical model of coupled heat and mass transfer for a contact baking process

    Feyissa, Aberham Hailu; Gernaey, Krist; Adler-Nissen, Jens

    2012-01-01

    to uncertainty in the model predictions. The aim of the current paper is to address this uncertainty challenge in the modelling of food production processes using a combination of uncertainty and sensitivity analysis, where the uncertainty analysis and global sensitivity analysis were applied to a heat and mass......Similar to other processes, the modelling of heat and mass transfer during food processing involves uncertainty in the values of input parameters (heat and mass transfer coefficients, evaporation rate parameters, thermo-physical properties, initial and boundary conditions) which leads...

  3. Ignoring correlation in uncertainty and sensitivity analysis in life cycle assessment: what is the risk?

    Groen, E.A., E-mail: Evelyne.Groen@gmail.com [Wageningen University, P.O. Box 338, Wageningen 6700 AH (Netherlands); Heijungs, R. [Vrije Universiteit Amsterdam, De Boelelaan 1105, Amsterdam 1081 HV (Netherlands); Leiden University, Einsteinweg 2, Leiden 2333 CC (Netherlands)

    2017-01-15

    Life cycle assessment (LCA) is an established tool to quantify the environmental impact of a product. A good assessment of uncertainty is important for making well-informed decisions in comparative LCA, as well as for correctly prioritising data collection efforts. Under- or overestimation of output uncertainty (e.g. output variance) will lead to incorrect decisions in such matters. The presence of correlations between input parameters during uncertainty propagation, can increase or decrease the the output variance. However, most LCA studies that include uncertainty analysis, ignore correlations between input parameters during uncertainty propagation, which may lead to incorrect conclusions. Two approaches to include correlations between input parameters during uncertainty propagation and global sensitivity analysis were studied: an analytical approach and a sampling approach. The use of both approaches is illustrated for an artificial case study of electricity production. Results demonstrate that both approaches yield approximately the same output variance and sensitivity indices for this specific case study. Furthermore, we demonstrate that the analytical approach can be used to quantify the risk of ignoring correlations between input parameters during uncertainty propagation in LCA. We demonstrate that: (1) we can predict if including correlations among input parameters in uncertainty propagation will increase or decrease output variance; (2) we can quantify the risk of ignoring correlations on the output variance and the global sensitivity indices. Moreover, this procedure requires only little data. - Highlights: • Ignoring correlation leads to under- or overestimation of the output variance. • We demonstrated that the risk of ignoring correlation can be quantified. • The procedure proposed is generally applicable in life cycle assessment. • In some cases, ignoring correlation has a minimal effect on decision-making tools.

  4. Comparative uncertainty analysis of copper loads in stormwater systems using GLUE and grey-box modeling

    Lindblom, Erik Ulfson; Madsen, Henrik; Mikkelsen, Peter Steen

    2007-01-01

    . With the proposed model and input data, the GLUE analysis show that the total sampled copper mass can be predicted within a range of +/- 50% of the median value ( 385 g), whereas the grey-box analysis showed a prediction uncertainty of less than +/- 30%. Future work will clarify the pros and cons of the two methods...

  5. Integrated Risk-Capability Analysis under Deep Uncertainty : An ESDMA Approach

    Pruyt, E.; Kwakkel, J.H.

    2012-01-01

    Integrated risk-capability analysis methodologies for dealing with increasing degrees of complexity and deep uncertainty are urgently needed in an ever more complex and uncertain world. Although scenario approaches, risk assessment methods, and capability analysis methods are used, few organizations

  6. Applying Fuzzy and Probabilistic Uncertainty Concepts to the Material Flow Analysis of Palladium in Austria

    Laner, David; Rechberger, Helmut; Astrup, Thomas Fruergaard

    2015-01-01

    Material flow analysis (MFA) is a widely applied tool to investigate resource and recycling systems of metals and minerals. Owing to data limitations and restricted system understanding, MFA results are inherently uncertain. To demonstrate the systematic implementation of uncertainty analysis in ...

  7. Uncertainty and Sensitivity Analysis of Afterbody Radiative Heating Predictions for Earth Entry

    West, Thomas K., IV; Johnston, Christopher O.; Hosder, Serhat

    2016-01-01

    The objective of this work was to perform sensitivity analysis and uncertainty quantification for afterbody radiative heating predictions of Stardust capsule during Earth entry at peak afterbody radiation conditions. The radiation environment in the afterbody region poses significant challenges for accurate uncertainty quantification and sensitivity analysis due to the complexity of the flow physics, computational cost, and large number of un-certain variables. In this study, first a sparse collocation non-intrusive polynomial chaos approach along with global non-linear sensitivity analysis was used to identify the most significant uncertain variables and reduce the dimensions of the stochastic problem. Then, a total order stochastic expansion was constructed over only the important parameters for an efficient and accurate estimate of the uncertainty in radiation. Based on previous work, 388 uncertain parameters were considered in the radiation model, which came from the thermodynamics, flow field chemistry, and radiation modeling. The sensitivity analysis showed that only four of these variables contributed significantly to afterbody radiation uncertainty, accounting for almost 95% of the uncertainty. These included the electronic- impact excitation rate for N between level 2 and level 5 and rates of three chemical reactions in uencing N, N(+), O, and O(+) number densities in the flow field.

  8. An Empirical Analysis of Stakeholders' Influence on Policy Development: the Role of Uncertainty Handling

    Rianne M. Bijlsma

    2011-03-01

    Full Text Available Stakeholder participation is advocated widely, but there is little structured, empirical research into its influence on policy development. We aim to further the insight into the characteristics of participatory policy development by comparing it to expert-based policy development for the same case. We describe the process of problem framing and analysis, as well as the knowledge base used. We apply an uncertainty perspective to reveal differences between the approaches and speculate about possible explanations. We view policy development as a continuous handling of substantive uncertainty and process uncertainty, and investigate how the methods of handling uncertainty of actors influence the policy development. Our findings suggest that the wider frame that was adopted in the participatory approach was the result of a more active handling of process uncertainty. The stakeholders handled institutional uncertainty by broadening the problem frame, and they handled strategic uncertainty by negotiating commitment and by including all important stakeholder criteria in the frame. In the expert-based approach, we observed a more passive handling of uncertainty, apparently to avoid complexity. The experts handled institutional uncertainty by reducing the scope and by anticipating windows of opportunity in other policy arenas. Strategic uncertainty was handled by assuming stakeholders' acceptance of noncontroversial measures that balanced benefits and sacrifices. Three other observations are of interest to the scientific debate on participatory policy processes. Firstly, the participatory policy was less adaptive than the expert-based policy. The observed low tolerance for process uncertainty of participants made them opt for a rigorous "once and for all" settling of the conflict. Secondly, in the participatory approach, actors preferred procedures of traceable knowledge acquisition over controversial topics to handle substantive uncertainty. This

  9. AN OVERVIEW OF THE UNCERTAINTY ANALYSIS, SENSITIVITY ANALYSIS, AND PARAMETER ESTIMATION (UA/SA/PE) API AND HOW TO IMPLEMENT IT

    The Application Programming Interface (API) for Uncertainty Analysis, Sensitivity Analysis, andParameter Estimation (UA/SA/PE API) (also known as Calibration, Optimization and Sensitivity and Uncertainty (CUSO)) was developed in a joint effort between several members of both ...

  10. Uncertainty in soil-structure interaction analysis arising from differences in analytical techniques

    Maslenikov, O.R.; Chen, J.C.; Johnson, J.J.

    1982-07-01

    This study addresses uncertainties arising from variations in different modeling approaches to soil-structure interaction of massive structures at a nuclear power plant. To perform a comprehensive systems analysis, it is necessary to quantify, for each phase of the traditional analysis procedure, both the realistic seismic response and the uncertainties associated with them. In this study two linear soil-structure interaction techniques were used to analyze the Zion, Illinois nuclear power plant: a direct method using the FLUSH computer program and a substructure approach using the CLASSI family of computer programs. In-structure response from two earthquakes, one real and one synthetic, was compared. Structure configurations from relatively simple to complicated multi-structure cases were analyzed. The resulting variations help quantify uncertainty in structure response due to analysis procedures

  11. Design, Analysis and Test of Logic Circuits Under Uncertainty

    Krishnaswamy, Smita; Hayes, John P

    2013-01-01

    Integrated circuits (ICs) increasingly exhibit uncertain characteristics due to soft errors, inherently probabilistic devices, and manufacturing variability. As device technologies scale, these effects can be detrimental to the reliability of logic circuits.  To improve future semiconductor designs, this book describes methods for analyzing, designing, and testing circuits subject to probabilistic effects. The authors first develop techniques to model inherently probabilistic methods in logic circuits and to test circuits for determining their reliability after they are manufactured. Then, they study error-masking mechanisms intrinsic to digital circuits and show how to leverage them to design more reliable circuits.  The book describes techniques for:   • Modeling and reasoning about probabilistic behavior in logic circuits, including a matrix-based reliability-analysis framework;   • Accurate analysis of soft-error rate (SER) based on functional-simulation, sufficiently scalable for use in gate-l...

  12. Advanced uncertainty modelling for container port risk analysis.

    Alyami, Hani; Yang, Zaili; Riahi, Ramin; Bonsall, Stephen; Wang, Jin

    2016-08-13

    Globalization has led to a rapid increase of container movements in seaports. Risks in seaports need to be appropriately addressed to ensure economic wealth, operational efficiency, and personnel safety. As a result, the safety performance of a Container Terminal Operational System (CTOS) plays a growing role in improving the efficiency of international trade. This paper proposes a novel method to facilitate the application of Failure Mode and Effects Analysis (FMEA) in assessing the safety performance of CTOS. The new approach is developed through incorporating a Fuzzy Rule-Based Bayesian Network (FRBN) with Evidential Reasoning (ER) in a complementary manner. The former provides a realistic and flexible method to describe input failure information for risk estimates of individual hazardous events (HEs) at the bottom level of a risk analysis hierarchy. The latter is used to aggregate HEs safety estimates collectively, allowing dynamic risk-based decision support in CTOS from a systematic perspective. The novel feature of the proposed method, compared to those in traditional port risk analysis lies in a dynamic model capable of dealing with continually changing operational conditions in ports. More importantly, a new sensitivity analysis method is developed and carried out to rank the HEs by taking into account their specific risk estimations (locally) and their Risk Influence (RI) to a port's safety system (globally). Due to its generality, the new approach can be tailored for a wide range of applications in different safety and reliability engineering and management systems, particularly when real time risk ranking is required to measure, predict, and improve the associated system safety performance. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Life cycle cost analysis of wind power considering stochastic uncertainties

    Li, Chiao-Ting; Peng, Huei; Sun, Jing

    2014-01-01

    This paper presents a long-term cost analysis of wind power and compares its competitiveness to non-renewable generating technologies. The analysis considers several important attributes related to wind intermittency that are sometimes ignored in traditional generation planning or LCOE (levelized cost of energy) studies, including the need for more nameplate capacity due to intermittency, hourly fluctuations in wind outputs and cost for reserves. The competitiveness of wind power is assessed by evaluating four scenarios: 1) adding natural gas generating capacity to the power grid; 2) adding coal generating capacity to the power grid; 3) adding wind capacity to the power grid; and, 4) adding wind capacity and energy storage to the power grid where an energy storage device is used to cover wind intermittency. A case study in the state of Michigan is presented to demonstrate the use of the proposed methodology, in which a time horizon from 2010 to 2040 is considered. The results show that wind energy will still be more expensive than natural gas power plants in the next three decades, but will be cheaper than coal capacities if wind intermittency is mitigated. Furthermore, if the costs of carbon emissions and environmental externalities are considered, wind generation will be a competitive option for grid capacity expansion. - Highlights: • The competitiveness of wind power is analyzed via life cycle cost analysis. • Wind intermittency and reserve costs are explicitly considered in the analysis. • Results show that wind is still more expensive than natural gas power plants. • Wind can be cheaper than coal capacities if wind intermittency is mitigated. • Wind will be competitive if costs of carbon emissions are considered

  14. Implementation of a Bayesian Engine for Uncertainty Analysis

    Leng Vang; Curtis Smith; Steven Prescott

    2014-08-01

    In probabilistic risk assessment, it is important to have an environment where analysts have access to a shared and secured high performance computing and a statistical analysis tool package. As part of the advanced small modular reactor probabilistic risk analysis framework implementation, we have identified the need for advanced Bayesian computations. However, in order to make this technology available to non-specialists, there is also a need of a simplified tool that allows users to author models and evaluate them within this framework. As a proof-of-concept, we have implemented an advanced open source Bayesian inference tool, OpenBUGS, within the browser-based cloud risk analysis framework that is under development at the Idaho National Laboratory. This development, the “OpenBUGS Scripter” has been implemented as a client side, visual web-based and integrated development environment for creating OpenBUGS language scripts. It depends on the shared server environment to execute the generated scripts and to transmit results back to the user. The visual models are in the form of linked diagrams, from which we automatically create the applicable OpenBUGS script that matches the diagram. These diagrams can be saved locally or stored on the server environment to be shared with other users.

  15. Improving uncertainty evaluation of process models by using pedigree analysis. A case study on CO2 capture with monoethanolamine

    van der Spek, Mijndert; Ramirez, Andrea; Faaij, André

    2016-01-01

    This article aims to improve uncertainty evaluation of process models by combining a quantitative uncertainty evaluation method (data validation) with a qualitative uncertainty evaluation method (pedigree analysis). The approach is tested on a case study of monoethanolamine based postcombustion CO2

  16. Total sensitivity and uncertainty analysis for LWR pin-cells with improved UNICORN code

    Wan, Chenghui; Cao, Liangzhi; Wu, Hongchun; Shen, Wei

    2017-01-01

    Highlights: • A new model is established for the total sensitivity and uncertainty analysis. • The NR approximation applied in S&U analysis can be avoided by the new model. • Sensitivity and uncertainty analysis is performed to PWR pin-cells by the new model. • The effects of the NR approximation for the PWR pin-cells are quantified. - Abstract: In this paper, improvements to the multigroup cross-section perturbation model have been proposed and applied in the self-developed UNICORN code, which is capable of performing the total sensitivity and total uncertainty analysis for the neutron-physics calculations by applying the direct numerical perturbation method and the statistical sampling method respectively. The narrow resonance (NR) approximation was applied in the multigroup cross-section perturbation model, implemented in UNICORN. As improvements to the NR approximation to refine the multigroup cross-section perturbation model, an ultrafine-group cross-section perturbation model has been established, in which the actual perturbations are applied to the ultrafine-group cross-section library and the reconstructions of the resonance cross sections are performed by solving the neutron slowing-down equation. The total sensitivity and total uncertainty analysis were then applied to the LWR pin-cells, using both the multigroup and the ultrafine-group cross-section perturbation models. The numerical results show that the NR approximation overestimates the relative sensitivity coefficients and the corresponding uncertainty results for the LWR pin-cells, and the effects of the NR approximation are significant for σ_(_n_,_γ_) and σ_(_n_,_e_l_a_s_) of "2"3"8U. Therefore, the effects of the NR approximation applied in the total sensitivity and total uncertainty analysis for the neutron-physics calculations of LWR should be taken into account.

  17. Using finite mixture models in thermal-hydraulics system code uncertainty analysis

    Carlos, S., E-mail: scarlos@iqn.upv.es [Department d’Enginyeria Química i Nuclear, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain); Sánchez, A. [Department d’Estadística Aplicada i Qualitat, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain); Ginestar, D. [Department de Matemàtica Aplicada, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain); Martorell, S. [Department d’Enginyeria Química i Nuclear, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain)

    2013-09-15

    Highlights: • Best estimate codes simulation needs uncertainty quantification. • The output variables can present multimodal probability distributions. • The analysis of multimodal distribution is performed using finite mixture models. • Two methods to reconstruct output variable probability distribution are used. -- Abstract: Nuclear Power Plant safety analysis is mainly based on the use of best estimate (BE) codes that predict the plant behavior under normal or accidental conditions. As the BE codes introduce uncertainties due to uncertainty in input parameters and modeling, it is necessary to perform uncertainty assessment (UA), and eventually sensitivity analysis (SA), of the results obtained. These analyses are part of the appropriate treatment of uncertainties imposed by current regulation based on the adoption of the best estimate plus uncertainty (BEPU) approach. The most popular approach for uncertainty assessment, based on Wilks’ method, obtains a tolerance/confidence interval, but it does not completely characterize the output variable behavior, which is required for an extended UA and SA. However, the development of standard UA and SA impose high computational cost due to the large number of simulations needed. In order to obtain more information about the output variable and, at the same time, to keep computational cost as low as possible, there has been a recent shift toward developing metamodels (model of model), or surrogate models, that approximate or emulate complex computer codes. In this way, there exist different techniques to reconstruct the probability distribution using the information provided by a sample of values as, for example, the finite mixture models. In this paper, the Expectation Maximization and the k-means algorithms are used to obtain a finite mixture model that reconstructs the output variable probability distribution from data obtained with RELAP-5 simulations. Both methodologies have been applied to a separated

  18. Reliability of a new biokinetic model of zirconium in internal dosimetry: part I, parameter uncertainty analysis.

    Li, Wei Bo; Greiter, Matthias; Oeh, Uwe; Hoeschen, Christoph

    2011-12-01

    The reliability of biokinetic models is essential in internal dose assessments and radiation risk analysis for the public, occupational workers, and patients exposed to radionuclides. In this paper, a method for assessing the reliability of biokinetic models by means of uncertainty and sensitivity analysis was developed. The paper is divided into two parts. In the first part of the study published here, the uncertainty sources of the model parameters for zirconium (Zr), developed by the International Commission on Radiological Protection (ICRP), were identified and analyzed. Furthermore, the uncertainty of the biokinetic experimental measurement performed at the Helmholtz Zentrum München-German Research Center for Environmental Health (HMGU) for developing a new biokinetic model of Zr was analyzed according to the Guide to the Expression of Uncertainty in Measurement, published by the International Organization for Standardization. The confidence interval and distribution of model parameters of the ICRP and HMGU Zr biokinetic models were evaluated. As a result of computer biokinetic modelings, the mean, standard uncertainty, and confidence interval of model prediction calculated based on the model parameter uncertainty were presented and compared to the plasma clearance and urinary excretion measured after intravenous administration. It was shown that for the most important compartment, the plasma, the uncertainty evaluated for the HMGU model was much smaller than that for the ICRP model; that phenomenon was observed for other organs and tissues as well. The uncertainty of the integral of the radioactivity of Zr up to 50 y calculated by the HMGU model after ingestion by adult members of the public was shown to be smaller by a factor of two than that of the ICRP model. It was also shown that the distribution type of the model parameter strongly influences the model prediction, and the correlation of the model input parameters affects the model prediction to a

  19. Development Optimization and Uncertainty Analysis Methods for Oil and Gas Reservoirs

    Ettehadtavakkol, Amin, E-mail: amin.ettehadtavakkol@ttu.edu [Texas Tech University (United States); Jablonowski, Christopher [Shell Exploration and Production Company (United States); Lake, Larry [University of Texas at Austin (United States)

    2017-04-15

    Uncertainty complicates the development optimization of oil and gas exploration and production projects, but methods have been devised to analyze uncertainty and its impact on optimal decision-making. This paper compares two methods for development optimization and uncertainty analysis: Monte Carlo (MC) simulation and stochastic programming. Two example problems for a gas field development and an oilfield development are solved and discussed to elaborate the advantages and disadvantages of each method. Development optimization involves decisions regarding the configuration of initial capital investment and subsequent operational decisions. Uncertainty analysis involves the quantification of the impact of uncertain parameters on the optimum design concept. The gas field development problem is designed to highlight the differences in the implementation of the two methods and to show that both methods yield the exact same optimum design. The results show that both MC optimization and stochastic programming provide unique benefits, and that the choice of method depends on the goal of the analysis. While the MC method generates more useful information, along with the optimum design configuration, the stochastic programming method is more computationally efficient in determining the optimal solution. Reservoirs comprise multiple compartments and layers with multiphase flow of oil, water, and gas. We present a workflow for development optimization under uncertainty for these reservoirs, and solve an example on the design optimization of a multicompartment, multilayer oilfield development.

  20. Development Optimization and Uncertainty Analysis Methods for Oil and Gas Reservoirs

    Ettehadtavakkol, Amin; Jablonowski, Christopher; Lake, Larry

    2017-01-01

    Uncertainty complicates the development optimization of oil and gas exploration and production projects, but methods have been devised to analyze uncertainty and its impact on optimal decision-making. This paper compares two methods for development optimization and uncertainty analysis: Monte Carlo (MC) simulation and stochastic programming. Two example problems for a gas field development and an oilfield development are solved and discussed to elaborate the advantages and disadvantages of each method. Development optimization involves decisions regarding the configuration of initial capital investment and subsequent operational decisions. Uncertainty analysis involves the quantification of the impact of uncertain parameters on the optimum design concept. The gas field development problem is designed to highlight the differences in the implementation of the two methods and to show that both methods yield the exact same optimum design. The results show that both MC optimization and stochastic programming provide unique benefits, and that the choice of method depends on the goal of the analysis. While the MC method generates more useful information, along with the optimum design configuration, the stochastic programming method is more computationally efficient in determining the optimal solution. Reservoirs comprise multiple compartments and layers with multiphase flow of oil, water, and gas. We present a workflow for development optimization under uncertainty for these reservoirs, and solve an example on the design optimization of a multicompartment, multilayer oilfield development.

  1. Uncertainty modelling and analysis of volume calculations based on a regular grid digital elevation model (DEM)

    Li, Chang; Wang, Qing; Shi, Wenzhong; Zhao, Sisi

    2018-05-01

    The accuracy of earthwork calculations that compute terrain volume is critical to digital terrain analysis (DTA). The uncertainties in volume calculations (VCs) based on a DEM are primarily related to three factors: 1) model error (ME), which is caused by an adopted algorithm for a VC model, 2) discrete error (DE), which is usually caused by DEM resolution and terrain complexity, and 3) propagation error (PE), which is caused by the variables' error. Based on these factors, the uncertainty modelling and analysis of VCs based on a regular grid DEM are investigated in this paper. Especially, how to quantify the uncertainty of VCs is proposed by a confidence interval based on truncation error (TE). In the experiments, the trapezoidal double rule (TDR) and Simpson's double rule (SDR) were used to calculate volume, where the TE is the major ME, and six simulated regular grid DEMs with different terrain complexity and resolution (i.e. DE) were generated by a Gauss synthetic surface to easily obtain the theoretical true value and eliminate the interference of data errors. For PE, Monte-Carlo simulation techniques and spatial autocorrelation were used to represent DEM uncertainty. This study can enrich uncertainty modelling and analysis-related theories of geographic information science.

  2. Applications of uncertainty analysis to visual evaluation of density in radiographs

    Uchida, Suguru; Ohtsuka, Akiyoshi; Fujita, Hiroshi.

    1981-01-01

    Uncertainty analysis, developed as a method of absolute judgment in psychology, is applied to a method of radiographic image evaluation with perceptual fluctuations and to an examination of visual evaluation of density in radiographs. Subjects are composed of three groups of four neurosurgeons, four radiologic technologists and four nonprofessionals. By using a five-category rating scale, each observer is directed to classify 255 radiographs randomly presented without feedback. Characteristics of each observer and each group can be shown quantitatively by calculated information values. It is also described that bivariate uncertainty analysis or entropy method can be used to calculate the degree of agreement of evaluation. (author)

  3. Uncertainty modeling in vibration, control and fuzzy analysis of structural systems

    Halder, Achintya; Ayyub, Bilal M

    1997-01-01

    This book gives an overview of the current state of uncertainty modeling in vibration, control, and fuzzy analysis of structural and mechanical systems. It is a coherent compendium written by leading experts and offers the reader a sampling of exciting research areas in several fast-growing branches in this field. Uncertainty modeling and analysis are becoming an integral part of system definition and modeling in many fields. The book consists of ten chapters that report the work of researchers, scientists and engineers on theoretical developments and diversified applications in engineering sy

  4. Applications of uncertainty analysis to visual evaluation of density in radiographs

    Uchida, S [Gifu Univ. (Japan); Ohtsuka, A; Fujita, H

    1981-03-01

    Uncertainty analysis, developed as a method of absolute judgment in psychology, is applied to a method of radiographic image evaluation with perceptual fluctuations and to an examination of visual evaluation of density in radiographs. Subjects are composed of three groups of four neurosurgeons, four radiologic technologists and four nonprofessionals. By using a five-category rating scale, each observer is directed to classify 255 radiographs randomly presented without feedback. Characteristics of each observer and each group can be shown quantitatively by calculated information values. It is also described that bivariate uncertainty analysis or entropy method can be used to calculate the degree of agreement of evaluation.

  5. Analysis of parameter uncertainties in the assessment of seismic risk for nuclear power plants

    Yucemen, S.M.

    1981-04-01

    Probabilistic and statistical methods are used to develop a procedure by which the seismic risk at a specific site can be systematically analyzed. The proposed probabilistic procedure provides a consisted method for the modelling, analysis and updating of uncertainties that are involved in the seismic risk analysis for nuclear power plants. Methods are proposed for including these uncertainties in the final value of calculated risks. Two specific case studies are presented in detail to illustrate the application of the probabilistic method of seismic risk evaluation and to investigate the sensitivity of results to different assumptions

  6. IAEA Coordinated Research Project on HTGR Reactor Physics, Thermal-hydraulics and Depletion Uncertainty Analysis

    Strydom, Gerhard [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bostelmann, F. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The predictive capability of coupled neutronics/thermal-hydraulics and depletion simulations for reactor design and safety analysis can be assessed with sensitivity analysis (SA) and uncertainty analysis (UA) methods. Uncertainty originates from errors in physical data, manufacturing uncertainties, modelling and computational algorithms. (The interested reader is referred to the large body of published SA and UA literature for a more complete overview of the various types of uncertainties, methodologies and results obtained). SA is helpful for ranking the various sources of uncertainty and error in the results of core analyses. SA and UA are required to address cost, safety, and licensing needs and should be applied to all aspects of reactor multi-physics simulation. SA and UA can guide experimental, modelling, and algorithm research and development. Current SA and UA rely either on derivative-based methods such as stochastic sampling methods or on generalized perturbation theory to obtain sensitivity coefficients. Neither approach addresses all needs. In order to benefit from recent advances in modelling and simulation and the availability of new covariance data (nuclear data uncertainties) extensive sensitivity and uncertainty studies are needed for quantification of the impact of different sources of uncertainties on the design and safety parameters of HTGRs. Only a parallel effort in advanced simulation and in nuclear data improvement will be able to provide designers with more robust and well validated calculation tools to meet design target accuracies. In February 2009, the Technical Working Group on Gas-Cooled Reactors (TWG-GCR) of the International Atomic Energy Agency (IAEA) recommended that the proposed Coordinated Research Program (CRP) on

  7. An analysis of uncertainties in the reference resonance absorption calculations

    Milosevic, M.; Pesic, M.

    1997-05-01

    A recently appeared generation of design-oriented methods, which allows to compute the space and energy dependence of the resonant absorption inside the fuel rod, induces a new problem of validation of results obtained with improved resonance treatments, Because no experimental results are available on the spatial and energy distribution of resonance absorption, detailed reference calculations were generated with the continuos-energy Monte Carlo and energy pointwise slowing-down codes. The accuracy of these calculations depends>on various in.fluences. In this paper an analysis of some influences, such as differences ;n nuclear data libraries and philosophy of reproducing the cross section data, is presented. Example application is given for a calculation benchmark that consists of determination of resonance absorption by 238 U in typical PWR pin cell geometry (author)

  8. BEMUSE Phase III Report - Uncertainty and Sensitivity Analysis of the LOFT L2-5 Test

    Bazin, P.; Crecy, A. de; Glaeser, H.; Skorek, T.; Joucla, J.; Probst, P.; Chung, B.; Oh, D.Y.; Kyncl, M.; Pernica, R.; Macek, J.; Meca, R.; Macian, R.; D'Auria, F.; Petruzzi, A.; Perez, M.; Reventos, F.; Fujioka, K.

    2007-02-01

    This report summarises the various contributions (ten participants) for phase 3 of BEMUSE: Uncertainty and Sensitivity Analyses of the LOFT L2-5 experiment, a Large-Break Loss-of-Coolant-Accident (LB-LOCA). For this phase, precise requirements step by step were provided to the participants. Four main parts are defined, which are: 1. List and uncertainties of the input uncertain parameters. 2. Uncertainty analysis results. 3. Sensitivity analysis results. 4. Improved methods, assessment of the methods (optional). 5% and 95% percentiles have to be estimated for 6 output parameters, which are of two kinds: 1. Scalar output parameters (First Peak Cladding Temperature (PCT), Second Peak Cladding Temperature, Time of accumulator injection, Time of complete quenching); 2. Time trends output parameters (Maximum cladding temperature, Upper plenum pressure). The main lessons learnt from phase 3 of the BEMUSE programme are the following: - for uncertainty analysis, all the participants use a probabilistic method associated with the use of Wilks' formula, except for UNIPI with its CIAU method (Code with the Capability of Internal Assessment of Uncertainty). Use of both methods has been successfully mastered. - Compared with the experiment, the results of uncertainty analysis are good on the whole. For example, for the cladding temperature-type output parameters (1. PCT, 2. PCT, time of complete quenching, maximum cladding temperature), 8 participants out of 10 find upper and lower bounds which envelop the experimental data. - Sensitivity analysis has been successfully performed by all the participants using the probabilistic method. All the used influence measures include the range of variation of the input parameters. Synthesis tables of the most influential phenomena and parameters have been plotted and participants will be able to use them for the continuation of the BEMUSE programme

  9. Uncertainties in Earthquake Loss Analysis: A Case Study From Southern California

    Mahdyiar, M.; Guin, J.

    2005-12-01

    Probabilistic earthquake hazard and loss analyses play important roles in many areas of risk management, including earthquake related public policy and insurance ratemaking. Rigorous loss estimation for portfolios of properties is difficult since there are various types of uncertainties in all aspects of modeling and analysis. It is the objective of this study to investigate the sensitivity of earthquake loss estimation to uncertainties in regional seismicity, earthquake source parameters, ground motions, and sites' spatial correlation on typical property portfolios in Southern California. Southern California is an attractive region for such a study because it has a large population concentration exposed to significant levels of seismic hazard. During the last decade, there have been several comprehensive studies of most regional faults and seismogenic sources. There have also been detailed studies on regional ground motion attenuations and regional and local site responses to ground motions. This information has been used by engineering seismologists to conduct regional seismic hazard and risk analysis on a routine basis. However, one of the more difficult tasks in such studies is the proper incorporation of uncertainties in the analysis. From the hazard side, there are uncertainties in the magnitudes, rates and mechanisms of the seismic sources and local site conditions and ground motion site amplifications. From the vulnerability side, there are considerable uncertainties in estimating the state of damage of buildings under different earthquake ground motions. From an analytical side, there are challenges in capturing the spatial correlation of ground motions and building damage, and integrating thousands of loss distribution curves with different degrees of correlation. In this paper we propose to address some of these issues by conducting loss analyses of a typical small portfolio in southern California, taking into consideration various source and ground

  10. Demonstration uncertainty/sensitivity analysis using the health and economic consequence model CRAC2

    Alpert, D.J.; Iman, R.L.; Johnson, J.D.; Helton, J.C.

    1985-01-01

    This paper summarizes a demonstration uncertainty/sensitivity analysis performed on the reactor accident consequence model CRAC2. The study was performed with uncertainty/sensitivity analysis techniques compiled as part of the MELCOR program. The principal objectives of the study were: 1) to demonstrate the use of the uncertainty/sensitivity analysis techniques on a health and economic consequence model, 2) to test the computer models which implement the techniques, 3) to identify possible difficulties in performing such an analysis, and 4) to explore alternative means of analyzing, displaying, and describing the results. Demonstration of the applicability of the techniques was the motivation for performing this study; thus, the results should not be taken as a definitive uncertainty analysis of health and economic consequences. Nevertheless, significant insights on health and economic consequence analysis can be drawn from the results of this type of study. Latin hypercube sampling (LHS), a modified Monte Carlo technique, was used in this study. LHS generates a multivariate input structure in which all the variables of interest are varied simultaneously and desired correlations between variables are preserved. LHS has been shown to produce estimates of output distribution functions that are comparable with results of larger random samples

  11. Prototype application of best estimate and uncertainty safety analysis methodology to large LOCA analysis

    Luxat, J.C.; Huget, R.G.

    2001-01-01

    Development of a methodology to perform best estimate and uncertainty nuclear safety analysis has been underway at Ontario Power Generation for the past two and one half years. A key driver for the methodology development, and one of the major challenges faced, is the need to re-establish demonstrated safety margins that have progressively been undermined through excessive and compounding conservatism in deterministic analyses. The major focus of the prototyping applications was to quantify the safety margins that exist at the probable range of high power operating conditions, rather than the highly improbable operating states associated with Limit of the Envelope (LOE) assumptions. In LOE, all parameters of significance to the consequences of a postulated accident are assumed to simultaneously deviate to their limiting values. Another equally important objective of the prototyping was to demonstrate the feasibility of conducting safety analysis as an incremental analysis activity, as opposed to a major re-analysis activity. The prototype analysis solely employed prior analyses of Bruce B large break LOCA events - no new computer simulations were undertaken. This is a significant and novel feature of the prototyping work. This methodology framework has been applied to a postulated large break LOCA in a Bruce generating unit on a prototype basis. This paper presents results of the application. (author)

  12. Coherent Uncertainty Analysis of Aerosol Measurements from Multiple Satellite Sensors

    Petrenko, M.; Ichoku, C.

    2013-01-01

    Aerosol retrievals from multiple spaceborne sensors, including MODIS (on Terra and Aqua), MISR, OMI, POLDER, CALIOP, and SeaWiFS altogether, a total of 11 different aerosol products were comparatively analyzed using data collocated with ground-based aerosol observations from the Aerosol Robotic Network (AERONET) stations within the Multi-sensor Aerosol Products Sampling System (MAPSS, http://giovanni.gsfc.nasa.gov/mapss/ and http://giovanni.gsfc.nasa.gov/aerostat/). The analysis was performed by comparing quality-screened satellite aerosol optical depth or thickness (AOD or AOT) retrievals during 2006-2010 to available collocated AERONET measurements globally, regionally, and seasonally, and deriving a number of statistical measures of accuracy. We used a robust statistical approach to detect and remove possible outliers in the collocated data that can bias the results of the analysis. Overall, the proportion of outliers in each of the quality-screened AOD products was within 12%. Squared correlation coefficient (R2) values of the satellite AOD retrievals relative to AERONET exceeded 0.6, with R2 for most of the products exceeding 0.7 over land and 0.8 over ocean. Root mean square error (RMSE) values for most of the AOD products were within 0.15 over land and 0.09 over ocean. We have been able to generate global maps showing regions where the different products present advantages over the others, as well as the relative performance of each product over different landcover types. It was observed that while MODIS, MISR, and SeaWiFS provide accurate retrievals over most of the landcover types, multi-angle capabilities make MISR the only sensor to retrieve reliable AOD over barren and snow / ice surfaces. Likewise, active sensing enables CALIOP to retrieve aerosol properties over bright-surface shrublands more accurately than the other sensors, while POLDER, which is the only one of the sensors capable of measuring polarized aerosols, outperforms other sensors in

  13. The importance of input interactions in the uncertainty and sensitivity analysis of nuclear fuel behavior

    Ikonen, T., E-mail: timo.ikonen@vtt.fi; Tulkki, V.

    2014-08-15

    Highlights: • Uncertainty and sensitivity analysis of modeled nuclear fuel behavior is performed. • Burnup dependency of the uncertainties and sensitivities is characterized. • Input interactions significantly increase output uncertainties for irradiated fuel. • Identification of uncertainty sources is greatly improved with higher order methods. • Results stress the importance of using methods that take interactions into account. - Abstract: The propagation of uncertainties in a PWR fuel rod under steady-state irradiation is analyzed by computational means. A hypothetical steady-state scenario of the Three Mile Island 1 reactor fuel rod is modeled with the fuel performance FRAPCON, using realistic input uncertainties for the fabrication and model parameters, boundary conditions and material properties. The uncertainty and sensitivity analysis is performed by extensive Monte Carlo sampling of the inputs’ probability distribution and by applying correlation coefficient and Sobol’ variance decomposition analyses. The latter includes evaluation of the second order and total effect sensitivity indices, allowing the study of interactions between input variables. The results show that the interactions play a large role in the propagation of uncertainties, and first order methods such as the correlation coefficient analyses are in general insufficient for sensitivity analysis of the fuel rod. Significant improvement over the first order methods can be achieved by using higher order methods. The results also show that both the magnitude of the uncertainties and their propagation depends not only on the output in question, but also on burnup. The latter is due to onset of new phenomena (such as the fission gas release) and the gradual closure of the pellet-cladding gap with increasing burnup. Increasing burnup also affects the importance of input interactions. Interaction effects are typically highest in the moderate burnup (of the order of 10–40 MWd

  14. Reliability analysis of water distribution systems under uncertainty

    Kansal, M.L.; Kumar, Arun; Sharma, P.B.

    1995-01-01

    In most of the developing countries, the Water Distribution Networks (WDN) are of intermittent type because of the shortage of safe drinking water. Failure of a pipeline(s) in such cases will cause not only the fall in one or more nodal heads but also the poor connectivity of source with various demand nodes of the system. Most of the previous works have used the two-step algorithm based on pathset or cutset approach for connectivity analysis. The computations become more cumbersome when connectivity of all demand nodes taken together with that of supply is carried out. In the present paper, network connectivity based on the concept of Appended Spanning Tree (AST) is suggested to compute global network connectivity which is defined as the probability of the source node being connected with all the demand nodes simultaneously. The concept of AST has distinct advantages as it attacks the problem directly rather than in an indirect way as most of the studies so far have done. Since the water distribution system is a repairable one, a general expression for pipeline avialability using the failure/repair rate is considered. Furthermore, the sensitivity of global reliability estimates due to the likely error in the estimation of failure/repair rates of various pipelines is also studied

  15. An estimation of uncertainties in containment P/T analysis using CONTEMPT/LT code

    Kang, Y.M.; Park, G.C.; Lee, U.C.; Kang, C.S.

    1991-01-01

    In a nuclear power plant, the containment design pressure and temperature (P/T) have been established based on the unrealistic conservatism with suffering from a drawback in the economics. Thus, it is necessary that the uncertainties of design P/T values have to be well defined through an extensive uncertainty analysis with plant-specific input data and or models used in the computer code. This study is to estimate plant-specific uncertainties of containment design P/T using the Monte Carlo method in Kori-3 reactor. Kori-3 plant parameters and Uchida heat transfer coefficient are selected to be treated statistically after the sensitivity study. The Monte Carlo analysis has performed based on the response surface method with the CONTEMPT/LT code and Latin Hypercube sampling technique. Finally, the design values based on 95 %/95 % probability are compared with worst estimated values to assess the design margin. (author)

  16. Deterministic sensitivity and uncertainty analysis for large-scale computer models

    Worley, B.A.; Pin, F.G.; Oblow, E.M.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.

    1988-01-01

    This paper presents a comprehensive approach to sensitivity and uncertainty analysis of large-scale computer models that is analytic (deterministic) in principle and that is firmly based on the model equations. The theory and application of two systems based upon computer calculus, GRESS and ADGEN, are discussed relative to their role in calculating model derivatives and sensitivities without a prohibitive initial manpower investment. Storage and computational requirements for these two systems are compared for a gradient-enhanced version of the PRESTO-II computer model. A Deterministic Uncertainty Analysis (DUA) method that retains the characteristics of analytically computing result uncertainties based upon parameter probability distributions is then introduced and results from recent studies are shown. 29 refs., 4 figs., 1 tab

  17. Analysis of Uncertainty and Variability in Finite Element Computational Models for Biomedical Engineering: Characterization and Propagation.

    Mangado, Nerea; Piella, Gemma; Noailly, Jérôme; Pons-Prats, Jordi; Ballester, Miguel Ángel González

    2016-01-01

    Computational modeling has become a powerful tool in biomedical engineering thanks to its potential to simulate coupled systems. However, real parameters are usually not accurately known, and variability is inherent in living organisms. To cope with this, probabilistic tools, statistical analysis and stochastic approaches have been used. This article aims to review the analysis of uncertainty and variability in the context of finite element modeling in biomedical engineering. Characterization techniques and propagation methods are presented, as well as examples of their applications in biomedical finite element simulations. Uncertainty propagation methods, both non-intrusive and intrusive, are described. Finally, pros and cons of the different approaches and their use in the scientific community are presented. This leads us to identify future directions for research and methodological development of uncertainty modeling in biomedical engineering.

  18. Application of RELAP/SCDAPSIM with integrated uncertainty options to research reactor systems thermal hydraulic analysis

    Allison, C.M.; Hohorst, J.K.; Perez, M.; Reventos, F.

    2010-01-01

    The RELAP/SCDAPSIM/MOD4.0 code, designed to predict the behavior of reactor systems during normal and accident conditions, is being developed as part of the international SCDAP Development and Training Program (SDTP). RELAP/SCDAPSIM/MOD4.0, which is the first version of RELAP5 completely rewritten to FORTRAN 90/95/2000 standards, uses publicly available RELAP5 and SCDAP models in combination with advanced programming and numerical techniques and other SDTP-member modeling/user options. One such member developed option is an integrated uncertainty analysis package being developed jointly by the Technical University of Catalonia (UPC) and Innovative Systems Software (ISS). This paper briefly summarizes the features of RELAP/SCDAPSIM/MOD4.0 and the integrated uncertainty analysis package, and then presents an example of how the integrated uncertainty package can be setup and used for a simple pipe flow problem. (author)

  19. Uncertainty analysis of flexible rotors considering fuzzy parameters and fuzzy-random parameters

    Fabian Andres Lara-Molina

    Full Text Available Abstract The components of flexible rotors are subjected to uncertainties. The main sources of uncertainties include the variation of mechanical properties. This contribution aims at analyzing the dynamics of flexible rotors under uncertain parameters modeled as fuzzy and fuzzy random variables. The uncertainty analysis encompasses the modeling of uncertain parameters and the numerical simulation of the corresponding flexible rotor model by using an approach based on fuzzy dynamic analysis. The numerical simulation is accomplished by mapping the fuzzy parameters of the deterministic flexible rotor model. Thereby, the flexible rotor is modeled by using both the Fuzzy Finite Element Method and the Fuzzy Stochastic Finite Element Method. Numerical simulations illustrate the methodology conveyed in terms of orbits and frequency response functions subject to uncertain parameters.

  20. Demonstration uncertainty/sensitivity analysis using the health and economic consequence model CRAC2

    Alpert, D.J.; Iman, R.L.; Johnson, J.D.; Helton, J.C.

    1984-12-01

    The techniques for performing uncertainty/sensitivity analyses compiled as part of the MELCOR program appear to be well suited for use with a health and economic consequence model. Two replicate samples of size 50 gave essentially identical results, indicating that for this case, a Latin hypercube sample of size 50 seems adequate to represent the distribution of results. Though the intent of this study was a demonstration of uncertainty/sensitivity analysis techniques, a number of insights relevant to health and economic consequence modeling can be gleaned: uncertainties in early deaths are significantly greater than uncertainties in latent cancer deaths; though the magnitude of the source term is the largest source of variation in estimated distributions of early deaths, a number of additional parameters are also important; even with the release fractions for a full SST1, one quarter of the CRAC2 runs gave no early deaths; and comparison of the estimates of mean early deaths for a full SST1 release in this study with those of recent point estimates for similar conditions indicates that the recent estimates may be significant overestimations of early deaths. Estimates of latent cancer deaths, however, are roughly comparable. An analysis of the type described here can provide insights in a number of areas. First, the variability in the results gives an indication of the potential uncertainty associated with the calculations. Second, the sensitivity of the results to assumptions about the input variables can be determined. Research efforts can then be concentrated on reducing the uncertainty in the variables which are the largest contributors to uncertainty in results

  1. Reliability and Robustness Analysis of the Masinga Dam under Uncertainty

    Hayden Postle-Floyd

    2017-02-01

    Full Text Available Kenya’s water abstraction must meet the projected growth in municipal and irrigation demand by the end of 2030 in order to achieve the country’s industrial and economic development plan. The Masinga dam, on the Tana River, is the key to meeting this goal to satisfy the growing demands whilst also continuing to provide hydroelectric power generation. This study quantitatively assesses the reliability and robustness of the Masinga dam system under uncertain future supply and demand using probabilistic climate and population projections, and examines how long-term planning may improve the longevity of the dam. River flow and demand projections are used alongside each other as inputs to the dam system simulation model linked to an optimisation engine to maximise water availability. Water availability after demand satisfaction is assessed for future years, and the projected reliability of the system is calculated for selected years. The analysis shows that maximising power generation on a short-term year-by-year basis achieves 80%, 50% and 1% reliability by 2020, 2025 and 2030 onwards, respectively. Longer term optimal planning, however, has increased system reliability to up to 95% in 2020, 80% in 2025, and more than 40% in 2030 onwards. In addition, increasing the capacity of the reservoir by around 25% can significantly improve the robustness of the system for all future time periods. This study provides a platform for analysing the implication of different planning and management of Masinga dam and suggests that careful consideration should be given to account for growing municipal needs and irrigation schemes in both the immediate and the associated Tana River basin.

  2. Computational Fluid Dynamics Uncertainty Analysis Applied to Heat Transfer over a Flat Plate

    Groves, Curtis Edward; Ilie, Marcel; Schallhorn, Paul A.

    2013-01-01

    There have been few discussions on using Computational Fluid Dynamics (CFD) without experimental validation. Pairing experimental data, uncertainty analysis, and analytical predictions provides a comprehensive approach to verification and is the current state of the art. With pressed budgets, collecting experimental data is rare or non-existent. This paper investigates and proposes a method to perform CFD uncertainty analysis only from computational data. The method uses current CFD uncertainty techniques coupled with the Student-T distribution to predict the heat transfer coefficient over a at plate. The inputs to the CFD model are varied from a specified tolerance or bias error and the difference in the results are used to estimate the uncertainty. The variation in each input is ranked from least to greatest to determine the order of importance. The results are compared to heat transfer correlations and conclusions drawn about the feasibility of using CFD without experimental data. The results provide a tactic to analytically estimate the uncertainty in a CFD model when experimental data is unavailable

  3. Estimation of a quantity of interest in uncertainty analysis: Some help from Bayesian decision theory

    Pasanisi, Alberto; Keller, Merlin; Parent, Eric

    2012-01-01

    In the context of risk analysis under uncertainty, we focus here on the problem of estimating a so-called quantity of interest of an uncertainty analysis problem, i.e. a given feature of the probability distribution function (pdf) of the output of a deterministic model with uncertain inputs. We will stay here in a fully probabilistic setting. A common problem is how to account for epistemic uncertainty tainting the parameter of the probability distribution of the inputs. In the standard practice, this uncertainty is often neglected (plug-in approach). When a specific uncertainty assessment is made, under the basis of the available information (expertise and/or data), a common solution consists in marginalizing the joint distribution of both observable inputs and parameters of the probabilistic model (i.e. computing the predictive pdf of the inputs), then propagating it through the deterministic model. We will reinterpret this approach in the light of Bayesian decision theory, and will put into evidence that this practice leads the analyst to adopt implicitly a specific loss function which may be inappropriate for the problem under investigation, and suboptimal from a decisional perspective. These concepts are illustrated on a simple numerical example, concerning a case of flood risk assessment.

  4. Uncertainty and sensitivity analysis of fission gas behavior in engineering-scale fuel modeling

    Pastore, Giovanni, E-mail: Giovanni.Pastore@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Swiler, L.P., E-mail: LPSwile@sandia.gov [Optimization and Uncertainty Quantification, Sandia National Laboratories, P.O. Box 5800, Albuquerque, NM 87185-1318 (United States); Hales, J.D., E-mail: Jason.Hales@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Novascone, S.R., E-mail: Stephen.Novascone@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Perez, D.M., E-mail: Danielle.Perez@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Spencer, B.W., E-mail: Benjamin.Spencer@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Luzzi, L., E-mail: Lelio.Luzzi@polimi.it [Politecnico di Milano, Department of Energy, Nuclear Engineering Division, via La Masa 34, I-20156 Milano (Italy); Van Uffelen, P., E-mail: Paul.Van-Uffelen@ec.europa.eu [European Commission, Joint Research Centre, Institute for Transuranium Elements, Hermann-von-Helmholtz-Platz 1, D-76344 Karlsruhe (Germany); Williamson, R.L., E-mail: Richard.Williamson@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States)

    2015-01-15

    The role of uncertainties in fission gas behavior calculations as part of engineering-scale nuclear fuel modeling is investigated using the BISON fuel performance code with a recently implemented physics-based model for fission gas release and swelling. Through the integration of BISON with the DAKOTA software, a sensitivity analysis of the results to selected model parameters is carried out based on UO{sub 2} single-pellet simulations covering different power regimes. The parameters are varied within ranges representative of the relative uncertainties and consistent with the information in the open literature. The study leads to an initial quantitative assessment of the uncertainty in fission gas behavior predictions with the parameter characterization presently available. Also, the relative importance of the single parameters is evaluated. Moreover, a sensitivity analysis is carried out based on simulations of a fuel rod irradiation experiment, pointing out a significant impact of the considered uncertainties on the calculated fission gas release and cladding diametral strain. The results of the study indicate that the commonly accepted deviation between calculated and measured fission gas release by a factor of 2 approximately corresponds to the inherent modeling uncertainty at high fission gas release. Nevertheless, significantly higher deviations may be expected for values around 10% and lower. Implications are discussed in terms of directions of research for the improved modeling of fission gas behavior for engineering purposes.

  5. Risk Analysis of Reservoir Flood Routing Calculation Based on Inflow Forecast Uncertainty

    Binquan Li

    2016-10-01

    Full Text Available Possible risks in reservoir flood control and regulation cannot be objectively assessed by deterministic flood forecasts, resulting in the probability of reservoir failure. We demonstrated a risk analysis of reservoir flood routing calculation accounting for inflow forecast uncertainty in a sub-basin of Huaihe River, China. The Xinanjiang model was used to provide deterministic flood forecasts, and was combined with the Hydrologic Uncertainty Processor (HUP to quantify reservoir inflow uncertainty in the probability density function (PDF form. Furthermore, the PDFs of reservoir water level (RWL and the risk rate of RWL exceeding a defined safety control level could be obtained. Results suggested that the median forecast (50th percentiles of HUP showed better agreement with observed inflows than the Xinanjiang model did in terms of the performance measures of flood process, peak, and volume. In addition, most observations (77.2% were bracketed by the uncertainty band of 90% confidence interval, with some small exceptions of high flows. Results proved that this framework of risk analysis could provide not only the deterministic forecasts of inflow and RWL, but also the fundamental uncertainty information (e.g., 90% confidence band for the reservoir flood routing calculation.

  6. Analysis of convergence of uncertainty and important factors affecting uncertainty in level 1 PSA for pressurized water reactors

    Shimada, Yoshio

    2002-01-01

    We analyzed how the convergence of mean core damage frequency (CDF) depends on the number of minimal cut sets, the sampling method and the random seed, using level 1 PSA models for Surry 1 and a Japanese 4 loop PWR plant. As a result, the followings were clarified: the good convergence efficiency of the latin hypercube sampling (LHS), the relationship between number of minimal cut sets and mean CDF, as well as the standard deviation and the easy method of judgment for mean CDF convergence. In addition, it was seen that the relationship between the number of probability variables (i.e. the number of basic events) and the number of samplings needed to converge for mean CDF. Analysis of important factors affecting uncertainty was also performed. As a result, it was found that the initiating events (especially loss of coolant accidents) were the dominant important factors. Finally, comparisons were made for the 95% confidence interval of the calculated results from the operating experience of the worldwide nuclear power plants with (1) the mean core damage frequency by PSA for 108 US plants and 51 Japanese plants and (2) the 95% confidence interval of the US and the Japanese Plant PSA model used in this research. As a result, it was clarified that the mean core damage frequency of almost all US pressurized and boiling light water reactors in the US was in the 90% confidence interval calculated from the operating experience of the nuclear power plants (PWRs and BWRs) in the world, but that of those reactors in Japan was smaller then that level. (author)

  7. Analysis of convergence of uncertainty and important factors affecting uncertainty in level 1 PSA for pressurized water reactors

    Shimada, Yoshio [Inst. of Nuclear Safety System Inc., Mihama, Fukui (Japan)

    2002-09-01

    We analyzed how the convergence of mean core damage frequency (CDF) depends on the number of minimal cut sets, the sampling method and the random seed, using level 1 PSA models for Surry 1 and a Japanese 4 loop PWR plant. As a result, the followings were clarified: the good convergence efficiency of the latin hypercube sampling (LHS), the relationship between number of minimal cut sets and mean CDF, as well as the standard deviation and the easy method of judgment for mean CDF convergence. In addition, it was seen that the relationship between the number of probability variables (i.e. the number of basic events) and the number of samplings needed to converge for mean CDF. Analysis of important factors affecting uncertainty was also performed. As a result, it was found that the initiating events (especially loss of coolant accidents) were the dominant important factors. Finally, comparisons were made for the 95% confidence interval of the calculated results from the operating experience of the worldwide nuclear power plants with (1) the mean core damage frequency by PSA for 108 US plants and 51 Japanese plants and (2) the 95% confidence interval of the US and the Japanese Plant PSA model used in this research. As a result, it was clarified that the mean core damage frequency of almost all US pressurized and boiling light water reactors in the US was in the 90% confidence interval calculated from the operating experience of the nuclear power plants (PWRs and BWRs) in the world, but that of those reactors in Japan was smaller then that level. (author)

  8. Uncertainty in projected point precipitation extremes for hydrological impact analysis of climate change

    Van Uytven, Els; Willems, Patrick

    2017-04-01

    Current trends in the hydro-meteorological variables indicate the potential impact of climate change on hydrological extremes. Therefore, they trigger an increased importance climate adaptation strategies in water management. The impact of climate change on hydro-meteorological and hydrological extremes is, however, highly uncertain. This is due to uncertainties introduced by the climate models, the internal variability inherent to the climate system, the greenhouse gas scenarios and the statistical downscaling methods. In view of the need to define sustainable climate adaptation strategies, there is a need to assess these uncertainties. This is commonly done by means of ensemble approaches. Because more and more climate models and statistical downscaling methods become available, there is a need to facilitate the climate impact and uncertainty analysis. A Climate Perturbation Tool has been developed for that purpose, which combines a set of statistical downscaling methods including weather typing, weather generator, transfer function and advanced perturbation based approaches. By use of an interactive interface, climate impact modelers can apply these statistical downscaling methods in a semi-automatic way to an ensemble of climate model runs. The tool is applicable to any region, but has been demonstrated so far to cases in Belgium, Suriname, Vietnam and Bangladesh. Time series representing future local-scale precipitation, temperature and potential evapotranspiration (PET) conditions were obtained, starting from time series of historical observations. Uncertainties on the future meteorological conditions are represented in two different ways: through an ensemble of time series, and a reduced set of synthetic scenarios. The both aim to span the full uncertainty range as assessed from the ensemble of climate model runs and downscaling methods. For Belgium, for instance, use was made of 100-year time series of 10-minutes precipitation observations and daily

  9. Validation and uncertainty analysis of a pre-treatment 2D dose prediction model

    Baeza, Jose A.; Wolfs, Cecile J. A.; Nijsten, Sebastiaan M. J. J. G.; Verhaegen, Frank

    2018-02-01

    Independent verification of complex treatment delivery with megavolt photon beam radiotherapy (RT) has been effectively used to detect and prevent errors. This work presents the validation and uncertainty analysis of a model that predicts 2D portal dose images (PDIs) without a patient or phantom in the beam. The prediction model is based on an exponential point dose model with separable primary and secondary photon fluence components. The model includes a scatter kernel, off-axis ratio map, transmission values and penumbra kernels for beam-delimiting components. These parameters were derived through a model fitting procedure supplied with point dose and dose profile measurements of radiation fields. The model was validated against a treatment planning system (TPS; Eclipse) and radiochromic film measurements for complex clinical scenarios, including volumetric modulated arc therapy (VMAT). Confidence limits on fitted model parameters were calculated based on simulated measurements. A sensitivity analysis was performed to evaluate the effect of the parameter uncertainties on the model output. For the maximum uncertainty, the maximum deviating measurement sets were propagated through the fitting procedure and the model. The overall uncertainty was assessed using all simulated measurements. The validation of the prediction model against the TPS and the film showed a good agreement, with on average 90.8% and 90.5% of pixels passing a (2%,2 mm) global gamma analysis respectively, with a low dose threshold of 10%. The maximum and overall uncertainty of the model is dependent on the type of clinical plan used as input. The results can be used to study the robustness of the model. A model for predicting accurate 2D pre-treatment PDIs in complex RT scenarios can be used clinically and its uncertainties can be taken into account.

  10. Scenario analysis to account for photovoltaic generation uncertainty in distribution grid reconfiguration

    Chittur Ramaswamy, Parvathy; Deconinck, Geert; Pillai, Jayakrishnan Radhakrishna

    2013-01-01

    This paper considers hourly reconfiguration of a low voltage distribution network with the objectives of minimizing power loss and voltage deviation. The uncertainty in photovoltaic (PV) generation which in turn will affect the optimum configuration is tackled with the help of scenario analysis. ......-dominated solutions, demonstrating their trade-offs. Finally, the best compromise solution can be selected depending on the decision maker's requirement....

  11. Phenomenological uncertainty analysis of containment building pressure load caused by severe accident sequences

    Park, S.Y.; Ahn, K.I.

    2014-01-01

    Highlights: • Phenomenological uncertainty analysis has been applied to level 2 PSA. • The methodology provides an alternative to simple deterministic analyses and sensitivity studies. • A realistic evaluation provides a more complete characterization of risks. • Uncertain parameters of MAAP code for the early containment failure were identified. - Abstract: This paper illustrates an application of a severe accident analysis code, MAAP, to the uncertainty evaluation of early containment failure scenarios employed in the containment event tree (CET) model of a reference plant. An uncertainty analysis of containment pressure behavior during severe accidents has been performed for an optimum assessment of an early containment failure model. The present application is mainly focused on determining an estimate of the containment building pressure load caused by severe accident sequences of a nuclear power plant. Key modeling parameters and phenomenological models employed for the present uncertainty analysis are closely related to the in-vessel hydrogen generation, direct containment heating, and gas combustion. The basic approach of this methodology is to (1) develop severe accident scenarios for which containment pressure loads should be performed based on a level 2 PSA, (2) identify severe accident phenomena relevant to an early containment failure, (3) identify the MAAP input parameters, sensitivity coefficients, and modeling options that describe or influence the early containment failure phenomena, (4) prescribe the likelihood descriptions of the potential range of these parameters, and (5) evaluate the code predictions using a number of random combinations of parameter inputs sampled from the likelihood distributions

  12. Communicating uncertainty in cost-benefit analysis : A cognitive psychological perspective

    Mouter, N.; Holleman, M.; Calvert, S.C.; Annema, J.A.

    2013-01-01

    Based on a cognitive psychological theory, this paper aims to improve the communication of uncertainty in Cost-Benefit Analysis. The theory is based on different cognitive-personality and cognitive-social psychological constructs that may help explain individual differences in the processing of

  13. Overview and application of the Model Optimization, Uncertainty, and SEnsitivity Analysis (MOUSE) toolbox

    For several decades, optimization and sensitivity/uncertainty analysis of environmental models has been the subject of extensive research. Although much progress has been made and sophisticated methods developed, the growing complexity of environmental models to represent real-world systems makes it...

  14. Estimates of uncertainties in analysis of positron lifetime spectra for metals

    Eldrup, M.; Huang, Y.M.; McKee, B.T.A.

    1978-01-01

    The effects of uncertainties and errors in various constraints used in the analysis of multi-component life-time spectra of positrons annihilating in metals containing defects have been investigated in detail using computer simulated decay spectra and subsequent analysis. It is found that the errors in the fitted values of the main component lifetimes and intensities introduced from incorrect values of the instrumental resolution function and off the source-surface components can easily exceed the statistic uncertainties. The effect of an incorrect resolution function may be reduced by excluding the peak regions of the spectra from the analysis. The influence of using incorrect source-surface components in the analysis may on the other hand be reduced by including the peak regions of the spectra. A main conclusion of the work is that extreme caution should be exercised to avoid introducing large errors through the constraints used in the analysis of experimental lifetime data. (orig.) [de

  15. Uncertainty Determination Methodology, Sampling Maps Generation and Trend Studies with Biomass Thermogravimetric Analysis

    Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Eguía, Pablo; Collazo, Joaquín

    2010-01-01

    This paper investigates a method for the determination of the maximum sampling error and confidence intervals of thermal properties obtained from thermogravimetric analysis (TG analysis) for several lignocellulosic materials (ground olive stone, almond shell, pine pellets and oak pellets), completing previous work of the same authors. A comparison has been made between results of TG analysis and prompt analysis. Levels of uncertainty and errors were obtained, demonstrating that properties evaluated by TG analysis were representative of the overall fuel composition, and no correlation between prompt and TG analysis exists. Additionally, a study of trends and time correlations is indicated. These results are particularly interesting for biomass energy applications. PMID:21152292

  16. Dissertation Defense Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems

    Groves, Curtis Edward

    2014-01-01

    Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This paper describes an approach to quantify the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft without the use of test data. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional "validation by test only" mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions. Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computational Fluid Dynamics can be used to verify these requirements; however, the model must be validated by test data. This research includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available and open source solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT, STARCCM+, and OPENFOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid Dynamics model using the methodology found in "Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations". This method requires three separate grids and solutions, which quantify the error bars around Computational Fluid Dynamics

  17. Dissertation Defense: Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems

    Groves, Curtis Edward

    2014-01-01

    Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This paper describes an approach to quantify the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft without the use of test data. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional validation by test only mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions.Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computational Fluid Dynamics can be used to verify these requirements; however, the model must be validated by test data. This research includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available and open source solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT, STARCCM+, and OPENFOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid Dynamics model using the methodology found in Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations. This method requires three separate grids and solutions, which quantify the error bars around Computational Fluid Dynamics predictions

  18. Verification of the thermal module in the ELESIM code and the associated uncertainty analysis

    Arimescu, V.I.; Williams, A.F.; Klein, M.E.; Richmond, W.R.; Couture, M.

    1997-09-01

    Temperature is a critical parameter in fuel modelling because most of the physical processes that occur in fuel elements during irradiation are thermally activated. The focus of this paper is the temperature distribution calculation used in the computer code ELESIM, developed at AECL to model the steady-state behaviour of CANDU fuel. A validation procedure for fuel codes is described and applied to ELESIM's thermal calculation.The effects of uncertainties in model parameters, like Uranium Dioxide thermal conductivity, and input variables, such as fuel element linear power, are accounted for through an uncertainty analysis using Response Surface and Monte Carlo techniques

  19. Sensitivity Analysis and Uncertainty Quantification for the LAMMPS Molecular Dynamics Simulation Code

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Bhat, Kabekode Ghanasham [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-07-18

    We examine sensitivity analysis and uncertainty quantification for molecular dynamics simulation. Extreme (large or small) output values for the LAMMPS code often occur at the boundaries of input regions, and uncertainties in those boundary values are overlooked by common SA methods. Similarly, input values for which code outputs are consistent with calibration data can also occur near boundaries. Upon applying approaches in the literature for imprecise probabilities (IPs), much more realistic results are obtained than for the complacent application of standard SA and code calibration.

  20. Guidelines for uncertainty analysis developed for the participants in the BIOMOVS II study

    Baeverstam, U.; Davis, P.; Garcia-Olivares, A.; Henrich, E.; Koch, J.

    1993-07-01

    This report has been produced to provide guidelines for uncertainty analysis for use by participants in the BIOMOVS II study. It is hoped that others with an interest in modelling contamination in the biosphere will also find it useful. The report has been prepared by members of the Uncertainty and Validation Working Group and has been reviewed by other BIOMOVS II participants. The opinions expressed are those of the authors and should not be taken to represent the views of the BIOMOVS II sponsors or other BIOMOVS Il participating organisations

  1. Guidelines for uncertainty analysis developed for the participants in the BIOMOVS II study

    Baeverstam, U; Davis, P; Garcia-Olivares, A; Henrich, E; Koch, J

    1993-07-01

    This report has been produced to provide guidelines for uncertainty analysis for use by participants in the BIOMOVS II study. It is hoped that others with an interest in modelling contamination in the biosphere will also find it useful. The report has been prepared by members of the Uncertainty and Validation Working Group and has been reviewed by other BIOMOVS II participants. The opinions expressed are those of the authors and should not be taken to represent the views of the BIOMOVS II sponsors or other BIOMOVS Il participating organisations.

  2. Uncertainty evaluation in correlated quantities: application to elemental analysis of atmospheric aerosols

    Espinosa, A.; Miranda, J.; Pineda, J. C.

    2010-01-01

    One of the aspects that are frequently overlooked in the evaluation of uncertainty in experimental data is the possibility that the involved quantities are correlated among them, due to different causes. An example in the elemental analysis of atmospheric aerosols using techniques like X-ray Fluorescence (X RF) or Particle Induced X-ray Emission (PIXE). In these cases, the measured elemental concentrations are highly correlated, and then are used to obtain information about other variables, such as the contribution from emitting sources related to soil, sulfate, non-soil potassium or organic matter. This work describes, as an example, the method required to evaluate the uncertainty in variables determined from correlated quantities from a set of atmospheric aerosol samples collected in the Metropolitan Area of the Mexico Valley and analyzed with PIXE. The work is based on the recommendations of the Guide for the Evaluation of Uncertainty published by the International Organization for Standardization. (Author)

  3. Review of best estimate plus uncertainty methods of thermal-hydraulic safety analysis

    Prosek, A.; Mavko, B.

    2003-01-01

    In 1988 United States Nuclear Regulatory Commission approved the revised rule on the acceptance of emergency core cooling system (ECCS) performance. Since that there has been significant interest in the development of codes and methodologies for best-estimate loss-of-coolant accident (LOCAs) analyses. Several new best estimate plus uncertainty methods (BEPUs) were developed in the world. The purpose of the paper is to review the developments in the direction of best estimate approaches with uncertainty quantification and to discuss the problems in practical applications of BEPU methods. In general, the licensee methods are following original methods. The study indicated that uncertainty analysis with random sampling of input parameters and the use of order statistics for desired tolerance limits of output parameters is today commonly accepted and mature approach. (author)

  4. A polynomial chaos approach to the analysis of vehicle dynamics under uncertainty

    Kewlani, Gaurav; Crawford, Justin; Iagnemma, Karl

    2012-05-01

    The ability of ground vehicles to quickly and accurately analyse their dynamic response to a given input is critical to their safety and efficient autonomous operation. In field conditions, significant uncertainty is associated with terrain and/or vehicle parameter estimates, and this uncertainty must be considered in the analysis of vehicle motion dynamics. Here, polynomial chaos approaches that explicitly consider parametric uncertainty during modelling of vehicle dynamics are presented. They are shown to be computationally more efficient than the standard Monte Carlo scheme, and experimental results compared with the simulation results performed on ANVEL (a vehicle simulator) indicate that the method can be utilised for efficient and accurate prediction of vehicle motion in realistic scenarios.

  5. The deuteron-radius puzzle is alive: A new analysis of nuclear structure uncertainties

    Hernandez, O. J.; Ekström, A.; Nevo Dinur, N.; Ji, C.; Bacca, S.; Barnea, N.

    2018-03-01

    To shed light on the deuteron radius puzzle we analyze the theoretical uncertainties of the nuclear structure corrections to the Lamb shift in muonic deuterium. We find that the discrepancy between the calculated two-photon exchange correction and the corresponding experimentally inferred value by Pohl et al. [1] remain. The present result is consistent with our previous estimate, although the discrepancy is reduced from 2.6 σ to about 2 σ. The error analysis includes statistic as well as systematic uncertainties stemming from the use of nucleon-nucleon interactions derived from chiral effective field theory at various orders. We therefore conclude that nuclear theory uncertainty is more likely not the source of the discrepancy.

  6. Good Modeling Practice for PAT Applications: Propagation of Input Uncertainty and Sensitivity Analysis

    Sin, Gürkan; Gernaey, Krist; Eliasson Lantz, Anna

    2009-01-01

    The uncertainty and sensitivity analysis are evaluated for their usefulness as part of the model-building within Process Analytical Technology applications. A mechanistic model describing a batch cultivation of Streptomyces coelicolor for antibiotic production was used as case study. The input...... compared to the large uncertainty observed in the antibiotic and off-gas CO2 predictions. The output uncertainty was observed to be lower during the exponential growth phase, while higher in the stationary and death phases - meaning the model describes some periods better than others. To understand which...... promising for helping to build reliable mechanistic models and to interpret the model outputs properly. These tools make part of good modeling practice, which can contribute to successful PAT applications for increased process understanding, operation and control purposes. © 2009 American Institute...

  7. Accounting for predictive uncertainty in a risk analysis focusing on radiological contamination of groundwater

    Andricevic, R.; Jacobson, R.L.; Daniels, J.I.

    1994-12-01

    This study focuses on the probabilistic travel time approach for predicting transport of radionuclides by groundwater velocity considering parameter uncertainty. The principal entity in the presented model is a travel time probability density function (pdf) conditioned on the set of parameters used to describe different transport processes like advection, dispersion, sorption, and decay. The model is applied to predict the arrival time of radionuclides in groundwater from the Nevada Test Site (NTS) at possible locations of potential human receptors nearby. Because of the lack of sorption the Tritium is found to provide the largest risk. Inclusion of sorption processes indicate that the parameter uncertainty and especially negative correlation between the mean velocity and the sorption strength is instrumental in evaluating the radionuclides arrival time at the prespecified accessible environment. Our analysis of potential health risks takes into consideration uncertainties in physiological attributes, as well as in committed effective dose and the estimate of physical detriment per unit Committed Effective Dose

  8. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; Geraci, Gianluca; Eldred, Michael S.; Vane, Zachary P.; Lacaze, Guilhem; Oefelein, Joseph C.; Najm, Habib N.

    2018-03-01

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis is conducted to identify influential uncertain input parameters, which can help reduce the systems stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. These methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.

  9. Analysis of actuator delay and its effect on uncertainty quantification for real-time hybrid simulation

    Chen, Cheng; Xu, Weijie; Guo, Tong; Chen, Kai

    2017-10-01

    Uncertainties in structure properties can result in different responses in hybrid simulations. Quantification of the effect of these uncertainties would enable researchers to estimate the variances of structural responses observed from experiments. This poses challenges for real-time hybrid simulation (RTHS) due to the existence of actuator delay. Polynomial chaos expansion (PCE) projects the model outputs on a basis of orthogonal stochastic polynomials to account for influences of model uncertainties. In this paper, PCE is utilized to evaluate effect of actuator delay on the maximum displacement from real-time hybrid simulation of a single degree of freedom (SDOF) structure when accounting for uncertainties in structural properties. The PCE is first applied for RTHS without delay to determine the order of PCE, the number of sample points as well as the method for coefficients calculation. The PCE is then applied to RTHS with actuator delay. The mean, variance and Sobol indices are compared and discussed to evaluate the effects of actuator delay on uncertainty quantification for RTHS. Results show that the mean and the variance of the maximum displacement increase linearly and exponentially with respect to actuator delay, respectively. Sensitivity analysis through Sobol indices also indicates the influence of the single random variable decreases while the coupling effect increases with the increase of actuator delay.

  10. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    Huan, Xun [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Safta, Cosmin [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Sargsyan, Khachik [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Geraci, Gianluca [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eldred, Michael S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vane, Zachary P. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Lacaze, Guilhem [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Oefelein, Joseph C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Najm, Habib N. [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2018-02-09

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis is conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. Finally, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.

  11. Development and application of objective uncertainty measures for nuclear power plant transient analysis[Dissertation 3897

    Vinai, P

    2007-10-15

    For the development, design and licensing of a nuclear power plant (NPP), a sound safety analysis is necessary to study the diverse physical phenomena involved in the system behaviour under operational and transient conditions. Such studies are based on detailed computer simulations. With the progresses achieved in computer technology and the greater availability of experimental and plant data, the use of best estimate codes for safety evaluations has gained increasing acceptance. The application of best estimate safety analysis has raised new problems that need to be addressed: it has become more crucial to assess as to how reliable code predictions are, especially when they need to be compared against safety limits that must not be crossed. It becomes necessary to identify and quantify the various possible sources of uncertainty that affect the reliability of the results. Currently, such uncertainty evaluations are generally based on experts' opinion. In the present research, a novel methodology based on a non-parametric statistical approach has been developed for objective quantification of best-estimate code uncertainties related to the physical models used in the code. The basis is an evaluation of the accuracy of a given physical model achieved by comparing its predictions with experimental data from an appropriate set of separate-effect tests. The differences between measurements and predictions can be considered stochastically distributed, and thus a statistical approach can be employed. The first step was the development of a procedure for investigating the dependence of a given physical model's accuracy on the experimental conditions. Each separate-effect test effectively provides a random sample of discrepancies between measurements and predictions, corresponding to a location in the state space defined by a certain number of independent system variables. As a consequence, the samples of 'errors', achieved from analysis of the entire

  12. Development and application of objective uncertainty measures for nuclear power plant transient analysis

    Vinai, P.

    2007-10-01

    For the development, design and licensing of a nuclear power plant (NPP), a sound safety analysis is necessary to study the diverse physical phenomena involved in the system behaviour under operational and transient conditions. Such studies are based on detailed computer simulations. With the progresses achieved in computer technology and the greater availability of experimental and plant data, the use of best estimate codes for safety evaluations has gained increasing acceptance. The application of best estimate safety analysis has raised new problems that need to be addressed: it has become more crucial to assess as to how reliable code predictions are, especially when they need to be compared against safety limits that must not be crossed. It becomes necessary to identify and quantify the various possible sources of uncertainty that affect the reliability of the results. Currently, such uncertainty evaluations are generally based on experts' opinion. In the present research, a novel methodology based on a non-parametric statistical approach has been developed for objective quantification of best-estimate code uncertainties related to the physical models used in the code. The basis is an evaluation of the accuracy of a given physical model achieved by comparing its predictions with experimental data from an appropriate set of separate-effect tests. The differences between measurements and predictions can be considered stochastically distributed, and thus a statistical approach can be employed. The first step was the development of a procedure for investigating the dependence of a given physical model's accuracy on the experimental conditions. Each separate-effect test effectively provides a random sample of discrepancies between measurements and predictions, corresponding to a location in the state space defined by a certain number of independent system variables. As a consequence, the samples of 'errors', achieved from analysis of the entire database, are

  13. Uncertainty analysis of one Main Circulation Pump trip event at the Ignalina NPP

    Vileiniskis, V.; Kaliatka, A.; Uspuras, E.

    2004-01-01

    One Main Circulation Pump (MCP) trip event is an anticipated transient with expected frequency of approximately one event per year. There were a few events when one MCP was inadvertently tripped. The throughput of the rest running pumps in the affected Main Circulation Circuit loop increased, however, the total coolant flow through the affected loop decreased. The main question arises whether this coolant flow rate is sufficient for adequate core cooling. This paper presents an investigation of one MCP trip event at the Ignalina NPP. According to international practice, the transient analysis should consist of deterministic analysis by employing best-estimate codes and uncertainty analysis. For that purpose, the plant's RELAP5 model and the GRS (Germany) System for Uncertainty and Sensitivity Analysis package (SUSA) were employed. Uncertainty analysis of flow energy loss in different parts of the Main Circulation Circuit, initial conditions and code-selected models was performed. Such analysis allows to estimate the influence of separate parameters on calculation results and to find the modelling parameters that have the largest impact on the event studied. On the basis of this analysis, recommendations for the further improvement of the model have been developed. (author)

  14. Uncertainty analysis for dynamic properties of MEMS resonator supported by fuzzy arithmetics

    A Martowicz

    2016-04-01

    Full Text Available In the paper the application of uncertainty analysis performed formicroelectromechanical resonator is presented. Main objective ofundertaken analysis is to assess the propagation of considered uncertaintiesin the variation of chosen dynamic characteristics of Finite Element model ofmicroresonator. Many different model parameters have been assumed tobe uncertain: geometry and material properties. Apart from total uncertaintypropagation, sensitivity analysis has been carried out to study separateinfluences of all input uncertain characteristics. Uncertainty analysis has beenperformed by means of fuzzy arithmetics in which alpha-cut strategy hasbeen applied to assemble output fuzzy number. Monte Carlo Simulation andGenetic Algorithms have been employed to calculate intervals connectedwith each alpha-cut of searched fuzzy number. Elaborated model ofmicroresonator has taken into account in a simplified way the presence ofsurrounding air and constant electrostatic field.

  15. Tolerance analysis in manufacturing using process capability ratio with measurement uncertainty

    Mahshid, Rasoul; Mansourvar, Zahra; Hansen, Hans Nørgaard

    2017-01-01

    . In this paper, a new statistical analysis was applied to manufactured products to assess achieved tolerances when the process is known while using capability ratio and expanded uncertainty. The analysis has benefits for process planning, determining actual precision limits, process optimization, troubleshoot......Tolerance analysis provides valuable information regarding performance of manufacturing process. It allows determining the maximum possible variation of a quality feature in production. Previous researches have focused on application of tolerance analysis to the design of mechanical assemblies...... malfunctioning existing part. The capability measure is based on a number of measurements performed on part’s quality variable. Since the ratio relies on measurements, elimination of any possible error has notable negative impact on results. Therefore, measurement uncertainty was used in combination with process...

  16. Estimates of Uncertainties in Analysis of Positron Lifetime Spectra for Metals

    Eldrup, Morten Mostgaard; Huang, Y. M.; McKee, B. T. A.

    1978-01-01

    by excluding the peak regions of the spectra from the analysis. The influence of using incorrect source-surface components in the analysis may on the other hand be reduced by including the peak regions of the spectra. A main conclusion of the work is that extreme caution should be exercised to avoid......The effects of uncertainties and errors in various constraints used in the analysis of multi-component life-time spectra of positrons annihilating in metals containing defects have been investigated in detail using computer simulated decay spectra and subsequent analysis. It is found...... that the errors in the fitted values of the main components lifetimes and intensities introduced from incorrect values of the instrumental resolution function and of the source-surface components can easily exceed the statistical uncertainties. The effect of an incorrect resolution function may be reduced...

  17. Uncertainty and sensitivity analysis methodology in a level-I PSA (Probabilistic Safety Assessment)

    Nunez McLeod, J.E.; Rivera, S.S.

    1997-01-01

    This work presents a methodology for sensitivity and uncertainty analysis, applicable to a probabilistic safety assessment level I. The work contents are: correct association of distributions to parameters, importance and qualification of expert opinions, generations of samples according to sample sizes, and study of the relationships among system variables and system response. A series of statistical-mathematical techniques are recommended along the development of the analysis methodology, as well different graphical visualization for the control of the study. (author) [es

  18. Petroleum taxation under uncertainty: contingent claims analysis with an application to Norway

    Lund, D.

    1992-01-01

    Contingent claims analysis provides a useful tool for analysing the value of tax claims, and of companies' after-tax values, under uncertainty. The method is presented and applied to the analysis of how Norwegian petroleum taxes affect company behaviour. Few results can be derived analytically. A numerical approach is suggested, with a stylized description of production possibilities. The Norwegian taxes are found to have strong distortionary effects. The relation to other methods and problems connected with the application are discussed. (Author)

  19. Uncertainty Analysis of Power Grid Investment Capacity Based on Monte Carlo

    Qin, Junsong; Liu, Bingyi; Niu, Dongxiao

    By analyzing the influence factors of the investment capacity of power grid, to depreciation cost, sales price and sales quantity, net profit, financing and GDP of the second industry as the dependent variable to build the investment capacity analysis model. After carrying out Kolmogorov-Smirnov test, get the probability distribution of each influence factor. Finally, obtained the grid investment capacity uncertainty of analysis results by Monte Carlo simulation.

  20. Cross-section sensitivity and uncertainty analysis of the FNG copper benchmark experiment

    Kodeli, I., E-mail: ivan.kodeli@ijs.si [Jožef Stefan Institute, Jamova 39, SI-1000 Ljubljana (Slovenia); Kondo, K. [Karlsruhe Institute of Technology, Postfach 3640, D-76021 Karlsruhe (Germany); Japan Atomic Energy Agency, Rokkasho-mura (Japan); Perel, R.L. [Racah Institute of Physics, Hebrew University of Jerusalem, IL-91904 Jerusalem (Israel); Fischer, U. [Karlsruhe Institute of Technology, Postfach 3640, D-76021 Karlsruhe (Germany)

    2016-11-01

    A neutronics benchmark experiment on copper assembly was performed end 2014–beginning 2015 at the 14-MeV Frascati neutron generator (FNG) of ENEA Frascati with the objective to provide the experimental database required for the validation of the copper nuclear data relevant for ITER design calculations, including the related uncertainties. The paper presents the pre- and post-analysis of the experiment performed using cross-section sensitivity and uncertainty codes, both deterministic (SUSD3D) and Monte Carlo (MCSEN5). Cumulative reaction rates and neutron flux spectra, their sensitivity to the cross sections, as well as the corresponding uncertainties were estimated for different selected detector positions up to ∼58 cm in the copper assembly. This permitted in the pre-analysis phase to optimize the geometry, the detector positions and the choice of activation reactions, and in the post-analysis phase to interpret the results of the measurements and the calculations, to conclude on the quality of the relevant nuclear cross-section data, and to estimate the uncertainties in the calculated nuclear responses and fluxes. Large uncertainties in the calculated reaction rates and neutron spectra of up to 50%, rarely observed at this level in the benchmark analysis using today's nuclear data, were predicted, particularly high for fast reactions. Observed C/E (dis)agreements with values as low as 0.5 partly confirm these predictions. Benchmark results are therefore expected to contribute to the improvement of both cross section as well as covariance data evaluations.

  1. Analysis of uncertainties in the estimates of nitrous oxide and methane emissions in the UK's greenhouse gas inventory for agriculture

    Milne, Alice E.; Glendining, Margaret J.; Bellamy, Pat; Misselbrook, Tom; Gilhespy, Sarah; Rivas Casado, Monica; Hulin, Adele; van Oijen, Marcel; Whitmore, Andrew P.

    2014-01-01

    The UK's greenhouse gas inventory for agriculture uses a model based on the IPCC Tier 1 and Tier 2 methods to estimate the emissions of methane and nitrous oxide from agriculture. The inventory calculations are disaggregated at country level (England, Wales, Scotland and Northern Ireland). Before now, no detailed assessment of the uncertainties in the estimates of emissions had been done. We used Monte Carlo simulation to do such an analysis. We collated information on the uncertainties of each of the model inputs. The uncertainties propagate through the model and result in uncertainties in the estimated emissions. Using a sensitivity analysis, we found that in England and Scotland the uncertainty in the emission factor for emissions from N inputs (EF1) affected uncertainty the most, but that in Wales and Northern Ireland, the emission factor for N leaching and runoff (EF5) had greater influence. We showed that if the uncertainty in any one of these emission factors is reduced by 50%, the uncertainty in emissions of nitrous oxide reduces by 10%. The uncertainty in the estimate for the emissions of methane emission factors for enteric fermentation in cows and sheep most affected the uncertainty in methane emissions. When inventories are disaggregated (as that for the UK is) correlation between separate instances of each emission factor will affect the uncertainty in emissions. As more countries move towards inventory models with disaggregation, it is important that the IPCC give firm guidance on this topic.

  2. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    Heo, Jaeseok; Kim, Kyung Doo

    2015-01-01

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper

  3. Data uncertainties in material flow analysis: Municipal solid waste management system in Maputo City, Mozambique.

    Dos Muchangos, Leticia Sarmento; Tokai, Akihiro; Hanashima, Atsuko

    2017-01-01

    Material flow analysis can effectively trace and quantify the flows and stocks of materials such as solid wastes in urban environments. However, the integrity of material flow analysis results is compromised by data uncertainties, an occurrence that is particularly acute in low-and-middle-income study contexts. This article investigates the uncertainties in the input data and their effects in a material flow analysis study of municipal solid waste management in Maputo City, the capital of Mozambique. The analysis is based on data collected in 2007 and 2014. Initially, the uncertainties and their ranges were identified by the data classification model of Hedbrant and Sörme, followed by the application of sensitivity analysis. The average lower and upper bounds were 29% and 71%, respectively, in 2007, increasing to 41% and 96%, respectively, in 2014. This indicates higher data quality in 2007 than in 2014. Results also show that not only data are partially missing from the established flows such as waste generation to final disposal, but also that they are limited and inconsistent in emerging flows and processes such as waste generation to material recovery (hence the wider variation in the 2014 parameters). The sensitivity analysis further clarified the most influencing parameter and the degree of influence of each parameter on the waste flows and the interrelations among the parameters. The findings highlight the need for an integrated municipal solid waste management approach to avoid transferring or worsening the negative impacts among the parameters and flows.

  4. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    Heo, Jaeseok, E-mail: jheo@kaeri.re.kr; Kim, Kyung Doo, E-mail: kdkim@kaeri.re.kr

    2015-10-15

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper.

  5. Uncertainty analysis methods for quantification of source terms using a large computer code

    Han, Seok Jung

    1997-02-01

    Quantification of uncertainties in the source term estimations by a large computer code, such as MELCOR and MAAP, is an essential process of the current probabilistic safety assessments (PSAs). The main objectives of the present study are (1) to investigate the applicability of a combined procedure of the response surface method (RSM) based on input determined from a statistical design and the Latin hypercube sampling (LHS) technique for the uncertainty analysis of CsI release fractions under a hypothetical severe accident sequence of a station blackout at Young-Gwang nuclear power plant using MAAP3.0B code as a benchmark problem; and (2) to propose a new measure of uncertainty importance based on the distributional sensitivity analysis. On the basis of the results obtained in the present work, the RSM is recommended to be used as a principal tool for an overall uncertainty analysis in source term quantifications, while using the LHS in the calculations of standardized regression coefficients (SRC) and standardized rank regression coefficients (SRRC) to determine the subset of the most important input parameters in the final screening step and to check the cumulative distribution functions (cdfs) obtained by RSM. Verification of the response surface model for its sufficient accuracy is a prerequisite for the reliability of the final results obtained by the combined procedure proposed in the present work. In the present study a new measure has been developed to utilize the metric distance obtained from cumulative distribution functions (cdfs). The measure has been evaluated for three different cases of distributions in order to assess the characteristics of the measure: The first case and the second are when the distribution is known as analytical distributions and the other case is when the distribution is unknown. The first case is given by symmetry analytical distributions. The second case consists of two asymmetry distributions of which the skewness is non zero

  6. An analysis of combined standard uncertainty for radiochemical measurements of environmental samples

    Berne, A.

    1996-01-01

    It is anticipated that future data acquisitions intended for use in radiological risk assessments will require the incorporation of uncertainty analysis. Often, only one aliquot of the sample is taken and a single determination is made. Under these circumstances, the total uncertainty is calculated using the open-quotes propagation of errorsclose quotes approach. However, there is no agreement in the radioanalytical community as to the exact equations to use. The Quality Assurance/Metrology Division of the Environmental Measurements Laboratory has developed a systematic process to compute uncertainties in constituent components of the analytical procedure, as well as the combined standard uncertainty (CSU). The equations for computation are presented here, with examples of their use. They have also been incorporated into a code for use in the spreadsheet application, QuattroPro trademark. Using the spreadsheet with appropriate inputs permits an analysis of the variations in the CSU as a function of several different variables. The relative importance of the open-quotes counting uncertaintyclose quotes can also be ascertained

  7. SWEPP PAN assay system uncertainty analysis: Active mode measurements of solidified aqueous sludge waste

    Blackwood, L.G.; Harker, Y.D.; Meachum, T.R.

    1997-12-01

    The Idaho National Engineering and Environmental Laboratory is being used as a temporary storage facility for transuranic waste generated by the US Nuclear Weapons program at the Rocky Flats Plant (RFP) in Golden, Colorado. Currently, there is a large effort in progress to prepare to ship this waste to the Waste Isolation Pilot Plant (WIPP) in Carlsbad, New Mexico. In order to meet the TRU Waste Characterization Quality Assurance Program Plan nondestructive assay compliance requirements and quality assurance objectives, it is necessary to determine the total uncertainty of the radioassay results produced by the Stored Waste Examination Pilot Plant (SWEPP) Passive Active Neutron (PAN) radioassay system. This paper is one of a series of reports quantifying the results of the uncertainty analysis of the PAN system measurements for specific waste types and measurement modes. In particular this report covers active mode measurements of weapons grade plutonium-contaminated aqueous sludge waste contained in 208 liter drums (item description codes 1, 2, 7, 800, 803, and 807). Results of the uncertainty analysis for PAN active mode measurements of aqueous sludge indicate that a bias correction multiplier of 1.55 should be applied to the PAN aqueous sludge measurements. With the bias correction, the uncertainty bounds on the expected bias are 0 ± 27%. These bounds meet the Quality Assurance Program Plan requirements for radioassay systems

  8. Operational Implementation of a Pc Uncertainty Construct for Conjunction Assessment Risk Analysis

    Newman, Lauri K.; Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    Earlier this year the NASA Conjunction Assessment and Risk Analysis (CARA) project presented the theoretical and algorithmic aspects of a method to include the uncertainties in the calculation inputs when computing the probability of collision (Pc) between two space objects, principally uncertainties in the covariances and the hard-body radius. The output of this calculation approach is to produce rather than a single Pc value an entire probability density function that will represent the range of possible Pc values given the uncertainties in the inputs and bring CA risk analysis methodologies more in line with modern risk management theory. The present study provides results from the exercise of this method against an extended dataset of satellite conjunctions in order to determine the effect of its use on the evaluation of conjunction assessment (CA) event risk posture. The effects are found to be considerable: a good number of events are downgraded from or upgraded to a serious risk designation on the basis of consideration of the Pc uncertainty. The findings counsel the integration of the developed methods into NASA CA operations.

  9. Artificial intelligence metamodel comparison and application to wind turbine airfoil uncertainty analysis

    Yaping Ju

    2016-05-01

    Full Text Available The Monte Carlo simulation method for turbomachinery uncertainty analysis often requires performing a huge number of simulations, the computational cost of which can be greatly alleviated with the help of metamodeling techniques. An intensive comparative study was performed on the approximation performance of three prospective artificial intelligence metamodels, that is, artificial neural network, radial basis function, and support vector regression. The genetic algorithm was used to optimize the predetermined parameters of each metamodel for the sake of a fair comparison. Through testing on 10 nonlinear functions with different problem scales and sample sizes, the genetic algorithm–support vector regression metamodel was found more accurate and robust than the other two counterparts. Accordingly, the genetic algorithm–support vector regression metamodel was selected and combined with the Monte Carlo simulation method for the uncertainty analysis of a wind turbine airfoil under two types of surface roughness uncertainties. The results show that the genetic algorithm–support vector regression metamodel can capture well the uncertainty propagation from the surface roughness to the airfoil aerodynamic performance. This work is useful to the application of metamodeling techniques in the robust design optimization of turbomachinery.

  10. Analysis of uncertainties caused by the atmospheric dispersion model in accident consequence assessments with UFOMOD

    Fischer, F.; Ehrhardt, J.

    1988-06-01

    Various techniques available for uncertainty analysis of large computer models are applied, described and selected as most appropriate for analyzing the uncertainty in the predictions of accident consequence assessments. The investigation refers to the atmospheric dispersion and deposition submodel (straight-line Gaussian plume model) of UFOMOD, whose most important input variables and parameters are linked with probability distributions derived from expert judgement. Uncertainty bands show how much variability exists, sensitivity measures determine what causes this variability in consequences. Results are presented as confidence bounds of complementary cumulative frequency distributions (CCFDs) of activity concentrations, organ doses and health effects, partially as a function of distance from the site. In addition the ranked influence of the uncertain parameters on the different consequence types is shown. For the estimation of confidence bounds it was sufficient to choose a model parameter sample size of n (n=59) equal to 1.5 times the number of uncertain model parameters. Different samples or an increase of sample size did not change the 5%-95% - confidence bands. To get statistically stable results of the sensitivity analysis, larger sample sizes are needed (n=100, 200). Random or Latin-hypercube sampling schemes as tools for uncertainty and sensitivity analyses led to comparable results. (orig.) [de

  11. Effect of the sample matrix on measurement uncertainty in X-ray fluorescence analysis

    Morgenstern, P.; Brueggemann, L.; Wennrich, R.

    2005-01-01

    The estimation of measurement uncertainty, with reference to univariate calibration functions, is discussed in detail in the Eurachem Guide 'Quantifying Uncertainty in Analytical Measurement'. The adoption of these recommendations to quantitative X-ray fluorescence analysis (XRF) involves basic problems which are above all due to the strong influence of the sample matrix on the analytical response. In XRF-analysis, the proposed recommendations are consequently applicable only to the matrix corrected response. The application is also restricted with regard to both the matrices and analyte concentrations. In this context the present studies are aimed at the problems to predict measurement uncertainty also with reference to more variable sample compositions. The corresponding investigations are focused on the use of the intensity of the Compton scattered tube line as an internal standard to assess the effect of the individual sample matrix on the analytical response relatively to a reference matrix. Based on this concept the estimation of the measurement uncertainty of an analyte presented in an unknown specimen can be predicted in consideration of the data obtained under defined matrix conditions

  12. Uncertainty theory

    Liu, Baoding

    2015-01-01

    When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...

  13. SENSIT: a cross-section and design sensitivity and uncertainty analysis code. [In FORTRAN for CDC-7600, IBM 360

    Gerstl, S.A.W.

    1980-01-01

    SENSIT computes the sensitivity and uncertainty of a calculated integral response (such as a dose rate) due to input cross sections and their uncertainties. Sensitivity profiles are computed for neutron and gamma-ray reaction cross sections of standard multigroup cross section sets and for secondary energy distributions (SEDs) of multigroup scattering matrices. In the design sensitivity mode, SENSIT computes changes in an integral response due to design changes and gives the appropriate sensitivity coefficients. Cross section uncertainty analyses are performed for three types of input data uncertainties: cross-section covariance matrices for pairs of multigroup reaction cross sections, spectral shape uncertainty parameters for secondary energy distributions (integral SED uncertainties), and covariance matrices for energy-dependent response functions. For all three types of data uncertainties SENSIT computes the resulting variance and estimated standard deviation in an integral response of interest, on the basis of generalized perturbation theory. SENSIT attempts to be more comprehensive than earlier sensitivity analysis codes, such as SWANLAKE.

  14. Sensitivity and Uncertainty Analysis of IAEA CRP HTGR Benchmark Using McCARD

    Jang, Sang Hoon; Shim, Hyung Jin

    2016-01-01

    The benchmark consists of 4 phases starting from the local standalone modeling (Phase I) to the safety calculation of coupled system with transient situation (Phase IV). As a preliminary study of UAM on HTGR, this paper covers the exercise 1 and 2 of Phase I which defines the unit cell and lattice geometry of MHTGR-350 (General Atomics). The objective of these exercises is to quantify the uncertainty of the multiplication factor induced by perturbing nuclear data as well as to analyze the specific features of HTGR such as double heterogeneity and self-shielding treatment. The uncertainty quantification of IAEA CRP HTGR UAM benchmarks were conducted using first-order AWP method in McCARD. Uncertainty of the multiplication factor was estimated only for the microscopic cross section perturbation. To reduce the computation time and memory shortage, recently implemented uncertainty analysis module in MC wielandt calculation was adjusted. The covariance data of cross section was generated by NJOY/ERRORR module with ENDF/B-VII.1. The numerical result was compared with evaluation result of DeCART/MUSAD code system developed by KAERI. IAEA CRP HTGR UAM benchmark problems were analyzed using McCARD. The numerical results were compared with Serpent for eigenvalue calculation and DeCART/MUSAD for S/U analysis. In eigenvalue calculation, inconsistencies were found in the result with ENDF/B-VII.1 cross section library and it was found to be the effect of thermal scattering data of graphite. As to S/U analysis, McCARD results matched well with DeCART/MUSAD, but showed some discrepancy in 238U capture regarding implicit uncertainty.

  15. Uncertainty analysis in comparative NAA applied to geological and biological matrices

    Zahn, Guilherme S.; Ticianelli, Regina B.; Lange, Camila N.; Favaro, Deborah I.T.; Figueiredo, Ana M.G., E-mail: gzahn@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    Comparative nuclear activation analysis is a multielemental primary analytical technique that may be used in a rather broad spectrum of matrices with minimal-to-none sample preprocessing. Although the total activation of a chemical element in a sample depends on a rather large set of parameters, when the sample is irradiated together with a well-known comparator, most of these parameters are crossed out and the concentration of that element can be determined simply by using the activities and masses of the comparator and the sample, the concentration of this chemical element in the sample, the half-life of the formed radionuclide and the time between counting the sample and the comparator. This simplification greatly reduces not only the calculations required, but also the uncertainty associated with the measurement; nevertheless, a cautious analysis must be carried out in order to make sure all relevant uncertainties are properly treated, so that the final result can be as representative of the measurement as possible. In this work, this analysis was performed for geological matrices, where concentrations of the interest nuclides are rather high, but so is the density and average atomic number of the sample, as well as for a biological matrix, in order to allow for a comparison. The results show that the largest part of the uncertainty comes from the activity measurements and from the concentration of the comparator, and that while the influence of time-related terms in the final uncertainty can be safely neglected, the uncertainty in the masses may be relevant under specific circumstances. (author)

  16. Uncertainty analysis in comparative NAA applied to geological and biological matrices

    Zahn, Guilherme S.; Ticianelli, Regina B.; Lange, Camila N.; Favaro, Deborah I.T.; Figueiredo, Ana M.G.

    2015-01-01

    Comparative nuclear activation analysis is a multielemental primary analytical technique that may be used in a rather broad spectrum of matrices with minimal-to-none sample preprocessing. Although the total activation of a chemical element in a sample depends on a rather large set of parameters, when the sample is irradiated together with a well-known comparator, most of these parameters are crossed out and the concentration of that element can be determined simply by using the activities and masses of the comparator and the sample, the concentration of this chemical element in the sample, the half-life of the formed radionuclide and the time between counting the sample and the comparator. This simplification greatly reduces not only the calculations required, but also the uncertainty associated with the measurement; nevertheless, a cautious analysis must be carried out in order to make sure all relevant uncertainties are properly treated, so that the final result can be as representative of the measurement as possible. In this work, this analysis was performed for geological matrices, where concentrations of the interest nuclides are rather high, but so is the density and average atomic number of the sample, as well as for a biological matrix, in order to allow for a comparison. The results show that the largest part of the uncertainty comes from the activity measurements and from the concentration of the comparator, and that while the influence of time-related terms in the final uncertainty can be safely neglected, the uncertainty in the masses may be relevant under specific circumstances. (author)

  17. Uncertainty and sensitivity analysis of chronic exposure results with the MACCS reactor accident consequence model

    Helton, J.C; Johnson, J.D; Rollstin, J.A; Shiver, A.W; Sprung, J.L

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the chronic exposure pathways associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 75 imprecisely known input variables on the following reactor accident consequences are studied: crop growing-season dose, crop long-term dose, water ingestion dose, milk growing-season dose, long-term groundshine dose, long-term inhalation dose, total food pathways dose, total ingestion pathways dose, total long-term pathways dose, total latent cancer fatalities, area-dependent cost, crop disposal cost, milk disposal cost, population-dependent cost, total economic cost, condemnation area, condemnation population, crop disposal area and milk disposal area. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: dry deposition velocity, transfer of cesium from animal feed to milk, transfer of cesium from animal feed to meet, ground concentration of Cs-134 at which the disposal of milk products will be initiated, transfer of Sr-90 from soil to legumes, maximum allowable ground concentration of Sr-90 for production of crops, fraction of cesium entering surface water that is consumed in drinking water, groundshine shielding factor, scale factor defining resuspension, dose reduction associated with decontamination, and ground concentration of I-131 at which disposal of crops will be initiated due to accidents that occur during the growing season. Reducing the uncertainty in the preceding variables was found to substantially reduce the uncertainty in the

  18. Stakeholder-driven multi-attribute analysis for energy project selection under uncertainty

    Read, Laura; Madani, Kaveh; Mokhtari, Soroush; Hanks, Catherine

    2017-01-01

    In practice, selecting an energy project for development requires balancing criteria and competing stakeholder priorities to identify the best alternative. Energy source selection can be modeled as multi-criteria decision-maker problems to provide quantitative support to reconcile technical, economic, environmental, social, and political factors with respect to the stakeholders' interests. Decision making among these complex interactions should also account for the uncertainty present in the input data. In response, this work develops a stochastic decision analysis framework to evaluate alternatives by involving stakeholders to identify both quantitative and qualitative selection criteria and performance metrics which carry uncertainties. The developed framework is illustrated using a case study from Fairbanks, Alaska, where decision makers and residents must decide on a new source of energy for heating and electricity. We approach this problem in a five step methodology: (1) engaging experts (role players) to develop criteria of project performance; (2) collecting a range of quantitative and qualitative input information to determine the performance of each proposed solution according to the selected criteria; (3) performing a Monte-Carlo analysis to capture uncertainties given in the inputs; (4) applying multi-criteria decision-making, social choice (voting), and fallback bargaining methods to account for three different levels of cooperation among the stakeholders; and (5) computing an aggregate performance index (API) score for each alternative based on its performance across criteria and cooperation levels. API scores communicate relative performance between alternatives. In this way, our methodology maps uncertainty from the input data to reflect risk in the decision and incorporates varying degrees of cooperation into the analysis to identify an optimal and practical alternative. - Highlights: • We develop an applicable stakeholder-driven framework for

  19. Demonstration of uncertainty quantification and sensitivity analysis for PWR fuel performance with BISON

    Zhang, Hongbin; Zhao, Haihua; Zou, Ling; Burns, Douglas; Ladd, Jacob

    2017-01-01

    BISON is an advanced fuels performance code being developed at Idaho National Laboratory and is the code of choice for fuels performance by the U.S. Department of Energy (DOE)’s Consortium for Advanced Simulation of Light Water Reactors (CASL) Program. An approach to uncertainty quantification and sensitivity analysis with BISON was developed and a new toolkit was created. A PWR fuel rod model was developed and simulated by BISON, and uncertainty quantification and sensitivity analysis were performed with eighteen uncertain input parameters. The maximum fuel temperature and gap conductance were selected as the figures of merit (FOM). Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis. (author)

  20. Demonstration of Uncertainty Quantification and Sensitivity Analysis for PWR Fuel Performance with BISON

    Zhang, Hongbin; Ladd, Jacob; Zhao, Haihua; Zou, Ling; Burns, Douglas

    2015-11-01

    BISON is an advanced fuels performance code being developed at Idaho National Laboratory and is the code of choice for fuels performance by the U.S. Department of Energy (DOE)’s Consortium for Advanced Simulation of Light Water Reactors (CASL) Program. An approach to uncertainty quantification and sensitivity analysis with BISON was developed and a new toolkit was created. A PWR fuel rod model was developed and simulated by BISON, and uncertainty quantification and sensitivity analysis were performed with eighteen uncertain input parameters. The maximum fuel temperature and gap conductance were selected as the figures of merit (FOM). Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis.

  1. Technology relevance of the 'uncertainty analysis in modelling' project for nuclear reactor safety

    D'Auria, F.; Langenbuch, S.; Royer, E.; Del Nevo, A.; Parisi, C.; Petruzzi, A.

    2007-01-01

    The OECD/NEA Nuclear Science Committee (NSC) endorsed the setting up of an Expert Group on Uncertainty Analysis in Modelling (UAM) in June 2006. This Expert Group reports to the Working Party on Scientific issues in Reactor Systems (WPRS) and because it addresses multi-scale / multi-physics aspects of uncertainty analysis, it will work in close co-ordination with the benchmark groups on coupled neutronics-thermal-hydraulics and on coupled core-plant problems, and the CSNI Group on Analysis and Management of Accidents (GAMA). The NEA/NSC has endorsed that this activity be undertaken with Prof. K. Ivanov from the Pennsylvania State University (PSU) as the main coordinator and host with the assistance of the Scientific Board. The objective of the proposed work is to define, coordinate, conduct, and report an international benchmark for uncertainty analysis in best-estimate coupled code calculations for design, operation, and safety analysis of LWRs entitled 'OECD UAM LWR Benchmark'. At the First Benchmark Workshop (UAM-1) held from 10 to 11 May 2007 at the OECD/NEA, one action concerned the forming of a sub-group, led by F. D'Auria, member of CSNI, responsible for defining the objectives, the impact and benefit of the UAM for safety and licensing. This report is the result of this action by the subgroup. (authors)

  2. Sensitivity and uncertainty analysis for Ignalina NPP confinement in case of loss of coolant accident

    Urbonavicius, E.; Babilas, E.; Rimkevicius, S.

    2003-01-01

    At present the best-estimate approach in the safety analysis of nuclear power plants is widely used around the world. The application of such approach requires to estimate the uncertainty of the calculated results. Various methodologies are applied in order to determine the uncertainty with the required accuracy. One of them is the statistical methodology developed at GRS mbH in Germany and integrated into the SUSA tool, which was applied for the sensitivity and uncertainty analysis of the thermal-hydraulic parameters inside the confinement (Accident Localisation System) of Ignalina NPP with RBMK-1500 reactor in case of Maximum Design Basis Accident (break of 900 mm diameter pipe). Several parameters that could potentially influence the calculated results were selected for the analysis. A set of input data with different initial values of the selected parameters was generated. In order to receive the results with 95 % probability and 95 % accuracy, 100 runs were performed with COCOSYS code developed at GRS mbH. The calculated results were processed with SUSA tool. The performed analysis showed a rather low dispersion of the results and only in the initial period of the accident. Besides, the analysis showed that there is no threat to the building structures of Ignalina NPP confinement in case of the considered accident scenario. (author)

  3. The effect of uncertainty of reactor parameters obtained using k0-NAA on result of analysis

    Sasajima, Fumio

    2006-01-01

    Neutron Activation Analysis using the k 0 method is a useful method allowing convenient and accurate simultaneous analysis of plural elements, eliminating the need for the use of comparative reference samples. As already well known, it is essential for the correct result of an analysis to obtain the α-factor and f-factor for a neutron spectrum in an irradiation field accurately when an attempt is made to use the k 0 method. For this reason, based on data obtained from the experiment conducted in the JRR-3 PN-3 system, how uncertainty of the measured values for α-factor and f-factor affects the result of an analysis was evaluated. The process of evaluation involved intentionally varying the values for reactor parameters followed by making an analysis of environmental reference samples (NIST SRM-1632c) using the k 0 method to examine the effect of these factors on the concentrations of 19 elements. The result of the evaluation revealed that the degree of the effect of uncertainty on the concentrations of 19 elements was at best approx. 1% under the condition of this experiment assuming that the factor α, a reactor parameter, had uncertainty of approx. 200%. (author)

  4. Uncertainty analysis and allocation of joint tolerances in robot manipulators based on interval analysis

    Wu Weidong; Rao, S.S.

    2007-01-01

    Many uncertain factors influence the accuracy and repeatability of robots. These factors include manufacturing and assembly tolerances and deviations in actuators and controllers. The effects of these uncertain factors must be carefully analyzed to obtain a clear insight into the manipulator performance. In order to ensure the position and orientation accuracy of a robot end effector as well as to reduce the manufacturing cost of the robot, it is necessary to quantify the influence of the uncertain factors and optimally allocate the tolerances. This involves a study of the direct and inverse kinematics of robot end effectors in the presence of uncertain factors. This paper focuses on the optimal allocation of joint tolerances with consideration of the positional and directional errors of the robot end effector and the manufacturing cost. The interval analysis is used for predicting errors in the performance of robot manipulators. The Stanford manipulator is considered for illustration. The unknown joint variables are modeled as interval parameters due to the inherent uncertainty. The cost-tolerance model is assumed to be of an exponential form during optimization. The effects of the upper bounds on the minimum cost and relative deviations of the directional and positional errors of the end effector are also studied

  5. Stand-alone core sensitivity and uncertainty analysis of ALFRED from Monte Carlo simulations

    Pérez-Valseca, A.-D.; Espinosa-Paredes, G.; François, J.L.; Vázquez Rodríguez, A.; Martín-del-Campo, C.

    2017-01-01

    Highlights: • Methodology based on Monte Carlo simulation. • Sensitivity analysis of Lead Fast Reactor (LFR). • Uncertainty and regression analysis of LFR. • 10% change in the core inlet flow, the response in thermal power change is 0.58%. • 2.5% change in the inlet lead temperature the response is 1.87% in power. - Abstract: The aim of this paper is the sensitivity and uncertainty analysis of a Lead-Cooled Fast Reactor (LFR) based on Monte Carlo simulation of sizes up to 2000. The methodology developed in this work considers the uncertainty of sensitivities and uncertainty of output variables due to a single-input-variable variation. The Advanced Lead fast Reactor European Demonstrator (ALFRED) is analyzed to determine the behavior of the essential parameters due to effects of mass flow and temperature of liquid lead. The ALFRED core mathematical model developed in this work is fully transient, which takes into account the heat transfer in an annular fuel pellet design, the thermo-fluid in the core, and the neutronic processes, which are modeled with point kinetic with feedback fuel temperature and expansion effects. The sensitivity evaluated in terms of the relative standard deviation (RSD) showed that for 10% change in the core inlet flow, the response in thermal power change is 0.58%, and for 2.5% change in the inlet lead temperature is 1.87%. The regression analysis with mass flow rate as the predictor variable showed statistically valid cubic correlations for neutron flux and linear relationship neutron flux as a function of the lead temperature. No statistically valid correlation was observed for the reactivity as a function of the mass flow rate and for the lead temperature. These correlations are useful for the study, analysis, and design of any LFR.

  6. Sensitivity analysis of respiratory parameter uncertainties: impact of criterion function form and constraints.

    Lutchen, K R

    1990-08-01

    A sensitivity analysis based on weighted least-squares regression is presented to evaluate alternative methods for fitting lumped-parameter models to respiratory impedance data. The goal is to maintain parameter accuracy simultaneously with practical experiment design. The analysis focuses on predicting parameter uncertainties using a linearized approximation for joint confidence regions. Applications are with four-element parallel and viscoelastic models for 0.125- to 4-Hz data and a six-element model with separate tissue and airway properties for input and transfer impedance data from 2-64 Hz. The criterion function form was evaluated by comparing parameter uncertainties when data are fit as magnitude and phase, dynamic resistance and compliance, or real and imaginary parts of input impedance. The proper choice of weighting can make all three criterion variables comparable. For the six-element model, parameter uncertainties were predicted when both input impedance and transfer impedance are acquired and fit simultaneously. A fit to both data sets from 4 to 64 Hz could reduce parameter estimate uncertainties considerably from those achievable by fitting either alone. For the four-element models, use of an independent, but noisy, measure of static compliance was assessed as a constraint on model parameters. This may allow acceptable parameter uncertainties for a minimum frequency of 0.275-0.375 Hz rather than 0.125 Hz. This reduces data acquisition requirements from a 16- to a 5.33- to 8-s breath holding period. These results are approximations, and the impact of using the linearized approximation for the confidence regions is discussed.

  7. Uncertainty quantification and sensitivity analysis with CASL Core Simulator VERA-CS

    Brown, C.S.; Zhang, Hongbin

    2016-01-01

    VERA-CS (Virtual Environment for Reactor Applications, Core Simulator) is a coupled neutron transport and thermal-hydraulics code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). An approach to uncertainty quantification and sensitivity analysis with VERA-CS was developed and a new toolkit was created to perform uncertainty quantification and sensitivity analysis. A 2 × 2 fuel assembly model was developed and simulated by VERA-CS, and uncertainty quantification and sensitivity analysis were performed with fourteen uncertain input parameters. The minimum departure from nucleate boiling ratio (MDNBR), maximum fuel center-line temperature, and maximum outer clad surface temperature were chosen as the selected figures of merit. Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis and coolant inlet temperature was consistently the most influential parameter. Parameters used as inputs to the critical heat flux calculation with the W-3 correlation were shown to be the most influential on the MDNBR, maximum fuel center-line temperature, and maximum outer clad surface temperature.

  8. Deterministic methods for sensitivity and uncertainty analysis in large-scale computer models

    Worley, B.A.; Oblow, E.M.; Pin, F.G.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.; Lucius, J.L.

    1987-01-01

    The fields of sensitivity and uncertainty analysis are dominated by statistical techniques when large-scale modeling codes are being analyzed. This paper reports on the development and availability of two systems, GRESS and ADGEN, that make use of computer calculus compilers to automate the implementation of deterministic sensitivity analysis capability into existing computer models. This automation removes the traditional limitation of deterministic sensitivity methods. The paper describes a deterministic uncertainty analysis method (DUA) that uses derivative information as a basis to propagate parameter probability distributions to obtain result probability distributions. The paper demonstrates the deterministic approach to sensitivity and uncertainty analysis as applied to a sample problem that models the flow of water through a borehole. The sample problem is used as a basis to compare the cumulative distribution function of the flow rate as calculated by the standard statistical methods and the DUA method. The DUA method gives a more accurate result based upon only two model executions compared to fifty executions in the statistical case

  9. A comparison of uncertainty analysis methods using a groundwater flow model

    Doctor, P.G.; Jacobson, E.A.; Buchanan, J.A.

    1988-06-01

    This report evaluates three uncertainty analysis methods that are proposed for use in performances assessment activities within the OCRWM and Nuclear Regulatory Commission (NRC) communities. The three methods are Monte Carlo simulation with unconstrained sampling, Monte Carlo simulation with Latin Hypercube sampling, and first-order analysis. Monte Carlo simulation with unconstrained sampling is a generally accepted uncertainty analysis method, but it has the disadvantage of being costly and time consuming. Latin Hypercube sampling was proposed to make Monte Carlo simulation more efficient. However, although it was originally formulated for independent variables, which is a major drawback in performance assessment modeling, Latin Hypercube can be used to generate correlated samples. The first-order method is efficient to implement because it is based on the first-order Taylor series expansion; however, there is concern that it does not adequately describe the variability for complex models. These three uncertainty analysis methods were evaluated using a calibrated groundwater flow model of a unconfined aquifer in southern Arizona. The two simulation methods produced similar results, although the Latin Hypercube method tends to produce samples whose estimates of statistical parameters are closer to the desired parameters. The mean travel times for the first-order method does not agree with those of the simulations. In additions, the first-order method produces estimates of variance in travel times that are more variable than those produced by the simulation methods, resulting in nonconservative tolerance intervals. 13 refs., 33 figs

  10. Free Vibration Analysis of Composite Plates via Refined Theories Accounting for Uncertainties

    G. Giunta

    2011-01-01

    Full Text Available The free vibration analysis of composite thin and relatively thick plates accounting for uncertainty is addressed in this work. Classical and refined two-dimensional models derived via Carrera's Unified Formulation (CUF are considered. Material properties and geometrical parameters are supposed to be random. The fundamental frequency related to the first bending eigenmode is stochastically described in terms of the mean value, the standard deviation, the related confidence intervals and the cumulative distribution function. The Monte Carlo Method is employed to account for uncertainty. Cross-ply, simply supported, orthotropic plates are accounted for. Symmetric and anti-symmetric lay-ups are investigated. Displacements based and mixed two-dimensional theories are adopted. Equivalent single layer and layer wise approaches are considered. A Navier type solution is assumed. The conducted analyses have shown that for the considered cases, the fundamental natural frequency is not very sensitive to the uncertainty in the material parameters, while uncertainty in the geometrical parameters should be accounted for. In the case of thin plates, all the considered models yield statistically matching results. For relatively thick plates, the difference in the mean value of the natural frequency is due to the different number of degrees of freedom in the model.

  11. Verification and uncertainty evaluation of CASMO-3/MASTER nuclear analysis system

    Song, Jae Seung; Cho, Byung Oh; Joo, Han Kyu; Zee, Sung Quun; Lee, Chung Chan; Park, Sang Yoon

    2000-06-01

    MASTER is a nuclear design code developed by KAERI. It uses group constants generated by CASMO-3 developed by Studsvik. In this report the verification and evaluation of uncertainty were performed for the code system application in nuclear reactor core analysis and design. The verification is performed via various benchmark comparisons for static and transient core condition, and core follow calculations with startup physics test predictions of total 14 cycles of pressurized water reactors. Benchmark calculation include comparisons with reference solutions of IAEA and OECA/NEA problems and critical experiment measurements. The uncertainty evaluation is focused to safety related parameters such as power distribution, reactivity coefficients, control rod worth and core reactivity. It is concluded that CASMO-3/MASTER can be applied for PWR core nuclear analysis and design without any bias factors. Also, it is verified that the system can be applied for SMART core, via supplemental comparisons with reference calculations by MCNP which is a probabilistic nuclear calculation code.

  12. Verification and uncertainty evaluation of CASMO-3/MASTER nuclear analysis system

    Song, Jae Seung; Cho, Byung Oh; Joo, Han Kyu; Zee, Sung Quun; Lee, Chung Chan; Park, Sang Yoon

    2000-06-01

    MASTER is a nuclear design code developed by KAERI. It uses group constants generated by CASMO-3 developed by Studsvik. In this report the verification and evaluation of uncertainty were performed for the code system application in nuclear reactor core analysis and design. The verification is performed via various benchmark comparisons for static and transient core condition, and core follow calculations with startup physics test predictions of total 14 cycles of pressurized water reactors. Benchmark calculation include comparisons with reference solutions of IAEA and OECA/NEA problems and critical experiment measurements. The uncertainty evaluation is focused to safety related parameters such as power distribution, reactivity coefficients, control rod worth and core reactivity. It is concluded that CASMO-3/MASTER can be applied for PWR core nuclear analysis and design without any bias factors. Also, it is verified that the system can be applied for SMART core, via supplemental comparisons with reference calculations by MCNP which is a probabilistic nuclear calculation code

  13. Bayesian Statistics and Uncertainty Quantification for Safety Boundary Analysis in Complex Systems

    He, Yuning; Davies, Misty Dawn

    2014-01-01

    The analysis of a safety-critical system often requires detailed knowledge of safe regions and their highdimensional non-linear boundaries. We present a statistical approach to iteratively detect and characterize the boundaries, which are provided as parameterized shape candidates. Using methods from uncertainty quantification and active learning, we incrementally construct a statistical model from only few simulation runs and obtain statistically sound estimates of the shape parameters for safety boundaries.

  14. Micropollutants throughout an integrated urban drainage model: Sensitivity and uncertainty analysis

    Mannina, Giorgio; Cosenza, Alida; Viviani, Gaspare

    2017-11-01

    The paper presents the sensitivity and uncertainty analysis of an integrated urban drainage model which includes micropollutants. Specifically, a bespoke integrated model developed in previous studies has been modified in order to include the micropollutant assessment (namely, sulfamethoxazole - SMX). The model takes into account also the interactions between the three components of the system: sewer system (SS), wastewater treatment plant (WWTP) and receiving water body (RWB). The analysis has been applied to an experimental catchment nearby Palermo (Italy): the Nocella catchment. Overall, five scenarios, each characterized by different uncertainty combinations of sub-systems (i.e., SS, WWTP and RWB), have been considered applying, for the sensitivity analysis, the Extended-FAST method in order to select the key factors affecting the RWB quality and to design a reliable/useful experimental campaign. Results have demonstrated that sensitivity analysis is a powerful tool for increasing operator confidence in the modelling results. The approach adopted here can be used for blocking some non-identifiable factors, thus wisely modifying the structure of the model and reducing the related uncertainty. The model factors related to the SS have been found to be the most relevant factors affecting the SMX modeling in the RWB when all model factors (scenario 1) or model factors of SS (scenarios 2 and 3) are varied. If the only factors related to the WWTP are changed (scenarios 4 and 5), the SMX concentration in the RWB is mainly influenced (till to 95% influence of the total variance for SSMX,max) by the aerobic sorption coefficient. A progressive uncertainty reduction from the upstream to downstream was found for the soluble fraction of SMX in the RWB.

  15. Information Synthesis in Uncertainty Studies: Application to the Analysis of the BEMUSE Results

    Baccou, J.; Chojnacki, E.; Destercke, S.

    2013-01-01

    To demonstrate that the nuclear power plants are designed to respond safely at numerous postulated accidents computer codes are used. The models of these computer codes are an approximation of the real physical behaviour occurring during an accident. Moreover the data used to run these codes are also known with a limited accuracy. Therefore the code predictions are not exact but uncertain. To deal with these uncertainties, 'best estimate' codes with 'best estimate' input data are used to obtain a best estimate calculation and it is necessary to derive the uncertainty associated to their estimations. For this reason, regulatory authorities demand in particular to technical safety organization such as the French Institut de Radioprotection et de Surete Nucleaire (IRSN) to provide results taking into account all the uncertainty sources to assess safety quantities are below critical values. Uncertainty analysis can be seen as a problem of information treatment and a special effort on four methodological key issues has to be done. The first one is related to information modelling. In safety studies, one can distinguish two kinds of uncertainty. The first type, called aleatory uncertainty, is due to the natural variability of an observed phenomenon and cannot be reduced by the arrival of new information. The second type, called epistemic uncertainty, can arise from imprecision. Contrary to the previous one, this uncertainty can be reduced by increasing the state of knowledge. Performing a relevant information modelling therefore requires to work with a mathematical formalism flexible enough to faithfully treat both types of uncertainties. The second one deals with information propagation through a computer code. It requires to run the codes several times and it is usually achieved thanks to a coupling to a statistical software. The complexity of the propagation is strongly connected to the mathematical framework used for the information modelling. The more general the

  16. Uncertainty and sensitivity analysis of early exposure results with the MACCS Reactor Accident Consequence Model

    Helton, J.C.; Johnson, J.D.; McKay, M.D.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the early health effects associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 34 imprecisely known input variables on the following reactor accident consequences are studied: number of early fatalities, number of cases of prodromal vomiting, population dose within 10 mi of the reactor, population dose within 1000 mi of the reactor, individual early fatality probability within 1 mi of the reactor, and maximum early fatality distance. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: scaling factor for horizontal dispersion, dry deposition velocity, inhalation protection factor for nonevacuees, groundshine shielding factor for nonevacuees, early fatality hazard function alpha value for bone marrow exposure, and scaling factor for vertical dispersion

  17. Sensitivity analysis on uncertainty variables affecting the NPP's LUEC with probabilistic approach

    Nuryanti; Akhmad Hidayatno; Erlinda Muslim

    2013-01-01

    One thing that is quite crucial to be reviewed prior to any investment decision on the nuclear power plant (NPP) project is the calculation of project economic, including calculation of Levelized Unit Electricity Cost (LUEC). Infrastructure projects such as NPP’s project are vulnerable to a number of uncertainty variables. Information on the uncertainty variables which makes LUEC’s value quite sensitive due to the changes of them is necessary in order the cost overrun can be avoided. Therefore this study aimed to do the sensitivity analysis on variables that affect LUEC with probabilistic approaches. This analysis was done by using Monte Carlo technique that simulate the relationship between the uncertainty variables and visible impact on LUEC. The sensitivity analysis result shows the significant changes on LUEC value of AP1000 and OPR due to the sensitivity of investment cost and capacity factors. While LUEC changes due to sensitivity of U 3 O 8 ’s price looks not quite significant. (author)

  18. Uncertainty and sensitivity analysis for the simulation of a station blackout scenario in the Jules Horowitz Reactor

    Ghione, Alberto; Noel, Brigitte; Vinai, Paolo; Demazière, Christophe

    2017-01-01

    Highlights: • A station blackout scenario in the Jules Horowitz Reactor is analyzed using CATHARE. • Input and model uncertainties relevant to the transient, are considered. • A statistical methodology for the propagation of the uncertainties is applied. • No safety criteria are exceeded and sufficiently large safety margins are estimated. • The most influential uncertainties are determined with a sensitivity analysis. - Abstract: An uncertainty and sensitivity analysis for the simulation of a station blackout scenario in the Jules Horowitz Reactor (JHR) is presented. The JHR is a new material testing reactor under construction at CEA on the Cadarache site, France. The thermal-hydraulic system code CATHARE is applied to investigate the response of the reactor system to the scenario. The uncertainty and sensitivity study was based on a statistical methodology for code uncertainty propagation, and the ‘Uncertainty and Sensitivity’ platform URANIE was used. Accordingly, the input uncertainties relevant to the transient, were identified, quantified, and propagated to the code output. The results show that the safety criteria are not exceeded and sufficiently large safety margins exist. In addition, the most influential input uncertainties on the safety parameters were found by making use of a sensitivity analysis.

  19. Uncertainty and Sensitivity Analysis Applied to the Validation of BWR Bundle Thermal-Hydraulic Calculations

    Hernandez-Solis, Augusto

    2010-04-01

    This work has two main objectives. The first one is to enhance the validation process of the thermal-hydraulic features of the Westinghouse code POLCA-T. This is achieved by computing a quantitative validation limit based on statistical uncertainty analysis. This validation theory is applied to some of the benchmark cases of the following macroscopic BFBT exercises: 1) Single and two phase bundle pressure drops, 2) Steady-state cross-sectional averaged void fraction, 3) Transient cross-sectional averaged void fraction and 4) Steady-state critical power tests. Sensitivity analysis is also performed to identify the most important uncertain parameters for each exercise. The second objective consists in showing the clear advantages of using the quasi-random Latin Hypercube Sampling (LHS) strategy over simple random sampling (SRS). LHS allows a much better coverage of the input uncertainties than SRS because it densely stratifies across the range of each input probability distribution. The aim here is to compare both uncertainty analyses on the BWR assembly void axial profile prediction in steady-state, and on the transient void fraction prediction at a certain axial level coming from a simulated re-circulation pump trip scenario. It is shown that the replicated void fraction mean (either in steady-state or transient conditions) has less variability when using LHS than SRS for the same number of calculations (i.e. same input space sample size) even if the resulting void fraction axial profiles are non-monotonic. It is also shown that the void fraction uncertainty limits achieved with SRS by running 458 calculations (sample size required to cover 95% of 8 uncertain input parameters with a 95% confidence), result in the same uncertainty limits achieved by LHS with only 100 calculations. These are thus clear indications on the advantages of using LHS. Finally, the present study contributes to a realistic analysis of nuclear reactors, in the sense that the uncertainties of

  20. Uncertainty analysis of the Operational Simplified Surface Energy Balance (SSEBop) model at multiple flux tower sites

    Chen, Mingshi; Senay, Gabriel B.; Singh, Ramesh K.; Verdin, James P.

    2016-01-01

    Evapotranspiration (ET) is an important component of the water cycle – ET from the land surface returns approximately 60% of the global precipitation back to the atmosphere. ET also plays an important role in energy transport among the biosphere, atmosphere, and hydrosphere. Current regional to global and daily to annual ET estimation relies mainly on surface energy balance (SEB) ET models or statistical and empirical methods driven by remote sensing data and various climatological databases. These models have uncertainties due to inevitable input errors, poorly defined parameters, and inadequate model structures. The eddy covariance measurements on water, energy, and carbon fluxes at the AmeriFlux tower sites provide an opportunity to assess the ET modeling uncertainties. In this study, we focused on uncertainty analysis of the Operational Simplified Surface Energy Balance (SSEBop) model for ET estimation at multiple AmeriFlux tower sites with diverse land cover characteristics and climatic conditions. The 8-day composite 1-km MODerate resolution Imaging Spectroradiometer (MODIS) land surface temperature (LST) was used as input land surface temperature for the SSEBop algorithms. The other input data were taken from the AmeriFlux database. Results of statistical analysis indicated that the SSEBop model performed well in estimating ET with an R2 of 0.86 between estimated ET and eddy covariance measurements at 42 AmeriFlux tower sites during 2001–2007. It was encouraging to see that the best performance was observed for croplands, where R2 was 0.92 with a root mean square error of 13 mm/month. The uncertainties or random errors from input variables and parameters of the SSEBop model led to monthly ET estimates with relative errors less than 20% across multiple flux tower sites distributed across different biomes. This uncertainty of the SSEBop model lies within the error range of other SEB models, suggesting systematic error or bias of the SSEBop model is within

  1. Nuclear data uncertainty analysis for the generation IV gas-cooled fast reactor

    Pelloni, S.; Mikityuk, K.

    2012-01-01

    For the European 2400 MW Gas-cooled Fast Reactor (GoFastR), this paper summarizes a priori uncertainties, i.e. without any integral experiment assessment, of the main neutronic parameters which were obtained on the basis of the deterministic code system ERANOS (Edition 2.2-N). JEFF-3.1 cross-sections were used in conjunction with the newest ENDF/B-VII.0 based covariance library (COMMARA-2.0) resulting from a recent cooperation of the Brookhaven and Los Alamos National Laboratories within the Advanced Fuel Cycle Initiative. The basis for the analysis is the original GoFastR concept with carbide fuel pins and silicon-carbide ceramic cladding, which was developed and proposed in the first quarter of 2009 by the 'French alternative energies and Atomic Energy Commission', CEA. The main conclusions from the current study are that nuclear data uncertainties of neutronic parameters may still be too large for this Generation IV reactor, especially concerning the multiplication factor, despite the fact that the new covariance library is quite complete; These uncertainties, in relative terms, do not show the a priori expected increase with bum-up as a result of the minor actinide and fission product build-up. Indeed, they are found almost independent of the fuel depletion, since the uncertainty associated with 238 U inelastic scattering results largely dominating. This finding clearly supports the activities of Subgroup 33 of the Working Party on International Nuclear Data Evaluation Cooperation (WPEC), i.e. Methods and issues for the combined use of integral experiments and covariance data, attempting to reduce the present unbiased uncertainties on nuclear data through adjustments based on available experimental data. (authors)

  2. Tolerance of uncertainty: Conceptual analysis, integrative model, and implications for healthcare.

    Hillen, Marij A; Gutheil, Caitlin M; Strout, Tania D; Smets, Ellen M A; Han, Paul K J

    2017-05-01

    Uncertainty tolerance (UT) is an important, well-studied phenomenon in health care and many other important domains of life, yet its conceptualization and measurement by researchers in various disciplines have varied substantially and its essential nature remains unclear. The objectives of this study were to: 1) analyze the meaning and logical coherence of UT as conceptualized by developers of UT measures, and 2) develop an integrative conceptual model to guide future empirical research regarding the nature, causes, and effects of UT. A narrative review and conceptual analysis of 18 existing measures of Uncertainty and Ambiguity Tolerance was conducted, focusing on how measure developers in various fields have defined both the "uncertainty" and "tolerance" components of UT-both explicitly through their writings and implicitly through the items constituting their measures. Both explicit and implicit conceptual definitions of uncertainty and tolerance vary substantially and are often poorly and inconsistently specified. A logically coherent, unified understanding or theoretical model of UT is lacking. To address these gaps, we propose a new integrative definition and multidimensional conceptual model that construes UT as the set of negative and positive psychological responses-cognitive, emotional, and behavioral-provoked by the conscious awareness of ignorance about particular aspects of the world. This model synthesizes insights from various disciplines and provides an organizing framework for future research. We discuss how this model can facilitate further empirical and theoretical research to better measure and understand the nature, determinants, and outcomes of UT in health care and other domains of life. Uncertainty tolerance is an important and complex phenomenon requiring more precise and consistent definition. An integrative definition and conceptual model, intended as a tentative and flexible point of departure for future research, adds needed breadth

  3. Bayesian analysis of stage-fall-discharge rating curves and their uncertainties

    Mansanarez, Valentin; Le Coz, Jérôme; Renard, Benjamin; Lang, Michel; Pierrefeu, Gilles; Le Boursicaud, Raphaël; Pobanz, Karine

    2016-04-01

    Stage-fall-discharge (SFD) rating curves are traditionally used to compute streamflow records at sites where the energy slope of the flow is variable due to variable backwater effects. Building on existing Bayesian approaches, we introduce an original hydraulics-based method for developing SFD rating curves used at twin gauge stations and estimating their uncertainties. Conventional power functions for channel and section controls are used, and transition to a backwater-affected channel control is computed based on a continuity condition, solved either analytically or numerically. The difference between the reference levels at the two stations is estimated as another uncertain parameter of the SFD model. The method proposed in this presentation incorporates information from both the hydraulic knowledge (equations of channel or section controls) and the information available in the stage-fall-discharge observations (gauging data). The obtained total uncertainty combines the parametric uncertainty and the remnant uncertainty related to the model of rating curve. This method provides a direct estimation of the physical inputs of the rating curve (roughness, width, slope bed, distance between twin gauges, etc.). The performance of the new method is tested using an application case affected by the variable backwater of a run-of-the-river dam: the Rhône river at Valence, France. In particular, a sensitivity analysis to the prior information and to the gauging dataset is performed. At that site, the stage-fall-discharge domain is well documented with gaugings conducted over a range of backwater affected and unaffected conditions. The performance of the new model was deemed to be satisfactory. Notably, transition to uniform flow when the overall range of the auxiliary stage is gauged is correctly simulated. The resulting curves are in good agreement with the observations (gaugings) and their uncertainty envelopes are acceptable for computing streamflow records. Similar

  4. Use of Quantitative Uncertainty Analysis to Support M&VDecisions in ESPCs

    Mathew, Paul A.; Koehling, Erick; Kumar, Satish

    2005-05-11

    Measurement and Verification (M&V) is a critical elementof an Energy Savings Performance Contract (ESPC) - without M&V, thereisno way to confirm that the projected savings in an ESPC are in factbeing realized. For any given energy conservation measure in an ESPC,there are usually several M&V choices, which will vary in terms ofmeasurement uncertainty, cost, and technical feasibility. Typically,M&V decisions are made almost solely based on engineering judgmentand experience, with little, if any, quantitative uncertainty analysis(QUA). This paper describes the results of a pilot project initiated bythe Department of Energy s Federal Energy Management Program to explorethe use of Monte-Carlo simulation to assess savings uncertainty andthereby augment the M&V decision-making process in ESPCs. The intentwas to use QUA selectively in combination with heuristic knowledge, inorder to obtain quantitative estimates of the savings uncertainty withoutthe burden of a comprehensive "bottoms-up" QUA. This approach was used toanalyze the savings uncertainty in an ESPC for a large federal agency.The QUA was seamlessly integrated into the ESPC development process andthe incremental effort was relatively small with user-friendly tools thatare commercially available. As the case study illustrates, in some casesthe QUA simply confirms intuitive or qualitative information, while inother cases, it provides insight that suggests revisiting the M&Vplan. The case study also showed that M&V decisions should beinformed by the portfolio risk diversification. By providing quantitativeuncertainty information, QUA can effectively augment the M&Vdecision-making process as well as the overall ESPC financialanalysis.

  5. Sensitivity/uncertainty analysis for free-in-air tissue kerma due to initial radiation at Hiroshima and Nagasaki

    Lillie, R.A.; Broadhead, B.L.; Pace, J.V. III

    1988-01-01

    Uncertainty estimates and cross correlations by range/survivor have been calculated for the Hiroshima and Nagasaki free-in-air (FIA) tissue kerma obtained from two-dimensional air/ground transport calculations. The uncertainties due to modeling parameter and basic nuclear transport data uncertainties were calculated for 700-, 1000-, and 1500-m ground ranges. Only the FIA tissue kerma due to initial radiation was treated in the analysis; the uncertainties associated with terrain and building shielding and phantom attenuation were not considered in this study. Uncertainties of --20% were obtained for the prompt neutron and secondary gamma kerma and 30% for the prompt gamma kerma at both cities. The uncertainties on the total prompt kerma at Hiroshima and Nagasaki are --18 and 15%, respectively. The estimated uncertainties vary only slightly by ground range and are fairly highly correlated. The total prompt kerma uncertainties are dominated by the secondary gamma uncertainties, which in turn are dominated by the modeling parameter uncertainties, particularly those associated with the weapon yield and radiation sources

  6. A cross-section sensitivity and uncertainty analysis on fusion reactor blankets with SAD/SED effect

    Furuta, Kazuo; Oka, Yoshiaki; Kondo, Shunsuke

    1986-01-01

    A cross-section sensitivity and uncertainty analysis on four types of fusion reactor blankets has been performed, based on cross-section covariance matrices. The design parameters investigated in the analysis include the tritium breeding ratio, the neutron heating and the fast neutron leakage flux from the inboard shield. Uncertainities in Secondary Angular Distribution (SAD) and Secondary Energy Distribution (SED) of scattered neutrons have been considered for lithium. The collective standard deviation, due to uncertainties in the evaluated cross-section data presently available, is 2-4% in the tritium breeding ratio, 2-3% in the neutron heating, and 10-20% in the fast neutron leakage flux. Contributions from SAD/SED uncertainties are significant for some parameters, such as those investigated in the present study. SAD/SED uncertainties should be considered in the sensitivity and uncertainty analysis on nuclear design of fusion reactors. (orig.)

  7. Measuring and explaining eco-efficiencies of wastewater treatment plants in China: An uncertainty analysis perspective.

    Dong, Xin; Zhang, Xinyi; Zeng, Siyu

    2017-04-01

    In the context of sustainable development, there has been an increasing requirement for an eco-efficiency assessment of wastewater treatment plants (WWTPs). Data envelopment analysis (DEA), a technique that is widely applied for relative efficiency assessment, is used in combination with the tolerances approach to handle WWTPs' multiple inputs and outputs as well as their uncertainty. The economic cost, energy consumption, contaminant removal, and global warming effect during the treatment processes are integrated to interpret the eco-efficiency of WWTPs. A total of 736 sample plants from across China are assessed, and large sensitivities to variations in inputs and outputs are observed for most samples, with only three WWTPs identified as being stably efficient. Size of plant, overcapacity, climate type, and influent characteristics are proven to have a significant influence on both the mean efficiency and performance sensitivity of WWTPs, while no clear relationships were found between eco-efficiency and technology under the framework of uncertainty analysis. The incorporation of uncertainty quantification and environmental impact consideration has improved the liability and applicability of the assessment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. On uncertainty and local sensitivity analysis for transient conjugate heat transfer problems

    Rauch, Christian

    2012-01-01

    The need for simulating real-world behavior of automobiles has led to more and more sophisticated models being added of various physical phenomena for being coupled together. This increases the number of parameters to be set and, consequently, the required knowledge of their relative importance for the solution and the theory behind them. Sensitivity and uncertainty analysis provides the knowledge of parameter importance. In this paper a thermal radiation solver is considered that performs conduction calculations and receives heat transfer coefficient and fluid temperate at a thermal node. The equations of local, discrete, and transient sensitivities for the conjugate heat transfer model solved by the finite difference method are being derived for some parameters. In the past, formulations for the finite element method have been published. This paper builds on the steady-state formulation published previously by the author. A numerical analysis on the stability of the solution matrix is being conducted. From those normalized sensitivity coefficients are calculated dimensionless uncertainty factors. On a simplified example the relative importance of the heat transfer modes at various locations is then investigated by those uncertainty factors and their changes over time

  9. Parameterization and Uncertainty Analysis of SWAT model in Hydrological Simulation of Chaohe River Basin

    Jie, M.; Zhang, J.; Guo, B. B.

    2017-12-01

    As a typical distributed hydrological model, the SWAT model also has a challenge in calibrating parameters and analysis their uncertainty. This paper chooses the Chaohe River Basin China as the study area, through the establishment of the SWAT model, loading the DEM data of the Chaohe river basin, the watershed is automatically divided into several sub-basins. Analyzing the land use, soil and slope which are on the basis of the sub-basins and calculating the hydrological response unit (HRU) of the study area, after running SWAT model, the runoff simulation values in the watershed are obtained. On this basis, using weather data, known daily runoff of three hydrological stations, combined with the SWAT-CUP automatic program and the manual adjustment method are used to analyze the multi-site calibration of the model parameters. Furthermore, the GLUE algorithm is used to analyze the parameters uncertainty of the SWAT model. Through the sensitivity analysis, calibration and uncertainty study of SWAT, the results indicate that the parameterization of the hydrological characteristics of the Chaohe river is successful and feasible which can be used to simulate the Chaohe river basin.

  10. Sensitivity and Uncertainty Analysis for coolant void reactivity in a CANDU Fuel Lattice Cell Model

    Yoo, Seung Yeol; Shim, Hyung Jin [Seoul National University, Seoul (Korea, Republic of)

    2016-10-15

    In this study, the EPBM is implemented in Seoul National university Monte Carlo (MC) code, McCARD which has the k uncertainty evaluation capability by the adjoint-weighted perturbation (AWP) method. The implementation is verified by comparing the sensitivities of the k-eigenvalue difference to the microscopic cross sections computed by the DPBM and the direct subtractions for the TMI-1 pin-cell problem. The uncertainty of the coolant void reactivity (CVR) in a CANDU fuel lattice model due to the ENDF/B-VII.1 covariance data is calculated by its sensitivities estimated by the EPBM. The method based on the eigenvalue perturbation theory (EPBM) utilizes the 1st order adjoint-weighted perturbation (AWP) technique to estimate the sensitivity of the eigenvalue difference. Furthermore this method can be easily applied in a S/U analysis code system equipped with the eigenvalue sensitivity calculation capability. The EPBM is implemented in McCARD code and verified by showing good agreement with reference solution. Then the McCARD S/U analysis have been performed with the EPBM module for the CVR in CANDU fuel lattice problem. It shows that the uncertainty contributions of nu of {sup 235}U and gamma reaction of {sup 238}U are dominant.

  11. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis☆

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-01-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987

  12. How uncertainty analysis of streamflow data can reduce costs and promote robust decisions in water management applications

    McMillan, Hilary; Seibert, Jan; Petersen-Overleir, Asgeir; Lang, Michel; White, Paul; Snelder, Ton; Rutherford, Kit; Krueger, Tobias; Mason, Robert; Kiang, Julie

    2017-07-01

    Streamflow data are used for important environmental and economic decisions, such as specifying and regulating minimum flows, managing water supplies, and planning for flood hazards. Despite significant uncertainty in most flow data, the flow series for these applications are often communicated and used without uncertainty information. In this commentary, we argue that proper analysis of uncertainty in river flow data can reduce costs and promote robust conclusions in water management applications. We substantiate our argument by providing case studies from Norway and New Zealand where streamflow uncertainty analysis has uncovered economic costs in the hydropower industry, improved public acceptance of a controversial water management policy, and tested the accuracy of water quality trends. We discuss the need for practical uncertainty assessment tools that generate multiple flow series realizations rather than simple error bounds. Although examples of such tools are in development, considerable barriers for uncertainty analysis and communication still exist for practitioners, and future research must aim to provide easier access and usability of uncertainty estimates. We conclude that flow uncertainty analysis is critical for good water management decisions.

  13. An approach to multi-attribute utility analysis under parametric uncertainty

    Kelly, M.; Thorne, M.C.

    2001-01-01

    The techniques of cost-benefit analysis and multi-attribute analysis provide a useful basis for informing decisions in situations where a number of potentially conflicting opinions or interests need to be considered, and where there are a number of possible decisions that could be adopted. When the input data to such decision-making processes are uniquely specified, cost-benefit analysis and multi-attribute utility analysis provide unambiguous guidance on the preferred decision option. However, when the data are not uniquely specified, application and interpretation of these techniques is more complex. Herein, an approach to multi-attribute utility analysis (and hence, as a special case, cost-benefit analysis) when input data are subject to parametric uncertainty is presented. The approach is based on the use of a Monte Carlo technique, and has recently been applied to options for the remediation of former uranium mining liabilities in a number of Central and Eastern European States