WorldWideScience

Sample records for analysis uncertainty

  1. Uncertainty analysis guide

    International Nuclear Information System (INIS)

    Andres, T.H.

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  2. Uncertainty analysis guide

    Energy Technology Data Exchange (ETDEWEB)

    Andres, T.H

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  3. Reliability analysis under epistemic uncertainty

    International Nuclear Information System (INIS)

    Nannapaneni, Saideep; Mahadevan, Sankaran

    2016-01-01

    This paper proposes a probabilistic framework to include both aleatory and epistemic uncertainty within model-based reliability estimation of engineering systems for individual limit states. Epistemic uncertainty is considered due to both data and model sources. Sparse point and/or interval data regarding the input random variables leads to uncertainty regarding their distribution types, distribution parameters, and correlations; this statistical uncertainty is included in the reliability analysis through a combination of likelihood-based representation, Bayesian hypothesis testing, and Bayesian model averaging techniques. Model errors, which include numerical solution errors and model form errors, are quantified through Gaussian process models and included in the reliability analysis. The probability integral transform is used to develop an auxiliary variable approach that facilitates a single-level representation of both aleatory and epistemic uncertainty. This strategy results in an efficient single-loop implementation of Monte Carlo simulation (MCS) and FORM/SORM techniques for reliability estimation under both aleatory and epistemic uncertainty. Two engineering examples are used to demonstrate the proposed methodology. - Highlights: • Epistemic uncertainty due to data and model included in reliability analysis. • A novel FORM-based approach proposed to include aleatory and epistemic uncertainty. • A single-loop Monte Carlo approach proposed to include both types of uncertainties. • Two engineering examples used for illustration.

  4. Deterministic uncertainty analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1987-01-01

    Uncertainties of computer results are of primary interest in applications such as high-level waste (HLW) repository performance assessment in which experimental validation is not possible or practical. This work presents an alternate deterministic approach for calculating uncertainties that has the potential to significantly reduce the number of computer runs required for conventional statistical analysis. 7 refs., 1 fig

  5. Sensitivity and uncertainty analysis

    CERN Document Server

    Cacuci, Dan G; Navon, Ionel Michael

    2005-01-01

    As computer-assisted modeling and analysis of physical processes have continued to grow and diversify, sensitivity and uncertainty analyses have become indispensable scientific tools. Sensitivity and Uncertainty Analysis. Volume I: Theory focused on the mathematical underpinnings of two important methods for such analyses: the Adjoint Sensitivity Analysis Procedure and the Global Adjoint Sensitivity Analysis Procedure. This volume concentrates on the practical aspects of performing these analyses for large-scale systems. The applications addressed include two-phase flow problems, a radiative c

  6. Deterministic uncertainty analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1987-12-01

    This paper presents a deterministic uncertainty analysis (DUA) method for calculating uncertainties that has the potential to significantly reduce the number of computer runs compared to conventional statistical analysis. The method is based upon the availability of derivative and sensitivity data such as that calculated using the well known direct or adjoint sensitivity analysis techniques. Formation of response surfaces using derivative data and the propagation of input probability distributions are discussed relative to their role in the DUA method. A sample problem that models the flow of water through a borehole is used as a basis to compare the cumulative distribution function of the flow rate as calculated by the standard statistical methods and the DUA method. Propogation of uncertainties by the DUA method is compared for ten cases in which the number of reference model runs was varied from one to ten. The DUA method gives a more accurate representation of the true cumulative distribution of the flow rate based upon as few as two model executions compared to fifty model executions using a statistical approach. 16 refs., 4 figs., 5 tabs

  7. Uncertainty analysis of environmental models

    International Nuclear Information System (INIS)

    Monte, L.

    1990-01-01

    In the present paper an evaluation of the output uncertainty of an environmental model for assessing the transfer of 137 Cs and 131 I in the human food chain are carried out on the basis of a statistical analysis of data reported by the literature. The uncertainty analysis offers the oppotunity of obtaining some remarkable information about the uncertainty of models predicting the migration of non radioactive substances in the environment mainly in relation to the dry and wet deposition

  8. CEC/USDOE workshop on uncertainty analysis

    International Nuclear Information System (INIS)

    Elderkin, C.E.; Kelly, G.N.

    1990-07-01

    Any measured or assessed quantity contains uncertainty. The quantitative estimation of such uncertainty is becoming increasingly important, especially in assuring that safety requirements are met in design, regulation, and operation of nuclear installations. The CEC/USDOE Workshop on Uncertainty Analysis, held in Santa Fe, New Mexico, on November 13 through 16, 1989, was organized jointly by the Commission of European Communities (CEC's) Radiation Protection Research program, dealing with uncertainties throughout the field of consequence assessment, and DOE's Atmospheric Studies in Complex Terrain (ASCOT) program, concerned with the particular uncertainties in time and space variant transport and dispersion. The workshop brought together US and European scientists who have been developing or applying uncertainty analysis methodologies, conducted in a variety of contexts, often with incomplete knowledge of the work of others in this area. Thus, it was timely to exchange views and experience, identify limitations of approaches to uncertainty and possible improvements, and enhance the interface between developers and users of uncertainty analysis methods. Furthermore, the workshop considered the extent to which consistent, rigorous methods could be used in various applications within consequence assessment. 3 refs

  9. Applied research in uncertainty modeling and analysis

    CERN Document Server

    Ayyub, Bilal

    2005-01-01

    Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

  10. Uncertainty analysis of the FRAP code

    International Nuclear Information System (INIS)

    Peck, S.O.

    1978-01-01

    A user oriented, automated uncertainty analysis capability has been built into the FRAP code (Fuel Rod Analysis Program) and applied to a PWR fuel rod undergoing a LOCA. The method of uncertainty analysis is the Response Surface Method (RSM). (author)

  11. Uncertainty analysis of the FRAP code

    International Nuclear Information System (INIS)

    Peck, S.O.

    1978-01-01

    A user oriented, automated uncertainty analysis capability has been built into the Fuel Rod Analysis Program (FRAP) code and has been applied to a pressurized water reactor (PWR) fuel rod undergoing a loss-of-coolant accident (LOCA). The method of uncertainty analysis is the response surface method. The automated version significantly reduced the time required to complete the analysis and, at the same time, greatly increased the problem scope. Results of the analysis showed a significant difference in the total and relative contributions to the uncertainty of the response parameters between steady state and transient conditions

  12. Uncertainty as Knowledge: Constraints on Policy Choices Provided by Analysis of Uncertainty

    Science.gov (United States)

    Lewandowsky, S.; Risbey, J.; Smithson, M.; Newell, B. R.

    2012-12-01

    Uncertainty forms an integral part of climate science, and it is often cited in connection with arguments against mitigative action. We argue that an analysis of uncertainty must consider existing knowledge as well as uncertainty, and the two must be evaluated with respect to the outcomes and risks associated with possible policy options. Although risk judgments are inherently subjective, an analysis of the role of uncertainty within the climate system yields two constraints that are robust to a broad range of assumptions. Those constraints are that (a) greater uncertainty about the climate system is necessarily associated with greater expected damages from warming, and (b) greater uncertainty translates into a greater risk of the failure of mitigation efforts. These ordinal constraints are unaffected by subjective or cultural risk-perception factors, they are independent of the discount rate, and they are independent of the magnitude of the estimate for climate sensitivity. The constraints mean that any appeal to uncertainty must imply a stronger, rather than weaker, need to cut greenhouse gas emissions than in the absence of uncertainty.

  13. Uncertainty analysis in Monte Carlo criticality computations

    International Nuclear Information System (INIS)

    Qi Ao

    2011-01-01

    Highlights: ► Two types of uncertainty methods for k eff Monte Carlo computations are examined. ► Sampling method has the least restrictions on perturbation but computing resources. ► Analytical method is limited to small perturbation on material properties. ► Practicality relies on efficiency, multiparameter applicability and data availability. - Abstract: Uncertainty analysis is imperative for nuclear criticality risk assessments when using Monte Carlo neutron transport methods to predict the effective neutron multiplication factor (k eff ) for fissionable material systems. For the validation of Monte Carlo codes for criticality computations against benchmark experiments, code accuracy and precision are measured by both the computational bias and uncertainty in the bias. The uncertainty in the bias accounts for known or quantified experimental, computational and model uncertainties. For the application of Monte Carlo codes for criticality analysis of fissionable material systems, an administrative margin of subcriticality must be imposed to provide additional assurance of subcriticality for any unknown or unquantified uncertainties. Because of a substantial impact of the administrative margin of subcriticality on economics and safety of nuclear fuel cycle operations, recently increasing interests in reducing the administrative margin of subcriticality make the uncertainty analysis in criticality safety computations more risk-significant. This paper provides an overview of two most popular k eff uncertainty analysis methods for Monte Carlo criticality computations: (1) sampling-based methods, and (2) analytical methods. Examples are given to demonstrate their usage in the k eff uncertainty analysis due to uncertainties in both neutronic and non-neutronic parameters of fissionable material systems.

  14. Qualitative uncertainty analysis in probabilistic safety assessment context

    International Nuclear Information System (INIS)

    Apostol, M.; Constantin, M; Turcu, I.

    2007-01-01

    In Probabilistic Safety Assessment (PSA) context, an uncertainty analysis is performed either to estimate the uncertainty in the final results (the risk to public health and safety) or to estimate the uncertainty in some intermediate quantities (the core damage frequency, the radionuclide release frequency or fatality frequency). The identification and evaluation of uncertainty are important tasks because they afford credit to the results and help in the decision-making process. Uncertainty analysis can be performed qualitatively or quantitatively. This paper performs a preliminary qualitative uncertainty analysis, by identification of major uncertainty in PSA level 1- level 2 interface and in the other two major procedural steps of a level 2 PSA i.e. the analysis of accident progression and of the containment and analysis of source term for severe accidents. One should mention that a level 2 PSA for a Nuclear Power Plant (NPP) involves the evaluation and quantification of the mechanisms, amount and probabilities of subsequent radioactive material releases from the containment. According to NUREG 1150, an important task in source term analysis is fission products transport analysis. The uncertainties related to the isotopes distribution in CANDU NPP primary circuit and isotopes' masses transferred in the containment, using SOPHAEROS module from ASTEC computer code will be also presented. (authors)

  15. Approach to uncertainty in risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rish, W.R.

    1988-08-01

    In the Fall of 1985 EPA's Office of Radiation Programs (ORP) initiated a project to develop a formal approach to dealing with uncertainties encountered when estimating and evaluating risks to human health and the environment. Based on a literature review of modeling uncertainty, interviews with ORP technical and management staff, and input from experts on uncertainty analysis, a comprehensive approach was developed. This approach recognizes by design the constraints on budget, time, manpower, expertise, and availability of information often encountered in ''real world'' modeling. It is based on the observation that in practice risk modeling is usually done to support a decision process. As such, the approach focuses on how to frame a given risk modeling problem, how to use that framing to select an appropriate mixture of uncertainty analyses techniques, and how to integrate the techniques into an uncertainty assessment that effectively communicates important information and insight to decision-makers. The approach is presented in this report. Practical guidance on characterizing and analyzing uncertainties about model form and quantities and on effectively communicating uncertainty analysis results is included. Examples from actual applications are presented.

  16. Approach to uncertainty in risk analysis

    International Nuclear Information System (INIS)

    Rish, W.R.

    1988-08-01

    In the Fall of 1985 EPA's Office of Radiation Programs (ORP) initiated a project to develop a formal approach to dealing with uncertainties encountered when estimating and evaluating risks to human health and the environment. Based on a literature review of modeling uncertainty, interviews with ORP technical and management staff, and input from experts on uncertainty analysis, a comprehensive approach was developed. This approach recognizes by design the constraints on budget, time, manpower, expertise, and availability of information often encountered in ''real world'' modeling. It is based on the observation that in practice risk modeling is usually done to support a decision process. As such, the approach focuses on how to frame a given risk modeling problem, how to use that framing to select an appropriate mixture of uncertainty analyses techniques, and how to integrate the techniques into an uncertainty assessment that effectively communicates important information and insight to decision-makers. The approach is presented in this report. Practical guidance on characterizing and analyzing uncertainties about model form and quantities and on effectively communicating uncertainty analysis results is included. Examples from actual applications are presented

  17. Automated uncertainty analysis methods in the FRAP computer codes

    International Nuclear Information System (INIS)

    Peck, S.O.

    1980-01-01

    A user oriented, automated uncertainty analysis capability has been incorporated in the Fuel Rod Analysis Program (FRAP) computer codes. The FRAP codes have been developed for the analysis of Light Water Reactor fuel rod behavior during steady state (FRAPCON) and transient (FRAP-T) conditions as part of the United States Nuclear Regulatory Commission's Water Reactor Safety Research Program. The objective of uncertainty analysis of these codes is to obtain estimates of the uncertainty in computed outputs of the codes is to obtain estimates of the uncertainty in computed outputs of the codes as a function of known uncertainties in input variables. This paper presents the methods used to generate an uncertainty analysis of a large computer code, discusses the assumptions that are made, and shows techniques for testing them. An uncertainty analysis of FRAP-T calculated fuel rod behavior during a hypothetical loss-of-coolant transient is presented as an example and carried through the discussion to illustrate the various concepts

  18. Representation of analysis results involving aleatory and epistemic uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jay Dean (ProStat, Mesa, AZ); Helton, Jon Craig (Arizona State University, Tempe, AZ); Oberkampf, William Louis; Sallaberry, Cedric J.

    2008-08-01

    Procedures are described for the representation of results in analyses that involve both aleatory uncertainty and epistemic uncertainty, with aleatory uncertainty deriving from an inherent randomness in the behavior of the system under study and epistemic uncertainty deriving from a lack of knowledge about the appropriate values to use for quantities that are assumed to have fixed but poorly known values in the context of a specific study. Aleatory uncertainty is usually represented with probability and leads to cumulative distribution functions (CDFs) or complementary cumulative distribution functions (CCDFs) for analysis results of interest. Several mathematical structures are available for the representation of epistemic uncertainty, including interval analysis, possibility theory, evidence theory and probability theory. In the presence of epistemic uncertainty, there is not a single CDF or CCDF for a given analysis result. Rather, there is a family of CDFs and a corresponding family of CCDFs that derive from epistemic uncertainty and have an uncertainty structure that derives from the particular uncertainty structure (i.e., interval analysis, possibility theory, evidence theory, probability theory) used to represent epistemic uncertainty. Graphical formats for the representation of epistemic uncertainty in families of CDFs and CCDFs are investigated and presented for the indicated characterizations of epistemic uncertainty.

  19. Measurement uncertainty analysis techniques applied to PV performance measurements

    International Nuclear Information System (INIS)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results

  20. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  1. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  2. Sensitivity functions for uncertainty analysis: Sensitivity and uncertainty analysis of reactor performance parameters

    International Nuclear Information System (INIS)

    Greenspan, E.

    1982-01-01

    This chapter presents the mathematical basis for sensitivity functions, discusses their physical meaning and information they contain, and clarifies a number of issues concerning their application, including the definition of group sensitivities, the selection of sensitivity functions to be included in the analysis, and limitations of sensitivity theory. Examines the theoretical foundation; criticality reset sensitivities; group sensitivities and uncertainties; selection of sensitivities included in the analysis; and other uses and limitations of sensitivity functions. Gives the theoretical formulation of sensitivity functions pertaining to ''as-built'' designs for performance parameters of the form of ratios of linear flux functionals (such as reaction-rate ratios), linear adjoint functionals, bilinear functions (such as reactivity worth ratios), and for reactor reactivity. Offers a consistent procedure for reducing energy-dependent or fine-group sensitivities and uncertainties to broad group sensitivities and uncertainties. Provides illustrations of sensitivity functions as well as references to available compilations of such functions and of total sensitivities. Indicates limitations of sensitivity theory originating from the fact that this theory is based on a first-order perturbation theory

  3. One Approach to the Fire PSA Uncertainty Analysis

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.; Vukovic, I.

    2002-01-01

    Experienced practical events and findings from the number of fire probabilistic safety assessment (PSA) studies show that fire has high relative importance for nuclear power plant safety. Fire PSA is a very challenging phenomenon and a number of issues are still in the area of research and development. This has a major impact on the conservatism of fire PSA findings. One way to reduce the level of conservatism is to conduct uncertainty analysis. At the top-level, uncertainty of the fire PSA can be separated in to three segments. The first segment is related to fire initiating events frequencies. The second uncertainty segment is connected to the uncertainty of fire damage. Finally, there is uncertainty related to the PSA model, which propagates this fire-initiated damage to the core damage or other analyzed risk. This paper discusses all three segments of uncertainty. Some recent experience with fire PSA study uncertainty analysis, usage of fire analysis code COMPBRN IIIe, and uncertainty evaluation importance to the final result is presented.(author)

  4. Uncertainty analysis

    International Nuclear Information System (INIS)

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software

  5. Assessing scenario and parametric uncertainties in risk analysis: a model uncertainty audit

    International Nuclear Information System (INIS)

    Tarantola, S.; Saltelli, A.; Draper, D.

    1999-01-01

    In the present study a process of model audit is addressed on a computational model used for predicting maximum radiological doses to humans in the field of nuclear waste disposal. Global uncertainty and sensitivity analyses are employed to assess output uncertainty and to quantify the contribution of parametric and scenario uncertainties to the model output. These tools are of fundamental importance for risk analysis and decision making purposes

  6. Uncertainty analysis for Ulysses safety evaluation report

    International Nuclear Information System (INIS)

    Frank, M.V.

    1991-01-01

    As part of the effort to review the Ulysses Final Safety Analysis Report and to understand the risk of plutonium release from the Ulysses spacecraft General Purpose Heat Source---Radioisotope Thermal Generator (GPHS-RTG), the Interagency Nuclear Safety Review Panel (INSRP) and the author performed an integrated, quantitative analysis of the uncertainties of the calculated risk of plutonium release from Ulysses. Using state-of-art probabilistic risk assessment technology, the uncertainty analysis accounted for both variability and uncertainty of the key parameters of the risk analysis. The results show that INSRP had high confidence that risk of fatal cancers from potential plutonium release associated with calculated launch and deployment accident scenarios is low

  7. Uncertainty analysis for geologic disposal of radioactive waste

    International Nuclear Information System (INIS)

    Cranwell, R.M.; Helton, J.C.

    1981-01-01

    The incorporation and representation of uncertainty in the analysis of the consequences and risks associated with the geologic disposal of high-level radioactive waste are discussed. Such uncertainty has three primary components: process modeling uncertainty, model input data uncertainty, and scenario uncertainty. The following topics are considered in connection with the preceding components: propagation of uncertainty in the modeling of a disposal site, sampling of input data for models, and uncertainty associated with model output

  8. Optimization of FRAP uncertainty analysis option

    International Nuclear Information System (INIS)

    Peck, S.O.

    1979-10-01

    The automated uncertainty analysis option that has been incorporated in the FRAP codes (FRAP-T5 and FRAPCON-2) provides the user with a means of obtaining uncertainty bands on code predicted variables at user-selected times during a fuel pin analysis. These uncertainty bands are obtained by multiple single fuel pin analyses to generate data which can then be analyzed by second order statistical error propagation techniques. In this process, a considerable amount of data is generated and stored on tape. The user has certain choices to make regarding which independent variables are to be used in the analysis and what order of error propagation equation should be used in modeling the output response. To aid the user in these decisions, a computer program, ANALYZ, has been written and added to the uncertainty analysis option package. A variety of considerations involved in fitting response surface equations and certain pit-falls of which the user should be aware are discussed. An equation is derived expressing a residual as a function of a fitted model and an assumed true model. A variety of experimental design choices are discussed, including the advantages and disadvantages of each approach. Finally, a description of the subcodes which constitute program ANALYZ is provided

  9. Analysis of uncertainties of thermal hydraulic calculations

    International Nuclear Information System (INIS)

    Macek, J.; Vavrin, J.

    2002-12-01

    In 1993-1997 it was proposed, within OECD projects, that a common program should be set up for uncertainty analysis by a probabilistic method based on a non-parametric statistical approach for system computer codes such as RELAP, ATHLET and CATHARE and that a method should be developed for statistical analysis of experimental databases for the preparation of the input deck and statistical analysis of the output calculation results. Software for such statistical analyses would then have to be processed as individual tools independent of the computer codes used for the thermal hydraulic analysis and programs for uncertainty analysis. In this context, a method for estimation of a thermal hydraulic calculation is outlined and selected methods of statistical analysis of uncertainties are described, including methods for prediction accuracy assessment based on the discrete Fourier transformation principle. (author)

  10. Systematic Evaluation of Uncertainty in Material Flow Analysis

    DEFF Research Database (Denmark)

    Laner, David; Rechberger, Helmut; Astrup, Thomas Fruergaard

    2014-01-01

    Material flow analysis (MFA) is a tool to investigate material flows and stocks in defined systems as a basis for resource management or environmental pollution control. Because of the diverse nature of sources and the varying quality and availability of data, MFA results are inherently uncertain....... Uncertainty analyses have received increasing attention in recent MFA studies, but systematic approaches for selection of appropriate uncertainty tools are missing. This article reviews existing literature related to handling of uncertainty in MFA studies and evaluates current practice of uncertainty analysis......) and exploratory MFA (identification of critical parameters and system behavior). Whereas mathematically simpler concepts focusing on data uncertainty characterization are appropriate for descriptive MFAs, statistical approaches enabling more-rigorous evaluation of uncertainty and model sensitivity are needed...

  11. Uncertainty analysis techniques

    International Nuclear Information System (INIS)

    Marivoet, J.; Saltelli, A.; Cadelli, N.

    1987-01-01

    The origin of the uncertainty affecting Performance Assessments, as well as their propagation to dose and risk results is discussed. The analysis is focused essentially on the uncertainties introduced by the input parameters, the values of which may range over some orders of magnitude and may be given as probability distribution function. The paper briefly reviews the existing sampling techniques used for Monte Carlo simulations and the methods for characterizing the output curves, determining their convergence and confidence limits. Annual doses, expectation values of the doses and risks are computed for a particular case of a possible repository in clay, in order to illustrate the significance of such output characteristics as the mean, the logarithmic mean and the median as well as their ratios. The report concludes that provisionally, due to its better robustness, such estimation as the 90th percentile may be substituted to the arithmetic mean for comparison of the estimated doses with acceptance criteria. In any case, the results obtained through Uncertainty Analyses must be interpreted with caution as long as input data distribution functions are not derived from experiments reasonably reproducing the situation in a well characterized repository and site

  12. Uncertainty analysis for hot channel

    International Nuclear Information System (INIS)

    Panka, I.; Kereszturi, A.

    2006-01-01

    The fulfillment of the safety analysis acceptance criteria is usually evaluated by separate hot channel calculations using the results of neutronic or/and thermo hydraulic system calculations. In case of an ATWS event (inadvertent withdrawal of control assembly), according to the analysis, a number of fuel rods are experiencing DNB for a longer time and must be regarded as failed. Their number must be determined for a further evaluation of the radiological consequences. In the deterministic approach, the global power history must be multiplied by different hot channel factors (kx) taking into account the radial power peaking factors for each fuel pin. If DNB occurs it is necessary to perform a few number of hot channel calculations to determine the limiting kx leading just to DNB and fuel failure (the conservative DNBR limit is 1.33). Knowing the pin power distribution from the core design calculation, the number of failed fuel pins can be calculated. The above procedure can be performed by conservative assumptions (e.g. conservative input parameters in the hot channel calculations), as well. In case of hot channel uncertainty analysis, the relevant input parameters (k x, mass flow, inlet temperature of the coolant, pin average burnup, initial gap size, selection of power history influencing the gap conductance value) of hot channel calculations and the DNBR limit are varied considering the respective uncertainties. An uncertainty analysis methodology was elaborated combining the response surface method with the one sided tolerance limit method of Wilks. The results of deterministic and uncertainty hot channel calculations are compared regarding to the number of failed fuel rods, max. temperature of the clad surface and max. temperature of the fuel (Authors)

  13. Uncertainty Analysis and Expert Judgment in Seismic Hazard Analysis

    Science.gov (United States)

    Klügel, Jens-Uwe

    2011-01-01

    The large uncertainty associated with the prediction of future earthquakes is usually regarded as the main reason for increased hazard estimates which have resulted from some recent large scale probabilistic seismic hazard analysis studies (e.g. the PEGASOS study in Switzerland and the Yucca Mountain study in the USA). It is frequently overlooked that such increased hazard estimates are characteristic for a single specific method of probabilistic seismic hazard analysis (PSHA): the traditional (Cornell-McGuire) PSHA method which has found its highest level of sophistication in the SSHAC probability method. Based on a review of the SSHAC probability model and its application in the PEGASOS project, it is shown that the surprising results of recent PSHA studies can be explained to a large extent by the uncertainty model used in traditional PSHA, which deviates from the state of the art in mathematics and risk analysis. This uncertainty model, the Ang-Tang uncertainty model, mixes concepts of decision theory with probabilistic hazard assessment methods leading to an overestimation of uncertainty in comparison to empirical evidence. Although expert knowledge can be a valuable source of scientific information, its incorporation into the SSHAC probability method does not resolve the issue of inflating uncertainties in PSHA results. Other, more data driven, PSHA approaches in use in some European countries are less vulnerable to this effect. The most valuable alternative to traditional PSHA is the direct probabilistic scenario-based approach, which is closely linked with emerging neo-deterministic methods based on waveform modelling.

  14. The uncertainty analysis of model results a practical guide

    CERN Document Server

    Hofer, Eduard

    2018-01-01

    This book is a practical guide to the uncertainty analysis of computer model applications. Used in many areas, such as engineering, ecology and economics, computer models are subject to various uncertainties at the level of model formulations, parameter values and input data. Naturally, it would be advantageous to know the combined effect of these uncertainties on the model results as well as whether the state of knowledge should be improved in order to reduce the uncertainty of the results most effectively. The book supports decision-makers, model developers and users in their argumentation for an uncertainty analysis and assists them in the interpretation of the analysis results.

  15. Uncertainty analysis for secondary energy distributions

    International Nuclear Information System (INIS)

    Gerstl, S.A.W.

    1978-01-01

    In many transport calculations the integral design parameter of interest (response) is determined mainly by secondary particles such as gamma rays from (n,γ) reactions or secondary neutrons from inelastic scattering events or (n,2n) reactions. Standard sensitivity analysis usually allows to calculate the sensitivities to the production cross sections of such secondaries, but an extended formalism is needed to also obtain the sensitivities to the energy distribution of the generated secondary particles. For a 30-group standard cross-section set 84% of all non-zero table positions pertain to the description of secondary energy distributions (SED's) and only 16% to the actual reaction cross sections. Therefore, any sensitivity/uncertainty analysis which does not consider the effects of SED's is incomplete and neglects most of the input data. This paper describes the methods of how sensitivity profiles for SED's are obtained and used to estimate the uncertainty of an integral response due to uncertainties in these SED's. The detailed theory is documented elsewhere and implemented in the LASL sensitivity code SENSIT. SED sensitivity profiles have proven particularly valuable in cross-section uncertainty analyses for fusion reactors. Even when the production cross sections for secondary neutrons were assumed to be without error, the uncertainties in the energy distribution of these secondaries produced appreciable uncertainties in the calculated tritium breeding rate. However, complete error files for SED's are presently nonexistent. Therefore, methods will be described that allow rough error estimates due to estimated SED uncertainties based on integral SED sensitivities

  16. Uncertainty analysis and validation of environmental models. The empirically based uncertainty analysis

    International Nuclear Information System (INIS)

    Monte, Luigi; Hakanson, Lars; Bergstroem, Ulla; Brittain, John; Heling, Rudie

    1996-01-01

    The principles of Empirically Based Uncertainty Analysis (EBUA) are described. EBUA is based on the evaluation of 'performance indices' that express the level of agreement between the model and sets of empirical independent data collected in different experimental circumstances. Some of these indices may be used to evaluate the confidence limits of the model output. The method is based on the statistical analysis of the distribution of the index values and on the quantitative relationship of these values with the ratio 'experimental data/model output'. Some performance indices are described in the present paper. Among these, the so-called 'functional distance' (d) between the logarithm of model output and the logarithm of the experimental data, defined as d 2 =Σ n 1 ( ln M i - ln O i ) 2 /n where M i is the i-th experimental value, O i the corresponding model evaluation and n the number of the couplets 'experimental value, predicted value', is an important tool for the EBUA method. From the statistical distribution of this performance index, it is possible to infer the characteristics of the distribution of the ratio 'experimental data/model output' and, consequently to evaluate the confidence limits for the model predictions. This method was applied to calculate the uncertainty level of a model developed to predict the migration of radiocaesium in lacustrine systems. Unfortunately, performance indices are affected by the uncertainty of the experimental data used in validation. Indeed, measurement results of environmental levels of contamination are generally associated with large uncertainty due to the measurement and sampling techniques and to the large variability in space and time of the measured quantities. It is demonstrated that this non-desired effect, in some circumstances, may be corrected by means of simple formulae

  17. Urban drainage models - making uncertainty analysis simple

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana

    2012-01-01

    in each measured/observed datapoint; an issue which is commonly overlook in the uncertainty analysis of urban drainage models. This comparison allows the user to intuitively estimate the optimum number of simulations required to conduct uncertainty analyses. The output of the method includes parameter......There is increasing awareness about uncertainties in modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here...

  18. Overview of methods for uncertainty analysis and sensitivity analysis in probabilistic risk assessment

    International Nuclear Information System (INIS)

    Iman, R.L.; Helton, J.C.

    1985-01-01

    Probabilistic Risk Assessment (PRA) is playing an increasingly important role in the nuclear reactor regulatory process. The assessment of uncertainties associated with PRA results is widely recognized as an important part of the analysis process. One of the major criticisms of the Reactor Safety Study was that its representation of uncertainty was inadequate. The desire for the capability to treat uncertainties with the MELCOR risk code being developed at Sandia National Laboratories is indicative of the current interest in this topic. However, as yet, uncertainty analysis and sensitivity analysis in the context of PRA is a relatively immature field. In this paper, available methods for uncertainty analysis and sensitivity analysis in a PRA are reviewed. This review first treats methods for use with individual components of a PRA and then considers how these methods could be combined in the performance of a complete PRA. In the context of this paper, the goal of uncertainty analysis is to measure the imprecision in PRA outcomes of interest, and the goal of sensitivity analysis is to identify the major contributors to this imprecision. There are a number of areas that must be considered in uncertainty analysis and sensitivity analysis for a PRA: (1) information, (2) systems analysis, (3) thermal-hydraulic phenomena/fission product behavior, (4) health and economic consequences, and (5) display of results. Each of these areas and the synthesis of them into a complete PRA are discussed

  19. Uncertainty Analysis with Considering Resonance Self-shielding Effect

    Energy Technology Data Exchange (ETDEWEB)

    Han, Tae Young [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    If infinitely diluted multi-group cross sections were used for the sensitivity, the covariance data from the evaluated nuclear data library (ENDL) was directly applied. However, in case of using a self-shielded multi-group cross section, the covariance data should be corrected considering self-shielding effect. Usually, implicit uncertainty can be defined as the uncertainty change by the resonance self-shielding effect as described above. MUSAD ( Modules of Uncertainty and Sensitivity Analysis for DeCART ) has been developed for a multiplication factor and cross section uncertainty based on the generalized perturbation theory and it, however, can only quantify the explicit uncertainty by the self-shielded multi-group cross sections without considering the implicit effect. Thus, this paper addresses the implementation of the implicit uncertainty analysis module into the code and the numerical results for the verification are provided. The implicit uncertainty analysis module has been implemented into MUSAD based on infinitely-diluted cross section-based consistent method. The verification calculation was performed on MHTGR 350 Ex.I-1a and the differences with McCARD result decrease from 40% to 1% in CZP case and 3% in HFP case. From this study, it is expected that MUSAD code can reasonably produce the complete uncertainty on VHTR or LWR where the resonance self-shielding effect should be significantly considered.

  20. Uncertainty Analysis with Considering Resonance Self-shielding Effect

    International Nuclear Information System (INIS)

    Han, Tae Young

    2016-01-01

    If infinitely diluted multi-group cross sections were used for the sensitivity, the covariance data from the evaluated nuclear data library (ENDL) was directly applied. However, in case of using a self-shielded multi-group cross section, the covariance data should be corrected considering self-shielding effect. Usually, implicit uncertainty can be defined as the uncertainty change by the resonance self-shielding effect as described above. MUSAD ( Modules of Uncertainty and Sensitivity Analysis for DeCART ) has been developed for a multiplication factor and cross section uncertainty based on the generalized perturbation theory and it, however, can only quantify the explicit uncertainty by the self-shielded multi-group cross sections without considering the implicit effect. Thus, this paper addresses the implementation of the implicit uncertainty analysis module into the code and the numerical results for the verification are provided. The implicit uncertainty analysis module has been implemented into MUSAD based on infinitely-diluted cross section-based consistent method. The verification calculation was performed on MHTGR 350 Ex.I-1a and the differences with McCARD result decrease from 40% to 1% in CZP case and 3% in HFP case. From this study, it is expected that MUSAD code can reasonably produce the complete uncertainty on VHTR or LWR where the resonance self-shielding effect should be significantly considered

  1. Uncertainty analysis of neutron transport calculation

    International Nuclear Information System (INIS)

    Oka, Y.; Furuta, K.; Kondo, S.

    1987-01-01

    A cross section sensitivity-uncertainty analysis code, SUSD was developed. The code calculates sensitivity coefficients for one and two-dimensional transport problems based on the first order perturbation theory. Variance and standard deviation of detector responses or design parameters can be obtained using cross section covariance matrix. The code is able to perform sensitivity-uncertainty analysis for secondary neutron angular distribution(SAD) and secondary neutron energy distribution(SED). Covariances of 6 Li and 7 Li neutron cross sections in JENDL-3PR1 were evaluated including SAD and SED. Covariances of Fe and Be were also evaluated. The uncertainty of tritium breeding ratio, fast neutron leakage flux and neutron heating was analysed on four types of blanket concepts for a commercial tokamak fusion reactor. The uncertainty of tritium breeding ratio was less than 6 percent. Contribution from SAD/SED uncertainties are significant for some parameters. Formulas to estimate the errors of numerical solution of the transport equation were derived based on the perturbation theory. This method enables us to deterministically estimate the numerical errors due to iterative solution, spacial discretization and Legendre polynomial expansion of transfer cross-sections. The calculational errors of the tritium breeding ratio and the fast neutron leakage flux of the fusion blankets were analysed. (author)

  2. Uncertainties in thick-target PIXE analysis

    International Nuclear Information System (INIS)

    Campbell, J.L.; Cookson, J.A.; Paul, H.

    1983-01-01

    Thick-target PIXE analysis insolves uncertainties arising from the calculation of thick-target X-ray production in addition to the usual PIXE uncertainties. The calculation demands knowledge of ionization cross-sections, stopping powers and photon attenuation coefficients. Information on these is reviewed critically and a computational method is used to estimate the uncertainties transmitted from this data base into results of thick-target PIXE analyses with reference to particular specimen types using beams of 2-3 MeV protons. A detailed assessment of the accuracy of thick-target PIXE is presented. (orig.)

  3. Uncertainty Propagation in Monte Carlo Depletion Analysis

    International Nuclear Information System (INIS)

    Shim, Hyung Jin; Kim, Yeong-il; Park, Ho Jin; Joo, Han Gyu; Kim, Chang Hyo

    2008-01-01

    A new formulation aimed at quantifying uncertainties of Monte Carlo (MC) tallies such as k eff and the microscopic reaction rates of nuclides and nuclide number densities in MC depletion analysis and examining their propagation behaviour as a function of depletion time step (DTS) is presented. It is shown that the variance of a given MC tally used as a measure of its uncertainty in this formulation arises from four sources; the statistical uncertainty of the MC tally, uncertainties of microscopic cross sections and nuclide number densities, and the cross correlations between them and the contribution of the latter three sources can be determined by computing the correlation coefficients between the uncertain variables. It is also shown that the variance of any given nuclide number density at the end of each DTS stems from uncertainties of the nuclide number densities (NND) and microscopic reaction rates (MRR) of nuclides at the beginning of each DTS and they are determined by computing correlation coefficients between these two uncertain variables. To test the viability of the formulation, we conducted MC depletion analysis for two sample depletion problems involving a simplified 7x7 fuel assembly (FA) and a 17x17 PWR FA, determined number densities of uranium and plutonium isotopes and their variances as well as k ∞ and its variance as a function of DTS, and demonstrated the applicability of the new formulation for uncertainty propagation analysis that need be followed in MC depletion computations. (authors)

  4. Discussion of OECD LWR Uncertainty Analysis in Modelling Benchmark

    International Nuclear Information System (INIS)

    Ivanov, K.; Avramova, M.; Royer, E.; Gillford, J.

    2013-01-01

    The demand for best estimate calculations in nuclear reactor design and safety evaluations has increased in recent years. Uncertainty quantification has been highlighted as part of the best estimate calculations. The modelling aspects of uncertainty and sensitivity analysis are to be further developed and validated on scientific grounds in support of their performance and application to multi-physics reactor simulations. The Organization for Economic Co-operation and Development (OECD) / Nuclear Energy Agency (NEA) Nuclear Science Committee (NSC) has endorsed the creation of an Expert Group on Uncertainty Analysis in Modelling (EGUAM). Within the framework of activities of EGUAM/NSC the OECD/NEA initiated the Benchmark for Uncertainty Analysis in Modelling for Design, Operation, and Safety Analysis of Light Water Reactor (OECD LWR UAM benchmark). The general objective of the benchmark is to propagate the predictive uncertainties of code results through complex coupled multi-physics and multi-scale simulations. The benchmark is divided into three phases with Phase I highlighting the uncertainty propagation in stand-alone neutronics calculations, while Phase II and III are focused on uncertainty analysis of reactor core and system respectively. This paper discusses the progress made in Phase I calculations, the Specifications for Phase II and the incoming challenges in defining Phase 3 exercises. The challenges of applying uncertainty quantification to complex code systems, in particular the time-dependent coupled physics models are the large computational burden and the utilization of non-linear models (expected due to the physics coupling). (authors)

  5. Cassini Spacecraft Uncertainty Analysis Data and Methodology Review and Update/Volume 1: Updated Parameter Uncertainty Models for the Consequence Analysis

    Energy Technology Data Exchange (ETDEWEB)

    WHEELER, TIMOTHY A.; WYSS, GREGORY D.; HARPER, FREDERICK T.

    2000-11-01

    Uncertainty distributions for specific parameters of the Cassini General Purpose Heat Source Radioisotope Thermoelectric Generator (GPHS-RTG) Final Safety Analysis Report consequence risk analysis were revised and updated. The revisions and updates were done for all consequence parameters for which relevant information exists from the joint project on Probabilistic Accident Consequence Uncertainty Analysis by the United States Nuclear Regulatory Commission and the Commission of European Communities.

  6. Uncertainty Analysis of Light Water Reactor Fuel Lattices

    Directory of Open Access Journals (Sweden)

    C. Arenas

    2013-01-01

    Full Text Available The study explored the calculation of uncertainty based on available cross-section covariance data and computational tool on fuel lattice levels, which included pin cell and the fuel assembly models. Uncertainty variations due to temperatures changes and different fuel compositions are the main focus of this analysis. Selected assemblies and unit pin cells were analyzed according to the OECD LWR UAM benchmark specifications. Criticality and uncertainty analysis were performed using TSUNAMI-2D sequence in SCALE 6.1. It was found that uncertainties increase with increasing temperature, while kinf decreases. This increase in the uncertainty is due to the increase in sensitivity of the largest contributing reaction of uncertainty, namely, the neutron capture reaction 238U(n, γ due to the Doppler broadening. In addition, three types (UOX, MOX, and UOX-Gd2O3 of fuel material compositions were analyzed. A remarkable increase in uncertainty in kinf was observed for the case of MOX fuel. The increase in uncertainty of kinf in MOX fuel was nearly twice the corresponding value in UOX fuel. The neutron-nuclide reaction of 238U, mainly inelastic scattering (n, n′, contributed the most to the uncertainties in the MOX fuel, shifting the neutron spectrum to higher energy compared to the UOX fuel.

  7. Nordic reference study on uncertainty and sensitivity analysis

    International Nuclear Information System (INIS)

    Hirschberg, S.; Jacobsson, P.; Pulkkinen, U.; Porn, K.

    1989-01-01

    This paper provides a review of the first phase of Nordic reference study on uncertainty and sensitivity analysis. The main objective of this study is to use experiences form previous Nordic Benchmark Exercises and reference studies concerning critical modeling issues such as common cause failures and human interactions, and to demonstrate the impact of associated uncertainties on the uncertainty of the investigated accident sequence. This has been done independently by three working groups which used different approaches to modeling and to uncertainty analysis. The estimated uncertainty interval for the analyzed accident sequence is large. Also the discrepancies between the groups are substantial but can be explained. Sensitivity analyses which have been carried out concern e.g. use of different CCF-quantification models, alternative handling of CCF-data, time windows for operator actions and time dependences in phase mission operation, impact of state-of-knowledge dependences and ranking of dominating uncertainty contributors. Specific findings with respect to these issues are summarized in the paper

  8. Uncertainty analysis of LBLOCA for Advanced Heavy Water Reactor

    International Nuclear Information System (INIS)

    Srivastava, A.; Lele, H.G.; Ghosh, A.K.; Kushwaha, H.S.

    2008-01-01

    The main objective of safety analysis is to demonstrate in a robust way that all safety requirements are met, i.e. sufficient margins exist between real values of important parameters and their threshold values at which damage of the barriers against release of radioactivity would occur. As stated in the IAEA Safety Requirements for Design of NPPs 'a safety analysis of the plant design shall be conducted in which methods of both deterministic and probabilistic analysis shall be applied'. It is required that 'the computer programs, analytical methods and plant models used in the safety analysis shall be verified and validated, and adequate consideration shall be given to uncertainties'. Uncertainties are present in calculations due to the computer codes, initial and boundary conditions, plant state, fuel parameters, scaling and numerical solution algorithm. All conservative approaches, still widely used, were introduced to cover uncertainties due to limited capability for modelling and understanding of physical phenomena at the early stages of safety analysis. The results obtained by this approach are quite unrealistic and the level of conservatism is not fully known. Another approach is the use of Best Estimate (BE) codes with realistic initial and boundary conditions. If this approach is selected, it should be based on statistically combined uncertainties for plant initial and boundary conditions, assumptions and code models. The current trends are going into direction of the best estimate code with some conservative assumptions of the system with realistic input data with uncertainty analysis. The BE analysis with evaluation of uncertainties offers, in addition, a way to quantify the existing plant safety margins. Its broader use in the future is therefore envisaged, even though it is not always feasible because of the difficulty of quantifying code uncertainties with sufficiently narrow range for every phenomenon and for each accident sequence. In this paper

  9. Durability reliability analysis for corroding concrete structures under uncertainty

    Science.gov (United States)

    Zhang, Hao

    2018-02-01

    This paper presents a durability reliability analysis of reinforced concrete structures subject to the action of marine chloride. The focus is to provide insight into the role of epistemic uncertainties on durability reliability. The corrosion model involves a number of variables whose probabilistic characteristics cannot be fully determined due to the limited availability of supporting data. All sources of uncertainty, both aleatory and epistemic, should be included in the reliability analysis. Two methods are available to formulate the epistemic uncertainty: the imprecise probability-based method and the purely probabilistic method in which the epistemic uncertainties are modeled as random variables. The paper illustrates how the epistemic uncertainties are modeled and propagated in the two methods, and shows how epistemic uncertainties govern the durability reliability.

  10. Validation of Fuel Performance Uncertainty for RIA Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Nam-Gyu; Yoo, Jong-Sung; Jung, Yil-Sup [KEPCO Nuclear Fuel Co., Daejeon (Korea, Republic of)

    2016-10-15

    To achieve this the computer code performance has to be validated based on the experimental results. And for the uncertainty quantification, important uncertainty parameters need to be selected, and combined uncertainty has to be evaluated with an acceptable statistical treatment. And important uncertainty parameters to the rod performance such as fuel enthalpy, fission gas release, cladding hoop strain etc. were chosen through the rigorous sensitivity studies. And their validity has been assessed by utilizing the experimental results, which were tested in CABRI and NSRR. Analysis results revealed that several tested rods were not bounded within combined fuel performance uncertainty. Assessment of fuel performance with an extended fuel power uncertainty on tested rods in NSRR and CABRI has been done. Analysis results showed that several tested rods were not bounded within calculated fuel performance uncertainty. This implies that the currently considered uncertainty range of the parameters is not enough to cover the fuel performance sufficiently.

  11. Nuclear data sensitivity/uncertainty analysis for XT-ADS

    International Nuclear Information System (INIS)

    Sugawara, Takanori; Sarotto, Massimo; Stankovskiy, Alexey; Van den Eynde, Gert

    2011-01-01

    Highlights: → The sensitivity and uncertainty analyses were performed to comprehend the reliability of the XT-ADS neutronic design. → The uncertainties deduced from the covariance data for the XT-ADS criticality were 0.94%, 1.9% and 1.1% by the SCALE 44-group, TENDL-2009 and JENDL-3.3 data, respectively. → When the target accuracy of 0.3%Δk for the criticality was considered, the uncertainties did not satisfy it. → To achieve this accuracy, the uncertainties should be improved by experiments under an adequate condition. - Abstract: The XT-ADS, an accelerator-driven system for an experimental demonstration, has been investigated in the framework of IP EUROTRANS FP6 project. In this study, the sensitivity and uncertainty analyses were performed to comprehend the reliability of the XT-ADS neutronic design. For the sensitivity analysis, it was found that the sensitivity coefficients were significantly different by changing the geometry models and calculation codes. For the uncertainty analysis, it was confirmed that the uncertainties deduced from the covariance data varied significantly by changing them. The uncertainties deduced from the covariance data for the XT-ADS criticality were 0.94%, 1.9% and 1.1% by the SCALE 44-group, TENDL-2009 and JENDL-3.3 data, respectively. When the target accuracy of 0.3%Δk for the criticality was considered, the uncertainties did not satisfy it. To achieve this accuracy, the uncertainties should be improved by experiments under an adequate condition.

  12. Some reflections on uncertainty analysis and management

    International Nuclear Information System (INIS)

    Aven, Terje

    2010-01-01

    A guide to quantitative uncertainty analysis and management in industry has recently been issued. The guide provides an overall framework for uncertainty modelling and characterisations, using probabilities but also other uncertainty representations (including the Dempster-Shafer theory). A number of practical applications showing how to use the framework are presented. The guide is considered as an important contribution to the field, but there is a potential for improvements. These relate mainly to the scientific basis and clarification of critical issues, for example, concerning the meaning of a probability and the concept of model uncertainty. A reformulation of the framework is suggested using probabilities as the only representation of uncertainty. Several simple examples are included to motivate and explain the basic ideas of the modified framework.

  13. The explicit treatment of model uncertainties in the presence of aleatory and epistemic parameter uncertainties in risk and reliability analysis

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Yang, Joon Eon

    2003-01-01

    In the risk and reliability analysis of complex technological systems, the primary concern of formal uncertainty analysis is to understand why uncertainties arise, and to evaluate how they impact the results of the analysis. In recent times, many of the uncertainty analyses have focused on parameters of the risk and reliability analysis models, whose values are uncertain in an aleatory or an epistemic way. As the field of parametric uncertainty analysis matures, however, more attention is being paid to the explicit treatment of uncertainties that are addressed in the predictive model itself as well as the accuracy of the predictive model. The essential steps for evaluating impacts of these model uncertainties in the presence of parameter uncertainties are to determine rigorously various sources of uncertainties to be addressed in an underlying model itself and in turn model parameters, based on our state-of-knowledge and relevant evidence. Answering clearly the question of how to characterize and treat explicitly the forgoing different sources of uncertainty is particularly important for practical aspects such as risk and reliability optimization of systems as well as more transparent risk information and decision-making under various uncertainties. The main purpose of this paper is to provide practical guidance for quantitatively treating various model uncertainties that would often be encountered in the risk and reliability modeling process of complex technological systems

  14. Uncertainty and sensitivity analysis in nuclear accident consequence assessment

    International Nuclear Information System (INIS)

    Karlberg, Olof.

    1989-01-01

    This report contains the results of a four year project in research contracts with the Nordic Cooperation in Nuclear Safety and the National Institute for Radiation Protection. An uncertainty/sensitivity analysis methodology consisting of Latin Hypercube sampling and regression analysis was applied to an accident consequence model. A number of input parameters were selected and the uncertainties related to these parameter were estimated within a Nordic group of experts. Individual doses, collective dose, health effects and their related uncertainties were then calculated for three release scenarios and for a representative sample of meteorological situations. From two of the scenarios the acute phase after an accident were simulated and from one the long time consequences. The most significant parameters were identified. The outer limits of the calculated uncertainty distributions are large and will grow to several order of magnitudes for the low probability consequences. The uncertainty in the expectation values are typical a factor 2-5 (1 Sigma). The variation in the model responses due to the variation of the weather parameters is fairly equal to the parameter uncertainty induced variation. The most important parameters showed out to be different for each pathway of exposure, which could be expected. However, the overall most important parameters are the wet deposition coefficient and the shielding factors. A general discussion of the usefulness of uncertainty analysis in consequence analysis is also given. (au)

  15. A methodology for uncertainty analysis of reference equations of state

    DEFF Research Database (Denmark)

    Cheung, Howard; Frutiger, Jerome; Bell, Ian H.

    We present a detailed methodology for the uncertainty analysis of reference equations of state (EOS) based on Helmholtz energy. In recent years there has been an increased interest in uncertainties of property data and process models of thermal systems. In the literature there are various...... for uncertainty analysis is suggested as a tool for EOS. The uncertainties of the EOS properties are calculated from the experimental values and the EOS model structure through the parameter covariance matrix and subsequent linear error propagation. This allows reporting the uncertainty range (95% confidence...

  16. Fuzzy Uncertainty Evaluation for Fault Tree Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ki Beom; Shim, Hyung Jin [Seoul National University, Seoul (Korea, Republic of); Jae, Moo Sung [Hanyang University, Seoul (Korea, Republic of)

    2015-05-15

    This traditional probabilistic approach can calculate relatively accurate results. However it requires a long time because of repetitive computation due to the MC method. In addition, when informative data for statistical analysis are not sufficient or some events are mainly caused by human error, the probabilistic approach may not be possible because uncertainties of these events are difficult to be expressed by probabilistic distributions. In order to reduce the computation time and quantify uncertainties of top events when basic events whose uncertainties are difficult to be expressed by probabilistic distributions exist, the fuzzy uncertainty propagation based on fuzzy set theory can be applied. In this paper, we develop a fuzzy uncertainty propagation code and apply the fault tree of the core damage accident after the large loss of coolant accident (LLOCA). The fuzzy uncertainty propagation code is implemented and tested for the fault tree of the radiation release accident. We apply this code to the fault tree of the core damage accident after the LLOCA in three cases and compare the results with those computed by the probabilistic uncertainty propagation using the MC method. The results obtained by the fuzzy uncertainty propagation can be calculated in relatively short time, covering the results obtained by the probabilistic uncertainty propagation.

  17. Uncertainty Analysis of In leakage Test for Pressurized Control Room Envelop

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J. B. [KHNP Central Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    In leakage tests for control room envelops(CRE) of newly constructed nuclear power plants are required to prove the control room habitability. Results of the in leakage tests should be analyzed using an uncertainty analysis. Test uncertainty can be an issue if the test results for pressurized CREs show low in leakage. To have a better knowledge of the test uncertainty, a statistical model for the uncertainty analysis is described here and a representative uncertainty analysis of a sample in leakage test is presented. A statistical method for analyzing the uncertainty of the in leakage test is presented here and a representative uncertainty analysis of a sample in leakage test was performed. By using the statistical method we can evaluate the test result with certain level of significance. This method can be more helpful when the difference of the two mean values of the test result is small.

  18. Uncertainty Analysis of In leakage Test for Pressurized Control Room Envelop

    International Nuclear Information System (INIS)

    Lee, J. B.

    2013-01-01

    In leakage tests for control room envelops(CRE) of newly constructed nuclear power plants are required to prove the control room habitability. Results of the in leakage tests should be analyzed using an uncertainty analysis. Test uncertainty can be an issue if the test results for pressurized CREs show low in leakage. To have a better knowledge of the test uncertainty, a statistical model for the uncertainty analysis is described here and a representative uncertainty analysis of a sample in leakage test is presented. A statistical method for analyzing the uncertainty of the in leakage test is presented here and a representative uncertainty analysis of a sample in leakage test was performed. By using the statistical method we can evaluate the test result with certain level of significance. This method can be more helpful when the difference of the two mean values of the test result is small

  19. Urban drainage models simplifying uncertainty analysis for practitioners

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana

    2013-01-01

    in each measured/observed datapoint; an issue that is commonly overlooked in the uncertainty analysis of urban drainage models. This comparison allows the user to intuitively estimate the optimum number of simulations required to conduct uncertainty analyses. The output of the method includes parameter......There is increasing awareness about uncertainties in the modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here...

  20. Code development for eigenvalue total sensitivity analysis and total uncertainty analysis

    International Nuclear Information System (INIS)

    Wan, Chenghui; Cao, Liangzhi; Wu, Hongchun; Zu, Tiejun; Shen, Wei

    2015-01-01

    Highlights: • We develop a new code for total sensitivity and uncertainty analysis. • The implicit effects of cross sections can be considered. • The results of our code agree well with TSUNAMI-1D. • Detailed analysis for origins of implicit effects is performed. - Abstract: The uncertainties of multigroup cross sections notably impact eigenvalue of neutron-transport equation. We report on a total sensitivity analysis and total uncertainty analysis code named UNICORN that has been developed by applying the direct numerical perturbation method and statistical sampling method. In order to consider the contributions of various basic cross sections and the implicit effects which are indirect results of multigroup cross sections through resonance self-shielding calculation, an improved multigroup cross-section perturbation model is developed. The DRAGON 4.0 code, with application of WIMSD-4 format library, is used by UNICORN to carry out the resonance self-shielding and neutron-transport calculations. In addition, the bootstrap technique has been applied to the statistical sampling method in UNICORN to obtain much steadier and more reliable uncertainty results. The UNICORN code has been verified against TSUNAMI-1D by analyzing the case of TMI-1 pin-cell. The numerical results show that the total uncertainty of eigenvalue caused by cross sections can reach up to be about 0.72%. Therefore the contributions of the basic cross sections and their implicit effects are not negligible

  1. A Preliminary Study on Sensitivity and Uncertainty Analysis with Statistic Method: Uncertainty Analysis with Cross Section Sampling from Lognormal Distribution

    Energy Technology Data Exchange (ETDEWEB)

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung [Hanyang Univ., Seoul (Korea, Republic of); Noh, Jae Man [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis.

  2. A Preliminary Study on Sensitivity and Uncertainty Analysis with Statistic Method: Uncertainty Analysis with Cross Section Sampling from Lognormal Distribution

    International Nuclear Information System (INIS)

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung; Noh, Jae Man

    2013-01-01

    The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis

  3. The role of uncertainty analysis in dose reconstruction and risk assessment

    International Nuclear Information System (INIS)

    Hoffman, F.O.; Simon, S.L.; Thiessen. K.M.

    1996-01-01

    Dose reconstruction and risk assessment rely heavily on the use of mathematical models to extrapolate information beyond the realm of direct observation. Because models are merely approximations of real systems, their predictions are inherently uncertain. As a result, full disclosure of uncertainty in dose and risk estimates is essential to achieve scientific credibility and to build public trust. The need for formal analysis of uncertainty in model predictions was presented during the nineteenth annual meeting of the NCRP. At that time, quantitative uncertainty analysis was considered a relatively new and difficult subject practiced by only a few investigators. Today, uncertainty analysis has become synonymous with the assessment process itself. When an uncertainty analysis is used iteratively within the assessment process, it can guide experimental research to refine dose and risk estimates, deferring potentially high cost or high consequence decisions until uncertainty is either acceptable or irreducible. Uncertainty analysis is now mandated for all ongoing dose reconstruction projects within the United States, a fact that distinguishes dose reconstruction from other types of exposure and risk assessments. 64 refs., 6 figs., 1 tab

  4. Uncertainty Analysis of Consequence Management (CM) Data Products.

    Energy Technology Data Exchange (ETDEWEB)

    Hunt, Brian D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eckert-Gallup, Aubrey Celia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cochran, Lainy Dromgoole [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kraus, Terrence D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Fournier, Sean Donovan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Allen, Mark B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Schetnan, Richard Reed [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Simpson, Matthew D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Okada, Colin E. [Remote Sensing Lab. (RSL), Nellis AFB, Las Vegas, NV (United States); Bingham, Avery A. [Remote Sensing Lab. (RSL), Nellis AFB, Las Vegas, NV (United States)

    2018-01-01

    The goal of this project is to develop and execute methods for characterizing uncertainty in data products that are deve loped and distributed by the DOE Consequence Management (CM) Program. A global approach to this problem is necessary because multiple sources of error and uncertainty from across the CM skill sets contribute to the ultimate p roduction of CM data products. This report presents the methods used to develop a probabilistic framework to characterize this uncertainty and provides results for an uncertainty analysis for a study scenario analyzed using this framework.

  5. Application of status uncertainty analysis methods for AP1000 LBLOCA calculation

    International Nuclear Information System (INIS)

    Zhang Shunxiang; Liang Guoxing

    2012-01-01

    Parameter uncertainty analysis is developed by using the reasonable method to establish the response relations between input parameter uncertainties and output uncertainties. The application of the parameter uncertainty analysis makes the simulation of plant state more accuracy and improves the plant economy with reasonable security assurance. The AP1000 LBLOCA was analyzed in this paper and the results indicate that the random sampling statistical analysis method, sensitivity analysis numerical method and traditional error propagation analysis method can provide quite large peak cladding temperature (PCT) safety margin, which is much helpful for choosing suitable uncertainty analysis method to improve the plant economy. Additionally, the random sampling statistical analysis method applying mathematical statistics theory makes the largest safety margin due to the reducing of the conservation. Comparing with the traditional conservative bounding parameter analysis method, the random sampling method can provide the PCT margin of 100 K, while the other two methods can only provide 50-60 K. (authors)

  6. Analytic uncertainty and sensitivity analysis of models with input correlations

    Science.gov (United States)

    Zhu, Yueying; Wang, Qiuping A.; Li, Wei; Cai, Xu

    2018-03-01

    Probabilistic uncertainty analysis is a common means of evaluating mathematical models. In mathematical modeling, the uncertainty in input variables is specified through distribution laws. Its contribution to the uncertainty in model response is usually analyzed by assuming that input variables are independent of each other. However, correlated parameters are often happened in practical applications. In the present paper, an analytic method is built for the uncertainty and sensitivity analysis of models in the presence of input correlations. With the method, it is straightforward to identify the importance of the independence and correlations of input variables in determining the model response. This allows one to decide whether or not the input correlations should be considered in practice. Numerical examples suggest the effectiveness and validation of our analytic method in the analysis of general models. A practical application of the method is also proposed to the uncertainty and sensitivity analysis of a deterministic HIV model.

  7. Uncertainty Analysis of Seebeck Coefficient and Electrical Resistivity Characterization

    Science.gov (United States)

    Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred

    2014-01-01

    In order to provide a complete description of a materials thermoelectric power factor, in addition to the measured nominal value, an uncertainty interval is required. The uncertainty may contain sources of measurement error including systematic bias error and precision error of a statistical nature. The work focuses specifically on the popular ZEM-3 (Ulvac Technologies) measurement system, but the methods apply to any measurement system. The analysis accounts for sources of systematic error including sample preparation tolerance, measurement probe placement, thermocouple cold-finger effect, and measurement parameters; in addition to including uncertainty of a statistical nature. Complete uncertainty analysis of a measurement system allows for more reliable comparison of measurement data between laboratories.

  8. Parameter Uncertainty for Repository Thermal Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hardin, Ernest [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hadgu, Teklu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Greenberg, Harris [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dupont, Mark [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-10-01

    This report is one follow-on to a study of reference geologic disposal design concepts (Hardin et al. 2011a). Based on an analysis of maximum temperatures, that study concluded that certain disposal concepts would require extended decay storage prior to emplacement, or the use of small waste packages, or both. The study used nominal values for thermal properties of host geologic media and engineered materials, demonstrating the need for uncertainty analysis to support the conclusions. This report is a first step that identifies the input parameters of the maximum temperature calculation, surveys published data on measured values, uses an analytical approach to determine which parameters are most important, and performs an example sensitivity analysis. Using results from this first step, temperature calculations planned for FY12 can focus on only the important parameters, and can use the uncertainty ranges reported here. The survey of published information on thermal properties of geologic media and engineered materials, is intended to be sufficient for use in generic calculations to evaluate the feasibility of reference disposal concepts. A full compendium of literature data is beyond the scope of this report. The term “uncertainty” is used here to represent both measurement uncertainty and spatial variability, or variability across host geologic units. For the most important parameters (e.g., buffer thermal conductivity) the extent of literature data surveyed samples these different forms of uncertainty and variability. Finally, this report is intended to be one chapter or section of a larger FY12 deliverable summarizing all the work on design concepts and thermal load management for geologic disposal (M3FT-12SN0804032, due 15Aug2012).

  9. Improved Monte Carlo Method for PSA Uncertainty Analysis

    International Nuclear Information System (INIS)

    Choi, Jongsoo

    2016-01-01

    The treatment of uncertainty is an important issue for regulatory decisions. Uncertainties exist from knowledge limitations. A probabilistic approach has exposed some of these limitations and provided a framework to assess their significance and assist in developing a strategy to accommodate them in the regulatory process. The uncertainty analysis (UA) is usually based on the Monte Carlo method. This paper proposes a Monte Carlo UA approach to calculate the mean risk metrics accounting for the SOKC between basic events (including CCFs) using efficient random number generators and to meet Capability Category III of the ASME/ANS PRA standard. Audit calculation is needed in PSA regulatory reviews of uncertainty analysis results submitted for licensing. The proposed Monte Carlo UA approach provides a high degree of confidence in PSA reviews. All PSA needs accounting for the SOKC between event probabilities to meet the ASME/ANS PRA standard

  10. Improved Monte Carlo Method for PSA Uncertainty Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jongsoo [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2016-10-15

    The treatment of uncertainty is an important issue for regulatory decisions. Uncertainties exist from knowledge limitations. A probabilistic approach has exposed some of these limitations and provided a framework to assess their significance and assist in developing a strategy to accommodate them in the regulatory process. The uncertainty analysis (UA) is usually based on the Monte Carlo method. This paper proposes a Monte Carlo UA approach to calculate the mean risk metrics accounting for the SOKC between basic events (including CCFs) using efficient random number generators and to meet Capability Category III of the ASME/ANS PRA standard. Audit calculation is needed in PSA regulatory reviews of uncertainty analysis results submitted for licensing. The proposed Monte Carlo UA approach provides a high degree of confidence in PSA reviews. All PSA needs accounting for the SOKC between event probabilities to meet the ASME/ANS PRA standard.

  11. Joint analysis of epistemic and aleatory uncertainty in stability analysis for geo-hazard assessments

    Science.gov (United States)

    Rohmer, Jeremy; Verdel, Thierry

    2017-04-01

    Uncertainty analysis is an unavoidable task of stability analysis of any geotechnical systems. Such analysis usually relies on the safety factor SF (if SF is below some specified threshold), the failure is possible). The objective of the stability analysis is then to estimate the failure probability P for SF to be below the specified threshold. When dealing with uncertainties, two facets should be considered as outlined by several authors in the domain of geotechnics, namely "aleatoric uncertainty" (also named "randomness" or "intrinsic variability") and "epistemic uncertainty" (i.e. when facing "vague, incomplete or imprecise information" such as limited databases and observations or "imperfect" modelling). The benefits of separating both facets of uncertainty can be seen from a risk management perspective because: - Aleatoric uncertainty, being a property of the system under study, cannot be reduced. However, practical actions can be taken to circumvent the potentially dangerous effects of such variability; - Epistemic uncertainty, being due to the incomplete/imprecise nature of available information, can be reduced by e.g., increasing the number of tests (lab or in site survey), improving the measurement methods or evaluating calculation procedure with model tests, confronting more information sources (expert opinions, data from literature, etc.). Uncertainty treatment in stability analysis usually restricts to the probabilistic framework to represent both facets of uncertainty. Yet, in the domain of geo-hazard assessments (like landslides, mine pillar collapse, rockfalls, etc.), the validity of this approach can be debatable. In the present communication, we propose to review the major criticisms available in the literature against the systematic use of probability in situations of high degree of uncertainty. On this basis, the feasibility of using a more flexible uncertainty representation tool is then investigated, namely Possibility distributions (e

  12. Report of a CSNI workshop on uncertainty analysis methods. Volume 1 + 2

    International Nuclear Information System (INIS)

    Wickett, A.J.; Yadigaroglu, G.

    1994-08-01

    The OECD NEA CSNI Principal Working Group 2 (PWG2) Task Group on Thermal Hydraulic System Behaviour (TGTHSB) has, in recent years, received presentations of a variety of different methods to analyze the uncertainty in the calculations of advanced unbiased (best estimate) codes. Proposals were also made for an International Standard Problem (ISP) to compare the uncertainty analysis methods. The objectives for the Workshop were to discuss and fully understand the principles of uncertainty analysis relevant to LOCA modelling and like problems, to examine the underlying issues from first principles, in preference to comparing and contrasting the currently proposed methods, to reach consensus on the issues identified as far as possible while not avoiding the controversial aspects, to identify as clearly as possible unreconciled differences, and to issue a Status Report. Eight uncertainty analysis methods were presented. A structured discussion of various aspects of uncertainty analysis followed - the need for uncertainty analysis, identification and ranking of uncertainties, characterisation, quantification and combination of uncertainties and applications, resources and future developments. As a result, the objectives set out above were, to a very large extent, achieved. Plans for the ISP were also discussed. Volume 1 contains a record of the discussions on uncertainty methods. Volume 2 is a compilation of descriptions of the eight uncertainty analysis methods presented at the workshop

  13. Survey of sampling-based methods for uncertainty and sensitivity analysis

    International Nuclear Information System (INIS)

    Helton, J.C.; Johnson, J.D.; Sallaberry, C.J.; Storlie, C.B.

    2006-01-01

    Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (i) definition of probability distributions to characterize epistemic uncertainty in analysis inputs (ii) generation of samples from uncertain analysis inputs (iii) propagation of sampled inputs through an analysis (iv) presentation of uncertainty analysis results, and (v) determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two-dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition

  14. Survey of sampling-based methods for uncertainty and sensitivity analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jay Dean; Helton, Jon Craig; Sallaberry, Cedric J. PhD. (.; .); Storlie, Curt B. (Colorado State University, Fort Collins, CO)

    2006-06-01

    Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (1) Definition of probability distributions to characterize epistemic uncertainty in analysis inputs, (2) Generation of samples from uncertain analysis inputs, (3) Propagation of sampled inputs through an analysis, (4) Presentation of uncertainty analysis results, and (5) Determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition.

  15. Integrated uncertainty analysis using RELAP/SCDAPSIM/MOD4.0

    International Nuclear Information System (INIS)

    Perez, M.; Reventos, F.; Wagner, R.; Allison, C.

    2009-01-01

    The RELAP/SCDAPSIM/MOD4.0 code, designed to predict the behavior of reactor systems during normal and accident conditions, is being developed as part of an international nuclear technology Software Development and Training Program (SDTP). RELAP/SCDAPSIM/MOD4.0, which is the first version of RELAP5 completely rewritten to FORTRAN 90/95/2000 standards, uses the publicly available RELAP5 and SCDAP models in combination with (a) advanced programming and numerical techniques, (b) advanced SDTP-member-developed models for LWR, HWR, and research reactor analysis, and (c) a variety of other member-developed computational packages. One such computational package is an integrated uncertainty analysis package being developed jointly by the Technical University of Catalunya (UPC) and Innovative Systems Software (ISS). The integrated uncertainty analysis approach used in the package uses the following steps: 1. Selection of the plant; 2. Selection of the scenario; 3. Selection of the safety criteria; 4. Identification and ranking of the relevant phenomena based on the safety criteria; 5. Selection of the appropriate code parameters to represent those phenomena; 6. Association of uncertainty by means of Probability Distribution Functions (PDFs) for each selected parameter; 7. Random sampling of the selected parameters according to its PDF and performing multiple computer runs to obtain uncertainty bands with a certain percentile and confidence level; 8. Processing the results of the multiple computer runs to estimate the uncertainty bands for the computed quantities associated with the selected safety criteria. The first four steps are performed by the user prior to the RELAP/SCDAPSIM/MOD4.0 analysis. The remaining steps are included with the MOD4.0 integrated uncertainty analysis (IUA) package. This paper briefly describes the integrated uncertainty analysis package including (a) the features of the package, (b) the implementation of the package into RELAP/SCDAPSIM/MOD4.0, and

  16. Systematic Analysis Of Ocean Colour Uncertainties

    Science.gov (United States)

    Lavender, Samantha

    2013-12-01

    This paper reviews current research into the estimation of uncertainties as a pixel-based measure to aid non- specialist users of remote sensing products. An example MERIS image, captured on the 28 March 2012, was processed with above-water atmospheric correction code. This was initially based on both the Antoine & Morel Standard Atmospheric Correction, with Bright Pixel correction component, and Doerffer Neural Network coastal water's approach. It's showed that analysis of the atmospheric by-products yield important information about the separation of the atmospheric and in-water signals, helping to sign-post possible uncertainties in the atmospheric correction results. Further analysis has concentrated on implementing a ‘simplistic' atmospheric correction so that the impact of changing the input auxiliary data can be analysed; the influence of changing surface pressure is demonstrated. Future work will focus on automating the analysis, so that the methodology can be implemented within an operational system.

  17. Estimating the measurement uncertainty in forensic blood alcohol analysis.

    Science.gov (United States)

    Gullberg, Rod G

    2012-04-01

    For many reasons, forensic toxicologists are being asked to determine and report their measurement uncertainty in blood alcohol analysis. While understood conceptually, the elements and computations involved in determining measurement uncertainty are generally foreign to most forensic toxicologists. Several established and well-documented methods are available to determine and report the uncertainty in blood alcohol measurement. A straightforward bottom-up approach is presented that includes: (1) specifying the measurand, (2) identifying the major components of uncertainty, (3) quantifying the components, (4) statistically combining the components and (5) reporting the results. A hypothetical example is presented that employs reasonable estimates for forensic blood alcohol analysis assuming headspace gas chromatography. These computations are easily employed in spreadsheet programs as well. Determining and reporting measurement uncertainty is an important element in establishing fitness-for-purpose. Indeed, the demand for such computations and information from the forensic toxicologist will continue to increase.

  18. Uncertainty Propagation in Hypersonic Vehicle Aerothermoelastic Analysis

    Science.gov (United States)

    Lamorte, Nicolas Etienne

    Hypersonic vehicles face a challenging flight environment. The aerothermoelastic analysis of its components requires numerous simplifying approximations. Identifying and quantifying the effect of uncertainties pushes the limits of the existing deterministic models, and is pursued in this work. An uncertainty quantification framework is used to propagate the effects of identified uncertainties on the stability margins and performance of the different systems considered. First, the aeroelastic stability of a typical section representative of a control surface on a hypersonic vehicle is examined. Variability in the uncoupled natural frequencies of the system is modeled to mimic the effect of aerodynamic heating. Next, the stability of an aerodynamically heated panel representing a component of the skin of a generic hypersonic vehicle is considered. Uncertainty in the location of transition from laminar to turbulent flow and the heat flux prediction is quantified using CFD. In both cases significant reductions of the stability margins are observed. A loosely coupled airframe--integrated scramjet engine is considered next. The elongated body and cowl of the engine flow path are subject to harsh aerothermodynamic loading which causes it to deform. Uncertainty associated with deformation prediction is propagated to the engine performance analysis. The cowl deformation is the main contributor to the sensitivity of the propulsion system performance. Finally, a framework for aerothermoelastic stability boundary calculation for hypersonic vehicles using CFD is developed. The usage of CFD enables one to consider different turbulence conditions, laminar or turbulent, and different models of the air mixture, in particular real gas model which accounts for dissociation of molecules at high temperature. The system is found to be sensitive to turbulence modeling as well as the location of the transition from laminar to turbulent flow. Real gas effects play a minor role in the

  19. Uncertainty Instability Risk Analysis of High Concrete Arch Dam Abutments

    Directory of Open Access Journals (Sweden)

    Xin Cao

    2017-01-01

    Full Text Available The uncertainties associated with concrete arch dams rise with the increased height of dams. Given the uncertainties associated with influencing factors, the stability of high arch dam abutments as a fuzzy random event was studied. In addition, given the randomness and fuzziness of calculation parameters as well as the failure criterion, hazard point and hazard surface uncertainty instability risk ratio models were proposed for high arch dam abutments on the basis of credibility theory. The uncertainty instability failure criterion was derived through the analysis of the progressive instability failure process on the basis of Shannon’s entropy theory. The uncertainties associated with influencing factors were quantized by probability or possibility distribution assignments. Gaussian random theory was used to generate random realizations for influence factors with spatial variability. The uncertainty stability analysis method was proposed by combining the finite element analysis and the limit equilibrium method. The instability risk ratio was calculated using the Monte Carlo simulation method and fuzzy random postprocessing. Results corroborate that the modeling approach is sound and that the calculation method is feasible.

  20. Two-dimensional cross-section sensitivity and uncertainty analysis for fusion reactor blankets

    International Nuclear Information System (INIS)

    Embrechts, M.J.

    1982-02-01

    A two-dimensional sensitivity and uncertainty analysis for the heating of the TF coil for the FED (fusion engineering device) blanket was performed. The uncertainties calculated are of the same order of magnitude as those resulting from a one-dimensional analysis. The largest uncertainties were caused by the cross section uncertainties for chromium

  1. Uncertainty analysis of nuclear waste package corrosion

    International Nuclear Information System (INIS)

    Kurth, R.E.; Nicolosi, S.L.

    1986-01-01

    This paper describes the results of an evaluation of three uncertainty analysis methods for assessing the possible variability in calculating the corrosion process in a nuclear waste package. The purpose of the study is the determination of how each of three uncertainty analysis methods, Monte Carlo, Latin hypercube sampling (LHS) and a modified discrete probability distribution method, perform in such calculations. The purpose is not to examine the absolute magnitude of the numbers but rather to rank the performance of each of the uncertainty methods in assessing the model variability. In this context it was found that the Monte Carlo method provided the most accurate assessment but at a prohibitively high cost. The modified discrete probability method provided accuracy close to that of the Monte Carlo for a fraction of the cost. The LHS method was found to be too inaccurate for this calculation although it would be appropriate for use in a model which requires substantially more computer time than the one studied in this paper

  2. Risk Characterization uncertainties associated description, sensitivity analysis

    International Nuclear Information System (INIS)

    Carrillo, M.; Tovar, M.; Alvarez, J.; Arraez, M.; Hordziejewicz, I.; Loreto, I.

    2013-01-01

    The power point presentation is about risks to the estimated levels of exposure, uncertainty and variability in the analysis, sensitivity analysis, risks from exposure to multiple substances, formulation of guidelines for carcinogenic and genotoxic compounds and risk subpopulations

  3. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis.

    Science.gov (United States)

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  4. Design optimization and uncertainty analysis of SMA morphing structures

    International Nuclear Information System (INIS)

    Oehler, S D; Hartl, D J; Lopez, R; Malak, R J; Lagoudas, D C

    2012-01-01

    The continuing implementation of shape memory alloys (SMAs) as lightweight solid-state actuators in morphing structures has now motivated research into finding optimized designs for use in aerospace control systems. This work proposes methods that use iterative analysis techniques to determine optimized designs for morphing aerostructures and consider the impact of uncertainty in model variables on the solution. A combination of commercially available and custom coded tools is utilized. ModelCenter, a suite of optimization algorithms and simulation process management tools, is coupled with the Abaqus finite element analysis suite and a custom SMA constitutive model to assess morphing structure designs in an automated fashion. The chosen case study involves determining the optimized configuration of a morphing aerostructure assembly that includes SMA flexures. This is accomplished by altering design inputs representing the placement of active components to minimize a specified cost function. An uncertainty analysis is also conducted using design of experiment methods to determine the sensitivity of the solution to a set of uncertainty variables. This second study demonstrates the effective use of Monte Carlo techniques to simulate the variance of model variables representing the inherent uncertainty in component fabrication processes. This paper outlines the modeling tools used to execute each case study, details the procedures for constructing the optimization problem and uncertainty analysis, and highlights the results from both studies. (paper)

  5. The role of sensitivity analysis in assessing uncertainty

    International Nuclear Information System (INIS)

    Crick, M.J.; Hill, M.D.

    1987-01-01

    Outside the specialist world of those carrying out performance assessments considerable confusion has arisen about the meanings of sensitivity analysis and uncertainty analysis. In this paper we attempt to reduce this confusion. We then go on to review approaches to sensitivity analysis within the context of assessing uncertainty, and to outline the types of test available to identify sensitive parameters, together with their advantages and disadvantages. The views expressed in this paper are those of the authors; they have not been formally endorsed by the National Radiological Protection Board and should not be interpreted as Board advice

  6. Including uncertainty in hazard analysis through fuzzy measures

    International Nuclear Information System (INIS)

    Bott, T.F.; Eisenhawer, S.W.

    1997-12-01

    This paper presents a method for capturing the uncertainty expressed by an Hazard Analysis (HA) expert team when estimating the frequencies and consequences of accident sequences and provides a sound mathematical framework for propagating this uncertainty to the risk estimates for these accident sequences. The uncertainty is readily expressed as distributions that can visually aid the analyst in determining the extent and source of risk uncertainty in HA accident sequences. The results also can be expressed as single statistics of the distribution in a manner analogous to expressing a probabilistic distribution as a point-value statistic such as a mean or median. The study discussed here used data collected during the elicitation portion of an HA on a high-level waste transfer process to demonstrate the techniques for capturing uncertainty. These data came from observations of the uncertainty that HA team members expressed in assigning frequencies and consequences to accident sequences during an actual HA. This uncertainty was captured and manipulated using ideas from possibility theory. The result of this study is a practical method for displaying and assessing the uncertainty in the HA team estimates of the frequency and consequences for accident sequences. This uncertainty provides potentially valuable information about accident sequences that typically is lost in the HA process

  7. Uncertainty Assessments in Fast Neutron Activation Analysis

    International Nuclear Information System (INIS)

    W. D. James; R. Zeisler

    2000-01-01

    Fast neutron activation analysis (FNAA) carried out with the use of small accelerator-based neutron generators is routinely used for major/minor element determinations in industry, mineral and petroleum exploration, and to some extent in research. While the method shares many of the operational procedures and therefore errors inherent to conventional thermal neutron activation analysis, its unique implementation gives rise to additional specific concerns that can result in errors or increased uncertainties of measured quantities. The authors were involved in a recent effort to evaluate irreversible incorporation of oxygen into a standard reference material (SRM) by direct measurement of oxygen by FNAA. That project required determination of oxygen in bottles of the SRM stored in varying environmental conditions and a comparison of the results. We recognized the need to accurately describe the total uncertainty of the measurements to accurately characterize any differences in the resulting average concentrations. It is our intent here to discuss the breadth of potential parameters that have the potential to contribute to the random and nonrandom errors of the method and provide estimates of the magnitude of uncertainty introduced. In addition, we will discuss the steps taken in this recent FNAA project to control quality, assess the uncertainty of the measurements, and evaluate results based on the statistical reproducibility

  8. Statistically based uncertainty assessments in nuclear risk analysis

    International Nuclear Information System (INIS)

    Spencer, F.W.; Diegert, K.V.; Easterling, R.G.

    1987-01-01

    Over the last decade, the problems of estimation and uncertainty assessment in probabilistics risk assessment (PRAs) have been addressed in a variety of NRC and industry-sponsored projects. These problems have received attention because of a recognition that major uncertainties in risk estimation exist, which can be reduced by collecting more and better data and other information, and because of a recognition that better methods for assessing these uncertainties are needed. In particular, a clear understanding of the nature and magnitude of various sources of uncertainty is needed to facilitate descision-making on possible plant changes and research options. Recent PRAs have employed methods of probability propagation, sometimes involving the use of Bayes Theorem, and intended to formalize the use of ''engineering judgment'' or ''expert opinion.'' All sources, or feelings, of uncertainty are expressed probabilistically, so that uncertainty analysis becomes simply a matter of probability propagation. Alternatives to forcing a probabilistic framework at all stages of a PRA are a major concern in this paper, however

  9. DS02 uncertainty analysis

    International Nuclear Information System (INIS)

    Kaul, Dean C.; Egbert, Stephen D.; Woolson, William A.

    2005-01-01

    In order to avoid the pitfalls that so discredited DS86 and its uncertainty estimates, and to provide DS02 uncertainties that are both defensible and credible, this report not only presents the ensemble uncertainties assembled from uncertainties in individual computational elements and radiation dose components but also describes how these relate to comparisons between observed and computed quantities at critical intervals in the computational process. These comparisons include those between observed and calculated radiation free-field components, where observations include thermal- and fast-neutron activation and gamma-ray thermoluminescence, which are relevant to the estimated systematic uncertainty for DS02. The comparisons also include those between calculated and observed survivor shielding, where the observations consist of biodosimetric measurements for individual survivors, which are relevant to the estimated random uncertainty for DS02. (J.P.N.)

  10. Sensitivity and uncertainty analysis of the PATHWAY radionuclide transport model

    International Nuclear Information System (INIS)

    Otis, M.D.

    1983-01-01

    Procedures were developed for the uncertainty and sensitivity analysis of a dynamic model of radionuclide transport through human food chains. Uncertainty in model predictions was estimated by propagation of parameter uncertainties using a Monte Carlo simulation technique. Sensitivity of model predictions to individual parameters was investigated using the partial correlation coefficient of each parameter with model output. Random values produced for the uncertainty analysis were used in the correlation analysis for sensitivity. These procedures were applied to the PATHWAY model which predicts concentrations of radionuclides in foods grown in Nevada and Utah and exposed to fallout during the period of atmospheric nuclear weapons testing in Nevada. Concentrations and time-integrated concentrations of iodine-131, cesium-136, and cesium-137 in milk and other foods were investigated. 9 figs., 13 tabs

  11. The characterisation and evaluation of uncertainty in probabilistic risk analysis

    International Nuclear Information System (INIS)

    Parry, G.W.; Winter, P.W.

    1980-10-01

    The sources of uncertainty in probabilistic risk analysis are discussed using the event/fault tree methodology as an example. The role of statistics in quantifying these uncertainties is investigated. A class of uncertainties is identified which is, at present, unquantifiable, using either classical or Bayesian statistics. It is argued that Bayesian statistics is the more appropriate vehicle for the probabilistic analysis of rare events and a short review is given with some discussion on the representation of ignorance. (author)

  12. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    Science.gov (United States)

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Uncertainty analysis in WWTP model applications: a critical discussion using an example from design

    DEFF Research Database (Denmark)

    Sin, Gürkan; Gernaey, Krist; Neumann, Marc B.

    2009-01-01

    of design performance criteria differs significantly. The implication for the practical applications of uncertainty analysis in the wastewater industry is profound: (i) as the uncertainty analysis results are specific to the framing used, the results must be interpreted within the context of that framing......This study focuses on uncertainty analysis of WWTP models and analyzes the issue of framing and how it affects the interpretation of uncertainty analysis results. As a case study, the prediction of uncertainty involved in model-based design of a wastewater treatment plant is studied. The Monte...... to stoichiometric, biokinetic and influent parameters; (2) uncertainty due to hydraulic behaviour of the plant and mass transfer parameters; (3) uncertainty due to the combination of (1) and (2). The results demonstrate that depending on the way the uncertainty analysis is framed, the estimated uncertainty...

  14. Principles and applications of measurement and uncertainty analysis in research and calibration

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that ``The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.`` Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true? What kind of information should we include in a statement of uncertainty accompanying a calibrated value? How and where do we get the information to include in an uncertainty statement? How should we interpret and use measurement uncertainty information? This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  15. Principles and applications of measurement and uncertainty analysis in research and calibration

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.'' Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true What kind of information should we include in a statement of uncertainty accompanying a calibrated value How and where do we get the information to include in an uncertainty statement How should we interpret and use measurement uncertainty information This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  16. Error Analysis of CM Data Products Sources of Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Hunt, Brian D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eckert-Gallup, Aubrey Celia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cochran, Lainy Dromgoole [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kraus, Terrence D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Allen, Mark B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Beal, Bill [National Security Technologies, Joint Base Andrews, MD (United States); Okada, Colin [National Security Technologies, LLC. (NSTec), Las Vegas, NV (United States); Simpson, Mathew [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-01

    This goal of this project is to address the current inability to assess the overall error and uncertainty of data products developed and distributed by DOE’s Consequence Management (CM) Program. This is a widely recognized shortfall, the resolution of which would provide a great deal of value and defensibility to the analysis results, data products, and the decision making process that follows this work. A global approach to this problem is necessary because multiple sources of error and uncertainty contribute to the ultimate production of CM data products. Therefore, this project will require collaboration with subject matter experts across a wide range of FRMAC skill sets in order to quantify the types of uncertainty that each area of the CM process might contain and to understand how variations in these uncertainty sources contribute to the aggregated uncertainty present in CM data products. The ultimate goal of this project is to quantify the confidence level of CM products to ensure that appropriate public and worker protections decisions are supported by defensible analysis.

  17. Uncertainty analysis of hydrological modeling in a tropical area using different algorithms

    Science.gov (United States)

    Rafiei Emam, Ammar; Kappas, Martin; Fassnacht, Steven; Linh, Nguyen Hoang Khanh

    2018-01-01

    Hydrological modeling outputs are subject to uncertainty resulting from different sources of errors (e.g., error in input data, model structure, and model parameters), making quantification of uncertainty in hydrological modeling imperative and meant to improve reliability of modeling results. The uncertainty analysis must solve difficulties in calibration of hydrological models, which further increase in areas with data scarcity. The purpose of this study is to apply four uncertainty analysis algorithms to a semi-distributed hydrological model, quantifying different source of uncertainties (especially parameter uncertainty) and evaluate their performance. In this study, the Soil and Water Assessment Tools (SWAT) eco-hydrological model was implemented for the watershed in the center of Vietnam. The sensitivity of parameters was analyzed, and the model was calibrated. The uncertainty analysis for the hydrological model was conducted based on four algorithms: Generalized Likelihood Uncertainty Estimation (GLUE), Sequential Uncertainty Fitting (SUFI), Parameter Solution method (ParaSol) and Particle Swarm Optimization (PSO). The performance of the algorithms was compared using P-factor and Rfactor, coefficient of determination (R 2), the Nash Sutcliffe coefficient of efficiency (NSE) and Percent Bias (PBIAS). The results showed the high performance of SUFI and PSO with P-factor>0.83, R-factor 0.91, NSE>0.89, and 0.18analysis. Indeed, the uncertainty analysis must be accounted when the outcomes of the model use for policy or management decisions.

  18. Fuzzy probability based fault tree analysis to propagate and quantify epistemic uncertainty

    International Nuclear Information System (INIS)

    Purba, Julwan Hendry; Sony Tjahyani, D.T.; Ekariansyah, Andi Sofrany; Tjahjono, Hendro

    2015-01-01

    Highlights: • Fuzzy probability based fault tree analysis is to evaluate epistemic uncertainty in fuzzy fault tree analysis. • Fuzzy probabilities represent likelihood occurrences of all events in a fault tree. • A fuzzy multiplication rule quantifies epistemic uncertainty of minimal cut sets. • A fuzzy complement rule estimate epistemic uncertainty of the top event. • The proposed FPFTA has successfully evaluated the U.S. Combustion Engineering RPS. - Abstract: A number of fuzzy fault tree analysis approaches, which integrate fuzzy concepts into the quantitative phase of conventional fault tree analysis, have been proposed to study reliabilities of engineering systems. Those new approaches apply expert judgments to overcome the limitation of the conventional fault tree analysis when basic events do not have probability distributions. Since expert judgments might come with epistemic uncertainty, it is important to quantify the overall uncertainties of the fuzzy fault tree analysis. Monte Carlo simulation is commonly used to quantify the overall uncertainties of conventional fault tree analysis. However, since Monte Carlo simulation is based on probability distribution, this technique is not appropriate for fuzzy fault tree analysis, which is based on fuzzy probabilities. The objective of this study is to develop a fuzzy probability based fault tree analysis to overcome the limitation of fuzzy fault tree analysis. To demonstrate the applicability of the proposed approach, a case study is performed and its results are then compared to the results analyzed by a conventional fault tree analysis. The results confirm that the proposed fuzzy probability based fault tree analysis is feasible to propagate and quantify epistemic uncertainties in fault tree analysis

  19. New challenges on uncertainty propagation assessment of flood risk analysis

    Science.gov (United States)

    Martins, Luciano; Aroca-Jiménez, Estefanía; Bodoque, José M.; Díez-Herrero, Andrés

    2016-04-01

    Natural hazards, such as floods, cause considerable damage to the human life, material and functional assets every year and around the World. Risk assessment procedures has associated a set of uncertainties, mainly of two types: natural, derived from stochastic character inherent in the flood process dynamics; and epistemic, that are associated with lack of knowledge or the bad procedures employed in the study of these processes. There are abundant scientific and technical literature on uncertainties estimation in each step of flood risk analysis (e.g. rainfall estimates, hydraulic modelling variables); but very few experience on the propagation of the uncertainties along the flood risk assessment. Therefore, epistemic uncertainties are the main goal of this work, in particular,understand the extension of the propagation of uncertainties throughout the process, starting with inundability studies until risk analysis, and how far does vary a proper analysis of the risk of flooding. These methodologies, such as Polynomial Chaos Theory (PCT), Method of Moments or Monte Carlo, are used to evaluate different sources of error, such as data records (precipitation gauges, flow gauges...), hydrologic and hydraulic modelling (inundation estimation), socio-demographic data (damage estimation) to evaluate the uncertainties propagation (UP) considered in design flood risk estimation both, in numerical and cartographic expression. In order to consider the total uncertainty and understand what factors are contributed most to the final uncertainty, we used the method of Polynomial Chaos Theory (PCT). It represents an interesting way to handle to inclusion of uncertainty in the modelling and simulation process. PCT allows for the development of a probabilistic model of the system in a deterministic setting. This is done by using random variables and polynomials to handle the effects of uncertainty. Method application results have a better robustness than traditional analysis

  20. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, main report

    International Nuclear Information System (INIS)

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The ultimate objective of the joint effort was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. Experts developed their distributions independently. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. To validate the distributions generated for the dispersion code input variables, samples from the distributions and propagated through the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the first of a three-volume document describing the project

  1. Uncertainty analysis for results of thermal hydraulic codes of best-estimate-type

    International Nuclear Information System (INIS)

    Alva N, J.

    2010-01-01

    In this thesis, some fundamental knowledge is presented about uncertainty analysis and about diverse methodologies applied in the study of nuclear power plant transient event analysis, particularly related to thermal hydraulics phenomena. These concepts and methodologies mentioned in this work come from a wide bibliographical research in the nuclear power subject. Methodologies for uncertainty analysis have been developed by quite diverse institutions, and they have been widely used worldwide for application to results from best-estimate-type computer codes in nuclear reactor thermal hydraulics and safety analysis. Also, the main uncertainty sources, types of uncertainties, and aspects related to best estimate modeling and methods are introduced. Once the main bases of uncertainty analysis have been set, and some of the known methodologies have been introduced, it is presented in detail the CSAU methodology, which will be applied in the analyses. The main objective of this thesis is to compare the results of an uncertainty and sensibility analysis by using the Response Surface Technique to the application of W ilks formula, apply through a loss coolant experiment and an event of rise in a BWR. Both techniques are options in the part of uncertainty and sensibility analysis of the CSAU methodology, which was developed for the analysis of transients and accidents at nuclear power plants, and it is the base of most of the methodologies used in licensing of nuclear power plants practically everywhere. Finally, the results of applying both techniques are compared and discussed. (Author)

  2. Planning for robust reserve networks using uncertainty analysis

    Science.gov (United States)

    Moilanen, A.; Runge, M.C.; Elith, Jane; Tyre, A.; Carmel, Y.; Fegraus, E.; Wintle, B.A.; Burgman, M.; Ben-Haim, Y.

    2006-01-01

    Planning land-use for biodiversity conservation frequently involves computer-assisted reserve selection algorithms. Typically such algorithms operate on matrices of species presence?absence in sites, or on species-specific distributions of model predicted probabilities of occurrence in grid cells. There are practically always errors in input data?erroneous species presence?absence data, structural and parametric uncertainty in predictive habitat models, and lack of correspondence between temporal presence and long-run persistence. Despite these uncertainties, typical reserve selection methods proceed as if there is no uncertainty in the data or models. Having two conservation options of apparently equal biological value, one would prefer the option whose value is relatively insensitive to errors in planning inputs. In this work we show how uncertainty analysis for reserve planning can be implemented within a framework of information-gap decision theory, generating reserve designs that are robust to uncertainty. Consideration of uncertainty involves modifications to the typical objective functions used in reserve selection. Search for robust-optimal reserve structures can still be implemented via typical reserve selection optimization techniques, including stepwise heuristics, integer-programming and stochastic global search.

  3. Uncertainty Estimation of Neutron Activation Analysis in Zinc Elemental Determination in Food Samples

    International Nuclear Information System (INIS)

    Endah Damastuti; Muhayatun; Diah Dwiana L

    2009-01-01

    Beside to complished the requirements of international standard of ISO/IEC 17025:2005, uncertainty estimation should be done to increase quality and confidence of analysis results and also to establish traceability of the analysis results to SI unit. Neutron activation analysis is a major technique used by Radiometry technique analysis laboratory and is included as scope of accreditation under ISO/IEC 17025:2005, therefore uncertainty estimation of neutron activation analysis is needed to be carried out. Sample and standard preparation as well as, irradiation and measurement using gamma spectrometry were the main activities which could give contribution to uncertainty. The components of uncertainty sources were specifically explained. The result of expanded uncertainty was 4,0 mg/kg with level of confidence 95% (coverage factor=2) and Zn concentration was 25,1 mg/kg. Counting statistic of cuplikan and standard were the major contribution of combined uncertainty. The uncertainty estimation was expected to increase the quality of the analysis results and could be applied further to other kind of samples. (author)

  4. Erha Uncertainty Analysis: Planning for the future

    International Nuclear Information System (INIS)

    Brami, T.R.; Hopkins, D.F.; Loguer, W.L.; Cornagia, D.M.; Braisted, A.W.C.

    2002-01-01

    The Erha field (OPL 209) was discovered in 1999 approximately 100 km off the coast of Nigeria in 1,100 m of water. The discovery well (Erha-1) encountered oil and gas in deep-water clastic reservoirs. The first appraisal well (Erha-2) drilled 1.6 km downdip to the northwest penetrated an oil-water contact and confirmed a potentially commercial discovery. However, the Erha-3 and Erha-3 ST-1 boreholes, drilled on the faulted east-side of the field in 2001, encountered shallower fluid contacts. As a result of these findings, a comprehensive field-wide uncertainty analysis was performed to better understand what we know versus what we think regarding resource size and economic viability The uncertainty analysis process applied at Erha is an integrated scenario-based probabilistic approach to model resource and reserves. Its goal is to provide quantitative results for a variety of scenarios, thus allowing identification of and focus on critical controls (the variables that are likely to impose the greatest influence).The initial focus at Erha was to incorporate the observed fluid contacts and to develop potential scenarios that included the range of possibilities in unpenetrated portions of the field. Four potential compartmentalization scenarios were hypothesized. The uncertainty model combines these scenarios with reservoir parameters and their plausible ranges. Input data comes from multiple sources including: wells, 3D seismic, reservoir flow simulation, geochemistry, fault-seal analysis, sequence stratigraphic analysis, and analogs. Once created, the model is sampled using Monte-Carlo techniques to create probability density functions for a variety of variables including oil in-place and recoverable reserves.Results of the uncertainty analysis support that despite a thinner oil column on the faulted east-side of the field, Erha is an economically attractive opportunity. Further, the results have been to develop data acquisition plans and mitigation strategies that

  5. Methodologies for uncertainty analysis in the level 2 PSA and their implementation procedures

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Yang, Joon Eun; Kim, Dong Ha

    2002-04-01

    Main purpose of this report to present standardized methodologies for uncertainty analysis in the Level 2 Probabilistic Safety Assessment (PSA) and their implementation procedures, based on results obtained through a critical review of the existing methodologies for the analysis of uncertainties employed in the Level 2 PSA, especially Accident Progression Event Tree (APET). Uncertainties employed in the Level 2 PSA, quantitative expressions of overall knowledge of analysts' and experts' participating in the probabilistic quantification process of phenomenological accident progressions ranging from core melt to containment failure, their numerical values are directly related to the degree of confidence that the analyst has that a given phenomenological event or accident process will or will not occur, or analyst's subjective probabilities of occurrence. These results that are obtained from Level 2 PSA uncertainty analysis, become an essential contributor to the plant risk, in addition to the Level 1 PSA and Level 3 PSA uncertainties. Uncertainty analysis methodologies and their implementation procedures presented in this report was prepared based on the following criteria: 'uncertainty quantification process must be logical, scrutable, complete, consistent and in an appropriate level of detail, as mandated by the Level 2 PSA objectives'. For the aforementioned purpose, this report deals mainly with (1) summary of general or Level 2 PSA specific uncertainty analysis methodologies, (2) selection of phenomenological branch events for uncertainty analysis in the APET, methodology for quantification of APET uncertainty inputs and its implementation procedure, (3) statistical propagation of uncertainty inputs through APET and its implementation procedure, and (4) formal procedure for quantification of APET uncertainties and source term categories (STCs) through the Level 2 PSA quantification codes

  6. Aspects of uncertainty analysis in accident consequence modeling

    International Nuclear Information System (INIS)

    Travis, C.C.; Hoffman, F.O.

    1981-01-01

    Mathematical models are frequently used to determine probable dose to man from an accidental release of radionuclides by a nuclear facility. With increased emphasis on the accuracy of these models, the incorporation of uncertainty analysis has become one of the most crucial and sensitive components in evaluating the significance of model predictions. In the present paper, we address three aspects of uncertainty in models used to assess the radiological impact to humans: uncertainties resulting from the natural variability in human biological parameters; the propagation of parameter variability by mathematical models; and comparison of model predictions to observational data

  7. Determination of uncertainties in energy and exergy analysis of a power plant

    International Nuclear Information System (INIS)

    Ege, Ahmet; Şahin, Hacı Mehmet

    2014-01-01

    Highlights: • Energy and exergy efficiency uncertainties in a large thermal power plant examined. • Sensitivity analysis shows importance of basic measurements on efficiency analysis. • A quick and practical approach is provided for determining efficiency uncertainties. • Extreme case analysis characterizes maximum possible boundaries of uncertainties. • Uncertainty determination in a plant is a dynamic process with real data. - Abstract: In this study, energy and exergy efficiency uncertainties of a large scale lignite fired power plant cycle and various measurement parameter sensitivities were investigated for five different design power outputs (100%, 85%, 80%, 60% and 40%) and with real data of the plant. For that purpose a black box method was employed considering coal flow with Lower Heating Value (LHV) as a single input and electricity produced as a single output of the plant. The uncertainty of energy and exergy efficiency of the plant was evaluated with this method by applying sensitivity analysis depending on the effect of measurement parameters such as LHV, coal mass flow rate, cell generator output voltage/current. In addition, an extreme case analysis was investigated to determine the maximum range of the uncertainties. Results of the black box method showed that uncertainties varied between 1.82–1.98% for energy efficiency and 1.32–1.43% for exergy efficiency of the plant at an operating power level of 40–100% of full power. It was concluded that LHV determination was the most important uncertainty source of energy and exergy efficiency of the plant. The uncertainties of the extreme case analysis were determined between 2.30% and 2.36% for energy efficiency while 1.66% and 1.70% for exergy efficiency for 40–100% power output respectively. Proposed method was shown to be an approach for understanding major uncertainties as well as effects of some measurement parameters in a large scale thermal power plant

  8. LOFT differential pressure uncertainty analysis

    International Nuclear Information System (INIS)

    Evans, R.P.; Biladeau, G.L.; Quinn, P.A.

    1977-03-01

    A performance analysis of the LOFT differential pressure (ΔP) measurement is presented. Along with completed descriptions of test programs and theoretical studies that have been conducted on the ΔP, specific sources of measurement uncertainty are identified, quantified, and combined to provide an assessment of the ability of this measurement to satisfy the SDD 1.4.1C (June 1975) requirement of measurement of differential pressure

  9. Sensitivity and uncertainty analysis applied to a repository in rock salt

    International Nuclear Information System (INIS)

    Polle, A.N.

    1996-12-01

    This document describes the sensitivity and uncertainty analysis with UNCSAM, as applied to a repository in rock salt for the EVEREST project. UNCSAM is a dedicated software package for sensitivity and uncertainty analysis, which was already used within the preceding PROSA project. The use of UNCSAM provides a flexible interface to EMOS ECN by substituting the sampled values in the various input files to be used by EMOS ECN ; the model calculations for this repository were performed with the EMOS ECN code. Preceding the sensitivity and uncertainty analysis, a number of preparations has been carried out to facilitate EMOS ECN with the probabilistic input data. For post-processing the EMOS ECN results, the characteristic output signals were processed. For the sensitivity and uncertainty analysis with UNCSAM the stochastic input, i.e. sampled values, and the output for the various EMOS ECN runs have been analyzed. (orig.)

  10. An introductory guide to uncertainty analysis in environmental and health risk assessment

    International Nuclear Information System (INIS)

    Hoffman, F.O.; Hammonds, J.S.

    1992-10-01

    To compensate for the potential for overly conservative estimates of risk using standard US Environmental Protection Agency methods, an uncertainty analysis should be performed as an integral part of each risk assessment. Uncertainty analyses allow one to obtain quantitative results in the form of confidence intervals that will aid in decision making and will provide guidance for the acquisition of additional data. To perform an uncertainty analysis, one must frequently rely on subjective judgment in the absence of data to estimate the range and a probability distribution describing the extent of uncertainty about a true but unknown value for each parameter of interest. This information is formulated from professional judgment based on an extensive review of literature, analysis of the data, and interviews with experts. Various analytical and numerical techniques are available to allow statistical propagation of the uncertainty in the model parameters to a statement of uncertainty in the risk to a potentially exposed individual. Although analytical methods may be straightforward for relatively simple models, they rapidly become complicated for more involved risk assessments. Because of the tedious efforts required to mathematically derive analytical approaches to propagate uncertainty in complicated risk assessments, numerical methods such as Monte Carlo simulation should be employed. The primary objective of this report is to provide an introductory guide for performing uncertainty analysis in risk assessments being performed for Superfund sites

  11. Uncertainty analysis of time-dependent nonlinear systems: theory and application to transient thermal hydraulics

    International Nuclear Information System (INIS)

    Barhen, J.; Bjerke, M.A.; Cacuci, D.G.; Mullins, C.B.; Wagschal, G.G.

    1982-01-01

    An advanced methodology for performing systematic uncertainty analysis of time-dependent nonlinear systems is presented. This methodology includes a capability for reducing uncertainties in system parameters and responses by using Bayesian inference techniques to consistently combine prior knowledge with additional experimental information. The determination of best estimates for the system parameters, for the responses, and for their respective covariances is treated as a time-dependent constrained minimization problem. Three alternative formalisms for solving this problem are developed. The two ''off-line'' formalisms, with and without ''foresight'' characteristics, require the generation of a complete sensitivity data base prior to performing the uncertainty analysis. The ''online'' formalism, in which uncertainty analysis is performed interactively with the system analysis code, is best suited for treatment of large-scale highly nonlinear time-dependent problems. This methodology is applied to the uncertainty analysis of a transient upflow of a high pressure water heat transfer experiment. For comparison, an uncertainty analysis using sensitivities computed by standard response surface techniques is also performed. The results of the analysis indicate the following. Major reduction of the discrepancies in the calculation/experiment ratios is achieved by using the new methodology. Incorporation of in-bundle measurements in the uncertainty analysis significantly reduces system uncertainties. Accuracy of sensitivities generated by response-surface techniques should be carefully assessed prior to using them as a basis for uncertainty analyses of transient reactor safety problems

  12. Uncertainty Analysis of the NASA Glenn 8x6 Supersonic Wind Tunnel

    Science.gov (United States)

    Stephens, Julia; Hubbard, Erin; Walter, Joel; McElroy, Tyler

    2016-01-01

    This paper presents methods and results of a detailed measurement uncertainty analysis that was performed for the 8- by 6-foot Supersonic Wind Tunnel located at the NASA Glenn Research Center. The statistical methods and engineering judgments used to estimate elemental uncertainties are described. The Monte Carlo method of propagating uncertainty was selected to determine the uncertainty of calculated variables of interest. A detailed description of the Monte Carlo method as applied for this analysis is provided. Detailed uncertainty results for the uncertainty in average free stream Mach number as well as other variables of interest are provided. All results are presented as random (variation in observed values about a true value), systematic (potential offset between observed and true value), and total (random and systematic combined) uncertainty. The largest sources contributing to uncertainty are determined and potential improvement opportunities for the facility are investigated.

  13. Selected examples of practical approaches for the assessment of model reliability - parameter uncertainty analysis

    International Nuclear Information System (INIS)

    Hofer, E.; Hoffman, F.O.

    1987-02-01

    The uncertainty analysis of model predictions has to discriminate between two fundamentally different types of uncertainty. The presence of stochastic variability (Type 1 uncertainty) necessitates the use of a probabilistic model instead of the much simpler deterministic one. Lack of knowledge (Type 2 uncertainty), however, applies to deterministic as well as to probabilistic model predictions and often dominates over uncertainties of Type 1. The term ''probability'' is interpreted differently in the probabilistic analysis of either type of uncertainty. After these discriminations have been explained the discussion centers on the propagation of parameter uncertainties through the model, the derivation of quantitative uncertainty statements for model predictions and the presentation and interpretation of the results of a Type 2 uncertainty analysis. Various alternative approaches are compared for a very simple deterministic model

  14. Approach to uncertainty evaluation for safety analysis

    International Nuclear Information System (INIS)

    Ogura, Katsunori

    2005-01-01

    Nuclear power plant safety used to be verified and confirmed through accident simulations using computer codes generally because it is very difficult to perform integrated experiments or tests for the verification and validation of the plant safety due to radioactive consequence, cost, and scaling to the actual plant. Traditionally the plant safety had been secured owing to the sufficient safety margin through the conservative assumptions and models to be applied to those simulations. Meanwhile the best-estimate analysis based on the realistic assumptions and models in support of the accumulated insights could be performed recently, inducing the reduction of safety margin in the analysis results and the increase of necessity to evaluate the reliability or uncertainty of the analysis results. This paper introduces an approach to evaluate the uncertainty of accident simulation and its results. (Note: This research had been done not in the Japan Nuclear Energy Safety Organization but in the Tokyo Institute of Technology.) (author)

  15. Uncertainty analysis of a low flow model for the Rhine River

    NARCIS (Netherlands)

    Demirel, M.C.; Booij, Martijn J.

    2011-01-01

    It is widely recognized that hydrological models are subject to parameter uncertainty. However, little attention has been paid so far to the uncertainty in parameters of the data-driven models like weights in neural networks. This study aims at applying a structured uncertainty analysis to a

  16. Application of uncertainty analysis in conceptual fusion reactor design

    International Nuclear Information System (INIS)

    Wu, T.; Maynard, C.W.

    1979-01-01

    The theories of sensitivity and uncertainty analysis are described and applied to a new conceptual tokamak fusion reactor design--NUWMAK. The responses investigated in this study include the tritium breeding ratio, first wall Ti dpa and gas productions, nuclear heating in the blanket, energy leakage to the magnet, and the dpa rate in the superconducting magnet aluminum stabilizer. The sensitivities and uncertainties of these responses are calculated. The cost/benefit feature of proposed integral measurements is also studied through the uncertainty reductions of these responses

  17. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear-reactor-safety research program is described and compared with other methodologies established for performing uncertainty analyses

  18. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear reactor safety research program is described and compared with other methodologies established for performing uncertainty analyses

  19. Uncertainty Analysis of Few Group Cross Sections Based on Generalized Perturbation Theory

    International Nuclear Information System (INIS)

    Han, Tae Young; Lee, Hyun Chul; Noh, Jae Man

    2014-01-01

    In this paper, the methodology of the sensitivity and uncertainty analysis code based on GPT was described and the preliminary verification calculations on the PMR200 pin cell problem were carried out. As a result, they are in a good agreement when compared with the results by TSUNAMI. From this study, it is expected that MUSAD code based on GPT can produce the uncertainty of the homogenized few group microscopic cross sections for a core simulator. For sensitivity and uncertainty analyses for general core responses, a two-step method is available and it utilizes the generalized perturbation theory (GPT) for homogenized few group cross sections in the first step and stochastic sampling method for general core responses in the second step. The uncertainty analysis procedure based on GPT in the first step needs the generalized adjoint solution from a cell or lattice code. For this, the generalized adjoint solver has been integrated into DeCART in our previous work. In this paper, MUSAD (Modues of Uncertainty and Sensitivity Analysis for DeCART) code based on the classical perturbation theory was expanded to the function of the sensitivity and uncertainty analysis for few group cross sections based on GPT. First, the uncertainty analysis method based on GPT was described and, in the next section, the preliminary results of the verification calculation on a VHTR pin cell problem were compared with the results by TSUNAMI of SCALE 6.1

  20. Analysis and Reduction of Complex Networks Under Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Knio, Omar M

    2014-04-09

    This is a collaborative proposal that aims at developing new methods for the analysis and reduction of complex multiscale networks under uncertainty. The approach is based on combining methods of computational singular perturbation (CSP) and probabilistic uncertainty quantification. In deterministic settings, CSP yields asymptotic approximations of reduced-dimensionality “slow manifolds” on which a multiscale dynamical system evolves. Introducing uncertainty raises fundamentally new issues, particularly concerning its impact on the topology of slow manifolds, and means to represent and quantify associated variability. To address these challenges, this project uses polynomial chaos (PC) methods to reformulate uncertain network models, and to analyze them using CSP in probabilistic terms. Specific objectives include (1) developing effective algorithms that can be used to illuminate fundamental and unexplored connections among model reduction, multiscale behavior, and uncertainty, and (2) demonstrating the performance of these algorithms through applications to model problems.

  1. Sensitivity and uncertainty analysis of NET/ITER shielding blankets

    International Nuclear Information System (INIS)

    Hogenbirk, A.; Gruppelaar, H.; Verschuur, K.A.

    1990-09-01

    Results are presented of sensitivity and uncertainty calculations based upon the European fusion file (EFF-1). The effect of uncertainties in Fe, Cr and Ni cross sections on the nuclear heating in the coils of a NET/ITER shielding blanket has been studied. The analysis has been performed for the total cross section as well as partial cross sections. The correct expression for the sensitivity profile was used, including the gain term. The resulting uncertainty in the nuclear heating lies between 10 and 20 per cent. (author). 18 refs.; 2 figs.; 2 tabs

  2. A structured analysis of uncertainty surrounding modeled impacts of groundwater-extraction rules

    Science.gov (United States)

    Guillaume, Joseph H. A.; Qureshi, M. Ejaz; Jakeman, Anthony J.

    2012-08-01

    Integrating economic and groundwater models for groundwater-management can help improve understanding of trade-offs involved between conflicting socioeconomic and biophysical objectives. However, there is significant uncertainty in most strategic decision-making situations, including in the models constructed to represent them. If not addressed, this uncertainty may be used to challenge the legitimacy of the models and decisions made using them. In this context, a preliminary uncertainty analysis was conducted of a dynamic coupled economic-groundwater model aimed at assessing groundwater extraction rules. The analysis demonstrates how a variety of uncertainties in such a model can be addressed. A number of methods are used including propagation of scenarios and bounds on parameters, multiple models, block bootstrap time-series sampling and robust linear regression for model calibration. These methods are described within the context of a theoretical uncertainty management framework, using a set of fundamental uncertainty management tasks and an uncertainty typology.

  3. Holistic uncertainty analysis in river basin modeling for climate vulnerability assessment

    Science.gov (United States)

    Taner, M. U.; Wi, S.; Brown, C.

    2017-12-01

    The challenges posed by uncertain future climate are a prominent concern for water resources managers. A number of frameworks exist for assessing the impacts of climate-related uncertainty, including internal climate variability and anthropogenic climate change, such as scenario-based approaches and vulnerability-based approaches. While in many cases climate uncertainty may be dominant, other factors such as future evolution of the river basin, hydrologic response and reservoir operations are potentially significant sources of uncertainty. While uncertainty associated with modeling hydrologic response has received attention, very little attention has focused on the range of uncertainty and possible effects of the water resources infrastructure and management. This work presents a holistic framework that allows analysis of climate, hydrologic and water management uncertainty in water resources systems analysis with the aid of a water system model designed to integrate component models for hydrology processes and water management activities. The uncertainties explored include those associated with climate variability and change, hydrologic model parameters, and water system operation rules. A Bayesian framework is used to quantify and model the uncertainties at each modeling steps in integrated fashion, including prior and the likelihood information about model parameters. The framework is demonstrated in a case study for the St. Croix Basin located at border of United States and Canada.

  4. Geological-structural models used in SR 97. Uncertainty analysis

    Energy Technology Data Exchange (ETDEWEB)

    Saksa, P.; Nummela, J. [FINTACT Oy (Finland)

    1998-10-01

    The uncertainty of geological-structural models was studied for the three sites in SR 97, called Aberg, Beberg and Ceberg. The evaluation covered both regional and site scale models, the emphasis being placed on fracture zones in the site scale. Uncertainty is a natural feature of all geoscientific investigations. It originates from measurements (errors in data, sampling limitations, scale variation) and conceptualisation (structural geometries and properties, ambiguous geometric or parametric solutions) to name the major ones. The structures of A-, B- and Ceberg are fracture zones of varying types. No major differences in the conceptualisation between the sites were noted. One source of uncertainty in the site models is the non-existence of fracture and zone information in the scale from 10 to 300 - 1000 m. At Aberg the development of the regional model has been performed very thoroughly. At the site scale one major source of uncertainty is that a clear definition of the target area is missing. Structures encountered in the boreholes are well explained and an interdisciplinary approach in interpretation have taken place. Beberg and Ceberg regional models contain relatively large uncertainties due to the investigation methodology and experience available at that time. In site scale six additional structures were proposed both to Beberg and Ceberg to variant analysis of these sites. Both sites include uncertainty in the form of many non-interpreted fractured sections along the boreholes. Statistical analysis gives high occurrences of structures for all three sites: typically 20 - 30 structures/km{sup 3}. Aberg has highest structural frequency, Beberg comes next and Ceberg has the lowest. The borehole configuration, orientations and surveying goals were inspected to find whether preferences or factors causing bias were present. Data from Aberg supports the conclusion that Aespoe sub volume would be an anomalously fractured, tectonised unit of its own. This means that

  5. Geological-structural models used in SR 97. Uncertainty analysis

    International Nuclear Information System (INIS)

    Saksa, P.; Nummela, J.

    1998-10-01

    The uncertainty of geological-structural models was studied for the three sites in SR 97, called Aberg, Beberg and Ceberg. The evaluation covered both regional and site scale models, the emphasis being placed on fracture zones in the site scale. Uncertainty is a natural feature of all geoscientific investigations. It originates from measurements (errors in data, sampling limitations, scale variation) and conceptualisation (structural geometries and properties, ambiguous geometric or parametric solutions) to name the major ones. The structures of A-, B- and Ceberg are fracture zones of varying types. No major differences in the conceptualisation between the sites were noted. One source of uncertainty in the site models is the non-existence of fracture and zone information in the scale from 10 to 300 - 1000 m. At Aberg the development of the regional model has been performed very thoroughly. At the site scale one major source of uncertainty is that a clear definition of the target area is missing. Structures encountered in the boreholes are well explained and an interdisciplinary approach in interpretation have taken place. Beberg and Ceberg regional models contain relatively large uncertainties due to the investigation methodology and experience available at that time. In site scale six additional structures were proposed both to Beberg and Ceberg to variant analysis of these sites. Both sites include uncertainty in the form of many non-interpreted fractured sections along the boreholes. Statistical analysis gives high occurrences of structures for all three sites: typically 20 - 30 structures/km 3 . Aberg has highest structural frequency, Beberg comes next and Ceberg has the lowest. The borehole configuration, orientations and surveying goals were inspected to find whether preferences or factors causing bias were present. Data from Aberg supports the conclusion that Aespoe sub volume would be an anomalously fractured, tectonised unit of its own. This means that the

  6. Uncertainty Management and Sensitivity Analysis

    DEFF Research Database (Denmark)

    Rosenbaum, Ralph K.; Georgiadis, Stylianos; Fantke, Peter

    2018-01-01

    Uncertainty is always there and LCA is no exception to that. The presence of uncertainties of different types and from numerous sources in LCA results is a fact, but managing them allows to quantify and improve the precision of a study and the robustness of its conclusions. LCA practice sometimes...... suffers from an imbalanced perception of uncertainties, justifying modelling choices and omissions. Identifying prevalent misconceptions around uncertainties in LCA is a central goal of this chapter, aiming to establish a positive approach focusing on the advantages of uncertainty management. The main...... objectives of this chapter are to learn how to deal with uncertainty in the context of LCA, how to quantify it, interpret and use it, and how to communicate it. The subject is approached more holistically than just focusing on relevant statistical methods or purely mathematical aspects. This chapter...

  7. Uncertainty Analysis via Failure Domain Characterization: Unrestricted Requirement Functions

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2011-01-01

    This paper proposes an uncertainty analysis framework based on the characterization of the uncertain parameter space. This characterization enables the identification of worst-case uncertainty combinations and the approximation of the failure and safe domains with a high level of accuracy. Because these approximations are comprised of subsets of readily computable probability, they enable the calculation of arbitrarily tight upper and lower bounds to the failure probability. The methods developed herein, which are based on nonlinear constrained optimization, are applicable to requirement functions whose functional dependency on the uncertainty is arbitrary and whose explicit form may even be unknown. Some of the most prominent features of the methodology are the substantial desensitization of the calculations from the assumed uncertainty model (i.e., the probability distribution describing the uncertainty) as well as the accommodation for changes in such a model with a practically insignificant amount of computational effort.

  8. Uncertainty Flow Facilitates Zero-Shot Multi-Label Learning in Affective Facial Analysis

    Directory of Open Access Journals (Sweden)

    Wenjun Bai

    2018-02-01

    Full Text Available Featured Application: The proposed Uncertainty Flow framework may benefit the facial analysis with its promised elevation in discriminability in multi-label affective classification tasks. Moreover, this framework also allows the efficient model training and between tasks knowledge transfer. The applications that rely heavily on continuous prediction on emotional valance, e.g., to monitor prisoners’ emotional stability in jail, can be directly benefited from our framework. Abstract: To lower the single-label dependency on affective facial analysis, it urges the fruition of multi-label affective learning. The impediment to practical implementation of existing multi-label algorithms pertains to scarcity of scalable multi-label training datasets. To resolve this, an inductive transfer learning based framework, i.e.,Uncertainty Flow, is put forward in this research to allow knowledge transfer from a single labelled emotion recognition task to a multi-label affective recognition task. I.e., the model uncertainty—which can be quantified in Uncertainty Flow—is distilled from a single-label learning task. The distilled model uncertainty ensures the later efficient zero-shot multi-label affective learning. On the theoretical perspective, within our proposed Uncertainty Flow framework, the feasibility of applying weakly informative priors, e.g., uniform and Cauchy prior, is fully explored in this research. More importantly, based on the derived weight uncertainty, three sets of prediction related uncertainty indexes, i.e., soft-max uncertainty, pure uncertainty and uncertainty plus are proposed to produce reliable and accurate multi-label predictions. Validated on our manual annotated evaluation dataset, i.e., the multi-label annotated FER2013, our proposed Uncertainty Flow in multi-label facial expression analysis exhibited superiority to conventional multi-label learning algorithms and multi-label compatible neural networks. The success of our

  9. Sensitivity and uncertainty analysis for fission product decay heat calculations

    International Nuclear Information System (INIS)

    Rebah, J.; Lee, Y.K.; Nimal, J.C.; Nimal, B.; Luneville, L.; Duchemin, B.

    1994-01-01

    The calculated uncertainty in decay heat due to the uncertainty in basic nuclear data given in the CEA86 Library, is presented. Uncertainties in summation calculation arise from several sources: fission product yields, half-lives and average decay energies. The correlation between basic data is taken into account. The uncertainty analysis were obtained for thermal-neutron-induced fission of U235 and Pu239 in the case of burst fission and irradiation time. The calculated decay heat in this study is compared with experimental results and with new calculation using the JEF2 Library. (from authors) 6 figs., 19 refs

  10. Uncertainty Analysis via Failure Domain Characterization: Polynomial Requirement Functions

    Science.gov (United States)

    Crespo, Luis G.; Munoz, Cesar A.; Narkawicz, Anthony J.; Kenny, Sean P.; Giesy, Daniel P.

    2011-01-01

    This paper proposes an uncertainty analysis framework based on the characterization of the uncertain parameter space. This characterization enables the identification of worst-case uncertainty combinations and the approximation of the failure and safe domains with a high level of accuracy. Because these approximations are comprised of subsets of readily computable probability, they enable the calculation of arbitrarily tight upper and lower bounds to the failure probability. A Bernstein expansion approach is used to size hyper-rectangular subsets while a sum of squares programming approach is used to size quasi-ellipsoidal subsets. These methods are applicable to requirement functions whose functional dependency on the uncertainty is a known polynomial. Some of the most prominent features of the methodology are the substantial desensitization of the calculations from the uncertainty model assumed (i.e., the probability distribution describing the uncertainty) as well as the accommodation for changes in such a model with a practically insignificant amount of computational effort.

  11. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Haskin, F.E. [Univ. of New Mexico, Albuquerque, NM (United States); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on early health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  12. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harrison, J.D. [National Radiological Protection Board (United Kingdom); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  13. The uncertainty in physical measurements an introduction to data analysis in the physics laboratory

    CERN Document Server

    Fornasini, Paolo

    2008-01-01

    All measurements of physical quantities are affected by uncertainty. Understanding the origin of uncertainty, evaluating its extent and suitably taking it into account in data analysis is essential for assessing the degree of accuracy of phenomenological relationships and physical laws in both scientific research and technological applications. The Uncertainty in Physical Measurements: An Introduction to Data Analysis in the Physics Laboratory presents an introduction to uncertainty and to some of the most common procedures of data analysis. This book will serve the reader well by filling the gap between tutorial textbooks and highly specialized monographs. The book is divided into three parts. The first part is a phenomenological introduction to measurement and uncertainty: properties of instruments, different causes and corresponding expressions of uncertainty, histograms and distributions, and unified expression of uncertainty. The second part contains an introduction to probability theory, random variable...

  14. Deterministic sensitivity and uncertainty analysis for large-scale computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Pin, F.G.; Oblow, E.M.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.

    1988-01-01

    This paper presents a comprehensive approach to sensitivity and uncertainty analysis of large-scale computer models that is analytic (deterministic) in principle and that is firmly based on the model equations. The theory and application of two systems based upon computer calculus, GRESS and ADGEN, are discussed relative to their role in calculating model derivatives and sensitivities without a prohibitive initial manpower investment. Storage and computational requirements for these two systems are compared for a gradient-enhanced version of the PRESTO-II computer model. A Deterministic Uncertainty Analysis (DUA) method that retains the characteristics of analytically computing result uncertainties based upon parameter probability distributions is then introduced and results from recent studies are shown. 29 refs., 4 figs., 1 tab

  15. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    International Nuclear Information System (INIS)

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project

  16. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    Energy Technology Data Exchange (ETDEWEB)

    Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States); Lui, C.H. [Nuclear Regulatory Commission, Washington, DC (United States); Goossens, L.H.J.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Paesler-Sauer, J. [Research Center, Karlsruhe (Germany); Helton, J.C. [and others

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.

  17. Uncertainty analysis for the assembly and core simulation of BEAVRS at the HZP conditions

    Energy Technology Data Exchange (ETDEWEB)

    Wan, Chenghui [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Cao, Liangzhi, E-mail: caolz@mail.xjtu.edu.cn [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Wu, Hongchun [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Shen, Wei [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Canadian Nuclear Safety Commission, Ottawa, Ontario (Canada)

    2017-04-15

    Highlights: • Uncertainty analysis has been completed based on the “two-step” scheme. • Uncertainty analysis has been performed to BEAVRS at HZP. • For lattice calculations, the few-group constant’s uncertainty was quantified. • For core simulation, uncertainties of k{sub eff} and power distributions were quantified. - Abstract: Based on the “two-step” scheme for the reactor-physics calculations, the capability of uncertainty analysis for the core simulations has been implemented in the UNICORN code, an in-house code for the sensitivity and uncertainty analysis of the reactor-physics calculations. Applying the statistical sampling method, the nuclear-data uncertainties can be propagated to the important predictions of the core simulations. The uncertainties of the few-group constants introduced by the uncertainties of the multigroup microscopic cross sections are quantified first for the lattice calculations; the uncertainties of the few-group constants are then propagated to the core multiplication factor and core power distributions for the core simulations. Up to now, our in-house lattice code NECP-CACTI and the neutron-diffusion solver NECP-VIOLET have been implemented in UNICORN for the steady-state core simulations based on the “two-step” scheme. With NECP-CACTI and NECP-VIOLET, the modeling and simulation of the steady-state BEAVRS benchmark problem at the HZP conditions was performed, and the results were compared with those obtained by CASMO-4E. Based on the modeling and simulation, the UNICORN code has been applied to perform the uncertainty analysis for BAEVRS at HZP. The uncertainty results of the eigenvalues and two-group constants for the lattice calculations and the multiplication factor and the power distributions for the steady-state core simulations are obtained and analyzed in detail.

  18. Uncertainty analysis for the assembly and core simulation of BEAVRS at the HZP conditions

    International Nuclear Information System (INIS)

    Wan, Chenghui; Cao, Liangzhi; Wu, Hongchun; Shen, Wei

    2017-01-01

    Highlights: • Uncertainty analysis has been completed based on the “two-step” scheme. • Uncertainty analysis has been performed to BEAVRS at HZP. • For lattice calculations, the few-group constant’s uncertainty was quantified. • For core simulation, uncertainties of k_e_f_f and power distributions were quantified. - Abstract: Based on the “two-step” scheme for the reactor-physics calculations, the capability of uncertainty analysis for the core simulations has been implemented in the UNICORN code, an in-house code for the sensitivity and uncertainty analysis of the reactor-physics calculations. Applying the statistical sampling method, the nuclear-data uncertainties can be propagated to the important predictions of the core simulations. The uncertainties of the few-group constants introduced by the uncertainties of the multigroup microscopic cross sections are quantified first for the lattice calculations; the uncertainties of the few-group constants are then propagated to the core multiplication factor and core power distributions for the core simulations. Up to now, our in-house lattice code NECP-CACTI and the neutron-diffusion solver NECP-VIOLET have been implemented in UNICORN for the steady-state core simulations based on the “two-step” scheme. With NECP-CACTI and NECP-VIOLET, the modeling and simulation of the steady-state BEAVRS benchmark problem at the HZP conditions was performed, and the results were compared with those obtained by CASMO-4E. Based on the modeling and simulation, the UNICORN code has been applied to perform the uncertainty analysis for BAEVRS at HZP. The uncertainty results of the eigenvalues and two-group constants for the lattice calculations and the multiplication factor and the power distributions for the steady-state core simulations are obtained and analyzed in detail.

  19. Performance Assessment Uncertainty Analysis for Japan's HLW Program Feasibility Study (H12)

    International Nuclear Information System (INIS)

    BABA, T.; ISHIGURO, K.; ISHIHARA, Y.; SAWADA, A.; UMEKI, H.; WAKASUGI, K.; WEBB, ERIK K.

    1999-01-01

    Most HLW programs in the world recognize that any estimate of long-term radiological performance must be couched in terms of the uncertainties derived from natural variation, changes through time and lack of knowledge about the essential processes. The Japan Nuclear Cycle Development Institute followed a relatively standard procedure to address two major categories of uncertainty. First, a FEatures, Events and Processes (FEPs) listing, screening and grouping activity was pursued in order to define the range of uncertainty in system processes as well as possible variations in engineering design. A reference and many alternative cases representing various groups of FEPs were defined and individual numerical simulations performed for each to quantify the range of conceptual uncertainty. Second, parameter distributions were developed for the reference case to represent the uncertainty in the strength of these processes, the sequencing of activities and geometric variations. Both point estimates using high and low values for individual parameters as well as a probabilistic analysis were performed to estimate parameter uncertainty. A brief description of the conceptual model uncertainty analysis is presented. This paper focuses on presenting the details of the probabilistic parameter uncertainty assessment

  20. Status of XSUSA for sampling based nuclear data uncertainty and sensitivity analysis

    International Nuclear Information System (INIS)

    Zwermann, W.; Gallner, L.; Klein, M.; Krzydacz-Hausmann; Pasichnyk, I.; Pautz, A.; Velkov, K.

    2013-01-01

    In the present contribution, an overview of the sampling based XSUSA method for sensitivity and uncertainty analysis with respect to nuclear data is given. The focus is on recent developments and applications of XSUSA. These applications include calculations for critical assemblies, fuel assembly depletion calculations, and steady state as well as transient reactor core calculations. The analyses are partially performed in the framework of international benchmark working groups (UACSA - Uncertainty Analyses for Criticality Safety Assessment, UAM - Uncertainty Analysis in Modelling). It is demonstrated that particularly for full-scale reactor calculations the influence of the nuclear data uncertainties on the results can be substantial. For instance, for the radial fission rate distributions of mixed UO 2 /MOX light water reactor cores, the 2σ uncertainties in the core centre and periphery can reach values exceeding 10%. For a fast transient, the resulting time behaviour of the reactor power was covered by a wide uncertainty band. Overall, the results confirm the necessity of adding systematic uncertainty analyses to best-estimate reactor calculations. (authors)

  1. Uncertainty analysis in the applications of nuclear probabilistic risk assessment

    International Nuclear Information System (INIS)

    Le Duy, T.D.

    2011-01-01

    The aim of this thesis is to propose an approach to model parameter and model uncertainties affecting the results of risk indicators used in the applications of nuclear Probabilistic Risk assessment (PRA). After studying the limitations of the traditional probabilistic approach to represent uncertainty in PRA model, a new approach based on the Dempster-Shafer theory has been proposed. The uncertainty analysis process of the proposed approach consists in five main steps. The first step aims to model input parameter uncertainties by belief and plausibility functions according to the data PRA model. The second step involves the propagation of parameter uncertainties through the risk model to lay out the uncertainties associated with output risk indicators. The model uncertainty is then taken into account in the third step by considering possible alternative risk models. The fourth step is intended firstly to provide decision makers with information needed for decision making under uncertainty (parametric and model) and secondly to identify the input parameters that have significant uncertainty contributions on the result. The final step allows the process to be continued in loop by studying the updating of beliefs functions given new data. The proposed methodology was implemented on a real but simplified application of PRA model. (author)

  2. Quantitative Analysis of Uncertainty in Medical Reporting: Creating a Standardized and Objective Methodology.

    Science.gov (United States)

    Reiner, Bruce I

    2018-04-01

    Uncertainty in text-based medical reports has long been recognized as problematic, frequently resulting in misunderstanding and miscommunication. One strategy for addressing the negative clinical ramifications of report uncertainty would be the creation of a standardized methodology for characterizing and quantifying uncertainty language, which could provide both the report author and reader with context related to the perceived level of diagnostic confidence and accuracy. A number of computerized strategies could be employed in the creation of this analysis including string search, natural language processing and understanding, histogram analysis, topic modeling, and machine learning. The derived uncertainty data offers the potential to objectively analyze report uncertainty in real time and correlate with outcomes analysis for the purpose of context and user-specific decision support at the point of care, where intervention would have the greatest clinical impact.

  3. Probabilistic Accident Consequence Uncertainty Analysis of the Food Chain Module in the COSYMA Package (invited paper)

    International Nuclear Information System (INIS)

    Brown, J.; Jones, J.A.

    2000-01-01

    This paper describes the uncertainty analysis of the food chain module of COSYMA and the uncertainty distributions on the input parameter values for the food chain model provided by the expert panels that were used for the analysis. Two expert panels were convened, covering the areas of soil and plant transfer processes and transfer to and through animals. The aggregated uncertainty distributions from the experts for the elicited variables were used in an uncertainty analysis of the food chain module of COSYMA. The main aim of the module analysis was to identify those parameters whose uncertainty makes large contributions to the overall uncertainty and so should be included in the overall analysis. (author)

  4. Comprehensive neutron cross-section and secondary energy distribution uncertainty analysis for a fusion reactor

    International Nuclear Information System (INIS)

    Gerstl, S.A.W.; LaBauve, R.J.; Young, P.G.

    1980-05-01

    On the example of General Atomic's well-documented Power Generating Fusion Reactor (PGFR) design, this report exercises a comprehensive neutron cross-section and secondary energy distribution (SED) uncertainty analysis. The LASL sensitivity and uncertainty analysis code SENSIT is used to calculate reaction cross-section sensitivity profiles and integral SED sensitivity coefficients. These are then folded with covariance matrices and integral SED uncertainties to obtain the resulting uncertainties of three calculated neutronics design parameters: two critical radiation damage rates and a nuclear heating rate. The report documents the first sensitivity-based data uncertainty analysis, which incorporates a quantitative treatment of the effects of SED uncertainties. The results demonstrate quantitatively that the ENDF/B-V cross-section data files for C, H, and O, including their SED data, are fully adequate for this design application, while the data for Fe and Ni are at best marginally adequate because they give rise to response uncertainties up to 25%. Much higher response uncertainties are caused by cross-section and SED data uncertainties in Cu (26 to 45%), tungsten (24 to 54%), and Cr (up to 98%). Specific recommendations are given for re-evaluations of certain reaction cross-sections, secondary energy distributions, and uncertainty estimates

  5. A Peep into the Uncertainty-Complexity-Relevance Modeling Trilemma through Global Sensitivity and Uncertainty Analysis

    Science.gov (United States)

    Munoz-Carpena, R.; Muller, S. J.; Chu, M.; Kiker, G. A.; Perz, S. G.

    2014-12-01

    Model Model complexity resulting from the need to integrate environmental system components cannot be understated. In particular, additional emphasis is urgently needed on rational approaches to guide decision making through uncertainties surrounding the integrated system across decision-relevant scales. However, in spite of the difficulties that the consideration of modeling uncertainty represent for the decision process, it should not be avoided or the value and science behind the models will be undermined. These two issues; i.e., the need for coupled models that can answer the pertinent questions and the need for models that do so with sufficient certainty, are the key indicators of a model's relevance. Model relevance is inextricably linked with model complexity. Although model complexity has advanced greatly in recent years there has been little work to rigorously characterize the threshold of relevance in integrated and complex models. Formally assessing the relevance of the model in the face of increasing complexity would be valuable because there is growing unease among developers and users of complex models about the cumulative effects of various sources of uncertainty on model outputs. In particular, this issue has prompted doubt over whether the considerable effort going into further elaborating complex models will in fact yield the expected payback. New approaches have been proposed recently to evaluate the uncertainty-complexity-relevance modeling trilemma (Muller, Muñoz-Carpena and Kiker, 2011) by incorporating state-of-the-art global sensitivity and uncertainty analysis (GSA/UA) in every step of the model development so as to quantify not only the uncertainty introduced by the addition of new environmental components, but the effect that these new components have over existing components (interactions, non-linear responses). Outputs from the analysis can also be used to quantify system resilience (stability, alternative states, thresholds or tipping

  6. Uncertainty analysis and design optimization of hybrid rocket motor powered vehicle for suborbital flight

    Directory of Open Access Journals (Sweden)

    Zhu Hao

    2015-06-01

    Full Text Available In this paper, we propose an uncertainty analysis and design optimization method and its applications on a hybrid rocket motor (HRM powered vehicle. The multidisciplinary design model of the rocket system is established and the design uncertainties are quantified. The sensitivity analysis of the uncertainties shows that the uncertainty generated from the error of fuel regression rate model has the most significant effect on the system performances. Then the differences between deterministic design optimization (DDO and uncertainty-based design optimization (UDO are discussed. Two newly formed uncertainty analysis methods, including the Kriging-based Monte Carlo simulation (KMCS and Kriging-based Taylor series approximation (KTSA, are carried out using a global approximation Kriging modeling method. Based on the system design model and the results of design uncertainty analysis, the design optimization of an HRM powered vehicle for suborbital flight is implemented using three design optimization methods: DDO, KMCS and KTSA. The comparisons indicate that the two UDO methods can enhance the design reliability and robustness. The researches and methods proposed in this paper can provide a better way for the general design of HRM powered vehicles.

  7. Uncertainty analysis of the 35% reactor inlet header break in a CANDU 6 reactor using RELAP/SCDAPSIM/MOD4.0 with integrated uncertainty analysis option

    International Nuclear Information System (INIS)

    Dupleac, D.; Perez, M.; Reventos, F.; Allison, C.

    2011-01-01

    The RELAP/SCDAPSIM/MOD4.0 code, designed to predict the behavior of reactor systems during normal and accident conditions, is being developed as part of an international nuclear technology Software Development and Training Program (SDTP). RELAP/SCDAPSIM/MOD4.0, which is the first version of RELAP5 completely rewritten to FORTRAN 90/95/2000 standards, uses the publicly available RELAP5 and SCDAP models in combination with (a) advanced programming and numerical techniques, (b) advanced SDTP-member-developed models for LWR, HWR, and research reactor analysis, and (c) a variety of other member-developed computational packages. One such computational package is an integrated uncertainty analysis (IUA) package being developed jointly by the Technical University of Catalonia (UPC) and Innovative Systems Software (ISS). RELAP/SCDAPSIM/MOD4.0(IUA) follows the input-propagation approach using probability distribution functions to define the uncertainty of the input parameters. The main steps for this type of methodologies, often referred as to statistical approaches or Wilks’ methods, are the ones that follow: 1. Selection of the plant; 2. Selection of the scenario; 3. Selection of the safety criteria; 4. Identification and ranking of the relevant phenomena based on the safety criteria; 5. Selection of the appropriate code parameters to represent those phenomena; 6. Association of uncertainty by means of Probability Distribution Functions (PDFs) for each selected parameter; 7. Random sampling of the selected parameters according to its PDF and performing multiple computer runs to obtain uncertainty bands with a certain percentile and confidence level; 8. Processing the results of the multiple computer runs to estimate the uncertainty bands for the computed quantities associated with the selected safety criteria. RELAP/SCDAPSIM/MOD4.0(IUA) calculates the number of required code runs given the desired percentile and confidence level, performs the sampling process for the

  8. Uncertainty analysis of the 35% reactor inlet header break in a CANDU 6 reactor using RELAP/SCDAPSIM/MOD4.0 with integrated uncertainty analysis option

    Energy Technology Data Exchange (ETDEWEB)

    Dupleac, D., E-mail: danieldu@cne.pub.ro [Politehnica Univ. of Bucharest (Romania); Perez, M.; Reventos, F., E-mail: marina.perez@upc.edu, E-mail: francesc.reventos@upc.edu [Technical Univ. of Catalonia (Spain); Allison, C., E-mail: iss@cableone.net [Innovative Systems Software (United States)

    2011-07-01

    The RELAP/SCDAPSIM/MOD4.0 code, designed to predict the behavior of reactor systems during normal and accident conditions, is being developed as part of an international nuclear technology Software Development and Training Program (SDTP). RELAP/SCDAPSIM/MOD4.0, which is the first version of RELAP5 completely rewritten to FORTRAN 90/95/2000 standards, uses the publicly available RELAP5 and SCDAP models in combination with (a) advanced programming and numerical techniques, (b) advanced SDTP-member-developed models for LWR, HWR, and research reactor analysis, and (c) a variety of other member-developed computational packages. One such computational package is an integrated uncertainty analysis (IUA) package being developed jointly by the Technical University of Catalonia (UPC) and Innovative Systems Software (ISS). RELAP/SCDAPSIM/MOD4.0(IUA) follows the input-propagation approach using probability distribution functions to define the uncertainty of the input parameters. The main steps for this type of methodologies, often referred as to statistical approaches or Wilks’ methods, are the ones that follow: 1. Selection of the plant; 2. Selection of the scenario; 3. Selection of the safety criteria; 4. Identification and ranking of the relevant phenomena based on the safety criteria; 5. Selection of the appropriate code parameters to represent those phenomena; 6. Association of uncertainty by means of Probability Distribution Functions (PDFs) for each selected parameter; 7. Random sampling of the selected parameters according to its PDF and performing multiple computer runs to obtain uncertainty bands with a certain percentile and confidence level; 8. Processing the results of the multiple computer runs to estimate the uncertainty bands for the computed quantities associated with the selected safety criteria. RELAP/SCDAPSIM/MOD4.0(IUA) calculates the number of required code runs given the desired percentile and confidence level, performs the sampling process for the

  9. Uncertainty and Sensitivity Analysis of Afterbody Radiative Heating Predictions for Earth Entry

    Science.gov (United States)

    West, Thomas K., IV; Johnston, Christopher O.; Hosder, Serhat

    2016-01-01

    The objective of this work was to perform sensitivity analysis and uncertainty quantification for afterbody radiative heating predictions of Stardust capsule during Earth entry at peak afterbody radiation conditions. The radiation environment in the afterbody region poses significant challenges for accurate uncertainty quantification and sensitivity analysis due to the complexity of the flow physics, computational cost, and large number of un-certain variables. In this study, first a sparse collocation non-intrusive polynomial chaos approach along with global non-linear sensitivity analysis was used to identify the most significant uncertain variables and reduce the dimensions of the stochastic problem. Then, a total order stochastic expansion was constructed over only the important parameters for an efficient and accurate estimate of the uncertainty in radiation. Based on previous work, 388 uncertain parameters were considered in the radiation model, which came from the thermodynamics, flow field chemistry, and radiation modeling. The sensitivity analysis showed that only four of these variables contributed significantly to afterbody radiation uncertainty, accounting for almost 95% of the uncertainty. These included the electronic- impact excitation rate for N between level 2 and level 5 and rates of three chemical reactions in uencing N, N(+), O, and O(+) number densities in the flow field.

  10. Uncertainty and sensitivity analysis on probabilistic safety assessment of an experimental facility

    International Nuclear Information System (INIS)

    Burgazzi, L.

    2000-01-01

    The aim of this work is to perform an uncertainty and sensitivity analysis on the probabilistic safety assessment of the International Fusion Materials Irradiation Facility (IFMIF), in order to assess the effect on the final risk values of the uncertainties associated with the generic data used for the initiating events and component reliability and to identify the key quantities contributing to this uncertainty. The analysis is conducted on the expected frequency calculated for the accident sequences, defined through the event tree (ET) modeling. This is in order to increment credit to the ET model quantification, to calculate frequency distributions for the occurrence of events and, consequently, to assess if sequences have been correctly selected on the probability standpoint and finally to verify the fulfillment of the safety conditions. Uncertainty and sensitivity analysis are performed using respectively Monte Carlo sampling and an importance parameter technique. (author)

  11. Uncertainties in elemental quantitative analysis by PIXE

    International Nuclear Information System (INIS)

    Montenegro, E.C.; Baptista, G.B.; Paschoa, A.S.; Barros Leite, C.V.

    1979-01-01

    The effects of the degree of non-uniformity of the particle beam, matrix composition and matrix thickness in a quantitative elemental analysis by particle induced X-ray emission (PIXE) are discussed and a criterion to evaluate the resulting degree of uncertainty in the mass determination by this method is established. (Auth.)

  12. Quality in environmental science for policy: Assessing uncertainty as a component of policy analysis

    International Nuclear Information System (INIS)

    Maxim, Laura; Sluijs, Jeroen P. van der

    2011-01-01

    The sheer number of attempts to define and classify uncertainty reveals an awareness of its importance in environmental science for policy, though the nature of uncertainty is often misunderstood. The interdisciplinary field of uncertainty analysis is unstable; there are currently several incomplete notions of uncertainty leading to different and incompatible uncertainty classifications. One of the most salient shortcomings of present-day practice is that most of these classifications focus on quantifying uncertainty while ignoring the qualitative aspects that tend to be decisive in the interface between science and policy. Consequently, the current practices of uncertainty analysis contribute to increasing the perceived precision of scientific knowledge, but do not adequately address its lack of socio-political relevance. The 'positivistic' uncertainty analysis models (like those that dominate the fields of climate change modelling and nuclear or chemical risk assessment) have little social relevance, as they do not influence negotiations between stakeholders. From the perspective of the science-policy interface, the current practices of uncertainty analysis are incomplete and incorrectly focused. We argue that although scientific knowledge produced and used in a context of political decision-making embodies traditional scientific characteristics, it also holds additional properties linked to its influence on social, political, and economic relations. Therefore, the significance of uncertainty cannot be assessed based on quality criteria that refer to the scientific content only; uncertainty must also include quality criteria specific to the properties and roles of this scientific knowledge within political, social, and economic contexts and processes. We propose a conceptual framework designed to account for such substantive, contextual, and procedural criteria of knowledge quality. At the same time, the proposed framework includes and synthesizes the various

  13. Preliminary Uncertainty Analysis for SMART Digital Core Protection and Monitoring System

    International Nuclear Information System (INIS)

    Koo, Bon Seung; In, Wang Kee; Hwang, Dae Hyun

    2012-01-01

    The Korea Atomic Energy Research Institute (KAERI) developed on-line digital core protection and monitoring systems, called SCOPS and SCOMS as a part of SMART plant protection and monitoring system. SCOPS simplified the protection system by directly connecting the four RSPT signals to each core protection channel and eliminated the control element assembly calculator (CEAC) hardware. SCOMS adopted DPCM3D method in synthesizing core power distribution instead of Fourier expansion method being used in conventional PWRs. The DPCM3D method produces a synthetic 3-D power distribution by coupling a neutronics code and measured in-core detector signals. The overall uncertainty analysis methodology which is used statistically combining uncertainty components of SMART core protection and monitoring system was developed. In this paper, preliminary overall uncertainty factors for SCOPS/SCOMS of SMART initial core were evaluated by applying newly developed uncertainty analysis method

  14. Uncertainties in criticality analysis which affect the storage and transportation of LWR fuel

    International Nuclear Information System (INIS)

    Napolitani, D.G.

    1989-01-01

    Satisfying the design criteria for subcriticality with uncertainties affects: the capacity of LWR storage arrays, maximum allowable enrichment, minimum allowable burnup and economics of various storage options. There are uncertainties due to: calculational method, data libraries, geometric limitations, modelling bias, the number and quality of benchmarks performed and mechanical uncertainties in the array. Yankee Atomic Electric Co. (YAEC) has developed and benchmarked methods to handle: high density storage rack designs, pin consolidation, low density moderation and burnup credit. The uncertainties associated with such criticality analysis are quantified on the basis of clean criticals, power reactor criticals and intercomparison of independent analysis methods

  15. Using finite mixture models in thermal-hydraulics system code uncertainty analysis

    Energy Technology Data Exchange (ETDEWEB)

    Carlos, S., E-mail: scarlos@iqn.upv.es [Department d’Enginyeria Química i Nuclear, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain); Sánchez, A. [Department d’Estadística Aplicada i Qualitat, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain); Ginestar, D. [Department de Matemàtica Aplicada, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain); Martorell, S. [Department d’Enginyeria Química i Nuclear, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain)

    2013-09-15

    Highlights: • Best estimate codes simulation needs uncertainty quantification. • The output variables can present multimodal probability distributions. • The analysis of multimodal distribution is performed using finite mixture models. • Two methods to reconstruct output variable probability distribution are used. -- Abstract: Nuclear Power Plant safety analysis is mainly based on the use of best estimate (BE) codes that predict the plant behavior under normal or accidental conditions. As the BE codes introduce uncertainties due to uncertainty in input parameters and modeling, it is necessary to perform uncertainty assessment (UA), and eventually sensitivity analysis (SA), of the results obtained. These analyses are part of the appropriate treatment of uncertainties imposed by current regulation based on the adoption of the best estimate plus uncertainty (BEPU) approach. The most popular approach for uncertainty assessment, based on Wilks’ method, obtains a tolerance/confidence interval, but it does not completely characterize the output variable behavior, which is required for an extended UA and SA. However, the development of standard UA and SA impose high computational cost due to the large number of simulations needed. In order to obtain more information about the output variable and, at the same time, to keep computational cost as low as possible, there has been a recent shift toward developing metamodels (model of model), or surrogate models, that approximate or emulate complex computer codes. In this way, there exist different techniques to reconstruct the probability distribution using the information provided by a sample of values as, for example, the finite mixture models. In this paper, the Expectation Maximization and the k-means algorithms are used to obtain a finite mixture model that reconstructs the output variable probability distribution from data obtained with RELAP-5 simulations. Both methodologies have been applied to a separated

  16. Development Optimization and Uncertainty Analysis Methods for Oil and Gas Reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Ettehadtavakkol, Amin, E-mail: amin.ettehadtavakkol@ttu.edu [Texas Tech University (United States); Jablonowski, Christopher [Shell Exploration and Production Company (United States); Lake, Larry [University of Texas at Austin (United States)

    2017-04-15

    Uncertainty complicates the development optimization of oil and gas exploration and production projects, but methods have been devised to analyze uncertainty and its impact on optimal decision-making. This paper compares two methods for development optimization and uncertainty analysis: Monte Carlo (MC) simulation and stochastic programming. Two example problems for a gas field development and an oilfield development are solved and discussed to elaborate the advantages and disadvantages of each method. Development optimization involves decisions regarding the configuration of initial capital investment and subsequent operational decisions. Uncertainty analysis involves the quantification of the impact of uncertain parameters on the optimum design concept. The gas field development problem is designed to highlight the differences in the implementation of the two methods and to show that both methods yield the exact same optimum design. The results show that both MC optimization and stochastic programming provide unique benefits, and that the choice of method depends on the goal of the analysis. While the MC method generates more useful information, along with the optimum design configuration, the stochastic programming method is more computationally efficient in determining the optimal solution. Reservoirs comprise multiple compartments and layers with multiphase flow of oil, water, and gas. We present a workflow for development optimization under uncertainty for these reservoirs, and solve an example on the design optimization of a multicompartment, multilayer oilfield development.

  17. Development Optimization and Uncertainty Analysis Methods for Oil and Gas Reservoirs

    International Nuclear Information System (INIS)

    Ettehadtavakkol, Amin; Jablonowski, Christopher; Lake, Larry

    2017-01-01

    Uncertainty complicates the development optimization of oil and gas exploration and production projects, but methods have been devised to analyze uncertainty and its impact on optimal decision-making. This paper compares two methods for development optimization and uncertainty analysis: Monte Carlo (MC) simulation and stochastic programming. Two example problems for a gas field development and an oilfield development are solved and discussed to elaborate the advantages and disadvantages of each method. Development optimization involves decisions regarding the configuration of initial capital investment and subsequent operational decisions. Uncertainty analysis involves the quantification of the impact of uncertain parameters on the optimum design concept. The gas field development problem is designed to highlight the differences in the implementation of the two methods and to show that both methods yield the exact same optimum design. The results show that both MC optimization and stochastic programming provide unique benefits, and that the choice of method depends on the goal of the analysis. While the MC method generates more useful information, along with the optimum design configuration, the stochastic programming method is more computationally efficient in determining the optimal solution. Reservoirs comprise multiple compartments and layers with multiphase flow of oil, water, and gas. We present a workflow for development optimization under uncertainty for these reservoirs, and solve an example on the design optimization of a multicompartment, multilayer oilfield development.

  18. Model parameter uncertainty analysis for annual field-scale P loss model

    Science.gov (United States)

    Phosphorous (P) loss models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. All P loss models, however, have an inherent amount of uncertainty associated with them. In this study, we conducted an uncertainty analysis with ...

  19. Uncertainty analysis of nonlinear systems employing the first-order reliability method

    International Nuclear Information System (INIS)

    Choi, Chan Kyu; Yoo, Hong Hee

    2012-01-01

    In most mechanical systems, properties of the system elements have uncertainties due to several reasons. For example, mass, stiffness coefficient of a spring, damping coefficient of a damper or friction coefficients have uncertain characteristics. The uncertain characteristics of the elements have a direct effect on the system performance uncertainty. It is very important to estimate the performance uncertainty since the performance uncertainty is directly related to manufacturing yield and consumer satisfaction. Due to this reason, the performance uncertainty should be estimated accurately and considered in the system design. In this paper, performance measures are defined for nonlinear vibration systems and the performance measure uncertainties are estimated employing the first order reliability method (FORM). It was found that the FORM could provide good results in spite of the system nonlinear characteristics. Comparing to the results obtained by Monte Carlo Simulation (MCS), the accuracy of the uncertainty analysis results obtained by the FORM is validated

  20. SENSIT: a cross-section and design sensitivity and uncertainty analysis code

    International Nuclear Information System (INIS)

    Gerstl, S.A.W.

    1980-01-01

    SENSIT computes the sensitivity and uncertainty of a calculated integral response (such as a dose rate) due to input cross sections and their uncertainties. Sensitivity profiles are computed for neutron and gamma-ray reaction cross sections of standard multigroup cross section sets and for secondary energy distributions (SEDs) of multigroup scattering matrices. In the design sensitivity mode, SENSIT computes changes in an integral response due to design changes and gives the appropriate sensitivity coefficients. Cross section uncertainty analyses are performed for three types of input data uncertainties: cross-section covariance matrices for pairs of multigroup reaction cross sections, spectral shape uncertainty parameters for secondary energy distributions (integral SED uncertainties), and covariance matrices for energy-dependent response functions. For all three types of data uncertainties SENSIT computes the resulting variance and estimated standard deviation in an integral response of interest, on the basis of generalized perturbation theory. SENSIT attempts to be more comprehensive than earlier sensitivity analysis codes, such as SWANLAKE

  1. Uncertainties in Earthquake Loss Analysis: A Case Study From Southern California

    Science.gov (United States)

    Mahdyiar, M.; Guin, J.

    2005-12-01

    Probabilistic earthquake hazard and loss analyses play important roles in many areas of risk management, including earthquake related public policy and insurance ratemaking. Rigorous loss estimation for portfolios of properties is difficult since there are various types of uncertainties in all aspects of modeling and analysis. It is the objective of this study to investigate the sensitivity of earthquake loss estimation to uncertainties in regional seismicity, earthquake source parameters, ground motions, and sites' spatial correlation on typical property portfolios in Southern California. Southern California is an attractive region for such a study because it has a large population concentration exposed to significant levels of seismic hazard. During the last decade, there have been several comprehensive studies of most regional faults and seismogenic sources. There have also been detailed studies on regional ground motion attenuations and regional and local site responses to ground motions. This information has been used by engineering seismologists to conduct regional seismic hazard and risk analysis on a routine basis. However, one of the more difficult tasks in such studies is the proper incorporation of uncertainties in the analysis. From the hazard side, there are uncertainties in the magnitudes, rates and mechanisms of the seismic sources and local site conditions and ground motion site amplifications. From the vulnerability side, there are considerable uncertainties in estimating the state of damage of buildings under different earthquake ground motions. From an analytical side, there are challenges in capturing the spatial correlation of ground motions and building damage, and integrating thousands of loss distribution curves with different degrees of correlation. In this paper we propose to address some of these issues by conducting loss analyses of a typical small portfolio in southern California, taking into consideration various source and ground

  2. Uncertainty and sensitivity analysis of environmental transport models

    International Nuclear Information System (INIS)

    Margulies, T.S.; Lancaster, L.E.

    1985-01-01

    An uncertainty and sensitivity analysis has been made of the CRAC-2 (Calculations of Reactor Accident Consequences) atmospheric transport and deposition models. Robustness and uncertainty aspects of air and ground deposited material and the relative contribution of input and model parameters were systematically studied. The underlying data structures were investigated using a multiway layout of factors over specified ranges generated via a Latin hypercube sampling scheme. The variables selected in our analysis include: weather bin, dry deposition velocity, rain washout coefficient/rain intensity, duration of release, heat content, sigma-z (vertical) plume dispersion parameter, sigma-y (crosswind) plume dispersion parameter, and mixing height. To determine the contributors to the output variability (versus distance from the site) step-wise regression analyses were performed on transformations of the spatial concentration patterns simulated. 27 references, 2 figures, 3 tables

  3. Uncertainty and sensitivity analysis applied to coupled code calculations for a VVER plant transient

    International Nuclear Information System (INIS)

    Langenbuch, S.; Krzykacz-Hausmann, B.; Schmidt, K. D.

    2004-01-01

    The development of coupled codes, combining thermal-hydraulic system codes and 3D neutron kinetics, is an important step to perform best-estimate plant transient calculations. It is generally agreed that the application of best-estimate methods should be supplemented by an uncertainty and sensitivity analysis to quantify the uncertainty of the results. The paper presents results from the application of the GRS uncertainty and sensitivity method for a VVER-440 plant transient, which was already studied earlier for the validation of coupled codes. For this application, the main steps of the uncertainty method are described. Typical results of the method applied to the analysis of the plant transient by several working groups using different coupled codes are presented and discussed The results demonstrate the capability of an uncertainty and sensitivity analysis. (authors)

  4. Uncertainty analysis in the task of individual monitoring data

    International Nuclear Information System (INIS)

    Molokanov, A.; Badjin, V.; Gasteva, G.; Antipin, E.

    2003-01-01

    Assessment of internal doses is an essential component of individual monitoring programmes for workers and consists of two stages: individual monitoring measurements and interpretation of the monitoring data in terms of annual intake and/or annual internal dose. The overall uncertainty in assessed dose is a combination of the uncertainties in these stages. An algorithm and a computer code were developed for estimating the uncertainties in these stages. An algorithm and a computer code were developed for estimating the uncertainty in the assessment of internal dose in the task of individual monitoring data interpretation. Two main influencing factors are analysed in this paper: the unknown time of the exposure and variability of bioassay measurements. The aim of this analysis is to show that the algorithm is applicable in designing an individual monitoring programme for workers so as to guarantee that the individual dose calculated from individual monitoring measurements does not exceed a required limit with a certain confidence probability. (author)

  5. Hydrocoin level 3 - Testing methods for sensitivity/uncertainty analysis

    International Nuclear Information System (INIS)

    Grundfelt, B.; Lindbom, B.; Larsson, A.; Andersson, K.

    1991-01-01

    The HYDROCOIN study is an international cooperative project for testing groundwater hydrology modelling strategies for performance assessment of nuclear waste disposal. The study was initiated in 1984 by the Swedish Nuclear Power Inspectorate and the technical work was finalised in 1987. The participating organisations are regulatory authorities as well as implementing organisations in 10 countries. The study has been performed at three levels aimed at studying computer code verification, model validation and sensitivity/uncertainty analysis respectively. The results from the first two levels, code verification and model validation, have been published in reports in 1988 and 1990 respectively. This paper focuses on some aspects of the results from Level 3, sensitivity/uncertainty analysis, for which a final report is planned to be published during 1990. For Level 3, seven test cases were defined. Some of these aimed at exploring the uncertainty associated with the modelling results by simply varying parameter values and conceptual assumptions. In other test cases statistical sampling methods were applied. One of the test cases dealt with particle tracking and the uncertainty introduced by this type of post processing. The amount of results available is substantial although unevenly spread over the test cases. It has not been possible to cover all aspects of the results in this paper. Instead, the different methods applied will be illustrated by some typical analyses. 4 figs., 9 refs

  6. Information Theory for Correlation Analysis and Estimation of Uncertainty Reduction in Maps and Models

    Directory of Open Access Journals (Sweden)

    J. Florian Wellmann

    2013-04-01

    Full Text Available The quantification and analysis of uncertainties is important in all cases where maps and models of uncertain properties are the basis for further decisions. Once these uncertainties are identified, the logical next step is to determine how they can be reduced. Information theory provides a framework for the analysis of spatial uncertainties when different subregions are considered as random variables. In the work presented here, joint entropy, conditional entropy, and mutual information are applied for a detailed analysis of spatial uncertainty correlations. The aim is to determine (i which areas in a spatial analysis share information, and (ii where, and by how much, additional information would reduce uncertainties. As an illustration, a typical geological example is evaluated: the case of a subsurface layer with uncertain depth, shape and thickness. Mutual information and multivariate conditional entropies are determined based on multiple simulated model realisations. Even for this simple case, the measures not only provide a clear picture of uncertainties and their correlations but also give detailed insights into the potential reduction of uncertainties at each position, given additional information at a different location. The methods are directly applicable to other types of spatial uncertainty evaluations, especially where multiple realisations of a model simulation are analysed. In summary, the application of information theoretic measures opens up the path to a better understanding of spatial uncertainties, and their relationship to information and prior knowledge, for cases where uncertain property distributions are spatially analysed and visualised in maps and models.

  7. Phenomenological uncertainty analysis of early containment failure at severe accident of nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Su Won

    2011-02-15

    The severe accident has inherently significant uncertainty due to wide range of conditions and performing experiments, validation and practical application are extremely difficult because of its high temperature and pressure. Although internal and external researches were put into practice, the reference used in Korean nuclear plants were foreign data of 1980s and safety analysis as the probabilistic safety assessment has not applied the newest methodology. Also, it is applied to containment pressure formed into point value as results of thermal hydraulic analysis to identify the probability of containment failure in level 2 PSA. In this paper, the uncertainty analysis methods for phenomena of severe accident influencing early containment failure were developed, the uncertainty analysis that apply Korean nuclear plants using the MELCOR code was performed and it is a point of view to present the distribution of containment pressure as a result of uncertainty analysis. Because early containment failure is important factor of Large Early Release Frequency(LERF) that is used as representative criteria of decision-making in nuclear power plants, it was selected in this paper among various modes of containment failure. Important phenomena of early containment failure at severe accident based on previous researches were comprehended and methodology of 7th steps to evaluate uncertainty was developed. The MELCOR input for analysis of the severe accident reflected natural circulation flow was developed and the accident scenario for station black out that was representative initial event of early containment failure was determined. By reviewing the internal model and correlation for MELCOR model relevant important phenomena of early containment failure, the uncertainty factors which could affect on the uncertainty were founded and the major factors were finally identified through the sensitivity analysis. In order to determine total number of MELCOR calculations which can

  8. Uncertainties in Cancer Risk Coefficients for Environmental Exposure to Radionuclides. An Uncertainty Analysis for Risk Coefficients Reported in Federal Guidance Report No. 13

    Energy Technology Data Exchange (ETDEWEB)

    Pawel, David [U.S. Environmental Protection Agency; Leggett, Richard Wayne [ORNL; Eckerman, Keith F [ORNL; Nelson, Christopher [U.S. Environmental Protection Agency

    2007-01-01

    Federal Guidance Report No. 13 (FGR 13) provides risk coefficients for estimation of the risk of cancer due to low-level exposure to each of more than 800 radionuclides. Uncertainties in risk coefficients were quantified in FGR 13 for 33 cases (exposure to each of 11 radionuclides by each of three exposure pathways) on the basis of sensitivity analyses in which various combinations of plausible biokinetic, dosimetric, and radiation risk models were used to generate alternative risk coefficients. The present report updates the uncertainty analysis in FGR 13 for the cases of inhalation and ingestion of radionuclides and expands the analysis to all radionuclides addressed in that report. The analysis indicates that most risk coefficients for inhalation or ingestion of radionuclides are determined within a factor of 5 or less by current information. That is, application of alternate plausible biokinetic and dosimetric models and radiation risk models (based on the linear, no-threshold hypothesis with an adjustment for the dose and dose rate effectiveness factor) is unlikely to change these coefficients by more than a factor of 5. In this analysis the assessed uncertainty in the radiation risk model was found to be the main determinant of the uncertainty category for most risk coefficients, but conclusions concerning the relative contributions of risk and dose models to the total uncertainty in a risk coefficient may depend strongly on the method of assessing uncertainties in the risk model.

  9. Uncertainty and sensitivity analysis in the scenario simulation with RELAP/SCDAP and MELCOR codes

    International Nuclear Information System (INIS)

    Garcia J, T.; Cardenas V, J.

    2015-09-01

    A methodology was implemented for analysis of uncertainty in simulations of scenarios with RELAP/SCDAP V- 3.4 bi-7 and MELCOR V-2.1 codes, same that are used to perform safety analysis in the Comision Nacional de Seguridad Nuclear y Salvaguardias (CNSNS). The uncertainty analysis methodology chosen is a probabilistic method of type Propagation of uncertainty of the input parameters to the departure parameters. Therefore, it began with the selection of the input parameters considered uncertain and are considered of high importance in the scenario for its direct effect on the output interest variable. These parameters were randomly sampled according to intervals of variation or probability distribution functions assigned by expert judgment to generate a set of input files that were run through the simulation code to propagate the uncertainty to the output parameters. Then, through the use or ordered statistical and formula Wilks, was determined that the minimum number of executions required to obtain the uncertainty bands that include a population of 95% at a confidence level of 95% in the results is 93, is important to mention that in this method that number of executions does not depend on the number of selected input parameters. In the implementation routines in Fortran 90 that allowed automate the process to make the uncertainty analysis in transients for RELAP/SCDAP code were generated. In the case of MELCOR code for severe accident analysis, automation was carried out through complement Dakota Uncertainty incorporated into the Snap platform. To test the practical application of this methodology, two analyzes were performed: the first with the simulation of closing transient of the main steam isolation valves using the RELAP/SCDAP code obtaining the uncertainty band of the dome pressure of the vessel; while in the second analysis, the accident simulation of the power total loss (Sbo) was carried out with the Macarol code obtaining the uncertainty band for the

  10. Code development of total sensitivity and uncertainty analysis for reactor physics calculations

    International Nuclear Information System (INIS)

    Wan, C.; Cao, L.; Wu, H.; Zu, T.; Shen, W.

    2015-01-01

    Sensitivity and uncertainty analysis are essential parts for reactor system to perform risk and policy analysis. In this study, total sensitivity and corresponding uncertainty analysis for responses of neutronics calculations have been accomplished and developed the S&U analysis code named UNICORN. The UNICORN code can consider the implicit effects of multigroup cross sections on the responses. The UNICORN code has been applied to typical pin-cell case in this paper, and can be proved correct by comparison the results with those of the TSUNAMI-1D code. (author)

  11. Code development of total sensitivity and uncertainty analysis for reactor physics calculations

    Energy Technology Data Exchange (ETDEWEB)

    Wan, C.; Cao, L.; Wu, H.; Zu, T., E-mail: chenghuiwan@stu.xjtu.edu.cn, E-mail: caolz@mail.xjtu.edu.cn, E-mail: hongchun@mail.xjtu.edu.cn, E-mail: tiejun@mail.xjtu.edu.cn [Xi' an Jiaotong Univ., School of Nuclear Science and Technology, Xi' an (China); Shen, W., E-mail: Wei.Shen@cnsc-ccsn.gc.ca [Xi' an Jiaotong Univ., School of Nuclear Science and Technology, Xi' an (China); Canadian Nuclear Safety Commission, Ottawa, ON (Canada)

    2015-07-01

    Sensitivity and uncertainty analysis are essential parts for reactor system to perform risk and policy analysis. In this study, total sensitivity and corresponding uncertainty analysis for responses of neutronics calculations have been accomplished and developed the S&U analysis code named UNICORN. The UNICORN code can consider the implicit effects of multigroup cross sections on the responses. The UNICORN code has been applied to typical pin-cell case in this paper, and can be proved correct by comparison the results with those of the TSUNAMI-1D code. (author)

  12. Uncertainty quantification and sensitivity analysis with CASL Core Simulator VERA-CS

    International Nuclear Information System (INIS)

    Brown, C.S.; Zhang, Hongbin

    2016-01-01

    VERA-CS (Virtual Environment for Reactor Applications, Core Simulator) is a coupled neutron transport and thermal-hydraulics code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). An approach to uncertainty quantification and sensitivity analysis with VERA-CS was developed and a new toolkit was created to perform uncertainty quantification and sensitivity analysis. A 2 × 2 fuel assembly model was developed and simulated by VERA-CS, and uncertainty quantification and sensitivity analysis were performed with fourteen uncertain input parameters. The minimum departure from nucleate boiling ratio (MDNBR), maximum fuel center-line temperature, and maximum outer clad surface temperature were chosen as the selected figures of merit. Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis and coolant inlet temperature was consistently the most influential parameter. Parameters used as inputs to the critical heat flux calculation with the W-3 correlation were shown to be the most influential on the MDNBR, maximum fuel center-line temperature, and maximum outer clad surface temperature.

  13. Demonstration uncertainty/sensitivity analysis using the health and economic consequence model CRAC2

    International Nuclear Information System (INIS)

    Alpert, D.J.; Iman, R.L.; Johnson, J.D.; Helton, J.C.

    1985-01-01

    This paper summarizes a demonstration uncertainty/sensitivity analysis performed on the reactor accident consequence model CRAC2. The study was performed with uncertainty/sensitivity analysis techniques compiled as part of the MELCOR program. The principal objectives of the study were: 1) to demonstrate the use of the uncertainty/sensitivity analysis techniques on a health and economic consequence model, 2) to test the computer models which implement the techniques, 3) to identify possible difficulties in performing such an analysis, and 4) to explore alternative means of analyzing, displaying, and describing the results. Demonstration of the applicability of the techniques was the motivation for performing this study; thus, the results should not be taken as a definitive uncertainty analysis of health and economic consequences. Nevertheless, significant insights on health and economic consequence analysis can be drawn from the results of this type of study. Latin hypercube sampling (LHS), a modified Monte Carlo technique, was used in this study. LHS generates a multivariate input structure in which all the variables of interest are varied simultaneously and desired correlations between variables are preserved. LHS has been shown to produce estimates of output distribution functions that are comparable with results of larger random samples

  14. Development of Uncertainty Analysis Method for SMART Digital Core Protection and Monitoring System

    International Nuclear Information System (INIS)

    Koo, Bon Seung; In, Wang Kee; Hwang, Dae Hyun

    2012-01-01

    The Korea Atomic Energy Research Institute has developed a system-integrated modular advanced reactor (SMART) for a seawater desalination and electricity generation. Online digital core protection and monitoring systems, called SCOPS and SCOMS respectively were developed. SCOPS calculates minimum DNBR and maximum LPD based on the several online measured system parameters. SCOMS calculates the variables of limiting conditions for operation. KAERI developed overall uncertainty analysis methodology which is used statistically combining uncertainty components of SMART core protection and monitoring system. By applying overall uncertainty factors in on-line SCOPS/SCOMS calculation, calculated LPD and DNBR are conservative with a 95/95 probability/confidence level. In this paper, uncertainty analysis method is described for SMART core protection and monitoring system

  15. Uncertainty Evaluation of the SFR Subchannel Thermal-Hydraulic Modeling Using a Hot Channel Factors Analysis

    International Nuclear Information System (INIS)

    Choi, Sun Rock; Cho, Chung Ho; Kim, Sang Ji

    2011-01-01

    In an SFR core analysis, a hot channel factors (HCF) method is most commonly used to evaluate uncertainty. It was employed to the early design such as the CRBRP and IFR. In other ways, the improved thermal design procedure (ITDP) is able to calculate the overall uncertainty based on the Root Sum Square technique and sensitivity analyses of each design parameters. The Monte Carlo method (MCM) is also employed to estimate the uncertainties. In this method, all the input uncertainties are randomly sampled according to their probability density functions and the resulting distribution for the output quantity is analyzed. Since an uncertainty analysis is basically calculated from the temperature distribution in a subassembly, the core thermal-hydraulic modeling greatly affects the resulting uncertainty. At KAERI, the SLTHEN and MATRA-LMR codes have been utilized to analyze the SFR core thermal-hydraulics. The SLTHEN (steady-state LMR core thermal hydraulics analysis code based on the ENERGY model) code is a modified version of the SUPERENERGY2 code, which conducts a multi-assembly, steady state calculation based on a simplified ENERGY model. The detailed subchannel analysis code MATRA-LMR (Multichannel Analyzer for Steady-State and Transients in Rod Arrays for Liquid Metal Reactors), an LMR version of MATRA, was also developed specifically for the SFR core thermal-hydraulic analysis. This paper describes comparative studies for core thermal-hydraulic models. The subchannel analysis and a hot channel factors based uncertainty evaluation system is established to estimate the core thermofluidic uncertainties using the MATRA-LMR code and the results are compared to those of the SLTHEN code

  16. Quantifying and managing uncertainty in operational modal analysis

    Science.gov (United States)

    Au, Siu-Kui; Brownjohn, James M. W.; Mottershead, John E.

    2018-03-01

    Operational modal analysis aims at identifying the modal properties (natural frequency, damping, etc.) of a structure using only the (output) vibration response measured under ambient conditions. Highly economical and feasible, it is becoming a common practice in full-scale vibration testing. In the absence of (input) loading information, however, the modal properties have significantly higher uncertainty than their counterparts identified from free or forced vibration (known input) tests. Mastering the relationship between identification uncertainty and test configuration is of great interest to both scientists and engineers, e.g., for achievable precision limits and test planning/budgeting. Addressing this challenge beyond the current state-of-the-art that are mostly concerned with identification algorithms, this work obtains closed form analytical expressions for the identification uncertainty (variance) of modal parameters that fundamentally explains the effect of test configuration. Collectively referred as 'uncertainty laws', these expressions are asymptotically correct for well-separated modes, small damping and long data; and are applicable under non-asymptotic situations. They provide a scientific basis for planning and standardization of ambient vibration tests, where factors such as channel noise, sensor number and location can be quantitatively accounted for. The work is reported comprehensively with verification through synthetic and experimental data (laboratory and field), scientific implications and practical guidelines for planning ambient vibration tests.

  17. Uncertainty analysis for the BEACON-COLSS core monitoring system application

    International Nuclear Information System (INIS)

    Morita, T.; Boyd, W.A.; Seong, K.B.

    2005-01-01

    This paper will cover the measurement uncertainty analysis of BEACON-COLSS core monitoring system. The uncertainty evaluation is made by using a BEACON-COLSS simulation program. By simulating the BEACON on-line operation for analytically generated reactor conditions, accuracy of the 'Measured' results can be evaluated by comparing to analytically generated 'Truth'. The DNB power margin is evaluated based on the Combustion Engineering's Modified Statistical Combination of Uncertainties (MSCU) using the CETOPD code for the DNBR calculation. A BEACON-COLSS simulation program for the uncertainty evaluation function has been established for plant applications. Qualification work has been completed for two Combustion Engineering plants. Results of the BEACON-COLSS measured peaking factors and DNBR power margin are plant type dependent and are applicable to reload cores as long as the core geometry and detector layout are unchanged. (authors)

  18. Uncertainty analysis of power monitoring transit time ultrasonic flow meters

    International Nuclear Information System (INIS)

    Orosz, A.; Miller, D. W.; Christensen, R. N.; Arndt, S.

    2006-01-01

    A general uncertainty analysis is applied to chordal, transit time ultrasonic flow meters that are used in nuclear power plant feedwater loops. This investigation focuses on relationships between the major parameters of the flow measurement. For this study, mass flow rate is divided into three components, profile factor, density, and a form of volumetric flow rate. All system parameters are used to calculate values for these three components. Uncertainty is analyzed using a perturbation method. Sensitivity coefficients for major system parameters are shown, and these coefficients are applicable to a range of ultrasonic flow meters used in similar applications. Also shown is the uncertainty to be expected for density along with its relationship to other system uncertainties. One other conclusion is that pipe diameter sensitivity coefficients may be a function of the calibration technique used. (authors)

  19. Uncertainty analysis technique of dynamic response and cumulative damage properties of piping system

    International Nuclear Information System (INIS)

    Suzuki, Kohei; Aoki, Shigeru; Hara, Fumio; Hanaoka, Masaaki; Yamashita, Tadashi.

    1982-01-01

    It is a technologically important subject to establish the method of uncertainty analysis statistically examining the variation of the earthquake response and damage properties of equipment and piping system due to the change of input load and the parameters of structural system, for evaluating the aseismatic capability and dynamic structural reliability of these systems. The uncertainty in the response and damage properties when equipment and piping system are subjected to excessive vibration load is mainly dependent on the irregularity of acting input load such as the unsteady vibration of earthquakes, and structural uncertainty in forms and dimensions. This study is the basic one to establish the method for evaluating the uncertainty in the cumulative damage property at the time of resonant vibration of piping system due to the disperse of structural parameters with a simple model. First, the piping models with simple form were broken by resonant vibration, and the uncertainty in the cumulative damage property was evaluated. Next, the response analysis using an elasto-plastic mechanics model was performed by numerical simulation. Finally, the method of uncertainty analysis for response and damage properties by the perturbation method utilizing equivalent linearization was proposed, and its propriety was proved. (Kako, I.)

  20. Uncertainty in soil-structure interaction analysis arising from differences in analytical techniques

    International Nuclear Information System (INIS)

    Maslenikov, O.R.; Chen, J.C.; Johnson, J.J.

    1982-07-01

    This study addresses uncertainties arising from variations in different modeling approaches to soil-structure interaction of massive structures at a nuclear power plant. To perform a comprehensive systems analysis, it is necessary to quantify, for each phase of the traditional analysis procedure, both the realistic seismic response and the uncertainties associated with them. In this study two linear soil-structure interaction techniques were used to analyze the Zion, Illinois nuclear power plant: a direct method using the FLUSH computer program and a substructure approach using the CLASSI family of computer programs. In-structure response from two earthquakes, one real and one synthetic, was compared. Structure configurations from relatively simple to complicated multi-structure cases were analyzed. The resulting variations help quantify uncertainty in structure response due to analysis procedures

  1. Simulating fuel behavior under transient conditions using FRAPTRAN and uncertainty analysis using Dakota

    International Nuclear Information System (INIS)

    Gomes, Daniel S.; Teixeira, Antonio S.

    2017-01-01

    Although regulatory agencies have shown a special interest in incorporating best estimate approaches in the fuel licensing process, fuel codes are currently licensed based on only the deterministic limits such as those seen in 10CRF50, and therefore, may yield unrealistic safety margins. The concept of uncertainty analysis is employed to more realistically manage this risk. In this study, uncertainties were classified into two categories: probabilistic and epistemic (owing to a lack of pre-existing knowledge in this area). Fuel rods have three sources of uncertainty: manufacturing tolerance, boundary conditions, and physical models. The first step in successfully analyzing the uncertainties involves performing a statistical analysis on the input parameters used throughout the fuel code. The response obtained from this analysis must show proportional index correlations because the uncertainties are globally propagated. The Dakota toolkit was used to analyze the FRAPTRAN transient fuel code. The subsequent sensitivity analyses helped in identifying the key parameters with the highest correlation indices including the peak cladding temperature and the time required for cladding failures. The uncertainty analysis was performed using an IFA-650-5 fuel rod, which was in line with the tests performed in the Halden Project in Norway. The main objectives of the Halden project included studying the ballooning and rupture processes. The results of this experiment demonstrate the accuracy and applicability of the physical models in evaluating the thermal conductivity, mechanical model, and fuel swelling formulations. (author)

  2. Simulating fuel behavior under transient conditions using FRAPTRAN and uncertainty analysis using Dakota

    Energy Technology Data Exchange (ETDEWEB)

    Gomes, Daniel S.; Teixeira, Antonio S., E-mail: dsgomes@ipen.br, E-mail: teixeira@ipen [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    Although regulatory agencies have shown a special interest in incorporating best estimate approaches in the fuel licensing process, fuel codes are currently licensed based on only the deterministic limits such as those seen in 10CRF50, and therefore, may yield unrealistic safety margins. The concept of uncertainty analysis is employed to more realistically manage this risk. In this study, uncertainties were classified into two categories: probabilistic and epistemic (owing to a lack of pre-existing knowledge in this area). Fuel rods have three sources of uncertainty: manufacturing tolerance, boundary conditions, and physical models. The first step in successfully analyzing the uncertainties involves performing a statistical analysis on the input parameters used throughout the fuel code. The response obtained from this analysis must show proportional index correlations because the uncertainties are globally propagated. The Dakota toolkit was used to analyze the FRAPTRAN transient fuel code. The subsequent sensitivity analyses helped in identifying the key parameters with the highest correlation indices including the peak cladding temperature and the time required for cladding failures. The uncertainty analysis was performed using an IFA-650-5 fuel rod, which was in line with the tests performed in the Halden Project in Norway. The main objectives of the Halden project included studying the ballooning and rupture processes. The results of this experiment demonstrate the accuracy and applicability of the physical models in evaluating the thermal conductivity, mechanical model, and fuel swelling formulations. (author)

  3. Coupled code analysis of uncertainty and sensitivity of Kalinin-3 benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Pasichnyk, Ihor; Zwermann, Winfried; Velkov, Kiril [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Garching (Germany); Nikonov, Sergey [VNIIAES, Moscow (Russian Federation)

    2016-09-15

    An uncertainty and sensitivity analysis is performed for the OECD/NEA coolant transient Benchmark (K-3) on measured data at Kalinin-3 Nuclear Power Plant (NPP). A switch off of one main coolant pump (MCP) at nominal reactor power is calculated using a coupled thermohydraulic and neutron-kinetic ATHLET-PARCS code. The objectives are to study uncertainty of total reactor power and to identify the main sources of reactor power uncertainty. The GRS uncertainty and sensitivity software package XSUSA is applied to propagate uncertainties in nuclear data libraries to the full core coupled transient calculations. A set of most important thermal-hydraulic parameters of the primary circuit is identified and a total of 23 thermohydraulic parameters are statistically varied using GRS code SUSA. The ATHLET model contains also a balance-of-plant (BOP) model which is simulated using ATHLET GCSM module. In particular the operation of the main steam generator regulators is modelled in detail. A set of 200 varied coupled ATHLET-PARCS calculations is analyzed. The results obtained show a clustering effect in the behavior of global reactor parameters. It is found that the GCSM system together with varied input parameters strongly influence the overall nuclear power plant behavior and can even lead to a new scenario. Possible reasons of the clustering effect are discussed in the paper. This work is a step forward in establishing a ''best-estimate calculations in combination with performing uncertainty analysis'' methodology for coupled full core calculations.

  4. Effect of activation cross section uncertainties in transmutation analysis of realistic low-activation steels for IFMIF

    Energy Technology Data Exchange (ETDEWEB)

    Cabellos, O.; Garcya-Herranz, N.; Sanz, J. [Institute of Nuclear Fusion, UPM, Madrid (Spain); Cabellos, O.; Garcya-Herranz, N.; Fernandez, P.; Fernandez, B. [Dept. of Nuclear Engineering, UPM, Madrid (Spain); Sanz, J. [Dept. of Power Engineering, UNED, Madrid (Spain); Reyes, S. [Safety, Environment and Health Group, ITER Joint Work Site, Cadarache Center (France)

    2008-07-01

    We address uncertainty analysis to draw conclusions on the reliability of the activation calculation in the International Fusion Materials Irradiation Facility (IFMIF) under the potential impact of activation cross section uncertainties. The Monte Carlo methodology implemented in ACAB code gives the uncertainty estimates due to the synergetic/global effect of the complete set of cross section uncertainties. An element-by-element analysis has been demonstrated as a helpful tool to easily analyse the transmutation performance of irradiated materials.The uncertainty analysis results showed that for times over about 24 h the relative error in the contact dose rate can be as large as 23 per cent. We have calculated the effect of cross section uncertainties in the IFMIF activation of all different elements. For EUROFER, uncertainties in H and He elements are 7.3% and 5.6%, respectively. We have found significant uncertainties in the transmutation response for C, P and Nb.

  5. Benchmarking and application of the state-of-the-art uncertainty analysis methods XSUSA and SHARK-X

    International Nuclear Information System (INIS)

    Aures, A.; Bostelmann, F.; Hursin, M.; Leray, O.

    2017-01-01

    Highlights: • Application of the uncertainty analysis methods XSUSA and SHARK-X. • Propagation of nuclear data uncertainty through PWR pin cell depletion calculation. • Uncertainty quantification of eigenvalue, nuclide densities and Doppler coefficient. • Top contributor to overall output uncertainty by sensitivity analysis. • Comparison with SAMPLER and TSUNAMI of the SCALE code package. - Abstract: This study presents collaborative work performed between GRS and PSI on benchmarking and application of the state-of-the-art uncertainty analysis methods XSUSA and SHARK-X. Applied to a PWR pin cell depletion calculation, both methods propagate input uncertainty from nuclear data to output uncertainty. The uncertainty of the multiplication factors, nuclide densities, and fuel temperature coefficients derived by both methods are compared at various burnup steps. Comparisons of these quantities are furthermore performed with the SAMPLER module of SCALE 6.2. The perturbation theory based TSUNAMI module of both SCALE 6.1 and SCALE 6.2 is additionally applied for comparisons of the reactivity coefficient.

  6. A GLUE uncertainty analysis of a drying model of pharmaceutical granules

    DEFF Research Database (Denmark)

    Mortier, Séverine Thérèse F.C.; Van Hoey, Stijn; Cierkens, Katrijn

    2013-01-01

    unit, which is part of the full continuous from-powder-to-tablet manufacturing line (Consigma™, GEA Pharma Systems). A validated model describing the drying behaviour of a single pharmaceutical granule in two consecutive phases is used. First of all, the effect of the assumptions at the particle level...... on the prediction uncertainty is assessed. Secondly, the paper focuses on the influence of the most sensitive parameters in the model. Finally, a combined analysis (particle level plus most sensitive parameters) is performed and discussed. To propagate the uncertainty originating from the parameter uncertainty...

  7. Uncertainty modelling and analysis of environmental systems: a river sediment yield example

    NARCIS (Netherlands)

    Keesman, K.J.; Koskela, J.; Guillaume, J.H.; Norton, J.P.; Croke, B.; Jakeman, A.

    2011-01-01

    Abstract: Throughout the last decades uncertainty analysis has become an essential part of environmental model building (e.g. Beck 1987; Refsgaard et al., 2007). The objective of the paper is to introduce stochastic and setmembership uncertainty modelling concepts, which basically differ in the

  8. Integrated Risk-Capability Analysis under Deep Uncertainty : An ESDMA Approach

    NARCIS (Netherlands)

    Pruyt, E.; Kwakkel, J.H.

    2012-01-01

    Integrated risk-capability analysis methodologies for dealing with increasing degrees of complexity and deep uncertainty are urgently needed in an ever more complex and uncertain world. Although scenario approaches, risk assessment methods, and capability analysis methods are used, few organizations

  9. Total sensitivity and uncertainty analysis for LWR pin-cells with improved UNICORN code

    International Nuclear Information System (INIS)

    Wan, Chenghui; Cao, Liangzhi; Wu, Hongchun; Shen, Wei

    2017-01-01

    Highlights: • A new model is established for the total sensitivity and uncertainty analysis. • The NR approximation applied in S&U analysis can be avoided by the new model. • Sensitivity and uncertainty analysis is performed to PWR pin-cells by the new model. • The effects of the NR approximation for the PWR pin-cells are quantified. - Abstract: In this paper, improvements to the multigroup cross-section perturbation model have been proposed and applied in the self-developed UNICORN code, which is capable of performing the total sensitivity and total uncertainty analysis for the neutron-physics calculations by applying the direct numerical perturbation method and the statistical sampling method respectively. The narrow resonance (NR) approximation was applied in the multigroup cross-section perturbation model, implemented in UNICORN. As improvements to the NR approximation to refine the multigroup cross-section perturbation model, an ultrafine-group cross-section perturbation model has been established, in which the actual perturbations are applied to the ultrafine-group cross-section library and the reconstructions of the resonance cross sections are performed by solving the neutron slowing-down equation. The total sensitivity and total uncertainty analysis were then applied to the LWR pin-cells, using both the multigroup and the ultrafine-group cross-section perturbation models. The numerical results show that the NR approximation overestimates the relative sensitivity coefficients and the corresponding uncertainty results for the LWR pin-cells, and the effects of the NR approximation are significant for σ_(_n_,_γ_) and σ_(_n_,_e_l_a_s_) of "2"3"8U. Therefore, the effects of the NR approximation applied in the total sensitivity and total uncertainty analysis for the neutron-physics calculations of LWR should be taken into account.

  10. Uncertainty analysis in comparative NAA applied to geological and biological matrices

    International Nuclear Information System (INIS)

    Zahn, Guilherme S.; Ticianelli, Regina B.; Lange, Camila N.; Favaro, Deborah I.T.; Figueiredo, Ana M.G.

    2015-01-01

    Comparative nuclear activation analysis is a multielemental primary analytical technique that may be used in a rather broad spectrum of matrices with minimal-to-none sample preprocessing. Although the total activation of a chemical element in a sample depends on a rather large set of parameters, when the sample is irradiated together with a well-known comparator, most of these parameters are crossed out and the concentration of that element can be determined simply by using the activities and masses of the comparator and the sample, the concentration of this chemical element in the sample, the half-life of the formed radionuclide and the time between counting the sample and the comparator. This simplification greatly reduces not only the calculations required, but also the uncertainty associated with the measurement; nevertheless, a cautious analysis must be carried out in order to make sure all relevant uncertainties are properly treated, so that the final result can be as representative of the measurement as possible. In this work, this analysis was performed for geological matrices, where concentrations of the interest nuclides are rather high, but so is the density and average atomic number of the sample, as well as for a biological matrix, in order to allow for a comparison. The results show that the largest part of the uncertainty comes from the activity measurements and from the concentration of the comparator, and that while the influence of time-related terms in the final uncertainty can be safely neglected, the uncertainty in the masses may be relevant under specific circumstances. (author)

  11. Uncertainty analysis in comparative NAA applied to geological and biological matrices

    Energy Technology Data Exchange (ETDEWEB)

    Zahn, Guilherme S.; Ticianelli, Regina B.; Lange, Camila N.; Favaro, Deborah I.T.; Figueiredo, Ana M.G., E-mail: gzahn@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    Comparative nuclear activation analysis is a multielemental primary analytical technique that may be used in a rather broad spectrum of matrices with minimal-to-none sample preprocessing. Although the total activation of a chemical element in a sample depends on a rather large set of parameters, when the sample is irradiated together with a well-known comparator, most of these parameters are crossed out and the concentration of that element can be determined simply by using the activities and masses of the comparator and the sample, the concentration of this chemical element in the sample, the half-life of the formed radionuclide and the time between counting the sample and the comparator. This simplification greatly reduces not only the calculations required, but also the uncertainty associated with the measurement; nevertheless, a cautious analysis must be carried out in order to make sure all relevant uncertainties are properly treated, so that the final result can be as representative of the measurement as possible. In this work, this analysis was performed for geological matrices, where concentrations of the interest nuclides are rather high, but so is the density and average atomic number of the sample, as well as for a biological matrix, in order to allow for a comparison. The results show that the largest part of the uncertainty comes from the activity measurements and from the concentration of the comparator, and that while the influence of time-related terms in the final uncertainty can be safely neglected, the uncertainty in the masses may be relevant under specific circumstances. (author)

  12. A Proposal on the Advanced Sampling Based Sensitivity and Uncertainty Analysis Method for the Eigenvalue Uncertainty Analysis

    International Nuclear Information System (INIS)

    Kim, Song Hyun; Song, Myung Sub; Shin, Chang Ho; Noh, Jae Man

    2014-01-01

    In using the perturbation theory, the uncertainty of the response can be estimated by a single transport simulation, and therefore it requires small computational load. However, it has a disadvantage that the computation methodology must be modified whenever estimating different response type such as multiplication factor, flux, or power distribution. Hence, it is suitable for analyzing few responses with lots of perturbed parameters. Statistical approach is a sampling based method which uses randomly sampled cross sections from covariance data for analyzing the uncertainty of the response. XSUSA is a code based on the statistical approach. The cross sections are only modified with the sampling based method; thus, general transport codes can be directly utilized for the S/U analysis without any code modifications. However, to calculate the uncertainty distribution from the result, code simulation should be enough repeated with randomly sampled cross sections. Therefore, this inefficiency is known as a disadvantage of the stochastic method. In this study, an advanced sampling method of the cross sections is proposed and verified to increase the estimation efficiency of the sampling based method. In this study, to increase the estimation efficiency of the sampling based S/U method, an advanced sampling and estimation method was proposed. The main feature of the proposed method is that the cross section averaged from each single sampled cross section is used. For the use of the proposed method, the validation was performed using the perturbation theory

  13. Uncertainty and sensitivity analysis of the nuclear fuel thermal behavior

    Energy Technology Data Exchange (ETDEWEB)

    Boulore, A., E-mail: antoine.boulore@cea.fr [Commissariat a l' Energie Atomique (CEA), DEN, Fuel Research Department, 13108 Saint-Paul-lez-Durance (France); Struzik, C. [Commissariat a l' Energie Atomique (CEA), DEN, Fuel Research Department, 13108 Saint-Paul-lez-Durance (France); Gaudier, F. [Commissariat a l' Energie Atomique (CEA), DEN, Systems and Structure Modeling Department, 91191 Gif-sur-Yvette (France)

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer A complete quantitative method for uncertainty propagation and sensitivity analysis is applied. Black-Right-Pointing-Pointer The thermal conductivity of UO{sub 2} is modeled as a random variable. Black-Right-Pointing-Pointer The first source of uncertainty is the linear heat rate. Black-Right-Pointing-Pointer The second source of uncertainty is the thermal conductivity of the fuel. - Abstract: In the global framework of nuclear fuel behavior simulation, the response of the models describing the physical phenomena occurring during the irradiation in reactor is mainly conditioned by the confidence in the calculated temperature of the fuel. Amongst all parameters influencing the temperature calculation in our fuel rod simulation code (METEOR V2), several sources of uncertainty have been identified as being the most sensitive: thermal conductivity of UO{sub 2}, radial distribution of power in the fuel pellet, local linear heat rate in the fuel rod, geometry of the pellet and thermal transfer in the gap. Expert judgment and inverse methods have been used to model the uncertainty of these parameters using theoretical distributions and correlation matrices. Propagation of these uncertainties in the METEOR V2 code using the URANIE framework and a Monte-Carlo technique has been performed in different experimental irradiations of UO{sub 2} fuel. At every time step of the simulated experiments, we get a temperature statistical distribution which results from the initial distributions of the uncertain parameters. We then can estimate confidence intervals of the calculated temperature. In order to quantify the sensitivity of the calculated temperature to each of the uncertain input parameters and data, we have also performed a sensitivity analysis using the Sobol' indices at first order.

  14. Uncertainty and sensitivity analysis: Mathematical model of coupled heat and mass transfer for a contact baking process

    DEFF Research Database (Denmark)

    Feyissa, Aberham Hailu; Gernaey, Krist; Adler-Nissen, Jens

    2012-01-01

    to uncertainty in the model predictions. The aim of the current paper is to address this uncertainty challenge in the modelling of food production processes using a combination of uncertainty and sensitivity analysis, where the uncertainty analysis and global sensitivity analysis were applied to a heat and mass......Similar to other processes, the modelling of heat and mass transfer during food processing involves uncertainty in the values of input parameters (heat and mass transfer coefficients, evaporation rate parameters, thermo-physical properties, initial and boundary conditions) which leads...

  15. An enhanced unified uncertainty analysis approach based on first order reliability method with single-level optimization

    International Nuclear Information System (INIS)

    Yao, Wen; Chen, Xiaoqian; Huang, Yiyong; Tooren, Michel van

    2013-01-01

    In engineering, there exist both aleatory uncertainties due to the inherent variation of the physical system and its operational environment, and epistemic uncertainties due to lack of knowledge and which can be reduced with the collection of more data. To analyze the uncertain distribution of the system performance under both aleatory and epistemic uncertainties, combined probability and evidence theory can be employed to quantify the compound effects of the mixed uncertainties. The existing First Order Reliability Method (FORM) based Unified Uncertainty Analysis (UUA) approach nests the optimization based interval analysis in the improved Hasofer–Lind–Rackwitz–Fiessler (iHLRF) algorithm based Most Probable Point (MPP) searching procedure, which is computationally inhibitive for complex systems and may encounter convergence problem as well. Therefore, in this paper it is proposed to use general optimization solvers to search MPP in the outer loop and then reformulate the double-loop optimization problem into an equivalent single-level optimization (SLO) problem, so as to simplify the uncertainty analysis process, improve the robustness of the algorithm, and alleviate the computational complexity. The effectiveness and efficiency of the proposed method is demonstrated with two numerical examples and one practical satellite conceptual design problem. -- Highlights: ► Uncertainty analysis under mixed aleatory and epistemic uncertainties is studied. ► A unified uncertainty analysis method is proposed with combined probability and evidence theory. ► The traditional nested analysis method is converted to single level optimization for efficiency. ► The effectiveness and efficiency of the proposed method are testified with three examples

  16. Deterministic methods for sensitivity and uncertainty analysis in large-scale computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Oblow, E.M.; Pin, F.G.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.; Lucius, J.L.

    1987-01-01

    The fields of sensitivity and uncertainty analysis are dominated by statistical techniques when large-scale modeling codes are being analyzed. This paper reports on the development and availability of two systems, GRESS and ADGEN, that make use of computer calculus compilers to automate the implementation of deterministic sensitivity analysis capability into existing computer models. This automation removes the traditional limitation of deterministic sensitivity methods. The paper describes a deterministic uncertainty analysis method (DUA) that uses derivative information as a basis to propagate parameter probability distributions to obtain result probability distributions. The paper demonstrates the deterministic approach to sensitivity and uncertainty analysis as applied to a sample problem that models the flow of water through a borehole. The sample problem is used as a basis to compare the cumulative distribution function of the flow rate as calculated by the standard statistical methods and the DUA method. The DUA method gives a more accurate result based upon only two model executions compared to fifty executions in the statistical case

  17. Bayesian analysis for uncertainty estimation of a canopy transpiration model

    Science.gov (United States)

    Samanta, S.; Mackay, D. S.; Clayton, M. K.; Kruger, E. L.; Ewers, B. E.

    2007-04-01

    A Bayesian approach was used to fit a conceptual transpiration model to half-hourly transpiration rates for a sugar maple (Acer saccharum) stand collected over a 5-month period and probabilistically estimate its parameter and prediction uncertainties. The model used the Penman-Monteith equation with the Jarvis model for canopy conductance. This deterministic model was extended by adding a normally distributed error term. This extension enabled using Markov chain Monte Carlo simulations to sample the posterior parameter distributions. The residuals revealed approximate conformance to the assumption of normally distributed errors. However, minor systematic structures in the residuals at fine timescales suggested model changes that would potentially improve the modeling of transpiration. Results also indicated considerable uncertainties in the parameter and transpiration estimates. This simple methodology of uncertainty analysis would facilitate the deductive step during the development cycle of deterministic conceptual models by accounting for these uncertainties while drawing inferences from data.

  18. Uncertainty analysis in seismic tomography

    Science.gov (United States)

    Owoc, Bartosz; Majdański, Mariusz

    2017-04-01

    Velocity field from seismic travel time tomography depends on several factors like regularization, inversion path, model parameterization etc. The result also strongly depends on an initial velocity model and precision of travel times picking. In this research we test dependence on starting model in layered tomography and compare it with effect of picking precision. Moreover, in our analysis for manual travel times picking the uncertainty distribution is asymmetric. This effect is shifting the results toward faster velocities. For calculation we are using JIVE3D travel time tomographic code. We used data from geo-engineering and industrial scale investigations, which were collected by our team from IG PAS.

  19. Reusable launch vehicle model uncertainties impact analysis

    Science.gov (United States)

    Chen, Jiaye; Mu, Rongjun; Zhang, Xin; Deng, Yanpeng

    2018-03-01

    Reusable launch vehicle(RLV) has the typical characteristics of complex aerodynamic shape and propulsion system coupling, and the flight environment is highly complicated and intensely changeable. So its model has large uncertainty, which makes the nominal system quite different from the real system. Therefore, studying the influences caused by the uncertainties on the stability of the control system is of great significance for the controller design. In order to improve the performance of RLV, this paper proposes the approach of analyzing the influence of the model uncertainties. According to the typical RLV, the coupling dynamic and kinematics models are built. Then different factors that cause uncertainties during building the model are analyzed and summed up. After that, the model uncertainties are expressed according to the additive uncertainty model. Choosing the uncertainties matrix's maximum singular values as the boundary model, and selecting the uncertainties matrix's norm to show t how much the uncertainty factors influence is on the stability of the control system . The simulation results illustrate that the inertial factors have the largest influence on the stability of the system, and it is necessary and important to take the model uncertainties into consideration before the designing the controller of this kind of aircraft( like RLV, etc).

  20. Quality in environmental science for policy: assessing uncertainty as a component of policy analysis

    NARCIS (Netherlands)

    Maxim, L.; van der Sluijs, J.P.

    2011-01-01

    The sheer number of attempts to define and classify uncertainty reveals an awareness of its importance in environmental science for policy, though the nature of uncertainty is often misunderstood. The interdisciplinary field of uncertainty analysis is unstable; there are currently several incomplete

  1. Uncertainty modeling in vibration, control and fuzzy analysis of structural systems

    CERN Document Server

    Halder, Achintya; Ayyub, Bilal M

    1997-01-01

    This book gives an overview of the current state of uncertainty modeling in vibration, control, and fuzzy analysis of structural and mechanical systems. It is a coherent compendium written by leading experts and offers the reader a sampling of exciting research areas in several fast-growing branches in this field. Uncertainty modeling and analysis are becoming an integral part of system definition and modeling in many fields. The book consists of ten chapters that report the work of researchers, scientists and engineers on theoretical developments and diversified applications in engineering sy

  2. Model parameter uncertainty analysis for an annual field-scale phosphorus loss model

    Science.gov (United States)

    Phosphorous (P) loss models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. All P loss models, however, have an inherent amount of uncertainty associated with them. In this study, we conducted an uncertainty analysis with ...

  3. Interpretations of alternative uncertainty representations in a reliability and risk analysis context

    International Nuclear Information System (INIS)

    Aven, T.

    2011-01-01

    Probability is the predominant tool used to measure uncertainties in reliability and risk analyses. However, other representations also exist, including imprecise (interval) probability, fuzzy probability and representations based on the theories of evidence (belief functions) and possibility. Many researchers in the field are strong proponents of these alternative methods, but some are also sceptical. In this paper, we address one basic requirement set for quantitative measures of uncertainty: the interpretation needed to explain what an uncertainty number expresses. We question to what extent the various measures meet this requirement. Comparisons are made with probabilistic analysis, where uncertainty is represented by subjective probabilities, using either a betting interpretation or a reference to an uncertainty standard interpretation. By distinguishing between chances (expressing variation) and subjective probabilities, new insights are gained into the link between the alternative uncertainty representations and probability.

  4. Analysis of uncertainties in the IAEA/WHO TLD postal dose audit system

    Energy Technology Data Exchange (ETDEWEB)

    Izewska, J. [Department of Nuclear Sciences and Applications, International Atomic Energy Agency, Wagramer Strasse 5, Vienna (Austria)], E-mail: j.izewska@iaea.org; Hultqvist, M. [Department of Medical Radiation Physics, Karolinska Institute, Stockholm University, Stockholm (Sweden); Bera, P. [Department of Nuclear Sciences and Applications, International Atomic Energy Agency, Wagramer Strasse 5, Vienna (Austria)

    2008-02-15

    The International Atomic Energy Agency (IAEA) and the World Health Organisation (WHO) operate the IAEA/WHO TLD postal dose audit programme. Thermoluminescence dosimeters (TLDs) are used as transfer devices in this programme. In the present work the uncertainties in the dose determination from TLD measurements have been evaluated. The analysis of uncertainties comprises uncertainties in the calibration coefficient of the TLD system and uncertainties in factors correcting for dose response non-linearity, fading of TL signal, energy response and influence of TLD holder. The individual uncertainties have been combined to estimate the total uncertainty in the dose evaluated from TLD measurements. The combined relative standard uncertainty in the dose determined from TLD measurements has been estimated to be 1.2% for irradiations with Co-60 {gamma}-rays and 1.6% for irradiations with high-energy X-rays. Results from irradiations by the Bureau international des poids et mesures (BIPM), Primary Standard Dosimetry Laboratories (PSDLs) and Secondary Standards Dosimetry Laboratories (SSDLs) compare favourably with the estimated uncertainties, whereas TLD results of radiotherapy centres show higher standard deviations than those derived theoretically.

  5. Application of uncertainty and sensitivity analysis to the air quality SHERPA modelling tool

    Science.gov (United States)

    Pisoni, E.; Albrecht, D.; Mara, T. A.; Rosati, R.; Tarantola, S.; Thunis, P.

    2018-06-01

    Air quality has significantly improved in Europe over the past few decades. Nonetheless we still find high concentrations in measurements mainly in specific regions or cities. This dimensional shift, from EU-wide to hot-spot exceedances, calls for a novel approach to regional air quality management (to complement EU-wide existing policies). The SHERPA (Screening for High Emission Reduction Potentials on Air quality) modelling tool was developed in this context. It provides an additional tool to be used in support to regional/local decision makers responsible for the design of air quality plans. It is therefore important to evaluate the quality of the SHERPA model, and its behavior in the face of various kinds of uncertainty. Uncertainty and sensitivity analysis techniques can be used for this purpose. They both reveal the links between assumptions and forecasts, help in-model simplification and may highlight unexpected relationships between inputs and outputs. Thus, a policy steered SHERPA module - predicting air quality improvement linked to emission reduction scenarios - was evaluated by means of (1) uncertainty analysis (UA) to quantify uncertainty in the model output, and (2) by sensitivity analysis (SA) to identify the most influential input sources of this uncertainty. The results of this study provide relevant information about the key variables driving the SHERPA output uncertainty, and advise policy-makers and modellers where to place their efforts for an improved decision-making process.

  6. Tolerance analysis in manufacturing using process capability ratio with measurement uncertainty

    DEFF Research Database (Denmark)

    Mahshid, Rasoul; Mansourvar, Zahra; Hansen, Hans Nørgaard

    2017-01-01

    . In this paper, a new statistical analysis was applied to manufactured products to assess achieved tolerances when the process is known while using capability ratio and expanded uncertainty. The analysis has benefits for process planning, determining actual precision limits, process optimization, troubleshoot......Tolerance analysis provides valuable information regarding performance of manufacturing process. It allows determining the maximum possible variation of a quality feature in production. Previous researches have focused on application of tolerance analysis to the design of mechanical assemblies...... malfunctioning existing part. The capability measure is based on a number of measurements performed on part’s quality variable. Since the ratio relies on measurements, elimination of any possible error has notable negative impact on results. Therefore, measurement uncertainty was used in combination with process...

  7. Cross-section sensitivity and uncertainty analysis of the FNG copper benchmark experiment

    Energy Technology Data Exchange (ETDEWEB)

    Kodeli, I., E-mail: ivan.kodeli@ijs.si [Jožef Stefan Institute, Jamova 39, SI-1000 Ljubljana (Slovenia); Kondo, K. [Karlsruhe Institute of Technology, Postfach 3640, D-76021 Karlsruhe (Germany); Japan Atomic Energy Agency, Rokkasho-mura (Japan); Perel, R.L. [Racah Institute of Physics, Hebrew University of Jerusalem, IL-91904 Jerusalem (Israel); Fischer, U. [Karlsruhe Institute of Technology, Postfach 3640, D-76021 Karlsruhe (Germany)

    2016-11-01

    A neutronics benchmark experiment on copper assembly was performed end 2014–beginning 2015 at the 14-MeV Frascati neutron generator (FNG) of ENEA Frascati with the objective to provide the experimental database required for the validation of the copper nuclear data relevant for ITER design calculations, including the related uncertainties. The paper presents the pre- and post-analysis of the experiment performed using cross-section sensitivity and uncertainty codes, both deterministic (SUSD3D) and Monte Carlo (MCSEN5). Cumulative reaction rates and neutron flux spectra, their sensitivity to the cross sections, as well as the corresponding uncertainties were estimated for different selected detector positions up to ∼58 cm in the copper assembly. This permitted in the pre-analysis phase to optimize the geometry, the detector positions and the choice of activation reactions, and in the post-analysis phase to interpret the results of the measurements and the calculations, to conclude on the quality of the relevant nuclear cross-section data, and to estimate the uncertainties in the calculated nuclear responses and fluxes. Large uncertainties in the calculated reaction rates and neutron spectra of up to 50%, rarely observed at this level in the benchmark analysis using today's nuclear data, were predicted, particularly high for fast reactions. Observed C/E (dis)agreements with values as low as 0.5 partly confirm these predictions. Benchmark results are therefore expected to contribute to the improvement of both cross section as well as covariance data evaluations.

  8. Application of Uncertainty and Sensitivity Analysis to a Kinetic Model for Enzymatic Biodiesel Production

    DEFF Research Database (Denmark)

    Price, Jason Anthony; Nordblad, Mathias; Woodley, John

    2014-01-01

    This paper demonstrates the added benefits of using uncertainty and sensitivity analysis in the kinetics of enzymatic biodiesel production. For this study, a kinetic model by Fedosov and co-workers is used. For the uncertainty analysis the Monte Carlo procedure was used to statistically quantify...

  9. Fukushima Daiichi unit 1 uncertainty analysis--Preliminary selection of uncertain parameters and analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Cardoni, Jeffrey N.; Kalinich, Donald A.

    2014-02-01

    Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.

  10. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    International Nuclear Information System (INIS)

    Sig Drellack, Lance Prothro

    2007-01-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result of the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The

  11. An analysis of combined standard uncertainty for radiochemical measurements of environmental samples

    International Nuclear Information System (INIS)

    Berne, A.

    1996-01-01

    It is anticipated that future data acquisitions intended for use in radiological risk assessments will require the incorporation of uncertainty analysis. Often, only one aliquot of the sample is taken and a single determination is made. Under these circumstances, the total uncertainty is calculated using the open-quotes propagation of errorsclose quotes approach. However, there is no agreement in the radioanalytical community as to the exact equations to use. The Quality Assurance/Metrology Division of the Environmental Measurements Laboratory has developed a systematic process to compute uncertainties in constituent components of the analytical procedure, as well as the combined standard uncertainty (CSU). The equations for computation are presented here, with examples of their use. They have also been incorporated into a code for use in the spreadsheet application, QuattroPro trademark. Using the spreadsheet with appropriate inputs permits an analysis of the variations in the CSU as a function of several different variables. The relative importance of the open-quotes counting uncertaintyclose quotes can also be ascertained

  12. Sampling based uncertainty analysis of 10% hot leg break LOCA in large scale test facility

    International Nuclear Information System (INIS)

    Sengupta, Samiran; Kraina, V.; Dubey, S. K.; Rao, R. S.; Gupta, S. K.

    2010-01-01

    Sampling based uncertainty analysis was carried out to quantify uncertainty in predictions of best estimate code RELAP5/MOD3.2 for a thermal hydraulic test (10% hot leg break LOCA) performed in the Large Scale Test Facility (LSTF) as a part of an IAEA coordinated research project. The nodalisation of the test facility was qualified for both steady state and transient level by systematically applying the procedures led by uncertainty methodology based on accuracy extrapolation (UMAE); uncertainty analysis was carried out using the Latin hypercube sampling (LHS) method to evaluate uncertainty for ten input parameters. Sixteen output parameters were selected for uncertainty evaluation and uncertainty band between 5 th and 95 th percentile of the output parameters were evaluated. It was observed that the uncertainty band for the primary pressure during two phase blowdown is larger than that of the remaining period. Similarly, a larger uncertainty band is observed relating to accumulator injection flow during reflood phase. Importance analysis was also carried out and standard rank regression coefficients were computed to quantify the effect of each individual input parameter on output parameters. It was observed that the break discharge coefficient is the most important uncertain parameter relating to the prediction of all the primary side parameters and that the steam generator (SG) relief pressure setting is the most important parameter in predicting the SG secondary pressure

  13. Risk Analysis of Reservoir Flood Routing Calculation Based on Inflow Forecast Uncertainty

    Directory of Open Access Journals (Sweden)

    Binquan Li

    2016-10-01

    Full Text Available Possible risks in reservoir flood control and regulation cannot be objectively assessed by deterministic flood forecasts, resulting in the probability of reservoir failure. We demonstrated a risk analysis of reservoir flood routing calculation accounting for inflow forecast uncertainty in a sub-basin of Huaihe River, China. The Xinanjiang model was used to provide deterministic flood forecasts, and was combined with the Hydrologic Uncertainty Processor (HUP to quantify reservoir inflow uncertainty in the probability density function (PDF form. Furthermore, the PDFs of reservoir water level (RWL and the risk rate of RWL exceeding a defined safety control level could be obtained. Results suggested that the median forecast (50th percentiles of HUP showed better agreement with observed inflows than the Xinanjiang model did in terms of the performance measures of flood process, peak, and volume. In addition, most observations (77.2% were bracketed by the uncertainty band of 90% confidence interval, with some small exceptions of high flows. Results proved that this framework of risk analysis could provide not only the deterministic forecasts of inflow and RWL, but also the fundamental uncertainty information (e.g., 90% confidence band for the reservoir flood routing calculation.

  14. HTGR reactor physics, thermal-hydraulics and depletion uncertainty analysis: a proposed IAEA coordinated research project

    International Nuclear Information System (INIS)

    Tyobeka, Bismark; Reitsma, Frederik; Ivanov, Kostadin

    2011-01-01

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The predictive capability of coupled neutronics/thermal hydraulics and depletion simulations for reactor design and safety analysis can be assessed with sensitivity analysis and uncertainty analysis methods. In order to benefit from recent advances in modeling and simulation and the availability of new covariance data (nuclear data uncertainties) extensive sensitivity and uncertainty studies are needed for quantification of the impact of different sources of uncertainties on the design and safety parameters of HTGRs. Uncertainty and sensitivity studies are an essential component of any significant effort in data and simulation improvement. In February 2009, the Technical Working Group on Gas-Cooled Reactors recommended that the proposed IAEA Coordinated Research Project (CRP) on the HTGR Uncertainty Analysis in Modeling be implemented. In the paper the current status and plan are presented. The CRP will also benefit from interactions with the currently ongoing OECD/NEA Light Water Reactor (LWR) UAM benchmark activity by taking into consideration the peculiarities of HTGR designs and simulation requirements. (author)

  15. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis☆

    Science.gov (United States)

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-01-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987

  16. A cross-section sensitivity and uncertainty analysis on fusion reactor blankets with SAD/SED effect

    International Nuclear Information System (INIS)

    Furuta, Kazuo; Oka, Yoshiaki; Kondo, Shunsuke

    1986-01-01

    A cross-section sensitivity and uncertainty analysis on four types of fusion reactor blankets has been performed, based on cross-section covariance matrices. The design parameters investigated in the analysis include the tritium breeding ratio, the neutron heating and the fast neutron leakage flux from the inboard shield. Uncertainities in Secondary Angular Distribution (SAD) and Secondary Energy Distribution (SED) of scattered neutrons have been considered for lithium. The collective standard deviation, due to uncertainties in the evaluated cross-section data presently available, is 2-4% in the tritium breeding ratio, 2-3% in the neutron heating, and 10-20% in the fast neutron leakage flux. Contributions from SAD/SED uncertainties are significant for some parameters, such as those investigated in the present study. SAD/SED uncertainties should be considered in the sensitivity and uncertainty analysis on nuclear design of fusion reactors. (orig.)

  17. How uncertainty analysis of streamflow data can reduce costs and promote robust decisions in water management applications

    Science.gov (United States)

    McMillan, Hilary; Seibert, Jan; Petersen-Overleir, Asgeir; Lang, Michel; White, Paul; Snelder, Ton; Rutherford, Kit; Krueger, Tobias; Mason, Robert; Kiang, Julie

    2017-07-01

    Streamflow data are used for important environmental and economic decisions, such as specifying and regulating minimum flows, managing water supplies, and planning for flood hazards. Despite significant uncertainty in most flow data, the flow series for these applications are often communicated and used without uncertainty information. In this commentary, we argue that proper analysis of uncertainty in river flow data can reduce costs and promote robust conclusions in water management applications. We substantiate our argument by providing case studies from Norway and New Zealand where streamflow uncertainty analysis has uncovered economic costs in the hydropower industry, improved public acceptance of a controversial water management policy, and tested the accuracy of water quality trends. We discuss the need for practical uncertainty assessment tools that generate multiple flow series realizations rather than simple error bounds. Although examples of such tools are in development, considerable barriers for uncertainty analysis and communication still exist for practitioners, and future research must aim to provide easier access and usability of uncertainty estimates. We conclude that flow uncertainty analysis is critical for good water management decisions.

  18. Decision analysis of shoreline protection under climate change uncertainty

    Science.gov (United States)

    Chao, Philip T.; Hobbs, Benjamin F.

    1997-04-01

    If global warming occurs, it could significantly affect water resource distribution and availability. Yet it is unclear whether the prospect of such change is relevant to water resources management decisions being made today. We model a shoreline protection decision problem with a stochastic dynamic program (SDP) to determine whether consideration of the possibility of climate change would alter the decision. Three questions are addressed with the SDP: (l) How important is climate change compared to other uncertainties?, (2) What is the economic loss if climate change uncertainty is ignored?, and (3) How does belief in climate change affect the timing of the decision? In the case study, sensitivity analysis shows that uncertainty in real discount rates has a stronger effect upon the decision than belief in climate change. Nevertheless, a strong belief in climate change makes the shoreline protection project less attractive and often alters the decision to build it.

  19. Summary of the CEC/USDOE workshop on uncertainty analysis

    International Nuclear Information System (INIS)

    Elderkin, C.E.; Kelly, G.N.

    1990-06-01

    There is uncertainty in all aspects of assessing the consequences of accidental releases of radioactive material, from understanding and describing the environmental and biological transfer processes to modeling emergency response. The need for an exchange of views and a comparison of approaches between the diverse disciplines led to the organization of a CEC/USDOE Workshop on Uncertainty Analysis held in Santa Fe, New Mexico, in November 1989. The workshop brought together specialists in a number of disciplines, including those expert in the mathematics and statistics of uncertainty analysis, in expert judgment elicitation and evaluation, and in all aspects of assessing the radiological and environmental consequences of accidental releases of radioactive material. In addition, there was participation from users of the output of accident consequences assessment in decision making and/or regulatory frameworks. The main conclusions that emerged from the workshop are summarized in this paper. These are discussed in the context of three different types of accident consequence assessment: probabilistic assessments of accident consequences undertaken as inputs to risk analyses of nuclear installations, assessments of accident consequences in real time to provide inputs to decisions on the introduction of countermeasures, and the reconstruction of doses and risks resulting form past releases of radioactive material

  20. AN OVERVIEW OF THE UNCERTAINTY ANALYSIS, SENSITIVITY ANALYSIS, AND PARAMETER ESTIMATION (UA/SA/PE) API AND HOW TO IMPLEMENT IT

    Science.gov (United States)

    The Application Programming Interface (API) for Uncertainty Analysis, Sensitivity Analysis, andParameter Estimation (UA/SA/PE API) (also known as Calibration, Optimization and Sensitivity and Uncertainty (CUSO)) was developed in a joint effort between several members of both ...

  1. Uncertainty analysis of flexible rotors considering fuzzy parameters and fuzzy-random parameters

    Directory of Open Access Journals (Sweden)

    Fabian Andres Lara-Molina

    Full Text Available Abstract The components of flexible rotors are subjected to uncertainties. The main sources of uncertainties include the variation of mechanical properties. This contribution aims at analyzing the dynamics of flexible rotors under uncertain parameters modeled as fuzzy and fuzzy random variables. The uncertainty analysis encompasses the modeling of uncertain parameters and the numerical simulation of the corresponding flexible rotor model by using an approach based on fuzzy dynamic analysis. The numerical simulation is accomplished by mapping the fuzzy parameters of the deterministic flexible rotor model. Thereby, the flexible rotor is modeled by using both the Fuzzy Finite Element Method and the Fuzzy Stochastic Finite Element Method. Numerical simulations illustrate the methodology conveyed in terms of orbits and frequency response functions subject to uncertain parameters.

  2. Uncertainty and sensitivity analysis in a Probabilistic Safety Analysis level-1

    International Nuclear Information System (INIS)

    Nunez Mc Leod, Jorge E.; Rivera, Selva S.

    1996-01-01

    A methodology for sensitivity and uncertainty analysis, applicable to a Probabilistic Safety Assessment Level I has been presented. The work contents are: correct association of distributions to parameters, importance and qualification of expert opinions, generations of samples according to sample sizes, and study of the relationships among system variables and systems response. A series of statistical-mathematical techniques are recommended along the development of the analysis methodology, as well as different graphical visualization for the control of the study. (author)

  3. A practical sensitivity analysis method for ranking sources of uncertainty in thermal–hydraulics applications

    Energy Technology Data Exchange (ETDEWEB)

    Pourgol-Mohammad, Mohammad, E-mail: pourgolmohammad@sut.ac.ir [Department of Mechanical Engineering, Sahand University of Technology, Tabriz (Iran, Islamic Republic of); Hoseyni, Seyed Mohsen [Department of Basic Sciences, East Tehran Branch, Islamic Azad University, Tehran (Iran, Islamic Republic of); Hoseyni, Seyed Mojtaba [Building & Housing Research Center, Tehran (Iran, Islamic Republic of); Sepanloo, Kamran [Nuclear Science and Technology Research Institute, Tehran (Iran, Islamic Republic of)

    2016-08-15

    Highlights: • Existing uncertainty ranking methods prove inconsistent for TH applications. • Introduction of a new method for ranking sources of uncertainty in TH codes. • Modified PIRT qualitatively identifies and ranks uncertainty sources more precisely. • The importance of parameters is calculated by a limited number of TH code executions. • Methodology is applied successfully on LOFT-LB1 test facility. - Abstract: In application to thermal–hydraulic calculations by system codes, sensitivity analysis plays an important role for managing the uncertainties of code output and risk analysis. Sensitivity analysis is also used to confirm the results of qualitative Phenomena Identification and Ranking Table (PIRT). Several methodologies have been developed to address uncertainty importance assessment. Generally, uncertainty importance measures, mainly devised for the Probabilistic Risk Assessment (PRA) applications, are not affordable for computationally demanding calculations of the complex thermal–hydraulics (TH) system codes. In other words, for effective quantification of the degree of the contribution of each phenomenon to the total uncertainty of the output, a practical approach is needed by considering high computational burden of TH calculations. This study aims primarily to show the inefficiency of the existing approaches and then introduces a solution to cope with the challenges in this area by modification of variance-based uncertainty importance method. Important parameters are identified by the modified PIRT approach qualitatively then their uncertainty importance is quantified by a local derivative index. The proposed index is attractive from its practicality point of view on TH applications. It is capable of calculating the importance of parameters by a limited number of TH code executions. Application of the proposed methodology is demonstrated on LOFT-LB1 test facility.

  4. A practical sensitivity analysis method for ranking sources of uncertainty in thermal–hydraulics applications

    International Nuclear Information System (INIS)

    Pourgol-Mohammad, Mohammad; Hoseyni, Seyed Mohsen; Hoseyni, Seyed Mojtaba; Sepanloo, Kamran

    2016-01-01

    Highlights: • Existing uncertainty ranking methods prove inconsistent for TH applications. • Introduction of a new method for ranking sources of uncertainty in TH codes. • Modified PIRT qualitatively identifies and ranks uncertainty sources more precisely. • The importance of parameters is calculated by a limited number of TH code executions. • Methodology is applied successfully on LOFT-LB1 test facility. - Abstract: In application to thermal–hydraulic calculations by system codes, sensitivity analysis plays an important role for managing the uncertainties of code output and risk analysis. Sensitivity analysis is also used to confirm the results of qualitative Phenomena Identification and Ranking Table (PIRT). Several methodologies have been developed to address uncertainty importance assessment. Generally, uncertainty importance measures, mainly devised for the Probabilistic Risk Assessment (PRA) applications, are not affordable for computationally demanding calculations of the complex thermal–hydraulics (TH) system codes. In other words, for effective quantification of the degree of the contribution of each phenomenon to the total uncertainty of the output, a practical approach is needed by considering high computational burden of TH calculations. This study aims primarily to show the inefficiency of the existing approaches and then introduces a solution to cope with the challenges in this area by modification of variance-based uncertainty importance method. Important parameters are identified by the modified PIRT approach qualitatively then their uncertainty importance is quantified by a local derivative index. The proposed index is attractive from its practicality point of view on TH applications. It is capable of calculating the importance of parameters by a limited number of TH code executions. Application of the proposed methodology is demonstrated on LOFT-LB1 test facility.

  5. Spatial GHG Inventory: Analysis of Uncertainty Sources. A Case Study for Ukraine

    International Nuclear Information System (INIS)

    Bun, R.; Gusti, M.; Kujii, L.; Tokar, O.; Tsybrivskyy, Y.; Bun, A.

    2007-01-01

    A geoinformation technology for creating spatially distributed greenhouse gas inventories based on a methodology provided by the Intergovernmental Panel on Climate Change and special software linking input data, inventory models, and a means for visualization are proposed. This technology opens up new possibilities for qualitative and quantitative spatially distributed presentations of inventory uncertainty at the regional level. Problems concerning uncertainty and verification of the distributed inventory are discussed. A Monte Carlo analysis of uncertainties in the energy sector at the regional level is performed, and a number of simulations concerning the effectiveness of uncertainty reduction in some regions are carried out. Uncertainties in activity data have a considerable influence on overall inventory uncertainty, for example, the inventory uncertainty in the energy sector declines from 3.2 to 2.0% when the uncertainty of energy-related statistical data on fuels combusted in the energy industries declines from 10 to 5%. Within the energy sector, the 'energy industries' subsector has the greatest impact on inventory uncertainty. The relative uncertainty in the energy sector inventory can be reduced from 2.19 to 1.47% if the uncertainty of specific statistical data on fuel consumption decreases from 10 to 5%. The 'energy industries' subsector has the greatest influence in the Donetsk oblast. Reducing the uncertainty of statistical data on electricity generation in just three regions - the Donetsk, Dnipropetrovsk, and Luhansk oblasts - from 7.5 to 4.0% results in a decline from 2.6 to 1.6% in the uncertainty in the national energy sector inventory

  6. Development of a System Analysis Toolkit for Sensitivity Analysis, Uncertainty Propagation, and Estimation of Parameter Distribution

    International Nuclear Information System (INIS)

    Heo, Jaeseok; Kim, Kyung Doo

    2015-01-01

    Statistical approaches to uncertainty quantification and sensitivity analysis are very important in estimating the safety margins for an engineering design application. This paper presents a system analysis and optimization toolkit developed by Korea Atomic Energy Research Institute (KAERI), which includes multiple packages of the sensitivity analysis and uncertainty quantification algorithms. In order to reduce the computing demand, multiple compute resources including multiprocessor computers and a network of workstations are simultaneously used. A Graphical User Interface (GUI) was also developed within the parallel computing framework for users to readily employ the toolkit for an engineering design and optimization problem. The goal of this work is to develop a GUI framework for engineering design and scientific analysis problems by implementing multiple packages of system analysis methods in the parallel computing toolkit. This was done by building an interface between an engineering simulation code and the system analysis software packages. The methods and strategies in the framework were designed to exploit parallel computing resources such as those found in a desktop multiprocessor workstation or a network of workstations. Available approaches in the framework include statistical and mathematical algorithms for use in science and engineering design problems. Currently the toolkit has 6 modules of the system analysis methodologies: deterministic and probabilistic approaches of data assimilation, uncertainty propagation, Chi-square linearity test, sensitivity analysis, and FFTBM

  7. Development of a System Analysis Toolkit for Sensitivity Analysis, Uncertainty Propagation, and Estimation of Parameter Distribution

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Jaeseok; Kim, Kyung Doo [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    Statistical approaches to uncertainty quantification and sensitivity analysis are very important in estimating the safety margins for an engineering design application. This paper presents a system analysis and optimization toolkit developed by Korea Atomic Energy Research Institute (KAERI), which includes multiple packages of the sensitivity analysis and uncertainty quantification algorithms. In order to reduce the computing demand, multiple compute resources including multiprocessor computers and a network of workstations are simultaneously used. A Graphical User Interface (GUI) was also developed within the parallel computing framework for users to readily employ the toolkit for an engineering design and optimization problem. The goal of this work is to develop a GUI framework for engineering design and scientific analysis problems by implementing multiple packages of system analysis methods in the parallel computing toolkit. This was done by building an interface between an engineering simulation code and the system analysis software packages. The methods and strategies in the framework were designed to exploit parallel computing resources such as those found in a desktop multiprocessor workstation or a network of workstations. Available approaches in the framework include statistical and mathematical algorithms for use in science and engineering design problems. Currently the toolkit has 6 modules of the system analysis methodologies: deterministic and probabilistic approaches of data assimilation, uncertainty propagation, Chi-square linearity test, sensitivity analysis, and FFTBM.

  8. Best estimate analysis of LOFT L2-5 with CATHARE: uncertainty and sensitivity analysis

    Energy Technology Data Exchange (ETDEWEB)

    JOUCLA, Jerome; PROBST, Pierre [Institute for Radiological Protection and Nuclear Safety, Fontenay-aux-Roses (France); FOUET, Fabrice [APTUS, Versailles (France)

    2008-07-01

    The revision of the 10 CFR50.46 in 1988 has made possible the use of best-estimate codes. They may be used in safety demonstration and licensing, provided that uncertainties are added to the relevant output parameters before comparing them with the acceptance criteria. In the safety analysis of the large break loss of coolant accident, it was agreed that the 95. percentile estimated with a high degree of confidence should be lower than the acceptance criteria. It appeared necessary to IRSN, technical support of the French Safety Authority, to get more insight into these strategies which are being developed not only in thermal-hydraulics but in other fields such as in neutronics. To estimate the 95. percentile with a high confidence level, we propose to use rank statistics or bootstrap. Toward the objective of assessing uncertainty, it is useful to determine and to classify the main input parameters. We suggest approximating the code by a surrogate model, the Kriging model, which will be used to make a sensitivity analysis with the SOBOL methodology. This paper presents the application of two new methodologies of how to make the uncertainty and sensitivity analysis on the maximum peak cladding temperature of the LOFT L2-5 test with the CATHARE code. (authors)

  9. Estimating annual bole biomass production using uncertainty analysis

    Science.gov (United States)

    Travis J. Woolley; Mark E. Harmon; Kari B. O' Connell

    2007-01-01

    Two common sampling methodologies coupled with a simple statistical model were evaluated to determine the accuracy and precision of annual bole biomass production (BBP) and inter-annual variability estimates using this type of approach. We performed an uncertainty analysis using Monte Carlo methods in conjunction with radial growth core data from trees in three Douglas...

  10. IAEA Coordinated Research Project on HTGR Reactor Physics, Thermal-hydraulics and Depletion Uncertainty Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Strydom, Gerhard [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bostelmann, F. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The predictive capability of coupled neutronics/thermal-hydraulics and depletion simulations for reactor design and safety analysis can be assessed with sensitivity analysis (SA) and uncertainty analysis (UA) methods. Uncertainty originates from errors in physical data, manufacturing uncertainties, modelling and computational algorithms. (The interested reader is referred to the large body of published SA and UA literature for a more complete overview of the various types of uncertainties, methodologies and results obtained). SA is helpful for ranking the various sources of uncertainty and error in the results of core analyses. SA and UA are required to address cost, safety, and licensing needs and should be applied to all aspects of reactor multi-physics simulation. SA and UA can guide experimental, modelling, and algorithm research and development. Current SA and UA rely either on derivative-based methods such as stochastic sampling methods or on generalized perturbation theory to obtain sensitivity coefficients. Neither approach addresses all needs. In order to benefit from recent advances in modelling and simulation and the availability of new covariance data (nuclear data uncertainties) extensive sensitivity and uncertainty studies are needed for quantification of the impact of different sources of uncertainties on the design and safety parameters of HTGRs. Only a parallel effort in advanced simulation and in nuclear data improvement will be able to provide designers with more robust and well validated calculation tools to meet design target accuracies. In February 2009, the Technical Working Group on Gas-Cooled Reactors (TWG-GCR) of the International Atomic Energy Agency (IAEA) recommended that the proposed Coordinated Research Program (CRP) on

  11. Uncertainty analysis of suppression pool heating during an ATWS in a BWR-5 plant

    International Nuclear Information System (INIS)

    Wulff, W.; Cheng, H.S.; Mallen, A.N.; Johnsen, G.W.; Lellouche, G.S.

    1994-03-01

    The uncertainty has been estimated of predicting the peak temperature in the suppression pool of a BWR power plant, which undergoes an NRC-postulated Anticipated Transient Without Scram (ATWS). The ATWS is initiated by recirculation-pump trips, and then leads to power and flow oscillations as they had occurred at the LaSalle-2 Power Station in March of 1988. After limit-cycle oscillations have been established, the turbines are tripped, but without MSIV closure, allowing steam discharge through the turbine bypass into the condenser. Postulated operator actions, namely to lower the reactor vessel pressure and the level elevation in the downcomer, are simulated by a robot model which accounts for operator uncertainty. All balance of plant and control systems modeling uncertainties were part of the statistical uncertainty analysis that was patterned after the Code Scaling, Applicability and Uncertainty (CSAU) evaluation methodology. The analysis showed that the predicted suppression-pool peak temperature of 329.3 K (133 degrees F) has a 95-percentile uncertainty of 14.4 K (26 degrees F), and that the size of this uncertainty bracket is dominated by the experimental uncertainty of measuring Safety and Relief Valve mass flow rates under critical-flow conditions. The analysis showed also that the probability of exceeding the suppression-pool temperature limit of 352.6 K (175 degrees F) is most likely zero (it is estimated as < 5-104). The square root of the sum of the squares of all the computed peak pool temperatures is 350.7 K (171.6 degrees F)

  12. Joint analysis of input and parametric uncertainties in watershed water quality modeling: A formal Bayesian approach

    Science.gov (United States)

    Han, Feng; Zheng, Yi

    2018-06-01

    Significant Input uncertainty is a major source of error in watershed water quality (WWQ) modeling. It remains challenging to address the input uncertainty in a rigorous Bayesian framework. This study develops the Bayesian Analysis of Input and Parametric Uncertainties (BAIPU), an approach for the joint analysis of input and parametric uncertainties through a tight coupling of Markov Chain Monte Carlo (MCMC) analysis and Bayesian Model Averaging (BMA). The formal likelihood function for this approach is derived considering a lag-1 autocorrelated, heteroscedastic, and Skew Exponential Power (SEP) distributed error model. A series of numerical experiments were performed based on a synthetic nitrate pollution case and on a real study case in the Newport Bay Watershed, California. The Soil and Water Assessment Tool (SWAT) and Differential Evolution Adaptive Metropolis (DREAM(ZS)) were used as the representative WWQ model and MCMC algorithm, respectively. The major findings include the following: (1) the BAIPU can be implemented and used to appropriately identify the uncertain parameters and characterize the predictive uncertainty; (2) the compensation effect between the input and parametric uncertainties can seriously mislead the modeling based management decisions, if the input uncertainty is not explicitly accounted for; (3) the BAIPU accounts for the interaction between the input and parametric uncertainties and therefore provides more accurate calibration and uncertainty results than a sequential analysis of the uncertainties; and (4) the BAIPU quantifies the credibility of different input assumptions on a statistical basis and can be implemented as an effective inverse modeling approach to the joint inference of parameters and inputs.

  13. Thermal-Hydraulic Analysis for SBLOCA in OPR1000 and Evaluation of Uncertainty for PSA

    International Nuclear Information System (INIS)

    Kim, Tae Jin; Park, Goon Cherl

    2012-01-01

    Probabilistic Safety assessment (PSA) is a mathematical tool to evaluate numerical estimates of risk for nuclear power plants (NPPs). But PSA has the problems about quality and reliability since the quantification of uncertainties from thermal hydraulic (TH) analysis has not been included in the quantification of overall uncertainties in PSA. From the former research, it is proved that the quantification of uncertainties from best-estimate LBLOCA analysis can improve the PSA quality by modifying the core damage frequency (CDF) from the existing PSA report. Basing on the similar concept, this study considers the quantification of SBLOCA analysis results. In this study, however, operator error parameters are also included in addition to the phenomenon parameters which are considered in LBLOCA analysis

  14. Quantification of margins and uncertainties: Alternative representations of epistemic uncertainty

    International Nuclear Information System (INIS)

    Helton, Jon C.; Johnson, Jay D.

    2011-01-01

    In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, 'Quantification of Margins and Uncertainties: Conceptual and Computational Basis,' describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples that employ probability for the representation of aleatory and epistemic uncertainty. The current presentation introduces and illustrates the use of interval analysis, possibility theory and evidence theory as alternatives to the use of probability theory for the representation of epistemic uncertainty in QMU-type analyses. The following topics are considered: the mathematical structure of alternative representations of uncertainty, alternative representations of epistemic uncertainty in QMU analyses involving only epistemic uncertainty, and alternative representations of epistemic uncertainty in QMU analyses involving a separation of aleatory and epistemic uncertainty. Analyses involving interval analysis, possibility theory and evidence theory are illustrated with the same two notional examples used in the presentation indicated above to illustrate the use of probability to represent aleatory and epistemic uncertainty in QMU analyses.

  15. Risk uncertainty analysis methods for NUREG-1150

    International Nuclear Information System (INIS)

    Benjamin, A.S.; Boyd, G.J.

    1987-01-01

    Evaluation and display of risk uncertainties for NUREG-1150 constitute a principal focus of the Severe Accident Risk Rebaselining/Risk Reduction Program (SARRP). Some of the principal objectives of the uncertainty evaluation are: (1) to provide a quantitative estimate that reflects, for those areas considered, a credible and realistic range of uncertainty in risk; (2) to rank the various sources of uncertainty with respect to their importance for various measures of risk; and (3) to characterize the state of understanding of each aspect of the risk assessment for which major uncertainties exist. This paper describes the methods developed to fulfill these objectives

  16. Global sensitivity analysis in wastewater treatment plant model applications: Prioritizing sources of uncertainty

    DEFF Research Database (Denmark)

    Sin, Gürkan; Gernaey, Krist; Neumann, Marc B.

    2011-01-01

    This study demonstrates the usefulness of global sensitivity analysis in wastewater treatment plant (WWTP) design to prioritize sources of uncertainty and quantify their impact on performance criteria. The study, which is performed with the Benchmark Simulation Model no. 1 plant design, complements...... insight into devising useful ways for reducing uncertainties in the plant performance. This information can help engineers design robust WWTP plants....... a previous paper on input uncertainty characterisation and propagation (Sin et al., 2009). A sampling-based sensitivity analysis is conducted to compute standardized regression coefficients. It was found that this method is able to decompose satisfactorily the variance of plant performance criteria (with R2...

  17. Uncertainty analysis comes to integrated assessment models for climate change…and conversely

    NARCIS (Netherlands)

    Cooke, R.M.

    2012-01-01

    This article traces the development of uncertainty analysis through three generations punctuated by large methodology investments in the nuclear sector. Driven by a very high perceived legitimation burden, these investments aimed at strengthening the scientific basis of uncertainty quantification.

  18. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Boardman, J. [AEA Technology (United Kingdom); Jones, J.A. [National Radiological Protection Board (United Kingdom); Harper, F.T.; Young, M.L. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  19. Uncertainty Analysis of the Temperature–Resistance Relationship of Temperature Sensing Fabric

    Directory of Open Access Journals (Sweden)

    Muhammad Dawood Husain

    2016-11-01

    Full Text Available This paper reports the uncertainty analysis of the temperature–resistance (TR data of the newly developed temperature sensing fabric (TSF, which is a double-layer knitted structure fabricated on an electronic flat-bed knitting machine, made of polyester as a basal yarn, and embedded with fine metallic wire as sensing element. The measurement principle of the TSF is identical to temperature resistance detector (RTD; that is, change in resistance due to change in temperature. The regression uncertainty (uncertainty within repeats and repeatability uncertainty (uncertainty among repeats were estimated by analysing more than 300 TR experimental repeats of 50 TSF samples. The experiments were performed under dynamic heating and cooling environments on a purpose-built test rig within the temperature range of 20–50 °C. The continuous experimental data was recorded through LabVIEW-based graphical user interface. The result showed that temperature and resistance values were not only repeatable but reproducible, with only minor variations. The regression uncertainty was found to be less than ±0.3 °C; the TSF sample made of Ni and W wires showed regression uncertainty of <±0.13 °C in comparison to Cu-based TSF samples (>±0.18 °C. The cooling TR data showed considerably reduced values (±0.07 °C of uncertainty in comparison with the heating TR data (±0.24 °C. The repeatability uncertainty was found to be less than ±0.5 °C. By increasing the number of samples and repeats, the uncertainties may be reduced further. The TSF could be used for continuous measurement of the temperature profile on the surface of the human body.

  20. Uncertainty Prediction in Passive Target Motion Analysis

    Science.gov (United States)

    2016-05-12

    Number 15/152,696 Filing Date 12 May 2016 Inventor John G. Baylog et al Address any questions concerning this matter to the Office of...300118 1 of 25 UNCERTAINTY PREDICTION IN PASSIVE TARGET MOTION ANALYSIS STATEMENT OF GOVERNMENT INTEREST [0001] The invention described herein...at an unknown location and following an unknown course relative to an observer 12. Observer 12 has a sensor array such as a passive sonar or radar

  1. Computational Fluid Dynamics Uncertainty Analysis Applied to Heat Transfer over a Flat Plate

    Science.gov (United States)

    Groves, Curtis Edward; Ilie, Marcel; Schallhorn, Paul A.

    2013-01-01

    There have been few discussions on using Computational Fluid Dynamics (CFD) without experimental validation. Pairing experimental data, uncertainty analysis, and analytical predictions provides a comprehensive approach to verification and is the current state of the art. With pressed budgets, collecting experimental data is rare or non-existent. This paper investigates and proposes a method to perform CFD uncertainty analysis only from computational data. The method uses current CFD uncertainty techniques coupled with the Student-T distribution to predict the heat transfer coefficient over a at plate. The inputs to the CFD model are varied from a specified tolerance or bias error and the difference in the results are used to estimate the uncertainty. The variation in each input is ranked from least to greatest to determine the order of importance. The results are compared to heat transfer correlations and conclusions drawn about the feasibility of using CFD without experimental data. The results provide a tactic to analytically estimate the uncertainty in a CFD model when experimental data is unavailable

  2. Uncertainty Analysis Framework - Hanford Site-Wide Groundwater Flow and Transport Model

    Energy Technology Data Exchange (ETDEWEB)

    Cole, Charles R.; Bergeron, Marcel P.; Murray, Christopher J.; Thorne, Paul D.; Wurstner, Signe K.; Rogers, Phillip M.

    2001-11-09

    Pacific Northwest National Laboratory (PNNL) embarked on a new initiative to strengthen the technical defensibility of the predictions being made with a site-wide groundwater flow and transport model at the U.S. Department of Energy Hanford Site in southeastern Washington State. In FY 2000, the focus of the initiative was on the characterization of major uncertainties in the current conceptual model that would affect model predictions. The long-term goals of the initiative are the development and implementation of an uncertainty estimation methodology in future assessments and analyses using the site-wide model. This report focuses on the development and implementation of an uncertainty analysis framework.

  3. Stakeholder-driven multi-attribute analysis for energy project selection under uncertainty

    International Nuclear Information System (INIS)

    Read, Laura; Madani, Kaveh; Mokhtari, Soroush; Hanks, Catherine

    2017-01-01

    In practice, selecting an energy project for development requires balancing criteria and competing stakeholder priorities to identify the best alternative. Energy source selection can be modeled as multi-criteria decision-maker problems to provide quantitative support to reconcile technical, economic, environmental, social, and political factors with respect to the stakeholders' interests. Decision making among these complex interactions should also account for the uncertainty present in the input data. In response, this work develops a stochastic decision analysis framework to evaluate alternatives by involving stakeholders to identify both quantitative and qualitative selection criteria and performance metrics which carry uncertainties. The developed framework is illustrated using a case study from Fairbanks, Alaska, where decision makers and residents must decide on a new source of energy for heating and electricity. We approach this problem in a five step methodology: (1) engaging experts (role players) to develop criteria of project performance; (2) collecting a range of quantitative and qualitative input information to determine the performance of each proposed solution according to the selected criteria; (3) performing a Monte-Carlo analysis to capture uncertainties given in the inputs; (4) applying multi-criteria decision-making, social choice (voting), and fallback bargaining methods to account for three different levels of cooperation among the stakeholders; and (5) computing an aggregate performance index (API) score for each alternative based on its performance across criteria and cooperation levels. API scores communicate relative performance between alternatives. In this way, our methodology maps uncertainty from the input data to reflect risk in the decision and incorporates varying degrees of cooperation into the analysis to identify an optimal and practical alternative. - Highlights: • We develop an applicable stakeholder-driven framework for

  4. Effect of Uncertainties in Physical Property Estimates on Process Design - Sensitivity Analysis

    DEFF Research Database (Denmark)

    Hukkerikar, Amol; Jones, Mark Nicholas; Sin, Gürkan

    for performing sensitivity of process design subject to uncertainties in the property estimates. To this end, first uncertainty analysis of the property models of pure components and their mixtures was performed in order to obtain the uncertainties in the estimated property values. As a next step, sensitivity......Chemical process design calculations require accurate and reliable physical and thermodynamic property data and property models of pure components and their mixtures in order to obtain reliable design parameters which help to achieve desired specifications. The uncertainties in the property values...... can arise from the experiments itself or from the property models employed. It is important to consider the effect of these uncertainties on the process design in order to assess the quality and reliability of the final design. The main objective of this work is to develop a systematic methodology...

  5. Statistical uncertainty analysis of radon transport in nonisothermal, unsaturated soils

    International Nuclear Information System (INIS)

    Holford, D.J.; Owczarski, P.C.; Gee, G.W.; Freeman, H.D.

    1990-10-01

    To accurately predict radon fluxes soils to the atmosphere, we must know more than the radium content of the soil. Radon flux from soil is affected not only by soil properties, but also by meteorological factors such as air pressure and temperature changes at the soil surface, as well as the infiltration of rainwater. Natural variations in meteorological factors and soil properties contribute to uncertainty in subsurface model predictions of radon flux, which, when coupled with a building transport model, will also add uncertainty to predictions of radon concentrations in homes. A statistical uncertainty analysis using our Rn3D finite-element numerical model was conducted to assess the relative importance of these meteorological factors and the soil properties affecting radon transport. 10 refs., 10 figs., 3 tabs

  6. Two-dimensional cross-section and SED uncertainty analysis for the Fusion Engineering Device (FED)

    International Nuclear Information System (INIS)

    Embrechts, M.J.; Urban, W.T.; Dudziak, D.J.

    1982-01-01

    The theory of two-dimensional cross-section and secondary-energy-distribution (SED) sensitivity was implemented by developing a two-dimensional sensitivity and uncertainty analysis code, SENSIT-2D. Analyses of the Fusion Engineering Design (FED) conceptual inboard shield indicate that, although the calculated uncertainties in the 2-D model are of the same order of magnitude as those resulting from the 1-D model, there might be severe differences. The more complex the geometry, the more compulsory a 2-D analysis becomes. Specific results show that the uncertainty for the integral heating of the toroidal field (TF) coil for the FED is 114.6%. The main contributors to the cross-section uncertainty are chromium and iron. Contributions to the total uncertainty were smaller for nickel, copper, hydrogen and carbon. All analyses were performed with the Los Alamos 42-group cross-section library generated from ENDF/B-V data, and the COVFILS covariance matrix library. The large uncertainties due to chromium result mainly from large convariances for the chromium total and elastic scattering cross sections

  7. Operational Implementation of a Pc Uncertainty Construct for Conjunction Assessment Risk Analysis

    Science.gov (United States)

    Newman, Lauri K.; Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    Earlier this year the NASA Conjunction Assessment and Risk Analysis (CARA) project presented the theoretical and algorithmic aspects of a method to include the uncertainties in the calculation inputs when computing the probability of collision (Pc) between two space objects, principally uncertainties in the covariances and the hard-body radius. The output of this calculation approach is to produce rather than a single Pc value an entire probability density function that will represent the range of possible Pc values given the uncertainties in the inputs and bring CA risk analysis methodologies more in line with modern risk management theory. The present study provides results from the exercise of this method against an extended dataset of satellite conjunctions in order to determine the effect of its use on the evaluation of conjunction assessment (CA) event risk posture. The effects are found to be considerable: a good number of events are downgraded from or upgraded to a serious risk designation on the basis of consideration of the Pc uncertainty. The findings counsel the integration of the developed methods into NASA CA operations.

  8. Micropollutants throughout an integrated urban drainage model: Sensitivity and uncertainty analysis

    Science.gov (United States)

    Mannina, Giorgio; Cosenza, Alida; Viviani, Gaspare

    2017-11-01

    The paper presents the sensitivity and uncertainty analysis of an integrated urban drainage model which includes micropollutants. Specifically, a bespoke integrated model developed in previous studies has been modified in order to include the micropollutant assessment (namely, sulfamethoxazole - SMX). The model takes into account also the interactions between the three components of the system: sewer system (SS), wastewater treatment plant (WWTP) and receiving water body (RWB). The analysis has been applied to an experimental catchment nearby Palermo (Italy): the Nocella catchment. Overall, five scenarios, each characterized by different uncertainty combinations of sub-systems (i.e., SS, WWTP and RWB), have been considered applying, for the sensitivity analysis, the Extended-FAST method in order to select the key factors affecting the RWB quality and to design a reliable/useful experimental campaign. Results have demonstrated that sensitivity analysis is a powerful tool for increasing operator confidence in the modelling results. The approach adopted here can be used for blocking some non-identifiable factors, thus wisely modifying the structure of the model and reducing the related uncertainty. The model factors related to the SS have been found to be the most relevant factors affecting the SMX modeling in the RWB when all model factors (scenario 1) or model factors of SS (scenarios 2 and 3) are varied. If the only factors related to the WWTP are changed (scenarios 4 and 5), the SMX concentration in the RWB is mainly influenced (till to 95% influence of the total variance for SSMX,max) by the aerobic sorption coefficient. A progressive uncertainty reduction from the upstream to downstream was found for the soluble fraction of SMX in the RWB.

  9. Estimates of Uncertainties in Analysis of Positron Lifetime Spectra for Metals

    DEFF Research Database (Denmark)

    Eldrup, Morten Mostgaard; Huang, Y. M.; McKee, B. T. A.

    1978-01-01

    by excluding the peak regions of the spectra from the analysis. The influence of using incorrect source-surface components in the analysis may on the other hand be reduced by including the peak regions of the spectra. A main conclusion of the work is that extreme caution should be exercised to avoid......The effects of uncertainties and errors in various constraints used in the analysis of multi-component life-time spectra of positrons annihilating in metals containing defects have been investigated in detail using computer simulated decay spectra and subsequent analysis. It is found...... that the errors in the fitted values of the main components lifetimes and intensities introduced from incorrect values of the instrumental resolution function and of the source-surface components can easily exceed the statistical uncertainties. The effect of an incorrect resolution function may be reduced...

  10. Uncertainty analysis of light water reactor unit fuel pin cells

    Energy Technology Data Exchange (ETDEWEB)

    Kamerow, S.; Ivanov, K., E-mail: sln107@PSU.EDU, E-mail: kni1@PSU.EDU [Department of Mechanical and Nuclear Engineering, The Pennsylvania State University, PA (United States); Moreno, C. Arenas, E-mail: cristina.arenas@UPC.EDU [Department of Physics and Nuclear Engineering, Technical University of Catalonia, Barcelona (Spain)

    2011-07-01

    The study explored the calculation of uncertainty based on available covariance data and computational tools. Uncertainty due to temperature changes and different fuel compositions are the main focus of this analysis. Selected unit fuel pin cells were analyzed according to the OECD LWR UAM benchmark specifications. Criticality and uncertainty analyses were performed using TSUNAMI-1D sequence in SCALE 6.0. It was found that uncertainties increase with increasing temperature while k{sub eff} decreases. This increase in the uncertainty is due to the increase in sensitivity of the largest contributor of uncertainty, namely nuclide reaction {sup 238}U (n, gamma). The sensitivity grew larger as the capture cross-section of {sup 238}U expanded due to Doppler broadening. In addition, three different compositions (UOx, MOx, and UOxGd{sub 2}O{sub 3}) of fuel cells were analyzed. It showed a remarkable increase in uncertainty in k{sub eff} for the case of the MOx fuel cell and UOxGd{sub 2}O{sub 3} fuel cell. The increase in the uncertainty of k{sub eff} in UOxGd{sub 2}O{sub 3} fuel was nearly twice of that in MOx fuel and almost four times the amount in UOx fuel. The components of the uncertainties in k{sub eff} in each case were examined and it was found that the neutron-nuclide reaction of {sup 238}U, mainly (n,n'), contributed the most to the uncertainties in the cases of MOx and UOxGd{sub 2}O{sub 3}. At higher energy, the covariance coefficient matrix of {sup 238}U (n,n') to {sup 238}U (n,n') and {sup 238}U (n,n') cross-section showed very large values. Further, examination of the UOxGd{sub 2}O{sub 3} case found that the {sup 238}U (n,n') became the dominant contributor to the uncertainty because most of the thermal neutrons in the cell were absorbed by Gadolinium in UOxGd{sub 2}O{sub 3} case and thus shifting the neutron spectrum to higher energy. For the MOx case on other hand, {sup 239}Pu has a very strong absorption cross-section at low energy

  11. Propagation of uncertainty and sensitivity analysis in an integral oil-gas plume model

    KAUST Repository

    Wang, Shitao

    2016-05-27

    Polynomial Chaos expansions are used to analyze uncertainties in an integral oil-gas plume model simulating the Deepwater Horizon oil spill. The study focuses on six uncertain input parameters—two entrainment parameters, the gas to oil ratio, two parameters associated with the droplet-size distribution, and the flow rate—that impact the model\\'s estimates of the plume\\'s trap and peel heights, and of its various gas fluxes. The ranges of the uncertain inputs were determined by experimental data. Ensemble calculations were performed to construct polynomial chaos-based surrogates that describe the variations in the outputs due to variations in the uncertain inputs. The surrogates were then used to estimate reliably the statistics of the model outputs, and to perform an analysis of variance. Two experiments were performed to study the impacts of high and low flow rate uncertainties. The analysis shows that in the former case the flow rate is the largest contributor to output uncertainties, whereas in the latter case, with the uncertainty range constrained by aposteriori analyses, the flow rate\\'s contribution becomes negligible. The trap and peel heights uncertainties are then mainly due to uncertainties in the 95% percentile of the droplet size and in the entrainment parameters.

  12. Propagation of uncertainty and sensitivity analysis in an integral oil-gas plume model

    KAUST Repository

    Wang, Shitao; Iskandarani, Mohamed; Srinivasan, Ashwanth; Thacker, W. Carlisle; Winokur, Justin; Knio, Omar

    2016-01-01

    Polynomial Chaos expansions are used to analyze uncertainties in an integral oil-gas plume model simulating the Deepwater Horizon oil spill. The study focuses on six uncertain input parameters—two entrainment parameters, the gas to oil ratio, two parameters associated with the droplet-size distribution, and the flow rate—that impact the model's estimates of the plume's trap and peel heights, and of its various gas fluxes. The ranges of the uncertain inputs were determined by experimental data. Ensemble calculations were performed to construct polynomial chaos-based surrogates that describe the variations in the outputs due to variations in the uncertain inputs. The surrogates were then used to estimate reliably the statistics of the model outputs, and to perform an analysis of variance. Two experiments were performed to study the impacts of high and low flow rate uncertainties. The analysis shows that in the former case the flow rate is the largest contributor to output uncertainties, whereas in the latter case, with the uncertainty range constrained by aposteriori analyses, the flow rate's contribution becomes negligible. The trap and peel heights uncertainties are then mainly due to uncertainties in the 95% percentile of the droplet size and in the entrainment parameters.

  13. A new measure of uncertainty importance based on distributional sensitivity analysis for PSA

    International Nuclear Information System (INIS)

    Han, Seok Jung; Tak, Nam Il; Chun, Moon Hyun

    1996-01-01

    The main objective of the present study is to propose a new measure of uncertainty importance based on distributional sensitivity analysis. The new measure is developed to utilize a metric distance obtained from cumulative distribution functions (cdfs). The measure is evaluated for two cases: one is a cdf given by a known analytical distribution and the other given by an empirical distribution generated by a crude Monte Carlo simulation. To study its applicability, the present measure has been applied to two different cases. The results are compared with those of existing three methods. The present approach is a useful measure of uncertainty importance which is based on cdfs. This method is simple and easy to calculate uncertainty importance without any complex process. On the basis of the results obtained in the present work, the present method is recommended to be used as a tool for the analysis of uncertainty importance

  14. Application of a Novel Dose-Uncertainty Model for Dose-Uncertainty Analysis in Prostate Intensity-Modulated Radiotherapy

    International Nuclear Information System (INIS)

    Jin Hosang; Palta, Jatinder R.; Kim, You-Hyun; Kim, Siyong

    2010-01-01

    Purpose: To analyze dose uncertainty using a previously published dose-uncertainty model, and to assess potential dosimetric risks existing in prostate intensity-modulated radiotherapy (IMRT). Methods and Materials: The dose-uncertainty model provides a three-dimensional (3D) dose-uncertainty distribution in a given confidence level. For 8 retrospectively selected patients, dose-uncertainty maps were constructed using the dose-uncertainty model at the 95% CL. In addition to uncertainties inherent to the radiation treatment planning system, four scenarios of spatial errors were considered: machine only (S1), S1 + intrafraction, S1 + interfraction, and S1 + both intrafraction and interfraction errors. To evaluate the potential risks of the IMRT plans, three dose-uncertainty-based plan evaluation tools were introduced: confidence-weighted dose-volume histogram, confidence-weighted dose distribution, and dose-uncertainty-volume histogram. Results: Dose uncertainty caused by interfraction setup error was more significant than that of intrafraction motion error. The maximum dose uncertainty (95% confidence) of the clinical target volume (CTV) was smaller than 5% of the prescribed dose in all but two cases (13.9% and 10.2%). The dose uncertainty for 95% of the CTV volume ranged from 1.3% to 2.9% of the prescribed dose. Conclusions: The dose uncertainty in prostate IMRT could be evaluated using the dose-uncertainty model. Prostate IMRT plans satisfying the same plan objectives could generate a significantly different dose uncertainty because a complex interplay of many uncertainty sources. The uncertainty-based plan evaluation contributes to generating reliable and error-resistant treatment plans.

  15. Uncertainty Assessment of Hydrological Frequency Analysis Using Bootstrap Method

    Directory of Open Access Journals (Sweden)

    Yi-Ming Hu

    2013-01-01

    Full Text Available The hydrological frequency analysis (HFA is the foundation for the hydraulic engineering design and water resources management. Hydrological extreme observations or samples are the basis for HFA; the representativeness of a sample series to the population distribution is extremely important for the estimation reliability of the hydrological design value or quantile. However, for most of hydrological extreme data obtained in practical application, the size of the samples is usually small, for example, in China about 40~50 years. Generally, samples with small size cannot completely display the statistical properties of the population distribution, thus leading to uncertainties in the estimation of hydrological design values. In this paper, a new method based on bootstrap is put forward to analyze the impact of sampling uncertainty on the design value. By bootstrap resampling technique, a large number of bootstrap samples are constructed from the original flood extreme observations; the corresponding design value or quantile is estimated for each bootstrap sample, so that the sampling distribution of design value is constructed; based on the sampling distribution, the uncertainty of quantile estimation can be quantified. Compared with the conventional approach, this method provides not only the point estimation of a design value but also quantitative evaluation on uncertainties of the estimation.

  16. Applications of uncertainty analysis to visual evaluation of density in radiographs

    International Nuclear Information System (INIS)

    Uchida, Suguru; Ohtsuka, Akiyoshi; Fujita, Hiroshi.

    1981-01-01

    Uncertainty analysis, developed as a method of absolute judgment in psychology, is applied to a method of radiographic image evaluation with perceptual fluctuations and to an examination of visual evaluation of density in radiographs. Subjects are composed of three groups of four neurosurgeons, four radiologic technologists and four nonprofessionals. By using a five-category rating scale, each observer is directed to classify 255 radiographs randomly presented without feedback. Characteristics of each observer and each group can be shown quantitatively by calculated information values. It is also described that bivariate uncertainty analysis or entropy method can be used to calculate the degree of agreement of evaluation. (author)

  17. Applications of uncertainty analysis to visual evaluation of density in radiographs

    Energy Technology Data Exchange (ETDEWEB)

    Uchida, S [Gifu Univ. (Japan); Ohtsuka, A; Fujita, H

    1981-03-01

    Uncertainty analysis, developed as a method of absolute judgment in psychology, is applied to a method of radiographic image evaluation with perceptual fluctuations and to an examination of visual evaluation of density in radiographs. Subjects are composed of three groups of four neurosurgeons, four radiologic technologists and four nonprofessionals. By using a five-category rating scale, each observer is directed to classify 255 radiographs randomly presented without feedback. Characteristics of each observer and each group can be shown quantitatively by calculated information values. It is also described that bivariate uncertainty analysis or entropy method can be used to calculate the degree of agreement of evaluation.

  18. Similarity and uncertainty analysis of the ALLEGRO MOX core

    International Nuclear Information System (INIS)

    Vrban, B.; Hascik, J.; Necas, V.; Slugen, V.

    2015-01-01

    The similarity and uncertainty analysis of the ESNII+ ALLEGRO MOX core has identified specific problems and challenges in the field of neutronic calculations. Similarity assessment identified 9 partly comparable experiments where only one reached ck and E values over 0.9. However the Global Integral Index G remains still low (0.75) and cannot be judge das sufficient. The total uncertainty of calculated k eff induced by XS data is according to our calculation 1.04%. The main contributors to this uncertainty are 239 Pu nubar and 238 U inelastic scattering. The additional margin from uncovered sensitivities was determined to be 0.28%. The identified low number of similar experiments prevents the use of advanced XS adjustment and bias estimation methods. More experimental data are needed and presented results may serve as a basic step in development of necessary critical assemblies. Although exact data are not presented in the paper, faster 44 energy group calculation gives almost the same results in similarity analysis in comparison to more complex 238 group calculation. Finally, it was demonstrated that TSUNAMI-IP utility can play a significant role in the future fast reactor development in Slovakia and in the Visegrad region. Clearly a further Research and Development and strong effort should be carried out in order to receive more complex methodology consisting of more plausible covariance data and related quantities. (authors)

  19. Treatment of uncertainties in the IPCC: a philosophical analysis

    Science.gov (United States)

    Jebeile, J.; Drouet, I.

    2014-12-01

    The IPCC produces scientific reports out of findings on climate and climate change. Because the findings are uncertain in many respects, the production of reports requires aggregating assessments of uncertainties of different kinds. This difficult task is currently regulated by the Guidance note for lead authors of the IPCC fifth assessment report on consistent treatment of uncertainties. The note recommends that two metrics—i.e. confidence and likelihood— be used for communicating the degree of certainty in findings. Confidence is expressed qualitatively "based on the type, amount, quality, and consistency of evidence […] and the degree of agreement", while likelihood is expressed probabilistically "based on statistical analysis of observations or model results, or expert judgment". Therefore, depending on the evidence evaluated, authors have the choice to present either an assigned level of confidence or a quantified measure of likelihood. But aggregating assessments of uncertainties of these two different kinds express distinct and conflicting methodologies. So the question arises whether the treatment of uncertainties in the IPCC is rationally justified. In order to answer the question, it is worth comparing the IPCC procedures with the formal normative theories of epistemic rationality which have been developed by philosophers. These theories—which include contributions to the philosophy of probability and to bayesian probabilistic confirmation theory—are relevant for our purpose because they are commonly used to assess the rationality of common collective jugement formation based on uncertain knowledge. In this paper we make the comparison and pursue the following objectives: i/we determine whether the IPCC confidence and likelihood can be compared with the notions of uncertainty targeted by or underlying the formal normative theories of epistemic rationality; ii/we investigate whether the formal normative theories of epistemic rationality justify

  20. Uncertainty theory

    CERN Document Server

    Liu, Baoding

    2015-01-01

    When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...

  1. Uncertainty analysis of one Main Circulation Pump trip event at the Ignalina NPP

    International Nuclear Information System (INIS)

    Vileiniskis, V.; Kaliatka, A.; Uspuras, E.

    2004-01-01

    One Main Circulation Pump (MCP) trip event is an anticipated transient with expected frequency of approximately one event per year. There were a few events when one MCP was inadvertently tripped. The throughput of the rest running pumps in the affected Main Circulation Circuit loop increased, however, the total coolant flow through the affected loop decreased. The main question arises whether this coolant flow rate is sufficient for adequate core cooling. This paper presents an investigation of one MCP trip event at the Ignalina NPP. According to international practice, the transient analysis should consist of deterministic analysis by employing best-estimate codes and uncertainty analysis. For that purpose, the plant's RELAP5 model and the GRS (Germany) System for Uncertainty and Sensitivity Analysis package (SUSA) were employed. Uncertainty analysis of flow energy loss in different parts of the Main Circulation Circuit, initial conditions and code-selected models was performed. Such analysis allows to estimate the influence of separate parameters on calculation results and to find the modelling parameters that have the largest impact on the event studied. On the basis of this analysis, recommendations for the further improvement of the model have been developed. (author)

  2. Error and Uncertainty Analysis for Ecological Modeling and Simulation

    Science.gov (United States)

    2001-12-01

    nitrate flux to the Gulf of Mexico. Nature (Brief Communication) 414: 166-167. (Uncertainty analysis done with SERDP software) Gertner, G., G...D. Goolsby 2001. Relating N inputs to the Mississippi River Basin and nitrate flux in the Lower Mississippi River: A comparison of approaches...Journal of Remote Sensing, 25(4):367-380. Wu, J., D.E. Jelinski, M. Luck, and P.T. Tueller, 2000. Multiscale analysis of landscape heterogeneity: scale

  3. Analysis and evaluation of regulatory uncertainties in 10 CFR 60 subparts B and E

    International Nuclear Information System (INIS)

    Weiner, R.F.; Patrick, W.C.

    1990-01-01

    This paper presents an attribute analysis scheme for prioritizing the resolution of regulatory uncertainties. Attributes are presented which assist in identifying the need for timeliness and durability of the resolution of an uncertainty

  4. Uncertainty and sensitivity analysis for the simulation of a station blackout scenario in the Jules Horowitz Reactor

    International Nuclear Information System (INIS)

    Ghione, Alberto; Noel, Brigitte; Vinai, Paolo; Demazière, Christophe

    2017-01-01

    Highlights: • A station blackout scenario in the Jules Horowitz Reactor is analyzed using CATHARE. • Input and model uncertainties relevant to the transient, are considered. • A statistical methodology for the propagation of the uncertainties is applied. • No safety criteria are exceeded and sufficiently large safety margins are estimated. • The most influential uncertainties are determined with a sensitivity analysis. - Abstract: An uncertainty and sensitivity analysis for the simulation of a station blackout scenario in the Jules Horowitz Reactor (JHR) is presented. The JHR is a new material testing reactor under construction at CEA on the Cadarache site, France. The thermal-hydraulic system code CATHARE is applied to investigate the response of the reactor system to the scenario. The uncertainty and sensitivity study was based on a statistical methodology for code uncertainty propagation, and the ‘Uncertainty and Sensitivity’ platform URANIE was used. Accordingly, the input uncertainties relevant to the transient, were identified, quantified, and propagated to the code output. The results show that the safety criteria are not exceeded and sufficiently large safety margins exist. In addition, the most influential input uncertainties on the safety parameters were found by making use of a sensitivity analysis.

  5. Stochastic analysis in production process and ecology under uncertainty

    CERN Document Server

    Bieda, Bogusław

    2014-01-01

    The monograph addresses a problem of stochastic analysis based on the uncertainty assessment by simulation and application of this method in ecology and steel industry under uncertainty. The first chapter defines the Monte Carlo (MC) method and random variables in stochastic models. Chapter two deals with the contamination transport in porous media. Stochastic approach for Municipal Solid Waste transit time contaminants modeling using MC simulation has been worked out. The third chapter describes the risk analysis of the waste to energy facility proposal for Konin city, including the financial aspects. Environmental impact assessment of the ArcelorMittal Steel Power Plant, in Kraków - in the chapter four - is given. Thus, four scenarios of the energy mix production processes were studied. Chapter five contains examples of using ecological Life Cycle Assessment (LCA) - a relatively new method of environmental impact assessment - which help in preparing pro-ecological strategy, and which can lead to reducing t...

  6. Estimates of uncertainties in analysis of positron lifetime spectra for metals

    International Nuclear Information System (INIS)

    Eldrup, M.; Huang, Y.M.; McKee, B.T.A.

    1978-01-01

    The effects of uncertainties and errors in various constraints used in the analysis of multi-component life-time spectra of positrons annihilating in metals containing defects have been investigated in detail using computer simulated decay spectra and subsequent analysis. It is found that the errors in the fitted values of the main component lifetimes and intensities introduced from incorrect values of the instrumental resolution function and off the source-surface components can easily exceed the statistic uncertainties. The effect of an incorrect resolution function may be reduced by excluding the peak regions of the spectra from the analysis. The influence of using incorrect source-surface components in the analysis may on the other hand be reduced by including the peak regions of the spectra. A main conclusion of the work is that extreme caution should be exercised to avoid introducing large errors through the constraints used in the analysis of experimental lifetime data. (orig.) [de

  7. Additional challenges for uncertainty analysis in river engineering

    Science.gov (United States)

    Berends, Koen; Warmink, Jord; Hulscher, Suzanne

    2016-04-01

    the proposed intervention. The implicit assumption underlying such analysis is that both models are commensurable. We hypothesize that they are commensurable only to a certain extent. In an idealised study we have demonstrated that prediction performance loss should be expected with increasingly large engineering works. When accounting for parametric uncertainty of floodplain roughness in model identification, we see uncertainty bounds for predicted effects of interventions increase with increasing intervention scale. Calibration of these types of models therefore seems to have a shelf-life, beyond which calibration does not longer improves prediction. Therefore a qualification scheme for model use is required that can be linked to model validity. In this study, we characterize model use along three dimensions: extrapolation (using the model with different external drivers), extension (using the model for different output or indicators) and modification (using modified models). Such use of models is expected to have implications for the applicability of surrogating modelling for efficient uncertainty analysis as well, which is recommended for future research. Warmink, J. J.; Straatsma, M. W.; Huthoff, F.; Booij, M. J. & Hulscher, S. J. M. H. 2013. Uncertainty of design water levels due to combined bed form and vegetation roughness in the Dutch river Waal. Journal of Flood Risk Management 6, 302-318 . DOI: 10.1111/jfr3.12014

  8. Uncertainty budget in internal monostandard NAA for small and large size samples analysis

    International Nuclear Information System (INIS)

    Dasari, K.B.; Acharya, R.

    2014-01-01

    Total uncertainty budget evaluation on determined concentration value is important under quality assurance programme. Concentration calculation in NAA or carried out by relative NAA and k0 based internal monostandard NAA (IM-NAA) method. IM-NAA method has been used for small and large sample analysis of clay potteries. An attempt was made to identify the uncertainty components in IM-NAA and uncertainty budget for La in both small and large size samples has been evaluated and compared. (author)

  9. The Impact of Uncertainty on Investment. A Meta-Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Koetse, M.J. [Department of Spatial Economics, Vrije Universiteit Amsterdam (Netherlands); De Groot, Henri L.F. [Tinbergen Institute, Amsterdam (Netherlands); Florax, R.J.G.M. [Department of Agricultural Economics, Purdue University, West Lafayette (United States)

    2006-07-01

    In this paper we perform a meta-analysis on empirical estimates of the impact between investment and uncertainty. Since the outcomes of primary studies are largely incomparable with respect to the magnitude of the effect, our analysis focuses on the direction and statistical significance of the relationship. The standard approach in this situation is to estimate an ordered probit model on a categorical estimate, defined in terms of the direction of the effect. The estimates are transformed into marginal effects, in order to represent the changes in the probability of finding a negative significant, insignificant, and positive significant estimate. Although a meta-analysis generally does not allow for inferences on the correctness of model specifications in primary studies, our results give clear directions for model building in empirical investment research. For example, not including factor prices in investment models may seriously affect the model outcomes. Furthermore, we find that Q models produce more negative significant estimates than other models do, ceteris paribus. The outcome of a study is also affected by the type of data used in a primary study. Although it is clear that meta-analysis cannot always give decisive insights into the explanations for the variation in empirical outcomes, our meta-analysis shows that we can explain to a large extent why empirical estimates of the investment uncertainty relationship differ.

  10. Photovoltaic Calibrations at the National Renewable Energy Laboratory and Uncertainty Analysis Following the ISO 17025 Guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Emery, Keith [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-09-01

    The measurement of photovoltaic (PV) performance with respect to reference conditions requires measuring current versus voltage for a given tabular reference spectrum, junction temperature, and total irradiance. This report presents the procedures implemented by the PV Cell and Module Performance Characterization Group at the National Renewable Energy Laboratory (NREL) to achieve the lowest practical uncertainty. A rigorous uncertainty analysis of these procedures is presented, which follows the International Organization for Standardization (ISO) Guide to the Expression of Uncertainty in Measurement. This uncertainty analysis is required for the team’s laboratory accreditation under ISO standard 17025, “General Requirements for the Competence of Testing and Calibration Laboratories.” The report also discusses additional areas where the uncertainty can be reduced.

  11. A unified approach for squeal instability analysis of disc brakes with two types of random-fuzzy uncertainties

    Science.gov (United States)

    Lü, Hui; Shangguan, Wen-Bin; Yu, Dejie

    2017-09-01

    Automotive brake systems are always subjected to various types of uncertainties and two types of random-fuzzy uncertainties may exist in the brakes. In this paper, a unified approach is proposed for squeal instability analysis of disc brakes with two types of random-fuzzy uncertainties. In the proposed approach, two uncertainty analysis models with mixed variables are introduced to model the random-fuzzy uncertainties. The first one is the random and fuzzy model, in which random variables and fuzzy variables exist simultaneously and independently. The second one is the fuzzy random model, in which uncertain parameters are all treated as random variables while their distribution parameters are expressed as fuzzy numbers. Firstly, the fuzziness is discretized by using α-cut technique and the two uncertainty analysis models are simplified into random-interval models. Afterwards, by temporarily neglecting interval uncertainties, the random-interval models are degraded into random models, in which the expectations, variances, reliability indexes and reliability probabilities of system stability functions are calculated. And then, by reconsidering the interval uncertainties, the bounds of the expectations, variances, reliability indexes and reliability probabilities are computed based on Taylor series expansion. Finally, by recomposing the analysis results at each α-cut level, the fuzzy reliability indexes and probabilities can be obtained, by which the brake squeal instability can be evaluated. The proposed approach gives a general framework to deal with both types of random-fuzzy uncertainties that may exist in the brakes and its effectiveness is demonstrated by numerical examples. It will be a valuable supplement to the systematic study of brake squeal considering uncertainty.

  12. Sensitivity and Uncertainty Analysis of IAEA CRP HTGR Benchmark Using McCARD

    International Nuclear Information System (INIS)

    Jang, Sang Hoon; Shim, Hyung Jin

    2016-01-01

    The benchmark consists of 4 phases starting from the local standalone modeling (Phase I) to the safety calculation of coupled system with transient situation (Phase IV). As a preliminary study of UAM on HTGR, this paper covers the exercise 1 and 2 of Phase I which defines the unit cell and lattice geometry of MHTGR-350 (General Atomics). The objective of these exercises is to quantify the uncertainty of the multiplication factor induced by perturbing nuclear data as well as to analyze the specific features of HTGR such as double heterogeneity and self-shielding treatment. The uncertainty quantification of IAEA CRP HTGR UAM benchmarks were conducted using first-order AWP method in McCARD. Uncertainty of the multiplication factor was estimated only for the microscopic cross section perturbation. To reduce the computation time and memory shortage, recently implemented uncertainty analysis module in MC wielandt calculation was adjusted. The covariance data of cross section was generated by NJOY/ERRORR module with ENDF/B-VII.1. The numerical result was compared with evaluation result of DeCART/MUSAD code system developed by KAERI. IAEA CRP HTGR UAM benchmark problems were analyzed using McCARD. The numerical results were compared with Serpent for eigenvalue calculation and DeCART/MUSAD for S/U analysis. In eigenvalue calculation, inconsistencies were found in the result with ENDF/B-VII.1 cross section library and it was found to be the effect of thermal scattering data of graphite. As to S/U analysis, McCARD results matched well with DeCART/MUSAD, but showed some discrepancy in 238U capture regarding implicit uncertainty.

  13. Uncertainty analysis in safety assessment

    International Nuclear Information System (INIS)

    Lemos, Francisco Luiz de; Sullivan, Terry

    1997-01-01

    Nuclear waste disposal is a very complex subject which requires the study of many different fields of science, like hydro geology, meteorology, geochemistry, etc. In addition, the waste disposal facilities are designed to last for a very long period of time. Both of these conditions make safety assessment projections filled with uncertainty. This paper addresses approaches for treatment of uncertainties in the safety assessment modeling due to the variability of data and some current approaches used to deal with this problem. (author)

  14. Practical Policy Applications of Uncertainty Analysis for National Greenhouse Gas Inventories

    International Nuclear Information System (INIS)

    Gillenwater, M.; Sussman, F.; Cohen, J.

    2007-01-01

    goals precisely in terms of relationships among important variables (such as emissions estimate, commitment level, or statistical confidence); and (3) develop a quantifiable adjustment mechanism that reflects these environmental goals. We recommend that countries implement an investigation-focused (i.e., qualitative) uncertainty analysis that will (1) provide the type of information necessary to develop more substantive, and potentially useful, quantitative uncertainty estimates-regardless of whether those quantitative estimates are used for policy purposes; and (2) provide information needed to understand the likely causes of uncertainty in inventory data and thereby point to ways to improve inventory quality (i.e., accuracy, transparency, completeness, and consistency)

  15. Practical Policy Applications of Uncertainty Analysis for National Greenhouse Gas Inventories

    Energy Technology Data Exchange (ETDEWEB)

    Gillenwater, M. [Environmental Resources Trust (United States)], E-mail: mgillenwater@ert.net; Sussman, F.; Cohen, J. [ICF International (United States)

    2007-09-15

    goals precisely in terms of relationships among important variables (such as emissions estimate, commitment level, or statistical confidence); and (3) develop a quantifiable adjustment mechanism that reflects these environmental goals. We recommend that countries implement an investigation-focused (i.e., qualitative) uncertainty analysis that will (1) provide the type of information necessary to develop more substantive, and potentially useful, quantitative uncertainty estimates-regardless of whether those quantitative estimates are used for policy purposes; and (2) provide information needed to understand the likely causes of uncertainty in inventory data and thereby point to ways to improve inventory quality (i.e., accuracy, transparency, completeness, and consistency)

  16. Artificial intelligence metamodel comparison and application to wind turbine airfoil uncertainty analysis

    Directory of Open Access Journals (Sweden)

    Yaping Ju

    2016-05-01

    Full Text Available The Monte Carlo simulation method for turbomachinery uncertainty analysis often requires performing a huge number of simulations, the computational cost of which can be greatly alleviated with the help of metamodeling techniques. An intensive comparative study was performed on the approximation performance of three prospective artificial intelligence metamodels, that is, artificial neural network, radial basis function, and support vector regression. The genetic algorithm was used to optimize the predetermined parameters of each metamodel for the sake of a fair comparison. Through testing on 10 nonlinear functions with different problem scales and sample sizes, the genetic algorithm–support vector regression metamodel was found more accurate and robust than the other two counterparts. Accordingly, the genetic algorithm–support vector regression metamodel was selected and combined with the Monte Carlo simulation method for the uncertainty analysis of a wind turbine airfoil under two types of surface roughness uncertainties. The results show that the genetic algorithm–support vector regression metamodel can capture well the uncertainty propagation from the surface roughness to the airfoil aerodynamic performance. This work is useful to the application of metamodeling techniques in the robust design optimization of turbomachinery.

  17. A task specific uncertainty analysis method for least-squares-based form characterization of ultra-precision freeform surfaces

    International Nuclear Information System (INIS)

    Ren, M J; Cheung, C F; Kong, L B

    2012-01-01

    In the measurement of ultra-precision freeform surfaces, least-squares-based form characterization methods are widely used to evaluate the form error of the measured surfaces. Although many methodologies have been proposed in recent years to improve the efficiency of the characterization process, relatively little research has been conducted on the analysis of associated uncertainty in the characterization results which may result from those characterization methods being used. As a result, this paper presents a task specific uncertainty analysis method with application in the least-squares-based form characterization of ultra-precision freeform surfaces. That is, the associated uncertainty in the form characterization results is estimated when the measured data are extracted from a specific surface with specific sampling strategy. Three factors are considered in this study which include measurement error, surface form error and sample size. The task specific uncertainty analysis method has been evaluated through a series of experiments. The results show that the task specific uncertainty analysis method can effectively estimate the uncertainty of the form characterization results for a specific freeform surface measurement

  18. Uncertainty and Sensitivity Analysis Results Obtained in the 1996 Performance Assessment for the Waste Isolation Pilot Plant

    International Nuclear Information System (INIS)

    Bean, J.E.; Berglund, J.W.; Davis, F.J.; Economy, K.; Garner, J.W.; Helton, J.C.; Johnson, J.D.; MacKinnon, R.J.; Miller, J.; O'Brien, D.G.; Ramsey, J.L.; Schreiber, J.D.; Shinta, A.; Smith, L.N.; Stockman, C.; Stoelzel, D.M.; Vaughn, P.

    1998-01-01

    The Waste Isolation Pilot Plant (WPP) is located in southeastern New Mexico and is being developed by the U.S. Department of Energy (DOE) for the geologic (deep underground) disposal of transuranic (TRU) waste. A detailed performance assessment (PA) for the WIPP was carried out in 1996 and supports an application by the DOE to the U.S. Environmental Protection Agency (EPA) for the certification of the WIPP for the disposal of TRU waste. The 1996 WIPP PA uses a computational structure that maintains a separation between stochastic (i.e., aleatory) and subjective (i.e., epistemic) uncertainty, with stochastic uncertainty arising from the many possible disruptions that could occur over the 10,000 yr regulatory period that applies to the WIPP and subjective uncertainty arising from the imprecision with which many of the quantities required in the PA are known. Important parts of this structure are (1) the use of Latin hypercube sampling to incorporate the effects of subjective uncertainty, (2) the use of Monte Carlo (i.e., random) sampling to incorporate the effects of stochastic uncertainty, and (3) the efficient use of the necessarily limited number of mechanistic calculations that can be performed to support the analysis. The use of Latin hypercube sampling generates a mapping from imprecisely known analysis inputs to analysis outcomes of interest that provides both a display of the uncertainty in analysis outcomes (i.e., uncertainty analysis) and a basis for investigating the effects of individual inputs on these outcomes (i.e., sensitivity analysis). The sensitivity analysis procedures used in the PA include examination of scatterplots, stepwise regression analysis, and partial correlation analysis. Uncertainty and sensitivity analysis results obtained as part of the 1996 WIPP PA are presented and discussed. Specific topics considered include two phase flow in the vicinity of the repository, radionuclide release from the repository, fluid flow and radionuclide

  19. Uncertainty and Sensitivity Analysis Results Obtained in the 1996 Performance Assessment for the Waste Isolation Pilot Plant

    Energy Technology Data Exchange (ETDEWEB)

    Bean, J.E.; Berglund, J.W.; Davis, F.J.; Economy, K.; Garner, J.W.; Helton, J.C.; Johnson, J.D.; MacKinnon, R.J.; Miller, J.; O' Brien, D.G.; Ramsey, J.L.; Schreiber, J.D.; Shinta, A.; Smith, L.N.; Stockman, C.; Stoelzel, D.M.; Vaughn, P.

    1998-09-01

    The Waste Isolation Pilot Plant (WPP) is located in southeastern New Mexico and is being developed by the U.S. Department of Energy (DOE) for the geologic (deep underground) disposal of transuranic (TRU) waste. A detailed performance assessment (PA) for the WIPP was carried out in 1996 and supports an application by the DOE to the U.S. Environmental Protection Agency (EPA) for the certification of the WIPP for the disposal of TRU waste. The 1996 WIPP PA uses a computational structure that maintains a separation between stochastic (i.e., aleatory) and subjective (i.e., epistemic) uncertainty, with stochastic uncertainty arising from the many possible disruptions that could occur over the 10,000 yr regulatory period that applies to the WIPP and subjective uncertainty arising from the imprecision with which many of the quantities required in the PA are known. Important parts of this structure are (1) the use of Latin hypercube sampling to incorporate the effects of subjective uncertainty, (2) the use of Monte Carlo (i.e., random) sampling to incorporate the effects of stochastic uncertainty, and (3) the efficient use of the necessarily limited number of mechanistic calculations that can be performed to support the analysis. The use of Latin hypercube sampling generates a mapping from imprecisely known analysis inputs to analysis outcomes of interest that provides both a display of the uncertainty in analysis outcomes (i.e., uncertainty analysis) and a basis for investigating the effects of individual inputs on these outcomes (i.e., sensitivity analysis). The sensitivity analysis procedures used in the PA include examination of scatterplots, stepwise regression analysis, and partial correlation analysis. Uncertainty and sensitivity analysis results obtained as part of the 1996 WIPP PA are presented and discussed. Specific topics considered include two phase flow in the vicinity of the repository, radionuclide release from the repository, fluid flow and radionuclide

  20. The importance of input interactions in the uncertainty and sensitivity analysis of nuclear fuel behavior

    Energy Technology Data Exchange (ETDEWEB)

    Ikonen, T., E-mail: timo.ikonen@vtt.fi; Tulkki, V.

    2014-08-15

    Highlights: • Uncertainty and sensitivity analysis of modeled nuclear fuel behavior is performed. • Burnup dependency of the uncertainties and sensitivities is characterized. • Input interactions significantly increase output uncertainties for irradiated fuel. • Identification of uncertainty sources is greatly improved with higher order methods. • Results stress the importance of using methods that take interactions into account. - Abstract: The propagation of uncertainties in a PWR fuel rod under steady-state irradiation is analyzed by computational means. A hypothetical steady-state scenario of the Three Mile Island 1 reactor fuel rod is modeled with the fuel performance FRAPCON, using realistic input uncertainties for the fabrication and model parameters, boundary conditions and material properties. The uncertainty and sensitivity analysis is performed by extensive Monte Carlo sampling of the inputs’ probability distribution and by applying correlation coefficient and Sobol’ variance decomposition analyses. The latter includes evaluation of the second order and total effect sensitivity indices, allowing the study of interactions between input variables. The results show that the interactions play a large role in the propagation of uncertainties, and first order methods such as the correlation coefficient analyses are in general insufficient for sensitivity analysis of the fuel rod. Significant improvement over the first order methods can be achieved by using higher order methods. The results also show that both the magnitude of the uncertainties and their propagation depends not only on the output in question, but also on burnup. The latter is due to onset of new phenomena (such as the fission gas release) and the gradual closure of the pellet-cladding gap with increasing burnup. Increasing burnup also affects the importance of input interactions. Interaction effects are typically highest in the moderate burnup (of the order of 10–40 MWd

  1. An uncertainty analysis using the NRPB accident consequence code Marc

    International Nuclear Information System (INIS)

    Jones, J.A.; Crick, M.J.; Simmonds, J.R.

    1991-01-01

    This paper describes an uncertainty analysis of MARC calculations of the consequences of accidental releases of radioactive materials to atmosphere. A total of 98 parameters describing the transfer of material through the environment to man, the doses received, and the health effects resulting from these doses, was considered. The uncertainties in the numbers of early and late health effects, numbers of people affected by countermeasures, the amounts of food restricted and the economic costs of the accident were estimated. This paper concentrates on the results for early death and fatal cancer for a large hypothetical release from a PWR

  2. Uncertainty analysis in estimating Japanese ingestion of global fallout Cs-137 using health risk evaluation model

    International Nuclear Information System (INIS)

    Shimada, Yoko; Morisawa, Shinsuke

    1998-01-01

    Most of model estimation of the environmental contamination includes some uncertainty associated with the parameter uncertainty in the model. In this study, the uncertainty was analyzed in a model for evaluating the ingestion of radionuclide caused by the long-term global low-level radioactive contamination by using various uncertainty analysis methods: the percentile estimate, the robustness analysis and the fuzzy estimate. The model is mainly composed of five sub-models, which include their own uncertainty; we also analyzed the uncertainty. The major findings obtained in this study include that the possibility of the discrepancy between predicted value by the model simulation and the observed data is less than 10%; the uncertainty of the predicted value is higher before 1950 and after 1980; the uncertainty of the predicted value can be reduced by decreasing the uncertainty of some environmental parameters in the model; the reliability of the model can definitively depend on the following environmental factors: direct foliar absorption coefficient, transfer factor of radionuclide from stratosphere down to troposphere, residual rate by food processing and cooking, transfer factor of radionuclide in ocean and sedimentation in ocean. (author)

  3. Uncertainty analysis of infinite homogeneous lead and sodium cooled fast reactors at beginning of life

    Energy Technology Data Exchange (ETDEWEB)

    Vanhanen, R., E-mail: risto.vanhanen@aalto.fi

    2015-03-15

    The objective of the present work is to estimate breeding ratio, radiation damage rate and minor actinide transmutation rate of infinite homogeneous lead and sodium cooled fast reactors. Uncertainty analysis is performed taking into account uncertainty in nuclear data and composition of the reactors. We use the recently released ENDF/B-VII.1 nuclear data library and restrict the work to the beginning of reactor life. We work under multigroup approximation. The Bondarenko method is used to acquire effective cross sections for the homogeneous reactor. Modeling error and numerical error are estimated. The adjoint sensitivity analysis is performed to calculate generalized adjoint fluxes for the responses. The generalized adjoint fluxes are used to calculate first order sensitivities of the responses to model parameters. The acquired sensitivities are used to propagate uncertainties in the input data to find out uncertainties in the responses. We show that the uncertainty in model parameters is the dominant source of uncertainty, followed by modeling error, input data precision and numerical error. The uncertainty due to composition of the reactor is low. We identify main sources of uncertainty and note that the low-fidelity evaluation of {sup 16}O is problematic due to lack of correlation between total and elastic reactions.

  4. Uncertainty analysis of infinite homogeneous lead and sodium cooled fast reactors at beginning of life

    International Nuclear Information System (INIS)

    Vanhanen, R.

    2015-01-01

    The objective of the present work is to estimate breeding ratio, radiation damage rate and minor actinide transmutation rate of infinite homogeneous lead and sodium cooled fast reactors. Uncertainty analysis is performed taking into account uncertainty in nuclear data and composition of the reactors. We use the recently released ENDF/B-VII.1 nuclear data library and restrict the work to the beginning of reactor life. We work under multigroup approximation. The Bondarenko method is used to acquire effective cross sections for the homogeneous reactor. Modeling error and numerical error are estimated. The adjoint sensitivity analysis is performed to calculate generalized adjoint fluxes for the responses. The generalized adjoint fluxes are used to calculate first order sensitivities of the responses to model parameters. The acquired sensitivities are used to propagate uncertainties in the input data to find out uncertainties in the responses. We show that the uncertainty in model parameters is the dominant source of uncertainty, followed by modeling error, input data precision and numerical error. The uncertainty due to composition of the reactor is low. We identify main sources of uncertainty and note that the low-fidelity evaluation of 16 O is problematic due to lack of correlation between total and elastic reactions

  5. Uncertainty analysis of a nondestructive radioassay system for transuranic waste

    International Nuclear Information System (INIS)

    Harker, Y.D.; Blackwood, L.G.; Meachum, T.R.; Yoon, W.Y.

    1996-01-01

    Radioassay of transuranic waste in 207 liter drums currently stored at the Idaho National Engineering Laboratory is achieved using a Passive Active Neutron (PAN) nondestructive assay system. In order to meet data quality assurance requirements for shipping and eventual permanent storage of these drums at the Waste Isolation Pilot Plant in Carlsbad, New Mexico, the total uncertainty of the PAN system measurements must be assessed. In particular, the uncertainty calculations are required to include the effects of variations in waste matrix parameters and related variables on the final measurement results. Because of the complexities involved in introducing waste matrix parameter effects into the uncertainty calculations, standard methods of analysis (e.g., experimentation followed by propagation of errors) could not be implemented. Instead, a modified statistical sampling and verification approach was developed. In this modified approach the total performance of the PAN system is simulated using computer models of the assay system and the resultant output is compared with the known input to assess the total uncertainty. This paper describes the simulation process and illustrates its application to waste comprised of weapons grade plutonium-contaminated graphite molds

  6. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    Energy Technology Data Exchange (ETDEWEB)

    Langenbrunner, James R. [Los Alamos National Laboratory; Booker, Jane M [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Salazar, Issac F [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  7. Uncertainty analysis for dynamic properties of MEMS resonator supported by fuzzy arithmetics

    Directory of Open Access Journals (Sweden)

    A Martowicz

    2016-04-01

    Full Text Available In the paper the application of uncertainty analysis performed formicroelectromechanical resonator is presented. Main objective ofundertaken analysis is to assess the propagation of considered uncertaintiesin the variation of chosen dynamic characteristics of Finite Element model ofmicroresonator. Many different model parameters have been assumed tobe uncertain: geometry and material properties. Apart from total uncertaintypropagation, sensitivity analysis has been carried out to study separateinfluences of all input uncertain characteristics. Uncertainty analysis has beenperformed by means of fuzzy arithmetics in which alpha-cut strategy hasbeen applied to assemble output fuzzy number. Monte Carlo Simulation andGenetic Algorithms have been employed to calculate intervals connectedwith each alpha-cut of searched fuzzy number. Elaborated model ofmicroresonator has taken into account in a simplified way the presence ofsurrounding air and constant electrostatic field.

  8. Dynamic Simulation, Sensitivity and Uncertainty Analysis of a Demonstration Scale Lignocellulosic Enzymatic Hydrolysis Process

    DEFF Research Database (Denmark)

    Prunescu, Remus Mihail; Sin, Gürkan

    2014-01-01

    This study presents the uncertainty and sensitivity analysis of a lignocellulosic enzymatic hydrolysis model considering both model and feed parameters as sources of uncertainty. The dynamic model is parametrized for accommodating various types of biomass, and different enzymatic complexes...

  9. Uncertainties propagation and global sensitivity analysis of the frequency response function of piezoelectric energy harvesters

    Science.gov (United States)

    Ruiz, Rafael O.; Meruane, Viviana

    2017-06-01

    The goal of this work is to describe a framework to propagate uncertainties in piezoelectric energy harvesters (PEHs). These uncertainties are related to the incomplete knowledge of the model parameters. The framework presented could be employed to conduct prior robust stochastic predictions. The prior analysis assumes a known probability density function for the uncertain variables and propagates the uncertainties to the output voltage. The framework is particularized to evaluate the behavior of the frequency response functions (FRFs) in PEHs, while its implementation is illustrated by the use of different unimorph and bimorph PEHs subjected to different scenarios: free of uncertainties, common uncertainties, and uncertainties as a product of imperfect clamping. The common variability associated with the PEH parameters are tabulated and reported. A global sensitivity analysis is conducted to identify the Sobol indices. Results indicate that the elastic modulus, density, and thickness of the piezoelectric layer are the most relevant parameters of the output variability. The importance of including the model parameter uncertainties in the estimation of the FRFs is revealed. In this sense, the present framework constitutes a powerful tool in the robust design and prediction of PEH performance.

  10. Representing uncertainty on model analysis plots

    Directory of Open Access Journals (Sweden)

    Trevor I. Smith

    2016-09-01

    Full Text Available Model analysis provides a mechanism for representing student learning as measured by standard multiple-choice surveys. The model plot contains information regarding both how likely students in a particular class are to choose the correct answer and how likely they are to choose an answer consistent with a well-documented conceptual model. Unfortunately, Bao’s original presentation of the model plot did not include a way to represent uncertainty in these measurements. I present details of a method to add error bars to model plots by expanding the work of Sommer and Lindell. I also provide a template for generating model plots with error bars.

  11. Accuracy and Uncertainty Analysis of Intelligent Techniques for Predicting the Longitudinal Dispersion Coefficient in Rivers

    Directory of Open Access Journals (Sweden)

    Abbas Akbarzadeh

    2010-09-01

    Full Text Available Accurate prediction of longitudinal dispersion coefficient (LDC can be useful for the determination of pollutants concentration distribution in natural rivers. However, the uncertainty associated with the results obtained from forecasting models has a negative effect on pollutant management in water resources. In this research, appropriate models are first developed using ANN and ANFIS techniques to predict the LDC in natural streams. Then, an uncertainty analysis is performed for ANN and ANFIS models based on Monte-Carlo simulation. The input parameters of the models are related to hydraulic variables and stream geometry. Results indicate that ANN is a suitable model for predicting the LDC, but it is also associated with a high level of uncertainty. However, results of uncertainty analysis show that ANFIS model has less uncertainty; i.e. it is the best model for forecasting satisfactorily the LDC in natural streams.

  12. Development and application of objective uncertainty measures for nuclear power plant transient analysis

    International Nuclear Information System (INIS)

    Vinai, P.

    2007-10-01

    For the development, design and licensing of a nuclear power plant (NPP), a sound safety analysis is necessary to study the diverse physical phenomena involved in the system behaviour under operational and transient conditions. Such studies are based on detailed computer simulations. With the progresses achieved in computer technology and the greater availability of experimental and plant data, the use of best estimate codes for safety evaluations has gained increasing acceptance. The application of best estimate safety analysis has raised new problems that need to be addressed: it has become more crucial to assess as to how reliable code predictions are, especially when they need to be compared against safety limits that must not be crossed. It becomes necessary to identify and quantify the various possible sources of uncertainty that affect the reliability of the results. Currently, such uncertainty evaluations are generally based on experts' opinion. In the present research, a novel methodology based on a non-parametric statistical approach has been developed for objective quantification of best-estimate code uncertainties related to the physical models used in the code. The basis is an evaluation of the accuracy of a given physical model achieved by comparing its predictions with experimental data from an appropriate set of separate-effect tests. The differences between measurements and predictions can be considered stochastically distributed, and thus a statistical approach can be employed. The first step was the development of a procedure for investigating the dependence of a given physical model's accuracy on the experimental conditions. Each separate-effect test effectively provides a random sample of discrepancies between measurements and predictions, corresponding to a location in the state space defined by a certain number of independent system variables. As a consequence, the samples of 'errors', achieved from analysis of the entire database, are

  13. Uncertainty analysis in safety assessment

    Energy Technology Data Exchange (ETDEWEB)

    Lemos, Francisco Luiz de [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN), Belo Horizonte, MG (Brazil); Sullivan, Terry [Brookhaven National Lab., Upton, NY (United States)

    1997-12-31

    Nuclear waste disposal is a very complex subject which requires the study of many different fields of science, like hydro geology, meteorology, geochemistry, etc. In addition, the waste disposal facilities are designed to last for a very long period of time. Both of these conditions make safety assessment projections filled with uncertainty. This paper addresses approaches for treatment of uncertainties in the safety assessment modeling due to the variability of data and some current approaches used to deal with this problem. (author) 13 refs.; e-mail: lemos at bnl.gov; sulliva1 at bnl.gov

  14. A comparison of uncertainty analysis methods using a groundwater flow model

    International Nuclear Information System (INIS)

    Doctor, P.G.; Jacobson, E.A.; Buchanan, J.A.

    1988-06-01

    This report evaluates three uncertainty analysis methods that are proposed for use in performances assessment activities within the OCRWM and Nuclear Regulatory Commission (NRC) communities. The three methods are Monte Carlo simulation with unconstrained sampling, Monte Carlo simulation with Latin Hypercube sampling, and first-order analysis. Monte Carlo simulation with unconstrained sampling is a generally accepted uncertainty analysis method, but it has the disadvantage of being costly and time consuming. Latin Hypercube sampling was proposed to make Monte Carlo simulation more efficient. However, although it was originally formulated for independent variables, which is a major drawback in performance assessment modeling, Latin Hypercube can be used to generate correlated samples. The first-order method is efficient to implement because it is based on the first-order Taylor series expansion; however, there is concern that it does not adequately describe the variability for complex models. These three uncertainty analysis methods were evaluated using a calibrated groundwater flow model of a unconfined aquifer in southern Arizona. The two simulation methods produced similar results, although the Latin Hypercube method tends to produce samples whose estimates of statistical parameters are closer to the desired parameters. The mean travel times for the first-order method does not agree with those of the simulations. In additions, the first-order method produces estimates of variance in travel times that are more variable than those produced by the simulation methods, resulting in nonconservative tolerance intervals. 13 refs., 33 figs

  15. Validation and uncertainty analysis of a pre-treatment 2D dose prediction model

    Science.gov (United States)

    Baeza, Jose A.; Wolfs, Cecile J. A.; Nijsten, Sebastiaan M. J. J. G.; Verhaegen, Frank

    2018-02-01

    Independent verification of complex treatment delivery with megavolt photon beam radiotherapy (RT) has been effectively used to detect and prevent errors. This work presents the validation and uncertainty analysis of a model that predicts 2D portal dose images (PDIs) without a patient or phantom in the beam. The prediction model is based on an exponential point dose model with separable primary and secondary photon fluence components. The model includes a scatter kernel, off-axis ratio map, transmission values and penumbra kernels for beam-delimiting components. These parameters were derived through a model fitting procedure supplied with point dose and dose profile measurements of radiation fields. The model was validated against a treatment planning system (TPS; Eclipse) and radiochromic film measurements for complex clinical scenarios, including volumetric modulated arc therapy (VMAT). Confidence limits on fitted model parameters were calculated based on simulated measurements. A sensitivity analysis was performed to evaluate the effect of the parameter uncertainties on the model output. For the maximum uncertainty, the maximum deviating measurement sets were propagated through the fitting procedure and the model. The overall uncertainty was assessed using all simulated measurements. The validation of the prediction model against the TPS and the film showed a good agreement, with on average 90.8% and 90.5% of pixels passing a (2%,2 mm) global gamma analysis respectively, with a low dose threshold of 10%. The maximum and overall uncertainty of the model is dependent on the type of clinical plan used as input. The results can be used to study the robustness of the model. A model for predicting accurate 2D pre-treatment PDIs in complex RT scenarios can be used clinically and its uncertainties can be taken into account.

  16. Uncertainty analysis of a one-dimensional constitutive model for shape memory alloy thermomechanical description

    DEFF Research Database (Denmark)

    Oliveira, Sergio A.; Savi, Marcelo A.; Santos, Ilmar F.

    2014-01-01

    The use of shape memory alloys (SMAs) in engineering applications has increased the interest of the accuracy analysis of their thermomechanical description. This work presents an uncertainty analysis related to experimental tensile tests conducted with shape memory alloy wires. Experimental data...... are compared with numerical simulations obtained from a constitutive model with internal constraints employed to describe the thermomechanical behavior of SMAs. The idea is to evaluate if the numerical simulations are within the uncertainty range of the experimental data. Parametric analysis is also developed...

  17. Crashworthiness uncertainty analysis of typical civil aircraft based on Box–Behnken method

    OpenAIRE

    Ren Yiru; Xiang Jinwu

    2014-01-01

    The crashworthiness is an important design factor of civil aircraft related with the safety of occupant during impact accident. It is a highly nonlinear transient dynamic problem and may be greatly influenced by the uncertainty factors. Crashworthiness uncertainty analysis is conducted to investigate the effects of initial conditions, structural dimensions and material properties. Simplified finite element model is built based on the geometrical model and basic physics phenomenon. Box–Behnken...

  18. Effect of the sample matrix on measurement uncertainty in X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Morgenstern, P.; Brueggemann, L.; Wennrich, R.

    2005-01-01

    The estimation of measurement uncertainty, with reference to univariate calibration functions, is discussed in detail in the Eurachem Guide 'Quantifying Uncertainty in Analytical Measurement'. The adoption of these recommendations to quantitative X-ray fluorescence analysis (XRF) involves basic problems which are above all due to the strong influence of the sample matrix on the analytical response. In XRF-analysis, the proposed recommendations are consequently applicable only to the matrix corrected response. The application is also restricted with regard to both the matrices and analyte concentrations. In this context the present studies are aimed at the problems to predict measurement uncertainty also with reference to more variable sample compositions. The corresponding investigations are focused on the use of the intensity of the Compton scattered tube line as an internal standard to assess the effect of the individual sample matrix on the analytical response relatively to a reference matrix. Based on this concept the estimation of the measurement uncertainty of an analyte presented in an unknown specimen can be predicted in consideration of the data obtained under defined matrix conditions

  19. Deterministic sensitivity and uncertainty analysis for large-scale computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Pin, F.G.; Oblow, E.M.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.

    1988-01-01

    The fields of sensitivity and uncertainty analysis have traditionally been dominated by statistical techniques when large-scale modeling codes are being analyzed. These methods are able to estimate sensitivities, generate response surfaces, and estimate response probability distributions given the input parameter probability distributions. Because the statistical methods are computationally costly, they are usually applied only to problems with relatively small parameter sets. Deterministic methods, on the other hand, are very efficient and can handle large data sets, but generally require simpler models because of the considerable programming effort required for their implementation. The first part of this paper reports on the development and availability of two systems, GRESS and ADGEN, that make use of computer calculus compilers to automate the implementation of deterministic sensitivity analysis capability into existing computer models. This automation removes the traditional limitation of deterministic sensitivity methods. This second part of the paper describes a deterministic uncertainty analysis method (DUA) that uses derivative information as a basis to propagate parameter probability distributions to obtain result probability distributions. This paper is applicable to low-level radioactive waste disposal system performance assessment

  20. Uncertainty analysis of energy consumption in dwellings

    Energy Technology Data Exchange (ETDEWEB)

    Pettersen, Trine Dyrstad

    1997-12-31

    This thesis presents a comprehensive study of an energy estimation model that can be used to examine the uncertainty of predicted energy consumption in a dwelling. The variation and uncertainty of input parameters due to the outdoor climate, the building construction and the inhabitants are studied as a basis for further energy evaluations. The occurring variations of energy consumption in nominal similar dwellings are also investigated due to verification of the simulated energy consumption. The main topics are (1) a study of expected variations and uncertainties in both input parameters used in energy consumption calculations and the energy consumption in the dwelling, (2) the development and evaluation of a simplified energy calculation model that considers uncertainties due to the input parameters, (3) an evaluation of the influence of the uncertain parameters on the total variation so that the most important parameters can be identified, and (4) the recommendation of a simplified procedure for treating uncertainties or possible deviations from average conditions. 90 refs., 182 figs., 73 tabs.

  1. Communicating uncertainty in cost-benefit analysis : A cognitive psychological perspective

    NARCIS (Netherlands)

    Mouter, N.; Holleman, M.; Calvert, S.C.; Annema, J.A.

    2013-01-01

    Based on a cognitive psychological theory, this paper aims to improve the communication of uncertainty in Cost-Benefit Analysis. The theory is based on different cognitive-personality and cognitive-social psychological constructs that may help explain individual differences in the processing of

  2. Analysis of the influence of input data uncertainties on determining the reliability of reservoir storage capacity

    Directory of Open Access Journals (Sweden)

    Marton Daniel

    2015-12-01

    Full Text Available The paper contains a sensitivity analysis of the influence of uncertainties in input hydrological, morphological and operating data required for a proposal for active reservoir conservation storage capacity and its achieved values. By introducing uncertainties into the considered inputs of the water management analysis of a reservoir, the subsequent analysed reservoir storage capacity is also affected with uncertainties. The values of water outflows from the reservoir and the hydrological reliabilities are affected with uncertainties as well. A simulation model of reservoir behaviour has been compiled with this kind of calculation as stated below. The model allows evaluation of the solution results, taking uncertainties into consideration, in contributing to a reduction in the occurrence of failure or lack of water during reservoir operation in low-water and dry periods.

  3. An educational model for ensemble streamflow simulation and uncertainty analysis

    Directory of Open Access Journals (Sweden)

    A. AghaKouchak

    2013-02-01

    Full Text Available This paper presents the hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity. HBV-Ensemble was administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of uncertainty in hydrological modeling.

  4. Incorporating parametric uncertainty into population viability analysis models

    Science.gov (United States)

    McGowan, Conor P.; Runge, Michael C.; Larson, Michael A.

    2011-01-01

    Uncertainty in parameter estimates from sampling variation or expert judgment can introduce substantial uncertainty into ecological predictions based on those estimates. However, in standard population viability analyses, one of the most widely used tools for managing plant, fish and wildlife populations, parametric uncertainty is often ignored in or discarded from model projections. We present a method for explicitly incorporating this source of uncertainty into population models to fully account for risk in management and decision contexts. Our method involves a two-step simulation process where parametric uncertainty is incorporated into the replication loop of the model and temporal variance is incorporated into the loop for time steps in the model. Using the piping plover, a federally threatened shorebird in the USA and Canada, as an example, we compare abundance projections and extinction probabilities from simulations that exclude and include parametric uncertainty. Although final abundance was very low for all sets of simulations, estimated extinction risk was much greater for the simulation that incorporated parametric uncertainty in the replication loop. Decisions about species conservation (e.g., listing, delisting, and jeopardy) might differ greatly depending on the treatment of parametric uncertainty in population models.

  5. Demonstration uncertainty/sensitivity analysis using the health and economic consequence model CRAC2

    International Nuclear Information System (INIS)

    Alpert, D.J.; Iman, R.L.; Johnson, J.D.; Helton, J.C.

    1984-12-01

    The techniques for performing uncertainty/sensitivity analyses compiled as part of the MELCOR program appear to be well suited for use with a health and economic consequence model. Two replicate samples of size 50 gave essentially identical results, indicating that for this case, a Latin hypercube sample of size 50 seems adequate to represent the distribution of results. Though the intent of this study was a demonstration of uncertainty/sensitivity analysis techniques, a number of insights relevant to health and economic consequence modeling can be gleaned: uncertainties in early deaths are significantly greater than uncertainties in latent cancer deaths; though the magnitude of the source term is the largest source of variation in estimated distributions of early deaths, a number of additional parameters are also important; even with the release fractions for a full SST1, one quarter of the CRAC2 runs gave no early deaths; and comparison of the estimates of mean early deaths for a full SST1 release in this study with those of recent point estimates for similar conditions indicates that the recent estimates may be significant overestimations of early deaths. Estimates of latent cancer deaths, however, are roughly comparable. An analysis of the type described here can provide insights in a number of areas. First, the variability in the results gives an indication of the potential uncertainty associated with the calculations. Second, the sensitivity of the results to assumptions about the input variables can be determined. Research efforts can then be concentrated on reducing the uncertainty in the variables which are the largest contributors to uncertainty in results

  6. Responses to clinical uncertainty in Australian general practice trainees: a cross-sectional analysis.

    Science.gov (United States)

    Cooke, Georga; Tapley, Amanda; Holliday, Elizabeth; Morgan, Simon; Henderson, Kim; Ball, Jean; van Driel, Mieke; Spike, Neil; Kerr, Rohan; Magin, Parker

    2017-12-01

    Tolerance for ambiguity is essential for optimal learning and professional competence. General practice trainees must be, or must learn to be, adept at managing clinical uncertainty. However, few studies have examined associations of intolerance of uncertainty in this group. The aim of this study was to establish levels of tolerance of uncertainty in Australian general practice trainees and associations of uncertainty with demographic, educational and training practice factors. A cross-sectional analysis was performed on the Registrar Clinical Encounters in Training (ReCEnT) project, an ongoing multi-site cohort study. Scores on three of the four independent subscales of the Physicians' Reaction to Uncertainty (PRU) instrument were analysed as outcome variables in linear regression models with trainee and practice factors as independent variables. A total of 594 trainees contributed data on a total of 1209 occasions. Trainees in earlier training terms had higher scores for 'Anxiety due to uncertainty', 'Concern about bad outcomes' and 'Reluctance to disclose diagnosis/treatment uncertainty to patients'. Beyond this, findings suggest two distinct sets of associations regarding reaction to uncertainty. Firstly, affective aspects of uncertainty (the 'Anxiety' and 'Concern' subscales) were associated with female gender, less experience in hospital prior to commencing general practice training, and graduation overseas. Secondly, a maladaptive response to uncertainty (the 'Reluctance to disclose' subscale) was associated with urban practice, health qualifications prior to studying medicine, practice in an area of higher socio-economic status, and being Australian-trained. This study has established levels of three measures of trainees' responses to uncertainty and associations with these responses. The current findings suggest differing 'phenotypes' of trainees with high 'affective' responses to uncertainty and those reluctant to disclose uncertainty to patients. More

  7. How does uncertainty shape patient experience in advanced illness? A secondary analysis of qualitative data.

    Science.gov (United States)

    Etkind, Simon Noah; Bristowe, Katherine; Bailey, Katharine; Selman, Lucy Ellen; Murtagh, Fliss Em

    2017-02-01

    Uncertainty is common in advanced illness but is infrequently studied in this context. If poorly addressed, uncertainty can lead to adverse patient outcomes. We aimed to understand patient experiences of uncertainty in advanced illness and develop a typology of patients' responses and preferences to inform practice. Secondary analysis of qualitative interview transcripts. Studies were assessed for inclusion and interviews were sampled using maximum-variation sampling. Analysis used a thematic approach with 10% of coding cross-checked to enhance reliability. Qualitative interviews from six studies including patients with heart failure, chronic obstructive pulmonary disease, renal disease, cancer and liver failure. A total of 30 transcripts were analysed. Median age was 75 (range, 43-95), 12 patients were women. The impact of uncertainty was frequently discussed: the main related themes were engagement with illness, information needs, patient priorities and the period of time that patients mainly focused their attention on (temporal focus). A typology of patient responses to uncertainty was developed from these themes. Uncertainty influences patient experience in advanced illness through affecting patients' information needs, preferences and future priorities for care. Our typology aids understanding of how patients with advanced illness respond to uncertainty. Assessment of these three factors may be a useful starting point to guide clinical assessment and shared decision making.

  8. Introducing uncertainty analysis of nucleation and crystal growth models in Process Analytical Technology (PAT) system design of crystallization processes

    DEFF Research Database (Denmark)

    Abdul Samad, Noor Asma Fazli Bin; Sin, Gürkan; Gernaey, Krist

    2013-01-01

    This paper presents the application of uncertainty and sensitivity analysis as part of a systematic modelbased process monitoring and control (PAT) system design framework for crystallization processes. For the uncertainty analysis, the Monte Carlo procedure is used to propagate input uncertainty...

  9. BEMUSE Phase III Report - Uncertainty and Sensitivity Analysis of the LOFT L2-5 Test

    International Nuclear Information System (INIS)

    Bazin, P.; Crecy, A. de; Glaeser, H.; Skorek, T.; Joucla, J.; Probst, P.; Chung, B.; Oh, D.Y.; Kyncl, M.; Pernica, R.; Macek, J.; Meca, R.; Macian, R.; D'Auria, F.; Petruzzi, A.; Perez, M.; Reventos, F.; Fujioka, K.

    2007-02-01

    This report summarises the various contributions (ten participants) for phase 3 of BEMUSE: Uncertainty and Sensitivity Analyses of the LOFT L2-5 experiment, a Large-Break Loss-of-Coolant-Accident (LB-LOCA). For this phase, precise requirements step by step were provided to the participants. Four main parts are defined, which are: 1. List and uncertainties of the input uncertain parameters. 2. Uncertainty analysis results. 3. Sensitivity analysis results. 4. Improved methods, assessment of the methods (optional). 5% and 95% percentiles have to be estimated for 6 output parameters, which are of two kinds: 1. Scalar output parameters (First Peak Cladding Temperature (PCT), Second Peak Cladding Temperature, Time of accumulator injection, Time of complete quenching); 2. Time trends output parameters (Maximum cladding temperature, Upper plenum pressure). The main lessons learnt from phase 3 of the BEMUSE programme are the following: - for uncertainty analysis, all the participants use a probabilistic method associated with the use of Wilks' formula, except for UNIPI with its CIAU method (Code with the Capability of Internal Assessment of Uncertainty). Use of both methods has been successfully mastered. - Compared with the experiment, the results of uncertainty analysis are good on the whole. For example, for the cladding temperature-type output parameters (1. PCT, 2. PCT, time of complete quenching, maximum cladding temperature), 8 participants out of 10 find upper and lower bounds which envelop the experimental data. - Sensitivity analysis has been successfully performed by all the participants using the probabilistic method. All the used influence measures include the range of variation of the input parameters. Synthesis tables of the most influential phenomena and parameters have been plotted and participants will be able to use them for the continuation of the BEMUSE programme

  10. An information-theoretic basis for uncertainty analysis: application to the QUASAR severe accident study

    International Nuclear Information System (INIS)

    Unwin, S.D.; Cazzoli, E.G.; Davis, R.E.; Khatib-Rahbar, M.; Lee, M.; Nourbakhsh, H.; Park, C.K.; Schmidt, E.

    1989-01-01

    The probabilistic characterization of uncertainty can be problematic in circumstances where there is a paucity of supporting data and limited experience on which to base engineering judgement. Information theory provides a framework in which to address this issue through reliance upon entropy-related principles of uncertainty maximization. We describe an application of such principles in the United States Nuclear Regulatory Commission-sponsored program QUASAR (Quantification and Uncertainty Analysis of Source Terms for Severe Accidents in Light Water Reactors). (author)

  11. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

    Energy Technology Data Exchange (ETDEWEB)

    Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States)] [and others

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.

  12. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

    International Nuclear Information System (INIS)

    Harper, F.T.; Young, M.L.; Miller, L.A.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes

  13. Use of probabilistic methods for analysis of cost and duration uncertainties in a decision analysis framework

    International Nuclear Information System (INIS)

    Boak, D.M.; Painton, L.

    1995-01-01

    Probabilistic forecasting techniques have been used in many risk assessment and performance assessment applications on radioactive waste disposal projects such as Yucca Mountain and the Waste Isolation Pilot Plant (WIPP). Probabilistic techniques such as Monte Carlo and Latin Hypercube sampling methods are routinely used to treat uncertainties in physical parameters important in simulating radionuclide transport in a coupled geohydrologic system and assessing the ability of that system to comply with regulatory release limits. However, the use of probabilistic techniques in the treatment of uncertainties in the cost and duration of programmatic alternatives on risk and performance assessment projects is less common. Where significant uncertainties exist and where programmatic decisions must be made despite existing uncertainties, probabilistic techniques may yield important insights into decision options, especially when used in a decision analysis framework and when properly balanced with deterministic analyses. For relatively simple evaluations, these types of probabilistic evaluations can be made using personal computer-based software

  14. Needs of the CSAU uncertainty method

    International Nuclear Information System (INIS)

    Prosek, A.; Mavko, B.

    2000-01-01

    The use of best estimate codes for safety analysis requires quantification of the uncertainties. These uncertainties are inherently linked to the chosen safety analysis methodology. Worldwide, various methods were proposed for this quantification. The purpose of this paper was to identify the needs of the Code Scaling, Applicability, and Uncertainty (CSAU) methodology and then to answer the needs. The specific procedural steps were combined from other methods for uncertainty evaluation and new tools and procedures were proposed. The uncertainty analysis approach and tools were then utilized for confirmatory study. The uncertainty was quantified for the RELAP5/MOD3.2 thermalhydraulic computer code. The results of the adapted CSAU approach to the small-break loss-of-coolant accident (SB LOCA) show that the adapted CSAU can be used for any thermal-hydraulic safety analysis with uncertainty evaluation. However, it was indicated that there are still some limitations in the CSAU approach that need to be resolved. (author)

  15. A polynomial chaos approach to the analysis of vehicle dynamics under uncertainty

    Science.gov (United States)

    Kewlani, Gaurav; Crawford, Justin; Iagnemma, Karl

    2012-05-01

    The ability of ground vehicles to quickly and accurately analyse their dynamic response to a given input is critical to their safety and efficient autonomous operation. In field conditions, significant uncertainty is associated with terrain and/or vehicle parameter estimates, and this uncertainty must be considered in the analysis of vehicle motion dynamics. Here, polynomial chaos approaches that explicitly consider parametric uncertainty during modelling of vehicle dynamics are presented. They are shown to be computationally more efficient than the standard Monte Carlo scheme, and experimental results compared with the simulation results performed on ANVEL (a vehicle simulator) indicate that the method can be utilised for efficient and accurate prediction of vehicle motion in realistic scenarios.

  16. The uncertainty of reference standards--a guide to understanding factors impacting uncertainty, uncertainty calculations, and vendor certifications.

    Science.gov (United States)

    Gates, Kevin; Chang, Ning; Dilek, Isil; Jian, Huahua; Pogue, Sherri; Sreenivasan, Uma

    2009-10-01

    Certified solution standards are widely used in forensic toxicological, clinical/diagnostic, and environmental testing. Typically, these standards are purchased as ampouled solutions with a certified concentration. Vendors present concentration and uncertainty differently on their Certificates of Analysis. Understanding the factors that impact uncertainty and which factors have been considered in the vendor's assignment of uncertainty are critical to understanding the accuracy of the standard and the impact on testing results. Understanding these variables is also important for laboratories seeking to comply with ISO/IEC 17025 requirements and for those preparing reference solutions from neat materials at the bench. The impact of uncertainty associated with the neat material purity (including residual water, residual solvent, and inorganic content), mass measurement (weighing techniques), and solvent addition (solution density) on the overall uncertainty of the certified concentration is described along with uncertainty calculations.

  17. Uncertainty and sensitivity analysis of fission gas behavior in engineering-scale fuel modeling

    Energy Technology Data Exchange (ETDEWEB)

    Pastore, Giovanni, E-mail: Giovanni.Pastore@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Swiler, L.P., E-mail: LPSwile@sandia.gov [Optimization and Uncertainty Quantification, Sandia National Laboratories, P.O. Box 5800, Albuquerque, NM 87185-1318 (United States); Hales, J.D., E-mail: Jason.Hales@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Novascone, S.R., E-mail: Stephen.Novascone@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Perez, D.M., E-mail: Danielle.Perez@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Spencer, B.W., E-mail: Benjamin.Spencer@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Luzzi, L., E-mail: Lelio.Luzzi@polimi.it [Politecnico di Milano, Department of Energy, Nuclear Engineering Division, via La Masa 34, I-20156 Milano (Italy); Van Uffelen, P., E-mail: Paul.Van-Uffelen@ec.europa.eu [European Commission, Joint Research Centre, Institute for Transuranium Elements, Hermann-von-Helmholtz-Platz 1, D-76344 Karlsruhe (Germany); Williamson, R.L., E-mail: Richard.Williamson@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States)

    2015-01-15

    The role of uncertainties in fission gas behavior calculations as part of engineering-scale nuclear fuel modeling is investigated using the BISON fuel performance code with a recently implemented physics-based model for fission gas release and swelling. Through the integration of BISON with the DAKOTA software, a sensitivity analysis of the results to selected model parameters is carried out based on UO{sub 2} single-pellet simulations covering different power regimes. The parameters are varied within ranges representative of the relative uncertainties and consistent with the information in the open literature. The study leads to an initial quantitative assessment of the uncertainty in fission gas behavior predictions with the parameter characterization presently available. Also, the relative importance of the single parameters is evaluated. Moreover, a sensitivity analysis is carried out based on simulations of a fuel rod irradiation experiment, pointing out a significant impact of the considered uncertainties on the calculated fission gas release and cladding diametral strain. The results of the study indicate that the commonly accepted deviation between calculated and measured fission gas release by a factor of 2 approximately corresponds to the inherent modeling uncertainty at high fission gas release. Nevertheless, significantly higher deviations may be expected for values around 10% and lower. Implications are discussed in terms of directions of research for the improved modeling of fission gas behavior for engineering purposes.

  18. Model uncertainty in safety assessment

    International Nuclear Information System (INIS)

    Pulkkinen, U.; Huovinen, T.

    1996-01-01

    The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.)

  19. Model uncertainty in safety assessment

    Energy Technology Data Exchange (ETDEWEB)

    Pulkkinen, U; Huovinen, T [VTT Automation, Espoo (Finland). Industrial Automation

    1996-01-01

    The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.).

  20. Uncertainty analysis on probabilistic fracture mechanics assessment methodology

    International Nuclear Information System (INIS)

    Rastogi, Rohit; Vinod, Gopika; Chandra, Vikas; Bhasin, Vivek; Babar, A.K.; Rao, V.V.S.S.; Vaze, K.K.; Kushwaha, H.S.; Venkat-Raj, V.

    1999-01-01

    Fracture Mechanics has found a profound usage in the area of design of components and assessing fitness for purpose/residual life estimation of an operating component. Since defect size and material properties are statistically distributed, various probabilistic approaches have been employed for the computation of fracture probability. Monte Carlo Simulation is one such procedure towards the analysis of fracture probability. This paper deals with uncertainty analysis using the Monte Carlo Simulation methods. These methods were developed based on the R6 failure assessment procedure, which has been widely used in analysing the integrity of structures. The application of this method is illustrated with a case study. (author)

  1. Accounting for uncertainty in ecological analysis: the strengths and limitations of hierarchical statistical modeling.

    Science.gov (United States)

    Cressie, Noel; Calder, Catherine A; Clark, James S; Ver Hoef, Jay M; Wikle, Christopher K

    2009-04-01

    Analyses of ecological data should account for the uncertainty in the process(es) that generated the data. However, accounting for these uncertainties is a difficult task, since ecology is known for its complexity. Measurement and/or process errors are often the only sources of uncertainty modeled when addressing complex ecological problems, yet analyses should also account for uncertainty in sampling design, in model specification, in parameters governing the specified model, and in initial and boundary conditions. Only then can we be confident in the scientific inferences and forecasts made from an analysis. Probability and statistics provide a framework that accounts for multiple sources of uncertainty. Given the complexities of ecological studies, the hierarchical statistical model is an invaluable tool. This approach is not new in ecology, and there are many examples (both Bayesian and non-Bayesian) in the literature illustrating the benefits of this approach. In this article, we provide a baseline for concepts, notation, and methods, from which discussion on hierarchical statistical modeling in ecology can proceed. We have also planted some seeds for discussion and tried to show where the practical difficulties lie. Our thesis is that hierarchical statistical modeling is a powerful way of approaching ecological analysis in the presence of inevitable but quantifiable uncertainties, even if practical issues sometimes require pragmatic compromises.

  2. Uncertainty Analysis of Resistance Tests in Ata Nutku Ship Model Testing Laboratory of Istanbul Technical University

    Directory of Open Access Journals (Sweden)

    Cihad DELEN

    2015-12-01

    Full Text Available In this study, some systematical resistance tests, where were performed in Ata Nutku Ship Model Testing Laboratory of Istanbul Technical University (ITU, have been included in order to determine the uncertainties. Experiments which are conducted in the framework of mathematical and physical rules for the solution of engineering problems, measurements, calculations include uncertainty. To question the reliability of the obtained values, the existing uncertainties should be expressed as quantities. The uncertainty of a measurement system is not known if the results do not carry a universal value. On the other hand, resistance is one of the most important parameters that should be considered in the process of ship design. Ship resistance during the design phase of a ship cannot be determined precisely and reliably due to the uncertainty resources in determining the resistance value that are taken into account. This case may cause negative effects to provide the required specifications in the latter design steps. The uncertainty arising from the resistance test has been estimated and compared for a displacement type ship and high speed marine vehicles according to ITTC 2002 and ITTC 2014 regulations which are related to the uncertainty analysis methods. Also, the advantages and disadvantages of both ITTC uncertainty analysis methods have been discussed.

  3. Demonstration of uncertainty quantification and sensitivity analysis for PWR fuel performance with BISON

    International Nuclear Information System (INIS)

    Zhang, Hongbin; Zhao, Haihua; Zou, Ling; Burns, Douglas; Ladd, Jacob

    2017-01-01

    BISON is an advanced fuels performance code being developed at Idaho National Laboratory and is the code of choice for fuels performance by the U.S. Department of Energy (DOE)’s Consortium for Advanced Simulation of Light Water Reactors (CASL) Program. An approach to uncertainty quantification and sensitivity analysis with BISON was developed and a new toolkit was created. A PWR fuel rod model was developed and simulated by BISON, and uncertainty quantification and sensitivity analysis were performed with eighteen uncertain input parameters. The maximum fuel temperature and gap conductance were selected as the figures of merit (FOM). Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis. (author)

  4. Demonstration of Uncertainty Quantification and Sensitivity Analysis for PWR Fuel Performance with BISON

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Hongbin; Ladd, Jacob; Zhao, Haihua; Zou, Ling; Burns, Douglas

    2015-11-01

    BISON is an advanced fuels performance code being developed at Idaho National Laboratory and is the code of choice for fuels performance by the U.S. Department of Energy (DOE)’s Consortium for Advanced Simulation of Light Water Reactors (CASL) Program. An approach to uncertainty quantification and sensitivity analysis with BISON was developed and a new toolkit was created. A PWR fuel rod model was developed and simulated by BISON, and uncertainty quantification and sensitivity analysis were performed with eighteen uncertain input parameters. The maximum fuel temperature and gap conductance were selected as the figures of merit (FOM). Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis.

  5. Phenomenological uncertainty analysis of containment building pressure load caused by severe accident sequences

    International Nuclear Information System (INIS)

    Park, S.Y.; Ahn, K.I.

    2014-01-01

    Highlights: • Phenomenological uncertainty analysis has been applied to level 2 PSA. • The methodology provides an alternative to simple deterministic analyses and sensitivity studies. • A realistic evaluation provides a more complete characterization of risks. • Uncertain parameters of MAAP code for the early containment failure were identified. - Abstract: This paper illustrates an application of a severe accident analysis code, MAAP, to the uncertainty evaluation of early containment failure scenarios employed in the containment event tree (CET) model of a reference plant. An uncertainty analysis of containment pressure behavior during severe accidents has been performed for an optimum assessment of an early containment failure model. The present application is mainly focused on determining an estimate of the containment building pressure load caused by severe accident sequences of a nuclear power plant. Key modeling parameters and phenomenological models employed for the present uncertainty analysis are closely related to the in-vessel hydrogen generation, direct containment heating, and gas combustion. The basic approach of this methodology is to (1) develop severe accident scenarios for which containment pressure loads should be performed based on a level 2 PSA, (2) identify severe accident phenomena relevant to an early containment failure, (3) identify the MAAP input parameters, sensitivity coefficients, and modeling options that describe or influence the early containment failure phenomena, (4) prescribe the likelihood descriptions of the potential range of these parameters, and (5) evaluate the code predictions using a number of random combinations of parameter inputs sampled from the likelihood distributions

  6. The use of kragten spreadsheets for uncertainty evaluation of uranium potentiometric analysis by the Brazilian Safeguards Laboratory

    International Nuclear Information System (INIS)

    Silva, Jose Wanderley S. da; Barros, Pedro Dionisio de; Araujo, Radier Mario S. de

    2009-01-01

    In safeguards, independent analysis of uranium content and enrichment of nuclear materials to verify operator's declarations is an important tool to evaluate the accountability system applied by nuclear installations. This determination may be performed by nondestructive (NDA) methods, generally done in the field using portable radiation detection systems, or destructive (DA) methods by chemical analysis when more accurate and precise results are necessary. Samples for DA analysis are collected by inspectors during safeguards inspections and sent to Safeguards Laboratory (LASAL) of the Brazilian Nuclear Energy Commission - (CNEN), where the analysis take place. The method used by LASAL for determination of uranium in different physical and chemical forms is the Davies and Gray/NBL using an automatic potentiometric titrator, which performs the titration of uranium IV by a standard solution of K 2 Cr 2 O 7 . Uncertainty budgets have been determined based on the concepts of the ISO 'Guide to the Expression of Uncertainty in Measurement' (GUM). In order to simplify the calculation of the uncertainty, a computational tool named Kragten Spreadsheet was used. Such spreadsheet uses the concepts established by the GUM and provides results that numerically approximates to those obtained by propagation of uncertainty with analytically determined sensitivity coefficients. The main parameters (input quantities) interfering on the uncertainty were studied. In order to evaluate their contribution in the final uncertainty, the uncertainties of all steps of the analytical method were estimated and compiled. (author)

  7. Uncertainty and sensitivity analysis in performance assessment for the proposed high-level radioactive waste repository at Yucca Mountain, Nevada

    International Nuclear Information System (INIS)

    Helton, Jon C.; Hansen, Clifford W.; Sallaberry, Cédric J.

    2012-01-01

    Extensive work has been carried out by the U.S. Department of Energy (DOE) in the development of a proposed geologic repository at Yucca Mountain (YM), Nevada, for the disposal of high-level radioactive waste. As part of this development, a detailed performance assessment (PA) for the YM repository was completed in 2008 and supported a license application by the DOE to the U.S. Nuclear Regulatory Commission (NRC) for the construction of the YM repository. The following aspects of the 2008 YM PA are described in this presentation: (i) conceptual structure and computational organization, (ii) uncertainty and sensitivity analysis techniques in use, (iii) uncertainty and sensitivity analysis for physical processes, and (iv) uncertainty and sensitivity analysis for expected dose to the reasonably maximally exposed individual (RMEI) specified the NRC’s regulations for the YM repository. - Highlights: ► An overview of performance assessment for the proposed Yucca Mountain radioactive waste repository is presented. ► Conceptual structure and computational organization are described. ► Uncertainty and sensitivity analysis techniques are described. ► Uncertainty and sensitivity analysis results for physical processes are presented. ► Uncertainty and sensitivity analysis results for expected dose are presented.

  8. Uncertainty, joint uncertainty, and the quantum uncertainty principle

    International Nuclear Information System (INIS)

    Narasimhachar, Varun; Poostindouz, Alireza; Gour, Gilad

    2016-01-01

    Historically, the element of uncertainty in quantum mechanics has been expressed through mathematical identities called uncertainty relations, a great many of which continue to be discovered. These relations use diverse measures to quantify uncertainty (and joint uncertainty). In this paper we use operational information-theoretic principles to identify the common essence of all such measures, thereby defining measure-independent notions of uncertainty and joint uncertainty. We find that most existing entropic uncertainty relations use measures of joint uncertainty that yield themselves to a small class of operational interpretations. Our notion relaxes this restriction, revealing previously unexplored joint uncertainty measures. To illustrate the utility of our formalism, we derive an uncertainty relation based on one such new measure. We also use our formalism to gain insight into the conditions under which measure-independent uncertainty relations can be found. (paper)

  9. An estimation of uncertainties in containment P/T analysis using CONTEMPT/LT code

    International Nuclear Information System (INIS)

    Kang, Y.M.; Park, G.C.; Lee, U.C.; Kang, C.S.

    1991-01-01

    In a nuclear power plant, the containment design pressure and temperature (P/T) have been established based on the unrealistic conservatism with suffering from a drawback in the economics. Thus, it is necessary that the uncertainties of design P/T values have to be well defined through an extensive uncertainty analysis with plant-specific input data and or models used in the computer code. This study is to estimate plant-specific uncertainties of containment design P/T using the Monte Carlo method in Kori-3 reactor. Kori-3 plant parameters and Uchida heat transfer coefficient are selected to be treated statistically after the sensitivity study. The Monte Carlo analysis has performed based on the response surface method with the CONTEMPT/LT code and Latin Hypercube sampling technique. Finally, the design values based on 95 %/95 % probability are compared with worst estimated values to assess the design margin. (author)

  10. Robustness of an uncertainty and sensitivity analysis of early exposure results with the MACCS reactor accident consequence model

    International Nuclear Information System (INIS)

    Helton, J.C.; Johnson, J.D.; McKay, M.D.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis were used in an investigation with the MACCS model of the early health effects associated with a severe accident at a nuclear power station. The following results were obtained in tests to check the robustness of the analysis techniques: two independent Latin hypercube samples produced similar uncertainty and sensitivity analysis results; setting important variables to best-estimate values produced substantial reductions in uncertainty, while setting the less important variables to best-estimate values had little effect on uncertainty; similar sensitivity analysis results were obtained when the original uniform and loguniform distributions assigned to the 34 imprecisely known input variables were changed to left-triangular distributions and then to right-triangular distributions; and analyses with rank-transformed and logarithmically-transformed data produced similar results and substantially outperformed analyses with raw (i.e., untransformed) data

  11. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)] [and others

    1997-06-01

    This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  12. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report

    International Nuclear Information System (INIS)

    Brown, J.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-06-01

    This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses

  13. The uncertainty analysis of a liquid metal reactor for burning minor actinides from light water reactors

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Hang Bok [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1999-12-31

    The neutronics analysis of a liquid metal reactor for burning minor actinides has shown that uncertainties in the nuclear data of several key minor actinide isotopes can introduce large uncertainties in the predicted performance of the core. A comprehensive sensitivity and uncertainty analysis was performed on a 1200 MWth actinide burner designed for a low burnup reactivity swing, negative doppler coefficient, and low sodium void worth. Sensitivities were generated using depletion perturbation methods for the equilibrium cycle of the reactor and covariance data was taken ENDF-B/V and other published sources. The relative uncertainties in the burnup swing, doppler coefficient, and void worth were conservatively estimated to be 180%, 97%, and 46%, respectively. 5 refs., 1 fig., 3 tabs. (Author)

  14. The uncertainty analysis of a liquid metal reactor for burning minor actinides from light water reactors

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Hang Bok [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    The neutronics analysis of a liquid metal reactor for burning minor actinides has shown that uncertainties in the nuclear data of several key minor actinide isotopes can introduce large uncertainties in the predicted performance of the core. A comprehensive sensitivity and uncertainty analysis was performed on a 1200 MWth actinide burner designed for a low burnup reactivity swing, negative doppler coefficient, and low sodium void worth. Sensitivities were generated using depletion perturbation methods for the equilibrium cycle of the reactor and covariance data was taken ENDF-B/V and other published sources. The relative uncertainties in the burnup swing, doppler coefficient, and void worth were conservatively estimated to be 180%, 97%, and 46%, respectively. 5 refs., 1 fig., 3 tabs. (Author)

  15. Sensitivity analysis on uncertainty variables affecting the NPP's LUEC with probabilistic approach

    International Nuclear Information System (INIS)

    Nuryanti; Akhmad Hidayatno; Erlinda Muslim

    2013-01-01

    One thing that is quite crucial to be reviewed prior to any investment decision on the nuclear power plant (NPP) project is the calculation of project economic, including calculation of Levelized Unit Electricity Cost (LUEC). Infrastructure projects such as NPP’s project are vulnerable to a number of uncertainty variables. Information on the uncertainty variables which makes LUEC’s value quite sensitive due to the changes of them is necessary in order the cost overrun can be avoided. Therefore this study aimed to do the sensitivity analysis on variables that affect LUEC with probabilistic approaches. This analysis was done by using Monte Carlo technique that simulate the relationship between the uncertainty variables and visible impact on LUEC. The sensitivity analysis result shows the significant changes on LUEC value of AP1000 and OPR due to the sensitivity of investment cost and capacity factors. While LUEC changes due to sensitivity of U 3 O 8 ’s price looks not quite significant. (author)

  16. Model Uncertainty and Robustness: A Computational Framework for Multimodel Analysis

    Science.gov (United States)

    Young, Cristobal; Holsteen, Katherine

    2017-01-01

    Model uncertainty is pervasive in social science. A key question is how robust empirical results are to sensible changes in model specification. We present a new approach and applied statistical software for computational multimodel analysis. Our approach proceeds in two steps: First, we estimate the modeling distribution of estimates across all…

  17. Uncertainty analysis in vulnerability estimations for elements at risk- a review of concepts and some examples on landslides

    Science.gov (United States)

    Ciurean, R. L.; Glade, T.

    2012-04-01

    Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.

  18. Stand-alone core sensitivity and uncertainty analysis of ALFRED from Monte Carlo simulations

    International Nuclear Information System (INIS)

    Pérez-Valseca, A.-D.; Espinosa-Paredes, G.; François, J.L.; Vázquez Rodríguez, A.; Martín-del-Campo, C.

    2017-01-01

    Highlights: • Methodology based on Monte Carlo simulation. • Sensitivity analysis of Lead Fast Reactor (LFR). • Uncertainty and regression analysis of LFR. • 10% change in the core inlet flow, the response in thermal power change is 0.58%. • 2.5% change in the inlet lead temperature the response is 1.87% in power. - Abstract: The aim of this paper is the sensitivity and uncertainty analysis of a Lead-Cooled Fast Reactor (LFR) based on Monte Carlo simulation of sizes up to 2000. The methodology developed in this work considers the uncertainty of sensitivities and uncertainty of output variables due to a single-input-variable variation. The Advanced Lead fast Reactor European Demonstrator (ALFRED) is analyzed to determine the behavior of the essential parameters due to effects of mass flow and temperature of liquid lead. The ALFRED core mathematical model developed in this work is fully transient, which takes into account the heat transfer in an annular fuel pellet design, the thermo-fluid in the core, and the neutronic processes, which are modeled with point kinetic with feedback fuel temperature and expansion effects. The sensitivity evaluated in terms of the relative standard deviation (RSD) showed that for 10% change in the core inlet flow, the response in thermal power change is 0.58%, and for 2.5% change in the inlet lead temperature is 1.87%. The regression analysis with mass flow rate as the predictor variable showed statistically valid cubic correlations for neutron flux and linear relationship neutron flux as a function of the lead temperature. No statistically valid correlation was observed for the reactivity as a function of the mass flow rate and for the lead temperature. These correlations are useful for the study, analysis, and design of any LFR.

  19. Technology relevance of the 'uncertainty analysis in modelling' project for nuclear reactor safety

    International Nuclear Information System (INIS)

    D'Auria, F.; Langenbuch, S.; Royer, E.; Del Nevo, A.; Parisi, C.; Petruzzi, A.

    2007-01-01

    The OECD/NEA Nuclear Science Committee (NSC) endorsed the setting up of an Expert Group on Uncertainty Analysis in Modelling (UAM) in June 2006. This Expert Group reports to the Working Party on Scientific issues in Reactor Systems (WPRS) and because it addresses multi-scale / multi-physics aspects of uncertainty analysis, it will work in close co-ordination with the benchmark groups on coupled neutronics-thermal-hydraulics and on coupled core-plant problems, and the CSNI Group on Analysis and Management of Accidents (GAMA). The NEA/NSC has endorsed that this activity be undertaken with Prof. K. Ivanov from the Pennsylvania State University (PSU) as the main coordinator and host with the assistance of the Scientific Board. The objective of the proposed work is to define, coordinate, conduct, and report an international benchmark for uncertainty analysis in best-estimate coupled code calculations for design, operation, and safety analysis of LWRs entitled 'OECD UAM LWR Benchmark'. At the First Benchmark Workshop (UAM-1) held from 10 to 11 May 2007 at the OECD/NEA, one action concerned the forming of a sub-group, led by F. D'Auria, member of CSNI, responsible for defining the objectives, the impact and benefit of the UAM for safety and licensing. This report is the result of this action by the subgroup. (authors)

  20. Sensitivity/uncertainty analysis for the Hiroshima dosimetry reevaluation effort

    International Nuclear Information System (INIS)

    Broadhead, B.L.; Lillie, R.A.; Pace, J.V. III; Cacuci, D.G.

    1987-01-01

    Uncertainty estimates and cross correlations by range/survivor location have been obtained for the free-in-air (FIA) tissue kerma for the Hiroshima atomic event. These uncertainties in the FIA kerma include contributions due to various modeling parameters and the basic cross section data and are given at three ground ranges, 700, 1000 and 1500 m. The estimated uncertainties are nearly constant over the given ground ranges and are approximately 27% for the prompt neutron kerma and secondary gamma kerma and 35% for the prompt gamma kerma. The total kerma uncertainty is dominated by the secondary gamma kerma uncertainties which are in turn largely due to the modeling parameter uncertainties

  1. Uncertainties combined in algae and water in chemical analysis in determinations with ICP-OES

    International Nuclear Information System (INIS)

    Souza, Poliana Santos de

    2014-01-01

    One way to determine if some trace elements in algae and water is through uncertainty calculations. Spectrometry and inductively coupled plasma optical emission (ICP-OES) is widely used in this procedure, because it allows the analysis in waters and areas of solid samples. Thus, some elements (Fe, Ca and Mg) were used to calculate the uncertainty. (author)

  2. Sensitivity Analysis and Uncertainty Quantification for the LAMMPS Molecular Dynamics Simulation Code

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Bhat, Kabekode Ghanasham [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-07-18

    We examine sensitivity analysis and uncertainty quantification for molecular dynamics simulation. Extreme (large or small) output values for the LAMMPS code often occur at the boundaries of input regions, and uncertainties in those boundary values are overlooked by common SA methods. Similarly, input values for which code outputs are consistent with calibration data can also occur near boundaries. Upon applying approaches in the literature for imprecise probabilities (IPs), much more realistic results are obtained than for the complacent application of standard SA and code calibration.

  3. Methodology to carry out a sensitivity and uncertainty analysis for cross sections using a coupled model Trace-Parcs

    International Nuclear Information System (INIS)

    Reyes F, M. C.; Del Valle G, E.; Gomez T, A. M.; Sanchez E, V.

    2015-09-01

    A methodology was implemented to carry out a sensitivity and uncertainty analysis for cross sections used in a coupled model for Trace/Parcs in a transient of control rod fall of a BWR-5. A model of the reactor core for the neutronic code Parcs was used, in which the assemblies located in the core are described. Thermo-hydraulic model in Trace was a simple model, where only a component type Chan was designed to represent all the core assemblies, which it was within a single vessel and boundary conditions were established. The thermo-hydraulic part was coupled with the neutron part, first for the steady state and then a transient of control rod fall was carried out for the sensitivity and uncertainty analysis. To carry out the analysis of cross sections used in the coupled model Trace/Parcs during the transient, the Probability Density Functions for 22 parameters selected from the total of neutronic parameters that use Parcs were generated, obtaining 100 different cases for the coupled model Trace/Parcs, each one with a database of different cross sections. All these cases were executed with the coupled model, obtaining in consequence 100 different output files for the transient of control rod fall doing emphasis in the nominal power, for which an uncertainty analysis was realized at the same time generate the band of uncertainty. With this analysis is possible to observe the ranges of results of the elected responses varying the selected uncertainty parameters. The sensitivity analysis complements the uncertainty analysis, identifying the parameter or parameters with more influence on the results and thus focuses on these parameters in order to better understand their effects. Beyond the obtained results, because is not a model with real operation data, the importance of this work is to know the application of the methodology to carry out the sensitivity and uncertainty analyses. (Author)

  4. Review of best estimate plus uncertainty methods of thermal-hydraulic safety analysis

    International Nuclear Information System (INIS)

    Prosek, A.; Mavko, B.

    2003-01-01

    In 1988 United States Nuclear Regulatory Commission approved the revised rule on the acceptance of emergency core cooling system (ECCS) performance. Since that there has been significant interest in the development of codes and methodologies for best-estimate loss-of-coolant accident (LOCAs) analyses. Several new best estimate plus uncertainty methods (BEPUs) were developed in the world. The purpose of the paper is to review the developments in the direction of best estimate approaches with uncertainty quantification and to discuss the problems in practical applications of BEPU methods. In general, the licensee methods are following original methods. The study indicated that uncertainty analysis with random sampling of input parameters and the use of order statistics for desired tolerance limits of output parameters is today commonly accepted and mature approach. (author)

  5. Parameter sensitivity and uncertainty of the forest carbon flux model FORUG : a Monte Carlo analysis

    Energy Technology Data Exchange (ETDEWEB)

    Verbeeck, H.; Samson, R.; Lemeur, R. [Ghent Univ., Ghent (Belgium). Laboratory of Plant Ecology; Verdonck, F. [Ghent Univ., Ghent (Belgium). Dept. of Applied Mathematics, Biometrics and Process Control

    2006-06-15

    The FORUG model is a multi-layer process-based model that simulates carbon dioxide (CO{sub 2}) and water exchange between forest stands and the atmosphere. The main model outputs are net ecosystem exchange (NEE), total ecosystem respiration (TER), gross primary production (GPP) and evapotranspiration. This study used a sensitivity analysis to identify the parameters contributing to NEE uncertainty in the FORUG model. The aim was to determine if it is necessary to estimate the uncertainty of all parameters of a model to determine overall output uncertainty. Data used in the study were the meteorological and flux data of beech trees in Hesse. The Monte Carlo method was used to rank sensitivity and uncertainty parameters in combination with a multiple linear regression. Simulations were run in which parameters were assigned probability distributions and the effect of variance in the parameters on the output distribution was assessed. The uncertainty of the output for NEE was estimated. Based on the arbitrary uncertainty of 10 key parameters, a standard deviation of 0.88 Mg C per year per NEE was found, which was equal to 24 per cent of the mean value of NEE. The sensitivity analysis showed that the overall output uncertainty of the FORUG model could be determined by accounting for only a few key parameters, which were identified as corresponding to critical parameters in the literature. It was concluded that the 10 most important parameters determined more than 90 per cent of the output uncertainty. High ranking parameters included soil respiration; photosynthesis; and crown architecture. It was concluded that the Monte Carlo technique is a useful tool for ranking the uncertainty of parameters of process-based forest flux models. 48 refs., 2 tabs., 2 figs.

  6. Analysis of parameter uncertainties in the assessment of seismic risk for nuclear power plants

    International Nuclear Information System (INIS)

    Yucemen, S.M.

    1981-04-01

    Probabilistic and statistical methods are used to develop a procedure by which the seismic risk at a specific site can be systematically analyzed. The proposed probabilistic procedure provides a consisted method for the modelling, analysis and updating of uncertainties that are involved in the seismic risk analysis for nuclear power plants. Methods are proposed for including these uncertainties in the final value of calculated risks. Two specific case studies are presented in detail to illustrate the application of the probabilistic method of seismic risk evaluation and to investigate the sensitivity of results to different assumptions

  7. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  8. Applications of the TSUNAMI sensitivity and uncertainty analysis methodology

    International Nuclear Information System (INIS)

    Rearden, Bradley T.; Hopper, Calvin M.; Elam, Karla R.; Goluoglu, Sedat; Parks, Cecil V.

    2003-01-01

    The TSUNAMI sensitivity and uncertainty analysis tools under development for the SCALE code system have recently been applied in four criticality safety studies. TSUNAMI is used to identify applicable benchmark experiments for criticality code validation, assist in the design of new critical experiments for a particular need, reevaluate previously computed computational biases, and assess the validation coverage and propose a penalty for noncoverage for a specific application. (author)

  9. Application of quantile functions for the analysis and comparison of gas pressure balance uncertainties

    Directory of Open Access Journals (Sweden)

    Ramnath Vishal

    2017-01-01

    Full Text Available Traditionally in the field of pressure metrology uncertainty quantification was performed with the use of the Guide to the Uncertainty in Measurement (GUM; however, with the introduction of the GUM Supplement 1 (GS1 the use of Monte Carlo simulations has become an accepted practice for uncertainty analysis in metrology for mathematical models in which the underlying assumptions of the GUM are not valid. Consequently the use of quantile functions was developed as a means to easily summarize and report on uncertainty numerical results that were based on Monte Carlo simulations. In this paper, we considered the case of a piston–cylinder operated pressure balance where the effective area is modelled in terms of a combination of explicit/implicit and linear/non-linear models, and how quantile functions may be applied to analyse results and compare uncertainties from a mixture of GUM and GS1 methodologies.

  10. Stochastic dynamic analysis of marine risers considering Gaussian system uncertainties

    Science.gov (United States)

    Ni, Pinghe; Li, Jun; Hao, Hong; Xia, Yong

    2018-03-01

    This paper performs the stochastic dynamic response analysis of marine risers with material uncertainties, i.e. in the mass density and elastic modulus, by using Stochastic Finite Element Method (SFEM) and model reduction technique. These uncertainties are assumed having Gaussian distributions. The random mass density and elastic modulus are represented by using the Karhunen-Loève (KL) expansion. The Polynomial Chaos (PC) expansion is adopted to represent the vibration response because the covariance of the output is unknown. Model reduction based on the Iterated Improved Reduced System (IIRS) technique is applied to eliminate the PC coefficients of the slave degrees of freedom to reduce the dimension of the stochastic system. Monte Carlo Simulation (MCS) is conducted to obtain the reference response statistics. Two numerical examples are studied in this paper. The response statistics from the proposed approach are compared with those from MCS. It is noted that the computational time is significantly reduced while the accuracy is kept. The results demonstrate the efficiency of the proposed approach for stochastic dynamic response analysis of marine risers.

  11. Procedures for uncertainty and sensitivity analysis in repository performance assessment

    International Nuclear Information System (INIS)

    Poern, K.; Aakerlund, O.

    1985-10-01

    The objective of the project was mainly a literature study of available methods for the treatment of parameter uncertainty propagation and sensitivity aspects in complete models such as those concerning geologic disposal of radioactive waste. The study, which has run parallel with the development of a code package (PROPER) for computer assisted analysis of function, also aims at the choice of accurate, cost-affective methods for uncertainty and sensitivity analysis. Such a choice depends on several factors like the number of input parameters, the capacity of the model and the computer reresources required to use the model. Two basic approaches are addressed in the report. In one of these the model of interest is directly simulated by an efficient sampling technique to generate an output distribution. Applying the other basic method the model is replaced by an approximating analytical response surface, which is then used in the sampling phase or in moment matching to generate the output distribution. Both approaches are illustrated by simple examples in the report. (author)

  12. Treatment of measurement uncertainties at the power burst facility

    International Nuclear Information System (INIS)

    Meyer, L.C.

    1980-01-01

    The treatment of measurement uncertainty at the Power Burst Facility provides a means of improving data integrity as well as meeting standard practice reporting requirements. This is accomplished by performing the uncertainty analysis in two parts, test independent uncertainty analysis and test dependent uncertainty analysis. The test independent uncertainty analysis is performed on instrumentation used repeatedly from one test to the next, and does not have to be repeated for each test except for improved or new types of instruments. A test dependent uncertainty analysis is performed on each test based on the test independent uncertainties modified as required by test specifications, experiment fixture design, and historical performance of instruments on similar tests. The methodology for performing uncertainty analysis based on the National Bureau of Standards method is reviewed with examples applied to nuclear instrumentation

  13. Bayesian models for comparative analysis integrating phylogenetic uncertainty

    Directory of Open Access Journals (Sweden)

    Villemereuil Pierre de

    2012-06-01

    Full Text Available Abstract Background Uncertainty in comparative analyses can come from at least two sources: a phylogenetic uncertainty in the tree topology or branch lengths, and b uncertainty due to intraspecific variation in trait values, either due to measurement error or natural individual variation. Most phylogenetic comparative methods do not account for such uncertainties. Not accounting for these sources of uncertainty leads to false perceptions of precision (confidence intervals will be too narrow and inflated significance in hypothesis testing (e.g. p-values will be too small. Although there is some application-specific software for fitting Bayesian models accounting for phylogenetic error, more general and flexible software is desirable. Methods We developed models to directly incorporate phylogenetic uncertainty into a range of analyses that biologists commonly perform, using a Bayesian framework and Markov Chain Monte Carlo analyses. Results We demonstrate applications in linear regression, quantification of phylogenetic signal, and measurement error models. Phylogenetic uncertainty was incorporated by applying a prior distribution for the phylogeny, where this distribution consisted of the posterior tree sets from Bayesian phylogenetic tree estimation programs. The models were analysed using simulated data sets, and applied to a real data set on plant traits, from rainforest plant species in Northern Australia. Analyses were performed using the free and open source software OpenBUGS and JAGS. Conclusions Incorporating phylogenetic uncertainty through an empirical prior distribution of trees leads to more precise estimation of regression model parameters than using a single consensus tree and enables a more realistic estimation of confidence intervals. In addition, models incorporating measurement errors and/or individual variation, in one or both variables, are easily formulated in the Bayesian framework. We show that BUGS is a useful, flexible

  14. Bayesian models for comparative analysis integrating phylogenetic uncertainty

    Science.gov (United States)

    2012-01-01

    Background Uncertainty in comparative analyses can come from at least two sources: a) phylogenetic uncertainty in the tree topology or branch lengths, and b) uncertainty due to intraspecific variation in trait values, either due to measurement error or natural individual variation. Most phylogenetic comparative methods do not account for such uncertainties. Not accounting for these sources of uncertainty leads to false perceptions of precision (confidence intervals will be too narrow) and inflated significance in hypothesis testing (e.g. p-values will be too small). Although there is some application-specific software for fitting Bayesian models accounting for phylogenetic error, more general and flexible software is desirable. Methods We developed models to directly incorporate phylogenetic uncertainty into a range of analyses that biologists commonly perform, using a Bayesian framework and Markov Chain Monte Carlo analyses. Results We demonstrate applications in linear regression, quantification of phylogenetic signal, and measurement error models. Phylogenetic uncertainty was incorporated by applying a prior distribution for the phylogeny, where this distribution consisted of the posterior tree sets from Bayesian phylogenetic tree estimation programs. The models were analysed using simulated data sets, and applied to a real data set on plant traits, from rainforest plant species in Northern Australia. Analyses were performed using the free and open source software OpenBUGS and JAGS. Conclusions Incorporating phylogenetic uncertainty through an empirical prior distribution of trees leads to more precise estimation of regression model parameters than using a single consensus tree and enables a more realistic estimation of confidence intervals. In addition, models incorporating measurement errors and/or individual variation, in one or both variables, are easily formulated in the Bayesian framework. We show that BUGS is a useful, flexible general purpose tool for

  15. On uncertainty and local sensitivity analysis for transient conjugate heat transfer problems

    International Nuclear Information System (INIS)

    Rauch, Christian

    2012-01-01

    The need for simulating real-world behavior of automobiles has led to more and more sophisticated models being added of various physical phenomena for being coupled together. This increases the number of parameters to be set and, consequently, the required knowledge of their relative importance for the solution and the theory behind them. Sensitivity and uncertainty analysis provides the knowledge of parameter importance. In this paper a thermal radiation solver is considered that performs conduction calculations and receives heat transfer coefficient and fluid temperate at a thermal node. The equations of local, discrete, and transient sensitivities for the conjugate heat transfer model solved by the finite difference method are being derived for some parameters. In the past, formulations for the finite element method have been published. This paper builds on the steady-state formulation published previously by the author. A numerical analysis on the stability of the solution matrix is being conducted. From those normalized sensitivity coefficients are calculated dimensionless uncertainty factors. On a simplified example the relative importance of the heat transfer modes at various locations is then investigated by those uncertainty factors and their changes over time

  16. Information Synthesis in Uncertainty Studies: Application to the Analysis of the BEMUSE Results

    International Nuclear Information System (INIS)

    Baccou, J.; Chojnacki, E.; Destercke, S.

    2013-01-01

    To demonstrate that the nuclear power plants are designed to respond safely at numerous postulated accidents computer codes are used. The models of these computer codes are an approximation of the real physical behaviour occurring during an accident. Moreover the data used to run these codes are also known with a limited accuracy. Therefore the code predictions are not exact but uncertain. To deal with these uncertainties, 'best estimate' codes with 'best estimate' input data are used to obtain a best estimate calculation and it is necessary to derive the uncertainty associated to their estimations. For this reason, regulatory authorities demand in particular to technical safety organization such as the French Institut de Radioprotection et de Surete Nucleaire (IRSN) to provide results taking into account all the uncertainty sources to assess safety quantities are below critical values. Uncertainty analysis can be seen as a problem of information treatment and a special effort on four methodological key issues has to be done. The first one is related to information modelling. In safety studies, one can distinguish two kinds of uncertainty. The first type, called aleatory uncertainty, is due to the natural variability of an observed phenomenon and cannot be reduced by the arrival of new information. The second type, called epistemic uncertainty, can arise from imprecision. Contrary to the previous one, this uncertainty can be reduced by increasing the state of knowledge. Performing a relevant information modelling therefore requires to work with a mathematical formalism flexible enough to faithfully treat both types of uncertainties. The second one deals with information propagation through a computer code. It requires to run the codes several times and it is usually achieved thanks to a coupling to a statistical software. The complexity of the propagation is strongly connected to the mathematical framework used for the information modelling. The more general the

  17. Uncertainty modelling and analysis of volume calculations based on a regular grid digital elevation model (DEM)

    Science.gov (United States)

    Li, Chang; Wang, Qing; Shi, Wenzhong; Zhao, Sisi

    2018-05-01

    The accuracy of earthwork calculations that compute terrain volume is critical to digital terrain analysis (DTA). The uncertainties in volume calculations (VCs) based on a DEM are primarily related to three factors: 1) model error (ME), which is caused by an adopted algorithm for a VC model, 2) discrete error (DE), which is usually caused by DEM resolution and terrain complexity, and 3) propagation error (PE), which is caused by the variables' error. Based on these factors, the uncertainty modelling and analysis of VCs based on a regular grid DEM are investigated in this paper. Especially, how to quantify the uncertainty of VCs is proposed by a confidence interval based on truncation error (TE). In the experiments, the trapezoidal double rule (TDR) and Simpson's double rule (SDR) were used to calculate volume, where the TE is the major ME, and six simulated regular grid DEMs with different terrain complexity and resolution (i.e. DE) were generated by a Gauss synthetic surface to easily obtain the theoretical true value and eliminate the interference of data errors. For PE, Monte-Carlo simulation techniques and spatial autocorrelation were used to represent DEM uncertainty. This study can enrich uncertainty modelling and analysis-related theories of geographic information science.

  18. The deuteron-radius puzzle is alive: A new analysis of nuclear structure uncertainties

    Science.gov (United States)

    Hernandez, O. J.; Ekström, A.; Nevo Dinur, N.; Ji, C.; Bacca, S.; Barnea, N.

    2018-03-01

    To shed light on the deuteron radius puzzle we analyze the theoretical uncertainties of the nuclear structure corrections to the Lamb shift in muonic deuterium. We find that the discrepancy between the calculated two-photon exchange correction and the corresponding experimentally inferred value by Pohl et al. [1] remain. The present result is consistent with our previous estimate, although the discrepancy is reduced from 2.6 σ to about 2 σ. The error analysis includes statistic as well as systematic uncertainties stemming from the use of nucleon-nucleon interactions derived from chiral effective field theory at various orders. We therefore conclude that nuclear theory uncertainty is more likely not the source of the discrepancy.

  19. Uncertainties in Nuclear Proliferation Modeling

    International Nuclear Information System (INIS)

    Kim, Chul Min; Yim, Man-Sung; Park, Hyeon Seok

    2015-01-01

    There have been various efforts in the research community to understand the determinants of nuclear proliferation and develop quantitative tools to predict nuclear proliferation events. Such systematic approaches have shown the possibility to provide warning for the international community to prevent nuclear proliferation activities. However, there are still large debates for the robustness of the actual effect of determinants and projection results. Some studies have shown that several factors can cause uncertainties in previous quantitative nuclear proliferation modeling works. This paper analyzes the uncertainties in the past approaches and suggests future works in the view of proliferation history, analysis methods, and variable selection. The research community still lacks the knowledge for the source of uncertainty in current models. Fundamental problems in modeling will remain even other advanced modeling method is developed. Before starting to develop fancy model based on the time dependent proliferation determinants' hypothesis, using graph theory, etc., it is important to analyze the uncertainty of current model to solve the fundamental problems of nuclear proliferation modeling. The uncertainty from different proliferation history coding is small. Serious problems are from limited analysis methods and correlation among the variables. Problems in regression analysis and survival analysis cause huge uncertainties when using the same dataset, which decreases the robustness of the result. Inaccurate variables for nuclear proliferation also increase the uncertainty. To overcome these problems, further quantitative research should focus on analyzing the knowledge suggested on the qualitative nuclear proliferation studies

  20. Sensitivity analysis of respiratory parameter uncertainties: impact of criterion function form and constraints.

    Science.gov (United States)

    Lutchen, K R

    1990-08-01

    A sensitivity analysis based on weighted least-squares regression is presented to evaluate alternative methods for fitting lumped-parameter models to respiratory impedance data. The goal is to maintain parameter accuracy simultaneously with practical experiment design. The analysis focuses on predicting parameter uncertainties using a linearized approximation for joint confidence regions. Applications are with four-element parallel and viscoelastic models for 0.125- to 4-Hz data and a six-element model with separate tissue and airway properties for input and transfer impedance data from 2-64 Hz. The criterion function form was evaluated by comparing parameter uncertainties when data are fit as magnitude and phase, dynamic resistance and compliance, or real and imaginary parts of input impedance. The proper choice of weighting can make all three criterion variables comparable. For the six-element model, parameter uncertainties were predicted when both input impedance and transfer impedance are acquired and fit simultaneously. A fit to both data sets from 4 to 64 Hz could reduce parameter estimate uncertainties considerably from those achievable by fitting either alone. For the four-element models, use of an independent, but noisy, measure of static compliance was assessed as a constraint on model parameters. This may allow acceptable parameter uncertainties for a minimum frequency of 0.275-0.375 Hz rather than 0.125 Hz. This reduces data acquisition requirements from a 16- to a 5.33- to 8-s breath holding period. These results are approximations, and the impact of using the linearized approximation for the confidence regions is discussed.

  1. May Day: A computer code to perform uncertainty and sensitivity analysis. Manuals

    International Nuclear Information System (INIS)

    Bolado, R.; Alonso, A.; Moya, J.M.

    1996-07-01

    The computer program May Day was developed to carry out the uncertainty and sensitivity analysis in the evaluation of radioactive waste storage. The May Day was made by the Polytechnical University of Madrid. (Author)

  2. Effect of Uncertainty Parameters in Blowdown and Reflood Models for OPR1000 LBLOCA Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Huh, Byung Gil; Jin, Chang Yong; Seul, Kwangwon; Hwang, Taesuk [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2014-05-15

    KINS(Korea Institute of Nuclear Safety) has also performed the audit calculation with the KINS Realistic Evaluation Methodology(KINS-REM) to confirm the validity of licensee's calculation. In the BEPU method, it is very important to quantify the code and model uncertainty. It is referred in the following requirement: BE calculations in Regulatory Guide 1.157 - 'the code and models used are acceptable and applicable to the specific facility over the intended operating range and must quantify the uncertainty in the specific application'. In general, the uncertainty of model/code should be obtained through the data comparison with relevant integral- and separate-effect tests at different scales. However, it is not easy to determine these kinds of uncertainty because of the difficulty for evaluating accurately various experiments. Therefore, the expert judgment has been used in many cases even with the limitation that the uncertainty range of important parameters can be wide and inaccurate. In the KINS-REM, six heat transfer parameters in the blowdown phase have been used to consider the uncertainty of models. Recently, MARS-KS code was modified to consider the uncertainty of the five heat transfer parameters in the reflood phase. Accordingly, it is required that the uncertainty range for parameters of reflood models is determined and the effect of these ranges is evaluated. In this study, the large break LOCA (LBLOCA) analysis for OPR1000 was performed to identify the effect of uncertainty parameters in blowdown and reflood models.

  3. Principal results of uncertainty and sensibility analysis for generic spanish AGP-granite

    International Nuclear Information System (INIS)

    Bolado, R.; Moya, J.A.

    1998-01-01

    Recently, ENRESA published his Performance Assessment of a deep geologic repository in granite. This paper summarises the main results of the uncertainty and sensitivity analysis performed on the data generated for the main scenario in the ENRESA Performance Assessment. The uncertainty analysis allowed us to determine the most important radionuclides, which were ''129I, ''36 Cl, ''79 Se and ''126 Sn, and to estimate upper bounds for the risk due to each one of them and for the global risk. Since ''129 I was the most important radionuclide, the main efforts in the sensitivity study were done in studying the most influential parameters on the maximum dose due to that radionuclide. The analysis shows that the order of magnitude of the maximum dose is essentially related to geosphere transport parameters. Nevertheless, the most influential parameters, when considering only the highest values of the maximum doses, are those that control the total amount of contaminant that can be driven into the main path to biosphere. (Author) 3 refs

  4. Introducing uncertainty analysis of nucleation and crystal growth models in Process Analytical Technology (PAT) system design of crystallization processes.

    Science.gov (United States)

    Samad, Noor Asma Fazli Abdul; Sin, Gürkan; Gernaey, Krist V; Gani, Rafiqul

    2013-11-01

    This paper presents the application of uncertainty and sensitivity analysis as part of a systematic model-based process monitoring and control (PAT) system design framework for crystallization processes. For the uncertainty analysis, the Monte Carlo procedure is used to propagate input uncertainty, while for sensitivity analysis, global methods including the standardized regression coefficients (SRC) and Morris screening are used to identify the most significant parameters. The potassium dihydrogen phosphate (KDP) crystallization process is used as a case study, both in open-loop and closed-loop operation. In the uncertainty analysis, the impact on the predicted output of uncertain parameters related to the nucleation and the crystal growth model has been investigated for both a one- and two-dimensional crystal size distribution (CSD). The open-loop results show that the input uncertainties lead to significant uncertainties on the CSD, with appearance of a secondary peak due to secondary nucleation for both cases. The sensitivity analysis indicated that the most important parameters affecting the CSDs are nucleation order and growth order constants. In the proposed PAT system design (closed-loop), the target CSD variability was successfully reduced compared to the open-loop case, also when considering uncertainty in nucleation and crystal growth model parameters. The latter forms a strong indication of the robustness of the proposed PAT system design in achieving the target CSD and encourages its transfer to full-scale implementation. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Uncertainties in Safety Analysis. A literature review

    International Nuclear Information System (INIS)

    Ekberg, C.

    1995-05-01

    The purpose of the presented work has been to give a short summary of the origins of many uncertainties arising in the designing and performance assessment of a repository for spent nuclear fuel. Some different methods to treat these uncertainties is also included. The methods and conclusions are in many cases general in the sense that they are applicable to many other disciplines where simulations are used. As a conclusion it may be noted that uncertainties of different origin have been discussed and debated, but one large group, e.g. computer simulations, where the methods to make a more explicit investigation exists, have not been investigated in a satisfying way. 50 refs

  6. Uncertainties in Safety Analysis. A literature review

    Energy Technology Data Exchange (ETDEWEB)

    Ekberg, C [Chalmers Univ. of Technology, Goeteborg (Sweden). Dept. of Nuclear Chemistry

    1995-05-01

    The purpose of the presented work has been to give a short summary of the origins of many uncertainties arising in the designing and performance assessment of a repository for spent nuclear fuel. Some different methods to treat these uncertainties is also included. The methods and conclusions are in many cases general in the sense that they are applicable to many other disciplines where simulations are used. As a conclusion it may be noted that uncertainties of different origin have been discussed and debated, but one large group, e.g. computer simulations, where the methods to make a more explicit investigation exists, have not been investigated in a satisfying way. 50 refs.

  7. Uncertainty Analysis of Power Grid Investment Capacity Based on Monte Carlo

    Science.gov (United States)

    Qin, Junsong; Liu, Bingyi; Niu, Dongxiao

    By analyzing the influence factors of the investment capacity of power grid, to depreciation cost, sales price and sales quantity, net profit, financing and GDP of the second industry as the dependent variable to build the investment capacity analysis model. After carrying out Kolmogorov-Smirnov test, get the probability distribution of each influence factor. Finally, obtained the grid investment capacity uncertainty of analysis results by Monte Carlo simulation.

  8. Uncertainty analysis methods for quantification of source terms using a large computer code

    International Nuclear Information System (INIS)

    Han, Seok Jung

    1997-02-01

    Quantification of uncertainties in the source term estimations by a large computer code, such as MELCOR and MAAP, is an essential process of the current probabilistic safety assessments (PSAs). The main objectives of the present study are (1) to investigate the applicability of a combined procedure of the response surface method (RSM) based on input determined from a statistical design and the Latin hypercube sampling (LHS) technique for the uncertainty analysis of CsI release fractions under a hypothetical severe accident sequence of a station blackout at Young-Gwang nuclear power plant using MAAP3.0B code as a benchmark problem; and (2) to propose a new measure of uncertainty importance based on the distributional sensitivity analysis. On the basis of the results obtained in the present work, the RSM is recommended to be used as a principal tool for an overall uncertainty analysis in source term quantifications, while using the LHS in the calculations of standardized regression coefficients (SRC) and standardized rank regression coefficients (SRRC) to determine the subset of the most important input parameters in the final screening step and to check the cumulative distribution functions (cdfs) obtained by RSM. Verification of the response surface model for its sufficient accuracy is a prerequisite for the reliability of the final results obtained by the combined procedure proposed in the present work. In the present study a new measure has been developed to utilize the metric distance obtained from cumulative distribution functions (cdfs). The measure has been evaluated for three different cases of distributions in order to assess the characteristics of the measure: The first case and the second are when the distribution is known as analytical distributions and the other case is when the distribution is unknown. The first case is given by symmetry analytical distributions. The second case consists of two asymmetry distributions of which the skewness is non zero

  9. Verification and uncertainty evaluation of CASMO-3/MASTER nuclear analysis system

    Energy Technology Data Exchange (ETDEWEB)

    Song, Jae Seung; Cho, Byung Oh; Joo, Han Kyu; Zee, Sung Quun; Lee, Chung Chan; Park, Sang Yoon

    2000-06-01

    MASTER is a nuclear design code developed by KAERI. It uses group constants generated by CASMO-3 developed by Studsvik. In this report the verification and evaluation of uncertainty were performed for the code system application in nuclear reactor core analysis and design. The verification is performed via various benchmark comparisons for static and transient core condition, and core follow calculations with startup physics test predictions of total 14 cycles of pressurized water reactors. Benchmark calculation include comparisons with reference solutions of IAEA and OECA/NEA problems and critical experiment measurements. The uncertainty evaluation is focused to safety related parameters such as power distribution, reactivity coefficients, control rod worth and core reactivity. It is concluded that CASMO-3/MASTER can be applied for PWR core nuclear analysis and design without any bias factors. Also, it is verified that the system can be applied for SMART core, via supplemental comparisons with reference calculations by MCNP which is a probabilistic nuclear calculation code.

  10. Characterization of sealed radioactive sources. Uncertainty analysis to improve detection methods

    International Nuclear Information System (INIS)

    Cummings, D.G.; Sommers, J.D.; Adamic, M.L.; Jimenez, M.; Giglio, J.J.; Carney, K.P.

    2009-01-01

    A radioactive 137 Cs source has been analyzed for the radioactive parent 137 Cs and stable decay daughter 137 Ba. The ratio of the daughter to parent atoms is used to estimate the date when Cs was purified prior to source encapsulation (an 'age' since purification). The isotopes were analyzed by inductively coupled plasma mass spectrometry (ICP-MS) after chemical separation. In addition, Ba was analyzed by isotope dilution ICP-MS (ID-ICP-MS). A detailed error analysis of the mass spectrometric work has been undertaken to identify areas of improvement, as well as quantifying the effect the errors have on the 'age' determined. This paper reports an uncertainty analysis to identifying areas of improvement and alternative techniques that may reduce the uncertainties. In particular, work on isotope dilution using ICP-MS for the 'age' determination of sealed sources is presented. The results will be compared to the original work done using external standards to calibrate the ICP-MS instrument. (author)

  11. Uncertainty Analysis on the Health Risk Caused by a Radiological Terror

    International Nuclear Information System (INIS)

    Jeong, Hyojoon; Hwang, Wontae; Kim, Eunhan; Han, Moonhee

    2010-01-01

    Atmospheric dispersion modeling and uncertainty analysis were carried out support the decision making in case of radiological emergency events. The risk caused by inhalation is highly dependent on air concentrations for radionuclides, therefore air monitoring and dispersion modeling have to be performed carefully to reduce the uncertainty of the health risk assessment for the public. When an intentional release of radioactive materials occurs in an urban area, air quality for radioactive materials in the environment is of great importance to take action for countermeasures and environmental risk assessments. Atmospheric modeling is part of the decision making tasks and that it is particularly important for emergency managers as they often need to take actions quickly on very inadequate information. In this study, we assume an 137 Cs explosion of 50 TBq using RDDs in the metropolitan area of Soul, South Korea. California Puff Model is used to calculate an atmospheric dispersion and transport for 137 Cs, and environmental risk analysis is performed using the Monte Carlo method

  12. Verification and uncertainty evaluation of CASMO-3/MASTER nuclear analysis system

    International Nuclear Information System (INIS)

    Song, Jae Seung; Cho, Byung Oh; Joo, Han Kyu; Zee, Sung Quun; Lee, Chung Chan; Park, Sang Yoon

    2000-06-01

    MASTER is a nuclear design code developed by KAERI. It uses group constants generated by CASMO-3 developed by Studsvik. In this report the verification and evaluation of uncertainty were performed for the code system application in nuclear reactor core analysis and design. The verification is performed via various benchmark comparisons for static and transient core condition, and core follow calculations with startup physics test predictions of total 14 cycles of pressurized water reactors. Benchmark calculation include comparisons with reference solutions of IAEA and OECA/NEA problems and critical experiment measurements. The uncertainty evaluation is focused to safety related parameters such as power distribution, reactivity coefficients, control rod worth and core reactivity. It is concluded that CASMO-3/MASTER can be applied for PWR core nuclear analysis and design without any bias factors. Also, it is verified that the system can be applied for SMART core, via supplemental comparisons with reference calculations by MCNP which is a probabilistic nuclear calculation code

  13. Application of intelligence based uncertainty analysis for HLW disposal

    International Nuclear Information System (INIS)

    Kato, Kazuyuki

    2003-01-01

    Safety assessment for geological disposal of high level radioactive waste inevitably involves factors that cannot be specified in a deterministic manner. These are namely: (1) 'variability' that arises from stochastic nature of the processes and features considered, e.g., distribution of canister corrosion times and spatial heterogeneity of a host geological formation; (2) 'ignorance' due to incomplete or imprecise knowledge of the processes and conditions expected in the future, e.g., uncertainty in the estimation of solubilities and sorption coefficients for important nuclides. In many cases, a decision in assessment, e.g., selection among model options or determination of a parameter value, is subjected to both variability and ignorance in a combined form. It is clearly important to evaluate both influences of variability and ignorance on the result of a safety assessment in a consistent manner. We developed a unified methodology to handle variability and ignorance by using probabilistic and possibilistic techniques respectively. The methodology has been applied to safety assessment of geological disposal of high level radioactive waste. Uncertainties associated with scenarios, models and parameters were defined in terms of fuzzy membership functions derived through a series of interviews to the experts while variability was formulated by means of probability density functions (pdfs) based on available data set. The exercise demonstrated applicability of the new methodology and, in particular, its advantage in quantifying uncertainties based on expert's opinion and in providing information on dependence of assessment result on the level of conservatism. In addition, it was also shown that sensitivity analysis could identify key parameters in reducing uncertainties associated with the overall assessment. The above information can be used to support the judgment process and guide the process of disposal system development in optimization of protection against

  14. Monte Carlo uncertainty analysis of dose estimates in radiochromic film dosimetry with single-channel and multichannel algorithms.

    Science.gov (United States)

    Vera-Sánchez, Juan Antonio; Ruiz-Morales, Carmen; González-López, Antonio

    2018-03-01

    To provide a multi-stage model to calculate uncertainty in radiochromic film dosimetry with Monte-Carlo techniques. This new approach is applied to single-channel and multichannel algorithms. Two lots of Gafchromic EBT3 are exposed in two different Varian linacs. They are read with an EPSON V800 flatbed scanner. The Monte-Carlo techniques in uncertainty analysis provide a numerical representation of the probability density functions of the output magnitudes. From this numerical representation, traditional parameters of uncertainty analysis as the standard deviations and bias are calculated. Moreover, these numerical representations are used to investigate the shape of the probability density functions of the output magnitudes. Also, another calibration film is read in four EPSON scanners (two V800 and two 10000XL) and the uncertainty analysis is carried out with the four images. The dose estimates of single-channel and multichannel algorithms show a Gaussian behavior and low bias. The multichannel algorithms lead to less uncertainty in the final dose estimates when the EPSON V800 is employed as reading device. In the case of the EPSON 10000XL, the single-channel algorithms provide less uncertainty in the dose estimates for doses higher than four Gy. A multi-stage model has been presented. With the aid of this model and the use of the Monte-Carlo techniques, the uncertainty of dose estimates for single-channel and multichannel algorithms are estimated. The application of the model together with Monte-Carlo techniques leads to a complete characterization of the uncertainties in radiochromic film dosimetry. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  15. Use of Quantitative Uncertainty Analysis to Support M&VDecisions in ESPCs

    Energy Technology Data Exchange (ETDEWEB)

    Mathew, Paul A.; Koehling, Erick; Kumar, Satish

    2005-05-11

    Measurement and Verification (M&V) is a critical elementof an Energy Savings Performance Contract (ESPC) - without M&V, thereisno way to confirm that the projected savings in an ESPC are in factbeing realized. For any given energy conservation measure in an ESPC,there are usually several M&V choices, which will vary in terms ofmeasurement uncertainty, cost, and technical feasibility. Typically,M&V decisions are made almost solely based on engineering judgmentand experience, with little, if any, quantitative uncertainty analysis(QUA). This paper describes the results of a pilot project initiated bythe Department of Energy s Federal Energy Management Program to explorethe use of Monte-Carlo simulation to assess savings uncertainty andthereby augment the M&V decision-making process in ESPCs. The intentwas to use QUA selectively in combination with heuristic knowledge, inorder to obtain quantitative estimates of the savings uncertainty withoutthe burden of a comprehensive "bottoms-up" QUA. This approach was used toanalyze the savings uncertainty in an ESPC for a large federal agency.The QUA was seamlessly integrated into the ESPC development process andthe incremental effort was relatively small with user-friendly tools thatare commercially available. As the case study illustrates, in some casesthe QUA simply confirms intuitive or qualitative information, while inother cases, it provides insight that suggests revisiting the M&Vplan. The case study also showed that M&V decisions should beinformed by the portfolio risk diversification. By providing quantitativeuncertainty information, QUA can effectively augment the M&Vdecision-making process as well as the overall ESPC financialanalysis.

  16. Uncertainty and Sensitivity Analysis Applied to the Validation of BWR Bundle Thermal-Hydraulic Calculations

    International Nuclear Information System (INIS)

    Hernandez-Solis, Augusto

    2010-04-01

    This work has two main objectives. The first one is to enhance the validation process of the thermal-hydraulic features of the Westinghouse code POLCA-T. This is achieved by computing a quantitative validation limit based on statistical uncertainty analysis. This validation theory is applied to some of the benchmark cases of the following macroscopic BFBT exercises: 1) Single and two phase bundle pressure drops, 2) Steady-state cross-sectional averaged void fraction, 3) Transient cross-sectional averaged void fraction and 4) Steady-state critical power tests. Sensitivity analysis is also performed to identify the most important uncertain parameters for each exercise. The second objective consists in showing the clear advantages of using the quasi-random Latin Hypercube Sampling (LHS) strategy over simple random sampling (SRS). LHS allows a much better coverage of the input uncertainties than SRS because it densely stratifies across the range of each input probability distribution. The aim here is to compare both uncertainty analyses on the BWR assembly void axial profile prediction in steady-state, and on the transient void fraction prediction at a certain axial level coming from a simulated re-circulation pump trip scenario. It is shown that the replicated void fraction mean (either in steady-state or transient conditions) has less variability when using LHS than SRS for the same number of calculations (i.e. same input space sample size) even if the resulting void fraction axial profiles are non-monotonic. It is also shown that the void fraction uncertainty limits achieved with SRS by running 458 calculations (sample size required to cover 95% of 8 uncertain input parameters with a 95% confidence), result in the same uncertainty limits achieved by LHS with only 100 calculations. These are thus clear indications on the advantages of using LHS. Finally, the present study contributes to a realistic analysis of nuclear reactors, in the sense that the uncertainties of

  17. Ignoring correlation in uncertainty and sensitivity analysis in life cycle assessment: what is the risk?

    Energy Technology Data Exchange (ETDEWEB)

    Groen, E.A., E-mail: Evelyne.Groen@gmail.com [Wageningen University, P.O. Box 338, Wageningen 6700 AH (Netherlands); Heijungs, R. [Vrije Universiteit Amsterdam, De Boelelaan 1105, Amsterdam 1081 HV (Netherlands); Leiden University, Einsteinweg 2, Leiden 2333 CC (Netherlands)

    2017-01-15

    Life cycle assessment (LCA) is an established tool to quantify the environmental impact of a product. A good assessment of uncertainty is important for making well-informed decisions in comparative LCA, as well as for correctly prioritising data collection efforts. Under- or overestimation of output uncertainty (e.g. output variance) will lead to incorrect decisions in such matters. The presence of correlations between input parameters during uncertainty propagation, can increase or decrease the the output variance. However, most LCA studies that include uncertainty analysis, ignore correlations between input parameters during uncertainty propagation, which may lead to incorrect conclusions. Two approaches to include correlations between input parameters during uncertainty propagation and global sensitivity analysis were studied: an analytical approach and a sampling approach. The use of both approaches is illustrated for an artificial case study of electricity production. Results demonstrate that both approaches yield approximately the same output variance and sensitivity indices for this specific case study. Furthermore, we demonstrate that the analytical approach can be used to quantify the risk of ignoring correlations between input parameters during uncertainty propagation in LCA. We demonstrate that: (1) we can predict if including correlations among input parameters in uncertainty propagation will increase or decrease output variance; (2) we can quantify the risk of ignoring correlations on the output variance and the global sensitivity indices. Moreover, this procedure requires only little data. - Highlights: • Ignoring correlation leads to under- or overestimation of the output variance. • We demonstrated that the risk of ignoring correlation can be quantified. • The procedure proposed is generally applicable in life cycle assessment. • In some cases, ignoring correlation has a minimal effect on decision-making tools.

  18. DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Eldred, Michael Scott; Vigil, Dena M.; Dalbey, Keith R.; Bohnhoff, William J.; Adams, Brian M.; Swiler, Laura Painton; Lefantzi, Sophia (Sandia National Laboratories, Livermore, CA); Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Eddy, John P.

    2011-12-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the DAKOTA software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of DAKOTA-related research publications in the areas of surrogate-based optimization, uncertainty quantification, and optimization under uncertainty that provide the foundation for many of DAKOTA's iterative analysis capabilities.

  19. Development and application of objective uncertainty measures for nuclear power plant transient analysis[Dissertation 3897

    Energy Technology Data Exchange (ETDEWEB)

    Vinai, P

    2007-10-15

    For the development, design and licensing of a nuclear power plant (NPP), a sound safety analysis is necessary to study the diverse physical phenomena involved in the system behaviour under operational and transient conditions. Such studies are based on detailed computer simulations. With the progresses achieved in computer technology and the greater availability of experimental and plant data, the use of best estimate codes for safety evaluations has gained increasing acceptance. The application of best estimate safety analysis has raised new problems that need to be addressed: it has become more crucial to assess as to how reliable code predictions are, especially when they need to be compared against safety limits that must not be crossed. It becomes necessary to identify and quantify the various possible sources of uncertainty that affect the reliability of the results. Currently, such uncertainty evaluations are generally based on experts' opinion. In the present research, a novel methodology based on a non-parametric statistical approach has been developed for objective quantification of best-estimate code uncertainties related to the physical models used in the code. The basis is an evaluation of the accuracy of a given physical model achieved by comparing its predictions with experimental data from an appropriate set of separate-effect tests. The differences between measurements and predictions can be considered stochastically distributed, and thus a statistical approach can be employed. The first step was the development of a procedure for investigating the dependence of a given physical model's accuracy on the experimental conditions. Each separate-effect test effectively provides a random sample of discrepancies between measurements and predictions, corresponding to a location in the state space defined by a certain number of independent system variables. As a consequence, the samples of 'errors', achieved from analysis of the entire

  20. [Influence of Uncertainty and Uncertainty Appraisal on Self-management in Hemodialysis Patients].

    Science.gov (United States)

    Jang, Hyung Suk; Lee, Chang Suk; Yang, Young Hee

    2015-04-01

    This study was done to examine the relation of uncertainty, uncertainty appraisal, and self-management in patients undergoing hemodialysis, and to identify factors influencing self-management. A convenience sample of 92 patients receiving hemodialysis was selected. Data were collected using a structured questionnaire and medical records. The collected data were analyzed using descriptive statistics, t-test, ANOVA, Pearson correlations and multiple regression analysis with the SPSS/WIN 20.0 program. The participants showed a moderate level of uncertainty with the highest score being for ambiguity among the four uncertainty subdomains. Scores for uncertainty danger or opportunity appraisals were under the mid points. The participants were found to perform a high level of self-management such as diet control, management of arteriovenous fistula, exercise, medication, physical management, measurements of body weight and blood pressure, and social activity. The self-management of participants undergoing hemodialysis showed a significant relationship with uncertainty and uncertainty appraisal. The significant factors influencing self-management were uncertainty, uncertainty opportunity appraisal, hemodialysis duration, and having a spouse. These variables explained 32.8% of the variance in self-management. The results suggest that intervention programs to reduce the level of uncertainty and to increase the level of uncertainty opportunity appraisal among patients would improve the self-management of hemodialysis patients.

  1. Quantification of Uncertainty in the Flood Frequency Analysis

    Science.gov (United States)

    Kasiapillai Sudalaimuthu, K.; He, J.; Swami, D.

    2017-12-01

    Flood frequency analysis (FFA) is usually carried out for planning and designing of water resources and hydraulic structures. Owing to the existence of variability in sample representation, selection of distribution and estimation of distribution parameters, the estimation of flood quantile has been always uncertain. Hence, suitable approaches must be developed to quantify the uncertainty in the form of prediction interval as an alternate to deterministic approach. The developed framework in the present study to include uncertainty in the FFA discusses a multi-objective optimization approach to construct the prediction interval using ensemble of flood quantile. Through this approach, an optimal variability of distribution parameters is identified to carry out FFA. To demonstrate the proposed approach, annual maximum flow data from two gauge stations (Bow river at Calgary and Banff, Canada) are used. The major focus of the present study was to evaluate the changes in magnitude of flood quantiles due to the recent extreme flood event occurred during the year 2013. In addition, the efficacy of the proposed method was further verified using standard bootstrap based sampling approaches and found that the proposed method is reliable in modeling extreme floods as compared to the bootstrap methods.

  2. Data Quality Assessment of the Uncertainty Analysis Applied to the Greenhouse Gas Emissions of a Dairy Cow System

    Directory of Open Access Journals (Sweden)

    Chun-Youl Baek

    2017-09-01

    Full Text Available The results of an uncertainty analysis are achieved by the statistical information (standard error, type of probability distributions, and range of minimum and maximum of the selected input parameters. However, there are limitations in identifying sufficient data samples for the selected input parameters for statistical information in the field of life cycle assessment (LCA. Therefore, there is a strong need for a consistent screening procedure to identify the input parameters for use in uncertainty analysis in the area of LCA. The conventional procedure for identifying input parameters for the uncertainty analysis method includes assessing the data quality using the pedigree method and the contribution analysis of the LCA results. This paper proposes a simplified procedure for ameliorating the existing data quality assessment method, which can lead to an efficient uncertainly analysis of LCA results. The proposed method has two salient features: (i a simplified procedure based on contribution analysis followed by a data quality assessment for selecting the input parameters for the uncertainty analysis; and (ii a quantitative data quality assessment method is proposed, based on the pedigree method, that adopts the analytic hierarchy process (AHP method and quality function deployment (QFD. The effects of the uncertainty of the selected input parameters on the LCA results were assessed using the Monte Carlo simulation method. A case study of greenhouse gas (GHG emissions from a dairy cow system was used to demonstrate the applicability of the proposed procedure.

  3. Estimation of a quantity of interest in uncertainty analysis: Some help from Bayesian decision theory

    International Nuclear Information System (INIS)

    Pasanisi, Alberto; Keller, Merlin; Parent, Eric

    2012-01-01

    In the context of risk analysis under uncertainty, we focus here on the problem of estimating a so-called quantity of interest of an uncertainty analysis problem, i.e. a given feature of the probability distribution function (pdf) of the output of a deterministic model with uncertain inputs. We will stay here in a fully probabilistic setting. A common problem is how to account for epistemic uncertainty tainting the parameter of the probability distribution of the inputs. In the standard practice, this uncertainty is often neglected (plug-in approach). When a specific uncertainty assessment is made, under the basis of the available information (expertise and/or data), a common solution consists in marginalizing the joint distribution of both observable inputs and parameters of the probabilistic model (i.e. computing the predictive pdf of the inputs), then propagating it through the deterministic model. We will reinterpret this approach in the light of Bayesian decision theory, and will put into evidence that this practice leads the analyst to adopt implicitly a specific loss function which may be inappropriate for the problem under investigation, and suboptimal from a decisional perspective. These concepts are illustrated on a simple numerical example, concerning a case of flood risk assessment.

  4. Uncertainty Analysis of RBMK-Related Experimental Data

    International Nuclear Information System (INIS)

    Urbonas, Rolandas; Kaliatka, Algirdas; Liaukonis, Mindaugas

    2002-01-01

    An attempt to validate state-of-the-art thermal hydraulic code ATHLET (GRS, Germany) on the basis of E-108 test facility was made. Originally this code was developed and validated for different type reactors than RBMK. Since state-of-art thermal hydraulic codes are widely used for simulation of RBMK reactors, further codes' implementation and validation is required. The phenomena associated with channel type flow instabilities and CHF were found to be an important step in the frame of the overall effort of state-of-the-art validation and application for RBMK reactors. In the paper one-channel approach analysis is presented. Thus, the oscillatory behaviour of the system was not detected. The results show dependence on the nodalization used in the heated channels, initial and boundary conditions and code selected models. It is shown that the code is able to predict a sudden heat structure temperature excursion, when critical heat flux is approached. GRS developed uncertainty and sensitivity methodology was employed in the analysis. (authors)

  5. An analysis of sensitivity and uncertainty associated with the use of the HSPF model for EIA applications

    Energy Technology Data Exchange (ETDEWEB)

    Biftu, G.F.; Beersing, A.; Wu, S.; Ade, F. [Golder Associates, Calgary, AB (Canada)

    2005-07-01

    An outline of a new approach to assessing the sensitivity and uncertainty associated with surface water modelling results using Hydrological Simulation Program-Fortran (HSPF) was presented, as well as the results of a sensitivity and uncertainty analysis. The HSPF model is often used to characterize the hydrological processes in watersheds within the oil sands region. Typical applications of HSPF included calibration of the model parameters using data from gauged watersheds, as well as validation of calibrated models with data sets. Additionally, simulations are often conducted to make flow predictions to support the environmental impact assessment (EIA) process. However, a key aspect of the modelling components of the EIA process is the sensitivity and uncertainty of the modelling results as compared to model parameters. Many of the variations in the HSPF model's outputs are caused by a small number of model parameters and outputs. A sensitivity analysis was performed to identify and focus on key parameters and assumptions that have the most influence on the model's outputs. Analysis entailed varying each parameter in turn, within a range, and examining the resulting relative changes in the model outputs. This analysis consisted of the selection of probability distributions to characterize the uncertainty in the model's key sensitive parameters, as well as the use of Monte Carlo and HSPF simulation to determine the uncertainty in model outputs. tabs, figs.

  6. INTEGRATION OF SYSTEM COMPONENTS AND UNCERTAINTY ANALYSIS - HANFORD EXAMPLES

    International Nuclear Information System (INIS)

    Wood, M.I.

    2009-01-01

    (sm b ullet) Deterministic 'One Off' analyses as basis for evaluating sensitivity and uncertainty relative to reference case (sm b ullet) Spatial coverage identical to reference case (sm b ullet) Two types of analysis assumptions - Minimax parameter values around reference case conditions - 'What If' cases that change reference case condition and associated parameter values (sm b ullet) No conclusions about likelihood of estimated result other than' qualitative expectation that actual outcome should tend toward reference case estimate

  7. Uncertainty and simulation

    International Nuclear Information System (INIS)

    Depres, B.; Dossantos-Uzarralde, P.

    2009-01-01

    More than 150 researchers and engineers from universities and the industrial world met to discuss on the new methodologies developed around assessing uncertainty. About 20 papers were presented and the main topics were: methods to study the propagation of uncertainties, sensitivity analysis, nuclear data covariances or multi-parameter optimisation. This report gathers the contributions of CEA researchers and engineers

  8. Uncertainty and sensitivity analysis of control strategies using the benchmark simulation model No1 (BSM1).

    Science.gov (United States)

    Flores-Alsina, Xavier; Rodriguez-Roda, Ignasi; Sin, Gürkan; Gernaey, Krist V

    2009-01-01

    The objective of this paper is to perform an uncertainty and sensitivity analysis of the predictions of the Benchmark Simulation Model (BSM) No. 1, when comparing four activated sludge control strategies. The Monte Carlo simulation technique is used to evaluate the uncertainty in the BSM1 predictions, considering the ASM1 bio-kinetic parameters and influent fractions as input uncertainties while the Effluent Quality Index (EQI) and the Operating Cost Index (OCI) are focused on as model outputs. The resulting Monte Carlo simulations are presented using descriptive statistics indicating the degree of uncertainty in the predicted EQI and OCI. Next, the Standard Regression Coefficients (SRC) method is used for sensitivity analysis to identify which input parameters influence the uncertainty in the EQI predictions the most. The results show that control strategies including an ammonium (S(NH)) controller reduce uncertainty in both overall pollution removal and effluent total Kjeldahl nitrogen. Also, control strategies with an external carbon source reduce the effluent nitrate (S(NO)) uncertainty increasing both their economical cost and variability as a trade-off. Finally, the maximum specific autotrophic growth rate (micro(A)) causes most of the variance in the effluent for all the evaluated control strategies. The influence of denitrification related parameters, e.g. eta(g) (anoxic growth rate correction factor) and eta(h) (anoxic hydrolysis rate correction factor), becomes less important when a S(NO) controller manipulating an external carbon source addition is implemented.

  9. Reduction and Uncertainty Analysis of Chemical Mechanisms Based on Local and Global Sensitivities

    Science.gov (United States)

    Esposito, Gaetano

    Numerical simulations of critical reacting flow phenomena in hypersonic propulsion devices require accurate representation of finite-rate chemical kinetics. The chemical kinetic models available for hydrocarbon fuel combustion are rather large, involving hundreds of species and thousands of reactions. As a consequence, they cannot be used in multi-dimensional computational fluid dynamic calculations in the foreseeable future due to the prohibitive computational cost. In addition to the computational difficulties, it is also known that some fundamental chemical kinetic parameters of detailed models have significant level of uncertainty due to limited experimental data available and to poor understanding of interactions among kinetic parameters. In the present investigation, local and global sensitivity analysis techniques are employed to develop a systematic approach of reducing and analyzing detailed chemical kinetic models. Unlike previous studies in which skeletal model reduction was based on the separate analysis of simple cases, in this work a novel strategy based on Principal Component Analysis of local sensitivity values is presented. This new approach is capable of simultaneously taking into account all the relevant canonical combustion configurations over different composition, temperature and pressure conditions. Moreover, the procedure developed in this work represents the first documented inclusion of non-premixed extinction phenomena, which is of great relevance in hypersonic combustors, in an automated reduction algorithm. The application of the skeletal reduction to a detailed kinetic model consisting of 111 species in 784 reactions is demonstrated. The resulting reduced skeletal model of 37--38 species showed that the global ignition/propagation/extinction phenomena of ethylene-air mixtures can be predicted within an accuracy of 2% of the full detailed model. The problems of both understanding non-linear interactions between kinetic parameters and

  10. 3.8 Proposed approach to uncertainty quantification and sensitivity analysis in the next PA

    Energy Technology Data Exchange (ETDEWEB)

    Flach, Greg [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Wohlwend, Jen [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-10-02

    This memorandum builds upon Section 3.8 of SRNL (2016) and Flach (2017) by defining key error analysis, uncertainty quantification, and sensitivity analysis concepts and terms, in preparation for the next E-Area Performance Assessment (WSRC 2008) revision.

  11. Statistical analysis of the uncertainty related to flood hazard appraisal

    Science.gov (United States)

    Notaro, Vincenza; Freni, Gabriele

    2015-12-01

    The estimation of flood hazard frequency statistics for an urban catchment is of great interest in practice. It provides the evaluation of potential flood risk and related damage and supports decision making for flood risk management. Flood risk is usually defined as function of the probability, that a system deficiency can cause flooding (hazard), and the expected damage, due to the flooding magnitude (damage), taking into account both the exposure and the vulnerability of the goods at risk. The expected flood damage can be evaluated by an a priori estimation of potential damage caused by flooding or by interpolating real damage data. With regard to flood hazard appraisal several procedures propose to identify some hazard indicator (HI) such as flood depth or the combination of flood depth and velocity and to assess the flood hazard corresponding to the analyzed area comparing the HI variables with user-defined threshold values or curves (penalty curves or matrixes). However, flooding data are usually unavailable or piecemeal allowing for carrying out a reliable flood hazard analysis, therefore hazard analysis is often performed by means of mathematical simulations aimed at evaluating water levels and flow velocities over catchment surface. As results a great part of the uncertainties intrinsic to flood risk appraisal can be related to the hazard evaluation due to the uncertainty inherent to modeling results and to the subjectivity of the user defined hazard thresholds applied to link flood depth to a hazard level. In the present work, a statistical methodology was proposed for evaluating and reducing the uncertainties connected with hazard level estimation. The methodology has been applied to a real urban watershed as case study.

  12. Sensitivity/uncertainty analysis for free-in-air tissue kerma due to initial radiation at Hiroshima and Nagasaki

    International Nuclear Information System (INIS)

    Lillie, R.A.; Broadhead, B.L.; Pace, J.V. III

    1988-01-01

    Uncertainty estimates and cross correlations by range/survivor have been calculated for the Hiroshima and Nagasaki free-in-air (FIA) tissue kerma obtained from two-dimensional air/ground transport calculations. The uncertainties due to modeling parameter and basic nuclear transport data uncertainties were calculated for 700-, 1000-, and 1500-m ground ranges. Only the FIA tissue kerma due to initial radiation was treated in the analysis; the uncertainties associated with terrain and building shielding and phantom attenuation were not considered in this study. Uncertainties of --20% were obtained for the prompt neutron and secondary gamma kerma and 30% for the prompt gamma kerma at both cities. The uncertainties on the total prompt kerma at Hiroshima and Nagasaki are --18 and 15%, respectively. The estimated uncertainties vary only slightly by ground range and are fairly highly correlated. The total prompt kerma uncertainties are dominated by the secondary gamma uncertainties, which in turn are dominated by the modeling parameter uncertainties, particularly those associated with the weapon yield and radiation sources

  13. Improving uncertainty evaluation of process models by using pedigree analysis. A case study on CO2 capture with monoethanolamine

    NARCIS (Netherlands)

    van der Spek, Mijndert; Ramirez, Andrea; Faaij, André

    2016-01-01

    This article aims to improve uncertainty evaluation of process models by combining a quantitative uncertainty evaluation method (data validation) with a qualitative uncertainty evaluation method (pedigree analysis). The approach is tested on a case study of monoethanolamine based postcombustion CO2

  14. Uncertainty and sensitivity analysis of control strategies using the benchmark simulation model No1 (BSM1)

    DEFF Research Database (Denmark)

    Flores-Alsina, Xavier; Rodriguez-Roda, Ignasi; Sin, Gürkan

    2009-01-01

    The objective of this paper is to perform an uncertainty and sensitivity analysis of the predictions of the Benchmark Simulation Model (BSM) No. 1, when comparing four activated sludge control strategies. The Monte Carlo simulation technique is used to evaluate the uncertainty in the BSM1 predict...

  15. A retrospective dosimetry method and its uncertainty analysis

    International Nuclear Information System (INIS)

    Zhang, L.; Jia, D.; Dai, G.

    2000-01-01

    The main aim of a radiation epidemiological study is to assess the risk of the population exposed to ionizing radiation. The actual work of the assessment may be very difficult because dose information about the population is often indirect and incomplete. It is very important, therefore, to find a way of estimating reasonable and reliable doses of the population by a retrospective method from limited information. In order to provide reasonable dose information for the cohort study of Chinese medical diagnostic X-ray workers, a retrospective dosimetry method was established. In China, a cohort study of more than 27,000 medical diagnostic X-ray workers, with 25,000 controls, has been carried out for about fifteen years in order to assess the risk to an occupationally exposed population. Obviously, a key to the success of the study is to obtain reliable and reasonable results of dose estimation by the dose reconstruction method. Before 1985, there was a lack of information regarding personal dose measured directly; however, we can obtain other indirect information. Examples are information about working loads from the documents of the hospitals, information about operational conditions of the workers of different statuses by a survey of occupational history, and the exposure levels of various working conditions by some simulation methods. The information for estimating organ dose can also be obtained by simulating experiments with a phantom. Based on the information mentioned above, a mathematical model and computerizing system for dose reconstruction of this occupational population was design and developed. Uncertainty analysis very important for dose reconstruction. The sources of uncertainty of our study are coming from two fields. One is coming from the mode of dose reconstruction. Another is coming from the survey of the occupational history. In the result reported, main results of the uncertainty will be presented. In order to control the uncertainty of the

  16. Uncertainty quantification and sensitivity analysis of an arterial wall mechanics model for evaluation of vascular drug therapies.

    Science.gov (United States)

    Heusinkveld, Maarten H G; Quicken, Sjeng; Holtackers, Robert J; Huberts, Wouter; Reesink, Koen D; Delhaas, Tammo; Spronck, Bart

    2018-02-01

    Quantification of the uncertainty in constitutive model predictions describing arterial wall mechanics is vital towards non-invasive assessment of vascular drug therapies. Therefore, we perform uncertainty quantification to determine uncertainty in mechanical characteristics describing the vessel wall response upon loading. Furthermore, a global variance-based sensitivity analysis is performed to pinpoint measurements that are most rewarding to be measured more precisely. We used previously published carotid diameter-pressure and intima-media thickness (IMT) data (measured in triplicate), and Holzapfel-Gasser-Ogden models. A virtual data set containing 5000 diastolic and systolic diameter-pressure points, and IMT values was generated by adding measurement error to the average of the measured data. The model was fitted to single-exponential curves calculated from the data, obtaining distributions of constitutive parameters and constituent load bearing parameters. Additionally, we (1) simulated vascular drug treatment to assess the relevance of model uncertainty and (2) evaluated how increasing the number of measurement repetitions influences model uncertainty. We found substantial uncertainty in constitutive parameters. Simulating vascular drug treatment predicted a 6% point reduction in collagen load bearing ([Formula: see text]), approximately 50% of its uncertainty. Sensitivity analysis indicated that the uncertainty in [Formula: see text] was primarily caused by noise in distension and IMT measurements. Spread in [Formula: see text] could be decreased by 50% when increasing the number of measurement repetitions from 3 to 10. Model uncertainty, notably that in [Formula: see text], could conceal effects of vascular drug therapy. However, this uncertainty could be reduced by increasing the number of measurement repetitions of distension and wall thickness measurements used for model parameterisation.

  17. Implementation of a methodology to perform the uncertainty and sensitivity analysis of the control rod drop in a BWR

    Energy Technology Data Exchange (ETDEWEB)

    Reyes F, M. del C.

    2015-07-01

    A methodology to perform uncertainty and sensitivity analysis for the cross sections used in a Trace/PARCS coupled model for a control rod drop transient of a BWR-5 reactor was implemented with the neutronics code PARCS. A model of the nuclear reactor detailing all assemblies located in the core was developed. However, the thermohydraulic model designed in Trace was a simple model, where one channel representing all the types of assemblies located in the core, it was located inside a simple vessel model and boundary conditions were established. The thermohydraulic model was coupled with the neutronics model, first for the steady state and then a Control Rod Drop (CRD) transient was performed, in order to carry out the uncertainty and sensitivity analysis. To perform the analysis of the cross sections used in the Trace/PARCS coupled model during the transient, Probability Density Functions (PDFs) were generated for the 22 parameters cross sections selected from the neutronics parameters that PARCS requires, thus obtaining 100 different cases for the Trace/PARCS coupled model, each with a database of different cross sections. All these cases were executed with the coupled model, therefore obtaining 100 different outputs for the CRD transient with special emphasis on 4 responses per output: 1) The reactivity, 2) the percentage of rated power, 3) the average fuel temperature and 4) the average coolant density. For each response during the transient an uncertainty analysis was performed in which the corresponding uncertainty bands were generated. With this analysis it is possible to observe the results ranges of the responses chose by varying the uncertainty parameters selected. This is very useful and important for maintaining the safety in the nuclear power plants, also to verify if the uncertainty band is within of safety margins. The sensitivity analysis complements the uncertainty analysis identifying the parameter or parameters with the most influence on the

  18. Implementation of a methodology to perform the uncertainty and sensitivity analysis of the control rod drop in a BWR

    International Nuclear Information System (INIS)

    Reyes F, M. del C.

    2015-01-01

    A methodology to perform uncertainty and sensitivity analysis for the cross sections used in a Trace/PARCS coupled model for a control rod drop transient of a BWR-5 reactor was implemented with the neutronics code PARCS. A model of the nuclear reactor detailing all assemblies located in the core was developed. However, the thermohydraulic model designed in Trace was a simple model, where one channel representing all the types of assemblies located in the core, it was located inside a simple vessel model and boundary conditions were established. The thermohydraulic model was coupled with the neutronics model, first for the steady state and then a Control Rod Drop (CRD) transient was performed, in order to carry out the uncertainty and sensitivity analysis. To perform the analysis of the cross sections used in the Trace/PARCS coupled model during the transient, Probability Density Functions (PDFs) were generated for the 22 parameters cross sections selected from the neutronics parameters that PARCS requires, thus obtaining 100 different cases for the Trace/PARCS coupled model, each with a database of different cross sections. All these cases were executed with the coupled model, therefore obtaining 100 different outputs for the CRD transient with special emphasis on 4 responses per output: 1) The reactivity, 2) the percentage of rated power, 3) the average fuel temperature and 4) the average coolant density. For each response during the transient an uncertainty analysis was performed in which the corresponding uncertainty bands were generated. With this analysis it is possible to observe the results ranges of the responses chose by varying the uncertainty parameters selected. This is very useful and important for maintaining the safety in the nuclear power plants, also to verify if the uncertainty band is within of safety margins. The sensitivity analysis complements the uncertainty analysis identifying the parameter or parameters with the most influence on the

  19. Reliability of a new biokinetic model of zirconium in internal dosimetry: part I, parameter uncertainty analysis.

    Science.gov (United States)

    Li, Wei Bo; Greiter, Matthias; Oeh, Uwe; Hoeschen, Christoph

    2011-12-01

    The reliability of biokinetic models is essential in internal dose assessments and radiation risk analysis for the public, occupational workers, and patients exposed to radionuclides. In this paper, a method for assessing the reliability of biokinetic models by means of uncertainty and sensitivity analysis was developed. The paper is divided into two parts. In the first part of the study published here, the uncertainty sources of the model parameters for zirconium (Zr), developed by the International Commission on Radiological Protection (ICRP), were identified and analyzed. Furthermore, the uncertainty of the biokinetic experimental measurement performed at the Helmholtz Zentrum München-German Research Center for Environmental Health (HMGU) for developing a new biokinetic model of Zr was analyzed according to the Guide to the Expression of Uncertainty in Measurement, published by the International Organization for Standardization. The confidence interval and distribution of model parameters of the ICRP and HMGU Zr biokinetic models were evaluated. As a result of computer biokinetic modelings, the mean, standard uncertainty, and confidence interval of model prediction calculated based on the model parameter uncertainty were presented and compared to the plasma clearance and urinary excretion measured after intravenous administration. It was shown that for the most important compartment, the plasma, the uncertainty evaluated for the HMGU model was much smaller than that for the ICRP model; that phenomenon was observed for other organs and tissues as well. The uncertainty of the integral of the radioactivity of Zr up to 50 y calculated by the HMGU model after ingestion by adult members of the public was shown to be smaller by a factor of two than that of the ICRP model. It was also shown that the distribution type of the model parameter strongly influences the model prediction, and the correlation of the model input parameters affects the model prediction to a

  20. Good Modeling Practice for PAT Applications: Propagation of Input Uncertainty and Sensitivity Analysis

    DEFF Research Database (Denmark)

    Sin, Gürkan; Gernaey, Krist; Eliasson Lantz, Anna

    2009-01-01

    The uncertainty and sensitivity analysis are evaluated for their usefulness as part of the model-building within Process Analytical Technology applications. A mechanistic model describing a batch cultivation of Streptomyces coelicolor for antibiotic production was used as case study. The input...... compared to the large uncertainty observed in the antibiotic and off-gas CO2 predictions. The output uncertainty was observed to be lower during the exponential growth phase, while higher in the stationary and death phases - meaning the model describes some periods better than others. To understand which...... promising for helping to build reliable mechanistic models and to interpret the model outputs properly. These tools make part of good modeling practice, which can contribute to successful PAT applications for increased process understanding, operation and control purposes. © 2009 American Institute...

  1. Extensive neutronic sensitivity-uncertainty analysis of a fusion reactor shielding blanket

    International Nuclear Information System (INIS)

    Hogenbirk, A.

    1994-01-01

    In this paper the results are presented of an extensive neutronic sensitivity-uncertainty study performed for the design of a shielding blanket for a next-step fusion reactor, such as ITER. A code system was used, which was developed at ECN Petten. The uncertainty in an important response parameter, the neutron heating in the inboard superconducting coils, was evaluated. Neutron transport calculations in the 100 neutron group GAM-II structure were performed using the code ANISN. For the sensitivity and uncertainty calculations the code SUSD was used. Uncertainties due to cross-section uncertainties were taken into account as well as uncertainties due to uncertainties in energy and angular distributions of scattered neutrons (SED and SAD uncertainties, respectively). The subject of direct-term uncertainties (i.e. uncertainties due to uncertainties in the kerma factors of the superconducting coils) is briefly touched upon. It is shown that SAD uncertainties, which have been largely neglected until now, contribute significantly to the total uncertainty. Moreover, the contribution of direct-term uncertainties may be large. The total uncertainty in the neutron heating, only due to Fe cross-sections, amounts to approximately 25%, which is rather large. However, uncertainty data are scarce and the data may very well be conservative. It is shown in this paper that with the code system used, sensitivity and uncertainty calculations can be performed in a straightforward way. Therefore, it is suggested that emphasis is now put on the generation of realistic, reliable covariance data for cross-sections as well as for angular and energy distributions. ((orig.))

  2. Validation of methodology and uncertainty assessment of antimony determination in environmental materials using Neutron Activation Analysis

    International Nuclear Information System (INIS)

    Matsubara, Tassiane C.M.; Saiki, Mitiko; Zahn, Guilherme S.; Moreira, Edson G.

    2013-01-01

    Antimony is an element found in low concentrations in the environment. However, its determination has attracted great interest because of the knowledge of its toxicity and increasing application. Neutron activation analysis (NAA) is a suitable method for the determination of several elements in different types, but in case of Sb, the analysis presents some difficulties due to spectral interferences. The objective of this research was to validate the method of NAA and uncertainty assessment for Sb determination in environmental samples. The experimental procedure consisted of irradiating twelve certified reference samples of different kind of matrices. The samples were irradiated in the nuclear research reactor IEA R1 IPEN/CNEN/SP followed by measurement of induced radioactivity, using a hyperpure germanium detector coupled to a gamma ray spectrometry. The radioisotopes 122 Sb and 124 Sb were measured and the Sb concentrations with their respective uncertainties were obtained by the comparative method. Relative errors and values of Z scores were calculated to evaluate the accuracy of the results for Sb determination in certified reference materials. The evaluation of the components that contribute to uncertainty measurement of the Sb concentration, showed that the major uncertainty contribution is due to statistical counting. The results also indicated that the uncertainty value of the combined standard uncertainty depends on the radioisotope measured and the decay time used for counting. (author)

  3. Uncertainty analysis of constant amplitude fatigue test data employing the six parameters random fatigue limit model

    Directory of Open Access Journals (Sweden)

    Leonetti Davide

    2018-01-01

    Full Text Available Estimating and reducing uncertainty in fatigue test data analysis is a relevant task in order to assess the reliability of a structural connection with respect to fatigue. Several statistical models have been proposed in the literature with the aim of representing the stress range vs. endurance trend of fatigue test data under constant amplitude loading and the scatter in the finite and infinite life regions. In order to estimate the safety level of the connection also the uncertainty related to the amount of information available need to be estimated using the methods provided by the theory of statistic. The Bayesian analysis is employed to reduce the uncertainty due to the often small amount of test data by introducing prior information related to the parameters of the statistical model. In this work, the inference of fatigue test data belonging to cover plated steel beams is presented. The uncertainty is estimated by making use of Bayesian and frequentist methods. The 5% quantile of the fatigue life is estimated by taking into account the uncertainty related to the sample size for both a dataset containing few samples and one containing more data. The S-N curves resulting from the application of the employed methods are compared and the effect of the reduction of uncertainty in the infinite life region is quantified.

  4. Sensitivity and uncertainty analysis for Ignalina NPP confinement in case of loss of coolant accident

    International Nuclear Information System (INIS)

    Urbonavicius, E.; Babilas, E.; Rimkevicius, S.

    2003-01-01

    At present the best-estimate approach in the safety analysis of nuclear power plants is widely used around the world. The application of such approach requires to estimate the uncertainty of the calculated results. Various methodologies are applied in order to determine the uncertainty with the required accuracy. One of them is the statistical methodology developed at GRS mbH in Germany and integrated into the SUSA tool, which was applied for the sensitivity and uncertainty analysis of the thermal-hydraulic parameters inside the confinement (Accident Localisation System) of Ignalina NPP with RBMK-1500 reactor in case of Maximum Design Basis Accident (break of 900 mm diameter pipe). Several parameters that could potentially influence the calculated results were selected for the analysis. A set of input data with different initial values of the selected parameters was generated. In order to receive the results with 95 % probability and 95 % accuracy, 100 runs were performed with COCOSYS code developed at GRS mbH. The calculated results were processed with SUSA tool. The performed analysis showed a rather low dispersion of the results and only in the initial period of the accident. Besides, the analysis showed that there is no threat to the building structures of Ignalina NPP confinement in case of the considered accident scenario. (author)

  5. Quantifying reactor safety margins: Application of CSAU [Code Scalability, Applicability and Uncertainty] methodology to LBLOCA: Part 3, Assessment and ranging of parameters for the uncertainty analysis of LBLOCA codes

    International Nuclear Information System (INIS)

    Wulff, W.; Boyack, B.E.; Duffey, R.B.

    1988-01-01

    Comparisons of results from TRAC-PF1/MOD1 code calculations with measurements from Separate Effects Tests, and published experimental data for modeling parameters have been used to determine the uncertainty ranges of code input and modeling parameters which dominate the uncertainty in predicting the Peak Clad Temperature for a postulated Large Break Loss of Coolant Accident (LBLOCA) in a four-loop Westinghouse Pressurized Water Reactor. The uncertainty ranges are used for a detailed statistical analysis to calculate the probability distribution function for the TRAC code-predicted Peak Clad Temperature, as is described in an attendant paper. Measurements from Separate Effects Tests and Integral Effects Tests have been compared with results from corresponding TRAC-PF1/MOD1 code calculations to determine globally the total uncertainty in predicting the Peak Clad Temperature for LBLOCAs. This determination is in support of the detailed statistical analysis mentioned above. The analyses presented here account for uncertainties in input parameters, in modeling and scaling, in computing and in measurements. The analyses are an important part of the work needed to implement the Code Scalability, Applicability and Uncertainty (CSAU) methodology. CSAU is needed to determine the suitability of a computer code for reactor safety analyses and the uncertainty in computer predictions. The results presented here are used to estimate the safety margin of a particular nuclear reactor power plant for a postulated accident. 25 refs., 10 figs., 11 tabs

  6. Uncertainty Analysis of A Flood Risk Mapping Procedure Applied In Urban Areas

    Science.gov (United States)

    Krause, J.; Uhrich, S.; Bormann, H.; Diekkrüger, B.

    In the framework of IRMA-Sponge program the presented study was part of the joint research project FRHYMAP (flood risk and hydrological mapping). A simple con- ceptual flooding model (FLOODMAP) has been developed to simulate flooded areas besides rivers within cities. FLOODMAP requires a minimum of input data (digital el- evation model (DEM), river line, water level plain) and parameters and calculates the flood extent as well as the spatial distribution of flood depths. of course the simulated model results are affected by errors and uncertainties. Possible sources of uncertain- ties are the model structure, model parameters and input data. Thus after the model validation (comparison of simulated water to observed extent, taken from airborne pictures) the uncertainty of the essential input data set (digital elevation model) was analysed. Monte Carlo simulations were performed to assess the effect of uncertain- ties concerning the statistics of DEM quality and to derive flooding probabilities from the set of simulations. The questions concerning a minimum resolution of a DEM re- quired for flood simulation and concerning the best aggregation procedure of a given DEM was answered by comparing the results obtained using all available standard GIS aggregation procedures. Seven different aggregation procedures were applied to high resolution DEMs (1-2m) in three cities (Bonn, Cologne, Luxembourg). Basing on this analysis the effect of 'uncertain' DEM data was estimated and compared with other sources of uncertainties. Especially socio-economic information and monetary transfer functions required for a damage risk analysis show a high uncertainty. There- fore this study helps to analyse the weak points of the flood risk and damage risk assessment procedure.

  7. Dissertation Defense Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems

    Science.gov (United States)

    Groves, Curtis Edward

    2014-01-01

    Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This paper describes an approach to quantify the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft without the use of test data. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional "validation by test only" mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions. Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computational Fluid Dynamics can be used to verify these requirements; however, the model must be validated by test data. This research includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available and open source solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT, STARCCM+, and OPENFOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid Dynamics model using the methodology found in "Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations". This method requires three separate grids and solutions, which quantify the error bars around Computational Fluid Dynamics

  8. Dissertation Defense: Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems

    Science.gov (United States)

    Groves, Curtis Edward

    2014-01-01

    Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This paper describes an approach to quantify the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft without the use of test data. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional validation by test only mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions.Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computational Fluid Dynamics can be used to verify these requirements; however, the model must be validated by test data. This research includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available and open source solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT, STARCCM+, and OPENFOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid Dynamics model using the methodology found in Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations. This method requires three separate grids and solutions, which quantify the error bars around Computational Fluid Dynamics predictions

  9. Information on Hydrologic Conceptual Models, Parameters, Uncertainty Analysis, and Data Sources for Dose Assessments at Decommissioning Sites

    International Nuclear Information System (INIS)

    Meyer, Philip D.; Gee, Glendon W.; Nicholson, Thomas J.

    1999-01-01

    This report addresses issues related to the analysis of uncertainty in dose assessments conducted as part of decommissioning analyses. The analysis is limited to the hydrologic aspects of the exposure pathway involving infiltration of water at the ground surface, leaching of contaminants, and transport of contaminants through the groundwater to a point of exposure. The basic conceptual models and mathematical implementations of three dose assessment codes are outlined along with the site-specific conditions under which the codes may provide inaccurate, potentially nonconservative results. In addition, the hydrologic parameters of the codes are identified and compared. A methodology for parameter uncertainty assessment is outlined that considers the potential data limitations and modeling needs of decommissioning analyses. This methodology uses generic parameter distributions based on national or regional databases, sensitivity analysis, probabilistic modeling, and Bayesian updating to incorporate site-specific information. Data sources for best-estimate parameter values and parameter uncertainty information are also reviewed. A follow-on report will illustrate the uncertainty assessment methodology using decommissioning test cases

  10. Information on Hydrologic Conceptual Models, Parameters, Uncertainty Analysis, and Data Sources for Dose Assessments at Decommissioning Sites

    International Nuclear Information System (INIS)

    Meyer D, Philip; Gee W, Glendon

    2000-01-01

    This report addresses issues related to the analysis of uncertainty in dose assessments conducted as part of decommissioning analyses. The analysis is limited to the hydrologic aspects of the exposure pathway involving infiltration of water at the ground surface, leaching of contaminants, and transport of contaminants through the groundwater to a point of exposure. The basic conceptual models and mathematical implementations of three dose assessment codes are outlined along with the site-specific conditions under which the codes may provide inaccurate, potentially nonconservative results. In addition, the hydrologic parameters of the codes are identified and compared. A methodology for parameter uncertainty assessment is outlined that considers the potential data limitations and modeling needs of decommissioning analyses. This methodology uses generic parameter distributions based on national or regional databases, sensitivity analysis, probabilistic modeling, and Bayesian updating to incorporate site-specific information. Data sources for best-estimate parameter values and parameter uncertainty information are also reviewed. A follow-on report will illustrate the uncertainty assessment methodology using decommissioning test cases

  11. Uncertainty evaluation in correlated quantities: application to elemental analysis of atmospheric aerosols

    International Nuclear Information System (INIS)

    Espinosa, A.; Miranda, J.; Pineda, J. C.

    2010-01-01

    One of the aspects that are frequently overlooked in the evaluation of uncertainty in experimental data is the possibility that the involved quantities are correlated among them, due to different causes. An example in the elemental analysis of atmospheric aerosols using techniques like X-ray Fluorescence (X RF) or Particle Induced X-ray Emission (PIXE). In these cases, the measured elemental concentrations are highly correlated, and then are used to obtain information about other variables, such as the contribution from emitting sources related to soil, sulfate, non-soil potassium or organic matter. This work describes, as an example, the method required to evaluate the uncertainty in variables determined from correlated quantities from a set of atmospheric aerosol samples collected in the Metropolitan Area of the Mexico Valley and analyzed with PIXE. The work is based on the recommendations of the Guide for the Evaluation of Uncertainty published by the International Organization for Standardization. (Author)

  12. Uncertainty, probability and information-gaps

    International Nuclear Information System (INIS)

    Ben-Haim, Yakov

    2004-01-01

    This paper discusses two main ideas. First, we focus on info-gap uncertainty, as distinct from probability. Info-gap theory is especially suited for modelling and managing uncertainty in system models: we invest all our knowledge in formulating the best possible model; this leaves the modeller with very faulty and fragmentary information about the variation of reality around that optimal model. Second, we examine the interdependence between uncertainty modelling and decision-making. Good uncertainty modelling requires contact with the end-use, namely, with the decision-making application of the uncertainty model. The most important avenue of uncertainty-propagation is from initial data- and model-uncertainties into uncertainty in the decision-domain. Two questions arise. Is the decision robust to the initial uncertainties? Is the decision prone to opportune windfall success? We apply info-gap robustness and opportunity functions to the analysis of representation and propagation of uncertainty in several of the Sandia Challenge Problems

  13. Guidelines for uncertainty analysis developed for the participants in the BIOMOVS II study

    International Nuclear Information System (INIS)

    Baeverstam, U.; Davis, P.; Garcia-Olivares, A.; Henrich, E.; Koch, J.

    1993-07-01

    This report has been produced to provide guidelines for uncertainty analysis for use by participants in the BIOMOVS II study. It is hoped that others with an interest in modelling contamination in the biosphere will also find it useful. The report has been prepared by members of the Uncertainty and Validation Working Group and has been reviewed by other BIOMOVS II participants. The opinions expressed are those of the authors and should not be taken to represent the views of the BIOMOVS II sponsors or other BIOMOVS Il participating organisations

  14. Guidelines for uncertainty analysis developed for the participants in the BIOMOVS II study

    Energy Technology Data Exchange (ETDEWEB)

    Baeverstam, U; Davis, P; Garcia-Olivares, A; Henrich, E; Koch, J

    1993-07-01

    This report has been produced to provide guidelines for uncertainty analysis for use by participants in the BIOMOVS II study. It is hoped that others with an interest in modelling contamination in the biosphere will also find it useful. The report has been prepared by members of the Uncertainty and Validation Working Group and has been reviewed by other BIOMOVS II participants. The opinions expressed are those of the authors and should not be taken to represent the views of the BIOMOVS II sponsors or other BIOMOVS Il participating organisations.

  15. Uncertainty Visualization Using Copula-Based Analysis in Mixed Distribution Models.

    Science.gov (United States)

    Hazarika, Subhashis; Biswas, Ayan; Shen, Han-Wei

    2018-01-01

    Distributions are often used to model uncertainty in many scientific datasets. To preserve the correlation among the spatially sampled grid locations in the dataset, various standard multivariate distribution models have been proposed in visualization literature. These models treat each grid location as a univariate random variable which models the uncertainty at that location. Standard multivariate distributions (both parametric and nonparametric) assume that all the univariate marginals are of the same type/family of distribution. But in reality, different grid locations show different statistical behavior which may not be modeled best by the same type of distribution. In this paper, we propose a new multivariate uncertainty modeling strategy to address the needs of uncertainty modeling in scientific datasets. Our proposed method is based on a statistically sound multivariate technique called Copula, which makes it possible to separate the process of estimating the univariate marginals and the process of modeling dependency, unlike the standard multivariate distributions. The modeling flexibility offered by our proposed method makes it possible to design distribution fields which can have different types of distribution (Gaussian, Histogram, KDE etc.) at the grid locations, while maintaining the correlation structure at the same time. Depending on the results of various standard statistical tests, we can choose an optimal distribution representation at each location, resulting in a more cost efficient modeling without significantly sacrificing on the analysis quality. To demonstrate the efficacy of our proposed modeling strategy, we extract and visualize uncertain features like isocontours and vortices in various real world datasets. We also study various modeling criterion to help users in the task of univariate model selection.

  16. Uncertainty quantification and error analysis

    Energy Technology Data Exchange (ETDEWEB)

    Higdon, Dave M [Los Alamos National Laboratory; Anderson, Mark C [Los Alamos National Laboratory; Habib, Salman [Los Alamos National Laboratory; Klein, Richard [Los Alamos National Laboratory; Berliner, Mark [OHIO STATE UNIV.; Covey, Curt [LLNL; Ghattas, Omar [UNIV OF TEXAS; Graziani, Carlo [UNIV OF CHICAGO; Seager, Mark [LLNL; Sefcik, Joseph [LLNL; Stark, Philip [UC/BERKELEY; Stewart, James [SNL

    2010-01-01

    UQ studies all sources of error and uncertainty, including: systematic and stochastic measurement error; ignorance; limitations of theoretical models; limitations of numerical representations of those models; limitations on the accuracy and reliability of computations, approximations, and algorithms; and human error. A more precise definition for UQ is suggested below.

  17. An inquiry into the potential of scenario analysis for dealing with uncertainty in strategic environmental assessment in China

    International Nuclear Information System (INIS)

    Zhu Zhixi; Bai, Hongtao; Xu He; Zhu Tan

    2011-01-01

    Strategic environmental assessment (SEA) inherently needs to address greater levels of uncertainty in the formulation and implementation processes of strategic decisions, compared with project environmental impact assessment. The range of uncertainties includes internal and external factors of the complex system that is concerned in the strategy. Scenario analysis is increasingly being used to cope with uncertainty in SEA. Following a brief introduction of scenarios and scenario analysis, this paper examines the rationale for scenario analysis in SEA in the context of China. The state of the art associated with scenario analysis applied to SEA in China was reviewed through four SEA case analyses. Lessons learned from these cases indicated the word 'scenario' appears to be abused and the scenario-based methods appear to be misused due to the lack of understanding of an uncertain future and scenario analysis. However, good experiences were also drawn on, regarding how to integrate scenario analysis into the SEA process in China, how to cope with driving forces including uncertainties, how to combine qualitative scenario storylines with quantitative impact predictions, and how to conduct assessments and propose recommendations based on scenarios. Additionally, the ways to improve the application of this tool in SEA were suggested. We concluded by calling for further methodological research on this issue and more practices.

  18. Comprehensive Approach to Verification and Validation of CFD Simulations Applied to Backward Facing Step-Application of CFD Uncertainty Analysis

    Science.gov (United States)

    Groves, Curtis E.; LLie, Marcel; Shallhorn, Paul A.

    2012-01-01

    There are inherent uncertainties and errors associated with using Computational Fluid Dynamics (CFD) to predict the flow field and there is no standard method for evaluating uncertainty in the CFD community. This paper describes an approach to -validate the . uncertainty in using CFD. The method will use the state of the art uncertainty analysis applying different turbulence niodels and draw conclusions on which models provide the least uncertainty and which models most accurately predict the flow of a backward facing step.

  19. Free Vibration Analysis of Composite Plates via Refined Theories Accounting for Uncertainties

    Directory of Open Access Journals (Sweden)

    G. Giunta

    2011-01-01

    Full Text Available The free vibration analysis of composite thin and relatively thick plates accounting for uncertainty is addressed in this work. Classical and refined two-dimensional models derived via Carrera's Unified Formulation (CUF are considered. Material properties and geometrical parameters are supposed to be random. The fundamental frequency related to the first bending eigenmode is stochastically described in terms of the mean value, the standard deviation, the related confidence intervals and the cumulative distribution function. The Monte Carlo Method is employed to account for uncertainty. Cross-ply, simply supported, orthotropic plates are accounted for. Symmetric and anti-symmetric lay-ups are investigated. Displacements based and mixed two-dimensional theories are adopted. Equivalent single layer and layer wise approaches are considered. A Navier type solution is assumed. The conducted analyses have shown that for the considered cases, the fundamental natural frequency is not very sensitive to the uncertainty in the material parameters, while uncertainty in the geometrical parameters should be accounted for. In the case of thin plates, all the considered models yield statistically matching results. For relatively thick plates, the difference in the mean value of the natural frequency is due to the different number of degrees of freedom in the model.

  20. Application of RELAP/SCDAPSIM with integrated uncertainty options to research reactor systems thermal hydraulic analysis

    International Nuclear Information System (INIS)

    Allison, C.M.; Hohorst, J.K.; Perez, M.; Reventos, F.

    2010-01-01

    The RELAP/SCDAPSIM/MOD4.0 code, designed to predict the behavior of reactor systems during normal and accident conditions, is being developed as part of the international SCDAP Development and Training Program (SDTP). RELAP/SCDAPSIM/MOD4.0, which is the first version of RELAP5 completely rewritten to FORTRAN 90/95/2000 standards, uses publicly available RELAP5 and SCDAP models in combination with advanced programming and numerical techniques and other SDTP-member modeling/user options. One such member developed option is an integrated uncertainty analysis package being developed jointly by the Technical University of Catalonia (UPC) and Innovative Systems Software (ISS). This paper briefly summarizes the features of RELAP/SCDAPSIM/MOD4.0 and the integrated uncertainty analysis package, and then presents an example of how the integrated uncertainty package can be setup and used for a simple pipe flow problem. (author)

  1. Uncertainty analysis in calculations of a road accident consequences

    International Nuclear Information System (INIS)

    Bonnefous, S.; Brenot, J.; Hubert, P.

    1995-01-01

    This paper develops a concrete situation witch is the search for an evacuation distance in case of a road accident implying a chlorine tank. The methodological aspect is how implementing uncertainty analysis in deterministic models with random parameters. The study demonstrates a great dispersion in the results. It allows to establish satisfactory decision rules and a hierarchy on parameters witch is useful to define priorities in the search for information and to improve the treatment of these parameters. (authors). 8 refs., 1 fig., 2 tabs

  2. The effect of uncertainty of reactor parameters obtained using k0-NAA on result of analysis

    International Nuclear Information System (INIS)

    Sasajima, Fumio

    2006-01-01

    Neutron Activation Analysis using the k 0 method is a useful method allowing convenient and accurate simultaneous analysis of plural elements, eliminating the need for the use of comparative reference samples. As already well known, it is essential for the correct result of an analysis to obtain the α-factor and f-factor for a neutron spectrum in an irradiation field accurately when an attempt is made to use the k 0 method. For this reason, based on data obtained from the experiment conducted in the JRR-3 PN-3 system, how uncertainty of the measured values for α-factor and f-factor affects the result of an analysis was evaluated. The process of evaluation involved intentionally varying the values for reactor parameters followed by making an analysis of environmental reference samples (NIST SRM-1632c) using the k 0 method to examine the effect of these factors on the concentrations of 19 elements. The result of the evaluation revealed that the degree of the effect of uncertainty on the concentrations of 19 elements was at best approx. 1% under the condition of this experiment assuming that the factor α, a reactor parameter, had uncertainty of approx. 200%. (author)

  3. Validation and assessment of uncertainty of chemical tests as a tool for the reliability analysis of wastewater IPEN

    International Nuclear Information System (INIS)

    Silva, Renan A.; Martins, Elaine A.J.; Furusawa, Helio A.

    2011-01-01

    The validation of analytical methods has become an indispensable tool for the analysis in chemical laboratories, including being required for such accreditation. However, even if a laboratory using validated methods of analysis there is the possibility that these methods generate results discrepant with reality by making necessary the addition of a quantitative attribute (a value) which indicates the degree of certainty the extent or the analytical method used. This measure assigned to the result of measurement is called measurement uncertainty. We estimate this uncertainty with a level of confidence both direction, an analytical result has limited significance if not carried out proper assessment of its uncertainty. One of the activities of this work was to elaborate a program to help the validation and evaluation of uncertainty in chemical analysis. The program was developed with Visual Basic programming language and method of evaluation of uncertainty introduced the following concepts based on the GUM (Guide to the Expression of Uncertainty in Measurement). This evaluation program uncertainty measurement will be applied to chemical analysis in support of the characterization of the Nuclear Fuel Cycle developed by IPEN and the study of organic substances in wastewater associated with professional activities of the Institute. In the first case, primarily for the determination of total uranium and the second case for substances that were generated by human activities and that are contained in resolution 357/2005. As strategy for development of this work was considered the PDCA cycle to improve the efficiency of each step and minimize errors while performing the experimental part. The program should be validated to meet requirements of standards such as, for example, the standard ISO/IEC 17025. The application, it is projected to use in other analytical procedures of both the Nuclear Fuel Cycle and in the control program and chemical waste management of IPEN

  4. Validation and assessment of uncertainty of chemical tests as a tool for the reliability analysis of wastewater IPEN

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Renan A.; Martins, Elaine A.J.; Furusawa, Helio A., E-mail: elaine@ipen.br, E-mail: helioaf@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2011-07-01

    The validation of analytical methods has become an indispensable tool for the analysis in chemical laboratories, including being required for such accreditation. However, even if a laboratory using validated methods of analysis there is the possibility that these methods generate results discrepant with reality by making necessary the addition of a quantitative attribute (a value) which indicates the degree of certainty the extent or the analytical method used. This measure assigned to the result of measurement is called measurement uncertainty. We estimate this uncertainty with a level of confidence both direction, an analytical result has limited significance if not carried out proper assessment of its uncertainty. One of the activities of this work was to elaborate a program to help the validation and evaluation of uncertainty in chemical analysis. The program was developed with Visual Basic programming language and method of evaluation of uncertainty introduced the following concepts based on the GUM (Guide to the Expression of Uncertainty in Measurement). This evaluation program uncertainty measurement will be applied to chemical analysis in support of the characterization of the Nuclear Fuel Cycle developed by IPEN and the study of organic substances in wastewater associated with professional activities of the Institute. In the first case, primarily for the determination of total uranium and the second case for substances that were generated by human activities and that are contained in resolution 357/2005. As strategy for development of this work was considered the PDCA cycle to improve the efficiency of each step and minimize errors while performing the experimental part. The program should be validated to meet requirements of standards such as, for example, the standard ISO/IEC 17025. The application, it is projected to use in other analytical procedures of both the Nuclear Fuel Cycle and in the control program and chemical waste management of IPEN

  5. Incorporating uncertainty analysis into life cycle estimates of greenhouse gas emissions from biomass production

    International Nuclear Information System (INIS)

    Johnson, David R.; Willis, Henry H.; Curtright, Aimee E.; Samaras, Constantine; Skone, Timothy

    2011-01-01

    Before further investments are made in utilizing biomass as a source of renewable energy, both policy makers and the energy industry need estimates of the net greenhouse gas (GHG) reductions expected from substituting biobased fuels for fossil fuels. Such GHG reductions depend greatly on how the biomass is cultivated, transported, processed, and converted into fuel or electricity. Any policy aiming to reduce GHGs with biomass-based energy must account for uncertainties in emissions at each stage of production, or else it risks yielding marginal reductions, if any, while potentially imposing great costs. This paper provides a framework for incorporating uncertainty analysis specifically into estimates of the life cycle GHG emissions from the production of biomass. We outline the sources of uncertainty, discuss the implications of uncertainty and variability on the limits of life cycle assessment (LCA) models, and provide a guide for practitioners to best practices in modeling these uncertainties. The suite of techniques described herein can be used to improve the understanding and the representation of the uncertainties associated with emissions estimates, thus enabling improved decision making with respect to the use of biomass for energy and fuel production. -- Highlights: → We describe key model, scenario and data uncertainties in LCAs of biobased fuels. → System boundaries and allocation choices should be consistent with study goals. → Scenarios should be designed around policy levers that can be controlled. → We describe a new way to analyze the importance of covariance between inputs.

  6. Tolerance of uncertainty: Conceptual analysis, integrative model, and implications for healthcare.

    Science.gov (United States)

    Hillen, Marij A; Gutheil, Caitlin M; Strout, Tania D; Smets, Ellen M A; Han, Paul K J

    2017-05-01

    Uncertainty tolerance (UT) is an important, well-studied phenomenon in health care and many other important domains of life, yet its conceptualization and measurement by researchers in various disciplines have varied substantially and its essential nature remains unclear. The objectives of this study were to: 1) analyze the meaning and logical coherence of UT as conceptualized by developers of UT measures, and 2) develop an integrative conceptual model to guide future empirical research regarding the nature, causes, and effects of UT. A narrative review and conceptual analysis of 18 existing measures of Uncertainty and Ambiguity Tolerance was conducted, focusing on how measure developers in various fields have defined both the "uncertainty" and "tolerance" components of UT-both explicitly through their writings and implicitly through the items constituting their measures. Both explicit and implicit conceptual definitions of uncertainty and tolerance vary substantially and are often poorly and inconsistently specified. A logically coherent, unified understanding or theoretical model of UT is lacking. To address these gaps, we propose a new integrative definition and multidimensional conceptual model that construes UT as the set of negative and positive psychological responses-cognitive, emotional, and behavioral-provoked by the conscious awareness of ignorance about particular aspects of the world. This model synthesizes insights from various disciplines and provides an organizing framework for future research. We discuss how this model can facilitate further empirical and theoretical research to better measure and understand the nature, determinants, and outcomes of UT in health care and other domains of life. Uncertainty tolerance is an important and complex phenomenon requiring more precise and consistent definition. An integrative definition and conceptual model, intended as a tentative and flexible point of departure for future research, adds needed breadth

  7. Bayesian analysis of stage-fall-discharge rating curves and their uncertainties

    Science.gov (United States)

    Mansanarez, Valentin; Le Coz, Jérôme; Renard, Benjamin; Lang, Michel; Pierrefeu, Gilles; Le Boursicaud, Raphaël; Pobanz, Karine

    2016-04-01

    Stage-fall-discharge (SFD) rating curves are traditionally used to compute streamflow records at sites where the energy slope of the flow is variable due to variable backwater effects. Building on existing Bayesian approaches, we introduce an original hydraulics-based method for developing SFD rating curves used at twin gauge stations and estimating their uncertainties. Conventional power functions for channel and section controls are used, and transition to a backwater-affected channel control is computed based on a continuity condition, solved either analytically or numerically. The difference between the reference levels at the two stations is estimated as another uncertain parameter of the SFD model. The method proposed in this presentation incorporates information from both the hydraulic knowledge (equations of channel or section controls) and the information available in the stage-fall-discharge observations (gauging data). The obtained total uncertainty combines the parametric uncertainty and the remnant uncertainty related to the model of rating curve. This method provides a direct estimation of the physical inputs of the rating curve (roughness, width, slope bed, distance between twin gauges, etc.). The performance of the new method is tested using an application case affected by the variable backwater of a run-of-the-river dam: the Rhône river at Valence, France. In particular, a sensitivity analysis to the prior information and to the gauging dataset is performed. At that site, the stage-fall-discharge domain is well documented with gaugings conducted over a range of backwater affected and unaffected conditions. The performance of the new model was deemed to be satisfactory. Notably, transition to uniform flow when the overall range of the auxiliary stage is gauged is correctly simulated. The resulting curves are in good agreement with the observations (gaugings) and their uncertainty envelopes are acceptable for computing streamflow records. Similar

  8. Structural uncertainty in seismic risk analysis. Seismic safety margins research program

    Energy Technology Data Exchange (ETDEWEB)

    Hasselman, T K; Simonian, S S [J.H. Wiggins Company (United States)

    1980-03-01

    This report documents the formulation of a methodology for modeling and evaluating the effects of structural uncertainty on predicted modal characteristics of the major structures and substructures of commercial nuclear power plants. The uncertainties are cast in the form of normalized random variables which represent the demonstrated ability to predict modal frequencies, damping and modal response amplitudes for broad generic types of structures (steel frame, reinforced concrete and prestressed concrete). Data based on observed differences between predicted and measured structural performance at the member, substructure, and/or major structural system levels are used to quantify uncertainties and thus form the data base for statistical analysis. Proper normalization enables data from non-nuclear structures, e.g., office buildings, to be included in the data base. Numerous alternative methods are defined within the general framework of this methodology. The report also documents the results of a data survey to identify, classify and evaluate available data for the required data base. A bibliography of 95 references is included. Deficiencies in the currently identified data base are exposed, and remedial measures suggested. Recommendations are made for implementation of the methodology. (author)

  9. Numerical Uncertainty Analysis for Computational Fluid Dynamics using Student T Distribution -- Application of CFD Uncertainty Analysis Compared to Exact Analytical Solution

    Science.gov (United States)

    Groves, Curtis E.; Ilie, marcel; Shallhorn, Paul A.

    2014-01-01

    Computational Fluid Dynamics (CFD) is the standard numerical tool used by Fluid Dynamists to estimate solutions to many problems in academia, government, and industry. CFD is known to have errors and uncertainties and there is no universally adopted method to estimate such quantities. This paper describes an approach to estimate CFD uncertainties strictly numerically using inputs and the Student-T distribution. The approach is compared to an exact analytical solution of fully developed, laminar flow between infinite, stationary plates. It is shown that treating all CFD input parameters as oscillatory uncertainty terms coupled with the Student-T distribution can encompass the exact solution.

  10. Aboveground Forest Biomass Estimation with Landsat and LiDAR Data and Uncertainty Analysis of the Estimates

    Directory of Open Access Journals (Sweden)

    Dengsheng Lu

    2012-01-01

    Full Text Available Landsat Thematic mapper (TM image has long been the dominate data source, and recently LiDAR has offered an important new structural data stream for forest biomass estimations. On the other hand, forest biomass uncertainty analysis research has only recently obtained sufficient attention due to the difficulty in collecting reference data. This paper provides a brief overview of current forest biomass estimation methods using both TM and LiDAR data. A case study is then presented that demonstrates the forest biomass estimation methods and uncertainty analysis. Results indicate that Landsat TM data can provide adequate biomass estimates for secondary succession but are not suitable for mature forest biomass estimates due to data saturation problems. LiDAR can overcome TM’s shortcoming providing better biomass estimation performance but has not been extensively applied in practice due to data availability constraints. The uncertainty analysis indicates that various sources affect the performance of forest biomass/carbon estimation. With that said, the clear dominate sources of uncertainty are the variation of input sample plot data and data saturation problem related to optical sensors. A possible solution to increasing the confidence in forest biomass estimates is to integrate the strengths of multisensor data.

  11. Uncertainty and sensitivity analysis of the LOFT L2-5 test: Results of the BEMUSE programme

    International Nuclear Information System (INIS)

    Crecy, A. de; Bazin, P.; Glaeser, H.; Skorek, T.; Joucla, J.; Probst, P.; Fujioka, K.; Chung, B.D.; Oh, D.Y.; Kyncl, M.; Pernica, R.; Macek, J.; Meca, R.; Macian, R.; D'Auria, F.; Petruzzi, A.; Batet, L.; Perez, M.; Reventos, F.

    2008-01-01

    This paper presents the results and the main lessons learnt from the phase 3 of BEMUSE, an international benchmark activity sponsored by the Committee on the Safety of Nuclear Installations [CSNI: Committee on the Safety of Nuclear Installations (NEA, OECD), 2007. BEMUSE Phase III Report. NEA/CSNI R(2007) 4, October 2007] of the OECD/NEA. The phase 3 of BEMUSE aimed at performing Uncertainty and Sensitivity Analyses of thermal-hydraulic codes used for the calculation of LOFT L2-5 experiment, which simulated a Large-Break Loss-of-Coolant-Accident (LB-LOCA). Eleven participants coming from ten organisations and eight countries took part in this benchmark. In the first section of this paper, the context of BEMUSE is described as well as the methods used by the participants. In the second section, the results of the benchmark are presented. The majority of the participants find uncertainty bands which envelop the experimental data fairly well, however the width of these bands is much diverged. A synthesis of the sensitivity analysis results has been made and is expected to provide a useful basis for further uncertainty analysis dealing with LB-LOCA. Finally, recommendations are given both for uncertainty and sensitivity analysis

  12. Analysis of uncertainty and variability in finite element computational models for biomedical engineering:characterization and propagation

    Directory of Open Access Journals (Sweden)

    Nerea Mangado

    2016-11-01

    Full Text Available Computational modeling has become a powerful tool in biomedical engineering thanks to its potential to simulate coupled systems. However, real parameters are usually not accurately known and variability is inherent in living organisms. To cope with this, probabilistic tools, statistical analysis and stochastic approaches have been used. This article aims to review the analysis of uncertainty and variability in the context of finite element modeling in biomedical engineering. Characterization techniques and propagation methods are presented, as well as examples of their applications in biomedical finite element simulations. Uncertainty propagation methods, both non-intrusive and intrusive, are described. Finally, pros and cons of the different approaches and their use in the scientific community are presented. This leads us to identify future directions for research and methodological development of uncertainty modeling in biomedical engineering.

  13. Analysis of Uncertainty and Variability in Finite Element Computational Models for Biomedical Engineering: Characterization and Propagation.

    Science.gov (United States)

    Mangado, Nerea; Piella, Gemma; Noailly, Jérôme; Pons-Prats, Jordi; Ballester, Miguel Ángel González

    2016-01-01

    Computational modeling has become a powerful tool in biomedical engineering thanks to its potential to simulate coupled systems. However, real parameters are usually not accurately known, and variability is inherent in living organisms. To cope with this, probabilistic tools, statistical analysis and stochastic approaches have been used. This article aims to review the analysis of uncertainty and variability in the context of finite element modeling in biomedical engineering. Characterization techniques and propagation methods are presented, as well as examples of their applications in biomedical finite element simulations. Uncertainty propagation methods, both non-intrusive and intrusive, are described. Finally, pros and cons of the different approaches and their use in the scientific community are presented. This leads us to identify future directions for research and methodological development of uncertainty modeling in biomedical engineering.

  14. Flood modelling : Parameterisation and inflow uncertainty

    NARCIS (Netherlands)

    Mukolwe, M.M.; Di Baldassarre, G.; Werner, M.; Solomatine, D.P.

    2014-01-01

    This paper presents an analysis of uncertainty in hydraulic modelling of floods, focusing on the inaccuracy caused by inflow errors and parameter uncertainty. In particular, the study develops a method to propagate the uncertainty induced by, firstly, application of a stage–discharge rating curve

  15. SWEPP PAN assay system uncertainty analysis: Active mode measurements of solidified aqueous sludge waste

    International Nuclear Information System (INIS)

    Blackwood, L.G.; Harker, Y.D.; Meachum, T.R.

    1997-12-01

    The Idaho National Engineering and Environmental Laboratory is being used as a temporary storage facility for transuranic waste generated by the US Nuclear Weapons program at the Rocky Flats Plant (RFP) in Golden, Colorado. Currently, there is a large effort in progress to prepare to ship this waste to the Waste Isolation Pilot Plant (WIPP) in Carlsbad, New Mexico. In order to meet the TRU Waste Characterization Quality Assurance Program Plan nondestructive assay compliance requirements and quality assurance objectives, it is necessary to determine the total uncertainty of the radioassay results produced by the Stored Waste Examination Pilot Plant (SWEPP) Passive Active Neutron (PAN) radioassay system. This paper is one of a series of reports quantifying the results of the uncertainty analysis of the PAN system measurements for specific waste types and measurement modes. In particular this report covers active mode measurements of weapons grade plutonium-contaminated aqueous sludge waste contained in 208 liter drums (item description codes 1, 2, 7, 800, 803, and 807). Results of the uncertainty analysis for PAN active mode measurements of aqueous sludge indicate that a bias correction multiplier of 1.55 should be applied to the PAN aqueous sludge measurements. With the bias correction, the uncertainty bounds on the expected bias are 0 ± 27%. These bounds meet the Quality Assurance Program Plan requirements for radioassay systems

  16. An Empirical Analysis of Stakeholders' Influence on Policy Development: the Role of Uncertainty Handling

    Directory of Open Access Journals (Sweden)

    Rianne M. Bijlsma

    2011-03-01

    Full Text Available Stakeholder participation is advocated widely, but there is little structured, empirical research into its influence on policy development. We aim to further the insight into the characteristics of participatory policy development by comparing it to expert-based policy development for the same case. We describe the process of problem framing and analysis, as well as the knowledge base used. We apply an uncertainty perspective to reveal differences between the approaches and speculate about possible explanations. We view policy development as a continuous handling of substantive uncertainty and process uncertainty, and investigate how the methods of handling uncertainty of actors influence the policy development. Our findings suggest that the wider frame that was adopted in the participatory approach was the result of a more active handling of process uncertainty. The stakeholders handled institutional uncertainty by broadening the problem frame, and they handled strategic uncertainty by negotiating commitment and by including all important stakeholder criteria in the frame. In the expert-based approach, we observed a more passive handling of uncertainty, apparently to avoid complexity. The experts handled institutional uncertainty by reducing the scope and by anticipating windows of opportunity in other policy arenas. Strategic uncertainty was handled by assuming stakeholders' acceptance of noncontroversial measures that balanced benefits and sacrifices. Three other observations are of interest to the scientific debate on participatory policy processes. Firstly, the participatory policy was less adaptive than the expert-based policy. The observed low tolerance for process uncertainty of participants made them opt for a rigorous "once and for all" settling of the conflict. Secondly, in the participatory approach, actors preferred procedures of traceable knowledge acquisition over controversial topics to handle substantive uncertainty. This

  17. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    Science.gov (United States)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; Geraci, Gianluca; Eldred, Michael S.; Vane, Zachary P.; Lacaze, Guilhem; Oefelein, Joseph C.; Najm, Habib N.

    2018-03-01

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis is conducted to identify influential uncertain input parameters, which can help reduce the systems stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. These methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.

  18. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    Energy Technology Data Exchange (ETDEWEB)

    Huan, Xun [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Safta, Cosmin [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Sargsyan, Khachik [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Geraci, Gianluca [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eldred, Michael S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vane, Zachary P. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Lacaze, Guilhem [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Oefelein, Joseph C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Najm, Habib N. [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2018-02-09

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis is conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. Finally, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.

  19. Biomass Thermogravimetric Analysis: Uncertainty Determination Methodology and Sampling Maps Generation

    Science.gov (United States)

    Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Eguía, Pablo; Collazo, Joaquín

    2010-01-01

    The objective of this study was to develop a methodology for the determination of the maximum sampling error and confidence intervals of thermal properties obtained from thermogravimetric analysis (TG), including moisture, volatile matter, fixed carbon and ash content. The sampling procedure of the TG analysis was of particular interest and was conducted with care. The results of the present study were compared to those of a prompt analysis, and a correlation between the mean values and maximum sampling errors of the methods were not observed. In general, low and acceptable levels of uncertainty and error were obtained, demonstrating that the properties evaluated by TG analysis were representative of the overall fuel composition. The accurate determination of the thermal properties of biomass with precise confidence intervals is of particular interest in energetic biomass applications. PMID:20717532

  20. OECD/NEA BENCHMARK FOR UNCERTAINTY ANALYSIS IN MODELING (UAM FOR LWRS – SUMMARY AND DISCUSSION OF NEUTRONICS CASES (PHASE I

    Directory of Open Access Journals (Sweden)

    RYAN N. BRATTON

    2014-06-01

    Full Text Available A Nuclear Energy Agency (NEA, Organization for Economic Co-operation and Development (OECD benchmark for Uncertainty Analysis in Modeling (UAM is defined in order to facilitate the development and validation of available uncertainty analysis and sensitivity analysis methods for best-estimate Light water Reactor (LWR design and safety calculations. The benchmark has been named the OECD/NEA UAM-LWR benchmark, and has been divided into three phases each of which focuses on a different portion of the uncertainty propagation in LWR multi-physics and multi-scale analysis. Several different reactor cases are modeled at various phases of a reactor calculation. This paper discusses Phase I, known as the “Neutronics Phase”, which is devoted mostly to the propagation of nuclear data (cross-section uncertainty throughout steady-state stand-alone neutronics core calculations. Three reactor systems (for which design, operation and measured data are available are rigorously studied in this benchmark: Peach Bottom Unit 2 BWR, Three Mile Island Unit 1 PWR, and VVER-1000 Kozloduy-6/Kalinin-3. Additional measured data is analyzed such as the KRITZ LEU criticality experiments and the SNEAK-7A and 7B experiments of the Karlsruhe Fast Critical Facility. Analyzed results include the top five neutron-nuclide reactions, which contribute the most to the prediction uncertainty in keff, as well as the uncertainty in key parameters of neutronics analysis such as microscopic and macroscopic cross-sections, six-group decay constants, assembly discontinuity factors, and axial and radial core power distributions. Conclusions are drawn regarding where further studies should be done to reduce uncertainties in key nuclide reaction uncertainties (i.e.: 238U radiative capture and inelastic scattering (n, n’ as well as the average number of neutrons released per fission event of 239Pu.

  1. Statistically based uncertainty analysis for ranking of component importance in the thermal-hydraulic safety analysis of the Advanced Neutron Source Reactor

    International Nuclear Information System (INIS)

    Wilson, G.E.

    1992-01-01

    The Analytic Hierarchy Process (AHP) has been used to help determine the importance of components and phenomena in thermal-hydraulic safety analyses of nuclear reactors. The AHP results are based, in part on expert opinion. Therefore, it is prudent to evaluate the uncertainty of the AHP ranks of importance. Prior applications have addressed uncertainty with experimental data comparisons and bounding sensitivity calculations. These methods work well when a sufficient experimental data base exists to justify the comparisons. However, in the case of limited or no experimental data the size of the uncertainty is normally made conservatively large. Accordingly, the author has taken another approach, that of performing a statistically based uncertainty analysis. The new work is based on prior evaluations of the importance of components and phenomena in the thermal-hydraulic safety analysis of the Advanced Neutron Source Reactor (ANSR), a new facility now in the design phase. The uncertainty during large break loss of coolant, and decay heat removal scenarios is estimated by assigning a probability distribution function (pdf) to the potential error in the initial expert estimates of pair-wise importance between the components. Using a Monte Carlo sampling technique, the error pdfs are propagated through the AHP software solutions to determine a pdf of uncertainty in the system wide importance of each component. To enhance the generality of the results, study of one other problem having different number of elements is reported, as are the effects of a larger assumed pdf error in the expert ranks. Validation of the Monte Carlo sample size and repeatability are also documented

  2. Uncertainty importance analysis using parametric moment ratio functions.

    Science.gov (United States)

    Wei, Pengfei; Lu, Zhenzhou; Song, Jingwen

    2014-02-01

    This article presents a new importance analysis framework, called parametric moment ratio function, for measuring the reduction of model output uncertainty when the distribution parameters of inputs are changed, and the emphasis is put on the mean and variance ratio functions with respect to the variances of model inputs. The proposed concepts efficiently guide the analyst to achieve a targeted reduction on the model output mean and variance by operating on the variances of model inputs. The unbiased and progressive unbiased Monte Carlo estimators are also derived for the parametric mean and variance ratio functions, respectively. Only a set of samples is needed for implementing the proposed importance analysis by the proposed estimators, thus the computational cost is free of input dimensionality. An analytical test example with highly nonlinear behavior is introduced for illustrating the engineering significance of the proposed importance analysis technique and verifying the efficiency and convergence of the derived Monte Carlo estimators. Finally, the moment ratio function is applied to a planar 10-bar structure for achieving a targeted 50% reduction of the model output variance. © 2013 Society for Risk Analysis.

  3. Uncertainty and sensitivity analysis of parameters affecting water hammer pressure wave behaviour

    International Nuclear Information System (INIS)

    Kaliatka, A.; Uspuras, E.; Vaisnoras, M.

    2006-01-01

    Pressure surges occurring in pipeline systems may be caused by fast control interference, start up and shut down processes and operation failure. They lead to water hammer upstream the closing valve and cavitational hammer downstream the valve, which may cause considerable damages to the pipeline and the support structures. Appearance of water hammer in thermal-hydraulic systems was widely studied employing different state-of-the-art thermal-hydraulic codes in many organizations. For the analysis water hammer test performed at Fraunhofer Institute for Environmental, Safety and Energy Technology (UMSICHT) at Oberhausen was considered. This paper presents the comparison of UMSICHT test facility experiment calculations employing the best estimate system code RELAP5/Mod3.3 to measured water hammer values after fast closure of a valve. The analysis revealed that the calculated first pressure peak, which has the highest value, matches the measured value very well. The performed analysis (as well as any other analyses) as a results of each individual calculation always contains uncertainty owing to initial conditions of installations, errors of measuring systems, errors caused by nodalization of objects at modelling, code correlations, etc. In this connection, results of uncertainty and sensitivity analysis of the initial conditions and code-selected models are shown in the paper. (orig.)

  4. Report on the uncertainty methods study

    International Nuclear Information System (INIS)

    1998-06-01

    The Uncertainty Methods Study (UMS) Group, following a mandate from CSNI, has compared five methods for calculating the uncertainty in the predictions of advanced 'best estimate' thermal-hydraulic codes: the Pisa method (based on extrapolation from integral experiments) and four methods identifying and combining input uncertainties. Three of these, the GRS, IPSN and ENUSA methods, use subjective probability distributions, and one, the AEAT method, performs a bounding analysis. Each method has been used to calculate the uncertainty in specified parameters for the LSTF SB-CL-18 5% cold leg small break LOCA experiment in the ROSA-IV Large Scale Test Facility (LSTF). The uncertainty analysis was conducted essentially blind and the participants did not use experimental measurements from the test as input apart from initial and boundary conditions. Participants calculated uncertainty ranges for experimental parameters including pressurizer pressure, primary circuit inventory and clad temperature (at a specified position) as functions of time

  5. Mathematical Analysis of Uncertainty

    Directory of Open Access Journals (Sweden)

    Angel GARRIDO

    2016-01-01

    Full Text Available Classical Logic showed early its insufficiencies for solving AI problems. The introduction of Fuzzy Logic aims at this problem. There have been research in the conventional Rough direction alone or in the Fuzzy direction alone, and more recently, attempts to combine both into Fuzzy Rough Sets or Rough Fuzzy Sets. We analyse some new and powerful tools in the study of Uncertainty, as the Probabilistic Graphical Models, Chain Graphs, Bayesian Networks, and Markov Networks, integrating our knowledge of graphs and probability.

  6. Status report on activities of ASTM E10.05.01 Task Group on Uncertainty Analysis

    International Nuclear Information System (INIS)

    Kam, F.B.K.; Stallman, F.W.

    1979-01-01

    Uncertainties in the field of reactor dosimetry are analyzed. A survey on uncertainty analysis as it is practiced at leading laboratories which are involved in reactor dosimetry is described. A questionnaire was prepared and mailed to about 45 installations and researchers. Nine replies were received, several of them were prepared by more than one author. Three of the nine came from installations outside the US. Results and the questionnaire are presented

  7. Approaches to highly parameterized inversion: A guide to using PEST for model-parameter and predictive-uncertainty analysis

    Science.gov (United States)

    Doherty, John E.; Hunt, Randall J.; Tonkin, Matthew J.

    2010-01-01

    Analysis of the uncertainty associated with parameters used by a numerical model, and with predictions that depend on those parameters, is fundamental to the use of modeling in support of decisionmaking. Unfortunately, predictive uncertainty analysis with regard to models can be very computationally demanding, due in part to complex constraints on parameters that arise from expert knowledge of system properties on the one hand (knowledge constraints) and from the necessity for the model parameters to assume values that allow the model to reproduce historical system behavior on the other hand (calibration constraints). Enforcement of knowledge and calibration constraints on parameters used by a model does not eliminate the uncertainty in those parameters. In fact, in many cases, enforcement of calibration constraints simply reduces the uncertainties associated with a number of broad-scale combinations of model parameters that collectively describe spatially averaged system properties. The uncertainties associated with other combinations of parameters, especially those that pertain to small-scale parameter heterogeneity, may not be reduced through the calibration process. To the extent that a prediction depends on system-property detail, its postcalibration variability may be reduced very little, if at all, by applying calibration constraints; knowledge constraints remain the only limits on the variability of predictions that depend on such detail. Regrettably, in many common modeling applications, these constraints are weak. Though the PEST software suite was initially developed as a tool for model calibration, recent developments have focused on the evaluation of model-parameter and predictive uncertainty. As a complement to functionality that it provides for highly parameterized inversion (calibration) by means of formal mathematical regularization techniques, the PEST suite provides utilities for linear and nonlinear error-variance and uncertainty analysis in

  8. Two-dimensional cross-section sensitivity and uncertainty analysis of the LBM [Lithium Blanket Module] experiments at LOTUS

    International Nuclear Information System (INIS)

    Davidson, J.W.; Dudziak, D.J.; Pelloni, S.; Stepanek, J.

    1988-01-01

    In a recent common Los Alamos/PSI effort, a sensitivity and nuclear data uncertainty path for the modular code system AARE (Advanced Analysis for Reactor Engineering) was developed. This path includes the cross-section code TRAMIX, the one-dimensional finite difference S/sub N/-transport code ONEDANT, the two-dimensional finite element S/sub N/-transport code TRISM, and the one- and two-dimensional sensitivity and nuclear data uncertainty code SENSIBL. Within the framework of the present work a complete set of forward and adjoint two-dimensional TRISM calculations were performed both for the bare, as well as for the Pb- and Be-preceeded, LBM using MATXS8 libraries. Then a two-dimensional sensitivity and uncertainty analysis for all cases was performed. The goal of this analysis was the determination of the uncertainties of a calculated tritium production per source neutron from lithium along the central Li 2 O rod in the LBM. Considered were the contributions from 1 H, 6 Li, 7 Li, 9 Be, /sup nat/C, 14 N, 16 O, 23 Na, 27 Al, /sup nat/Si, /sup nat/Cr, /sup nat/Fe, /sup nat/Ni, and /sup nat/Pb. 22 refs., 1 fig., 3 tabs

  9. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    International Nuclear Information System (INIS)

    Heo, Jaeseok; Kim, Kyung Doo

    2015-01-01

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper

  10. Data uncertainties in material flow analysis: Municipal solid waste management system in Maputo City, Mozambique.

    Science.gov (United States)

    Dos Muchangos, Leticia Sarmento; Tokai, Akihiro; Hanashima, Atsuko

    2017-01-01

    Material flow analysis can effectively trace and quantify the flows and stocks of materials such as solid wastes in urban environments. However, the integrity of material flow analysis results is compromised by data uncertainties, an occurrence that is particularly acute in low-and-middle-income study contexts. This article investigates the uncertainties in the input data and their effects in a material flow analysis study of municipal solid waste management in Maputo City, the capital of Mozambique. The analysis is based on data collected in 2007 and 2014. Initially, the uncertainties and their ranges were identified by the data classification model of Hedbrant and Sörme, followed by the application of sensitivity analysis. The average lower and upper bounds were 29% and 71%, respectively, in 2007, increasing to 41% and 96%, respectively, in 2014. This indicates higher data quality in 2007 than in 2014. Results also show that not only data are partially missing from the established flows such as waste generation to final disposal, but also that they are limited and inconsistent in emerging flows and processes such as waste generation to material recovery (hence the wider variation in the 2014 parameters). The sensitivity analysis further clarified the most influencing parameter and the degree of influence of each parameter on the waste flows and the interrelations among the parameters. The findings highlight the need for an integrated municipal solid waste management approach to avoid transferring or worsening the negative impacts among the parameters and flows.

  11. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Jaeseok, E-mail: jheo@kaeri.re.kr; Kim, Kyung Doo, E-mail: kdkim@kaeri.re.kr

    2015-10-15

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper.

  12. Quantifying uncertainty in LCA-modelling of waste management systems

    DEFF Research Database (Denmark)

    Clavreul, Julie; Guyonnet, D.; Christensen, Thomas Højlund

    2012-01-01

    Uncertainty analysis in LCA studies has been subject to major progress over the last years. In the context of waste management, various methods have been implemented but a systematic method for uncertainty analysis of waste-LCA studies is lacking. The objective of this paper is (1) to present...... the sources of uncertainty specifically inherent to waste-LCA studies, (2) to select and apply several methods for uncertainty analysis and (3) to develop a general framework for quantitative uncertainty assessment of LCA of waste management systems. The suggested method is a sequence of four steps combining...

  13. Results from the Application of Uncertainty Methods in the CSNI Uncertainty Methods Study (UMS)

    International Nuclear Information System (INIS)

    Glaeser, H.

    2008-01-01

    Within licensing procedures there is the incentive to replace the conservative requirements for code application by a - best estimate - concept supplemented by an uncertainty analysis to account for predictive uncertainties of code results. Methods have been developed to quantify these uncertainties. The Uncertainty Methods Study (UMS) Group, following a mandate from CSNI, has compared five methods for calculating the uncertainty in the predictions of advanced -best estimate- thermal-hydraulic codes. Most of the methods identify and combine input uncertainties. The major differences between the predictions of the methods came from the choice of uncertain parameters and the quantification of the input uncertainties, i.e. the wideness of the uncertainty ranges. Therefore, suitable experimental and analytical information has to be selected to specify these uncertainty ranges or distributions. After the closure of the Uncertainty Method Study (UMS) and after the report was issued comparison calculations of experiment LSTF-SB-CL-18 were performed by University of Pisa using different versions of the RELAP 5 code. It turned out that the version used by two of the participants calculated a 170 K higher peak clad temperature compared with other versions using the same input deck. This may contribute to the differences of the upper limit of the uncertainty ranges.

  14. Uncertainty in river discharge observations: a quantitative analysis

    Directory of Open Access Journals (Sweden)

    G. Di Baldassarre

    2009-06-01

    Full Text Available This study proposes a framework for analysing and quantifying the uncertainty of river flow data. Such uncertainty is often considered to be negligible with respect to other approximations affecting hydrological studies. Actually, given that river discharge data are usually obtained by means of the so-called rating curve method, a number of different sources of error affect the derived observations. These include: errors in measurements of river stage and discharge utilised to parameterise the rating curve, interpolation and extrapolation error of the rating curve, presence of unsteady flow conditions, and seasonal variations of the state of the vegetation (i.e. roughness. This study aims at analysing these sources of uncertainty using an original methodology. The novelty of the proposed framework lies in the estimation of rating curve uncertainty, which is based on hydraulic simulations. These latter are carried out on a reach of the Po River (Italy by means of a one-dimensional (1-D hydraulic model code (HEC-RAS. The results of the study show that errors in river flow data are indeed far from negligible.

  15. Comparison of the effect of hazard and response/fragility uncertainties on core melt probability uncertainty

    International Nuclear Information System (INIS)

    Mensing, R.W.

    1985-01-01

    This report proposes a method for comparing the effects of the uncertainty in probabilistic risk analysis (PRA) input parameters on the uncertainty in the predicted risks. The proposed method is applied to compare the effect of uncertainties in the descriptions of (1) the seismic hazard at a nuclear power plant site and (2) random variations in plant subsystem responses and component fragility on the uncertainty in the predicted probability of core melt. The PRA used is that developed by the Seismic Safety Margins Research Program

  16. Transfer of Nuclear Data Uncertainties to the Uncertainties of Fuel Characteristic by Interval Calculation

    International Nuclear Information System (INIS)

    Ukraintsev, V.F.; Kolesov, V.V.

    2006-01-01

    Usually for evaluation of reactor functionals uncertainties, the perturbation theory and sensitivity analysis techniques are used. Of cause linearization approach of perturbation theory is used. This approach has several disadvantages and that is why a new method, based on application of a special interval calculations technique has been created. Basically, the problem of dependency of fuel cycle characteristic uncertainties from source group neutron cross-sections and decay parameters uncertainties can be solved (to some extent) as well by use of sensitivity analysis. However such procedure is rather labor consuming and does not give guaranteed estimations for received parameters since it works, strictly speaking, only for small deviations because it is initially based on linearization of the mathematical problems. The technique of fuel cycle characteristics uncertainties estimation is based on so-called interval analysis (or interval calculations). The basic advantage of this technique is the opportunity of deriving correct estimations. This technique consists in introducing a new special type of data such as Interval data in codes and the definition for them of all arithmetic operations. A technique of problem decision for system of linear equations (isotope kinetics) with use of interval arithmetic for the fuel burning up problem, has been realized. Thus there is an opportunity to compute a neutron flux, fission and capture cross-section uncertainties impact on nuclide concentration uncertainties and on fuel cycle characteristics (such as K eff , breeding ratio, decay heat power etc). By this time the code for interval calculation of burn-up computing has been developed and verified

  17. Uncertainty and sensitivity analysis of early exposure results with the MACCS Reactor Accident Consequence Model

    International Nuclear Information System (INIS)

    Helton, J.C.; Johnson, J.D.; McKay, M.D.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the early health effects associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 34 imprecisely known input variables on the following reactor accident consequences are studied: number of early fatalities, number of cases of prodromal vomiting, population dose within 10 mi of the reactor, population dose within 1000 mi of the reactor, individual early fatality probability within 1 mi of the reactor, and maximum early fatality distance. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: scaling factor for horizontal dispersion, dry deposition velocity, inhalation protection factor for nonevacuees, groundshine shielding factor for nonevacuees, early fatality hazard function alpha value for bone marrow exposure, and scaling factor for vertical dispersion

  18. The IAEA Coordinated Research Program on HTGR Reactor Physics, Thermal-hydraulics and Depletion Uncertainty Analysis: Description of the Benchmark Test Cases and Phases

    Energy Technology Data Exchange (ETDEWEB)

    Frederik Reitsma; Gerhard Strydom; Bismark Tyobeka; Kostadin Ivanov

    2012-10-01

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The uncertainties in the HTR analysis tools are today typically assessed with sensitivity analysis and then a few important input uncertainties (typically based on a PIRT process) are varied in the analysis to find a spread in the parameter of importance. However, one wish to apply a more fundamental approach to determine the predictive capability and accuracies of coupled neutronics/thermal-hydraulics and depletion simulations used for reactor design and safety assessment. Today there is a broader acceptance of the use of uncertainty analysis even in safety studies and it has been accepted by regulators in some cases to replace the traditional conservative analysis. Finally, there is also a renewed focus in supplying reliable covariance data (nuclear data uncertainties) that can then be used in uncertainty methods. Uncertainty and sensitivity studies are therefore becoming an essential component of any significant effort in data and simulation improvement. In order to address uncertainty in analysis and methods in the HTGR community the IAEA launched a Coordinated Research Project (CRP) on the HTGR Uncertainty Analysis in Modelling early in 2012. The project is built on the experience of the OECD/NEA Light Water Reactor (LWR) Uncertainty Analysis in Best-Estimate Modelling (UAM) benchmark activity, but focuses specifically on the peculiarities of HTGR designs and its simulation requirements. Two benchmark problems were defined with the prismatic type design represented by the MHTGR-350 design from General Atomics (GA) while a 250 MW modular pebble bed design, similar to the INET (China) and indirect-cycle PBMR (South Africa) designs are also included. In the paper more detail on the benchmark cases, the different specific phases and tasks and the latest

  19. Analysis of uncertainties caused by the atmospheric dispersion model in accident consequence assessments with UFOMOD

    International Nuclear Information System (INIS)

    Fischer, F.; Ehrhardt, J.

    1988-06-01

    Various techniques available for uncertainty analysis of large computer models are applied, described and selected as most appropriate for analyzing the uncertainty in the predictions of accident consequence assessments. The investigation refers to the atmospheric dispersion and deposition submodel (straight-line Gaussian plume model) of UFOMOD, whose most important input variables and parameters are linked with probability distributions derived from expert judgement. Uncertainty bands show how much variability exists, sensitivity measures determine what causes this variability in consequences. Results are presented as confidence bounds of complementary cumulative frequency distributions (CCFDs) of activity concentrations, organ doses and health effects, partially as a function of distance from the site. In addition the ranked influence of the uncertain parameters on the different consequence types is shown. For the estimation of confidence bounds it was sufficient to choose a model parameter sample size of n (n=59) equal to 1.5 times the number of uncertain model parameters. Different samples or an increase of sample size did not change the 5%-95% - confidence bands. To get statistically stable results of the sensitivity analysis, larger sample sizes are needed (n=100, 200). Random or Latin-hypercube sampling schemes as tools for uncertainty and sensitivity analyses led to comparable results. (orig.) [de

  20. Concepts involved in a proposed application of uncertainty analysis to the performance assessment of high-level nuclear waste isolation systems

    International Nuclear Information System (INIS)

    Maerker, R.E.

    1986-03-01

    This report introduces the concepts of a previously developed methodology which could readily be extended to the field of performance assessment for high-level nuclear waste isolation systems. The methodology incorporates sensitivities previously obtained with the GRESS code into an uncertainty analysis, from which propagated uncertainties in calculated responses may be derived from basic data uncertainties. Following a definition of terms, examples are provided illustrating commonly used conventions for describing the concepts of covariance and sensitivity. Examples of solutions to problems previously encountered in related fields involving uncertainty analysis and use of a generalized linear least-squares adjustment procedure are also presented. 5 refs., 14 tabs

  1. Sparse grid-based polynomial chaos expansion for aerodynamics of an airfoil with uncertainties

    Directory of Open Access Journals (Sweden)

    Xiaojing WU

    2018-05-01

    Full Text Available The uncertainties can generate fluctuations with aerodynamic characteristics. Uncertainty Quantification (UQ is applied to compute its impact on the aerodynamic characteristics. In addition, the contribution of each uncertainty to aerodynamic characteristics should be computed by uncertainty sensitivity analysis. Non-Intrusive Polynomial Chaos (NIPC has been successfully applied to uncertainty quantification and uncertainty sensitivity analysis. However, the non-intrusive polynomial chaos method becomes inefficient as the number of random variables adopted to describe uncertainties increases. This deficiency becomes significant in stochastic aerodynamic analysis considering the geometric uncertainty because the description of geometric uncertainty generally needs many parameters. To solve the deficiency, a Sparse Grid-based Polynomial Chaos (SGPC expansion is used to do uncertainty quantification and sensitivity analysis for stochastic aerodynamic analysis considering geometric and operational uncertainties. It is proved that the method is more efficient than non-intrusive polynomial chaos and Monte Carlo Simulation (MSC method for the stochastic aerodynamic analysis. By uncertainty quantification, it can be learnt that the flow characteristics of shock wave and boundary layer separation are sensitive to the geometric uncertainty in transonic region. The uncertainty sensitivity analysis reveals the individual and coupled effects among the uncertainty parameters. Keywords: Non-intrusive polynomial chaos, Sparse grid, Stochastic aerodynamic analysis, Uncertainty sensitivity analysis, Uncertainty quantification

  2. Uncertainty Estimation Cheat Sheet for Probabilistic Risk Assessment

    Science.gov (United States)

    Britton, Paul T.; Al Hassan, Mohammad; Ring, Robert W.

    2017-01-01

    "Uncertainty analysis itself is uncertain, therefore, you cannot evaluate it exactly," Source Uncertain Quantitative results for aerospace engineering problems are influenced by many sources of uncertainty. Uncertainty analysis aims to make a technical contribution to decision-making through the quantification of uncertainties in the relevant variables as well as through the propagation of these uncertainties up to the result. Uncertainty can be thought of as a measure of the 'goodness' of a result and is typically represented as statistical dispersion. This paper will explain common measures of centrality and dispersion; and-with examples-will provide guidelines for how they may be estimated to ensure effective technical contributions to decision-making.

  3. Towards an Industrial Application of Statistical Uncertainty Analysis Methods to Multi-physical Modelling and Safety Analyses

    International Nuclear Information System (INIS)

    Zhang, Jinzhao; Segurado, Jacobo; Schneidesch, Christophe

    2013-01-01

    Since 1980's, Tractebel Engineering (TE) has being developed and applied a multi-physical modelling and safety analyses capability, based on a code package consisting of the best estimate 3D neutronic (PANTHER), system thermal hydraulic (RELAP5), core sub-channel thermal hydraulic (COBRA-3C), and fuel thermal mechanic (FRAPCON/FRAPTRAN) codes. A series of methodologies have been developed to perform and to license the reactor safety analysis and core reload design, based on the deterministic bounding approach. Following the recent trends in research and development as well as in industrial applications, TE has been working since 2010 towards the application of the statistical sensitivity and uncertainty analysis methods to the multi-physical modelling and licensing safety analyses. In this paper, the TE multi-physical modelling and safety analyses capability is first described, followed by the proposed TE best estimate plus statistical uncertainty analysis method (BESUAM). The chosen statistical sensitivity and uncertainty analysis methods (non-parametric order statistic method or bootstrap) and tool (DAKOTA) are then presented, followed by some preliminary results of their applications to FRAPCON/FRAPTRAN simulation of OECD RIA fuel rod codes benchmark and RELAP5/MOD3.3 simulation of THTF tests. (authors)

  4. Uncertainty in projected point precipitation extremes for hydrological impact analysis of climate change

    Science.gov (United States)

    Van Uytven, Els; Willems, Patrick

    2017-04-01

    Current trends in the hydro-meteorological variables indicate the potential impact of climate change on hydrological extremes. Therefore, they trigger an increased importance climate adaptation strategies in water management. The impact of climate change on hydro-meteorological and hydrological extremes is, however, highly uncertain. This is due to uncertainties introduced by the climate models, the internal variability inherent to the climate system, the greenhouse gas scenarios and the statistical downscaling methods. In view of the need to define sustainable climate adaptation strategies, there is a need to assess these uncertainties. This is commonly done by means of ensemble approaches. Because more and more climate models and statistical downscaling methods become available, there is a need to facilitate the climate impact and uncertainty analysis. A Climate Perturbation Tool has been developed for that purpose, which combines a set of statistical downscaling methods including weather typing, weather generator, transfer function and advanced perturbation based approaches. By use of an interactive interface, climate impact modelers can apply these statistical downscaling methods in a semi-automatic way to an ensemble of climate model runs. The tool is applicable to any region, but has been demonstrated so far to cases in Belgium, Suriname, Vietnam and Bangladesh. Time series representing future local-scale precipitation, temperature and potential evapotranspiration (PET) conditions were obtained, starting from time series of historical observations. Uncertainties on the future meteorological conditions are represented in two different ways: through an ensemble of time series, and a reduced set of synthetic scenarios. The both aim to span the full uncertainty range as assessed from the ensemble of climate model runs and downscaling methods. For Belgium, for instance, use was made of 100-year time series of 10-minutes precipitation observations and daily

  5. Comparing the treatment of uncertainty in Bayesian networks and fuzzy expert systems used for a human reliability analysis application

    International Nuclear Information System (INIS)

    Baraldi, Piero; Podofillini, Luca; Mkrtchyan, Lusine; Zio, Enrico; Dang, Vinh N.

    2015-01-01

    The use of expert systems can be helpful to improve the transparency and repeatability of assessments in areas of risk analysis with limited data available. In this field, human reliability analysis (HRA) is no exception, and, in particular, dependence analysis is an HRA task strongly based on analyst judgement. The analysis of dependence among Human Failure Events refers to the assessment of the effect of an earlier human failure on the probability of the subsequent ones. This paper analyses and compares two expert systems, based on Bayesian Belief Networks and Fuzzy Logic (a Fuzzy Expert System, FES), respectively. The comparison shows that a BBN approach should be preferred in all the cases characterized by quantifiable uncertainty in the input (i.e. when probability distributions can be assigned to describe the input parameters uncertainty), since it provides a satisfactory representation of the uncertainty and its output is directly interpretable for use within PSA. On the other hand, in cases characterized by very limited knowledge, an analyst may feel constrained by the probabilistic framework, which requires assigning probability distributions for describing uncertainty. In these cases, the FES seems to lead to a more transparent representation of the input and output uncertainty. - Highlights: • We analyse treatment of uncertainty in two expert systems. • We compare a Bayesian Belief Network (BBN) and a Fuzzy Expert System (FES). • We focus on the input assessment, inference engines and output assessment. • We focus on an application problem of interest for human reliability analysis. • We emphasize the application rather than math to reach non-BBN or FES specialists

  6. Overview and application of the Model Optimization, Uncertainty, and SEnsitivity Analysis (MOUSE) toolbox

    Science.gov (United States)

    For several decades, optimization and sensitivity/uncertainty analysis of environmental models has been the subject of extensive research. Although much progress has been made and sophisticated methods developed, the growing complexity of environmental models to represent real-world systems makes it...

  7. An approach to multi-attribute utility analysis under parametric uncertainty

    International Nuclear Information System (INIS)

    Kelly, M.; Thorne, M.C.

    2001-01-01

    The techniques of cost-benefit analysis and multi-attribute analysis provide a useful basis for informing decisions in situations where a number of potentially conflicting opinions or interests need to be considered, and where there are a number of possible decisions that could be adopted. When the input data to such decision-making processes are uniquely specified, cost-benefit analysis and multi-attribute utility analysis provide unambiguous guidance on the preferred decision option. However, when the data are not uniquely specified, application and interpretation of these techniques is more complex. Herein, an approach to multi-attribute utility analysis (and hence, as a special case, cost-benefit analysis) when input data are subject to parametric uncertainty is presented. The approach is based on the use of a Monte Carlo technique, and has recently been applied to options for the remediation of former uranium mining liabilities in a number of Central and Eastern European States

  8. Applying Fuzzy and Probabilistic Uncertainty Concepts to the Material Flow Analysis of Palladium in Austria

    DEFF Research Database (Denmark)

    Laner, David; Rechberger, Helmut; Astrup, Thomas Fruergaard

    2015-01-01

    Material flow analysis (MFA) is a widely applied tool to investigate resource and recycling systems of metals and minerals. Owing to data limitations and restricted system understanding, MFA results are inherently uncertain. To demonstrate the systematic implementation of uncertainty analysis in ...

  9. Uncertainty analysis for results of thermal hydraulic codes of best-estimate-type; Analisis de incertidumbre para resultados de codigos termohidraulicos de mejor estimacion

    Energy Technology Data Exchange (ETDEWEB)

    Alva N, J.

    2010-07-01

    In this thesis, some fundamental knowledge is presented about uncertainty analysis and about diverse methodologies applied in the study of nuclear power plant transient event analysis, particularly related to thermal hydraulics phenomena. These concepts and methodologies mentioned in this work come from a wide bibliographical research in the nuclear power subject. Methodologies for uncertainty analysis have been developed by quite diverse institutions, and they have been widely used worldwide for application to results from best-estimate-type computer codes in nuclear reactor thermal hydraulics and safety analysis. Also, the main uncertainty sources, types of uncertainties, and aspects related to best estimate modeling and methods are introduced. Once the main bases of uncertainty analysis have been set, and some of the known methodologies have been introduced, it is presented in detail the CSAU methodology, which will be applied in the analyses. The main objective of this thesis is to compare the results of an uncertainty and sensibility analysis by using the Response Surface Technique to the application of W ilks formula, apply through a loss coolant experiment and an event of rise in a BWR. Both techniques are options in the part of uncertainty and sensibility analysis of the CSAU methodology, which was developed for the analysis of transients and accidents at nuclear power plants, and it is the base of most of the methodologies used in licensing of nuclear power plants practically everywhere. Finally, the results of applying both techniques are compared and discussed. (Author)

  10. Uncertainty in techno-economic estimates of cellulosic ethanol production due to experimental measurement uncertainty

    Directory of Open Access Journals (Sweden)

    Vicari Kristin J

    2012-04-01

    Full Text Available Abstract Background Cost-effective production of lignocellulosic biofuels remains a major financial and technical challenge at the industrial scale. A critical tool in biofuels process development is the techno-economic (TE model, which calculates biofuel production costs using a process model and an economic model. The process model solves mass and energy balances for each unit, and the economic model estimates capital and operating costs from the process model based on economic assumptions. The process model inputs include experimental data on the feedstock composition and intermediate product yields for each unit. These experimental yield data are calculated from primary measurements. Uncertainty in these primary measurements is propagated to the calculated yields, to the process model, and ultimately to the economic model. Thus, outputs of the TE model have a minimum uncertainty associated with the uncertainty in the primary measurements. Results We calculate the uncertainty in the Minimum Ethanol Selling Price (MESP estimate for lignocellulosic ethanol production via a biochemical conversion process: dilute sulfuric acid pretreatment of corn stover followed by enzymatic hydrolysis and co-fermentation of the resulting sugars to ethanol. We perform a sensitivity analysis on the TE model and identify the feedstock composition and conversion yields from three unit operations (xylose from pretreatment, glucose from enzymatic hydrolysis, and ethanol from fermentation as the most important variables. The uncertainty in the pretreatment xylose yield arises from multiple measurements, whereas the glucose and ethanol yields from enzymatic hydrolysis and fermentation, respectively, are dominated by a single measurement: the fraction of insoluble solids (fIS in the biomass slurries. Conclusions We calculate a $0.15/gal uncertainty in MESP from the TE model due to uncertainties in primary measurements. This result sets a lower bound on the error bars of

  11. Verification of uncertainty budgets

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Madsen, B.S.

    2005-01-01

    , and therefore it is essential that the applicability of the overall uncertainty budget to actual measurement results be verified on the basis of current experimental data. This should be carried out by replicate analysis of samples taken in accordance with the definition of the measurand, but representing...... the full range of matrices and concentrations for which the budget is assumed to be valid. In this way the assumptions made in the uncertainty budget can be experimentally verified, both as regards sources of variability that are assumed negligible, and dominant uncertainty components. Agreement between...

  12. Analysis of uncertainties in the estimates of nitrous oxide and methane emissions in the UK's greenhouse gas inventory for agriculture

    Science.gov (United States)

    Milne, Alice E.; Glendining, Margaret J.; Bellamy, Pat; Misselbrook, Tom; Gilhespy, Sarah; Rivas Casado, Monica; Hulin, Adele; van Oijen, Marcel; Whitmore, Andrew P.

    2014-01-01

    The UK's greenhouse gas inventory for agriculture uses a model based on the IPCC Tier 1 and Tier 2 methods to estimate the emissions of methane and nitrous oxide from agriculture. The inventory calculations are disaggregated at country level (England, Wales, Scotland and Northern Ireland). Before now, no detailed assessment of the uncertainties in the estimates of emissions had been done. We used Monte Carlo simulation to do such an analysis. We collated information on the uncertainties of each of the model inputs. The uncertainties propagate through the model and result in uncertainties in the estimated emissions. Using a sensitivity analysis, we found that in England and Scotland the uncertainty in the emission factor for emissions from N inputs (EF1) affected uncertainty the most, but that in Wales and Northern Ireland, the emission factor for N leaching and runoff (EF5) had greater influence. We showed that if the uncertainty in any one of these emission factors is reduced by 50%, the uncertainty in emissions of nitrous oxide reduces by 10%. The uncertainty in the estimate for the emissions of methane emission factors for enteric fermentation in cows and sheep most affected the uncertainty in methane emissions. When inventories are disaggregated (as that for the UK is) correlation between separate instances of each emission factor will affect the uncertainty in emissions. As more countries move towards inventory models with disaggregation, it is important that the IPCC give firm guidance on this topic.

  13. A new uncertainty importance measure

    International Nuclear Information System (INIS)

    Borgonovo, E.

    2007-01-01

    Uncertainty in parameters is present in many risk assessment problems and leads to uncertainty in model predictions. In this work, we introduce a global sensitivity indicator which looks at the influence of input uncertainty on the entire output distribution without reference to a specific moment of the output (moment independence) and which can be defined also in the presence of correlations among the parameters. We discuss its mathematical properties and highlight the differences between the present indicator, variance-based uncertainty importance measures and a moment independent sensitivity indicator previously introduced in the literature. Numerical results are discussed with application to the probabilistic risk assessment model on which Iman [A matrix-based approach to uncertainty and sensitivity analysis for fault trees. Risk Anal 1987;7(1):22-33] first introduced uncertainty importance measures

  14. Evaluating prediction uncertainty

    International Nuclear Information System (INIS)

    McKay, M.D.

    1995-03-01

    The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented

  15. Oil price uncertainty in Canada

    Energy Technology Data Exchange (ETDEWEB)

    Elder, John [Department of Finance and Real Estate, 1272 Campus Delivery, Colorado State University, Fort Collins, CO 80523 (United States); Serletis, Apostolos [Department of Economics, University of Calgary, Calgary, Alberta (Canada)

    2009-11-15

    Bernanke [Bernanke, Ben S. Irreversibility, uncertainty, and cyclical investment. Quarterly Journal of Economics 98 (1983), 85-106.] shows how uncertainty about energy prices may induce optimizing firms to postpone investment decisions, thereby leading to a decline in aggregate output. Elder and Serletis [Elder, John and Serletis, Apostolos. Oil price uncertainty.] find empirical evidence that uncertainty about oil prices has tended to depress investment in the United States. In this paper we assess the robustness of these results by investigating the effects of oil price uncertainty in Canada. Our results are remarkably similar to existing results for the United States, providing additional evidence that uncertainty about oil prices may provide another explanation for why the sharp oil price declines of 1985 failed to produce rapid output growth. Impulse-response analysis suggests that uncertainty about oil prices may tend to reinforce the negative response of output to positive oil shocks. (author)

  16. Nuclear data uncertainty analysis for the generation IV gas-cooled fast reactor

    International Nuclear Information System (INIS)

    Pelloni, S.; Mikityuk, K.

    2012-01-01

    For the European 2400 MW Gas-cooled Fast Reactor (GoFastR), this paper summarizes a priori uncertainties, i.e. without any integral experiment assessment, of the main neutronic parameters which were obtained on the basis of the deterministic code system ERANOS (Edition 2.2-N). JEFF-3.1 cross-sections were used in conjunction with the newest ENDF/B-VII.0 based covariance library (COMMARA-2.0) resulting from a recent cooperation of the Brookhaven and Los Alamos National Laboratories within the Advanced Fuel Cycle Initiative. The basis for the analysis is the original GoFastR concept with carbide fuel pins and silicon-carbide ceramic cladding, which was developed and proposed in the first quarter of 2009 by the 'French alternative energies and Atomic Energy Commission', CEA. The main conclusions from the current study are that nuclear data uncertainties of neutronic parameters may still be too large for this Generation IV reactor, especially concerning the multiplication factor, despite the fact that the new covariance library is quite complete; These uncertainties, in relative terms, do not show the a priori expected increase with bum-up as a result of the minor actinide and fission product build-up. Indeed, they are found almost independent of the fuel depletion, since the uncertainty associated with 238 U inelastic scattering results largely dominating. This finding clearly supports the activities of Subgroup 33 of the Working Party on International Nuclear Data Evaluation Cooperation (WPEC), i.e. Methods and issues for the combined use of integral experiments and covariance data, attempting to reduce the present unbiased uncertainties on nuclear data through adjustments based on available experimental data. (authors)

  17. A Proposal of Estimation Methodology to Improve Calculation Efficiency of Sampling-based Method in Nuclear Data Sensitivity and Uncertainty Analysis

    International Nuclear Information System (INIS)

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung; Noh, Jae Man

    2014-01-01

    The uncertainty with the sampling-based method is evaluated by repeating transport calculations with a number of cross section data sampled from the covariance uncertainty data. In the transport calculation with the sampling-based method, the transport equation is not modified; therefore, all uncertainties of the responses such as k eff , reaction rates, flux and power distribution can be directly obtained all at one time without code modification. However, a major drawback with the sampling-based method is that it requires expensive computational load for statistically reliable results (inside confidence level 0.95) in the uncertainty analysis. The purpose of this study is to develop a method for improving the computational efficiency and obtaining highly reliable uncertainty result in using the sampling-based method with Monte Carlo simulation. The proposed method is a method to reduce the convergence time of the response uncertainty by using the multiple sets of sampled group cross sections in a single Monte Carlo simulation. The proposed method was verified by estimating GODIVA benchmark problem and the results were compared with that of conventional sampling-based method. In this study, sampling-based method based on central limit theorem is proposed to improve calculation efficiency by reducing the number of repetitive Monte Carlo transport calculation required to obtain reliable uncertainty analysis results. Each set of sampled group cross sections is assigned to each active cycle group in a single Monte Carlo simulation. The criticality uncertainty for the GODIVA problem is evaluated by the proposed and previous method. The results show that the proposed sampling-based method can efficiently decrease the number of Monte Carlo simulation required for evaluate uncertainty of k eff . It is expected that the proposed method will improve computational efficiency of uncertainty analysis with sampling-based method

  18. Qualification and application of nuclear reactor accident analysis code with the capability of internal assessment of uncertainty

    International Nuclear Information System (INIS)

    Borges, Ronaldo Celem

    2001-10-01

    This thesis presents an independent qualification of the CIAU code ('Code with the capability of - Internal Assessment of Uncertainty') which is part of the internal uncertainty evaluation process with a thermal hydraulic system code on a realistic basis. This is done by combining the uncertainty methodology UMAE ('Uncertainty Methodology based on Accuracy Extrapolation') with the RELAP5/Mod3.2 code. This allows associating uncertainty band estimates with the results obtained by the realistic calculation of the code, meeting licensing requirements of safety analysis. The independent qualification is supported by simulations with RELAP5/Mod3.2 related to accident condition tests of LOBI experimental facility and to an event which has occurred in Angra 1 nuclear power plant, by comparison with measured results and by establishing uncertainty bands on safety parameter calculated time trends. These bands have indeed enveloped the measured trends. Results from this independent qualification of CIAU have allowed to ascertain the adequate application of a systematic realistic code procedure to analyse accidents with uncertainties incorporated in the results, although there is an evident need of extending the uncertainty data base. It has been verified that use of the code with this internal assessment of uncertainty is feasible in the design and license stages of a NPP. (author)

  19. Strategy under uncertainty.

    Science.gov (United States)

    Courtney, H; Kirkland, J; Viguerie, P

    1997-01-01

    At the heart of the traditional approach to strategy lies the assumption that by applying a set of powerful analytic tools, executives can predict the future of any business accurately enough to allow them to choose a clear strategic direction. But what happens when the environment is so uncertain that no amount of analysis will allow us to predict the future? What makes for a good strategy in highly uncertain business environments? The authors, consultants at McKinsey & Company, argue that uncertainty requires a new way of thinking about strategy. All too often, they say, executives take a binary view: either they underestimate uncertainty to come up with the forecasts required by their companies' planning or capital-budging processes, or they overestimate it, abandon all analysis, and go with their gut instinct. The authors outline a new approach that begins by making a crucial distinction among four discrete levels of uncertainty that any company might face. They then explain how a set of generic strategies--shaping the market, adapting to it, or reserving the right to play at a later time--can be used in each of the four levels. And they illustrate how these strategies can be implemented through a combination of three basic types of actions: big bets, options, and no-regrets moves. The framework can help managers determine which analytic tools can inform decision making under uncertainty--and which cannot. At a broader level, it offers executives a discipline for thinking rigorously and systematically about uncertainty and its implications for strategy.

  20. Uncertainty and global climate change research

    Energy Technology Data Exchange (ETDEWEB)

    Tonn, B.E. [Oak Ridge National Lab., TN (United States); Weiher, R. [National Oceanic and Atmospheric Administration, Boulder, CO (United States)

    1994-06-01

    The Workshop on Uncertainty and Global Climate Change Research March 22--23, 1994, in Knoxville, Tennessee. This report summarizes the results and recommendations of the workshop. The purpose of the workshop was to examine in-depth the concept of uncertainty. From an analytical point of view, uncertainty is a central feature of global climate science, economics and decision making. The magnitude and complexity of uncertainty surrounding global climate change has made it quite difficult to answer even the most simple and important of questions-whether potentially costly action is required now to ameliorate adverse consequences of global climate change or whether delay is warranted to gain better information to reduce uncertainties. A major conclusion of the workshop is that multidisciplinary integrated assessments using decision analytic techniques as a foundation is key to addressing global change policy concerns. First, uncertainty must be dealt with explicitly and rigorously since it is and will continue to be a key feature of analysis and recommendations on policy questions for years to come. Second, key policy questions and variables need to be explicitly identified, prioritized, and their uncertainty characterized to guide the entire scientific, modeling, and policy analysis process. Multidisciplinary integrated assessment techniques and value of information methodologies are best suited for this task. In terms of timeliness and relevance of developing and applying decision analytic techniques, the global change research and policy communities are moving rapidly toward integrated approaches to research design and policy analysis.

  1. Overview of hybrid subspace methods for uncertainty quantification, sensitivity analysis

    International Nuclear Information System (INIS)

    Abdel-Khalik, Hany S.; Bang, Youngsuk; Wang, Congjian

    2013-01-01

    Highlights: ► We overview the state-of-the-art in uncertainty quantification and sensitivity analysis. ► We overview new developments in above areas using hybrid methods. ► We give a tutorial introduction to above areas and the new developments. ► Hybrid methods address the explosion in dimensionality in nonlinear models. ► Representative numerical experiments are given. -- Abstract: The role of modeling and simulation has been heavily promoted in recent years to improve understanding of complex engineering systems. To realize the benefits of modeling and simulation, concerted efforts in the areas of uncertainty quantification and sensitivity analysis are required. The manuscript intends to serve as a pedagogical presentation of the material to young researchers and practitioners with little background on the subjects. We believe this is important as the role of these subjects is expected to be integral to the design, safety, and operation of existing as well as next generation reactors. In addition to covering the basics, an overview of the current state-of-the-art will be given with particular emphasis on the challenges pertaining to nuclear reactor modeling. The second objective will focus on presenting our own development of hybrid subspace methods intended to address the explosion in the computational overhead required when handling real-world complex engineering systems.

  2. Parameterization and Uncertainty Analysis of SWAT model in Hydrological Simulation of Chaohe River Basin

    Science.gov (United States)

    Jie, M.; Zhang, J.; Guo, B. B.

    2017-12-01

    As a typical distributed hydrological model, the SWAT model also has a challenge in calibrating parameters and analysis their uncertainty. This paper chooses the Chaohe River Basin China as the study area, through the establishment of the SWAT model, loading the DEM data of the Chaohe river basin, the watershed is automatically divided into several sub-basins. Analyzing the land use, soil and slope which are on the basis of the sub-basins and calculating the hydrological response unit (HRU) of the study area, after running SWAT model, the runoff simulation values in the watershed are obtained. On this basis, using weather data, known daily runoff of three hydrological stations, combined with the SWAT-CUP automatic program and the manual adjustment method are used to analyze the multi-site calibration of the model parameters. Furthermore, the GLUE algorithm is used to analyze the parameters uncertainty of the SWAT model. Through the sensitivity analysis, calibration and uncertainty study of SWAT, the results indicate that the parameterization of the hydrological characteristics of the Chaohe river is successful and feasible which can be used to simulate the Chaohe river basin.

  3. Uncertainty relation and probability. Numerical illustration

    International Nuclear Information System (INIS)

    Fujikawa, Kazuo; Umetsu, Koichiro

    2011-01-01

    The uncertainty relation and the probability interpretation of quantum mechanics are intrinsically connected, as is evidenced by the evaluation of standard deviations. It is thus natural to ask if one can associate a very small uncertainty product of suitably sampled events with a very small probability. We have shown elsewhere that some examples of the evasion of the uncertainty relation noted in the past are in fact understood in this way. We here numerically illustrate that a very small uncertainty product is realized if one performs a suitable sampling of measured data that occur with a very small probability. We introduce a notion of cyclic measurements. It is also shown that our analysis is consistent with the Landau-Pollak-type uncertainty relation. It is suggested that the present analysis may help reconcile the contradicting views about the 'standard quantum limit' in the detection of gravitational waves. (author)

  4. A new approach and computational algorithm for sensitivity/uncertainty analysis for SED and SAD with applications to beryllium integral experiments

    International Nuclear Information System (INIS)

    Song, P.M.; Youssef, M.Z.; Abdou, M.A.

    1993-01-01

    A new approach for treating the sensitivity and uncertainty in the secondary energy distribution (SED) and the secondary angular distribution (SAD) has been developed, and the existing two-dimensional sensitivity/uncertainty analysis code, FORSS, was expanded to incorporate the new approach. The calculational algorithm was applied to the 9 Be(n,2n) cross section to study the effect of the current uncertainties in the SED and SAD of neutrons emitted from this reaction on the prediction accuracy of the tritium production rate from 6 Li(T 6 ) and 7 Li(T 7 ) in an engineering-oriented fusion integral experiment of the US Department of Energy/Japan Atomic Energy Research Institute Collaborative Program on Fusion Neutronics in which beryllium was used as a neutron multiplier. In addition, the analysis was extended to include the uncertainties in the integrated smooth cross sections of beryllium and other materials that constituted the test assembly used in the experiment. This comprehensive two-dimensional cross-section sensitivity/uncertainty analysis aimed at identifying the sources of discrepancies between calculated and measured values for T 6 and T 7

  5. Parameter optimization, sensitivity, and uncertainty analysis of an ecosystem model at a forest flux tower site in the United States

    Science.gov (United States)

    Wu, Yiping; Liu, Shuguang; Huang, Zhihong; Yan, Wende

    2014-01-01

    Ecosystem models are useful tools for understanding ecological processes and for sustainable management of resources. In biogeochemical field, numerical models have been widely used for investigating carbon dynamics under global changes from site to regional and global scales. However, it is still challenging to optimize parameters and estimate parameterization uncertainty for complex process-based models such as the Erosion Deposition Carbon Model (EDCM), a modified version of CENTURY, that consider carbon, water, and nutrient cycles of ecosystems. This study was designed to conduct the parameter identifiability, optimization, sensitivity, and uncertainty analysis of EDCM using our developed EDCM-Auto, which incorporated a comprehensive R package—Flexible Modeling Framework (FME) and the Shuffled Complex Evolution (SCE) algorithm. Using a forest flux tower site as a case study, we implemented a comprehensive modeling analysis involving nine parameters and four target variables (carbon and water fluxes) with their corresponding measurements based on the eddy covariance technique. The local sensitivity analysis shows that the plant production-related parameters (e.g., PPDF1 and PRDX) are most sensitive to the model cost function. Both SCE and FME are comparable and performed well in deriving the optimal parameter set with satisfactory simulations of target variables. Global sensitivity and uncertainty analysis indicate that the parameter uncertainty and the resulting output uncertainty can be quantified, and that the magnitude of parameter-uncertainty effects depends on variables and seasons. This study also demonstrates that using the cutting-edge R functions such as FME can be feasible and attractive for conducting comprehensive parameter analysis for ecosystem modeling.

  6. Analysis of uncertainty in modeling perceived risks

    International Nuclear Information System (INIS)

    Melnyk, R.; Sandquist, G.M.

    2005-01-01

    Expanding on a mathematical model developed for quantifying and assessing perceived risks, the distribution functions, variances, and uncertainties associated with estimating the model parameters are quantified. The analytical model permits the identification and assignment of any number of quantifiable risk perception factors that can be incorporated within standard risk methodology. Those risk perception factors associated with major technical issues are modeled using lognormal probability density functions to span the potentially large uncertainty variations associated with these risk perceptions. The model quantifies the logic of public risk perception and provides an effective means for measuring and responding to perceived risks. (authors)

  7. Sample application of sensitivity/uncertainty analysis techniques to a groundwater transport problem. National Low-Level Waste Management Program

    International Nuclear Information System (INIS)

    Seitz, R.R.; Rood, A.S.; Harris, G.A.; Maheras, S.J.; Kotecki, M.

    1991-06-01

    The primary objective of this document is to provide sample applications of selected sensitivity and uncertainty analysis techniques within the context of the radiological performance assessment process. These applications were drawn from the companion document Guidelines for Sensitivity and Uncertainty Analyses of Low-Level Radioactive Waste Performance Assessment Computer Codes (S. Maheras and M. Kotecki, DOE/LLW-100, 1990). Three techniques are illustrated in this document: one-factor-at-a-time (OFAT) analysis, fractional factorial design, and Latin hypercube sampling. The report also illustrates the differences in sensitivity and uncertainty analysis at the early and latter stages of the performance assessment process, and potential pitfalls that can be encountered when applying the techniques. The emphasis is on application of the techniques as opposed to the actual results, since the results are hypothetical and are not based on site-specific conditions

  8. ThermoData Engine: Extension to Solvent Design and Multi-component Process Stream Property Calculations with Uncertainty Analysis

    DEFF Research Database (Denmark)

    Diky, Vladimir; Chirico, Robert D.; Muzny, Chris

    ThermoData Engine (TDE, NIST Standard Reference Databases 103a and 103b) is the first product that implements the concept of Dynamic Data Evaluation in the fields of thermophysics and thermochemistry, which includes maintaining the comprehensive and up-to-date database of experimentally measured...... property values and expert system for data analysis and generation of recommended property values at the specified conditions along with uncertainties on demand. The most recent extension of TDE covers solvent design and multi-component process stream property calculations with uncertainty analysis...... variations). Predictions can be compared to the available experimental data, and uncertainties are estimated for all efficiency criteria. Calculations of the properties of multi-component streams including composition at phase equilibria (flash calculations) are at the heart of process simulation engines...

  9. Sensitivity Analysis of Uncertainty Parameter based on MARS-LMR Code on SHRT-45R of EBR II

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Seok-Ju; Kang, Doo-Hyuk; Seo, Jae-Seung [System Engineering and Technology Co., Daejeon (Korea, Republic of); Bae, Sung-Won [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Jeong, Hae-Yong [Sejong University, Seoul (Korea, Republic of)

    2016-10-15

    In order to assess the uncertainty quantification of the MARS-LMR code, the code has been improved by modifying the source code to accommodate calculation process required for uncertainty quantification. In the present study, a transient of Unprotected Loss of Flow(ULOF) is selected as typical cases of as Anticipated Transient without Scram(ATWS) which belongs to DEC category. The MARS-LMR input generation for EBR II SHRT-45R and execution works are performed by using the PAPIRUS program. The sensitivity analysis is carried out with Uncertainty Parameter of the MARS-LMR code for EBR-II SHRT-45R. Based on the results of sensitivity analysis, dominant parameters with large sensitivity to FoM are picked out. Dominant parameters selected are closely related to the development process of ULOF event.

  10. Uncertainty analysis of 137Cs and 90Sr activity in borehole water from a waste disposal site

    International Nuclear Information System (INIS)

    Dafauti, Sunita; Pulhani, Vandana; Datta, D.; Hegde, A.G.

    2005-01-01

    Uncertainty quantification (UQ) is the quantitative characterization and use of uncertainty in experimental applications. There are two distinct types of uncertainty variability which can be quantified in principle using classical probability theory and lack of knowledge which requires more than classical probability theory for its quantification. Fuzzy set theory was applied to quantify the second type of uncertainty associated with the measurement of activity due to 137 Cs and 90 Sr present in bore-well water samples from a waste disposal site. The upper and lower limits of concentration were computed and it may be concluded from the analysis that the alpha cut technique of fuzzy set theory is a good nonprecise estimator of these types of bounds. (author)

  11. Uncertainties in Forecasting Streamflow using Entropy Theory

    Science.gov (United States)

    Cui, H.; Singh, V. P.

    2017-12-01

    Streamflow forecasting is essential in river restoration, reservoir operation, power generation, irrigation, navigation, and water management. However, there is always uncertainties accompanied in forecast, which may affect the forecasting results and lead to large variations. Therefore, uncertainties must be considered and be assessed properly when forecasting streamflow for water management. The aim of our work is to quantify the uncertainties involved in forecasting streamflow and provide reliable streamflow forecast. Despite that streamflow time series are stochastic, they exhibit seasonal and periodic patterns. Therefore, streamflow forecasting entails modeling seasonality, periodicity, and its correlation structure, and assessing uncertainties. This study applies entropy theory to forecast streamflow and measure uncertainties during the forecasting process. To apply entropy theory for streamflow forecasting, spectral analysis is combined to time series analysis, as spectral analysis can be employed to characterize patterns of streamflow variation and identify the periodicity of streamflow. That is, it permits to extract significant information for understanding the streamflow process and prediction thereof. Application of entropy theory for streamflow forecasting involves determination of spectral density, determination of parameters, and extension of autocorrelation function. The uncertainties brought by precipitation input, forecasting model and forecasted results are measured separately using entropy. With information theory, how these uncertainties transported and aggregated during these processes will be described.

  12. Geoengineering to Avoid Overshoot: An Analysis of Uncertainty

    Science.gov (United States)

    Tanaka, Katsumasa; Cho, Cheolhung; Krey, Volker; Patt, Anthony; Rafaj, Peter; Rao-Skirbekk, Shilpa; Wagner, Fabian

    2010-05-01

    ., 2009) is employed to calculate climate responses including associated uncertainty and to estimate geoengineering profiles to cap the warming at 2°C since preindustrial. The inversion setup for the model ACC2 is used to estimate the uncertain parameters (e.g. climate sensitivity) against associated historical observations (e.g. global-mean surface air temperature). Our preliminary results show that under climate and scenario uncertainties, a geoengineering intervention to avoid an overshoot would be with medium intensity in the latter half of this century (≈ 1 Mt. Pinatubo eruption every 4 years in terms of stratospheric sulfur injections). The start year of geoengineering intervention does not significantly influence the long-term geoengineering profile. However, a geoengineering intervention of the medium intensity could bring about substantial environmental side effects such as the destruction of stratospheric ozone. Our results point to the necessity to pursue persistently mainstream mitigation efforts. 2) Pollution Abatement and Geoengineering The second study examines the potential of geoengineering combined with air clean policy. A drastic air pollution abatement might result in an abrupt warming because it would suddenly remove the tropospheric aerosols which partly offset the background global warming (e.g. Andreae et al, 2005, Raddatz and Tanaka, 2010). This study investigates the magnitude of unrealized warming under a range of policy assumptions and associated uncertainties. Then the profile of geoengineering is estimated to suppress the warming that would be accompanied by clean air policy. This study is the first attempt to explore uncertainty in the warming caused by clean air policy - Kloster et al. (2009), which assess regional changes in climate and hydrological cycle, has not however included associated uncertainties in the analysis. A variety of policy assumptions will be devised to represent various degrees of air pollution abatement. These

  13. SENSIT: a cross-section and design sensitivity and uncertainty analysis code. [In FORTRAN for CDC-7600, IBM 360

    Energy Technology Data Exchange (ETDEWEB)

    Gerstl, S.A.W.

    1980-01-01

    SENSIT computes the sensitivity and uncertainty of a calculated integral response (such as a dose rate) due to input cross sections and their uncertainties. Sensitivity profiles are computed for neutron and gamma-ray reaction cross sections of standard multigroup cross section sets and for secondary energy distributions (SEDs) of multigroup scattering matrices. In the design sensitivity mode, SENSIT computes changes in an integral response due to design changes and gives the appropriate sensitivity coefficients. Cross section uncertainty analyses are performed for three types of input data uncertainties: cross-section covariance matrices for pairs of multigroup reaction cross sections, spectral shape uncertainty parameters for secondary energy distributions (integral SED uncertainties), and covariance matrices for energy-dependent response functions. For all three types of data uncertainties SENSIT computes the resulting variance and estimated standard deviation in an integral response of interest, on the basis of generalized perturbation theory. SENSIT attempts to be more comprehensive than earlier sensitivity analysis codes, such as SWANLAKE.

  14. Analysis of Uncertainty in Dynamic Processes Development of Banks Functioning

    Directory of Open Access Journals (Sweden)

    Aleksei V. Korovyakovskii

    2013-01-01

    Full Text Available The paper offers the approach to measure of uncertainty estimation in dynamic processes of banks functioning, using statistic data of different banking operations indicators. To calculate measure of uncertainty in dynamic processes of banks functioning the phase images of relevant sets of statistic data are considered. Besides, it is shown that the form of phase image of the studied sets of statistic data can act as a basis of measure of uncertainty estimation in dynamic processes of banks functioning. The set of analytical characteristics are offered to formalize the form of phase image definition of the studied sets of statistic data. It is shown that the offered analytical characteristics consider inequality of changes in values of the studied sets of statistic data, which is one of the ways of uncertainty display in dynamic processes development. The invariant estimates of measure of uncertainty in dynamic processes of banks functioning, considering significant changes in absolute values of the same indicators for different banks were obtained. The examples of calculation of measure of uncertainty in dynamic processes of concrete banks functioning were cited.

  15. Measuring and explaining eco-efficiencies of wastewater treatment plants in China: An uncertainty analysis perspective.

    Science.gov (United States)

    Dong, Xin; Zhang, Xinyi; Zeng, Siyu

    2017-04-01

    In the context of sustainable development, there has been an increasing requirement for an eco-efficiency assessment of wastewater treatment plants (WWTPs). Data envelopment analysis (DEA), a technique that is widely applied for relative efficiency assessment, is used in combination with the tolerances approach to handle WWTPs' multiple inputs and outputs as well as their uncertainty. The economic cost, energy consumption, contaminant removal, and global warming effect during the treatment processes are integrated to interpret the eco-efficiency of WWTPs. A total of 736 sample plants from across China are assessed, and large sensitivities to variations in inputs and outputs are observed for most samples, with only three WWTPs identified as being stably efficient. Size of plant, overcapacity, climate type, and influent characteristics are proven to have a significant influence on both the mean efficiency and performance sensitivity of WWTPs, while no clear relationships were found between eco-efficiency and technology under the framework of uncertainty analysis. The incorporation of uncertainty quantification and environmental impact consideration has improved the liability and applicability of the assessment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. An uncertainty and sensitivity analysis approach for GIS-based multicriteria landslide susceptibility mapping

    Science.gov (United States)

    Feizizadeh, Bakhtiar; Blaschke, Thomas

    2014-01-01

    GIS-based multicriteria decision analysis (MCDA) methods are increasingly being used in landslide susceptibility mapping. However, the uncertainties that are associated with MCDA techniques may significantly impact the results. This may sometimes lead to inaccurate outcomes and undesirable consequences. This article introduces a new GIS-based MCDA approach. We illustrate the consequences of applying different MCDA methods within a decision-making process through uncertainty analysis. Three GIS-MCDA methods in conjunction with Monte Carlo simulation (MCS) and Dempster–Shafer theory are analyzed for landslide susceptibility mapping (LSM) in the Urmia lake basin in Iran, which is highly susceptible to landslide hazards. The methodology comprises three stages. First, the LSM criteria are ranked and a sensitivity analysis is implemented to simulate error propagation based on the MCS. The resulting weights are expressed through probability density functions. Accordingly, within the second stage, three MCDA methods, namely analytical hierarchy process (AHP), weighted linear combination (WLC) and ordered weighted average (OWA), are used to produce the landslide susceptibility maps. In the third stage, accuracy assessments are carried out and the uncertainties of the different results are measured. We compare the accuracies of the three MCDA methods based on (1) the Dempster–Shafer theory and (2) a validation of the results using an inventory of known landslides and their respective coverage based on object-based image analysis of IRS-ID satellite images. The results of this study reveal that through the integration of GIS and MCDA models, it is possible to identify strategies for choosing an appropriate method for LSM. Furthermore, our findings indicate that the integration of MCDA and MCS can significantly improve the accuracy of the results. In LSM, the AHP method performed best, while the OWA reveals better performance in the reliability assessment. The WLC

  17. Petroleum taxation under uncertainty: contingent claims analysis with an application to Norway

    International Nuclear Information System (INIS)

    Lund, D.

    1992-01-01

    Contingent claims analysis provides a useful tool for analysing the value of tax claims, and of companies' after-tax values, under uncertainty. The method is presented and applied to the analysis of how Norwegian petroleum taxes affect company behaviour. Few results can be derived analytically. A numerical approach is suggested, with a stylized description of production possibilities. The Norwegian taxes are found to have strong distortionary effects. The relation to other methods and problems connected with the application are discussed. (Author)

  18. Achieving 95% probability level using best estimate codes and the code scaling, applicability and uncertainty (CSAU) [Code Scaling, Applicability and Uncertainty] methodology

    International Nuclear Information System (INIS)

    Wilson, G.E.; Boyack, B.E.; Duffey, R.B.; Griffith, P.; Katsma, K.R.; Lellouche, G.S.; Rohatgi, U.S.; Wulff, W.; Zuber, N.

    1988-01-01

    Issue of a revised rule for loss of coolant accident/emergency core cooling system (LOCA/ECCS) analysis of light water reactors will allow the use of best estimate (BE) computer codes in safety analysis, with uncertainty analysis. This paper describes a systematic methodology, CSAU (Code Scaling, Applicability and Uncertainty), which will provide uncertainty bounds in a cost effective, auditable, rational and practical manner. 8 figs., 2 tabs

  19. Estimation of environment-related properties of chemicals for design of sustainable processes: Development of group-contribution+ (GC+) models and uncertainty analysis

    DEFF Research Database (Denmark)

    Hukkerikar, Amol; Kalakul, Sawitree; Sarup, Bent

    2012-01-01

    The aim of this work is to develop group-3 contribution+ (GC+)method (combined group-contribution (GC) method and atom connectivity index (CI)) based 15 property models to provide reliable estimations of environment-related properties of organic chemicals together with uncertainties of estimated...... property values. For this purpose, a systematic methodology for property modeling and uncertainty analysis is used. The methodology includes a parameter estimation step to determine parameters of property models and an uncertainty analysis step to establish statistical information about the quality......, poly functional chemicals, etc.) taken from the database of the US Environmental Protection Agency (EPA) and from the database of USEtox is used. For property modeling and uncertainty analysis, the Marrero and Gani GC method and atom connectivity index method have been considered. In total, 22...

  20. Uncertainty Quantification with Applications to Engineering Problems

    DEFF Research Database (Denmark)

    Bigoni, Daniele

    in measurements, predictions and manufacturing, and we can say that any dynamical system used in engineering is subject to some of these uncertainties. The first part of this work presents an overview of the mathematical framework used in Uncertainty Quantification (UQ) analysis and introduces the spectral tensor...... and thus the UQ analysis of the associated systems will benefit greatly from the application of methods which require few function evaluations. We first consider the propagation of the uncertainty and the sensitivity analysis of the non-linear dynamics of railway vehicles with suspension components whose......-scale problems, where efficient methods are necessary with today’s computational resources. The outcome of this work was also the creation of several freely available Python modules for Uncertainty Quantification, which are listed and described in the appendix....