WorldWideScience

Sample records for uncertainty analysis code

  1. Automated uncertainty analysis methods in the FRAP computer codes

    International Nuclear Information System (INIS)

    Peck, S.O.

    1980-01-01

    A user oriented, automated uncertainty analysis capability has been incorporated in the Fuel Rod Analysis Program (FRAP) computer codes. The FRAP codes have been developed for the analysis of Light Water Reactor fuel rod behavior during steady state (FRAPCON) and transient (FRAP-T) conditions as part of the United States Nuclear Regulatory Commission's Water Reactor Safety Research Program. The objective of uncertainty analysis of these codes is to obtain estimates of the uncertainty in computed outputs of the codes is to obtain estimates of the uncertainty in computed outputs of the codes as a function of known uncertainties in input variables. This paper presents the methods used to generate an uncertainty analysis of a large computer code, discusses the assumptions that are made, and shows techniques for testing them. An uncertainty analysis of FRAP-T calculated fuel rod behavior during a hypothetical loss-of-coolant transient is presented as an example and carried through the discussion to illustrate the various concepts

  2. Uncertainty analysis of the FRAP code

    International Nuclear Information System (INIS)

    Peck, S.O.

    1978-01-01

    A user oriented, automated uncertainty analysis capability has been built into the FRAP code (Fuel Rod Analysis Program) and applied to a PWR fuel rod undergoing a LOCA. The method of uncertainty analysis is the Response Surface Method (RSM). (author)

  3. Uncertainty analysis of the FRAP code

    International Nuclear Information System (INIS)

    Peck, S.O.

    1978-01-01

    A user oriented, automated uncertainty analysis capability has been built into the Fuel Rod Analysis Program (FRAP) code and has been applied to a pressurized water reactor (PWR) fuel rod undergoing a loss-of-coolant accident (LOCA). The method of uncertainty analysis is the response surface method. The automated version significantly reduced the time required to complete the analysis and, at the same time, greatly increased the problem scope. Results of the analysis showed a significant difference in the total and relative contributions to the uncertainty of the response parameters between steady state and transient conditions

  4. Code development for eigenvalue total sensitivity analysis and total uncertainty analysis

    International Nuclear Information System (INIS)

    Wan, Chenghui; Cao, Liangzhi; Wu, Hongchun; Zu, Tiejun; Shen, Wei

    2015-01-01

    Highlights: • We develop a new code for total sensitivity and uncertainty analysis. • The implicit effects of cross sections can be considered. • The results of our code agree well with TSUNAMI-1D. • Detailed analysis for origins of implicit effects is performed. - Abstract: The uncertainties of multigroup cross sections notably impact eigenvalue of neutron-transport equation. We report on a total sensitivity analysis and total uncertainty analysis code named UNICORN that has been developed by applying the direct numerical perturbation method and statistical sampling method. In order to consider the contributions of various basic cross sections and the implicit effects which are indirect results of multigroup cross sections through resonance self-shielding calculation, an improved multigroup cross-section perturbation model is developed. The DRAGON 4.0 code, with application of WIMSD-4 format library, is used by UNICORN to carry out the resonance self-shielding and neutron-transport calculations. In addition, the bootstrap technique has been applied to the statistical sampling method in UNICORN to obtain much steadier and more reliable uncertainty results. The UNICORN code has been verified against TSUNAMI-1D by analyzing the case of TMI-1 pin-cell. The numerical results show that the total uncertainty of eigenvalue caused by cross sections can reach up to be about 0.72%. Therefore the contributions of the basic cross sections and their implicit effects are not negligible

  5. Code development of total sensitivity and uncertainty analysis for reactor physics calculations

    International Nuclear Information System (INIS)

    Wan, C.; Cao, L.; Wu, H.; Zu, T.; Shen, W.

    2015-01-01

    Sensitivity and uncertainty analysis are essential parts for reactor system to perform risk and policy analysis. In this study, total sensitivity and corresponding uncertainty analysis for responses of neutronics calculations have been accomplished and developed the S&U analysis code named UNICORN. The UNICORN code can consider the implicit effects of multigroup cross sections on the responses. The UNICORN code has been applied to typical pin-cell case in this paper, and can be proved correct by comparison the results with those of the TSUNAMI-1D code. (author)

  6. Code development of total sensitivity and uncertainty analysis for reactor physics calculations

    Energy Technology Data Exchange (ETDEWEB)

    Wan, C.; Cao, L.; Wu, H.; Zu, T., E-mail: chenghuiwan@stu.xjtu.edu.cn, E-mail: caolz@mail.xjtu.edu.cn, E-mail: hongchun@mail.xjtu.edu.cn, E-mail: tiejun@mail.xjtu.edu.cn [Xi' an Jiaotong Univ., School of Nuclear Science and Technology, Xi' an (China); Shen, W., E-mail: Wei.Shen@cnsc-ccsn.gc.ca [Xi' an Jiaotong Univ., School of Nuclear Science and Technology, Xi' an (China); Canadian Nuclear Safety Commission, Ottawa, ON (Canada)

    2015-07-01

    Sensitivity and uncertainty analysis are essential parts for reactor system to perform risk and policy analysis. In this study, total sensitivity and corresponding uncertainty analysis for responses of neutronics calculations have been accomplished and developed the S&U analysis code named UNICORN. The UNICORN code can consider the implicit effects of multigroup cross sections on the responses. The UNICORN code has been applied to typical pin-cell case in this paper, and can be proved correct by comparison the results with those of the TSUNAMI-1D code. (author)

  7. Achieving 95% probability level using best estimate codes and the code scaling, applicability and uncertainty (CSAU) [Code Scaling, Applicability and Uncertainty] methodology

    International Nuclear Information System (INIS)

    Wilson, G.E.; Boyack, B.E.; Duffey, R.B.; Griffith, P.; Katsma, K.R.; Lellouche, G.S.; Rohatgi, U.S.; Wulff, W.; Zuber, N.

    1988-01-01

    Issue of a revised rule for loss of coolant accident/emergency core cooling system (LOCA/ECCS) analysis of light water reactors will allow the use of best estimate (BE) computer codes in safety analysis, with uncertainty analysis. This paper describes a systematic methodology, CSAU (Code Scaling, Applicability and Uncertainty), which will provide uncertainty bounds in a cost effective, auditable, rational and practical manner. 8 figs., 2 tabs

  8. Uncertainty and sensitivity analysis applied to coupled code calculations for a VVER plant transient

    International Nuclear Information System (INIS)

    Langenbuch, S.; Krzykacz-Hausmann, B.; Schmidt, K. D.

    2004-01-01

    The development of coupled codes, combining thermal-hydraulic system codes and 3D neutron kinetics, is an important step to perform best-estimate plant transient calculations. It is generally agreed that the application of best-estimate methods should be supplemented by an uncertainty and sensitivity analysis to quantify the uncertainty of the results. The paper presents results from the application of the GRS uncertainty and sensitivity method for a VVER-440 plant transient, which was already studied earlier for the validation of coupled codes. For this application, the main steps of the uncertainty method are described. Typical results of the method applied to the analysis of the plant transient by several working groups using different coupled codes are presented and discussed The results demonstrate the capability of an uncertainty and sensitivity analysis. (authors)

  9. Using finite mixture models in thermal-hydraulics system code uncertainty analysis

    Energy Technology Data Exchange (ETDEWEB)

    Carlos, S., E-mail: scarlos@iqn.upv.es [Department d’Enginyeria Química i Nuclear, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain); Sánchez, A. [Department d’Estadística Aplicada i Qualitat, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain); Ginestar, D. [Department de Matemàtica Aplicada, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain); Martorell, S. [Department d’Enginyeria Química i Nuclear, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain)

    2013-09-15

    Highlights: • Best estimate codes simulation needs uncertainty quantification. • The output variables can present multimodal probability distributions. • The analysis of multimodal distribution is performed using finite mixture models. • Two methods to reconstruct output variable probability distribution are used. -- Abstract: Nuclear Power Plant safety analysis is mainly based on the use of best estimate (BE) codes that predict the plant behavior under normal or accidental conditions. As the BE codes introduce uncertainties due to uncertainty in input parameters and modeling, it is necessary to perform uncertainty assessment (UA), and eventually sensitivity analysis (SA), of the results obtained. These analyses are part of the appropriate treatment of uncertainties imposed by current regulation based on the adoption of the best estimate plus uncertainty (BEPU) approach. The most popular approach for uncertainty assessment, based on Wilks’ method, obtains a tolerance/confidence interval, but it does not completely characterize the output variable behavior, which is required for an extended UA and SA. However, the development of standard UA and SA impose high computational cost due to the large number of simulations needed. In order to obtain more information about the output variable and, at the same time, to keep computational cost as low as possible, there has been a recent shift toward developing metamodels (model of model), or surrogate models, that approximate or emulate complex computer codes. In this way, there exist different techniques to reconstruct the probability distribution using the information provided by a sample of values as, for example, the finite mixture models. In this paper, the Expectation Maximization and the k-means algorithms are used to obtain a finite mixture model that reconstructs the output variable probability distribution from data obtained with RELAP-5 simulations. Both methodologies have been applied to a separated

  10. Uncertainty and sensitivity analysis in the scenario simulation with RELAP/SCDAP and MELCOR codes

    International Nuclear Information System (INIS)

    Garcia J, T.; Cardenas V, J.

    2015-09-01

    A methodology was implemented for analysis of uncertainty in simulations of scenarios with RELAP/SCDAP V- 3.4 bi-7 and MELCOR V-2.1 codes, same that are used to perform safety analysis in the Comision Nacional de Seguridad Nuclear y Salvaguardias (CNSNS). The uncertainty analysis methodology chosen is a probabilistic method of type Propagation of uncertainty of the input parameters to the departure parameters. Therefore, it began with the selection of the input parameters considered uncertain and are considered of high importance in the scenario for its direct effect on the output interest variable. These parameters were randomly sampled according to intervals of variation or probability distribution functions assigned by expert judgment to generate a set of input files that were run through the simulation code to propagate the uncertainty to the output parameters. Then, through the use or ordered statistical and formula Wilks, was determined that the minimum number of executions required to obtain the uncertainty bands that include a population of 95% at a confidence level of 95% in the results is 93, is important to mention that in this method that number of executions does not depend on the number of selected input parameters. In the implementation routines in Fortran 90 that allowed automate the process to make the uncertainty analysis in transients for RELAP/SCDAP code were generated. In the case of MELCOR code for severe accident analysis, automation was carried out through complement Dakota Uncertainty incorporated into the Snap platform. To test the practical application of this methodology, two analyzes were performed: the first with the simulation of closing transient of the main steam isolation valves using the RELAP/SCDAP code obtaining the uncertainty band of the dome pressure of the vessel; while in the second analysis, the accident simulation of the power total loss (Sbo) was carried out with the Macarol code obtaining the uncertainty band for the

  11. Sensitivity Analysis and Uncertainty Quantification for the LAMMPS Molecular Dynamics Simulation Code

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Bhat, Kabekode Ghanasham [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-07-18

    We examine sensitivity analysis and uncertainty quantification for molecular dynamics simulation. Extreme (large or small) output values for the LAMMPS code often occur at the boundaries of input regions, and uncertainties in those boundary values are overlooked by common SA methods. Similarly, input values for which code outputs are consistent with calibration data can also occur near boundaries. Upon applying approaches in the literature for imprecise probabilities (IPs), much more realistic results are obtained than for the complacent application of standard SA and code calibration.

  12. Methods and computer codes for probabilistic sensitivity and uncertainty analysis

    International Nuclear Information System (INIS)

    Vaurio, J.K.

    1985-01-01

    This paper describes the methods and applications experience with two computer codes that are now available from the National Energy Software Center at Argonne National Laboratory. The purpose of the SCREEN code is to identify a group of most important input variables of a code that has many (tens, hundreds) input variables with uncertainties, and do this without relying on judgment or exhaustive sensitivity studies. Purpose of the PROSA-2 code is to propagate uncertainties and calculate the distributions of interesting output variable(s) of a safety analysis code using response surface techniques, based on the same runs used for screening. Several applications are discussed, but the codes are generic, not tailored to any specific safety application code. They are compatible in terms of input/output requirements but also independent of each other, e.g., PROSA-2 can be used without first using SCREEN if a set of important input variables has first been selected by other methods. Also, although SCREEN can select cases to be run (by random sampling), a user can select cases by other methods if he so prefers, and still use the rest of SCREEN for identifying important input variables

  13. Quantifying reactor safety margins: Application of CSAU [Code Scalability, Applicability and Uncertainty] methodology to LBLOCA: Part 3, Assessment and ranging of parameters for the uncertainty analysis of LBLOCA codes

    International Nuclear Information System (INIS)

    Wulff, W.; Boyack, B.E.; Duffey, R.B.

    1988-01-01

    Comparisons of results from TRAC-PF1/MOD1 code calculations with measurements from Separate Effects Tests, and published experimental data for modeling parameters have been used to determine the uncertainty ranges of code input and modeling parameters which dominate the uncertainty in predicting the Peak Clad Temperature for a postulated Large Break Loss of Coolant Accident (LBLOCA) in a four-loop Westinghouse Pressurized Water Reactor. The uncertainty ranges are used for a detailed statistical analysis to calculate the probability distribution function for the TRAC code-predicted Peak Clad Temperature, as is described in an attendant paper. Measurements from Separate Effects Tests and Integral Effects Tests have been compared with results from corresponding TRAC-PF1/MOD1 code calculations to determine globally the total uncertainty in predicting the Peak Clad Temperature for LBLOCAs. This determination is in support of the detailed statistical analysis mentioned above. The analyses presented here account for uncertainties in input parameters, in modeling and scaling, in computing and in measurements. The analyses are an important part of the work needed to implement the Code Scalability, Applicability and Uncertainty (CSAU) methodology. CSAU is needed to determine the suitability of a computer code for reactor safety analyses and the uncertainty in computer predictions. The results presented here are used to estimate the safety margin of a particular nuclear reactor power plant for a postulated accident. 25 refs., 10 figs., 11 tabs

  14. Coupled code analysis of uncertainty and sensitivity of Kalinin-3 benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Pasichnyk, Ihor; Zwermann, Winfried; Velkov, Kiril [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Garching (Germany); Nikonov, Sergey [VNIIAES, Moscow (Russian Federation)

    2016-09-15

    An uncertainty and sensitivity analysis is performed for the OECD/NEA coolant transient Benchmark (K-3) on measured data at Kalinin-3 Nuclear Power Plant (NPP). A switch off of one main coolant pump (MCP) at nominal reactor power is calculated using a coupled thermohydraulic and neutron-kinetic ATHLET-PARCS code. The objectives are to study uncertainty of total reactor power and to identify the main sources of reactor power uncertainty. The GRS uncertainty and sensitivity software package XSUSA is applied to propagate uncertainties in nuclear data libraries to the full core coupled transient calculations. A set of most important thermal-hydraulic parameters of the primary circuit is identified and a total of 23 thermohydraulic parameters are statistically varied using GRS code SUSA. The ATHLET model contains also a balance-of-plant (BOP) model which is simulated using ATHLET GCSM module. In particular the operation of the main steam generator regulators is modelled in detail. A set of 200 varied coupled ATHLET-PARCS calculations is analyzed. The results obtained show a clustering effect in the behavior of global reactor parameters. It is found that the GCSM system together with varied input parameters strongly influence the overall nuclear power plant behavior and can even lead to a new scenario. Possible reasons of the clustering effect are discussed in the paper. This work is a step forward in establishing a ''best-estimate calculations in combination with performing uncertainty analysis'' methodology for coupled full core calculations.

  15. Qualification and application of nuclear reactor accident analysis code with the capability of internal assessment of uncertainty

    International Nuclear Information System (INIS)

    Borges, Ronaldo Celem

    2001-10-01

    This thesis presents an independent qualification of the CIAU code ('Code with the capability of - Internal Assessment of Uncertainty') which is part of the internal uncertainty evaluation process with a thermal hydraulic system code on a realistic basis. This is done by combining the uncertainty methodology UMAE ('Uncertainty Methodology based on Accuracy Extrapolation') with the RELAP5/Mod3.2 code. This allows associating uncertainty band estimates with the results obtained by the realistic calculation of the code, meeting licensing requirements of safety analysis. The independent qualification is supported by simulations with RELAP5/Mod3.2 related to accident condition tests of LOBI experimental facility and to an event which has occurred in Angra 1 nuclear power plant, by comparison with measured results and by establishing uncertainty bands on safety parameter calculated time trends. These bands have indeed enveloped the measured trends. Results from this independent qualification of CIAU have allowed to ascertain the adequate application of a systematic realistic code procedure to analyse accidents with uncertainties incorporated in the results, although there is an evident need of extending the uncertainty data base. It has been verified that use of the code with this internal assessment of uncertainty is feasible in the design and license stages of a NPP. (author)

  16. Uncertainty analysis guide

    International Nuclear Information System (INIS)

    Andres, T.H.

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  17. Uncertainty analysis guide

    Energy Technology Data Exchange (ETDEWEB)

    Andres, T.H

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  18. An estimation of uncertainties in containment P/T analysis using CONTEMPT/LT code

    International Nuclear Information System (INIS)

    Kang, Y.M.; Park, G.C.; Lee, U.C.; Kang, C.S.

    1991-01-01

    In a nuclear power plant, the containment design pressure and temperature (P/T) have been established based on the unrealistic conservatism with suffering from a drawback in the economics. Thus, it is necessary that the uncertainties of design P/T values have to be well defined through an extensive uncertainty analysis with plant-specific input data and or models used in the computer code. This study is to estimate plant-specific uncertainties of containment design P/T using the Monte Carlo method in Kori-3 reactor. Kori-3 plant parameters and Uchida heat transfer coefficient are selected to be treated statistically after the sensitivity study. The Monte Carlo analysis has performed based on the response surface method with the CONTEMPT/LT code and Latin Hypercube sampling technique. Finally, the design values based on 95 %/95 % probability are compared with worst estimated values to assess the design margin. (author)

  19. SENSIT: a cross-section and design sensitivity and uncertainty analysis code

    International Nuclear Information System (INIS)

    Gerstl, S.A.W.

    1980-01-01

    SENSIT computes the sensitivity and uncertainty of a calculated integral response (such as a dose rate) due to input cross sections and their uncertainties. Sensitivity profiles are computed for neutron and gamma-ray reaction cross sections of standard multigroup cross section sets and for secondary energy distributions (SEDs) of multigroup scattering matrices. In the design sensitivity mode, SENSIT computes changes in an integral response due to design changes and gives the appropriate sensitivity coefficients. Cross section uncertainty analyses are performed for three types of input data uncertainties: cross-section covariance matrices for pairs of multigroup reaction cross sections, spectral shape uncertainty parameters for secondary energy distributions (integral SED uncertainties), and covariance matrices for energy-dependent response functions. For all three types of data uncertainties SENSIT computes the resulting variance and estimated standard deviation in an integral response of interest, on the basis of generalized perturbation theory. SENSIT attempts to be more comprehensive than earlier sensitivity analysis codes, such as SWANLAKE

  20. Uncertainty analysis for results of thermal hydraulic codes of best-estimate-type

    International Nuclear Information System (INIS)

    Alva N, J.

    2010-01-01

    In this thesis, some fundamental knowledge is presented about uncertainty analysis and about diverse methodologies applied in the study of nuclear power plant transient event analysis, particularly related to thermal hydraulics phenomena. These concepts and methodologies mentioned in this work come from a wide bibliographical research in the nuclear power subject. Methodologies for uncertainty analysis have been developed by quite diverse institutions, and they have been widely used worldwide for application to results from best-estimate-type computer codes in nuclear reactor thermal hydraulics and safety analysis. Also, the main uncertainty sources, types of uncertainties, and aspects related to best estimate modeling and methods are introduced. Once the main bases of uncertainty analysis have been set, and some of the known methodologies have been introduced, it is presented in detail the CSAU methodology, which will be applied in the analyses. The main objective of this thesis is to compare the results of an uncertainty and sensibility analysis by using the Response Surface Technique to the application of W ilks formula, apply through a loss coolant experiment and an event of rise in a BWR. Both techniques are options in the part of uncertainty and sensibility analysis of the CSAU methodology, which was developed for the analysis of transients and accidents at nuclear power plants, and it is the base of most of the methodologies used in licensing of nuclear power plants practically everywhere. Finally, the results of applying both techniques are compared and discussed. (Author)

  1. BWR transient analysis using neutronic / thermal hydraulic coupled codes including uncertainty quantification

    International Nuclear Information System (INIS)

    Hartmann, C.; Sanchez, V.; Tietsch, W.; Stieglitz, R.

    2012-01-01

    The KIT is involved in the development and qualification of best estimate methodologies for BWR transient analysis in cooperation with industrial partners. The goal is to establish the most advanced thermal hydraulic system codes coupled with 3D reactor dynamic codes to be able to perform a more realistic evaluation of the BWR behavior under accidental conditions. For this purpose a computational chain based on the lattice code (SCALE6/GenPMAXS), the coupled neutronic/thermal hydraulic code (TRACE/PARCS) as well as a Monte Carlo based uncertainty and sensitivity package (SUSA) has been established and applied to different kind of transients of a Boiling Water Reactor (BWR). This paper will describe the multidimensional models of the plant elaborated for TRACE and PARCS to perform the investigations mentioned before. For the uncertainty quantification of the coupled code TRACE/PARCS and specifically to take into account the influence of the kinetics parameters in such studies, the PARCS code has been extended to facilitate the change of model parameters in such a way that the SUSA package can be used in connection with TRACE/PARCS for the U and S studies. This approach will be presented in detail. The results obtained for a rod drop transient with TRACE/PARCS using the SUSA-methodology showed clearly the importance of some kinetic parameters on the transient progression demonstrating that the coupling of a best-estimate coupled codes with uncertainty and sensitivity tools is very promising and of great importance for the safety assessment of nuclear reactors. (authors)

  2. Sensitivity Analysis of Uncertainty Parameter based on MARS-LMR Code on SHRT-45R of EBR II

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Seok-Ju; Kang, Doo-Hyuk; Seo, Jae-Seung [System Engineering and Technology Co., Daejeon (Korea, Republic of); Bae, Sung-Won [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Jeong, Hae-Yong [Sejong University, Seoul (Korea, Republic of)

    2016-10-15

    In order to assess the uncertainty quantification of the MARS-LMR code, the code has been improved by modifying the source code to accommodate calculation process required for uncertainty quantification. In the present study, a transient of Unprotected Loss of Flow(ULOF) is selected as typical cases of as Anticipated Transient without Scram(ATWS) which belongs to DEC category. The MARS-LMR input generation for EBR II SHRT-45R and execution works are performed by using the PAPIRUS program. The sensitivity analysis is carried out with Uncertainty Parameter of the MARS-LMR code for EBR-II SHRT-45R. Based on the results of sensitivity analysis, dominant parameters with large sensitivity to FoM are picked out. Dominant parameters selected are closely related to the development process of ULOF event.

  3. Validation uncertainty of MATRA code for subchannel void distributions

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Dae-Hyun; Kim, S. J.; Kwon, H.; Seo, K. W. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    To extend code capability to the whole core subchannel analysis, pre-conditioned Krylov matrix solvers such as BiCGSTAB and GMRES are implemented in MATRA code as well as parallel computing algorithms using MPI and OPENMP. It is coded by fortran 90, and has some user friendly features such as graphic user interface. MATRA code was approved by Korean regulation body for design calculation of integral-type PWR named SMART. The major role subchannel code is to evaluate core thermal margin through the hot channel analysis and uncertainty evaluation for CHF predictions. In addition, it is potentially used for the best estimation of core thermal hydraulic field by incorporating into multiphysics and/or multi-scale code systems. In this study we examined a validation process for the subchannel code MATRA specifically in the prediction of subchannel void distributions. The primary objective of validation is to estimate a range within which the simulation modeling error lies. The experimental data for subchannel void distributions at steady state and transient conditions was provided on the framework of OECD/NEA UAM benchmark program. The validation uncertainty of MATRA code was evaluated for a specific experimental condition by comparing the simulation result and experimental data. A validation process should be preceded by code and solution verification. However, quantification of verification uncertainty was not addressed in this study. The validation uncertainty of the MATRA code for predicting subchannel void distribution was evaluated for a single data point of void fraction measurement at a 5x5 PWR test bundle on the framework of OECD UAM benchmark program. The validation standard uncertainties were evaluated as 4.2%, 3.9%, and 2.8% with the Monte-Carlo approach at the axial levels of 2216 mm, 2669 mm, and 3177 mm, respectively. The sensitivity coefficient approach revealed similar results of uncertainties but did not account for the nonlinear effects on the

  4. Development of Evaluation Code for MUF Uncertainty

    International Nuclear Information System (INIS)

    Won, Byung Hee; Han, Bo Young; Shin, Hee Sung; Ahn, Seong-Kyu; Park, Geun-Il; Park, Se Hwan

    2015-01-01

    Material Unaccounted For (MUF) is the material balance evaluated by measured nuclear material in a Material Balance Area (MBA). Assuming perfect measurements and no diversion from a facility, one can expect a zero MUF. However, non-zero MUF is always occurred because of measurement uncertainty even though the facility is under normal operation condition. Furthermore, there are many measurements using different equipment at various Key Measurement Points (KMPs), and the MUF uncertainty is affected by errors of those measurements. Evaluating MUF uncertainty is essentially required to develop safeguards system including nuclear measurement system in pyroprocessing, which is being developed for reducing radioactive waste from spent fuel in Korea Atomic Energy Research Institute (KAERI). The evaluation code for analyzing MUF uncertainty has been developed and it was verified using sample problem from the IAEA reference. MUF uncertainty can be simply and quickly calculated by using this evaluation code which is made based on graphical user interface for user friendly. It is also expected that the code will make the sensitivity analysis on the MUF uncertainty for the various safeguards systems easy and more systematic. It is suitable for users who want to evaluate the conventional safeguards system as well as to develop a new system for developing facilities

  5. Development of Evaluation Code for MUF Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Won, Byung Hee; Han, Bo Young; Shin, Hee Sung; Ahn, Seong-Kyu; Park, Geun-Il; Park, Se Hwan [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    Material Unaccounted For (MUF) is the material balance evaluated by measured nuclear material in a Material Balance Area (MBA). Assuming perfect measurements and no diversion from a facility, one can expect a zero MUF. However, non-zero MUF is always occurred because of measurement uncertainty even though the facility is under normal operation condition. Furthermore, there are many measurements using different equipment at various Key Measurement Points (KMPs), and the MUF uncertainty is affected by errors of those measurements. Evaluating MUF uncertainty is essentially required to develop safeguards system including nuclear measurement system in pyroprocessing, which is being developed for reducing radioactive waste from spent fuel in Korea Atomic Energy Research Institute (KAERI). The evaluation code for analyzing MUF uncertainty has been developed and it was verified using sample problem from the IAEA reference. MUF uncertainty can be simply and quickly calculated by using this evaluation code which is made based on graphical user interface for user friendly. It is also expected that the code will make the sensitivity analysis on the MUF uncertainty for the various safeguards systems easy and more systematic. It is suitable for users who want to evaluate the conventional safeguards system as well as to develop a new system for developing facilities.

  6. Verification of the thermal module in the ELESIM code and the associated uncertainty analysis

    International Nuclear Information System (INIS)

    Arimescu, V.I.; Williams, A.F.; Klein, M.E.; Richmond, W.R.; Couture, M.

    1997-09-01

    Temperature is a critical parameter in fuel modelling because most of the physical processes that occur in fuel elements during irradiation are thermally activated. The focus of this paper is the temperature distribution calculation used in the computer code ELESIM, developed at AECL to model the steady-state behaviour of CANDU fuel. A validation procedure for fuel codes is described and applied to ELESIM's thermal calculation.The effects of uncertainties in model parameters, like Uranium Dioxide thermal conductivity, and input variables, such as fuel element linear power, are accounted for through an uncertainty analysis using Response Surface and Monte Carlo techniques

  7. Users manual for the FORSS sensitivity and uncertainty analysis code system

    International Nuclear Information System (INIS)

    Lucius, J.L.; Weisbin, C.R.; Marable, J.H.; Drischler, J.D.; Wright, R.Q.; White, J.E.

    1981-01-01

    FORSS is a code system used to study relationships between nuclear reaction cross sections, integral experiments, reactor performance parameter predictions and associated uncertainties. This report describes the computing environment and the modules currently used to implement FORSS Sensitivity and Uncertainty Methodology

  8. Users manual for the FORSS sensitivity and uncertainty analysis code system

    Energy Technology Data Exchange (ETDEWEB)

    Lucius, J.L.; Weisbin, C.R.; Marable, J.H.; Drischler, J.D.; Wright, R.Q.; White, J.E.

    1981-01-01

    FORSS is a code system used to study relationships between nuclear reaction cross sections, integral experiments, reactor performance parameter predictions and associated uncertainties. This report describes the computing environment and the modules currently used to implement FORSS Sensitivity and Uncertainty Methodology.

  9. Validation and uncertainty analysis of the Athlet thermal-hydraulic computer code

    International Nuclear Information System (INIS)

    Glaeser, H.

    1995-01-01

    The computer code ATHLET is being developed by GRS as an advanced best-estimate code for the simulation of breaks and transients in Pressurized Water Reactor (PWRs) and Boiling Water Reactor (BWRs) including beyond design basis accidents. A systematic validation of ATHLET is based on a well balanced set of integral and separate effects tests emphasizing the German combined Emergency Core Cooling (ECC) injection system. When using best estimate codes for predictions of reactor plant states during assumed accidents, qualification of the uncertainty in these calculations is highly desirable. A method for uncertainty and sensitivity evaluation has been developed by GRS where the computational effort is independent of the number of uncertain parameters. (author)

  10. GRS Method for Uncertainty and Sensitivity Evaluation of Code Results and Applications

    International Nuclear Information System (INIS)

    Glaeser, H.

    2008-01-01

    During the recent years, an increasing interest in computational reactor safety analysis is to replace the conservative evaluation model calculations by best estimate calculations supplemented by uncertainty analysis of the code results. The evaluation of the margin to acceptance criteria, for example, the maximum fuel rod clad temperature, should be based on the upper limit of the calculated uncertainty range. Uncertainty analysis is needed if useful conclusions are to be obtained from best estimate thermal-hydraulic code calculations, otherwise single values of unknown accuracy would be presented for comparison with regulatory acceptance limits. Methods have been developed and presented to quantify the uncertainty of computer code results. The basic techniques proposed by GRS are presented together with applications to a large break loss of coolant accident on a reference reactor as well as on an experiment simulating containment behaviour

  11. Post-test calculation and uncertainty analysis of the experiment QUENCH-07 with the system code ATHLET-CD

    International Nuclear Information System (INIS)

    Austregesilo, Henrique; Bals, Christine; Trambauer, Klaus

    2007-01-01

    In the frame of developmental assessment and code validation, a post-test calculation of the test QUENCH-07 was performed with ATHLET-CD. The system code ATHLET-CD is being developed for best-estimate simulation of accidents with core degradation and for evaluation of accident management procedures. It applies the detailed models of the thermal-hydraulic code ATHLET in an efficient coupling with dedicated models for core degradation and fission products behaviour. The first step of the work was the simulation of the test QUENCH-07 applying the modelling options recommended in the code User's Manual (reference calculation). The global results of this calculation showed a good agreement with the measured data. This calculation was complemented by a sensitivity analysis in order to investigate the influence of a combined variation of code input parameters on the simulation of the main phenomena observed experimentally. Results of this sensitivity analysis indicate that the main experimental measurements lay within the uncertainty range of the corresponding calculated values. Among the main contributors to the uncertainty of code results are the heat transfer coefficient due to forced convection to superheated steam-argon mixture, the thermal conductivity of the shroud isolation and the external heater rod resistance. Uncertainties on modelling of B 4 C oxidation do not affect significantly the total calculated hydrogen release rates

  12. Uncertainty and sensitivity analysis using probabilistic system assessment code. 1

    International Nuclear Information System (INIS)

    Honma, Toshimitsu; Sasahara, Takashi.

    1993-10-01

    This report presents the results obtained when applying the probabilistic system assessment code under development to the PSACOIN Level 0 intercomparison exercise organized by the Probabilistic System Assessment Code User Group in the Nuclear Energy Agency (NEA) of OECD. This exercise is one of a series designed to compare and verify probabilistic codes in the performance assessment of geological radioactive waste disposal facilities. The computations were performed using the Monte Carlo sampling code PREP and post-processor code USAMO. The submodels in the waste disposal system were described and coded with the specification of the exercise. Besides the results required for the exercise, further additional uncertainty and sensitivity analyses were performed and the details of these are also included. (author)

  13. A statistical methodology for quantification of uncertainty in best estimate code physical models

    International Nuclear Information System (INIS)

    Vinai, Paolo; Macian-Juan, Rafael; Chawla, Rakesh

    2007-01-01

    A novel uncertainty assessment methodology, based on a statistical non-parametric approach, is presented in this paper. It achieves quantification of code physical model uncertainty by making use of model performance information obtained from studies of appropriate separate-effect tests. Uncertainties are quantified in the form of estimated probability density functions (pdf's), calculated with a newly developed non-parametric estimator. The new estimator objectively predicts the probability distribution of the model's 'error' (its uncertainty) from databases reflecting the model's accuracy on the basis of available experiments. The methodology is completed by applying a novel multi-dimensional clustering technique based on the comparison of model error samples with the Kruskall-Wallis test. This takes into account the fact that a model's uncertainty depends on system conditions, since a best estimate code can give predictions for which the accuracy is affected by the regions of the physical space in which the experiments occur. The final result is an objective, rigorous and accurate manner of assigning uncertainty to coded models, i.e. the input information needed by code uncertainty propagation methodologies used for assessing the accuracy of best estimate codes in nuclear systems analysis. The new methodology has been applied to the quantification of the uncertainty in the RETRAN-3D void model and then used in the analysis of an independent separate-effect experiment. This has clearly demonstrated the basic feasibility of the approach, as well as its advantages in yielding narrower uncertainty bands in quantifying the code's accuracy for void fraction predictions

  14. Total sensitivity and uncertainty analysis for LWR pin-cells with improved UNICORN code

    International Nuclear Information System (INIS)

    Wan, Chenghui; Cao, Liangzhi; Wu, Hongchun; Shen, Wei

    2017-01-01

    Highlights: • A new model is established for the total sensitivity and uncertainty analysis. • The NR approximation applied in S&U analysis can be avoided by the new model. • Sensitivity and uncertainty analysis is performed to PWR pin-cells by the new model. • The effects of the NR approximation for the PWR pin-cells are quantified. - Abstract: In this paper, improvements to the multigroup cross-section perturbation model have been proposed and applied in the self-developed UNICORN code, which is capable of performing the total sensitivity and total uncertainty analysis for the neutron-physics calculations by applying the direct numerical perturbation method and the statistical sampling method respectively. The narrow resonance (NR) approximation was applied in the multigroup cross-section perturbation model, implemented in UNICORN. As improvements to the NR approximation to refine the multigroup cross-section perturbation model, an ultrafine-group cross-section perturbation model has been established, in which the actual perturbations are applied to the ultrafine-group cross-section library and the reconstructions of the resonance cross sections are performed by solving the neutron slowing-down equation. The total sensitivity and total uncertainty analysis were then applied to the LWR pin-cells, using both the multigroup and the ultrafine-group cross-section perturbation models. The numerical results show that the NR approximation overestimates the relative sensitivity coefficients and the corresponding uncertainty results for the LWR pin-cells, and the effects of the NR approximation are significant for σ_(_n_,_γ_) and σ_(_n_,_e_l_a_s_) of "2"3"8U. Therefore, the effects of the NR approximation applied in the total sensitivity and total uncertainty analysis for the neutron-physics calculations of LWR should be taken into account.

  15. Development and application of methods to characterize code uncertainty

    International Nuclear Information System (INIS)

    Wilson, G.E.; Burtt, J.D.; Case, G.S.; Einerson, J.J.; Hanson, R.G.

    1985-01-01

    The United States Nuclear Regulatory Commission sponsors both international and domestic studies to assess its safety analysis codes. The Commission staff intends to use the results of these studies to quantify the uncertainty of the codes with a statistically based analysis method. Development of the methodology is underway. The Idaho National Engineering Laboratory contributions to the early development effort, and testing of two candidate methods are the subjects of this paper

  16. FERRET data analysis code

    International Nuclear Information System (INIS)

    Schmittroth, F.

    1979-09-01

    A documentation of the FERRET data analysis code is given. The code provides a way to combine related measurements and calculations in a consistent evaluation. Basically a very general least-squares code, it is oriented towards problems frequently encountered in nuclear data and reactor physics. A strong emphasis is on the proper treatment of uncertainties and correlations and in providing quantitative uncertainty estimates. Documentation includes a review of the method, structure of the code, input formats, and examples

  17. SURE: a system of computer codes for performing sensitivity/uncertainty analyses with the RELAP code

    International Nuclear Information System (INIS)

    Bjerke, M.A.

    1983-02-01

    A package of computer codes has been developed to perform a nonlinear uncertainty analysis on transient thermal-hydraulic systems which are modeled with the RELAP computer code. Using an uncertainty around the analyses of experiments in the PWR-BDHT Separate Effects Program at Oak Ridge National Laboratory. The use of FORTRAN programs running interactively on the PDP-10 computer has made the system very easy to use and provided great flexibility in the choice of processing paths. Several experiments simulating a loss-of-coolant accident in a nuclear reactor have been successfully analyzed. It has been shown that the system can be automated easily to further simplify its use and that the conversion of the entire system to a base code other than RELAP is possible

  18. Uncertainty analysis methods for quantification of source terms using a large computer code

    International Nuclear Information System (INIS)

    Han, Seok Jung

    1997-02-01

    Quantification of uncertainties in the source term estimations by a large computer code, such as MELCOR and MAAP, is an essential process of the current probabilistic safety assessments (PSAs). The main objectives of the present study are (1) to investigate the applicability of a combined procedure of the response surface method (RSM) based on input determined from a statistical design and the Latin hypercube sampling (LHS) technique for the uncertainty analysis of CsI release fractions under a hypothetical severe accident sequence of a station blackout at Young-Gwang nuclear power plant using MAAP3.0B code as a benchmark problem; and (2) to propose a new measure of uncertainty importance based on the distributional sensitivity analysis. On the basis of the results obtained in the present work, the RSM is recommended to be used as a principal tool for an overall uncertainty analysis in source term quantifications, while using the LHS in the calculations of standardized regression coefficients (SRC) and standardized rank regression coefficients (SRRC) to determine the subset of the most important input parameters in the final screening step and to check the cumulative distribution functions (cdfs) obtained by RSM. Verification of the response surface model for its sufficient accuracy is a prerequisite for the reliability of the final results obtained by the combined procedure proposed in the present work. In the present study a new measure has been developed to utilize the metric distance obtained from cumulative distribution functions (cdfs). The measure has been evaluated for three different cases of distributions in order to assess the characteristics of the measure: The first case and the second are when the distribution is known as analytical distributions and the other case is when the distribution is unknown. The first case is given by symmetry analytical distributions. The second case consists of two asymmetry distributions of which the skewness is non zero

  19. Striatal dopamine release codes uncertainty in pathological gambling

    DEFF Research Database (Denmark)

    Linnet, Jakob; Mouridsen, Kim; Peterson, Ericka

    2012-01-01

    Two mechanisms of midbrain and striatal dopaminergic projections may be involved in pathological gambling: hypersensitivity to reward and sustained activation toward uncertainty. The midbrain—striatal dopamine system distinctly codes reward and uncertainty, where dopaminergic activation is a linear...... function of expected reward and an inverse U-shaped function of uncertainty. In this study, we investigated the dopaminergic coding of reward and uncertainty in 18 pathological gambling sufferers and 16 healthy controls. We used positron emission tomography (PET) with the tracer [11C]raclopride to measure...... dopamine release, and we used performance on the Iowa Gambling Task (IGT) to determine overall reward and uncertainty. We hypothesized that we would find a linear function between dopamine release and IGT performance, if dopamine release coded reward in pathological gambling. If, on the other hand...

  20. Striatal dopamine release codes uncertainty in pathological gambling

    DEFF Research Database (Denmark)

    Linnet, Jakob; Mouridsen, Kim; Peterson, Ericka

    2012-01-01

    Two mechanisms of midbrain and striatal dopaminergic projections may be involved in pathological gambling: hypersensitivity to reward and sustained activation toward uncertainty. The midbrain-striatal dopamine system distinctly codes reward and uncertainty, where dopaminergic activation is a linear...... function of expected reward and an inverse U-shaped function of uncertainty. In this study, we investigated the dopaminergic coding of reward and uncertainty in 18 pathological gambling sufferers and 16 healthy controls. We used positron emission tomography (PET) with the tracer [(11)C......]raclopride to measure dopamine release, and we used performance on the Iowa Gambling Task (IGT) to determine overall reward and uncertainty. We hypothesized that we would find a linear function between dopamine release and IGT performance, if dopamine release coded reward in pathological gambling. If, on the other hand...

  1. Uncertainty analysis in Monte Carlo criticality computations

    International Nuclear Information System (INIS)

    Qi Ao

    2011-01-01

    Highlights: ► Two types of uncertainty methods for k eff Monte Carlo computations are examined. ► Sampling method has the least restrictions on perturbation but computing resources. ► Analytical method is limited to small perturbation on material properties. ► Practicality relies on efficiency, multiparameter applicability and data availability. - Abstract: Uncertainty analysis is imperative for nuclear criticality risk assessments when using Monte Carlo neutron transport methods to predict the effective neutron multiplication factor (k eff ) for fissionable material systems. For the validation of Monte Carlo codes for criticality computations against benchmark experiments, code accuracy and precision are measured by both the computational bias and uncertainty in the bias. The uncertainty in the bias accounts for known or quantified experimental, computational and model uncertainties. For the application of Monte Carlo codes for criticality analysis of fissionable material systems, an administrative margin of subcriticality must be imposed to provide additional assurance of subcriticality for any unknown or unquantified uncertainties. Because of a substantial impact of the administrative margin of subcriticality on economics and safety of nuclear fuel cycle operations, recently increasing interests in reducing the administrative margin of subcriticality make the uncertainty analysis in criticality safety computations more risk-significant. This paper provides an overview of two most popular k eff uncertainty analysis methods for Monte Carlo criticality computations: (1) sampling-based methods, and (2) analytical methods. Examples are given to demonstrate their usage in the k eff uncertainty analysis due to uncertainties in both neutronic and non-neutronic parameters of fissionable material systems.

  2. CSAU (Code Scaling, Applicability and Uncertainty)

    International Nuclear Information System (INIS)

    Wilson, G.E.; Boyack, B.E.

    1989-01-01

    Best Estimate computer codes have been accepted by the U.S. Nuclear Regulatory Commission as an optional tool for performing safety analysis related to the licensing and regulation of current nuclear reactors producing commercial electrical power, providing their uncertainty is quantified. In support of this policy change, the NRC and its contractors and consultants have developed and demonstrated an uncertainty quantification methodology called CSAU. The primary use of the CSAU methodology is to quantify safety margins for existing designs; however, the methodology can also serve an equally important role in advanced reactor research for plants not yet built. This paper describes the CSAU methodology, at the generic process level, and provides the general principles whereby it may be applied to evaluations of advanced reactor designs

  3. Simulation codes and the impact of validation/uncertainty requirements

    International Nuclear Information System (INIS)

    Sills, H.E.

    1995-01-01

    Several of the OECD/CSNI members have adapted a proposed methodology for code validation and uncertainty assessment. Although the validation process adapted by members has a high degree of commonality, the uncertainty assessment processes selected are more variable, ranaing from subjective to formal. This paper describes the validation and uncertainty assessment process, the sources of uncertainty, methods of reducing uncertainty, and methods of assessing uncertainty.Examples are presented from the Ontario Hydro application of the validation methodology and uncertainty assessment to the system thermal hydraulics discipline and the TUF (1) system thermal hydraulics code. (author)

  4. A review on the CIRCE methodology to quantify the uncertainty of the physical models of a code

    International Nuclear Information System (INIS)

    Jeon, Seong Su; Hong, Soon Joon; Bang, Young Seok

    2012-01-01

    In the field of nuclear engineering, recent regulatory audit calculations of large break loss of coolant accident (LBLOCA) have been performed with the best estimate code such as MARS, RELAP5 and CATHARE. Since the credible regulatory audit calculation is very important in the evaluation of the safety of the nuclear power plant (NPP), there have been many researches to develop rules and methodologies for the use of best estimate codes. One of the major points is to develop the best estimate plus uncertainty (BEPU) method for uncertainty analysis. As a representative BEPU method, NRC proposes the CSAU (Code scaling, applicability and uncertainty) methodology, which clearly identifies the different steps necessary for an uncertainty analysis. The general idea is 1) to determine all the sources of uncertainty in the code, also called basic uncertainties, 2) quantify them and 3) combine them in order to obtain the final uncertainty for the studied application. Using the uncertainty analysis such as CSAU methodology, an uncertainty band for the code response (calculation result), important from the safety point of view is calculated and the safety margin of the NPP is quantified. An example of such a response is the peak cladding temperature (PCT) for a LBLOCA. However, there is a problem in the uncertainty analysis with the best estimate codes. Generally, it is very difficult to determine the uncertainties due to the empiricism of closure laws (also called correlations or constitutive relationships). So far the only proposed approach is based on the expert judgment. For this case, the uncertainty range of important parameters can be wide and inaccurate so that the confidence level of the BEPU calculation results can be decreased. In order to solve this problem, recently CEA (France) proposes a statistical method of data analysis, called CIRCE. The CIRCE method is intended to quantify the uncertainties of the correlations of a code. It may replace the expert judgment

  5. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, main report

    International Nuclear Information System (INIS)

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The ultimate objective of the joint effort was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. Experts developed their distributions independently. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. To validate the distributions generated for the dispersion code input variables, samples from the distributions and propagated through the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the first of a three-volume document describing the project

  6. Uncertainties associated with the use of the KENO Monte Carlo criticality codes

    International Nuclear Information System (INIS)

    Landers, N.F.; Petrie, L.M.

    1989-01-01

    The KENO multi-group Monte Carlo criticality codes have earned the reputation of being efficient, user friendly tools especially suited for the analysis of situations commonly encountered in the storage and transportation of fissile materials. Throughout their twenty years of service, a continuing effort has been made to maintain and improve these codes to meet the needs of the nuclear criticality safety community. Foremost among these needs is the knowledge of how to utilize the results safely and effectively. Therefore it is important that code users be aware of uncertainties that may affect their results. These uncertainties originate from approximations in the problem data, methods used to process cross sections, and assumptions, limitations and approximations within the criticality computer code itself. 6 refs., 8 figs., 1 tab

  7. Uncertainty analysis of neutron transport calculation

    International Nuclear Information System (INIS)

    Oka, Y.; Furuta, K.; Kondo, S.

    1987-01-01

    A cross section sensitivity-uncertainty analysis code, SUSD was developed. The code calculates sensitivity coefficients for one and two-dimensional transport problems based on the first order perturbation theory. Variance and standard deviation of detector responses or design parameters can be obtained using cross section covariance matrix. The code is able to perform sensitivity-uncertainty analysis for secondary neutron angular distribution(SAD) and secondary neutron energy distribution(SED). Covariances of 6 Li and 7 Li neutron cross sections in JENDL-3PR1 were evaluated including SAD and SED. Covariances of Fe and Be were also evaluated. The uncertainty of tritium breeding ratio, fast neutron leakage flux and neutron heating was analysed on four types of blanket concepts for a commercial tokamak fusion reactor. The uncertainty of tritium breeding ratio was less than 6 percent. Contribution from SAD/SED uncertainties are significant for some parameters. Formulas to estimate the errors of numerical solution of the transport equation were derived based on the perturbation theory. This method enables us to deterministically estimate the numerical errors due to iterative solution, spacial discretization and Legendre polynomial expansion of transfer cross-sections. The calculational errors of the tritium breeding ratio and the fast neutron leakage flux of the fusion blankets were analysed. (author)

  8. Analysis of uncertainties of thermal hydraulic calculations

    International Nuclear Information System (INIS)

    Macek, J.; Vavrin, J.

    2002-12-01

    In 1993-1997 it was proposed, within OECD projects, that a common program should be set up for uncertainty analysis by a probabilistic method based on a non-parametric statistical approach for system computer codes such as RELAP, ATHLET and CATHARE and that a method should be developed for statistical analysis of experimental databases for the preparation of the input deck and statistical analysis of the output calculation results. Software for such statistical analyses would then have to be processed as individual tools independent of the computer codes used for the thermal hydraulic analysis and programs for uncertainty analysis. In this context, a method for estimation of a thermal hydraulic calculation is outlined and selected methods of statistical analysis of uncertainties are described, including methods for prediction accuracy assessment based on the discrete Fourier transformation principle. (author)

  9. The Uncertainty Test for the MAAP Computer Code

    International Nuclear Information System (INIS)

    Park, S. H.; Song, Y. M.; Park, S. Y.; Ahn, K. I.; Kim, K. R.; Lee, Y. J.

    2008-01-01

    After the Three Mile Island Unit 2 (TMI-2) and Chernobyl accidents, safety issues for a severe accident are treated in various aspects. Major issues in our research part include a level 2 PSA. The difficulty in expanding the level 2 PSA as a risk information activity is the uncertainty. In former days, it attached a weight to improve the quality in a internal accident PSA, but the effort is insufficient for decrease the phenomenon uncertainty in the level 2 PSA. In our country, the uncertainty degree is high in the case of a level 2 PSA model, and it is necessary to secure a model to decrease the uncertainty. We have not yet experienced the uncertainty assessment technology, the assessment system itself depends on advanced nations. In advanced nations, the severe accident simulator is implemented in the hardware level. But in our case, basic function in a software level can be implemented. In these circumstance at home and abroad, similar instances are surveyed such as UQM and MELCOR. Referred to these instances, SAUNA (Severe Accident UNcertainty Analysis) system is being developed in our project to assess and decrease the uncertainty in a level 2 PSA. It selects the MAAP code to analyze the uncertainty in a severe accident

  10. Uncertainty analysis of LBLOCA for Advanced Heavy Water Reactor

    International Nuclear Information System (INIS)

    Srivastava, A.; Lele, H.G.; Ghosh, A.K.; Kushwaha, H.S.

    2008-01-01

    The main objective of safety analysis is to demonstrate in a robust way that all safety requirements are met, i.e. sufficient margins exist between real values of important parameters and their threshold values at which damage of the barriers against release of radioactivity would occur. As stated in the IAEA Safety Requirements for Design of NPPs 'a safety analysis of the plant design shall be conducted in which methods of both deterministic and probabilistic analysis shall be applied'. It is required that 'the computer programs, analytical methods and plant models used in the safety analysis shall be verified and validated, and adequate consideration shall be given to uncertainties'. Uncertainties are present in calculations due to the computer codes, initial and boundary conditions, plant state, fuel parameters, scaling and numerical solution algorithm. All conservative approaches, still widely used, were introduced to cover uncertainties due to limited capability for modelling and understanding of physical phenomena at the early stages of safety analysis. The results obtained by this approach are quite unrealistic and the level of conservatism is not fully known. Another approach is the use of Best Estimate (BE) codes with realistic initial and boundary conditions. If this approach is selected, it should be based on statistically combined uncertainties for plant initial and boundary conditions, assumptions and code models. The current trends are going into direction of the best estimate code with some conservative assumptions of the system with realistic input data with uncertainty analysis. The BE analysis with evaluation of uncertainties offers, in addition, a way to quantify the existing plant safety margins. Its broader use in the future is therefore envisaged, even though it is not always feasible because of the difficulty of quantifying code uncertainties with sufficiently narrow range for every phenomenon and for each accident sequence. In this paper

  11. MUP, CEC-DES, STRADE. Codes for uncertainty propagation, experimental design and stratified random sampling techniques

    International Nuclear Information System (INIS)

    Amendola, A.; Astolfi, M.; Lisanti, B.

    1983-01-01

    The report describes the how-to-use of the codes: MUP (Monte Carlo Uncertainty Propagation) for uncertainty analysis by Monte Carlo simulation, including correlation analysis, extreme value identification and study of selected ranges of the variable space; CEC-DES (Central Composite Design) for building experimental matrices according to the requirements of Central Composite and Factorial Experimental Designs; and, STRADE (Stratified Random Design) for experimental designs based on the Latin Hypercube Sampling Techniques. Application fields, of the codes are probabilistic risk assessment, experimental design, sensitivity analysis and system identification problems

  12. SENSIT: a cross-section and design sensitivity and uncertainty analysis code. [In FORTRAN for CDC-7600, IBM 360

    Energy Technology Data Exchange (ETDEWEB)

    Gerstl, S.A.W.

    1980-01-01

    SENSIT computes the sensitivity and uncertainty of a calculated integral response (such as a dose rate) due to input cross sections and their uncertainties. Sensitivity profiles are computed for neutron and gamma-ray reaction cross sections of standard multigroup cross section sets and for secondary energy distributions (SEDs) of multigroup scattering matrices. In the design sensitivity mode, SENSIT computes changes in an integral response due to design changes and gives the appropriate sensitivity coefficients. Cross section uncertainty analyses are performed for three types of input data uncertainties: cross-section covariance matrices for pairs of multigroup reaction cross sections, spectral shape uncertainty parameters for secondary energy distributions (integral SED uncertainties), and covariance matrices for energy-dependent response functions. For all three types of data uncertainties SENSIT computes the resulting variance and estimated standard deviation in an integral response of interest, on the basis of generalized perturbation theory. SENSIT attempts to be more comprehensive than earlier sensitivity analysis codes, such as SWANLAKE.

  13. Fuzzy Uncertainty Evaluation for Fault Tree Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ki Beom; Shim, Hyung Jin [Seoul National University, Seoul (Korea, Republic of); Jae, Moo Sung [Hanyang University, Seoul (Korea, Republic of)

    2015-05-15

    This traditional probabilistic approach can calculate relatively accurate results. However it requires a long time because of repetitive computation due to the MC method. In addition, when informative data for statistical analysis are not sufficient or some events are mainly caused by human error, the probabilistic approach may not be possible because uncertainties of these events are difficult to be expressed by probabilistic distributions. In order to reduce the computation time and quantify uncertainties of top events when basic events whose uncertainties are difficult to be expressed by probabilistic distributions exist, the fuzzy uncertainty propagation based on fuzzy set theory can be applied. In this paper, we develop a fuzzy uncertainty propagation code and apply the fault tree of the core damage accident after the large loss of coolant accident (LLOCA). The fuzzy uncertainty propagation code is implemented and tested for the fault tree of the radiation release accident. We apply this code to the fault tree of the core damage accident after the LLOCA in three cases and compare the results with those computed by the probabilistic uncertainty propagation using the MC method. The results obtained by the fuzzy uncertainty propagation can be calculated in relatively short time, covering the results obtained by the probabilistic uncertainty propagation.

  14. Processing of the GALILEO fuel rod code model uncertainties within the AREVA LWR realistic thermal-mechanical analysis methodology

    International Nuclear Information System (INIS)

    Mailhe, P.; Barbier, B.; Garnier, C.; Landskron, H.; Sedlacek, R.; Arimescu, I.; Smith, M.; Bellanger, P.

    2013-01-01

    The availability of reliable tools and associated methodology able to accurately predict the LWR fuel behavior in all conditions is of great importance for safe and economic fuel usage. For that purpose, AREVA has developed its new global fuel rod performance code GALILEO along with its associated realistic thermal-mechanical analysis methodology. This realistic methodology is based on a Monte Carlo type random sampling of all relevant input variables. After having outlined the AREVA realistic methodology, this paper will be focused on the GALILEO code benchmarking process, on its extended experimental database and on the GALILEO model uncertainties assessment. The propagation of these model uncertainties through the AREVA realistic methodology is also presented. This GALILEO model uncertainties processing is of the utmost importance for accurate fuel design margin evaluation as illustrated on some application examples. With the submittal of Topical Report GALILEO to the U.S. NRC in 2013, GALILEO and its methodology are on the way to be industrially used in a wide range of irradiation conditions. (authors)

  15. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Haskin, F.E. [Univ. of New Mexico, Albuquerque, NM (United States); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on early health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  16. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harrison, J.D. [National Radiological Protection Board (United Kingdom); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  17. Optimization of FRAP uncertainty analysis option

    International Nuclear Information System (INIS)

    Peck, S.O.

    1979-10-01

    The automated uncertainty analysis option that has been incorporated in the FRAP codes (FRAP-T5 and FRAPCON-2) provides the user with a means of obtaining uncertainty bands on code predicted variables at user-selected times during a fuel pin analysis. These uncertainty bands are obtained by multiple single fuel pin analyses to generate data which can then be analyzed by second order statistical error propagation techniques. In this process, a considerable amount of data is generated and stored on tape. The user has certain choices to make regarding which independent variables are to be used in the analysis and what order of error propagation equation should be used in modeling the output response. To aid the user in these decisions, a computer program, ANALYZ, has been written and added to the uncertainty analysis option package. A variety of considerations involved in fitting response surface equations and certain pit-falls of which the user should be aware are discussed. An equation is derived expressing a residual as a function of a fitted model and an assumed true model. A variety of experimental design choices are discussed, including the advantages and disadvantages of each approach. Finally, a description of the subcodes which constitute program ANALYZ is provided

  18. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    International Nuclear Information System (INIS)

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project

  19. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    Energy Technology Data Exchange (ETDEWEB)

    Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States); Lui, C.H. [Nuclear Regulatory Commission, Washington, DC (United States); Goossens, L.H.J.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Paesler-Sauer, J. [Research Center, Karlsruhe (Germany); Helton, J.C. [and others

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.

  20. Uncertainty and sensitivity analysis in the scenario simulation with RELAP/SCDAP and MELCOR codes; Analisis de incertidumbre y sensibilidad en la simulacion de escenarios con los codigos RELAP/SCDAP y MELCOR

    Energy Technology Data Exchange (ETDEWEB)

    Garcia J, T.; Cardenas V, J., E-mail: tonatiuh.garcia@cnsns.gob.mx [Comision Nacional de Seguridad Nuclear y Salvaguardias, Dr. Barragan 779, Col. Narvarte, 03020 Ciudad de Mexico (Mexico)

    2015-09-15

    A methodology was implemented for analysis of uncertainty in simulations of scenarios with RELAP/SCDAP V- 3.4 bi-7 and MELCOR V-2.1 codes, same that are used to perform safety analysis in the Comision Nacional de Seguridad Nuclear y Salvaguardias (CNSNS). The uncertainty analysis methodology chosen is a probabilistic method of type Propagation of uncertainty of the input parameters to the departure parameters. Therefore, it began with the selection of the input parameters considered uncertain and are considered of high importance in the scenario for its direct effect on the output interest variable. These parameters were randomly sampled according to intervals of variation or probability distribution functions assigned by expert judgment to generate a set of input files that were run through the simulation code to propagate the uncertainty to the output parameters. Then, through the use or ordered statistical and formula Wilks, was determined that the minimum number of executions required to obtain the uncertainty bands that include a population of 95% at a confidence level of 95% in the results is 93, is important to mention that in this method that number of executions does not depend on the number of selected input parameters. In the implementation routines in Fortran 90 that allowed automate the process to make the uncertainty analysis in transients for RELAP/SCDAP code were generated. In the case of MELCOR code for severe accident analysis, automation was carried out through complement Dakota Uncertainty incorporated into the Snap platform. To test the practical application of this methodology, two analyzes were performed: the first with the simulation of closing transient of the main steam isolation valves using the RELAP/SCDAP code obtaining the uncertainty band of the dome pressure of the vessel; while in the second analysis, the accident simulation of the power total loss (Sbo) was carried out with the Macarol code obtaining the uncertainty band for the

  1. Use of Sensitivity and Uncertainty Analysis to Select Benchmark Experiments for the Validation of Computer Codes and Data

    International Nuclear Information System (INIS)

    Elam, K.R.; Rearden, B.T.

    2003-01-01

    Sensitivity and uncertainty analysis methodologies under development at Oak Ridge National Laboratory were applied to determine whether existing benchmark experiments adequately cover the area of applicability for the criticality code and data validation of PuO 2 and mixed-oxide (MOX) powder systems. The study examined three PuO 2 powder systems and four MOX powder systems that would be useful for establishing mass limits for a MOX fuel fabrication facility. Using traditional methods to choose experiments for criticality analysis validation, 46 benchmark critical experiments were identified as applicable to the PuO 2 powder systems. However, only 14 experiments were thought to be within the area of applicability for dry MOX powder systems.The applicability of 318 benchmark critical experiments, including the 60 experiments initially identified, was assessed. Each benchmark and powder system was analyzed using the Tools for Sensitivity and UNcertainty Analysis Methodology Implementation (TSUNAMI) one-dimensional (TSUNAMI-1D) or TSUNAMI three-dimensional (TSUNAMI-3D) sensitivity analysis sequences, which will be included in the next release of the SCALE code system. This sensitivity data and cross-section uncertainty data were then processed with TSUNAMI-IP to determine the correlation of each application to each experiment in the benchmarking set. Correlation coefficients are used to assess the similarity between systems and determine the applicability of one system for the code and data validation of another.The applicability of most of the experiments identified using traditional methods was confirmed by the TSUNAMI analysis. In addition, some PuO 2 and MOX powder systems were determined to be within the area of applicability of several other benchmarks that would not have been considered using traditional methods. Therefore, the number of benchmark experiments useful for the validation of these systems exceeds the number previously expected. The TSUNAMI analysis

  2. Uncertainty Analysis with Considering Resonance Self-shielding Effect

    Energy Technology Data Exchange (ETDEWEB)

    Han, Tae Young [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    If infinitely diluted multi-group cross sections were used for the sensitivity, the covariance data from the evaluated nuclear data library (ENDL) was directly applied. However, in case of using a self-shielded multi-group cross section, the covariance data should be corrected considering self-shielding effect. Usually, implicit uncertainty can be defined as the uncertainty change by the resonance self-shielding effect as described above. MUSAD ( Modules of Uncertainty and Sensitivity Analysis for DeCART ) has been developed for a multiplication factor and cross section uncertainty based on the generalized perturbation theory and it, however, can only quantify the explicit uncertainty by the self-shielded multi-group cross sections without considering the implicit effect. Thus, this paper addresses the implementation of the implicit uncertainty analysis module into the code and the numerical results for the verification are provided. The implicit uncertainty analysis module has been implemented into MUSAD based on infinitely-diluted cross section-based consistent method. The verification calculation was performed on MHTGR 350 Ex.I-1a and the differences with McCARD result decrease from 40% to 1% in CZP case and 3% in HFP case. From this study, it is expected that MUSAD code can reasonably produce the complete uncertainty on VHTR or LWR where the resonance self-shielding effect should be significantly considered.

  3. Uncertainty Analysis with Considering Resonance Self-shielding Effect

    International Nuclear Information System (INIS)

    Han, Tae Young

    2016-01-01

    If infinitely diluted multi-group cross sections were used for the sensitivity, the covariance data from the evaluated nuclear data library (ENDL) was directly applied. However, in case of using a self-shielded multi-group cross section, the covariance data should be corrected considering self-shielding effect. Usually, implicit uncertainty can be defined as the uncertainty change by the resonance self-shielding effect as described above. MUSAD ( Modules of Uncertainty and Sensitivity Analysis for DeCART ) has been developed for a multiplication factor and cross section uncertainty based on the generalized perturbation theory and it, however, can only quantify the explicit uncertainty by the self-shielded multi-group cross sections without considering the implicit effect. Thus, this paper addresses the implementation of the implicit uncertainty analysis module into the code and the numerical results for the verification are provided. The implicit uncertainty analysis module has been implemented into MUSAD based on infinitely-diluted cross section-based consistent method. The verification calculation was performed on MHTGR 350 Ex.I-1a and the differences with McCARD result decrease from 40% to 1% in CZP case and 3% in HFP case. From this study, it is expected that MUSAD code can reasonably produce the complete uncertainty on VHTR or LWR where the resonance self-shielding effect should be significantly considered

  4. Uncertainty Evaluation of the SFR Subchannel Thermal-Hydraulic Modeling Using a Hot Channel Factors Analysis

    International Nuclear Information System (INIS)

    Choi, Sun Rock; Cho, Chung Ho; Kim, Sang Ji

    2011-01-01

    In an SFR core analysis, a hot channel factors (HCF) method is most commonly used to evaluate uncertainty. It was employed to the early design such as the CRBRP and IFR. In other ways, the improved thermal design procedure (ITDP) is able to calculate the overall uncertainty based on the Root Sum Square technique and sensitivity analyses of each design parameters. The Monte Carlo method (MCM) is also employed to estimate the uncertainties. In this method, all the input uncertainties are randomly sampled according to their probability density functions and the resulting distribution for the output quantity is analyzed. Since an uncertainty analysis is basically calculated from the temperature distribution in a subassembly, the core thermal-hydraulic modeling greatly affects the resulting uncertainty. At KAERI, the SLTHEN and MATRA-LMR codes have been utilized to analyze the SFR core thermal-hydraulics. The SLTHEN (steady-state LMR core thermal hydraulics analysis code based on the ENERGY model) code is a modified version of the SUPERENERGY2 code, which conducts a multi-assembly, steady state calculation based on a simplified ENERGY model. The detailed subchannel analysis code MATRA-LMR (Multichannel Analyzer for Steady-State and Transients in Rod Arrays for Liquid Metal Reactors), an LMR version of MATRA, was also developed specifically for the SFR core thermal-hydraulic analysis. This paper describes comparative studies for core thermal-hydraulic models. The subchannel analysis and a hot channel factors based uncertainty evaluation system is established to estimate the core thermofluidic uncertainties using the MATRA-LMR code and the results are compared to those of the SLTHEN code

  5. Stochastic methods for uncertainty treatment of functional variables in computer codes: application to safety studies

    International Nuclear Information System (INIS)

    Nanty, Simon

    2015-01-01

    This work relates to the framework of uncertainty quantification for numerical simulators, and more precisely studies two industrial applications linked to the safety studies of nuclear plants. These two applications have several common features. The first one is that the computer code inputs are functional and scalar variables, functional ones being dependent. The second feature is that the probability distribution of functional variables is known only through a sample of their realizations. The third feature, relative to only one of the two applications, is the high computational cost of the code, which limits the number of possible simulations. The main objective of this work was to propose a complete methodology for the uncertainty analysis of numerical simulators for the two considered cases. First, we have proposed a methodology to quantify the uncertainties of dependent functional random variables from a sample of their realizations. This methodology enables to both model the dependency between variables and their link to another variable, called co-variate, which could be, for instance, the output of the considered code. Then, we have developed an adaptation of a visualization tool for functional data, which enables to simultaneously visualize the uncertainties and features of dependent functional variables. Second, a method to perform the global sensitivity analysis of the codes used in the two studied cases has been proposed. In the case of a computationally demanding code, the direct use of quantitative global sensitivity analysis methods is intractable. To overcome this issue, the retained solution consists in building a surrogate model or meta model, a fast-running model approximating the computationally expensive code. An optimized uniform sampling strategy for scalar and functional variables has been developed to build a learning basis for the meta model. Finally, a new approximation approach for expensive codes with functional outputs has been

  6. Uncertainty Analysis of Few Group Cross Sections Based on Generalized Perturbation Theory

    International Nuclear Information System (INIS)

    Han, Tae Young; Lee, Hyun Chul; Noh, Jae Man

    2014-01-01

    In this paper, the methodology of the sensitivity and uncertainty analysis code based on GPT was described and the preliminary verification calculations on the PMR200 pin cell problem were carried out. As a result, they are in a good agreement when compared with the results by TSUNAMI. From this study, it is expected that MUSAD code based on GPT can produce the uncertainty of the homogenized few group microscopic cross sections for a core simulator. For sensitivity and uncertainty analyses for general core responses, a two-step method is available and it utilizes the generalized perturbation theory (GPT) for homogenized few group cross sections in the first step and stochastic sampling method for general core responses in the second step. The uncertainty analysis procedure based on GPT in the first step needs the generalized adjoint solution from a cell or lattice code. For this, the generalized adjoint solver has been integrated into DeCART in our previous work. In this paper, MUSAD (Modues of Uncertainty and Sensitivity Analysis for DeCART) code based on the classical perturbation theory was expanded to the function of the sensitivity and uncertainty analysis for few group cross sections based on GPT. First, the uncertainty analysis method based on GPT was described and, in the next section, the preliminary results of the verification calculation on a VHTR pin cell problem were compared with the results by TSUNAMI of SCALE 6.1

  7. Measures of uncertainty, importance and sensitivity of the SEDA code

    International Nuclear Information System (INIS)

    Baron, J.; Caruso, A.; Vinate, H.

    1996-01-01

    The purpose of this work is the estimation of the uncertainty on the results of the SEDA code (Sistema de Evaluacion de Dosis en Accidentes) in accordance with the input data and its parameters. The SEDA code has been developed by the Comision Nacional de Energia Atomica for the estimation of doses during emergencies in the vicinity of Atucha and Embalse, nuclear power plants. The user should feed the code with meteorological data, source terms and accident data (timing involved, release height, thermal content of the release, etc.) It is designed to be used during the emergency, and to bring fast results that enable to make decisions. The uncertainty in the results of the SEDA code is quantified in the present paper. This uncertainty is associated both with the data the user inputs to the code, and with the uncertain parameters of the code own models. The used method consisted in the statistical characterization of the parameters and variables, assigning them adequate probability distributions. These distributions have been sampled with the Latin Hypercube Sampling method, which is a stratified multi-variable Monte-Carlo technique. The code has been performed for each of the samples and finally, a result sample has been obtained. These results have been characterized from the statistical point of view (obtaining their mean, most probable value, distribution shape, etc.) for several distances from the source. Finally, the Partial Correlation Coefficients and Standard Regression Coefficients techniques have been used to obtain the relative importance of each input variable, and the Sensitivity of the code to its variations. The measures of Importance and Sensitivity have been obtained for several distances from the source and various cases of atmospheric stability, making comparisons possible. This paper allowed to confide in the results of the code, and the association of their uncertainty to them, as a way to know the limits in which the results can vary in a real

  8. Discussion of OECD LWR Uncertainty Analysis in Modelling Benchmark

    International Nuclear Information System (INIS)

    Ivanov, K.; Avramova, M.; Royer, E.; Gillford, J.

    2013-01-01

    The demand for best estimate calculations in nuclear reactor design and safety evaluations has increased in recent years. Uncertainty quantification has been highlighted as part of the best estimate calculations. The modelling aspects of uncertainty and sensitivity analysis are to be further developed and validated on scientific grounds in support of their performance and application to multi-physics reactor simulations. The Organization for Economic Co-operation and Development (OECD) / Nuclear Energy Agency (NEA) Nuclear Science Committee (NSC) has endorsed the creation of an Expert Group on Uncertainty Analysis in Modelling (EGUAM). Within the framework of activities of EGUAM/NSC the OECD/NEA initiated the Benchmark for Uncertainty Analysis in Modelling for Design, Operation, and Safety Analysis of Light Water Reactor (OECD LWR UAM benchmark). The general objective of the benchmark is to propagate the predictive uncertainties of code results through complex coupled multi-physics and multi-scale simulations. The benchmark is divided into three phases with Phase I highlighting the uncertainty propagation in stand-alone neutronics calculations, while Phase II and III are focused on uncertainty analysis of reactor core and system respectively. This paper discusses the progress made in Phase I calculations, the Specifications for Phase II and the incoming challenges in defining Phase 3 exercises. The challenges of applying uncertainty quantification to complex code systems, in particular the time-dependent coupled physics models are the large computational burden and the utilization of non-linear models (expected due to the physics coupling). (authors)

  9. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Boardman, J. [AEA Technology (United Kingdom); Jones, J.A. [National Radiological Protection Board (United Kingdom); Harper, F.T.; Young, M.L. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  10. Licensing in BE system code calculations. Applications and uncertainty evaluation by CIAU method

    International Nuclear Information System (INIS)

    Petruzzi, Alessandro; D'Auria, Francesco

    2007-01-01

    The evaluation of uncertainty constitutes the necessary supplement of Best Estimate (BE) calculations performed to understand accident scenarios in water cooled nuclear reactors. The needs come from the imperfection of computational tools on the one side and from the interest in using such tool to get more precise evaluation of safety margins. In the present paper the approaches to uncertainty are outlined and the CIAU (Code with capability of Internal Assessment of Uncertainty) method proposed by the University of Pisa is described including ideas at the basis and results from applications. Two approaches are distinguished that are characterized as 'propagation of code input uncertainty' and 'propagation of code output errors'. For both methods, the thermal-hydraulic code is at the centre of the process of uncertainty evaluation: in the former case the code itself is adopted to compute the error bands and to propagate the input errors, in the latter case the errors in code application to relevant measurements are used to derive the error bands. The CIAU method exploits the idea of the 'status approach' for identifying the thermal-hydraulic conditions of an accident in any Nuclear Power Plant (NPP). Errors in predicting such status are derived from the comparison between predicted and measured quantities and, in the stage of the application of the method, are used to compute the uncertainty. (author)

  11. Quantifying reactor safety margins: Application of code scaling, applicability, and uncertainty evaluation methodology to a large-break, loss-of-coolant accident

    International Nuclear Information System (INIS)

    Boyack, B.; Duffey, R.; Wilson, G.; Griffith, P.; Lellouche, G.; Levy, S.; Rohatgi, U.; Wulff, W.; Zuber, N.

    1989-12-01

    The US Nuclear Regulatory Commission (NRC) has issued a revised rule for loss-of-coolant accident/emergency core cooling system (ECCS) analysis of light water reactors to allow the use of best-estimate computer codes in safety analysis as an option. A key feature of this option requires the licensee to quantify the uncertainty of the calculations and include that uncertainty when comparing the calculated results with acceptance limits provided in 10 CFR Part 50. To support the revised ECCS rule and illustrate its application, the NRC and its contractors and consultants have developed and demonstrated an uncertainty evaluation methodology called code scaling, applicability, and uncertainty (CSAU). The CSAU methodology and an example application described in this report demonstrate that uncertainties in complex phenomena can be quantified. The methodology is structured, traceable, and practical, as is needed in the regulatory arena. The methodology is systematic and comprehensive as it addresses and integrates the scenario, experiments, code, and plant to resolve questions concerned with: (a) code capability to scale-up processes from test facility to full-scale nuclear power plants; (b) code applicability to safety studies of a postulated accident scenario in a specified nuclear power plant; and (c) quantifying uncertainties of calculated results. 127 refs., 55 figs., 40 tabs

  12. Uncertainty analysis of the 35% reactor inlet header break in a CANDU 6 reactor using RELAP/SCDAPSIM/MOD4.0 with integrated uncertainty analysis option

    International Nuclear Information System (INIS)

    Dupleac, D.; Perez, M.; Reventos, F.; Allison, C.

    2011-01-01

    The RELAP/SCDAPSIM/MOD4.0 code, designed to predict the behavior of reactor systems during normal and accident conditions, is being developed as part of an international nuclear technology Software Development and Training Program (SDTP). RELAP/SCDAPSIM/MOD4.0, which is the first version of RELAP5 completely rewritten to FORTRAN 90/95/2000 standards, uses the publicly available RELAP5 and SCDAP models in combination with (a) advanced programming and numerical techniques, (b) advanced SDTP-member-developed models for LWR, HWR, and research reactor analysis, and (c) a variety of other member-developed computational packages. One such computational package is an integrated uncertainty analysis (IUA) package being developed jointly by the Technical University of Catalonia (UPC) and Innovative Systems Software (ISS). RELAP/SCDAPSIM/MOD4.0(IUA) follows the input-propagation approach using probability distribution functions to define the uncertainty of the input parameters. The main steps for this type of methodologies, often referred as to statistical approaches or Wilks’ methods, are the ones that follow: 1. Selection of the plant; 2. Selection of the scenario; 3. Selection of the safety criteria; 4. Identification and ranking of the relevant phenomena based on the safety criteria; 5. Selection of the appropriate code parameters to represent those phenomena; 6. Association of uncertainty by means of Probability Distribution Functions (PDFs) for each selected parameter; 7. Random sampling of the selected parameters according to its PDF and performing multiple computer runs to obtain uncertainty bands with a certain percentile and confidence level; 8. Processing the results of the multiple computer runs to estimate the uncertainty bands for the computed quantities associated with the selected safety criteria. RELAP/SCDAPSIM/MOD4.0(IUA) calculates the number of required code runs given the desired percentile and confidence level, performs the sampling process for the

  13. Uncertainty analysis of the 35% reactor inlet header break in a CANDU 6 reactor using RELAP/SCDAPSIM/MOD4.0 with integrated uncertainty analysis option

    Energy Technology Data Exchange (ETDEWEB)

    Dupleac, D., E-mail: danieldu@cne.pub.ro [Politehnica Univ. of Bucharest (Romania); Perez, M.; Reventos, F., E-mail: marina.perez@upc.edu, E-mail: francesc.reventos@upc.edu [Technical Univ. of Catalonia (Spain); Allison, C., E-mail: iss@cableone.net [Innovative Systems Software (United States)

    2011-07-01

    The RELAP/SCDAPSIM/MOD4.0 code, designed to predict the behavior of reactor systems during normal and accident conditions, is being developed as part of an international nuclear technology Software Development and Training Program (SDTP). RELAP/SCDAPSIM/MOD4.0, which is the first version of RELAP5 completely rewritten to FORTRAN 90/95/2000 standards, uses the publicly available RELAP5 and SCDAP models in combination with (a) advanced programming and numerical techniques, (b) advanced SDTP-member-developed models for LWR, HWR, and research reactor analysis, and (c) a variety of other member-developed computational packages. One such computational package is an integrated uncertainty analysis (IUA) package being developed jointly by the Technical University of Catalonia (UPC) and Innovative Systems Software (ISS). RELAP/SCDAPSIM/MOD4.0(IUA) follows the input-propagation approach using probability distribution functions to define the uncertainty of the input parameters. The main steps for this type of methodologies, often referred as to statistical approaches or Wilks’ methods, are the ones that follow: 1. Selection of the plant; 2. Selection of the scenario; 3. Selection of the safety criteria; 4. Identification and ranking of the relevant phenomena based on the safety criteria; 5. Selection of the appropriate code parameters to represent those phenomena; 6. Association of uncertainty by means of Probability Distribution Functions (PDFs) for each selected parameter; 7. Random sampling of the selected parameters according to its PDF and performing multiple computer runs to obtain uncertainty bands with a certain percentile and confidence level; 8. Processing the results of the multiple computer runs to estimate the uncertainty bands for the computed quantities associated with the selected safety criteria. RELAP/SCDAPSIM/MOD4.0(IUA) calculates the number of required code runs given the desired percentile and confidence level, performs the sampling process for the

  14. Some uncertainty results obtained by the statistical version of the KARATE code system related to core design and safety analysis

    Energy Technology Data Exchange (ETDEWEB)

    Panka, Istvan; Hegyi, Gyoergy; Maraczy, Csaba; Temesvari, Emese [Hungarian Academy of Sciences, Budapest (Hungary). Reactor Analysis Dept.

    2017-11-15

    The best-estimate KARATE code system has been widely used for core design calculations and simulations of slow transients of VVER reactors. Recently there has been an increasing need for assessing the uncertainties of such calculations by propagating the basic input uncertainties of the models through the full calculation chain. In order to determine the uncertainties of quantities of interest during the burnup, the statistical version of the KARATE code system has been elaborated. In the first part of the paper, the main features of the new code system are discussed. The applied statistical method is based on Monte-Carlo sampling of the considered input data taking into account mainly the covariance matrices of the cross sections and/or the technological uncertainties. In the second part of the paper, only the uncertainties of cross sections are considered and an equilibrium cycle related to a VVER-440 type reactor is investigated. The burnup dependence of the uncertainties of some safety related parameters (e.g. critical boron concentration, rod worth, feedback coefficients, assembly-wise radial power and burnup distribution) are discussed and compared to the recently used limits.

  15. Uncertainty analysis for results of thermal hydraulic codes of best-estimate-type; Analisis de incertidumbre para resultados de codigos termohidraulicos de mejor estimacion

    Energy Technology Data Exchange (ETDEWEB)

    Alva N, J.

    2010-07-01

    In this thesis, some fundamental knowledge is presented about uncertainty analysis and about diverse methodologies applied in the study of nuclear power plant transient event analysis, particularly related to thermal hydraulics phenomena. These concepts and methodologies mentioned in this work come from a wide bibliographical research in the nuclear power subject. Methodologies for uncertainty analysis have been developed by quite diverse institutions, and they have been widely used worldwide for application to results from best-estimate-type computer codes in nuclear reactor thermal hydraulics and safety analysis. Also, the main uncertainty sources, types of uncertainties, and aspects related to best estimate modeling and methods are introduced. Once the main bases of uncertainty analysis have been set, and some of the known methodologies have been introduced, it is presented in detail the CSAU methodology, which will be applied in the analyses. The main objective of this thesis is to compare the results of an uncertainty and sensibility analysis by using the Response Surface Technique to the application of W ilks formula, apply through a loss coolant experiment and an event of rise in a BWR. Both techniques are options in the part of uncertainty and sensibility analysis of the CSAU methodology, which was developed for the analysis of transients and accidents at nuclear power plants, and it is the base of most of the methodologies used in licensing of nuclear power plants practically everywhere. Finally, the results of applying both techniques are compared and discussed. (Author)

  16. Supporting qualified database for V and V and uncertainty evaluation of best-estimate system codes

    International Nuclear Information System (INIS)

    Petruzzi, A.; D'Auria, F.

    2014-01-01

    Uncertainty evaluation constitutes a key feature of BEPU (Best Estimate Plus Uncertainty) process. The uncertainty can be the result of a Monte Carlo type analysis involving input uncertainty parameters or the outcome of a process involving the use of experimental data and connected code calculations. Those uncertainty methods are discussed in several papers and guidelines (IAEA-SRS- 52, OECD/NEA BEMUSE reports). The present paper aims at discussing the role and the depth of the analysis required for merging from one side suitable experimental data and on the other side qualified code calculation results. This aspect is mostly connected with the second approach for uncertainty mentioned above, but it can be used also in the framework of the first approach. Namely, the paper discusses the features and structure of the database that includes the following kinds of documents: 1. The' RDS-facility' (Reference Data Set for the selected facility): this includes the description of the facility, the geometrical characterization of any component of the facility, the instrumentations, the data acquisition system, the evaluation of pressure losses, the physical properties of the material and the characterization of pumps, valves and heat losses; 2. The 'RDS-test' (Reference Data Set for the selected test of the facility): this includes the description of the main phenomena investigated during the test, the configuration of the facility for the selected test (possible new evaluation of pressure and heat losses if needed) and the specific boundary and initial conditions; 3. The 'QP' (Qualification Report) of the code calculation results: this includes the description of the nodalization developed following a set of homogeneous techniques, the achievement of the steady state conditions and the qualitative and quantitative analysis of the transient with the characterization of the Relevant Thermal-Hydraulics Aspects (RTA); 4. The EH (Engineering

  17. Correlated statistical uncertainties in coded-aperture imaging

    International Nuclear Information System (INIS)

    Fleenor, Matthew C.; Blackston, Matthew A.; Ziock, Klaus P.

    2015-01-01

    In nuclear security applications, coded-aperture imagers can provide a wealth of information regarding the attributes of both the radioactive and nonradioactive components of the objects being imaged. However, for optimum benefit to the community, spatial attributes need to be determined in a quantitative and statistically meaningful manner. To address a deficiency of quantifiable errors in coded-aperture imaging, we present uncertainty matrices containing covariance terms between image pixels for MURA mask patterns. We calculated these correlated uncertainties as functions of variation in mask rank, mask pattern over-sampling, and whether or not anti-mask data are included. Utilizing simulated point source data, we found that correlations arose when two or more image pixels were summed. Furthermore, we found that the presence of correlations was heightened by the process of over-sampling, while correlations were suppressed by the inclusion of anti-mask data and with increased mask rank. As an application of this result, we explored how statistics-based alarming is impacted in a radiological search scenario

  18. Qualitative uncertainty analysis in probabilistic safety assessment context

    International Nuclear Information System (INIS)

    Apostol, M.; Constantin, M; Turcu, I.

    2007-01-01

    In Probabilistic Safety Assessment (PSA) context, an uncertainty analysis is performed either to estimate the uncertainty in the final results (the risk to public health and safety) or to estimate the uncertainty in some intermediate quantities (the core damage frequency, the radionuclide release frequency or fatality frequency). The identification and evaluation of uncertainty are important tasks because they afford credit to the results and help in the decision-making process. Uncertainty analysis can be performed qualitatively or quantitatively. This paper performs a preliminary qualitative uncertainty analysis, by identification of major uncertainty in PSA level 1- level 2 interface and in the other two major procedural steps of a level 2 PSA i.e. the analysis of accident progression and of the containment and analysis of source term for severe accidents. One should mention that a level 2 PSA for a Nuclear Power Plant (NPP) involves the evaluation and quantification of the mechanisms, amount and probabilities of subsequent radioactive material releases from the containment. According to NUREG 1150, an important task in source term analysis is fission products transport analysis. The uncertainties related to the isotopes distribution in CANDU NPP primary circuit and isotopes' masses transferred in the containment, using SOPHAEROS module from ASTEC computer code will be also presented. (authors)

  19. A practical sensitivity analysis method for ranking sources of uncertainty in thermal–hydraulics applications

    Energy Technology Data Exchange (ETDEWEB)

    Pourgol-Mohammad, Mohammad, E-mail: pourgolmohammad@sut.ac.ir [Department of Mechanical Engineering, Sahand University of Technology, Tabriz (Iran, Islamic Republic of); Hoseyni, Seyed Mohsen [Department of Basic Sciences, East Tehran Branch, Islamic Azad University, Tehran (Iran, Islamic Republic of); Hoseyni, Seyed Mojtaba [Building & Housing Research Center, Tehran (Iran, Islamic Republic of); Sepanloo, Kamran [Nuclear Science and Technology Research Institute, Tehran (Iran, Islamic Republic of)

    2016-08-15

    Highlights: • Existing uncertainty ranking methods prove inconsistent for TH applications. • Introduction of a new method for ranking sources of uncertainty in TH codes. • Modified PIRT qualitatively identifies and ranks uncertainty sources more precisely. • The importance of parameters is calculated by a limited number of TH code executions. • Methodology is applied successfully on LOFT-LB1 test facility. - Abstract: In application to thermal–hydraulic calculations by system codes, sensitivity analysis plays an important role for managing the uncertainties of code output and risk analysis. Sensitivity analysis is also used to confirm the results of qualitative Phenomena Identification and Ranking Table (PIRT). Several methodologies have been developed to address uncertainty importance assessment. Generally, uncertainty importance measures, mainly devised for the Probabilistic Risk Assessment (PRA) applications, are not affordable for computationally demanding calculations of the complex thermal–hydraulics (TH) system codes. In other words, for effective quantification of the degree of the contribution of each phenomenon to the total uncertainty of the output, a practical approach is needed by considering high computational burden of TH calculations. This study aims primarily to show the inefficiency of the existing approaches and then introduces a solution to cope with the challenges in this area by modification of variance-based uncertainty importance method. Important parameters are identified by the modified PIRT approach qualitatively then their uncertainty importance is quantified by a local derivative index. The proposed index is attractive from its practicality point of view on TH applications. It is capable of calculating the importance of parameters by a limited number of TH code executions. Application of the proposed methodology is demonstrated on LOFT-LB1 test facility.

  20. A practical sensitivity analysis method for ranking sources of uncertainty in thermal–hydraulics applications

    International Nuclear Information System (INIS)

    Pourgol-Mohammad, Mohammad; Hoseyni, Seyed Mohsen; Hoseyni, Seyed Mojtaba; Sepanloo, Kamran

    2016-01-01

    Highlights: • Existing uncertainty ranking methods prove inconsistent for TH applications. • Introduction of a new method for ranking sources of uncertainty in TH codes. • Modified PIRT qualitatively identifies and ranks uncertainty sources more precisely. • The importance of parameters is calculated by a limited number of TH code executions. • Methodology is applied successfully on LOFT-LB1 test facility. - Abstract: In application to thermal–hydraulic calculations by system codes, sensitivity analysis plays an important role for managing the uncertainties of code output and risk analysis. Sensitivity analysis is also used to confirm the results of qualitative Phenomena Identification and Ranking Table (PIRT). Several methodologies have been developed to address uncertainty importance assessment. Generally, uncertainty importance measures, mainly devised for the Probabilistic Risk Assessment (PRA) applications, are not affordable for computationally demanding calculations of the complex thermal–hydraulics (TH) system codes. In other words, for effective quantification of the degree of the contribution of each phenomenon to the total uncertainty of the output, a practical approach is needed by considering high computational burden of TH calculations. This study aims primarily to show the inefficiency of the existing approaches and then introduces a solution to cope with the challenges in this area by modification of variance-based uncertainty importance method. Important parameters are identified by the modified PIRT approach qualitatively then their uncertainty importance is quantified by a local derivative index. The proposed index is attractive from its practicality point of view on TH applications. It is capable of calculating the importance of parameters by a limited number of TH code executions. Application of the proposed methodology is demonstrated on LOFT-LB1 test facility.

  1. Development of the integrated system reliability analysis code MODULE

    International Nuclear Information System (INIS)

    Han, S.H.; Yoo, K.J.; Kim, T.W.

    1987-01-01

    The major components in a system reliability analysis are the determination of cut sets, importance measure, and uncertainty analysis. Various computer codes have been used for these purposes. For example, SETS and FTAP are used to determine cut sets; Importance for importance calculations; and Sample, CONINT, and MOCUP for uncertainty analysis. There have been problems when the codes run each other and the input and output are not linked, which could result in errors when preparing input for each code. The code MODULE was developed to carry out the above calculations simultaneously without linking input and outputs to other codes. MODULE can also prepare input for SETS for the case of a large fault tree that cannot be handled by MODULE. The flow diagram of the MODULE code is shown. To verify the MODULE code, two examples are selected and the results and computation times are compared with those of SETS, FTAP, CONINT, and MOCUP on both Cyber 170-875 and IBM PC/AT. Two examples are fault trees of the auxiliary feedwater system (AFWS) of Korea Nuclear Units (KNU)-1 and -2, which have 54 gates and 115 events, 39 gates and 92 events, respectively. The MODULE code has the advantage that it can calculate the cut sets, importances, and uncertainties in a single run with little increase in computing time over other codes and that it can be used in personal computers

  2. Qualification and application of nuclear reactor accident analysis code with the capability of internal assessment of uncertainty; Qualificacao e aplicacao de codigo de acidentes de reatores nucleares com capacidade interna de avaliacao de incerteza

    Energy Technology Data Exchange (ETDEWEB)

    Borges, Ronaldo Celem

    2001-10-15

    This thesis presents an independent qualification of the CIAU code ('Code with the capability of - Internal Assessment of Uncertainty') which is part of the internal uncertainty evaluation process with a thermal hydraulic system code on a realistic basis. This is done by combining the uncertainty methodology UMAE ('Uncertainty Methodology based on Accuracy Extrapolation') with the RELAP5/Mod3.2 code. This allows associating uncertainty band estimates with the results obtained by the realistic calculation of the code, meeting licensing requirements of safety analysis. The independent qualification is supported by simulations with RELAP5/Mod3.2 related to accident condition tests of LOBI experimental facility and to an event which has occurred in Angra 1 nuclear power plant, by comparison with measured results and by establishing uncertainty bands on safety parameter calculated time trends. These bands have indeed enveloped the measured trends. Results from this independent qualification of CIAU have allowed to ascertain the adequate application of a systematic realistic code procedure to analyse accidents with uncertainties incorporated in the results, although there is an evident need of extending the uncertainty data base. It has been verified that use of the code with this internal assessment of uncertainty is feasible in the design and license stages of a NPP. (author)

  3. Uncertainty analysis for the assembly and core simulation of BEAVRS at the HZP conditions

    Energy Technology Data Exchange (ETDEWEB)

    Wan, Chenghui [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Cao, Liangzhi, E-mail: caolz@mail.xjtu.edu.cn [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Wu, Hongchun [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Shen, Wei [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Canadian Nuclear Safety Commission, Ottawa, Ontario (Canada)

    2017-04-15

    Highlights: • Uncertainty analysis has been completed based on the “two-step” scheme. • Uncertainty analysis has been performed to BEAVRS at HZP. • For lattice calculations, the few-group constant’s uncertainty was quantified. • For core simulation, uncertainties of k{sub eff} and power distributions were quantified. - Abstract: Based on the “two-step” scheme for the reactor-physics calculations, the capability of uncertainty analysis for the core simulations has been implemented in the UNICORN code, an in-house code for the sensitivity and uncertainty analysis of the reactor-physics calculations. Applying the statistical sampling method, the nuclear-data uncertainties can be propagated to the important predictions of the core simulations. The uncertainties of the few-group constants introduced by the uncertainties of the multigroup microscopic cross sections are quantified first for the lattice calculations; the uncertainties of the few-group constants are then propagated to the core multiplication factor and core power distributions for the core simulations. Up to now, our in-house lattice code NECP-CACTI and the neutron-diffusion solver NECP-VIOLET have been implemented in UNICORN for the steady-state core simulations based on the “two-step” scheme. With NECP-CACTI and NECP-VIOLET, the modeling and simulation of the steady-state BEAVRS benchmark problem at the HZP conditions was performed, and the results were compared with those obtained by CASMO-4E. Based on the modeling and simulation, the UNICORN code has been applied to perform the uncertainty analysis for BAEVRS at HZP. The uncertainty results of the eigenvalues and two-group constants for the lattice calculations and the multiplication factor and the power distributions for the steady-state core simulations are obtained and analyzed in detail.

  4. Uncertainty analysis for the assembly and core simulation of BEAVRS at the HZP conditions

    International Nuclear Information System (INIS)

    Wan, Chenghui; Cao, Liangzhi; Wu, Hongchun; Shen, Wei

    2017-01-01

    Highlights: • Uncertainty analysis has been completed based on the “two-step” scheme. • Uncertainty analysis has been performed to BEAVRS at HZP. • For lattice calculations, the few-group constant’s uncertainty was quantified. • For core simulation, uncertainties of k_e_f_f and power distributions were quantified. - Abstract: Based on the “two-step” scheme for the reactor-physics calculations, the capability of uncertainty analysis for the core simulations has been implemented in the UNICORN code, an in-house code for the sensitivity and uncertainty analysis of the reactor-physics calculations. Applying the statistical sampling method, the nuclear-data uncertainties can be propagated to the important predictions of the core simulations. The uncertainties of the few-group constants introduced by the uncertainties of the multigroup microscopic cross sections are quantified first for the lattice calculations; the uncertainties of the few-group constants are then propagated to the core multiplication factor and core power distributions for the core simulations. Up to now, our in-house lattice code NECP-CACTI and the neutron-diffusion solver NECP-VIOLET have been implemented in UNICORN for the steady-state core simulations based on the “two-step” scheme. With NECP-CACTI and NECP-VIOLET, the modeling and simulation of the steady-state BEAVRS benchmark problem at the HZP conditions was performed, and the results were compared with those obtained by CASMO-4E. Based on the modeling and simulation, the UNICORN code has been applied to perform the uncertainty analysis for BAEVRS at HZP. The uncertainty results of the eigenvalues and two-group constants for the lattice calculations and the multiplication factor and the power distributions for the steady-state core simulations are obtained and analyzed in detail.

  5. One Approach to the Fire PSA Uncertainty Analysis

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.; Vukovic, I.

    2002-01-01

    Experienced practical events and findings from the number of fire probabilistic safety assessment (PSA) studies show that fire has high relative importance for nuclear power plant safety. Fire PSA is a very challenging phenomenon and a number of issues are still in the area of research and development. This has a major impact on the conservatism of fire PSA findings. One way to reduce the level of conservatism is to conduct uncertainty analysis. At the top-level, uncertainty of the fire PSA can be separated in to three segments. The first segment is related to fire initiating events frequencies. The second uncertainty segment is connected to the uncertainty of fire damage. Finally, there is uncertainty related to the PSA model, which propagates this fire-initiated damage to the core damage or other analyzed risk. This paper discusses all three segments of uncertainty. Some recent experience with fire PSA study uncertainty analysis, usage of fire analysis code COMPBRN IIIe, and uncertainty evaluation importance to the final result is presented.(author)

  6. Uncertainty analysis in the simulation of an HPGe detector using the Monte Carlo Code MCNP5

    International Nuclear Information System (INIS)

    Gallardo, Sergio; Pozuelo, Fausto; Querol, Andrea; Verdu, Gumersindo; Rodenas, Jose; Ortiz, J.; Pereira, Claubia

    2013-01-01

    A gamma spectrometer including an HPGe detector is commonly used for environmental radioactivity measurements. Many works have been focused on the simulation of the HPGe detector using Monte Carlo codes such as MCNP5. However, the simulation of this kind of detectors presents important difficulties due to the lack of information from manufacturers and due to loss of intrinsic properties in aging detectors. Some parameters such as the active volume or the Ge dead layer thickness are many times unknown and are estimated during simulations. In this work, a detailed model of an HPGe detector and a petri dish containing a certified gamma source has been done. The certified gamma source contains nuclides to cover the energy range between 50 and 1800 keV. As a result of the simulation, the Pulse Height Distribution (PHD) is obtained and the efficiency curve can be calculated from net peak areas and taking into account the certified activity of the source. In order to avoid errors due to the net area calculation, the simulated PHD is treated using the GammaVision software. On the other hand, it is proposed to use the Noether-Wilks formula to do an uncertainty analysis of model with the main goal of determining the efficiency curve of this detector and its associated uncertainty. The uncertainty analysis has been focused on dead layer thickness at different positions of the crystal. Results confirm the important role of the dead layer thickness in the low energy range of the efficiency curve. In the high energy range (from 300 to 1800 keV) the main contribution to the absolute uncertainty is due to variations in the active volume. (author)

  7. Uncertainty analysis in the simulation of an HPGe detector using the Monte Carlo Code MCNP5

    Energy Technology Data Exchange (ETDEWEB)

    Gallardo, Sergio; Pozuelo, Fausto; Querol, Andrea; Verdu, Gumersindo; Rodenas, Jose, E-mail: sergalbe@upv.es [Universitat Politecnica de Valencia, Valencia, (Spain). Instituto de Seguridad Industrial, Radiofisica y Medioambiental (ISIRYM); Ortiz, J. [Universitat Politecnica de Valencia, Valencia, (Spain). Servicio de Radiaciones. Lab. de Radiactividad Ambiental; Pereira, Claubia [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Departamento de Engenharia Nuclear

    2013-07-01

    A gamma spectrometer including an HPGe detector is commonly used for environmental radioactivity measurements. Many works have been focused on the simulation of the HPGe detector using Monte Carlo codes such as MCNP5. However, the simulation of this kind of detectors presents important difficulties due to the lack of information from manufacturers and due to loss of intrinsic properties in aging detectors. Some parameters such as the active volume or the Ge dead layer thickness are many times unknown and are estimated during simulations. In this work, a detailed model of an HPGe detector and a petri dish containing a certified gamma source has been done. The certified gamma source contains nuclides to cover the energy range between 50 and 1800 keV. As a result of the simulation, the Pulse Height Distribution (PHD) is obtained and the efficiency curve can be calculated from net peak areas and taking into account the certified activity of the source. In order to avoid errors due to the net area calculation, the simulated PHD is treated using the GammaVision software. On the other hand, it is proposed to use the Noether-Wilks formula to do an uncertainty analysis of model with the main goal of determining the efficiency curve of this detector and its associated uncertainty. The uncertainty analysis has been focused on dead layer thickness at different positions of the crystal. Results confirm the important role of the dead layer thickness in the low energy range of the efficiency curve. In the high energy range (from 300 to 1800 keV) the main contribution to the absolute uncertainty is due to variations in the active volume. (author)

  8. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

    Energy Technology Data Exchange (ETDEWEB)

    Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States)] [and others

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.

  9. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

    International Nuclear Information System (INIS)

    Harper, F.T.; Young, M.L.; Miller, L.A.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes

  10. Sampling-based nuclear data uncertainty quantification for continuous energy Monte-Carlo codes

    International Nuclear Information System (INIS)

    Zhu, T.

    2015-01-01

    Research on the uncertainty of nuclear data is motivated by practical necessity. Nuclear data uncertainties can propagate through nuclear system simulations into operation and safety related parameters. The tolerance for uncertainties in nuclear reactor design and operation can affect the economic efficiency of nuclear power, and essentially its sustainability. The goal of the present PhD research is to establish a methodology of nuclear data uncertainty quantification (NDUQ) for MCNPX, the continuous-energy Monte-Carlo (M-C) code. The high fidelity (continuous-energy treatment and flexible geometry modelling) of MCNPX makes it the choice of routine criticality safety calculations at PSI/LRS, but also raises challenges for NDUQ by conventional sensitivity/uncertainty (S/U) methods. For example, only recently in 2011, the capability of calculating continuous energy κ_e_f_f sensitivity to nuclear data was demonstrated in certain M-C codes by using the method of iterated fission probability. The methodology developed during this PhD research is fundamentally different from the conventional S/U approach: nuclear data are treated as random variables and sampled in accordance to presumed probability distributions. When sampled nuclear data are used in repeated model calculations, the output variance is attributed to the collective uncertainties of nuclear data. The NUSS (Nuclear data Uncertainty Stochastic Sampling) tool is based on this sampling approach and implemented to work with MCNPX’s ACE format of nuclear data, which also gives NUSS compatibility with MCNP and SERPENT M-C codes. In contrast, multigroup uncertainties are used for the sampling of ACE-formatted pointwise-energy nuclear data in a groupwise manner due to the more limited quantity and quality of nuclear data uncertainties. Conveniently, the usage of multigroup nuclear data uncertainties allows consistent comparison between NUSS and other methods (both S/U and sampling-based) that employ the same

  11. Sampling-based nuclear data uncertainty quantification for continuous energy Monte-Carlo codes

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, T.

    2015-07-01

    Research on the uncertainty of nuclear data is motivated by practical necessity. Nuclear data uncertainties can propagate through nuclear system simulations into operation and safety related parameters. The tolerance for uncertainties in nuclear reactor design and operation can affect the economic efficiency of nuclear power, and essentially its sustainability. The goal of the present PhD research is to establish a methodology of nuclear data uncertainty quantification (NDUQ) for MCNPX, the continuous-energy Monte-Carlo (M-C) code. The high fidelity (continuous-energy treatment and flexible geometry modelling) of MCNPX makes it the choice of routine criticality safety calculations at PSI/LRS, but also raises challenges for NDUQ by conventional sensitivity/uncertainty (S/U) methods. For example, only recently in 2011, the capability of calculating continuous energy κ{sub eff} sensitivity to nuclear data was demonstrated in certain M-C codes by using the method of iterated fission probability. The methodology developed during this PhD research is fundamentally different from the conventional S/U approach: nuclear data are treated as random variables and sampled in accordance to presumed probability distributions. When sampled nuclear data are used in repeated model calculations, the output variance is attributed to the collective uncertainties of nuclear data. The NUSS (Nuclear data Uncertainty Stochastic Sampling) tool is based on this sampling approach and implemented to work with MCNPX’s ACE format of nuclear data, which also gives NUSS compatibility with MCNP and SERPENT M-C codes. In contrast, multigroup uncertainties are used for the sampling of ACE-formatted pointwise-energy nuclear data in a groupwise manner due to the more limited quantity and quality of nuclear data uncertainties. Conveniently, the usage of multigroup nuclear data uncertainties allows consistent comparison between NUSS and other methods (both S/U and sampling-based) that employ the same

  12. PIV Uncertainty Methodologies for CFD Code Validation at the MIR Facility

    Energy Technology Data Exchange (ETDEWEB)

    Sabharwall, Piyush [Idaho National Lab. (INL), Idaho Falls, ID (United States); Skifton, Richard [Idaho National Lab. (INL), Idaho Falls, ID (United States); Stoots, Carl [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kim, Eung Soo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Conder, Thomas [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2013-12-01

    Currently, computational fluid dynamics (CFD) is widely used in the nuclear thermal hydraulics field for design and safety analyses. To validate CFD codes, high quality multi dimensional flow field data are essential. The Matched Index of Refraction (MIR) Flow Facility at Idaho National Laboratory has a unique capability to contribute to the development of validated CFD codes through the use of Particle Image Velocimetry (PIV). The significance of the MIR facility is that it permits non intrusive velocity measurement techniques, such as PIV, through complex models without requiring probes and other instrumentation that disturb the flow. At the heart of any PIV calculation is the cross-correlation, which is used to estimate the displacement of particles in some small part of the image over the time span between two images. This image displacement is indicated by the location of the largest peak. In the MIR facility, uncertainty quantification is a challenging task due to the use of optical measurement techniques. Currently, this study is developing a reliable method to analyze uncertainty and sensitivity of the measured data and develop a computer code to automatically analyze the uncertainty/sensitivity of the measured data. The main objective of this study is to develop a well established uncertainty quantification method for the MIR Flow Facility, which consists of many complicated uncertainty factors. In this study, the uncertainty sources are resolved in depth by categorizing them into uncertainties from the MIR flow loop and PIV system (including particle motion, image distortion, and data processing). Then, each uncertainty source is mathematically modeled or adequately defined. Finally, this study will provide a method and procedure to quantify the experimental uncertainty in the MIR Flow Facility with sample test results.

  13. A Proposal on the Advanced Sampling Based Sensitivity and Uncertainty Analysis Method for the Eigenvalue Uncertainty Analysis

    International Nuclear Information System (INIS)

    Kim, Song Hyun; Song, Myung Sub; Shin, Chang Ho; Noh, Jae Man

    2014-01-01

    In using the perturbation theory, the uncertainty of the response can be estimated by a single transport simulation, and therefore it requires small computational load. However, it has a disadvantage that the computation methodology must be modified whenever estimating different response type such as multiplication factor, flux, or power distribution. Hence, it is suitable for analyzing few responses with lots of perturbed parameters. Statistical approach is a sampling based method which uses randomly sampled cross sections from covariance data for analyzing the uncertainty of the response. XSUSA is a code based on the statistical approach. The cross sections are only modified with the sampling based method; thus, general transport codes can be directly utilized for the S/U analysis without any code modifications. However, to calculate the uncertainty distribution from the result, code simulation should be enough repeated with randomly sampled cross sections. Therefore, this inefficiency is known as a disadvantage of the stochastic method. In this study, an advanced sampling method of the cross sections is proposed and verified to increase the estimation efficiency of the sampling based method. In this study, to increase the estimation efficiency of the sampling based S/U method, an advanced sampling and estimation method was proposed. The main feature of the proposed method is that the cross section averaged from each single sampled cross section is used. For the use of the proposed method, the validation was performed using the perturbation theory

  14. An uncertainty analysis using the NRPB accident consequence code Marc

    International Nuclear Information System (INIS)

    Jones, J.A.; Crick, M.J.; Simmonds, J.R.

    1991-01-01

    This paper describes an uncertainty analysis of MARC calculations of the consequences of accidental releases of radioactive materials to atmosphere. A total of 98 parameters describing the transfer of material through the environment to man, the doses received, and the health effects resulting from these doses, was considered. The uncertainties in the numbers of early and late health effects, numbers of people affected by countermeasures, the amounts of food restricted and the economic costs of the accident were estimated. This paper concentrates on the results for early death and fatal cancer for a large hypothetical release from a PWR

  15. Uncertainty Analysis of RBMK-Related Experimental Data

    International Nuclear Information System (INIS)

    Urbonas, Rolandas; Kaliatka, Algirdas; Liaukonis, Mindaugas

    2002-01-01

    An attempt to validate state-of-the-art thermal hydraulic code ATHLET (GRS, Germany) on the basis of E-108 test facility was made. Originally this code was developed and validated for different type reactors than RBMK. Since state-of-art thermal hydraulic codes are widely used for simulation of RBMK reactors, further codes' implementation and validation is required. The phenomena associated with channel type flow instabilities and CHF were found to be an important step in the frame of the overall effort of state-of-the-art validation and application for RBMK reactors. In the paper one-channel approach analysis is presented. Thus, the oscillatory behaviour of the system was not detected. The results show dependence on the nodalization used in the heated channels, initial and boundary conditions and code selected models. It is shown that the code is able to predict a sudden heat structure temperature excursion, when critical heat flux is approached. GRS developed uncertainty and sensitivity methodology was employed in the analysis. (authors)

  16. Validation of Fuel Performance Uncertainty for RIA Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Nam-Gyu; Yoo, Jong-Sung; Jung, Yil-Sup [KEPCO Nuclear Fuel Co., Daejeon (Korea, Republic of)

    2016-10-15

    To achieve this the computer code performance has to be validated based on the experimental results. And for the uncertainty quantification, important uncertainty parameters need to be selected, and combined uncertainty has to be evaluated with an acceptable statistical treatment. And important uncertainty parameters to the rod performance such as fuel enthalpy, fission gas release, cladding hoop strain etc. were chosen through the rigorous sensitivity studies. And their validity has been assessed by utilizing the experimental results, which were tested in CABRI and NSRR. Analysis results revealed that several tested rods were not bounded within combined fuel performance uncertainty. Assessment of fuel performance with an extended fuel power uncertainty on tested rods in NSRR and CABRI has been done. Analysis results showed that several tested rods were not bounded within calculated fuel performance uncertainty. This implies that the currently considered uncertainty range of the parameters is not enough to cover the fuel performance sufficiently.

  17. CSAU (code scaling, applicability and uncertainty), a tool to prioritize advanced reactor research

    International Nuclear Information System (INIS)

    Wilson, G.E.; Boyack, B.E.

    1990-01-01

    Best Estimate computer codes have been accepted by the US Nuclear Regulatory Commission as an optional tool for performing safety analysis related to the licensing and regulation of current nuclear reactors producing commercial electrical power, providing their uncertainty is quantified. In support of this policy change, the NRC and its contractors and consultants have developed and demonstrated an uncertainty quantification methodology called CSAU. At the process level, the method is generic to any application which relies on best estimate computer code simulations to determine safe operating margins. The primary use of the CSAU methodology is to quantify safety margins for existing designs; however, the methodology can also serve an equally important role in advanced reactor research for plants not yet built. Applied early, during the period when alternate designs are being evaluated, the methodology can identify the relative importance of the sources of uncertainty in the knowledge of each plant behavior and, thereby, help prioritize the research needed to bring the new designs to fruition. This paper describes the CSAU methodology, at the generic process level, and provides the general principles whereby it may be applied to evaluations of advanced reactor designs. 9 refs., 1 fig., 1 tab

  18. Finite mixture models for sensitivity analysis of thermal hydraulic codes for passive safety systems analysis

    Energy Technology Data Exchange (ETDEWEB)

    Di Maio, Francesco, E-mail: francesco.dimaio@polimi.it [Energy Department, Politecnico di Milano, Via La Masa 34, 20156 Milano (Italy); Nicola, Giancarlo [Energy Department, Politecnico di Milano, Via La Masa 34, 20156 Milano (Italy); Zio, Enrico [Energy Department, Politecnico di Milano, Via La Masa 34, 20156 Milano (Italy); Chair on System Science and Energetic Challenge Fondation EDF, Ecole Centrale Paris and Supelec, Paris (France); Yu, Yu [School of Nuclear Science and Engineering, North China Electric Power University, 102206 Beijing (China)

    2015-08-15

    Highlights: • Uncertainties of TH codes affect the system failure probability quantification. • We present Finite Mixture Models (FMMs) for sensitivity analysis of TH codes. • FMMs approximate the pdf of the output of a TH code with a limited number of simulations. • The approach is tested on a Passive Containment Cooling System of an AP1000 reactor. • The novel approach overcomes the results of a standard variance decomposition method. - Abstract: For safety analysis of Nuclear Power Plants (NPPs), Best Estimate (BE) Thermal Hydraulic (TH) codes are used to predict system response in normal and accidental conditions. The assessment of the uncertainties of TH codes is a critical issue for system failure probability quantification. In this paper, we consider passive safety systems of advanced NPPs and present a novel approach of Sensitivity Analysis (SA). The approach is based on Finite Mixture Models (FMMs) to approximate the probability density function (i.e., the uncertainty) of the output of the passive safety system TH code with a limited number of simulations. We propose a novel Sensitivity Analysis (SA) method for keeping the computational cost low: an Expectation Maximization (EM) algorithm is used to calculate the saliency of the TH code input variables for identifying those that most affect the system functional failure. The novel approach is compared with a standard variance decomposition method on a case study considering a Passive Containment Cooling System (PCCS) of an Advanced Pressurized reactor AP1000.

  19. Application of code scaling, applicability and uncertainty methodology to large break LOCA analysis of two loop PWR

    International Nuclear Information System (INIS)

    Mavko, B.; Stritar, A.; Prosek, A.

    1993-01-01

    In NED 119, No. 1 (May 1990) a series of six papers published by a Technical Program Group presented a new methodology for the safety evaluation of emergency core cooling systems in nuclear power plants. This paper describes the application of that new methodology to the LB LOCA analysis of the two loop Westinghouse power plant. Results of the original work were used wherever possible, so that the analysis was finished in less than one man year of work. Steam generator plugging level and safety injection flow rate were used as additional uncertainty parameters, which had not been used in the original work. The computer code RELAP5/MOD2 was used. Response surface was generated by the regression analysis and by the artificial neural network like Optimal Statistical Estimator method. Results were compared also to the analytical calculation. (orig.)

  20. Deterministic sensitivity and uncertainty methodology for best estimate system codes applied in nuclear technology

    International Nuclear Information System (INIS)

    Petruzzi, A.; D'Auria, F.; Cacuci, D.G.

    2009-01-01

    Nuclear Power Plant (NPP) technology has been developed based on the traditional defense in depth philosophy supported by deterministic and overly conservative methods for safety analysis. In the 1970s [1], conservative hypotheses were introduced for safety analyses to address existing uncertainties. Since then, intensive thermal-hydraulic experimental research has resulted in a considerable increase in knowledge and consequently in the development of best-estimate codes able to provide more realistic information about the physical behaviour and to identify the most relevant safety issues allowing the evaluation of the existing actual margins between the results of the calculations and the acceptance criteria. However, the best-estimate calculation results from complex thermal-hydraulic system codes (like Relap5, Cathare, Athlet, Trace, etc..) are affected by unavoidable approximations that are un-predictable without the use of computational tools that account for the various sources of uncertainty. Therefore the use of best-estimate codes (BE) within the reactor technology, either for design or safety purposes, implies understanding and accepting the limitations and the deficiencies of those codes. Taking into consideration the above framework, a comprehensive approach for utilizing quantified uncertainties arising from Integral Test Facilities (ITFs, [2]) and Separate Effect Test Facilities (SETFs, [3]) in the process of calibrating complex computer models for the application to NPP transient scenarios has been developed. The methodology proposed is capable of accommodating multiple SETFs and ITFs to learn as much as possible about uncertain parameters, allowing for the improvement of the computer model predictions based on the available experimental evidences. The proposed methodology constitutes a major step forward with respect to the generally used expert judgment and statistical methods as it permits a) to establish the uncertainties of any parameter

  1. Integrated uncertainty analysis using RELAP/SCDAPSIM/MOD4.0

    International Nuclear Information System (INIS)

    Perez, M.; Reventos, F.; Wagner, R.; Allison, C.

    2009-01-01

    The RELAP/SCDAPSIM/MOD4.0 code, designed to predict the behavior of reactor systems during normal and accident conditions, is being developed as part of an international nuclear technology Software Development and Training Program (SDTP). RELAP/SCDAPSIM/MOD4.0, which is the first version of RELAP5 completely rewritten to FORTRAN 90/95/2000 standards, uses the publicly available RELAP5 and SCDAP models in combination with (a) advanced programming and numerical techniques, (b) advanced SDTP-member-developed models for LWR, HWR, and research reactor analysis, and (c) a variety of other member-developed computational packages. One such computational package is an integrated uncertainty analysis package being developed jointly by the Technical University of Catalunya (UPC) and Innovative Systems Software (ISS). The integrated uncertainty analysis approach used in the package uses the following steps: 1. Selection of the plant; 2. Selection of the scenario; 3. Selection of the safety criteria; 4. Identification and ranking of the relevant phenomena based on the safety criteria; 5. Selection of the appropriate code parameters to represent those phenomena; 6. Association of uncertainty by means of Probability Distribution Functions (PDFs) for each selected parameter; 7. Random sampling of the selected parameters according to its PDF and performing multiple computer runs to obtain uncertainty bands with a certain percentile and confidence level; 8. Processing the results of the multiple computer runs to estimate the uncertainty bands for the computed quantities associated with the selected safety criteria. The first four steps are performed by the user prior to the RELAP/SCDAPSIM/MOD4.0 analysis. The remaining steps are included with the MOD4.0 integrated uncertainty analysis (IUA) package. This paper briefly describes the integrated uncertainty analysis package including (a) the features of the package, (b) the implementation of the package into RELAP/SCDAPSIM/MOD4.0, and

  2. Uncertainty analysis in the task of individual monitoring data

    International Nuclear Information System (INIS)

    Molokanov, A.; Badjin, V.; Gasteva, G.; Antipin, E.

    2003-01-01

    Assessment of internal doses is an essential component of individual monitoring programmes for workers and consists of two stages: individual monitoring measurements and interpretation of the monitoring data in terms of annual intake and/or annual internal dose. The overall uncertainty in assessed dose is a combination of the uncertainties in these stages. An algorithm and a computer code were developed for estimating the uncertainties in these stages. An algorithm and a computer code were developed for estimating the uncertainty in the assessment of internal dose in the task of individual monitoring data interpretation. Two main influencing factors are analysed in this paper: the unknown time of the exposure and variability of bioassay measurements. The aim of this analysis is to show that the algorithm is applicable in designing an individual monitoring programme for workers so as to guarantee that the individual dose calculated from individual monitoring measurements does not exceed a required limit with a certain confidence probability. (author)

  3. Simulating fuel behavior under transient conditions using FRAPTRAN and uncertainty analysis using Dakota

    International Nuclear Information System (INIS)

    Gomes, Daniel S.; Teixeira, Antonio S.

    2017-01-01

    Although regulatory agencies have shown a special interest in incorporating best estimate approaches in the fuel licensing process, fuel codes are currently licensed based on only the deterministic limits such as those seen in 10CRF50, and therefore, may yield unrealistic safety margins. The concept of uncertainty analysis is employed to more realistically manage this risk. In this study, uncertainties were classified into two categories: probabilistic and epistemic (owing to a lack of pre-existing knowledge in this area). Fuel rods have three sources of uncertainty: manufacturing tolerance, boundary conditions, and physical models. The first step in successfully analyzing the uncertainties involves performing a statistical analysis on the input parameters used throughout the fuel code. The response obtained from this analysis must show proportional index correlations because the uncertainties are globally propagated. The Dakota toolkit was used to analyze the FRAPTRAN transient fuel code. The subsequent sensitivity analyses helped in identifying the key parameters with the highest correlation indices including the peak cladding temperature and the time required for cladding failures. The uncertainty analysis was performed using an IFA-650-5 fuel rod, which was in line with the tests performed in the Halden Project in Norway. The main objectives of the Halden project included studying the ballooning and rupture processes. The results of this experiment demonstrate the accuracy and applicability of the physical models in evaluating the thermal conductivity, mechanical model, and fuel swelling formulations. (author)

  4. Simulating fuel behavior under transient conditions using FRAPTRAN and uncertainty analysis using Dakota

    Energy Technology Data Exchange (ETDEWEB)

    Gomes, Daniel S.; Teixeira, Antonio S., E-mail: dsgomes@ipen.br, E-mail: teixeira@ipen [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    Although regulatory agencies have shown a special interest in incorporating best estimate approaches in the fuel licensing process, fuel codes are currently licensed based on only the deterministic limits such as those seen in 10CRF50, and therefore, may yield unrealistic safety margins. The concept of uncertainty analysis is employed to more realistically manage this risk. In this study, uncertainties were classified into two categories: probabilistic and epistemic (owing to a lack of pre-existing knowledge in this area). Fuel rods have three sources of uncertainty: manufacturing tolerance, boundary conditions, and physical models. The first step in successfully analyzing the uncertainties involves performing a statistical analysis on the input parameters used throughout the fuel code. The response obtained from this analysis must show proportional index correlations because the uncertainties are globally propagated. The Dakota toolkit was used to analyze the FRAPTRAN transient fuel code. The subsequent sensitivity analyses helped in identifying the key parameters with the highest correlation indices including the peak cladding temperature and the time required for cladding failures. The uncertainty analysis was performed using an IFA-650-5 fuel rod, which was in line with the tests performed in the Halden Project in Norway. The main objectives of the Halden project included studying the ballooning and rupture processes. The results of this experiment demonstrate the accuracy and applicability of the physical models in evaluating the thermal conductivity, mechanical model, and fuel swelling formulations. (author)

  5. Best estimate analysis of LOFT L2-5 with CATHARE: uncertainty and sensitivity analysis

    Energy Technology Data Exchange (ETDEWEB)

    JOUCLA, Jerome; PROBST, Pierre [Institute for Radiological Protection and Nuclear Safety, Fontenay-aux-Roses (France); FOUET, Fabrice [APTUS, Versailles (France)

    2008-07-01

    The revision of the 10 CFR50.46 in 1988 has made possible the use of best-estimate codes. They may be used in safety demonstration and licensing, provided that uncertainties are added to the relevant output parameters before comparing them with the acceptance criteria. In the safety analysis of the large break loss of coolant accident, it was agreed that the 95. percentile estimated with a high degree of confidence should be lower than the acceptance criteria. It appeared necessary to IRSN, technical support of the French Safety Authority, to get more insight into these strategies which are being developed not only in thermal-hydraulics but in other fields such as in neutronics. To estimate the 95. percentile with a high confidence level, we propose to use rank statistics or bootstrap. Toward the objective of assessing uncertainty, it is useful to determine and to classify the main input parameters. We suggest approximating the code by a surrogate model, the Kriging model, which will be used to make a sensitivity analysis with the SOBOL methodology. This paper presents the application of two new methodologies of how to make the uncertainty and sensitivity analysis on the maximum peak cladding temperature of the LOFT L2-5 test with the CATHARE code. (authors)

  6. Uncertainty and sensitivity analysis for the simulation of a station blackout scenario in the Jules Horowitz Reactor

    International Nuclear Information System (INIS)

    Ghione, Alberto; Noel, Brigitte; Vinai, Paolo; Demazière, Christophe

    2017-01-01

    Highlights: • A station blackout scenario in the Jules Horowitz Reactor is analyzed using CATHARE. • Input and model uncertainties relevant to the transient, are considered. • A statistical methodology for the propagation of the uncertainties is applied. • No safety criteria are exceeded and sufficiently large safety margins are estimated. • The most influential uncertainties are determined with a sensitivity analysis. - Abstract: An uncertainty and sensitivity analysis for the simulation of a station blackout scenario in the Jules Horowitz Reactor (JHR) is presented. The JHR is a new material testing reactor under construction at CEA on the Cadarache site, France. The thermal-hydraulic system code CATHARE is applied to investigate the response of the reactor system to the scenario. The uncertainty and sensitivity study was based on a statistical methodology for code uncertainty propagation, and the ‘Uncertainty and Sensitivity’ platform URANIE was used. Accordingly, the input uncertainties relevant to the transient, were identified, quantified, and propagated to the code output. The results show that the safety criteria are not exceeded and sufficiently large safety margins exist. In addition, the most influential input uncertainties on the safety parameters were found by making use of a sensitivity analysis.

  7. UCODE_2005 and six other computer codes for universal sensitivity analysis, calibration, and uncertainty evaluation constructed using the JUPITER API

    Science.gov (United States)

    Poeter, Eileen E.; Hill, Mary C.; Banta, Edward R.; Mehl, Steffen; Christensen, Steen

    2006-01-01

    This report documents the computer codes UCODE_2005 and six post-processors. Together the codes can be used with existing process models to perform sensitivity analysis, data needs assessment, calibration, prediction, and uncertainty analysis. Any process model or set of models can be used; the only requirements are that models have numerical (ASCII or text only) input and output files, that the numbers in these files have sufficient significant digits, that all required models can be run from a single batch file or script, and that simulated values are continuous functions of the parameter values. Process models can include pre-processors and post-processors as well as one or more models related to the processes of interest (physical, chemical, and so on), making UCODE_2005 extremely powerful. An estimated parameter can be a quantity that appears in the input files of the process model(s), or a quantity used in an equation that produces a value that appears in the input files. In the latter situation, the equation is user-defined. UCODE_2005 can compare observations and simulated equivalents. The simulated equivalents can be any simulated value written in the process-model output files or can be calculated from simulated values with user-defined equations. The quantities can be model results, or dependent variables. For example, for ground-water models they can be heads, flows, concentrations, and so on. Prior, or direct, information on estimated parameters also can be considered. Statistics are calculated to quantify the comparison of observations and simulated equivalents, including a weighted least-squares objective function. In addition, data-exchange files are produced that facilitate graphical analysis. UCODE_2005 can be used fruitfully in model calibration through its sensitivity analysis capabilities and its ability to estimate parameter values that result in the best possible fit to the observations. Parameters are estimated using nonlinear regression: a

  8. Uncertainty Methods Framework Development for the TRACE Thermal-Hydraulics Code by the U.S.NRC

    International Nuclear Information System (INIS)

    Bajorek, Stephen M.; Gingrich, Chester

    2013-01-01

    The Code of Federal Regulations, Title 10, Part 50.46 requires that the Emergency Core Cooling System (ECCS) performance be evaluated for a number of postulated Loss-Of-Coolant-Accidents (LOCAs). The rule allows two methods for calculation of the acceptance criteria; using a realistic model in the so-called 'Best Estimate' approach, or the more prescriptive following Appendix K to Part 50. Because of the conservatism of Appendix K, recent Evaluation Model submittals to the NRC used the realistic approach. With this approach, the Evaluation Model must demonstrate that the Peak Cladding Temperature (PCT), the Maximum Local Oxidation (MLO) and Core-Wide Oxidation (CWO) remain below their regulatory limits with a 'high probability'. Guidance for Best Estimate calculations following 50.46(a)(1) was provided by Regulatory Guide 1.157. This Guide identified a 95% probability level as being acceptable for comparisons of best-estimate predictions to the applicable regulatory limits, but was vague with respect to acceptable methods in which to determine the code uncertainty. Nor, did it specify if a confidence level should be determined. As a result, vendors have developed Evaluation Models utilizing several different methods to combine uncertainty parameters and determine the PCT and other variables to a high probability. In order to quantify the accuracy of TRACE calculations for a wide variety of applications and to audit Best Estimate calculations made by industry, the NRC is developing its own independent methodology to determine the peak cladding temperature and other parameters of regulatory interest to a high probability. Because several methods are in use, and each vendor's methodology ranges different parameters, the NRC method must be flexible and sufficiently general. Not only must the method apply to LOCA analysis for conventional light-water reactors, it must also be extendable to new reactor designs and type of analyses where the acceptance criteria are less

  9. Systematic Analysis Of Ocean Colour Uncertainties

    Science.gov (United States)

    Lavender, Samantha

    2013-12-01

    This paper reviews current research into the estimation of uncertainties as a pixel-based measure to aid non- specialist users of remote sensing products. An example MERIS image, captured on the 28 March 2012, was processed with above-water atmospheric correction code. This was initially based on both the Antoine & Morel Standard Atmospheric Correction, with Bright Pixel correction component, and Doerffer Neural Network coastal water's approach. It's showed that analysis of the atmospheric by-products yield important information about the separation of the atmospheric and in-water signals, helping to sign-post possible uncertainties in the atmospheric correction results. Further analysis has concentrated on implementing a ‘simplistic' atmospheric correction so that the impact of changing the input auxiliary data can be analysed; the influence of changing surface pressure is demonstrated. Future work will focus on automating the analysis, so that the methodology can be implemented within an operational system.

  10. Needs of the CSAU uncertainty method

    International Nuclear Information System (INIS)

    Prosek, A.; Mavko, B.

    2000-01-01

    The use of best estimate codes for safety analysis requires quantification of the uncertainties. These uncertainties are inherently linked to the chosen safety analysis methodology. Worldwide, various methods were proposed for this quantification. The purpose of this paper was to identify the needs of the Code Scaling, Applicability, and Uncertainty (CSAU) methodology and then to answer the needs. The specific procedural steps were combined from other methods for uncertainty evaluation and new tools and procedures were proposed. The uncertainty analysis approach and tools were then utilized for confirmatory study. The uncertainty was quantified for the RELAP5/MOD3.2 thermalhydraulic computer code. The results of the adapted CSAU approach to the small-break loss-of-coolant accident (SB LOCA) show that the adapted CSAU can be used for any thermal-hydraulic safety analysis with uncertainty evaluation. However, it was indicated that there are still some limitations in the CSAU approach that need to be resolved. (author)

  11. The sensitivity analysis by adjoint method for the uncertainty evaluation of the CATHARE-2 code

    Energy Technology Data Exchange (ETDEWEB)

    Barre, F.; de Crecy, A.; Perret, C. [French Atomic Energy Commission (CEA), Grenoble (France)

    1995-09-01

    This paper presents the application of the DASM (Discrete Adjoint Sensitivity Method) to CATHARE 2 thermal-hydraulics code. In a first part, the basis of this method is presented. The mathematical model of the CATHARE 2 code is based on the two fluid six equation model. It is discretized using implicit time discretization and it is relatively easy to implement this method in the code. The DASM is the ASM directly applied to the algebraic system of the discretized code equations which has been demonstrated to be the only solution of the mathematical model. The ASM is an integral part of the new version 1.4 of CATHARE. It acts as a post-processing module. It has been qualified by comparison with the {open_quotes}brute force{close_quotes} technique. In a second part, an application of the DASM in CATHARE 2 is presented. It deals with the determination of the uncertainties of the constitutive relationships, which is a compulsory step for calculating the final uncertainty of a given response. First, the general principles of the method are explained: the constitutive relationship are represented by several parameters and the aim is to calculate the variance-covariance matrix of these parameters. The experimental results of the separate effect tests used to establish the correlation are considered. The variance of the corresponding results calculated by CATHARE are estimated by comparing experiment and calculation. A DASM calculation is carried out to provide the derivatives of the responses. The final covariance matrix is obtained by combination of the variance of the responses and those derivatives. Then, the application of this method to a simple case-the blowdown Canon experiment-is presented. This application has been successfully performed.

  12. The sensitivity analysis by adjoint method for the uncertainty evaluation of the CATHARE-2 code

    International Nuclear Information System (INIS)

    Barre, F.; de Crecy, A.; Perret, C.

    1995-01-01

    This paper presents the application of the DASM (Discrete Adjoint Sensitivity Method) to CATHARE 2 thermal-hydraulics code. In a first part, the basis of this method is presented. The mathematical model of the CATHARE 2 code is based on the two fluid six equation model. It is discretized using implicit time discretization and it is relatively easy to implement this method in the code. The DASM is the ASM directly applied to the algebraic system of the discretized code equations which has been demonstrated to be the only solution of the mathematical model. The ASM is an integral part of the new version 1.4 of CATHARE. It acts as a post-processing module. It has been qualified by comparison with the open-quotes brute forceclose quotes technique. In a second part, an application of the DASM in CATHARE 2 is presented. It deals with the determination of the uncertainties of the constitutive relationships, which is a compulsory step for calculating the final uncertainty of a given response. First, the general principles of the method are explained: the constitutive relationship are represented by several parameters and the aim is to calculate the variance-covariance matrix of these parameters. The experimental results of the separate effect tests used to establish the correlation are considered. The variance of the corresponding results calculated by CATHARE are estimated by comparing experiment and calculation. A DASM calculation is carried out to provide the derivatives of the responses. The final covariance matrix is obtained by combination of the variance of the responses and those derivatives. Then, the application of this method to a simple case-the blowdown Canon experiment-is presented. This application has been successfully performed

  13. Coupling of system thermal–hydraulics and Monte-Carlo code: Convergence criteria and quantification of correlation between statistical uncertainty and coupled error

    International Nuclear Information System (INIS)

    Wu, Xu; Kozlowski, Tomasz

    2015-01-01

    Highlights: • Coupling of Monte Carlo code Serpent and thermal–hydraulics code RELAP5. • A convergence criterion is developed based on the statistical uncertainty of power. • Correlation between MC statistical uncertainty and coupled error is quantified. • Both UO 2 and MOX single assembly models are used in the coupled simulation. • Validation of coupling results with a multi-group transport code DeCART. - Abstract: Coupled multi-physics approach plays an important role in improving computational accuracy. Compared with deterministic neutronics codes, Monte Carlo codes have the advantage of a higher resolution level. In the present paper, a three-dimensional continuous-energy Monte Carlo reactor physics burnup calculation code, Serpent, is coupled with a thermal–hydraulics safety analysis code, RELAP5. The coupled Serpent/RELAP5 code capability is demonstrated by the improved axial power distribution of UO 2 and MOX single assembly models, based on the OECD-NEA/NRC PWR MOX/UO 2 Core Transient Benchmark. Comparisons of calculation results using the coupled code with those from the deterministic methods, specifically heterogeneous multi-group transport code DeCART, show that the coupling produces more precise results. A new convergence criterion for the coupled simulation is developed based on the statistical uncertainty in power distribution in the Monte Carlo code, rather than ad-hoc criteria used in previous research. The new convergence criterion is shown to be more rigorous, equally convenient to use but requiring a few more coupling steps to converge. Finally, the influence of Monte Carlo statistical uncertainty on the coupled error of power and thermal–hydraulics parameters is quantified. The results are presented such that they can be used to find the statistical uncertainty to use in Monte Carlo in order to achieve a desired precision in coupled simulation

  14. Overview of methods for uncertainty analysis and sensitivity analysis in probabilistic risk assessment

    International Nuclear Information System (INIS)

    Iman, R.L.; Helton, J.C.

    1985-01-01

    Probabilistic Risk Assessment (PRA) is playing an increasingly important role in the nuclear reactor regulatory process. The assessment of uncertainties associated with PRA results is widely recognized as an important part of the analysis process. One of the major criticisms of the Reactor Safety Study was that its representation of uncertainty was inadequate. The desire for the capability to treat uncertainties with the MELCOR risk code being developed at Sandia National Laboratories is indicative of the current interest in this topic. However, as yet, uncertainty analysis and sensitivity analysis in the context of PRA is a relatively immature field. In this paper, available methods for uncertainty analysis and sensitivity analysis in a PRA are reviewed. This review first treats methods for use with individual components of a PRA and then considers how these methods could be combined in the performance of a complete PRA. In the context of this paper, the goal of uncertainty analysis is to measure the imprecision in PRA outcomes of interest, and the goal of sensitivity analysis is to identify the major contributors to this imprecision. There are a number of areas that must be considered in uncertainty analysis and sensitivity analysis for a PRA: (1) information, (2) systems analysis, (3) thermal-hydraulic phenomena/fission product behavior, (4) health and economic consequences, and (5) display of results. Each of these areas and the synthesis of them into a complete PRA are discussed

  15. Uncertainties in source term calculations generated by the ORIGEN2 computer code for Hanford Production Reactors

    International Nuclear Information System (INIS)

    Heeb, C.M.

    1991-03-01

    The ORIGEN2 computer code is the primary calculational tool for computing isotopic source terms for the Hanford Environmental Dose Reconstruction (HEDR) Project. The ORIGEN2 code computes the amounts of radionuclides that are created or remain in spent nuclear fuel after neutron irradiation and radioactive decay have occurred as a result of nuclear reactor operation. ORIGEN2 was chosen as the primary code for these calculations because it is widely used and accepted by the nuclear industry, both in the United States and the rest of the world. Its comprehensive library of over 1,600 nuclides includes any possible isotope of interest to the HEDR Project. It is important to evaluate the uncertainties expected from use of ORIGEN2 in the HEDR Project because these uncertainties may have a pivotal impact on the final accuracy and credibility of the results of the project. There are three primary sources of uncertainty in an ORIGEN2 calculation: basic nuclear data uncertainty in neutron cross sections, radioactive decay constants, energy per fission, and fission product yields; calculational uncertainty due to input data; and code uncertainties (i.e., numerical approximations, and neutron spectrum-averaged cross-section values from the code library). 15 refs., 5 figs., 5 tabs

  16. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)] [and others

    1997-06-01

    This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  17. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report

    International Nuclear Information System (INIS)

    Brown, J.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-06-01

    This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses

  18. Uncertainty analysis for secondary energy distributions

    International Nuclear Information System (INIS)

    Gerstl, S.A.W.

    1978-01-01

    In many transport calculations the integral design parameter of interest (response) is determined mainly by secondary particles such as gamma rays from (n,γ) reactions or secondary neutrons from inelastic scattering events or (n,2n) reactions. Standard sensitivity analysis usually allows to calculate the sensitivities to the production cross sections of such secondaries, but an extended formalism is needed to also obtain the sensitivities to the energy distribution of the generated secondary particles. For a 30-group standard cross-section set 84% of all non-zero table positions pertain to the description of secondary energy distributions (SED's) and only 16% to the actual reaction cross sections. Therefore, any sensitivity/uncertainty analysis which does not consider the effects of SED's is incomplete and neglects most of the input data. This paper describes the methods of how sensitivity profiles for SED's are obtained and used to estimate the uncertainty of an integral response due to uncertainties in these SED's. The detailed theory is documented elsewhere and implemented in the LASL sensitivity code SENSIT. SED sensitivity profiles have proven particularly valuable in cross-section uncertainty analyses for fusion reactors. Even when the production cross sections for secondary neutrons were assumed to be without error, the uncertainties in the energy distribution of these secondaries produced appreciable uncertainties in the calculated tritium breeding rate. However, complete error files for SED's are presently nonexistent. Therefore, methods will be described that allow rough error estimates due to estimated SED uncertainties based on integral SED sensitivities

  19. Applications of the TSUNAMI sensitivity and uncertainty analysis methodology

    International Nuclear Information System (INIS)

    Rearden, Bradley T.; Hopper, Calvin M.; Elam, Karla R.; Goluoglu, Sedat; Parks, Cecil V.

    2003-01-01

    The TSUNAMI sensitivity and uncertainty analysis tools under development for the SCALE code system have recently been applied in four criticality safety studies. TSUNAMI is used to identify applicable benchmark experiments for criticality code validation, assist in the design of new critical experiments for a particular need, reevaluate previously computed computational biases, and assess the validation coverage and propose a penalty for noncoverage for a specific application. (author)

  20. Comparative Criticality Analysis of Two Monte Carlo Codes on Centrifugal Atomizer: MCNPS and SCALE

    International Nuclear Information System (INIS)

    Kang, H-S; Jang, M-S; Kim, S-R; Park, J-M; Kim, K-N

    2015-01-01

    There are two well-known Monte Carlo codes for criticality analysis, MCNP5 and SCALE. MCNP5 is a general-purpose Monte Carlo N-Particle code that can be used for neutron, photon, electron or coupled neutron / photon / electron transport, including the capability to calculate eigenvalues for critical system as a main analysis code. SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor physics, radiation shielding, radioactive source term characterization, and sensitivity and uncertainty analysis. SCALE was conceived and funded by US NRC to perform standardized computer analysis for licensing evaluation and is used widely in the world. We performed a validation test of MCNP5 and a comparative analysis of Monte Carlo codes, MCNP5 and SCALE, in terms of the critical analysis of centrifugal atomizer. In the criticality analysis using MCNP5 code, we obtained the statistically reliable results by using a large number of source histories per cycle and performing of uncertainty analysis

  1. Comparative Criticality Analysis of Two Monte Carlo Codes on Centrifugal Atomizer: MCNPS and SCALE

    Energy Technology Data Exchange (ETDEWEB)

    Kang, H-S; Jang, M-S; Kim, S-R [NESS, Daejeon (Korea, Republic of); Park, J-M; Kim, K-N [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    There are two well-known Monte Carlo codes for criticality analysis, MCNP5 and SCALE. MCNP5 is a general-purpose Monte Carlo N-Particle code that can be used for neutron, photon, electron or coupled neutron / photon / electron transport, including the capability to calculate eigenvalues for critical system as a main analysis code. SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor physics, radiation shielding, radioactive source term characterization, and sensitivity and uncertainty analysis. SCALE was conceived and funded by US NRC to perform standardized computer analysis for licensing evaluation and is used widely in the world. We performed a validation test of MCNP5 and a comparative analysis of Monte Carlo codes, MCNP5 and SCALE, in terms of the critical analysis of centrifugal atomizer. In the criticality analysis using MCNP5 code, we obtained the statistically reliable results by using a large number of source histories per cycle and performing of uncertainty analysis.

  2. Two-dimensional cross-section sensitivity and uncertainty analysis of the LBM [Lithium Blanket Module] experiments at LOTUS

    International Nuclear Information System (INIS)

    Davidson, J.W.; Dudziak, D.J.; Pelloni, S.; Stepanek, J.

    1988-01-01

    In a recent common Los Alamos/PSI effort, a sensitivity and nuclear data uncertainty path for the modular code system AARE (Advanced Analysis for Reactor Engineering) was developed. This path includes the cross-section code TRAMIX, the one-dimensional finite difference S/sub N/-transport code ONEDANT, the two-dimensional finite element S/sub N/-transport code TRISM, and the one- and two-dimensional sensitivity and nuclear data uncertainty code SENSIBL. Within the framework of the present work a complete set of forward and adjoint two-dimensional TRISM calculations were performed both for the bare, as well as for the Pb- and Be-preceeded, LBM using MATXS8 libraries. Then a two-dimensional sensitivity and uncertainty analysis for all cases was performed. The goal of this analysis was the determination of the uncertainties of a calculated tritium production per source neutron from lithium along the central Li 2 O rod in the LBM. Considered were the contributions from 1 H, 6 Li, 7 Li, 9 Be, /sup nat/C, 14 N, 16 O, 23 Na, 27 Al, /sup nat/Si, /sup nat/Cr, /sup nat/Fe, /sup nat/Ni, and /sup nat/Pb. 22 refs., 1 fig., 3 tabs

  3. Nuclear data sensitivity/uncertainty analysis for XT-ADS

    International Nuclear Information System (INIS)

    Sugawara, Takanori; Sarotto, Massimo; Stankovskiy, Alexey; Van den Eynde, Gert

    2011-01-01

    Highlights: → The sensitivity and uncertainty analyses were performed to comprehend the reliability of the XT-ADS neutronic design. → The uncertainties deduced from the covariance data for the XT-ADS criticality were 0.94%, 1.9% and 1.1% by the SCALE 44-group, TENDL-2009 and JENDL-3.3 data, respectively. → When the target accuracy of 0.3%Δk for the criticality was considered, the uncertainties did not satisfy it. → To achieve this accuracy, the uncertainties should be improved by experiments under an adequate condition. - Abstract: The XT-ADS, an accelerator-driven system for an experimental demonstration, has been investigated in the framework of IP EUROTRANS FP6 project. In this study, the sensitivity and uncertainty analyses were performed to comprehend the reliability of the XT-ADS neutronic design. For the sensitivity analysis, it was found that the sensitivity coefficients were significantly different by changing the geometry models and calculation codes. For the uncertainty analysis, it was confirmed that the uncertainties deduced from the covariance data varied significantly by changing them. The uncertainties deduced from the covariance data for the XT-ADS criticality were 0.94%, 1.9% and 1.1% by the SCALE 44-group, TENDL-2009 and JENDL-3.3 data, respectively. When the target accuracy of 0.3%Δk for the criticality was considered, the uncertainties did not satisfy it. To achieve this accuracy, the uncertainties should be improved by experiments under an adequate condition.

  4. Hydrocoin level 3 - Testing methods for sensitivity/uncertainty analysis

    International Nuclear Information System (INIS)

    Grundfelt, B.; Lindbom, B.; Larsson, A.; Andersson, K.

    1991-01-01

    The HYDROCOIN study is an international cooperative project for testing groundwater hydrology modelling strategies for performance assessment of nuclear waste disposal. The study was initiated in 1984 by the Swedish Nuclear Power Inspectorate and the technical work was finalised in 1987. The participating organisations are regulatory authorities as well as implementing organisations in 10 countries. The study has been performed at three levels aimed at studying computer code verification, model validation and sensitivity/uncertainty analysis respectively. The results from the first two levels, code verification and model validation, have been published in reports in 1988 and 1990 respectively. This paper focuses on some aspects of the results from Level 3, sensitivity/uncertainty analysis, for which a final report is planned to be published during 1990. For Level 3, seven test cases were defined. Some of these aimed at exploring the uncertainty associated with the modelling results by simply varying parameter values and conceptual assumptions. In other test cases statistical sampling methods were applied. One of the test cases dealt with particle tracking and the uncertainty introduced by this type of post processing. The amount of results available is substantial although unevenly spread over the test cases. It has not been possible to cover all aspects of the results in this paper. Instead, the different methods applied will be illustrated by some typical analyses. 4 figs., 9 refs

  5. May Day: A computer code to perform uncertainty and sensitivity analysis. Manuals

    International Nuclear Information System (INIS)

    Bolado, R.; Alonso, A.; Moya, J.M.

    1996-07-01

    The computer program May Day was developed to carry out the uncertainty and sensitivity analysis in the evaluation of radioactive waste storage. The May Day was made by the Polytechnical University of Madrid. (Author)

  6. Information Synthesis in Uncertainty Studies: Application to the Analysis of the BEMUSE Results

    International Nuclear Information System (INIS)

    Baccou, J.; Chojnacki, E.; Destercke, S.

    2013-01-01

    To demonstrate that the nuclear power plants are designed to respond safely at numerous postulated accidents computer codes are used. The models of these computer codes are an approximation of the real physical behaviour occurring during an accident. Moreover the data used to run these codes are also known with a limited accuracy. Therefore the code predictions are not exact but uncertain. To deal with these uncertainties, 'best estimate' codes with 'best estimate' input data are used to obtain a best estimate calculation and it is necessary to derive the uncertainty associated to their estimations. For this reason, regulatory authorities demand in particular to technical safety organization such as the French Institut de Radioprotection et de Surete Nucleaire (IRSN) to provide results taking into account all the uncertainty sources to assess safety quantities are below critical values. Uncertainty analysis can be seen as a problem of information treatment and a special effort on four methodological key issues has to be done. The first one is related to information modelling. In safety studies, one can distinguish two kinds of uncertainty. The first type, called aleatory uncertainty, is due to the natural variability of an observed phenomenon and cannot be reduced by the arrival of new information. The second type, called epistemic uncertainty, can arise from imprecision. Contrary to the previous one, this uncertainty can be reduced by increasing the state of knowledge. Performing a relevant information modelling therefore requires to work with a mathematical formalism flexible enough to faithfully treat both types of uncertainties. The second one deals with information propagation through a computer code. It requires to run the codes several times and it is usually achieved thanks to a coupling to a statistical software. The complexity of the propagation is strongly connected to the mathematical framework used for the information modelling. The more general the

  7. Development of a nuclear data uncertainties propagation code on the residual power in fast neutron reactors

    International Nuclear Information System (INIS)

    Benoit, J.-C.

    2012-01-01

    This PhD study is in the field of nuclear energy, the back end of nuclear fuel cycle and uncertainty calculations. The CEA must design the prototype ASTRID, a sodium cooled fast reactor (SFR) and one of the selected concepts of the Generation IV forum, for which the calculation of the value and the uncertainty of the decay heat have a significant impact. In this study is developed a code of propagation of uncertainties of nuclear data on the decay heat in SFR. The process took place in three stages. The first step has limited the number of parameters involved in the calculation of the decay heat. For this, an experiment on decay heat on the reactor PHENIX (PUIREX 2008) was studied to validate experimentally the DARWIN package for SFR and quantify the source terms of the decay heat. The second step was aimed to develop a code of propagation of uncertainties: CyRUS (Cycle Reactor Uncertainty and Sensitivity). A deterministic propagation method was chosen because calculations are fast and reliable. Assumptions of linearity and normality have been validated theoretically. The code has also been successfully compared with a stochastic code on the example of the thermal burst fission curve of 235 U. The last part was an application of the code on several experiments: decay heat of a reactor, isotopic composition of a fuel pin and the burst fission curve of 235 U. The code has demonstrated the possibility of feedback on nuclear data impacting the uncertainty of this problem. Two main results were highlighted. Firstly, the simplifying assumptions of deterministic codes are compatible with a precise calculation of the uncertainty of the decay heat. Secondly, the developed method is intrusive and allows feedback on nuclear data from experiments on the back end of nuclear fuel cycle. In particular, this study showed how important it is to measure precisely independent fission yields along with their covariance matrices in order to improve the accuracy of the calculation of

  8. Verification and uncertainty evaluation of CASMO-3/MASTER nuclear analysis system

    Energy Technology Data Exchange (ETDEWEB)

    Song, Jae Seung; Cho, Byung Oh; Joo, Han Kyu; Zee, Sung Quun; Lee, Chung Chan; Park, Sang Yoon

    2000-06-01

    MASTER is a nuclear design code developed by KAERI. It uses group constants generated by CASMO-3 developed by Studsvik. In this report the verification and evaluation of uncertainty were performed for the code system application in nuclear reactor core analysis and design. The verification is performed via various benchmark comparisons for static and transient core condition, and core follow calculations with startup physics test predictions of total 14 cycles of pressurized water reactors. Benchmark calculation include comparisons with reference solutions of IAEA and OECA/NEA problems and critical experiment measurements. The uncertainty evaluation is focused to safety related parameters such as power distribution, reactivity coefficients, control rod worth and core reactivity. It is concluded that CASMO-3/MASTER can be applied for PWR core nuclear analysis and design without any bias factors. Also, it is verified that the system can be applied for SMART core, via supplemental comparisons with reference calculations by MCNP which is a probabilistic nuclear calculation code.

  9. Verification and uncertainty evaluation of CASMO-3/MASTER nuclear analysis system

    International Nuclear Information System (INIS)

    Song, Jae Seung; Cho, Byung Oh; Joo, Han Kyu; Zee, Sung Quun; Lee, Chung Chan; Park, Sang Yoon

    2000-06-01

    MASTER is a nuclear design code developed by KAERI. It uses group constants generated by CASMO-3 developed by Studsvik. In this report the verification and evaluation of uncertainty were performed for the code system application in nuclear reactor core analysis and design. The verification is performed via various benchmark comparisons for static and transient core condition, and core follow calculations with startup physics test predictions of total 14 cycles of pressurized water reactors. Benchmark calculation include comparisons with reference solutions of IAEA and OECA/NEA problems and critical experiment measurements. The uncertainty evaluation is focused to safety related parameters such as power distribution, reactivity coefficients, control rod worth and core reactivity. It is concluded that CASMO-3/MASTER can be applied for PWR core nuclear analysis and design without any bias factors. Also, it is verified that the system can be applied for SMART core, via supplemental comparisons with reference calculations by MCNP which is a probabilistic nuclear calculation code

  10. Demonstration of uncertainty quantification and sensitivity analysis for PWR fuel performance with BISON

    International Nuclear Information System (INIS)

    Zhang, Hongbin; Zhao, Haihua; Zou, Ling; Burns, Douglas; Ladd, Jacob

    2017-01-01

    BISON is an advanced fuels performance code being developed at Idaho National Laboratory and is the code of choice for fuels performance by the U.S. Department of Energy (DOE)’s Consortium for Advanced Simulation of Light Water Reactors (CASL) Program. An approach to uncertainty quantification and sensitivity analysis with BISON was developed and a new toolkit was created. A PWR fuel rod model was developed and simulated by BISON, and uncertainty quantification and sensitivity analysis were performed with eighteen uncertain input parameters. The maximum fuel temperature and gap conductance were selected as the figures of merit (FOM). Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis. (author)

  11. Demonstration of Uncertainty Quantification and Sensitivity Analysis for PWR Fuel Performance with BISON

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Hongbin; Ladd, Jacob; Zhao, Haihua; Zou, Ling; Burns, Douglas

    2015-11-01

    BISON is an advanced fuels performance code being developed at Idaho National Laboratory and is the code of choice for fuels performance by the U.S. Department of Energy (DOE)’s Consortium for Advanced Simulation of Light Water Reactors (CASL) Program. An approach to uncertainty quantification and sensitivity analysis with BISON was developed and a new toolkit was created. A PWR fuel rod model was developed and simulated by BISON, and uncertainty quantification and sensitivity analysis were performed with eighteen uncertain input parameters. The maximum fuel temperature and gap conductance were selected as the figures of merit (FOM). Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis.

  12. Effect of Uncertainty Parameters in Blowdown and Reflood Models for OPR1000 LBLOCA Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Huh, Byung Gil; Jin, Chang Yong; Seul, Kwangwon; Hwang, Taesuk [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2014-05-15

    KINS(Korea Institute of Nuclear Safety) has also performed the audit calculation with the KINS Realistic Evaluation Methodology(KINS-REM) to confirm the validity of licensee's calculation. In the BEPU method, it is very important to quantify the code and model uncertainty. It is referred in the following requirement: BE calculations in Regulatory Guide 1.157 - 'the code and models used are acceptable and applicable to the specific facility over the intended operating range and must quantify the uncertainty in the specific application'. In general, the uncertainty of model/code should be obtained through the data comparison with relevant integral- and separate-effect tests at different scales. However, it is not easy to determine these kinds of uncertainty because of the difficulty for evaluating accurately various experiments. Therefore, the expert judgment has been used in many cases even with the limitation that the uncertainty range of important parameters can be wide and inaccurate. In the KINS-REM, six heat transfer parameters in the blowdown phase have been used to consider the uncertainty of models. Recently, MARS-KS code was modified to consider the uncertainty of the five heat transfer parameters in the reflood phase. Accordingly, it is required that the uncertainty range for parameters of reflood models is determined and the effect of these ranges is evaluated. In this study, the large break LOCA (LBLOCA) analysis for OPR1000 was performed to identify the effect of uncertainty parameters in blowdown and reflood models.

  13. The role of the uncertainty in code development

    Energy Technology Data Exchange (ETDEWEB)

    Barre, F. [CEA-Grenoble (France)

    1997-07-01

    From a general point of view, all the results of a calculation should be given with their uncertainty. It is of most importance in nuclear safety where sizing of the safety systems, therefore protection of the population and the environment essentially depends on the calculation results. Until these last years, the safety analysis was performed with conservative tools. Two types of critics can be made. Firstly, conservative margins can be too large and it may be possible to reduce the cost of the plant or its operation with a best estimate approach. Secondly, some of the conservative hypotheses may not really conservative in the full range of physical events which can occur during an accident. Simpson gives an interesting example: in some cases, the majoration of the residual power during a small break LOCA can lead to an overprediction of the swell level and thus of an overprediction of the core cooling, which is opposite to a conservative prediction. A last question is: does the accumulation of conservative hypotheses for a problem always give a conservative result? The two phase flow physics, mainly dealing with situation of mechanical and thermal non-equilibrium, is too much complicated to answer these questions with a simple engineer judgement. The objective of this paper is to make a review of the quantification of the uncertainties which can be made during code development and validation.

  14. The role of the uncertainty in code development

    International Nuclear Information System (INIS)

    Barre, F.

    1997-01-01

    From a general point of view, all the results of a calculation should be given with their uncertainty. It is of most importance in nuclear safety where sizing of the safety systems, therefore protection of the population and the environment essentially depends on the calculation results. Until these last years, the safety analysis was performed with conservative tools. Two types of critics can be made. Firstly, conservative margins can be too large and it may be possible to reduce the cost of the plant or its operation with a best estimate approach. Secondly, some of the conservative hypotheses may not really conservative in the full range of physical events which can occur during an accident. Simpson gives an interesting example: in some cases, the majoration of the residual power during a small break LOCA can lead to an overprediction of the swell level and thus of an overprediction of the core cooling, which is opposite to a conservative prediction. A last question is: does the accumulation of conservative hypotheses for a problem always give a conservative result? The two phase flow physics, mainly dealing with situation of mechanical and thermal non-equilibrium, is too much complicated to answer these questions with a simple engineer judgement. The objective of this paper is to make a review of the quantification of the uncertainties which can be made during code development and validation

  15. Sensitivity and uncertainty analysis applied to a repository in rock salt

    International Nuclear Information System (INIS)

    Polle, A.N.

    1996-12-01

    This document describes the sensitivity and uncertainty analysis with UNCSAM, as applied to a repository in rock salt for the EVEREST project. UNCSAM is a dedicated software package for sensitivity and uncertainty analysis, which was already used within the preceding PROSA project. The use of UNCSAM provides a flexible interface to EMOS ECN by substituting the sampled values in the various input files to be used by EMOS ECN ; the model calculations for this repository were performed with the EMOS ECN code. Preceding the sensitivity and uncertainty analysis, a number of preparations has been carried out to facilitate EMOS ECN with the probabilistic input data. For post-processing the EMOS ECN results, the characteristic output signals were processed. For the sensitivity and uncertainty analysis with UNCSAM the stochastic input, i.e. sampled values, and the output for the various EMOS ECN runs have been analyzed. (orig.)

  16. Quantifying reactor safety margins: Part 1: An overview of the code scaling, applicability, and uncertainty evaluation methodology

    International Nuclear Information System (INIS)

    Boyack, B.E.; Duffey, R.B.; Griffith, P.

    1988-01-01

    In August 1988, the Nuclear Regulatory Commission (NRC) approved the final version of a revised rule on the acceptance of emergency core cooling systems (ECCS) entitled ''Emergency Core Cooling System; Revisions to Acceptance Criteria.'' The revised rule states an alternate ECCS performance analysis, based on best-estimate methods, may be used to provide more realistic estimates of plant safety margins, provided the licensee quantifies the uncertainty of the estimates and included that uncertainty when comparing the calculated results with prescribed acceptance limits. To support the revised ECCS rule, the NRC and its contractors and consultants have developed and demonstrated a method called the Code Scaling, Applicability, and Uncertainty (CSAU) evaluation methodology. It is an auditable, traceable, and practical method for combining quantitative analyses and expert opinions to arrive at computed values of uncertainty. This paper provides an overview of the CSAU evaluation methodology and its application to a postulated cold-leg, large-break loss-of-coolant accident in a Westinghouse four-loop pressurized water reactor with 17 /times/ 17 fuel. The code selected for this demonstration of the CSAU methodology was TRAC-PF1/MOD1, Version 14.3. 23 refs., 5 figs., 1 tab

  17. Probabilistic accident consequence uncertainty analysis -- Late health effects uncertain assessment. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Little, M.P.; Muirhead, C.R. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the expert panel on late health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  18. Report of a CSNI workshop on uncertainty analysis methods. Volume 1 + 2

    International Nuclear Information System (INIS)

    Wickett, A.J.; Yadigaroglu, G.

    1994-08-01

    The OECD NEA CSNI Principal Working Group 2 (PWG2) Task Group on Thermal Hydraulic System Behaviour (TGTHSB) has, in recent years, received presentations of a variety of different methods to analyze the uncertainty in the calculations of advanced unbiased (best estimate) codes. Proposals were also made for an International Standard Problem (ISP) to compare the uncertainty analysis methods. The objectives for the Workshop were to discuss and fully understand the principles of uncertainty analysis relevant to LOCA modelling and like problems, to examine the underlying issues from first principles, in preference to comparing and contrasting the currently proposed methods, to reach consensus on the issues identified as far as possible while not avoiding the controversial aspects, to identify as clearly as possible unreconciled differences, and to issue a Status Report. Eight uncertainty analysis methods were presented. A structured discussion of various aspects of uncertainty analysis followed - the need for uncertainty analysis, identification and ranking of uncertainties, characterisation, quantification and combination of uncertainties and applications, resources and future developments. As a result, the objectives set out above were, to a very large extent, achieved. Plans for the ISP were also discussed. Volume 1 contains a record of the discussions on uncertainty methods. Volume 2 is a compilation of descriptions of the eight uncertainty analysis methods presented at the workshop

  19. Uncertainty and sensitivity analysis of parameters affecting water hammer pressure wave behaviour

    International Nuclear Information System (INIS)

    Kaliatka, A.; Uspuras, E.; Vaisnoras, M.

    2006-01-01

    Pressure surges occurring in pipeline systems may be caused by fast control interference, start up and shut down processes and operation failure. They lead to water hammer upstream the closing valve and cavitational hammer downstream the valve, which may cause considerable damages to the pipeline and the support structures. Appearance of water hammer in thermal-hydraulic systems was widely studied employing different state-of-the-art thermal-hydraulic codes in many organizations. For the analysis water hammer test performed at Fraunhofer Institute for Environmental, Safety and Energy Technology (UMSICHT) at Oberhausen was considered. This paper presents the comparison of UMSICHT test facility experiment calculations employing the best estimate system code RELAP5/Mod3.3 to measured water hammer values after fast closure of a valve. The analysis revealed that the calculated first pressure peak, which has the highest value, matches the measured value very well. The performed analysis (as well as any other analyses) as a results of each individual calculation always contains uncertainty owing to initial conditions of installations, errors of measuring systems, errors caused by nodalization of objects at modelling, code correlations, etc. In this connection, results of uncertainty and sensitivity analysis of the initial conditions and code-selected models are shown in the paper. (orig.)

  20. Approach to uncertainty evaluation for safety analysis

    International Nuclear Information System (INIS)

    Ogura, Katsunori

    2005-01-01

    Nuclear power plant safety used to be verified and confirmed through accident simulations using computer codes generally because it is very difficult to perform integrated experiments or tests for the verification and validation of the plant safety due to radioactive consequence, cost, and scaling to the actual plant. Traditionally the plant safety had been secured owing to the sufficient safety margin through the conservative assumptions and models to be applied to those simulations. Meanwhile the best-estimate analysis based on the realistic assumptions and models in support of the accumulated insights could be performed recently, inducing the reduction of safety margin in the analysis results and the increase of necessity to evaluate the reliability or uncertainty of the analysis results. This paper introduces an approach to evaluate the uncertainty of accident simulation and its results. (Note: This research had been done not in the Japan Nuclear Energy Safety Organization but in the Tokyo Institute of Technology.) (author)

  1. Uncertainty analysis of one Main Circulation Pump trip event at the Ignalina NPP

    International Nuclear Information System (INIS)

    Vileiniskis, V.; Kaliatka, A.; Uspuras, E.

    2004-01-01

    One Main Circulation Pump (MCP) trip event is an anticipated transient with expected frequency of approximately one event per year. There were a few events when one MCP was inadvertently tripped. The throughput of the rest running pumps in the affected Main Circulation Circuit loop increased, however, the total coolant flow through the affected loop decreased. The main question arises whether this coolant flow rate is sufficient for adequate core cooling. This paper presents an investigation of one MCP trip event at the Ignalina NPP. According to international practice, the transient analysis should consist of deterministic analysis by employing best-estimate codes and uncertainty analysis. For that purpose, the plant's RELAP5 model and the GRS (Germany) System for Uncertainty and Sensitivity Analysis package (SUSA) were employed. Uncertainty analysis of flow energy loss in different parts of the Main Circulation Circuit, initial conditions and code-selected models was performed. Such analysis allows to estimate the influence of separate parameters on calculation results and to find the modelling parameters that have the largest impact on the event studied. On the basis of this analysis, recommendations for the further improvement of the model have been developed. (author)

  2. Results from the Application of Uncertainty Methods in the CSNI Uncertainty Methods Study (UMS)

    International Nuclear Information System (INIS)

    Glaeser, H.

    2008-01-01

    Within licensing procedures there is the incentive to replace the conservative requirements for code application by a - best estimate - concept supplemented by an uncertainty analysis to account for predictive uncertainties of code results. Methods have been developed to quantify these uncertainties. The Uncertainty Methods Study (UMS) Group, following a mandate from CSNI, has compared five methods for calculating the uncertainty in the predictions of advanced -best estimate- thermal-hydraulic codes. Most of the methods identify and combine input uncertainties. The major differences between the predictions of the methods came from the choice of uncertain parameters and the quantification of the input uncertainties, i.e. the wideness of the uncertainty ranges. Therefore, suitable experimental and analytical information has to be selected to specify these uncertainty ranges or distributions. After the closure of the Uncertainty Method Study (UMS) and after the report was issued comparison calculations of experiment LSTF-SB-CL-18 were performed by University of Pisa using different versions of the RELAP 5 code. It turned out that the version used by two of the participants calculated a 170 K higher peak clad temperature compared with other versions using the same input deck. This may contribute to the differences of the upper limit of the uncertainty ranges.

  3. Verification of the CONPAS (CONtainment Performance Analysis System) code package

    International Nuclear Information System (INIS)

    Kim, See Darl; Ahn, Kwang Il; Song, Yong Man; Choi, Young; Park, Soo Yong; Kim, Dong Ha; Jin, Young Ho.

    1997-09-01

    CONPAS is a computer code package to integrate the numerical, graphical, and results-oriented aspects of Level 2 probabilistic safety assessment (PSA) for nuclear power plants under a PC window environment automatically. For the integrated analysis of Level 2 PSA, the code utilizes four distinct, but closely related modules: (1) ET Editor, (2) Computer, (3) Text Editor, and (4) Mechanistic Code Plotter. Compared with other existing computer codes for Level 2 PSA, and CONPAS code provides several advanced features: computational aspects including systematic uncertainty analysis, importance analysis, sensitivity analysis and data interpretation, reporting aspects including tabling and graphic as well as user-friendly interface. The computational performance of CONPAS has been verified through a Level 2 PSA to a reference plant. The results of the CONPAS code was compared with an existing level 2 PSA code (NUCAP+) and the comparison proves that CONPAS is appropriate for Level 2 PSA. (author). 9 refs., 8 tabs., 14 figs

  4. Extensive neutronic sensitivity-uncertainty analysis of a fusion reactor shielding blanket

    International Nuclear Information System (INIS)

    Hogenbirk, A.

    1994-01-01

    In this paper the results are presented of an extensive neutronic sensitivity-uncertainty study performed for the design of a shielding blanket for a next-step fusion reactor, such as ITER. A code system was used, which was developed at ECN Petten. The uncertainty in an important response parameter, the neutron heating in the inboard superconducting coils, was evaluated. Neutron transport calculations in the 100 neutron group GAM-II structure were performed using the code ANISN. For the sensitivity and uncertainty calculations the code SUSD was used. Uncertainties due to cross-section uncertainties were taken into account as well as uncertainties due to uncertainties in energy and angular distributions of scattered neutrons (SED and SAD uncertainties, respectively). The subject of direct-term uncertainties (i.e. uncertainties due to uncertainties in the kerma factors of the superconducting coils) is briefly touched upon. It is shown that SAD uncertainties, which have been largely neglected until now, contribute significantly to the total uncertainty. Moreover, the contribution of direct-term uncertainties may be large. The total uncertainty in the neutron heating, only due to Fe cross-sections, amounts to approximately 25%, which is rather large. However, uncertainty data are scarce and the data may very well be conservative. It is shown in this paper that with the code system used, sensitivity and uncertainty calculations can be performed in a straightforward way. Therefore, it is suggested that emphasis is now put on the generation of realistic, reliable covariance data for cross-sections as well as for angular and energy distributions. ((orig.))

  5. Information on Hydrologic Conceptual Models, Parameters, Uncertainty Analysis, and Data Sources for Dose Assessments at Decommissioning Sites

    International Nuclear Information System (INIS)

    Meyer, Philip D.; Gee, Glendon W.; Nicholson, Thomas J.

    1999-01-01

    This report addresses issues related to the analysis of uncertainty in dose assessments conducted as part of decommissioning analyses. The analysis is limited to the hydrologic aspects of the exposure pathway involving infiltration of water at the ground surface, leaching of contaminants, and transport of contaminants through the groundwater to a point of exposure. The basic conceptual models and mathematical implementations of three dose assessment codes are outlined along with the site-specific conditions under which the codes may provide inaccurate, potentially nonconservative results. In addition, the hydrologic parameters of the codes are identified and compared. A methodology for parameter uncertainty assessment is outlined that considers the potential data limitations and modeling needs of decommissioning analyses. This methodology uses generic parameter distributions based on national or regional databases, sensitivity analysis, probabilistic modeling, and Bayesian updating to incorporate site-specific information. Data sources for best-estimate parameter values and parameter uncertainty information are also reviewed. A follow-on report will illustrate the uncertainty assessment methodology using decommissioning test cases

  6. Information on Hydrologic Conceptual Models, Parameters, Uncertainty Analysis, and Data Sources for Dose Assessments at Decommissioning Sites

    International Nuclear Information System (INIS)

    Meyer D, Philip; Gee W, Glendon

    2000-01-01

    This report addresses issues related to the analysis of uncertainty in dose assessments conducted as part of decommissioning analyses. The analysis is limited to the hydrologic aspects of the exposure pathway involving infiltration of water at the ground surface, leaching of contaminants, and transport of contaminants through the groundwater to a point of exposure. The basic conceptual models and mathematical implementations of three dose assessment codes are outlined along with the site-specific conditions under which the codes may provide inaccurate, potentially nonconservative results. In addition, the hydrologic parameters of the codes are identified and compared. A methodology for parameter uncertainty assessment is outlined that considers the potential data limitations and modeling needs of decommissioning analyses. This methodology uses generic parameter distributions based on national or regional databases, sensitivity analysis, probabilistic modeling, and Bayesian updating to incorporate site-specific information. Data sources for best-estimate parameter values and parameter uncertainty information are also reviewed. A follow-on report will illustrate the uncertainty assessment methodology using decommissioning test cases

  7. Improvement of Modeling HTGR Neutron Physics by Uncertainty Analysis with the Use of Cross-Section Covariance Information

    Science.gov (United States)

    Boyarinov, V. F.; Grol, A. V.; Fomichenko, P. A.; Ternovykh, M. Yu

    2017-01-01

    This work is aimed at improvement of HTGR neutron physics design calculations by application of uncertainty analysis with the use of cross-section covariance information. Methodology and codes for preparation of multigroup libraries of covariance information for individual isotopes from the basic 44-group library of SCALE-6 code system were developed. A 69-group library of covariance information in a special format for main isotopes and elements typical for high temperature gas cooled reactors (HTGR) was generated. This library can be used for estimation of uncertainties, associated with nuclear data, in analysis of HTGR neutron physics with design codes. As an example, calculations of one-group cross-section uncertainties for fission and capture reactions for main isotopes of the MHTGR-350 benchmark, as well as uncertainties of the multiplication factor (k∞) for the MHTGR-350 fuel compact cell model and fuel block model were performed. These uncertainties were estimated by the developed technology with the use of WIMS-D code and modules of SCALE-6 code system, namely, by TSUNAMI, KENO-VI and SAMS. Eight most important reactions on isotopes for MHTGR-350 benchmark were identified, namely: 10B(capt), 238U(n,γ), ν5, 235U(n,γ), 238U(el), natC(el), 235U(fiss)-235U(n,γ), 235U(fiss).

  8. Design optimization and uncertainty analysis of SMA morphing structures

    International Nuclear Information System (INIS)

    Oehler, S D; Hartl, D J; Lopez, R; Malak, R J; Lagoudas, D C

    2012-01-01

    The continuing implementation of shape memory alloys (SMAs) as lightweight solid-state actuators in morphing structures has now motivated research into finding optimized designs for use in aerospace control systems. This work proposes methods that use iterative analysis techniques to determine optimized designs for morphing aerostructures and consider the impact of uncertainty in model variables on the solution. A combination of commercially available and custom coded tools is utilized. ModelCenter, a suite of optimization algorithms and simulation process management tools, is coupled with the Abaqus finite element analysis suite and a custom SMA constitutive model to assess morphing structure designs in an automated fashion. The chosen case study involves determining the optimized configuration of a morphing aerostructure assembly that includes SMA flexures. This is accomplished by altering design inputs representing the placement of active components to minimize a specified cost function. An uncertainty analysis is also conducted using design of experiment methods to determine the sensitivity of the solution to a set of uncertainty variables. This second study demonstrates the effective use of Monte Carlo techniques to simulate the variance of model variables representing the inherent uncertainty in component fabrication processes. This paper outlines the modeling tools used to execute each case study, details the procedures for constructing the optimization problem and uncertainty analysis, and highlights the results from both studies. (paper)

  9. Phenomenological uncertainty analysis of containment building pressure load caused by severe accident sequences

    International Nuclear Information System (INIS)

    Park, S.Y.; Ahn, K.I.

    2014-01-01

    Highlights: • Phenomenological uncertainty analysis has been applied to level 2 PSA. • The methodology provides an alternative to simple deterministic analyses and sensitivity studies. • A realistic evaluation provides a more complete characterization of risks. • Uncertain parameters of MAAP code for the early containment failure were identified. - Abstract: This paper illustrates an application of a severe accident analysis code, MAAP, to the uncertainty evaluation of early containment failure scenarios employed in the containment event tree (CET) model of a reference plant. An uncertainty analysis of containment pressure behavior during severe accidents has been performed for an optimum assessment of an early containment failure model. The present application is mainly focused on determining an estimate of the containment building pressure load caused by severe accident sequences of a nuclear power plant. Key modeling parameters and phenomenological models employed for the present uncertainty analysis are closely related to the in-vessel hydrogen generation, direct containment heating, and gas combustion. The basic approach of this methodology is to (1) develop severe accident scenarios for which containment pressure loads should be performed based on a level 2 PSA, (2) identify severe accident phenomena relevant to an early containment failure, (3) identify the MAAP input parameters, sensitivity coefficients, and modeling options that describe or influence the early containment failure phenomena, (4) prescribe the likelihood descriptions of the potential range of these parameters, and (5) evaluate the code predictions using a number of random combinations of parameter inputs sampled from the likelihood distributions

  10. International training program in support of safety analysis. 3D S.UN.COP-scaling uncertainty and 3D thermal-hydraulics/neutron-kinetics coupled codes seminars

    International Nuclear Information System (INIS)

    Petruzzi, Alessandro; D'Auria, Francesco; Bajs, Tomislav; Reventos, Francesc; Hassan, Yassin

    2007-01-01

    Thermal-hydraulic system computer codes are extensively used worldwide for analysis of nuclear facilities by utilities, regulatory bodies, nuclear power plant designers and vendors, nuclear fuel companies, research organizations, consulting companies, and technical support organizations. The computer code user represents a source of uncertainty that can influence the results of system code calculations. This influence is commonly known as the user effect' and stems from the limitations embedded in the codes as well as from the limited capability of the analysis to use the codes. Code user training and qualification is an effective means for reducing the variation of results caused by the application of the codes by different users. This paper describes a systematic approach to training code users who, upon completion of the training, should be able to perform calculations making the best possible use of the capabilities of best estimate codes. In other words, the program aims at contributing towards solving the problem of user effect. The 3D S.UN.COP (Scaling, Uncertainty and 3D COuPled code calculations) seminars have been organized as follow-up of the proposal to IAEA for the Permanent Training Course for System Code Users. Six seminars have been held at University of Pisa (2003, 2004), at The Pennsylvania State University (2004), at University of Zagreb (2005), at the School of Industrial Engineering of Barcelona (January-February 2006) and in Buenos Aires, Argentina (October 2006), being this last one requested by ARN (Autoridad Regulatoria Nuclear), NA-SA (Nucleoelectrica Argentina S.A) and CNEA (Comision Nacional de Energia Atomica). It was recognized that such courses represented both a source of continuing education for current code users and a mean for current code users to enter the formal training structure of a proposed 'permanent' stepwise approach to user training. The 3D S.UN.COP 2006 in Barcelona was successfully held with the attendance of 33

  11. Sensitivity and uncertainty studies of the CRAC2 computer code

    International Nuclear Information System (INIS)

    Kocher, D.C.; Ward, R.C.; Killough, G.G.; Dunning, D.E. Jr.; Hicks, B.B.; Hosker, R.P. Jr.; Ku, J.Y.; Rao, K.S.

    1985-05-01

    This report presents a study of the sensitivity of early fatalities, early injuries, latent cancer fatalities, and economic costs for hypothetical nuclear reactor accidents as predicted by the CRAC2 computer code (CRAC = Calculation of Reactor Accident Consequences) to uncertainties in selected models and parameters used in the code. The sources of uncertainty that were investigated in the CRAC2 sensitivity studies include (1) the model for plume rise, (2) the model for wet deposition, (3) the procedure for meteorological bin-sampling involving the selection of weather sequences that contain rain, (4) the dose conversion factors for inhalation as they are affected by uncertainties in the physical and chemical form of the released radionuclides, (5) the weathering half-time for external ground-surface exposure, and (6) the transfer coefficients for estimating exposures via terrestrial foodchain pathways. The sensitivity studies were performed for selected radionuclide releases, hourly meteorological data, land-use data, a fixed non-uniform population distribution, a single evacuation model, and various release heights and sensible heat rates. Two important general conclusions from the sensitivity and uncertainty studies are as follows: (1) The large effects on predicted early fatalities and early injuries that were observed in some of the sensitivity studies apparently are due in part to the presence of thresholds in the dose-response models. Thus, the observed sensitivities depend in part on the magnitude of the radionuclide releases. (2) Some of the effects on predicted early fatalities and early injuries that were observed in the sensitivity studies were comparable to effects that were due only to the selection of different sets of weather sequences in bin-sampling runs. 47 figs., 50 tabs

  12. Uncertainty propagation applied to multi-scale thermal-hydraulics coupled codes. A step towards validation

    Energy Technology Data Exchange (ETDEWEB)

    Geffray, Clotaire Clement

    2017-03-20

    The work presented here constitutes an important step towards the validation of the use of coupled system thermal-hydraulics and computational fluid dynamics codes for the simulation of complex flows in liquid metal cooled pool-type facilities. First, a set of methods suited for uncertainty and sensitivity analysis and validation activities with regards to the specific constraints of the work with coupled and expensive-to-run codes is proposed. Then, these methods are applied to the ATHLET - ANSYS CFX model of the TALL-3D facility. Several transients performed at this latter facility are investigated. The results are presented, discussed and compared to the experimental data. Finally, assessments of the validity of the selected methods and of the quality of the model are offered.

  13. Uncertainty analysis in seismic tomography

    Science.gov (United States)

    Owoc, Bartosz; Majdański, Mariusz

    2017-04-01

    Velocity field from seismic travel time tomography depends on several factors like regularization, inversion path, model parameterization etc. The result also strongly depends on an initial velocity model and precision of travel times picking. In this research we test dependence on starting model in layered tomography and compare it with effect of picking precision. Moreover, in our analysis for manual travel times picking the uncertainty distribution is asymmetric. This effect is shifting the results toward faster velocities. For calculation we are using JIVE3D travel time tomographic code. We used data from geo-engineering and industrial scale investigations, which were collected by our team from IG PAS.

  14. Uncertainty and sensitivity analysis of the nuclear fuel thermal behavior

    Energy Technology Data Exchange (ETDEWEB)

    Boulore, A., E-mail: antoine.boulore@cea.fr [Commissariat a l' Energie Atomique (CEA), DEN, Fuel Research Department, 13108 Saint-Paul-lez-Durance (France); Struzik, C. [Commissariat a l' Energie Atomique (CEA), DEN, Fuel Research Department, 13108 Saint-Paul-lez-Durance (France); Gaudier, F. [Commissariat a l' Energie Atomique (CEA), DEN, Systems and Structure Modeling Department, 91191 Gif-sur-Yvette (France)

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer A complete quantitative method for uncertainty propagation and sensitivity analysis is applied. Black-Right-Pointing-Pointer The thermal conductivity of UO{sub 2} is modeled as a random variable. Black-Right-Pointing-Pointer The first source of uncertainty is the linear heat rate. Black-Right-Pointing-Pointer The second source of uncertainty is the thermal conductivity of the fuel. - Abstract: In the global framework of nuclear fuel behavior simulation, the response of the models describing the physical phenomena occurring during the irradiation in reactor is mainly conditioned by the confidence in the calculated temperature of the fuel. Amongst all parameters influencing the temperature calculation in our fuel rod simulation code (METEOR V2), several sources of uncertainty have been identified as being the most sensitive: thermal conductivity of UO{sub 2}, radial distribution of power in the fuel pellet, local linear heat rate in the fuel rod, geometry of the pellet and thermal transfer in the gap. Expert judgment and inverse methods have been used to model the uncertainty of these parameters using theoretical distributions and correlation matrices. Propagation of these uncertainties in the METEOR V2 code using the URANIE framework and a Monte-Carlo technique has been performed in different experimental irradiations of UO{sub 2} fuel. At every time step of the simulated experiments, we get a temperature statistical distribution which results from the initial distributions of the uncertain parameters. We then can estimate confidence intervals of the calculated temperature. In order to quantify the sensitivity of the calculated temperature to each of the uncertain input parameters and data, we have also performed a sensitivity analysis using the Sobol' indices at first order.

  15. Development and application of objective uncertainty measures for nuclear power plant transient analysis

    International Nuclear Information System (INIS)

    Vinai, P.

    2007-10-01

    For the development, design and licensing of a nuclear power plant (NPP), a sound safety analysis is necessary to study the diverse physical phenomena involved in the system behaviour under operational and transient conditions. Such studies are based on detailed computer simulations. With the progresses achieved in computer technology and the greater availability of experimental and plant data, the use of best estimate codes for safety evaluations has gained increasing acceptance. The application of best estimate safety analysis has raised new problems that need to be addressed: it has become more crucial to assess as to how reliable code predictions are, especially when they need to be compared against safety limits that must not be crossed. It becomes necessary to identify and quantify the various possible sources of uncertainty that affect the reliability of the results. Currently, such uncertainty evaluations are generally based on experts' opinion. In the present research, a novel methodology based on a non-parametric statistical approach has been developed for objective quantification of best-estimate code uncertainties related to the physical models used in the code. The basis is an evaluation of the accuracy of a given physical model achieved by comparing its predictions with experimental data from an appropriate set of separate-effect tests. The differences between measurements and predictions can be considered stochastically distributed, and thus a statistical approach can be employed. The first step was the development of a procedure for investigating the dependence of a given physical model's accuracy on the experimental conditions. Each separate-effect test effectively provides a random sample of discrepancies between measurements and predictions, corresponding to a location in the state space defined by a certain number of independent system variables. As a consequence, the samples of 'errors', achieved from analysis of the entire database, are

  16. Preliminary Uncertainty Analysis for SMART Digital Core Protection and Monitoring System

    International Nuclear Information System (INIS)

    Koo, Bon Seung; In, Wang Kee; Hwang, Dae Hyun

    2012-01-01

    The Korea Atomic Energy Research Institute (KAERI) developed on-line digital core protection and monitoring systems, called SCOPS and SCOMS as a part of SMART plant protection and monitoring system. SCOPS simplified the protection system by directly connecting the four RSPT signals to each core protection channel and eliminated the control element assembly calculator (CEAC) hardware. SCOMS adopted DPCM3D method in synthesizing core power distribution instead of Fourier expansion method being used in conventional PWRs. The DPCM3D method produces a synthetic 3-D power distribution by coupling a neutronics code and measured in-core detector signals. The overall uncertainty analysis methodology which is used statistically combining uncertainty components of SMART core protection and monitoring system was developed. In this paper, preliminary overall uncertainty factors for SCOPS/SCOMS of SMART initial core were evaluated by applying newly developed uncertainty analysis method

  17. Towards an Industrial Application of Statistical Uncertainty Analysis Methods to Multi-physical Modelling and Safety Analyses

    International Nuclear Information System (INIS)

    Zhang, Jinzhao; Segurado, Jacobo; Schneidesch, Christophe

    2013-01-01

    Since 1980's, Tractebel Engineering (TE) has being developed and applied a multi-physical modelling and safety analyses capability, based on a code package consisting of the best estimate 3D neutronic (PANTHER), system thermal hydraulic (RELAP5), core sub-channel thermal hydraulic (COBRA-3C), and fuel thermal mechanic (FRAPCON/FRAPTRAN) codes. A series of methodologies have been developed to perform and to license the reactor safety analysis and core reload design, based on the deterministic bounding approach. Following the recent trends in research and development as well as in industrial applications, TE has been working since 2010 towards the application of the statistical sensitivity and uncertainty analysis methods to the multi-physical modelling and licensing safety analyses. In this paper, the TE multi-physical modelling and safety analyses capability is first described, followed by the proposed TE best estimate plus statistical uncertainty analysis method (BESUAM). The chosen statistical sensitivity and uncertainty analysis methods (non-parametric order statistic method or bootstrap) and tool (DAKOTA) are then presented, followed by some preliminary results of their applications to FRAPCON/FRAPTRAN simulation of OECD RIA fuel rod codes benchmark and RELAP5/MOD3.3 simulation of THTF tests. (authors)

  18. Bootstrap and Order Statistics for Quantifying Thermal-Hydraulic Code Uncertainties in the Estimation of Safety Margins

    Directory of Open Access Journals (Sweden)

    Enrico Zio

    2008-01-01

    Full Text Available In the present work, the uncertainties affecting the safety margins estimated from thermal-hydraulic code calculations are captured quantitatively by resorting to the order statistics and the bootstrap technique. The proposed framework of analysis is applied to the estimation of the safety margin, with its confidence interval, of the maximum fuel cladding temperature reached during a complete group distribution blockage scenario in a RBMK-1500 nuclear reactor.

  19. Cross-section data uncertainty and how such information is used in fusion analysis

    International Nuclear Information System (INIS)

    Kodeli, I.

    1999-01-01

    A sensitivity and uncertainty computer code package has been developed and already extensively used in the analyses for pressure vessel dosimetry and benchmark experiments. The procedure is based on the SUSD3D code and the VITAMIN-J/COVA covariance matrix library. The complete covariance matrices of secondary angular distribution (SAD-file MF=34), as available in the EFF-2 evaluation in ENDF/B-6 format can be treated. SAD effects can be important in particular for the fusion applications. An option to perform three-dimensional sensitivity and uncertainty analysis is included in the SUSD3D code. Use of angular moment files instead of bulky angular flux files produced by the discrete ordinates transport codes reduces considerably the size of the files required and represents an acceptable approximation for the problem types analysed here. The underlying perturbation theory is reviewed and some examples on the use are listed.(author)

  20. Uncertainty analysis of time-dependent nonlinear systems: theory and application to transient thermal hydraulics

    International Nuclear Information System (INIS)

    Barhen, J.; Bjerke, M.A.; Cacuci, D.G.; Mullins, C.B.; Wagschal, G.G.

    1982-01-01

    An advanced methodology for performing systematic uncertainty analysis of time-dependent nonlinear systems is presented. This methodology includes a capability for reducing uncertainties in system parameters and responses by using Bayesian inference techniques to consistently combine prior knowledge with additional experimental information. The determination of best estimates for the system parameters, for the responses, and for their respective covariances is treated as a time-dependent constrained minimization problem. Three alternative formalisms for solving this problem are developed. The two ''off-line'' formalisms, with and without ''foresight'' characteristics, require the generation of a complete sensitivity data base prior to performing the uncertainty analysis. The ''online'' formalism, in which uncertainty analysis is performed interactively with the system analysis code, is best suited for treatment of large-scale highly nonlinear time-dependent problems. This methodology is applied to the uncertainty analysis of a transient upflow of a high pressure water heat transfer experiment. For comparison, an uncertainty analysis using sensitivities computed by standard response surface techniques is also performed. The results of the analysis indicate the following. Major reduction of the discrepancies in the calculation/experiment ratios is achieved by using the new methodology. Incorporation of in-bundle measurements in the uncertainty analysis significantly reduces system uncertainties. Accuracy of sensitivities generated by response-surface techniques should be carefully assessed prior to using them as a basis for uncertainty analyses of transient reactor safety problems

  1. Response surface methodology for sensitivity and uncertainty analysis: performance and perspectives

    International Nuclear Information System (INIS)

    Olivi, L.; Brunelli, F.; Cacciabue, P.C.; Parisi, P.

    1985-01-01

    Two main aspects have to be taken into account in studying a nuclear accident scenario when using nuclear safety codes as an information source. The first one concerns the behavior of the code response and the set of assumptions to be introduced for its modelling. The second one is connected with the uncertainty features of the code input, often modelled as a probability density function (pdf). The analyst can apply two well-defined approaches depending on whether he wants major emphasis put on either of the aspects. Response Surface Methodology uses polynomial and inverse polynomial models together with the theory of experimental design, expressly developed for the identification procedure. It constitutes a well-established body of techniques able to cover a wide spectrum of requirements, when the first aspect plays the crucial role in the definition of the objectives. Other techniques such as Latin hypercube sampling, stratified sampling or even random sampling can fit better, when the second aspect affects the reliability of the analysis. The ultimate goal for both approaches is the selection of the variable, i.e. the identification of the code input variables most effective on the output and the uncertainty propagation, i.e. the assessment of the pdf to be attributed to the code response. The main aim of this work is to present a sensitivity analysis method, already tested on a real case, sufficiently flexible to be applied in both approaches mentioned

  2. DNBR calculation in digital core protection system by a subchannel analysis code

    International Nuclear Information System (INIS)

    In, W. K.; Yoo, Y. J.; Hwang, T. H.; Ji, S. K.

    2001-01-01

    The DNBR calculation uncertainty and DNBR margin were evaluated in digital core protection system by a thermal-hydrualic subchannel analysis code MATRA. A simplified thermal-hydraulic code CETOP is used to calculate on-line DNBR in core protection system at a digital PWR. The DNBR tuning process against a best-estimate subchannel analysis code is required for CETOP to ensure accurate and conservative DNBR calculation but not necessary for MATRA. The DNBR calculations by MATRA and CETOP were performed for a large number of operating condition in Yonggwang nulcear units 3-4 where the digitial core protection system is initially implemented in Korea. MATRA resulted in a less negative mean value (i.e., reduce the overconservatism) and a somewhat larger standard deviation of the DNBR error. The uncertainty corrected minimum DNBR by MATRA was shown to be higher by 1.8% -9.9% that the CETOP DNBR

  3. International Training Program in Support of Safety Analysis: 3D S.UN.COP-Scaling, Uncertainty and 3D Thermal-Hydraulics/Neutron-Kinetics Coupled Codes Seminars

    International Nuclear Information System (INIS)

    Petruzzi, Alessandro; D'Auria, Francesco; Bajs, Tomislav; Reventos, Francesc

    2006-01-01

    Thermal-hydraulic system computer codes are extensively used worldwide for analysis of nuclear facilities by utilities, regulatory bodies, nuclear power plant designers and vendors, nuclear fuel companies, research organizations, consulting companies, and technical support organizations. The computer code user represents a source of uncertainty that can influence the results of system code calculations. This influence is commonly known as the 'user effect' and stems from the limitations embedded in the codes as well as from the limited capability of the analysts to use the codes. Code user training and qualification is an effective means for reducing the variation of results caused by the application of the codes by different users. This paper describes a systematic approach to training code users who, upon completion of the training, should be able to perform calculations making the best possible use of the capabilities of best estimate codes. In other words, the program aims at contributing towards solving the problem of user effect. The 3D S.UN.COP (Scaling, Uncertainty and 3D COuPled code calculations) seminars have been organized as follow-up of the proposal to IAEA for the Permanent Training Course for System Code Users [1]. Five seminars have been held at University of Pisa (2003, 2004), at The Pennsylvania State University (2004), at University of Zagreb (2005) and at the School of Industrial Engineering of Barcelona (2006). It was recognized that such courses represented both a source of continuing education for current code users and a mean for current code users to enter the formal training structure of a proposed 'permanent' stepwise approach to user training. The 3D S.UN.COP 2006 was successfully held with the attendance of 33 participants coming from 18 countries and 28 different institutions (universities, vendors, national laboratories and regulatory bodies). More than 30 scientists (coming from 13 countries and 23 different institutions) were

  4. Uncertainty analysis for the BEACON-COLSS core monitoring system application

    International Nuclear Information System (INIS)

    Morita, T.; Boyd, W.A.; Seong, K.B.

    2005-01-01

    This paper will cover the measurement uncertainty analysis of BEACON-COLSS core monitoring system. The uncertainty evaluation is made by using a BEACON-COLSS simulation program. By simulating the BEACON on-line operation for analytically generated reactor conditions, accuracy of the 'Measured' results can be evaluated by comparing to analytically generated 'Truth'. The DNB power margin is evaluated based on the Combustion Engineering's Modified Statistical Combination of Uncertainties (MSCU) using the CETOPD code for the DNBR calculation. A BEACON-COLSS simulation program for the uncertainty evaluation function has been established for plant applications. Qualification work has been completed for two Combustion Engineering plants. Results of the BEACON-COLSS measured peaking factors and DNBR power margin are plant type dependent and are applicable to reload cores as long as the core geometry and detector layout are unchanged. (authors)

  5. PREMIUM - Benchmark on the quantification of the uncertainty of the physical models in the system thermal-hydraulic codes

    International Nuclear Information System (INIS)

    Skorek, Tomasz; Crecy, Agnes de

    2013-01-01

    PREMIUM (Post BEMUSE Reflood Models Input Uncertainty Methods) is an activity launched with the aim to push forward the methods of quantification of physical models uncertainties in thermal-hydraulic codes. It is endorsed by OECD/NEA/CSNI/WGAMA. The benchmark PREMIUM is addressed to all who applies uncertainty evaluation methods based on input uncertainties quantification and propagation. The benchmark is based on a selected case of uncertainty analysis application to the simulation of quench front propagation in an experimental test facility. Application to an experiment enables evaluation and confirmation of the quantified probability distribution functions on the basis of experimental data. The scope of the benchmark comprises a review of the existing methods, selection of potentially important uncertain input parameters, preliminary quantification of the ranges and distributions of the identified parameters, evaluation of the probability density function using experimental results of tests performed on FEBA test facility and confirmation/validation of the performed quantification on the basis of blind calculation of Reflood 2-D PERICLES experiment. (authors)

  6. Uncertainty propagation in probabilistic safety analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Fleming, P.V.

    1981-09-01

    The uncertainty propagation in probabilistic safety analysis of nuclear power plants, is done. The methodology of the minimal cut is implemented in the computer code SVALON and the results for several cases are compared with corresponding results obtained with the SAMPLE code, which employs the Monte Carlo method to propagate the uncertanties. The results have show that, for a relatively small number of dominant minimal cut sets (n approximately 25) and error factors (r approximately 5) the SVALON code yields results which are comparable to those obtained with SAMPLE. An analysis of the unavailability of the low pressure recirculation system of Angra 1 for both the short and long term recirculation phases, are presented. The results for the short term phase are in good agreement with the corresponding one given in WASH-1400. (E.G.) [pt

  7. Methodologies for uncertainty analysis in the level 2 PSA and their implementation procedures

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Yang, Joon Eun; Kim, Dong Ha

    2002-04-01

    Main purpose of this report to present standardized methodologies for uncertainty analysis in the Level 2 Probabilistic Safety Assessment (PSA) and their implementation procedures, based on results obtained through a critical review of the existing methodologies for the analysis of uncertainties employed in the Level 2 PSA, especially Accident Progression Event Tree (APET). Uncertainties employed in the Level 2 PSA, quantitative expressions of overall knowledge of analysts' and experts' participating in the probabilistic quantification process of phenomenological accident progressions ranging from core melt to containment failure, their numerical values are directly related to the degree of confidence that the analyst has that a given phenomenological event or accident process will or will not occur, or analyst's subjective probabilities of occurrence. These results that are obtained from Level 2 PSA uncertainty analysis, become an essential contributor to the plant risk, in addition to the Level 1 PSA and Level 3 PSA uncertainties. Uncertainty analysis methodologies and their implementation procedures presented in this report was prepared based on the following criteria: 'uncertainty quantification process must be logical, scrutable, complete, consistent and in an appropriate level of detail, as mandated by the Level 2 PSA objectives'. For the aforementioned purpose, this report deals mainly with (1) summary of general or Level 2 PSA specific uncertainty analysis methodologies, (2) selection of phenomenological branch events for uncertainty analysis in the APET, methodology for quantification of APET uncertainty inputs and its implementation procedure, (3) statistical propagation of uncertainty inputs through APET and its implementation procedure, and (4) formal procedure for quantification of APET uncertainties and source term categories (STCs) through the Level 2 PSA quantification codes

  8. Experimental data bases useful for quantification of model uncertainties in best estimate codes

    International Nuclear Information System (INIS)

    Wilson, G.E.; Katsma, K.R.; Jacobson, J.L.; Boodry, K.S.

    1988-01-01

    A data base is necessary for assessment of thermal hydraulic codes within the context of the new NRC ECCS Rule. Separate effect tests examine particular phenomena that may be used to develop and/or verify models and constitutive relationships in the code. Integral tests are used to demonstrate the capability of codes to model global characteristics and sequence of events for real or hypothetical transients. The nuclear industry has developed a large experimental data base of fundamental nuclear, thermal-hydraulic phenomena for code validation. Given a particular scenario, and recognizing the scenario's important phenomena, selected information from this data base may be used to demonstrate applicability of a particular code to simulate the scenario and to determine code model uncertainties. LBLOCA experimental data bases useful to this objective are identified in this paper. 2 tabs

  9. Case studies in Gaussian process modelling of computer codes

    International Nuclear Information System (INIS)

    Kennedy, Marc C.; Anderson, Clive W.; Conti, Stefano; O'Hagan, Anthony

    2006-01-01

    In this paper we present a number of recent applications in which an emulator of a computer code is created using a Gaussian process model. Tools are then applied to the emulator to perform sensitivity analysis and uncertainty analysis. Sensitivity analysis is used both as an aid to model improvement and as a guide to how much the output uncertainty might be reduced by learning about specific inputs. Uncertainty analysis allows us to reflect output uncertainty due to unknown input parameters, when the finished code is used for prediction. The computer codes themselves are currently being developed within the UK Centre for Terrestrial Carbon Dynamics

  10. Automated sensitivity analysis of the radionuclide migration code UCBNE10.2

    International Nuclear Information System (INIS)

    Pin, F.G.; Worley, B.A.; Oblow, E.M.; Wright, R.Q.; Harper, W.V.

    1985-01-01

    The Salt Repository Project (SRP) of the US Department of Energy is performing ongoing performance assessment analyses for the eventual licensing of an underground high-level nuclear waste repository in salt. As part of these studies, sensitivity and uncertainty analysis play a major role in the identification of important parameters, and in the identification of specific data needs for site characterization. Oak Ridge National Laboratory has supported the SRP in this effort resulting in the development of an automated procedure for performing large-scale sensitivity analysis using computer calculus. GRESS, Gradient Enhanced Software System, is a pre-compiler that can process FORTRAN computer codes and add derivative taking capabilities to the normal calculated results. The GRESS code is described and applied to the code UCB-NE-10.2 which simulates the migration through an adsorptive medium of the radionuclide members of a decay chain. Conclusions are drawn relative to the applicability of GRESS for more general large-scale modeling sensitivity studies, and the role of such techniques in the overall SRP sensitivity/uncertainty program is detailed. 6 refs., 2 figs., 3 tabs

  11. Effect of activation cross section uncertainties in transmutation analysis of realistic low-activation steels for IFMIF

    Energy Technology Data Exchange (ETDEWEB)

    Cabellos, O.; Garcya-Herranz, N.; Sanz, J. [Institute of Nuclear Fusion, UPM, Madrid (Spain); Cabellos, O.; Garcya-Herranz, N.; Fernandez, P.; Fernandez, B. [Dept. of Nuclear Engineering, UPM, Madrid (Spain); Sanz, J. [Dept. of Power Engineering, UNED, Madrid (Spain); Reyes, S. [Safety, Environment and Health Group, ITER Joint Work Site, Cadarache Center (France)

    2008-07-01

    We address uncertainty analysis to draw conclusions on the reliability of the activation calculation in the International Fusion Materials Irradiation Facility (IFMIF) under the potential impact of activation cross section uncertainties. The Monte Carlo methodology implemented in ACAB code gives the uncertainty estimates due to the synergetic/global effect of the complete set of cross section uncertainties. An element-by-element analysis has been demonstrated as a helpful tool to easily analyse the transmutation performance of irradiated materials.The uncertainty analysis results showed that for times over about 24 h the relative error in the contact dose rate can be as large as 23 per cent. We have calculated the effect of cross section uncertainties in the IFMIF activation of all different elements. For EUROFER, uncertainties in H and He elements are 7.3% and 5.6%, respectively. We have found significant uncertainties in the transmutation response for C, P and Nb.

  12. Benchmarking and application of the state-of-the-art uncertainty analysis methods XSUSA and SHARK-X

    International Nuclear Information System (INIS)

    Aures, A.; Bostelmann, F.; Hursin, M.; Leray, O.

    2017-01-01

    Highlights: • Application of the uncertainty analysis methods XSUSA and SHARK-X. • Propagation of nuclear data uncertainty through PWR pin cell depletion calculation. • Uncertainty quantification of eigenvalue, nuclide densities and Doppler coefficient. • Top contributor to overall output uncertainty by sensitivity analysis. • Comparison with SAMPLER and TSUNAMI of the SCALE code package. - Abstract: This study presents collaborative work performed between GRS and PSI on benchmarking and application of the state-of-the-art uncertainty analysis methods XSUSA and SHARK-X. Applied to a PWR pin cell depletion calculation, both methods propagate input uncertainty from nuclear data to output uncertainty. The uncertainty of the multiplication factors, nuclide densities, and fuel temperature coefficients derived by both methods are compared at various burnup steps. Comparisons of these quantities are furthermore performed with the SAMPLER module of SCALE 6.2. The perturbation theory based TSUNAMI module of both SCALE 6.1 and SCALE 6.2 is additionally applied for comparisons of the reactivity coefficient.

  13. Fukushima Daiichi unit 1 uncertainty analysis--Preliminary selection of uncertain parameters and analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Cardoni, Jeffrey N.; Kalinich, Donald A.

    2014-02-01

    Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.

  14. Deterministic methods for sensitivity and uncertainty analysis in large-scale computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Oblow, E.M.; Pin, F.G.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.; Lucius, J.L.

    1987-01-01

    The fields of sensitivity and uncertainty analysis are dominated by statistical techniques when large-scale modeling codes are being analyzed. This paper reports on the development and availability of two systems, GRESS and ADGEN, that make use of computer calculus compilers to automate the implementation of deterministic sensitivity analysis capability into existing computer models. This automation removes the traditional limitation of deterministic sensitivity methods. The paper describes a deterministic uncertainty analysis method (DUA) that uses derivative information as a basis to propagate parameter probability distributions to obtain result probability distributions. The paper demonstrates the deterministic approach to sensitivity and uncertainty analysis as applied to a sample problem that models the flow of water through a borehole. The sample problem is used as a basis to compare the cumulative distribution function of the flow rate as calculated by the standard statistical methods and the DUA method. The DUA method gives a more accurate result based upon only two model executions compared to fifty executions in the statistical case

  15. Comprehensive neutron cross-section and secondary energy distribution uncertainty analysis for a fusion reactor

    International Nuclear Information System (INIS)

    Gerstl, S.A.W.; LaBauve, R.J.; Young, P.G.

    1980-05-01

    On the example of General Atomic's well-documented Power Generating Fusion Reactor (PGFR) design, this report exercises a comprehensive neutron cross-section and secondary energy distribution (SED) uncertainty analysis. The LASL sensitivity and uncertainty analysis code SENSIT is used to calculate reaction cross-section sensitivity profiles and integral SED sensitivity coefficients. These are then folded with covariance matrices and integral SED uncertainties to obtain the resulting uncertainties of three calculated neutronics design parameters: two critical radiation damage rates and a nuclear heating rate. The report documents the first sensitivity-based data uncertainty analysis, which incorporates a quantitative treatment of the effects of SED uncertainties. The results demonstrate quantitatively that the ENDF/B-V cross-section data files for C, H, and O, including their SED data, are fully adequate for this design application, while the data for Fe and Ni are at best marginally adequate because they give rise to response uncertainties up to 25%. Much higher response uncertainties are caused by cross-section and SED data uncertainties in Cu (26 to 45%), tungsten (24 to 54%), and Cr (up to 98%). Specific recommendations are given for re-evaluations of certain reaction cross-sections, secondary energy distributions, and uncertainty estimates

  16. Uncertainty quantification and sensitivity analysis with CASL Core Simulator VERA-CS

    International Nuclear Information System (INIS)

    Brown, C.S.; Zhang, Hongbin

    2016-01-01

    VERA-CS (Virtual Environment for Reactor Applications, Core Simulator) is a coupled neutron transport and thermal-hydraulics code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). An approach to uncertainty quantification and sensitivity analysis with VERA-CS was developed and a new toolkit was created to perform uncertainty quantification and sensitivity analysis. A 2 × 2 fuel assembly model was developed and simulated by VERA-CS, and uncertainty quantification and sensitivity analysis were performed with fourteen uncertain input parameters. The minimum departure from nucleate boiling ratio (MDNBR), maximum fuel center-line temperature, and maximum outer clad surface temperature were chosen as the selected figures of merit. Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis and coolant inlet temperature was consistently the most influential parameter. Parameters used as inputs to the critical heat flux calculation with the W-3 correlation were shown to be the most influential on the MDNBR, maximum fuel center-line temperature, and maximum outer clad surface temperature.

  17. An analysis of combined standard uncertainty for radiochemical measurements of environmental samples

    International Nuclear Information System (INIS)

    Berne, A.

    1996-01-01

    It is anticipated that future data acquisitions intended for use in radiological risk assessments will require the incorporation of uncertainty analysis. Often, only one aliquot of the sample is taken and a single determination is made. Under these circumstances, the total uncertainty is calculated using the open-quotes propagation of errorsclose quotes approach. However, there is no agreement in the radioanalytical community as to the exact equations to use. The Quality Assurance/Metrology Division of the Environmental Measurements Laboratory has developed a systematic process to compute uncertainties in constituent components of the analytical procedure, as well as the combined standard uncertainty (CSU). The equations for computation are presented here, with examples of their use. They have also been incorporated into a code for use in the spreadsheet application, QuattroPro trademark. Using the spreadsheet with appropriate inputs permits an analysis of the variations in the CSU as a function of several different variables. The relative importance of the open-quotes counting uncertaintyclose quotes can also be ascertained

  18. Development and application of objective uncertainty measures for nuclear power plant transient analysis[Dissertation 3897

    Energy Technology Data Exchange (ETDEWEB)

    Vinai, P

    2007-10-15

    For the development, design and licensing of a nuclear power plant (NPP), a sound safety analysis is necessary to study the diverse physical phenomena involved in the system behaviour under operational and transient conditions. Such studies are based on detailed computer simulations. With the progresses achieved in computer technology and the greater availability of experimental and plant data, the use of best estimate codes for safety evaluations has gained increasing acceptance. The application of best estimate safety analysis has raised new problems that need to be addressed: it has become more crucial to assess as to how reliable code predictions are, especially when they need to be compared against safety limits that must not be crossed. It becomes necessary to identify and quantify the various possible sources of uncertainty that affect the reliability of the results. Currently, such uncertainty evaluations are generally based on experts' opinion. In the present research, a novel methodology based on a non-parametric statistical approach has been developed for objective quantification of best-estimate code uncertainties related to the physical models used in the code. The basis is an evaluation of the accuracy of a given physical model achieved by comparing its predictions with experimental data from an appropriate set of separate-effect tests. The differences between measurements and predictions can be considered stochastically distributed, and thus a statistical approach can be employed. The first step was the development of a procedure for investigating the dependence of a given physical model's accuracy on the experimental conditions. Each separate-effect test effectively provides a random sample of discrepancies between measurements and predictions, corresponding to a location in the state space defined by a certain number of independent system variables. As a consequence, the samples of 'errors', achieved from analysis of the entire

  19. CREOLE experiment study on the reactivity temperature coefficient with sensitivity and uncertainty analysis using the MCNP5 code and different neutron cross section evaluations

    International Nuclear Information System (INIS)

    Boulaich, Y.; El Bardouni, T.; Erradi, L.; Chakir, E.; Boukhal, H.; Nacir, B.; El Younoussi, C.; El Bakkari, B.; Merroun, O.; Zoubair, M.

    2011-01-01

    Highlights: → In the present work, we have analyzed the CREOLE experiment on the reactivity temperature coefficient (RTC) by using the three-dimensional continuous energy code (MCNP5) and the last updated nuclear data evaluations. → Calculation-experiment discrepancies of the RTC were analyzed and the results have shown that the JENDL3.3 and JEFF3.1 evaluations give the most consistent values. → In order to specify the source of the relatively large discrepancy in the case of ENDF-BVII nuclear data evaluation, the k eff discrepancy between ENDF-BVII and JENDL3.3 was decomposed by using sensitivity and uncertainty analysis technique. - Abstract: In the present work, we analyze the CREOLE experiment on the reactivity temperature coefficient (RTC) by using the three-dimensional continuous energy code (MCNP5) and the last updated nuclear data evaluations. This experiment performed in the EOLE critical facility located at CEA/Cadarache, was mainly dedicated to the RTC studies for both UO 2 and UO 2 -PuO 2 PWR type lattices covering the whole temperature range from 20 deg. C to 300 deg. C. We have developed an accurate 3D model of the EOLE reactor by using the MCNP5 Monte Carlo code which guarantees a high level of fidelity in the description of different configurations at various temperatures taking into account their consequence on neutron cross section data and all thermal expansion effects. In this case, the remaining error between calculation and experiment will be awarded mainly to uncertainties on nuclear data. Our own cross section library was constructed by using NJOY99.259 code with point-wise nuclear data based on ENDF-BVII, JEFF3.1 and JENDL3.3 evaluation files. The MCNP model was validated through the axial and radial fission rate measurements at room and hot temperatures. Calculation-experiment discrepancies of the RTC were analyzed and the results have shown that the JENDL3.3 and JEFF3.1 evaluations give the most consistent values; the discrepancy is

  20. Analysis of the impact of correlated benchmark experiments on the validation of codes for criticality safety analysis

    International Nuclear Information System (INIS)

    Bock, M.; Stuke, M.; Behler, M.

    2013-01-01

    The validation of a code for criticality safety analysis requires the recalculation of benchmark experiments. The selected benchmark experiments are chosen such that they have properties similar to the application case that has to be assessed. A common source of benchmark experiments is the 'International Handbook of Evaluated Criticality Safety Benchmark Experiments' (ICSBEP Handbook) compiled by the 'International Criticality Safety Benchmark Evaluation Project' (ICSBEP). In order to take full advantage of the information provided by the individual benchmark descriptions for the application case, the recommended procedure is to perform an uncertainty analysis. The latter is based on the uncertainties of experimental results included in most of the benchmark descriptions. They can be performed by means of the Monte Carlo sampling technique. The consideration of uncertainties is also being introduced in the supplementary sheet of DIN 25478 'Application of computer codes in the assessment of criticality safety'. However, for a correct treatment of uncertainties taking into account the individual uncertainties of the benchmark experiments is insufficient. In addition, correlations between benchmark experiments have to be handled correctly. For example, these correlations can arise due to different cases of a benchmark experiment sharing the same components like fuel pins or fissile solutions. Thus, manufacturing tolerances of these components (e.g. diameter of the fuel pellets) have to be considered in a consistent manner in all cases of the benchmark experiment. At the 2012 meeting of the Expert Group on 'Uncertainty Analysis for Criticality Safety Assessment' (UACSA) of the OECD/NEA a benchmark proposal was outlined that aimed for the determination of the impact on benchmark correlations on the estimation of the computational bias of the neutron multiplication factor (k eff ). The analysis presented here is based on this proposal. (orig.)

  1. Sensitivity and uncertainty studies of the CRAC2 computer code

    International Nuclear Information System (INIS)

    Kocher, D.C.; Ward, R.C.; Killough, G.G.; Dunning, D.E. Jr.; Hicks, B.B.; Hosker, R.P. Jr.; Ku, J.Y.; Rao, K.S.

    1987-01-01

    The authors have studied the sensitivity of health impacts from nuclear reactor accidents, as predicted by the CRAC2 computer code, to the following sources of uncertainty: (1) the model for plume rise, (2) the model for wet deposition, (3) the meteorological bin-sampling procedure for selecting weather sequences with rain, (4) the dose conversion factors for inhalation as affected by uncertainties in the particle size of the carrier aerosol and the clearance rates of radionuclides from the respiratory tract, (5) the weathering half-time for external ground-surface exposure, and (6) the transfer coefficients for terrestrial foodchain pathways. Predicted health impacts usually showed little sensitivity to use of an alternative plume-rise model or a modified rain-bin structure in bin-sampling. Health impacts often were quite sensitive to use of an alternative wet-deposition model in single-trial runs with rain during plume passage, but were less sensitive to the model in bin-sampling runs. Uncertainties in the inhalation dose conversion factors had important effects on early injuries in single-trial runs. Latent cancer fatalities were moderately sensitive to uncertainties in the weathering half-time for ground-surface exposures, but showed little sensitivity to the transfer coefficients for terrestrial foodchain pathways. Sensitivities of CRAC2 predictions to uncertainties in the models and parameters also depended on the magnitude of the source term, and some of the effects on early health effects were comparable to those that were due only to selection of different sets of weather sequences in bin-sampling

  2. Development and application of best-estimate LWR safety analysis codes

    International Nuclear Information System (INIS)

    Reocreux, M.

    1997-01-01

    This paper is a review of the status and the future orientations of the development and application of best estimate LWR safety analysis codes. The present status of these codes exhibits a large success and almost a complete fulfillment of the objectives which were assigned in the 70s. The applications of Best Estimate codes are numerous and cover a large variety of safety questions. However these applications raised a number of problems. The first ones concern the need to have a better control of the quality of the results. This means requirements on code assessment and on uncertainties evaluation. The second ones concern needs for code development and specifically regarding physical models, numerics, coupling with other codes and programming. The analysis of the orientations for code developments and applications in the next years, shows that some developments should be made without delay in order to solve today questions whereas some others are more long term and should be tested for example in some pilot programmes before being eventually applied in main code development. Each of these development programmes are analyzed in the paper by detailing their main content and their possible interest. (author)

  3. Sensitivity and Uncertainty Analysis for coolant void reactivity in a CANDU Fuel Lattice Cell Model

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Seung Yeol; Shim, Hyung Jin [Seoul National University, Seoul (Korea, Republic of)

    2016-10-15

    In this study, the EPBM is implemented in Seoul National university Monte Carlo (MC) code, McCARD which has the k uncertainty evaluation capability by the adjoint-weighted perturbation (AWP) method. The implementation is verified by comparing the sensitivities of the k-eigenvalue difference to the microscopic cross sections computed by the DPBM and the direct subtractions for the TMI-1 pin-cell problem. The uncertainty of the coolant void reactivity (CVR) in a CANDU fuel lattice model due to the ENDF/B-VII.1 covariance data is calculated by its sensitivities estimated by the EPBM. The method based on the eigenvalue perturbation theory (EPBM) utilizes the 1st order adjoint-weighted perturbation (AWP) technique to estimate the sensitivity of the eigenvalue difference. Furthermore this method can be easily applied in a S/U analysis code system equipped with the eigenvalue sensitivity calculation capability. The EPBM is implemented in McCARD code and verified by showing good agreement with reference solution. Then the McCARD S/U analysis have been performed with the EPBM module for the CVR in CANDU fuel lattice problem. It shows that the uncertainty contributions of nu of {sup 235}U and gamma reaction of {sup 238}U are dominant.

  4. Acquired experience on organizing 3D S.UN.COP: international course to support nuclear license by user training in the areas of scaling, uncertainty, and 3D thermal-hydraulics/neutron-kinetics coupled codes

    Energy Technology Data Exchange (ETDEWEB)

    Petruzzi, Alessandro; D' Auria, Francesco [University of Pisa, San Piero a Grado (Italy). Nuclear Research Group San Piero a Grado (GRNSPG); Galetti, Regina, E-mail: regina@cnen.gov.b [National Commission for Nuclear Energy (CNEN), Rio de Janeiro, RJ (Brazil); Bajs, Tomislav [University of Zagreb (Croatia). Fac. of Electrical Engineering and Computing. Dept. of Power Systems; Reventos, Francesc [Technical University of Catalonia, Barcelona (Spain). Dept. of Physics and Nuclear Engineering

    2011-07-01

    Thermal-hydraulic system computer codes are extensively used worldwide for analysis of nuclear facilities by utilities, regulatory bodies, nuclear power plant designers, vendors, and research organizations. Computer code user represents a source of uncertainty that may significantly affect the results of system code calculations. Code user training and qualification represent an effective means for reducing the variation of results caused by the application of the codes by different users. This paper describes the experience in applying a systematic approach to training code users who, upon completion of the training, should be able to perform calculations making the best possible use of the capabilities of best estimate codes. In addition, this paper presents the organization and the main features of the 3D S.UN.COP (scaling, uncertainty, and 3D coupled code calculations) seminars during which particular emphasis is given to practical applications in connection with the licensing process of best estimate plus uncertainty methodologies, showing the designer, utility and regulatory approaches. (author)

  5. Acquired experience on organizing 3D S.UN.COP: international course to support nuclear license by user training in the areas of scaling, uncertainty, and 3D thermal-hydraulics/neutron-kinetics coupled codes

    International Nuclear Information System (INIS)

    Petruzzi, Alessandro; D'Auria, Francesco; Galetti, Regina; Bajs, Tomislav; Reventos, Francesc

    2011-01-01

    Thermal-hydraulic system computer codes are extensively used worldwide for analysis of nuclear facilities by utilities, regulatory bodies, nuclear power plant designers, vendors, and research organizations. Computer code user represents a source of uncertainty that may significantly affect the results of system code calculations. Code user training and qualification represent an effective means for reducing the variation of results caused by the application of the codes by different users. This paper describes the experience in applying a systematic approach to training code users who, upon completion of the training, should be able to perform calculations making the best possible use of the capabilities of best estimate codes. In addition, this paper presents the organization and the main features of the 3D S.UN.COP (scaling, uncertainty, and 3D coupled code calculations) seminars during which particular emphasis is given to practical applications in connection with the licensing process of best estimate plus uncertainty methodologies, showing the designer, utility and regulatory approaches. (author)

  6. Reliability analysis under epistemic uncertainty

    International Nuclear Information System (INIS)

    Nannapaneni, Saideep; Mahadevan, Sankaran

    2016-01-01

    This paper proposes a probabilistic framework to include both aleatory and epistemic uncertainty within model-based reliability estimation of engineering systems for individual limit states. Epistemic uncertainty is considered due to both data and model sources. Sparse point and/or interval data regarding the input random variables leads to uncertainty regarding their distribution types, distribution parameters, and correlations; this statistical uncertainty is included in the reliability analysis through a combination of likelihood-based representation, Bayesian hypothesis testing, and Bayesian model averaging techniques. Model errors, which include numerical solution errors and model form errors, are quantified through Gaussian process models and included in the reliability analysis. The probability integral transform is used to develop an auxiliary variable approach that facilitates a single-level representation of both aleatory and epistemic uncertainty. This strategy results in an efficient single-loop implementation of Monte Carlo simulation (MCS) and FORM/SORM techniques for reliability estimation under both aleatory and epistemic uncertainty. Two engineering examples are used to demonstrate the proposed methodology. - Highlights: • Epistemic uncertainty due to data and model included in reliability analysis. • A novel FORM-based approach proposed to include aleatory and epistemic uncertainty. • A single-loop Monte Carlo approach proposed to include both types of uncertainties. • Two engineering examples used for illustration.

  7. HTGR reactor physics, thermal-hydraulics and depletion uncertainty analysis: a proposed IAEA coordinated research project

    International Nuclear Information System (INIS)

    Tyobeka, Bismark; Reitsma, Frederik; Ivanov, Kostadin

    2011-01-01

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The predictive capability of coupled neutronics/thermal hydraulics and depletion simulations for reactor design and safety analysis can be assessed with sensitivity analysis and uncertainty analysis methods. In order to benefit from recent advances in modeling and simulation and the availability of new covariance data (nuclear data uncertainties) extensive sensitivity and uncertainty studies are needed for quantification of the impact of different sources of uncertainties on the design and safety parameters of HTGRs. Uncertainty and sensitivity studies are an essential component of any significant effort in data and simulation improvement. In February 2009, the Technical Working Group on Gas-Cooled Reactors recommended that the proposed IAEA Coordinated Research Project (CRP) on the HTGR Uncertainty Analysis in Modeling be implemented. In the paper the current status and plan are presented. The CRP will also benefit from interactions with the currently ongoing OECD/NEA Light Water Reactor (LWR) UAM benchmark activity by taking into consideration the peculiarities of HTGR designs and simulation requirements. (author)

  8. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)] [and others

    1997-06-01

    This volume is the second of a two-volume document that summarizes a joint project by the US Nuclear Regulatory and the Commission of European Communities to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This two-volume report, which examines mechanisms and uncertainties of transfer through the food chain, is the first in a series of five such reports. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain transfer that affect calculations of offsite radiological consequences. Seven of the experts reported on transfer into the food chain through soil and plants, nine reported on transfer via food products from animals, and two reported on both. The expert judgment elicitation procedure and its outcomes are described in these volumes. This volume contains seven appendices. Appendix A presents a brief discussion of the MAACS and COSYMA model codes. Appendix B is the structure document and elicitation questionnaire for the expert panel on soils and plants. Appendix C presents the rationales and responses of each of the members of the soils and plants expert panel. Appendix D is the structure document and elicitation questionnaire for the expert panel on animal transfer. The rationales and responses of each of the experts on animal transfer are given in Appendix E. Brief biographies of the food chain expert panel members are provided in Appendix F. Aggregated results of expert responses are presented in graph format in Appendix G.

  9. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 2: Appendices

    International Nuclear Information System (INIS)

    Brown, J.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-06-01

    This volume is the second of a two-volume document that summarizes a joint project by the US Nuclear Regulatory and the Commission of European Communities to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This two-volume report, which examines mechanisms and uncertainties of transfer through the food chain, is the first in a series of five such reports. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain transfer that affect calculations of offsite radiological consequences. Seven of the experts reported on transfer into the food chain through soil and plants, nine reported on transfer via food products from animals, and two reported on both. The expert judgment elicitation procedure and its outcomes are described in these volumes. This volume contains seven appendices. Appendix A presents a brief discussion of the MAACS and COSYMA model codes. Appendix B is the structure document and elicitation questionnaire for the expert panel on soils and plants. Appendix C presents the rationales and responses of each of the members of the soils and plants expert panel. Appendix D is the structure document and elicitation questionnaire for the expert panel on animal transfer. The rationales and responses of each of the experts on animal transfer are given in Appendix E. Brief biographies of the food chain expert panel members are provided in Appendix F. Aggregated results of expert responses are presented in graph format in Appendix G

  10. Two-dimensional cross-section and SED uncertainty analysis for the Fusion Engineering Device (FED)

    International Nuclear Information System (INIS)

    Embrechts, M.J.; Urban, W.T.; Dudziak, D.J.

    1982-01-01

    The theory of two-dimensional cross-section and secondary-energy-distribution (SED) sensitivity was implemented by developing a two-dimensional sensitivity and uncertainty analysis code, SENSIT-2D. Analyses of the Fusion Engineering Design (FED) conceptual inboard shield indicate that, although the calculated uncertainties in the 2-D model are of the same order of magnitude as those resulting from the 1-D model, there might be severe differences. The more complex the geometry, the more compulsory a 2-D analysis becomes. Specific results show that the uncertainty for the integral heating of the toroidal field (TF) coil for the FED is 114.6%. The main contributors to the cross-section uncertainty are chromium and iron. Contributions to the total uncertainty were smaller for nickel, copper, hydrogen and carbon. All analyses were performed with the Los Alamos 42-group cross-section library generated from ENDF/B-V data, and the COVFILS covariance matrix library. The large uncertainties due to chromium result mainly from large convariances for the chromium total and elastic scattering cross sections

  11. Two-dimensional cross-section sensitivity and uncertainty analysis of the LBM experience at LOTUS

    International Nuclear Information System (INIS)

    Davidson, J.W.; Dudziak, D.J.; Pelloni, S.; Stepanek, J.

    1989-01-01

    In recent years, the LOTUS fusion blanket facility at IGA-EPF in Lausanne provided a series of irradiation experiments with the Lithium Blanket Module (LBM). The LBM has both realistic fusion blanket and materials and configuration. It is approximately an 80-cm cube, and the breeding material is Li 2 . Using as the D-T neutron source the Haefely Neutron Generator (HNG) with an intensity of about 5·10 12 n/s, a series of experiments with the bare LBM as well as with the LBM preceded by Pb, Be and ThO 2 multipliers were carried out. In a recent common Los Alamos/PSI effort, a sensitivity and nuclear data uncertainty path for the modular code system AARE (Advanced Analysis for Reactor Engineering) was developed. This path includes the cross-section code TRAMIX, the one-dimensional finite difference S n -transport code ONEDANT, the two-dimensional finite element S n -transport code TRISM, and the one- and two-dimensional sensitivity and nuclear data uncertainty code SENSIBL. For the nucleonic transport calculations, three 187-neutron-group libraries are presently available: MATXS8A and MATXS8F based on ENDF/B-V evaluations and MAT187 based on JEF/EFF evaluations. COVFILS-2, a 74-group library of neutron cross-sections, scattering matrices and covariances, is the data source for SENSIBL; the 74-group structure of COVFILS-2 is a subset of the Los Alamos 187-group structure. Within the framework of the present work a complete set of forward and adjoint two-dimensional TRISM calculations were performed both for the bare, as well as for the Pb- and Be-preceded, LBM using MATXS8 libraries. Then a two-dimensional sensitivity and uncertainty analysis for all cases was performed

  12. Deterministic uncertainty analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1987-01-01

    Uncertainties of computer results are of primary interest in applications such as high-level waste (HLW) repository performance assessment in which experimental validation is not possible or practical. This work presents an alternate deterministic approach for calculating uncertainties that has the potential to significantly reduce the number of computer runs required for conventional statistical analysis. 7 refs., 1 fig

  13. Sensitivity and uncertainty analysis

    CERN Document Server

    Cacuci, Dan G; Navon, Ionel Michael

    2005-01-01

    As computer-assisted modeling and analysis of physical processes have continued to grow and diversify, sensitivity and uncertainty analyses have become indispensable scientific tools. Sensitivity and Uncertainty Analysis. Volume I: Theory focused on the mathematical underpinnings of two important methods for such analyses: the Adjoint Sensitivity Analysis Procedure and the Global Adjoint Sensitivity Analysis Procedure. This volume concentrates on the practical aspects of performing these analyses for large-scale systems. The applications addressed include two-phase flow problems, a radiative c

  14. International training program: 3D S.UN.COP - Scaling, uncertainty and 3D thermal-hydraulics/neutron-kinetics coupled codes seminar

    International Nuclear Information System (INIS)

    Petruzzi, A.; D'Auria, F.; Bajs, T.; Reventos, F.

    2006-01-01

    Thermal-hydraulic system computer codes are extensively used worldwide for analysis of nuclear facilities by utilities, regulatory bodies, nuclear power plant designers and vendors, nuclear fuel companies, research organizations, consulting companies, and technical support organizations. The computer code user represents a source of uncertainty that can influence the results of system code calculations. This influence is commonly known as the 'user effect' and stems from the limitations embedded in the codes as well as from the limited capability of the analysts to use the codes. Code user training and qualification is an effective means for reducing the variation of results caused by the application of the codes by different users. This paper describes a systematic approach to training code users who, upon completion of the training, should be able to perform calculations making the best possible use of the capabilities of best estimate codes. In other words, the program aims at contributing towards solving the problem of user effect. The 3D S.UN.COP 2005 (Scaling, Uncertainty and 3D COuPled code calculations) seminar has been organized by University of Pisa and University of Zagreb as follow-up of the proposal to IAEA for the Permanent Training Course for System Code Users (D'Auria, 1998). It was recognized that such a course represented both a source of continuing education for current code users and a means for current code users to enter the formal training structure of a proposed 'permanent' stepwise approach to user training. The seminar-training was successfully held with the participation of 19 persons coming from 9 countries and 14 different institutions (universities, vendors, national laboratories and regulatory bodies). More than 15 scientists were involved in the organization of the seminar, presenting theoretical aspects of the proposed methodologies and holding the training and the final examination. A certificate (LA Code User grade) was released

  15. Summary of existing uncertainty methods

    International Nuclear Information System (INIS)

    Glaeser, Horst

    2013-01-01

    A summary of existing and most used uncertainty methods is presented, and the main features are compared. One of these methods is the order statistics method based on Wilks' formula. It is applied in safety research as well as in licensing. This method has been first proposed by GRS for use in deterministic safety analysis, and is now used by many organisations world-wide. Its advantage is that the number of potential uncertain input and output parameters is not limited to a small number. Such a limitation was necessary for the first demonstration of the Code Scaling Applicability Uncertainty Method (CSAU) by the United States Regulatory Commission (USNRC). They did not apply Wilks' formula in their statistical method propagating input uncertainties to obtain the uncertainty of a single output variable, like peak cladding temperature. A Phenomena Identification and Ranking Table (PIRT) was set up in order to limit the number of uncertain input parameters, and consequently, the number of calculations to be performed. Another purpose of such a PIRT process is to identify the most important physical phenomena which a computer code should be suitable to calculate. The validation of the code should be focused on the identified phenomena. Response surfaces are used in some applications replacing the computer code for performing a high number of calculations. The second well known uncertainty method is the Uncertainty Methodology Based on Accuracy Extrapolation (UMAE) and the follow-up method 'Code with the Capability of Internal Assessment of Uncertainty (CIAU)' developed by the University Pisa. Unlike the statistical approaches, the CIAU does compare experimental data with calculation results. It does not consider uncertain input parameters. Therefore, the CIAU is highly dependent on the experimental database. The accuracy gained from the comparison between experimental data and calculated results are extrapolated to obtain the uncertainty of the system code predictions

  16. Deterministic uncertainty analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1987-12-01

    This paper presents a deterministic uncertainty analysis (DUA) method for calculating uncertainties that has the potential to significantly reduce the number of computer runs compared to conventional statistical analysis. The method is based upon the availability of derivative and sensitivity data such as that calculated using the well known direct or adjoint sensitivity analysis techniques. Formation of response surfaces using derivative data and the propagation of input probability distributions are discussed relative to their role in the DUA method. A sample problem that models the flow of water through a borehole is used as a basis to compare the cumulative distribution function of the flow rate as calculated by the standard statistical methods and the DUA method. Propogation of uncertainties by the DUA method is compared for ten cases in which the number of reference model runs was varied from one to ten. The DUA method gives a more accurate representation of the true cumulative distribution of the flow rate based upon as few as two model executions compared to fifty model executions using a statistical approach. 16 refs., 4 figs., 5 tabs

  17. BEPU methods and combining of uncertainties

    International Nuclear Information System (INIS)

    Prosek, A.; Mavko, B.

    2004-01-01

    After approval of the revised rule on the acceptance of emergency core cooling system (ECCS) performance in 1988 there has been significant interest in the development of codes and methodologies for best-estimate loss-of-coolant accident (LOCAs) analyses. The Code Scaling, Applicability and Uncertainty (CSAU) evaluation method was developed and demonstrated for large-break (LB) LOCA in a pressurized water reactor. Later several new best estimate plus uncertainty methods (BEPUs) were developed in the world. The purpose of the paper is to identify and compare the statistical approaches of BEPU methods and present their important plant and licensing applications. The study showed that uncertainty analysis with random sampling of input parameters and the use of order statistics for desired tolerance limits of output parameters is today commonly accepted approach. The existing BEPU methods seems mature enough while the future research may be focused on the codes with internal assessment of uncertainty. (author)

  18. Uncertainty analysis of multiple canister repository model by large-scale calculation

    International Nuclear Information System (INIS)

    Tsujimoto, K.; Okuda, H.; Ahn, J.

    2007-01-01

    A prototype uncertainty analysis has been made by using the multiple-canister radionuclide transport code, VR, for performance assessment for the high-level radioactive waste repository. Fractures in the host rock determine main conduit of groundwater, and thus significantly affect the magnitude of radionuclide release rates from the repository. In this study, the probability distribution function (PDF) for the number of connected canisters in the same fracture cluster that bears water flow has been determined in a Monte-Carlo fashion by running the FFDF code with assumed PDFs for fracture geometry. The uncertainty for the release rate of 237 Np from a hypothetical repository containing 100 canisters has been quantitatively evaluated by using the VR code with PDFs for the number of connected canisters and the near field rock porosity. The calculation results show that the mass transport is greatly affected by (1) the magnitude of the radionuclide source determined by the number of connected canisters by the fracture cluster, and (2) the canister concentration effect in the same fracture network. The results also show the two conflicting tendencies that the more fractures in the repository model space, the greater average value but the smaller uncertainty of the peak fractional release rate is. To perform a vast amount of calculation, we have utilized the Earth Simulator and SR8000. The multi-level hybrid programming method is applied in the optimization to exploit high performance of the Earth Simulator. The Latin Hypercube Sampling has been utilized to reduce the number of samplings in Monte-Carlo calculation. (authors)

  19. Sampling based uncertainty analysis of 10% hot leg break LOCA in large scale test facility

    International Nuclear Information System (INIS)

    Sengupta, Samiran; Kraina, V.; Dubey, S. K.; Rao, R. S.; Gupta, S. K.

    2010-01-01

    Sampling based uncertainty analysis was carried out to quantify uncertainty in predictions of best estimate code RELAP5/MOD3.2 for a thermal hydraulic test (10% hot leg break LOCA) performed in the Large Scale Test Facility (LSTF) as a part of an IAEA coordinated research project. The nodalisation of the test facility was qualified for both steady state and transient level by systematically applying the procedures led by uncertainty methodology based on accuracy extrapolation (UMAE); uncertainty analysis was carried out using the Latin hypercube sampling (LHS) method to evaluate uncertainty for ten input parameters. Sixteen output parameters were selected for uncertainty evaluation and uncertainty band between 5 th and 95 th percentile of the output parameters were evaluated. It was observed that the uncertainty band for the primary pressure during two phase blowdown is larger than that of the remaining period. Similarly, a larger uncertainty band is observed relating to accumulator injection flow during reflood phase. Importance analysis was also carried out and standard rank regression coefficients were computed to quantify the effect of each individual input parameter on output parameters. It was observed that the break discharge coefficient is the most important uncertain parameter relating to the prediction of all the primary side parameters and that the steam generator (SG) relief pressure setting is the most important parameter in predicting the SG secondary pressure

  20. Uncertainties in calculations of nuclear design code system for the high temperature engineering test reactor (HTTR)

    International Nuclear Information System (INIS)

    Shindo, R.; Yamashita, K.; Murata, I.

    1991-01-01

    The nuclear design code system for the HTTR consists of one dimensional cell burnup computer code, developed in JAERI and the TWOTRAN-2 transport code. In order to satisfy related design criteria, uncertainty of the calculation was investigated by comparing the calculated and experimental results. The experiments were performed with a graphite moderated critical assembly. It was confirmed that discrepancies between calculations and experiments were small enough to be allowed in the nuclear design of HTTR. 8 refs, 6 figs

  1. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    International Nuclear Information System (INIS)

    Heo, Jaeseok; Kim, Kyung Doo

    2015-01-01

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper

  2. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Jaeseok, E-mail: jheo@kaeri.re.kr; Kim, Kyung Doo, E-mail: kdkim@kaeri.re.kr

    2015-10-15

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper.

  3. Review of best estimate plus uncertainty methods of thermal-hydraulic safety analysis

    International Nuclear Information System (INIS)

    Prosek, A.; Mavko, B.

    2003-01-01

    In 1988 United States Nuclear Regulatory Commission approved the revised rule on the acceptance of emergency core cooling system (ECCS) performance. Since that there has been significant interest in the development of codes and methodologies for best-estimate loss-of-coolant accident (LOCAs) analyses. Several new best estimate plus uncertainty methods (BEPUs) were developed in the world. The purpose of the paper is to review the developments in the direction of best estimate approaches with uncertainty quantification and to discuss the problems in practical applications of BEPU methods. In general, the licensee methods are following original methods. The study indicated that uncertainty analysis with random sampling of input parameters and the use of order statistics for desired tolerance limits of output parameters is today commonly accepted and mature approach. (author)

  4. Sensitivity Analysis of FEAST-Metal Fuel Performance Code: Initial Results

    International Nuclear Information System (INIS)

    Edelmann, Paul Guy; Williams, Brian J.; Unal, Cetin; Yacout, Abdellatif

    2012-01-01

    This memo documents the completion of the LANL milestone, M3FT-12LA0202041, describing methodologies and initial results using FEAST-Metal. The FEAST-Metal code calculations for this work are being conducted at LANL in support of on-going activities related to sensitivity analysis of fuel performance codes. The objective is to identify important macroscopic parameters of interest to modeling and simulation of metallic fuel performance. This report summarizes our preliminary results for the sensitivity analysis using 6 calibration datasets for metallic fuel developed at ANL for EBR-II experiments. Sensitivity ranking methodology was deployed to narrow down the selected parameters for the current study. There are approximately 84 calibration parameters in the FEAST-Metal code, of which 32 were ultimately used in Phase II of this study. Preliminary results of this sensitivity analysis led to the following ranking of FEAST models for future calibration and improvements: fuel conductivity, fission gas transport/release, fuel creep, and precipitation kinetics. More validation data is needed to validate calibrated parameter distributions for future uncertainty quantification studies with FEAST-Metal. Results of this study also served to point out some code deficiencies and possible errors, and these are being investigated in order to determine root causes and to improve upon the existing code models.

  5. Development of a System Analysis Toolkit for Sensitivity Analysis, Uncertainty Propagation, and Estimation of Parameter Distribution

    International Nuclear Information System (INIS)

    Heo, Jaeseok; Kim, Kyung Doo

    2015-01-01

    Statistical approaches to uncertainty quantification and sensitivity analysis are very important in estimating the safety margins for an engineering design application. This paper presents a system analysis and optimization toolkit developed by Korea Atomic Energy Research Institute (KAERI), which includes multiple packages of the sensitivity analysis and uncertainty quantification algorithms. In order to reduce the computing demand, multiple compute resources including multiprocessor computers and a network of workstations are simultaneously used. A Graphical User Interface (GUI) was also developed within the parallel computing framework for users to readily employ the toolkit for an engineering design and optimization problem. The goal of this work is to develop a GUI framework for engineering design and scientific analysis problems by implementing multiple packages of system analysis methods in the parallel computing toolkit. This was done by building an interface between an engineering simulation code and the system analysis software packages. The methods and strategies in the framework were designed to exploit parallel computing resources such as those found in a desktop multiprocessor workstation or a network of workstations. Available approaches in the framework include statistical and mathematical algorithms for use in science and engineering design problems. Currently the toolkit has 6 modules of the system analysis methodologies: deterministic and probabilistic approaches of data assimilation, uncertainty propagation, Chi-square linearity test, sensitivity analysis, and FFTBM

  6. Development of a System Analysis Toolkit for Sensitivity Analysis, Uncertainty Propagation, and Estimation of Parameter Distribution

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Jaeseok; Kim, Kyung Doo [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    Statistical approaches to uncertainty quantification and sensitivity analysis are very important in estimating the safety margins for an engineering design application. This paper presents a system analysis and optimization toolkit developed by Korea Atomic Energy Research Institute (KAERI), which includes multiple packages of the sensitivity analysis and uncertainty quantification algorithms. In order to reduce the computing demand, multiple compute resources including multiprocessor computers and a network of workstations are simultaneously used. A Graphical User Interface (GUI) was also developed within the parallel computing framework for users to readily employ the toolkit for an engineering design and optimization problem. The goal of this work is to develop a GUI framework for engineering design and scientific analysis problems by implementing multiple packages of system analysis methods in the parallel computing toolkit. This was done by building an interface between an engineering simulation code and the system analysis software packages. The methods and strategies in the framework were designed to exploit parallel computing resources such as those found in a desktop multiprocessor workstation or a network of workstations. Available approaches in the framework include statistical and mathematical algorithms for use in science and engineering design problems. Currently the toolkit has 6 modules of the system analysis methodologies: deterministic and probabilistic approaches of data assimilation, uncertainty propagation, Chi-square linearity test, sensitivity analysis, and FFTBM.

  7. Deterministic sensitivity and uncertainty analysis for large-scale computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Pin, F.G.; Oblow, E.M.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.

    1988-01-01

    The fields of sensitivity and uncertainty analysis have traditionally been dominated by statistical techniques when large-scale modeling codes are being analyzed. These methods are able to estimate sensitivities, generate response surfaces, and estimate response probability distributions given the input parameter probability distributions. Because the statistical methods are computationally costly, they are usually applied only to problems with relatively small parameter sets. Deterministic methods, on the other hand, are very efficient and can handle large data sets, but generally require simpler models because of the considerable programming effort required for their implementation. The first part of this paper reports on the development and availability of two systems, GRESS and ADGEN, that make use of computer calculus compilers to automate the implementation of deterministic sensitivity analysis capability into existing computer models. This automation removes the traditional limitation of deterministic sensitivity methods. This second part of the paper describes a deterministic uncertainty analysis method (DUA) that uses derivative information as a basis to propagate parameter probability distributions to obtain result probability distributions. This paper is applicable to low-level radioactive waste disposal system performance assessment

  8. An Integrated Approach for Characterization of Uncertainty in Complex Best Estimate Safety Assessment

    International Nuclear Information System (INIS)

    Pourgol-Mohamad, Mohammad; Modarres, Mohammad; Mosleh, Ali

    2013-01-01

    This paper discusses an approach called Integrated Methodology for Thermal-Hydraulics Uncertainty Analysis (IMTHUA) to characterize and integrate a wide range of uncertainties associated with the best estimate models and complex system codes used for nuclear power plant safety analyses. Examples of applications include complex thermal hydraulic and fire analysis codes. In identifying and assessing uncertainties, the proposed methodology treats the complex code as a 'white box', thus explicitly treating internal sub-model uncertainties in addition to the uncertainties related to the inputs to the code. The methodology accounts for uncertainties related to experimental data used to develop such sub-models, and efficiently propagates all uncertainties during best estimate calculations. Uncertainties are formally analyzed and probabilistically treated using a Bayesian inference framework. This comprehensive approach presents the results in a form usable in most other safety analyses such as the probabilistic safety assessment. The code output results are further updated through additional Bayesian inference using any available experimental data, for example from thermal hydraulic integral test facilities. The approach includes provisions to account for uncertainties associated with user-specified options, for example for choices among alternative sub-models, or among several different correlations. Complex time-dependent best-estimate calculations are computationally intense. The paper presents approaches to minimize computational intensity during the uncertainty propagation. Finally, the paper will report effectiveness and practicality of the methodology with two applications to a complex thermal-hydraulics system code as well as a complex fire simulation code. In case of multiple alternative models, several techniques, including dynamic model switching, user-controlled model selection, and model mixing, are discussed. (authors)

  9. Uncertainty analysis of environmental models

    International Nuclear Information System (INIS)

    Monte, L.

    1990-01-01

    In the present paper an evaluation of the output uncertainty of an environmental model for assessing the transfer of 137 Cs and 131 I in the human food chain are carried out on the basis of a statistical analysis of data reported by the literature. The uncertainty analysis offers the oppotunity of obtaining some remarkable information about the uncertainty of models predicting the migration of non radioactive substances in the environment mainly in relation to the dry and wet deposition

  10. SCDAP: a light water reactor computer code for severe core damage analysis

    International Nuclear Information System (INIS)

    Marino, G.P.; Allison, C.M.; Majumdar, D.

    1982-01-01

    Development of the first code version (MODO) of the Severe Core Damage Analysis Package (SCDAP) computer code is described, and calculations made with SCDAP/MODO are presented. The objective of this computer code development program is to develop a capability for analyzing severe disruption of a light water reactor core, including fuel and cladding liquefaction, flow, and freezing; fission product release; hydrogen generation; quenched-induced fragmentation; coolability of the resulting geometry; and ultimately vessel failure due to vessel-melt interaction. SCDAP will be used to identify the phenomena which control core behavior during a severe accident, to help quantify uncertainties in risk assessment analysis, and to support planning and evaluation of severe fuel damage experiments and data. SCDAP/MODO addresses the behavior of a single fuel bundle. Future versions will be developed with capabilities for core-wide and vessel-melt interaction analysis

  11. International Training Program: 3D S. Un. Cop - Scaling, Uncertainty and 3D Thermal-Hydraulics/Neutron-Kinetics Coupled Codes Seminar

    International Nuclear Information System (INIS)

    Pertuzzi, A.; D'Auria, F.; Bajs, T.; Reventos, F.

    2006-01-01

    Thermal-hydraulic system computer codes are extensively used worldwide for analysis of nuclear facilities by utilities, regulatory bodies, nuclear power plant designers and vendors, nuclear fuel companies, research organizations, consulting companies, and technical support organizations. The computer code user represents a source of uncertainty that can influence the results of system code calculations. This influence is commonly known as the 'user effect' and stems from the limitations embedded in the codes as well as from the limited capability of the analysts to use the codes. Code user training and qualification is an effective means for reducing the variation of results caused by the application of the codes by different users. This paper describes a systematic approach to training code users who, upon completion of the training, should be able to perform calculations making the best possible use of the capabilities of best estimate codes. In other words, the program aims at contributing towards solving the problem of user effect. The 3D S.UN.COP (Scaling, Uncertainty and 3D COuPled code calculations) seminars have been organized as follow-up of the proposal to IAEA for the Permanent Training Course for System Code Users (D'Auria, 1998). Four seminars have been held at University of Pisa (2003, 2004), at The Pennsylvania State University (2004) and at University of Zagreb (2005). It was recognized that such courses represented both a source of continuing education for current code users and a mean for current code users to enter the formal training structure of a proposed 'permanent' stepwise approach to user training. The 3D S.UN.COP 2005 was successfully held with the participation of 19 persons coming from 9 countries and 14 different institutions (universities, vendors, national laboratories and regulatory bodies). More than 15 scientists were involved in the organization of the seminar, presenting theoretical aspects of the proposed methodologies and

  12. Transfer of Nuclear Data Uncertainties to the Uncertainties of Fuel Characteristic by Interval Calculation

    International Nuclear Information System (INIS)

    Ukraintsev, V.F.; Kolesov, V.V.

    2006-01-01

    Usually for evaluation of reactor functionals uncertainties, the perturbation theory and sensitivity analysis techniques are used. Of cause linearization approach of perturbation theory is used. This approach has several disadvantages and that is why a new method, based on application of a special interval calculations technique has been created. Basically, the problem of dependency of fuel cycle characteristic uncertainties from source group neutron cross-sections and decay parameters uncertainties can be solved (to some extent) as well by use of sensitivity analysis. However such procedure is rather labor consuming and does not give guaranteed estimations for received parameters since it works, strictly speaking, only for small deviations because it is initially based on linearization of the mathematical problems. The technique of fuel cycle characteristics uncertainties estimation is based on so-called interval analysis (or interval calculations). The basic advantage of this technique is the opportunity of deriving correct estimations. This technique consists in introducing a new special type of data such as Interval data in codes and the definition for them of all arithmetic operations. A technique of problem decision for system of linear equations (isotope kinetics) with use of interval arithmetic for the fuel burning up problem, has been realized. Thus there is an opportunity to compute a neutron flux, fission and capture cross-section uncertainties impact on nuclide concentration uncertainties and on fuel cycle characteristics (such as K eff , breeding ratio, decay heat power etc). By this time the code for interval calculation of burn-up computing has been developed and verified

  13. Cross-section sensitivity and uncertainty analysis of the FNG copper benchmark experiment

    Energy Technology Data Exchange (ETDEWEB)

    Kodeli, I., E-mail: ivan.kodeli@ijs.si [Jožef Stefan Institute, Jamova 39, SI-1000 Ljubljana (Slovenia); Kondo, K. [Karlsruhe Institute of Technology, Postfach 3640, D-76021 Karlsruhe (Germany); Japan Atomic Energy Agency, Rokkasho-mura (Japan); Perel, R.L. [Racah Institute of Physics, Hebrew University of Jerusalem, IL-91904 Jerusalem (Israel); Fischer, U. [Karlsruhe Institute of Technology, Postfach 3640, D-76021 Karlsruhe (Germany)

    2016-11-01

    A neutronics benchmark experiment on copper assembly was performed end 2014–beginning 2015 at the 14-MeV Frascati neutron generator (FNG) of ENEA Frascati with the objective to provide the experimental database required for the validation of the copper nuclear data relevant for ITER design calculations, including the related uncertainties. The paper presents the pre- and post-analysis of the experiment performed using cross-section sensitivity and uncertainty codes, both deterministic (SUSD3D) and Monte Carlo (MCSEN5). Cumulative reaction rates and neutron flux spectra, their sensitivity to the cross sections, as well as the corresponding uncertainties were estimated for different selected detector positions up to ∼58 cm in the copper assembly. This permitted in the pre-analysis phase to optimize the geometry, the detector positions and the choice of activation reactions, and in the post-analysis phase to interpret the results of the measurements and the calculations, to conclude on the quality of the relevant nuclear cross-section data, and to estimate the uncertainties in the calculated nuclear responses and fluxes. Large uncertainties in the calculated reaction rates and neutron spectra of up to 50%, rarely observed at this level in the benchmark analysis using today's nuclear data, were predicted, particularly high for fast reactions. Observed C/E (dis)agreements with values as low as 0.5 partly confirm these predictions. Benchmark results are therefore expected to contribute to the improvement of both cross section as well as covariance data evaluations.

  14. CEC/USDOE workshop on uncertainty analysis

    International Nuclear Information System (INIS)

    Elderkin, C.E.; Kelly, G.N.

    1990-07-01

    Any measured or assessed quantity contains uncertainty. The quantitative estimation of such uncertainty is becoming increasingly important, especially in assuring that safety requirements are met in design, regulation, and operation of nuclear installations. The CEC/USDOE Workshop on Uncertainty Analysis, held in Santa Fe, New Mexico, on November 13 through 16, 1989, was organized jointly by the Commission of European Communities (CEC's) Radiation Protection Research program, dealing with uncertainties throughout the field of consequence assessment, and DOE's Atmospheric Studies in Complex Terrain (ASCOT) program, concerned with the particular uncertainties in time and space variant transport and dispersion. The workshop brought together US and European scientists who have been developing or applying uncertainty analysis methodologies, conducted in a variety of contexts, often with incomplete knowledge of the work of others in this area. Thus, it was timely to exchange views and experience, identify limitations of approaches to uncertainty and possible improvements, and enhance the interface between developers and users of uncertainty analysis methods. Furthermore, the workshop considered the extent to which consistent, rigorous methods could be used in various applications within consequence assessment. 3 refs

  15. The STAT7 Code for Statistical Propagation of Uncertainties In Steady-State Thermal Hydraulics Analysis of Plate-Fueled Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Dunn, Floyd E. [Argonne National Lab. (ANL), Argonne, IL (United States); Hu, Lin-wen [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States). Nuclear Reactor Lab.; Wilson, Erik [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-12-01

    The STAT code was written to automate many of the steady-state thermal hydraulic safety calculations for the MIT research reactor, both for conversion of the reactor from high enrichment uranium fuel to low enrichment uranium fuel and for future fuel re-loads after the conversion. A Monte-Carlo statistical propagation approach is used to treat uncertainties in important parameters in the analysis. These safety calculations are ultimately intended to protect against high fuel plate temperatures due to critical heat flux or departure from nucleate boiling or onset of flow instability; but additional margin is obtained by basing the limiting safety settings on avoiding onset of nucleate boiling. STAT7 can simultaneously analyze all of the axial nodes of all of the fuel plates and all of the coolant channels for one stripe of a fuel element. The stripes run the length of the fuel, from the bottom to the top. Power splits are calculated for each axial node of each plate to determine how much of the power goes out each face of the plate. By running STAT7 multiple times, full core analysis has been performed by analyzing the margin to ONB for each axial node of each stripe of each plate of each element in the core.

  16. Applied research in uncertainty modeling and analysis

    CERN Document Server

    Ayyub, Bilal

    2005-01-01

    Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

  17. SCALE-6 Sensitivity/Uncertainty Methods and Covariance Data

    International Nuclear Information System (INIS)

    Williams, Mark L.; Rearden, Bradley T.

    2008-01-01

    Computational methods and data used for sensitivity and uncertainty analysis within the SCALE nuclear analysis code system are presented. The methodology used to calculate sensitivity coefficients and similarity coefficients and to perform nuclear data adjustment is discussed. A description is provided of the SCALE-6 covariance library based on ENDF/B-VII and other nuclear data evaluations, supplemented by 'low-fidelity' approximate covariances. SCALE (Standardized Computer Analyses for Licensing Evaluation) is a modular code system developed by Oak Ridge National Laboratory (ORNL) to perform calculations for criticality safety, reactor physics, and radiation shielding applications. SCALE calculations typically use sequences that execute a predefined series of executable modules to compute particle fluxes and responses like the critical multiplication factor. SCALE also includes modules for sensitivity and uncertainty (S/U) analysis of calculated responses. The S/U codes in SCALE are collectively referred to as TSUNAMI (Tools for Sensitivity and UNcertainty Analysis Methodology Implementation). SCALE-6-scheduled for release in 2008-contains significant new capabilities, including important enhancements in S/U methods and data. The main functions of TSUNAMI are to (a) compute nuclear data sensitivity coefficients and response uncertainties, (b) establish similarity between benchmark experiments and design applications, and (c) reduce uncertainty in calculated responses by consolidating integral benchmark experiments. TSUNAMI includes easy-to-use graphical user interfaces for defining problem input and viewing three-dimensional (3D) geometries, as well as an integrated plotting package.

  18. Local conditions and uncertainty bands for Semiscale Test S-02-9

    International Nuclear Information System (INIS)

    Varacalle, D.J. Jr.

    1979-01-01

    Analysis was performed to derive local conditions heat transfer parameters and their uncertainties using computer codes and experimentally derived boundary conditions for the Semiscale core for LOCA Test S-02-9. Calculations performed consisted of nominal code cases using best-estimate input parameters and cases where the specified input parameters were perturbed in accordance with the response surface method of uncertainty analysis. The output parameters of interest were those that are used in film boiling heat transfer correlations including enthalpy, pressure, quality, and coolant flow rate. Large uncertainty deviations occurred during low core mass flow periods where the relative flow uncertainties were large. Utilizing the derived local conditions and their associated uncertainties, a study was then made which showed the uncertainty in film boiling heat transfer coefficient varied between 5 and 250%

  19. Treatment of uncertainties in the geologic disposal of radioactive waste

    International Nuclear Information System (INIS)

    Cranwell, R.M.

    1985-01-01

    Uncertainty in the analysis of geologic waste disposal is generally considered to have three primary components: (1) computer code/model uncertainty, (2) model parameter uncertainty, and (3) scenario uncertainty. Computer code/model uncertainty arises from problems associated with determination of appropriate parameters for use in model construction, mathematical formulatin of models, and numerical techniques used in conjunction with the mathematical formulation of models. Model parameter uncertainty arises from problems associated with selection of appropriate values for model input, data interpretation and possible misuse of data, and variation of data. Scenario uncertainty arises from problems associated with the ''completeness' of scenarios, the definition of parameters which describe scenarios, and the rate or probability of scenario occurrence. The preceding sources of uncertainty are discussed below

  20. The Findings from the OECD/NEA/CSNI UMS (Uncertainty Method Study)

    International Nuclear Information System (INIS)

    D'Auria, F.; Glaeser, H.

    2013-01-01

    Within licensing procedures there is the incentive to replace the conservative requirements for code application by a 'best estimate' concept supplemented by an uncertainty analysis to account for predictive uncertainties of code results. Methods have been developed to quantify these uncertainties. The Uncertainty Methods Study (UMS) Group, following a mandate from CSNI (Committee on the Safety of Nuclear Installations) of OECD/NEA (Organization for Economic Cooperation and Development / Nuclear Energy Agency), has compared five methods for calculating the uncertainty in the predictions of advanced 'best estimate' thermal-hydraulic codes. Most of the methods identify and combine input uncertainties. The major differences between the predictions of the methods came from the choice of uncertain parameters and the quantification of the input uncertainties, i.e. the wideness of the uncertainty ranges. Therefore, suitable experimental and analytical information has to be selected to specify these uncertainty ranges or distributions. After the closure of the Uncertainty Method Study (UMS) and after the report was issued comparison calculations of experiment LSTF-SB-CL-18 were performed by University of Pisa using different versions of the RELAP 5 code. It turned out that the version used by two of the participants calculated a 170 K higher peak clad temperature compared with other versions using the same input deck. This may contribute to the differences of the upper limit of the uncertainty ranges. A 'bifurcation' analysis was also performed by the same research group also providing another way of interpreting the high temperature peak calculated by two of the participants. (authors)

  1. The Generic Containment SB-LOCA accident simulation: Comparison of the parameter uncertainties and user-effect

    International Nuclear Information System (INIS)

    Povilaitis, Mantas; Kelm, Stephan; Urbonavičius, Egidijus

    2017-01-01

    Highlights: • Uncertainty and sensitivity analysis for the Generic Containment severe accident. • Comparison of the analysis results with the uncertainties based in the user effect. • Demonstration of the similar importance of both the reducing the user effect and input uncertainties. - Abstract: Uncertainties in safety assessment of the nuclear power plants using computer codes come from several sources: choice of computer code, user effect (a strong impact of user choices on the simulation’s outcome) and uncertainty of various physical parameters. The “Generic Containment” activity was performed in the frames of the EU-FP7 project SARNET2 to investigate the influence of user effect and computer code choice on the results on the nuclear power plant scale. During this activity, a Generic Containment nodalisation was developed and used for exercise by the participants applying various computer codes. Even though the model of the Generic Containment and the transient scenario were precisely and uniquely defined, considerably different results were obtained not only among different codes but also among participants using the same code, showing significant influence of the user effect. This paper present analysis, which is an extension of the “Generic Containment” benchmark and investigates the effect of input parameter’s uncertainties in comparison to the user effect. Calculations were performed using the computer code ASTEC, the uncertainty and sensitivity of the results were estimated using GRS method and tool SUSA. The results of the present analysis show, that while there are differences between the uncertainty bands of the parameters, in general the deviation bands caused by parameters’ uncertainty and the user effect are comparable and of the same order. The properties of concrete and the surface areas may have more influence on containment pressure than the user effect and choice of computer code as identified in the SARNET2 Generic

  2. A Two-Step Approach to Uncertainty Quantification of Core Simulators

    Directory of Open Access Journals (Sweden)

    Artem Yankov

    2012-01-01

    Full Text Available For the multiple sources of error introduced into the standard computational regime for simulating reactor cores, rigorous uncertainty analysis methods are available primarily to quantify the effects of cross section uncertainties. Two methods for propagating cross section uncertainties through core simulators are the XSUSA statistical approach and the “two-step” method. The XSUSA approach, which is based on the SUSA code package, is fundamentally a stochastic sampling method. Alternatively, the two-step method utilizes generalized perturbation theory in the first step and stochastic sampling in the second step. The consistency of these two methods in quantifying uncertainties in the multiplication factor and in the core power distribution was examined in the framework of phase I-3 of the OECD Uncertainty Analysis in Modeling benchmark. With the Three Mile Island Unit 1 core as a base model for analysis, the XSUSA and two-step methods were applied with certain limitations, and the results were compared to those produced by other stochastic sampling-based codes. Based on the uncertainty analysis results, conclusions were drawn as to the method that is currently more viable for computing uncertainties in burnup and transient calculations.

  3. Code manual for CONTAIN 2.0: A computer code for nuclear reactor containment analysis

    International Nuclear Information System (INIS)

    Murata, K.K.; Williams, D.C.; Griffith, R.O.; Gido, R.G.; Tadios, E.L.; Davis, F.J.; Martinez, G.M.; Washington, K.E.; Tills, J.

    1997-12-01

    The CONTAIN 2.0 computer code is an integrated analysis tool used for predicting the physical conditions, chemical compositions, and distributions of radiological materials inside a containment building following the release of material from the primary system in a light-water reactor accident. It can also predict the source term to the environment. CONTAIN 2.0 is intended to replace the earlier CONTAIN 1.12, which was released in 1991. The purpose of this Code Manual is to provide full documentation of the features and models in CONTAIN 2.0. Besides complete descriptions of the models, this Code Manual provides a complete description of the input and output from the code. CONTAIN 2.0 is a highly flexible and modular code that can run problems that are either quite simple or highly complex. An important aspect of CONTAIN is that the interactions among thermal-hydraulic phenomena, aerosol behavior, and fission product behavior are taken into account. The code includes atmospheric models for steam/air thermodynamics, intercell flows, condensation/evaporation on structures and aerosols, aerosol behavior, and gas combustion. It also includes models for reactor cavity phenomena such as core-concrete interactions and coolant pool boiling. Heat conduction in structures, fission product decay and transport, radioactive decay heating, and the thermal-hydraulic and fission product decontamination effects of engineered safety features are also modeled. To the extent possible, the best available models for severe accident phenomena have been incorporated into CONTAIN, but it is intrinsic to the nature of accident analysis that significant uncertainty exists regarding numerous phenomena. In those cases, sensitivity studies can be performed with CONTAIN by means of user-specified input parameters. Thus, the code can be viewed as a tool designed to assist the knowledge reactor safety analyst in evaluating the consequences of specific modeling assumptions

  4. Code manual for CONTAIN 2.0: A computer code for nuclear reactor containment analysis

    Energy Technology Data Exchange (ETDEWEB)

    Murata, K.K.; Williams, D.C.; Griffith, R.O.; Gido, R.G.; Tadios, E.L.; Davis, F.J.; Martinez, G.M.; Washington, K.E. [Sandia National Labs., Albuquerque, NM (United States); Tills, J. [J. Tills and Associates, Inc., Sandia Park, NM (United States)

    1997-12-01

    The CONTAIN 2.0 computer code is an integrated analysis tool used for predicting the physical conditions, chemical compositions, and distributions of radiological materials inside a containment building following the release of material from the primary system in a light-water reactor accident. It can also predict the source term to the environment. CONTAIN 2.0 is intended to replace the earlier CONTAIN 1.12, which was released in 1991. The purpose of this Code Manual is to provide full documentation of the features and models in CONTAIN 2.0. Besides complete descriptions of the models, this Code Manual provides a complete description of the input and output from the code. CONTAIN 2.0 is a highly flexible and modular code that can run problems that are either quite simple or highly complex. An important aspect of CONTAIN is that the interactions among thermal-hydraulic phenomena, aerosol behavior, and fission product behavior are taken into account. The code includes atmospheric models for steam/air thermodynamics, intercell flows, condensation/evaporation on structures and aerosols, aerosol behavior, and gas combustion. It also includes models for reactor cavity phenomena such as core-concrete interactions and coolant pool boiling. Heat conduction in structures, fission product decay and transport, radioactive decay heating, and the thermal-hydraulic and fission product decontamination effects of engineered safety features are also modeled. To the extent possible, the best available models for severe accident phenomena have been incorporated into CONTAIN, but it is intrinsic to the nature of accident analysis that significant uncertainty exists regarding numerous phenomena. In those cases, sensitivity studies can be performed with CONTAIN by means of user-specified input parameters. Thus, the code can be viewed as a tool designed to assist the knowledge reactor safety analyst in evaluating the consequences of specific modeling assumptions.

  5. Nuclear-data uncertainty propagations in burnup calculation for the PWR assembly

    International Nuclear Information System (INIS)

    Wan, Chenghui; Cao, Liangzhi; Wu, Hongchun; Shen, Wei

    2017-01-01

    Highlights: • The DRAGON 5.0 and NECP-CACTI have been implemented in UNICORN. • The effects of different neutronics methods on S&U results were quantified. • Uncertainty analysis has been applied to burnup calculation of PWR assembly. • The uncertainties of eigenvalue and few-group constants have been quantified. - Abstract: In this paper, our home-developed lattice code NECP-CACTI has been implemented into our UNICORN code to perform sensitivity and uncertainty analysis for the lattice calculations. The verified multigroup cross-section perturbation model and methods of the sensitivity and uncertainty analysis are established and applied to different lattice codes in UNICORN. As DRAGON5.0 and NECP-CACTI are available for the lattice calculations in UNICORN now, the effects of different neutronics methods (including methods for the neutron-transport and resonance self-shielding calculations) on the results of sensitivity and uncertainty analysis were studied in this paper. Based on NECP-CACTI, uncertainty analysis using the statistical sampling method has been performed to the burnup calculation for the fresh-fueled TMI-1 assembly, propagating the nuclear-data uncertainties to k_∞ and two-group constants of the lattice calculation with depletions. As results shown, for different neutronics methods, it can be observed that different methods of the neutron-transport calculation introduce no differences to the results of sensitivity and uncertainty analysis, while different methods of the resonance self-shielding calculation would impact the results. With depletions of the TMI-1 assembly, for k_∞, the relative uncertainty varies between 0.45% and 0.60%; for two-group constants, the largest variation is between 0.35% and 2.56% for vΣ_f_,_2. Moreover, the most significant contributors to the uncertainty of k_∞ and two-group constants varied with depletions are determined.

  6. Benchmarks for Uncertainty Analysis in Modelling (UAM) for the Design, Operation and Safety Analysis of LWRs - Volume I: Specification and Support Data for Neutronics Cases (Phase I)

    International Nuclear Information System (INIS)

    Ivanov, K.; Avramova, M.; Kamerow, S.; Kodeli, I.; Sartori, E.; Ivanov, E.; Cabellos, O.

    2013-01-01

    The objective of the OECD LWR UAM activity is to establish an internationally accepted benchmark framework to compare, assess and further develop different uncertainty analysis methods associated with the design, operation and safety of LWRs. As a result, the LWR UAM benchmark will help to address current nuclear power generation industry and regulation needs and issues related to practical implementation of risk-informed regulation. The realistic evaluation of consequences must be made with best-estimate coupled codes, but to be meaningful, such results should be supplemented by an uncertainty analysis. The use of coupled codes allows us to avoid unnecessary penalties due to incoherent approximations in the traditional decoupled calculations, and to obtain more accurate evaluation of margins regarding licensing limit. This becomes important for licensing power upgrades, improved fuel assembly and control rod designs, higher burn-up and others issues related to operating LWRs as well as to the new Generation 3+ designs being licensed now (ESBWR, AP-1 000, EPR-1 600, etc.). Establishing an internationally accepted LWR UAM benchmark framework offers the possibility to accelerate the licensing process when using best estimate methods. The proposed technical approach is to establish a benchmark for uncertainty analysis in best-estimate modelling and coupled multi-physics and multi-scale LWR analysis, using as bases a series of well-defined problems with complete sets of input specifications and reference experimental data. The objective is to determine the uncertainty in LWR system calculations at all stages of coupled reactor physics/thermal hydraulics calculations. The full chain of uncertainty propagation from basic data, engineering uncertainties, across different scales (multi-scale), and physics phenomena (multi-physics) will be tested on a number of benchmark exercises for which experimental data are available and for which the power plant details have been

  7. Sensitivity/uncertainty analysis of a borehole scenario comparing Latin Hypercube Sampling and deterministic sensitivity approaches

    International Nuclear Information System (INIS)

    Harper, W.V.; Gupta, S.K.

    1983-10-01

    A computer code was used to study steady-state flow for a hypothetical borehole scenario. The model consists of three coupled equations with only eight parameters and three dependent variables. This study focused on steady-state flow as the performance measure of interest. Two different approaches to sensitivity/uncertainty analysis were used on this code. One approach, based on Latin Hypercube Sampling (LHS), is a statistical sampling method, whereas, the second approach is based on the deterministic evaluation of sensitivities. The LHS technique is easy to apply and should work well for codes with a moderate number of parameters. Of deterministic techniques, the direct method is preferred when there are many performance measures of interest and a moderate number of parameters. The adjoint method is recommended when there are a limited number of performance measures and an unlimited number of parameters. This unlimited number of parameters capability can be extremely useful for finite element or finite difference codes with a large number of grid blocks. The Office of Nuclear Waste Isolation will use the technique most appropriate for an individual situation. For example, the adjoint method may be used to reduce the scope to a size that can be readily handled by a technique such as LHS. Other techniques for sensitivity/uncertainty analysis, e.g., kriging followed by conditional simulation, will be used also. 15 references, 4 figures, 9 tables

  8. Uncertainty analysis of minimum vessel liquid inventory during a small-break LOCA in a B ampersand W Plant: An application of the CSAU methodology using the RELAP5/MOD3 computer code

    International Nuclear Information System (INIS)

    Ortiz, M.G.; Ghan, L.S.

    1992-12-01

    The Nuclear Regulatory Commission (NRC) revised the emergency core cooling system licensing rule to allow the use of best estimate computer codes, provided the uncertainty of the calculations are quantified and used in the licensing and regulation process. The NRC developed a generic methodology called Code Scaling, Applicability, and Uncertainty (CSAU) to evaluate best estimate code uncertainties. The objective of this work was to adapt and demonstrate the CSAU methodology for a small-break loss-of-coolant accident (SBLOCA) in a Pressurized Water Reactor of Babcock ampersand Wilcox Company lowered loop design using RELAP5/MOD3 as the simulation tool. The CSAU methodology was successfully demonstrated for the new set of variants defined in this project (scenario, plant design, code). However, the robustness of the reactor design to this SBLOCA scenario limits the applicability of the specific results to other plants or scenarios. Several aspects of the code were not exercised because the conditions of the transient never reached enough severity. The plant operator proved to be a determining factor in the course of the transient scenario, and steps were taken to include the operator in the model, simulation, and analyses

  9. Concepts involved in a proposed application of uncertainty analysis to the performance assessment of high-level nuclear waste isolation systems

    International Nuclear Information System (INIS)

    Maerker, R.E.

    1986-03-01

    This report introduces the concepts of a previously developed methodology which could readily be extended to the field of performance assessment for high-level nuclear waste isolation systems. The methodology incorporates sensitivities previously obtained with the GRESS code into an uncertainty analysis, from which propagated uncertainties in calculated responses may be derived from basic data uncertainties. Following a definition of terms, examples are provided illustrating commonly used conventions for describing the concepts of covariance and sensitivity. Examples of solutions to problems previously encountered in related fields involving uncertainty analysis and use of a generalized linear least-squares adjustment procedure are also presented. 5 refs., 14 tabs

  10. Uncertainty analysis of suppression pool heating during an ATWS in a BWR-5 plant

    International Nuclear Information System (INIS)

    Wulff, W.; Cheng, H.S.; Mallen, A.N.; Johnsen, G.W.; Lellouche, G.S.

    1994-03-01

    The uncertainty has been estimated of predicting the peak temperature in the suppression pool of a BWR power plant, which undergoes an NRC-postulated Anticipated Transient Without Scram (ATWS). The ATWS is initiated by recirculation-pump trips, and then leads to power and flow oscillations as they had occurred at the LaSalle-2 Power Station in March of 1988. After limit-cycle oscillations have been established, the turbines are tripped, but without MSIV closure, allowing steam discharge through the turbine bypass into the condenser. Postulated operator actions, namely to lower the reactor vessel pressure and the level elevation in the downcomer, are simulated by a robot model which accounts for operator uncertainty. All balance of plant and control systems modeling uncertainties were part of the statistical uncertainty analysis that was patterned after the Code Scaling, Applicability and Uncertainty (CSAU) evaluation methodology. The analysis showed that the predicted suppression-pool peak temperature of 329.3 K (133 degrees F) has a 95-percentile uncertainty of 14.4 K (26 degrees F), and that the size of this uncertainty bracket is dominated by the experimental uncertainty of measuring Safety and Relief Valve mass flow rates under critical-flow conditions. The analysis showed also that the probability of exceeding the suppression-pool temperature limit of 352.6 K (175 degrees F) is most likely zero (it is estimated as < 5-104). The square root of the sum of the squares of all the computed peak pool temperatures is 350.7 K (171.6 degrees F)

  11. Ethical Code Effectiveness in Football Clubs: A Longitudinal Analysis

    OpenAIRE

    Constandt, Bram; De Waegeneer, Els; Willem, Annick

    2017-01-01

    As football (soccer) clubs are facing different ethical challenges, many clubs are turning to ethical codes to counteract unethical behaviour. However, both in- and outside the sport field, uncertainty remains about the effectiveness of these ethical codes. For the first time, a longitudinal study design was adopted to evaluate code effectiveness. Specifically, a sample of non-professional football clubs formed the subject of our inquiry. Ethical code effectiveness was...

  12. Uncertainty as Knowledge: Constraints on Policy Choices Provided by Analysis of Uncertainty

    Science.gov (United States)

    Lewandowsky, S.; Risbey, J.; Smithson, M.; Newell, B. R.

    2012-12-01

    Uncertainty forms an integral part of climate science, and it is often cited in connection with arguments against mitigative action. We argue that an analysis of uncertainty must consider existing knowledge as well as uncertainty, and the two must be evaluated with respect to the outcomes and risks associated with possible policy options. Although risk judgments are inherently subjective, an analysis of the role of uncertainty within the climate system yields two constraints that are robust to a broad range of assumptions. Those constraints are that (a) greater uncertainty about the climate system is necessarily associated with greater expected damages from warming, and (b) greater uncertainty translates into a greater risk of the failure of mitigation efforts. These ordinal constraints are unaffected by subjective or cultural risk-perception factors, they are independent of the discount rate, and they are independent of the magnitude of the estimate for climate sensitivity. The constraints mean that any appeal to uncertainty must imply a stronger, rather than weaker, need to cut greenhouse gas emissions than in the absence of uncertainty.

  13. Development and assessment of best estimate integrated safety analysis code

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Lee, Young Jin; Hwang, Moon Kyu

    2007-03-01

    Improvement of the integrated safety analysis code MARS3.0 has been carried out and a multi-D safety analysis application system has been established. Iterative matrix solver and parallel processing algorithm have been introduced, and a LINUX version has been generated to enable MARS to run in cluster PCs. MARS variables and sub-routines have been reformed and modularised to simplify code maintenance. Model uncertainty analyses have been performed for THTF, FLECHT, NEPTUN, and LOFT experiments as well as APR1400 plant. Participations in international cooperation research projects such as OECD BEMUSE, SETH, PKL, BFBT, and TMI-2 have been actively pursued as part of code assessment efforts. The assessment, evaluation and experimental data obtained through international cooperation projects have been registered and maintained in the T/H Databank. Multi-D analyses of APR1400 LBLOCA, DVI Break, SLB, and SGTR have been carried out as a part of application efforts in multi-D safety analysis. GUI based 3D input generator has been developed for user convenience. Operation of the MARS Users Group (MUG) was continued and through MUG, the technology has been transferred to 24 organisations. A set of 4 volumes of user manuals has been compiled and the correction reports for the code errors reported during MARS development have been published

  14. Development and assessment of best estimate integrated safety analysis code

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Bub Dong; Lee, Young Jin; Hwang, Moon Kyu (and others)

    2007-03-15

    Improvement of the integrated safety analysis code MARS3.0 has been carried out and a multi-D safety analysis application system has been established. Iterative matrix solver and parallel processing algorithm have been introduced, and a LINUX version has been generated to enable MARS to run in cluster PCs. MARS variables and sub-routines have been reformed and modularised to simplify code maintenance. Model uncertainty analyses have been performed for THTF, FLECHT, NEPTUN, and LOFT experiments as well as APR1400 plant. Participations in international cooperation research projects such as OECD BEMUSE, SETH, PKL, BFBT, and TMI-2 have been actively pursued as part of code assessment efforts. The assessment, evaluation and experimental data obtained through international cooperation projects have been registered and maintained in the T/H Databank. Multi-D analyses of APR1400 LBLOCA, DVI Break, SLB, and SGTR have been carried out as a part of application efforts in multi-D safety analysis. GUI based 3D input generator has been developed for user convenience. Operation of the MARS Users Group (MUG) was continued and through MUG, the technology has been transferred to 24 organisations. A set of 4 volumes of user manuals has been compiled and the correction reports for the code errors reported during MARS development have been published.

  15. Uncertainty Evaluation with Multi-Dimensional Model of LBLOCA in OPR1000 Plant

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jieun; Oh, Deog Yeon; Seul, Kwang-Won; Lee, Jin Ho [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2016-10-15

    KINS has used KINS-REM (KINS-Realistic Evaluation Methodology) which developed for Best- Estimate (BE) calculation and uncertainty quantification for regulatory audit. This methodology has been improved continuously by numerous studies, such as uncertainty parameters and uncertainty ranges. In this study, to evaluate the applicability of improved KINS-REM for OPR1000 plant, uncertainty evaluation with multi-dimensional model for confirming multi-dimensional phenomena was conducted with MARS-KS code. In this study, the uncertainty evaluation with multi- dimensional model of OPR1000 plant was conducted for confirming the applicability of improved KINS- REM The reactor vessel modeled using MULTID component of MARS-KS code, and total 29 uncertainty parameters were considered by 124 sampled calculations. Through 124 calculations using Mosaique program with MARS-KS code, peak cladding temperature was calculated and final PCT was determined by the 3rd order Wilks' formula. The uncertainty parameters which has strong influence were investigated by Pearson coefficient analysis. They were mostly related with plant operation and fuel material properties. Evaluation results through the 124 calculations and sensitivity analysis show that improved KINS-REM could be reasonably applicable for uncertainty evaluation with multi-dimensional model calculations of OPR1000 plants.

  16. How does uncertainty shape patient experience in advanced illness? A secondary analysis of qualitative data.

    Science.gov (United States)

    Etkind, Simon Noah; Bristowe, Katherine; Bailey, Katharine; Selman, Lucy Ellen; Murtagh, Fliss Em

    2017-02-01

    Uncertainty is common in advanced illness but is infrequently studied in this context. If poorly addressed, uncertainty can lead to adverse patient outcomes. We aimed to understand patient experiences of uncertainty in advanced illness and develop a typology of patients' responses and preferences to inform practice. Secondary analysis of qualitative interview transcripts. Studies were assessed for inclusion and interviews were sampled using maximum-variation sampling. Analysis used a thematic approach with 10% of coding cross-checked to enhance reliability. Qualitative interviews from six studies including patients with heart failure, chronic obstructive pulmonary disease, renal disease, cancer and liver failure. A total of 30 transcripts were analysed. Median age was 75 (range, 43-95), 12 patients were women. The impact of uncertainty was frequently discussed: the main related themes were engagement with illness, information needs, patient priorities and the period of time that patients mainly focused their attention on (temporal focus). A typology of patient responses to uncertainty was developed from these themes. Uncertainty influences patient experience in advanced illness through affecting patients' information needs, preferences and future priorities for care. Our typology aids understanding of how patients with advanced illness respond to uncertainty. Assessment of these three factors may be a useful starting point to guide clinical assessment and shared decision making.

  17. Application of the coupled Relap5/Panther codes for PWR steam. Line break accident analysis

    International Nuclear Information System (INIS)

    Guisset, J.-P.; Bosso, S.; Charlier, A.; Delhaye, X.; Ergo, O.; Ouliddren, K.; Schneidesch, C.; Zhang, J.

    2001-01-01

    A dynamic coupling between the existing 1-dimensional thermal-hydraulics system code RELAP5 and the 3-dimensional neutronics code PANTHER is applied via the transient analysis code linkage program TALINK. An interface between PANTHER and the subchannel thermal-hydraulic analysis code COBRA 3C allows direct evaluation of the Departure from Nucleate Boiling Ratio in parallel with the coupled PANTHER/RELAP5 simulation. The coupled codes are applied to develop a Final Safety Analysis Report (FSAR) accident analysis methodology for the major Steam Line Break (SLB) accident at hot zero power in a typical three-loop pressurised water reactor. In this methodology, the uncertainties related to the plant, core thermal-hydraulic and neutronic parameters are combined in a deterministic bounding approach based on sensitivity studies. The results of coupled thermal-hydraulic and neutronic analysis of SLB are presented and discussed. It is shown that there exists an important margin in the traditional FSAR accident analysis for SLB, which can be attributed by the conservatism's introduced by de-coupling the plant sub-systems. (author)

  18. Application of the coupled Relap5/Panther codes for PWR steam. Line break accident analysis

    Energy Technology Data Exchange (ETDEWEB)

    Guisset, J.-P.; Bosso, S.; Charlier, A.; Delhaye, X.; Ergo, O.; Ouliddren, K.; Schneidesch, C.; Zhang, J. [Tractebel Energy Engineering, Brussels (Belgium)

    2001-07-01

    A dynamic coupling between the existing 1-dimensional thermal-hydraulics system code RELAP5 and the 3-dimensional neutronics code PANTHER is applied via the transient analysis code linkage program TALINK. An interface between PANTHER and the subchannel thermal-hydraulic analysis code COBRA 3C allows direct evaluation of the Departure from Nucleate Boiling Ratio in parallel with the coupled PANTHER/RELAP5 simulation. The coupled codes are applied to develop a Final Safety Analysis Report (FSAR) accident analysis methodology for the major Steam Line Break (SLB) accident at hot zero power in a typical three-loop pressurised water reactor. In this methodology, the uncertainties related to the plant, core thermal-hydraulic and neutronic parameters are combined in a deterministic bounding approach based on sensitivity studies. The results of coupled thermal-hydraulic and neutronic analysis of SLB are presented and discussed. It is shown that there exists an important margin in the traditional FSAR accident analysis for SLB, which can be attributed by the conservatism's introduced by de-coupling the plant sub-systems. (author)

  19. Preliminary uncertainty analysis of OECD/UAM benchmark for the TMI-1 reactor

    International Nuclear Information System (INIS)

    Cardoso, Fabiano S.; Faria, Rochkhudson B.; Silva, Lucas M.C.; Pereira, Claubia; Fortini, Angela

    2015-01-01

    Nowadays the demand from nuclear research centers for safety, regulation and better-estimated predictions provided with confidence bounds has been increasing. On that way, studies have pointed out that present uncertainties in the nuclear data should be significantly reduced, to get the full benefit from the advanced modeling and simulation initiatives. The major outcome of NEA/OECD (UAM) workshop took place Italy on 2006, was the preparation of a benchmark work program with steps (exercises) that would be needed to define the uncertainty and modeling tasks. On that direction, this work was performed within the framework of UAM Exercise 1 (I-1) 'Cell Physics' to validate the study, and to be able estimated the accuracies of the model. The objectives of this study were to make a preliminary analysis of criticality values of TMI-1 PWR and the biases of the results from two different nuclear codes multiplication factor. The range of the bias was obtained using the deterministic codes: NEWT (New ESC-based Weighting Transport code), the two-dimensional transport module that uses AMPX-formatted cross-sections processed by other SCALE; and WIMSD5 (Winfrith Improved Multi-Group Scheme) code. The WIMSD5 system consists of a simplified geometric representation of heterogeneous space zones that are coupled with each other and with the boundaries, while the properties of each spacing element are obtained from Carlson DSN method or Collision Probability method. (author)

  20. Procedures for uncertainty and sensitivity analysis in repository performance assessment

    International Nuclear Information System (INIS)

    Poern, K.; Aakerlund, O.

    1985-10-01

    The objective of the project was mainly a literature study of available methods for the treatment of parameter uncertainty propagation and sensitivity aspects in complete models such as those concerning geologic disposal of radioactive waste. The study, which has run parallel with the development of a code package (PROPER) for computer assisted analysis of function, also aims at the choice of accurate, cost-affective methods for uncertainty and sensitivity analysis. Such a choice depends on several factors like the number of input parameters, the capacity of the model and the computer reresources required to use the model. Two basic approaches are addressed in the report. In one of these the model of interest is directly simulated by an efficient sampling technique to generate an output distribution. Applying the other basic method the model is replaced by an approximating analytical response surface, which is then used in the sampling phase or in moment matching to generate the output distribution. Both approaches are illustrated by simple examples in the report. (author)

  1. Supporting Qualified Database for Uncertainty Evaluation

    International Nuclear Information System (INIS)

    Petruzzi, A.; Fiori, F.; Kovtonyuk, A.; Lisovyy, O.; D'Auria, F.

    2013-01-01

    Uncertainty evaluation constitutes a key feature of BEPU (Best Estimate Plus Uncertainty) process. The uncertainty can be the result of a Monte Carlo type analysis involving input uncertainty parameters or the outcome of a process involving the use of experimental data and connected code calculations. Those uncertainty methods are discussed in several papers and guidelines (IAEA-SRS-52, OECD/NEA BEMUSE reports). The present paper aims at discussing the role and the depth of the analysis required for merging from one side suitable experimental data and on the other side qualified code calculation results. This aspect is mostly connected with the second approach for uncertainty mentioned above, but it can be used also in the framework of the first approach. Namely, the paper discusses the features and structure of the database that includes the following kinds of documents: 1. The 'RDS-facility' (Reference Data Set for the selected facility): this includes the description of the facility, the geometrical characterization of any component of the facility, the instrumentations, the data acquisition system, the evaluation of pressure losses, the physical properties of the material and the characterization of pumps, valves and heat losses; 2. The 'RDS-test' (Reference Data Set for the selected test of the facility): this includes the description of the main phenomena investigated during the test, the configuration of the facility for the selected test (possible new evaluation of pressure and heat losses if needed) and the specific boundary and initial conditions; 3. The 'QP' (Qualification Report) of the code calculation results: this includes the description of the nodalization developed following a set of homogeneous techniques, the achievement of the steady state conditions and the qualitative and quantitative analysis of the transient with the characterization of the Relevant Thermal-Hydraulics Aspects (RTA); 4. The EH (Engineering Handbook) of the input nodalization

  2. Supporting qualified database for uncertainty evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Petruzzi, A.; Fiori, F.; Kovtonyuk, A.; D' Auria, F. [Nuclear Research Group of San Piero A Grado, Univ. of Pisa, Via Livornese 1291, 56122 Pisa (Italy)

    2012-07-01

    Uncertainty evaluation constitutes a key feature of BEPU (Best Estimate Plus Uncertainty) process. The uncertainty can be the result of a Monte Carlo type analysis involving input uncertainty parameters or the outcome of a process involving the use of experimental data and connected code calculations. Those uncertainty methods are discussed in several papers and guidelines (IAEA-SRS-52, OECD/NEA BEMUSE reports). The present paper aims at discussing the role and the depth of the analysis required for merging from one side suitable experimental data and on the other side qualified code calculation results. This aspect is mostly connected with the second approach for uncertainty mentioned above, but it can be used also in the framework of the first approach. Namely, the paper discusses the features and structure of the database that includes the following kinds of documents: 1. The' RDS-facility' (Reference Data Set for the selected facility): this includes the description of the facility, the geometrical characterization of any component of the facility, the instrumentations, the data acquisition system, the evaluation of pressure losses, the physical properties of the material and the characterization of pumps, valves and heat losses; 2. The 'RDS-test' (Reference Data Set for the selected test of the facility): this includes the description of the main phenomena investigated during the test, the configuration of the facility for the selected test (possible new evaluation of pressure and heat losses if needed) and the specific boundary and initial conditions; 3. The 'QR' (Qualification Report) of the code calculation results: this includes the description of the nodalization developed following a set of homogeneous techniques, the achievement of the steady state conditions and the qualitative and quantitative analysis of the transient with the characterization of the Relevant Thermal-Hydraulics Aspects (RTA); 4. The EH (Engineering

  3. Eigenvalue sensitivity analysis and uncertainty quantification in SCALE6.2.1 using continuous-energy Monte Carlo Method

    Energy Technology Data Exchange (ETDEWEB)

    Labarile, A.; Barrachina, T.; Miró, R.; Verdú, G., E-mail: alabarile@iqn.upv.es, E-mail: tbarrachina@iqn.upv.es, E-mail: rmiro@iqn.upv.es, E-mail: gverdu@iqn.upv.es [Institute for Industrial, Radiophysical and Environmental Safety - ISIRYM, Valencia (Spain); Pereira, C., E-mail: claubia@nuclear.ufmg.br [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Departamento de Engenharia Nuclear

    2017-07-01

    The use of Best-Estimate computer codes is one of the greatest concerns in the nuclear industry especially for licensing analysis. Of paramount importance is the estimation of the uncertainties of the whole system to establish the safety margins based on highly reliable results. The estimation of these uncertainties should be performed by applying a methodology to propagate the uncertainties from the input parameters and the models implemented in the code to the output parameters. This study employs two different approaches for the Sensitivity Analysis (SA) and Uncertainty Quantification (UQ), the adjoint-based perturbation theory of TSUNAMI-3D, and the stochastic sampling technique of SAMPLER/KENO. The cases studied are two models of Light Water Reactors in the framework of the OECD/NEA UAM-LWR benchmark, a Boiling Water Reactor (BWR) and a Pressurized Water Reactor (PWR). Both of them at Hot Full Power (HFP) and Hot Zero Power (HZP) conditions, with and without control rod. This work presents the results of k{sub eff} from different simulation, and discuss the comparison of the two methods employed. In particular, a list of the major contributors to the uncertainty of k{sub eff} in terms of microscopic cross sections; their sensitivity coefficients; a comparison between the results of the two modules and with reference values; statistical information from the stochastic approach, and the probability and statistical confidence reached in the simulations. The reader will find all these information discussed in this paper. (author)

  4. Uncertainty and sensitivity analysis for the modeling of transients with interaction of thermal hydraulics and neutron kinetics

    International Nuclear Information System (INIS)

    Soeren Kliem; Siegfried Mittag; Siegfried Langenbuch

    2005-01-01

    Full text of publication follows: The transition from the application of conservative models to the use of best-estimate models raises the question about the uncertainty of the obtained results. This question becomes especially important, if the best-estimate models should be used for safety analyses in the field of nuclear engineering. Different methodologies were developed to assess the uncertainty of the calculation results of computer simulation codes. One of them is the methodology developed by Gesellschaft fuer Anlagenund Reaktorsicherheit (GRS) which uses the statistical code package SUSA. In the past, this methodology was applied to the calculation results of the advanced thermal hydraulic system code ATHLET. In the frame of the recently finished EU FP5 funded research project VALCO, that methodology was extended and successfully applied to different coupled code systems, including the uncertainty analysis for neutronics. These code systems consist of a thermal hydraulic system code and a 3D neutron kinetic core model. One of the code systems applied was ATHLET coupled with the Rossendorf kinetics code DYN3D. Two real transients at NPPs with VVER-type reactors documented within the VALCO project were selected for analyses. One was the load drop of one of two turbines to house load level at the Loviisa-1 NPP (VVER-440), the second was a test with the switching-off of one of two main feed water pumps at the VVER-1000 Balakovo-4 NPP. The current paper is dedicated to the different steps of the use and implementation of the GRS methodology to coupled code systems and to the assessment of the results obtained by the DYN3D/ATHLET code. Based on the relevant physical processes in both transients, lists of possible sources of uncertainties were compiled. They are specific for the two transients. Besides control parameters like control rod movement and thermal hydraulic parameters like secondary side pressure, mass flow rates, pressurizer sprayer and heater

  5. Uncertainty analysis of the SWEPP PAN assay system for glass waste (content codes 440, 441 and 442)

    International Nuclear Information System (INIS)

    Blackwood, L.G.; Harker, Y.D.; Meachum, T.R.; Yoon, W.Y.

    1996-10-01

    INEL is being used as a temporary storage facility for transuranic waste generated by the Nuclear Weapons program at the Rocky Flats Plant. Currently, there is a large effort in progress to prepare to ship this waste to WIPP. In order to meet the TRU Waste Characterization Quality Assurance Program Plan nondestructive assay compliance requirements and quality assurance objectives, it is necessary to determine the total uncertainty of the radioassay results produced by the Stored Waste Examination Pilot Plant (SWEPP) Passive Action Neutron (PAN) radioassay system. This paper discusses a modified statistical sampling and verification approach used to determine the total uncertainty of SWEPP PAN measurements for glass waste (content codes 440, 441, and 442) contained in 208 liter drums. In the modified statistical sampling and verification approach, the total performance of the SWEPP PAN nondestructive assay system for specifically selected waste conditions is simulated using computer models. A set of 100 cases covering the known conditions exhibited in glass waste was compiled using a combined statistical sampling and factorial experimental design approach. Parameter values assigned in each simulation were derived from reviews of approximately 100 real-time radiography video tapes of RFP glass waste drums, results from previous SWEPP PAN measurements on glass waste drums, and shipping data from RFP where the glass waste was generated. The data in the 100 selected cases form the multi-parameter input to the simulation model. The reported plutonium masses from the simulation model are compared with corresponding input masses. From these comparisons, the bias and total uncertainty associated with SWEPP PAN measurements on glass waste drums are estimated. The validity of the simulation approach is verified by comparing simulated output against results from calibration measurements using known plutonium sources and two glass waste calibration drums

  6. Uncertainty analysis of the SWEPP PAN assay system for glass waste (content codes 440, 441 and 442)

    Energy Technology Data Exchange (ETDEWEB)

    Blackwood, L.G.; Harker, Y.D.; Meachum, T.R.; Yoon, W.Y.

    1996-10-01

    INEL is being used as a temporary storage facility for transuranic waste generated by the Nuclear Weapons program at the Rocky Flats Plant. Currently, there is a large effort in progress to prepare to ship this waste to WIPP. In order to meet the TRU Waste Characterization Quality Assurance Program Plan nondestructive assay compliance requirements and quality assurance objectives, it is necessary to determine the total uncertainty of the radioassay results produced by the Stored Waste Examination Pilot Plant (SWEPP) Passive Action Neutron (PAN) radioassay system. This paper discusses a modified statistical sampling and verification approach used to determine the total uncertainty of SWEPP PAN measurements for glass waste (content codes 440, 441, and 442) contained in 208 liter drums. In the modified statistical sampling and verification approach, the total performance of the SWEPP PAN nondestructive assay system for specifically selected waste conditions is simulated using computer models. A set of 100 cases covering the known conditions exhibited in glass waste was compiled using a combined statistical sampling and factorial experimental design approach. Parameter values assigned in each simulation were derived from reviews of approximately 100 real-time radiography video tapes of RFP glass waste drums, results from previous SWEPP PAN measurements on glass waste drums, and shipping data from RFP where the glass waste was generated. The data in the 100 selected cases form the multi-parameter input to the simulation model. The reported plutonium masses from the simulation model are compared with corresponding input masses. From these comparisons, the bias and total uncertainty associated with SWEPP PAN measurements on glass waste drums are estimated. The validity of the simulation approach is verified by comparing simulated output against results from calibration measurements using known plutonium sources and two glass waste calibration drums.

  7. Uncertainty Evaluation of Best Estimate Calculation Results

    International Nuclear Information System (INIS)

    Glaeser, H.

    2006-01-01

    Efforts are underway in Germany to perform analysis using best estimate computer codes and to include uncertainty evaluation in licensing. The German Reactor Safety Commission (RSK) issued a recommendation to perform uncertainty analysis in loss of coolant accident safety analyses (LOCA), recently. A more general requirement is included in a draft revision of the German Nuclear Regulation which is an activity of the German Ministry of Environment and Reactor Safety (BMU). According to the recommendation of the German RSK to perform safety analyses for LOCA in licensing the following deterministic requirements have still to be applied: Most unfavourable single failure, Unavailability due to preventive maintenance, Break location, Break size and break type, Double ended break, 100 percent through 200 percent, Large, medium and small break, Loss of off-site power, Core power (at accident initiation the most unfavourable conditions and values have to be assumed which may occur under normal operation taking into account the set-points of integral power and power density control. Measurement and calibration errors can be considered statistically), Time of fuel cycle. Analysis using best estimate codes with evaluation of uncertainties is the only way to quantify conservatisms with regard to code models and uncertainties of plant, fuel parameters and decay heat. This is especially the case for approaching licensing limits, e.g. due to power up-rates, higher burn-up and higher enrichment. Broader use of best estimate analysis is therefore envisaged in the future. Since some deterministic unfavourable assumptions regarding availability of NPP systems are still used, some conservatism in best-estimate analyses remains. Methods of uncertainty analyses have been developed and applied by the vendor Framatome ANP as well as by GRS in Germany. The GRS development was sponsored by the German Ministry of Economy and Labour (BMWA). (author)

  8. Application of RELAP/SCDAPSIM with integrated uncertainty options to research reactor systems thermal hydraulic analysis

    International Nuclear Information System (INIS)

    Allison, C.M.; Hohorst, J.K.; Perez, M.; Reventos, F.

    2010-01-01

    The RELAP/SCDAPSIM/MOD4.0 code, designed to predict the behavior of reactor systems during normal and accident conditions, is being developed as part of the international SCDAP Development and Training Program (SDTP). RELAP/SCDAPSIM/MOD4.0, which is the first version of RELAP5 completely rewritten to FORTRAN 90/95/2000 standards, uses publicly available RELAP5 and SCDAP models in combination with advanced programming and numerical techniques and other SDTP-member modeling/user options. One such member developed option is an integrated uncertainty analysis package being developed jointly by the Technical University of Catalonia (UPC) and Innovative Systems Software (ISS). This paper briefly summarizes the features of RELAP/SCDAPSIM/MOD4.0 and the integrated uncertainty analysis package, and then presents an example of how the integrated uncertainty package can be setup and used for a simple pipe flow problem. (author)

  9. Uncertainty and sensitivity analysis of fission gas behavior in engineering-scale fuel modeling

    Energy Technology Data Exchange (ETDEWEB)

    Pastore, Giovanni, E-mail: Giovanni.Pastore@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Swiler, L.P., E-mail: LPSwile@sandia.gov [Optimization and Uncertainty Quantification, Sandia National Laboratories, P.O. Box 5800, Albuquerque, NM 87185-1318 (United States); Hales, J.D., E-mail: Jason.Hales@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Novascone, S.R., E-mail: Stephen.Novascone@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Perez, D.M., E-mail: Danielle.Perez@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Spencer, B.W., E-mail: Benjamin.Spencer@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Luzzi, L., E-mail: Lelio.Luzzi@polimi.it [Politecnico di Milano, Department of Energy, Nuclear Engineering Division, via La Masa 34, I-20156 Milano (Italy); Van Uffelen, P., E-mail: Paul.Van-Uffelen@ec.europa.eu [European Commission, Joint Research Centre, Institute for Transuranium Elements, Hermann-von-Helmholtz-Platz 1, D-76344 Karlsruhe (Germany); Williamson, R.L., E-mail: Richard.Williamson@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States)

    2015-01-15

    The role of uncertainties in fission gas behavior calculations as part of engineering-scale nuclear fuel modeling is investigated using the BISON fuel performance code with a recently implemented physics-based model for fission gas release and swelling. Through the integration of BISON with the DAKOTA software, a sensitivity analysis of the results to selected model parameters is carried out based on UO{sub 2} single-pellet simulations covering different power regimes. The parameters are varied within ranges representative of the relative uncertainties and consistent with the information in the open literature. The study leads to an initial quantitative assessment of the uncertainty in fission gas behavior predictions with the parameter characterization presently available. Also, the relative importance of the single parameters is evaluated. Moreover, a sensitivity analysis is carried out based on simulations of a fuel rod irradiation experiment, pointing out a significant impact of the considered uncertainties on the calculated fission gas release and cladding diametral strain. The results of the study indicate that the commonly accepted deviation between calculated and measured fission gas release by a factor of 2 approximately corresponds to the inherent modeling uncertainty at high fission gas release. Nevertheless, significantly higher deviations may be expected for values around 10% and lower. Implications are discussed in terms of directions of research for the improved modeling of fission gas behavior for engineering purposes.

  10. Uncertainty and sensitivity analysis in reactivity-initiated accident fuel modeling: synthesis of organisation for economic co-operation and development (OECD/nuclear energy agency (NEA benchmark on reactivity-initiated accident codes phase-II

    Directory of Open Access Journals (Sweden)

    Olivier Marchand

    2018-03-01

    Full Text Available In the framework of OECD/NEA Working Group on Fuel Safety, a RIA fuel-rod-code Benchmark Phase I was organized in 2010–2013. It consisted of four experiments on highly irradiated fuel rodlets tested under different experimental conditions. This benchmark revealed the need to better understand the basic models incorporated in each code for realistic simulation of the complicated integral RIA tests with high burnup fuel rods. A second phase of the benchmark (Phase II was thus launched early in 2014, which has been organized in two complementary activities: (1 comparison of the results of different simulations on simplified cases in order to provide additional bases for understanding the differences in modelling of the concerned phenomena; (2 assessment of the uncertainty of the results. The present paper provides a summary and conclusions of the second activity of the Benchmark Phase II, which is based on the input uncertainty propagation methodology. The main conclusion is that uncertainties cannot fully explain the difference between the code predictions. Finally, based on the RIA benchmark Phase-I and Phase-II conclusions, some recommendations are made. Keywords: RIA, Codes Benchmarking, Fuel Modelling, OECD

  11. Phenomenological uncertainty analysis of early containment failure at severe accident of nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Su Won

    2011-02-15

    The severe accident has inherently significant uncertainty due to wide range of conditions and performing experiments, validation and practical application are extremely difficult because of its high temperature and pressure. Although internal and external researches were put into practice, the reference used in Korean nuclear plants were foreign data of 1980s and safety analysis as the probabilistic safety assessment has not applied the newest methodology. Also, it is applied to containment pressure formed into point value as results of thermal hydraulic analysis to identify the probability of containment failure in level 2 PSA. In this paper, the uncertainty analysis methods for phenomena of severe accident influencing early containment failure were developed, the uncertainty analysis that apply Korean nuclear plants using the MELCOR code was performed and it is a point of view to present the distribution of containment pressure as a result of uncertainty analysis. Because early containment failure is important factor of Large Early Release Frequency(LERF) that is used as representative criteria of decision-making in nuclear power plants, it was selected in this paper among various modes of containment failure. Important phenomena of early containment failure at severe accident based on previous researches were comprehended and methodology of 7th steps to evaluate uncertainty was developed. The MELCOR input for analysis of the severe accident reflected natural circulation flow was developed and the accident scenario for station black out that was representative initial event of early containment failure was determined. By reviewing the internal model and correlation for MELCOR model relevant important phenomena of early containment failure, the uncertainty factors which could affect on the uncertainty were founded and the major factors were finally identified through the sensitivity analysis. In order to determine total number of MELCOR calculations which can

  12. Uncertainty and Sensitivity Analyses for CFD Codes: an Attempt of a State of the Art on the Basis of the CEA Experience

    International Nuclear Information System (INIS)

    Crecy, Agnes de; Bazin, Pascal

    2013-01-01

    Uncertainty and sensitivity analyses, associated to best-estimate calculations become paramount for licensing processes and are known as BEPU (Best-Estimate Plus Uncertainties) methods. A recent activity such as the BEMUSE benchmark has shown that the present methods are mature enough for the system thermal-hydraulics codes, even if issues such as the quantification of the uncertainties of the input parameters, and especially, the physical models must be improved. But CFD codes are more and more used for fine 3-D modeling such as, for example, those necessary in dilution or stratification problems. The application of the BEPU methods to CFD codes becomes an issue that must be now addressed. That is precisely the goal of this paper. It consists of two main parts. In the chapter 2, the specificities of CFD codes for BEPU methods are listed, with focuses on the possible difficulties. In the chapter 3, the studies performed at CEA are described. It is important to note that CEA research in this field is only beginning and must not be viewed as a reference approach. (authors)

  13. A research on verification of the CONTAIN CODE model and the uncertainty reduction method for containment integrity

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jae-Hong; Kim, Moo-Hwan; Bae, Seong-Won; Byun, Sang-Chul [Pohang University of Science and Technology, Pohang (Korea, Republic of)

    1998-03-15

    The final objectives of this study are to establish the way of measuring the integrity of containment building structures and safety analysis in the period of a postuIated severe accidents and to decrease the uncertainty of these methods. For that object, the CONTAIN 1.2 codes model for analyzing the severe accidents phenomena and the heat transfer between the air inside the containment buildings and inner walls have been reviewed and analyzed. For the double containment wall provided to the next generation nuclear reactor, which is different to the previous type of containment, the temperature and pressure rising history were calculated and compared to the results of previous ones.

  14. Sensitivity and Uncertainty Analysis of IAEA CRP HTGR Benchmark Using McCARD

    International Nuclear Information System (INIS)

    Jang, Sang Hoon; Shim, Hyung Jin

    2016-01-01

    The benchmark consists of 4 phases starting from the local standalone modeling (Phase I) to the safety calculation of coupled system with transient situation (Phase IV). As a preliminary study of UAM on HTGR, this paper covers the exercise 1 and 2 of Phase I which defines the unit cell and lattice geometry of MHTGR-350 (General Atomics). The objective of these exercises is to quantify the uncertainty of the multiplication factor induced by perturbing nuclear data as well as to analyze the specific features of HTGR such as double heterogeneity and self-shielding treatment. The uncertainty quantification of IAEA CRP HTGR UAM benchmarks were conducted using first-order AWP method in McCARD. Uncertainty of the multiplication factor was estimated only for the microscopic cross section perturbation. To reduce the computation time and memory shortage, recently implemented uncertainty analysis module in MC wielandt calculation was adjusted. The covariance data of cross section was generated by NJOY/ERRORR module with ENDF/B-VII.1. The numerical result was compared with evaluation result of DeCART/MUSAD code system developed by KAERI. IAEA CRP HTGR UAM benchmark problems were analyzed using McCARD. The numerical results were compared with Serpent for eigenvalue calculation and DeCART/MUSAD for S/U analysis. In eigenvalue calculation, inconsistencies were found in the result with ENDF/B-VII.1 cross section library and it was found to be the effect of thermal scattering data of graphite. As to S/U analysis, McCARD results matched well with DeCART/MUSAD, but showed some discrepancy in 238U capture regarding implicit uncertainty.

  15. Introduction of thermal-hydraulic analysis code and system analysis code for HTGR

    International Nuclear Information System (INIS)

    Tanaka, Mitsuhiro; Izaki, Makoto; Koike, Hiroyuki; Tokumitsu, Masashi

    1984-01-01

    Kawasaki Heavy Industries Ltd. has advanced the development and systematization of analysis codes, aiming at lining up the analysis codes for heat transferring flow and control characteristics, taking up HTGR plants as the main object. In order to make the model of flow when shock waves propagate to heating tubes, SALE-3D which can analyze a complex system was developed, therefore, it is reported in this paper. Concerning the analysis code for control characteristics, the method of sensitivity analysis in a topological space including an example of application is reported. The flow analysis code SALE-3D is that for analyzing the flow of compressible viscous fluid in a three-dimensional system over the velocity range from incompressibility limit to supersonic velocity. The fundamental equations and fundamental algorithm of the SALE-3D, the calculation of cell volume, the plotting of perspective drawings and the analysis of the three-dimensional behavior of shock waves propagating in heating tubes after their rupture accident are described. The method of sensitivity analysis was added to the analysis code for control characteristics in a topological space, and blow-down phenomena was analyzed by its application. (Kako, I.)

  16. Approach to uncertainty in risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rish, W.R.

    1988-08-01

    In the Fall of 1985 EPA's Office of Radiation Programs (ORP) initiated a project to develop a formal approach to dealing with uncertainties encountered when estimating and evaluating risks to human health and the environment. Based on a literature review of modeling uncertainty, interviews with ORP technical and management staff, and input from experts on uncertainty analysis, a comprehensive approach was developed. This approach recognizes by design the constraints on budget, time, manpower, expertise, and availability of information often encountered in ''real world'' modeling. It is based on the observation that in practice risk modeling is usually done to support a decision process. As such, the approach focuses on how to frame a given risk modeling problem, how to use that framing to select an appropriate mixture of uncertainty analyses techniques, and how to integrate the techniques into an uncertainty assessment that effectively communicates important information and insight to decision-makers. The approach is presented in this report. Practical guidance on characterizing and analyzing uncertainties about model form and quantities and on effectively communicating uncertainty analysis results is included. Examples from actual applications are presented.

  17. Approach to uncertainty in risk analysis

    International Nuclear Information System (INIS)

    Rish, W.R.

    1988-08-01

    In the Fall of 1985 EPA's Office of Radiation Programs (ORP) initiated a project to develop a formal approach to dealing with uncertainties encountered when estimating and evaluating risks to human health and the environment. Based on a literature review of modeling uncertainty, interviews with ORP technical and management staff, and input from experts on uncertainty analysis, a comprehensive approach was developed. This approach recognizes by design the constraints on budget, time, manpower, expertise, and availability of information often encountered in ''real world'' modeling. It is based on the observation that in practice risk modeling is usually done to support a decision process. As such, the approach focuses on how to frame a given risk modeling problem, how to use that framing to select an appropriate mixture of uncertainty analyses techniques, and how to integrate the techniques into an uncertainty assessment that effectively communicates important information and insight to decision-makers. The approach is presented in this report. Practical guidance on characterizing and analyzing uncertainties about model form and quantities and on effectively communicating uncertainty analysis results is included. Examples from actual applications are presented

  18. Uncertainty Evaluation for SMART Synthesized Power Distribution

    International Nuclear Information System (INIS)

    Cho, J. Y.; Song, J. S.; Lee, C. C.; Park, S. Y.; Kim, K. Y.; Lee, K. H.

    2010-07-01

    This report performs the uncertainty analysis for the SMART synthesis power distribution generated by a SSUN (SMART core SUpporting system coupled by Nuclear design code) code. SSUN runs coupled with the MASTER neutronics code and generates the core 3-D synthesis power distribution by using DPCM3D. The MASTER code plays a role to provide the DPCM3D constants to the SSUN code for the current core states. The uncertainties evaluated in this report are the form of 95%/95% probability/confidence one-sided tolerance limits and can be used in conjunction with Technical Specification limits on these quantities to establish appropriate LCO (Limiting Conditions of Operation) and LSSS (Limiting Safety System Settings) limits. This report is applicable to SMART nuclear reactor using fixed rhodium detector systems. The unknown true power distribution should be given for the uncertainty evaluation of the synthesis power distribution. This report produces virtual distributions for the true power distribution by imposing the CASMO-3/MASTER uncertainty to the MASTER power distribution. Detector signals are generated from these virtual distribution and the DPCM3D constants are from the MASTER power distribution. The SSUN code synthesizes the core 3-D power distribution by using these detector signals and the DPCM3D constants. The following summarizes the uncertainty evaluation procedure for the synthesis power distribution. (1) Generation of 3-D power distribution by MASTER -> Determination of the DPCM3D constants. (2) Generation of virtual power distribution (assumed to be true power distribution) -> Generation of detector signals. (3) Generation of synthesis power distribution. (4) Uncertainty evaluation for the synthesis power distribution. Chi-Square normality test rejects the hypothesis of normal distribution for the synthesis power error distribution. Therefore, the KRUSKAL WALLIS test and the non-parametric statistics are used for data pooling and the tolerance limits. The

  19. Effects of uncertainties of experimental data in the benchmarking of a computer code

    International Nuclear Information System (INIS)

    Meulemeester, E. de; Bouffioux, P.; Demeester, J.

    1980-01-01

    Fuel rod performance modelling is sometimes taken in an academical way. The experience of the COMETHE code development since 1967 has clearly shown that benchmarking was the most important part of modelling development. Unfortunately, it requires well characterized data. Although, the two examples presented here were not intended for benchmarking, as the COMETHE calculations were only performed for an interpretation of the results, they illustrate the effects of a lack of fuel characterization and of the power history uncertainties

  20. Technology relevance of the 'uncertainty analysis in modelling' project for nuclear reactor safety

    International Nuclear Information System (INIS)

    D'Auria, F.; Langenbuch, S.; Royer, E.; Del Nevo, A.; Parisi, C.; Petruzzi, A.

    2007-01-01

    The OECD/NEA Nuclear Science Committee (NSC) endorsed the setting up of an Expert Group on Uncertainty Analysis in Modelling (UAM) in June 2006. This Expert Group reports to the Working Party on Scientific issues in Reactor Systems (WPRS) and because it addresses multi-scale / multi-physics aspects of uncertainty analysis, it will work in close co-ordination with the benchmark groups on coupled neutronics-thermal-hydraulics and on coupled core-plant problems, and the CSNI Group on Analysis and Management of Accidents (GAMA). The NEA/NSC has endorsed that this activity be undertaken with Prof. K. Ivanov from the Pennsylvania State University (PSU) as the main coordinator and host with the assistance of the Scientific Board. The objective of the proposed work is to define, coordinate, conduct, and report an international benchmark for uncertainty analysis in best-estimate coupled code calculations for design, operation, and safety analysis of LWRs entitled 'OECD UAM LWR Benchmark'. At the First Benchmark Workshop (UAM-1) held from 10 to 11 May 2007 at the OECD/NEA, one action concerned the forming of a sub-group, led by F. D'Auria, member of CSNI, responsible for defining the objectives, the impact and benefit of the UAM for safety and licensing. This report is the result of this action by the subgroup. (authors)

  1. Report on the uncertainty methods study

    International Nuclear Information System (INIS)

    1998-06-01

    The Uncertainty Methods Study (UMS) Group, following a mandate from CSNI, has compared five methods for calculating the uncertainty in the predictions of advanced 'best estimate' thermal-hydraulic codes: the Pisa method (based on extrapolation from integral experiments) and four methods identifying and combining input uncertainties. Three of these, the GRS, IPSN and ENUSA methods, use subjective probability distributions, and one, the AEAT method, performs a bounding analysis. Each method has been used to calculate the uncertainty in specified parameters for the LSTF SB-CL-18 5% cold leg small break LOCA experiment in the ROSA-IV Large Scale Test Facility (LSTF). The uncertainty analysis was conducted essentially blind and the participants did not use experimental measurements from the test as input apart from initial and boundary conditions. Participants calculated uncertainty ranges for experimental parameters including pressurizer pressure, primary circuit inventory and clad temperature (at a specified position) as functions of time

  2. The use of Monte-Carlo simulation and order statistics for uncertainty analysis of a LBLOCA transient (LOFT-L2-5)

    International Nuclear Information System (INIS)

    Chojnacki, E.; Benoit, J.P.

    2007-01-01

    Best estimate computer codes are increasingly used in nuclear industry for the accident management procedures and have been planned to be used for the licensing procedures. Contrary to conservative codes which are supposed to give penalizing results, best estimate codes attempt to calculate accidental transients in a realistic way. It becomes therefore of prime importance, in particular for technical organization as IRSN in charge of safety assessment, to know the uncertainty on the results of such codes. Thus, CSNI has sponsored few years ago (published in 1998) the Uncertainty Methods Study (UMS) program on uncertainty methodologies used for a SBLOCA transient (LSTF-CL-18) and is now supporting the BEMUSE program for a LBLOCA transient (LOFT-L2-5). The large majority of BEMUSE participants (9 out of 10) use uncertainty methodologies based on a probabilistic modelling and all of them use Monte-Carlo simulations to propagate the uncertainties through their computer codes. Also, all of 'probabilistic participants' intend to use order statistics to determine the sampling size of the Monte-Carlo simulation and to derive the uncertainty ranges associated to their computer calculations. The first aim of this paper is to remind the advantages and also the assumptions of the probabilistic modelling and more specifically of order statistics (as Wilks' formula) in uncertainty methodologies. Indeed Monte-Carlo methods provide flexible and extremely powerful techniques for solving many of the uncertainty propagation problems encountered in nuclear safety analysis. However it is important to keep in mind that probabilistic methods are data intensive. That means, probabilistic methods cannot produce robust results unless a considerable body of information has been collected. A main interest of the use of order statistics results is to allow to take into account an unlimited number of uncertain parameters and, from a restricted number of code calculations to provide statistical

  3. Sample application of sensitivity/uncertainty analysis techniques to a groundwater transport problem. National Low-Level Waste Management Program

    International Nuclear Information System (INIS)

    Seitz, R.R.; Rood, A.S.; Harris, G.A.; Maheras, S.J.; Kotecki, M.

    1991-06-01

    The primary objective of this document is to provide sample applications of selected sensitivity and uncertainty analysis techniques within the context of the radiological performance assessment process. These applications were drawn from the companion document Guidelines for Sensitivity and Uncertainty Analyses of Low-Level Radioactive Waste Performance Assessment Computer Codes (S. Maheras and M. Kotecki, DOE/LLW-100, 1990). Three techniques are illustrated in this document: one-factor-at-a-time (OFAT) analysis, fractional factorial design, and Latin hypercube sampling. The report also illustrates the differences in sensitivity and uncertainty analysis at the early and latter stages of the performance assessment process, and potential pitfalls that can be encountered when applying the techniques. The emphasis is on application of the techniques as opposed to the actual results, since the results are hypothetical and are not based on site-specific conditions

  4. IAEA Coordinated Research Project on HTGR Reactor Physics, Thermal-hydraulics and Depletion Uncertainty Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Strydom, Gerhard [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bostelmann, F. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The predictive capability of coupled neutronics/thermal-hydraulics and depletion simulations for reactor design and safety analysis can be assessed with sensitivity analysis (SA) and uncertainty analysis (UA) methods. Uncertainty originates from errors in physical data, manufacturing uncertainties, modelling and computational algorithms. (The interested reader is referred to the large body of published SA and UA literature for a more complete overview of the various types of uncertainties, methodologies and results obtained). SA is helpful for ranking the various sources of uncertainty and error in the results of core analyses. SA and UA are required to address cost, safety, and licensing needs and should be applied to all aspects of reactor multi-physics simulation. SA and UA can guide experimental, modelling, and algorithm research and development. Current SA and UA rely either on derivative-based methods such as stochastic sampling methods or on generalized perturbation theory to obtain sensitivity coefficients. Neither approach addresses all needs. In order to benefit from recent advances in modelling and simulation and the availability of new covariance data (nuclear data uncertainties) extensive sensitivity and uncertainty studies are needed for quantification of the impact of different sources of uncertainties on the design and safety parameters of HTGRs. Only a parallel effort in advanced simulation and in nuclear data improvement will be able to provide designers with more robust and well validated calculation tools to meet design target accuracies. In February 2009, the Technical Working Group on Gas-Cooled Reactors (TWG-GCR) of the International Atomic Energy Agency (IAEA) recommended that the proposed Coordinated Research Program (CRP) on

  5. DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Eldred, Michael Scott; Vigil, Dena M.; Dalbey, Keith R.; Bohnhoff, William J.; Adams, Brian M.; Swiler, Laura Painton; Lefantzi, Sophia (Sandia National Laboratories, Livermore, CA); Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Eddy, John P.

    2011-12-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the DAKOTA software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of DAKOTA-related research publications in the areas of surrogate-based optimization, uncertainty quantification, and optimization under uncertainty that provide the foundation for many of DAKOTA's iterative analysis capabilities.

  6. Representation of analysis results involving aleatory and epistemic uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jay Dean (ProStat, Mesa, AZ); Helton, Jon Craig (Arizona State University, Tempe, AZ); Oberkampf, William Louis; Sallaberry, Cedric J.

    2008-08-01

    Procedures are described for the representation of results in analyses that involve both aleatory uncertainty and epistemic uncertainty, with aleatory uncertainty deriving from an inherent randomness in the behavior of the system under study and epistemic uncertainty deriving from a lack of knowledge about the appropriate values to use for quantities that are assumed to have fixed but poorly known values in the context of a specific study. Aleatory uncertainty is usually represented with probability and leads to cumulative distribution functions (CDFs) or complementary cumulative distribution functions (CCDFs) for analysis results of interest. Several mathematical structures are available for the representation of epistemic uncertainty, including interval analysis, possibility theory, evidence theory and probability theory. In the presence of epistemic uncertainty, there is not a single CDF or CCDF for a given analysis result. Rather, there is a family of CDFs and a corresponding family of CCDFs that derive from epistemic uncertainty and have an uncertainty structure that derives from the particular uncertainty structure (i.e., interval analysis, possibility theory, evidence theory, probability theory) used to represent epistemic uncertainty. Graphical formats for the representation of epistemic uncertainty in families of CDFs and CCDFs are investigated and presented for the indicated characterizations of epistemic uncertainty.

  7. Measurement uncertainty analysis techniques applied to PV performance measurements

    International Nuclear Information System (INIS)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results

  8. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  9. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  10. User's manual of SECOM2: a computer code for seismic system reliability analysis

    International Nuclear Information System (INIS)

    Uchiyama, Tomoaki; Oikawa, Tetsukuni; Kondo, Masaaki; Tamura, Kazuo

    2002-03-01

    This report is the user's manual of seismic system reliability analysis code SECOM2 (Seismic Core Melt Frequency Evaluation Code Ver.2) developed at the Japan Atomic Energy Research Institute for systems reliability analysis, which is one of the tasks of seismic probabilistic safety assessment (PSA) of nuclear power plants (NPPs). The SECOM2 code has many functions such as: Calculation of component failure probabilities based on the response factor method, Extraction of minimal cut sets (MCSs), Calculation of conditional system failure probabilities for given seismic motion levels at the site of an NPP, Calculation of accident sequence frequencies and the core damage frequency (CDF) with use of the seismic hazard curve, Importance analysis using various indicators, Uncertainty analysis, Calculation of the CDF taking into account the effect of the correlations of responses and capacities of components, and Efficient sensitivity analysis by changing parameters on responses and capacities of components. These analyses require the fault tree (FT) representing the occurrence condition of the system failures and core damage, information about response and capacity of components and seismic hazard curve for the NPP site as inputs. This report presents the models and methods applied in the SECOM2 code and how to use those functions. (author)

  11. Sensitivity functions for uncertainty analysis: Sensitivity and uncertainty analysis of reactor performance parameters

    International Nuclear Information System (INIS)

    Greenspan, E.

    1982-01-01

    This chapter presents the mathematical basis for sensitivity functions, discusses their physical meaning and information they contain, and clarifies a number of issues concerning their application, including the definition of group sensitivities, the selection of sensitivity functions to be included in the analysis, and limitations of sensitivity theory. Examines the theoretical foundation; criticality reset sensitivities; group sensitivities and uncertainties; selection of sensitivities included in the analysis; and other uses and limitations of sensitivity functions. Gives the theoretical formulation of sensitivity functions pertaining to ''as-built'' designs for performance parameters of the form of ratios of linear flux functionals (such as reaction-rate ratios), linear adjoint functionals, bilinear functions (such as reactivity worth ratios), and for reactor reactivity. Offers a consistent procedure for reducing energy-dependent or fine-group sensitivities and uncertainties to broad group sensitivities and uncertainties. Provides illustrations of sensitivity functions as well as references to available compilations of such functions and of total sensitivities. Indicates limitations of sensitivity theory originating from the fact that this theory is based on a first-order perturbation theory

  12. Uncertainty analysis

    International Nuclear Information System (INIS)

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software

  13. Uncertainty in river discharge observations: a quantitative analysis

    Directory of Open Access Journals (Sweden)

    G. Di Baldassarre

    2009-06-01

    Full Text Available This study proposes a framework for analysing and quantifying the uncertainty of river flow data. Such uncertainty is often considered to be negligible with respect to other approximations affecting hydrological studies. Actually, given that river discharge data are usually obtained by means of the so-called rating curve method, a number of different sources of error affect the derived observations. These include: errors in measurements of river stage and discharge utilised to parameterise the rating curve, interpolation and extrapolation error of the rating curve, presence of unsteady flow conditions, and seasonal variations of the state of the vegetation (i.e. roughness. This study aims at analysing these sources of uncertainty using an original methodology. The novelty of the proposed framework lies in the estimation of rating curve uncertainty, which is based on hydraulic simulations. These latter are carried out on a reach of the Po River (Italy by means of a one-dimensional (1-D hydraulic model code (HEC-RAS. The results of the study show that errors in river flow data are indeed far from negligible.

  14. Assessing scenario and parametric uncertainties in risk analysis: a model uncertainty audit

    International Nuclear Information System (INIS)

    Tarantola, S.; Saltelli, A.; Draper, D.

    1999-01-01

    In the present study a process of model audit is addressed on a computational model used for predicting maximum radiological doses to humans in the field of nuclear waste disposal. Global uncertainty and sensitivity analyses are employed to assess output uncertainty and to quantify the contribution of parametric and scenario uncertainties to the model output. These tools are of fundamental importance for risk analysis and decision making purposes

  15. PUFF-III: A Code for Processing ENDF Uncertainty Data Into Multigroup Covariance Matrices

    International Nuclear Information System (INIS)

    Dunn, M.E.

    2000-01-01

    PUFF-III is an extension of the previous PUFF-II code that was developed in the 1970s and early 1980s. The PUFF codes process the Evaluated Nuclear Data File (ENDF) covariance data and generate multigroup covariance matrices on a user-specified energy grid structure. Unlike its predecessor, PUFF-III can process the new ENDF/B-VI data formats. In particular, PUFF-III has the capability to process the spontaneous fission covariances for fission neutron multiplicity. With regard to the covariance data in File 33 of the ENDF system, PUFF-III has the capability to process short-range variance formats, as well as the lumped reaction covariance data formats that were introduced in ENDF/B-V. In addition to the new ENDF formats, a new directory feature is now available that allows the user to obtain a detailed directory of the uncertainty information in the data files without visually inspecting the ENDF data. Following the correlation matrix calculation, PUFF-III also evaluates the eigenvalues of each correlation matrix and tests each matrix for positive definiteness. Additional new features are discussed in the manual. PUFF-III has been developed for implementation in the AMPX code system, and several modifications were incorporated to improve memory allocation tasks and input/output operations. Consequently, the resulting code has a structure that is similar to other modules in the AMPX code system. With the release of PUFF-III, a new and improved covariance processing code is available to process ENDF covariance formats through Version VI

  16. Fukushima Daiichi Unit 1 Uncertainty Analysis-Exploration of Core Melt Progression Uncertain Parameters-Volume II.

    Energy Technology Data Exchange (ETDEWEB)

    Denman, Matthew R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Brooks, Dusty Marie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    Sandia National Laboratories (SNL) has conducted an uncertainty analysi s (UA) on the Fukushima Daiichi unit (1F1) accident progression wit h the MELCOR code. Volume I of the 1F1 UA discusses the physical modeling details and time history results of the UA. Volume II of the 1F1 UA discusses the statistical viewpoint. The model used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). The goal of this work was to perform a focused evaluation of uncertainty in core damage progression behavior and its effect on key figures - of - merit (e.g., hydrogen production, fraction of intact fuel, vessel lower head failure) and in doing so assess the applicability of traditional sensitivity analysis techniques .

  17. BEMUSE Phase III Report - Uncertainty and Sensitivity Analysis of the LOFT L2-5 Test

    International Nuclear Information System (INIS)

    Bazin, P.; Crecy, A. de; Glaeser, H.; Skorek, T.; Joucla, J.; Probst, P.; Chung, B.; Oh, D.Y.; Kyncl, M.; Pernica, R.; Macek, J.; Meca, R.; Macian, R.; D'Auria, F.; Petruzzi, A.; Perez, M.; Reventos, F.; Fujioka, K.

    2007-02-01

    This report summarises the various contributions (ten participants) for phase 3 of BEMUSE: Uncertainty and Sensitivity Analyses of the LOFT L2-5 experiment, a Large-Break Loss-of-Coolant-Accident (LB-LOCA). For this phase, precise requirements step by step were provided to the participants. Four main parts are defined, which are: 1. List and uncertainties of the input uncertain parameters. 2. Uncertainty analysis results. 3. Sensitivity analysis results. 4. Improved methods, assessment of the methods (optional). 5% and 95% percentiles have to be estimated for 6 output parameters, which are of two kinds: 1. Scalar output parameters (First Peak Cladding Temperature (PCT), Second Peak Cladding Temperature, Time of accumulator injection, Time of complete quenching); 2. Time trends output parameters (Maximum cladding temperature, Upper plenum pressure). The main lessons learnt from phase 3 of the BEMUSE programme are the following: - for uncertainty analysis, all the participants use a probabilistic method associated with the use of Wilks' formula, except for UNIPI with its CIAU method (Code with the Capability of Internal Assessment of Uncertainty). Use of both methods has been successfully mastered. - Compared with the experiment, the results of uncertainty analysis are good on the whole. For example, for the cladding temperature-type output parameters (1. PCT, 2. PCT, time of complete quenching, maximum cladding temperature), 8 participants out of 10 find upper and lower bounds which envelop the experimental data. - Sensitivity analysis has been successfully performed by all the participants using the probabilistic method. All the used influence measures include the range of variation of the input parameters. Synthesis tables of the most influential phenomena and parameters have been plotted and participants will be able to use them for the continuation of the BEMUSE programme

  18. Advanced LOCA code uncertainty assessment

    International Nuclear Information System (INIS)

    Wickett, A.J.; Neill, A.P.

    1990-11-01

    This report describes a pilot study that identified, quantified and combined uncertainties for the LOBI BL-02 3% small break test. A ''dials'' version of TRAC-PF1/MOD1, called TRAC-F, was used. (author)

  19. Uncertainties in Nuclear Proliferation Modeling

    International Nuclear Information System (INIS)

    Kim, Chul Min; Yim, Man-Sung; Park, Hyeon Seok

    2015-01-01

    There have been various efforts in the research community to understand the determinants of nuclear proliferation and develop quantitative tools to predict nuclear proliferation events. Such systematic approaches have shown the possibility to provide warning for the international community to prevent nuclear proliferation activities. However, there are still large debates for the robustness of the actual effect of determinants and projection results. Some studies have shown that several factors can cause uncertainties in previous quantitative nuclear proliferation modeling works. This paper analyzes the uncertainties in the past approaches and suggests future works in the view of proliferation history, analysis methods, and variable selection. The research community still lacks the knowledge for the source of uncertainty in current models. Fundamental problems in modeling will remain even other advanced modeling method is developed. Before starting to develop fancy model based on the time dependent proliferation determinants' hypothesis, using graph theory, etc., it is important to analyze the uncertainty of current model to solve the fundamental problems of nuclear proliferation modeling. The uncertainty from different proliferation history coding is small. Serious problems are from limited analysis methods and correlation among the variables. Problems in regression analysis and survival analysis cause huge uncertainties when using the same dataset, which decreases the robustness of the result. Inaccurate variables for nuclear proliferation also increase the uncertainty. To overcome these problems, further quantitative research should focus on analyzing the knowledge suggested on the qualitative nuclear proliferation studies

  20. Sensitivity and uncertainty analysis for Ignalina NPP confinement in case of loss of coolant accident

    International Nuclear Information System (INIS)

    Urbonavicius, E.; Babilas, E.; Rimkevicius, S.

    2003-01-01

    At present the best-estimate approach in the safety analysis of nuclear power plants is widely used around the world. The application of such approach requires to estimate the uncertainty of the calculated results. Various methodologies are applied in order to determine the uncertainty with the required accuracy. One of them is the statistical methodology developed at GRS mbH in Germany and integrated into the SUSA tool, which was applied for the sensitivity and uncertainty analysis of the thermal-hydraulic parameters inside the confinement (Accident Localisation System) of Ignalina NPP with RBMK-1500 reactor in case of Maximum Design Basis Accident (break of 900 mm diameter pipe). Several parameters that could potentially influence the calculated results were selected for the analysis. A set of input data with different initial values of the selected parameters was generated. In order to receive the results with 95 % probability and 95 % accuracy, 100 runs were performed with COCOSYS code developed at GRS mbH. The calculated results were processed with SUSA tool. The performed analysis showed a rather low dispersion of the results and only in the initial period of the accident. Besides, the analysis showed that there is no threat to the building structures of Ignalina NPP confinement in case of the considered accident scenario. (author)

  1. Uncertainty analysis for Ulysses safety evaluation report

    International Nuclear Information System (INIS)

    Frank, M.V.

    1991-01-01

    As part of the effort to review the Ulysses Final Safety Analysis Report and to understand the risk of plutonium release from the Ulysses spacecraft General Purpose Heat Source---Radioisotope Thermal Generator (GPHS-RTG), the Interagency Nuclear Safety Review Panel (INSRP) and the author performed an integrated, quantitative analysis of the uncertainties of the calculated risk of plutonium release from Ulysses. Using state-of-art probabilistic risk assessment technology, the uncertainty analysis accounted for both variability and uncertainty of the key parameters of the risk analysis. The results show that INSRP had high confidence that risk of fatal cancers from potential plutonium release associated with calculated launch and deployment accident scenarios is low

  2. Uncertainty calculations made easier

    International Nuclear Information System (INIS)

    Hogenbirk, A.

    1994-07-01

    The results are presented of a neutron cross section sensitivity/uncertainty analysis performed in a complicated 2D model of the NET shielding blanket design inside the ITER torus design, surrounded by the cryostat/biological shield as planned for ITER. The calculations were performed with a code system developed at ECN Petten, with which sensitivity/uncertainty calculations become relatively simple. In order to check the deterministic neutron transport calculations (performed with DORT), calculations were also performed with the Monte Carlo code MCNP. Care was taken to model the 2.0 cm wide gaps between two blanket segments, as the neutron flux behind the vacuum vessel is largely determined by neutrons streaming through these gaps. The resulting neutron flux spectra are in excellent agreement up to the end of the cryostat. It is noted, that at this position the attenuation of the neutron flux is about 1 l orders of magnitude. The uncertainty in the energy integrated flux at the beginning of the vacuum vessel and at the beginning of the cryostat was determined in the calculations. The uncertainty appears to be strongly dependent on the exact geometry: if the gaps are filled with stainless steel, the neutron spectrum changes strongly, which results in an uncertainty of 70% in the energy integrated flux at the beginning of the cryostat in the no-gap-geometry, compared to an uncertainty of only 5% in the gap-geometry. Therefore, it is essential to take into account the exact geometry in sensitivity/uncertainty calculations. Furthermore, this study shows that an improvement of the covariance data is urgently needed in order to obtain reliable estimates of the uncertainties in response parameters in neutron transport calculations. (orig./GL)

  3. Aleatoric and epistemic uncertainties in sampling based nuclear data uncertainty and sensitivity analyses

    International Nuclear Information System (INIS)

    Zwermann, W.; Krzykacz-Hausmann, B.; Gallner, L.; Klein, M.; Pautz, A.; Velkov, K.

    2012-01-01

    Sampling based uncertainty and sensitivity analyses due to epistemic input uncertainties, i.e. to an incomplete knowledge of uncertain input parameters, can be performed with arbitrary application programs to solve the physical problem under consideration. For the description of steady-state particle transport, direct simulations of the microscopic processes with Monte Carlo codes are often used. This introduces an additional source of uncertainty, the aleatoric sampling uncertainty, which is due to the randomness of the simulation process performed by sampling, and which adds to the total combined output sampling uncertainty. So far, this aleatoric part of uncertainty is minimized by running a sufficiently large number of Monte Carlo histories for each sample calculation, thus making its impact negligible as compared to the impact from sampling the epistemic uncertainties. Obviously, this process may cause high computational costs. The present paper shows that in many applications reliable epistemic uncertainty results can also be obtained with substantially lower computational effort by performing and analyzing two appropriately generated series of samples with much smaller number of Monte Carlo histories each. The method is applied along with the nuclear data uncertainty and sensitivity code package XSUSA in combination with the Monte Carlo transport code KENO-Va to various critical assemblies and a full scale reactor calculation. It is shown that the proposed method yields output uncertainties and sensitivities equivalent to the traditional approach, with a high reduction of computing time by factors of the magnitude of 100. (authors)

  4. Perspectives on the development of next generation reactor systems safety analysis codes

    International Nuclear Information System (INIS)

    Zhang, H.

    2015-01-01

    tackle the deficiencies in the existing codes. 3) Software design of the next-generation codes needs to take into consideration of having the flexibility to add new models if necessary, as well as to allow for embedded uncertainty quantification, and capability of multi-physics coupling with other codes. 4) The next generation codes need proper verification and validation (V & V) before they can be used to plant applications. New approaches need to be developed to verify and validate complex multi-physics models with multiple time and length scales and advanced modeling techniques. 5) The next generation system analysis codes should be designed to be integrated into probabilistic evaluation to enable a risk-informed safety margin characterization (RISMC) process in order to optimize plant safety and performance by incorporating plant impacts, aging, and degradation processes into the safety analysis. (author)

  5. Perspectives on the development of next generation reactor systems safety analysis codes

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, H., E-mail: Hongbin.Zhang@inl.gov [Idaho National Laboratory, Idaho Falls, ID (United States)

    2015-07-01

    tackle the deficiencies in the existing codes. 3) Software design of the next-generation codes needs to take into consideration of having the flexibility to add new models if necessary, as well as to allow for embedded uncertainty quantification, and capability of multi-physics coupling with other codes. 4) The next generation codes need proper verification and validation (V & V) before they can be used to plant applications. New approaches need to be developed to verify and validate complex multi-physics models with multiple time and length scales and advanced modeling techniques. 5) The next generation system analysis codes should be designed to be integrated into probabilistic evaluation to enable a risk-informed safety margin characterization (RISMC) process in order to optimize plant safety and performance by incorporating plant impacts, aging, and degradation processes into the safety analysis. (author)

  6. Uncertainty analysis for geologic disposal of radioactive waste

    International Nuclear Information System (INIS)

    Cranwell, R.M.; Helton, J.C.

    1981-01-01

    The incorporation and representation of uncertainty in the analysis of the consequences and risks associated with the geologic disposal of high-level radioactive waste are discussed. Such uncertainty has three primary components: process modeling uncertainty, model input data uncertainty, and scenario uncertainty. The following topics are considered in connection with the preceding components: propagation of uncertainty in the modeling of a disposal site, sampling of input data for models, and uncertainty associated with model output

  7. A new approach and computational algorithm for sensitivity/uncertainty analysis for SED and SAD with applications to beryllium integral experiments

    International Nuclear Information System (INIS)

    Song, P.M.; Youssef, M.Z.; Abdou, M.A.

    1993-01-01

    A new approach for treating the sensitivity and uncertainty in the secondary energy distribution (SED) and the secondary angular distribution (SAD) has been developed, and the existing two-dimensional sensitivity/uncertainty analysis code, FORSS, was expanded to incorporate the new approach. The calculational algorithm was applied to the 9 Be(n,2n) cross section to study the effect of the current uncertainties in the SED and SAD of neutrons emitted from this reaction on the prediction accuracy of the tritium production rate from 6 Li(T 6 ) and 7 Li(T 7 ) in an engineering-oriented fusion integral experiment of the US Department of Energy/Japan Atomic Energy Research Institute Collaborative Program on Fusion Neutronics in which beryllium was used as a neutron multiplier. In addition, the analysis was extended to include the uncertainties in the integrated smooth cross sections of beryllium and other materials that constituted the test assembly used in the experiment. This comprehensive two-dimensional cross-section sensitivity/uncertainty analysis aimed at identifying the sources of discrepancies between calculated and measured values for T 6 and T 7

  8. Demonstration of Emulator-Based Bayesian Calibration of Safety Analysis Codes: Theory and Formulation

    Directory of Open Access Journals (Sweden)

    Joseph P. Yurko

    2015-01-01

    Full Text Available System codes for simulation of safety performance of nuclear plants may contain parameters whose values are not known very accurately. New information from tests or operating experience is incorporated into safety codes by a process known as calibration, which reduces uncertainty in the output of the code and thereby improves its support for decision-making. The work reported here implements several improvements on classic calibration techniques afforded by modern analysis techniques. The key innovation has come from development of code surrogate model (or code emulator construction and prediction algorithms. Use of a fast emulator makes the calibration processes used here with Markov Chain Monte Carlo (MCMC sampling feasible. This work uses Gaussian Process (GP based emulators, which have been used previously to emulate computer codes in the nuclear field. The present work describes the formulation of an emulator that incorporates GPs into a factor analysis-type or pattern recognition-type model. This “function factorization” Gaussian Process (FFGP model allows overcoming limitations present in standard GP emulators, thereby improving both accuracy and speed of the emulator-based calibration process. Calibration of a friction-factor example using a Method of Manufactured Solution is performed to illustrate key properties of the FFGP based process.

  9. Development and application of a new deterministic method for calculating computer model result uncertainties

    International Nuclear Information System (INIS)

    Maerker, R.E.; Worley, B.A.

    1989-01-01

    Interest in research into the field of uncertainty analysis has recently been stimulated as a result of a need in high-level waste repository design assessment for uncertainty information in the form of response complementary cumulative distribution functions (CCDFs) to show compliance with regulatory requirements. The solution to this problem must obviously rely on the analysis of computer code models, which, however, employ parameters that can have large uncertainties. The motivation for the research presented in this paper is a search for a method involving a deterministic uncertainty analysis approach that could serve as an improvement over those methods that make exclusive use of statistical techniques. A deterministic uncertainty analysis (DUA) approach based on the use of first derivative information is the method studied in the present procedure. The present method has been applied to a high-level nuclear waste repository problem involving use of the codes ORIGEN2, SAS, and BRINETEMP in series, and the resulting CDF of a BRINETEMP result of interest is compared with that obtained through a completely statistical analysis

  10. International benchmark for coupled codes and uncertainty analysis in modelling: switching-Off of one of the four operating main circulation pumps at nominal reactor power at NPP Kalinin unit 3

    International Nuclear Information System (INIS)

    Tereshonok, V. A.; Nikonov, S. P.; Lizorkin, M. P.; Velkov, K; Pautz, A.; Ivanov, V.

    2008-01-01

    The paper briefly describes the Specification of an international NEA/OECD benchmark based on measured plant data. During the commissioning tests for nominal power at NPP Kalinin Unit 3 a lot of measurements of neutron and thermo-hydraulic parameters have been carried out in the reactor pressure vessel, primary and the secondary circuits. One of the measured data sets for the transient 'Switching-off of one Main Circulation Pump (MCP) at nominal power' has been chosen to be applied for validation of coupled thermal-hydraulic and neutron-kinetic system codes and additionally for performing of uncertainty analyses as a part of the NEA/OECD Uncertainty Analysis in Modeling Benchmark. The benchmark is opened for all countries and institutions. The experimental data and the final specification with the cross section libraries will be provided to the participants from NEA/OECD only after official declaration of real participation in the benchmark and delivery of the simulated results of the transient for comparison. (Author)

  11. Systematic Evaluation of Uncertainty in Material Flow Analysis

    DEFF Research Database (Denmark)

    Laner, David; Rechberger, Helmut; Astrup, Thomas Fruergaard

    2014-01-01

    Material flow analysis (MFA) is a tool to investigate material flows and stocks in defined systems as a basis for resource management or environmental pollution control. Because of the diverse nature of sources and the varying quality and availability of data, MFA results are inherently uncertain....... Uncertainty analyses have received increasing attention in recent MFA studies, but systematic approaches for selection of appropriate uncertainty tools are missing. This article reviews existing literature related to handling of uncertainty in MFA studies and evaluates current practice of uncertainty analysis......) and exploratory MFA (identification of critical parameters and system behavior). Whereas mathematically simpler concepts focusing on data uncertainty characterization are appropriate for descriptive MFAs, statistical approaches enabling more-rigorous evaluation of uncertainty and model sensitivity are needed...

  12. Quantification of uncertainties in source term estimates for a BWR with Mark I containment

    International Nuclear Information System (INIS)

    Khatib-Rahbar, M.; Cazzoli, E.; Davis, R.; Ishigami, T.; Lee, M.; Nourbakhsh, H.; Schmidt, E.; Unwin, S.

    1988-01-01

    A methodology for quantification and uncertainty analysis of source terms for severe accident in light water reactors (QUASAR) has been developed. The objectives of the QUASAR program are (1) to develop a framework for performing an uncertainty evaluation of the input parameters of the phenomenological models used in the Source Term Code Package (STCP), and (2) to quantify the uncertainties in certain phenomenological aspects of source terms (that are not modeled by STCP) using state-of-the-art methods. The QUASAR methodology consists of (1) screening sensitivity analysis, where the most sensitive input variables are selected for detailed uncertainty analysis, (2) uncertainty analysis, where probability density functions (PDFs) are established for the parameters identified by the screening stage and propagated through the codes to obtain PDFs for the outputs (i.e., release fractions to the environment), and (3) distribution sensitivity analysis, which is performed to determine the sensitivity of the output PDFs to the input PDFs. In this paper attention is limited to a single accident progression sequence, namely; a station blackout accident in a BWR with a Mark I containment buildings. Identified as an important accident in the draft NUREG-1150 a station blackout involves loss of both off-site power and DC power resulting in failure of the diesels to start and in the unavailability of the high pressure injection and core isolation coding systems

  13. Uncertainty and sensitivity analysis of the LOFT L2-5 test: Results of the BEMUSE programme

    International Nuclear Information System (INIS)

    Crecy, A. de; Bazin, P.; Glaeser, H.; Skorek, T.; Joucla, J.; Probst, P.; Fujioka, K.; Chung, B.D.; Oh, D.Y.; Kyncl, M.; Pernica, R.; Macek, J.; Meca, R.; Macian, R.; D'Auria, F.; Petruzzi, A.; Batet, L.; Perez, M.; Reventos, F.

    2008-01-01

    This paper presents the results and the main lessons learnt from the phase 3 of BEMUSE, an international benchmark activity sponsored by the Committee on the Safety of Nuclear Installations [CSNI: Committee on the Safety of Nuclear Installations (NEA, OECD), 2007. BEMUSE Phase III Report. NEA/CSNI R(2007) 4, October 2007] of the OECD/NEA. The phase 3 of BEMUSE aimed at performing Uncertainty and Sensitivity Analyses of thermal-hydraulic codes used for the calculation of LOFT L2-5 experiment, which simulated a Large-Break Loss-of-Coolant-Accident (LB-LOCA). Eleven participants coming from ten organisations and eight countries took part in this benchmark. In the first section of this paper, the context of BEMUSE is described as well as the methods used by the participants. In the second section, the results of the benchmark are presented. The majority of the participants find uncertainty bands which envelop the experimental data fairly well, however the width of these bands is much diverged. A synthesis of the sensitivity analysis results has been made and is expected to provide a useful basis for further uncertainty analysis dealing with LB-LOCA. Finally, recommendations are given both for uncertainty and sensitivity analysis

  14. Quantifying phenomenological importance in best-estimate plus uncertainty analyses

    International Nuclear Information System (INIS)

    Martin, Robert P.

    2009-01-01

    This paper describes a general methodology for quantifying the importance of specific phenomenological elements to analysis measures evaluated from non-parametric best-estimate plus uncertainty evaluation methodologies. The principal objective of an importance analysis is to reveal those uncertainty contributors having the greatest influence on key analysis measures. This characterization supports the credibility of the uncertainty analysis, the applicability of the analytical tools, and even the generic evaluation methodology through the validation of the engineering judgments that guided the evaluation methodology development. A demonstration of the importance analysis is provided using data from a sample problem considered in the development of AREVA's Realistic LBLOCA methodology. The results are presented against the original large-break LOCA Phenomena Identification and Ranking Table developed by the Technical Program Group responsible for authoring the Code Scaling, Applicability and Uncertainty methodology. (author)

  15. Background and Qualification of Uncertainty Methods

    International Nuclear Information System (INIS)

    D'Auria, F.; Petruzzi, A.

    2008-01-01

    The evaluation of uncertainty constitutes the necessary supplement of Best Estimate calculations performed to understand accident scenarios in water cooled nuclear reactors. The needs come from the imperfection of computational tools on the one side and from the interest in using such tool to get more precise evaluation of safety margins. The paper reviews the salient features of two independent approaches for estimating uncertainties associated with predictions of complex system codes. Namely the propagation of code input error and the propagation of the calculation output error constitute the key-words for identifying the methods of current interest for industrial applications. Throughout the developed methods, uncertainty bands can be derived (both upper and lower) for any desired quantity of the transient of interest. For the second case, the uncertainty method is coupled with the thermal-hydraulic code to get the Code with capability of Internal Assessment of Uncertainty, whose features are discussed in more detail.

  16. Advanced Approach to Consider Aleatory and Epistemic Uncertainties for Integral Accident Simulations

    International Nuclear Information System (INIS)

    Peschke, Joerg; Kloos, Martina

    2013-01-01

    The use of best-estimate codes together with realistic input data generally requires that all potentially important epistemic uncertainties which may affect the code prediction are considered in order to get an adequate quantification of the epistemic uncertainty of the prediction as an expression of the existing imprecise knowledge. To facilitate the performance of the required epistemic uncertainty analyses, methods and corresponding software tools are available like, for instance, the GRS-tool SUSA (Software for Uncertainty and Sensitivity Analysis). However, for risk-informed decision-making, the restriction on epistemic uncertainties alone is not enough. Transients and accident scenarios are also affected by aleatory uncertainties which are due to the unpredictable nature of phenomena. It is essential that aleatory uncertainties are taken into account as well, not only in a simplified and supposedly conservative way but as realistic as possible. The additional consideration of aleatory uncertainties, for instance, on the behavior of the technical system, the performance of plant operators, or on the behavior of the physical process provides a quantification of probabilistically significant accident sequences. Only if a safety analysis is able to account for both epistemic and aleatory uncertainties in a realistic manner, it can provide a well-founded risk-informed answer for decision-making. At GRS, an advanced probabilistic dynamics method was developed to address this problem and to provide a more realistic modeling and assessment of transients and accident scenarios. This method allows for an integral simulation of complex dynamic processes particularly taking into account interactions between the plant dynamics as simulated by a best-estimate code, the dynamics of operator actions and the influence of epistemic and aleatory uncertainties. In this paper, the GRS method MCDET (Monte Carlo Dynamic Event Tree) for probabilistic dynamics analysis is explained

  17. Uncertainty analysis techniques

    International Nuclear Information System (INIS)

    Marivoet, J.; Saltelli, A.; Cadelli, N.

    1987-01-01

    The origin of the uncertainty affecting Performance Assessments, as well as their propagation to dose and risk results is discussed. The analysis is focused essentially on the uncertainties introduced by the input parameters, the values of which may range over some orders of magnitude and may be given as probability distribution function. The paper briefly reviews the existing sampling techniques used for Monte Carlo simulations and the methods for characterizing the output curves, determining their convergence and confidence limits. Annual doses, expectation values of the doses and risks are computed for a particular case of a possible repository in clay, in order to illustrate the significance of such output characteristics as the mean, the logarithmic mean and the median as well as their ratios. The report concludes that provisionally, due to its better robustness, such estimation as the 90th percentile may be substituted to the arithmetic mean for comparison of the estimated doses with acceptance criteria. In any case, the results obtained through Uncertainty Analyses must be interpreted with caution as long as input data distribution functions are not derived from experiments reasonably reproducing the situation in a well characterized repository and site

  18. Uncertainty analysis for hot channel

    International Nuclear Information System (INIS)

    Panka, I.; Kereszturi, A.

    2006-01-01

    The fulfillment of the safety analysis acceptance criteria is usually evaluated by separate hot channel calculations using the results of neutronic or/and thermo hydraulic system calculations. In case of an ATWS event (inadvertent withdrawal of control assembly), according to the analysis, a number of fuel rods are experiencing DNB for a longer time and must be regarded as failed. Their number must be determined for a further evaluation of the radiological consequences. In the deterministic approach, the global power history must be multiplied by different hot channel factors (kx) taking into account the radial power peaking factors for each fuel pin. If DNB occurs it is necessary to perform a few number of hot channel calculations to determine the limiting kx leading just to DNB and fuel failure (the conservative DNBR limit is 1.33). Knowing the pin power distribution from the core design calculation, the number of failed fuel pins can be calculated. The above procedure can be performed by conservative assumptions (e.g. conservative input parameters in the hot channel calculations), as well. In case of hot channel uncertainty analysis, the relevant input parameters (k x, mass flow, inlet temperature of the coolant, pin average burnup, initial gap size, selection of power history influencing the gap conductance value) of hot channel calculations and the DNBR limit are varied considering the respective uncertainties. An uncertainty analysis methodology was elaborated combining the response surface method with the one sided tolerance limit method of Wilks. The results of deterministic and uncertainty hot channel calculations are compared regarding to the number of failed fuel rods, max. temperature of the clad surface and max. temperature of the fuel (Authors)

  19. Uncertainty Analysis and Expert Judgment in Seismic Hazard Analysis

    Science.gov (United States)

    Klügel, Jens-Uwe

    2011-01-01

    The large uncertainty associated with the prediction of future earthquakes is usually regarded as the main reason for increased hazard estimates which have resulted from some recent large scale probabilistic seismic hazard analysis studies (e.g. the PEGASOS study in Switzerland and the Yucca Mountain study in the USA). It is frequently overlooked that such increased hazard estimates are characteristic for a single specific method of probabilistic seismic hazard analysis (PSHA): the traditional (Cornell-McGuire) PSHA method which has found its highest level of sophistication in the SSHAC probability method. Based on a review of the SSHAC probability model and its application in the PEGASOS project, it is shown that the surprising results of recent PSHA studies can be explained to a large extent by the uncertainty model used in traditional PSHA, which deviates from the state of the art in mathematics and risk analysis. This uncertainty model, the Ang-Tang uncertainty model, mixes concepts of decision theory with probabilistic hazard assessment methods leading to an overestimation of uncertainty in comparison to empirical evidence. Although expert knowledge can be a valuable source of scientific information, its incorporation into the SSHAC probability method does not resolve the issue of inflating uncertainties in PSHA results. Other, more data driven, PSHA approaches in use in some European countries are less vulnerable to this effect. The most valuable alternative to traditional PSHA is the direct probabilistic scenario-based approach, which is closely linked with emerging neo-deterministic methods based on waveform modelling.

  20. Application of data analysis techniques to nuclear reactor systems code to accuracy assessment

    International Nuclear Information System (INIS)

    Kunz, R.F.; Kasmala, G.F.; Murray, C.J.; Mahaffy, J.H.

    2000-01-01

    An automated code assessment program (ACAP) has been developed by the authors to provide quantitative comparisons between nuclear reactor systems (NRS) code results and experimental measurements. This software was developed under subcontract to the United States Nuclear Regulatory Commission for use in its NRS code consolidation efforts. In this paper, background on the topic of NRS accuracy and uncertainty assessment is provided which motivates the development of and defines basic software requirements for ACAP. A survey of data analysis techniques was performed, focusing on the applicability of methods in the construction of NRS code-data comparison measures. The results of this review process, which further defined the scope, user interface and process for using ACAP are also summarized. A description of the software package and several sample applications to NRS data sets are provided. Its functionality and ability to provide objective accuracy assessment figures are demonstrated. (author)

  1. GAMUT: A computer code for γ-ray energy and intensity analysis

    International Nuclear Information System (INIS)

    Firestone, R.B.

    1991-05-01

    GAMUT is a computer code to analyze γ-ray energies and intensities. It does a linear least-squares fit of measured γ-ray energies from one or more experiments to the level scheme. GAMUT also performs a non-linear least-squares analysis of branching intensities. For both energy and intensity data, a statistical Chi-square analysis is performed with an iterative uncertainty adjustment. The uncertainties of outlying measured values and sets of measurements with x 2 /f>1 are increased, and the calculation is repeated until the uncertainties are consistent with the fitted values. GAMUT accepts input from standard or special-format ENSDF data sets. The special-format ENSDF data sets were designed to permit analysis of more than one set of measurements associated with a single ENSDF data set. GAMUT prepares a standard ENSDF format output data set containing the adjusted values. If more than one input ENSDF data set is provided, GAMUT creates an ADOPTED LEVELS, GAMMAS data set containing the adjusted level and γ-ray energies and branching intensities from each level normalized to 100 for the strongest γ-ray. GAMUT also provides a summary of the results and an extensive log of the iterative analysis. GAMUT is interactive prompting the user for input and output file names and for default calculation options. This version of GAMUT has adjustable dimensions so that any maximum number of data sets, levels, and γ-rays can be established at the time of implementation. 6 refs

  2. Safety margins of operating reactors. Analysis of uncertainties and implications for decision making

    International Nuclear Information System (INIS)

    2003-01-01

    Maintaining safety in the design and operation of nuclear power plants (NPPs) is a very important task under the conditions of a challenging environment, affected by the deregulated electricity market and implementation of risk informed regulations. In Member States, advanced computer codes are widely used as safety analysis tools in the framework of licensing of new NPP projects, safety upgrading programmes of existing NPPs, periodic safety reviews, renewal of operating licences, use of the safety margins for reactor power uprating, better utilization of nuclear fuel and higher operational flexibility, for justification of lifetime extensions, development of new emergency operating procedures, analysis of operational events, and development of accident management programmes. The issue of inadequate quality of safety analysis is becoming important due to a general tendency to use advanced tools for better establishment and utilization of safety margins, while the existence of such margins assure that NPPs operate safely in all modes of operation and at all times. The most important safety margins relate to physical barriers against release of radioactive material, such as fuel matrix and fuel cladding, reactor coolant system boundary, and the containment. Typically, safety margins are determined with use of computational tools for safety analysis. Advanced best estimate computer codes are suggested e.g. in the IAEA Safety Guide on Safety Assessment and Verification for Nuclear Power Plants to be used for current safety analysis. Such computer codes require their careful application to avoid unjustified reduction in robustness of the reactor safety. The issue of uncertainties in safety analyses and their impact on evaluation of safety margins is addressed in a number of IAEA guidance documents, in particular in the Safety Report on Accident Analysis for Nuclear Power Plants. It is also discussed in various technical meetings and workshops devoted to this area. The

  3. The uncertainty analysis of model results a practical guide

    CERN Document Server

    Hofer, Eduard

    2018-01-01

    This book is a practical guide to the uncertainty analysis of computer model applications. Used in many areas, such as engineering, ecology and economics, computer models are subject to various uncertainties at the level of model formulations, parameter values and input data. Naturally, it would be advantageous to know the combined effect of these uncertainties on the model results as well as whether the state of knowledge should be improved in order to reduce the uncertainty of the results most effectively. The book supports decision-makers, model developers and users in their argumentation for an uncertainty analysis and assists them in the interpretation of the analysis results.

  4. Development of Probabilistic Internal Dosimetry Computer Code

    Energy Technology Data Exchange (ETDEWEB)

    Noh, Siwan [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kwon, Tae-Eun [Korea Institute of Radiological and Medical Sciences, Seoul (Korea, Republic of); Lee, Jai-Ki [Korean Association for Radiation Protection, Seoul (Korea, Republic of)

    2017-02-15

    Internal radiation dose assessment involves biokinetic models, the corresponding parameters, measured data, and many assumptions. Every component considered in the internal dose assessment has its own uncertainty, which is propagated in the intake activity and internal dose estimates. For research or scientific purposes, and for retrospective dose reconstruction for accident scenarios occurring in workplaces having a large quantity of unsealed radionuclides, such as nuclear power plants, nuclear fuel cycle facilities, and facilities in which nuclear medicine is practiced, a quantitative uncertainty assessment of the internal dose is often required. However, no calculation tools or computer codes that incorporate all the relevant processes and their corresponding uncertainties, i.e., from the measured data to the committed dose, are available. Thus, the objective of the present study is to develop an integrated probabilistic internal-dose-assessment computer code. First, the uncertainty components in internal dosimetry are identified, and quantitative uncertainty data are collected. Then, an uncertainty database is established for each component. In order to propagate these uncertainties in an internal dose assessment, a probabilistic internal-dose-assessment system that employs the Bayesian and Monte Carlo methods. Based on the developed system, we developed a probabilistic internal-dose-assessment code by using MATLAB so as to estimate the dose distributions from the measured data with uncertainty. Using the developed code, we calculated the internal dose distribution and statistical values (e.g. the 2.5{sup th}, 5{sup th}, median, 95{sup th}, and 97.5{sup th} percentiles) for three sample scenarios. On the basis of the distributions, we performed a sensitivity analysis to determine the influence of each component on the resulting dose in order to identify the major component of the uncertainty in a bioassay. The results of this study can be applied to various

  5. Development of Probabilistic Internal Dosimetry Computer Code

    International Nuclear Information System (INIS)

    Noh, Siwan; Kwon, Tae-Eun; Lee, Jai-Ki

    2017-01-01

    Internal radiation dose assessment involves biokinetic models, the corresponding parameters, measured data, and many assumptions. Every component considered in the internal dose assessment has its own uncertainty, which is propagated in the intake activity and internal dose estimates. For research or scientific purposes, and for retrospective dose reconstruction for accident scenarios occurring in workplaces having a large quantity of unsealed radionuclides, such as nuclear power plants, nuclear fuel cycle facilities, and facilities in which nuclear medicine is practiced, a quantitative uncertainty assessment of the internal dose is often required. However, no calculation tools or computer codes that incorporate all the relevant processes and their corresponding uncertainties, i.e., from the measured data to the committed dose, are available. Thus, the objective of the present study is to develop an integrated probabilistic internal-dose-assessment computer code. First, the uncertainty components in internal dosimetry are identified, and quantitative uncertainty data are collected. Then, an uncertainty database is established for each component. In order to propagate these uncertainties in an internal dose assessment, a probabilistic internal-dose-assessment system that employs the Bayesian and Monte Carlo methods. Based on the developed system, we developed a probabilistic internal-dose-assessment code by using MATLAB so as to estimate the dose distributions from the measured data with uncertainty. Using the developed code, we calculated the internal dose distribution and statistical values (e.g. the 2.5 th , 5 th , median, 95 th , and 97.5 th percentiles) for three sample scenarios. On the basis of the distributions, we performed a sensitivity analysis to determine the influence of each component on the resulting dose in order to identify the major component of the uncertainty in a bioassay. The results of this study can be applied to various situations. In cases

  6. Methodology to carry out a sensitivity and uncertainty analysis for cross sections using a coupled model Trace-Parcs

    International Nuclear Information System (INIS)

    Reyes F, M. C.; Del Valle G, E.; Gomez T, A. M.; Sanchez E, V.

    2015-09-01

    A methodology was implemented to carry out a sensitivity and uncertainty analysis for cross sections used in a coupled model for Trace/Parcs in a transient of control rod fall of a BWR-5. A model of the reactor core for the neutronic code Parcs was used, in which the assemblies located in the core are described. Thermo-hydraulic model in Trace was a simple model, where only a component type Chan was designed to represent all the core assemblies, which it was within a single vessel and boundary conditions were established. The thermo-hydraulic part was coupled with the neutron part, first for the steady state and then a transient of control rod fall was carried out for the sensitivity and uncertainty analysis. To carry out the analysis of cross sections used in the coupled model Trace/Parcs during the transient, the Probability Density Functions for 22 parameters selected from the total of neutronic parameters that use Parcs were generated, obtaining 100 different cases for the coupled model Trace/Parcs, each one with a database of different cross sections. All these cases were executed with the coupled model, obtaining in consequence 100 different output files for the transient of control rod fall doing emphasis in the nominal power, for which an uncertainty analysis was realized at the same time generate the band of uncertainty. With this analysis is possible to observe the ranges of results of the elected responses varying the selected uncertainty parameters. The sensitivity analysis complements the uncertainty analysis, identifying the parameter or parameters with more influence on the results and thus focuses on these parameters in order to better understand their effects. Beyond the obtained results, because is not a model with real operation data, the importance of this work is to know the application of the methodology to carry out the sensitivity and uncertainty analyses. (Author)

  7. Uncertainty analysis and validation of environmental models. The empirically based uncertainty analysis

    International Nuclear Information System (INIS)

    Monte, Luigi; Hakanson, Lars; Bergstroem, Ulla; Brittain, John; Heling, Rudie

    1996-01-01

    The principles of Empirically Based Uncertainty Analysis (EBUA) are described. EBUA is based on the evaluation of 'performance indices' that express the level of agreement between the model and sets of empirical independent data collected in different experimental circumstances. Some of these indices may be used to evaluate the confidence limits of the model output. The method is based on the statistical analysis of the distribution of the index values and on the quantitative relationship of these values with the ratio 'experimental data/model output'. Some performance indices are described in the present paper. Among these, the so-called 'functional distance' (d) between the logarithm of model output and the logarithm of the experimental data, defined as d 2 =Σ n 1 ( ln M i - ln O i ) 2 /n where M i is the i-th experimental value, O i the corresponding model evaluation and n the number of the couplets 'experimental value, predicted value', is an important tool for the EBUA method. From the statistical distribution of this performance index, it is possible to infer the characteristics of the distribution of the ratio 'experimental data/model output' and, consequently to evaluate the confidence limits for the model predictions. This method was applied to calculate the uncertainty level of a model developed to predict the migration of radiocaesium in lacustrine systems. Unfortunately, performance indices are affected by the uncertainty of the experimental data used in validation. Indeed, measurement results of environmental levels of contamination are generally associated with large uncertainty due to the measurement and sampling techniques and to the large variability in space and time of the measured quantities. It is demonstrated that this non-desired effect, in some circumstances, may be corrected by means of simple formulae

  8. Small break LOCA RELAP5/MOD3 uncertainty quantification: Bias and uncertainty evaluation for important phenomena

    International Nuclear Information System (INIS)

    Ortiz, M.G.; Ghan, L.S.; Vogl, J.

    1991-01-01

    The Nuclear Regulatory Commission (NRC) revised the Emergency Core Cooling System (ECCS) licensing rule to allow the use of Best Estimate (BE) computer codes, provided the uncertainty of the calculations are quantified and used in the licensing and regulation process. The NRC developed a generic methodology called Code Scaling, Applicability and Uncertainty (CSAU) to evaluate BE code uncertainties. The CSAU methodology was demonstrated with a specific application to a pressurized water reactor (PWR), experiencing a postulated large break loss-of-coolant accident (LBLOCA). The current work is part of an effort to adapt and demonstrate the CSAU methodology to a small break (SB) LOCA in a PWR of B and W design using RELAP5/MOD3 as the simulation tool. The subject of this paper is the Assessment and Ranging of Parameters (Element 2 of the CSAU methodology), which determines the contribution to uncertainty of specific models in the code

  9. Sensitivity analysis and uncertainties simulation of the migration of radionuclide in the system of geological disposal-CRP-GEORC model

    International Nuclear Information System (INIS)

    Su Rui; Wang Ju; Chen Weiming; Zong Zihua; Zhao Honggang

    2008-01-01

    CRP-GEORC concept model is an artificial system of geological disposal for High-Level radioactive waste. Sensitivity analysis and uncertainties simulation of the migration of radionuclide Se-79 and I-129 in the far field of this system by using GoldSim Code have been conducted. It can be seen from the simulation results that variables used to describe the geological features and characterization of groundwater flow are sensitive variables of whole geological disposal system. The uncertainties of parameters have remarkable influence on the simulation results. (authors)

  10. Assessment of uncertainty in full core reactor physics calculations using statistical methods

    International Nuclear Information System (INIS)

    McEwan, C.

    2012-01-01

    The best estimate method of safety analysis involves choosing a realistic set of input parameters for a proposed safety case and evaluating the uncertainty in the results. Determining the uncertainty in code outputs remains a challenge and is the subject of a benchmarking exercise proposed by the Organization for Economic Cooperation and Development. The work proposed in this paper will contribute to this benchmark by assessing the uncertainty in a depletion calculation of the final nuclide concentrations for an experiment performed in the Fukushima-2 reactor. This will be done using lattice transport code DRAGON and a tool known as DINOSAUR. (author)

  11. Assessment of uncertainty in full core reactor physics calculations using statistical methods

    Energy Technology Data Exchange (ETDEWEB)

    McEwan, C., E-mail: mcewac2@mcmaster.ca [McMaster Univ., Hamilton, Ontario (Canada)

    2012-07-01

    The best estimate method of safety analysis involves choosing a realistic set of input parameters for a proposed safety case and evaluating the uncertainty in the results. Determining the uncertainty in code outputs remains a challenge and is the subject of a benchmarking exercise proposed by the Organization for Economic Cooperation and Development. The work proposed in this paper will contribute to this benchmark by assessing the uncertainty in a depletion calculation of the final nuclide concentrations for an experiment performed in the Fukushima-2 reactor. This will be done using lattice transport code DRAGON and a tool known as DINOSAUR. (author)

  12. Sensitivity coefficients of reactor parameters in fast critical assemblies and uncertainty analysis

    International Nuclear Information System (INIS)

    Aoyama, Takafumi; Suzuki, Takayuki; Takeda, Toshikazu; Hasegawa, Akira; Kikuchi, Yasuyuki.

    1986-02-01

    Sensitivity coefficients of reactor parameters in several fast critical assemblies to various cross sections were calculated in 16 group by means of SAGEP code based on the generalized perturbation theory. The sensitivity coefficients were tabulated and the difference of sensitivity coefficients was discussed. Furthermore, the uncertainty of calculated reactor parameters due to cross section uncertainty were estimated using the sensitivity coefficients and cross section covariance data. (author)

  13. SWEPP PAN assay system uncertainty analysis: Active mode measurements of solidified aqueous sludge waste

    International Nuclear Information System (INIS)

    Blackwood, L.G.; Harker, Y.D.; Meachum, T.R.

    1997-12-01

    The Idaho National Engineering and Environmental Laboratory is being used as a temporary storage facility for transuranic waste generated by the US Nuclear Weapons program at the Rocky Flats Plant (RFP) in Golden, Colorado. Currently, there is a large effort in progress to prepare to ship this waste to the Waste Isolation Pilot Plant (WIPP) in Carlsbad, New Mexico. In order to meet the TRU Waste Characterization Quality Assurance Program Plan nondestructive assay compliance requirements and quality assurance objectives, it is necessary to determine the total uncertainty of the radioassay results produced by the Stored Waste Examination Pilot Plant (SWEPP) Passive Active Neutron (PAN) radioassay system. This paper is one of a series of reports quantifying the results of the uncertainty analysis of the PAN system measurements for specific waste types and measurement modes. In particular this report covers active mode measurements of weapons grade plutonium-contaminated aqueous sludge waste contained in 208 liter drums (item description codes 1, 2, 7, 800, 803, and 807). Results of the uncertainty analysis for PAN active mode measurements of aqueous sludge indicate that a bias correction multiplier of 1.55 should be applied to the PAN aqueous sludge measurements. With the bias correction, the uncertainty bounds on the expected bias are 0 ± 27%. These bounds meet the Quality Assurance Program Plan requirements for radioassay systems

  14. XSOR codes users manual

    International Nuclear Information System (INIS)

    Jow, Hong-Nian; Murfin, W.B.; Johnson, J.D.

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ''XSOR''. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms

  15. Urban drainage models - making uncertainty analysis simple

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana

    2012-01-01

    in each measured/observed datapoint; an issue which is commonly overlook in the uncertainty analysis of urban drainage models. This comparison allows the user to intuitively estimate the optimum number of simulations required to conduct uncertainty analyses. The output of the method includes parameter......There is increasing awareness about uncertainties in modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here...

  16. Automated sensitivity analysis of the radionuclide migration code UCB-NE-10.2

    International Nuclear Information System (INIS)

    Pin, F.G.; Worley, B.A.; Oblow, E.M.; Wright, R.Q.; Harper, W.V.

    1985-01-01

    The Salt Repository Project (SRP) of the U.S. Department of Energy is performing ongoing performance assessment analyses for the eventual licensing of an underground high-level nuclear waste repository in salt. As part of these studies, sensitivity and uncertainty analyses play a major role in the identification of important parameters, and in the identification of specific data needs for site characterization. Oak Ridge National Laboratory has supported the SRP in this effort resulting in thee development of an automated procedure for performing large scale sensitivity analysis using computer calculus. GRESS, GRadient Enhanced Software System, is a pre-compiler that can process FORTRAN computer codes and add derivative taking capabilities to the normal calculated results. The GRESS code is described and applied to the code UCB-NE-10.2 which simulates the migration through a sorption medium of the radionuclide members of a decay chain

  17. Consistent Code Qualification Process and Application to WWER-1000 NPP

    International Nuclear Information System (INIS)

    Berthon, A.; Petruzzi, A.; Giannotti, W.; D'Auria, F.; Reventos, F.

    2006-01-01

    Calculation analysis by application of the system codes are performed to evaluate the NPP or the facility behavior during a postulated transient or to evaluate the code capability. The calculation analysis constitutes a process that involves the code itself, the data of the reference plant, the data about the transient, the nodalization, and the user. All these elements affect one each other and affect the results. A major issue in the use of mathematical model is constituted by the model capability to reproduce the plant or facility behavior under steady state and transient conditions. These aspects constitute two main checks that must be satisfied during the qualification process. The first of them is related to the realization of a scheme of the reference plant; the second one is related to the capability to reproduce the transient behavior. The aim of this paper is to describe the UMAE (Uncertainty Method based on Accuracy Extrapolation) methodology developed at University of Pisa for qualifying a nodalization and analysing the calculated results and to perform the uncertainty evaluation of the system code by the CIAU code (Code with the capability of Internal Assessment of Uncertainty). The activity consists with the re-analysis of the Experiment BL-44 (SBLOCA) performed in the LOBI facility and the analysis of a Kv-scaling calculation of the WWER-1000 NPP nodalization taking as reference the test BL-44. Relap5/Mod3.3 has been used as thermal-hydraulic system code and the standard procedure adopted at University of Pisa has been applied to show the capability of the code to predict the significant aspects of the transient and to obtain a qualified nodalization of the WWER-1000 through a systematic qualitative and quantitative accuracy evaluation. The qualitative accuracy evaluation is based on the selection of Relevant Thermal-hydraulic Aspects (RTAs) and is a prerequisite to the application of the Fast Fourier Transform Based Method (FFTBM) which quantifies

  18. Uncertainties in thick-target PIXE analysis

    International Nuclear Information System (INIS)

    Campbell, J.L.; Cookson, J.A.; Paul, H.

    1983-01-01

    Thick-target PIXE analysis insolves uncertainties arising from the calculation of thick-target X-ray production in addition to the usual PIXE uncertainties. The calculation demands knowledge of ionization cross-sections, stopping powers and photon attenuation coefficients. Information on these is reviewed critically and a computational method is used to estimate the uncertainties transmitted from this data base into results of thick-target PIXE analyses with reference to particular specimen types using beams of 2-3 MeV protons. A detailed assessment of the accuracy of thick-target PIXE is presented. (orig.)

  19. Understanding and reducing statistical uncertainties in nebular abundance determinations

    Science.gov (United States)

    Wesson, R.; Stock, D. J.; Scicluna, P.

    2012-06-01

    Whenever observations are compared to theories, an estimate of the uncertainties associated with the observations is vital if the comparison is to be meaningful. However, many or even most determinations of temperatures, densities and abundances in photoionized nebulae do not quote the associated uncertainty. Those that do typically propagate the uncertainties using analytical techniques which rely on assumptions that generally do not hold. Motivated by this issue, we have developed Nebular Empirical Analysis Tool (NEAT), a new code for calculating chemical abundances in photoionized nebulae. The code carries out a standard analysis of lists of emission lines using long-established techniques to estimate the amount of interstellar extinction, calculate representative temperatures and densities, compute ionic abundances from both collisionally excited lines and recombination lines, and finally to estimate total elemental abundances using an ionization correction scheme. NEATuses a Monte Carlo technique to robustly propagate uncertainties from line flux measurements through to the derived abundances. We show that, for typical observational data, this approach is superior to analytic estimates of uncertainties. NEAT also accounts for the effect of upward biasing on measurements of lines with low signal-to-noise ratio, allowing us to accurately quantify the effect of this bias on abundance determinations. We find not only that the effect can result in significant overestimates of heavy element abundances derived from weak lines, but also that taking it into account reduces the uncertainty of these abundance determinations. Finally, we investigate the effect of possible uncertainties in R, the ratio of selective-to-total extinction, on abundance determinations. We find that the uncertainty due to this parameter is negligible compared to the statistical uncertainties due to typical line flux measurement uncertainties.

  20. Use of Nuclear Data Sensitivity and Uncertainty Analysis for the Design Preparation of the HCLL Breeder Blanket Mockup Experiment for ITER

    Directory of Open Access Journals (Sweden)

    I. Kodeli

    2008-01-01

    Full Text Available An experiment on a mockup of the test blanket module based on helium-cooled lithium lead (HCLL concept will be performed in 2008 in the Frascati Neutron Generator (FNG in order to study neutronics characteristics of the module and the accuracy of the computational tools. With the objective to prepare and optimise the design of the mockup in the sense to provide maximum information on the state-of-the-art of the cross-section data the mockup was pre-analysed using the deterministic codes for the sensitivity/uncertainty analysis. The neutron fluxes and tritium production rate (TPR, their sensitivity to the underlying basic cross-sections, as well as the corresponding uncertainties were calculated using the deterministic transport codes (DOORS package, the sensitivity/uncertainty code package SUSD3D, and the VITAMINJ/ COVA covariance matrix libraries. The cross-section reactions with largest contribution to the uncertainty of the calculated TPR were identified to be (n,2n and (n,3n reactions on lead. The conclusions of this work support the main benchmark design and suggest some modifications and improvements. In particular this study recommends the use, as far as possible, of both natural and enriched lithium pellets for the TRP measurements. The combined use is expected to provide additional and complementary information on the sensitive cross-sections.

  1. Uncertainty Propagation in Monte Carlo Depletion Analysis

    International Nuclear Information System (INIS)

    Shim, Hyung Jin; Kim, Yeong-il; Park, Ho Jin; Joo, Han Gyu; Kim, Chang Hyo

    2008-01-01

    A new formulation aimed at quantifying uncertainties of Monte Carlo (MC) tallies such as k eff and the microscopic reaction rates of nuclides and nuclide number densities in MC depletion analysis and examining their propagation behaviour as a function of depletion time step (DTS) is presented. It is shown that the variance of a given MC tally used as a measure of its uncertainty in this formulation arises from four sources; the statistical uncertainty of the MC tally, uncertainties of microscopic cross sections and nuclide number densities, and the cross correlations between them and the contribution of the latter three sources can be determined by computing the correlation coefficients between the uncertain variables. It is also shown that the variance of any given nuclide number density at the end of each DTS stems from uncertainties of the nuclide number densities (NND) and microscopic reaction rates (MRR) of nuclides at the beginning of each DTS and they are determined by computing correlation coefficients between these two uncertain variables. To test the viability of the formulation, we conducted MC depletion analysis for two sample depletion problems involving a simplified 7x7 fuel assembly (FA) and a 17x17 PWR FA, determined number densities of uranium and plutonium isotopes and their variances as well as k ∞ and its variance as a function of DTS, and demonstrated the applicability of the new formulation for uncertainty propagation analysis that need be followed in MC depletion computations. (authors)

  2. Fuel performance analysis code 'FAIR'

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.; Mahajan, S.C.; Kakodkar, A.

    1994-01-01

    For modelling nuclear reactor fuel rod behaviour of water cooled reactors under severe power maneuvering and high burnups, a mechanistic fuel performance analysis code FAIR has been developed. The code incorporates finite element based thermomechanical module, physically based fission gas release module and relevant models for modelling fuel related phenomena, such as, pellet cracking, densification and swelling, radial flux redistribution across the pellet due to the build up of plutonium near the pellet surface, pellet clad mechanical interaction/stress corrosion cracking (PCMI/SSC) failure of sheath etc. The code follows the established principles of fuel rod analysis programmes, such as coupling of thermal and mechanical solutions along with the fission gas release calculations, analysing different axial segments of fuel rod simultaneously, providing means for performing local analysis such as clad ridging analysis etc. The modular nature of the code offers flexibility in affecting modifications easily to the code for modelling MOX fuels and thorium based fuels. For performing analysis of fuel rods subjected to very long power histories within a reasonable amount of time, the code has been parallelised and is commissioned on the ANUPAM parallel processing system developed at Bhabha Atomic Research Centre (BARC). (author). 37 refs

  3. The role of the PIRT process in identifying code improvements and executing code development

    International Nuclear Information System (INIS)

    Wilson, G.E.; Boyack, B.E.

    1997-01-01

    In September 1988, the USNRC issued a revised ECCS rule for light water reactors that allows, as an option, the use of best estimate (BE) plus uncertainty methods in safety analysis. The key feature of this licensing option relates to quantification of the uncertainty in the determination that an NPP has a low probability of violating the safety criteria specified in 10 CFR 50. To support the 1988 licensing revision, the USNRC and its contractors developed the CSAU evaluation methodology to demonstrate the feasibility of the BE plus uncertainty approach. The PIRT process, Step 3 in the CSAU methodology, was originally formulated to support the BE plus uncertainty licensing option as executed in the CSAU approach to safety analysis. Subsequent work has shown the PIRT process to be a much more powerful tool than conceived in its original form. Through further development and application, the PIRT process has shown itself to be a robust means to establish safety analysis computer code phenomenological requirements in their order of importance to such analyses. Used early in research directed toward these objectives, PIRT results also provide the technical basis and cost effective organization for new experimental programs needed to improve the safety analysis codes for new applications. The primary purpose of this paper is to describe the generic PIRT process, including typical and common illustrations from prior applications. The secondary objective is to provide guidance to future applications of the process to help them focus, in a graded approach, on systems, components, processes and phenomena that have been common in several prior applications

  4. Cassini Spacecraft Uncertainty Analysis Data and Methodology Review and Update/Volume 1: Updated Parameter Uncertainty Models for the Consequence Analysis

    Energy Technology Data Exchange (ETDEWEB)

    WHEELER, TIMOTHY A.; WYSS, GREGORY D.; HARPER, FREDERICK T.

    2000-11-01

    Uncertainty distributions for specific parameters of the Cassini General Purpose Heat Source Radioisotope Thermoelectric Generator (GPHS-RTG) Final Safety Analysis Report consequence risk analysis were revised and updated. The revisions and updates were done for all consequence parameters for which relevant information exists from the joint project on Probabilistic Accident Consequence Uncertainty Analysis by the United States Nuclear Regulatory Commission and the Commission of European Communities.

  5. Sensitivity analysis of fuel pin failure performance under slow-ramp type transient overpower condition by using a fuel performance analysis code FEMAXI-FBR

    International Nuclear Information System (INIS)

    Tsuboi, Yasushi; Ninokata, Hisashi; Endo, Hiroshi; Ishizu, Tomoko; Tatewaki, Isao; Saito, Hiroaki

    2012-01-01

    The FEMAXI-FBR is a fuel performance analysis code and has been developed as one module of core disruptive evaluation system, the ASTERIA-FBR. The FEMAXI-FBR has reproduced the failure pin behavior during slow transient overpower. The axial location of pin failure affects the power and reactivity behavior during core disruptive accident, and failure model of which pin failure occurs at upper part of pin is used by reflecting the results of the CABRI-2 test. By using the FEMAXI-FBR, sensitivity analysis of uncertainty of design parameters such as irradiation conditions and fuel fabrication tolerances was performed to clarify the effect on axial location of pin failure during slow transient overpower. The sensitivity analysis showed that the uncertainty of design parameters does not affect the failure location. It suggests that the failure model with which locations of failure occur at upper part of pin can be adopted for core disruptive calculation by taking into consideration of design uncertainties. (author)

  6. Uncertainty characteristics of EPA's ground-water transport model for low-level waste performance assessment

    International Nuclear Information System (INIS)

    Yim, Man-Sung

    1995-01-01

    Performance assessment is an essential step either in design or in licensing processes to ensure the safety of any proposed radioactive waste disposal facilities. Since performance assessment requires the use of computer codes, understanding the characteristics of computer models used and the uncertainties of the estimated results is important. The PRESTO-EPA code, which was the basis of the Environmental Protection Agency's analysis for low-level-waste rulemaking, is widely used for various performance assessment activities in the country with no adequate information available for the uncertainty characteristics of the results. In this study, the groundwater transport model PRESTO-EPA was examined based on the analysis of 14 C transport along with the investigation of uncertainty characteristics

  7. Uncertainty study of the PWR pressure vessel fluence. Adjustment of the nuclear data base

    International Nuclear Information System (INIS)

    Kodeli, I.A.

    1994-01-01

    The code system devoted to the calculation of the sensitivity and uncertainty of of the neutron flux and reaction rates calculated by the transport codes, has been developed. Adjustment of the basic data to experimental results can be performed as well. Various sources of uncertainties can be taken into account, such as those due to the uncertainties in the cross-sections, response functions, fission spectrum and space distribution of neutron source, geometry and material composition uncertainties... One -As well as two- dimensional analysis can be performed. Linear perturbation theory is applied. The code system is sufficiently general to be used for various analysis in the fields of fission and fusion. The principal objective of our studies concerns the capsule dosimetry study realized in the framework of the 900 MWe PWR pressure vessel surveillance program. The analysis indicates that the present calculations, performed by the code TRIPOLI-2, using the ENDF/B-IV based, non-perturbed neutron cross-section library in 315 energy groups, allows to estimate the neutron flux and the reaction rates in the surveillance capsules and in the most calculated and measured reaction rates permits to reduce these uncertainties. The results obtained with the adjusted iron cross-sections, response functions and fission spectrum show that the agreement between the calculation and the experiment was improved to become within 10% approximately. The neutron flux deduced from the experiment is then extrapolated from the capsule to the most exposed pressure vessel location using the calculated lead factor. The uncertainty in this factor was estimated to be about 7%. (author). 39 refs., 52 figs., 30 tabs

  8. Uncertainty Analysis of Light Water Reactor Fuel Lattices

    Directory of Open Access Journals (Sweden)

    C. Arenas

    2013-01-01

    Full Text Available The study explored the calculation of uncertainty based on available cross-section covariance data and computational tool on fuel lattice levels, which included pin cell and the fuel assembly models. Uncertainty variations due to temperatures changes and different fuel compositions are the main focus of this analysis. Selected assemblies and unit pin cells were analyzed according to the OECD LWR UAM benchmark specifications. Criticality and uncertainty analysis were performed using TSUNAMI-2D sequence in SCALE 6.1. It was found that uncertainties increase with increasing temperature, while kinf decreases. This increase in the uncertainty is due to the increase in sensitivity of the largest contributing reaction of uncertainty, namely, the neutron capture reaction 238U(n, γ due to the Doppler broadening. In addition, three types (UOX, MOX, and UOX-Gd2O3 of fuel material compositions were analyzed. A remarkable increase in uncertainty in kinf was observed for the case of MOX fuel. The increase in uncertainty of kinf in MOX fuel was nearly twice the corresponding value in UOX fuel. The neutron-nuclide reaction of 238U, mainly inelastic scattering (n, n′, contributed the most to the uncertainties in the MOX fuel, shifting the neutron spectrum to higher energy compared to the UOX fuel.

  9. CASL L1 Milestone report : CASL.P4.01, sensitivity and uncertainty analysis for CIPS with VIPRE-W and BOA.

    Energy Technology Data Exchange (ETDEWEB)

    Sung, Yixing (Westinghouse Electric Company LLC, Cranberry Township, PA); Adams, Brian M.; Secker, Jeffrey R. (Westinghouse Electric Company LLC, Cranberry Township, PA)

    2011-12-01

    The CASL Level 1 Milestone CASL.P4.01, successfully completed in December 2011, aimed to 'conduct, using methodologies integrated into VERA, a detailed sensitivity analysis and uncertainty quantification of a crud-relevant problem with baseline VERA capabilities (ANC/VIPRE-W/BOA).' The VUQ focus area led this effort, in partnership with AMA, and with support from VRI. DAKOTA was coupled to existing VIPRE-W thermal-hydraulics and BOA crud/boron deposit simulations representing a pressurized water reactor (PWR) that previously experienced crud-induced power shift (CIPS). This work supports understanding of CIPS by exploring the sensitivity and uncertainty in BOA outputs with respect to uncertain operating and model parameters. This report summarizes work coupling the software tools, characterizing uncertainties, and analyzing the results of iterative sensitivity and uncertainty studies. These studies focused on sensitivity and uncertainty of CIPS indicators calculated by the current version of the BOA code used in the industry. Challenges with this kind of analysis are identified to inform follow-on research goals and VERA development targeting crud-related challenge problems.

  10. CASL L1 Milestone report: CASL.P4.01, sensitivity and uncertainty analysis for CIPS with VIPRE-W and BOA

    International Nuclear Information System (INIS)

    Sung, Yixing; Adams, Brian M.; Secker, Jeffrey R.

    2011-01-01

    The CASL Level 1 Milestone CASL.P4.01, successfully completed in December 2011, aimed to 'conduct, using methodologies integrated into VERA, a detailed sensitivity analysis and uncertainty quantification of a crud-relevant problem with baseline VERA capabilities (ANC/VIPRE-W/BOA).' The VUQ focus area led this effort, in partnership with AMA, and with support from VRI. DAKOTA was coupled to existing VIPRE-W thermal-hydraulics and BOA crud/boron deposit simulations representing a pressurized water reactor (PWR) that previously experienced crud-induced power shift (CIPS). This work supports understanding of CIPS by exploring the sensitivity and uncertainty in BOA outputs with respect to uncertain operating and model parameters. This report summarizes work coupling the software tools, characterizing uncertainties, and analyzing the results of iterative sensitivity and uncertainty studies. These studies focused on sensitivity and uncertainty of CIPS indicators calculated by the current version of the BOA code used in the industry. Challenges with this kind of analysis are identified to inform follow-on research goals and VERA development targeting crud-related challenge problems.

  11. Estimating uncertainty of inference for validation

    Energy Technology Data Exchange (ETDEWEB)

    Booker, Jane M [Los Alamos National Laboratory; Langenbrunner, James R [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2010-09-30

    We present a validation process based upon the concept that validation is an inference-making activity. This has always been true, but the association has not been as important before as it is now. Previously, theory had been confirmed by more data, and predictions were possible based on data. The process today is to infer from theory to code and from code to prediction, making the role of prediction somewhat automatic, and a machine function. Validation is defined as determining the degree to which a model and code is an accurate representation of experimental test data. Imbedded in validation is the intention to use the computer code to predict. To predict is to accept the conclusion that an observable final state will manifest; therefore, prediction is an inference whose goodness relies on the validity of the code. Quantifying the uncertainty of a prediction amounts to quantifying the uncertainty of validation, and this involves the characterization of uncertainties inherent in theory/models/codes and the corresponding data. An introduction to inference making and its associated uncertainty is provided as a foundation for the validation problem. A mathematical construction for estimating the uncertainty in the validation inference is then presented, including a possibility distribution constructed to represent the inference uncertainty for validation under uncertainty. The estimation of inference uncertainty for validation is illustrated using data and calculations from Inertial Confinement Fusion (ICF). The ICF measurements of neutron yield and ion temperature were obtained for direct-drive inertial fusion capsules at the Omega laser facility. The glass capsules, containing the fusion gas, were systematically selected with the intent of establishing a reproducible baseline of high-yield 10{sup 13}-10{sup 14} neutron output. The deuterium-tritium ratio in these experiments was varied to study its influence upon yield. This paper on validation inference is the

  12. Large break LOCA uncertainty evaluation and comparison with conservative calculation

    International Nuclear Information System (INIS)

    Glaeser, H.G.

    2004-01-01

    is different to the USA. Significant differences of results are presented between conservative calculations according to the USA Code of Federal Regulation which requires to apply conservative models in conformance with the required and acceptable features of ECCS Evaluation Models, and best estimate plus uncertainty evaluation. Consequently, additional margin to licensing criteria is available by changing from conservative evaluation to best estimate calculations plus uncertainty analysis in the USA. This is not the case in other countries where the use of best estimate computer codes is already a common practice for 'conservative' calculations. However, uncertainty of calculation results is especially important when approaching licensing limits, e.g. due to power u prates. This is the reason why a sub-committee of the German Reactor Safety Commission recently recommended the assessment of uncertainty in calculated results in licensing

  13. An analysis of the CSNI/GREST core concrete interaction chemical thermodynamic benchmark exercise using the MPEC2 computer code

    International Nuclear Information System (INIS)

    Muramatsu, Ken; Kondo, Yasuhiko; Uchida, Masaaki; Soda, Kunihisa

    1989-01-01

    Fission product (EP) release during a core concrete interaction (CCI) is an important factor of the uncertainty associated with a source term estimation for an LWR severe accident. An analysis was made on the CCI Chemical Thermodynamic Benchmark Exercise organized by OECD/NEA/CSNI Group of Experts on Source Terms (GREST) for investigating the uncertainty in thermodynamic modeling for CCI. The benchmark exercise was to calculate the equilibrium FP vapor pressure for given system of temperature, pressure, and debris composition. The benchmark consisted of two parts, A and B. Part A was a simplified problem intended to test the numerical techniques. In part B, the participants were requested to use their own best estimate thermodynamic data base to examine the variability of the results due to the difference in thermodynamic data base. JAERI participated in this benchmark exercise with use of the MPEC2 code. Chemical thermodynamic data base needed for analysis of Part B was taken from the VENESA code. This report describes the computer code used, inputs to the code, and results from the calculation by JAERI. The present calculation indicates that the FP vapor pressure depends strongly on temperature and Oxygen potential in core debris and the pattern of dependency may be different for different FP elements. (author)

  14. Bayesian models for comparative analysis integrating phylogenetic uncertainty

    Directory of Open Access Journals (Sweden)

    Villemereuil Pierre de

    2012-06-01

    general purpose tool for phylogenetic comparative analyses, particularly for modelling in the face of phylogenetic uncertainty and accounting for measurement error or individual variation in explanatory variables. Code for all models is provided in the BUGS model description language.

  15. Bayesian models for comparative analysis integrating phylogenetic uncertainty

    Science.gov (United States)

    2012-01-01

    phylogenetic comparative analyses, particularly for modelling in the face of phylogenetic uncertainty and accounting for measurement error or individual variation in explanatory variables. Code for all models is provided in the BUGS model description language. PMID:22741602

  16. Nordic reference study on uncertainty and sensitivity analysis

    International Nuclear Information System (INIS)

    Hirschberg, S.; Jacobsson, P.; Pulkkinen, U.; Porn, K.

    1989-01-01

    This paper provides a review of the first phase of Nordic reference study on uncertainty and sensitivity analysis. The main objective of this study is to use experiences form previous Nordic Benchmark Exercises and reference studies concerning critical modeling issues such as common cause failures and human interactions, and to demonstrate the impact of associated uncertainties on the uncertainty of the investigated accident sequence. This has been done independently by three working groups which used different approaches to modeling and to uncertainty analysis. The estimated uncertainty interval for the analyzed accident sequence is large. Also the discrepancies between the groups are substantial but can be explained. Sensitivity analyses which have been carried out concern e.g. use of different CCF-quantification models, alternative handling of CCF-data, time windows for operator actions and time dependences in phase mission operation, impact of state-of-knowledge dependences and ranking of dominating uncertainty contributors. Specific findings with respect to these issues are summarized in the paper

  17. Application of uncertainty analysis method for calculations of accident conditions for RP AES-2006

    International Nuclear Information System (INIS)

    Zajtsev, S.I.; Bykov, M.A.; Zakutaev, M.O.; Siryapin, V.N.; Petkevich, I.G.; Siryapin, N.V.; Borisov, S.L.; Kozlachkov, A.N.

    2015-01-01

    An analysis of some accidents using the uncertainly assessment methods is given. The list of the variable parameters incorporated the model parameters of the computer codes, initial and boundary conditions of reactor plant, neutronics. On the basis of the performed calculations of the accident conditions using the statistical method, errors assessment is presented in the determination of the main parameters comparable with the acceptance criteria. It was shown that in the investigated accidents the values of the calculated parameters with account for their error obtained from TRAP-KS and KORSAR/GP Codes do not exceed the established acceptance criteria. Besides, these values do not exceed the values obtained in the conservative calculations. A possibility in principle of the actual application of the method of estimation of uncertainty was shown to justify the safety of WWER AES-2006 using the thermal-physical codes KORSAR/GP and TRAP-KS, PANDA and SUSA programs [ru

  18. Uncertainty and Sensitivity Analyses Plan

    International Nuclear Information System (INIS)

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project

  19. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis:version 4.0 developers manual.

    Energy Technology Data Exchange (ETDEWEB)

    Griffin, Joshua D. (Sandia National lababoratory, Livermore, CA); Eldred, Michael Scott; Martinez-Canales, Monica L. (Sandia National lababoratory, Livermore, CA); Watson, Jean-Paul; Kolda, Tamara Gibson (Sandia National lababoratory, Livermore, CA); Giunta, Anthony Andrew; Adams, Brian M.; Swiler, Laura Painton; Williams, Pamela J. (Sandia National lababoratory, Livermore, CA); Hough, Patricia Diane (Sandia National lababoratory, Livermore, CA); Gay, David M.; Dunlavy, Daniel M.; Eddy, John P.; Hart, William Eugene; Brown, Shannon L.

    2006-10-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a developers manual for the DAKOTA software and describes the DAKOTA class hierarchies and their interrelationships. It derives directly from annotation of the actual source code and provides detailed class documentation, including all member functions and attributes.

  20. DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis. Version 5.0, developers manual.

    Energy Technology Data Exchange (ETDEWEB)

    Eldred, Michael Scott; Dalbey, Keith R.; Bohnhoff, William J.; Adams, Brian M.; Swiler, Laura Painton; Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Gay, David M.; Eddy, John P.; Haskell, Karen H.

    2010-05-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a developers manual for the DAKOTA software and describes the DAKOTA class hierarchies and their interrelationships. It derives directly from annotation of the actual source code and provides detailed class documentation, including all member functions and attributes.

  1. Durability reliability analysis for corroding concrete structures under uncertainty

    Science.gov (United States)

    Zhang, Hao

    2018-02-01

    This paper presents a durability reliability analysis of reinforced concrete structures subject to the action of marine chloride. The focus is to provide insight into the role of epistemic uncertainties on durability reliability. The corrosion model involves a number of variables whose probabilistic characteristics cannot be fully determined due to the limited availability of supporting data. All sources of uncertainty, both aleatory and epistemic, should be included in the reliability analysis. Two methods are available to formulate the epistemic uncertainty: the imprecise probability-based method and the purely probabilistic method in which the epistemic uncertainties are modeled as random variables. The paper illustrates how the epistemic uncertainties are modeled and propagated in the two methods, and shows how epistemic uncertainties govern the durability reliability.

  2. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis :

    Energy Technology Data Exchange (ETDEWEB)

    Adams, Brian M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ebeida, Mohamed Salah [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eldred, Michael S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jakeman, John Davis [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Swiler, Laura Painton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Stephens, John Adam [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vigil, Dena M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wildey, Timothy Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bohnhoff, William J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eddy, John P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hu, Kenneth T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dalbey, Keith R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bauman, Lara E [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hough, Patricia Diane [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-05-01

    The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.

  3. Developing a methodology for the evaluation of results uncertainties in CFD codes; Desarrollo de una Metodologia para la Evaluacion de Incertidumbres en los Resultados de Codigos de CFD

    Energy Technology Data Exchange (ETDEWEB)

    Munoz-cobo, J. L.; Chiva, S.; Pena, C.; Vela, E.

    2014-07-01

    In this work the development of a methodology is studied to evaluate the uncertainty in the results of CFD codes and is compatible with the VV-20 standard Standard for Verification and Validation in CFD and Heat Transfer {sup ,} developed by the Association of Mechanical Engineers ASME . Similarly, the alternatives are studied for obtaining existing uncertainty in the results to see which is the best choice from the point of view of implementation and time. We have developed two methods for calculating uncertainty of the results of a CFD code, the first method based on the use of techniques of Monte-Carlo for the propagation of uncertainty in this first method we think it is preferable to use the statistics of the order to determine the number of cases to execute the code, because this way we can always determine the confidence interval desired level of output quantities. The second type of method we have developed is based on non-intrusive polynomial chaos. (Author)

  4. Uncertainty and Sensitivity Analysis Applied to the Validation of BWR Bundle Thermal-Hydraulic Calculations

    International Nuclear Information System (INIS)

    Hernandez-Solis, Augusto

    2010-04-01

    This work has two main objectives. The first one is to enhance the validation process of the thermal-hydraulic features of the Westinghouse code POLCA-T. This is achieved by computing a quantitative validation limit based on statistical uncertainty analysis. This validation theory is applied to some of the benchmark cases of the following macroscopic BFBT exercises: 1) Single and two phase bundle pressure drops, 2) Steady-state cross-sectional averaged void fraction, 3) Transient cross-sectional averaged void fraction and 4) Steady-state critical power tests. Sensitivity analysis is also performed to identify the most important uncertain parameters for each exercise. The second objective consists in showing the clear advantages of using the quasi-random Latin Hypercube Sampling (LHS) strategy over simple random sampling (SRS). LHS allows a much better coverage of the input uncertainties than SRS because it densely stratifies across the range of each input probability distribution. The aim here is to compare both uncertainty analyses on the BWR assembly void axial profile prediction in steady-state, and on the transient void fraction prediction at a certain axial level coming from a simulated re-circulation pump trip scenario. It is shown that the replicated void fraction mean (either in steady-state or transient conditions) has less variability when using LHS than SRS for the same number of calculations (i.e. same input space sample size) even if the resulting void fraction axial profiles are non-monotonic. It is also shown that the void fraction uncertainty limits achieved with SRS by running 458 calculations (sample size required to cover 95% of 8 uncertain input parameters with a 95% confidence), result in the same uncertainty limits achieved by LHS with only 100 calculations. These are thus clear indications on the advantages of using LHS. Finally, the present study contributes to a realistic analysis of nuclear reactors, in the sense that the uncertainties of

  5. Some reflections on uncertainty analysis and management

    International Nuclear Information System (INIS)

    Aven, Terje

    2010-01-01

    A guide to quantitative uncertainty analysis and management in industry has recently been issued. The guide provides an overall framework for uncertainty modelling and characterisations, using probabilities but also other uncertainty representations (including the Dempster-Shafer theory). A number of practical applications showing how to use the framework are presented. The guide is considered as an important contribution to the field, but there is a potential for improvements. These relate mainly to the scientific basis and clarification of critical issues, for example, concerning the meaning of a probability and the concept of model uncertainty. A reformulation of the framework is suggested using probabilities as the only representation of uncertainty. Several simple examples are included to motivate and explain the basic ideas of the modified framework.

  6. IAEA CRP on HTGR Uncertainties in Modeling: Assessment of Phase I Lattice to Core Model Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Rouxelin, Pascal Nicolas [Idaho National Lab. (INL), Idaho Falls, ID (United States); Strydom, Gerhard [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    Best-estimate plus uncertainty analysis of reactors is replacing the traditional conservative (stacked uncertainty) method for safety and licensing analysis. To facilitate uncertainty analysis applications, a comprehensive approach and methodology must be developed and applied. High temperature gas cooled reactors (HTGRs) have several features that require techniques not used in light-water reactor analysis (e.g., coated-particle design and large graphite quantities at high temperatures). The International Atomic Energy Agency has therefore launched the Coordinated Research Project on HTGR Uncertainty Analysis in Modeling to study uncertainty propagation in the HTGR analysis chain. The benchmark problem defined for the prismatic design is represented by the General Atomics Modular HTGR 350. The main focus of this report is the compilation and discussion of the results obtained for various permutations of Exercise I 2c and the use of the cross section data in Exercise II 1a of the prismatic benchmark, which is defined as the last and first steps of the lattice and core simulation phases, respectively. The report summarizes the Idaho National Laboratory (INL) best estimate results obtained for Exercise I 2a (fresh single-fuel block), Exercise I 2b (depleted single-fuel block), and Exercise I 2c (super cell) in addition to the first results of an investigation into the cross section generation effects for the super-cell problem. The two dimensional deterministic code known as the New ESC based Weighting Transport (NEWT) included in the Standardized Computer Analyses for Licensing Evaluation (SCALE) 6.1.2 package was used for the cross section evaluation, and the results obtained were compared to the three dimensional stochastic SCALE module KENO VI. The NEWT cross section libraries were generated for several permutations of the current benchmark super-cell geometry and were then provided as input to the Phase II core calculation of the stand alone neutronics Exercise

  7. The explicit treatment of model uncertainties in the presence of aleatory and epistemic parameter uncertainties in risk and reliability analysis

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Yang, Joon Eon

    2003-01-01

    In the risk and reliability analysis of complex technological systems, the primary concern of formal uncertainty analysis is to understand why uncertainties arise, and to evaluate how they impact the results of the analysis. In recent times, many of the uncertainty analyses have focused on parameters of the risk and reliability analysis models, whose values are uncertain in an aleatory or an epistemic way. As the field of parametric uncertainty analysis matures, however, more attention is being paid to the explicit treatment of uncertainties that are addressed in the predictive model itself as well as the accuracy of the predictive model. The essential steps for evaluating impacts of these model uncertainties in the presence of parameter uncertainties are to determine rigorously various sources of uncertainties to be addressed in an underlying model itself and in turn model parameters, based on our state-of-knowledge and relevant evidence. Answering clearly the question of how to characterize and treat explicitly the forgoing different sources of uncertainty is particularly important for practical aspects such as risk and reliability optimization of systems as well as more transparent risk information and decision-making under various uncertainties. The main purpose of this paper is to provide practical guidance for quantitatively treating various model uncertainties that would often be encountered in the risk and reliability modeling process of complex technological systems

  8. Uncertainty and sensitivity analysis in nuclear accident consequence assessment

    International Nuclear Information System (INIS)

    Karlberg, Olof.

    1989-01-01

    This report contains the results of a four year project in research contracts with the Nordic Cooperation in Nuclear Safety and the National Institute for Radiation Protection. An uncertainty/sensitivity analysis methodology consisting of Latin Hypercube sampling and regression analysis was applied to an accident consequence model. A number of input parameters were selected and the uncertainties related to these parameter were estimated within a Nordic group of experts. Individual doses, collective dose, health effects and their related uncertainties were then calculated for three release scenarios and for a representative sample of meteorological situations. From two of the scenarios the acute phase after an accident were simulated and from one the long time consequences. The most significant parameters were identified. The outer limits of the calculated uncertainty distributions are large and will grow to several order of magnitudes for the low probability consequences. The uncertainty in the expectation values are typical a factor 2-5 (1 Sigma). The variation in the model responses due to the variation of the weather parameters is fairly equal to the parameter uncertainty induced variation. The most important parameters showed out to be different for each pathway of exposure, which could be expected. However, the overall most important parameters are the wet deposition coefficient and the shielding factors. A general discussion of the usefulness of uncertainty analysis in consequence analysis is also given. (au)

  9. Uncertainty Propagation in OMFIT

    Science.gov (United States)

    Smith, Sterling; Meneghini, Orso; Sung, Choongki

    2017-10-01

    A rigorous comparison of power balance fluxes and turbulent model fluxes requires the propagation of uncertainties in the kinetic profiles and their derivatives. Making extensive use of the python uncertainties package, the OMFIT framework has been used to propagate covariant uncertainties to provide an uncertainty in the power balance calculation from the ONETWO code, as well as through the turbulent fluxes calculated by the TGLF code. The covariant uncertainties arise from fitting 1D (constant on flux surface) density and temperature profiles and associated random errors with parameterized functions such as a modified tanh. The power balance and model fluxes can then be compared with quantification of the uncertainties. No effort is made at propagating systematic errors. A case study will be shown for the effects of resonant magnetic perturbations on the kinetic profiles and fluxes at the top of the pedestal. A separate attempt at modeling the random errors with Monte Carlo sampling will be compared to the method of propagating the fitting function parameter covariant uncertainties. Work supported by US DOE under DE-FC02-04ER54698, DE-FG2-95ER-54309, DE-SC 0012656.

  10. The role of the PIRT process in identifying code improvements and executing code development

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G.E. [Idaho National Engineering Lab., Idaho Falls, ID (United States); Boyack, B.E. [Los Alamos National Lab., NM (United States)

    1997-07-01

    In September 1988, the USNRC issued a revised ECCS rule for light water reactors that allows, as an option, the use of best estimate (BE) plus uncertainty methods in safety analysis. The key feature of this licensing option relates to quantification of the uncertainty in the determination that an NPP has a {open_quotes}low{close_quotes} probability of violating the safety criteria specified in 10 CFR 50. To support the 1988 licensing revision, the USNRC and its contractors developed the CSAU evaluation methodology to demonstrate the feasibility of the BE plus uncertainty approach. The PIRT process, Step 3 in the CSAU methodology, was originally formulated to support the BE plus uncertainty licensing option as executed in the CSAU approach to safety analysis. Subsequent work has shown the PIRT process to be a much more powerful tool than conceived in its original form. Through further development and application, the PIRT process has shown itself to be a robust means to establish safety analysis computer code phenomenological requirements in their order of importance to such analyses. Used early in research directed toward these objectives, PIRT results also provide the technical basis and cost effective organization for new experimental programs needed to improve the safety analysis codes for new applications. The primary purpose of this paper is to describe the generic PIRT process, including typical and common illustrations from prior applications. The secondary objective is to provide guidance to future applications of the process to help them focus, in a graded approach, on systems, components, processes and phenomena that have been common in several prior applications.

  11. Uncertainty analysis with a view towards applications in accident consequence assessments

    International Nuclear Information System (INIS)

    Fischer, F.; Erhardt, J.

    1985-09-01

    Since the publication of the US-Reactor Safety Study WASH-1400 there has been an increasing interest to develop and apply methods which allow to quantify the uncertainty inherent in probabilistic risk assessments (PRAs) and accident consequence assessments (ACAs) for installations of the nuclear fuel cycle. Research and development in this area is forced by the fact that PRA and ACA are more and more used for comparative, decisive and fact finding studies initiated by industry and regulatory commissions. This report summarizes and reviews some of the main methods and gives some hints to do sensitivity and uncertainty analyses. Some first investigations aiming at the application of the method mentioned above to a submodel of the ACA-code UFOMOD (KfK) are presented. Sensitivity analyses and some uncertainty studies an important submodel of UFOMOD are carried out to identify the relevant parameters for subsequent uncertainty calculations. (orig./HP) [de

  12. Contributions to the uncertainty management in numerical modelization: wave propagation in random media and analysis of computer experiments

    International Nuclear Information System (INIS)

    Iooss, B.

    2009-01-01

    The present document constitutes my Habilitation thesis report. It recalls my scientific activity of the twelve last years, since my PhD thesis until the works completed as a research engineer at CEA Cadarache. The two main chapters of this document correspond to two different research fields both referring to the uncertainty treatment in engineering problems. The first chapter establishes a synthesis of my work on high frequency wave propagation in random medium. It more specifically relates to the study of the statistical fluctuations of acoustic wave travel-times in random and/or turbulent media. The new results mainly concern the introduction of the velocity field statistical anisotropy in the analytical expressions of the travel-time statistical moments according to those of the velocity field. This work was primarily carried by requirements in geophysics (oil exploration and seismology). The second chapter is concerned by the probabilistic techniques to study the effect of input variables uncertainties in numerical models. My main applications in this chapter relate to the nuclear engineering domain which offers a large variety of uncertainty problems to be treated. First of all, a complete synthesis is carried out on the statistical methods of sensitivity analysis and global exploration of numerical models. The construction and the use of a meta-model (inexpensive mathematical function replacing an expensive computer code) are then illustrated by my work on the Gaussian process model (kriging). Two additional topics are finally approached: the high quantile estimation of a computer code output and the analysis of stochastic computer codes. We conclude this memory with some perspectives about the numerical simulation and the use of predictive models in industry. This context is extremely positive for future researches and application developments. (author)

  13. Extending CANTUP code analysis to probabilistic evaluations

    International Nuclear Information System (INIS)

    Florea, S.

    2001-01-01

    The structural analysis with numerical methods based on final element method plays at present a central role in evaluations and predictions of structural systems which require safety and reliable operation in aggressive environmental conditions. This is the case too for the CANDU - 600 fuel channel, where besides the corrosive and thermal aggression upon the Zr97.5Nb2.5 pressure tubes, a lasting irradiation adds which has marked consequences upon the materials properties evolution. This results in an unavoidable spreading in the materials properties in time, affected by high uncertainties. Consequently, the deterministic evaluation with computation codes based on finite element method are supplemented by statistic and probabilistic methods of evaluation of the response of structural components. This paper reports the works on extending the thermo-mechanical evaluation of the fuel channel components in the frame of probabilistic structure mechanics based on statistical methods and developed upon deterministic CANTUP code analyses. CANTUP code was adapted from LAHEY 77 platform onto Microsoft Developer Studio - Fortran Power Station 4.0 platform. To test the statistical evaluation of the creeping behaviour of pressure tube, the value of longitudinal elasticity modulus (Young) was used, as random variable, with a normal distribution around value, as used in deterministic analyses. The influence of the random quantity upon the hog and effective stress developed in the pressure tube for to time values, specific to primary and secondary creep was studied. The results obtained after a five year creep, corresponding to the secondary creep are presented

  14. A methodology for uncertainty analysis of reference equations of state

    DEFF Research Database (Denmark)

    Cheung, Howard; Frutiger, Jerome; Bell, Ian H.

    We present a detailed methodology for the uncertainty analysis of reference equations of state (EOS) based on Helmholtz energy. In recent years there has been an increased interest in uncertainties of property data and process models of thermal systems. In the literature there are various...... for uncertainty analysis is suggested as a tool for EOS. The uncertainties of the EOS properties are calculated from the experimental values and the EOS model structure through the parameter covariance matrix and subsequent linear error propagation. This allows reporting the uncertainty range (95% confidence...

  15. Application of FORSS sensitivity and uncertainty methodology to fast reactor benchmark analysis

    International Nuclear Information System (INIS)

    Weisbin, C.R.; Marable, J.H.; Lucius, J.L.; Oblow, E.M.; Mynatt, F.R.; Peelle, R.W.; Perey, F.G.

    1976-12-01

    FORSS is a code system used to study relationships between nuclear reaction cross sections, integral experiments, reactor performance parameter predictions, and associated uncertainties. This paper presents the theory and code description as well as the first results of applying FORSS to fast reactor benchmarks. Specifically, for various assemblies and reactor performance parameters, the nuclear data sensitivities were computed by nuclide, reaction type, and energy. Comprehensive libraries of energy-dependent coefficients have been developed in a computer retrievable format and released for distribution by RSIC and NNCSC. Uncertainties induced by nuclear data were quantified using preliminary, energy-dependent relative covariance matrices evaluated with ENDF/B-IV expectation values and processed for 238 U(n,f), 238 U(n,γ), 239 Pu(n,f), and 239 Pu(ν). Nuclear data accuracy requirements to meet specified performance criteria at minimum experimental cost were determined

  16. A study of different approaches for multi-scale sensitivity analysis of the TALL-3D experiment using thermal-hydraulic computer codes

    International Nuclear Information System (INIS)

    Geffray, Clotaire; Macian-Juan, Rafael

    2014-01-01

    In the context of the FP7 European THINS Project, complex thermal-hydraulic phenomena relevant for the Generation IV of nuclear reactors are investigated. KTH (Sweden) built the TALL-3D facility to investigate the transition from forced to natural circulation of the Lead-Bismuth Eutectic (LBE) in a pool connected to a 3-leg primary circuit with two heaters and a heat exchanger. The simulation of such 3D phenomena is a challenging task. GRS (Germany) developed the coupling between the Computational Fluid Dynamics (CFD) code ANSYS CFX and the System Analysis code ATHLET. Such coupled codes combine the advantages of CFD, which allow a fine resolution of 3D phenomena, and of System Analysis codes, which are fast running. TUM (Germany) is responsible for the Uncertainty and Sensitivity Analysis of the coupled ATHLET-CFX model in the THINS Project. The influence of modeling uncertainty on simulation results needs to be assessed to characterize and to improve the model and, eventually, to assess its performance against experimental data. TUM has developed a computational framework capable of propagating model input uncertainty through coupled codes. This framework can also be used to apply different approaches for the assessment of the influence of the uncertain input parameters on the model output (Sensitivity Analysis). The work reported in this paper focuses on three methods for the assessment of the sensitivity of the results to the modeling uncertainty. The first method (Morris) allows for the computation of the Elementary Effects resulting from the input parameters. This method is widely used to perform Screening Analysis. The second method (Spearman's rank correlation) relies on regression-based non-parametric measures. This method is suitable if the relation between the input and the output variables is at least monotonic, with the advantage of a low computational cost. The last method (Sobol') computes so-called total effect indices which account for

  17. Uncertainty Analysis of In leakage Test for Pressurized Control Room Envelop

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J. B. [KHNP Central Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    In leakage tests for control room envelops(CRE) of newly constructed nuclear power plants are required to prove the control room habitability. Results of the in leakage tests should be analyzed using an uncertainty analysis. Test uncertainty can be an issue if the test results for pressurized CREs show low in leakage. To have a better knowledge of the test uncertainty, a statistical model for the uncertainty analysis is described here and a representative uncertainty analysis of a sample in leakage test is presented. A statistical method for analyzing the uncertainty of the in leakage test is presented here and a representative uncertainty analysis of a sample in leakage test was performed. By using the statistical method we can evaluate the test result with certain level of significance. This method can be more helpful when the difference of the two mean values of the test result is small.

  18. Uncertainty Analysis of In leakage Test for Pressurized Control Room Envelop

    International Nuclear Information System (INIS)

    Lee, J. B.

    2013-01-01

    In leakage tests for control room envelops(CRE) of newly constructed nuclear power plants are required to prove the control room habitability. Results of the in leakage tests should be analyzed using an uncertainty analysis. Test uncertainty can be an issue if the test results for pressurized CREs show low in leakage. To have a better knowledge of the test uncertainty, a statistical model for the uncertainty analysis is described here and a representative uncertainty analysis of a sample in leakage test is presented. A statistical method for analyzing the uncertainty of the in leakage test is presented here and a representative uncertainty analysis of a sample in leakage test was performed. By using the statistical method we can evaluate the test result with certain level of significance. This method can be more helpful when the difference of the two mean values of the test result is small

  19. Latent uncertainties of the precalculated track Monte Carlo method

    Energy Technology Data Exchange (ETDEWEB)

    Renaud, Marc-André; Seuntjens, Jan [Medical Physics Unit, McGill University, Montreal, Quebec H3G 1A4 (Canada); Roberge, David [Département de radio-oncologie, Centre Hospitalier de l’Université de Montréal, Montreal, Quebec H2L 4M1 (Canada)

    2015-01-15

    Purpose: While significant progress has been made in speeding up Monte Carlo (MC) dose calculation methods, they remain too time-consuming for the purpose of inverse planning. To achieve clinically usable calculation speeds, a precalculated Monte Carlo (PMC) algorithm for proton and electron transport was developed to run on graphics processing units (GPUs). The algorithm utilizes pregenerated particle track data from conventional MC codes for different materials such as water, bone, and lung to produce dose distributions in voxelized phantoms. While PMC methods have been described in the past, an explicit quantification of the latent uncertainty arising from the limited number of unique tracks in the pregenerated track bank is missing from the paper. With a proper uncertainty analysis, an optimal number of tracks in the pregenerated track bank can be selected for a desired dose calculation uncertainty. Methods: Particle tracks were pregenerated for electrons and protons using EGSnrc and GEANT4 and saved in a database. The PMC algorithm for track selection, rotation, and transport was implemented on the Compute Unified Device Architecture (CUDA) 4.0 programming framework. PMC dose distributions were calculated in a variety of media and compared to benchmark dose distributions simulated from the corresponding general-purpose MC codes in the same conditions. A latent uncertainty metric was defined and analysis was performed by varying the pregenerated track bank size and the number of simulated primary particle histories and comparing dose values to a “ground truth” benchmark dose distribution calculated to 0.04% average uncertainty in voxels with dose greater than 20% of D{sub max}. Efficiency metrics were calculated against benchmark MC codes on a single CPU core with no variance reduction. Results: Dose distributions generated using PMC and benchmark MC codes were compared and found to be within 2% of each other in voxels with dose values greater than 20% of

  20. Latent uncertainties of the precalculated track Monte Carlo method

    International Nuclear Information System (INIS)

    Renaud, Marc-André; Seuntjens, Jan; Roberge, David

    2015-01-01

    Purpose: While significant progress has been made in speeding up Monte Carlo (MC) dose calculation methods, they remain too time-consuming for the purpose of inverse planning. To achieve clinically usable calculation speeds, a precalculated Monte Carlo (PMC) algorithm for proton and electron transport was developed to run on graphics processing units (GPUs). The algorithm utilizes pregenerated particle track data from conventional MC codes for different materials such as water, bone, and lung to produce dose distributions in voxelized phantoms. While PMC methods have been described in the past, an explicit quantification of the latent uncertainty arising from the limited number of unique tracks in the pregenerated track bank is missing from the paper. With a proper uncertainty analysis, an optimal number of tracks in the pregenerated track bank can be selected for a desired dose calculation uncertainty. Methods: Particle tracks were pregenerated for electrons and protons using EGSnrc and GEANT4 and saved in a database. The PMC algorithm for track selection, rotation, and transport was implemented on the Compute Unified Device Architecture (CUDA) 4.0 programming framework. PMC dose distributions were calculated in a variety of media and compared to benchmark dose distributions simulated from the corresponding general-purpose MC codes in the same conditions. A latent uncertainty metric was defined and analysis was performed by varying the pregenerated track bank size and the number of simulated primary particle histories and comparing dose values to a “ground truth” benchmark dose distribution calculated to 0.04% average uncertainty in voxels with dose greater than 20% of D max . Efficiency metrics were calculated against benchmark MC codes on a single CPU core with no variance reduction. Results: Dose distributions generated using PMC and benchmark MC codes were compared and found to be within 2% of each other in voxels with dose values greater than 20% of the

  1. Nuclear data uncertainty analysis for the generation IV gas-cooled fast reactor

    International Nuclear Information System (INIS)

    Pelloni, S.; Mikityuk, K.

    2012-01-01

    For the European 2400 MW Gas-cooled Fast Reactor (GoFastR), this paper summarizes a priori uncertainties, i.e. without any integral experiment assessment, of the main neutronic parameters which were obtained on the basis of the deterministic code system ERANOS (Edition 2.2-N). JEFF-3.1 cross-sections were used in conjunction with the newest ENDF/B-VII.0 based covariance library (COMMARA-2.0) resulting from a recent cooperation of the Brookhaven and Los Alamos National Laboratories within the Advanced Fuel Cycle Initiative. The basis for the analysis is the original GoFastR concept with carbide fuel pins and silicon-carbide ceramic cladding, which was developed and proposed in the first quarter of 2009 by the 'French alternative energies and Atomic Energy Commission', CEA. The main conclusions from the current study are that nuclear data uncertainties of neutronic parameters may still be too large for this Generation IV reactor, especially concerning the multiplication factor, despite the fact that the new covariance library is quite complete; These uncertainties, in relative terms, do not show the a priori expected increase with bum-up as a result of the minor actinide and fission product build-up. Indeed, they are found almost independent of the fuel depletion, since the uncertainty associated with 238 U inelastic scattering results largely dominating. This finding clearly supports the activities of Subgroup 33 of the Working Party on International Nuclear Data Evaluation Cooperation (WPEC), i.e. Methods and issues for the combined use of integral experiments and covariance data, attempting to reduce the present unbiased uncertainties on nuclear data through adjustments based on available experimental data. (authors)

  2. Uncertainty Propagation Analysis for the Monte Carlo Time-Dependent Simulations

    International Nuclear Information System (INIS)

    Shaukata, Nadeem; Shim, Hyung Jin

    2015-01-01

    In this paper, a conventional method to control the neutron population for super-critical systems is implemented. Instead of considering the cycles, the simulation is divided in time intervals. At the end of each time interval, neutron population control is applied on the banked neutrons. Randomly selected neutrons are discarded, until the size of neutron population matches the initial neutron histories at the beginning of time simulation. A time-dependent simulation mode has also been implemented in the development version of SERPENT 2 Monte Carlo code. In this mode, sequential population control mechanism has been proposed for modeling of prompt super-critical systems. A Monte Carlo method has been properly used in TART code for dynamic criticality calculations. For super-critical systems, the neutron population is allowed to grow over a period of time. The neutron population is uniformly combed to return it to the neutron population started with at the beginning of time boundary. In this study, conventional time-dependent Monte Carlo (TDMC) algorithm is implemented. There is an exponential growth of neutron population in estimation of neutron density tally for super-critical systems and the number of neutrons being tracked exceed the memory of the computer. In order to control this exponential growth at the end of each time boundary, a conventional time cut-off controlling population strategy is included in TDMC. A scale factor is introduced to tally the desired neutron density at the end of each time boundary. The main purpose of this paper is the quantification of uncertainty propagation in neutron densities at the end of each time boundary for super-critical systems. This uncertainty is caused by the uncertainty resulting from the introduction of scale factor. The effectiveness of TDMC is examined for one-group infinite homogeneous problem (the rod model) and two-group infinite homogeneous problem. The desired neutron density is tallied by the introduction of

  3. Uncertainty Propagation Analysis for the Monte Carlo Time-Dependent Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Shaukata, Nadeem; Shim, Hyung Jin [Seoul National University, Seoul (Korea, Republic of)

    2015-10-15

    In this paper, a conventional method to control the neutron population for super-critical systems is implemented. Instead of considering the cycles, the simulation is divided in time intervals. At the end of each time interval, neutron population control is applied on the banked neutrons. Randomly selected neutrons are discarded, until the size of neutron population matches the initial neutron histories at the beginning of time simulation. A time-dependent simulation mode has also been implemented in the development version of SERPENT 2 Monte Carlo code. In this mode, sequential population control mechanism has been proposed for modeling of prompt super-critical systems. A Monte Carlo method has been properly used in TART code for dynamic criticality calculations. For super-critical systems, the neutron population is allowed to grow over a period of time. The neutron population is uniformly combed to return it to the neutron population started with at the beginning of time boundary. In this study, conventional time-dependent Monte Carlo (TDMC) algorithm is implemented. There is an exponential growth of neutron population in estimation of neutron density tally for super-critical systems and the number of neutrons being tracked exceed the memory of the computer. In order to control this exponential growth at the end of each time boundary, a conventional time cut-off controlling population strategy is included in TDMC. A scale factor is introduced to tally the desired neutron density at the end of each time boundary. The main purpose of this paper is the quantification of uncertainty propagation in neutron densities at the end of each time boundary for super-critical systems. This uncertainty is caused by the uncertainty resulting from the introduction of scale factor. The effectiveness of TDMC is examined for one-group infinite homogeneous problem (the rod model) and two-group infinite homogeneous problem. The desired neutron density is tallied by the introduction of

  4. Urban drainage models simplifying uncertainty analysis for practitioners

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana

    2013-01-01

    in each measured/observed datapoint; an issue that is commonly overlooked in the uncertainty analysis of urban drainage models. This comparison allows the user to intuitively estimate the optimum number of simulations required to conduct uncertainty analyses. The output of the method includes parameter......There is increasing awareness about uncertainties in the modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here...

  5. The IAEA Coordinated Research Program on HTGR Reactor Physics, Thermal-hydraulics and Depletion Uncertainty Analysis: Description of the Benchmark Test Cases and Phases

    Energy Technology Data Exchange (ETDEWEB)

    Frederik Reitsma; Gerhard Strydom; Bismark Tyobeka; Kostadin Ivanov

    2012-10-01

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The uncertainties in the HTR analysis tools are today typically assessed with sensitivity analysis and then a few important input uncertainties (typically based on a PIRT process) are varied in the analysis to find a spread in the parameter of importance. However, one wish to apply a more fundamental approach to determine the predictive capability and accuracies of coupled neutronics/thermal-hydraulics and depletion simulations used for reactor design and safety assessment. Today there is a broader acceptance of the use of uncertainty analysis even in safety studies and it has been accepted by regulators in some cases to replace the traditional conservative analysis. Finally, there is also a renewed focus in supplying reliable covariance data (nuclear data uncertainties) that can then be used in uncertainty methods. Uncertainty and sensitivity studies are therefore becoming an essential component of any significant effort in data and simulation improvement. In order to address uncertainty in analysis and methods in the HTGR community the IAEA launched a Coordinated Research Project (CRP) on the HTGR Uncertainty Analysis in Modelling early in 2012. The project is built on the experience of the OECD/NEA Light Water Reactor (LWR) Uncertainty Analysis in Best-Estimate Modelling (UAM) benchmark activity, but focuses specifically on the peculiarities of HTGR designs and its simulation requirements. Two benchmark problems were defined with the prismatic type design represented by the MHTGR-350 design from General Atomics (GA) while a 250 MW modular pebble bed design, similar to the INET (China) and indirect-cycle PBMR (South Africa) designs are also included. In the paper more detail on the benchmark cases, the different specific phases and tasks and the latest

  6. Modified Phenomena Identification and Ranking Table (PIRT) for Uncertainty Analysis

    International Nuclear Information System (INIS)

    Gol-Mohamad, Mohammad P.; Modarres, Mohammad; Mosleh, Ali

    2006-01-01

    This paper describes a methodology of characterizing important phenomena, which is also part of a broader research by the authors called 'Modified PIRT'. The methodology provides robust process of phenomena identification and ranking process for more precise quantification of uncertainty. It is a two-step process of identifying and ranking methodology based on thermal-hydraulics (TH) importance as well as uncertainty importance. Analytical Hierarchical Process (AHP) has been used for as a formal approach for TH identification and ranking. Formal uncertainty importance technique is used to estimate the degree of credibility of the TH model(s) used to represent the important phenomena. This part uses subjective justification by evaluating available information and data from experiments, and code predictions. The proposed methodology was demonstrated by developing a PIRT for large break loss of coolant accident LBLOCA for the LOFT integral facility with highest core power (test LB-1). (authors)

  7. Status of SPACE Safety Analysis Code Development

    International Nuclear Information System (INIS)

    Lee, Dong Hyuk; Yang, Chang Keun; Kim, Se Yun; Ha, Sang Jun

    2009-01-01

    In 2006, the Korean the Korean nuclear industry started developing a thermal-hydraulic analysis code for safety analysis of PWR(Pressurized Water Reactor). The new code is named as SPACE(Safety and Performance Analysis Code for Nuclear Power Plant). The SPACE code can solve two-fluid, three-field governing equations in one dimensional or three dimensional geometry. The SPACE code has many component models required for modeling a PWR, such as reactor coolant pump, safety injection tank, etc. The programming language used in the new code is C++, for new generation of engineers who are more comfortable with C/C++ than old FORTRAN language. This paper describes general characteristics of SPACE code and current status of SPACE code development

  8. A Preliminary Study on Sensitivity and Uncertainty Analysis with Statistic Method: Uncertainty Analysis with Cross Section Sampling from Lognormal Distribution

    Energy Technology Data Exchange (ETDEWEB)

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung [Hanyang Univ., Seoul (Korea, Republic of); Noh, Jae Man [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis.

  9. A Preliminary Study on Sensitivity and Uncertainty Analysis with Statistic Method: Uncertainty Analysis with Cross Section Sampling from Lognormal Distribution

    International Nuclear Information System (INIS)

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung; Noh, Jae Man

    2013-01-01

    The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis

  10. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC): gap analysis for high fidelity and performance assessment code development

    International Nuclear Information System (INIS)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe Jr.; Webb, Stephen Walter; Dewers, Thomas A.; Mariner, Paul E.; Edwards, Harold Carter; Fuller, Timothy J.; Freeze, Geoffrey A.; Jove-Colon, Carlos F.; Wang, Yifeng

    2011-01-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are

  11. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC) : gap analysis for high fidelity and performance assessment code development.

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.; Webb, Stephen Walter; Dewers, Thomas A.; Mariner, Paul E.; Edwards, Harold Carter; Fuller, Timothy J.; Freeze, Geoffrey A.; Jove-Colon, Carlos F.; Wang, Yifeng

    2011-03-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are

  12. The role of uncertainty analysis in dose reconstruction and risk assessment

    International Nuclear Information System (INIS)

    Hoffman, F.O.; Simon, S.L.; Thiessen. K.M.

    1996-01-01

    Dose reconstruction and risk assessment rely heavily on the use of mathematical models to extrapolate information beyond the realm of direct observation. Because models are merely approximations of real systems, their predictions are inherently uncertain. As a result, full disclosure of uncertainty in dose and risk estimates is essential to achieve scientific credibility and to build public trust. The need for formal analysis of uncertainty in model predictions was presented during the nineteenth annual meeting of the NCRP. At that time, quantitative uncertainty analysis was considered a relatively new and difficult subject practiced by only a few investigators. Today, uncertainty analysis has become synonymous with the assessment process itself. When an uncertainty analysis is used iteratively within the assessment process, it can guide experimental research to refine dose and risk estimates, deferring potentially high cost or high consequence decisions until uncertainty is either acceptable or irreducible. Uncertainty analysis is now mandated for all ongoing dose reconstruction projects within the United States, a fact that distinguishes dose reconstruction from other types of exposure and risk assessments. 64 refs., 6 figs., 1 tab

  13. Application of code scaling applicability and uncertainty methodology to the large break loss of coolant

    International Nuclear Information System (INIS)

    Young, M.Y.; Bajorek, S.M.; Nissley, M.E.

    1998-01-01

    In the late 1980s, after completion of an extensive research program, the United States Nuclear Regulatory Commission (USNRC) amended its regulations (10CFR50.46) to allow the use of realistic physical models to analyze the loss of coolant accident (LOCA) in a light water reactors. Prior to this time, the evaluation of this accident was subject to a prescriptive set of rules (appendix K of the regulations) requiring conservative models and assumptions to be applied simultaneously, leading to very pessimistic estimates of the impact of this accident on the reactor core. The rule change therefore promised to provide significant benefits to owners of power reactors, allowing them to increase output. In response to the rule change, a method called code scaling, applicability and uncertainty (CSAU) was developed to apply realistic methods, while properly taking into account data uncertainty, uncertainty in physical modeling and plant variability. The method was claimed to be structured, traceable, and practical, but was met with some criticism when first demonstrated. In 1996, the USNRC approved a methodology, based on CSAU, developed by a group led by Westinghouse. The lessons learned in this application of CSAU will be summarized. Some of the issues raised concerning the validity and completeness of the CSAU methodology will also be discussed. (orig.)

  14. PCT Uncertainty Analysis Using Unscented Transform with Random Orthogonal Matrix

    Energy Technology Data Exchange (ETDEWEB)

    Fynana, Douglas A.; Ahn, Kwang-Il [KAERI, Daejeon (Korea, Republic of); Lee, John C. [Univ. of Michigan, Michigan (United States)

    2015-05-15

    less statistical variation of mean and variance estimates of the output pdf. For BEPU applications where on the order of 20 to 50 input parameter uncertainties are usually sampled, the UT with ROM requires the same order of magnitude of best estimate code simulations as the widely used Wilks' formula. In addition to BEPU, the UT with ROM may be a useful sampling algorithm in other nuclear applications involving computer codes and uncertainties. One future area of research is nuclear data uncertainty propagation in neutronics calculations. Cross section uncertainties are conveniently available as covariance data in the evaluated nuclear data files so the UT only requiring covariance information appears to be an appropriate method.

  15. The adjoint sensitivity method, a contribution to the code uncertainty evaluation

    International Nuclear Information System (INIS)

    Ounsy, A.; Crecy, F. de; Brun, B.

    1993-01-01

    The application of the ASM (Adjoint Sensitivity Method) to thermohydraulic codes, is examined. The advantage of the method is to be very few CPU time consuming in comparison with usual approach requiring one complete code run per sensitivity determination. The mathematical aspects of the problem are first described, and the applicability of the method of the functional-type response of a thermalhydraulic model is demonstrated. On a simple example of non linear hyperbolic equation (Burgers equation) the problem has been analyzed. It is shown that the formalism used in the literature treating this subject is not appropriate. A new mathematical formalism circumventing the problem is proposed. For the discretized form of the problem, two methods are possible: the Continuous ASM and the Discrete ASM. The equivalence of both methods is demonstrated; nevertheless only the DASM constitutes a practical solution for thermalhydraulic codes. The application of the DASM to the thermalhydraulic safety code CATHARE is then presented for two examples. They demonstrate that ASM constitutes an efficient tool for the analysis of code sensitivity. (authors) 7 figs., 5 tabs., 8 refs

  16. The adjoint sensitivity method. A contribution to the code uncertainty evaluation

    International Nuclear Information System (INIS)

    Ounsy, A.; Brun, B.

    1993-01-01

    The application of the ASM (Adjoint Sensitivity Method) to thermohydraulic codes, is examined. The advantage of the method is to be very few CPU time consuming in comparison with usual approach requiring one complete code run per sensitivity determination. The mathematical aspects of the problem are first described, and the applicability of the method of the functional-type response of a thermalhydraulic model is demonstrated. On a simple example of non linear hyperbolic equation (Burgers equation) the problem has been analyzed. It is shown that the formalism used in the literature treating this subject is not appropriate. A new mathematical formalism circumventing the problem is proposed. For the discretized form of the problem, two methods are possible: the Continuous ASM and the Discrete ASM. The equivalence of both methods is demonstrated; nevertheless only the DASM constitutes a practical solution for thermalhydraulic codes. The application of the DASM to the thermalhydraulic safety code CATHARE is then presented for two examples. They demonstrate that ASM constitutes an efficient tool for the analysis of code sensitivity. (authors) 7 figs., 5 tabs., 8 refs

  17. The adjoint sensitivity method. A contribution to the code uncertainty evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Ounsy, A; Brun, B

    1994-12-31

    The application of the ASM (Adjoint Sensitivity Method) to thermohydraulic codes, is examined. The advantage of the method is to be very few CPU time consuming in comparison with usual approach requiring one complete code run per sensitivity determination. The mathematical aspects of the problem are first described, and the applicability of the method of the functional-type response of a thermalhydraulic model is demonstrated. On a simple example of non linear hyperbolic equation (Burgers equation) the problem has been analyzed. It is shown that the formalism used in the literature treating this subject is not appropriate. A new mathematical formalism circumventing the problem is proposed. For the discretized form of the problem, two methods are possible: the Continuous ASM and the Discrete ASM. The equivalence of both methods is demonstrated; nevertheless only the DASM constitutes a practical solution for thermalhydraulic codes. The application of the DASM to the thermalhydraulic safety code CATHARE is then presented for two examples. They demonstrate that ASM constitutes an efficient tool for the analysis of code sensitivity. (authors) 7 figs., 5 tabs., 8 refs.

  18. The adjoint sensitivity method, a contribution to the code uncertainty evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Ounsy, A; Crecy, F de; Brun, B

    1994-12-31

    The application of the ASM (Adjoint Sensitivity Method) to thermohydraulic codes, is examined. The advantage of the method is to be very few CPU time consuming in comparison with usual approach requiring one complete code run per sensitivity determination. The mathematical aspects of the problem are first described, and the applicability of the method of the functional-type response of a thermalhydraulic model is demonstrated. On a simple example of non linear hyperbolic equation (Burgers equation) the problem has been analyzed. It is shown that the formalism used in the literature treating this subject is not appropriate. A new mathematical formalism circumventing the problem is proposed. For the discretized form of the problem, two methods are possible: the Continuous ASM and the Discrete ASM. The equivalence of both methods is demonstrated; nevertheless only the DASM constitutes a practical solution for thermalhydraulic codes. The application of the DASM to the thermalhydraulic safety code CATHARE is then presented for two examples. They demonstrate that ASM constitutes an efficient tool for the analysis of code sensitivity. (authors) 7 figs., 5 tabs., 8 refs.

  19. Uncertainty Analysis of Consequence Management (CM) Data Products.

    Energy Technology Data Exchange (ETDEWEB)

    Hunt, Brian D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eckert-Gallup, Aubrey Celia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cochran, Lainy Dromgoole [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kraus, Terrence D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Fournier, Sean Donovan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Allen, Mark B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Schetnan, Richard Reed [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Simpson, Matthew D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Okada, Colin E. [Remote Sensing Lab. (RSL), Nellis AFB, Las Vegas, NV (United States); Bingham, Avery A. [Remote Sensing Lab. (RSL), Nellis AFB, Las Vegas, NV (United States)

    2018-01-01

    The goal of this project is to develop and execute methods for characterizing uncertainty in data products that are deve loped and distributed by the DOE Consequence Management (CM) Program. A global approach to this problem is necessary because multiple sources of error and uncertainty from across the CM skill sets contribute to the ultimate p roduction of CM data products. This report presents the methods used to develop a probabilistic framework to characterize this uncertainty and provides results for an uncertainty analysis for a study scenario analyzed using this framework.

  20. Use of nuclear data sensitivity and uncertainty analysis for the design preparation of the HCLL breeder blanket mock-up experiment for ITER

    International Nuclear Information System (INIS)

    Kodeli, I.

    2007-01-01

    An experiment on a mock-up of the Test Blanket module based on Helium Cooled Lithium Lead (HCLL) concept will be performed in 2007 in the FNG utility in Frascati in order to study neutronics characteristics of the module and the performance of the computational tools in the accurate prediction of the neutron transport. With the objective to prepare and optimise the design of the mock-up in the sense to provide maximum information on the state-of-the-art of the cross section data the mock-up was pre-analysed using the deterministic codes for the sensitivity/uncertainty analysis. The neutron fluxes and tritium production rate (TPR), their sensitivity to the underlying basic cross sections, as well as the corresponding uncertainty estimations were calculated using the deterministic transport codes (DOORS package), the sensitivity/uncertainty code package SUSD3D and the VITAMIN-J/COVA covariance matrix libraries. The cross section reactions with largest contribution to the uncertainty in the calculation of the TPR were identified to be (n,2n) and (n,3n) reactions on plumb. The conclusions of this work support the main benchmark design and suggest some modifications and improvements. In particular this study recommends the use, as far as possible, of both natural and enriched lithium pellets for the TRP measurements. The combined use is expected to provide additional and complementary information on the sensitive cross sections. (author)

  1. Application of status uncertainty analysis methods for AP1000 LBLOCA calculation

    International Nuclear Information System (INIS)

    Zhang Shunxiang; Liang Guoxing

    2012-01-01

    Parameter uncertainty analysis is developed by using the reasonable method to establish the response relations between input parameter uncertainties and output uncertainties. The application of the parameter uncertainty analysis makes the simulation of plant state more accuracy and improves the plant economy with reasonable security assurance. The AP1000 LBLOCA was analyzed in this paper and the results indicate that the random sampling statistical analysis method, sensitivity analysis numerical method and traditional error propagation analysis method can provide quite large peak cladding temperature (PCT) safety margin, which is much helpful for choosing suitable uncertainty analysis method to improve the plant economy. Additionally, the random sampling statistical analysis method applying mathematical statistics theory makes the largest safety margin due to the reducing of the conservation. Comparing with the traditional conservative bounding parameter analysis method, the random sampling method can provide the PCT margin of 100 K, while the other two methods can only provide 50-60 K. (authors)

  2. Implementation of a methodology to perform the uncertainty and sensitivity analysis of the control rod drop in a BWR

    Energy Technology Data Exchange (ETDEWEB)

    Reyes F, M. del C.

    2015-07-01

    A methodology to perform uncertainty and sensitivity analysis for the cross sections used in a Trace/PARCS coupled model for a control rod drop transient of a BWR-5 reactor was implemented with the neutronics code PARCS. A model of the nuclear reactor detailing all assemblies located in the core was developed. However, the thermohydraulic model designed in Trace was a simple model, where one channel representing all the types of assemblies located in the core, it was located inside a simple vessel model and boundary conditions were established. The thermohydraulic model was coupled with the neutronics model, first for the steady state and then a Control Rod Drop (CRD) transient was performed, in order to carry out the uncertainty and sensitivity analysis. To perform the analysis of the cross sections used in the Trace/PARCS coupled model during the transient, Probability Density Functions (PDFs) were generated for the 22 parameters cross sections selected from the neutronics parameters that PARCS requires, thus obtaining 100 different cases for the Trace/PARCS coupled model, each with a database of different cross sections. All these cases were executed with the coupled model, therefore obtaining 100 different outputs for the CRD transient with special emphasis on 4 responses per output: 1) The reactivity, 2) the percentage of rated power, 3) the average fuel temperature and 4) the average coolant density. For each response during the transient an uncertainty analysis was performed in which the corresponding uncertainty bands were generated. With this analysis it is possible to observe the results ranges of the responses chose by varying the uncertainty parameters selected. This is very useful and important for maintaining the safety in the nuclear power plants, also to verify if the uncertainty band is within of safety margins. The sensitivity analysis complements the uncertainty analysis identifying the parameter or parameters with the most influence on the

  3. Implementation of a methodology to perform the uncertainty and sensitivity analysis of the control rod drop in a BWR

    International Nuclear Information System (INIS)

    Reyes F, M. del C.

    2015-01-01

    A methodology to perform uncertainty and sensitivity analysis for the cross sections used in a Trace/PARCS coupled model for a control rod drop transient of a BWR-5 reactor was implemented with the neutronics code PARCS. A model of the nuclear reactor detailing all assemblies located in the core was developed. However, the thermohydraulic model designed in Trace was a simple model, where one channel representing all the types of assemblies located in the core, it was located inside a simple vessel model and boundary conditions were established. The thermohydraulic model was coupled with the neutronics model, first for the steady state and then a Control Rod Drop (CRD) transient was performed, in order to carry out the uncertainty and sensitivity analysis. To perform the analysis of the cross sections used in the Trace/PARCS coupled model during the transient, Probability Density Functions (PDFs) were generated for the 22 parameters cross sections selected from the neutronics parameters that PARCS requires, thus obtaining 100 different cases for the Trace/PARCS coupled model, each with a database of different cross sections. All these cases were executed with the coupled model, therefore obtaining 100 different outputs for the CRD transient with special emphasis on 4 responses per output: 1) The reactivity, 2) the percentage of rated power, 3) the average fuel temperature and 4) the average coolant density. For each response during the transient an uncertainty analysis was performed in which the corresponding uncertainty bands were generated. With this analysis it is possible to observe the results ranges of the responses chose by varying the uncertainty parameters selected. This is very useful and important for maintaining the safety in the nuclear power plants, also to verify if the uncertainty band is within of safety margins. The sensitivity analysis complements the uncertainty analysis identifying the parameter or parameters with the most influence on the

  4. BN-600 MOX Core Benchmark Analysis. Results from Phases 4 and 6 of a Coordinated Research Project on Updated Codes and Methods to Reduce the Calculational Uncertainties of the LMFR Reactivity Effects

    International Nuclear Information System (INIS)

    2013-12-01

    For those Member States that have or have had significant fast reactor development programmes, it is of utmost importance that they have validated up to date codes and methods for fast reactor physics analysis in support of R and D and core design activities in the area of actinide utilization and incineration. In particular, some Member States have recently focused on fast reactor systems for minor actinide transmutation and on cores optimized for consuming rather than breeding plutonium; the physics of the breeder reactor cycle having already been widely investigated. Plutonium burning systems may have an important role in managing plutonium stocks until the time when major programmes of self-sufficient fast breeder reactors are established. For assessing the safety of these systems, it is important to determine the prediction accuracy of transient simulations and their associated reactivity coefficients. In response to Member States' expressed interest, the IAEA sponsored a coordinated research project (CRP) on Updated Codes and Methods to Reduce the Calculational Uncertainties of the LMFR Reactivity Effects. The CRP started in November 1999 and, at the first meeting, the members of the CRP endorsed a benchmark on the BN-600 hybrid core for consideration in its first studies. Benchmark analyses of the BN-600 hybrid core were performed during the first three phases of the CRP, investigating different nuclear data and levels of approximation in the calculation of safety related reactivity effects and their influence on uncertainties in transient analysis prediction. In an additional phase of the benchmark studies, experimental data were used for the verification and validation of nuclear data libraries and methods in support of the previous three phases. The results of phases 1, 2, 3 and 5 of the CRP are reported in IAEA-TECDOC-1623, BN-600 Hybrid Core Benchmark Analyses, Results from a Coordinated Research Project on Updated Codes and Methods to Reduce the

  5. A Proposal of Estimation Methodology to Improve Calculation Efficiency of Sampling-based Method in Nuclear Data Sensitivity and Uncertainty Analysis

    International Nuclear Information System (INIS)

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung; Noh, Jae Man

    2014-01-01

    The uncertainty with the sampling-based method is evaluated by repeating transport calculations with a number of cross section data sampled from the covariance uncertainty data. In the transport calculation with the sampling-based method, the transport equation is not modified; therefore, all uncertainties of the responses such as k eff , reaction rates, flux and power distribution can be directly obtained all at one time without code modification. However, a major drawback with the sampling-based method is that it requires expensive computational load for statistically reliable results (inside confidence level 0.95) in the uncertainty analysis. The purpose of this study is to develop a method for improving the computational efficiency and obtaining highly reliable uncertainty result in using the sampling-based method with Monte Carlo simulation. The proposed method is a method to reduce the convergence time of the response uncertainty by using the multiple sets of sampled group cross sections in a single Monte Carlo simulation. The proposed method was verified by estimating GODIVA benchmark problem and the results were compared with that of conventional sampling-based method. In this study, sampling-based method based on central limit theorem is proposed to improve calculation efficiency by reducing the number of repetitive Monte Carlo transport calculation required to obtain reliable uncertainty analysis results. Each set of sampled group cross sections is assigned to each active cycle group in a single Monte Carlo simulation. The criticality uncertainty for the GODIVA problem is evaluated by the proposed and previous method. The results show that the proposed sampling-based method can efficiently decrease the number of Monte Carlo simulation required for evaluate uncertainty of k eff . It is expected that the proposed method will improve computational efficiency of uncertainty analysis with sampling-based method

  6. Application of FORSS sensitivity and uncertainty methodology to fast reactor benchmark analysis

    Energy Technology Data Exchange (ETDEWEB)

    Weisbin, C.R.; Marable, J.H.; Lucius, J.L.; Oblow, E.M.; Mynatt, F.R.; Peelle, R.W.; Perey, F.G.

    1976-12-01

    FORSS is a code system used to study relationships between nuclear reaction cross sections, integral experiments, reactor performance parameter predictions, and associated uncertainties. This paper presents the theory and code description as well as the first results of applying FORSS to fast reactor benchmarks. Specifically, for various assemblies and reactor performance parameters, the nuclear data sensitivities were computed by nuclide, reaction type, and energy. Comprehensive libraries of energy-dependent coefficients have been developed in a computer retrievable format and released for distribution by RSIC and NNCSC. Uncertainties induced by nuclear data were quantified using preliminary, energy-dependent relative covariance matrices evaluated with ENDF/B-IV expectation values and processed for /sup 238/U(n,f), /sup 238/U(n,..gamma..), /sup 239/Pu(n,f), and /sup 239/Pu(..nu..). Nuclear data accuracy requirements to meet specified performance criteria at minimum experimental cost were determined.

  7. Results for Phase I of the IAEA Coordinated Research Program on HTGR Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Strydom, Gerhard [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bostelmann, Friederike [Idaho National Lab. (INL), Idaho Falls, ID (United States); Yoon, Su Jong [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-01-01

    The quantification of uncertainties in design and safety analysis of reactors is today not only broadly accepted, but in many cases became the preferred way to replace traditional conservative analysis for safety and licensing analysis. The use of a more fundamental methodology is also consistent with the reliable high fidelity physics models and robust, efficient, and accurate codes available today. To facilitate uncertainty analysis applications a comprehensive approach and methodology must be developed and applied. High Temperature Gas-cooled Reactors (HTGR) has its own peculiarities, coated particle design, large graphite quantities, different materials and high temperatures that also require other simulation requirements. The IAEA has therefore launched a Coordinated Research Project (CRP) on the HTGR Uncertainty Analysis in Modeling (UAM) in 2013 to study uncertainty propagation specifically in the HTGR analysis chain. Two benchmark problems are defined, with the prismatic design represented by the General Atomics (GA) MHTGR-350 and a 250 MW modular pebble bed design similar to the HTR-PM (INET, China). This report summarizes the contributions of the HTGR Methods Simulation group at Idaho National Laboratory (INL) up to this point of the CRP. The activities at INL have been focused so far on creating the problem specifications for the prismatic design, as well as providing reference solutions for the exercises defined for Phase I. An overview is provided of the HTGR UAM objectives and scope, and the detailed specifications for Exercises I-1, I-2, I-3 and I-4 are also included here for completeness. The main focus of the report is the compilation and discussion of reference results for Phase I (i.e. for input parameters at their nominal or best-estimate values), which is defined as the first step of the uncertainty quantification process. These reference results can be used by other CRP participants for comparison with other codes or their own reference

  8. Analytic uncertainty and sensitivity analysis of models with input correlations

    Science.gov (United States)

    Zhu, Yueying; Wang, Qiuping A.; Li, Wei; Cai, Xu

    2018-03-01

    Probabilistic uncertainty analysis is a common means of evaluating mathematical models. In mathematical modeling, the uncertainty in input variables is specified through distribution laws. Its contribution to the uncertainty in model response is usually analyzed by assuming that input variables are independent of each other. However, correlated parameters are often happened in practical applications. In the present paper, an analytic method is built for the uncertainty and sensitivity analysis of models in the presence of input correlations. With the method, it is straightforward to identify the importance of the independence and correlations of input variables in determining the model response. This allows one to decide whether or not the input correlations should be considered in practice. Numerical examples suggest the effectiveness and validation of our analytic method in the analysis of general models. A practical application of the method is also proposed to the uncertainty and sensitivity analysis of a deterministic HIV model.

  9. Axial power monitoring uncertainty in the Savannah River Reactors

    International Nuclear Information System (INIS)

    Losey, D.C.; Revolinski, S.M.

    1990-01-01

    The results of this analysis quantified the uncertainty associated with monitoring the Axial Power Shape (APS) in the Savannah River Reactors. Thermocouples at each assembly flow exit map the radial power distribution and are the primary means of monitoring power in these reactors. The remaining uncertainty in power monitoring is associated with the relative axial power distribution. The APS is monitored by seven sensors that respond to power on each of nine vertical Axial Power Monitor (APM) rods. Computation of the APS uncertainty, for the reactor power limits analysis, started with a large database of APM rod measurements spanning several years of reactor operation. A computer algorithm was used to randomly select a sample of APSs which were input to a code. This code modeled the thermal-hydraulic performance of a single fuel assembly during a design basis Loss-of Coolant Accident. The assembly power limit at Onset of Significant Voiding was computed for each APS. The output was a distribution of expected assembly power limits that was adjusted to account for the biases caused by instrumentation error and by measuring 7 points rather than a continuous APS. Statistical analysis of the final assembly power limit distribution showed that reducing reactor power by approximately 3% was sufficient to account for APS variation. This data confirmed expectations that the assembly exit thermocouples provide all information needed for monitoring core power. The computational analysis results also quantified the contribution to power limits of the various uncertainties such as instrumentation error

  10. Statistical Uncertainty Quantification of Physical Models during Reflood of LBLOCA

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Deog Yeon; Seul, Kwang Won; Woo, Sweng Woong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-05-15

    The use of the best-estimate (BE) computer codes in safety analysis for loss-of-coolant accident (LOCA) is the major trend in many countries to reduce the significant conservatism. A key feature of this BE evaluation requires the licensee to quantify the uncertainty of the calculations. So, it is very important how to determine the uncertainty distribution before conducting the uncertainty evaluation. Uncertainty includes those of physical model and correlation, plant operational parameters, and so forth. The quantification process is often performed mainly by subjective expert judgment or obtained from reference documents of computer code. In this respect, more mathematical methods are needed to reasonably determine the uncertainty ranges. The first uncertainty quantification are performed with the various increments for two influential uncertainty parameters to get the calculated responses and their derivatives. The different data set with two influential uncertainty parameters for FEBA tests, are chosen applying more strict criteria for selecting responses and their derivatives, which may be considered as the user’s effect in the CIRCÉ applications. Finally, three influential uncertainty parameters are considered to study the effect on the number of uncertainty parameters due to the limitation of CIRCÉ method. With the determined uncertainty ranges, uncertainty evaluations for FEBA tests are performed to check whether the experimental responses such as the cladding temperature or pressure drop are inside the limits of calculated uncertainty bounds. A confirmation step will be performed to evaluate the quality of the information in the case of the different reflooding PERICLES experiments. The uncertainty ranges of physical model in MARS-KS thermal-hydraulic code during the reflooding were quantified by CIRCÉ method using FEBA experiment tests, instead of expert judgment. Also, through the uncertainty evaluation for FEBA and PERICLES tests, it was confirmed

  11. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis version 6.0 theory manual

    Energy Technology Data Exchange (ETDEWEB)

    Adams, Brian M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ebeida, Mohamed Salah [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eldred, Michael S [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jakeman, John Davis [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Swiler, Laura Painton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Stephens, John Adam [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vigil, Dena M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wildey, Timothy Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bohnhoff, William J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eddy, John P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hu, Kenneth T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dalbey, Keith R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bauman, Lara E [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hough, Patricia Diane [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-05-01

    The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the Dakota software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of Dakota-related research publications in the areas of surrogate-based optimization, uncertainty quanti cation, and optimization under uncertainty that provide the foundation for many of Dakota's iterative analysis capabilities.

  12. Uncertainty Analysis of Seebeck Coefficient and Electrical Resistivity Characterization

    Science.gov (United States)

    Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred

    2014-01-01

    In order to provide a complete description of a materials thermoelectric power factor, in addition to the measured nominal value, an uncertainty interval is required. The uncertainty may contain sources of measurement error including systematic bias error and precision error of a statistical nature. The work focuses specifically on the popular ZEM-3 (Ulvac Technologies) measurement system, but the methods apply to any measurement system. The analysis accounts for sources of systematic error including sample preparation tolerance, measurement probe placement, thermocouple cold-finger effect, and measurement parameters; in addition to including uncertainty of a statistical nature. Complete uncertainty analysis of a measurement system allows for more reliable comparison of measurement data between laboratories.

  13. Parameter Uncertainty for Repository Thermal Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hardin, Ernest [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hadgu, Teklu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Greenberg, Harris [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dupont, Mark [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-10-01

    This report is one follow-on to a study of reference geologic disposal design concepts (Hardin et al. 2011a). Based on an analysis of maximum temperatures, that study concluded that certain disposal concepts would require extended decay storage prior to emplacement, or the use of small waste packages, or both. The study used nominal values for thermal properties of host geologic media and engineered materials, demonstrating the need for uncertainty analysis to support the conclusions. This report is a first step that identifies the input parameters of the maximum temperature calculation, surveys published data on measured values, uses an analytical approach to determine which parameters are most important, and performs an example sensitivity analysis. Using results from this first step, temperature calculations planned for FY12 can focus on only the important parameters, and can use the uncertainty ranges reported here. The survey of published information on thermal properties of geologic media and engineered materials, is intended to be sufficient for use in generic calculations to evaluate the feasibility of reference disposal concepts. A full compendium of literature data is beyond the scope of this report. The term “uncertainty” is used here to represent both measurement uncertainty and spatial variability, or variability across host geologic units. For the most important parameters (e.g., buffer thermal conductivity) the extent of literature data surveyed samples these different forms of uncertainty and variability. Finally, this report is intended to be one chapter or section of a larger FY12 deliverable summarizing all the work on design concepts and thermal load management for geologic disposal (M3FT-12SN0804032, due 15Aug2012).

  14. Estimation of Peaking Factor Uncertainty due to Manufacturing Tolerance using Statistical Sampling Method

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kyung Hoon; Park, Ho Jin; Lee, Chung Chan; Cho, Jin Young [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    The purpose of this paper is to study the effect on output parameters in the lattice physics calculation due to the last input uncertainty such as manufacturing deviations from nominal value for material composition and geometric dimensions. In a nuclear design and analysis, the lattice physics calculations are usually employed to generate lattice parameters for the nodal core simulation and pin power reconstruction. These lattice parameters which consist of homogenized few-group cross-sections, assembly discontinuity factors, and form-functions can be affected by input uncertainties which arise from three different sources: 1) multi-group cross-section uncertainties, 2) the uncertainties associated with methods and modeling approximations utilized in lattice physics codes, and 3) fuel/assembly manufacturing uncertainties. In this paper, data provided by the light water reactor (LWR) uncertainty analysis in modeling (UAM) benchmark has been used as the manufacturing uncertainties. First, the effect of each input parameter has been investigated through sensitivity calculations at the fuel assembly level. Then, uncertainty in prediction of peaking factor due to the most sensitive input parameter has been estimated using the statistical sampling method, often called the brute force method. For our analysis, the two-dimensional transport lattice code DeCART2D and its ENDF/B-VII.1 based 47-group library were used to perform the lattice physics calculation. Sensitivity calculations have been performed in order to study the influence of manufacturing tolerances on the lattice parameters. The manufacturing tolerance that has the largest influence on the k-inf is the fuel density. The second most sensitive parameter is the outer clad diameter.

  15. Improved Monte Carlo Method for PSA Uncertainty Analysis

    International Nuclear Information System (INIS)

    Choi, Jongsoo

    2016-01-01

    The treatment of uncertainty is an important issue for regulatory decisions. Uncertainties exist from knowledge limitations. A probabilistic approach has exposed some of these limitations and provided a framework to assess their significance and assist in developing a strategy to accommodate them in the regulatory process. The uncertainty analysis (UA) is usually based on the Monte Carlo method. This paper proposes a Monte Carlo UA approach to calculate the mean risk metrics accounting for the SOKC between basic events (including CCFs) using efficient random number generators and to meet Capability Category III of the ASME/ANS PRA standard. Audit calculation is needed in PSA regulatory reviews of uncertainty analysis results submitted for licensing. The proposed Monte Carlo UA approach provides a high degree of confidence in PSA reviews. All PSA needs accounting for the SOKC between event probabilities to meet the ASME/ANS PRA standard

  16. Improved Monte Carlo Method for PSA Uncertainty Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jongsoo [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2016-10-15

    The treatment of uncertainty is an important issue for regulatory decisions. Uncertainties exist from knowledge limitations. A probabilistic approach has exposed some of these limitations and provided a framework to assess their significance and assist in developing a strategy to accommodate them in the regulatory process. The uncertainty analysis (UA) is usually based on the Monte Carlo method. This paper proposes a Monte Carlo UA approach to calculate the mean risk metrics accounting for the SOKC between basic events (including CCFs) using efficient random number generators and to meet Capability Category III of the ASME/ANS PRA standard. Audit calculation is needed in PSA regulatory reviews of uncertainty analysis results submitted for licensing. The proposed Monte Carlo UA approach provides a high degree of confidence in PSA reviews. All PSA needs accounting for the SOKC between event probabilities to meet the ASME/ANS PRA standard.

  17. Joint analysis of epistemic and aleatory uncertainty in stability analysis for geo-hazard assessments

    Science.gov (United States)

    Rohmer, Jeremy; Verdel, Thierry

    2017-04-01

    Uncertainty analysis is an unavoidable task of stability analysis of any geotechnical systems. Such analysis usually relies on the safety factor SF (if SF is below some specified threshold), the failure is possible). The objective of the stability analysis is then to estimate the failure probability P for SF to be below the specified threshold. When dealing with uncertainties, two facets should be considered as outlined by several authors in the domain of geotechnics, namely "aleatoric uncertainty" (also named "randomness" or "intrinsic variability") and "epistemic uncertainty" (i.e. when facing "vague, incomplete or imprecise information" such as limited databases and observations or "imperfect" modelling). The benefits of separating both facets of uncertainty can be seen from a risk management perspective because: - Aleatoric uncertainty, being a property of the system under study, cannot be reduced. However, practical actions can be taken to circumvent the potentially dangerous effects of such variability; - Epistemic uncertainty, being due to the incomplete/imprecise nature of available information, can be reduced by e.g., increasing the number of tests (lab or in site survey), improving the measurement methods or evaluating calculation procedure with model tests, confronting more information sources (expert opinions, data from literature, etc.). Uncertainty treatment in stability analysis usually restricts to the probabilistic framework to represent both facets of uncertainty. Yet, in the domain of geo-hazard assessments (like landslides, mine pillar collapse, rockfalls, etc.), the validity of this approach can be debatable. In the present communication, we propose to review the major criticisms available in the literature against the systematic use of probability in situations of high degree of uncertainty. On this basis, the feasibility of using a more flexible uncertainty representation tool is then investigated, namely Possibility distributions (e

  18. Large scale applicability of a Fully Adaptive Non-Intrusive Spectral Projection technique: Sensitivity and uncertainty analysis of a transient

    International Nuclear Information System (INIS)

    Perkó, Zoltán; Lathouwers, Danny; Kloosterman, Jan Leen; Hagen, Tim van der

    2014-01-01

    Highlights: • Grid and basis adaptive Polynomial Chaos techniques are presented for S and U analysis. • Dimensionality reduction and incremental polynomial order reduce computational costs. • An unprotected loss of flow transient is investigated in a Gas Cooled Fast Reactor. • S and U analysis is performed with MC and adaptive PC methods, for 42 input parameters. • PC accurately estimates means, variances, PDFs, sensitivities and uncertainties. - Abstract: Since the early years of reactor physics the most prominent sensitivity and uncertainty (S and U) analysis methods in the nuclear community have been adjoint based techniques. While these are very effective for pure neutronics problems due to the linearity of the transport equation, they become complicated when coupled non-linear systems are involved. With the continuous increase in computational power such complicated multi-physics problems are becoming progressively tractable, hence affordable and easily applicable S and U analysis tools also have to be developed in parallel. For reactor physics problems for which adjoint methods are prohibitive Polynomial Chaos (PC) techniques offer an attractive alternative to traditional random sampling based approaches. At TU Delft such PC methods have been studied for a number of years and this paper presents a large scale application of our Fully Adaptive Non-Intrusive Spectral Projection (FANISP) algorithm for performing the sensitivity and uncertainty analysis of a Gas Cooled Fast Reactor (GFR) Unprotected Loss Of Flow (ULOF) transient. The transient was simulated using the Cathare 2 code system and a fully detailed model of the GFR2400 reactor design that was investigated in the European FP7 GoFastR project. Several sources of uncertainty were taken into account amounting to an unusually high number of stochastic input parameters (42) and numerous output quantities were investigated. The results show consistently good performance of the applied adaptive PC

  19. Survey of sampling-based methods for uncertainty and sensitivity analysis

    International Nuclear Information System (INIS)

    Helton, J.C.; Johnson, J.D.; Sallaberry, C.J.; Storlie, C.B.

    2006-01-01

    Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (i) definition of probability distributions to characterize epistemic uncertainty in analysis inputs (ii) generation of samples from uncertain analysis inputs (iii) propagation of sampled inputs through an analysis (iv) presentation of uncertainty analysis results, and (v) determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two-dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition

  20. Survey of sampling-based methods for uncertainty and sensitivity analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jay Dean; Helton, Jon Craig; Sallaberry, Cedric J. PhD. (.; .); Storlie, Curt B. (Colorado State University, Fort Collins, CO)

    2006-06-01

    Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (1) Definition of probability distributions to characterize epistemic uncertainty in analysis inputs, (2) Generation of samples from uncertain analysis inputs, (3) Propagation of sampled inputs through an analysis, (4) Presentation of uncertainty analysis results, and (5) Determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition.

  1. The adjoint sensitivity method, a contribution to the code uncertainty evaluation

    International Nuclear Information System (INIS)

    Ounsy, A.; Brun, B.; De Crecy, F.

    1994-01-01

    This paper deals with the application of the adjoint sensitivity method (ASM) to thermal hydraulic codes. The advantage of the method is to use small central processing unit time in comparison with the usual approach requiring one complete code run per sensitivity determination. In the first part the mathematical aspects of the problem are treated, and the applicability of the method of the functional-type response of a thermal hydraulic model is demonstrated. On a simple example of non-linear hyperbolic equation (Burgers equation) the problem has been analysed. It is shown that the formalism used in the literature treating this subject is not appropriate. A new mathematical formalism circumventing the problem is proposed. For the discretized form of the problem, two methods are possible: the continuous ASM and the discrete ASM. The equivalence of both methods is demonstrated; nevertheless only the discrete ASM constitutes a practical solution for thermal hydraulic codes. The application of the discrete ASM to the thermal hydraulic safety code CATHARE is then presented for two examples. They demonstrate that the discrete ASM constitutes an efficient tool for the analysis of code sensitivity. ((orig.))

  2. Uncertainty modelling and code calibration for composite materials

    DEFF Research Database (Denmark)

    Toft, Henrik Stensgaard; Branner, Kim; Mishnaevsky, Leon, Jr

    2013-01-01

    and measurement uncertainties which are introduced on the different scales. Typically, these uncertainties are taken into account in the design process using characteristic values and partial safety factors specified in a design standard. The value of the partial safety factors should reflect a reasonable balance...... to wind turbine blades are calibrated for two typical lay-ups using a large number of load cases and ratios between the aerodynamic forces and the inertia forces....

  3. Estimating the measurement uncertainty in forensic blood alcohol analysis.

    Science.gov (United States)

    Gullberg, Rod G

    2012-04-01

    For many reasons, forensic toxicologists are being asked to determine and report their measurement uncertainty in blood alcohol analysis. While understood conceptually, the elements and computations involved in determining measurement uncertainty are generally foreign to most forensic toxicologists. Several established and well-documented methods are available to determine and report the uncertainty in blood alcohol measurement. A straightforward bottom-up approach is presented that includes: (1) specifying the measurand, (2) identifying the major components of uncertainty, (3) quantifying the components, (4) statistically combining the components and (5) reporting the results. A hypothetical example is presented that employs reasonable estimates for forensic blood alcohol analysis assuming headspace gas chromatography. These computations are easily employed in spreadsheet programs as well. Determining and reporting measurement uncertainty is an important element in establishing fitness-for-purpose. Indeed, the demand for such computations and information from the forensic toxicologist will continue to increase.

  4. Uncertainty Propagation in Hypersonic Vehicle Aerothermoelastic Analysis

    Science.gov (United States)

    Lamorte, Nicolas Etienne

    Hypersonic vehicles face a challenging flight environment. The aerothermoelastic analysis of its components requires numerous simplifying approximations. Identifying and quantifying the effect of uncertainties pushes the limits of the existing deterministic models, and is pursued in this work. An uncertainty quantification framework is used to propagate the effects of identified uncertainties on the stability margins and performance of the different systems considered. First, the aeroelastic stability of a typical section representative of a control surface on a hypersonic vehicle is examined. Variability in the uncoupled natural frequencies of the system is modeled to mimic the effect of aerodynamic heating. Next, the stability of an aerodynamically heated panel representing a component of the skin of a generic hypersonic vehicle is considered. Uncertainty in the location of transition from laminar to turbulent flow and the heat flux prediction is quantified using CFD. In both cases significant reductions of the stability margins are observed. A loosely coupled airframe--integrated scramjet engine is considered next. The elongated body and cowl of the engine flow path are subject to harsh aerothermodynamic loading which causes it to deform. Uncertainty associated with deformation prediction is propagated to the engine performance analysis. The cowl deformation is the main contributor to the sensitivity of the propulsion system performance. Finally, a framework for aerothermoelastic stability boundary calculation for hypersonic vehicles using CFD is developed. The usage of CFD enables one to consider different turbulence conditions, laminar or turbulent, and different models of the air mixture, in particular real gas model which accounts for dissociation of molecules at high temperature. The system is found to be sensitive to turbulence modeling as well as the location of the transition from laminar to turbulent flow. Real gas effects play a minor role in the

  5. Uncertainty Instability Risk Analysis of High Concrete Arch Dam Abutments

    Directory of Open Access Journals (Sweden)

    Xin Cao

    2017-01-01

    Full Text Available The uncertainties associated with concrete arch dams rise with the increased height of dams. Given the uncertainties associated with influencing factors, the stability of high arch dam abutments as a fuzzy random event was studied. In addition, given the randomness and fuzziness of calculation parameters as well as the failure criterion, hazard point and hazard surface uncertainty instability risk ratio models were proposed for high arch dam abutments on the basis of credibility theory. The uncertainty instability failure criterion was derived through the analysis of the progressive instability failure process on the basis of Shannon’s entropy theory. The uncertainties associated with influencing factors were quantized by probability or possibility distribution assignments. Gaussian random theory was used to generate random realizations for influence factors with spatial variability. The uncertainty stability analysis method was proposed by combining the finite element analysis and the limit equilibrium method. The instability risk ratio was calculated using the Monte Carlo simulation method and fuzzy random postprocessing. Results corroborate that the modeling approach is sound and that the calculation method is feasible.

  6. Spectrum unfolding, sensitivity analysis and propagation of uncertainties with the maximum entropy deconvolution code MAXED

    CERN Document Server

    Reginatto, M; Neumann, S

    2002-01-01

    MAXED was developed to apply the maximum entropy principle to the unfolding of neutron spectrometric measurements. The approach followed in MAXED has several features that make it attractive: it permits inclusion of a priori information in a well-defined and mathematically consistent way, the algorithm used to derive the solution spectrum is not ad hoc (it can be justified on the basis of arguments that originate in information theory), and the solution spectrum is a non-negative function that can be written in closed form. This last feature permits the use of standard methods for the sensitivity analysis and propagation of uncertainties of MAXED solution spectra. We illustrate its use with unfoldings of NE 213 scintillation detector measurements of photon calibration spectra, and of multisphere neutron spectrometer measurements of cosmic-ray induced neutrons at high altitude (approx 20 km) in the atmosphere.

  7. Computer codes for safety analysis

    International Nuclear Information System (INIS)

    Holland, D.F.

    1986-11-01

    Computer codes for fusion safety analysis have been under development in the United States for about a decade. This paper will discuss five codes that are currently under development by the Fusion Safety Program. The purpose and capability of each code will be presented, a sample given, followed by a discussion of the present status and future development plans

  8. CATHARE code development and assessment methodologies

    International Nuclear Information System (INIS)

    Micaelli, J.C.; Barre, F.; Bestion, D.

    1995-01-01

    The CATHARE thermal-hydraulic code has been developed jointly by Commissariat a l'Energie Atomique (CEA), Electricite de France (EdF), and Framatorne for safety analysis. Since the beginning of the project (September 1979), development and assessment activities have followed a methodology supported by two series of experimental tests: separate effects tests and integral effects tests. The purpose of this paper is to describe this methodology, the code assessment status, and the evolution to take into account two new components of this program: the modeling of three-dimensional phenomena and the requirements of code uncertainty evaluation

  9. On epistemic uncertainties in event tree success criteria

    International Nuclear Information System (INIS)

    Jordan Cizelj, R.; Parzer, I.

    2003-01-01

    Uncertainty analysis of parameters, which are used as success criteria in PSA event trees, is presented in the paper. The influence of parameters on PSA model is indirect, and they are rather subject to epistemic uncertainties. Consequently, point estimates of these parameters cannot be automatically exchanged with probability distributions. For each PSA parameter, the analysis of several influencing factors is performed. As a result, recommended parameters' values for sensitivity analysis of the influence of these parameters on PSA results are given. In particular, the parameters related to exposure times were investigated. The values of the exposure times are assessed using different methodologies. The analysis of three parameters is presented in the paper, based on the comparison between the results of MAAP 3.0B and RELAP5/MOD2 codes. (author)

  10. Verification of the thermal module in the ELESIM code and the associated uncertainty analysis

    International Nuclear Information System (INIS)

    Arimescu, V.I.; Williams, A.F.; Klein, M.E.; Richmond, W.R.; Couture, M.

    1997-01-01

    Temperature is a critical parameter in fuel modelling because most of the physical processes that occur in fuel elements during irradiation are thermally activated. The focus of this paper is the temperature distribution calculation used in the computer code ELESIM, developed at AECL to model the steady state behaviour of CANDU fuel. A validation procedure for fuel codes is described and applied to ELESIM's thermal calculation

  11. Passive active neutron radioassay measurement uncertainty for combustible and glass waste matrices

    International Nuclear Information System (INIS)

    Blackwood, L.G.; Harker, Y.D.; Meachum, T.R.; Yoon, Woo Y.

    1997-01-01

    Using a modified statistical sampling and verification approach, total uncertainty of INEL's Passive Active Neutron (PAN) radioassay system was evaluated for combustible and glass content codes. Waste structure and content of 100 randomly selected drums in each the waste categories were computer modeled based on review of real-time radiography video tapes. Specific quantities of Pu were added to the drum models according to an experimental design. These drum models were then submitted to the Monte Carlo Neutron Photon code processing and subsequent calculations to produce simulated PAN system measurements. The reported Pu masses from the simulation runs were compared with the corresponding input masses. Analysis of the measurement errors produced uncertainty estimates. This paper presents results of the uncertainty calculations and compares them to previous reported results obtained for graphite waste

  12. Two-dimensional cross-section sensitivity and uncertainty analysis for fusion reactor blankets

    International Nuclear Information System (INIS)

    Embrechts, M.J.

    1982-02-01

    A two-dimensional sensitivity and uncertainty analysis for the heating of the TF coil for the FED (fusion engineering device) blanket was performed. The uncertainties calculated are of the same order of magnitude as those resulting from a one-dimensional analysis. The largest uncertainties were caused by the cross section uncertainties for chromium

  13. Uncertainty analysis of nuclear waste package corrosion

    International Nuclear Information System (INIS)

    Kurth, R.E.; Nicolosi, S.L.

    1986-01-01

    This paper describes the results of an evaluation of three uncertainty analysis methods for assessing the possible variability in calculating the corrosion process in a nuclear waste package. The purpose of the study is the determination of how each of three uncertainty analysis methods, Monte Carlo, Latin hypercube sampling (LHS) and a modified discrete probability distribution method, perform in such calculations. The purpose is not to examine the absolute magnitude of the numbers but rather to rank the performance of each of the uncertainty methods in assessing the model variability. In this context it was found that the Monte Carlo method provided the most accurate assessment but at a prohibitively high cost. The modified discrete probability method provided accuracy close to that of the Monte Carlo for a fraction of the cost. The LHS method was found to be too inaccurate for this calculation although it would be appropriate for use in a model which requires substantially more computer time than the one studied in this paper

  14. Uncertainty and validation. Effect of user interpretation on uncertainty estimates

    International Nuclear Information System (INIS)

    Kirchner, G.; Peterson, R.

    1996-11-01

    Uncertainty in predictions of environmental transfer models arises from, among other sources, the adequacy of the conceptual model, the approximations made in coding the conceptual model, the quality of the input data, the uncertainty in parameter values, and the assumptions made by the user. In recent years efforts to quantify the confidence that can be placed in predictions have been increasing, but have concentrated on a statistical propagation of the influence of parameter uncertainties on the calculational results. The primary objective of this Working Group of BIOMOVS II was to test user's influence on model predictions on a more systematic basis than has been done before. The main goals were as follows: To compare differences between predictions from different people all using the same model and the same scenario description with the statistical uncertainties calculated by the model. To investigate the main reasons for different interpretations by users. To create a better awareness of the potential influence of the user on the modeling results. Terrestrial food chain models driven by deposition of radionuclides from the atmosphere were used. Three codes were obtained and run with three scenarios by a maximum of 10 users. A number of conclusions can be drawn some of which are general and independent of the type of models and processes studied, while others are restricted to the few processes that were addressed directly: For any set of predictions, the variation in best estimates was greater than one order of magnitude. Often the range increased from deposition to pasture to milk probably due to additional transfer processes. The 95% confidence intervals about the predictions calculated from the parameter distributions prepared by the participants did not always overlap the observations; similarly, sometimes the confidence intervals on the predictions did not overlap. Often the 95% confidence intervals of individual predictions were smaller than the

  15. Uncertainty and validation. Effect of user interpretation on uncertainty estimates

    Energy Technology Data Exchange (ETDEWEB)

    Kirchner, G. [Univ. of Bremen (Germany); Peterson, R. [AECL, Chalk River, ON (Canada)] [and others

    1996-11-01

    Uncertainty in predictions of environmental transfer models arises from, among other sources, the adequacy of the conceptual model, the approximations made in coding the conceptual model, the quality of the input data, the uncertainty in parameter values, and the assumptions made by the user. In recent years efforts to quantify the confidence that can be placed in predictions have been increasing, but have concentrated on a statistical propagation of the influence of parameter uncertainties on the calculational results. The primary objective of this Working Group of BIOMOVS II was to test user's influence on model predictions on a more systematic basis than has been done before. The main goals were as follows: To compare differences between predictions from different people all using the same model and the same scenario description with the statistical uncertainties calculated by the model. To investigate the main reasons for different interpretations by users. To create a better awareness of the potential influence of the user on the modeling results. Terrestrial food chain models driven by deposition of radionuclides from the atmosphere were used. Three codes were obtained and run with three scenarios by a maximum of 10 users. A number of conclusions can be drawn some of which are general and independent of the type of models and processes studied, while others are restricted to the few processes that were addressed directly: For any set of predictions, the variation in best estimates was greater than one order of magnitude. Often the range increased from deposition to pasture to milk probably due to additional transfer processes. The 95% confidence intervals about the predictions calculated from the parameter distributions prepared by the participants did not always overlap the observations; similarly, sometimes the confidence intervals on the predictions did not overlap. Often the 95% confidence intervals of individual predictions were smaller than the

  16. Risk Characterization uncertainties associated description, sensitivity analysis

    International Nuclear Information System (INIS)

    Carrillo, M.; Tovar, M.; Alvarez, J.; Arraez, M.; Hordziejewicz, I.; Loreto, I.

    2013-01-01

    The power point presentation is about risks to the estimated levels of exposure, uncertainty and variability in the analysis, sensitivity analysis, risks from exposure to multiple substances, formulation of guidelines for carcinogenic and genotoxic compounds and risk subpopulations

  17. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis.

    Science.gov (United States)

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  18. The role of sensitivity analysis in assessing uncertainty

    International Nuclear Information System (INIS)

    Crick, M.J.; Hill, M.D.

    1987-01-01

    Outside the specialist world of those carrying out performance assessments considerable confusion has arisen about the meanings of sensitivity analysis and uncertainty analysis. In this paper we attempt to reduce this confusion. We then go on to review approaches to sensitivity analysis within the context of assessing uncertainty, and to outline the types of test available to identify sensitive parameters, together with their advantages and disadvantages. The views expressed in this paper are those of the authors; they have not been formally endorsed by the National Radiological Protection Board and should not be interpreted as Board advice

  19. Including uncertainty in hazard analysis through fuzzy measures

    International Nuclear Information System (INIS)

    Bott, T.F.; Eisenhawer, S.W.

    1997-12-01

    This paper presents a method for capturing the uncertainty expressed by an Hazard Analysis (HA) expert team when estimating the frequencies and consequences of accident sequences and provides a sound mathematical framework for propagating this uncertainty to the risk estimates for these accident sequences. The uncertainty is readily expressed as distributions that can visually aid the analyst in determining the extent and source of risk uncertainty in HA accident sequences. The results also can be expressed as single statistics of the distribution in a manner analogous to expressing a probabilistic distribution as a point-value statistic such as a mean or median. The study discussed here used data collected during the elicitation portion of an HA on a high-level waste transfer process to demonstrate the techniques for capturing uncertainty. These data came from observations of the uncertainty that HA team members expressed in assigning frequencies and consequences to accident sequences during an actual HA. This uncertainty was captured and manipulated using ideas from possibility theory. The result of this study is a practical method for displaying and assessing the uncertainty in the HA team estimates of the frequency and consequences for accident sequences. This uncertainty provides potentially valuable information about accident sequences that typically is lost in the HA process

  20. Uncertainty Assessments in Fast Neutron Activation Analysis

    International Nuclear Information System (INIS)

    W. D. James; R. Zeisler

    2000-01-01

    Fast neutron activation analysis (FNAA) carried out with the use of small accelerator-based neutron generators is routinely used for major/minor element determinations in industry, mineral and petroleum exploration, and to some extent in research. While the method shares many of the operational procedures and therefore errors inherent to conventional thermal neutron activation analysis, its unique implementation gives rise to additional specific concerns that can result in errors or increased uncertainties of measured quantities. The authors were involved in a recent effort to evaluate irreversible incorporation of oxygen into a standard reference material (SRM) by direct measurement of oxygen by FNAA. That project required determination of oxygen in bottles of the SRM stored in varying environmental conditions and a comparison of the results. We recognized the need to accurately describe the total uncertainty of the measurements to accurately characterize any differences in the resulting average concentrations. It is our intent here to discuss the breadth of potential parameters that have the potential to contribute to the random and nonrandom errors of the method and provide estimates of the magnitude of uncertainty introduced. In addition, we will discuss the steps taken in this recent FNAA project to control quality, assess the uncertainty of the measurements, and evaluate results based on the statistical reproducibility

  1. Statistically based uncertainty assessments in nuclear risk analysis

    International Nuclear Information System (INIS)

    Spencer, F.W.; Diegert, K.V.; Easterling, R.G.

    1987-01-01

    Over the last decade, the problems of estimation and uncertainty assessment in probabilistics risk assessment (PRAs) have been addressed in a variety of NRC and industry-sponsored projects. These problems have received attention because of a recognition that major uncertainties in risk estimation exist, which can be reduced by collecting more and better data and other information, and because of a recognition that better methods for assessing these uncertainties are needed. In particular, a clear understanding of the nature and magnitude of various sources of uncertainty is needed to facilitate descision-making on possible plant changes and research options. Recent PRAs have employed methods of probability propagation, sometimes involving the use of Bayes Theorem, and intended to formalize the use of ''engineering judgment'' or ''expert opinion.'' All sources, or feelings, of uncertainty are expressed probabilistically, so that uncertainty analysis becomes simply a matter of probability propagation. Alternatives to forcing a probabilistic framework at all stages of a PRA are a major concern in this paper, however

  2. DS02 uncertainty analysis

    International Nuclear Information System (INIS)

    Kaul, Dean C.; Egbert, Stephen D.; Woolson, William A.

    2005-01-01

    In order to avoid the pitfalls that so discredited DS86 and its uncertainty estimates, and to provide DS02 uncertainties that are both defensible and credible, this report not only presents the ensemble uncertainties assembled from uncertainties in individual computational elements and radiation dose components but also describes how these relate to comparisons between observed and computed quantities at critical intervals in the computational process. These comparisons include those between observed and calculated radiation free-field components, where observations include thermal- and fast-neutron activation and gamma-ray thermoluminescence, which are relevant to the estimated systematic uncertainty for DS02. The comparisons also include those between calculated and observed survivor shielding, where the observations consist of biodosimetric measurements for individual survivors, which are relevant to the estimated random uncertainty for DS02. (J.P.N.)

  3. Sensitivity and uncertainty analysis of the PATHWAY radionuclide transport model

    International Nuclear Information System (INIS)

    Otis, M.D.

    1983-01-01

    Procedures were developed for the uncertainty and sensitivity analysis of a dynamic model of radionuclide transport through human food chains. Uncertainty in model predictions was estimated by propagation of parameter uncertainties using a Monte Carlo simulation technique. Sensitivity of model predictions to individual parameters was investigated using the partial correlation coefficient of each parameter with model output. Random values produced for the uncertainty analysis were used in the correlation analysis for sensitivity. These procedures were applied to the PATHWAY model which predicts concentrations of radionuclides in foods grown in Nevada and Utah and exposed to fallout during the period of atmospheric nuclear weapons testing in Nevada. Concentrations and time-integrated concentrations of iodine-131, cesium-136, and cesium-137 in milk and other foods were investigated. 9 figs., 13 tabs

  4. The characterisation and evaluation of uncertainty in probabilistic risk analysis

    International Nuclear Information System (INIS)

    Parry, G.W.; Winter, P.W.

    1980-10-01

    The sources of uncertainty in probabilistic risk analysis are discussed using the event/fault tree methodology as an example. The role of statistics in quantifying these uncertainties is investigated. A class of uncertainties is identified which is, at present, unquantifiable, using either classical or Bayesian statistics. It is argued that Bayesian statistics is the more appropriate vehicle for the probabilistic analysis of rare events and a short review is given with some discussion on the representation of ignorance. (author)

  5. Code package to analyse behavior of the WWER fuel rods in normal operation: TOPRA's code

    International Nuclear Information System (INIS)

    Scheglov, A.; Proselkov, V.

    2001-01-01

    This paper briefly describes the code package intended for analysis of WWER fuel rod characteristics. The package includes two computer codes: TOPRA-1 and TOPRA-2 for full-scale fuel rod analyses; MRZ and MKK codes for analyzing the separate sections of fuel rods in r-z and r-j geometry. The TOPRA's codes are developed on the base of PIN-mod2 version and verified against experimental results obtained in MR, MIR and Halden research reactors (in the framework of SOFIT, FGR-2 and FUMEX experimental programs). Comparative analysis of calculation results and results from post-reactor examination of the WWER-440 and WWER-1000 fuel rod are also made as additional verification of these codes. To avoid the enlarging of uncertainties in fuel behavior prediction as a result of simplifying of the fuel geometry, MKK and MRZ codes are developed on the basis of the finite element method with use of the three nodal finite elements. Results obtained in the course of the code verification indicate the possibility for application of the method and TOPRA's code for simplified engineering calculations of WWER fuel rods thermal-physical parameters. An analysis of maximum relative errors for predicting of the fuel rod characteristics in the range of the accepted parameter values is also presented in the paper

  6. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    Science.gov (United States)

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. SWEPP PAN assay system uncertainty analysis: Passive mode measurements of graphite waste

    International Nuclear Information System (INIS)

    Blackwood, L.G.; Harker, Y.D.; Meachum, T.R.; Yoon, Woo Y.

    1997-07-01

    The Idaho National Engineering and Environmental Laboratory is being used as a temporary storage facility for transuranic waste generated by the U.S. Nuclear Weapons program at the Rocky Flats Plant (RFP) in Golden, Colorado. Currently, there is a large effort in progress to prepare to ship this waste to the Waste Isolation Pilot Plant (WIPP) in Carlsbad, New Mexico. In order to meet the TRU Waste Characterization Quality Assurance Program Plan nondestructive assay compliance requirements and quality assurance objectives, it is necessary to determine the total uncertainty of the radioassay results produced by the Stored Waste Examination Pilot Plant (SWEPP) Passive Active Neutron (PAN) radioassay system. To this end a modified statistical sampling and verification approach has been developed to determine the total uncertainty of a PAN measurement. In this approach the total performance of the PAN nondestructive assay system is simulated using computer models of the assay system and the resultant output is compared with the known input to assess the total uncertainty. This paper is one of a series of reports quantifying the results of the uncertainty analysis of the PAN system measurements for specific waste types and measurement modes. In particular this report covers passive mode measurements of weapons grade plutonium-contaminated graphite molds contained in 208 liter drums (waste code 300). The validity of the simulation approach is verified by comparing simulated output against results from measurements using known plutonium sources and a surrogate graphite waste form drum. For actual graphite waste form conditions, a set of 50 cases covering a statistical sampling of the conditions exhibited in graphite wastes was compiled using a Latin hypercube statistical sampling approach

  8. Light-water reactor safety analysis codes

    International Nuclear Information System (INIS)

    Jackson, J.F.; Ransom, V.H.; Ybarrondo, L.J.; Liles, D.R.

    1980-01-01

    A brief review of the evolution of light-water reactor safety analysis codes is presented. Included is a summary comparison of the technical capabilities of major system codes. Three recent codes are described in more detail to serve as examples of currently used techniques. Example comparisons between calculated results using these codes and experimental data are given. Finally, a brief evaluation of current code capability and future development trends is presented

  9. Development of a Prototype Model-Form Uncertainty Knowledge Base

    Science.gov (United States)

    Green, Lawrence L.

    2016-01-01

    Uncertainties are generally classified as either aleatory or epistemic. Aleatory uncertainties are those attributed to random variation, either naturally or through manufacturing processes. Epistemic uncertainties are generally attributed to a lack of knowledge. One type of epistemic uncertainty is called model-form uncertainty. The term model-form means that among the choices to be made during a design process within an analysis, there are different forms of the analysis process, which each give different results for the same configuration at the same flight conditions. Examples of model-form uncertainties include the grid density, grid type, and solver type used within a computational fluid dynamics code, or the choice of the number and type of model elements within a structures analysis. The objectives of this work are to identify and quantify a representative set of model-form uncertainties and to make this information available to designers through an interactive knowledge base (KB). The KB can then be used during probabilistic design sessions, so as to enable the possible reduction of uncertainties in the design process through resource investment. An extensive literature search has been conducted to identify and quantify typical model-form uncertainties present within aerospace design. An initial attempt has been made to assemble the results of this literature search into a searchable KB, usable in real time during probabilistic design sessions. A concept of operations and the basic structure of a model-form uncertainty KB are described. Key operations within the KB are illustrated. Current limitations in the KB, and possible workarounds are explained.

  10. Uncertainty analysis in WWTP model applications: a critical discussion using an example from design

    DEFF Research Database (Denmark)

    Sin, Gürkan; Gernaey, Krist; Neumann, Marc B.

    2009-01-01

    of design performance criteria differs significantly. The implication for the practical applications of uncertainty analysis in the wastewater industry is profound: (i) as the uncertainty analysis results are specific to the framing used, the results must be interpreted within the context of that framing......This study focuses on uncertainty analysis of WWTP models and analyzes the issue of framing and how it affects the interpretation of uncertainty analysis results. As a case study, the prediction of uncertainty involved in model-based design of a wastewater treatment plant is studied. The Monte...... to stoichiometric, biokinetic and influent parameters; (2) uncertainty due to hydraulic behaviour of the plant and mass transfer parameters; (3) uncertainty due to the combination of (1) and (2). The results demonstrate that depending on the way the uncertainty analysis is framed, the estimated uncertainty...

  11. Principles and applications of measurement and uncertainty analysis in research and calibration

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that ``The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.`` Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true? What kind of information should we include in a statement of uncertainty accompanying a calibrated value? How and where do we get the information to include in an uncertainty statement? How should we interpret and use measurement uncertainty information? This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  12. Principles and applications of measurement and uncertainty analysis in research and calibration

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.'' Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true What kind of information should we include in a statement of uncertainty accompanying a calibrated value How and where do we get the information to include in an uncertainty statement How should we interpret and use measurement uncertainty information This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  13. Error Analysis of CM Data Products Sources of Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Hunt, Brian D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eckert-Gallup, Aubrey Celia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cochran, Lainy Dromgoole [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kraus, Terrence D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Allen, Mark B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Beal, Bill [National Security Technologies, Joint Base Andrews, MD (United States); Okada, Colin [National Security Technologies, LLC. (NSTec), Las Vegas, NV (United States); Simpson, Mathew [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-01

    This goal of this project is to address the current inability to assess the overall error and uncertainty of data products developed and distributed by DOE’s Consequence Management (CM) Program. This is a widely recognized shortfall, the resolution of which would provide a great deal of value and defensibility to the analysis results, data products, and the decision making process that follows this work. A global approach to this problem is necessary because multiple sources of error and uncertainty contribute to the ultimate production of CM data products. Therefore, this project will require collaboration with subject matter experts across a wide range of FRMAC skill sets in order to quantify the types of uncertainty that each area of the CM process might contain and to understand how variations in these uncertainty sources contribute to the aggregated uncertainty present in CM data products. The ultimate goal of this project is to quantify the confidence level of CM products to ensure that appropriate public and worker protections decisions are supported by defensible analysis.

  14. Uncertainty analysis of hydrological modeling in a tropical area using different algorithms

    Science.gov (United States)

    Rafiei Emam, Ammar; Kappas, Martin; Fassnacht, Steven; Linh, Nguyen Hoang Khanh

    2018-01-01

    Hydrological modeling outputs are subject to uncertainty resulting from different sources of errors (e.g., error in input data, model structure, and model parameters), making quantification of uncertainty in hydrological modeling imperative and meant to improve reliability of modeling results. The uncertainty analysis must solve difficulties in calibration of hydrological models, which further increase in areas with data scarcity. The purpose of this study is to apply four uncertainty analysis algorithms to a semi-distributed hydrological model, quantifying different source of uncertainties (especially parameter uncertainty) and evaluate their performance. In this study, the Soil and Water Assessment Tools (SWAT) eco-hydrological model was implemented for the watershed in the center of Vietnam. The sensitivity of parameters was analyzed, and the model was calibrated. The uncertainty analysis for the hydrological model was conducted based on four algorithms: Generalized Likelihood Uncertainty Estimation (GLUE), Sequential Uncertainty Fitting (SUFI), Parameter Solution method (ParaSol) and Particle Swarm Optimization (PSO). The performance of the algorithms was compared using P-factor and Rfactor, coefficient of determination (R 2), the Nash Sutcliffe coefficient of efficiency (NSE) and Percent Bias (PBIAS). The results showed the high performance of SUFI and PSO with P-factor>0.83, R-factor 0.91, NSE>0.89, and 0.18analysis. Indeed, the uncertainty analysis must be accounted when the outcomes of the model use for policy or management decisions.

  15. Fuzzy probability based fault tree analysis to propagate and quantify epistemic uncertainty

    International Nuclear Information System (INIS)

    Purba, Julwan Hendry; Sony Tjahyani, D.T.; Ekariansyah, Andi Sofrany; Tjahjono, Hendro

    2015-01-01

    Highlights: • Fuzzy probability based fault tree analysis is to evaluate epistemic uncertainty in fuzzy fault tree analysis. • Fuzzy probabilities represent likelihood occurrences of all events in a fault tree. • A fuzzy multiplication rule quantifies epistemic uncertainty of minimal cut sets. • A fuzzy complement rule estimate epistemic uncertainty of the top event. • The proposed FPFTA has successfully evaluated the U.S. Combustion Engineering RPS. - Abstract: A number of fuzzy fault tree analysis approaches, which integrate fuzzy concepts into the quantitative phase of conventional fault tree analysis, have been proposed to study reliabilities of engineering systems. Those new approaches apply expert judgments to overcome the limitation of the conventional fault tree analysis when basic events do not have probability distributions. Since expert judgments might come with epistemic uncertainty, it is important to quantify the overall uncertainties of the fuzzy fault tree analysis. Monte Carlo simulation is commonly used to quantify the overall uncertainties of conventional fault tree analysis. However, since Monte Carlo simulation is based on probability distribution, this technique is not appropriate for fuzzy fault tree analysis, which is based on fuzzy probabilities. The objective of this study is to develop a fuzzy probability based fault tree analysis to overcome the limitation of fuzzy fault tree analysis. To demonstrate the applicability of the proposed approach, a case study is performed and its results are then compared to the results analyzed by a conventional fault tree analysis. The results confirm that the proposed fuzzy probability based fault tree analysis is feasible to propagate and quantify epistemic uncertainties in fault tree analysis

  16. New challenges on uncertainty propagation assessment of flood risk analysis

    Science.gov (United States)

    Martins, Luciano; Aroca-Jiménez, Estefanía; Bodoque, José M.; Díez-Herrero, Andrés

    2016-04-01

    Natural hazards, such as floods, cause considerable damage to the human life, material and functional assets every year and around the World. Risk assessment procedures has associated a set of uncertainties, mainly of two types: natural, derived from stochastic character inherent in the flood process dynamics; and epistemic, that are associated with lack of knowledge or the bad procedures employed in the study of these processes. There are abundant scientific and technical literature on uncertainties estimation in each step of flood risk analysis (e.g. rainfall estimates, hydraulic modelling variables); but very few experience on the propagation of the uncertainties along the flood risk assessment. Therefore, epistemic uncertainties are the main goal of this work, in particular,understand the extension of the propagation of uncertainties throughout the process, starting with inundability studies until risk analysis, and how far does vary a proper analysis of the risk of flooding. These methodologies, such as Polynomial Chaos Theory (PCT), Method of Moments or Monte Carlo, are used to evaluate different sources of error, such as data records (precipitation gauges, flow gauges...), hydrologic and hydraulic modelling (inundation estimation), socio-demographic data (damage estimation) to evaluate the uncertainties propagation (UP) considered in design flood risk estimation both, in numerical and cartographic expression. In order to consider the total uncertainty and understand what factors are contributed most to the final uncertainty, we used the method of Polynomial Chaos Theory (PCT). It represents an interesting way to handle to inclusion of uncertainty in the modelling and simulation process. PCT allows for the development of a probabilistic model of the system in a deterministic setting. This is done by using random variables and polynomials to handle the effects of uncertainty. Method application results have a better robustness than traditional analysis

  17. Developments of HTGR thermofluid dynamic analysis codes and HTGR plant dynamic simulation code

    International Nuclear Information System (INIS)

    Tanaka, Mitsuhiro; Izaki, Makoto; Koike, Hiroyuki; Tokumitsu, Masashi

    1983-01-01

    In nuclear power plants as well as high temperature gas-cooled reactor plants, the design is mostly performed on the basis of the results after their characteristics have been grasped by carrying out the numerical simulation using the analysis code. Also in Kawasaki Heavy Industries Ltd., on the basis of the system engineering accumulated with gas-cooled reactors since several years ago, the preparation and systematization of analysis codes have been advanced, aiming at lining up the analysis codes for heat transferring flow and control characteristics, taking up HTGR plants as the main object. In this report, a part of the results is described. The example of the analysis applying the two-dimensional compressible flow analysis codes SOLA-VOF and SALE-2D, which were developed by Los Alamos National Laboratory in USA and modified for use in Kawasaki, to HTGR system is reported. Besides, Kawasaki has developed the control characteristics analyzing code DYSCO by which the change of system composition is easy and high versatility is available. The outline, fundamental equations, fundamental algorithms and examples of application of the SOLA-VOF and SALE-2D, the present status of system characteristic simulation codes and the outline of the DYSCO are described. (Kako, I.)

  18. Analysis of fission gas release in LWR fuel using the BISON code

    Energy Technology Data Exchange (ETDEWEB)

    G. Pastore; J.D. Hales; S.R. Novascone; D.M. Perez; B.W. Spencer; R.L. Williamson

    2013-09-01

    Recent advances in the development of the finite-element based, multidimensional fuel performance code BISON of Idaho National Laboratory are presented. Specifically, the development, implementation and testing of a new model for the analysis of fission gas behavior in LWR-UO2 fuel during irradiation are summarized. While retaining a physics-based description of the relevant mechanisms, the model is characterized by a level of complexity suitable for application to engineering-scale nuclear fuel analysis and consistent with the uncertainties pertaining to some parameters. The treatment includes the fundamental features of fission gas behavior, among which are gas diffusion and precipitation in fuel grains, growth and coalescence of gas bubbles at grain faces, grain growth and grain boundary sweeping effects, thermal, athermal, and transient gas release. The BISON code incorporating the new model is applied to the simulation of irradiation experiments from the OECD/NEA International Fuel Performance Experiments database, also included in the IAEA coordinated research projects FUMEX-II and FUMEX-III. The comparison of the results with the available experimental data at moderate burn-up is presented, pointing out an encouraging predictive accuracy, without any fitting applied to the model parameters.

  19. Uncertainty Analysis of RELAP5-3D

    Energy Technology Data Exchange (ETDEWEB)

    Alexandra E Gertman; Dr. George L Mesina

    2012-07-01

    As world-wide energy consumption continues to increase, so does the demand for the use of alternative energy sources, such as Nuclear Energy. Nuclear Power Plants currently supply over 370 gigawatts of electricity, and more than 60 new nuclear reactors have been commissioned by 15 different countries. The primary concern for Nuclear Power Plant operation and lisencing has been safety. The safety of the operation of Nuclear Power Plants is no simple matter- it involves the training of operators, design of the reactor, as well as equipment and design upgrades throughout the lifetime of the reactor, etc. To safely design, operate, and understand nuclear power plants, industry and government alike have relied upon the use of best-estimate simulation codes, which allow for an accurate model of any given plant to be created with well-defined margins of safety. The most widely used of these best-estimate simulation codes in the Nuclear Power industry is RELAP5-3D. Our project focused on improving the modeling capabilities of RELAP5-3D by developing uncertainty estimates for its calculations. This work involved analyzing high, medium, and low ranked phenomena from an INL PIRT on a small break Loss-Of-Coolant Accident as wall as an analysis of a large break Loss-Of- Coolant Accident. Statistical analyses were performed using correlation coefficients. To perform the studies, computer programs were written that modify a template RELAP5 input deck to produce one deck for each combination of key input parameters. Python scripting enabled the running of the generated input files with RELAP5-3D on INL’s massively parallel cluster system. Data from the studies was collected and analyzed with SAS. A summary of the results of our studies are presented.

  20. Decay heat uncertainty quantification of MYRRHA

    Directory of Open Access Journals (Sweden)

    Fiorito Luca

    2017-01-01

    Full Text Available MYRRHA is a lead-bismuth cooled MOX-fueled accelerator driven system (ADS currently in the design phase at SCK·CEN in Belgium. The correct evaluation of the decay heat and of its uncertainty level is very important for the safety demonstration of the reactor. In the first part of this work we assessed the decay heat released by the MYRRHA core using the ALEPH-2 burnup code. The second part of the study focused on the nuclear data uncertainty and covariance propagation to the MYRRHA decay heat. Radioactive decay data, independent fission yield and cross section uncertainties/covariances were propagated using two nuclear data sampling codes, namely NUDUNA and SANDY. According to the results, 238U cross sections and fission yield data are the largest contributors to the MYRRHA decay heat uncertainty. The calculated uncertainty values are deemed acceptable from the safety point of view as they are well within the available regulatory limits.

  1. Regulatory requirements to the thermal-hydraulic and thermal-mechanical computer codes

    International Nuclear Information System (INIS)

    Vitkova, M.; Kalchev, B.; Stefanova, S.

    2006-01-01

    The paper presents an overview of the regulatory requirements to the thermal-hydraulic and thermal-mechanical computer codes, which are used for safety assessment of the fuel design and the fuel utilization. Some requirements to the model development, verification and validation of the codes and analysis of code uncertainties are also define. Questions concerning Quality Assurance during development and implementation of the codes as well as preparation of a detailed verification and validation plan are briefly discussed

  2. Deterministic 3D transport, sensitivity and uncertainty analysis of TPR and reaction rate measurements in HCPB Breeder Blanket mock-up benchmark

    International Nuclear Information System (INIS)

    Kodeli, I.

    2006-01-01

    The Helium-Cooled Pebble Bed (HCPB) Breeder Blanket mock-up benchmark experiment was analysed using the deterministic transport, sensitivity and uncertainty code system in order to determine the Tritium Production Rate (TPR) in the ceramic breeder and the neutron reaction rates in beryllium, both nominal values and the corresponding uncertainties. The experiment, performed in 2005 to validate the HCPB concept, consists of a metallic beryllium set-up with two double layers of breeder material (Li 2 CO 3 powder). The reaction rate measurements include the Li 2 CO 3 pellets for the tritium breeding monitoring and activation foils, inserted at several axial and lateral locations in the block. In addition to the well established and validated procedure based on the 2-dimensional (2D) code DORT, a new approach for the 3D modelling was validated based on the TORT/GRTUNCL3D transport codes. The SUSD3D code, also in 3D geometry, was used for the cross-section sensitivity and uncertainty calculations. These studies are useful for the interpretation of the experimental measurements, in particular to assess the uncertainties linked to the basic nuclear data. The TPR, the neutron activation rates and the associated uncertainties were determined using the EFF-3.0 9 Be nuclear cross section and covariance data, and compared with those from other evaluations, like FENDL-2.1. Sensitivity profiles and nuclear data uncertainties of the TPR and detector reaction rates with respect to the cross-sections of 9 Be, 6 Li, 7 Li, O and C were determined at different positions in the experimental block. (author)

  3. R-matrix analysis code (RAC)

    International Nuclear Information System (INIS)

    Chen Zhenpeng; Qi Huiquan

    1990-01-01

    A comprehensive R-matrix analysis code has been developed. It is based on the multichannel and multilevel R-matrix theory and runs in VAX computer with FORTRAN-77. With this code many kinds of experimental data for one nuclear system can be fitted simultaneously. The comparisions between code RAC and code EDA of LANL are made. The data show both codes produced the same calculation results when one set of R-matrix parameters was used. The differential cross section of 10 B (n, α) 7 Li for E n = 0.4 MeV and the polarization of 16 O (n,n) 16 O for E n = 2.56 MeV are presented

  4. Development of realistic thermal hydraulic system analysis code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Won Jae; Chung, B. D; Kim, K. D. [and others

    2002-05-01

    The realistic safety analysis system is essential for nuclear safety research, advanced reactor development, safety analysis in nuclear industry and 'in-house' plant design capability development. In this project, we have developed a best-estimate multi-dimensional thermal-hydraulic system code, MARS, which is based on the integrated version of the RELAP5 and COBRA-TF codes. To improve the realistic analysis capability, we have improved the models for multi-dimensional two-phase flow phenomena and for advanced two-phase flow modeling. In addition, the GUI (Graphic User Interface) feature were developed to enhance the user's convenience. To develop the coupled analysis capability, the MARS code were linked with the three-dimensional reactor kinetics code (MASTER), the core thermal analysis code (COBRA-III/CP), and the best-estimate containment analysis code (CONTEMPT), resulting in MARS/MASTER/COBRA/CONTEMPT. Currently, the MARS code system has been distributed to 18 domestic organizations, including research, industrial, regulatory organizations and universities. The MARS has been being widely used for the safety research of existing PWRs, advanced PWR, CANDU and research reactor, the pre-test analysis of TH experiments, and others.

  5. Development of realistic thermal hydraulic system analysis code

    International Nuclear Information System (INIS)

    Lee, Won Jae; Chung, B. D; Kim, K. D.

    2002-05-01

    The realistic safety analysis system is essential for nuclear safety research, advanced reactor development, safety analysis in nuclear industry and 'in-house' plant design capability development. In this project, we have developed a best-estimate multi-dimensional thermal-hydraulic system code, MARS, which is based on the integrated version of the RELAP5 and COBRA-TF codes. To improve the realistic analysis capability, we have improved the models for multi-dimensional two-phase flow phenomena and for advanced two-phase flow modeling. In addition, the GUI (Graphic User Interface) feature were developed to enhance the user's convenience. To develop the coupled analysis capability, the MARS code were linked with the three-dimensional reactor kinetics code (MASTER), the core thermal analysis code (COBRA-III/CP), and the best-estimate containment analysis code (CONTEMPT), resulting in MARS/MASTER/COBRA/CONTEMPT. Currently, the MARS code system has been distributed to 18 domestic organizations, including research, industrial, regulatory organizations and universities. The MARS has been being widely used for the safety research of existing PWRs, advanced PWR, CANDU and research reactor, the pre-test analysis of TH experiments, and others

  6. Uncertainty and sensitivity analysis in the neutronic parameters generation for BWR and PWR coupled thermal-hydraulic–neutronic simulations

    International Nuclear Information System (INIS)

    Ánchel, F.; Barrachina, T.; Miró, R.; Verdú, G.; Juanas, J.; Macián-Juan, R.

    2012-01-01

    Highlights: ► Best-estimate codes are affected by the uncertainty in the methods and the models. ► Influence of the uncertainty in the macroscopic cross-sections in a BWR and PWR RIA accidents analysis. ► The fast diffusion coefficient, the scattering cross section and both fission cross sections are the most influential factors. ► The absorption cross sections very little influence. ► Using a normal pdf the results are more “conservative” comparing the power peak reached with uncertainty quantified with a uniform pdf. - Abstract: The Best Estimate analysis consists of a coupled thermal-hydraulic and neutronic description of the nuclear system's behavior; uncertainties from both aspects should be included and jointly propagated. This paper presents a study of the influence of the uncertainty in the macroscopic neutronic information that describes a three-dimensional core model on the most relevant results of the simulation of a Reactivity Induced Accident (RIA). The analyses of a BWR-RIA and a PWR-RIA have been carried out with a three-dimensional thermal-hydraulic and neutronic model for the coupled system TRACE-PARCS and RELAP-PARCS. The cross section information has been generated by the SIMTAB methodology based on the joint use of CASMO-SIMULATE. The statistically based methodology performs a Monte-Carlo kind of sampling of the uncertainty in the macroscopic cross sections. The size of the sampling is determined by the characteristics of the tolerance intervals by applying the Noether–Wilks formulas. A number of simulations equal to the sample size have been carried out in which the cross sections used by PARCS are directly modified with uncertainty, and non-parametric statistical methods are applied to the resulting sample of the values of the output variables to determine their intervals of tolerance.

  7. Planning for robust reserve networks using uncertainty analysis

    Science.gov (United States)

    Moilanen, A.; Runge, M.C.; Elith, Jane; Tyre, A.; Carmel, Y.; Fegraus, E.; Wintle, B.A.; Burgman, M.; Ben-Haim, Y.

    2006-01-01

    Planning land-use for biodiversity conservation frequently involves computer-assisted reserve selection algorithms. Typically such algorithms operate on matrices of species presence?absence in sites, or on species-specific distributions of model predicted probabilities of occurrence in grid cells. There are practically always errors in input data?erroneous species presence?absence data, structural and parametric uncertainty in predictive habitat models, and lack of correspondence between temporal presence and long-run persistence. Despite these uncertainties, typical reserve selection methods proceed as if there is no uncertainty in the data or models. Having two conservation options of apparently equal biological value, one would prefer the option whose value is relatively insensitive to errors in planning inputs. In this work we show how uncertainty analysis for reserve planning can be implemented within a framework of information-gap decision theory, generating reserve designs that are robust to uncertainty. Consideration of uncertainty involves modifications to the typical objective functions used in reserve selection. Search for robust-optimal reserve structures can still be implemented via typical reserve selection optimization techniques, including stepwise heuristics, integer-programming and stochastic global search.

  8. Uncertainty Estimation of Neutron Activation Analysis in Zinc Elemental Determination in Food Samples

    International Nuclear Information System (INIS)

    Endah Damastuti; Muhayatun; Diah Dwiana L

    2009-01-01

    Beside to complished the requirements of international standard of ISO/IEC 17025:2005, uncertainty estimation should be done to increase quality and confidence of analysis results and also to establish traceability of the analysis results to SI unit. Neutron activation analysis is a major technique used by Radiometry technique analysis laboratory and is included as scope of accreditation under ISO/IEC 17025:2005, therefore uncertainty estimation of neutron activation analysis is needed to be carried out. Sample and standard preparation as well as, irradiation and measurement using gamma spectrometry were the main activities which could give contribution to uncertainty. The components of uncertainty sources were specifically explained. The result of expanded uncertainty was 4,0 mg/kg with level of confidence 95% (coverage factor=2) and Zn concentration was 25,1 mg/kg. Counting statistic of cuplikan and standard were the major contribution of combined uncertainty. The uncertainty estimation was expected to increase the quality of the analysis results and could be applied further to other kind of samples. (author)

  9. Erha Uncertainty Analysis: Planning for the future

    International Nuclear Information System (INIS)

    Brami, T.R.; Hopkins, D.F.; Loguer, W.L.; Cornagia, D.M.; Braisted, A.W.C.

    2002-01-01

    The Erha field (OPL 209) was discovered in 1999 approximately 100 km off the coast of Nigeria in 1,100 m of water. The discovery well (Erha-1) encountered oil and gas in deep-water clastic reservoirs. The first appraisal well (Erha-2) drilled 1.6 km downdip to the northwest penetrated an oil-water contact and confirmed a potentially commercial discovery. However, the Erha-3 and Erha-3 ST-1 boreholes, drilled on the faulted east-side of the field in 2001, encountered shallower fluid contacts. As a result of these findings, a comprehensive field-wide uncertainty analysis was performed to better understand what we know versus what we think regarding resource size and economic viability The uncertainty analysis process applied at Erha is an integrated scenario-based probabilistic approach to model resource and reserves. Its goal is to provide quantitative results for a variety of scenarios, thus allowing identification of and focus on critical controls (the variables that are likely to impose the greatest influence).The initial focus at Erha was to incorporate the observed fluid contacts and to develop potential scenarios that included the range of possibilities in unpenetrated portions of the field. Four potential compartmentalization scenarios were hypothesized. The uncertainty model combines these scenarios with reservoir parameters and their plausible ranges. Input data comes from multiple sources including: wells, 3D seismic, reservoir flow simulation, geochemistry, fault-seal analysis, sequence stratigraphic analysis, and analogs. Once created, the model is sampled using Monte-Carlo techniques to create probability density functions for a variety of variables including oil in-place and recoverable reserves.Results of the uncertainty analysis support that despite a thinner oil column on the faulted east-side of the field, Erha is an economically attractive opportunity. Further, the results have been to develop data acquisition plans and mitigation strategies that

  10. Uncertainty analyses of infiltration and subsurface flow and transport for SDMP sites

    International Nuclear Information System (INIS)

    Meyer, P.D.; Rockhold, M.L.; Gee, G.W.

    1997-09-01

    US Nuclear Regulatory Commission staff have identified a number of sites requiring special attention in the decommissioning process because of elevated levels of radioactive contaminants. Traits common to many of these sites include limited data characterizing the subsurface, the presence of long-lived radionuclides necessitating a long-term analysis (1,000 years or more), and potential exposure through multiple pathways. As a consequence of these traits, the uncertainty in predicted exposures can be significant. In addition, simplifications to the physical system and the transport mechanisms are often necessary to reduce the computational requirements of the analysis. Several multiple-pathway transport codes exist for estimating dose, two of which were used in this study. These two codes have built-in Monte Carlo simulation capabilities that were used for the uncertainty analysis. Several tools for improving uncertainty analyses of exposure estimates through the groundwater pathway have been developed and are discussed in this report. Generic probability distributions for unsaturated and saturated zone soil hydraulic parameters are presented. A method is presented to combine the generic distributions with site-specific water retention data using a Bayesian analysis. The resulting updated soil hydraulic parameter distributions can be used to obtain an updated estimate of the probability distribution of dose. The method is illustrated using a hypothetical decommissioning site

  11. Parallel processing of structural integrity analysis codes

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.

    1996-01-01

    Structural integrity analysis forms an important role in assessing and demonstrating the safety of nuclear reactor components. This analysis is performed using analytical tools such as Finite Element Method (FEM) with the help of digital computers. The complexity of the problems involved in nuclear engineering demands high speed computation facilities to obtain solutions in reasonable amount of time. Parallel processing systems such as ANUPAM provide an efficient platform for realising the high speed computation. The development and implementation of software on parallel processing systems is an interesting and challenging task. The data and algorithm structure of the codes plays an important role in exploiting the parallel processing system capabilities. Structural analysis codes based on FEM can be divided into two categories with respect to their implementation on parallel processing systems. The first category codes such as those used for harmonic analysis, mechanistic fuel performance codes need not require the parallelisation of individual modules of the codes. The second category of codes such as conventional FEM codes require parallelisation of individual modules. In this category, parallelisation of equation solution module poses major difficulties. Different solution schemes such as domain decomposition method (DDM), parallel active column solver and substructuring method are currently used on parallel processing systems. Two codes, FAIR and TABS belonging to each of these categories have been implemented on ANUPAM. The implementation details of these codes and the performance of different equation solvers are highlighted. (author). 5 refs., 12 figs., 1 tab

  12. Fukushima Daiichi Unit 1 Accident Progression Uncertainty Analysis and Implications for Decommissioning of Fukushima Reactors - Volume I.

    Energy Technology Data Exchange (ETDEWEB)

    Gauntt, Randall O. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Mattie, Patrick D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-01-01

    Sandia National Laboratories (SNL) has conducted an uncertainty analysis (UA) on the Fukushima Daiichi unit (1F1) accident progression with the MELCOR code. The model used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). That study focused on reconstructing the accident progressions, as postulated by the limited plant data. This work was focused evaluation of uncertainty in core damage progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, reactor damage state, fraction of intact fuel, vessel lower head failure). The primary intent of this study was to characterize the range of predicted damage states in the 1F1 reactor considering state of knowledge uncertainties associated with MELCOR modeling of core damage progression and to generate information that may be useful in informing the decommissioning activities that will be employed to defuel the damaged reactors at the Fukushima Daiichi Nuclear Power Plant. Additionally, core damage progression variability inherent in MELCOR modeling numerics is investigated.

  13. OPR1000 RCP Flow Coastdown Analysis using SPACE Code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dong-Hyuk; Kim, Seyun [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    The Korean nuclear industry developed a thermal-hydraulic analysis code for the safety analysis of PWRs, named SPACE(Safety and Performance Analysis Code for Nuclear Power Plant). Current loss of flow transient analysis of OPR1000 uses COAST code to calculate transient RCS(Reactor Coolant System) flow. The COAST code calculates RCS loop flow using pump performance curves and RCP(Reactor Coolant Pump) inertia. In this paper, SPACE code is used to reproduce RCS flowrates calculated by COAST code. The loss of flow transient is transient initiated by reduction of forced reactor coolant circulation. Typical loss of flow transients are complete loss of flow(CLOF) and locked rotor(LR). OPR1000 RCP flow coastdown analysis was performed using SPACE using simplified nodalization. Complete loss of flow(4 RCP trip) was analyzed. The results show good agreement with those from COAST code, which is CE code for calculating RCS flow during loss of flow transients. Through this study, we confirmed that SPACE code can be used instead of COAST code for RCP flow coastdown analysis.

  14. Use of computer codes for system reliability analysis

    International Nuclear Information System (INIS)

    Sabek, M.; Gaafar, M.; Poucet, A.

    1988-01-01

    This paper gives a collective summary of the studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRANTIC, FTAP, computer code package RALLY, and BOUNDS codes. Two reference study cases were executed by each code. The results obtained logic/probabilistic analysis as well as computation time are compared

  15. Uncertainty Quantification of Turbulence Model Closure Coefficients for Transonic Wall-Bounded Flows

    Science.gov (United States)

    Schaefer, John; West, Thomas; Hosder, Serhat; Rumsey, Christopher; Carlson, Jan-Renee; Kleb, William

    2015-01-01

    The goal of this work was to quantify the uncertainty and sensitivity of commonly used turbulence models in Reynolds-Averaged Navier-Stokes codes due to uncertainty in the values of closure coefficients for transonic, wall-bounded flows and to rank the contribution of each coefficient to uncertainty in various output flow quantities of interest. Specifically, uncertainty quantification of turbulence model closure coefficients was performed for transonic flow over an axisymmetric bump at zero degrees angle of attack and the RAE 2822 transonic airfoil at a lift coefficient of 0.744. Three turbulence models were considered: the Spalart-Allmaras Model, Wilcox (2006) k-w Model, and the Menter Shear-Stress Trans- port Model. The FUN3D code developed by NASA Langley Research Center was used as the flow solver. The uncertainty quantification analysis employed stochastic expansions based on non-intrusive polynomial chaos as an efficient means of uncertainty propagation. Several integrated and point-quantities are considered as uncertain outputs for both CFD problems. All closure coefficients were treated as epistemic uncertain variables represented with intervals. Sobol indices were used to rank the relative contributions of each closure coefficient to the total uncertainty in the output quantities of interest. This study identified a number of closure coefficients for each turbulence model for which more information will reduce the amount of uncertainty in the output significantly for transonic, wall-bounded flows.

  16. New strategies for quantifying and propagating nuclear data uncertainty in CUSA

    International Nuclear Information System (INIS)

    Zhao, Qiang; Zhang, Chunyan; Hao, Chen; Li, Fu; Wang, Dongyong; Yu, Yan

    2016-01-01

    Highlights: • An efficient sampling method based on LHS combined with Cholesky decomposition conversion is proposed. • A code of generating multi-group covariance matrices has been developed. • The uncertainty and sensitivity results of CUSA agree well with TSUNAMI-1D. - Abstract: The uncertainties of nuclear cross sections are propagated to the key parameters of nuclear reactor core through transport calculation. The statistical sampling method can be used to quantify and propagate nuclear data uncertainty in nuclear reactor physics calculations. In order to use statistical sampling method two key technical problems, method of generating multi-group covariance matrices and sampling method, should be considered reasonably and efficiently. In this paper, a method of transforming nuclear cross section covariance matrix in multi-group form into users' group structures based on the flat-flux approximation has been studied in depth. And most notably, an efficient sampling method has been proposed, which is based on Latin Hypercube Sampling (LHS) combined with Cholesky decomposition conversion. Based on those method, two modules named T-COCCO and GUIDE have been developed and have been successfully added into the code for uncertainty and sensitivity analysis (CUSA). The new modules have been verified respectively. Numerical results for the TMI-1 pin-cell case are presented and compared to TSUNAMI-1D. The comparison of the results further support that the methods and the computational tool developed in this work can be used to conduct sensitivity and uncertainty analysis for nuclear cross sections.

  17. New strategies for quantifying and propagating nuclear data uncertainty in CUSA

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Qiang; Zhang, Chunyan [Fundamental Science on Nuclear Safety and Simulation Technology Laboratory, College of Nuclear Science and Technology, Harbin Engineering University, Harbin (China); Hao, Chen, E-mail: haochen.heu@163.com [Fundamental Science on Nuclear Safety and Simulation Technology Laboratory, College of Nuclear Science and Technology, Harbin Engineering University, Harbin (China); Li, Fu [Institute of Nuclear and New Energy Technology(INET), Collaborative Innovation Center of Advanced Nuclear Energy Technology, Key Laboratory of Advanced Reactor Engineering and Safety of Ministry of Education, Tsinghua University, Beijing (China); Wang, Dongyong [Fundamental Science on Nuclear Safety and Simulation Technology Laboratory, College of Nuclear Science and Technology, Harbin Engineering University, Harbin (China); School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an (China); Yu, Yan [Fundamental Science on Nuclear Safety and Simulation Technology Laboratory, College of Nuclear Science and Technology, Harbin Engineering University, Harbin (China)

    2016-10-15

    Highlights: • An efficient sampling method based on LHS combined with Cholesky decomposition conversion is proposed. • A code of generating multi-group covariance matrices has been developed. • The uncertainty and sensitivity results of CUSA agree well with TSUNAMI-1D. - Abstract: The uncertainties of nuclear cross sections are propagated to the key parameters of nuclear reactor core through transport calculation. The statistical sampling method can be used to quantify and propagate nuclear data uncertainty in nuclear reactor physics calculations. In order to use statistical sampling method two key technical problems, method of generating multi-group covariance matrices and sampling method, should be considered reasonably and efficiently. In this paper, a method of transforming nuclear cross section covariance matrix in multi-group form into users' group structures based on the flat-flux approximation has been studied in depth. And most notably, an efficient sampling method has been proposed, which is based on Latin Hypercube Sampling (LHS) combined with Cholesky decomposition conversion. Based on those method, two modules named T-COCCO and GUIDE have been developed and have been successfully added into the code for uncertainty and sensitivity analysis (CUSA). The new modules have been verified respectively. Numerical results for the TMI-1 pin-cell case are presented and compared to TSUNAMI-1D. The comparison of the results further support that the methods and the computational tool developed in this work can be used to conduct sensitivity and uncertainty analysis for nuclear cross sections.

  18. Aspects of uncertainty analysis in accident consequence modeling

    International Nuclear Information System (INIS)

    Travis, C.C.; Hoffman, F.O.

    1981-01-01

    Mathematical models are frequently used to determine probable dose to man from an accidental release of radionuclides by a nuclear facility. With increased emphasis on the accuracy of these models, the incorporation of uncertainty analysis has become one of the most crucial and sensitive components in evaluating the significance of model predictions. In the present paper, we address three aspects of uncertainty in models used to assess the radiological impact to humans: uncertainties resulting from the natural variability in human biological parameters; the propagation of parameter variability by mathematical models; and comparison of model predictions to observational data

  19. Determination of uncertainties in energy and exergy analysis of a power plant

    International Nuclear Information System (INIS)

    Ege, Ahmet; Şahin, Hacı Mehmet

    2014-01-01

    Highlights: • Energy and exergy efficiency uncertainties in a large thermal power plant examined. • Sensitivity analysis shows importance of basic measurements on efficiency analysis. • A quick and practical approach is provided for determining efficiency uncertainties. • Extreme case analysis characterizes maximum possible boundaries of uncertainties. • Uncertainty determination in a plant is a dynamic process with real data. - Abstract: In this study, energy and exergy efficiency uncertainties of a large scale lignite fired power plant cycle and various measurement parameter sensitivities were investigated for five different design power outputs (100%, 85%, 80%, 60% and 40%) and with real data of the plant. For that purpose a black box method was employed considering coal flow with Lower Heating Value (LHV) as a single input and electricity produced as a single output of the plant. The uncertainty of energy and exergy efficiency of the plant was evaluated with this method by applying sensitivity analysis depending on the effect of measurement parameters such as LHV, coal mass flow rate, cell generator output voltage/current. In addition, an extreme case analysis was investigated to determine the maximum range of the uncertainties. Results of the black box method showed that uncertainties varied between 1.82–1.98% for energy efficiency and 1.32–1.43% for exergy efficiency of the plant at an operating power level of 40–100% of full power. It was concluded that LHV determination was the most important uncertainty source of energy and exergy efficiency of the plant. The uncertainties of the extreme case analysis were determined between 2.30% and 2.36% for energy efficiency while 1.66% and 1.70% for exergy efficiency for 40–100% power output respectively. Proposed method was shown to be an approach for understanding major uncertainties as well as effects of some measurement parameters in a large scale thermal power plant

  20. Sensitivity and uncertainty studies of the CRAC2 code for selected meteorological models and parameters

    International Nuclear Information System (INIS)

    Ward, R.C.; Kocher, D.C.; Hicks, B.B.; Hosker, R.P. Jr.; Ku, J.Y.; Rao, K.S.

    1985-01-01

    We have studied the sensitivity of results from the CRAC2 computer code, which predicts health impacts from a reactor-accident scenario, to uncertainties in selected meteorological models and parameters. The sources of uncertainty examined include the models for plume rise and wet deposition and the meteorological bin-sampling procedure. An alternative plume-rise model usually had little effect on predicted health impacts. In an alternative wet-deposition model, the scavenging rate depends only on storm type, rather than on rainfall rate and atmospheric stability class as in the CRAC2 model. Use of the alternative wet-deposition model in meteorological bin-sampling runs decreased predicted mean early injuries by as much as a factor of 2-3 and, for large release heights and sensible heat rates, decreased mean early fatalities by nearly an order of magnitude. The bin-sampling procedure in CRAC2 was expanded by dividing each rain bin into four bins that depend on rainfall rate. Use of the modified bin structure in conjunction with the CRAC2 wet-deposition model changed all predicted health impacts by less than a factor of 2. 9 references

  1. Contribution of the mathematical modelling of knowledge to the evaluation of uncertainty margins of a LBLOCA transient (LOFT-L2-5)

    International Nuclear Information System (INIS)

    Baccou, J.; Chojnacki, E.

    2007-01-01

    This work is devoted to some recent developments in uncertainty analysis of the computer code responses used for accident management procedures in nuclear industry. The classical probabilistic approach to evaluate uncertainties is recalled. In this case, the statistical treatment of the code responses is based on the use of order statistics. It provides direct estimations of relevant statistical measures for safety studies. However, the lack of knowledge about uncertainty sources can deteriorate the decision-making. To respect the real state of knowledge, a second model, based on the Dempster-Shafer theory is introduced. It allows to mix the probabilistic approach with the possibility theory that is more appropriate when few information is available. An application of both methodologies to the uncertainty analysis of a LBLOCA transient (LOFT-L2-5) is given

  2. Uncertainty Analysis of OC5-DeepCwind Floating Semisubmersible Offshore Wind Test Campaign: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Robertson, Amy N [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-07-26

    This paper examines how to assess the uncertainty levels for test measurements of the Offshore Code Comparison, Continued, with Correlation (OC5)-DeepCwind floating offshore wind system, examined within the OC5 project. The goal of the OC5 project was to validate the accuracy of ultimate and fatigue load estimates from a numerical model of the floating semisubmersible using data measured during scaled tank testing of the system under wind and wave loading. The examination of uncertainty was done after the test, and it was found that the limited amount of data available did not allow for an acceptable uncertainty assessment. Therefore, this paper instead qualitatively examines the sources of uncertainty associated with this test to start a discussion of how to assess uncertainty for these types of experiments and to summarize what should be done during future testing to acquire the information needed for a proper uncertainty assessment. Foremost, future validation campaigns should initiate numerical modeling before testing to guide the test campaign, which should include a rigorous assessment of uncertainty, and perform validation during testing to ensure that the tests address all of the validation needs.

  3. Error-correction coding and decoding bounds, codes, decoders, analysis and applications

    CERN Document Server

    Tomlinson, Martin; Ambroze, Marcel A; Ahmed, Mohammed; Jibril, Mubarak

    2017-01-01

    This book discusses both the theory and practical applications of self-correcting data, commonly known as error-correcting codes. The applications included demonstrate the importance of these codes in a wide range of everyday technologies, from smartphones to secure communications and transactions. Written in a readily understandable style, the book presents the authors’ twenty-five years of research organized into five parts: Part I is concerned with the theoretical performance attainable by using error correcting codes to achieve communications efficiency in digital communications systems. Part II explores the construction of error-correcting codes and explains the different families of codes and how they are designed. Techniques are described for producing the very best codes. Part III addresses the analysis of low-density parity-check (LDPC) codes, primarily to calculate their stopping sets and low-weight codeword spectrum which determines the performance of these codes. Part IV deals with decoders desi...

  4. Sensitivity and uncertainty analysis of reactivities for UO2 and MOX fueled PWR cells

    Energy Technology Data Exchange (ETDEWEB)

    Foad, Basma [Research Institute of Nuclear Engineering, University of Fukui, Kanawa-cho 1-2-4, Tsuruga-shi, Fukui-ken, 914-0055 (Japan); Egypt Nuclear and Radiological Regulatory Authority, 3 Ahmad El Zomar St., Nasr City, Cairo, 11787 (Egypt); Takeda, Toshikazu [Research Institute of Nuclear Engineering, University of Fukui, Kanawa-cho 1-2-4, Tsuruga-shi, Fukui-ken, 914-0055 (Japan)

    2015-12-31

    The purpose of this paper is to apply our improved method for calculating sensitivities and uncertainties of reactivity responses for UO{sub 2} and MOX fueled pressurized water reactor cells. The improved method has been used to calculate sensitivity coefficients relative to infinite dilution cross-sections, where the self-shielding effect is taken into account. Two types of reactivities are considered: Doppler reactivity and coolant void reactivity, for each type of reactivity, the sensitivities are calculated for small and large perturbations. The results have demonstrated that the reactivity responses have larger relative uncertainty than eigenvalue responses. In addition, the uncertainty of coolant void reactivity is much greater than Doppler reactivity especially for large perturbations. The sensitivity coefficients and uncertainties of both reactivities were verified by comparing with SCALE code results using ENDF/B-VII library and good agreements have been found.

  5. Manometer Behavior Analysis using CATHENA, RELAP and GOTHIC Codes

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yang Hoon; Han, Kee Soo; Moon, Bok Ja; Jang, Misuk [Nuclear Engineering Service and Solution Co. Ltd., Daejeon (Korea, Republic of)

    2014-05-15

    In this presentation, simple thermal hydraulic behavior is analyzed using three codes to show the possibility of using alternative codes. We established three models of simple u-tube manometer using three different codes. CATHENA (Canadian Algorithm for Thermal hydraulic Network Analysis), RELAP (Reactor Excursion and Leak Analysis Program), GOTHIC (Generation of Thermal Hydraulic Information for Containments) are used for this analysis. CATHENA and RELAP are widely used codes for the analysis of system behavior of CANDU and PWR. And GOTHIC code also has been widely used for the analysis of thermal hydraulic behavior in the containment system. In this paper, the internal behavior of u-tube manometer was analyzed using 3 codes, CATHENA, RELAP and GOTHIC. The general transient behavior is similar among 3 codes. However, the behavior simulated using GOTHIC shows some different trend compared with the results from the other 2 codes at the end of the transient. It would be resulted from the use of different physical model in GOTHIC, which is specialized for the multi-phase thermal hydraulic behavior analysis of containment system unlike the other two codes.

  6. Validation of coupled neutronic / thermal-hydraulic codes for VVER reactors. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Mittag, S.; Grundmann, U.; Kliem, S.; Kozmenkov, Y.; Rindelhardt, U.; Rohde, U.; Weiss, F.-P.; Langenbuch, S.; Krzykacz-Hausmann, B.; Schmidt, K.-D.; Vanttola, T.; Haemaelaeinen, A.; Kaloinen, E.; Kereszturi, A.; Hegyi, G.; Panka, I.; Hadek, J.; Strmensky, C.; Darilek, P.; Petkov, P.; Stefanova, S.; Kuchin, A.; Khalimonchuk, V.; Hlbocky, P.; Sico, D.; Danilin, S.; Ionov, V.; Nikonov, S.; Powney, D.

    2004-08-01

    In recent years, the simulation methods for the safety analysis of nuclear power plants have been continuously improved to perform realistic calculations. Therefore in VALCO work package 2 (WP 2), the usual application of coupled neutron-kinetic / thermal-hydraulic codes to VVER has been supplemented by systematic uncertainty and sensitivity analyses. A comprehensive uncertainty analysis has been carried out. The GRS uncertainty and sensitivity method based on the statistical code package SUSA was applied to the two transients studied earlier in SRR-1/95: A load drop of one turbo-generator in Loviisa-1 (VVER-440), and a switch-off of one feed water pump in Balakovo-4 (VVER-1000). The main steps of these analyses and the results obtained by applying different coupled code systems (SMABRE - HEXTRAN, ATHLET - DYN3D, ATHLET - KIKO3D, ATHLET - BIPR-8) are described in this report. The application of this method is only based on variations of input parameter values. No internal code adjustments are needed. An essential result of the analysis using the GRS SUSA methodology is the identification of the input parameters, such as the secondary-circuit pressure, the control-assembly position (as a function of time), and the control-assembly efficiency, that most sensitively affect safety-relevant output parameters, like reactor power, coolant heat-up, and primary pressure. Uncertainty bands for these output parameters have been derived. The variation of potentially uncertain input parameter values as a consequence of uncertain knowledge can activate system actions causing quite different transient evolutions. This gives indications about possible plant conditions that might be reached from the initiating event assuming only small disturbances. In this way, the uncertainty and sensitivity analysis reveals the spectrum of possible transient evolutions. Deviations of SRR-1/95 coupled code calculations from measurements also led to the objective to separate neutron kinetics from

  7. LOFT differential pressure uncertainty analysis

    International Nuclear Information System (INIS)

    Evans, R.P.; Biladeau, G.L.; Quinn, P.A.

    1977-03-01

    A performance analysis of the LOFT differential pressure (ΔP) measurement is presented. Along with completed descriptions of test programs and theoretical studies that have been conducted on the ΔP, specific sources of measurement uncertainty are identified, quantified, and combined to provide an assessment of the ability of this measurement to satisfy the SDD 1.4.1C (June 1975) requirement of measurement of differential pressure

  8. System transient analysis code development for low pressure and low power

    International Nuclear Information System (INIS)

    Kim, Hee Cheol

    1998-02-01

    A real time reactor system analysis code, ARTIST, based on drift flux model has been developed to investigate the transient system behavior under low pressure, low flow and low power conditions with noncondensable gas present in the system. The governing equations of the ARTIST code consist of three mass continuity equations (steam, liquid and noncondensable), two energy equations (gas and mixture) and one momentum equation (mixture) constituted with the drift flux model. The capability of ARTIST in predicting two-phase flow void distribution in the system has been validated against experimental data. The results of the ARTIST axial void distribution at low pressure and low flow, are far better than the results of both the homogeneous model of TASS code and the two-fluid model of RELAP5/MOD3 code. Also, RELAP5/MOD3 calculation shows the large amplitude of void fraction oscillations at low pressure. These results imply that interfacial momentum transfer terms in the two-fluid model formulation should be carefully constituted, especially for the low pressure condition due to the big density differences between steam and water. Thermal-hydraulic state solution scheme is developed when noncondensable gas exists. Numerical consistency and convergence of obtaining equilibrium state is tested with the ideal problems for various situations including very low partial pressure conditions. Calculated thermal-hydraulic state for each test shows consistent and expected behaviour. A new multi-layer back propagation network algorithm for calculating the departure from nucleate boiling ratio (DNBR) is developed and adopted in ARTIST code in order to have real-time DNBR evaluation by eliminating the tandem procedure of the transient DNBR calculation. The algorithm trained by different patterns generated by latin hypercube sampling method on the performance space is tested for the randomly sampled untrained data and the transient DNBR data. The uncertainty of the algorithm is

  9. An introductory guide to uncertainty analysis in environmental and health risk assessment

    International Nuclear Information System (INIS)

    Hoffman, F.O.; Hammonds, J.S.

    1992-10-01

    To compensate for the potential for overly conservative estimates of risk using standard US Environmental Protection Agency methods, an uncertainty analysis should be performed as an integral part of each risk assessment. Uncertainty analyses allow one to obtain quantitative results in the form of confidence intervals that will aid in decision making and will provide guidance for the acquisition of additional data. To perform an uncertainty analysis, one must frequently rely on subjective judgment in the absence of data to estimate the range and a probability distribution describing the extent of uncertainty about a true but unknown value for each parameter of interest. This information is formulated from professional judgment based on an extensive review of literature, analysis of the data, and interviews with experts. Various analytical and numerical techniques are available to allow statistical propagation of the uncertainty in the model parameters to a statement of uncertainty in the risk to a potentially exposed individual. Although analytical methods may be straightforward for relatively simple models, they rapidly become complicated for more involved risk assessments. Because of the tedious efforts required to mathematically derive analytical approaches to propagate uncertainty in complicated risk assessments, numerical methods such as Monte Carlo simulation should be employed. The primary objective of this report is to provide an introductory guide for performing uncertainty analysis in risk assessments being performed for Superfund sites

  10. Uncertainty Analysis of the NASA Glenn 8x6 Supersonic Wind Tunnel

    Science.gov (United States)

    Stephens, Julia; Hubbard, Erin; Walter, Joel; McElroy, Tyler

    2016-01-01

    This paper presents methods and results of a detailed measurement uncertainty analysis that was performed for the 8- by 6-foot Supersonic Wind Tunnel located at the NASA Glenn Research Center. The statistical methods and engineering judgments used to estimate elemental uncertainties are described. The Monte Carlo method of propagating uncertainty was selected to determine the uncertainty of calculated variables of interest. A detailed description of the Monte Carlo method as applied for this analysis is provided. Detailed uncertainty results for the uncertainty in average free stream Mach number as well as other variables of interest are provided. All results are presented as random (variation in observed values about a true value), systematic (potential offset between observed and true value), and total (random and systematic combined) uncertainty. The largest sources contributing to uncertainty are determined and potential improvement opportunities for the facility are investigated.

  11. Selected examples of practical approaches for the assessment of model reliability - parameter uncertainty analysis

    International Nuclear Information System (INIS)

    Hofer, E.; Hoffman, F.O.

    1987-02-01

    The uncertainty analysis of model predictions has to discriminate between two fundamentally different types of uncertainty. The presence of stochastic variability (Type 1 uncertainty) necessitates the use of a probabilistic model instead of the much simpler deterministic one. Lack of knowledge (Type 2 uncertainty), however, applies to deterministic as well as to probabilistic model predictions and often dominates over uncertainties of Type 1. The term ''probability'' is interpreted differently in the probabilistic analysis of either type of uncertainty. After these discriminations have been explained the discussion centers on the propagation of parameter uncertainties through the model, the derivation of quantitative uncertainty statements for model predictions and the presentation and interpretation of the results of a Type 2 uncertainty analysis. Various alternative approaches are compared for a very simple deterministic model

  12. Static Code Analysis with Gitlab-CI

    CERN Document Server

    Datko, Szymon Tomasz

    2016-01-01

    Static Code Analysis is a simple but efficient way to ensure that application’s source code is free from known flaws and security vulnerabilities. Although such analysis tools are often coming with more advanced code editors, there are a lot of people who prefer less complicated environments. The easiest solution would involve education – where to get and how to use the aforementioned tools. However, counting on the manual usage of such tools still does not guarantee their actual usage. On the other hand, reducing the required effort, according to the idea “setup once, use anytime without sweat” seems like a more promising approach. In this paper, the approach to automate code scanning, within the existing CERN’s Gitlab installation, is described. For realization of that project, the Gitlab-CI service (the “CI” stands for "Continuous Integration"), with Docker assistance, was employed to provide a variety of static code analysers for different programming languages. This document covers the gene...

  13. Uncertainties in modelling and scaling of critical flows and pump model in TRAC-PF1/MOD1

    International Nuclear Information System (INIS)

    Rohatgi, U.S.; Yu, Wen-Shi.

    1987-01-01

    The USNRC has established a Code Scalability, Applicability and Uncertainty (CSAU) evaluation methodology to quantify the uncertainty in the prediction of safety parameters by the best estimate codes. These codes can then be applied to evaluate the Emergency Core Cooling System (ECCS). The TRAC-PF1/MOD1 version was selected as the first code to undergo the CSAU analysis for LBLOCA applications. It was established through this methodology that break flow and pump models are among the top ranked models in the code affecting the peak clad temperature (PCT) prediction for LBLOCA. The break flow model bias or discrepancy and the uncertainty were determined by modelling the test section near the break for 12 Marviken tests. It was observed that the TRAC-PF1/MOD1 code consistently underpredicts the break flow rate and that the prediction improved with increasing pipe length (larger L/D). This is true for both subcooled and two-phase critical flows. A pump model was developed from Westinghouse (1/3 scale) data. The data represent the largest available test pump relevant to Westinghouse PWRs. It was then shown through the analysis of CE and CREARE pump data that larger pumps degrade less and also that pumps degrade less at higher pressures. Since the model developed here is based on the 1/3 scale pump and on low pressure data, it is conservative and will overpredict the degradation when applied to PWRs

  14. Approach and methods to evaluate the uncertainty in system thermalhydraulic calculations

    International Nuclear Information System (INIS)

    D'Auria, F.

    2004-01-01

    The evaluation of uncertainty constitutes the necessary supplement of Best Estimate (BE) calculations performed to understand accident scenarios in water cooled nuclear reactors. The needs come from the imperfection of computational tools on the one side and from the interest in using such tool to get more precise evaluation of safety margins. In the present paper the approaches to uncertainty are outlined and the CIAU (Code with capability of Internal Assessment of Uncertainty) method proposed by the University of Pisa is described including ideas at the basis and results from applications. An activity in progress at the International Atomic Energy Agency (IAEA) is considered. Two approaches are distinguished that are characterized as 'propagation of code input uncertainty' and 'propagation of code output errors'. For both methods, the thermal-hydraulic code is at the centre of the process of uncertainty evaluation: in the former case the code itself is adopted to compute the error bands and to propagate the input errors, in the latter case the errors in code application to relevant measurements are used to derive the error bands. The CIAU method exploits the idea of the 'status approach' for identifying the thermalhydraulic conditions of an accident in any Nuclear Power Plant (NPP). Errors in predicting such status are derived from the comparison between predicted and measured quantities and, in the stage of the application of the method, are used to compute the uncertainty. (author)

  15. The fourth research co-ordination meeting (RCM) on 'Updated codes and methods to reduce the calculational uncertainties of liquid metal fast reactors reactivity effects'. Working material

    International Nuclear Information System (INIS)

    2003-01-01

    The fourth Research Co-ordination Meeting (RCM) of the Co-ordinated Research Project (CRP) on 'Updated Codes and Methods to Reduce the Calculational Uncertainties of the LMFR Reactivity Effect' was held during 19-23 May, 2003 in Obninsk, Russian Federation. The general objective of the CRP is to validate, verify and improve methodologies and computer codes used for the calculation of reactivity coefficients in fast reactors aiming at enhancing the utilization of plutonium and minor actinides. The first RCM took place in Vienna on 24 - 26 November 1999. The meeting was attended by 19 participants from 7 Member States and one from an international organization (France, Germany, India, Japan, Rep. of Korea, Russian Federation, the United Kingdom, and IAEA). The participants from two Member States (China and the U.S.A.) provided their results and presentation materials even though being absent at the meeting. The results for several relevant reactivity parameters obtained by the participants with their own state-of-the-art basic data and codes, were compared in terms of calculational uncertainty, and their effects on the ULOF transient behavior of the hybrid BN- 600 core were evaluated. Contributions of the participants in the benchmark analyses is shown. This report first addresses the benchmark definitions and specifications given for each Phase and briefly introduces the basic data, computer codes, and methodologies applied to the benchmark analyses by various participants. Then, the results obtained by the participants in terms of calculational uncertainty and their effect on the core transient behavior are intercompared. Finally it addresses some conclusions drawn in the benchmarks

  16. Uncertainty analysis of a low flow model for the Rhine River

    NARCIS (Netherlands)

    Demirel, M.C.; Booij, Martijn J.

    2011-01-01

    It is widely recognized that hydrological models are subject to parameter uncertainty. However, little attention has been paid so far to the uncertainty in parameters of the data-driven models like weights in neural networks. This study aims at applying a structured uncertainty analysis to a

  17. Application of uncertainty analysis in conceptual fusion reactor design

    International Nuclear Information System (INIS)

    Wu, T.; Maynard, C.W.

    1979-01-01

    The theories of sensitivity and uncertainty analysis are described and applied to a new conceptual tokamak fusion reactor design--NUWMAK. The responses investigated in this study include the tritium breeding ratio, first wall Ti dpa and gas productions, nuclear heating in the blanket, energy leakage to the magnet, and the dpa rate in the superconducting magnet aluminum stabilizer. The sensitivities and uncertainties of these responses are calculated. The cost/benefit feature of proposed integral measurements is also studied through the uncertainty reductions of these responses

  18. Module type plant system dynamics analysis code (MSG-COPD). Code manual

    International Nuclear Information System (INIS)

    Sakai, Takaaki

    2002-11-01

    MSG-COPD is a module type plant system dynamics analysis code which involves a multi-dimensional thermal-hydraulics calculation module to analyze pool type of fast breeder reactors. Explanations of each module and the methods for the input data are described in this code manual. (author)

  19. DAKOTA, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis:version 4.0 reference manual

    Energy Technology Data Exchange (ETDEWEB)

    Griffin, Joshua D. (Sandai National Labs, Livermore, CA); Eldred, Michael Scott; Martinez-Canales, Monica L. (Sandai National Labs, Livermore, CA); Watson, Jean-Paul; Kolda, Tamara Gibson (Sandai National Labs, Livermore, CA); Adams, Brian M.; Swiler, Laura Painton; Williams, Pamela J. (Sandai National Labs, Livermore, CA); Hough, Patricia Diane (Sandai National Labs, Livermore, CA); Gay, David M.; Dunlavy, Daniel M.; Eddy, John P.; Hart, William Eugene; Guinta, Anthony A.; Brown, Shannon L.

    2006-10-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a reference manual for the commands specification for the DAKOTA software, providing input overviews, option descriptions, and example specifications.

  20. Solution weighting for the SAND-II Monte Carlo code

    International Nuclear Information System (INIS)

    Oster, C.A.; McElroy, W.N.; Simons, R.L.; Lippincott, E.P.; Odette, G.R.

    1976-01-01

    Modifications to the SAND-II Error Analysis Monte Carlo code to include solution weighting based on input data uncertainties have been made and are discussed together with background information on the SAND-II algorithm. The new procedure permits input data having smaller uncertainties to have a greater influence on the solution spectrum than do the data having larger uncertainties. The results of an indepth study to find a practical procedure and the first results of its application to three important Interlaboratory LMFBR Reaction Rate (ILRR) program benchmark spectra (CFRMF, ΣΣ, and 235 U fission) are discussed

  1. 76 FR 64931 - Building Energy Codes Cost Analysis

    Science.gov (United States)

    2011-10-19

    ...-0046] Building Energy Codes Cost Analysis AGENCY: Office of Energy Efficiency and Renewable Energy... reopening of the time period for submitting comments on the request for information on Building Energy Codes... the request for information on Building Energy Code Cost Analysis and provide docket number EERE-2011...

  2. Uncertainty analysis of NDA waste measurements using computer simulations

    International Nuclear Information System (INIS)

    Blackwood, L.G.; Harker, Y.D.; Yoon, W.Y.; Meachum, T.R.

    2000-01-01

    Uncertainty assessments for nondestructive radioassay (NDA) systems for nuclear waste are complicated by factors extraneous to the measurement systems themselves. Most notably, characteristics of the waste matrix (e.g., homogeneity) and radioactive source material (e.g., particle size distribution) can have great effects on measured mass values. Under these circumstances, characterizing the waste population is as important as understanding the measurement system in obtaining realistic uncertainty values. When extraneous waste characteristics affect measurement results, the uncertainty results are waste-type specific. The goal becomes to assess the expected bias and precision for the measurement of a randomly selected item from the waste population of interest. Standard propagation-of-errors methods for uncertainty analysis can be very difficult to implement in the presence of significant extraneous effects on the measurement system. An alternative approach that naturally includes the extraneous effects is as follows: (1) Draw a random sample of items from the population of interest; (2) Measure the items using the NDA system of interest; (3) Establish the true quantity being measured using a gold standard technique; and (4) Estimate bias by deriving a statistical regression model comparing the measurements on the system of interest to the gold standard values; similar regression techniques for modeling the standard deviation of the difference values gives the estimated precision. Actual implementation of this method is often impractical. For example, a true gold standard confirmation measurement may not exist. A more tractable implementation is obtained by developing numerical models for both the waste material and the measurement system. A random sample of simulated waste containers generated by the waste population model serves as input to the measurement system model. This approach has been developed and successfully applied to assessing the quantity of

  3. Uncertainty correlation in stochastic safety analysis of natural circulation decay heat removal of liquid metal reactor

    International Nuclear Information System (INIS)

    Takata, Takashi; Yamaguchi, Akira

    2009-01-01

    Since various uncertainties of input variables are involved and nonlinearly-correlated in the Best Estimate (BE) plant dynamics code, it is of importance to evaluate the importance of input uncertainty to the computational results and to estimate the accuracy of the confidence level of the results. In order to estimate the importance and the accuracy, the authors have applied the stochastic safety analysis procedure using the Latin Hypercube sampling method to Liquid Metal Reactor (LMR) natural circulation Decay Heat Removal (DHR) phenomenon in the present paper. 17 input variables are chosen for the analyses and 5 influential variables, which affect the maximum coolant temperature at the core in a short period of time (several tens seconds), are selected to investigate the importance by comparing with the full-scope parametric analysis. As a result, it has been demonstrated that a comparative small number of samples is sufficient enough to estimate the dominant input variable and the confidence level. Furthermore, the influence of the sampling method on the accuracy of the upper tolerance limit (confidence level of 95%) has been examined based on the Wilks' formula. (author)

  4. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear-reactor-safety research program is described and compared with other methodologies established for performing uncertainty analyses

  5. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear reactor safety research program is described and compared with other methodologies established for performing uncertainty analyses

  6. Methodology, status and plans for development and assessment of Cathare code

    Energy Technology Data Exchange (ETDEWEB)

    Bestion, D.; Barre, F.; Faydide, B. [CEA - Grenoble (France)

    1997-07-01

    This paper presents the methodology, status and plans for the development, assessment and uncertainty evaluation of the Cathare code. Cathare is a thermalhydraulic code developed by CEA (DRN), IPSN, EDF and FRAMATOME for PWR safety analysis. First, the status of the code development and assessment is presented. The general strategy used for the development and the assessment of the code is presented. Analytical experiments with separate effect tests, and component tests are used for the development and the validation of closure laws. Successive Revisions of constitutive laws are implemented in successive Versions of the code and assessed. System tests or integral tests are used to validate the general consistency of the Revision. Each delivery of a code Version + Revision is fully assessed and documented. A methodology is being developed to determine the uncertainty on all constitutive laws of the code using calculations of many analytical tests and applying the Discrete Adjoint Sensitivity Method (DASM). At last, the plans for the future developments of the code are presented. They concern the optimization of the code performance through parallel computing - the code will be used for real time full scope plant simulators - the coupling with many other codes (neutronic codes, severe accident codes), the application of the code for containment thermalhydraulics. Also, physical improvements are required in the field of low pressure transients and in the modeling for the 3-D model.

  7. Assessment of shielding analysis methods, codes, and data for spent fuel transport/storage applications

    International Nuclear Information System (INIS)

    Parks, C.V.; Broadhead, B.L.; Hermann, O.W.; Tang, J.S.; Cramer, S.N.; Gauthey, J.C.; Kirk, B.L.; Roussin, R.W.

    1988-07-01

    This report provides a preliminary assessment of the computational tools and existing methods used to obtain radiation dose rates from shielded spent nuclear fuel and high-level radioactive waste (HLW). Particular emphasis is placed on analysis tools and techniques applicable to facilities/equipment designed for the transport or storage of spent nuclear fuel or HLW. Applications to cask transport, storage, and facility handling are considered. The report reviews the analytic techniques for generating appropriate radiation sources, evaluating the radiation transport through the shield, and calculating the dose at a desired point or surface exterior to the shield. Discrete ordinates, Monte Carlo, and point kernel methods for evaluating radiation transport are reviewed, along with existing codes and data that utilize these methods. A literature survey was employed to select a cadre of codes and data libraries to be reviewed. The selection process was based on specific criteria presented in the report. Separate summaries were written for several codes (or family of codes) that provided information on the method of solution, limitations and advantages, availability, data access, ease of use, and known accuracy. For each data library, the summary covers the source of the data, applicability of these data, and known verification efforts. Finally, the report discusses the overall status of spent fuel shielding analysis techniques and attempts to illustrate areas where inaccuracy and/or uncertainty exist. The report notes the advantages and limitations of several analysis procedures and illustrates the importance of using adequate cross-section data sets. Additional work is recommended to enable final selection/validation of analysis tools that will best meet the US Department of Energy's requirements for use in developing a viable HLW management system. 188 refs., 16 figs., 27 tabs

  8. Analysis and Reduction of Complex Networks Under Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Knio, Omar M

    2014-04-09

    This is a collaborative proposal that aims at developing new methods for the analysis and reduction of complex multiscale networks under uncertainty. The approach is based on combining methods of computational singular perturbation (CSP) and probabilistic uncertainty quantification. In deterministic settings, CSP yields asymptotic approximations of reduced-dimensionality “slow manifolds” on which a multiscale dynamical system evolves. Introducing uncertainty raises fundamentally new issues, particularly concerning its impact on the topology of slow manifolds, and means to represent and quantify associated variability. To address these challenges, this project uses polynomial chaos (PC) methods to reformulate uncertain network models, and to analyze them using CSP in probabilistic terms. Specific objectives include (1) developing effective algorithms that can be used to illuminate fundamental and unexplored connections among model reduction, multiscale behavior, and uncertainty, and (2) demonstrating the performance of these algorithms through applications to model problems.

  9. Complex Visual Data Analysis, Uncertainty, and Representation

    National Research Council Canada - National Science Library

    Schunn, Christian D; Saner, Lelyn D; Kirschenbaum, Susan K; Trafton, J. G; Littleton, Eliza B

    2007-01-01

    ... (weather forecasting, submarine target motion analysis, and fMRI data analysis). Internal spatial representations are coded from spontaneous gestures made during cued-recall summaries of problem solving activities...

  10. Sensitivity and uncertainty analysis of NET/ITER shielding blankets

    International Nuclear Information System (INIS)

    Hogenbirk, A.; Gruppelaar, H.; Verschuur, K.A.

    1990-09-01

    Results are presented of sensitivity and uncertainty calculations based upon the European fusion file (EFF-1). The effect of uncertainties in Fe, Cr and Ni cross sections on the nuclear heating in the coils of a NET/ITER shielding blanket has been studied. The analysis has been performed for the total cross section as well as partial cross sections. The correct expression for the sensitivity profile was used, including the gain term. The resulting uncertainty in the nuclear heating lies between 10 and 20 per cent. (author). 18 refs.; 2 figs.; 2 tabs

  11. A structured analysis of uncertainty surrounding modeled impacts of groundwater-extraction rules

    Science.gov (United States)

    Guillaume, Joseph H. A.; Qureshi, M. Ejaz; Jakeman, Anthony J.

    2012-08-01

    Integrating economic and groundwater models for groundwater-management can help improve understanding of trade-offs involved between conflicting socioeconomic and biophysical objectives. However, there is significant uncertainty in most strategic decision-making situations, including in the models constructed to represent them. If not addressed, this uncertainty may be used to challenge the legitimacy of the models and decisions made using them. In this context, a preliminary uncertainty analysis was conducted of a dynamic coupled economic-groundwater model aimed at assessing groundwater extraction rules. The analysis demonstrates how a variety of uncertainties in such a model can be addressed. A number of methods are used including propagation of scenarios and bounds on parameters, multiple models, block bootstrap time-series sampling and robust linear regression for model calibration. These methods are described within the context of a theoretical uncertainty management framework, using a set of fundamental uncertainty management tasks and an uncertainty typology.

  12. Two-dimensional disruption thermal analysis code DREAM

    International Nuclear Information System (INIS)

    Yamazaki, Seiichiro; Kobayashi, Takeshi; Seki, Masahiro.

    1988-08-01

    When a plasma disruption takes place in a tokamak type fusion reactor, plasma facing components such as first wall and divertor/limiter are subjected to an intense heat load with very high heat flux and short duration. At the surface of the wall, temperature rapidly rises, and melting and evaporation occurs, it causes reduction of wall thickness and crack initiation/propagation. As lifetime of the components is significantly affected by them, the transient analysis in consideration of phase changes (melting/evaporation) and radiation heat loss is required in the design of these components. This paper describes the computer code DREAM developed to perform the two-dimensional transient thermal analysis that takes phase changes and radiation into account. The input and output of the code and a sample analysis on a disruption simulation experiment are also reported. The user's input manual is added as an appendix. The profiles and time variations of temperature, and melting and evaporated thicknesses of the material subjected to intense heat load can be obtained, using this computer code. This code also gives the temperature data for elastoplastic analysis with FEM structural analysis codes (ADINA, MARC, etc.) to evaluate the thermal stress and crack propagation behavior within the wall materials. (author)

  13. Subchannel analysis code development for CANDU fuel channel

    International Nuclear Information System (INIS)

    Park, J. H.; Suk, H. C.; Jun, J. S.; Oh, D. J.; Hwang, D. H.; Yoo, Y. J.

    1998-07-01

    Since there are several subchannel codes such as COBRA and TORC codes for a PWR fuel channel but not for a CANDU fuel channel in our country, the subchannel analysis code for a CANDU fuel channel was developed for the prediction of flow conditions on the subchannels, for the accurate assessment of the thermal margin, the effect of appendages, and radial/axial power profile of fuel bundles on flow conditions and CHF and so on. In order to develop the subchannel analysis code for a CANDU fuel channel, subchannel analysis methodology and its applicability/pertinence for a fuel channel were reviewed from the CANDU fuel channel point of view. Several thermalhydraulic and numerical models for the subchannel analysis on a CANDU fuel channel were developed. The experimental data of the CANDU fuel channel were collected, analyzed and used for validation of a subchannel analysis code developed in this work. (author). 11 refs., 3 tabs., 50 figs

  14. Accounting for Model Uncertainties Using Reliability Methods - Application to Carbon Dioxide Geologic Sequestration System. Final Report

    International Nuclear Information System (INIS)

    Mok, Chin Man; Doughty, Christine; Zhang, Keni; Pruess, Karsten; Kiureghian, Armen; Zhang, Miao; Kaback, Dawn

    2010-01-01

    A new computer code, CALRELTOUGH, which uses reliability methods to incorporate parameter sensitivity and uncertainty analysis into subsurface flow and transport models, was developed by Geomatrix Consultants, Inc. in collaboration with Lawrence Berkeley National Laboratory and University of California at Berkeley. The CALREL reliability code was developed at the University of California at Berkely for geotechnical applications and the TOUGH family of codes was developed at Lawrence Berkeley National Laboratory for subsurface flow and tranport applications. The integration of the two codes provides provides a new approach to deal with uncertainties in flow and transport modeling of the subsurface, such as those uncertainties associated with hydrogeology parameters, boundary conditions, and initial conditions of subsurface flow and transport using data from site characterization and monitoring for conditioning. The new code enables computation of the reliability of a system and the components that make up the system, instead of calculating the complete probability distributions of model predictions at all locations at all times. The new CALRELTOUGH code has tremendous potential to advance subsurface understanding for a variety of applications including subsurface energy storage, nuclear waste disposal, carbon sequestration, extraction of natural resources, and environmental remediation. The new code was tested on a carbon sequestration problem as part of the Phase I project. Phase iI was not awarded.

  15. Holistic uncertainty analysis in river basin modeling for climate vulnerability assessment

    Science.gov (United States)

    Taner, M. U.; Wi, S.; Brown, C.

    2017-12-01

    The challenges posed by uncertain future climate are a prominent concern for water resources managers. A number of frameworks exist for assessing the impacts of climate-related uncertainty, including internal climate variability and anthropogenic climate change, such as scenario-based approaches and vulnerability-based approaches. While in many cases climate uncertainty may be dominant, other factors such as future evolution of the river basin, hydrologic response and reservoir operations are potentially significant sources of uncertainty. While uncertainty associated with modeling hydrologic response has received attention, very little attention has focused on the range of uncertainty and possible effects of the water resources infrastructure and management. This work presents a holistic framework that allows analysis of climate, hydrologic and water management uncertainty in water resources systems analysis with the aid of a water system model designed to integrate component models for hydrology processes and water management activities. The uncertainties explored include those associated with climate variability and change, hydrologic model parameters, and water system operation rules. A Bayesian framework is used to quantify and model the uncertainties at each modeling steps in integrated fashion, including prior and the likelihood information about model parameters. The framework is demonstrated in a case study for the St. Croix Basin located at border of United States and Canada.

  16. Uncertainty in RELAP5/MOD3.2 calculations for interfacial drag in downward two-phase flow

    International Nuclear Information System (INIS)

    Clark, Collin; Schlegel, Joshua P.; Hibiki, Takashi; Ishii, Mamoru; Kinoshita, Ikuo

    2016-01-01

    Highlights: • Uncertainty propagation is key for best estimate code reliability. • Uncertainty in drift flux correlations used to evaluate uncertainty in interfacial drag. • Bias and error have been compared for various models. - Abstract: RELAP5/MOD3.2 is a thermal-hydraulic system analysis code used to predict the response of nuclear reactor coolant systems in the event of certain accident scenarios. It is important that RELAP and other system analysis codes are able to accurately predict various two-phase flow phenomena, particularly the interfacial transfers between the liquid and gas phases. It is also important to understand how much uncertainty exists in these predictions due to uncertainties in the constitutive relations used to close the two-fluid model. In this paper, the uncertainty in the interfacial drag calculated by RELAP5/MOD3.2 due to errors in the drift-flux models used to close the model is evaluated and compared to the correlation developed by Goda et al. (2003). The case of downward flow is considered due to the importance of co-current and counter-current downward flow for predicting behavior in the downcomer of reactor systems during small-break Loss of Coolant Accidents (LOCAs) in nuclear reactor systems. The overall uncertainty in the interfacial force calculations due to error in the distribution parameter models were found to have a bias of +8.1% and error of 20.1% for the models used in RELAP5, and a bias of −30.8% and error of 23.1% for the correlation of Goda et al. (2003). However this analysis neglects the effects of compensating errors in the drift-flux parameters, as the drift velocity is assumed to be perfectly accurate. More physically meaningful results could be obtained if the distribution parameter and drift velocity were calculated directly from local phase concentration and velocity measurements, however no studies were available which included all of this information.

  17. Geological-structural models used in SR 97. Uncertainty analysis

    Energy Technology Data Exchange (ETDEWEB)

    Saksa, P.; Nummela, J. [FINTACT Oy (Finland)

    1998-10-01

    The uncertainty of geological-structural models was studied for the three sites in SR 97, called Aberg, Beberg and Ceberg. The evaluation covered both regional and site scale models, the emphasis being placed on fracture zones in the site scale. Uncertainty is a natural feature of all geoscientific investigations. It originates from measurements (errors in data, sampling limitations, scale variation) and conceptualisation (structural geometries and properties, ambiguous geometric or parametric solutions) to name the major ones. The structures of A-, B- and Ceberg are fracture zones of varying types. No major differences in the conceptualisation between the sites were noted. One source of uncertainty in the site models is the non-existence of fracture and zone information in the scale from 10 to 300 - 1000 m. At Aberg the development of the regional model has been performed very thoroughly. At the site scale one major source of uncertainty is that a clear definition of the target area is missing. Structures encountered in the boreholes are well explained and an interdisciplinary approach in interpretation have taken place. Beberg and Ceberg regional models contain relatively large uncertainties due to the investigation methodology and experience available at that time. In site scale six additional structures were proposed both to Beberg and Ceberg to variant analysis of these sites. Both sites include uncertainty in the form of many non-interpreted fractured sections along the boreholes. Statistical analysis gives high occurrences of structures for all three sites: typically 20 - 30 structures/km{sup 3}. Aberg has highest structural frequency, Beberg comes next and Ceberg has the lowest. The borehole configuration, orientations and surveying goals were inspected to find whether preferences or factors causing bias were present. Data from Aberg supports the conclusion that Aespoe sub volume would be an anomalously fractured, tectonised unit of its own. This means that

  18. Geological-structural models used in SR 97. Uncertainty analysis

    International Nuclear Information System (INIS)

    Saksa, P.; Nummela, J.

    1998-10-01

    The uncertainty of geological-structural models was studied for the three sites in SR 97, called Aberg, Beberg and Ceberg. The evaluation covered both regional and site scale models, the emphasis being placed on fracture zones in the site scale. Uncertainty is a natural feature of all geoscientific investigations. It originates from measurements (errors in data, sampling limitations, scale variation) and conceptualisation (structural geometries and properties, ambiguous geometric or parametric solutions) to name the major ones. The structures of A-, B- and Ceberg are fracture zones of varying types. No major differences in the conceptualisation between the sites were noted. One source of uncertainty in the site models is the non-existence of fracture and zone information in the scale from 10 to 300 - 1000 m. At Aberg the development of the regional model has been performed very thoroughly. At the site scale one major source of uncertainty is that a clear definition of the target area is missing. Structures encountered in the boreholes are well explained and an interdisciplinary approach in interpretation have taken place. Beberg and Ceberg regional models contain relatively large uncertainties due to the investigation methodology and experience available at that time. In site scale six additional structures were proposed both to Beberg and Ceberg to variant analysis of these sites. Both sites include uncertainty in the form of many non-interpreted fractured sections along the boreholes. Statistical analysis gives high occurrences of structures for all three sites: typically 20 - 30 structures/km 3 . Aberg has highest structural frequency, Beberg comes next and Ceberg has the lowest. The borehole configuration, orientations and surveying goals were inspected to find whether preferences or factors causing bias were present. Data from Aberg supports the conclusion that Aespoe sub volume would be an anomalously fractured, tectonised unit of its own. This means that the

  19. Experimental transport analysis code system in JT-60

    International Nuclear Information System (INIS)

    Hirayama, Toshio; Shimizu, Katsuhiro; Tani, Keiji; Shirai, Hiroshi; Kikuchi, Mitsuru

    1988-03-01

    Transport analysis codes have been developed in order to study confinement properties related to particle and energy balance in ohmically and neutral beam heated plasmas of JT-60. The analysis procedure is divided into three steps as follows: 1) LOOK ; The shape of the plasma boundary is identified with a fast boundary identification code of FBI by using magnetic data, and flux surfaces are calculated with a MHD equilibrium code of SELENE. The diagnostic data are mapped to flux surfaces for neutral beam heating calculation and/or for radial transport analysis. 2) OFMC ; On the basis of transformed data, an orbit following Monte Carlo code of OFMC calculates both profiles of power deposition and particle source of neutral beam injected into a plasma. 3) SCOOP ; In the last stage, a one dimensional transport code of SCOOP solves particle and energy balance for electron and ion, in order to evaluate transport coefficients as well as global parameters such as energy confinement time and the stored energy. The analysis results are provided to a data bank of DARTS that is used to find an overview of important consideration on confinement with a regression analysis code of RAC. (author)

  20. Uncertainty Management and Sensitivity Analysis

    DEFF Research Database (Denmark)

    Rosenbaum, Ralph K.; Georgiadis, Stylianos; Fantke, Peter

    2018-01-01

    Uncertainty is always there and LCA is no exception to that. The presence of uncertainties of different types and from numerous sources in LCA results is a fact, but managing them allows to quantify and improve the precision of a study and the robustness of its conclusions. LCA practice sometimes...... suffers from an imbalanced perception of uncertainties, justifying modelling choices and omissions. Identifying prevalent misconceptions around uncertainties in LCA is a central goal of this chapter, aiming to establish a positive approach focusing on the advantages of uncertainty management. The main...... objectives of this chapter are to learn how to deal with uncertainty in the context of LCA, how to quantify it, interpret and use it, and how to communicate it. The subject is approached more holistically than just focusing on relevant statistical methods or purely mathematical aspects. This chapter...