WorldWideScience

Sample records for uncertainty analysis code

  1. Uncertainty analysis of the FRAP code

    International Nuclear Information System (INIS)

    Peck, S.O.

    1978-01-01

    A user oriented, automated uncertainty analysis capability has been built into the FRAP code (Fuel Rod Analysis Program) and applied to a PWR fuel rod undergoing a LOCA. The method of uncertainty analysis is the Response Surface Method (RSM). (author)

  2. Uncertainty analysis of the FRAP code

    International Nuclear Information System (INIS)

    Peck, S.O.

    1978-01-01

    A user oriented, automated uncertainty analysis capability has been built into the Fuel Rod Analysis Program (FRAP) code and has been applied to a pressurized water reactor (PWR) fuel rod undergoing a loss-of-coolant accident (LOCA). The method of uncertainty analysis is the response surface method. The automated version significantly reduced the time required to complete the analysis and, at the same time, greatly increased the problem scope. Results of the analysis showed a significant difference in the total and relative contributions to the uncertainty of the response parameters between steady state and transient conditions

  3. Automated uncertainty analysis methods in the FRAP computer codes

    International Nuclear Information System (INIS)

    Peck, S.O.

    1980-01-01

    A user oriented, automated uncertainty analysis capability has been incorporated in the Fuel Rod Analysis Program (FRAP) computer codes. The FRAP codes have been developed for the analysis of Light Water Reactor fuel rod behavior during steady state (FRAPCON) and transient (FRAP-T) conditions as part of the United States Nuclear Regulatory Commission's Water Reactor Safety Research Program. The objective of uncertainty analysis of these codes is to obtain estimates of the uncertainty in computed outputs of the codes is to obtain estimates of the uncertainty in computed outputs of the codes as a function of known uncertainties in input variables. This paper presents the methods used to generate an uncertainty analysis of a large computer code, discusses the assumptions that are made, and shows techniques for testing them. An uncertainty analysis of FRAP-T calculated fuel rod behavior during a hypothetical loss-of-coolant transient is presented as an example and carried through the discussion to illustrate the various concepts

  4. Methods and computer codes for probabilistic sensitivity and uncertainty analysis

    International Nuclear Information System (INIS)

    Vaurio, J.K.

    1985-01-01

    This paper describes the methods and applications experience with two computer codes that are now available from the National Energy Software Center at Argonne National Laboratory. The purpose of the SCREEN code is to identify a group of most important input variables of a code that has many (tens, hundreds) input variables with uncertainties, and do this without relying on judgment or exhaustive sensitivity studies. Purpose of the PROSA-2 code is to propagate uncertainties and calculate the distributions of interesting output variable(s) of a safety analysis code using response surface techniques, based on the same runs used for screening. Several applications are discussed, but the codes are generic, not tailored to any specific safety application code. They are compatible in terms of input/output requirements but also independent of each other, e.g., PROSA-2 can be used without first using SCREEN if a set of important input variables has first been selected by other methods. Also, although SCREEN can select cases to be run (by random sampling), a user can select cases by other methods if he so prefers, and still use the rest of SCREEN for identifying important input variables

  5. Uncertainty and sensitivity analysis using probabilistic system assessment code. 1

    International Nuclear Information System (INIS)

    Honma, Toshimitsu; Sasahara, Takashi.

    1993-10-01

    This report presents the results obtained when applying the probabilistic system assessment code under development to the PSACOIN Level 0 intercomparison exercise organized by the Probabilistic System Assessment Code User Group in the Nuclear Energy Agency (NEA) of OECD. This exercise is one of a series designed to compare and verify probabilistic codes in the performance assessment of geological radioactive waste disposal facilities. The computations were performed using the Monte Carlo sampling code PREP and post-processor code USAMO. The submodels in the waste disposal system were described and coded with the specification of the exercise. Besides the results required for the exercise, further additional uncertainty and sensitivity analyses were performed and the details of these are also included. (author)

  6. An uncertainty analysis using the NRPB accident consequence code Marc

    International Nuclear Information System (INIS)

    Jones, J.A.; Crick, M.J.; Simmonds, J.R.

    1991-01-01

    This paper describes an uncertainty analysis of MARC calculations of the consequences of accidental releases of radioactive materials to atmosphere. A total of 98 parameters describing the transfer of material through the environment to man, the doses received, and the health effects resulting from these doses, was considered. The uncertainties in the numbers of early and late health effects, numbers of people affected by countermeasures, the amounts of food restricted and the economic costs of the accident were estimated. This paper concentrates on the results for early death and fatal cancer for a large hypothetical release from a PWR

  7. Code development for eigenvalue total sensitivity analysis and total uncertainty analysis

    International Nuclear Information System (INIS)

    Wan, Chenghui; Cao, Liangzhi; Wu, Hongchun; Zu, Tiejun; Shen, Wei

    2015-01-01

    Highlights: • We develop a new code for total sensitivity and uncertainty analysis. • The implicit effects of cross sections can be considered. • The results of our code agree well with TSUNAMI-1D. • Detailed analysis for origins of implicit effects is performed. - Abstract: The uncertainties of multigroup cross sections notably impact eigenvalue of neutron-transport equation. We report on a total sensitivity analysis and total uncertainty analysis code named UNICORN that has been developed by applying the direct numerical perturbation method and statistical sampling method. In order to consider the contributions of various basic cross sections and the implicit effects which are indirect results of multigroup cross sections through resonance self-shielding calculation, an improved multigroup cross-section perturbation model is developed. The DRAGON 4.0 code, with application of WIMSD-4 format library, is used by UNICORN to carry out the resonance self-shielding and neutron-transport calculations. In addition, the bootstrap technique has been applied to the statistical sampling method in UNICORN to obtain much steadier and more reliable uncertainty results. The UNICORN code has been verified against TSUNAMI-1D by analyzing the case of TMI-1 pin-cell. The numerical results show that the total uncertainty of eigenvalue caused by cross sections can reach up to be about 0.72%. Therefore the contributions of the basic cross sections and their implicit effects are not negligible

  8. Uncertainty and sensitivity analysis applied to coupled code calculations for a VVER plant transient

    International Nuclear Information System (INIS)

    Langenbuch, S.; Krzykacz-Hausmann, B.; Schmidt, K. D.

    2004-01-01

    The development of coupled codes, combining thermal-hydraulic system codes and 3D neutron kinetics, is an important step to perform best-estimate plant transient calculations. It is generally agreed that the application of best-estimate methods should be supplemented by an uncertainty and sensitivity analysis to quantify the uncertainty of the results. The paper presents results from the application of the GRS uncertainty and sensitivity method for a VVER-440 plant transient, which was already studied earlier for the validation of coupled codes. For this application, the main steps of the uncertainty method are described. Typical results of the method applied to the analysis of the plant transient by several working groups using different coupled codes are presented and discussed The results demonstrate the capability of an uncertainty and sensitivity analysis. (authors)

  9. Code development of total sensitivity and uncertainty analysis for reactor physics calculations

    International Nuclear Information System (INIS)

    Wan, C.; Cao, L.; Wu, H.; Zu, T.; Shen, W.

    2015-01-01

    Sensitivity and uncertainty analysis are essential parts for reactor system to perform risk and policy analysis. In this study, total sensitivity and corresponding uncertainty analysis for responses of neutronics calculations have been accomplished and developed the S&U analysis code named UNICORN. The UNICORN code can consider the implicit effects of multigroup cross sections on the responses. The UNICORN code has been applied to typical pin-cell case in this paper, and can be proved correct by comparison the results with those of the TSUNAMI-1D code. (author)

  10. Code development of total sensitivity and uncertainty analysis for reactor physics calculations

    Energy Technology Data Exchange (ETDEWEB)

    Wan, C.; Cao, L.; Wu, H.; Zu, T., E-mail: chenghuiwan@stu.xjtu.edu.cn, E-mail: caolz@mail.xjtu.edu.cn, E-mail: hongchun@mail.xjtu.edu.cn, E-mail: tiejun@mail.xjtu.edu.cn [Xi' an Jiaotong Univ., School of Nuclear Science and Technology, Xi' an (China); Shen, W., E-mail: Wei.Shen@cnsc-ccsn.gc.ca [Xi' an Jiaotong Univ., School of Nuclear Science and Technology, Xi' an (China); Canadian Nuclear Safety Commission, Ottawa, ON (Canada)

    2015-07-01

    Sensitivity and uncertainty analysis are essential parts for reactor system to perform risk and policy analysis. In this study, total sensitivity and corresponding uncertainty analysis for responses of neutronics calculations have been accomplished and developed the S&U analysis code named UNICORN. The UNICORN code can consider the implicit effects of multigroup cross sections on the responses. The UNICORN code has been applied to typical pin-cell case in this paper, and can be proved correct by comparison the results with those of the TSUNAMI-1D code. (author)

  11. Uncertainty and sensitivity analysis in the scenario simulation with RELAP/SCDAP and MELCOR codes

    International Nuclear Information System (INIS)

    Garcia J, T.; Cardenas V, J.

    2015-09-01

    A methodology was implemented for analysis of uncertainty in simulations of scenarios with RELAP/SCDAP V- 3.4 bi-7 and MELCOR V-2.1 codes, same that are used to perform safety analysis in the Comision Nacional de Seguridad Nuclear y Salvaguardias (CNSNS). The uncertainty analysis methodology chosen is a probabilistic method of type Propagation of uncertainty of the input parameters to the departure parameters. Therefore, it began with the selection of the input parameters considered uncertain and are considered of high importance in the scenario for its direct effect on the output interest variable. These parameters were randomly sampled according to intervals of variation or probability distribution functions assigned by expert judgment to generate a set of input files that were run through the simulation code to propagate the uncertainty to the output parameters. Then, through the use or ordered statistical and formula Wilks, was determined that the minimum number of executions required to obtain the uncertainty bands that include a population of 95% at a confidence level of 95% in the results is 93, is important to mention that in this method that number of executions does not depend on the number of selected input parameters. In the implementation routines in Fortran 90 that allowed automate the process to make the uncertainty analysis in transients for RELAP/SCDAP code were generated. In the case of MELCOR code for severe accident analysis, automation was carried out through complement Dakota Uncertainty incorporated into the Snap platform. To test the practical application of this methodology, two analyzes were performed: the first with the simulation of closing transient of the main steam isolation valves using the RELAP/SCDAP code obtaining the uncertainty band of the dome pressure of the vessel; while in the second analysis, the accident simulation of the power total loss (Sbo) was carried out with the Macarol code obtaining the uncertainty band for the

  12. SENSIT: a cross-section and design sensitivity and uncertainty analysis code

    International Nuclear Information System (INIS)

    Gerstl, S.A.W.

    1980-01-01

    SENSIT computes the sensitivity and uncertainty of a calculated integral response (such as a dose rate) due to input cross sections and their uncertainties. Sensitivity profiles are computed for neutron and gamma-ray reaction cross sections of standard multigroup cross section sets and for secondary energy distributions (SEDs) of multigroup scattering matrices. In the design sensitivity mode, SENSIT computes changes in an integral response due to design changes and gives the appropriate sensitivity coefficients. Cross section uncertainty analyses are performed for three types of input data uncertainties: cross-section covariance matrices for pairs of multigroup reaction cross sections, spectral shape uncertainty parameters for secondary energy distributions (integral SED uncertainties), and covariance matrices for energy-dependent response functions. For all three types of data uncertainties SENSIT computes the resulting variance and estimated standard deviation in an integral response of interest, on the basis of generalized perturbation theory. SENSIT attempts to be more comprehensive than earlier sensitivity analysis codes, such as SWANLAKE

  13. Sensitivity Analysis and Uncertainty Quantification for the LAMMPS Molecular Dynamics Simulation Code

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Bhat, Kabekode Ghanasham [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-07-18

    We examine sensitivity analysis and uncertainty quantification for molecular dynamics simulation. Extreme (large or small) output values for the LAMMPS code often occur at the boundaries of input regions, and uncertainties in those boundary values are overlooked by common SA methods. Similarly, input values for which code outputs are consistent with calibration data can also occur near boundaries. Upon applying approaches in the literature for imprecise probabilities (IPs), much more realistic results are obtained than for the complacent application of standard SA and code calibration.

  14. Uncertainty analysis for results of thermal hydraulic codes of best-estimate-type

    International Nuclear Information System (INIS)

    Alva N, J.

    2010-01-01

    In this thesis, some fundamental knowledge is presented about uncertainty analysis and about diverse methodologies applied in the study of nuclear power plant transient event analysis, particularly related to thermal hydraulics phenomena. These concepts and methodologies mentioned in this work come from a wide bibliographical research in the nuclear power subject. Methodologies for uncertainty analysis have been developed by quite diverse institutions, and they have been widely used worldwide for application to results from best-estimate-type computer codes in nuclear reactor thermal hydraulics and safety analysis. Also, the main uncertainty sources, types of uncertainties, and aspects related to best estimate modeling and methods are introduced. Once the main bases of uncertainty analysis have been set, and some of the known methodologies have been introduced, it is presented in detail the CSAU methodology, which will be applied in the analyses. The main objective of this thesis is to compare the results of an uncertainty and sensibility analysis by using the Response Surface Technique to the application of W ilks formula, apply through a loss coolant experiment and an event of rise in a BWR. Both techniques are options in the part of uncertainty and sensibility analysis of the CSAU methodology, which was developed for the analysis of transients and accidents at nuclear power plants, and it is the base of most of the methodologies used in licensing of nuclear power plants practically everywhere. Finally, the results of applying both techniques are compared and discussed. (Author)

  15. Verification of the thermal module in the ELESIM code and the associated uncertainty analysis

    International Nuclear Information System (INIS)

    Arimescu, V.I.; Williams, A.F.; Klein, M.E.; Richmond, W.R.; Couture, M.

    1997-09-01

    Temperature is a critical parameter in fuel modelling because most of the physical processes that occur in fuel elements during irradiation are thermally activated. The focus of this paper is the temperature distribution calculation used in the computer code ELESIM, developed at AECL to model the steady-state behaviour of CANDU fuel. A validation procedure for fuel codes is described and applied to ELESIM's thermal calculation.The effects of uncertainties in model parameters, like Uranium Dioxide thermal conductivity, and input variables, such as fuel element linear power, are accounted for through an uncertainty analysis using Response Surface and Monte Carlo techniques

  16. Using finite mixture models in thermal-hydraulics system code uncertainty analysis

    Energy Technology Data Exchange (ETDEWEB)

    Carlos, S., E-mail: scarlos@iqn.upv.es [Department d’Enginyeria Química i Nuclear, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain); Sánchez, A. [Department d’Estadística Aplicada i Qualitat, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain); Ginestar, D. [Department de Matemàtica Aplicada, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain); Martorell, S. [Department d’Enginyeria Química i Nuclear, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain)

    2013-09-15

    Highlights: • Best estimate codes simulation needs uncertainty quantification. • The output variables can present multimodal probability distributions. • The analysis of multimodal distribution is performed using finite mixture models. • Two methods to reconstruct output variable probability distribution are used. -- Abstract: Nuclear Power Plant safety analysis is mainly based on the use of best estimate (BE) codes that predict the plant behavior under normal or accidental conditions. As the BE codes introduce uncertainties due to uncertainty in input parameters and modeling, it is necessary to perform uncertainty assessment (UA), and eventually sensitivity analysis (SA), of the results obtained. These analyses are part of the appropriate treatment of uncertainties imposed by current regulation based on the adoption of the best estimate plus uncertainty (BEPU) approach. The most popular approach for uncertainty assessment, based on Wilks’ method, obtains a tolerance/confidence interval, but it does not completely characterize the output variable behavior, which is required for an extended UA and SA. However, the development of standard UA and SA impose high computational cost due to the large number of simulations needed. In order to obtain more information about the output variable and, at the same time, to keep computational cost as low as possible, there has been a recent shift toward developing metamodels (model of model), or surrogate models, that approximate or emulate complex computer codes. In this way, there exist different techniques to reconstruct the probability distribution using the information provided by a sample of values as, for example, the finite mixture models. In this paper, the Expectation Maximization and the k-means algorithms are used to obtain a finite mixture model that reconstructs the output variable probability distribution from data obtained with RELAP-5 simulations. Both methodologies have been applied to a separated

  17. An estimation of uncertainties in containment P/T analysis using CONTEMPT/LT code

    International Nuclear Information System (INIS)

    Kang, Y.M.; Park, G.C.; Lee, U.C.; Kang, C.S.

    1991-01-01

    In a nuclear power plant, the containment design pressure and temperature (P/T) have been established based on the unrealistic conservatism with suffering from a drawback in the economics. Thus, it is necessary that the uncertainties of design P/T values have to be well defined through an extensive uncertainty analysis with plant-specific input data and or models used in the computer code. This study is to estimate plant-specific uncertainties of containment design P/T using the Monte Carlo method in Kori-3 reactor. Kori-3 plant parameters and Uchida heat transfer coefficient are selected to be treated statistically after the sensitivity study. The Monte Carlo analysis has performed based on the response surface method with the CONTEMPT/LT code and Latin Hypercube sampling technique. Finally, the design values based on 95 %/95 % probability are compared with worst estimated values to assess the design margin. (author)

  18. Coupled code analysis of uncertainty and sensitivity of Kalinin-3 benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Pasichnyk, Ihor; Zwermann, Winfried; Velkov, Kiril [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Garching (Germany); Nikonov, Sergey [VNIIAES, Moscow (Russian Federation)

    2016-09-15

    An uncertainty and sensitivity analysis is performed for the OECD/NEA coolant transient Benchmark (K-3) on measured data at Kalinin-3 Nuclear Power Plant (NPP). A switch off of one main coolant pump (MCP) at nominal reactor power is calculated using a coupled thermohydraulic and neutron-kinetic ATHLET-PARCS code. The objectives are to study uncertainty of total reactor power and to identify the main sources of reactor power uncertainty. The GRS uncertainty and sensitivity software package XSUSA is applied to propagate uncertainties in nuclear data libraries to the full core coupled transient calculations. A set of most important thermal-hydraulic parameters of the primary circuit is identified and a total of 23 thermohydraulic parameters are statistically varied using GRS code SUSA. The ATHLET model contains also a balance-of-plant (BOP) model which is simulated using ATHLET GCSM module. In particular the operation of the main steam generator regulators is modelled in detail. A set of 200 varied coupled ATHLET-PARCS calculations is analyzed. The results obtained show a clustering effect in the behavior of global reactor parameters. It is found that the GCSM system together with varied input parameters strongly influence the overall nuclear power plant behavior and can even lead to a new scenario. Possible reasons of the clustering effect are discussed in the paper. This work is a step forward in establishing a ''best-estimate calculations in combination with performing uncertainty analysis'' methodology for coupled full core calculations.

  19. BWR transient analysis using neutronic / thermal hydraulic coupled codes including uncertainty quantification

    International Nuclear Information System (INIS)

    Hartmann, C.; Sanchez, V.; Tietsch, W.; Stieglitz, R.

    2012-01-01

    The KIT is involved in the development and qualification of best estimate methodologies for BWR transient analysis in cooperation with industrial partners. The goal is to establish the most advanced thermal hydraulic system codes coupled with 3D reactor dynamic codes to be able to perform a more realistic evaluation of the BWR behavior under accidental conditions. For this purpose a computational chain based on the lattice code (SCALE6/GenPMAXS), the coupled neutronic/thermal hydraulic code (TRACE/PARCS) as well as a Monte Carlo based uncertainty and sensitivity package (SUSA) has been established and applied to different kind of transients of a Boiling Water Reactor (BWR). This paper will describe the multidimensional models of the plant elaborated for TRACE and PARCS to perform the investigations mentioned before. For the uncertainty quantification of the coupled code TRACE/PARCS and specifically to take into account the influence of the kinetics parameters in such studies, the PARCS code has been extended to facilitate the change of model parameters in such a way that the SUSA package can be used in connection with TRACE/PARCS for the U and S studies. This approach will be presented in detail. The results obtained for a rod drop transient with TRACE/PARCS using the SUSA-methodology showed clearly the importance of some kinetic parameters on the transient progression demonstrating that the coupling of a best-estimate coupled codes with uncertainty and sensitivity tools is very promising and of great importance for the safety assessment of nuclear reactors. (authors)

  20. Total sensitivity and uncertainty analysis for LWR pin-cells with improved UNICORN code

    International Nuclear Information System (INIS)

    Wan, Chenghui; Cao, Liangzhi; Wu, Hongchun; Shen, Wei

    2017-01-01

    Highlights: • A new model is established for the total sensitivity and uncertainty analysis. • The NR approximation applied in S&U analysis can be avoided by the new model. • Sensitivity and uncertainty analysis is performed to PWR pin-cells by the new model. • The effects of the NR approximation for the PWR pin-cells are quantified. - Abstract: In this paper, improvements to the multigroup cross-section perturbation model have been proposed and applied in the self-developed UNICORN code, which is capable of performing the total sensitivity and total uncertainty analysis for the neutron-physics calculations by applying the direct numerical perturbation method and the statistical sampling method respectively. The narrow resonance (NR) approximation was applied in the multigroup cross-section perturbation model, implemented in UNICORN. As improvements to the NR approximation to refine the multigroup cross-section perturbation model, an ultrafine-group cross-section perturbation model has been established, in which the actual perturbations are applied to the ultrafine-group cross-section library and the reconstructions of the resonance cross sections are performed by solving the neutron slowing-down equation. The total sensitivity and total uncertainty analysis were then applied to the LWR pin-cells, using both the multigroup and the ultrafine-group cross-section perturbation models. The numerical results show that the NR approximation overestimates the relative sensitivity coefficients and the corresponding uncertainty results for the LWR pin-cells, and the effects of the NR approximation are significant for σ_(_n_,_γ_) and σ_(_n_,_e_l_a_s_) of "2"3"8U. Therefore, the effects of the NR approximation applied in the total sensitivity and total uncertainty analysis for the neutron-physics calculations of LWR should be taken into account.

  1. Qualification and application of nuclear reactor accident analysis code with the capability of internal assessment of uncertainty

    International Nuclear Information System (INIS)

    Borges, Ronaldo Celem

    2001-10-01

    This thesis presents an independent qualification of the CIAU code ('Code with the capability of - Internal Assessment of Uncertainty') which is part of the internal uncertainty evaluation process with a thermal hydraulic system code on a realistic basis. This is done by combining the uncertainty methodology UMAE ('Uncertainty Methodology based on Accuracy Extrapolation') with the RELAP5/Mod3.2 code. This allows associating uncertainty band estimates with the results obtained by the realistic calculation of the code, meeting licensing requirements of safety analysis. The independent qualification is supported by simulations with RELAP5/Mod3.2 related to accident condition tests of LOBI experimental facility and to an event which has occurred in Angra 1 nuclear power plant, by comparison with measured results and by establishing uncertainty bands on safety parameter calculated time trends. These bands have indeed enveloped the measured trends. Results from this independent qualification of CIAU have allowed to ascertain the adequate application of a systematic realistic code procedure to analyse accidents with uncertainties incorporated in the results, although there is an evident need of extending the uncertainty data base. It has been verified that use of the code with this internal assessment of uncertainty is feasible in the design and license stages of a NPP. (author)

  2. Uncertainty analysis methods for quantification of source terms using a large computer code

    International Nuclear Information System (INIS)

    Han, Seok Jung

    1997-02-01

    Quantification of uncertainties in the source term estimations by a large computer code, such as MELCOR and MAAP, is an essential process of the current probabilistic safety assessments (PSAs). The main objectives of the present study are (1) to investigate the applicability of a combined procedure of the response surface method (RSM) based on input determined from a statistical design and the Latin hypercube sampling (LHS) technique for the uncertainty analysis of CsI release fractions under a hypothetical severe accident sequence of a station blackout at Young-Gwang nuclear power plant using MAAP3.0B code as a benchmark problem; and (2) to propose a new measure of uncertainty importance based on the distributional sensitivity analysis. On the basis of the results obtained in the present work, the RSM is recommended to be used as a principal tool for an overall uncertainty analysis in source term quantifications, while using the LHS in the calculations of standardized regression coefficients (SRC) and standardized rank regression coefficients (SRRC) to determine the subset of the most important input parameters in the final screening step and to check the cumulative distribution functions (cdfs) obtained by RSM. Verification of the response surface model for its sufficient accuracy is a prerequisite for the reliability of the final results obtained by the combined procedure proposed in the present work. In the present study a new measure has been developed to utilize the metric distance obtained from cumulative distribution functions (cdfs). The measure has been evaluated for three different cases of distributions in order to assess the characteristics of the measure: The first case and the second are when the distribution is known as analytical distributions and the other case is when the distribution is unknown. The first case is given by symmetry analytical distributions. The second case consists of two asymmetry distributions of which the skewness is non zero

  3. Users manual for the FORSS sensitivity and uncertainty analysis code system

    International Nuclear Information System (INIS)

    Lucius, J.L.; Weisbin, C.R.; Marable, J.H.; Drischler, J.D.; Wright, R.Q.; White, J.E.

    1981-01-01

    FORSS is a code system used to study relationships between nuclear reaction cross sections, integral experiments, reactor performance parameter predictions and associated uncertainties. This report describes the computing environment and the modules currently used to implement FORSS Sensitivity and Uncertainty Methodology

  4. Users manual for the FORSS sensitivity and uncertainty analysis code system

    Energy Technology Data Exchange (ETDEWEB)

    Lucius, J.L.; Weisbin, C.R.; Marable, J.H.; Drischler, J.D.; Wright, R.Q.; White, J.E.

    1981-01-01

    FORSS is a code system used to study relationships between nuclear reaction cross sections, integral experiments, reactor performance parameter predictions and associated uncertainties. This report describes the computing environment and the modules currently used to implement FORSS Sensitivity and Uncertainty Methodology.

  5. Uncertainty analysis in the simulation of an HPGe detector using the Monte Carlo Code MCNP5

    International Nuclear Information System (INIS)

    Gallardo, Sergio; Pozuelo, Fausto; Querol, Andrea; Verdu, Gumersindo; Rodenas, Jose; Ortiz, J.; Pereira, Claubia

    2013-01-01

    A gamma spectrometer including an HPGe detector is commonly used for environmental radioactivity measurements. Many works have been focused on the simulation of the HPGe detector using Monte Carlo codes such as MCNP5. However, the simulation of this kind of detectors presents important difficulties due to the lack of information from manufacturers and due to loss of intrinsic properties in aging detectors. Some parameters such as the active volume or the Ge dead layer thickness are many times unknown and are estimated during simulations. In this work, a detailed model of an HPGe detector and a petri dish containing a certified gamma source has been done. The certified gamma source contains nuclides to cover the energy range between 50 and 1800 keV. As a result of the simulation, the Pulse Height Distribution (PHD) is obtained and the efficiency curve can be calculated from net peak areas and taking into account the certified activity of the source. In order to avoid errors due to the net area calculation, the simulated PHD is treated using the GammaVision software. On the other hand, it is proposed to use the Noether-Wilks formula to do an uncertainty analysis of model with the main goal of determining the efficiency curve of this detector and its associated uncertainty. The uncertainty analysis has been focused on dead layer thickness at different positions of the crystal. Results confirm the important role of the dead layer thickness in the low energy range of the efficiency curve. In the high energy range (from 300 to 1800 keV) the main contribution to the absolute uncertainty is due to variations in the active volume. (author)

  6. Uncertainty analysis in the simulation of an HPGe detector using the Monte Carlo Code MCNP5

    Energy Technology Data Exchange (ETDEWEB)

    Gallardo, Sergio; Pozuelo, Fausto; Querol, Andrea; Verdu, Gumersindo; Rodenas, Jose, E-mail: sergalbe@upv.es [Universitat Politecnica de Valencia, Valencia, (Spain). Instituto de Seguridad Industrial, Radiofisica y Medioambiental (ISIRYM); Ortiz, J. [Universitat Politecnica de Valencia, Valencia, (Spain). Servicio de Radiaciones. Lab. de Radiactividad Ambiental; Pereira, Claubia [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Departamento de Engenharia Nuclear

    2013-07-01

    A gamma spectrometer including an HPGe detector is commonly used for environmental radioactivity measurements. Many works have been focused on the simulation of the HPGe detector using Monte Carlo codes such as MCNP5. However, the simulation of this kind of detectors presents important difficulties due to the lack of information from manufacturers and due to loss of intrinsic properties in aging detectors. Some parameters such as the active volume or the Ge dead layer thickness are many times unknown and are estimated during simulations. In this work, a detailed model of an HPGe detector and a petri dish containing a certified gamma source has been done. The certified gamma source contains nuclides to cover the energy range between 50 and 1800 keV. As a result of the simulation, the Pulse Height Distribution (PHD) is obtained and the efficiency curve can be calculated from net peak areas and taking into account the certified activity of the source. In order to avoid errors due to the net area calculation, the simulated PHD is treated using the GammaVision software. On the other hand, it is proposed to use the Noether-Wilks formula to do an uncertainty analysis of model with the main goal of determining the efficiency curve of this detector and its associated uncertainty. The uncertainty analysis has been focused on dead layer thickness at different positions of the crystal. Results confirm the important role of the dead layer thickness in the low energy range of the efficiency curve. In the high energy range (from 300 to 1800 keV) the main contribution to the absolute uncertainty is due to variations in the active volume. (author)

  7. Sensitivity Analysis of Uncertainty Parameter based on MARS-LMR Code on SHRT-45R of EBR II

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Seok-Ju; Kang, Doo-Hyuk; Seo, Jae-Seung [System Engineering and Technology Co., Daejeon (Korea, Republic of); Bae, Sung-Won [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Jeong, Hae-Yong [Sejong University, Seoul (Korea, Republic of)

    2016-10-15

    In order to assess the uncertainty quantification of the MARS-LMR code, the code has been improved by modifying the source code to accommodate calculation process required for uncertainty quantification. In the present study, a transient of Unprotected Loss of Flow(ULOF) is selected as typical cases of as Anticipated Transient without Scram(ATWS) which belongs to DEC category. The MARS-LMR input generation for EBR II SHRT-45R and execution works are performed by using the PAPIRUS program. The sensitivity analysis is carried out with Uncertainty Parameter of the MARS-LMR code for EBR-II SHRT-45R. Based on the results of sensitivity analysis, dominant parameters with large sensitivity to FoM are picked out. Dominant parameters selected are closely related to the development process of ULOF event.

  8. SENSIT: a cross-section and design sensitivity and uncertainty analysis code. [In FORTRAN for CDC-7600, IBM 360

    Energy Technology Data Exchange (ETDEWEB)

    Gerstl, S.A.W.

    1980-01-01

    SENSIT computes the sensitivity and uncertainty of a calculated integral response (such as a dose rate) due to input cross sections and their uncertainties. Sensitivity profiles are computed for neutron and gamma-ray reaction cross sections of standard multigroup cross section sets and for secondary energy distributions (SEDs) of multigroup scattering matrices. In the design sensitivity mode, SENSIT computes changes in an integral response due to design changes and gives the appropriate sensitivity coefficients. Cross section uncertainty analyses are performed for three types of input data uncertainties: cross-section covariance matrices for pairs of multigroup reaction cross sections, spectral shape uncertainty parameters for secondary energy distributions (integral SED uncertainties), and covariance matrices for energy-dependent response functions. For all three types of data uncertainties SENSIT computes the resulting variance and estimated standard deviation in an integral response of interest, on the basis of generalized perturbation theory. SENSIT attempts to be more comprehensive than earlier sensitivity analysis codes, such as SWANLAKE.

  9. Quantifying reactor safety margins: Application of CSAU [Code Scalability, Applicability and Uncertainty] methodology to LBLOCA: Part 3, Assessment and ranging of parameters for the uncertainty analysis of LBLOCA codes

    International Nuclear Information System (INIS)

    Wulff, W.; Boyack, B.E.; Duffey, R.B.

    1988-01-01

    Comparisons of results from TRAC-PF1/MOD1 code calculations with measurements from Separate Effects Tests, and published experimental data for modeling parameters have been used to determine the uncertainty ranges of code input and modeling parameters which dominate the uncertainty in predicting the Peak Clad Temperature for a postulated Large Break Loss of Coolant Accident (LBLOCA) in a four-loop Westinghouse Pressurized Water Reactor. The uncertainty ranges are used for a detailed statistical analysis to calculate the probability distribution function for the TRAC code-predicted Peak Clad Temperature, as is described in an attendant paper. Measurements from Separate Effects Tests and Integral Effects Tests have been compared with results from corresponding TRAC-PF1/MOD1 code calculations to determine globally the total uncertainty in predicting the Peak Clad Temperature for LBLOCAs. This determination is in support of the detailed statistical analysis mentioned above. The analyses presented here account for uncertainties in input parameters, in modeling and scaling, in computing and in measurements. The analyses are an important part of the work needed to implement the Code Scalability, Applicability and Uncertainty (CSAU) methodology. CSAU is needed to determine the suitability of a computer code for reactor safety analyses and the uncertainty in computer predictions. The results presented here are used to estimate the safety margin of a particular nuclear reactor power plant for a postulated accident. 25 refs., 10 figs., 11 tabs

  10. Validation and uncertainty analysis of the Athlet thermal-hydraulic computer code

    International Nuclear Information System (INIS)

    Glaeser, H.

    1995-01-01

    The computer code ATHLET is being developed by GRS as an advanced best-estimate code for the simulation of breaks and transients in Pressurized Water Reactor (PWRs) and Boiling Water Reactor (BWRs) including beyond design basis accidents. A systematic validation of ATHLET is based on a well balanced set of integral and separate effects tests emphasizing the German combined Emergency Core Cooling (ECC) injection system. When using best estimate codes for predictions of reactor plant states during assumed accidents, qualification of the uncertainty in these calculations is highly desirable. A method for uncertainty and sensitivity evaluation has been developed by GRS where the computational effort is independent of the number of uncertain parameters. (author)

  11. May Day: A computer code to perform uncertainty and sensitivity analysis. Manuals

    International Nuclear Information System (INIS)

    Bolado, R.; Alonso, A.; Moya, J.M.

    1996-07-01

    The computer program May Day was developed to carry out the uncertainty and sensitivity analysis in the evaluation of radioactive waste storage. The May Day was made by the Polytechnical University of Madrid. (Author)

  12. Uncertainty analysis guide

    International Nuclear Information System (INIS)

    Andres, T.H.

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  13. Uncertainty analysis guide

    Energy Technology Data Exchange (ETDEWEB)

    Andres, T.H

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  14. Post-test calculation and uncertainty analysis of the experiment QUENCH-07 with the system code ATHLET-CD

    International Nuclear Information System (INIS)

    Austregesilo, Henrique; Bals, Christine; Trambauer, Klaus

    2007-01-01

    In the frame of developmental assessment and code validation, a post-test calculation of the test QUENCH-07 was performed with ATHLET-CD. The system code ATHLET-CD is being developed for best-estimate simulation of accidents with core degradation and for evaluation of accident management procedures. It applies the detailed models of the thermal-hydraulic code ATHLET in an efficient coupling with dedicated models for core degradation and fission products behaviour. The first step of the work was the simulation of the test QUENCH-07 applying the modelling options recommended in the code User's Manual (reference calculation). The global results of this calculation showed a good agreement with the measured data. This calculation was complemented by a sensitivity analysis in order to investigate the influence of a combined variation of code input parameters on the simulation of the main phenomena observed experimentally. Results of this sensitivity analysis indicate that the main experimental measurements lay within the uncertainty range of the corresponding calculated values. Among the main contributors to the uncertainty of code results are the heat transfer coefficient due to forced convection to superheated steam-argon mixture, the thermal conductivity of the shroud isolation and the external heater rod resistance. Uncertainties on modelling of B 4 C oxidation do not affect significantly the total calculated hydrogen release rates

  15. The sensitivity analysis by adjoint method for the uncertainty evaluation of the CATHARE-2 code

    Energy Technology Data Exchange (ETDEWEB)

    Barre, F.; de Crecy, A.; Perret, C. [French Atomic Energy Commission (CEA), Grenoble (France)

    1995-09-01

    This paper presents the application of the DASM (Discrete Adjoint Sensitivity Method) to CATHARE 2 thermal-hydraulics code. In a first part, the basis of this method is presented. The mathematical model of the CATHARE 2 code is based on the two fluid six equation model. It is discretized using implicit time discretization and it is relatively easy to implement this method in the code. The DASM is the ASM directly applied to the algebraic system of the discretized code equations which has been demonstrated to be the only solution of the mathematical model. The ASM is an integral part of the new version 1.4 of CATHARE. It acts as a post-processing module. It has been qualified by comparison with the {open_quotes}brute force{close_quotes} technique. In a second part, an application of the DASM in CATHARE 2 is presented. It deals with the determination of the uncertainties of the constitutive relationships, which is a compulsory step for calculating the final uncertainty of a given response. First, the general principles of the method are explained: the constitutive relationship are represented by several parameters and the aim is to calculate the variance-covariance matrix of these parameters. The experimental results of the separate effect tests used to establish the correlation are considered. The variance of the corresponding results calculated by CATHARE are estimated by comparing experiment and calculation. A DASM calculation is carried out to provide the derivatives of the responses. The final covariance matrix is obtained by combination of the variance of the responses and those derivatives. Then, the application of this method to a simple case-the blowdown Canon experiment-is presented. This application has been successfully performed.

  16. The sensitivity analysis by adjoint method for the uncertainty evaluation of the CATHARE-2 code

    International Nuclear Information System (INIS)

    Barre, F.; de Crecy, A.; Perret, C.

    1995-01-01

    This paper presents the application of the DASM (Discrete Adjoint Sensitivity Method) to CATHARE 2 thermal-hydraulics code. In a first part, the basis of this method is presented. The mathematical model of the CATHARE 2 code is based on the two fluid six equation model. It is discretized using implicit time discretization and it is relatively easy to implement this method in the code. The DASM is the ASM directly applied to the algebraic system of the discretized code equations which has been demonstrated to be the only solution of the mathematical model. The ASM is an integral part of the new version 1.4 of CATHARE. It acts as a post-processing module. It has been qualified by comparison with the open-quotes brute forceclose quotes technique. In a second part, an application of the DASM in CATHARE 2 is presented. It deals with the determination of the uncertainties of the constitutive relationships, which is a compulsory step for calculating the final uncertainty of a given response. First, the general principles of the method are explained: the constitutive relationship are represented by several parameters and the aim is to calculate the variance-covariance matrix of these parameters. The experimental results of the separate effect tests used to establish the correlation are considered. The variance of the corresponding results calculated by CATHARE are estimated by comparing experiment and calculation. A DASM calculation is carried out to provide the derivatives of the responses. The final covariance matrix is obtained by combination of the variance of the responses and those derivatives. Then, the application of this method to a simple case-the blowdown Canon experiment-is presented. This application has been successfully performed

  17. Development of Evaluation Code for MUF Uncertainty

    International Nuclear Information System (INIS)

    Won, Byung Hee; Han, Bo Young; Shin, Hee Sung; Ahn, Seong-Kyu; Park, Geun-Il; Park, Se Hwan

    2015-01-01

    Material Unaccounted For (MUF) is the material balance evaluated by measured nuclear material in a Material Balance Area (MBA). Assuming perfect measurements and no diversion from a facility, one can expect a zero MUF. However, non-zero MUF is always occurred because of measurement uncertainty even though the facility is under normal operation condition. Furthermore, there are many measurements using different equipment at various Key Measurement Points (KMPs), and the MUF uncertainty is affected by errors of those measurements. Evaluating MUF uncertainty is essentially required to develop safeguards system including nuclear measurement system in pyroprocessing, which is being developed for reducing radioactive waste from spent fuel in Korea Atomic Energy Research Institute (KAERI). The evaluation code for analyzing MUF uncertainty has been developed and it was verified using sample problem from the IAEA reference. MUF uncertainty can be simply and quickly calculated by using this evaluation code which is made based on graphical user interface for user friendly. It is also expected that the code will make the sensitivity analysis on the MUF uncertainty for the various safeguards systems easy and more systematic. It is suitable for users who want to evaluate the conventional safeguards system as well as to develop a new system for developing facilities

  18. Development of Evaluation Code for MUF Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Won, Byung Hee; Han, Bo Young; Shin, Hee Sung; Ahn, Seong-Kyu; Park, Geun-Il; Park, Se Hwan [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    Material Unaccounted For (MUF) is the material balance evaluated by measured nuclear material in a Material Balance Area (MBA). Assuming perfect measurements and no diversion from a facility, one can expect a zero MUF. However, non-zero MUF is always occurred because of measurement uncertainty even though the facility is under normal operation condition. Furthermore, there are many measurements using different equipment at various Key Measurement Points (KMPs), and the MUF uncertainty is affected by errors of those measurements. Evaluating MUF uncertainty is essentially required to develop safeguards system including nuclear measurement system in pyroprocessing, which is being developed for reducing radioactive waste from spent fuel in Korea Atomic Energy Research Institute (KAERI). The evaluation code for analyzing MUF uncertainty has been developed and it was verified using sample problem from the IAEA reference. MUF uncertainty can be simply and quickly calculated by using this evaluation code which is made based on graphical user interface for user friendly. It is also expected that the code will make the sensitivity analysis on the MUF uncertainty for the various safeguards systems easy and more systematic. It is suitable for users who want to evaluate the conventional safeguards system as well as to develop a new system for developing facilities.

  19. Uncertainty analysis

    International Nuclear Information System (INIS)

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software

  20. FERRET data analysis code

    International Nuclear Information System (INIS)

    Schmittroth, F.

    1979-09-01

    A documentation of the FERRET data analysis code is given. The code provides a way to combine related measurements and calculations in a consistent evaluation. Basically a very general least-squares code, it is oriented towards problems frequently encountered in nuclear data and reactor physics. A strong emphasis is on the proper treatment of uncertainties and correlations and in providing quantitative uncertainty estimates. Documentation includes a review of the method, structure of the code, input formats, and examples

  1. Processing of the GALILEO fuel rod code model uncertainties within the AREVA LWR realistic thermal-mechanical analysis methodology

    International Nuclear Information System (INIS)

    Mailhe, P.; Barbier, B.; Garnier, C.; Landskron, H.; Sedlacek, R.; Arimescu, I.; Smith, M.; Bellanger, P.

    2013-01-01

    The availability of reliable tools and associated methodology able to accurately predict the LWR fuel behavior in all conditions is of great importance for safe and economic fuel usage. For that purpose, AREVA has developed its new global fuel rod performance code GALILEO along with its associated realistic thermal-mechanical analysis methodology. This realistic methodology is based on a Monte Carlo type random sampling of all relevant input variables. After having outlined the AREVA realistic methodology, this paper will be focused on the GALILEO code benchmarking process, on its extended experimental database and on the GALILEO model uncertainties assessment. The propagation of these model uncertainties through the AREVA realistic methodology is also presented. This GALILEO model uncertainties processing is of the utmost importance for accurate fuel design margin evaluation as illustrated on some application examples. With the submittal of Topical Report GALILEO to the U.S. NRC in 2013, GALILEO and its methodology are on the way to be industrially used in a wide range of irradiation conditions. (authors)

  2. Spectrum unfolding, sensitivity analysis and propagation of uncertainties with the maximum entropy deconvolution code MAXED

    CERN Document Server

    Reginatto, M; Neumann, S

    2002-01-01

    MAXED was developed to apply the maximum entropy principle to the unfolding of neutron spectrometric measurements. The approach followed in MAXED has several features that make it attractive: it permits inclusion of a priori information in a well-defined and mathematically consistent way, the algorithm used to derive the solution spectrum is not ad hoc (it can be justified on the basis of arguments that originate in information theory), and the solution spectrum is a non-negative function that can be written in closed form. This last feature permits the use of standard methods for the sensitivity analysis and propagation of uncertainties of MAXED solution spectra. We illustrate its use with unfoldings of NE 213 scintillation detector measurements of photon calibration spectra, and of multisphere neutron spectrometer measurements of cosmic-ray induced neutrons at high altitude (approx 20 km) in the atmosphere.

  3. Application of code scaling, applicability and uncertainty methodology to large break LOCA analysis of two loop PWR

    International Nuclear Information System (INIS)

    Mavko, B.; Stritar, A.; Prosek, A.

    1993-01-01

    In NED 119, No. 1 (May 1990) a series of six papers published by a Technical Program Group presented a new methodology for the safety evaluation of emergency core cooling systems in nuclear power plants. This paper describes the application of that new methodology to the LB LOCA analysis of the two loop Westinghouse power plant. Results of the original work were used wherever possible, so that the analysis was finished in less than one man year of work. Steam generator plugging level and safety injection flow rate were used as additional uncertainty parameters, which had not been used in the original work. The computer code RELAP5/MOD2 was used. Response surface was generated by the regression analysis and by the artificial neural network like Optimal Statistical Estimator method. Results were compared also to the analytical calculation. (orig.)

  4. Use of Sensitivity and Uncertainty Analysis to Select Benchmark Experiments for the Validation of Computer Codes and Data

    International Nuclear Information System (INIS)

    Elam, K.R.; Rearden, B.T.

    2003-01-01

    Sensitivity and uncertainty analysis methodologies under development at Oak Ridge National Laboratory were applied to determine whether existing benchmark experiments adequately cover the area of applicability for the criticality code and data validation of PuO 2 and mixed-oxide (MOX) powder systems. The study examined three PuO 2 powder systems and four MOX powder systems that would be useful for establishing mass limits for a MOX fuel fabrication facility. Using traditional methods to choose experiments for criticality analysis validation, 46 benchmark critical experiments were identified as applicable to the PuO 2 powder systems. However, only 14 experiments were thought to be within the area of applicability for dry MOX powder systems.The applicability of 318 benchmark critical experiments, including the 60 experiments initially identified, was assessed. Each benchmark and powder system was analyzed using the Tools for Sensitivity and UNcertainty Analysis Methodology Implementation (TSUNAMI) one-dimensional (TSUNAMI-1D) or TSUNAMI three-dimensional (TSUNAMI-3D) sensitivity analysis sequences, which will be included in the next release of the SCALE code system. This sensitivity data and cross-section uncertainty data were then processed with TSUNAMI-IP to determine the correlation of each application to each experiment in the benchmarking set. Correlation coefficients are used to assess the similarity between systems and determine the applicability of one system for the code and data validation of another.The applicability of most of the experiments identified using traditional methods was confirmed by the TSUNAMI analysis. In addition, some PuO 2 and MOX powder systems were determined to be within the area of applicability of several other benchmarks that would not have been considered using traditional methods. Therefore, the number of benchmark experiments useful for the validation of these systems exceeds the number previously expected. The TSUNAMI analysis

  5. UCODE_2005 and six other computer codes for universal sensitivity analysis, calibration, and uncertainty evaluation constructed using the JUPITER API

    Science.gov (United States)

    Poeter, Eileen E.; Hill, Mary C.; Banta, Edward R.; Mehl, Steffen; Christensen, Steen

    2006-01-01

    This report documents the computer codes UCODE_2005 and six post-processors. Together the codes can be used with existing process models to perform sensitivity analysis, data needs assessment, calibration, prediction, and uncertainty analysis. Any process model or set of models can be used; the only requirements are that models have numerical (ASCII or text only) input and output files, that the numbers in these files have sufficient significant digits, that all required models can be run from a single batch file or script, and that simulated values are continuous functions of the parameter values. Process models can include pre-processors and post-processors as well as one or more models related to the processes of interest (physical, chemical, and so on), making UCODE_2005 extremely powerful. An estimated parameter can be a quantity that appears in the input files of the process model(s), or a quantity used in an equation that produces a value that appears in the input files. In the latter situation, the equation is user-defined. UCODE_2005 can compare observations and simulated equivalents. The simulated equivalents can be any simulated value written in the process-model output files or can be calculated from simulated values with user-defined equations. The quantities can be model results, or dependent variables. For example, for ground-water models they can be heads, flows, concentrations, and so on. Prior, or direct, information on estimated parameters also can be considered. Statistics are calculated to quantify the comparison of observations and simulated equivalents, including a weighted least-squares objective function. In addition, data-exchange files are produced that facilitate graphical analysis. UCODE_2005 can be used fruitfully in model calibration through its sensitivity analysis capabilities and its ability to estimate parameter values that result in the best possible fit to the observations. Parameters are estimated using nonlinear regression: a

  6. Verification of the thermal module in the ELESIM code and the associated uncertainty analysis

    International Nuclear Information System (INIS)

    Arimescu, V.I.; Williams, A.F.; Klein, M.E.; Richmond, W.R.; Couture, M.

    1997-01-01

    Temperature is a critical parameter in fuel modelling because most of the physical processes that occur in fuel elements during irradiation are thermally activated. The focus of this paper is the temperature distribution calculation used in the computer code ELESIM, developed at AECL to model the steady state behaviour of CANDU fuel. A validation procedure for fuel codes is described and applied to ELESIM's thermal calculation

  7. Advanced LOCA code uncertainty assessment

    International Nuclear Information System (INIS)

    Wickett, A.J.; Neill, A.P.

    1990-11-01

    This report describes a pilot study that identified, quantified and combined uncertainties for the LOBI BL-02 3% small break test. A ''dials'' version of TRAC-PF1/MOD1, called TRAC-F, was used. (author)

  8. CSAU (Code Scaling, Applicability and Uncertainty)

    International Nuclear Information System (INIS)

    Wilson, G.E.; Boyack, B.E.

    1989-01-01

    Best Estimate computer codes have been accepted by the U.S. Nuclear Regulatory Commission as an optional tool for performing safety analysis related to the licensing and regulation of current nuclear reactors producing commercial electrical power, providing their uncertainty is quantified. In support of this policy change, the NRC and its contractors and consultants have developed and demonstrated an uncertainty quantification methodology called CSAU. The primary use of the CSAU methodology is to quantify safety margins for existing designs; however, the methodology can also serve an equally important role in advanced reactor research for plants not yet built. This paper describes the CSAU methodology, at the generic process level, and provides the general principles whereby it may be applied to evaluations of advanced reactor designs

  9. Some uncertainty results obtained by the statistical version of the KARATE code system related to core design and safety analysis

    Energy Technology Data Exchange (ETDEWEB)

    Panka, Istvan; Hegyi, Gyoergy; Maraczy, Csaba; Temesvari, Emese [Hungarian Academy of Sciences, Budapest (Hungary). Reactor Analysis Dept.

    2017-11-15

    The best-estimate KARATE code system has been widely used for core design calculations and simulations of slow transients of VVER reactors. Recently there has been an increasing need for assessing the uncertainties of such calculations by propagating the basic input uncertainties of the models through the full calculation chain. In order to determine the uncertainties of quantities of interest during the burnup, the statistical version of the KARATE code system has been elaborated. In the first part of the paper, the main features of the new code system are discussed. The applied statistical method is based on Monte-Carlo sampling of the considered input data taking into account mainly the covariance matrices of the cross sections and/or the technological uncertainties. In the second part of the paper, only the uncertainties of cross sections are considered and an equilibrium cycle related to a VVER-440 type reactor is investigated. The burnup dependence of the uncertainties of some safety related parameters (e.g. critical boron concentration, rod worth, feedback coefficients, assembly-wise radial power and burnup distribution) are discussed and compared to the recently used limits.

  10. Achieving 95% probability level using best estimate codes and the code scaling, applicability and uncertainty (CSAU) [Code Scaling, Applicability and Uncertainty] methodology

    International Nuclear Information System (INIS)

    Wilson, G.E.; Boyack, B.E.; Duffey, R.B.; Griffith, P.; Katsma, K.R.; Lellouche, G.S.; Rohatgi, U.S.; Wulff, W.; Zuber, N.

    1988-01-01

    Issue of a revised rule for loss of coolant accident/emergency core cooling system (LOCA/ECCS) analysis of light water reactors will allow the use of best estimate (BE) computer codes in safety analysis, with uncertainty analysis. This paper describes a systematic methodology, CSAU (Code Scaling, Applicability and Uncertainty), which will provide uncertainty bounds in a cost effective, auditable, rational and practical manner. 8 figs., 2 tabs

  11. Uncertainty analysis for results of thermal hydraulic codes of best-estimate-type; Analisis de incertidumbre para resultados de codigos termohidraulicos de mejor estimacion

    Energy Technology Data Exchange (ETDEWEB)

    Alva N, J.

    2010-07-01

    In this thesis, some fundamental knowledge is presented about uncertainty analysis and about diverse methodologies applied in the study of nuclear power plant transient event analysis, particularly related to thermal hydraulics phenomena. These concepts and methodologies mentioned in this work come from a wide bibliographical research in the nuclear power subject. Methodologies for uncertainty analysis have been developed by quite diverse institutions, and they have been widely used worldwide for application to results from best-estimate-type computer codes in nuclear reactor thermal hydraulics and safety analysis. Also, the main uncertainty sources, types of uncertainties, and aspects related to best estimate modeling and methods are introduced. Once the main bases of uncertainty analysis have been set, and some of the known methodologies have been introduced, it is presented in detail the CSAU methodology, which will be applied in the analyses. The main objective of this thesis is to compare the results of an uncertainty and sensibility analysis by using the Response Surface Technique to the application of W ilks formula, apply through a loss coolant experiment and an event of rise in a BWR. Both techniques are options in the part of uncertainty and sensibility analysis of the CSAU methodology, which was developed for the analysis of transients and accidents at nuclear power plants, and it is the base of most of the methodologies used in licensing of nuclear power plants practically everywhere. Finally, the results of applying both techniques are compared and discussed. (Author)

  12. Validation uncertainty of MATRA code for subchannel void distributions

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Dae-Hyun; Kim, S. J.; Kwon, H.; Seo, K. W. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    To extend code capability to the whole core subchannel analysis, pre-conditioned Krylov matrix solvers such as BiCGSTAB and GMRES are implemented in MATRA code as well as parallel computing algorithms using MPI and OPENMP. It is coded by fortran 90, and has some user friendly features such as graphic user interface. MATRA code was approved by Korean regulation body for design calculation of integral-type PWR named SMART. The major role subchannel code is to evaluate core thermal margin through the hot channel analysis and uncertainty evaluation for CHF predictions. In addition, it is potentially used for the best estimation of core thermal hydraulic field by incorporating into multiphysics and/or multi-scale code systems. In this study we examined a validation process for the subchannel code MATRA specifically in the prediction of subchannel void distributions. The primary objective of validation is to estimate a range within which the simulation modeling error lies. The experimental data for subchannel void distributions at steady state and transient conditions was provided on the framework of OECD/NEA UAM benchmark program. The validation uncertainty of MATRA code was evaluated for a specific experimental condition by comparing the simulation result and experimental data. A validation process should be preceded by code and solution verification. However, quantification of verification uncertainty was not addressed in this study. The validation uncertainty of the MATRA code for predicting subchannel void distribution was evaluated for a single data point of void fraction measurement at a 5x5 PWR test bundle on the framework of OECD UAM benchmark program. The validation standard uncertainties were evaluated as 4.2%, 3.9%, and 2.8% with the Monte-Carlo approach at the axial levels of 2216 mm, 2669 mm, and 3177 mm, respectively. The sensitivity coefficient approach revealed similar results of uncertainties but did not account for the nonlinear effects on the

  13. Development and application of methods to characterize code uncertainty

    International Nuclear Information System (INIS)

    Wilson, G.E.; Burtt, J.D.; Case, G.S.; Einerson, J.J.; Hanson, R.G.

    1985-01-01

    The United States Nuclear Regulatory Commission sponsors both international and domestic studies to assess its safety analysis codes. The Commission staff intends to use the results of these studies to quantify the uncertainty of the codes with a statistically based analysis method. Development of the methodology is underway. The Idaho National Engineering Laboratory contributions to the early development effort, and testing of two candidate methods are the subjects of this paper

  14. International training program in support of safety analysis. 3D S.UN.COP-scaling uncertainty and 3D thermal-hydraulics/neutron-kinetics coupled codes seminars

    International Nuclear Information System (INIS)

    Petruzzi, Alessandro; D'Auria, Francesco; Bajs, Tomislav; Reventos, Francesc; Hassan, Yassin

    2007-01-01

    Thermal-hydraulic system computer codes are extensively used worldwide for analysis of nuclear facilities by utilities, regulatory bodies, nuclear power plant designers and vendors, nuclear fuel companies, research organizations, consulting companies, and technical support organizations. The computer code user represents a source of uncertainty that can influence the results of system code calculations. This influence is commonly known as the user effect' and stems from the limitations embedded in the codes as well as from the limited capability of the analysis to use the codes. Code user training and qualification is an effective means for reducing the variation of results caused by the application of the codes by different users. This paper describes a systematic approach to training code users who, upon completion of the training, should be able to perform calculations making the best possible use of the capabilities of best estimate codes. In other words, the program aims at contributing towards solving the problem of user effect. The 3D S.UN.COP (Scaling, Uncertainty and 3D COuPled code calculations) seminars have been organized as follow-up of the proposal to IAEA for the Permanent Training Course for System Code Users. Six seminars have been held at University of Pisa (2003, 2004), at The Pennsylvania State University (2004), at University of Zagreb (2005), at the School of Industrial Engineering of Barcelona (January-February 2006) and in Buenos Aires, Argentina (October 2006), being this last one requested by ARN (Autoridad Regulatoria Nuclear), NA-SA (Nucleoelectrica Argentina S.A) and CNEA (Comision Nacional de Energia Atomica). It was recognized that such courses represented both a source of continuing education for current code users and a mean for current code users to enter the formal training structure of a proposed 'permanent' stepwise approach to user training. The 3D S.UN.COP 2006 in Barcelona was successfully held with the attendance of 33

  15. International Training Program in Support of Safety Analysis: 3D S.UN.COP-Scaling, Uncertainty and 3D Thermal-Hydraulics/Neutron-Kinetics Coupled Codes Seminars

    International Nuclear Information System (INIS)

    Petruzzi, Alessandro; D'Auria, Francesco; Bajs, Tomislav; Reventos, Francesc

    2006-01-01

    Thermal-hydraulic system computer codes are extensively used worldwide for analysis of nuclear facilities by utilities, regulatory bodies, nuclear power plant designers and vendors, nuclear fuel companies, research organizations, consulting companies, and technical support organizations. The computer code user represents a source of uncertainty that can influence the results of system code calculations. This influence is commonly known as the 'user effect' and stems from the limitations embedded in the codes as well as from the limited capability of the analysts to use the codes. Code user training and qualification is an effective means for reducing the variation of results caused by the application of the codes by different users. This paper describes a systematic approach to training code users who, upon completion of the training, should be able to perform calculations making the best possible use of the capabilities of best estimate codes. In other words, the program aims at contributing towards solving the problem of user effect. The 3D S.UN.COP (Scaling, Uncertainty and 3D COuPled code calculations) seminars have been organized as follow-up of the proposal to IAEA for the Permanent Training Course for System Code Users [1]. Five seminars have been held at University of Pisa (2003, 2004), at The Pennsylvania State University (2004), at University of Zagreb (2005) and at the School of Industrial Engineering of Barcelona (2006). It was recognized that such courses represented both a source of continuing education for current code users and a mean for current code users to enter the formal training structure of a proposed 'permanent' stepwise approach to user training. The 3D S.UN.COP 2006 was successfully held with the attendance of 33 participants coming from 18 countries and 28 different institutions (universities, vendors, national laboratories and regulatory bodies). More than 30 scientists (coming from 13 countries and 23 different institutions) were

  16. Uncertainty analysis of the SWEPP PAN assay system for glass waste (content codes 440, 441 and 442)

    International Nuclear Information System (INIS)

    Blackwood, L.G.; Harker, Y.D.; Meachum, T.R.; Yoon, W.Y.

    1996-10-01

    INEL is being used as a temporary storage facility for transuranic waste generated by the Nuclear Weapons program at the Rocky Flats Plant. Currently, there is a large effort in progress to prepare to ship this waste to WIPP. In order to meet the TRU Waste Characterization Quality Assurance Program Plan nondestructive assay compliance requirements and quality assurance objectives, it is necessary to determine the total uncertainty of the radioassay results produced by the Stored Waste Examination Pilot Plant (SWEPP) Passive Action Neutron (PAN) radioassay system. This paper discusses a modified statistical sampling and verification approach used to determine the total uncertainty of SWEPP PAN measurements for glass waste (content codes 440, 441, and 442) contained in 208 liter drums. In the modified statistical sampling and verification approach, the total performance of the SWEPP PAN nondestructive assay system for specifically selected waste conditions is simulated using computer models. A set of 100 cases covering the known conditions exhibited in glass waste was compiled using a combined statistical sampling and factorial experimental design approach. Parameter values assigned in each simulation were derived from reviews of approximately 100 real-time radiography video tapes of RFP glass waste drums, results from previous SWEPP PAN measurements on glass waste drums, and shipping data from RFP where the glass waste was generated. The data in the 100 selected cases form the multi-parameter input to the simulation model. The reported plutonium masses from the simulation model are compared with corresponding input masses. From these comparisons, the bias and total uncertainty associated with SWEPP PAN measurements on glass waste drums are estimated. The validity of the simulation approach is verified by comparing simulated output against results from calibration measurements using known plutonium sources and two glass waste calibration drums

  17. Uncertainty analysis of the SWEPP PAN assay system for glass waste (content codes 440, 441 and 442)

    Energy Technology Data Exchange (ETDEWEB)

    Blackwood, L.G.; Harker, Y.D.; Meachum, T.R.; Yoon, W.Y.

    1996-10-01

    INEL is being used as a temporary storage facility for transuranic waste generated by the Nuclear Weapons program at the Rocky Flats Plant. Currently, there is a large effort in progress to prepare to ship this waste to WIPP. In order to meet the TRU Waste Characterization Quality Assurance Program Plan nondestructive assay compliance requirements and quality assurance objectives, it is necessary to determine the total uncertainty of the radioassay results produced by the Stored Waste Examination Pilot Plant (SWEPP) Passive Action Neutron (PAN) radioassay system. This paper discusses a modified statistical sampling and verification approach used to determine the total uncertainty of SWEPP PAN measurements for glass waste (content codes 440, 441, and 442) contained in 208 liter drums. In the modified statistical sampling and verification approach, the total performance of the SWEPP PAN nondestructive assay system for specifically selected waste conditions is simulated using computer models. A set of 100 cases covering the known conditions exhibited in glass waste was compiled using a combined statistical sampling and factorial experimental design approach. Parameter values assigned in each simulation were derived from reviews of approximately 100 real-time radiography video tapes of RFP glass waste drums, results from previous SWEPP PAN measurements on glass waste drums, and shipping data from RFP where the glass waste was generated. The data in the 100 selected cases form the multi-parameter input to the simulation model. The reported plutonium masses from the simulation model are compared with corresponding input masses. From these comparisons, the bias and total uncertainty associated with SWEPP PAN measurements on glass waste drums are estimated. The validity of the simulation approach is verified by comparing simulated output against results from calibration measurements using known plutonium sources and two glass waste calibration drums.

  18. Qualification and application of nuclear reactor accident analysis code with the capability of internal assessment of uncertainty; Qualificacao e aplicacao de codigo de acidentes de reatores nucleares com capacidade interna de avaliacao de incerteza

    Energy Technology Data Exchange (ETDEWEB)

    Borges, Ronaldo Celem

    2001-10-15

    This thesis presents an independent qualification of the CIAU code ('Code with the capability of - Internal Assessment of Uncertainty') which is part of the internal uncertainty evaluation process with a thermal hydraulic system code on a realistic basis. This is done by combining the uncertainty methodology UMAE ('Uncertainty Methodology based on Accuracy Extrapolation') with the RELAP5/Mod3.2 code. This allows associating uncertainty band estimates with the results obtained by the realistic calculation of the code, meeting licensing requirements of safety analysis. The independent qualification is supported by simulations with RELAP5/Mod3.2 related to accident condition tests of LOBI experimental facility and to an event which has occurred in Angra 1 nuclear power plant, by comparison with measured results and by establishing uncertainty bands on safety parameter calculated time trends. These bands have indeed enveloped the measured trends. Results from this independent qualification of CIAU have allowed to ascertain the adequate application of a systematic realistic code procedure to analyse accidents with uncertainties incorporated in the results, although there is an evident need of extending the uncertainty data base. It has been verified that use of the code with this internal assessment of uncertainty is feasible in the design and license stages of a NPP. (author)

  19. Deterministic uncertainty analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1987-01-01

    Uncertainties of computer results are of primary interest in applications such as high-level waste (HLW) repository performance assessment in which experimental validation is not possible or practical. This work presents an alternate deterministic approach for calculating uncertainties that has the potential to significantly reduce the number of computer runs required for conventional statistical analysis. 7 refs., 1 fig

  20. The Uncertainty Test for the MAAP Computer Code

    International Nuclear Information System (INIS)

    Park, S. H.; Song, Y. M.; Park, S. Y.; Ahn, K. I.; Kim, K. R.; Lee, Y. J.

    2008-01-01

    After the Three Mile Island Unit 2 (TMI-2) and Chernobyl accidents, safety issues for a severe accident are treated in various aspects. Major issues in our research part include a level 2 PSA. The difficulty in expanding the level 2 PSA as a risk information activity is the uncertainty. In former days, it attached a weight to improve the quality in a internal accident PSA, but the effort is insufficient for decrease the phenomenon uncertainty in the level 2 PSA. In our country, the uncertainty degree is high in the case of a level 2 PSA model, and it is necessary to secure a model to decrease the uncertainty. We have not yet experienced the uncertainty assessment technology, the assessment system itself depends on advanced nations. In advanced nations, the severe accident simulator is implemented in the hardware level. But in our case, basic function in a software level can be implemented. In these circumstance at home and abroad, similar instances are surveyed such as UQM and MELCOR. Referred to these instances, SAUNA (Severe Accident UNcertainty Analysis) system is being developed in our project to assess and decrease the uncertainty in a level 2 PSA. It selects the MAAP code to analyze the uncertainty in a severe accident

  1. Simulation codes and the impact of validation/uncertainty requirements

    International Nuclear Information System (INIS)

    Sills, H.E.

    1995-01-01

    Several of the OECD/CSNI members have adapted a proposed methodology for code validation and uncertainty assessment. Although the validation process adapted by members has a high degree of commonality, the uncertainty assessment processes selected are more variable, ranaing from subjective to formal. This paper describes the validation and uncertainty assessment process, the sources of uncertainty, methods of reducing uncertainty, and methods of assessing uncertainty.Examples are presented from the Ontario Hydro application of the validation methodology and uncertainty assessment to the system thermal hydraulics discipline and the TUF (1) system thermal hydraulics code. (author)

  2. Sensitivity and uncertainty analysis

    CERN Document Server

    Cacuci, Dan G; Navon, Ionel Michael

    2005-01-01

    As computer-assisted modeling and analysis of physical processes have continued to grow and diversify, sensitivity and uncertainty analyses have become indispensable scientific tools. Sensitivity and Uncertainty Analysis. Volume I: Theory focused on the mathematical underpinnings of two important methods for such analyses: the Adjoint Sensitivity Analysis Procedure and the Global Adjoint Sensitivity Analysis Procedure. This volume concentrates on the practical aspects of performing these analyses for large-scale systems. The applications addressed include two-phase flow problems, a radiative c

  3. Uncertainty and sensitivity analysis in the scenario simulation with RELAP/SCDAP and MELCOR codes; Analisis de incertidumbre y sensibilidad en la simulacion de escenarios con los codigos RELAP/SCDAP y MELCOR

    Energy Technology Data Exchange (ETDEWEB)

    Garcia J, T.; Cardenas V, J., E-mail: tonatiuh.garcia@cnsns.gob.mx [Comision Nacional de Seguridad Nuclear y Salvaguardias, Dr. Barragan 779, Col. Narvarte, 03020 Ciudad de Mexico (Mexico)

    2015-09-15

    A methodology was implemented for analysis of uncertainty in simulations of scenarios with RELAP/SCDAP V- 3.4 bi-7 and MELCOR V-2.1 codes, same that are used to perform safety analysis in the Comision Nacional de Seguridad Nuclear y Salvaguardias (CNSNS). The uncertainty analysis methodology chosen is a probabilistic method of type Propagation of uncertainty of the input parameters to the departure parameters. Therefore, it began with the selection of the input parameters considered uncertain and are considered of high importance in the scenario for its direct effect on the output interest variable. These parameters were randomly sampled according to intervals of variation or probability distribution functions assigned by expert judgment to generate a set of input files that were run through the simulation code to propagate the uncertainty to the output parameters. Then, through the use or ordered statistical and formula Wilks, was determined that the minimum number of executions required to obtain the uncertainty bands that include a population of 95% at a confidence level of 95% in the results is 93, is important to mention that in this method that number of executions does not depend on the number of selected input parameters. In the implementation routines in Fortran 90 that allowed automate the process to make the uncertainty analysis in transients for RELAP/SCDAP code were generated. In the case of MELCOR code for severe accident analysis, automation was carried out through complement Dakota Uncertainty incorporated into the Snap platform. To test the practical application of this methodology, two analyzes were performed: the first with the simulation of closing transient of the main steam isolation valves using the RELAP/SCDAP code obtaining the uncertainty band of the dome pressure of the vessel; while in the second analysis, the accident simulation of the power total loss (Sbo) was carried out with the Macarol code obtaining the uncertainty band for the

  4. SURE: a system of computer codes for performing sensitivity/uncertainty analyses with the RELAP code

    International Nuclear Information System (INIS)

    Bjerke, M.A.

    1983-02-01

    A package of computer codes has been developed to perform a nonlinear uncertainty analysis on transient thermal-hydraulic systems which are modeled with the RELAP computer code. Using an uncertainty around the analyses of experiments in the PWR-BDHT Separate Effects Program at Oak Ridge National Laboratory. The use of FORTRAN programs running interactively on the PDP-10 computer has made the system very easy to use and provided great flexibility in the choice of processing paths. Several experiments simulating a loss-of-coolant accident in a nuclear reactor have been successfully analyzed. It has been shown that the system can be automated easily to further simplify its use and that the conversion of the entire system to a base code other than RELAP is possible

  5. Deterministic uncertainty analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1987-12-01

    This paper presents a deterministic uncertainty analysis (DUA) method for calculating uncertainties that has the potential to significantly reduce the number of computer runs compared to conventional statistical analysis. The method is based upon the availability of derivative and sensitivity data such as that calculated using the well known direct or adjoint sensitivity analysis techniques. Formation of response surfaces using derivative data and the propagation of input probability distributions are discussed relative to their role in the DUA method. A sample problem that models the flow of water through a borehole is used as a basis to compare the cumulative distribution function of the flow rate as calculated by the standard statistical methods and the DUA method. Propogation of uncertainties by the DUA method is compared for ten cases in which the number of reference model runs was varied from one to ten. The DUA method gives a more accurate representation of the true cumulative distribution of the flow rate based upon as few as two model executions compared to fifty model executions using a statistical approach. 16 refs., 4 figs., 5 tabs

  6. Uncertainty analysis in seismic tomography

    Science.gov (United States)

    Owoc, Bartosz; Majdański, Mariusz

    2017-04-01

    Velocity field from seismic travel time tomography depends on several factors like regularization, inversion path, model parameterization etc. The result also strongly depends on an initial velocity model and precision of travel times picking. In this research we test dependence on starting model in layered tomography and compare it with effect of picking precision. Moreover, in our analysis for manual travel times picking the uncertainty distribution is asymmetric. This effect is shifting the results toward faster velocities. For calculation we are using JIVE3D travel time tomographic code. We used data from geo-engineering and industrial scale investigations, which were collected by our team from IG PAS.

  7. Uncertainty analysis techniques

    International Nuclear Information System (INIS)

    Marivoet, J.; Saltelli, A.; Cadelli, N.

    1987-01-01

    The origin of the uncertainty affecting Performance Assessments, as well as their propagation to dose and risk results is discussed. The analysis is focused essentially on the uncertainties introduced by the input parameters, the values of which may range over some orders of magnitude and may be given as probability distribution function. The paper briefly reviews the existing sampling techniques used for Monte Carlo simulations and the methods for characterizing the output curves, determining their convergence and confidence limits. Annual doses, expectation values of the doses and risks are computed for a particular case of a possible repository in clay, in order to illustrate the significance of such output characteristics as the mean, the logarithmic mean and the median as well as their ratios. The report concludes that provisionally, due to its better robustness, such estimation as the 90th percentile may be substituted to the arithmetic mean for comparison of the estimated doses with acceptance criteria. In any case, the results obtained through Uncertainty Analyses must be interpreted with caution as long as input data distribution functions are not derived from experiments reasonably reproducing the situation in a well characterized repository and site

  8. Fuzzy Uncertainty Evaluation for Fault Tree Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ki Beom; Shim, Hyung Jin [Seoul National University, Seoul (Korea, Republic of); Jae, Moo Sung [Hanyang University, Seoul (Korea, Republic of)

    2015-05-15

    This traditional probabilistic approach can calculate relatively accurate results. However it requires a long time because of repetitive computation due to the MC method. In addition, when informative data for statistical analysis are not sufficient or some events are mainly caused by human error, the probabilistic approach may not be possible because uncertainties of these events are difficult to be expressed by probabilistic distributions. In order to reduce the computation time and quantify uncertainties of top events when basic events whose uncertainties are difficult to be expressed by probabilistic distributions exist, the fuzzy uncertainty propagation based on fuzzy set theory can be applied. In this paper, we develop a fuzzy uncertainty propagation code and apply the fault tree of the core damage accident after the large loss of coolant accident (LLOCA). The fuzzy uncertainty propagation code is implemented and tested for the fault tree of the radiation release accident. We apply this code to the fault tree of the core damage accident after the LLOCA in three cases and compare the results with those computed by the probabilistic uncertainty propagation using the MC method. The results obtained by the fuzzy uncertainty propagation can be calculated in relatively short time, covering the results obtained by the probabilistic uncertainty propagation.

  9. CREOLE experiment study on the reactivity temperature coefficient with sensitivity and uncertainty analysis using the MCNP5 code and different neutron cross section evaluations

    International Nuclear Information System (INIS)

    Boulaich, Y.; El Bardouni, T.; Erradi, L.; Chakir, E.; Boukhal, H.; Nacir, B.; El Younoussi, C.; El Bakkari, B.; Merroun, O.; Zoubair, M.

    2011-01-01

    Highlights: → In the present work, we have analyzed the CREOLE experiment on the reactivity temperature coefficient (RTC) by using the three-dimensional continuous energy code (MCNP5) and the last updated nuclear data evaluations. → Calculation-experiment discrepancies of the RTC were analyzed and the results have shown that the JENDL3.3 and JEFF3.1 evaluations give the most consistent values. → In order to specify the source of the relatively large discrepancy in the case of ENDF-BVII nuclear data evaluation, the k eff discrepancy between ENDF-BVII and JENDL3.3 was decomposed by using sensitivity and uncertainty analysis technique. - Abstract: In the present work, we analyze the CREOLE experiment on the reactivity temperature coefficient (RTC) by using the three-dimensional continuous energy code (MCNP5) and the last updated nuclear data evaluations. This experiment performed in the EOLE critical facility located at CEA/Cadarache, was mainly dedicated to the RTC studies for both UO 2 and UO 2 -PuO 2 PWR type lattices covering the whole temperature range from 20 deg. C to 300 deg. C. We have developed an accurate 3D model of the EOLE reactor by using the MCNP5 Monte Carlo code which guarantees a high level of fidelity in the description of different configurations at various temperatures taking into account their consequence on neutron cross section data and all thermal expansion effects. In this case, the remaining error between calculation and experiment will be awarded mainly to uncertainties on nuclear data. Our own cross section library was constructed by using NJOY99.259 code with point-wise nuclear data based on ENDF-BVII, JEFF3.1 and JENDL3.3 evaluation files. The MCNP model was validated through the axial and radial fission rate measurements at room and hot temperatures. Calculation-experiment discrepancies of the RTC were analyzed and the results have shown that the JENDL3.3 and JEFF3.1 evaluations give the most consistent values; the discrepancy is

  10. Striatal dopamine release codes uncertainty in pathological gambling

    DEFF Research Database (Denmark)

    Linnet, Jakob; Mouridsen, Kim; Peterson, Ericka

    2012-01-01

    Two mechanisms of midbrain and striatal dopaminergic projections may be involved in pathological gambling: hypersensitivity to reward and sustained activation toward uncertainty. The midbrain—striatal dopamine system distinctly codes reward and uncertainty, where dopaminergic activation is a linear...... function of expected reward and an inverse U-shaped function of uncertainty. In this study, we investigated the dopaminergic coding of reward and uncertainty in 18 pathological gambling sufferers and 16 healthy controls. We used positron emission tomography (PET) with the tracer [11C]raclopride to measure...... dopamine release, and we used performance on the Iowa Gambling Task (IGT) to determine overall reward and uncertainty. We hypothesized that we would find a linear function between dopamine release and IGT performance, if dopamine release coded reward in pathological gambling. If, on the other hand...

  11. Striatal dopamine release codes uncertainty in pathological gambling

    DEFF Research Database (Denmark)

    Linnet, Jakob; Mouridsen, Kim; Peterson, Ericka

    2012-01-01

    Two mechanisms of midbrain and striatal dopaminergic projections may be involved in pathological gambling: hypersensitivity to reward and sustained activation toward uncertainty. The midbrain-striatal dopamine system distinctly codes reward and uncertainty, where dopaminergic activation is a linear...... function of expected reward and an inverse U-shaped function of uncertainty. In this study, we investigated the dopaminergic coding of reward and uncertainty in 18 pathological gambling sufferers and 16 healthy controls. We used positron emission tomography (PET) with the tracer [(11)C......]raclopride to measure dopamine release, and we used performance on the Iowa Gambling Task (IGT) to determine overall reward and uncertainty. We hypothesized that we would find a linear function between dopamine release and IGT performance, if dopamine release coded reward in pathological gambling. If, on the other hand...

  12. Uncertainty analysis in Monte Carlo criticality computations

    International Nuclear Information System (INIS)

    Qi Ao

    2011-01-01

    Highlights: ► Two types of uncertainty methods for k eff Monte Carlo computations are examined. ► Sampling method has the least restrictions on perturbation but computing resources. ► Analytical method is limited to small perturbation on material properties. ► Practicality relies on efficiency, multiparameter applicability and data availability. - Abstract: Uncertainty analysis is imperative for nuclear criticality risk assessments when using Monte Carlo neutron transport methods to predict the effective neutron multiplication factor (k eff ) for fissionable material systems. For the validation of Monte Carlo codes for criticality computations against benchmark experiments, code accuracy and precision are measured by both the computational bias and uncertainty in the bias. The uncertainty in the bias accounts for known or quantified experimental, computational and model uncertainties. For the application of Monte Carlo codes for criticality analysis of fissionable material systems, an administrative margin of subcriticality must be imposed to provide additional assurance of subcriticality for any unknown or unquantified uncertainties. Because of a substantial impact of the administrative margin of subcriticality on economics and safety of nuclear fuel cycle operations, recently increasing interests in reducing the administrative margin of subcriticality make the uncertainty analysis in criticality safety computations more risk-significant. This paper provides an overview of two most popular k eff uncertainty analysis methods for Monte Carlo criticality computations: (1) sampling-based methods, and (2) analytical methods. Examples are given to demonstrate their usage in the k eff uncertainty analysis due to uncertainties in both neutronic and non-neutronic parameters of fissionable material systems.

  13. The STAT7 Code for Statistical Propagation of Uncertainties In Steady-State Thermal Hydraulics Analysis of Plate-Fueled Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Dunn, Floyd E. [Argonne National Lab. (ANL), Argonne, IL (United States); Hu, Lin-wen [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States). Nuclear Reactor Lab.; Wilson, Erik [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-12-01

    The STAT code was written to automate many of the steady-state thermal hydraulic safety calculations for the MIT research reactor, both for conversion of the reactor from high enrichment uranium fuel to low enrichment uranium fuel and for future fuel re-loads after the conversion. A Monte-Carlo statistical propagation approach is used to treat uncertainties in important parameters in the analysis. These safety calculations are ultimately intended to protect against high fuel plate temperatures due to critical heat flux or departure from nucleate boiling or onset of flow instability; but additional margin is obtained by basing the limiting safety settings on avoiding onset of nucleate boiling. STAT7 can simultaneously analyze all of the axial nodes of all of the fuel plates and all of the coolant channels for one stripe of a fuel element. The stripes run the length of the fuel, from the bottom to the top. Power splits are calculated for each axial node of each plate to determine how much of the power goes out each face of the plate. By running STAT7 multiple times, full core analysis has been performed by analyzing the margin to ONB for each axial node of each stripe of each plate of each element in the core.

  14. Analysis of uncertainties of thermal hydraulic calculations

    International Nuclear Information System (INIS)

    Macek, J.; Vavrin, J.

    2002-12-01

    In 1993-1997 it was proposed, within OECD projects, that a common program should be set up for uncertainty analysis by a probabilistic method based on a non-parametric statistical approach for system computer codes such as RELAP, ATHLET and CATHARE and that a method should be developed for statistical analysis of experimental databases for the preparation of the input deck and statistical analysis of the output calculation results. Software for such statistical analyses would then have to be processed as individual tools independent of the computer codes used for the thermal hydraulic analysis and programs for uncertainty analysis. In this context, a method for estimation of a thermal hydraulic calculation is outlined and selected methods of statistical analysis of uncertainties are described, including methods for prediction accuracy assessment based on the discrete Fourier transformation principle. (author)

  15. DS02 uncertainty analysis

    International Nuclear Information System (INIS)

    Kaul, Dean C.; Egbert, Stephen D.; Woolson, William A.

    2005-01-01

    In order to avoid the pitfalls that so discredited DS86 and its uncertainty estimates, and to provide DS02 uncertainties that are both defensible and credible, this report not only presents the ensemble uncertainties assembled from uncertainties in individual computational elements and radiation dose components but also describes how these relate to comparisons between observed and computed quantities at critical intervals in the computational process. These comparisons include those between observed and calculated radiation free-field components, where observations include thermal- and fast-neutron activation and gamma-ray thermoluminescence, which are relevant to the estimated systematic uncertainty for DS02. The comparisons also include those between calculated and observed survivor shielding, where the observations consist of biodosimetric measurements for individual survivors, which are relevant to the estimated random uncertainty for DS02. (J.P.N.)

  16. Uncertainty analysis of environmental models

    International Nuclear Information System (INIS)

    Monte, L.

    1990-01-01

    In the present paper an evaluation of the output uncertainty of an environmental model for assessing the transfer of 137 Cs and 131 I in the human food chain are carried out on the basis of a statistical analysis of data reported by the literature. The uncertainty analysis offers the oppotunity of obtaining some remarkable information about the uncertainty of models predicting the migration of non radioactive substances in the environment mainly in relation to the dry and wet deposition

  17. GRS Method for Uncertainty and Sensitivity Evaluation of Code Results and Applications

    International Nuclear Information System (INIS)

    Glaeser, H.

    2008-01-01

    During the recent years, an increasing interest in computational reactor safety analysis is to replace the conservative evaluation model calculations by best estimate calculations supplemented by uncertainty analysis of the code results. The evaluation of the margin to acceptance criteria, for example, the maximum fuel rod clad temperature, should be based on the upper limit of the calculated uncertainty range. Uncertainty analysis is needed if useful conclusions are to be obtained from best estimate thermal-hydraulic code calculations, otherwise single values of unknown accuracy would be presented for comparison with regulatory acceptance limits. Methods have been developed and presented to quantify the uncertainty of computer code results. The basic techniques proposed by GRS are presented together with applications to a large break loss of coolant accident on a reference reactor as well as on an experiment simulating containment behaviour

  18. The role of the uncertainty in code development

    Energy Technology Data Exchange (ETDEWEB)

    Barre, F. [CEA-Grenoble (France)

    1997-07-01

    From a general point of view, all the results of a calculation should be given with their uncertainty. It is of most importance in nuclear safety where sizing of the safety systems, therefore protection of the population and the environment essentially depends on the calculation results. Until these last years, the safety analysis was performed with conservative tools. Two types of critics can be made. Firstly, conservative margins can be too large and it may be possible to reduce the cost of the plant or its operation with a best estimate approach. Secondly, some of the conservative hypotheses may not really conservative in the full range of physical events which can occur during an accident. Simpson gives an interesting example: in some cases, the majoration of the residual power during a small break LOCA can lead to an overprediction of the swell level and thus of an overprediction of the core cooling, which is opposite to a conservative prediction. A last question is: does the accumulation of conservative hypotheses for a problem always give a conservative result? The two phase flow physics, mainly dealing with situation of mechanical and thermal non-equilibrium, is too much complicated to answer these questions with a simple engineer judgement. The objective of this paper is to make a review of the quantification of the uncertainties which can be made during code development and validation.

  19. The role of the uncertainty in code development

    International Nuclear Information System (INIS)

    Barre, F.

    1997-01-01

    From a general point of view, all the results of a calculation should be given with their uncertainty. It is of most importance in nuclear safety where sizing of the safety systems, therefore protection of the population and the environment essentially depends on the calculation results. Until these last years, the safety analysis was performed with conservative tools. Two types of critics can be made. Firstly, conservative margins can be too large and it may be possible to reduce the cost of the plant or its operation with a best estimate approach. Secondly, some of the conservative hypotheses may not really conservative in the full range of physical events which can occur during an accident. Simpson gives an interesting example: in some cases, the majoration of the residual power during a small break LOCA can lead to an overprediction of the swell level and thus of an overprediction of the core cooling, which is opposite to a conservative prediction. A last question is: does the accumulation of conservative hypotheses for a problem always give a conservative result? The two phase flow physics, mainly dealing with situation of mechanical and thermal non-equilibrium, is too much complicated to answer these questions with a simple engineer judgement. The objective of this paper is to make a review of the quantification of the uncertainties which can be made during code development and validation

  20. Uncertainty analysis of neutron transport calculation

    International Nuclear Information System (INIS)

    Oka, Y.; Furuta, K.; Kondo, S.

    1987-01-01

    A cross section sensitivity-uncertainty analysis code, SUSD was developed. The code calculates sensitivity coefficients for one and two-dimensional transport problems based on the first order perturbation theory. Variance and standard deviation of detector responses or design parameters can be obtained using cross section covariance matrix. The code is able to perform sensitivity-uncertainty analysis for secondary neutron angular distribution(SAD) and secondary neutron energy distribution(SED). Covariances of 6 Li and 7 Li neutron cross sections in JENDL-3PR1 were evaluated including SAD and SED. Covariances of Fe and Be were also evaluated. The uncertainty of tritium breeding ratio, fast neutron leakage flux and neutron heating was analysed on four types of blanket concepts for a commercial tokamak fusion reactor. The uncertainty of tritium breeding ratio was less than 6 percent. Contribution from SAD/SED uncertainties are significant for some parameters. Formulas to estimate the errors of numerical solution of the transport equation were derived based on the perturbation theory. This method enables us to deterministically estimate the numerical errors due to iterative solution, spacial discretization and Legendre polynomial expansion of transfer cross-sections. The calculational errors of the tritium breeding ratio and the fast neutron leakage flux of the fusion blankets were analysed. (author)

  1. Reliability analysis under epistemic uncertainty

    International Nuclear Information System (INIS)

    Nannapaneni, Saideep; Mahadevan, Sankaran

    2016-01-01

    This paper proposes a probabilistic framework to include both aleatory and epistemic uncertainty within model-based reliability estimation of engineering systems for individual limit states. Epistemic uncertainty is considered due to both data and model sources. Sparse point and/or interval data regarding the input random variables leads to uncertainty regarding their distribution types, distribution parameters, and correlations; this statistical uncertainty is included in the reliability analysis through a combination of likelihood-based representation, Bayesian hypothesis testing, and Bayesian model averaging techniques. Model errors, which include numerical solution errors and model form errors, are quantified through Gaussian process models and included in the reliability analysis. The probability integral transform is used to develop an auxiliary variable approach that facilitates a single-level representation of both aleatory and epistemic uncertainty. This strategy results in an efficient single-loop implementation of Monte Carlo simulation (MCS) and FORM/SORM techniques for reliability estimation under both aleatory and epistemic uncertainty. Two engineering examples are used to demonstrate the proposed methodology. - Highlights: • Epistemic uncertainty due to data and model included in reliability analysis. • A novel FORM-based approach proposed to include aleatory and epistemic uncertainty. • A single-loop Monte Carlo approach proposed to include both types of uncertainties. • Two engineering examples used for illustration.

  2. A statistical methodology for quantification of uncertainty in best estimate code physical models

    International Nuclear Information System (INIS)

    Vinai, Paolo; Macian-Juan, Rafael; Chawla, Rakesh

    2007-01-01

    A novel uncertainty assessment methodology, based on a statistical non-parametric approach, is presented in this paper. It achieves quantification of code physical model uncertainty by making use of model performance information obtained from studies of appropriate separate-effect tests. Uncertainties are quantified in the form of estimated probability density functions (pdf's), calculated with a newly developed non-parametric estimator. The new estimator objectively predicts the probability distribution of the model's 'error' (its uncertainty) from databases reflecting the model's accuracy on the basis of available experiments. The methodology is completed by applying a novel multi-dimensional clustering technique based on the comparison of model error samples with the Kruskall-Wallis test. This takes into account the fact that a model's uncertainty depends on system conditions, since a best estimate code can give predictions for which the accuracy is affected by the regions of the physical space in which the experiments occur. The final result is an objective, rigorous and accurate manner of assigning uncertainty to coded models, i.e. the input information needed by code uncertainty propagation methodologies used for assessing the accuracy of best estimate codes in nuclear systems analysis. The new methodology has been applied to the quantification of the uncertainty in the RETRAN-3D void model and then used in the analysis of an independent separate-effect experiment. This has clearly demonstrated the basic feasibility of the approach, as well as its advantages in yielding narrower uncertainty bands in quantifying the code's accuracy for void fraction predictions

  3. Measures of uncertainty, importance and sensitivity of the SEDA code

    International Nuclear Information System (INIS)

    Baron, J.; Caruso, A.; Vinate, H.

    1996-01-01

    The purpose of this work is the estimation of the uncertainty on the results of the SEDA code (Sistema de Evaluacion de Dosis en Accidentes) in accordance with the input data and its parameters. The SEDA code has been developed by the Comision Nacional de Energia Atomica for the estimation of doses during emergencies in the vicinity of Atucha and Embalse, nuclear power plants. The user should feed the code with meteorological data, source terms and accident data (timing involved, release height, thermal content of the release, etc.) It is designed to be used during the emergency, and to bring fast results that enable to make decisions. The uncertainty in the results of the SEDA code is quantified in the present paper. This uncertainty is associated both with the data the user inputs to the code, and with the uncertain parameters of the code own models. The used method consisted in the statistical characterization of the parameters and variables, assigning them adequate probability distributions. These distributions have been sampled with the Latin Hypercube Sampling method, which is a stratified multi-variable Monte-Carlo technique. The code has been performed for each of the samples and finally, a result sample has been obtained. These results have been characterized from the statistical point of view (obtaining their mean, most probable value, distribution shape, etc.) for several distances from the source. Finally, the Partial Correlation Coefficients and Standard Regression Coefficients techniques have been used to obtain the relative importance of each input variable, and the Sensitivity of the code to its variations. The measures of Importance and Sensitivity have been obtained for several distances from the source and various cases of atmospheric stability, making comparisons possible. This paper allowed to confide in the results of the code, and the association of their uncertainty to them, as a way to know the limits in which the results can vary in a real

  4. Mathematical Analysis of Uncertainty

    Directory of Open Access Journals (Sweden)

    Angel GARRIDO

    2016-01-01

    Full Text Available Classical Logic showed early its insufficiencies for solving AI problems. The introduction of Fuzzy Logic aims at this problem. There have been research in the conventional Rough direction alone or in the Fuzzy direction alone, and more recently, attempts to combine both into Fuzzy Rough Sets or Rough Fuzzy Sets. We analyse some new and powerful tools in the study of Uncertainty, as the Probabilistic Graphical Models, Chain Graphs, Bayesian Networks, and Markov Networks, integrating our knowledge of graphs and probability.

  5. Uncertainty analysis for secondary energy distributions

    International Nuclear Information System (INIS)

    Gerstl, S.A.W.

    1978-01-01

    In many transport calculations the integral design parameter of interest (response) is determined mainly by secondary particles such as gamma rays from (n,γ) reactions or secondary neutrons from inelastic scattering events or (n,2n) reactions. Standard sensitivity analysis usually allows to calculate the sensitivities to the production cross sections of such secondaries, but an extended formalism is needed to also obtain the sensitivities to the energy distribution of the generated secondary particles. For a 30-group standard cross-section set 84% of all non-zero table positions pertain to the description of secondary energy distributions (SED's) and only 16% to the actual reaction cross sections. Therefore, any sensitivity/uncertainty analysis which does not consider the effects of SED's is incomplete and neglects most of the input data. This paper describes the methods of how sensitivity profiles for SED's are obtained and used to estimate the uncertainty of an integral response due to uncertainties in these SED's. The detailed theory is documented elsewhere and implemented in the LASL sensitivity code SENSIT. SED sensitivity profiles have proven particularly valuable in cross-section uncertainty analyses for fusion reactors. Even when the production cross sections for secondary neutrons were assumed to be without error, the uncertainties in the energy distribution of these secondaries produced appreciable uncertainties in the calculated tritium breeding rate. However, complete error files for SED's are presently nonexistent. Therefore, methods will be described that allow rough error estimates due to estimated SED uncertainties based on integral SED sensitivities

  6. MUP, CEC-DES, STRADE. Codes for uncertainty propagation, experimental design and stratified random sampling techniques

    International Nuclear Information System (INIS)

    Amendola, A.; Astolfi, M.; Lisanti, B.

    1983-01-01

    The report describes the how-to-use of the codes: MUP (Monte Carlo Uncertainty Propagation) for uncertainty analysis by Monte Carlo simulation, including correlation analysis, extreme value identification and study of selected ranges of the variable space; CEC-DES (Central Composite Design) for building experimental matrices according to the requirements of Central Composite and Factorial Experimental Designs; and, STRADE (Stratified Random Design) for experimental designs based on the Latin Hypercube Sampling Techniques. Application fields, of the codes are probabilistic risk assessment, experimental design, sensitivity analysis and system identification problems

  7. Optimization of FRAP uncertainty analysis option

    International Nuclear Information System (INIS)

    Peck, S.O.

    1979-10-01

    The automated uncertainty analysis option that has been incorporated in the FRAP codes (FRAP-T5 and FRAPCON-2) provides the user with a means of obtaining uncertainty bands on code predicted variables at user-selected times during a fuel pin analysis. These uncertainty bands are obtained by multiple single fuel pin analyses to generate data which can then be analyzed by second order statistical error propagation techniques. In this process, a considerable amount of data is generated and stored on tape. The user has certain choices to make regarding which independent variables are to be used in the analysis and what order of error propagation equation should be used in modeling the output response. To aid the user in these decisions, a computer program, ANALYZ, has been written and added to the uncertainty analysis option package. A variety of considerations involved in fitting response surface equations and certain pit-falls of which the user should be aware are discussed. An equation is derived expressing a residual as a function of a fitted model and an assumed true model. A variety of experimental design choices are discussed, including the advantages and disadvantages of each approach. Finally, a description of the subcodes which constitute program ANALYZ is provided

  8. Sensitivity and uncertainty studies of the CRAC2 computer code

    International Nuclear Information System (INIS)

    Kocher, D.C.; Ward, R.C.; Killough, G.G.; Dunning, D.E. Jr.; Hicks, B.B.; Hosker, R.P. Jr.; Ku, J.Y.; Rao, K.S.

    1985-05-01

    This report presents a study of the sensitivity of early fatalities, early injuries, latent cancer fatalities, and economic costs for hypothetical nuclear reactor accidents as predicted by the CRAC2 computer code (CRAC = Calculation of Reactor Accident Consequences) to uncertainties in selected models and parameters used in the code. The sources of uncertainty that were investigated in the CRAC2 sensitivity studies include (1) the model for plume rise, (2) the model for wet deposition, (3) the procedure for meteorological bin-sampling involving the selection of weather sequences that contain rain, (4) the dose conversion factors for inhalation as they are affected by uncertainties in the physical and chemical form of the released radionuclides, (5) the weathering half-time for external ground-surface exposure, and (6) the transfer coefficients for estimating exposures via terrestrial foodchain pathways. The sensitivity studies were performed for selected radionuclide releases, hourly meteorological data, land-use data, a fixed non-uniform population distribution, a single evacuation model, and various release heights and sensible heat rates. Two important general conclusions from the sensitivity and uncertainty studies are as follows: (1) The large effects on predicted early fatalities and early injuries that were observed in some of the sensitivity studies apparently are due in part to the presence of thresholds in the dose-response models. Thus, the observed sensitivities depend in part on the magnitude of the radionuclide releases. (2) Some of the effects on predicted early fatalities and early injuries that were observed in the sensitivity studies were comparable to effects that were due only to the selection of different sets of weather sequences in bin-sampling runs. 47 figs., 50 tabs

  9. Uncertainties associated with the use of the KENO Monte Carlo criticality codes

    International Nuclear Information System (INIS)

    Landers, N.F.; Petrie, L.M.

    1989-01-01

    The KENO multi-group Monte Carlo criticality codes have earned the reputation of being efficient, user friendly tools especially suited for the analysis of situations commonly encountered in the storage and transportation of fissile materials. Throughout their twenty years of service, a continuing effort has been made to maintain and improve these codes to meet the needs of the nuclear criticality safety community. Foremost among these needs is the knowledge of how to utilize the results safely and effectively. Therefore it is important that code users be aware of uncertainties that may affect their results. These uncertainties originate from approximations in the problem data, methods used to process cross sections, and assumptions, limitations and approximations within the criticality computer code itself. 6 refs., 8 figs., 1 tab

  10. Systematic Analysis Of Ocean Colour Uncertainties

    Science.gov (United States)

    Lavender, Samantha

    2013-12-01

    This paper reviews current research into the estimation of uncertainties as a pixel-based measure to aid non- specialist users of remote sensing products. An example MERIS image, captured on the 28 March 2012, was processed with above-water atmospheric correction code. This was initially based on both the Antoine & Morel Standard Atmospheric Correction, with Bright Pixel correction component, and Doerffer Neural Network coastal water's approach. It's showed that analysis of the atmospheric by-products yield important information about the separation of the atmospheric and in-water signals, helping to sign-post possible uncertainties in the atmospheric correction results. Further analysis has concentrated on implementing a ‘simplistic' atmospheric correction so that the impact of changing the input auxiliary data can be analysed; the influence of changing surface pressure is demonstrated. Future work will focus on automating the analysis, so that the methodology can be implemented within an operational system.

  11. Approach to uncertainty evaluation for safety analysis

    International Nuclear Information System (INIS)

    Ogura, Katsunori

    2005-01-01

    Nuclear power plant safety used to be verified and confirmed through accident simulations using computer codes generally because it is very difficult to perform integrated experiments or tests for the verification and validation of the plant safety due to radioactive consequence, cost, and scaling to the actual plant. Traditionally the plant safety had been secured owing to the sufficient safety margin through the conservative assumptions and models to be applied to those simulations. Meanwhile the best-estimate analysis based on the realistic assumptions and models in support of the accumulated insights could be performed recently, inducing the reduction of safety margin in the analysis results and the increase of necessity to evaluate the reliability or uncertainty of the analysis results. This paper introduces an approach to evaluate the uncertainty of accident simulation and its results. (Note: This research had been done not in the Japan Nuclear Energy Safety Organization but in the Tokyo Institute of Technology.) (author)

  12. Sampling-based nuclear data uncertainty quantification for continuous energy Monte-Carlo codes

    International Nuclear Information System (INIS)

    Zhu, T.

    2015-01-01

    Research on the uncertainty of nuclear data is motivated by practical necessity. Nuclear data uncertainties can propagate through nuclear system simulations into operation and safety related parameters. The tolerance for uncertainties in nuclear reactor design and operation can affect the economic efficiency of nuclear power, and essentially its sustainability. The goal of the present PhD research is to establish a methodology of nuclear data uncertainty quantification (NDUQ) for MCNPX, the continuous-energy Monte-Carlo (M-C) code. The high fidelity (continuous-energy treatment and flexible geometry modelling) of MCNPX makes it the choice of routine criticality safety calculations at PSI/LRS, but also raises challenges for NDUQ by conventional sensitivity/uncertainty (S/U) methods. For example, only recently in 2011, the capability of calculating continuous energy κ_e_f_f sensitivity to nuclear data was demonstrated in certain M-C codes by using the method of iterated fission probability. The methodology developed during this PhD research is fundamentally different from the conventional S/U approach: nuclear data are treated as random variables and sampled in accordance to presumed probability distributions. When sampled nuclear data are used in repeated model calculations, the output variance is attributed to the collective uncertainties of nuclear data. The NUSS (Nuclear data Uncertainty Stochastic Sampling) tool is based on this sampling approach and implemented to work with MCNPX’s ACE format of nuclear data, which also gives NUSS compatibility with MCNP and SERPENT M-C codes. In contrast, multigroup uncertainties are used for the sampling of ACE-formatted pointwise-energy nuclear data in a groupwise manner due to the more limited quantity and quality of nuclear data uncertainties. Conveniently, the usage of multigroup nuclear data uncertainties allows consistent comparison between NUSS and other methods (both S/U and sampling-based) that employ the same

  13. Sampling-based nuclear data uncertainty quantification for continuous energy Monte-Carlo codes

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, T.

    2015-07-01

    Research on the uncertainty of nuclear data is motivated by practical necessity. Nuclear data uncertainties can propagate through nuclear system simulations into operation and safety related parameters. The tolerance for uncertainties in nuclear reactor design and operation can affect the economic efficiency of nuclear power, and essentially its sustainability. The goal of the present PhD research is to establish a methodology of nuclear data uncertainty quantification (NDUQ) for MCNPX, the continuous-energy Monte-Carlo (M-C) code. The high fidelity (continuous-energy treatment and flexible geometry modelling) of MCNPX makes it the choice of routine criticality safety calculations at PSI/LRS, but also raises challenges for NDUQ by conventional sensitivity/uncertainty (S/U) methods. For example, only recently in 2011, the capability of calculating continuous energy κ{sub eff} sensitivity to nuclear data was demonstrated in certain M-C codes by using the method of iterated fission probability. The methodology developed during this PhD research is fundamentally different from the conventional S/U approach: nuclear data are treated as random variables and sampled in accordance to presumed probability distributions. When sampled nuclear data are used in repeated model calculations, the output variance is attributed to the collective uncertainties of nuclear data. The NUSS (Nuclear data Uncertainty Stochastic Sampling) tool is based on this sampling approach and implemented to work with MCNPX’s ACE format of nuclear data, which also gives NUSS compatibility with MCNP and SERPENT M-C codes. In contrast, multigroup uncertainties are used for the sampling of ACE-formatted pointwise-energy nuclear data in a groupwise manner due to the more limited quantity and quality of nuclear data uncertainties. Conveniently, the usage of multigroup nuclear data uncertainties allows consistent comparison between NUSS and other methods (both S/U and sampling-based) that employ the same

  14. Sensitivity and uncertainty studies of the CRAC2 computer code

    International Nuclear Information System (INIS)

    Kocher, D.C.; Ward, R.C.; Killough, G.G.; Dunning, D.E. Jr.; Hicks, B.B.; Hosker, R.P. Jr.; Ku, J.Y.; Rao, K.S.

    1987-01-01

    The authors have studied the sensitivity of health impacts from nuclear reactor accidents, as predicted by the CRAC2 computer code, to the following sources of uncertainty: (1) the model for plume rise, (2) the model for wet deposition, (3) the meteorological bin-sampling procedure for selecting weather sequences with rain, (4) the dose conversion factors for inhalation as affected by uncertainties in the particle size of the carrier aerosol and the clearance rates of radionuclides from the respiratory tract, (5) the weathering half-time for external ground-surface exposure, and (6) the transfer coefficients for terrestrial foodchain pathways. Predicted health impacts usually showed little sensitivity to use of an alternative plume-rise model or a modified rain-bin structure in bin-sampling. Health impacts often were quite sensitive to use of an alternative wet-deposition model in single-trial runs with rain during plume passage, but were less sensitive to the model in bin-sampling runs. Uncertainties in the inhalation dose conversion factors had important effects on early injuries in single-trial runs. Latent cancer fatalities were moderately sensitive to uncertainties in the weathering half-time for ground-surface exposures, but showed little sensitivity to the transfer coefficients for terrestrial foodchain pathways. Sensitivities of CRAC2 predictions to uncertainties in the models and parameters also depended on the magnitude of the source term, and some of the effects on early health effects were comparable to those that were due only to selection of different sets of weather sequences in bin-sampling

  15. Validation of Fuel Performance Uncertainty for RIA Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Nam-Gyu; Yoo, Jong-Sung; Jung, Yil-Sup [KEPCO Nuclear Fuel Co., Daejeon (Korea, Republic of)

    2016-10-15

    To achieve this the computer code performance has to be validated based on the experimental results. And for the uncertainty quantification, important uncertainty parameters need to be selected, and combined uncertainty has to be evaluated with an acceptable statistical treatment. And important uncertainty parameters to the rod performance such as fuel enthalpy, fission gas release, cladding hoop strain etc. were chosen through the rigorous sensitivity studies. And their validity has been assessed by utilizing the experimental results, which were tested in CABRI and NSRR. Analysis results revealed that several tested rods were not bounded within combined fuel performance uncertainty. Assessment of fuel performance with an extended fuel power uncertainty on tested rods in NSRR and CABRI has been done. Analysis results showed that several tested rods were not bounded within calculated fuel performance uncertainty. This implies that the currently considered uncertainty range of the parameters is not enough to cover the fuel performance sufficiently.

  16. Correlated statistical uncertainties in coded-aperture imaging

    International Nuclear Information System (INIS)

    Fleenor, Matthew C.; Blackston, Matthew A.; Ziock, Klaus P.

    2015-01-01

    In nuclear security applications, coded-aperture imagers can provide a wealth of information regarding the attributes of both the radioactive and nonradioactive components of the objects being imaged. However, for optimum benefit to the community, spatial attributes need to be determined in a quantitative and statistically meaningful manner. To address a deficiency of quantifiable errors in coded-aperture imaging, we present uncertainty matrices containing covariance terms between image pixels for MURA mask patterns. We calculated these correlated uncertainties as functions of variation in mask rank, mask pattern over-sampling, and whether or not anti-mask data are included. Utilizing simulated point source data, we found that correlations arose when two or more image pixels were summed. Furthermore, we found that the presence of correlations was heightened by the process of over-sampling, while correlations were suppressed by the inclusion of anti-mask data and with increased mask rank. As an application of this result, we explored how statistics-based alarming is impacted in a radiological search scenario

  17. Uncertainty analysis for hot channel

    International Nuclear Information System (INIS)

    Panka, I.; Kereszturi, A.

    2006-01-01

    The fulfillment of the safety analysis acceptance criteria is usually evaluated by separate hot channel calculations using the results of neutronic or/and thermo hydraulic system calculations. In case of an ATWS event (inadvertent withdrawal of control assembly), according to the analysis, a number of fuel rods are experiencing DNB for a longer time and must be regarded as failed. Their number must be determined for a further evaluation of the radiological consequences. In the deterministic approach, the global power history must be multiplied by different hot channel factors (kx) taking into account the radial power peaking factors for each fuel pin. If DNB occurs it is necessary to perform a few number of hot channel calculations to determine the limiting kx leading just to DNB and fuel failure (the conservative DNBR limit is 1.33). Knowing the pin power distribution from the core design calculation, the number of failed fuel pins can be calculated. The above procedure can be performed by conservative assumptions (e.g. conservative input parameters in the hot channel calculations), as well. In case of hot channel uncertainty analysis, the relevant input parameters (k x, mass flow, inlet temperature of the coolant, pin average burnup, initial gap size, selection of power history influencing the gap conductance value) of hot channel calculations and the DNBR limit are varied considering the respective uncertainties. An uncertainty analysis methodology was elaborated combining the response surface method with the one sided tolerance limit method of Wilks. The results of deterministic and uncertainty hot channel calculations are compared regarding to the number of failed fuel rods, max. temperature of the clad surface and max. temperature of the fuel (Authors)

  18. LOFT differential pressure uncertainty analysis

    International Nuclear Information System (INIS)

    Evans, R.P.; Biladeau, G.L.; Quinn, P.A.

    1977-03-01

    A performance analysis of the LOFT differential pressure (ΔP) measurement is presented. Along with completed descriptions of test programs and theoretical studies that have been conducted on the ΔP, specific sources of measurement uncertainty are identified, quantified, and combined to provide an assessment of the ability of this measurement to satisfy the SDD 1.4.1C (June 1975) requirement of measurement of differential pressure

  19. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, main report

    International Nuclear Information System (INIS)

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The ultimate objective of the joint effort was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. Experts developed their distributions independently. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. To validate the distributions generated for the dispersion code input variables, samples from the distributions and propagated through the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the first of a three-volume document describing the project

  20. Stochastic methods for uncertainty treatment of functional variables in computer codes: application to safety studies

    International Nuclear Information System (INIS)

    Nanty, Simon

    2015-01-01

    This work relates to the framework of uncertainty quantification for numerical simulators, and more precisely studies two industrial applications linked to the safety studies of nuclear plants. These two applications have several common features. The first one is that the computer code inputs are functional and scalar variables, functional ones being dependent. The second feature is that the probability distribution of functional variables is known only through a sample of their realizations. The third feature, relative to only one of the two applications, is the high computational cost of the code, which limits the number of possible simulations. The main objective of this work was to propose a complete methodology for the uncertainty analysis of numerical simulators for the two considered cases. First, we have proposed a methodology to quantify the uncertainties of dependent functional random variables from a sample of their realizations. This methodology enables to both model the dependency between variables and their link to another variable, called co-variate, which could be, for instance, the output of the considered code. Then, we have developed an adaptation of a visualization tool for functional data, which enables to simultaneously visualize the uncertainties and features of dependent functional variables. Second, a method to perform the global sensitivity analysis of the codes used in the two studied cases has been proposed. In the case of a computationally demanding code, the direct use of quantitative global sensitivity analysis methods is intractable. To overcome this issue, the retained solution consists in building a surrogate model or meta model, a fast-running model approximating the computationally expensive code. An optimized uniform sampling strategy for scalar and functional variables has been developed to build a learning basis for the meta model. Finally, a new approximation approach for expensive codes with functional outputs has been

  1. Discussion of OECD LWR Uncertainty Analysis in Modelling Benchmark

    International Nuclear Information System (INIS)

    Ivanov, K.; Avramova, M.; Royer, E.; Gillford, J.

    2013-01-01

    The demand for best estimate calculations in nuclear reactor design and safety evaluations has increased in recent years. Uncertainty quantification has been highlighted as part of the best estimate calculations. The modelling aspects of uncertainty and sensitivity analysis are to be further developed and validated on scientific grounds in support of their performance and application to multi-physics reactor simulations. The Organization for Economic Co-operation and Development (OECD) / Nuclear Energy Agency (NEA) Nuclear Science Committee (NSC) has endorsed the creation of an Expert Group on Uncertainty Analysis in Modelling (EGUAM). Within the framework of activities of EGUAM/NSC the OECD/NEA initiated the Benchmark for Uncertainty Analysis in Modelling for Design, Operation, and Safety Analysis of Light Water Reactor (OECD LWR UAM benchmark). The general objective of the benchmark is to propagate the predictive uncertainties of code results through complex coupled multi-physics and multi-scale simulations. The benchmark is divided into three phases with Phase I highlighting the uncertainty propagation in stand-alone neutronics calculations, while Phase II and III are focused on uncertainty analysis of reactor core and system respectively. This paper discusses the progress made in Phase I calculations, the Specifications for Phase II and the incoming challenges in defining Phase 3 exercises. The challenges of applying uncertainty quantification to complex code systems, in particular the time-dependent coupled physics models are the large computational burden and the utilization of non-linear models (expected due to the physics coupling). (authors)

  2. Applied research in uncertainty modeling and analysis

    CERN Document Server

    Ayyub, Bilal

    2005-01-01

    Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

  3. BN-600 MOX Core Benchmark Analysis. Results from Phases 4 and 6 of a Coordinated Research Project on Updated Codes and Methods to Reduce the Calculational Uncertainties of the LMFR Reactivity Effects

    International Nuclear Information System (INIS)

    2013-12-01

    For those Member States that have or have had significant fast reactor development programmes, it is of utmost importance that they have validated up to date codes and methods for fast reactor physics analysis in support of R and D and core design activities in the area of actinide utilization and incineration. In particular, some Member States have recently focused on fast reactor systems for minor actinide transmutation and on cores optimized for consuming rather than breeding plutonium; the physics of the breeder reactor cycle having already been widely investigated. Plutonium burning systems may have an important role in managing plutonium stocks until the time when major programmes of self-sufficient fast breeder reactors are established. For assessing the safety of these systems, it is important to determine the prediction accuracy of transient simulations and their associated reactivity coefficients. In response to Member States' expressed interest, the IAEA sponsored a coordinated research project (CRP) on Updated Codes and Methods to Reduce the Calculational Uncertainties of the LMFR Reactivity Effects. The CRP started in November 1999 and, at the first meeting, the members of the CRP endorsed a benchmark on the BN-600 hybrid core for consideration in its first studies. Benchmark analyses of the BN-600 hybrid core were performed during the first three phases of the CRP, investigating different nuclear data and levels of approximation in the calculation of safety related reactivity effects and their influence on uncertainties in transient analysis prediction. In an additional phase of the benchmark studies, experimental data were used for the verification and validation of nuclear data libraries and methods in support of the previous three phases. The results of phases 1, 2, 3 and 5 of the CRP are reported in IAEA-TECDOC-1623, BN-600 Hybrid Core Benchmark Analyses, Results from a Coordinated Research Project on Updated Codes and Methods to Reduce the

  4. Deterministic sensitivity and uncertainty methodology for best estimate system codes applied in nuclear technology

    International Nuclear Information System (INIS)

    Petruzzi, A.; D'Auria, F.; Cacuci, D.G.

    2009-01-01

    Nuclear Power Plant (NPP) technology has been developed based on the traditional defense in depth philosophy supported by deterministic and overly conservative methods for safety analysis. In the 1970s [1], conservative hypotheses were introduced for safety analyses to address existing uncertainties. Since then, intensive thermal-hydraulic experimental research has resulted in a considerable increase in knowledge and consequently in the development of best-estimate codes able to provide more realistic information about the physical behaviour and to identify the most relevant safety issues allowing the evaluation of the existing actual margins between the results of the calculations and the acceptance criteria. However, the best-estimate calculation results from complex thermal-hydraulic system codes (like Relap5, Cathare, Athlet, Trace, etc..) are affected by unavoidable approximations that are un-predictable without the use of computational tools that account for the various sources of uncertainty. Therefore the use of best-estimate codes (BE) within the reactor technology, either for design or safety purposes, implies understanding and accepting the limitations and the deficiencies of those codes. Taking into consideration the above framework, a comprehensive approach for utilizing quantified uncertainties arising from Integral Test Facilities (ITFs, [2]) and Separate Effect Test Facilities (SETFs, [3]) in the process of calibrating complex computer models for the application to NPP transient scenarios has been developed. The methodology proposed is capable of accommodating multiple SETFs and ITFs to learn as much as possible about uncertain parameters, allowing for the improvement of the computer model predictions based on the available experimental evidences. The proposed methodology constitutes a major step forward with respect to the generally used expert judgment and statistical methods as it permits a) to establish the uncertainties of any parameter

  5. A review on the CIRCE methodology to quantify the uncertainty of the physical models of a code

    International Nuclear Information System (INIS)

    Jeon, Seong Su; Hong, Soon Joon; Bang, Young Seok

    2012-01-01

    In the field of nuclear engineering, recent regulatory audit calculations of large break loss of coolant accident (LBLOCA) have been performed with the best estimate code such as MARS, RELAP5 and CATHARE. Since the credible regulatory audit calculation is very important in the evaluation of the safety of the nuclear power plant (NPP), there have been many researches to develop rules and methodologies for the use of best estimate codes. One of the major points is to develop the best estimate plus uncertainty (BEPU) method for uncertainty analysis. As a representative BEPU method, NRC proposes the CSAU (Code scaling, applicability and uncertainty) methodology, which clearly identifies the different steps necessary for an uncertainty analysis. The general idea is 1) to determine all the sources of uncertainty in the code, also called basic uncertainties, 2) quantify them and 3) combine them in order to obtain the final uncertainty for the studied application. Using the uncertainty analysis such as CSAU methodology, an uncertainty band for the code response (calculation result), important from the safety point of view is calculated and the safety margin of the NPP is quantified. An example of such a response is the peak cladding temperature (PCT) for a LBLOCA. However, there is a problem in the uncertainty analysis with the best estimate codes. Generally, it is very difficult to determine the uncertainties due to the empiricism of closure laws (also called correlations or constitutive relationships). So far the only proposed approach is based on the expert judgment. For this case, the uncertainty range of important parameters can be wide and inaccurate so that the confidence level of the BEPU calculation results can be decreased. In order to solve this problem, recently CEA (France) proposes a statistical method of data analysis, called CIRCE. The CIRCE method is intended to quantify the uncertainties of the correlations of a code. It may replace the expert judgment

  6. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear-reactor-safety research program is described and compared with other methodologies established for performing uncertainty analyses

  7. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear reactor safety research program is described and compared with other methodologies established for performing uncertainty analyses

  8. Uncertainty Analysis with Considering Resonance Self-shielding Effect

    International Nuclear Information System (INIS)

    Han, Tae Young

    2016-01-01

    If infinitely diluted multi-group cross sections were used for the sensitivity, the covariance data from the evaluated nuclear data library (ENDL) was directly applied. However, in case of using a self-shielded multi-group cross section, the covariance data should be corrected considering self-shielding effect. Usually, implicit uncertainty can be defined as the uncertainty change by the resonance self-shielding effect as described above. MUSAD ( Modules of Uncertainty and Sensitivity Analysis for DeCART ) has been developed for a multiplication factor and cross section uncertainty based on the generalized perturbation theory and it, however, can only quantify the explicit uncertainty by the self-shielded multi-group cross sections without considering the implicit effect. Thus, this paper addresses the implementation of the implicit uncertainty analysis module into the code and the numerical results for the verification are provided. The implicit uncertainty analysis module has been implemented into MUSAD based on infinitely-diluted cross section-based consistent method. The verification calculation was performed on MHTGR 350 Ex.I-1a and the differences with McCARD result decrease from 40% to 1% in CZP case and 3% in HFP case. From this study, it is expected that MUSAD code can reasonably produce the complete uncertainty on VHTR or LWR where the resonance self-shielding effect should be significantly considered

  9. Uncertainty Analysis with Considering Resonance Self-shielding Effect

    Energy Technology Data Exchange (ETDEWEB)

    Han, Tae Young [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    If infinitely diluted multi-group cross sections were used for the sensitivity, the covariance data from the evaluated nuclear data library (ENDL) was directly applied. However, in case of using a self-shielded multi-group cross section, the covariance data should be corrected considering self-shielding effect. Usually, implicit uncertainty can be defined as the uncertainty change by the resonance self-shielding effect as described above. MUSAD ( Modules of Uncertainty and Sensitivity Analysis for DeCART ) has been developed for a multiplication factor and cross section uncertainty based on the generalized perturbation theory and it, however, can only quantify the explicit uncertainty by the self-shielded multi-group cross sections without considering the implicit effect. Thus, this paper addresses the implementation of the implicit uncertainty analysis module into the code and the numerical results for the verification are provided. The implicit uncertainty analysis module has been implemented into MUSAD based on infinitely-diluted cross section-based consistent method. The verification calculation was performed on MHTGR 350 Ex.I-1a and the differences with McCARD result decrease from 40% to 1% in CZP case and 3% in HFP case. From this study, it is expected that MUSAD code can reasonably produce the complete uncertainty on VHTR or LWR where the resonance self-shielding effect should be significantly considered.

  10. International benchmark for coupled codes and uncertainty analysis in modelling: switching-Off of one of the four operating main circulation pumps at nominal reactor power at NPP Kalinin unit 3

    International Nuclear Information System (INIS)

    Tereshonok, V. A.; Nikonov, S. P.; Lizorkin, M. P.; Velkov, K; Pautz, A.; Ivanov, V.

    2008-01-01

    The paper briefly describes the Specification of an international NEA/OECD benchmark based on measured plant data. During the commissioning tests for nominal power at NPP Kalinin Unit 3 a lot of measurements of neutron and thermo-hydraulic parameters have been carried out in the reactor pressure vessel, primary and the secondary circuits. One of the measured data sets for the transient 'Switching-off of one Main Circulation Pump (MCP) at nominal power' has been chosen to be applied for validation of coupled thermal-hydraulic and neutron-kinetic system codes and additionally for performing of uncertainty analyses as a part of the NEA/OECD Uncertainty Analysis in Modeling Benchmark. The benchmark is opened for all countries and institutions. The experimental data and the final specification with the cross section libraries will be provided to the participants from NEA/OECD only after official declaration of real participation in the benchmark and delivery of the simulated results of the transient for comparison. (Author)

  11. Qualitative uncertainty analysis in probabilistic safety assessment context

    International Nuclear Information System (INIS)

    Apostol, M.; Constantin, M; Turcu, I.

    2007-01-01

    In Probabilistic Safety Assessment (PSA) context, an uncertainty analysis is performed either to estimate the uncertainty in the final results (the risk to public health and safety) or to estimate the uncertainty in some intermediate quantities (the core damage frequency, the radionuclide release frequency or fatality frequency). The identification and evaluation of uncertainty are important tasks because they afford credit to the results and help in the decision-making process. Uncertainty analysis can be performed qualitatively or quantitatively. This paper performs a preliminary qualitative uncertainty analysis, by identification of major uncertainty in PSA level 1- level 2 interface and in the other two major procedural steps of a level 2 PSA i.e. the analysis of accident progression and of the containment and analysis of source term for severe accidents. One should mention that a level 2 PSA for a Nuclear Power Plant (NPP) involves the evaluation and quantification of the mechanisms, amount and probabilities of subsequent radioactive material releases from the containment. According to NUREG 1150, an important task in source term analysis is fission products transport analysis. The uncertainties related to the isotopes distribution in CANDU NPP primary circuit and isotopes' masses transferred in the containment, using SOPHAEROS module from ASTEC computer code will be also presented. (authors)

  12. Uncertainty Management and Sensitivity Analysis

    DEFF Research Database (Denmark)

    Rosenbaum, Ralph K.; Georgiadis, Stylianos; Fantke, Peter

    2018-01-01

    Uncertainty is always there and LCA is no exception to that. The presence of uncertainties of different types and from numerous sources in LCA results is a fact, but managing them allows to quantify and improve the precision of a study and the robustness of its conclusions. LCA practice sometimes...... suffers from an imbalanced perception of uncertainties, justifying modelling choices and omissions. Identifying prevalent misconceptions around uncertainties in LCA is a central goal of this chapter, aiming to establish a positive approach focusing on the advantages of uncertainty management. The main...... objectives of this chapter are to learn how to deal with uncertainty in the context of LCA, how to quantify it, interpret and use it, and how to communicate it. The subject is approached more holistically than just focusing on relevant statistical methods or purely mathematical aspects. This chapter...

  13. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    International Nuclear Information System (INIS)

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project

  14. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    Energy Technology Data Exchange (ETDEWEB)

    Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States); Lui, C.H. [Nuclear Regulatory Commission, Washington, DC (United States); Goossens, L.H.J.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Paesler-Sauer, J. [Research Center, Karlsruhe (Germany); Helton, J.C. [and others

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.

  15. Supporting qualified database for V and V and uncertainty evaluation of best-estimate system codes

    International Nuclear Information System (INIS)

    Petruzzi, A.; D'Auria, F.

    2014-01-01

    Uncertainty evaluation constitutes a key feature of BEPU (Best Estimate Plus Uncertainty) process. The uncertainty can be the result of a Monte Carlo type analysis involving input uncertainty parameters or the outcome of a process involving the use of experimental data and connected code calculations. Those uncertainty methods are discussed in several papers and guidelines (IAEA-SRS- 52, OECD/NEA BEMUSE reports). The present paper aims at discussing the role and the depth of the analysis required for merging from one side suitable experimental data and on the other side qualified code calculation results. This aspect is mostly connected with the second approach for uncertainty mentioned above, but it can be used also in the framework of the first approach. Namely, the paper discusses the features and structure of the database that includes the following kinds of documents: 1. The' RDS-facility' (Reference Data Set for the selected facility): this includes the description of the facility, the geometrical characterization of any component of the facility, the instrumentations, the data acquisition system, the evaluation of pressure losses, the physical properties of the material and the characterization of pumps, valves and heat losses; 2. The 'RDS-test' (Reference Data Set for the selected test of the facility): this includes the description of the main phenomena investigated during the test, the configuration of the facility for the selected test (possible new evaluation of pressure and heat losses if needed) and the specific boundary and initial conditions; 3. The 'QP' (Qualification Report) of the code calculation results: this includes the description of the nodalization developed following a set of homogeneous techniques, the achievement of the steady state conditions and the qualitative and quantitative analysis of the transient with the characterization of the Relevant Thermal-Hydraulics Aspects (RTA); 4. The EH (Engineering

  16. Uncertainty analysis in the task of individual monitoring data

    International Nuclear Information System (INIS)

    Molokanov, A.; Badjin, V.; Gasteva, G.; Antipin, E.

    2003-01-01

    Assessment of internal doses is an essential component of individual monitoring programmes for workers and consists of two stages: individual monitoring measurements and interpretation of the monitoring data in terms of annual intake and/or annual internal dose. The overall uncertainty in assessed dose is a combination of the uncertainties in these stages. An algorithm and a computer code were developed for estimating the uncertainties in these stages. An algorithm and a computer code were developed for estimating the uncertainty in the assessment of internal dose in the task of individual monitoring data interpretation. Two main influencing factors are analysed in this paper: the unknown time of the exposure and variability of bioassay measurements. The aim of this analysis is to show that the algorithm is applicable in designing an individual monitoring programme for workers so as to guarantee that the individual dose calculated from individual monitoring measurements does not exceed a required limit with a certain confidence probability. (author)

  17. Uncertainty analysis of minimum vessel liquid inventory during a small-break LOCA in a B ampersand W Plant: An application of the CSAU methodology using the RELAP5/MOD3 computer code

    International Nuclear Information System (INIS)

    Ortiz, M.G.; Ghan, L.S.

    1992-12-01

    The Nuclear Regulatory Commission (NRC) revised the emergency core cooling system licensing rule to allow the use of best estimate computer codes, provided the uncertainty of the calculations are quantified and used in the licensing and regulation process. The NRC developed a generic methodology called Code Scaling, Applicability, and Uncertainty (CSAU) to evaluate best estimate code uncertainties. The objective of this work was to adapt and demonstrate the CSAU methodology for a small-break loss-of-coolant accident (SBLOCA) in a Pressurized Water Reactor of Babcock ampersand Wilcox Company lowered loop design using RELAP5/MOD3 as the simulation tool. The CSAU methodology was successfully demonstrated for the new set of variants defined in this project (scenario, plant design, code). However, the robustness of the reactor design to this SBLOCA scenario limits the applicability of the specific results to other plants or scenarios. Several aspects of the code were not exercised because the conditions of the transient never reached enough severity. The plant operator proved to be a determining factor in the course of the transient scenario, and steps were taken to include the operator in the model, simulation, and analyses

  18. Computer codes for safety analysis

    International Nuclear Information System (INIS)

    Holland, D.F.

    1986-11-01

    Computer codes for fusion safety analysis have been under development in the United States for about a decade. This paper will discuss five codes that are currently under development by the Fusion Safety Program. The purpose and capability of each code will be presented, a sample given, followed by a discussion of the present status and future development plans

  19. Applications of the TSUNAMI sensitivity and uncertainty analysis methodology

    International Nuclear Information System (INIS)

    Rearden, Bradley T.; Hopper, Calvin M.; Elam, Karla R.; Goluoglu, Sedat; Parks, Cecil V.

    2003-01-01

    The TSUNAMI sensitivity and uncertainty analysis tools under development for the SCALE code system have recently been applied in four criticality safety studies. TSUNAMI is used to identify applicable benchmark experiments for criticality code validation, assist in the design of new critical experiments for a particular need, reevaluate previously computed computational biases, and assess the validation coverage and propose a penalty for noncoverage for a specific application. (author)

  20. Uncertainty analysis of LBLOCA for Advanced Heavy Water Reactor

    International Nuclear Information System (INIS)

    Srivastava, A.; Lele, H.G.; Ghosh, A.K.; Kushwaha, H.S.

    2008-01-01

    The main objective of safety analysis is to demonstrate in a robust way that all safety requirements are met, i.e. sufficient margins exist between real values of important parameters and their threshold values at which damage of the barriers against release of radioactivity would occur. As stated in the IAEA Safety Requirements for Design of NPPs 'a safety analysis of the plant design shall be conducted in which methods of both deterministic and probabilistic analysis shall be applied'. It is required that 'the computer programs, analytical methods and plant models used in the safety analysis shall be verified and validated, and adequate consideration shall be given to uncertainties'. Uncertainties are present in calculations due to the computer codes, initial and boundary conditions, plant state, fuel parameters, scaling and numerical solution algorithm. All conservative approaches, still widely used, were introduced to cover uncertainties due to limited capability for modelling and understanding of physical phenomena at the early stages of safety analysis. The results obtained by this approach are quite unrealistic and the level of conservatism is not fully known. Another approach is the use of Best Estimate (BE) codes with realistic initial and boundary conditions. If this approach is selected, it should be based on statistically combined uncertainties for plant initial and boundary conditions, assumptions and code models. The current trends are going into direction of the best estimate code with some conservative assumptions of the system with realistic input data with uncertainty analysis. The BE analysis with evaluation of uncertainties offers, in addition, a way to quantify the existing plant safety margins. Its broader use in the future is therefore envisaged, even though it is not always feasible because of the difficulty of quantifying code uncertainties with sufficiently narrow range for every phenomenon and for each accident sequence. In this paper

  1. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Haskin, F.E. [Univ. of New Mexico, Albuquerque, NM (United States); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on early health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  2. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Boardman, J. [AEA Technology (United Kingdom); Jones, J.A. [National Radiological Protection Board (United Kingdom); Harper, F.T.; Young, M.L. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  3. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harrison, J.D. [National Radiological Protection Board (United Kingdom); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  4. Uncertainty analysis in safety assessment

    International Nuclear Information System (INIS)

    Lemos, Francisco Luiz de; Sullivan, Terry

    1997-01-01

    Nuclear waste disposal is a very complex subject which requires the study of many different fields of science, like hydro geology, meteorology, geochemistry, etc. In addition, the waste disposal facilities are designed to last for a very long period of time. Both of these conditions make safety assessment projections filled with uncertainty. This paper addresses approaches for treatment of uncertainties in the safety assessment modeling due to the variability of data and some current approaches used to deal with this problem. (author)

  5. Uncertainty and sensitivity analysis in reactivity-initiated accident fuel modeling: synthesis of organisation for economic co-operation and development (OECD/nuclear energy agency (NEA benchmark on reactivity-initiated accident codes phase-II

    Directory of Open Access Journals (Sweden)

    Olivier Marchand

    2018-03-01

    Full Text Available In the framework of OECD/NEA Working Group on Fuel Safety, a RIA fuel-rod-code Benchmark Phase I was organized in 2010–2013. It consisted of four experiments on highly irradiated fuel rodlets tested under different experimental conditions. This benchmark revealed the need to better understand the basic models incorporated in each code for realistic simulation of the complicated integral RIA tests with high burnup fuel rods. A second phase of the benchmark (Phase II was thus launched early in 2014, which has been organized in two complementary activities: (1 comparison of the results of different simulations on simplified cases in order to provide additional bases for understanding the differences in modelling of the concerned phenomena; (2 assessment of the uncertainty of the results. The present paper provides a summary and conclusions of the second activity of the Benchmark Phase II, which is based on the input uncertainty propagation methodology. The main conclusion is that uncertainties cannot fully explain the difference between the code predictions. Finally, based on the RIA benchmark Phase-I and Phase-II conclusions, some recommendations are made. Keywords: RIA, Codes Benchmarking, Fuel Modelling, OECD

  6. One Approach to the Fire PSA Uncertainty Analysis

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.; Vukovic, I.

    2002-01-01

    Experienced practical events and findings from the number of fire probabilistic safety assessment (PSA) studies show that fire has high relative importance for nuclear power plant safety. Fire PSA is a very challenging phenomenon and a number of issues are still in the area of research and development. This has a major impact on the conservatism of fire PSA findings. One way to reduce the level of conservatism is to conduct uncertainty analysis. At the top-level, uncertainty of the fire PSA can be separated in to three segments. The first segment is related to fire initiating events frequencies. The second uncertainty segment is connected to the uncertainty of fire damage. Finally, there is uncertainty related to the PSA model, which propagates this fire-initiated damage to the core damage or other analyzed risk. This paper discusses all three segments of uncertainty. Some recent experience with fire PSA study uncertainty analysis, usage of fire analysis code COMPBRN IIIe, and uncertainty evaluation importance to the final result is presented.(author)

  7. Uncertainty modelling and code calibration for composite materials

    DEFF Research Database (Denmark)

    Toft, Henrik Stensgaard; Branner, Kim; Mishnaevsky, Leon, Jr

    2013-01-01

    and measurement uncertainties which are introduced on the different scales. Typically, these uncertainties are taken into account in the design process using characteristic values and partial safety factors specified in a design standard. The value of the partial safety factors should reflect a reasonable balance...... to wind turbine blades are calibrated for two typical lay-ups using a large number of load cases and ratios between the aerodynamic forces and the inertia forces....

  8. CSAU (code scaling, applicability and uncertainty), a tool to prioritize advanced reactor research

    International Nuclear Information System (INIS)

    Wilson, G.E.; Boyack, B.E.

    1990-01-01

    Best Estimate computer codes have been accepted by the US Nuclear Regulatory Commission as an optional tool for performing safety analysis related to the licensing and regulation of current nuclear reactors producing commercial electrical power, providing their uncertainty is quantified. In support of this policy change, the NRC and its contractors and consultants have developed and demonstrated an uncertainty quantification methodology called CSAU. At the process level, the method is generic to any application which relies on best estimate computer code simulations to determine safe operating margins. The primary use of the CSAU methodology is to quantify safety margins for existing designs; however, the methodology can also serve an equally important role in advanced reactor research for plants not yet built. Applied early, during the period when alternate designs are being evaluated, the methodology can identify the relative importance of the sources of uncertainty in the knowledge of each plant behavior and, thereby, help prioritize the research needed to bring the new designs to fruition. This paper describes the CSAU methodology, at the generic process level, and provides the general principles whereby it may be applied to evaluations of advanced reactor designs. 9 refs., 1 fig., 1 tab

  9. Uncertainty analysis in safety assessment

    Energy Technology Data Exchange (ETDEWEB)

    Lemos, Francisco Luiz de [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN), Belo Horizonte, MG (Brazil); Sullivan, Terry [Brookhaven National Lab., Upton, NY (United States)

    1997-12-31

    Nuclear waste disposal is a very complex subject which requires the study of many different fields of science, like hydro geology, meteorology, geochemistry, etc. In addition, the waste disposal facilities are designed to last for a very long period of time. Both of these conditions make safety assessment projections filled with uncertainty. This paper addresses approaches for treatment of uncertainties in the safety assessment modeling due to the variability of data and some current approaches used to deal with this problem. (author) 13 refs.; e-mail: lemos at bnl.gov; sulliva1 at bnl.gov

  10. Uncertainty quantification and error analysis

    Energy Technology Data Exchange (ETDEWEB)

    Higdon, Dave M [Los Alamos National Laboratory; Anderson, Mark C [Los Alamos National Laboratory; Habib, Salman [Los Alamos National Laboratory; Klein, Richard [Los Alamos National Laboratory; Berliner, Mark [OHIO STATE UNIV.; Covey, Curt [LLNL; Ghattas, Omar [UNIV OF TEXAS; Graziani, Carlo [UNIV OF CHICAGO; Seager, Mark [LLNL; Sefcik, Joseph [LLNL; Stark, Philip [UC/BERKELEY; Stewart, James [SNL

    2010-01-01

    UQ studies all sources of error and uncertainty, including: systematic and stochastic measurement error; ignorance; limitations of theoretical models; limitations of numerical representations of those models; limitations on the accuracy and reliability of computations, approximations, and algorithms; and human error. A more precise definition for UQ is suggested below.

  11. Risk Characterization uncertainties associated description, sensitivity analysis

    International Nuclear Information System (INIS)

    Carrillo, M.; Tovar, M.; Alvarez, J.; Arraez, M.; Hordziejewicz, I.; Loreto, I.

    2013-01-01

    The power point presentation is about risks to the estimated levels of exposure, uncertainty and variability in the analysis, sensitivity analysis, risks from exposure to multiple substances, formulation of guidelines for carcinogenic and genotoxic compounds and risk subpopulations

  12. Uncertainty propagation applied to multi-scale thermal-hydraulics coupled codes. A step towards validation

    Energy Technology Data Exchange (ETDEWEB)

    Geffray, Clotaire Clement

    2017-03-20

    The work presented here constitutes an important step towards the validation of the use of coupled system thermal-hydraulics and computational fluid dynamics codes for the simulation of complex flows in liquid metal cooled pool-type facilities. First, a set of methods suited for uncertainty and sensitivity analysis and validation activities with regards to the specific constraints of the work with coupled and expensive-to-run codes is proposed. Then, these methods are applied to the ATHLET - ANSYS CFX model of the TALL-3D facility. Several transients performed at this latter facility are investigated. The results are presented, discussed and compared to the experimental data. Finally, assessments of the validity of the selected methods and of the quality of the model are offered.

  13. Urban drainage models - making uncertainty analysis simple

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana

    2012-01-01

    in each measured/observed datapoint; an issue which is commonly overlook in the uncertainty analysis of urban drainage models. This comparison allows the user to intuitively estimate the optimum number of simulations required to conduct uncertainty analyses. The output of the method includes parameter......There is increasing awareness about uncertainties in modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here...

  14. CEC/USDOE workshop on uncertainty analysis

    International Nuclear Information System (INIS)

    Elderkin, C.E.; Kelly, G.N.

    1990-07-01

    Any measured or assessed quantity contains uncertainty. The quantitative estimation of such uncertainty is becoming increasingly important, especially in assuring that safety requirements are met in design, regulation, and operation of nuclear installations. The CEC/USDOE Workshop on Uncertainty Analysis, held in Santa Fe, New Mexico, on November 13 through 16, 1989, was organized jointly by the Commission of European Communities (CEC's) Radiation Protection Research program, dealing with uncertainties throughout the field of consequence assessment, and DOE's Atmospheric Studies in Complex Terrain (ASCOT) program, concerned with the particular uncertainties in time and space variant transport and dispersion. The workshop brought together US and European scientists who have been developing or applying uncertainty analysis methodologies, conducted in a variety of contexts, often with incomplete knowledge of the work of others in this area. Thus, it was timely to exchange views and experience, identify limitations of approaches to uncertainty and possible improvements, and enhance the interface between developers and users of uncertainty analysis methods. Furthermore, the workshop considered the extent to which consistent, rigorous methods could be used in various applications within consequence assessment. 3 refs

  15. Licensing in BE system code calculations. Applications and uncertainty evaluation by CIAU method

    International Nuclear Information System (INIS)

    Petruzzi, Alessandro; D'Auria, Francesco

    2007-01-01

    The evaluation of uncertainty constitutes the necessary supplement of Best Estimate (BE) calculations performed to understand accident scenarios in water cooled nuclear reactors. The needs come from the imperfection of computational tools on the one side and from the interest in using such tool to get more precise evaluation of safety margins. In the present paper the approaches to uncertainty are outlined and the CIAU (Code with capability of Internal Assessment of Uncertainty) method proposed by the University of Pisa is described including ideas at the basis and results from applications. Two approaches are distinguished that are characterized as 'propagation of code input uncertainty' and 'propagation of code output errors'. For both methods, the thermal-hydraulic code is at the centre of the process of uncertainty evaluation: in the former case the code itself is adopted to compute the error bands and to propagate the input errors, in the latter case the errors in code application to relevant measurements are used to derive the error bands. The CIAU method exploits the idea of the 'status approach' for identifying the thermal-hydraulic conditions of an accident in any Nuclear Power Plant (NPP). Errors in predicting such status are derived from the comparison between predicted and measured quantities and, in the stage of the application of the method, are used to compute the uncertainty. (author)

  16. Nuclear data sensitivity/uncertainty analysis for XT-ADS

    International Nuclear Information System (INIS)

    Sugawara, Takanori; Sarotto, Massimo; Stankovskiy, Alexey; Van den Eynde, Gert

    2011-01-01

    Highlights: → The sensitivity and uncertainty analyses were performed to comprehend the reliability of the XT-ADS neutronic design. → The uncertainties deduced from the covariance data for the XT-ADS criticality were 0.94%, 1.9% and 1.1% by the SCALE 44-group, TENDL-2009 and JENDL-3.3 data, respectively. → When the target accuracy of 0.3%Δk for the criticality was considered, the uncertainties did not satisfy it. → To achieve this accuracy, the uncertainties should be improved by experiments under an adequate condition. - Abstract: The XT-ADS, an accelerator-driven system for an experimental demonstration, has been investigated in the framework of IP EUROTRANS FP6 project. In this study, the sensitivity and uncertainty analyses were performed to comprehend the reliability of the XT-ADS neutronic design. For the sensitivity analysis, it was found that the sensitivity coefficients were significantly different by changing the geometry models and calculation codes. For the uncertainty analysis, it was confirmed that the uncertainties deduced from the covariance data varied significantly by changing them. The uncertainties deduced from the covariance data for the XT-ADS criticality were 0.94%, 1.9% and 1.1% by the SCALE 44-group, TENDL-2009 and JENDL-3.3 data, respectively. When the target accuracy of 0.3%Δk for the criticality was considered, the uncertainties did not satisfy it. To achieve this accuracy, the uncertainties should be improved by experiments under an adequate condition.

  17. Some reflections on uncertainty analysis and management

    International Nuclear Information System (INIS)

    Aven, Terje

    2010-01-01

    A guide to quantitative uncertainty analysis and management in industry has recently been issued. The guide provides an overall framework for uncertainty modelling and characterisations, using probabilities but also other uncertainty representations (including the Dempster-Shafer theory). A number of practical applications showing how to use the framework are presented. The guide is considered as an important contribution to the field, but there is a potential for improvements. These relate mainly to the scientific basis and clarification of critical issues, for example, concerning the meaning of a probability and the concept of model uncertainty. A reformulation of the framework is suggested using probabilities as the only representation of uncertainty. Several simple examples are included to motivate and explain the basic ideas of the modified framework.

  18. Fuel performance analysis code 'FAIR'

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.; Mahajan, S.C.; Kakodkar, A.

    1994-01-01

    For modelling nuclear reactor fuel rod behaviour of water cooled reactors under severe power maneuvering and high burnups, a mechanistic fuel performance analysis code FAIR has been developed. The code incorporates finite element based thermomechanical module, physically based fission gas release module and relevant models for modelling fuel related phenomena, such as, pellet cracking, densification and swelling, radial flux redistribution across the pellet due to the build up of plutonium near the pellet surface, pellet clad mechanical interaction/stress corrosion cracking (PCMI/SSC) failure of sheath etc. The code follows the established principles of fuel rod analysis programmes, such as coupling of thermal and mechanical solutions along with the fission gas release calculations, analysing different axial segments of fuel rod simultaneously, providing means for performing local analysis such as clad ridging analysis etc. The modular nature of the code offers flexibility in affecting modifications easily to the code for modelling MOX fuels and thorium based fuels. For performing analysis of fuel rods subjected to very long power histories within a reasonable amount of time, the code has been parallelised and is commissioned on the ANUPAM parallel processing system developed at Bhabha Atomic Research Centre (BARC). (author). 37 refs

  19. Uncertainties in source term calculations generated by the ORIGEN2 computer code for Hanford Production Reactors

    International Nuclear Information System (INIS)

    Heeb, C.M.

    1991-03-01

    The ORIGEN2 computer code is the primary calculational tool for computing isotopic source terms for the Hanford Environmental Dose Reconstruction (HEDR) Project. The ORIGEN2 code computes the amounts of radionuclides that are created or remain in spent nuclear fuel after neutron irradiation and radioactive decay have occurred as a result of nuclear reactor operation. ORIGEN2 was chosen as the primary code for these calculations because it is widely used and accepted by the nuclear industry, both in the United States and the rest of the world. Its comprehensive library of over 1,600 nuclides includes any possible isotope of interest to the HEDR Project. It is important to evaluate the uncertainties expected from use of ORIGEN2 in the HEDR Project because these uncertainties may have a pivotal impact on the final accuracy and credibility of the results of the project. There are three primary sources of uncertainty in an ORIGEN2 calculation: basic nuclear data uncertainty in neutron cross sections, radioactive decay constants, energy per fission, and fission product yields; calculational uncertainty due to input data; and code uncertainties (i.e., numerical approximations, and neutron spectrum-averaged cross-section values from the code library). 15 refs., 5 figs., 5 tabs

  20. Uncertainties in calculations of nuclear design code system for the high temperature engineering test reactor (HTTR)

    International Nuclear Information System (INIS)

    Shindo, R.; Yamashita, K.; Murata, I.

    1991-01-01

    The nuclear design code system for the HTTR consists of one dimensional cell burnup computer code, developed in JAERI and the TWOTRAN-2 transport code. In order to satisfy related design criteria, uncertainty of the calculation was investigated by comparing the calculated and experimental results. The experiments were performed with a graphite moderated critical assembly. It was confirmed that discrepancies between calculations and experiments were small enough to be allowed in the nuclear design of HTTR. 8 refs, 6 figs

  1. Approach to uncertainty in risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rish, W.R.

    1988-08-01

    In the Fall of 1985 EPA's Office of Radiation Programs (ORP) initiated a project to develop a formal approach to dealing with uncertainties encountered when estimating and evaluating risks to human health and the environment. Based on a literature review of modeling uncertainty, interviews with ORP technical and management staff, and input from experts on uncertainty analysis, a comprehensive approach was developed. This approach recognizes by design the constraints on budget, time, manpower, expertise, and availability of information often encountered in ''real world'' modeling. It is based on the observation that in practice risk modeling is usually done to support a decision process. As such, the approach focuses on how to frame a given risk modeling problem, how to use that framing to select an appropriate mixture of uncertainty analyses techniques, and how to integrate the techniques into an uncertainty assessment that effectively communicates important information and insight to decision-makers. The approach is presented in this report. Practical guidance on characterizing and analyzing uncertainties about model form and quantities and on effectively communicating uncertainty analysis results is included. Examples from actual applications are presented.

  2. Approach to uncertainty in risk analysis

    International Nuclear Information System (INIS)

    Rish, W.R.

    1988-08-01

    In the Fall of 1985 EPA's Office of Radiation Programs (ORP) initiated a project to develop a formal approach to dealing with uncertainties encountered when estimating and evaluating risks to human health and the environment. Based on a literature review of modeling uncertainty, interviews with ORP technical and management staff, and input from experts on uncertainty analysis, a comprehensive approach was developed. This approach recognizes by design the constraints on budget, time, manpower, expertise, and availability of information often encountered in ''real world'' modeling. It is based on the observation that in practice risk modeling is usually done to support a decision process. As such, the approach focuses on how to frame a given risk modeling problem, how to use that framing to select an appropriate mixture of uncertainty analyses techniques, and how to integrate the techniques into an uncertainty assessment that effectively communicates important information and insight to decision-makers. The approach is presented in this report. Practical guidance on characterizing and analyzing uncertainties about model form and quantities and on effectively communicating uncertainty analysis results is included. Examples from actual applications are presented

  3. Complex Visual Data Analysis, Uncertainty, and Representation

    National Research Council Canada - National Science Library

    Schunn, Christian D; Saner, Lelyn D; Kirschenbaum, Susan K; Trafton, J. G; Littleton, Eliza B

    2007-01-01

    ... (weather forecasting, submarine target motion analysis, and fMRI data analysis). Internal spatial representations are coded from spontaneous gestures made during cued-recall summaries of problem solving activities...

  4. Uncertainties in thick-target PIXE analysis

    International Nuclear Information System (INIS)

    Campbell, J.L.; Cookson, J.A.; Paul, H.

    1983-01-01

    Thick-target PIXE analysis insolves uncertainties arising from the calculation of thick-target X-ray production in addition to the usual PIXE uncertainties. The calculation demands knowledge of ionization cross-sections, stopping powers and photon attenuation coefficients. Information on these is reviewed critically and a computational method is used to estimate the uncertainties transmitted from this data base into results of thick-target PIXE analyses with reference to particular specimen types using beams of 2-3 MeV protons. A detailed assessment of the accuracy of thick-target PIXE is presented. (orig.)

  5. Development of the integrated system reliability analysis code MODULE

    International Nuclear Information System (INIS)

    Han, S.H.; Yoo, K.J.; Kim, T.W.

    1987-01-01

    The major components in a system reliability analysis are the determination of cut sets, importance measure, and uncertainty analysis. Various computer codes have been used for these purposes. For example, SETS and FTAP are used to determine cut sets; Importance for importance calculations; and Sample, CONINT, and MOCUP for uncertainty analysis. There have been problems when the codes run each other and the input and output are not linked, which could result in errors when preparing input for each code. The code MODULE was developed to carry out the above calculations simultaneously without linking input and outputs to other codes. MODULE can also prepare input for SETS for the case of a large fault tree that cannot be handled by MODULE. The flow diagram of the MODULE code is shown. To verify the MODULE code, two examples are selected and the results and computation times are compared with those of SETS, FTAP, CONINT, and MOCUP on both Cyber 170-875 and IBM PC/AT. Two examples are fault trees of the auxiliary feedwater system (AFWS) of Korea Nuclear Units (KNU)-1 and -2, which have 54 gates and 115 events, 39 gates and 92 events, respectively. The MODULE code has the advantage that it can calculate the cut sets, importances, and uncertainties in a single run with little increase in computing time over other codes and that it can be used in personal computers

  6. Verification of the CONPAS (CONtainment Performance Analysis System) code package

    International Nuclear Information System (INIS)

    Kim, See Darl; Ahn, Kwang Il; Song, Yong Man; Choi, Young; Park, Soo Yong; Kim, Dong Ha; Jin, Young Ho.

    1997-09-01

    CONPAS is a computer code package to integrate the numerical, graphical, and results-oriented aspects of Level 2 probabilistic safety assessment (PSA) for nuclear power plants under a PC window environment automatically. For the integrated analysis of Level 2 PSA, the code utilizes four distinct, but closely related modules: (1) ET Editor, (2) Computer, (3) Text Editor, and (4) Mechanistic Code Plotter. Compared with other existing computer codes for Level 2 PSA, and CONPAS code provides several advanced features: computational aspects including systematic uncertainty analysis, importance analysis, sensitivity analysis and data interpretation, reporting aspects including tabling and graphic as well as user-friendly interface. The computational performance of CONPAS has been verified through a Level 2 PSA to a reference plant. The results of the CONPAS code was compared with an existing level 2 PSA code (NUCAP+) and the comparison proves that CONPAS is appropriate for Level 2 PSA. (author). 9 refs., 8 tabs., 14 figs

  7. Uncertainty Propagation in Monte Carlo Depletion Analysis

    International Nuclear Information System (INIS)

    Shim, Hyung Jin; Kim, Yeong-il; Park, Ho Jin; Joo, Han Gyu; Kim, Chang Hyo

    2008-01-01

    A new formulation aimed at quantifying uncertainties of Monte Carlo (MC) tallies such as k eff and the microscopic reaction rates of nuclides and nuclide number densities in MC depletion analysis and examining their propagation behaviour as a function of depletion time step (DTS) is presented. It is shown that the variance of a given MC tally used as a measure of its uncertainty in this formulation arises from four sources; the statistical uncertainty of the MC tally, uncertainties of microscopic cross sections and nuclide number densities, and the cross correlations between them and the contribution of the latter three sources can be determined by computing the correlation coefficients between the uncertain variables. It is also shown that the variance of any given nuclide number density at the end of each DTS stems from uncertainties of the nuclide number densities (NND) and microscopic reaction rates (MRR) of nuclides at the beginning of each DTS and they are determined by computing correlation coefficients between these two uncertain variables. To test the viability of the formulation, we conducted MC depletion analysis for two sample depletion problems involving a simplified 7x7 fuel assembly (FA) and a 17x17 PWR FA, determined number densities of uranium and plutonium isotopes and their variances as well as k ∞ and its variance as a function of DTS, and demonstrated the applicability of the new formulation for uncertainty propagation analysis that need be followed in MC depletion computations. (authors)

  8. Design optimization and uncertainty analysis of SMA morphing structures

    International Nuclear Information System (INIS)

    Oehler, S D; Hartl, D J; Lopez, R; Malak, R J; Lagoudas, D C

    2012-01-01

    The continuing implementation of shape memory alloys (SMAs) as lightweight solid-state actuators in morphing structures has now motivated research into finding optimized designs for use in aerospace control systems. This work proposes methods that use iterative analysis techniques to determine optimized designs for morphing aerostructures and consider the impact of uncertainty in model variables on the solution. A combination of commercially available and custom coded tools is utilized. ModelCenter, a suite of optimization algorithms and simulation process management tools, is coupled with the Abaqus finite element analysis suite and a custom SMA constitutive model to assess morphing structure designs in an automated fashion. The chosen case study involves determining the optimized configuration of a morphing aerostructure assembly that includes SMA flexures. This is accomplished by altering design inputs representing the placement of active components to minimize a specified cost function. An uncertainty analysis is also conducted using design of experiment methods to determine the sensitivity of the solution to a set of uncertainty variables. This second study demonstrates the effective use of Monte Carlo techniques to simulate the variance of model variables representing the inherent uncertainty in component fabrication processes. This paper outlines the modeling tools used to execute each case study, details the procedures for constructing the optimization problem and uncertainty analysis, and highlights the results from both studies. (paper)

  9. Uncertainty Principles and Fourier Analysis

    Indian Academy of Sciences (India)

    analysis on the part of the reader. Those who are not fa- miliar with Fourier analysis are encouraged to look up Box. 1 along with [3]. (A) Heisenberg's inequality: Let us measure concentration in terms of standard deviation i.e. for a square integrable func-. 00 tion defined on 1R and normalized so that J If(x)12d,x = 1,. -00. 00.

  10. Uncertainty analysis for Ulysses safety evaluation report

    International Nuclear Information System (INIS)

    Frank, M.V.

    1991-01-01

    As part of the effort to review the Ulysses Final Safety Analysis Report and to understand the risk of plutonium release from the Ulysses spacecraft General Purpose Heat Source---Radioisotope Thermal Generator (GPHS-RTG), the Interagency Nuclear Safety Review Panel (INSRP) and the author performed an integrated, quantitative analysis of the uncertainties of the calculated risk of plutonium release from Ulysses. Using state-of-art probabilistic risk assessment technology, the uncertainty analysis accounted for both variability and uncertainty of the key parameters of the risk analysis. The results show that INSRP had high confidence that risk of fatal cancers from potential plutonium release associated with calculated launch and deployment accident scenarios is low

  11. A Proposal on the Advanced Sampling Based Sensitivity and Uncertainty Analysis Method for the Eigenvalue Uncertainty Analysis

    International Nuclear Information System (INIS)

    Kim, Song Hyun; Song, Myung Sub; Shin, Chang Ho; Noh, Jae Man

    2014-01-01

    In using the perturbation theory, the uncertainty of the response can be estimated by a single transport simulation, and therefore it requires small computational load. However, it has a disadvantage that the computation methodology must be modified whenever estimating different response type such as multiplication factor, flux, or power distribution. Hence, it is suitable for analyzing few responses with lots of perturbed parameters. Statistical approach is a sampling based method which uses randomly sampled cross sections from covariance data for analyzing the uncertainty of the response. XSUSA is a code based on the statistical approach. The cross sections are only modified with the sampling based method; thus, general transport codes can be directly utilized for the S/U analysis without any code modifications. However, to calculate the uncertainty distribution from the result, code simulation should be enough repeated with randomly sampled cross sections. Therefore, this inefficiency is known as a disadvantage of the stochastic method. In this study, an advanced sampling method of the cross sections is proposed and verified to increase the estimation efficiency of the sampling based method. In this study, to increase the estimation efficiency of the sampling based S/U method, an advanced sampling and estimation method was proposed. The main feature of the proposed method is that the cross section averaged from each single sampled cross section is used. For the use of the proposed method, the validation was performed using the perturbation theory

  12. Hydrocoin level 3 - Testing methods for sensitivity/uncertainty analysis

    International Nuclear Information System (INIS)

    Grundfelt, B.; Lindbom, B.; Larsson, A.; Andersson, K.

    1991-01-01

    The HYDROCOIN study is an international cooperative project for testing groundwater hydrology modelling strategies for performance assessment of nuclear waste disposal. The study was initiated in 1984 by the Swedish Nuclear Power Inspectorate and the technical work was finalised in 1987. The participating organisations are regulatory authorities as well as implementing organisations in 10 countries. The study has been performed at three levels aimed at studying computer code verification, model validation and sensitivity/uncertainty analysis respectively. The results from the first two levels, code verification and model validation, have been published in reports in 1988 and 1990 respectively. This paper focuses on some aspects of the results from Level 3, sensitivity/uncertainty analysis, for which a final report is planned to be published during 1990. For Level 3, seven test cases were defined. Some of these aimed at exploring the uncertainty associated with the modelling results by simply varying parameter values and conceptual assumptions. In other test cases statistical sampling methods were applied. One of the test cases dealt with particle tracking and the uncertainty introduced by this type of post processing. The amount of results available is substantial although unevenly spread over the test cases. It has not been possible to cover all aspects of the results in this paper. Instead, the different methods applied will be illustrated by some typical analyses. 4 figs., 9 refs

  13. Uncertainty propagation in probabilistic safety analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Fleming, P.V.

    1981-09-01

    The uncertainty propagation in probabilistic safety analysis of nuclear power plants, is done. The methodology of the minimal cut is implemented in the computer code SVALON and the results for several cases are compared with corresponding results obtained with the SAMPLE code, which employs the Monte Carlo method to propagate the uncertanties. The results have show that, for a relatively small number of dominant minimal cut sets (n approximately 25) and error factors (r approximately 5) the SVALON code yields results which are comparable to those obtained with SAMPLE. An analysis of the unavailability of the low pressure recirculation system of Angra 1 for both the short and long term recirculation phases, are presented. The results for the short term phase are in good agreement with the corresponding one given in WASH-1400. (E.G.) [pt

  14. PIV Uncertainty Methodologies for CFD Code Validation at the MIR Facility

    Energy Technology Data Exchange (ETDEWEB)

    Sabharwall, Piyush [Idaho National Lab. (INL), Idaho Falls, ID (United States); Skifton, Richard [Idaho National Lab. (INL), Idaho Falls, ID (United States); Stoots, Carl [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kim, Eung Soo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Conder, Thomas [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2013-12-01

    Currently, computational fluid dynamics (CFD) is widely used in the nuclear thermal hydraulics field for design and safety analyses. To validate CFD codes, high quality multi dimensional flow field data are essential. The Matched Index of Refraction (MIR) Flow Facility at Idaho National Laboratory has a unique capability to contribute to the development of validated CFD codes through the use of Particle Image Velocimetry (PIV). The significance of the MIR facility is that it permits non intrusive velocity measurement techniques, such as PIV, through complex models without requiring probes and other instrumentation that disturb the flow. At the heart of any PIV calculation is the cross-correlation, which is used to estimate the displacement of particles in some small part of the image over the time span between two images. This image displacement is indicated by the location of the largest peak. In the MIR facility, uncertainty quantification is a challenging task due to the use of optical measurement techniques. Currently, this study is developing a reliable method to analyze uncertainty and sensitivity of the measured data and develop a computer code to automatically analyze the uncertainty/sensitivity of the measured data. The main objective of this study is to develop a well established uncertainty quantification method for the MIR Flow Facility, which consists of many complicated uncertainty factors. In this study, the uncertainty sources are resolved in depth by categorizing them into uncertainties from the MIR flow loop and PIV system (including particle motion, image distortion, and data processing). Then, each uncertainty source is mathematically modeled or adequately defined. Finally, this study will provide a method and procedure to quantify the experimental uncertainty in the MIR Flow Facility with sample test results.

  15. Uncertainty and sensitivity analysis of the nuclear fuel thermal behavior

    Energy Technology Data Exchange (ETDEWEB)

    Boulore, A., E-mail: antoine.boulore@cea.fr [Commissariat a l' Energie Atomique (CEA), DEN, Fuel Research Department, 13108 Saint-Paul-lez-Durance (France); Struzik, C. [Commissariat a l' Energie Atomique (CEA), DEN, Fuel Research Department, 13108 Saint-Paul-lez-Durance (France); Gaudier, F. [Commissariat a l' Energie Atomique (CEA), DEN, Systems and Structure Modeling Department, 91191 Gif-sur-Yvette (France)

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer A complete quantitative method for uncertainty propagation and sensitivity analysis is applied. Black-Right-Pointing-Pointer The thermal conductivity of UO{sub 2} is modeled as a random variable. Black-Right-Pointing-Pointer The first source of uncertainty is the linear heat rate. Black-Right-Pointing-Pointer The second source of uncertainty is the thermal conductivity of the fuel. - Abstract: In the global framework of nuclear fuel behavior simulation, the response of the models describing the physical phenomena occurring during the irradiation in reactor is mainly conditioned by the confidence in the calculated temperature of the fuel. Amongst all parameters influencing the temperature calculation in our fuel rod simulation code (METEOR V2), several sources of uncertainty have been identified as being the most sensitive: thermal conductivity of UO{sub 2}, radial distribution of power in the fuel pellet, local linear heat rate in the fuel rod, geometry of the pellet and thermal transfer in the gap. Expert judgment and inverse methods have been used to model the uncertainty of these parameters using theoretical distributions and correlation matrices. Propagation of these uncertainties in the METEOR V2 code using the URANIE framework and a Monte-Carlo technique has been performed in different experimental irradiations of UO{sub 2} fuel. At every time step of the simulated experiments, we get a temperature statistical distribution which results from the initial distributions of the uncertain parameters. We then can estimate confidence intervals of the calculated temperature. In order to quantify the sensitivity of the calculated temperature to each of the uncertain input parameters and data, we have also performed a sensitivity analysis using the Sobol' indices at first order.

  16. CINETHICA - Core accident analysis code

    International Nuclear Information System (INIS)

    Nakata, H.

    1989-10-01

    A computer program for nuclear accident analysis has been developed based on the point-kinetics approximation and one-dimensional heat transfer model for reactivity feedback calculation. Hansen's method/1/ were used for the kinetics equation solution and explicit Euler method were adopted for the thermohidraulic equations. The results were favorably compared to those from the GAPOTKIN Code/2/. (author) [pt

  17. Bootstrap and Order Statistics for Quantifying Thermal-Hydraulic Code Uncertainties in the Estimation of Safety Margins

    Directory of Open Access Journals (Sweden)

    Enrico Zio

    2008-01-01

    Full Text Available In the present work, the uncertainties affecting the safety margins estimated from thermal-hydraulic code calculations are captured quantitatively by resorting to the order statistics and the bootstrap technique. The proposed framework of analysis is applied to the estimation of the safety margin, with its confidence interval, of the maximum fuel cladding temperature reached during a complete group distribution blockage scenario in a RBMK-1500 nuclear reactor.

  18. Experimental data bases useful for quantification of model uncertainties in best estimate codes

    International Nuclear Information System (INIS)

    Wilson, G.E.; Katsma, K.R.; Jacobson, J.L.; Boodry, K.S.

    1988-01-01

    A data base is necessary for assessment of thermal hydraulic codes within the context of the new NRC ECCS Rule. Separate effect tests examine particular phenomena that may be used to develop and/or verify models and constitutive relationships in the code. Integral tests are used to demonstrate the capability of codes to model global characteristics and sequence of events for real or hypothetical transients. The nuclear industry has developed a large experimental data base of fundamental nuclear, thermal-hydraulic phenomena for code validation. Given a particular scenario, and recognizing the scenario's important phenomena, selected information from this data base may be used to demonstrate applicability of a particular code to simulate the scenario and to determine code model uncertainties. LBLOCA experimental data bases useful to this objective are identified in this paper. 2 tabs

  19. Uncertainty analysis for geologic disposal of radioactive waste

    International Nuclear Information System (INIS)

    Cranwell, R.M.; Helton, J.C.

    1981-01-01

    The incorporation and representation of uncertainty in the analysis of the consequences and risks associated with the geologic disposal of high-level radioactive waste are discussed. Such uncertainty has three primary components: process modeling uncertainty, model input data uncertainty, and scenario uncertainty. The following topics are considered in connection with the preceding components: propagation of uncertainty in the modeling of a disposal site, sampling of input data for models, and uncertainty associated with model output

  20. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

    Energy Technology Data Exchange (ETDEWEB)

    Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States)] [and others

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.

  1. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

    International Nuclear Information System (INIS)

    Harper, F.T.; Young, M.L.; Miller, L.A.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes

  2. Uncertainty Propagation in Hypersonic Vehicle Aerothermoelastic Analysis

    Science.gov (United States)

    Lamorte, Nicolas Etienne

    Hypersonic vehicles face a challenging flight environment. The aerothermoelastic analysis of its components requires numerous simplifying approximations. Identifying and quantifying the effect of uncertainties pushes the limits of the existing deterministic models, and is pursued in this work. An uncertainty quantification framework is used to propagate the effects of identified uncertainties on the stability margins and performance of the different systems considered. First, the aeroelastic stability of a typical section representative of a control surface on a hypersonic vehicle is examined. Variability in the uncoupled natural frequencies of the system is modeled to mimic the effect of aerodynamic heating. Next, the stability of an aerodynamically heated panel representing a component of the skin of a generic hypersonic vehicle is considered. Uncertainty in the location of transition from laminar to turbulent flow and the heat flux prediction is quantified using CFD. In both cases significant reductions of the stability margins are observed. A loosely coupled airframe--integrated scramjet engine is considered next. The elongated body and cowl of the engine flow path are subject to harsh aerothermodynamic loading which causes it to deform. Uncertainty associated with deformation prediction is propagated to the engine performance analysis. The cowl deformation is the main contributor to the sensitivity of the propulsion system performance. Finally, a framework for aerothermoelastic stability boundary calculation for hypersonic vehicles using CFD is developed. The usage of CFD enables one to consider different turbulence conditions, laminar or turbulent, and different models of the air mixture, in particular real gas model which accounts for dissociation of molecules at high temperature. The system is found to be sensitive to turbulence modeling as well as the location of the transition from laminar to turbulent flow. Real gas effects play a minor role in the

  3. Uncertainties in elemental quantitative analysis by PIXE

    International Nuclear Information System (INIS)

    Montenegro, E.C.; Baptista, G.B.; Paschoa, A.S.; Barros Leite, C.V.

    1979-01-01

    The effects of the degree of non-uniformity of the particle beam, matrix composition and matrix thickness in a quantitative elemental analysis by particle induced X-ray emission (PIXE) are discussed and a criterion to evaluate the resulting degree of uncertainty in the mass determination by this method is established. (Auth.)

  4. Uncertainty Analysis of RBMK-Related Experimental Data

    International Nuclear Information System (INIS)

    Urbonas, Rolandas; Kaliatka, Algirdas; Liaukonis, Mindaugas

    2002-01-01

    An attempt to validate state-of-the-art thermal hydraulic code ATHLET (GRS, Germany) on the basis of E-108 test facility was made. Originally this code was developed and validated for different type reactors than RBMK. Since state-of-art thermal hydraulic codes are widely used for simulation of RBMK reactors, further codes' implementation and validation is required. The phenomena associated with channel type flow instabilities and CHF were found to be an important step in the frame of the overall effort of state-of-the-art validation and application for RBMK reactors. In the paper one-channel approach analysis is presented. Thus, the oscillatory behaviour of the system was not detected. The results show dependence on the nodalization used in the heated channels, initial and boundary conditions and code selected models. It is shown that the code is able to predict a sudden heat structure temperature excursion, when critical heat flux is approached. GRS developed uncertainty and sensitivity methodology was employed in the analysis. (authors)

  5. Overview of methods for uncertainty analysis and sensitivity analysis in probabilistic risk assessment

    International Nuclear Information System (INIS)

    Iman, R.L.; Helton, J.C.

    1985-01-01

    Probabilistic Risk Assessment (PRA) is playing an increasingly important role in the nuclear reactor regulatory process. The assessment of uncertainties associated with PRA results is widely recognized as an important part of the analysis process. One of the major criticisms of the Reactor Safety Study was that its representation of uncertainty was inadequate. The desire for the capability to treat uncertainties with the MELCOR risk code being developed at Sandia National Laboratories is indicative of the current interest in this topic. However, as yet, uncertainty analysis and sensitivity analysis in the context of PRA is a relatively immature field. In this paper, available methods for uncertainty analysis and sensitivity analysis in a PRA are reviewed. This review first treats methods for use with individual components of a PRA and then considers how these methods could be combined in the performance of a complete PRA. In the context of this paper, the goal of uncertainty analysis is to measure the imprecision in PRA outcomes of interest, and the goal of sensitivity analysis is to identify the major contributors to this imprecision. There are a number of areas that must be considered in uncertainty analysis and sensitivity analysis for a PRA: (1) information, (2) systems analysis, (3) thermal-hydraulic phenomena/fission product behavior, (4) health and economic consequences, and (5) display of results. Each of these areas and the synthesis of them into a complete PRA are discussed

  6. Development of a nuclear data uncertainties propagation code on the residual power in fast neutron reactors

    International Nuclear Information System (INIS)

    Benoit, J.-C.

    2012-01-01

    This PhD study is in the field of nuclear energy, the back end of nuclear fuel cycle and uncertainty calculations. The CEA must design the prototype ASTRID, a sodium cooled fast reactor (SFR) and one of the selected concepts of the Generation IV forum, for which the calculation of the value and the uncertainty of the decay heat have a significant impact. In this study is developed a code of propagation of uncertainties of nuclear data on the decay heat in SFR. The process took place in three stages. The first step has limited the number of parameters involved in the calculation of the decay heat. For this, an experiment on decay heat on the reactor PHENIX (PUIREX 2008) was studied to validate experimentally the DARWIN package for SFR and quantify the source terms of the decay heat. The second step was aimed to develop a code of propagation of uncertainties: CyRUS (Cycle Reactor Uncertainty and Sensitivity). A deterministic propagation method was chosen because calculations are fast and reliable. Assumptions of linearity and normality have been validated theoretically. The code has also been successfully compared with a stochastic code on the example of the thermal burst fission curve of 235 U. The last part was an application of the code on several experiments: decay heat of a reactor, isotopic composition of a fuel pin and the burst fission curve of 235 U. The code has demonstrated the possibility of feedback on nuclear data impacting the uncertainty of this problem. Two main results were highlighted. Firstly, the simplifying assumptions of deterministic codes are compatible with a precise calculation of the uncertainty of the decay heat. Secondly, the developed method is intrusive and allows feedback on nuclear data from experiments on the back end of nuclear fuel cycle. In particular, this study showed how important it is to measure precisely independent fission yields along with their covariance matrices in order to improve the accuracy of the calculation of

  7. Uncertainty analysis for the assembly and core simulation of BEAVRS at the HZP conditions

    International Nuclear Information System (INIS)

    Wan, Chenghui; Cao, Liangzhi; Wu, Hongchun; Shen, Wei

    2017-01-01

    Highlights: • Uncertainty analysis has been completed based on the “two-step” scheme. • Uncertainty analysis has been performed to BEAVRS at HZP. • For lattice calculations, the few-group constant’s uncertainty was quantified. • For core simulation, uncertainties of k_e_f_f and power distributions were quantified. - Abstract: Based on the “two-step” scheme for the reactor-physics calculations, the capability of uncertainty analysis for the core simulations has been implemented in the UNICORN code, an in-house code for the sensitivity and uncertainty analysis of the reactor-physics calculations. Applying the statistical sampling method, the nuclear-data uncertainties can be propagated to the important predictions of the core simulations. The uncertainties of the few-group constants introduced by the uncertainties of the multigroup microscopic cross sections are quantified first for the lattice calculations; the uncertainties of the few-group constants are then propagated to the core multiplication factor and core power distributions for the core simulations. Up to now, our in-house lattice code NECP-CACTI and the neutron-diffusion solver NECP-VIOLET have been implemented in UNICORN for the steady-state core simulations based on the “two-step” scheme. With NECP-CACTI and NECP-VIOLET, the modeling and simulation of the steady-state BEAVRS benchmark problem at the HZP conditions was performed, and the results were compared with those obtained by CASMO-4E. Based on the modeling and simulation, the UNICORN code has been applied to perform the uncertainty analysis for BAEVRS at HZP. The uncertainty results of the eigenvalues and two-group constants for the lattice calculations and the multiplication factor and the power distributions for the steady-state core simulations are obtained and analyzed in detail.

  8. Uncertainty analysis for the assembly and core simulation of BEAVRS at the HZP conditions

    Energy Technology Data Exchange (ETDEWEB)

    Wan, Chenghui [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Cao, Liangzhi, E-mail: caolz@mail.xjtu.edu.cn [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Wu, Hongchun [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Shen, Wei [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Canadian Nuclear Safety Commission, Ottawa, Ontario (Canada)

    2017-04-15

    Highlights: • Uncertainty analysis has been completed based on the “two-step” scheme. • Uncertainty analysis has been performed to BEAVRS at HZP. • For lattice calculations, the few-group constant’s uncertainty was quantified. • For core simulation, uncertainties of k{sub eff} and power distributions were quantified. - Abstract: Based on the “two-step” scheme for the reactor-physics calculations, the capability of uncertainty analysis for the core simulations has been implemented in the UNICORN code, an in-house code for the sensitivity and uncertainty analysis of the reactor-physics calculations. Applying the statistical sampling method, the nuclear-data uncertainties can be propagated to the important predictions of the core simulations. The uncertainties of the few-group constants introduced by the uncertainties of the multigroup microscopic cross sections are quantified first for the lattice calculations; the uncertainties of the few-group constants are then propagated to the core multiplication factor and core power distributions for the core simulations. Up to now, our in-house lattice code NECP-CACTI and the neutron-diffusion solver NECP-VIOLET have been implemented in UNICORN for the steady-state core simulations based on the “two-step” scheme. With NECP-CACTI and NECP-VIOLET, the modeling and simulation of the steady-state BEAVRS benchmark problem at the HZP conditions was performed, and the results were compared with those obtained by CASMO-4E. Based on the modeling and simulation, the UNICORN code has been applied to perform the uncertainty analysis for BAEVRS at HZP. The uncertainty results of the eigenvalues and two-group constants for the lattice calculations and the multiplication factor and the power distributions for the steady-state core simulations are obtained and analyzed in detail.

  9. Uncertainty Assessments in Fast Neutron Activation Analysis

    International Nuclear Information System (INIS)

    W. D. James; R. Zeisler

    2000-01-01

    Fast neutron activation analysis (FNAA) carried out with the use of small accelerator-based neutron generators is routinely used for major/minor element determinations in industry, mineral and petroleum exploration, and to some extent in research. While the method shares many of the operational procedures and therefore errors inherent to conventional thermal neutron activation analysis, its unique implementation gives rise to additional specific concerns that can result in errors or increased uncertainties of measured quantities. The authors were involved in a recent effort to evaluate irreversible incorporation of oxygen into a standard reference material (SRM) by direct measurement of oxygen by FNAA. That project required determination of oxygen in bottles of the SRM stored in varying environmental conditions and a comparison of the results. We recognized the need to accurately describe the total uncertainty of the measurements to accurately characterize any differences in the resulting average concentrations. It is our intent here to discuss the breadth of potential parameters that have the potential to contribute to the random and nonrandom errors of the method and provide estimates of the magnitude of uncertainty introduced. In addition, we will discuss the steps taken in this recent FNAA project to control quality, assess the uncertainty of the measurements, and evaluate results based on the statistical reproducibility

  10. Uncertainty Evaluation of the SFR Subchannel Thermal-Hydraulic Modeling Using a Hot Channel Factors Analysis

    International Nuclear Information System (INIS)

    Choi, Sun Rock; Cho, Chung Ho; Kim, Sang Ji

    2011-01-01

    In an SFR core analysis, a hot channel factors (HCF) method is most commonly used to evaluate uncertainty. It was employed to the early design such as the CRBRP and IFR. In other ways, the improved thermal design procedure (ITDP) is able to calculate the overall uncertainty based on the Root Sum Square technique and sensitivity analyses of each design parameters. The Monte Carlo method (MCM) is also employed to estimate the uncertainties. In this method, all the input uncertainties are randomly sampled according to their probability density functions and the resulting distribution for the output quantity is analyzed. Since an uncertainty analysis is basically calculated from the temperature distribution in a subassembly, the core thermal-hydraulic modeling greatly affects the resulting uncertainty. At KAERI, the SLTHEN and MATRA-LMR codes have been utilized to analyze the SFR core thermal-hydraulics. The SLTHEN (steady-state LMR core thermal hydraulics analysis code based on the ENERGY model) code is a modified version of the SUPERENERGY2 code, which conducts a multi-assembly, steady state calculation based on a simplified ENERGY model. The detailed subchannel analysis code MATRA-LMR (Multichannel Analyzer for Steady-State and Transients in Rod Arrays for Liquid Metal Reactors), an LMR version of MATRA, was also developed specifically for the SFR core thermal-hydraulic analysis. This paper describes comparative studies for core thermal-hydraulic models. The subchannel analysis and a hot channel factors based uncertainty evaluation system is established to estimate the core thermofluidic uncertainties using the MATRA-LMR code and the results are compared to those of the SLTHEN code

  11. Quantifying reactor safety margins: Application of code scaling, applicability, and uncertainty evaluation methodology to a large-break, loss-of-coolant accident

    International Nuclear Information System (INIS)

    Boyack, B.; Duffey, R.; Wilson, G.; Griffith, P.; Lellouche, G.; Levy, S.; Rohatgi, U.; Wulff, W.; Zuber, N.

    1989-12-01

    The US Nuclear Regulatory Commission (NRC) has issued a revised rule for loss-of-coolant accident/emergency core cooling system (ECCS) analysis of light water reactors to allow the use of best-estimate computer codes in safety analysis as an option. A key feature of this option requires the licensee to quantify the uncertainty of the calculations and include that uncertainty when comparing the calculated results with acceptance limits provided in 10 CFR Part 50. To support the revised ECCS rule and illustrate its application, the NRC and its contractors and consultants have developed and demonstrated an uncertainty evaluation methodology called code scaling, applicability, and uncertainty (CSAU). The CSAU methodology and an example application described in this report demonstrate that uncertainties in complex phenomena can be quantified. The methodology is structured, traceable, and practical, as is needed in the regulatory arena. The methodology is systematic and comprehensive as it addresses and integrates the scenario, experiments, code, and plant to resolve questions concerned with: (a) code capability to scale-up processes from test facility to full-scale nuclear power plants; (b) code applicability to safety studies of a postulated accident scenario in a specified nuclear power plant; and (c) quantifying uncertainties of calculated results. 127 refs., 55 figs., 40 tabs

  12. Erha Uncertainty Analysis: Planning for the future

    International Nuclear Information System (INIS)

    Brami, T.R.; Hopkins, D.F.; Loguer, W.L.; Cornagia, D.M.; Braisted, A.W.C.

    2002-01-01

    The Erha field (OPL 209) was discovered in 1999 approximately 100 km off the coast of Nigeria in 1,100 m of water. The discovery well (Erha-1) encountered oil and gas in deep-water clastic reservoirs. The first appraisal well (Erha-2) drilled 1.6 km downdip to the northwest penetrated an oil-water contact and confirmed a potentially commercial discovery. However, the Erha-3 and Erha-3 ST-1 boreholes, drilled on the faulted east-side of the field in 2001, encountered shallower fluid contacts. As a result of these findings, a comprehensive field-wide uncertainty analysis was performed to better understand what we know versus what we think regarding resource size and economic viability The uncertainty analysis process applied at Erha is an integrated scenario-based probabilistic approach to model resource and reserves. Its goal is to provide quantitative results for a variety of scenarios, thus allowing identification of and focus on critical controls (the variables that are likely to impose the greatest influence).The initial focus at Erha was to incorporate the observed fluid contacts and to develop potential scenarios that included the range of possibilities in unpenetrated portions of the field. Four potential compartmentalization scenarios were hypothesized. The uncertainty model combines these scenarios with reservoir parameters and their plausible ranges. Input data comes from multiple sources including: wells, 3D seismic, reservoir flow simulation, geochemistry, fault-seal analysis, sequence stratigraphic analysis, and analogs. Once created, the model is sampled using Monte-Carlo techniques to create probability density functions for a variety of variables including oil in-place and recoverable reserves.Results of the uncertainty analysis support that despite a thinner oil column on the faulted east-side of the field, Erha is an economically attractive opportunity. Further, the results have been to develop data acquisition plans and mitigation strategies that

  13. Extending CANTUP code analysis to probabilistic evaluations

    International Nuclear Information System (INIS)

    Florea, S.

    2001-01-01

    The structural analysis with numerical methods based on final element method plays at present a central role in evaluations and predictions of structural systems which require safety and reliable operation in aggressive environmental conditions. This is the case too for the CANDU - 600 fuel channel, where besides the corrosive and thermal aggression upon the Zr97.5Nb2.5 pressure tubes, a lasting irradiation adds which has marked consequences upon the materials properties evolution. This results in an unavoidable spreading in the materials properties in time, affected by high uncertainties. Consequently, the deterministic evaluation with computation codes based on finite element method are supplemented by statistic and probabilistic methods of evaluation of the response of structural components. This paper reports the works on extending the thermo-mechanical evaluation of the fuel channel components in the frame of probabilistic structure mechanics based on statistical methods and developed upon deterministic CANTUP code analyses. CANTUP code was adapted from LAHEY 77 platform onto Microsoft Developer Studio - Fortran Power Station 4.0 platform. To test the statistical evaluation of the creeping behaviour of pressure tube, the value of longitudinal elasticity modulus (Young) was used, as random variable, with a normal distribution around value, as used in deterministic analyses. The influence of the random quantity upon the hog and effective stress developed in the pressure tube for to time values, specific to primary and secondary creep was studied. The results obtained after a five year creep, corresponding to the secondary creep are presented

  14. Uncertainty Analysis of Few Group Cross Sections Based on Generalized Perturbation Theory

    International Nuclear Information System (INIS)

    Han, Tae Young; Lee, Hyun Chul; Noh, Jae Man

    2014-01-01

    In this paper, the methodology of the sensitivity and uncertainty analysis code based on GPT was described and the preliminary verification calculations on the PMR200 pin cell problem were carried out. As a result, they are in a good agreement when compared with the results by TSUNAMI. From this study, it is expected that MUSAD code based on GPT can produce the uncertainty of the homogenized few group microscopic cross sections for a core simulator. For sensitivity and uncertainty analyses for general core responses, a two-step method is available and it utilizes the generalized perturbation theory (GPT) for homogenized few group cross sections in the first step and stochastic sampling method for general core responses in the second step. The uncertainty analysis procedure based on GPT in the first step needs the generalized adjoint solution from a cell or lattice code. For this, the generalized adjoint solver has been integrated into DeCART in our previous work. In this paper, MUSAD (Modues of Uncertainty and Sensitivity Analysis for DeCART) code based on the classical perturbation theory was expanded to the function of the sensitivity and uncertainty analysis for few group cross sections based on GPT. First, the uncertainty analysis method based on GPT was described and, in the next section, the preliminary results of the verification calculation on a VHTR pin cell problem were compared with the results by TSUNAMI of SCALE 6.1

  15. Quantifying reactor safety margins: Part 1: An overview of the code scaling, applicability, and uncertainty evaluation methodology

    International Nuclear Information System (INIS)

    Boyack, B.E.; Duffey, R.B.; Griffith, P.

    1988-01-01

    In August 1988, the Nuclear Regulatory Commission (NRC) approved the final version of a revised rule on the acceptance of emergency core cooling systems (ECCS) entitled ''Emergency Core Cooling System; Revisions to Acceptance Criteria.'' The revised rule states an alternate ECCS performance analysis, based on best-estimate methods, may be used to provide more realistic estimates of plant safety margins, provided the licensee quantifies the uncertainty of the estimates and included that uncertainty when comparing the calculated results with prescribed acceptance limits. To support the revised ECCS rule, the NRC and its contractors and consultants have developed and demonstrated a method called the Code Scaling, Applicability, and Uncertainty (CSAU) evaluation methodology. It is an auditable, traceable, and practical method for combining quantitative analyses and expert opinions to arrive at computed values of uncertainty. This paper provides an overview of the CSAU evaluation methodology and its application to a postulated cold-leg, large-break loss-of-coolant accident in a Westinghouse four-loop pressurized water reactor with 17 /times/ 17 fuel. The code selected for this demonstration of the CSAU methodology was TRAC-PF1/MOD1, Version 14.3. 23 refs., 5 figs., 1 tab

  16. Uncertainty Prediction in Passive Target Motion Analysis

    Science.gov (United States)

    2016-05-12

    Number 15/152,696 Filing Date 12 May 2016 Inventor John G. Baylog et al Address any questions concerning this matter to the Office of...300118 1 of 25 UNCERTAINTY PREDICTION IN PASSIVE TARGET MOTION ANALYSIS STATEMENT OF GOVERNMENT INTEREST [0001] The invention described herein...at an unknown location and following an unknown course relative to an observer 12. Observer 12 has a sensor array such as a passive sonar or radar

  17. Coupling of system thermal–hydraulics and Monte-Carlo code: Convergence criteria and quantification of correlation between statistical uncertainty and coupled error

    International Nuclear Information System (INIS)

    Wu, Xu; Kozlowski, Tomasz

    2015-01-01

    Highlights: • Coupling of Monte Carlo code Serpent and thermal–hydraulics code RELAP5. • A convergence criterion is developed based on the statistical uncertainty of power. • Correlation between MC statistical uncertainty and coupled error is quantified. • Both UO 2 and MOX single assembly models are used in the coupled simulation. • Validation of coupling results with a multi-group transport code DeCART. - Abstract: Coupled multi-physics approach plays an important role in improving computational accuracy. Compared with deterministic neutronics codes, Monte Carlo codes have the advantage of a higher resolution level. In the present paper, a three-dimensional continuous-energy Monte Carlo reactor physics burnup calculation code, Serpent, is coupled with a thermal–hydraulics safety analysis code, RELAP5. The coupled Serpent/RELAP5 code capability is demonstrated by the improved axial power distribution of UO 2 and MOX single assembly models, based on the OECD-NEA/NRC PWR MOX/UO 2 Core Transient Benchmark. Comparisons of calculation results using the coupled code with those from the deterministic methods, specifically heterogeneous multi-group transport code DeCART, show that the coupling produces more precise results. A new convergence criterion for the coupled simulation is developed based on the statistical uncertainty in power distribution in the Monte Carlo code, rather than ad-hoc criteria used in previous research. The new convergence criterion is shown to be more rigorous, equally convenient to use but requiring a few more coupling steps to converge. Finally, the influence of Monte Carlo statistical uncertainty on the coupled error of power and thermal–hydraulics parameters is quantified. The results are presented such that they can be used to find the statistical uncertainty to use in Monte Carlo in order to achieve a desired precision in coupled simulation

  18. Parameter Uncertainty for Repository Thermal Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hardin, Ernest [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hadgu, Teklu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Greenberg, Harris [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dupont, Mark [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-10-01

    This report is one follow-on to a study of reference geologic disposal design concepts (Hardin et al. 2011a). Based on an analysis of maximum temperatures, that study concluded that certain disposal concepts would require extended decay storage prior to emplacement, or the use of small waste packages, or both. The study used nominal values for thermal properties of host geologic media and engineered materials, demonstrating the need for uncertainty analysis to support the conclusions. This report is a first step that identifies the input parameters of the maximum temperature calculation, surveys published data on measured values, uses an analytical approach to determine which parameters are most important, and performs an example sensitivity analysis. Using results from this first step, temperature calculations planned for FY12 can focus on only the important parameters, and can use the uncertainty ranges reported here. The survey of published information on thermal properties of geologic media and engineered materials, is intended to be sufficient for use in generic calculations to evaluate the feasibility of reference disposal concepts. A full compendium of literature data is beyond the scope of this report. The term “uncertainty” is used here to represent both measurement uncertainty and spatial variability, or variability across host geologic units. For the most important parameters (e.g., buffer thermal conductivity) the extent of literature data surveyed samples these different forms of uncertainty and variability. Finally, this report is intended to be one chapter or section of a larger FY12 deliverable summarizing all the work on design concepts and thermal load management for geologic disposal (M3FT-12SN0804032, due 15Aug2012).

  19. Uncertainty analysis of nuclear waste package corrosion

    International Nuclear Information System (INIS)

    Kurth, R.E.; Nicolosi, S.L.

    1986-01-01

    This paper describes the results of an evaluation of three uncertainty analysis methods for assessing the possible variability in calculating the corrosion process in a nuclear waste package. The purpose of the study is the determination of how each of three uncertainty analysis methods, Monte Carlo, Latin hypercube sampling (LHS) and a modified discrete probability distribution method, perform in such calculations. The purpose is not to examine the absolute magnitude of the numbers but rather to rank the performance of each of the uncertainty methods in assessing the model variability. In this context it was found that the Monte Carlo method provided the most accurate assessment but at a prohibitively high cost. The modified discrete probability method provided accuracy close to that of the Monte Carlo for a fraction of the cost. The LHS method was found to be too inaccurate for this calculation although it would be appropriate for use in a model which requires substantially more computer time than the one studied in this paper

  20. A practical sensitivity analysis method for ranking sources of uncertainty in thermal–hydraulics applications

    Energy Technology Data Exchange (ETDEWEB)

    Pourgol-Mohammad, Mohammad, E-mail: pourgolmohammad@sut.ac.ir [Department of Mechanical Engineering, Sahand University of Technology, Tabriz (Iran, Islamic Republic of); Hoseyni, Seyed Mohsen [Department of Basic Sciences, East Tehran Branch, Islamic Azad University, Tehran (Iran, Islamic Republic of); Hoseyni, Seyed Mojtaba [Building & Housing Research Center, Tehran (Iran, Islamic Republic of); Sepanloo, Kamran [Nuclear Science and Technology Research Institute, Tehran (Iran, Islamic Republic of)

    2016-08-15

    Highlights: • Existing uncertainty ranking methods prove inconsistent for TH applications. • Introduction of a new method for ranking sources of uncertainty in TH codes. • Modified PIRT qualitatively identifies and ranks uncertainty sources more precisely. • The importance of parameters is calculated by a limited number of TH code executions. • Methodology is applied successfully on LOFT-LB1 test facility. - Abstract: In application to thermal–hydraulic calculations by system codes, sensitivity analysis plays an important role for managing the uncertainties of code output and risk analysis. Sensitivity analysis is also used to confirm the results of qualitative Phenomena Identification and Ranking Table (PIRT). Several methodologies have been developed to address uncertainty importance assessment. Generally, uncertainty importance measures, mainly devised for the Probabilistic Risk Assessment (PRA) applications, are not affordable for computationally demanding calculations of the complex thermal–hydraulics (TH) system codes. In other words, for effective quantification of the degree of the contribution of each phenomenon to the total uncertainty of the output, a practical approach is needed by considering high computational burden of TH calculations. This study aims primarily to show the inefficiency of the existing approaches and then introduces a solution to cope with the challenges in this area by modification of variance-based uncertainty importance method. Important parameters are identified by the modified PIRT approach qualitatively then their uncertainty importance is quantified by a local derivative index. The proposed index is attractive from its practicality point of view on TH applications. It is capable of calculating the importance of parameters by a limited number of TH code executions. Application of the proposed methodology is demonstrated on LOFT-LB1 test facility.

  1. A practical sensitivity analysis method for ranking sources of uncertainty in thermal–hydraulics applications

    International Nuclear Information System (INIS)

    Pourgol-Mohammad, Mohammad; Hoseyni, Seyed Mohsen; Hoseyni, Seyed Mojtaba; Sepanloo, Kamran

    2016-01-01

    Highlights: • Existing uncertainty ranking methods prove inconsistent for TH applications. • Introduction of a new method for ranking sources of uncertainty in TH codes. • Modified PIRT qualitatively identifies and ranks uncertainty sources more precisely. • The importance of parameters is calculated by a limited number of TH code executions. • Methodology is applied successfully on LOFT-LB1 test facility. - Abstract: In application to thermal–hydraulic calculations by system codes, sensitivity analysis plays an important role for managing the uncertainties of code output and risk analysis. Sensitivity analysis is also used to confirm the results of qualitative Phenomena Identification and Ranking Table (PIRT). Several methodologies have been developed to address uncertainty importance assessment. Generally, uncertainty importance measures, mainly devised for the Probabilistic Risk Assessment (PRA) applications, are not affordable for computationally demanding calculations of the complex thermal–hydraulics (TH) system codes. In other words, for effective quantification of the degree of the contribution of each phenomenon to the total uncertainty of the output, a practical approach is needed by considering high computational burden of TH calculations. This study aims primarily to show the inefficiency of the existing approaches and then introduces a solution to cope with the challenges in this area by modification of variance-based uncertainty importance method. Important parameters are identified by the modified PIRT approach qualitatively then their uncertainty importance is quantified by a local derivative index. The proposed index is attractive from its practicality point of view on TH applications. It is capable of calculating the importance of parameters by a limited number of TH code executions. Application of the proposed methodology is demonstrated on LOFT-LB1 test facility.

  2. PREMIUM - Benchmark on the quantification of the uncertainty of the physical models in the system thermal-hydraulic codes

    International Nuclear Information System (INIS)

    Skorek, Tomasz; Crecy, Agnes de

    2013-01-01

    PREMIUM (Post BEMUSE Reflood Models Input Uncertainty Methods) is an activity launched with the aim to push forward the methods of quantification of physical models uncertainties in thermal-hydraulic codes. It is endorsed by OECD/NEA/CSNI/WGAMA. The benchmark PREMIUM is addressed to all who applies uncertainty evaluation methods based on input uncertainties quantification and propagation. The benchmark is based on a selected case of uncertainty analysis application to the simulation of quench front propagation in an experimental test facility. Application to an experiment enables evaluation and confirmation of the quantified probability distribution functions on the basis of experimental data. The scope of the benchmark comprises a review of the existing methods, selection of potentially important uncertain input parameters, preliminary quantification of the ranges and distributions of the identified parameters, evaluation of the probability density function using experimental results of tests performed on FEBA test facility and confirmation/validation of the performed quantification on the basis of blind calculation of Reflood 2-D PERICLES experiment. (authors)

  3. Effects of uncertainties of experimental data in the benchmarking of a computer code

    International Nuclear Information System (INIS)

    Meulemeester, E. de; Bouffioux, P.; Demeester, J.

    1980-01-01

    Fuel rod performance modelling is sometimes taken in an academical way. The experience of the COMETHE code development since 1967 has clearly shown that benchmarking was the most important part of modelling development. Unfortunately, it requires well characterized data. Although, the two examples presented here were not intended for benchmarking, as the COMETHE calculations were only performed for an interpretation of the results, they illustrate the effects of a lack of fuel characterization and of the power history uncertainties

  4. Report of a CSNI workshop on uncertainty analysis methods. Volume 1 + 2

    International Nuclear Information System (INIS)

    Wickett, A.J.; Yadigaroglu, G.

    1994-08-01

    The OECD NEA CSNI Principal Working Group 2 (PWG2) Task Group on Thermal Hydraulic System Behaviour (TGTHSB) has, in recent years, received presentations of a variety of different methods to analyze the uncertainty in the calculations of advanced unbiased (best estimate) codes. Proposals were also made for an International Standard Problem (ISP) to compare the uncertainty analysis methods. The objectives for the Workshop were to discuss and fully understand the principles of uncertainty analysis relevant to LOCA modelling and like problems, to examine the underlying issues from first principles, in preference to comparing and contrasting the currently proposed methods, to reach consensus on the issues identified as far as possible while not avoiding the controversial aspects, to identify as clearly as possible unreconciled differences, and to issue a Status Report. Eight uncertainty analysis methods were presented. A structured discussion of various aspects of uncertainty analysis followed - the need for uncertainty analysis, identification and ranking of uncertainties, characterisation, quantification and combination of uncertainties and applications, resources and future developments. As a result, the objectives set out above were, to a very large extent, achieved. Plans for the ISP were also discussed. Volume 1 contains a record of the discussions on uncertainty methods. Volume 2 is a compilation of descriptions of the eight uncertainty analysis methods presented at the workshop

  5. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)] [and others

    1997-06-01

    This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  6. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report

    International Nuclear Information System (INIS)

    Brown, J.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-06-01

    This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses

  7. Light-water reactor safety analysis codes

    International Nuclear Information System (INIS)

    Jackson, J.F.; Ransom, V.H.; Ybarrondo, L.J.; Liles, D.R.

    1980-01-01

    A brief review of the evolution of light-water reactor safety analysis codes is presented. Included is a summary comparison of the technical capabilities of major system codes. Three recent codes are described in more detail to serve as examples of currently used techniques. Example comparisons between calculated results using these codes and experimental data are given. Finally, a brief evaluation of current code capability and future development trends is presented

  8. Representing uncertainty on model analysis plots

    Directory of Open Access Journals (Sweden)

    Trevor I. Smith

    2016-09-01

    Full Text Available Model analysis provides a mechanism for representing student learning as measured by standard multiple-choice surveys. The model plot contains information regarding both how likely students in a particular class are to choose the correct answer and how likely they are to choose an answer consistent with a well-documented conceptual model. Unfortunately, Bao’s original presentation of the model plot did not include a way to represent uncertainty in these measurements. I present details of a method to add error bars to model plots by expanding the work of Sommer and Lindell. I also provide a template for generating model plots with error bars.

  9. Uncertainty analysis of time-dependent nonlinear systems: theory and application to transient thermal hydraulics

    International Nuclear Information System (INIS)

    Barhen, J.; Bjerke, M.A.; Cacuci, D.G.; Mullins, C.B.; Wagschal, G.G.

    1982-01-01

    An advanced methodology for performing systematic uncertainty analysis of time-dependent nonlinear systems is presented. This methodology includes a capability for reducing uncertainties in system parameters and responses by using Bayesian inference techniques to consistently combine prior knowledge with additional experimental information. The determination of best estimates for the system parameters, for the responses, and for their respective covariances is treated as a time-dependent constrained minimization problem. Three alternative formalisms for solving this problem are developed. The two ''off-line'' formalisms, with and without ''foresight'' characteristics, require the generation of a complete sensitivity data base prior to performing the uncertainty analysis. The ''online'' formalism, in which uncertainty analysis is performed interactively with the system analysis code, is best suited for treatment of large-scale highly nonlinear time-dependent problems. This methodology is applied to the uncertainty analysis of a transient upflow of a high pressure water heat transfer experiment. For comparison, an uncertainty analysis using sensitivities computed by standard response surface techniques is also performed. The results of the analysis indicate the following. Major reduction of the discrepancies in the calculation/experiment ratios is achieved by using the new methodology. Incorporation of in-bundle measurements in the uncertainty analysis significantly reduces system uncertainties. Accuracy of sensitivities generated by response-surface techniques should be carefully assessed prior to using them as a basis for uncertainty analyses of transient reactor safety problems

  10. A research on verification of the CONTAIN CODE model and the uncertainty reduction method for containment integrity

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jae-Hong; Kim, Moo-Hwan; Bae, Seong-Won; Byun, Sang-Chul [Pohang University of Science and Technology, Pohang (Korea, Republic of)

    1998-03-15

    The final objectives of this study are to establish the way of measuring the integrity of containment building structures and safety analysis in the period of a postuIated severe accidents and to decrease the uncertainty of these methods. For that object, the CONTAIN 1.2 codes model for analyzing the severe accidents phenomena and the heat transfer between the air inside the containment buildings and inner walls have been reviewed and analyzed. For the double containment wall provided to the next generation nuclear reactor, which is different to the previous type of containment, the temperature and pressure rising history were calculated and compared to the results of previous ones.

  11. Fukushima Daiichi unit 1 uncertainty analysis--Preliminary selection of uncertain parameters and analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Cardoni, Jeffrey N.; Kalinich, Donald A.

    2014-02-01

    Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.

  12. Status of SPACE Safety Analysis Code Development

    International Nuclear Information System (INIS)

    Lee, Dong Hyuk; Yang, Chang Keun; Kim, Se Yun; Ha, Sang Jun

    2009-01-01

    In 2006, the Korean the Korean nuclear industry started developing a thermal-hydraulic analysis code for safety analysis of PWR(Pressurized Water Reactor). The new code is named as SPACE(Safety and Performance Analysis Code for Nuclear Power Plant). The SPACE code can solve two-fluid, three-field governing equations in one dimensional or three dimensional geometry. The SPACE code has many component models required for modeling a PWR, such as reactor coolant pump, safety injection tank, etc. The programming language used in the new code is C++, for new generation of engineers who are more comfortable with C/C++ than old FORTRAN language. This paper describes general characteristics of SPACE code and current status of SPACE code development

  13. Uncertainty Analysis and Expert Judgment in Seismic Hazard Analysis

    Science.gov (United States)

    Klügel, Jens-Uwe

    2011-01-01

    The large uncertainty associated with the prediction of future earthquakes is usually regarded as the main reason for increased hazard estimates which have resulted from some recent large scale probabilistic seismic hazard analysis studies (e.g. the PEGASOS study in Switzerland and the Yucca Mountain study in the USA). It is frequently overlooked that such increased hazard estimates are characteristic for a single specific method of probabilistic seismic hazard analysis (PSHA): the traditional (Cornell-McGuire) PSHA method which has found its highest level of sophistication in the SSHAC probability method. Based on a review of the SSHAC probability model and its application in the PEGASOS project, it is shown that the surprising results of recent PSHA studies can be explained to a large extent by the uncertainty model used in traditional PSHA, which deviates from the state of the art in mathematics and risk analysis. This uncertainty model, the Ang-Tang uncertainty model, mixes concepts of decision theory with probabilistic hazard assessment methods leading to an overestimation of uncertainty in comparison to empirical evidence. Although expert knowledge can be a valuable source of scientific information, its incorporation into the SSHAC probability method does not resolve the issue of inflating uncertainties in PSHA results. Other, more data driven, PSHA approaches in use in some European countries are less vulnerable to this effect. The most valuable alternative to traditional PSHA is the direct probabilistic scenario-based approach, which is closely linked with emerging neo-deterministic methods based on waveform modelling.

  14. Uncertainty analysis of the 35% reactor inlet header break in a CANDU 6 reactor using RELAP/SCDAPSIM/MOD4.0 with integrated uncertainty analysis option

    Energy Technology Data Exchange (ETDEWEB)

    Dupleac, D., E-mail: danieldu@cne.pub.ro [Politehnica Univ. of Bucharest (Romania); Perez, M.; Reventos, F., E-mail: marina.perez@upc.edu, E-mail: francesc.reventos@upc.edu [Technical Univ. of Catalonia (Spain); Allison, C., E-mail: iss@cableone.net [Innovative Systems Software (United States)

    2011-07-01

    The RELAP/SCDAPSIM/MOD4.0 code, designed to predict the behavior of reactor systems during normal and accident conditions, is being developed as part of an international nuclear technology Software Development and Training Program (SDTP). RELAP/SCDAPSIM/MOD4.0, which is the first version of RELAP5 completely rewritten to FORTRAN 90/95/2000 standards, uses the publicly available RELAP5 and SCDAP models in combination with (a) advanced programming and numerical techniques, (b) advanced SDTP-member-developed models for LWR, HWR, and research reactor analysis, and (c) a variety of other member-developed computational packages. One such computational package is an integrated uncertainty analysis (IUA) package being developed jointly by the Technical University of Catalonia (UPC) and Innovative Systems Software (ISS). RELAP/SCDAPSIM/MOD4.0(IUA) follows the input-propagation approach using probability distribution functions to define the uncertainty of the input parameters. The main steps for this type of methodologies, often referred as to statistical approaches or Wilks’ methods, are the ones that follow: 1. Selection of the plant; 2. Selection of the scenario; 3. Selection of the safety criteria; 4. Identification and ranking of the relevant phenomena based on the safety criteria; 5. Selection of the appropriate code parameters to represent those phenomena; 6. Association of uncertainty by means of Probability Distribution Functions (PDFs) for each selected parameter; 7. Random sampling of the selected parameters according to its PDF and performing multiple computer runs to obtain uncertainty bands with a certain percentile and confidence level; 8. Processing the results of the multiple computer runs to estimate the uncertainty bands for the computed quantities associated with the selected safety criteria. RELAP/SCDAPSIM/MOD4.0(IUA) calculates the number of required code runs given the desired percentile and confidence level, performs the sampling process for the

  15. Uncertainty analysis of the 35% reactor inlet header break in a CANDU 6 reactor using RELAP/SCDAPSIM/MOD4.0 with integrated uncertainty analysis option

    International Nuclear Information System (INIS)

    Dupleac, D.; Perez, M.; Reventos, F.; Allison, C.

    2011-01-01

    The RELAP/SCDAPSIM/MOD4.0 code, designed to predict the behavior of reactor systems during normal and accident conditions, is being developed as part of an international nuclear technology Software Development and Training Program (SDTP). RELAP/SCDAPSIM/MOD4.0, which is the first version of RELAP5 completely rewritten to FORTRAN 90/95/2000 standards, uses the publicly available RELAP5 and SCDAP models in combination with (a) advanced programming and numerical techniques, (b) advanced SDTP-member-developed models for LWR, HWR, and research reactor analysis, and (c) a variety of other member-developed computational packages. One such computational package is an integrated uncertainty analysis (IUA) package being developed jointly by the Technical University of Catalonia (UPC) and Innovative Systems Software (ISS). RELAP/SCDAPSIM/MOD4.0(IUA) follows the input-propagation approach using probability distribution functions to define the uncertainty of the input parameters. The main steps for this type of methodologies, often referred as to statistical approaches or Wilks’ methods, are the ones that follow: 1. Selection of the plant; 2. Selection of the scenario; 3. Selection of the safety criteria; 4. Identification and ranking of the relevant phenomena based on the safety criteria; 5. Selection of the appropriate code parameters to represent those phenomena; 6. Association of uncertainty by means of Probability Distribution Functions (PDFs) for each selected parameter; 7. Random sampling of the selected parameters according to its PDF and performing multiple computer runs to obtain uncertainty bands with a certain percentile and confidence level; 8. Processing the results of the multiple computer runs to estimate the uncertainty bands for the computed quantities associated with the selected safety criteria. RELAP/SCDAPSIM/MOD4.0(IUA) calculates the number of required code runs given the desired percentile and confidence level, performs the sampling process for the

  16. Risk uncertainty analysis methods for NUREG-1150

    International Nuclear Information System (INIS)

    Benjamin, A.S.; Boyd, G.J.

    1987-01-01

    Evaluation and display of risk uncertainties for NUREG-1150 constitute a principal focus of the Severe Accident Risk Rebaselining/Risk Reduction Program (SARRP). Some of the principal objectives of the uncertainty evaluation are: (1) to provide a quantitative estimate that reflects, for those areas considered, a credible and realistic range of uncertainty in risk; (2) to rank the various sources of uncertainty with respect to their importance for various measures of risk; and (3) to characterize the state of understanding of each aspect of the risk assessment for which major uncertainties exist. This paper describes the methods developed to fulfill these objectives

  17. Uncertainty Methods Framework Development for the TRACE Thermal-Hydraulics Code by the U.S.NRC

    International Nuclear Information System (INIS)

    Bajorek, Stephen M.; Gingrich, Chester

    2013-01-01

    The Code of Federal Regulations, Title 10, Part 50.46 requires that the Emergency Core Cooling System (ECCS) performance be evaluated for a number of postulated Loss-Of-Coolant-Accidents (LOCAs). The rule allows two methods for calculation of the acceptance criteria; using a realistic model in the so-called 'Best Estimate' approach, or the more prescriptive following Appendix K to Part 50. Because of the conservatism of Appendix K, recent Evaluation Model submittals to the NRC used the realistic approach. With this approach, the Evaluation Model must demonstrate that the Peak Cladding Temperature (PCT), the Maximum Local Oxidation (MLO) and Core-Wide Oxidation (CWO) remain below their regulatory limits with a 'high probability'. Guidance for Best Estimate calculations following 50.46(a)(1) was provided by Regulatory Guide 1.157. This Guide identified a 95% probability level as being acceptable for comparisons of best-estimate predictions to the applicable regulatory limits, but was vague with respect to acceptable methods in which to determine the code uncertainty. Nor, did it specify if a confidence level should be determined. As a result, vendors have developed Evaluation Models utilizing several different methods to combine uncertainty parameters and determine the PCT and other variables to a high probability. In order to quantify the accuracy of TRACE calculations for a wide variety of applications and to audit Best Estimate calculations made by industry, the NRC is developing its own independent methodology to determine the peak cladding temperature and other parameters of regulatory interest to a high probability. Because several methods are in use, and each vendor's methodology ranges different parameters, the NRC method must be flexible and sufficiently general. Not only must the method apply to LOCA analysis for conventional light-water reactors, it must also be extendable to new reactor designs and type of analyses where the acceptance criteria are less

  18. Representation of analysis results involving aleatory and epistemic uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jay Dean (ProStat, Mesa, AZ); Helton, Jon Craig (Arizona State University, Tempe, AZ); Oberkampf, William Louis; Sallaberry, Cedric J.

    2008-08-01

    Procedures are described for the representation of results in analyses that involve both aleatory uncertainty and epistemic uncertainty, with aleatory uncertainty deriving from an inherent randomness in the behavior of the system under study and epistemic uncertainty deriving from a lack of knowledge about the appropriate values to use for quantities that are assumed to have fixed but poorly known values in the context of a specific study. Aleatory uncertainty is usually represented with probability and leads to cumulative distribution functions (CDFs) or complementary cumulative distribution functions (CCDFs) for analysis results of interest. Several mathematical structures are available for the representation of epistemic uncertainty, including interval analysis, possibility theory, evidence theory and probability theory. In the presence of epistemic uncertainty, there is not a single CDF or CCDF for a given analysis result. Rather, there is a family of CDFs and a corresponding family of CCDFs that derive from epistemic uncertainty and have an uncertainty structure that derives from the particular uncertainty structure (i.e., interval analysis, possibility theory, evidence theory, probability theory) used to represent epistemic uncertainty. Graphical formats for the representation of epistemic uncertainty in families of CDFs and CCDFs are investigated and presented for the indicated characterizations of epistemic uncertainty.

  19. Comprehensive neutron cross-section and secondary energy distribution uncertainty analysis for a fusion reactor

    International Nuclear Information System (INIS)

    Gerstl, S.A.W.; LaBauve, R.J.; Young, P.G.

    1980-05-01

    On the example of General Atomic's well-documented Power Generating Fusion Reactor (PGFR) design, this report exercises a comprehensive neutron cross-section and secondary energy distribution (SED) uncertainty analysis. The LASL sensitivity and uncertainty analysis code SENSIT is used to calculate reaction cross-section sensitivity profiles and integral SED sensitivity coefficients. These are then folded with covariance matrices and integral SED uncertainties to obtain the resulting uncertainties of three calculated neutronics design parameters: two critical radiation damage rates and a nuclear heating rate. The report documents the first sensitivity-based data uncertainty analysis, which incorporates a quantitative treatment of the effects of SED uncertainties. The results demonstrate quantitatively that the ENDF/B-V cross-section data files for C, H, and O, including their SED data, are fully adequate for this design application, while the data for Fe and Ni are at best marginally adequate because they give rise to response uncertainties up to 25%. Much higher response uncertainties are caused by cross-section and SED data uncertainties in Cu (26 to 45%), tungsten (24 to 54%), and Cr (up to 98%). Specific recommendations are given for re-evaluations of certain reaction cross-sections, secondary energy distributions, and uncertainty estimates

  20. Assessing scenario and parametric uncertainties in risk analysis: a model uncertainty audit

    International Nuclear Information System (INIS)

    Tarantola, S.; Saltelli, A.; Draper, D.

    1999-01-01

    In the present study a process of model audit is addressed on a computational model used for predicting maximum radiological doses to humans in the field of nuclear waste disposal. Global uncertainty and sensitivity analyses are employed to assess output uncertainty and to quantify the contribution of parametric and scenario uncertainties to the model output. These tools are of fundamental importance for risk analysis and decision making purposes

  1. Stability analysis by ERATO code

    International Nuclear Information System (INIS)

    Tsunematsu, Toshihide; Takeda, Tatsuoki; Matsuura, Toshihiko; Azumi, Masafumi; Kurita, Gen-ichi

    1979-12-01

    Problems in MHD stability calculations by ERATO code are described; which concern convergence property of results, equilibrium codes, and machine optimization of ERATO code. It is concluded that irregularity on a convergence curve is not due to a fault of the ERATO code itself but due to inappropriate choice of the equilibrium calculation meshes. Also described are a code to calculate an equilibrium as a quasi-inverse problem and a code to calculate an equilibrium as a result of a transport process. Optimization of the code with respect to I/O operations reduced both CPU time and I/O time considerably. With the FACOM230-75 APU/CPU multiprocessor system, the performance is about 6 times as high as with the FACOM230-75 CPU, showing the effectiveness of a vector processing computer for the kind of MHD computations. This report is a summary of the material presented at the ERATO workshop 1979(ORNL), supplemented with some details. (author)

  2. Reusable launch vehicle model uncertainties impact analysis

    Science.gov (United States)

    Chen, Jiaye; Mu, Rongjun; Zhang, Xin; Deng, Yanpeng

    2018-03-01

    Reusable launch vehicle(RLV) has the typical characteristics of complex aerodynamic shape and propulsion system coupling, and the flight environment is highly complicated and intensely changeable. So its model has large uncertainty, which makes the nominal system quite different from the real system. Therefore, studying the influences caused by the uncertainties on the stability of the control system is of great significance for the controller design. In order to improve the performance of RLV, this paper proposes the approach of analyzing the influence of the model uncertainties. According to the typical RLV, the coupling dynamic and kinematics models are built. Then different factors that cause uncertainties during building the model are analyzed and summed up. After that, the model uncertainties are expressed according to the additive uncertainty model. Choosing the uncertainties matrix's maximum singular values as the boundary model, and selecting the uncertainties matrix's norm to show t how much the uncertainty factors influence is on the stability of the control system . The simulation results illustrate that the inertial factors have the largest influence on the stability of the system, and it is necessary and important to take the model uncertainties into consideration before the designing the controller of this kind of aircraft( like RLV, etc).

  3. Simulating fuel behavior under transient conditions using FRAPTRAN and uncertainty analysis using Dakota

    International Nuclear Information System (INIS)

    Gomes, Daniel S.; Teixeira, Antonio S.

    2017-01-01

    Although regulatory agencies have shown a special interest in incorporating best estimate approaches in the fuel licensing process, fuel codes are currently licensed based on only the deterministic limits such as those seen in 10CRF50, and therefore, may yield unrealistic safety margins. The concept of uncertainty analysis is employed to more realistically manage this risk. In this study, uncertainties were classified into two categories: probabilistic and epistemic (owing to a lack of pre-existing knowledge in this area). Fuel rods have three sources of uncertainty: manufacturing tolerance, boundary conditions, and physical models. The first step in successfully analyzing the uncertainties involves performing a statistical analysis on the input parameters used throughout the fuel code. The response obtained from this analysis must show proportional index correlations because the uncertainties are globally propagated. The Dakota toolkit was used to analyze the FRAPTRAN transient fuel code. The subsequent sensitivity analyses helped in identifying the key parameters with the highest correlation indices including the peak cladding temperature and the time required for cladding failures. The uncertainty analysis was performed using an IFA-650-5 fuel rod, which was in line with the tests performed in the Halden Project in Norway. The main objectives of the Halden project included studying the ballooning and rupture processes. The results of this experiment demonstrate the accuracy and applicability of the physical models in evaluating the thermal conductivity, mechanical model, and fuel swelling formulations. (author)

  4. Simulating fuel behavior under transient conditions using FRAPTRAN and uncertainty analysis using Dakota

    Energy Technology Data Exchange (ETDEWEB)

    Gomes, Daniel S.; Teixeira, Antonio S., E-mail: dsgomes@ipen.br, E-mail: teixeira@ipen [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    Although regulatory agencies have shown a special interest in incorporating best estimate approaches in the fuel licensing process, fuel codes are currently licensed based on only the deterministic limits such as those seen in 10CRF50, and therefore, may yield unrealistic safety margins. The concept of uncertainty analysis is employed to more realistically manage this risk. In this study, uncertainties were classified into two categories: probabilistic and epistemic (owing to a lack of pre-existing knowledge in this area). Fuel rods have three sources of uncertainty: manufacturing tolerance, boundary conditions, and physical models. The first step in successfully analyzing the uncertainties involves performing a statistical analysis on the input parameters used throughout the fuel code. The response obtained from this analysis must show proportional index correlations because the uncertainties are globally propagated. The Dakota toolkit was used to analyze the FRAPTRAN transient fuel code. The subsequent sensitivity analyses helped in identifying the key parameters with the highest correlation indices including the peak cladding temperature and the time required for cladding failures. The uncertainty analysis was performed using an IFA-650-5 fuel rod, which was in line with the tests performed in the Halden Project in Norway. The main objectives of the Halden project included studying the ballooning and rupture processes. The results of this experiment demonstrate the accuracy and applicability of the physical models in evaluating the thermal conductivity, mechanical model, and fuel swelling formulations. (author)

  5. International training program: 3D S.UN.COP - Scaling, uncertainty and 3D thermal-hydraulics/neutron-kinetics coupled codes seminar

    International Nuclear Information System (INIS)

    Petruzzi, A.; D'Auria, F.; Bajs, T.; Reventos, F.

    2006-01-01

    Thermal-hydraulic system computer codes are extensively used worldwide for analysis of nuclear facilities by utilities, regulatory bodies, nuclear power plant designers and vendors, nuclear fuel companies, research organizations, consulting companies, and technical support organizations. The computer code user represents a source of uncertainty that can influence the results of system code calculations. This influence is commonly known as the 'user effect' and stems from the limitations embedded in the codes as well as from the limited capability of the analysts to use the codes. Code user training and qualification is an effective means for reducing the variation of results caused by the application of the codes by different users. This paper describes a systematic approach to training code users who, upon completion of the training, should be able to perform calculations making the best possible use of the capabilities of best estimate codes. In other words, the program aims at contributing towards solving the problem of user effect. The 3D S.UN.COP 2005 (Scaling, Uncertainty and 3D COuPled code calculations) seminar has been organized by University of Pisa and University of Zagreb as follow-up of the proposal to IAEA for the Permanent Training Course for System Code Users (D'Auria, 1998). It was recognized that such a course represented both a source of continuing education for current code users and a means for current code users to enter the formal training structure of a proposed 'permanent' stepwise approach to user training. The seminar-training was successfully held with the participation of 19 persons coming from 9 countries and 14 different institutions (universities, vendors, national laboratories and regulatory bodies). More than 15 scientists were involved in the organization of the seminar, presenting theoretical aspects of the proposed methodologies and holding the training and the final examination. A certificate (LA Code User grade) was released

  6. The uncertainty analysis of model results a practical guide

    CERN Document Server

    Hofer, Eduard

    2018-01-01

    This book is a practical guide to the uncertainty analysis of computer model applications. Used in many areas, such as engineering, ecology and economics, computer models are subject to various uncertainties at the level of model formulations, parameter values and input data. Naturally, it would be advantageous to know the combined effect of these uncertainties on the model results as well as whether the state of knowledge should be improved in order to reduce the uncertainty of the results most effectively. The book supports decision-makers, model developers and users in their argumentation for an uncertainty analysis and assists them in the interpretation of the analysis results.

  7. Finite mixture models for sensitivity analysis of thermal hydraulic codes for passive safety systems analysis

    Energy Technology Data Exchange (ETDEWEB)

    Di Maio, Francesco, E-mail: francesco.dimaio@polimi.it [Energy Department, Politecnico di Milano, Via La Masa 34, 20156 Milano (Italy); Nicola, Giancarlo [Energy Department, Politecnico di Milano, Via La Masa 34, 20156 Milano (Italy); Zio, Enrico [Energy Department, Politecnico di Milano, Via La Masa 34, 20156 Milano (Italy); Chair on System Science and Energetic Challenge Fondation EDF, Ecole Centrale Paris and Supelec, Paris (France); Yu, Yu [School of Nuclear Science and Engineering, North China Electric Power University, 102206 Beijing (China)

    2015-08-15

    Highlights: • Uncertainties of TH codes affect the system failure probability quantification. • We present Finite Mixture Models (FMMs) for sensitivity analysis of TH codes. • FMMs approximate the pdf of the output of a TH code with a limited number of simulations. • The approach is tested on a Passive Containment Cooling System of an AP1000 reactor. • The novel approach overcomes the results of a standard variance decomposition method. - Abstract: For safety analysis of Nuclear Power Plants (NPPs), Best Estimate (BE) Thermal Hydraulic (TH) codes are used to predict system response in normal and accidental conditions. The assessment of the uncertainties of TH codes is a critical issue for system failure probability quantification. In this paper, we consider passive safety systems of advanced NPPs and present a novel approach of Sensitivity Analysis (SA). The approach is based on Finite Mixture Models (FMMs) to approximate the probability density function (i.e., the uncertainty) of the output of the passive safety system TH code with a limited number of simulations. We propose a novel Sensitivity Analysis (SA) method for keeping the computational cost low: an Expectation Maximization (EM) algorithm is used to calculate the saliency of the TH code input variables for identifying those that most affect the system functional failure. The novel approach is compared with a standard variance decomposition method on a case study considering a Passive Containment Cooling System (PCCS) of an Advanced Pressurized reactor AP1000.

  8. PUFF-III: A Code for Processing ENDF Uncertainty Data Into Multigroup Covariance Matrices

    International Nuclear Information System (INIS)

    Dunn, M.E.

    2000-01-01

    PUFF-III is an extension of the previous PUFF-II code that was developed in the 1970s and early 1980s. The PUFF codes process the Evaluated Nuclear Data File (ENDF) covariance data and generate multigroup covariance matrices on a user-specified energy grid structure. Unlike its predecessor, PUFF-III can process the new ENDF/B-VI data formats. In particular, PUFF-III has the capability to process the spontaneous fission covariances for fission neutron multiplicity. With regard to the covariance data in File 33 of the ENDF system, PUFF-III has the capability to process short-range variance formats, as well as the lumped reaction covariance data formats that were introduced in ENDF/B-V. In addition to the new ENDF formats, a new directory feature is now available that allows the user to obtain a detailed directory of the uncertainty information in the data files without visually inspecting the ENDF data. Following the correlation matrix calculation, PUFF-III also evaluates the eigenvalues of each correlation matrix and tests each matrix for positive definiteness. Additional new features are discussed in the manual. PUFF-III has been developed for implementation in the AMPX code system, and several modifications were incorporated to improve memory allocation tasks and input/output operations. Consequently, the resulting code has a structure that is similar to other modules in the AMPX code system. With the release of PUFF-III, a new and improved covariance processing code is available to process ENDF covariance formats through Version VI

  9. Methodologies for uncertainty analysis in the level 2 PSA and their implementation procedures

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Yang, Joon Eun; Kim, Dong Ha

    2002-04-01

    Main purpose of this report to present standardized methodologies for uncertainty analysis in the Level 2 Probabilistic Safety Assessment (PSA) and their implementation procedures, based on results obtained through a critical review of the existing methodologies for the analysis of uncertainties employed in the Level 2 PSA, especially Accident Progression Event Tree (APET). Uncertainties employed in the Level 2 PSA, quantitative expressions of overall knowledge of analysts' and experts' participating in the probabilistic quantification process of phenomenological accident progressions ranging from core melt to containment failure, their numerical values are directly related to the degree of confidence that the analyst has that a given phenomenological event or accident process will or will not occur, or analyst's subjective probabilities of occurrence. These results that are obtained from Level 2 PSA uncertainty analysis, become an essential contributor to the plant risk, in addition to the Level 1 PSA and Level 3 PSA uncertainties. Uncertainty analysis methodologies and their implementation procedures presented in this report was prepared based on the following criteria: 'uncertainty quantification process must be logical, scrutable, complete, consistent and in an appropriate level of detail, as mandated by the Level 2 PSA objectives'. For the aforementioned purpose, this report deals mainly with (1) summary of general or Level 2 PSA specific uncertainty analysis methodologies, (2) selection of phenomenological branch events for uncertainty analysis in the APET, methodology for quantification of APET uncertainty inputs and its implementation procedure, (3) statistical propagation of uncertainty inputs through APET and its implementation procedure, and (4) formal procedure for quantification of APET uncertainties and source term categories (STCs) through the Level 2 PSA quantification codes

  10. A methodology for uncertainty analysis of reference equations of state

    DEFF Research Database (Denmark)

    Cheung, Howard; Frutiger, Jerome; Bell, Ian H.

    We present a detailed methodology for the uncertainty analysis of reference equations of state (EOS) based on Helmholtz energy. In recent years there has been an increased interest in uncertainties of property data and process models of thermal systems. In the literature there are various...... for uncertainty analysis is suggested as a tool for EOS. The uncertainties of the EOS properties are calculated from the experimental values and the EOS model structure through the parameter covariance matrix and subsequent linear error propagation. This allows reporting the uncertainty range (95% confidence...

  11. Best estimate analysis of LOFT L2-5 with CATHARE: uncertainty and sensitivity analysis

    Energy Technology Data Exchange (ETDEWEB)

    JOUCLA, Jerome; PROBST, Pierre [Institute for Radiological Protection and Nuclear Safety, Fontenay-aux-Roses (France); FOUET, Fabrice [APTUS, Versailles (France)

    2008-07-01

    The revision of the 10 CFR50.46 in 1988 has made possible the use of best-estimate codes. They may be used in safety demonstration and licensing, provided that uncertainties are added to the relevant output parameters before comparing them with the acceptance criteria. In the safety analysis of the large break loss of coolant accident, it was agreed that the 95. percentile estimated with a high degree of confidence should be lower than the acceptance criteria. It appeared necessary to IRSN, technical support of the French Safety Authority, to get more insight into these strategies which are being developed not only in thermal-hydraulics but in other fields such as in neutronics. To estimate the 95. percentile with a high confidence level, we propose to use rank statistics or bootstrap. Toward the objective of assessing uncertainty, it is useful to determine and to classify the main input parameters. We suggest approximating the code by a surrogate model, the Kriging model, which will be used to make a sensitivity analysis with the SOBOL methodology. This paper presents the application of two new methodologies of how to make the uncertainty and sensitivity analysis on the maximum peak cladding temperature of the LOFT L2-5 test with the CATHARE code. (authors)

  12. Measurement uncertainty analysis techniques applied to PV performance measurements

    International Nuclear Information System (INIS)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results

  13. Effect of activation cross section uncertainties in transmutation analysis of realistic low-activation steels for IFMIF

    Energy Technology Data Exchange (ETDEWEB)

    Cabellos, O.; Garcya-Herranz, N.; Sanz, J. [Institute of Nuclear Fusion, UPM, Madrid (Spain); Cabellos, O.; Garcya-Herranz, N.; Fernandez, P.; Fernandez, B. [Dept. of Nuclear Engineering, UPM, Madrid (Spain); Sanz, J. [Dept. of Power Engineering, UNED, Madrid (Spain); Reyes, S. [Safety, Environment and Health Group, ITER Joint Work Site, Cadarache Center (France)

    2008-07-01

    We address uncertainty analysis to draw conclusions on the reliability of the activation calculation in the International Fusion Materials Irradiation Facility (IFMIF) under the potential impact of activation cross section uncertainties. The Monte Carlo methodology implemented in ACAB code gives the uncertainty estimates due to the synergetic/global effect of the complete set of cross section uncertainties. An element-by-element analysis has been demonstrated as a helpful tool to easily analyse the transmutation performance of irradiated materials.The uncertainty analysis results showed that for times over about 24 h the relative error in the contact dose rate can be as large as 23 per cent. We have calculated the effect of cross section uncertainties in the IFMIF activation of all different elements. For EUROFER, uncertainties in H and He elements are 7.3% and 5.6%, respectively. We have found significant uncertainties in the transmutation response for C, P and Nb.

  14. Application of code scaling applicability and uncertainty methodology to the large break loss of coolant

    International Nuclear Information System (INIS)

    Young, M.Y.; Bajorek, S.M.; Nissley, M.E.

    1998-01-01

    In the late 1980s, after completion of an extensive research program, the United States Nuclear Regulatory Commission (USNRC) amended its regulations (10CFR50.46) to allow the use of realistic physical models to analyze the loss of coolant accident (LOCA) in a light water reactors. Prior to this time, the evaluation of this accident was subject to a prescriptive set of rules (appendix K of the regulations) requiring conservative models and assumptions to be applied simultaneously, leading to very pessimistic estimates of the impact of this accident on the reactor core. The rule change therefore promised to provide significant benefits to owners of power reactors, allowing them to increase output. In response to the rule change, a method called code scaling, applicability and uncertainty (CSAU) was developed to apply realistic methods, while properly taking into account data uncertainty, uncertainty in physical modeling and plant variability. The method was claimed to be structured, traceable, and practical, but was met with some criticism when first demonstrated. In 1996, the USNRC approved a methodology, based on CSAU, developed by a group led by Westinghouse. The lessons learned in this application of CSAU will be summarized. Some of the issues raised concerning the validity and completeness of the CSAU methodology will also be discussed. (orig.)

  15. Procedures for uncertainty and sensitivity analysis in repository performance assessment

    International Nuclear Information System (INIS)

    Poern, K.; Aakerlund, O.

    1985-10-01

    The objective of the project was mainly a literature study of available methods for the treatment of parameter uncertainty propagation and sensitivity aspects in complete models such as those concerning geologic disposal of radioactive waste. The study, which has run parallel with the development of a code package (PROPER) for computer assisted analysis of function, also aims at the choice of accurate, cost-affective methods for uncertainty and sensitivity analysis. Such a choice depends on several factors like the number of input parameters, the capacity of the model and the computer reresources required to use the model. Two basic approaches are addressed in the report. In one of these the model of interest is directly simulated by an efficient sampling technique to generate an output distribution. Applying the other basic method the model is replaced by an approximating analytical response surface, which is then used in the sampling phase or in moment matching to generate the output distribution. Both approaches are illustrated by simple examples in the report. (author)

  16. Sampling based uncertainty analysis of 10% hot leg break LOCA in large scale test facility

    International Nuclear Information System (INIS)

    Sengupta, Samiran; Kraina, V.; Dubey, S. K.; Rao, R. S.; Gupta, S. K.

    2010-01-01

    Sampling based uncertainty analysis was carried out to quantify uncertainty in predictions of best estimate code RELAP5/MOD3.2 for a thermal hydraulic test (10% hot leg break LOCA) performed in the Large Scale Test Facility (LSTF) as a part of an IAEA coordinated research project. The nodalisation of the test facility was qualified for both steady state and transient level by systematically applying the procedures led by uncertainty methodology based on accuracy extrapolation (UMAE); uncertainty analysis was carried out using the Latin hypercube sampling (LHS) method to evaluate uncertainty for ten input parameters. Sixteen output parameters were selected for uncertainty evaluation and uncertainty band between 5 th and 95 th percentile of the output parameters were evaluated. It was observed that the uncertainty band for the primary pressure during two phase blowdown is larger than that of the remaining period. Similarly, a larger uncertainty band is observed relating to accumulator injection flow during reflood phase. Importance analysis was also carried out and standard rank regression coefficients were computed to quantify the effect of each individual input parameter on output parameters. It was observed that the break discharge coefficient is the most important uncertain parameter relating to the prediction of all the primary side parameters and that the steam generator (SG) relief pressure setting is the most important parameter in predicting the SG secondary pressure

  17. SASSYS LMFBR systems analysis code

    International Nuclear Information System (INIS)

    Dunn, F.E.; Prohammer, F.G.

    1982-01-01

    The SASSYS code provides detailed steady-state and transient thermal-hydraulic analyses of the reactor core, inlet and outlet coolant plenums, primary and intermediate heat-removal systems, steam generators, and emergency shut-down heat removal systems in liquid-metal-cooled fast-breeder reactors (LMFBRs). The main purpose of the code is to analyze the consequences of failures in the shut-down heat-removal system and to determine whether this system can perform its mission adequately even with some of its components inoperable. The code is not plant-specific. It is intended for use with any LMFBR, using either a loop or a pool design, a once-through steam generator or an evaporator-superheater combination, and either a homogeneous core or a heterogeneous core with internal-blanket assemblies

  18. Effect of Uncertainty Parameters in Blowdown and Reflood Models for OPR1000 LBLOCA Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Huh, Byung Gil; Jin, Chang Yong; Seul, Kwangwon; Hwang, Taesuk [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2014-05-15

    KINS(Korea Institute of Nuclear Safety) has also performed the audit calculation with the KINS Realistic Evaluation Methodology(KINS-REM) to confirm the validity of licensee's calculation. In the BEPU method, it is very important to quantify the code and model uncertainty. It is referred in the following requirement: BE calculations in Regulatory Guide 1.157 - 'the code and models used are acceptable and applicable to the specific facility over the intended operating range and must quantify the uncertainty in the specific application'. In general, the uncertainty of model/code should be obtained through the data comparison with relevant integral- and separate-effect tests at different scales. However, it is not easy to determine these kinds of uncertainty because of the difficulty for evaluating accurately various experiments. Therefore, the expert judgment has been used in many cases even with the limitation that the uncertainty range of important parameters can be wide and inaccurate. In the KINS-REM, six heat transfer parameters in the blowdown phase have been used to consider the uncertainty of models. Recently, MARS-KS code was modified to consider the uncertainty of the five heat transfer parameters in the reflood phase. Accordingly, it is required that the uncertainty range for parameters of reflood models is determined and the effect of these ranges is evaluated. In this study, the large break LOCA (LBLOCA) analysis for OPR1000 was performed to identify the effect of uncertainty parameters in blowdown and reflood models.

  19. Uncertainty as Knowledge: Constraints on Policy Choices Provided by Analysis of Uncertainty

    Science.gov (United States)

    Lewandowsky, S.; Risbey, J.; Smithson, M.; Newell, B. R.

    2012-12-01

    Uncertainty forms an integral part of climate science, and it is often cited in connection with arguments against mitigative action. We argue that an analysis of uncertainty must consider existing knowledge as well as uncertainty, and the two must be evaluated with respect to the outcomes and risks associated with possible policy options. Although risk judgments are inherently subjective, an analysis of the role of uncertainty within the climate system yields two constraints that are robust to a broad range of assumptions. Those constraints are that (a) greater uncertainty about the climate system is necessarily associated with greater expected damages from warming, and (b) greater uncertainty translates into a greater risk of the failure of mitigation efforts. These ordinal constraints are unaffected by subjective or cultural risk-perception factors, they are independent of the discount rate, and they are independent of the magnitude of the estimate for climate sensitivity. The constraints mean that any appeal to uncertainty must imply a stronger, rather than weaker, need to cut greenhouse gas emissions than in the absence of uncertainty.

  20. Uncertainties in Safety Analysis. A literature review

    International Nuclear Information System (INIS)

    Ekberg, C.

    1995-05-01

    The purpose of the presented work has been to give a short summary of the origins of many uncertainties arising in the designing and performance assessment of a repository for spent nuclear fuel. Some different methods to treat these uncertainties is also included. The methods and conclusions are in many cases general in the sense that they are applicable to many other disciplines where simulations are used. As a conclusion it may be noted that uncertainties of different origin have been discussed and debated, but one large group, e.g. computer simulations, where the methods to make a more explicit investigation exists, have not been investigated in a satisfying way. 50 refs

  1. Uncertainties in Safety Analysis. A literature review

    Energy Technology Data Exchange (ETDEWEB)

    Ekberg, C [Chalmers Univ. of Technology, Goeteborg (Sweden). Dept. of Nuclear Chemistry

    1995-05-01

    The purpose of the presented work has been to give a short summary of the origins of many uncertainties arising in the designing and performance assessment of a repository for spent nuclear fuel. Some different methods to treat these uncertainties is also included. The methods and conclusions are in many cases general in the sense that they are applicable to many other disciplines where simulations are used. As a conclusion it may be noted that uncertainties of different origin have been discussed and debated, but one large group, e.g. computer simulations, where the methods to make a more explicit investigation exists, have not been investigated in a satisfying way. 50 refs.

  2. Sensitivity and uncertainty analysis applied to a repository in rock salt

    International Nuclear Information System (INIS)

    Polle, A.N.

    1996-12-01

    This document describes the sensitivity and uncertainty analysis with UNCSAM, as applied to a repository in rock salt for the EVEREST project. UNCSAM is a dedicated software package for sensitivity and uncertainty analysis, which was already used within the preceding PROSA project. The use of UNCSAM provides a flexible interface to EMOS ECN by substituting the sampled values in the various input files to be used by EMOS ECN ; the model calculations for this repository were performed with the EMOS ECN code. Preceding the sensitivity and uncertainty analysis, a number of preparations has been carried out to facilitate EMOS ECN with the probabilistic input data. For post-processing the EMOS ECN results, the characteristic output signals were processed. For the sensitivity and uncertainty analysis with UNCSAM the stochastic input, i.e. sampled values, and the output for the various EMOS ECN runs have been analyzed. (orig.)

  3. Parallel processing of structural integrity analysis codes

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.

    1996-01-01

    Structural integrity analysis forms an important role in assessing and demonstrating the safety of nuclear reactor components. This analysis is performed using analytical tools such as Finite Element Method (FEM) with the help of digital computers. The complexity of the problems involved in nuclear engineering demands high speed computation facilities to obtain solutions in reasonable amount of time. Parallel processing systems such as ANUPAM provide an efficient platform for realising the high speed computation. The development and implementation of software on parallel processing systems is an interesting and challenging task. The data and algorithm structure of the codes plays an important role in exploiting the parallel processing system capabilities. Structural analysis codes based on FEM can be divided into two categories with respect to their implementation on parallel processing systems. The first category codes such as those used for harmonic analysis, mechanistic fuel performance codes need not require the parallelisation of individual modules of the codes. The second category of codes such as conventional FEM codes require parallelisation of individual modules. In this category, parallelisation of equation solution module poses major difficulties. Different solution schemes such as domain decomposition method (DDM), parallel active column solver and substructuring method are currently used on parallel processing systems. Two codes, FAIR and TABS belonging to each of these categories have been implemented on ANUPAM. The implementation details of these codes and the performance of different equation solvers are highlighted. (author). 5 refs., 12 figs., 1 tab

  4. Sensitivity functions for uncertainty analysis: Sensitivity and uncertainty analysis of reactor performance parameters

    International Nuclear Information System (INIS)

    Greenspan, E.

    1982-01-01

    This chapter presents the mathematical basis for sensitivity functions, discusses their physical meaning and information they contain, and clarifies a number of issues concerning their application, including the definition of group sensitivities, the selection of sensitivity functions to be included in the analysis, and limitations of sensitivity theory. Examines the theoretical foundation; criticality reset sensitivities; group sensitivities and uncertainties; selection of sensitivities included in the analysis; and other uses and limitations of sensitivity functions. Gives the theoretical formulation of sensitivity functions pertaining to ''as-built'' designs for performance parameters of the form of ratios of linear flux functionals (such as reaction-rate ratios), linear adjoint functionals, bilinear functions (such as reactivity worth ratios), and for reactor reactivity. Offers a consistent procedure for reducing energy-dependent or fine-group sensitivities and uncertainties to broad group sensitivities and uncertainties. Provides illustrations of sensitivity functions as well as references to available compilations of such functions and of total sensitivities. Indicates limitations of sensitivity theory originating from the fact that this theory is based on a first-order perturbation theory

  5. Uncertainty analysis and validation of environmental models. The empirically based uncertainty analysis

    International Nuclear Information System (INIS)

    Monte, Luigi; Hakanson, Lars; Bergstroem, Ulla; Brittain, John; Heling, Rudie

    1996-01-01

    The principles of Empirically Based Uncertainty Analysis (EBUA) are described. EBUA is based on the evaluation of 'performance indices' that express the level of agreement between the model and sets of empirical independent data collected in different experimental circumstances. Some of these indices may be used to evaluate the confidence limits of the model output. The method is based on the statistical analysis of the distribution of the index values and on the quantitative relationship of these values with the ratio 'experimental data/model output'. Some performance indices are described in the present paper. Among these, the so-called 'functional distance' (d) between the logarithm of model output and the logarithm of the experimental data, defined as d 2 =Σ n 1 ( ln M i - ln O i ) 2 /n where M i is the i-th experimental value, O i the corresponding model evaluation and n the number of the couplets 'experimental value, predicted value', is an important tool for the EBUA method. From the statistical distribution of this performance index, it is possible to infer the characteristics of the distribution of the ratio 'experimental data/model output' and, consequently to evaluate the confidence limits for the model predictions. This method was applied to calculate the uncertainty level of a model developed to predict the migration of radiocaesium in lacustrine systems. Unfortunately, performance indices are affected by the uncertainty of the experimental data used in validation. Indeed, measurement results of environmental levels of contamination are generally associated with large uncertainty due to the measurement and sampling techniques and to the large variability in space and time of the measured quantities. It is demonstrated that this non-desired effect, in some circumstances, may be corrected by means of simple formulae

  6. The characterisation and evaluation of uncertainty in probabilistic risk analysis

    International Nuclear Information System (INIS)

    Parry, G.W.; Winter, P.W.

    1980-10-01

    The sources of uncertainty in probabilistic risk analysis are discussed using the event/fault tree methodology as an example. The role of statistics in quantifying these uncertainties is investigated. A class of uncertainties is identified which is, at present, unquantifiable, using either classical or Bayesian statistics. It is argued that Bayesian statistics is the more appropriate vehicle for the probabilistic analysis of rare events and a short review is given with some discussion on the representation of ignorance. (author)

  7. Cross-section data uncertainty and how such information is used in fusion analysis

    International Nuclear Information System (INIS)

    Kodeli, I.

    1999-01-01

    A sensitivity and uncertainty computer code package has been developed and already extensively used in the analyses for pressure vessel dosimetry and benchmark experiments. The procedure is based on the SUSD3D code and the VITAMIN-J/COVA covariance matrix library. The complete covariance matrices of secondary angular distribution (SAD-file MF=34), as available in the EFF-2 evaluation in ENDF/B-6 format can be treated. SAD effects can be important in particular for the fusion applications. An option to perform three-dimensional sensitivity and uncertainty analysis is included in the SUSD3D code. Use of angular moment files instead of bulky angular flux files produced by the discrete ordinates transport codes reduces considerably the size of the files required and represents an acceptable approximation for the problem types analysed here. The underlying perturbation theory is reviewed and some examples on the use are listed.(author)

  8. Uncertainty analysis of energy consumption in dwellings

    Energy Technology Data Exchange (ETDEWEB)

    Pettersen, Trine Dyrstad

    1997-12-31

    This thesis presents a comprehensive study of an energy estimation model that can be used to examine the uncertainty of predicted energy consumption in a dwelling. The variation and uncertainty of input parameters due to the outdoor climate, the building construction and the inhabitants are studied as a basis for further energy evaluations. The occurring variations of energy consumption in nominal similar dwellings are also investigated due to verification of the simulated energy consumption. The main topics are (1) a study of expected variations and uncertainties in both input parameters used in energy consumption calculations and the energy consumption in the dwelling, (2) the development and evaluation of a simplified energy calculation model that considers uncertainties due to the input parameters, (3) an evaluation of the influence of the uncertain parameters on the total variation so that the most important parameters can be identified, and (4) the recommendation of a simplified procedure for treating uncertainties or possible deviations from average conditions. 90 refs., 182 figs., 73 tabs.

  9. Code manual for CONTAIN 2.0: A computer code for nuclear reactor containment analysis

    Energy Technology Data Exchange (ETDEWEB)

    Murata, K.K.; Williams, D.C.; Griffith, R.O.; Gido, R.G.; Tadios, E.L.; Davis, F.J.; Martinez, G.M.; Washington, K.E. [Sandia National Labs., Albuquerque, NM (United States); Tills, J. [J. Tills and Associates, Inc., Sandia Park, NM (United States)

    1997-12-01

    The CONTAIN 2.0 computer code is an integrated analysis tool used for predicting the physical conditions, chemical compositions, and distributions of radiological materials inside a containment building following the release of material from the primary system in a light-water reactor accident. It can also predict the source term to the environment. CONTAIN 2.0 is intended to replace the earlier CONTAIN 1.12, which was released in 1991. The purpose of this Code Manual is to provide full documentation of the features and models in CONTAIN 2.0. Besides complete descriptions of the models, this Code Manual provides a complete description of the input and output from the code. CONTAIN 2.0 is a highly flexible and modular code that can run problems that are either quite simple or highly complex. An important aspect of CONTAIN is that the interactions among thermal-hydraulic phenomena, aerosol behavior, and fission product behavior are taken into account. The code includes atmospheric models for steam/air thermodynamics, intercell flows, condensation/evaporation on structures and aerosols, aerosol behavior, and gas combustion. It also includes models for reactor cavity phenomena such as core-concrete interactions and coolant pool boiling. Heat conduction in structures, fission product decay and transport, radioactive decay heating, and the thermal-hydraulic and fission product decontamination effects of engineered safety features are also modeled. To the extent possible, the best available models for severe accident phenomena have been incorporated into CONTAIN, but it is intrinsic to the nature of accident analysis that significant uncertainty exists regarding numerous phenomena. In those cases, sensitivity studies can be performed with CONTAIN by means of user-specified input parameters. Thus, the code can be viewed as a tool designed to assist the knowledge reactor safety analyst in evaluating the consequences of specific modeling assumptions.

  10. Code manual for CONTAIN 2.0: A computer code for nuclear reactor containment analysis

    International Nuclear Information System (INIS)

    Murata, K.K.; Williams, D.C.; Griffith, R.O.; Gido, R.G.; Tadios, E.L.; Davis, F.J.; Martinez, G.M.; Washington, K.E.; Tills, J.

    1997-12-01

    The CONTAIN 2.0 computer code is an integrated analysis tool used for predicting the physical conditions, chemical compositions, and distributions of radiological materials inside a containment building following the release of material from the primary system in a light-water reactor accident. It can also predict the source term to the environment. CONTAIN 2.0 is intended to replace the earlier CONTAIN 1.12, which was released in 1991. The purpose of this Code Manual is to provide full documentation of the features and models in CONTAIN 2.0. Besides complete descriptions of the models, this Code Manual provides a complete description of the input and output from the code. CONTAIN 2.0 is a highly flexible and modular code that can run problems that are either quite simple or highly complex. An important aspect of CONTAIN is that the interactions among thermal-hydraulic phenomena, aerosol behavior, and fission product behavior are taken into account. The code includes atmospheric models for steam/air thermodynamics, intercell flows, condensation/evaporation on structures and aerosols, aerosol behavior, and gas combustion. It also includes models for reactor cavity phenomena such as core-concrete interactions and coolant pool boiling. Heat conduction in structures, fission product decay and transport, radioactive decay heating, and the thermal-hydraulic and fission product decontamination effects of engineered safety features are also modeled. To the extent possible, the best available models for severe accident phenomena have been incorporated into CONTAIN, but it is intrinsic to the nature of accident analysis that significant uncertainty exists regarding numerous phenomena. In those cases, sensitivity studies can be performed with CONTAIN by means of user-specified input parameters. Thus, the code can be viewed as a tool designed to assist the knowledge reactor safety analyst in evaluating the consequences of specific modeling assumptions

  11. Sensitivity and uncertainty studies of the CRAC2 code for selected meteorological models and parameters

    International Nuclear Information System (INIS)

    Ward, R.C.; Kocher, D.C.; Hicks, B.B.; Hosker, R.P. Jr.; Ku, J.Y.; Rao, K.S.

    1985-01-01

    We have studied the sensitivity of results from the CRAC2 computer code, which predicts health impacts from a reactor-accident scenario, to uncertainties in selected meteorological models and parameters. The sources of uncertainty examined include the models for plume rise and wet deposition and the meteorological bin-sampling procedure. An alternative plume-rise model usually had little effect on predicted health impacts. In an alternative wet-deposition model, the scavenging rate depends only on storm type, rather than on rainfall rate and atmospheric stability class as in the CRAC2 model. Use of the alternative wet-deposition model in meteorological bin-sampling runs decreased predicted mean early injuries by as much as a factor of 2-3 and, for large release heights and sensible heat rates, decreased mean early fatalities by nearly an order of magnitude. The bin-sampling procedure in CRAC2 was expanded by dividing each rain bin into four bins that depend on rainfall rate. Use of the modified bin structure in conjunction with the CRAC2 wet-deposition model changed all predicted health impacts by less than a factor of 2. 9 references

  12. Analysis of uncertainty in modeling perceived risks

    International Nuclear Information System (INIS)

    Melnyk, R.; Sandquist, G.M.

    2005-01-01

    Expanding on a mathematical model developed for quantifying and assessing perceived risks, the distribution functions, variances, and uncertainties associated with estimating the model parameters are quantified. The analytical model permits the identification and assignment of any number of quantifiable risk perception factors that can be incorporated within standard risk methodology. Those risk perception factors associated with major technical issues are modeled using lognormal probability density functions to span the potentially large uncertainty variations associated with these risk perceptions. The model quantifies the logic of public risk perception and provides an effective means for measuring and responding to perceived risks. (authors)

  13. Uncertainties

    Indian Academy of Sciences (India)

    To reflect this uncertainty in the climate scenarios, the use of AOGCMs that explicitly simulate the carbon cycle and chemistry of all the substances are needed. The Hadley Centre has developed a version of the climate model that allows the effect of climate change on the carbon cycle and its feedback into climate, to be ...

  14. Uncertainty

    International Nuclear Information System (INIS)

    Silva, T.A. da

    1988-01-01

    The comparison between the uncertainty method recommended by International Atomic Energy Agency (IAEA) and the and the International Weight and Measure Commitee (CIPM) are showed, for the calibration of clinical dosimeters in the secondary standard Dosimetry Laboratory (SSDL). (C.G.C.) [pt

  15. Durability reliability analysis for corroding concrete structures under uncertainty

    Science.gov (United States)

    Zhang, Hao

    2018-02-01

    This paper presents a durability reliability analysis of reinforced concrete structures subject to the action of marine chloride. The focus is to provide insight into the role of epistemic uncertainties on durability reliability. The corrosion model involves a number of variables whose probabilistic characteristics cannot be fully determined due to the limited availability of supporting data. All sources of uncertainty, both aleatory and epistemic, should be included in the reliability analysis. Two methods are available to formulate the epistemic uncertainty: the imprecise probability-based method and the purely probabilistic method in which the epistemic uncertainties are modeled as random variables. The paper illustrates how the epistemic uncertainties are modeled and propagated in the two methods, and shows how epistemic uncertainties govern the durability reliability.

  16. Systematic Evaluation of Uncertainty in Material Flow Analysis

    DEFF Research Database (Denmark)

    Laner, David; Rechberger, Helmut; Astrup, Thomas Fruergaard

    2014-01-01

    Material flow analysis (MFA) is a tool to investigate material flows and stocks in defined systems as a basis for resource management or environmental pollution control. Because of the diverse nature of sources and the varying quality and availability of data, MFA results are inherently uncertain....... Uncertainty analyses have received increasing attention in recent MFA studies, but systematic approaches for selection of appropriate uncertainty tools are missing. This article reviews existing literature related to handling of uncertainty in MFA studies and evaluates current practice of uncertainty analysis......) and exploratory MFA (identification of critical parameters and system behavior). Whereas mathematically simpler concepts focusing on data uncertainty characterization are appropriate for descriptive MFAs, statistical approaches enabling more-rigorous evaluation of uncertainty and model sensitivity are needed...

  17. Uncertainty and sensitivity analysis for the simulation of a station blackout scenario in the Jules Horowitz Reactor

    International Nuclear Information System (INIS)

    Ghione, Alberto; Noel, Brigitte; Vinai, Paolo; Demazière, Christophe

    2017-01-01

    Highlights: • A station blackout scenario in the Jules Horowitz Reactor is analyzed using CATHARE. • Input and model uncertainties relevant to the transient, are considered. • A statistical methodology for the propagation of the uncertainties is applied. • No safety criteria are exceeded and sufficiently large safety margins are estimated. • The most influential uncertainties are determined with a sensitivity analysis. - Abstract: An uncertainty and sensitivity analysis for the simulation of a station blackout scenario in the Jules Horowitz Reactor (JHR) is presented. The JHR is a new material testing reactor under construction at CEA on the Cadarache site, France. The thermal-hydraulic system code CATHARE is applied to investigate the response of the reactor system to the scenario. The uncertainty and sensitivity study was based on a statistical methodology for code uncertainty propagation, and the ‘Uncertainty and Sensitivity’ platform URANIE was used. Accordingly, the input uncertainties relevant to the transient, were identified, quantified, and propagated to the code output. The results show that the safety criteria are not exceeded and sufficiently large safety margins exist. In addition, the most influential input uncertainties on the safety parameters were found by making use of a sensitivity analysis.

  18. ENSDF analysis codes. IBM version, August 1982

    International Nuclear Information System (INIS)

    Lorenz, A.

    1982-01-01

    The nuclear structure analysis programme tape consists of physics computer processing codes used in the evaluation of mass-chain structure data. This tape was generated by the National Nuclear Data Centre, Brookhaven National Laboratory in the USA. (author)

  19. ENSDF analysis codes: IBM version. August 1982

    International Nuclear Information System (INIS)

    Lorenz, A.

    1982-09-01

    The nuclear structure analysis programme tape consists of physics computer processing codes used in the evaluation of mass-chain structure data. This tape was generated by the National Nuclear Data Centre, Brookhaven National Laboratory in the USA. (author)

  20. Development of a System Analysis Toolkit for Sensitivity Analysis, Uncertainty Propagation, and Estimation of Parameter Distribution

    International Nuclear Information System (INIS)

    Heo, Jaeseok; Kim, Kyung Doo

    2015-01-01

    Statistical approaches to uncertainty quantification and sensitivity analysis are very important in estimating the safety margins for an engineering design application. This paper presents a system analysis and optimization toolkit developed by Korea Atomic Energy Research Institute (KAERI), which includes multiple packages of the sensitivity analysis and uncertainty quantification algorithms. In order to reduce the computing demand, multiple compute resources including multiprocessor computers and a network of workstations are simultaneously used. A Graphical User Interface (GUI) was also developed within the parallel computing framework for users to readily employ the toolkit for an engineering design and optimization problem. The goal of this work is to develop a GUI framework for engineering design and scientific analysis problems by implementing multiple packages of system analysis methods in the parallel computing toolkit. This was done by building an interface between an engineering simulation code and the system analysis software packages. The methods and strategies in the framework were designed to exploit parallel computing resources such as those found in a desktop multiprocessor workstation or a network of workstations. Available approaches in the framework include statistical and mathematical algorithms for use in science and engineering design problems. Currently the toolkit has 6 modules of the system analysis methodologies: deterministic and probabilistic approaches of data assimilation, uncertainty propagation, Chi-square linearity test, sensitivity analysis, and FFTBM

  1. Development of a System Analysis Toolkit for Sensitivity Analysis, Uncertainty Propagation, and Estimation of Parameter Distribution

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Jaeseok; Kim, Kyung Doo [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    Statistical approaches to uncertainty quantification and sensitivity analysis are very important in estimating the safety margins for an engineering design application. This paper presents a system analysis and optimization toolkit developed by Korea Atomic Energy Research Institute (KAERI), which includes multiple packages of the sensitivity analysis and uncertainty quantification algorithms. In order to reduce the computing demand, multiple compute resources including multiprocessor computers and a network of workstations are simultaneously used. A Graphical User Interface (GUI) was also developed within the parallel computing framework for users to readily employ the toolkit for an engineering design and optimization problem. The goal of this work is to develop a GUI framework for engineering design and scientific analysis problems by implementing multiple packages of system analysis methods in the parallel computing toolkit. This was done by building an interface between an engineering simulation code and the system analysis software packages. The methods and strategies in the framework were designed to exploit parallel computing resources such as those found in a desktop multiprocessor workstation or a network of workstations. Available approaches in the framework include statistical and mathematical algorithms for use in science and engineering design problems. Currently the toolkit has 6 modules of the system analysis methodologies: deterministic and probabilistic approaches of data assimilation, uncertainty propagation, Chi-square linearity test, sensitivity analysis, and FFTBM.

  2. Uncertainty analysis of dosimetry spectrum unfolding

    International Nuclear Information System (INIS)

    Perey, F.G.

    1977-01-01

    The propagation of uncertainties in the input data is analyzed for the usual dosimetry unfolding solution. A new formulation of the dosimetry unfolding problem is proposed in which the most likely value of the spectrum is obtained. The relationship of this solution to the usual one is discussed

  3. The adjoint sensitivity method, a contribution to the code uncertainty evaluation

    International Nuclear Information System (INIS)

    Ounsy, A.; Brun, B.; De Crecy, F.

    1994-01-01

    This paper deals with the application of the adjoint sensitivity method (ASM) to thermal hydraulic codes. The advantage of the method is to use small central processing unit time in comparison with the usual approach requiring one complete code run per sensitivity determination. In the first part the mathematical aspects of the problem are treated, and the applicability of the method of the functional-type response of a thermal hydraulic model is demonstrated. On a simple example of non-linear hyperbolic equation (Burgers equation) the problem has been analysed. It is shown that the formalism used in the literature treating this subject is not appropriate. A new mathematical formalism circumventing the problem is proposed. For the discretized form of the problem, two methods are possible: the continuous ASM and the discrete ASM. The equivalence of both methods is demonstrated; nevertheless only the discrete ASM constitutes a practical solution for thermal hydraulic codes. The application of the discrete ASM to the thermal hydraulic safety code CATHARE is then presented for two examples. They demonstrate that the discrete ASM constitutes an efficient tool for the analysis of code sensitivity. ((orig.))

  4. The adjoint sensitivity method, a contribution to the code uncertainty evaluation

    International Nuclear Information System (INIS)

    Ounsy, A.; Crecy, F. de; Brun, B.

    1993-01-01

    The application of the ASM (Adjoint Sensitivity Method) to thermohydraulic codes, is examined. The advantage of the method is to be very few CPU time consuming in comparison with usual approach requiring one complete code run per sensitivity determination. The mathematical aspects of the problem are first described, and the applicability of the method of the functional-type response of a thermalhydraulic model is demonstrated. On a simple example of non linear hyperbolic equation (Burgers equation) the problem has been analyzed. It is shown that the formalism used in the literature treating this subject is not appropriate. A new mathematical formalism circumventing the problem is proposed. For the discretized form of the problem, two methods are possible: the Continuous ASM and the Discrete ASM. The equivalence of both methods is demonstrated; nevertheless only the DASM constitutes a practical solution for thermalhydraulic codes. The application of the DASM to the thermalhydraulic safety code CATHARE is then presented for two examples. They demonstrate that ASM constitutes an efficient tool for the analysis of code sensitivity. (authors) 7 figs., 5 tabs., 8 refs

  5. The adjoint sensitivity method. A contribution to the code uncertainty evaluation

    International Nuclear Information System (INIS)

    Ounsy, A.; Brun, B.

    1993-01-01

    The application of the ASM (Adjoint Sensitivity Method) to thermohydraulic codes, is examined. The advantage of the method is to be very few CPU time consuming in comparison with usual approach requiring one complete code run per sensitivity determination. The mathematical aspects of the problem are first described, and the applicability of the method of the functional-type response of a thermalhydraulic model is demonstrated. On a simple example of non linear hyperbolic equation (Burgers equation) the problem has been analyzed. It is shown that the formalism used in the literature treating this subject is not appropriate. A new mathematical formalism circumventing the problem is proposed. For the discretized form of the problem, two methods are possible: the Continuous ASM and the Discrete ASM. The equivalence of both methods is demonstrated; nevertheless only the DASM constitutes a practical solution for thermalhydraulic codes. The application of the DASM to the thermalhydraulic safety code CATHARE is then presented for two examples. They demonstrate that ASM constitutes an efficient tool for the analysis of code sensitivity. (authors) 7 figs., 5 tabs., 8 refs

  6. The adjoint sensitivity method. A contribution to the code uncertainty evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Ounsy, A; Brun, B

    1994-12-31

    The application of the ASM (Adjoint Sensitivity Method) to thermohydraulic codes, is examined. The advantage of the method is to be very few CPU time consuming in comparison with usual approach requiring one complete code run per sensitivity determination. The mathematical aspects of the problem are first described, and the applicability of the method of the functional-type response of a thermalhydraulic model is demonstrated. On a simple example of non linear hyperbolic equation (Burgers equation) the problem has been analyzed. It is shown that the formalism used in the literature treating this subject is not appropriate. A new mathematical formalism circumventing the problem is proposed. For the discretized form of the problem, two methods are possible: the Continuous ASM and the Discrete ASM. The equivalence of both methods is demonstrated; nevertheless only the DASM constitutes a practical solution for thermalhydraulic codes. The application of the DASM to the thermalhydraulic safety code CATHARE is then presented for two examples. They demonstrate that ASM constitutes an efficient tool for the analysis of code sensitivity. (authors) 7 figs., 5 tabs., 8 refs.

  7. The adjoint sensitivity method, a contribution to the code uncertainty evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Ounsy, A; Crecy, F de; Brun, B

    1994-12-31

    The application of the ASM (Adjoint Sensitivity Method) to thermohydraulic codes, is examined. The advantage of the method is to be very few CPU time consuming in comparison with usual approach requiring one complete code run per sensitivity determination. The mathematical aspects of the problem are first described, and the applicability of the method of the functional-type response of a thermalhydraulic model is demonstrated. On a simple example of non linear hyperbolic equation (Burgers equation) the problem has been analyzed. It is shown that the formalism used in the literature treating this subject is not appropriate. A new mathematical formalism circumventing the problem is proposed. For the discretized form of the problem, two methods are possible: the Continuous ASM and the Discrete ASM. The equivalence of both methods is demonstrated; nevertheless only the DASM constitutes a practical solution for thermalhydraulic codes. The application of the DASM to the thermalhydraulic safety code CATHARE is then presented for two examples. They demonstrate that ASM constitutes an efficient tool for the analysis of code sensitivity. (authors) 7 figs., 5 tabs., 8 refs.

  8. Analytic uncertainty and sensitivity analysis of models with input correlations

    Science.gov (United States)

    Zhu, Yueying; Wang, Qiuping A.; Li, Wei; Cai, Xu

    2018-03-01

    Probabilistic uncertainty analysis is a common means of evaluating mathematical models. In mathematical modeling, the uncertainty in input variables is specified through distribution laws. Its contribution to the uncertainty in model response is usually analyzed by assuming that input variables are independent of each other. However, correlated parameters are often happened in practical applications. In the present paper, an analytic method is built for the uncertainty and sensitivity analysis of models in the presence of input correlations. With the method, it is straightforward to identify the importance of the independence and correlations of input variables in determining the model response. This allows one to decide whether or not the input correlations should be considered in practice. Numerical examples suggest the effectiveness and validation of our analytic method in the analysis of general models. A practical application of the method is also proposed to the uncertainty and sensitivity analysis of a deterministic HIV model.

  9. Verification and uncertainty evaluation of CASMO-3/MASTER nuclear analysis system

    Energy Technology Data Exchange (ETDEWEB)

    Song, Jae Seung; Cho, Byung Oh; Joo, Han Kyu; Zee, Sung Quun; Lee, Chung Chan; Park, Sang Yoon

    2000-06-01

    MASTER is a nuclear design code developed by KAERI. It uses group constants generated by CASMO-3 developed by Studsvik. In this report the verification and evaluation of uncertainty were performed for the code system application in nuclear reactor core analysis and design. The verification is performed via various benchmark comparisons for static and transient core condition, and core follow calculations with startup physics test predictions of total 14 cycles of pressurized water reactors. Benchmark calculation include comparisons with reference solutions of IAEA and OECA/NEA problems and critical experiment measurements. The uncertainty evaluation is focused to safety related parameters such as power distribution, reactivity coefficients, control rod worth and core reactivity. It is concluded that CASMO-3/MASTER can be applied for PWR core nuclear analysis and design without any bias factors. Also, it is verified that the system can be applied for SMART core, via supplemental comparisons with reference calculations by MCNP which is a probabilistic nuclear calculation code.

  10. Verification and uncertainty evaluation of CASMO-3/MASTER nuclear analysis system

    International Nuclear Information System (INIS)

    Song, Jae Seung; Cho, Byung Oh; Joo, Han Kyu; Zee, Sung Quun; Lee, Chung Chan; Park, Sang Yoon

    2000-06-01

    MASTER is a nuclear design code developed by KAERI. It uses group constants generated by CASMO-3 developed by Studsvik. In this report the verification and evaluation of uncertainty were performed for the code system application in nuclear reactor core analysis and design. The verification is performed via various benchmark comparisons for static and transient core condition, and core follow calculations with startup physics test predictions of total 14 cycles of pressurized water reactors. Benchmark calculation include comparisons with reference solutions of IAEA and OECA/NEA problems and critical experiment measurements. The uncertainty evaluation is focused to safety related parameters such as power distribution, reactivity coefficients, control rod worth and core reactivity. It is concluded that CASMO-3/MASTER can be applied for PWR core nuclear analysis and design without any bias factors. Also, it is verified that the system can be applied for SMART core, via supplemental comparisons with reference calculations by MCNP which is a probabilistic nuclear calculation code

  11. Preliminary Uncertainty Analysis for SMART Digital Core Protection and Monitoring System

    International Nuclear Information System (INIS)

    Koo, Bon Seung; In, Wang Kee; Hwang, Dae Hyun

    2012-01-01

    The Korea Atomic Energy Research Institute (KAERI) developed on-line digital core protection and monitoring systems, called SCOPS and SCOMS as a part of SMART plant protection and monitoring system. SCOPS simplified the protection system by directly connecting the four RSPT signals to each core protection channel and eliminated the control element assembly calculator (CEAC) hardware. SCOMS adopted DPCM3D method in synthesizing core power distribution instead of Fourier expansion method being used in conventional PWRs. The DPCM3D method produces a synthetic 3-D power distribution by coupling a neutronics code and measured in-core detector signals. The overall uncertainty analysis methodology which is used statistically combining uncertainty components of SMART core protection and monitoring system was developed. In this paper, preliminary overall uncertainty factors for SCOPS/SCOMS of SMART initial core were evaluated by applying newly developed uncertainty analysis method

  12. Uncertainty analysis of NDA waste measurements using computer simulations

    International Nuclear Information System (INIS)

    Blackwood, L.G.; Harker, Y.D.; Yoon, W.Y.; Meachum, T.R.

    2000-01-01

    Uncertainty assessments for nondestructive radioassay (NDA) systems for nuclear waste are complicated by factors extraneous to the measurement systems themselves. Most notably, characteristics of the waste matrix (e.g., homogeneity) and radioactive source material (e.g., particle size distribution) can have great effects on measured mass values. Under these circumstances, characterizing the waste population is as important as understanding the measurement system in obtaining realistic uncertainty values. When extraneous waste characteristics affect measurement results, the uncertainty results are waste-type specific. The goal becomes to assess the expected bias and precision for the measurement of a randomly selected item from the waste population of interest. Standard propagation-of-errors methods for uncertainty analysis can be very difficult to implement in the presence of significant extraneous effects on the measurement system. An alternative approach that naturally includes the extraneous effects is as follows: (1) Draw a random sample of items from the population of interest; (2) Measure the items using the NDA system of interest; (3) Establish the true quantity being measured using a gold standard technique; and (4) Estimate bias by deriving a statistical regression model comparing the measurements on the system of interest to the gold standard values; similar regression techniques for modeling the standard deviation of the difference values gives the estimated precision. Actual implementation of this method is often impractical. For example, a true gold standard confirmation measurement may not exist. A more tractable implementation is obtained by developing numerical models for both the waste material and the measurement system. A random sample of simulated waste containers generated by the waste population model serves as input to the measurement system model. This approach has been developed and successfully applied to assessing the quantity of

  13. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  14. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  15. Phenomenological uncertainty analysis of containment building pressure load caused by severe accident sequences

    International Nuclear Information System (INIS)

    Park, S.Y.; Ahn, K.I.

    2014-01-01

    Highlights: • Phenomenological uncertainty analysis has been applied to level 2 PSA. • The methodology provides an alternative to simple deterministic analyses and sensitivity studies. • A realistic evaluation provides a more complete characterization of risks. • Uncertain parameters of MAAP code for the early containment failure were identified. - Abstract: This paper illustrates an application of a severe accident analysis code, MAAP, to the uncertainty evaluation of early containment failure scenarios employed in the containment event tree (CET) model of a reference plant. An uncertainty analysis of containment pressure behavior during severe accidents has been performed for an optimum assessment of an early containment failure model. The present application is mainly focused on determining an estimate of the containment building pressure load caused by severe accident sequences of a nuclear power plant. Key modeling parameters and phenomenological models employed for the present uncertainty analysis are closely related to the in-vessel hydrogen generation, direct containment heating, and gas combustion. The basic approach of this methodology is to (1) develop severe accident scenarios for which containment pressure loads should be performed based on a level 2 PSA, (2) identify severe accident phenomena relevant to an early containment failure, (3) identify the MAAP input parameters, sensitivity coefficients, and modeling options that describe or influence the early containment failure phenomena, (4) prescribe the likelihood descriptions of the potential range of these parameters, and (5) evaluate the code predictions using a number of random combinations of parameter inputs sampled from the likelihood distributions

  16. Sensitivity and uncertainty analysis for fission product decay heat calculations

    International Nuclear Information System (INIS)

    Rebah, J.; Lee, Y.K.; Nimal, J.C.; Nimal, B.; Luneville, L.; Duchemin, B.

    1994-01-01

    The calculated uncertainty in decay heat due to the uncertainty in basic nuclear data given in the CEA86 Library, is presented. Uncertainties in summation calculation arise from several sources: fission product yields, half-lives and average decay energies. The correlation between basic data is taken into account. The uncertainty analysis were obtained for thermal-neutron-induced fission of U235 and Pu239 in the case of burst fission and irradiation time. The calculated decay heat in this study is compared with experimental results and with new calculation using the JEF2 Library. (from authors) 6 figs., 19 refs

  17. Urban drainage models simplifying uncertainty analysis for practitioners

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana

    2013-01-01

    in each measured/observed datapoint; an issue that is commonly overlooked in the uncertainty analysis of urban drainage models. This comparison allows the user to intuitively estimate the optimum number of simulations required to conduct uncertainty analyses. The output of the method includes parameter......There is increasing awareness about uncertainties in the modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here...

  18. Uncertainty Analysis of Consequence Management (CM) Data Products.

    Energy Technology Data Exchange (ETDEWEB)

    Hunt, Brian D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eckert-Gallup, Aubrey Celia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cochran, Lainy Dromgoole [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kraus, Terrence D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Fournier, Sean Donovan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Allen, Mark B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Schetnan, Richard Reed [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Simpson, Matthew D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Okada, Colin E. [Remote Sensing Lab. (RSL), Nellis AFB, Las Vegas, NV (United States); Bingham, Avery A. [Remote Sensing Lab. (RSL), Nellis AFB, Las Vegas, NV (United States)

    2018-01-01

    The goal of this project is to develop and execute methods for characterizing uncertainty in data products that are deve loped and distributed by the DOE Consequence Management (CM) Program. A global approach to this problem is necessary because multiple sources of error and uncertainty from across the CM skill sets contribute to the ultimate p roduction of CM data products. This report presents the methods used to develop a probabilistic framework to characterize this uncertainty and provides results for an uncertainty analysis for a study scenario analyzed using this framework.

  19. R-matrix analysis code (RAC)

    International Nuclear Information System (INIS)

    Chen Zhenpeng; Qi Huiquan

    1990-01-01

    A comprehensive R-matrix analysis code has been developed. It is based on the multichannel and multilevel R-matrix theory and runs in VAX computer with FORTRAN-77. With this code many kinds of experimental data for one nuclear system can be fitted simultaneously. The comparisions between code RAC and code EDA of LANL are made. The data show both codes produced the same calculation results when one set of R-matrix parameters was used. The differential cross section of 10 B (n, α) 7 Li for E n = 0.4 MeV and the polarization of 16 O (n,n) 16 O for E n = 2.56 MeV are presented

  20. Decisions under uncertainty using Bayesian analysis

    Directory of Open Access Journals (Sweden)

    Stelian STANCU

    2006-01-01

    Full Text Available The present paper makes a short presentation of the Bayesian decions method, where extrainformation brings a great support to decision making process, but also attract new costs. In this situation, getting new information, generally experimentaly based, contributes to diminushing the uncertainty degree that influences decision making process. As a conclusion, in a large number of decision problems, there is the possibility that the decision makers will renew some decisions already taken because of the facilities offered by obtainig extrainformation.

  1. Uncertainty Analysis of Light Water Reactor Fuel Lattices

    Directory of Open Access Journals (Sweden)

    C. Arenas

    2013-01-01

    Full Text Available The study explored the calculation of uncertainty based on available cross-section covariance data and computational tool on fuel lattice levels, which included pin cell and the fuel assembly models. Uncertainty variations due to temperatures changes and different fuel compositions are the main focus of this analysis. Selected assemblies and unit pin cells were analyzed according to the OECD LWR UAM benchmark specifications. Criticality and uncertainty analysis were performed using TSUNAMI-2D sequence in SCALE 6.1. It was found that uncertainties increase with increasing temperature, while kinf decreases. This increase in the uncertainty is due to the increase in sensitivity of the largest contributing reaction of uncertainty, namely, the neutron capture reaction 238U(n, γ due to the Doppler broadening. In addition, three types (UOX, MOX, and UOX-Gd2O3 of fuel material compositions were analyzed. A remarkable increase in uncertainty in kinf was observed for the case of MOX fuel. The increase in uncertainty of kinf in MOX fuel was nearly twice the corresponding value in UOX fuel. The neutron-nuclide reaction of 238U, mainly inelastic scattering (n, n′, contributed the most to the uncertainties in the MOX fuel, shifting the neutron spectrum to higher energy compared to the UOX fuel.

  2. Integrated uncertainty analysis using RELAP/SCDAPSIM/MOD4.0

    International Nuclear Information System (INIS)

    Perez, M.; Reventos, F.; Wagner, R.; Allison, C.

    2009-01-01

    The RELAP/SCDAPSIM/MOD4.0 code, designed to predict the behavior of reactor systems during normal and accident conditions, is being developed as part of an international nuclear technology Software Development and Training Program (SDTP). RELAP/SCDAPSIM/MOD4.0, which is the first version of RELAP5 completely rewritten to FORTRAN 90/95/2000 standards, uses the publicly available RELAP5 and SCDAP models in combination with (a) advanced programming and numerical techniques, (b) advanced SDTP-member-developed models for LWR, HWR, and research reactor analysis, and (c) a variety of other member-developed computational packages. One such computational package is an integrated uncertainty analysis package being developed jointly by the Technical University of Catalunya (UPC) and Innovative Systems Software (ISS). The integrated uncertainty analysis approach used in the package uses the following steps: 1. Selection of the plant; 2. Selection of the scenario; 3. Selection of the safety criteria; 4. Identification and ranking of the relevant phenomena based on the safety criteria; 5. Selection of the appropriate code parameters to represent those phenomena; 6. Association of uncertainty by means of Probability Distribution Functions (PDFs) for each selected parameter; 7. Random sampling of the selected parameters according to its PDF and performing multiple computer runs to obtain uncertainty bands with a certain percentile and confidence level; 8. Processing the results of the multiple computer runs to estimate the uncertainty bands for the computed quantities associated with the selected safety criteria. The first four steps are performed by the user prior to the RELAP/SCDAPSIM/MOD4.0 analysis. The remaining steps are included with the MOD4.0 integrated uncertainty analysis (IUA) package. This paper briefly describes the integrated uncertainty analysis package including (a) the features of the package, (b) the implementation of the package into RELAP/SCDAPSIM/MOD4.0, and

  3. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)] [and others

    1997-06-01

    This volume is the second of a two-volume document that summarizes a joint project by the US Nuclear Regulatory and the Commission of European Communities to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This two-volume report, which examines mechanisms and uncertainties of transfer through the food chain, is the first in a series of five such reports. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain transfer that affect calculations of offsite radiological consequences. Seven of the experts reported on transfer into the food chain through soil and plants, nine reported on transfer via food products from animals, and two reported on both. The expert judgment elicitation procedure and its outcomes are described in these volumes. This volume contains seven appendices. Appendix A presents a brief discussion of the MAACS and COSYMA model codes. Appendix B is the structure document and elicitation questionnaire for the expert panel on soils and plants. Appendix C presents the rationales and responses of each of the members of the soils and plants expert panel. Appendix D is the structure document and elicitation questionnaire for the expert panel on animal transfer. The rationales and responses of each of the experts on animal transfer are given in Appendix E. Brief biographies of the food chain expert panel members are provided in Appendix F. Aggregated results of expert responses are presented in graph format in Appendix G.

  4. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 2: Appendices

    International Nuclear Information System (INIS)

    Brown, J.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-06-01

    This volume is the second of a two-volume document that summarizes a joint project by the US Nuclear Regulatory and the Commission of European Communities to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This two-volume report, which examines mechanisms and uncertainties of transfer through the food chain, is the first in a series of five such reports. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain transfer that affect calculations of offsite radiological consequences. Seven of the experts reported on transfer into the food chain through soil and plants, nine reported on transfer via food products from animals, and two reported on both. The expert judgment elicitation procedure and its outcomes are described in these volumes. This volume contains seven appendices. Appendix A presents a brief discussion of the MAACS and COSYMA model codes. Appendix B is the structure document and elicitation questionnaire for the expert panel on soils and plants. Appendix C presents the rationales and responses of each of the members of the soils and plants expert panel. Appendix D is the structure document and elicitation questionnaire for the expert panel on animal transfer. The rationales and responses of each of the experts on animal transfer are given in Appendix E. Brief biographies of the food chain expert panel members are provided in Appendix F. Aggregated results of expert responses are presented in graph format in Appendix G

  5. Bayesian models for comparative analysis integrating phylogenetic uncertainty

    Directory of Open Access Journals (Sweden)

    Villemereuil Pierre de

    2012-06-01

    general purpose tool for phylogenetic comparative analyses, particularly for modelling in the face of phylogenetic uncertainty and accounting for measurement error or individual variation in explanatory variables. Code for all models is provided in the BUGS model description language.

  6. Bayesian models for comparative analysis integrating phylogenetic uncertainty

    Science.gov (United States)

    2012-01-01

    phylogenetic comparative analyses, particularly for modelling in the face of phylogenetic uncertainty and accounting for measurement error or individual variation in explanatory variables. Code for all models is provided in the BUGS model description language. PMID:22741602

  7. Estimating the measurement uncertainty in forensic blood alcohol analysis.

    Science.gov (United States)

    Gullberg, Rod G

    2012-04-01

    For many reasons, forensic toxicologists are being asked to determine and report their measurement uncertainty in blood alcohol analysis. While understood conceptually, the elements and computations involved in determining measurement uncertainty are generally foreign to most forensic toxicologists. Several established and well-documented methods are available to determine and report the uncertainty in blood alcohol measurement. A straightforward bottom-up approach is presented that includes: (1) specifying the measurand, (2) identifying the major components of uncertainty, (3) quantifying the components, (4) statistically combining the components and (5) reporting the results. A hypothetical example is presented that employs reasonable estimates for forensic blood alcohol analysis assuming headspace gas chromatography. These computations are easily employed in spreadsheet programs as well. Determining and reporting measurement uncertainty is an important element in establishing fitness-for-purpose. Indeed, the demand for such computations and information from the forensic toxicologist will continue to increase.

  8. Uncertainty about probability: a decision analysis perspective

    International Nuclear Information System (INIS)

    Howard, R.A.

    1988-01-01

    The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group

  9. Uncertainty analysis in the applications of nuclear probabilistic risk assessment

    International Nuclear Information System (INIS)

    Le Duy, T.D.

    2011-01-01

    The aim of this thesis is to propose an approach to model parameter and model uncertainties affecting the results of risk indicators used in the applications of nuclear Probabilistic Risk assessment (PRA). After studying the limitations of the traditional probabilistic approach to represent uncertainty in PRA model, a new approach based on the Dempster-Shafer theory has been proposed. The uncertainty analysis process of the proposed approach consists in five main steps. The first step aims to model input parameter uncertainties by belief and plausibility functions according to the data PRA model. The second step involves the propagation of parameter uncertainties through the risk model to lay out the uncertainties associated with output risk indicators. The model uncertainty is then taken into account in the third step by considering possible alternative risk models. The fourth step is intended firstly to provide decision makers with information needed for decision making under uncertainty (parametric and model) and secondly to identify the input parameters that have significant uncertainty contributions on the result. The final step allows the process to be continued in loop by studying the updating of beliefs functions given new data. The proposed methodology was implemented on a real but simplified application of PRA model. (author)

  10. Including uncertainty in hazard analysis through fuzzy measures

    International Nuclear Information System (INIS)

    Bott, T.F.; Eisenhawer, S.W.

    1997-12-01

    This paper presents a method for capturing the uncertainty expressed by an Hazard Analysis (HA) expert team when estimating the frequencies and consequences of accident sequences and provides a sound mathematical framework for propagating this uncertainty to the risk estimates for these accident sequences. The uncertainty is readily expressed as distributions that can visually aid the analyst in determining the extent and source of risk uncertainty in HA accident sequences. The results also can be expressed as single statistics of the distribution in a manner analogous to expressing a probabilistic distribution as a point-value statistic such as a mean or median. The study discussed here used data collected during the elicitation portion of an HA on a high-level waste transfer process to demonstrate the techniques for capturing uncertainty. These data came from observations of the uncertainty that HA team members expressed in assigning frequencies and consequences to accident sequences during an actual HA. This uncertainty was captured and manipulated using ideas from possibility theory. The result of this study is a practical method for displaying and assessing the uncertainty in the HA team estimates of the frequency and consequences for accident sequences. This uncertainty provides potentially valuable information about accident sequences that typically is lost in the HA process

  11. Uncertainty analysis for the BEACON-COLSS core monitoring system application

    International Nuclear Information System (INIS)

    Morita, T.; Boyd, W.A.; Seong, K.B.

    2005-01-01

    This paper will cover the measurement uncertainty analysis of BEACON-COLSS core monitoring system. The uncertainty evaluation is made by using a BEACON-COLSS simulation program. By simulating the BEACON on-line operation for analytically generated reactor conditions, accuracy of the 'Measured' results can be evaluated by comparing to analytically generated 'Truth'. The DNB power margin is evaluated based on the Combustion Engineering's Modified Statistical Combination of Uncertainties (MSCU) using the CETOPD code for the DNBR calculation. A BEACON-COLSS simulation program for the uncertainty evaluation function has been established for plant applications. Qualification work has been completed for two Combustion Engineering plants. Results of the BEACON-COLSS measured peaking factors and DNBR power margin are plant type dependent and are applicable to reload cores as long as the core geometry and detector layout are unchanged. (authors)

  12. How does uncertainty shape patient experience in advanced illness? A secondary analysis of qualitative data.

    Science.gov (United States)

    Etkind, Simon Noah; Bristowe, Katherine; Bailey, Katharine; Selman, Lucy Ellen; Murtagh, Fliss Em

    2017-02-01

    Uncertainty is common in advanced illness but is infrequently studied in this context. If poorly addressed, uncertainty can lead to adverse patient outcomes. We aimed to understand patient experiences of uncertainty in advanced illness and develop a typology of patients' responses and preferences to inform practice. Secondary analysis of qualitative interview transcripts. Studies were assessed for inclusion and interviews were sampled using maximum-variation sampling. Analysis used a thematic approach with 10% of coding cross-checked to enhance reliability. Qualitative interviews from six studies including patients with heart failure, chronic obstructive pulmonary disease, renal disease, cancer and liver failure. A total of 30 transcripts were analysed. Median age was 75 (range, 43-95), 12 patients were women. The impact of uncertainty was frequently discussed: the main related themes were engagement with illness, information needs, patient priorities and the period of time that patients mainly focused their attention on (temporal focus). A typology of patient responses to uncertainty was developed from these themes. Uncertainty influences patient experience in advanced illness through affecting patients' information needs, preferences and future priorities for care. Our typology aids understanding of how patients with advanced illness respond to uncertainty. Assessment of these three factors may be a useful starting point to guide clinical assessment and shared decision making.

  13. Uncertainty analysis technique for OMEGA Dante measurementsa)

    Science.gov (United States)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-10-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  14. Uncertainty analysis technique for OMEGA Dante measurements

    International Nuclear Information System (INIS)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-01-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  15. Uncertainty Analysis Technique for OMEGA Dante Measurements

    International Nuclear Information System (INIS)

    May, M.J.; Widmann, K.; Sorce, C.; Park, H.; Schneider, M.

    2010-01-01

    The Dante is an 18 channel X-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g. hohlraums, etc.) at X-ray energies between 50 eV to 10 keV. It is a main diagnostics installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the X-ray diodes, filters and mirrors and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte-Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  16. Assessment and uncertainty analysis of groundwater risk.

    Science.gov (United States)

    Li, Fawen; Zhu, Jingzhao; Deng, Xiyuan; Zhao, Yong; Li, Shaofei

    2018-01-01

    Groundwater with relatively stable quantity and quality is commonly used by human being. However, as the over-mining of groundwater, problems such as groundwater funnel, land subsidence and salt water intrusion have emerged. In order to avoid further deterioration of hydrogeological problems in over-mining regions, it is necessary to conduct the assessment of groundwater risk. In this paper, risks of shallow and deep groundwater in the water intake area of the South-to-North Water Transfer Project in Tianjin, China, were evaluated. Firstly, two sets of four-level evaluation index system were constructed based on the different characteristics of shallow and deep groundwater. Secondly, based on the normalized factor values and the synthetic weights, the risk values of shallow and deep groundwater were calculated. Lastly, the uncertainty of groundwater risk assessment was analyzed by indicator kriging method. The results meet the decision maker's demand for risk information, and overcome previous risk assessment results expressed in the form of deterministic point estimations, which ignore the uncertainty of risk assessment. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Development and assessment of best estimate integrated safety analysis code

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Bub Dong; Lee, Young Jin; Hwang, Moon Kyu (and others)

    2007-03-15

    Improvement of the integrated safety analysis code MARS3.0 has been carried out and a multi-D safety analysis application system has been established. Iterative matrix solver and parallel processing algorithm have been introduced, and a LINUX version has been generated to enable MARS to run in cluster PCs. MARS variables and sub-routines have been reformed and modularised to simplify code maintenance. Model uncertainty analyses have been performed for THTF, FLECHT, NEPTUN, and LOFT experiments as well as APR1400 plant. Participations in international cooperation research projects such as OECD BEMUSE, SETH, PKL, BFBT, and TMI-2 have been actively pursued as part of code assessment efforts. The assessment, evaluation and experimental data obtained through international cooperation projects have been registered and maintained in the T/H Databank. Multi-D analyses of APR1400 LBLOCA, DVI Break, SLB, and SGTR have been carried out as a part of application efforts in multi-D safety analysis. GUI based 3D input generator has been developed for user convenience. Operation of the MARS Users Group (MUG) was continued and through MUG, the technology has been transferred to 24 organisations. A set of 4 volumes of user manuals has been compiled and the correction reports for the code errors reported during MARS development have been published.

  18. Development and assessment of best estimate integrated safety analysis code

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Lee, Young Jin; Hwang, Moon Kyu

    2007-03-01

    Improvement of the integrated safety analysis code MARS3.0 has been carried out and a multi-D safety analysis application system has been established. Iterative matrix solver and parallel processing algorithm have been introduced, and a LINUX version has been generated to enable MARS to run in cluster PCs. MARS variables and sub-routines have been reformed and modularised to simplify code maintenance. Model uncertainty analyses have been performed for THTF, FLECHT, NEPTUN, and LOFT experiments as well as APR1400 plant. Participations in international cooperation research projects such as OECD BEMUSE, SETH, PKL, BFBT, and TMI-2 have been actively pursued as part of code assessment efforts. The assessment, evaluation and experimental data obtained through international cooperation projects have been registered and maintained in the T/H Databank. Multi-D analyses of APR1400 LBLOCA, DVI Break, SLB, and SGTR have been carried out as a part of application efforts in multi-D safety analysis. GUI based 3D input generator has been developed for user convenience. Operation of the MARS Users Group (MUG) was continued and through MUG, the technology has been transferred to 24 organisations. A set of 4 volumes of user manuals has been compiled and the correction reports for the code errors reported during MARS development have been published

  19. Uncertainty and sensitivity analysis in nuclear accident consequence assessment

    International Nuclear Information System (INIS)

    Karlberg, Olof.

    1989-01-01

    This report contains the results of a four year project in research contracts with the Nordic Cooperation in Nuclear Safety and the National Institute for Radiation Protection. An uncertainty/sensitivity analysis methodology consisting of Latin Hypercube sampling and regression analysis was applied to an accident consequence model. A number of input parameters were selected and the uncertainties related to these parameter were estimated within a Nordic group of experts. Individual doses, collective dose, health effects and their related uncertainties were then calculated for three release scenarios and for a representative sample of meteorological situations. From two of the scenarios the acute phase after an accident were simulated and from one the long time consequences. The most significant parameters were identified. The outer limits of the calculated uncertainty distributions are large and will grow to several order of magnitudes for the low probability consequences. The uncertainty in the expectation values are typical a factor 2-5 (1 Sigma). The variation in the model responses due to the variation of the weather parameters is fairly equal to the parameter uncertainty induced variation. The most important parameters showed out to be different for each pathway of exposure, which could be expected. However, the overall most important parameters are the wet deposition coefficient and the shielding factors. A general discussion of the usefulness of uncertainty analysis in consequence analysis is also given. (au)

  20. HTGR reactor physics, thermal-hydraulics and depletion uncertainty analysis: a proposed IAEA coordinated research project

    International Nuclear Information System (INIS)

    Tyobeka, Bismark; Reitsma, Frederik; Ivanov, Kostadin

    2011-01-01

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The predictive capability of coupled neutronics/thermal hydraulics and depletion simulations for reactor design and safety analysis can be assessed with sensitivity analysis and uncertainty analysis methods. In order to benefit from recent advances in modeling and simulation and the availability of new covariance data (nuclear data uncertainties) extensive sensitivity and uncertainty studies are needed for quantification of the impact of different sources of uncertainties on the design and safety parameters of HTGRs. Uncertainty and sensitivity studies are an essential component of any significant effort in data and simulation improvement. In February 2009, the Technical Working Group on Gas-Cooled Reactors recommended that the proposed IAEA Coordinated Research Project (CRP) on the HTGR Uncertainty Analysis in Modeling be implemented. In the paper the current status and plan are presented. The CRP will also benefit from interactions with the currently ongoing OECD/NEA Light Water Reactor (LWR) UAM benchmark activity by taking into consideration the peculiarities of HTGR designs and simulation requirements. (author)

  1. International Training Program: 3D S. Un. Cop - Scaling, Uncertainty and 3D Thermal-Hydraulics/Neutron-Kinetics Coupled Codes Seminar

    International Nuclear Information System (INIS)

    Pertuzzi, A.; D'Auria, F.; Bajs, T.; Reventos, F.

    2006-01-01

    Thermal-hydraulic system computer codes are extensively used worldwide for analysis of nuclear facilities by utilities, regulatory bodies, nuclear power plant designers and vendors, nuclear fuel companies, research organizations, consulting companies, and technical support organizations. The computer code user represents a source of uncertainty that can influence the results of system code calculations. This influence is commonly known as the 'user effect' and stems from the limitations embedded in the codes as well as from the limited capability of the analysts to use the codes. Code user training and qualification is an effective means for reducing the variation of results caused by the application of the codes by different users. This paper describes a systematic approach to training code users who, upon completion of the training, should be able to perform calculations making the best possible use of the capabilities of best estimate codes. In other words, the program aims at contributing towards solving the problem of user effect. The 3D S.UN.COP (Scaling, Uncertainty and 3D COuPled code calculations) seminars have been organized as follow-up of the proposal to IAEA for the Permanent Training Course for System Code Users (D'Auria, 1998). Four seminars have been held at University of Pisa (2003, 2004), at The Pennsylvania State University (2004) and at University of Zagreb (2005). It was recognized that such courses represented both a source of continuing education for current code users and a mean for current code users to enter the formal training structure of a proposed 'permanent' stepwise approach to user training. The 3D S.UN.COP 2005 was successfully held with the participation of 19 persons coming from 9 countries and 14 different institutions (universities, vendors, national laboratories and regulatory bodies). More than 15 scientists were involved in the organization of the seminar, presenting theoretical aspects of the proposed methodologies and

  2. Probabilistic accident consequence uncertainty analysis -- Late health effects uncertain assessment. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Little, M.P.; Muirhead, C.R. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the expert panel on late health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  3. Demonstration of uncertainty quantification and sensitivity analysis for PWR fuel performance with BISON

    International Nuclear Information System (INIS)

    Zhang, Hongbin; Zhao, Haihua; Zou, Ling; Burns, Douglas; Ladd, Jacob

    2017-01-01

    BISON is an advanced fuels performance code being developed at Idaho National Laboratory and is the code of choice for fuels performance by the U.S. Department of Energy (DOE)’s Consortium for Advanced Simulation of Light Water Reactors (CASL) Program. An approach to uncertainty quantification and sensitivity analysis with BISON was developed and a new toolkit was created. A PWR fuel rod model was developed and simulated by BISON, and uncertainty quantification and sensitivity analysis were performed with eighteen uncertain input parameters. The maximum fuel temperature and gap conductance were selected as the figures of merit (FOM). Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis. (author)

  4. Demonstration of Uncertainty Quantification and Sensitivity Analysis for PWR Fuel Performance with BISON

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Hongbin; Ladd, Jacob; Zhao, Haihua; Zou, Ling; Burns, Douglas

    2015-11-01

    BISON is an advanced fuels performance code being developed at Idaho National Laboratory and is the code of choice for fuels performance by the U.S. Department of Energy (DOE)’s Consortium for Advanced Simulation of Light Water Reactors (CASL) Program. An approach to uncertainty quantification and sensitivity analysis with BISON was developed and a new toolkit was created. A PWR fuel rod model was developed and simulated by BISON, and uncertainty quantification and sensitivity analysis were performed with eighteen uncertain input parameters. The maximum fuel temperature and gap conductance were selected as the figures of merit (FOM). Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis.

  5. Uncertainty Analysis of RELAP5-3D

    Energy Technology Data Exchange (ETDEWEB)

    Alexandra E Gertman; Dr. George L Mesina

    2012-07-01

    As world-wide energy consumption continues to increase, so does the demand for the use of alternative energy sources, such as Nuclear Energy. Nuclear Power Plants currently supply over 370 gigawatts of electricity, and more than 60 new nuclear reactors have been commissioned by 15 different countries. The primary concern for Nuclear Power Plant operation and lisencing has been safety. The safety of the operation of Nuclear Power Plants is no simple matter- it involves the training of operators, design of the reactor, as well as equipment and design upgrades throughout the lifetime of the reactor, etc. To safely design, operate, and understand nuclear power plants, industry and government alike have relied upon the use of best-estimate simulation codes, which allow for an accurate model of any given plant to be created with well-defined margins of safety. The most widely used of these best-estimate simulation codes in the Nuclear Power industry is RELAP5-3D. Our project focused on improving the modeling capabilities of RELAP5-3D by developing uncertainty estimates for its calculations. This work involved analyzing high, medium, and low ranked phenomena from an INL PIRT on a small break Loss-Of-Coolant Accident as wall as an analysis of a large break Loss-Of- Coolant Accident. Statistical analyses were performed using correlation coefficients. To perform the studies, computer programs were written that modify a template RELAP5 input deck to produce one deck for each combination of key input parameters. Python scripting enabled the running of the generated input files with RELAP5-3D on INL’s massively parallel cluster system. Data from the studies was collected and analyzed with SAS. A summary of the results of our studies are presented.

  6. Centrifugal Compressor Aeroelastic Analysis Code

    Science.gov (United States)

    Keith, Theo G., Jr.; Srivastava, Rakesh

    2002-01-01

    Centrifugal compressors are very widely used in the turbomachine industry where low mass flow rates are required. Gas turbine engines for tanks, rotorcraft and small jets rely extensively on centrifugal compressors for rugged and compact design. These compressors experience problems related with unsteadiness of flowfields, such as stall flutter, separation at the trailing edge over diffuser guide vanes, tip vortex unsteadiness, etc., leading to rotating stall and surge. Considerable interest exists in small gas turbine engine manufacturers to understand and eventually eliminate the problems related to centrifugal compressors. The geometric complexity of centrifugal compressor blades and the twisting of the blade passages makes the linear methods inapplicable. Advanced computational fluid dynamics (CFD) methods are needed for accurate unsteady aerodynamic and aeroelastic analysis of centrifugal compressors. Most of the current day industrial turbomachines and small aircraft engines are designed with a centrifugal compressor. With such a large customer base and NASA Glenn Research Center being, the lead center for turbomachines, it is important that adequate emphasis be placed on this area as well. Currently, this activity is not supported under any project at NASA Glenn.

  7. NORTICA - a new code for cyclotron analysis

    International Nuclear Information System (INIS)

    Gorelov, D.; Johnson, D.; Marti, F.

    2001-01-01

    The new package NORTICA (Numerical ORbit Tracking In Cyclotrons with Analysis) of computer codes for beam dynamics simulations is under development at NSCL. The package was started as a replacement for the code MONSTER developed in the laboratory in the past. The new codes are capable of beam dynamics simulations in both CCF (Coupled Cyclotron Facility) accelerators, the K500 and K1200 superconducting cyclotrons. The general purpose of this package is assisting in setting and tuning the cyclotrons taking into account the main field and extraction channel imperfections. The computer platform for the package is Alpha Station with UNIX operating system and X-Windows graphic interface. A multiple programming language approach was used in order to combine the reliability of the numerical algorithms developed over the long period of time in the laboratory and the friendliness of modern style user interface. This paper describes the capability and features of the codes in the present state

  8. Uncertainty analysis of geothermal energy economics

    Science.gov (United States)

    Sener, Adil Caner

    This dissertation research endeavors to explore geothermal energy economics by assessing and quantifying the uncertainties associated with the nature of geothermal energy and energy investments overall. The study introduces a stochastic geothermal cost model and a valuation approach for different geothermal power plant development scenarios. The Monte Carlo simulation technique is employed to obtain probability distributions of geothermal energy development costs and project net present values. In the study a stochastic cost model with incorporated dependence structure is defined and compared with the model where random variables are modeled as independent inputs. One of the goals of the study is to attempt to shed light on the long-standing modeling problem of dependence modeling between random input variables. The dependence between random input variables will be modeled by employing the method of copulas. The study focuses on four main types of geothermal power generation technologies and introduces a stochastic levelized cost model for each technology. Moreover, we also compare the levelized costs of natural gas combined cycle and coal-fired power plants with geothermal power plants. The input data used in the model relies on the cost data recently reported by government agencies and non-profit organizations, such as the Department of Energy, National Laboratories, California Energy Commission and Geothermal Energy Association. The second part of the study introduces the stochastic discounted cash flow valuation model for the geothermal technologies analyzed in the first phase. In this phase of the study, the Integrated Planning Model (IPM) software was used to forecast the revenue streams of geothermal assets under different price and regulation scenarios. These results are then combined to create a stochastic revenue forecast of the power plants. The uncertainties in gas prices and environmental regulations will be modeled and their potential impacts will be

  9. Modified Phenomena Identification and Ranking Table (PIRT) for Uncertainty Analysis

    International Nuclear Information System (INIS)

    Gol-Mohamad, Mohammad P.; Modarres, Mohammad; Mosleh, Ali

    2006-01-01

    This paper describes a methodology of characterizing important phenomena, which is also part of a broader research by the authors called 'Modified PIRT'. The methodology provides robust process of phenomena identification and ranking process for more precise quantification of uncertainty. It is a two-step process of identifying and ranking methodology based on thermal-hydraulics (TH) importance as well as uncertainty importance. Analytical Hierarchical Process (AHP) has been used for as a formal approach for TH identification and ranking. Formal uncertainty importance technique is used to estimate the degree of credibility of the TH model(s) used to represent the important phenomena. This part uses subjective justification by evaluating available information and data from experiments, and code predictions. The proposed methodology was demonstrated by developing a PIRT for large break loss of coolant accident LBLOCA for the LOFT integral facility with highest core power (test LB-1). (authors)

  10. The EURACOS activation experiments: preliminary uncertainty analysis

    International Nuclear Information System (INIS)

    Yeivin, Y.

    1982-01-01

    A sequence of counting rates of an irradiated sulphur pellet, r(tsub(i)), measured at different times after the end of the irradiation, are fitted to r(t)=Aexp(-lambda t)+B. A standard adjustment procedure is applied to determine the parameters A and B, their standard deviations and correlation, and chi square. It is demonstrated that if the counting-rate uncertainties are entirely due to the counting statistics, the experimental data are totally inconsistent with the ''theoretical'' model. However, assuming an additional systematic error of approximalety 1%, and eliminating a few ''bad'' data, produces a data set quite consistent with the model. The dependence of chi square on the assumed systematic error and the data elimination procedure are discussed in great detail. A review of the adjustment procedure is appended to the report

  11. Uncertainty Analysis via Failure Domain Characterization: Unrestricted Requirement Functions

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2011-01-01

    This paper proposes an uncertainty analysis framework based on the characterization of the uncertain parameter space. This characterization enables the identification of worst-case uncertainty combinations and the approximation of the failure and safe domains with a high level of accuracy. Because these approximations are comprised of subsets of readily computable probability, they enable the calculation of arbitrarily tight upper and lower bounds to the failure probability. The methods developed herein, which are based on nonlinear constrained optimization, are applicable to requirement functions whose functional dependency on the uncertainty is arbitrary and whose explicit form may even be unknown. Some of the most prominent features of the methodology are the substantial desensitization of the calculations from the assumed uncertainty model (i.e., the probability distribution describing the uncertainty) as well as the accommodation for changes in such a model with a practically insignificant amount of computational effort.

  12. Uncertainty Analysis via Failure Domain Characterization: Polynomial Requirement Functions

    Science.gov (United States)

    Crespo, Luis G.; Munoz, Cesar A.; Narkawicz, Anthony J.; Kenny, Sean P.; Giesy, Daniel P.

    2011-01-01

    This paper proposes an uncertainty analysis framework based on the characterization of the uncertain parameter space. This characterization enables the identification of worst-case uncertainty combinations and the approximation of the failure and safe domains with a high level of accuracy. Because these approximations are comprised of subsets of readily computable probability, they enable the calculation of arbitrarily tight upper and lower bounds to the failure probability. A Bernstein expansion approach is used to size hyper-rectangular subsets while a sum of squares programming approach is used to size quasi-ellipsoidal subsets. These methods are applicable to requirement functions whose functional dependency on the uncertainty is a known polynomial. Some of the most prominent features of the methodology are the substantial desensitization of the calculations from the uncertainty model assumed (i.e., the probability distribution describing the uncertainty) as well as the accommodation for changes in such a model with a practically insignificant amount of computational effort.

  13. Ethical Code Effectiveness in Football Clubs: A Longitudinal Analysis

    OpenAIRE

    Constandt, Bram; De Waegeneer, Els; Willem, Annick

    2017-01-01

    As football (soccer) clubs are facing different ethical challenges, many clubs are turning to ethical codes to counteract unethical behaviour. However, both in- and outside the sport field, uncertainty remains about the effectiveness of these ethical codes. For the first time, a longitudinal study design was adopted to evaluate code effectiveness. Specifically, a sample of non-professional football clubs formed the subject of our inquiry. Ethical code effectiveness was...

  14. Sensitivity and uncertainty analysis of the PATHWAY radionuclide transport model

    International Nuclear Information System (INIS)

    Otis, M.D.

    1983-01-01

    Procedures were developed for the uncertainty and sensitivity analysis of a dynamic model of radionuclide transport through human food chains. Uncertainty in model predictions was estimated by propagation of parameter uncertainties using a Monte Carlo simulation technique. Sensitivity of model predictions to individual parameters was investigated using the partial correlation coefficient of each parameter with model output. Random values produced for the uncertainty analysis were used in the correlation analysis for sensitivity. These procedures were applied to the PATHWAY model which predicts concentrations of radionuclides in foods grown in Nevada and Utah and exposed to fallout during the period of atmospheric nuclear weapons testing in Nevada. Concentrations and time-integrated concentrations of iodine-131, cesium-136, and cesium-137 in milk and other foods were investigated. 9 figs., 13 tabs

  15. CATHENA 4. A thermalhydraulics network analysis code

    International Nuclear Information System (INIS)

    Aydemir, N.U.; Hanna, B.N.

    2009-01-01

    Canadian Algorithm for THErmalhydraulic Network Analysis (CATHENA) is a one-dimensional, non-equilibrium, two-phase, two fluid network analysis code that has been in use for over two decades by various groups in Canada and around the world. The objective of the present paper is to describe the design, application and future development plans for the CATHENA 4 thermalhydraulics network analysis code, which is a modernized version of the present frozen CATHENA 3 code. The new code is designed in modular form, using the Fortran 95 (F95) programming language. The semi-implicit numerical integration scheme of CATHENA 3 is re-written to implement a fully-implicit methodology using Newton's iterative solution scheme suitable for nonlinear equations. The closure relations, as a first step, have been converted from the existing CATHENA 3 implementation to F95 but modularized to achieve ease of maintenance. The paper presents the field equations, followed by a description of the Newton's scheme used. The finite-difference form of the field equations is given, followed by a discussion of convergence criteria. Two applications of CATHENA 4 are presented to demonstrate the temporal and spatial convergence of the new code for problems with known solutions or available experimental data. (author)

  16. Code system for fast reactor neutronics analysis

    International Nuclear Information System (INIS)

    Nakagawa, Masayuki; Abe, Junji; Sato, Wakaei.

    1983-04-01

    A code system for analysis of fast reactor neutronics has been developed for the purpose of handy use and error reduction. The JOINT code produces the input data file to be used in the neutronics calculation code and also prepares the cross section library file with an assigned format. The effective cross sections are saved in the PDS file with an unified format. At the present stage, this code system includes the following codes; SLAROM, ESELEM5, EXPANDA-G for the production of effective cross sections and CITATION-FBR, ANISN-JR, TWOTRAN2, PHENIX, 3DB, MORSE, CIPER and SNPERT. In the course of the development, some utility programs and service programs have been additionaly developed. These are used for access of PDS file, edit of the cross sections and graphic display. Included in this report are a description of input data format of the JOINT and other programs, and of the function of each subroutine and utility programs. The usage of PDS file is also explained. In Appendix A, the input formats are described for the revised version of the CIPER code. (author)

  17. Uncertainty analysis with statistically correlated failure data

    International Nuclear Information System (INIS)

    Modarres, M.; Dezfuli, H.; Roush, M.L.

    1987-01-01

    Likelihood of occurrence of the top event of a fault tree or sequences of an event tree is estimated from the failure probability of components that constitute the events of the fault/event tree. Component failure probabilities are subject to statistical uncertainties. In addition, there are cases where the failure data are statistically correlated. At present most fault tree calculations are based on uncorrelated component failure data. This chapter describes a methodology for assessing the probability intervals for the top event failure probability of fault trees or frequency of occurrence of event tree sequences when event failure data are statistically correlated. To estimate mean and variance of the top event, a second-order system moment method is presented through Taylor series expansion, which provides an alternative to the normally used Monte Carlo method. For cases where component failure probabilities are statistically correlated, the Taylor expansion terms are treated properly. Moment matching technique is used to obtain the probability distribution function of the top event through fitting the Johnson Ssub(B) distribution. The computer program, CORRELATE, was developed to perform the calculations necessary for the implementation of the method developed. (author)

  18. Aspects of uncertainty analysis in accident consequence modeling

    International Nuclear Information System (INIS)

    Travis, C.C.; Hoffman, F.O.

    1981-01-01

    Mathematical models are frequently used to determine probable dose to man from an accidental release of radionuclides by a nuclear facility. With increased emphasis on the accuracy of these models, the incorporation of uncertainty analysis has become one of the most crucial and sensitive components in evaluating the significance of model predictions. In the present paper, we address three aspects of uncertainty in models used to assess the radiological impact to humans: uncertainties resulting from the natural variability in human biological parameters; the propagation of parameter variability by mathematical models; and comparison of model predictions to observational data

  19. New challenges on uncertainty propagation assessment of flood risk analysis

    Science.gov (United States)

    Martins, Luciano; Aroca-Jiménez, Estefanía; Bodoque, José M.; Díez-Herrero, Andrés

    2016-04-01

    Natural hazards, such as floods, cause considerable damage to the human life, material and functional assets every year and around the World. Risk assessment procedures has associated a set of uncertainties, mainly of two types: natural, derived from stochastic character inherent in the flood process dynamics; and epistemic, that are associated with lack of knowledge or the bad procedures employed in the study of these processes. There are abundant scientific and technical literature on uncertainties estimation in each step of flood risk analysis (e.g. rainfall estimates, hydraulic modelling variables); but very few experience on the propagation of the uncertainties along the flood risk assessment. Therefore, epistemic uncertainties are the main goal of this work, in particular,understand the extension of the propagation of uncertainties throughout the process, starting with inundability studies until risk analysis, and how far does vary a proper analysis of the risk of flooding. These methodologies, such as Polynomial Chaos Theory (PCT), Method of Moments or Monte Carlo, are used to evaluate different sources of error, such as data records (precipitation gauges, flow gauges...), hydrologic and hydraulic modelling (inundation estimation), socio-demographic data (damage estimation) to evaluate the uncertainties propagation (UP) considered in design flood risk estimation both, in numerical and cartographic expression. In order to consider the total uncertainty and understand what factors are contributed most to the final uncertainty, we used the method of Polynomial Chaos Theory (PCT). It represents an interesting way to handle to inclusion of uncertainty in the modelling and simulation process. PCT allows for the development of a probabilistic model of the system in a deterministic setting. This is done by using random variables and polynomials to handle the effects of uncertainty. Method application results have a better robustness than traditional analysis

  20. Application of uncertainty analysis in conceptual fusion reactor design

    International Nuclear Information System (INIS)

    Wu, T.; Maynard, C.W.

    1979-01-01

    The theories of sensitivity and uncertainty analysis are described and applied to a new conceptual tokamak fusion reactor design--NUWMAK. The responses investigated in this study include the tritium breeding ratio, first wall Ti dpa and gas productions, nuclear heating in the blanket, energy leakage to the magnet, and the dpa rate in the superconducting magnet aluminum stabilizer. The sensitivities and uncertainties of these responses are calculated. The cost/benefit feature of proposed integral measurements is also studied through the uncertainty reductions of these responses

  1. Sensitivity and uncertainty analysis of NET/ITER shielding blankets

    International Nuclear Information System (INIS)

    Hogenbirk, A.; Gruppelaar, H.; Verschuur, K.A.

    1990-09-01

    Results are presented of sensitivity and uncertainty calculations based upon the European fusion file (EFF-1). The effect of uncertainties in Fe, Cr and Ni cross sections on the nuclear heating in the coils of a NET/ITER shielding blanket has been studied. The analysis has been performed for the total cross section as well as partial cross sections. The correct expression for the sensitivity profile was used, including the gain term. The resulting uncertainty in the nuclear heating lies between 10 and 20 per cent. (author). 18 refs.; 2 figs.; 2 tabs

  2. Static Code Analysis with Gitlab-CI

    CERN Document Server

    Datko, Szymon Tomasz

    2016-01-01

    Static Code Analysis is a simple but efficient way to ensure that application’s source code is free from known flaws and security vulnerabilities. Although such analysis tools are often coming with more advanced code editors, there are a lot of people who prefer less complicated environments. The easiest solution would involve education – where to get and how to use the aforementioned tools. However, counting on the manual usage of such tools still does not guarantee their actual usage. On the other hand, reducing the required effort, according to the idea “setup once, use anytime without sweat” seems like a more promising approach. In this paper, the approach to automate code scanning, within the existing CERN’s Gitlab installation, is described. For realization of that project, the Gitlab-CI service (the “CI” stands for "Continuous Integration"), with Docker assistance, was employed to provide a variety of static code analysers for different programming languages. This document covers the gene...

  3. Improved Monte Carlo Method for PSA Uncertainty Analysis

    International Nuclear Information System (INIS)

    Choi, Jongsoo

    2016-01-01

    The treatment of uncertainty is an important issue for regulatory decisions. Uncertainties exist from knowledge limitations. A probabilistic approach has exposed some of these limitations and provided a framework to assess their significance and assist in developing a strategy to accommodate them in the regulatory process. The uncertainty analysis (UA) is usually based on the Monte Carlo method. This paper proposes a Monte Carlo UA approach to calculate the mean risk metrics accounting for the SOKC between basic events (including CCFs) using efficient random number generators and to meet Capability Category III of the ASME/ANS PRA standard. Audit calculation is needed in PSA regulatory reviews of uncertainty analysis results submitted for licensing. The proposed Monte Carlo UA approach provides a high degree of confidence in PSA reviews. All PSA needs accounting for the SOKC between event probabilities to meet the ASME/ANS PRA standard

  4. Uncertainty Analysis of Seebeck Coefficient and Electrical Resistivity Characterization

    Science.gov (United States)

    Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred

    2014-01-01

    In order to provide a complete description of a materials thermoelectric power factor, in addition to the measured nominal value, an uncertainty interval is required. The uncertainty may contain sources of measurement error including systematic bias error and precision error of a statistical nature. The work focuses specifically on the popular ZEM-3 (Ulvac Technologies) measurement system, but the methods apply to any measurement system. The analysis accounts for sources of systematic error including sample preparation tolerance, measurement probe placement, thermocouple cold-finger effect, and measurement parameters; in addition to including uncertainty of a statistical nature. Complete uncertainty analysis of a measurement system allows for more reliable comparison of measurement data between laboratories.

  5. Improved Monte Carlo Method for PSA Uncertainty Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jongsoo [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2016-10-15

    The treatment of uncertainty is an important issue for regulatory decisions. Uncertainties exist from knowledge limitations. A probabilistic approach has exposed some of these limitations and provided a framework to assess their significance and assist in developing a strategy to accommodate them in the regulatory process. The uncertainty analysis (UA) is usually based on the Monte Carlo method. This paper proposes a Monte Carlo UA approach to calculate the mean risk metrics accounting for the SOKC between basic events (including CCFs) using efficient random number generators and to meet Capability Category III of the ASME/ANS PRA standard. Audit calculation is needed in PSA regulatory reviews of uncertainty analysis results submitted for licensing. The proposed Monte Carlo UA approach provides a high degree of confidence in PSA reviews. All PSA needs accounting for the SOKC between event probabilities to meet the ASME/ANS PRA standard.

  6. Turbo Pascal Computer Code for PIXE Analysis

    International Nuclear Information System (INIS)

    Darsono

    2002-01-01

    To optimal utilization of the 150 kV ion accelerator facilities and to govern the analysis technique using ion accelerator, the research and development of low energy PIXE technology has been done. The R and D for hardware of the low energy PIXE installation in P3TM have been carried on since year 2000. To support the R and D of PIXE accelerator facilities in harmonize with the R and D of the PIXE hardware, the development of PIXE software for analysis is also needed. The development of database of PIXE software for analysis using turbo Pascal computer code is reported in this paper. This computer code computes the ionization cross-section, the fluorescence yield, and the stopping power of elements also it computes the coefficient attenuation of X- rays energy. The computer code is named PIXEDASIS and it is part of big computer code planed for PIXE analysis that will be constructed in the near future. PIXEDASIS is designed to be communicative with the user. It has the input from the keyboard. The output shows in the PC monitor, which also can be printed. The performance test of the PIXEDASIS shows that it can be operated well and it can provide data agreement with data form other literatures. (author)

  7. Application of RELAP/SCDAPSIM with integrated uncertainty options to research reactor systems thermal hydraulic analysis

    International Nuclear Information System (INIS)

    Allison, C.M.; Hohorst, J.K.; Perez, M.; Reventos, F.

    2010-01-01

    The RELAP/SCDAPSIM/MOD4.0 code, designed to predict the behavior of reactor systems during normal and accident conditions, is being developed as part of the international SCDAP Development and Training Program (SDTP). RELAP/SCDAPSIM/MOD4.0, which is the first version of RELAP5 completely rewritten to FORTRAN 90/95/2000 standards, uses publicly available RELAP5 and SCDAP models in combination with advanced programming and numerical techniques and other SDTP-member modeling/user options. One such member developed option is an integrated uncertainty analysis package being developed jointly by the Technical University of Catalonia (UPC) and Innovative Systems Software (ISS). This paper briefly summarizes the features of RELAP/SCDAPSIM/MOD4.0 and the integrated uncertainty analysis package, and then presents an example of how the integrated uncertainty package can be setup and used for a simple pipe flow problem. (author)

  8. PCT Uncertainty Analysis Using Unscented Transform with Random Orthogonal Matrix

    Energy Technology Data Exchange (ETDEWEB)

    Fynana, Douglas A.; Ahn, Kwang-Il [KAERI, Daejeon (Korea, Republic of); Lee, John C. [Univ. of Michigan, Michigan (United States)

    2015-05-15

    less statistical variation of mean and variance estimates of the output pdf. For BEPU applications where on the order of 20 to 50 input parameter uncertainties are usually sampled, the UT with ROM requires the same order of magnitude of best estimate code simulations as the widely used Wilks' formula. In addition to BEPU, the UT with ROM may be a useful sampling algorithm in other nuclear applications involving computer codes and uncertainties. One future area of research is nuclear data uncertainty propagation in neutronics calculations. Cross section uncertainties are conveniently available as covariance data in the evaluated nuclear data files so the UT only requiring covariance information appears to be an appropriate method.

  9. Two-dimensional cross-section and SED uncertainty analysis for the Fusion Engineering Device (FED)

    International Nuclear Information System (INIS)

    Embrechts, M.J.; Urban, W.T.; Dudziak, D.J.

    1982-01-01

    The theory of two-dimensional cross-section and secondary-energy-distribution (SED) sensitivity was implemented by developing a two-dimensional sensitivity and uncertainty analysis code, SENSIT-2D. Analyses of the Fusion Engineering Design (FED) conceptual inboard shield indicate that, although the calculated uncertainties in the 2-D model are of the same order of magnitude as those resulting from the 1-D model, there might be severe differences. The more complex the geometry, the more compulsory a 2-D analysis becomes. Specific results show that the uncertainty for the integral heating of the toroidal field (TF) coil for the FED is 114.6%. The main contributors to the cross-section uncertainty are chromium and iron. Contributions to the total uncertainty were smaller for nickel, copper, hydrogen and carbon. All analyses were performed with the Los Alamos 42-group cross-section library generated from ENDF/B-V data, and the COVFILS covariance matrix library. The large uncertainties due to chromium result mainly from large convariances for the chromium total and elastic scattering cross sections

  10. Uncertainty Instability Risk Analysis of High Concrete Arch Dam Abutments

    Directory of Open Access Journals (Sweden)

    Xin Cao

    2017-01-01

    Full Text Available The uncertainties associated with concrete arch dams rise with the increased height of dams. Given the uncertainties associated with influencing factors, the stability of high arch dam abutments as a fuzzy random event was studied. In addition, given the randomness and fuzziness of calculation parameters as well as the failure criterion, hazard point and hazard surface uncertainty instability risk ratio models were proposed for high arch dam abutments on the basis of credibility theory. The uncertainty instability failure criterion was derived through the analysis of the progressive instability failure process on the basis of Shannon’s entropy theory. The uncertainties associated with influencing factors were quantized by probability or possibility distribution assignments. Gaussian random theory was used to generate random realizations for influence factors with spatial variability. The uncertainty stability analysis method was proposed by combining the finite element analysis and the limit equilibrium method. The instability risk ratio was calculated using the Monte Carlo simulation method and fuzzy random postprocessing. Results corroborate that the modeling approach is sound and that the calculation method is feasible.

  11. Uncertainty analysis of one Main Circulation Pump trip event at the Ignalina NPP

    International Nuclear Information System (INIS)

    Vileiniskis, V.; Kaliatka, A.; Uspuras, E.

    2004-01-01

    One Main Circulation Pump (MCP) trip event is an anticipated transient with expected frequency of approximately one event per year. There were a few events when one MCP was inadvertently tripped. The throughput of the rest running pumps in the affected Main Circulation Circuit loop increased, however, the total coolant flow through the affected loop decreased. The main question arises whether this coolant flow rate is sufficient for adequate core cooling. This paper presents an investigation of one MCP trip event at the Ignalina NPP. According to international practice, the transient analysis should consist of deterministic analysis by employing best-estimate codes and uncertainty analysis. For that purpose, the plant's RELAP5 model and the GRS (Germany) System for Uncertainty and Sensitivity Analysis package (SUSA) were employed. Uncertainty analysis of flow energy loss in different parts of the Main Circulation Circuit, initial conditions and code-selected models was performed. Such analysis allows to estimate the influence of separate parameters on calculation results and to find the modelling parameters that have the largest impact on the event studied. On the basis of this analysis, recommendations for the further improvement of the model have been developed. (author)

  12. Benchmark calculation of subchannel analysis codes

    International Nuclear Information System (INIS)

    1996-02-01

    In order to evaluate the analysis capabilities of various subchannel codes used in thermal-hydraulic design of light water reactors, benchmark calculations were performed. The selected benchmark problems and major findings obtained by the calculations were as follows: (1)As for single-phase flow mixing experiments between two channels, the calculated results of water temperature distribution along the flow direction were agreed with experimental results by tuning turbulent mixing coefficients properly. However, the effect of gap width observed in the experiments could not be predicted by the subchannel codes. (2)As for two-phase flow mixing experiments between two channels, in high water flow rate cases, the calculated distributions of air and water flows in each channel were well agreed with the experimental results. In low water flow cases, on the other hand, the air mixing rates were underestimated. (3)As for two-phase flow mixing experiments among multi-channels, the calculated mass velocities at channel exit under steady-state condition were agreed with experimental values within about 10%. However, the predictive errors of exit qualities were as high as 30%. (4)As for critical heat flux(CHF) experiments, two different results were obtained. A code indicated that the calculated CHF's using KfK or EPRI correlations were well agreed with the experimental results, while another code suggested that the CHF's were well predicted by using WSC-2 correlation or Weisman-Pei mechanistic model. (5)As for droplets entrainment and deposition experiments, it was indicated that the predictive capability was significantly increased by improving correlations. On the other hand, a remarkable discrepancy between codes was observed. That is, a code underestimated the droplet flow rate and overestimated the liquid film flow rate in high quality cases, while another code overestimated the droplet flow rate and underestimated the liquid film flow rate in low quality cases. (J.P.N.)

  13. The explicit treatment of model uncertainties in the presence of aleatory and epistemic parameter uncertainties in risk and reliability analysis

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Yang, Joon Eon

    2003-01-01

    In the risk and reliability analysis of complex technological systems, the primary concern of formal uncertainty analysis is to understand why uncertainties arise, and to evaluate how they impact the results of the analysis. In recent times, many of the uncertainty analyses have focused on parameters of the risk and reliability analysis models, whose values are uncertain in an aleatory or an epistemic way. As the field of parametric uncertainty analysis matures, however, more attention is being paid to the explicit treatment of uncertainties that are addressed in the predictive model itself as well as the accuracy of the predictive model. The essential steps for evaluating impacts of these model uncertainties in the presence of parameter uncertainties are to determine rigorously various sources of uncertainties to be addressed in an underlying model itself and in turn model parameters, based on our state-of-knowledge and relevant evidence. Answering clearly the question of how to characterize and treat explicitly the forgoing different sources of uncertainty is particularly important for practical aspects such as risk and reliability optimization of systems as well as more transparent risk information and decision-making under various uncertainties. The main purpose of this paper is to provide practical guidance for quantitatively treating various model uncertainties that would often be encountered in the risk and reliability modeling process of complex technological systems

  14. Response surface methodology for sensitivity and uncertainty analysis: performance and perspectives

    International Nuclear Information System (INIS)

    Olivi, L.; Brunelli, F.; Cacciabue, P.C.; Parisi, P.

    1985-01-01

    Two main aspects have to be taken into account in studying a nuclear accident scenario when using nuclear safety codes as an information source. The first one concerns the behavior of the code response and the set of assumptions to be introduced for its modelling. The second one is connected with the uncertainty features of the code input, often modelled as a probability density function (pdf). The analyst can apply two well-defined approaches depending on whether he wants major emphasis put on either of the aspects. Response Surface Methodology uses polynomial and inverse polynomial models together with the theory of experimental design, expressly developed for the identification procedure. It constitutes a well-established body of techniques able to cover a wide spectrum of requirements, when the first aspect plays the crucial role in the definition of the objectives. Other techniques such as Latin hypercube sampling, stratified sampling or even random sampling can fit better, when the second aspect affects the reliability of the analysis. The ultimate goal for both approaches is the selection of the variable, i.e. the identification of the code input variables most effective on the output and the uncertainty propagation, i.e. the assessment of the pdf to be attributed to the code response. The main aim of this work is to present a sensitivity analysis method, already tested on a real case, sufficiently flexible to be applied in both approaches mentioned

  15. Sandia National Laboratories analysis code data base

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, C.W.

    1994-11-01

    Sandia National Laboratories, mission is to solve important problems in the areas of national defense, energy security, environmental integrity, and industrial technology. The Laboratories` strategy for accomplishing this mission is to conduct research to provide an understanding of the important physical phenomena underlying any problem, and then to construct validated computational models of the phenomena which can be used as tools to solve the problem. In the course of implementing this strategy, Sandia`s technical staff has produced a wide variety of numerical problem-solving tools which they use regularly in the design, analysis, performance prediction, and optimization of Sandia components, systems and manufacturing processes. This report provides the relevant technical and accessibility data on the numerical codes used at Sandia, including information on the technical competency or capability area that each code addresses, code ``ownership`` and release status, and references describing the physical models and numerical implementation.

  16. Sandia National Laboratories analysis code data base

    Science.gov (United States)

    Peterson, C. W.

    1994-11-01

    Sandia National Laboratories' mission is to solve important problems in the areas of national defense, energy security, environmental integrity, and industrial technology. The laboratories' strategy for accomplishing this mission is to conduct research to provide an understanding of the important physical phenomena underlying any problem, and then to construct validated computational models of the phenomena which can be used as tools to solve the problem. In the course of implementing this strategy, Sandia's technical staff has produced a wide variety of numerical problem-solving tools which they use regularly in the design, analysis, performance prediction, and optimization of Sandia components, systems, and manufacturing processes. This report provides the relevant technical and accessibility data on the numerical codes used at Sandia, including information on the technical competency or capability area that each code addresses, code 'ownership' and release status, and references describing the physical models and numerical implementation.

  17. Efficient Quantification of Uncertainties in Complex Computer Code Results, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal addresses methods for efficient quantification of margins and uncertainties (QMU) for models that couple multiple, large-scale commercial or...

  18. Development of chemical equilibrium analysis code 'CHEEQ'

    International Nuclear Information System (INIS)

    Nagai, Shuichiro

    2006-08-01

    'CHEEQ' code which calculates the partial pressure and the mass of the system consisting of ideal gas and pure condensed phase compounds, was developed. Characteristics of 'CHEEQ' code are as follows. All the chemical equilibrium equations were described by the formation reactions from the mono-atomic gases in order to simplify the code structure and input preparation. Chemical equilibrium conditions, Σν i μ i =0 for the gaseous compounds and precipitated condensed phase compounds and Σν i μ i > 0 for the non-precipitated condensed phase compounds, were applied. Where, ν i and μ i are stoichiometric coefficient and chemical potential of component i. Virtual solid model was introduced to perform the calculation of constant partial pressure condition. 'CHEEQ' was consisted of following 3 parts, (1) analysis code, zc132. f. (2) thermodynamic data base, zmdb01 and (3) input data file, zindb. 'CHEEQ' code can calculate the system which consisted of elements (max.20), condensed phase compounds (max.100) and gaseous compounds. (max.200). Thermodynamic data base, zmdb01 contains about 1000 elements and compounds, and 200 of them were Actinide elements and their compounds. This report describes the basic equations, the outline of the solution procedure and instructions to prepare the input data and to evaluate the calculation results. (author)

  19. Uncertainty analysis of suppression pool heating during an ATWS in a BWR-5 plant

    International Nuclear Information System (INIS)

    Wulff, W.; Cheng, H.S.; Mallen, A.N.; Johnsen, G.W.; Lellouche, G.S.

    1994-03-01

    The uncertainty has been estimated of predicting the peak temperature in the suppression pool of a BWR power plant, which undergoes an NRC-postulated Anticipated Transient Without Scram (ATWS). The ATWS is initiated by recirculation-pump trips, and then leads to power and flow oscillations as they had occurred at the LaSalle-2 Power Station in March of 1988. After limit-cycle oscillations have been established, the turbines are tripped, but without MSIV closure, allowing steam discharge through the turbine bypass into the condenser. Postulated operator actions, namely to lower the reactor vessel pressure and the level elevation in the downcomer, are simulated by a robot model which accounts for operator uncertainty. All balance of plant and control systems modeling uncertainties were part of the statistical uncertainty analysis that was patterned after the Code Scaling, Applicability and Uncertainty (CSAU) evaluation methodology. The analysis showed that the predicted suppression-pool peak temperature of 329.3 K (133 degrees F) has a 95-percentile uncertainty of 14.4 K (26 degrees F), and that the size of this uncertainty bracket is dominated by the experimental uncertainty of measuring Safety and Relief Valve mass flow rates under critical-flow conditions. The analysis showed also that the probability of exceeding the suppression-pool temperature limit of 352.6 K (175 degrees F) is most likely zero (it is estimated as < 5-104). The square root of the sum of the squares of all the computed peak pool temperatures is 350.7 K (171.6 degrees F)

  20. Dealing with phenomenological uncertainty in risk analysis

    International Nuclear Information System (INIS)

    Theofanous, T.G.

    1994-01-01

    The Risk-Oriented Accident Analysis Methodology (ROAAM) is summarized and developed further towards a formal definition. The key ideas behind the methodology and these more formal aspects are also presented and discussed

  1. Uncertainty and sensitivity analysis of parameters affecting water hammer pressure wave behaviour

    International Nuclear Information System (INIS)

    Kaliatka, A.; Uspuras, E.; Vaisnoras, M.

    2006-01-01

    Pressure surges occurring in pipeline systems may be caused by fast control interference, start up and shut down processes and operation failure. They lead to water hammer upstream the closing valve and cavitational hammer downstream the valve, which may cause considerable damages to the pipeline and the support structures. Appearance of water hammer in thermal-hydraulic systems was widely studied employing different state-of-the-art thermal-hydraulic codes in many organizations. For the analysis water hammer test performed at Fraunhofer Institute for Environmental, Safety and Energy Technology (UMSICHT) at Oberhausen was considered. This paper presents the comparison of UMSICHT test facility experiment calculations employing the best estimate system code RELAP5/Mod3.3 to measured water hammer values after fast closure of a valve. The analysis revealed that the calculated first pressure peak, which has the highest value, matches the measured value very well. The performed analysis (as well as any other analyses) as a results of each individual calculation always contains uncertainty owing to initial conditions of installations, errors of measuring systems, errors caused by nodalization of objects at modelling, code correlations, etc. In this connection, results of uncertainty and sensitivity analysis of the initial conditions and code-selected models are shown in the paper. (orig.)

  2. Improvement of Modeling HTGR Neutron Physics by Uncertainty Analysis with the Use of Cross-Section Covariance Information

    Science.gov (United States)

    Boyarinov, V. F.; Grol, A. V.; Fomichenko, P. A.; Ternovykh, M. Yu

    2017-01-01

    This work is aimed at improvement of HTGR neutron physics design calculations by application of uncertainty analysis with the use of cross-section covariance information. Methodology and codes for preparation of multigroup libraries of covariance information for individual isotopes from the basic 44-group library of SCALE-6 code system were developed. A 69-group library of covariance information in a special format for main isotopes and elements typical for high temperature gas cooled reactors (HTGR) was generated. This library can be used for estimation of uncertainties, associated with nuclear data, in analysis of HTGR neutron physics with design codes. As an example, calculations of one-group cross-section uncertainties for fission and capture reactions for main isotopes of the MHTGR-350 benchmark, as well as uncertainties of the multiplication factor (k∞) for the MHTGR-350 fuel compact cell model and fuel block model were performed. These uncertainties were estimated by the developed technology with the use of WIMS-D code and modules of SCALE-6 code system, namely, by TSUNAMI, KENO-VI and SAMS. Eight most important reactions on isotopes for MHTGR-350 benchmark were identified, namely: 10B(capt), 238U(n,γ), ν5, 235U(n,γ), 238U(el), natC(el), 235U(fiss)-235U(n,γ), 235U(fiss).

  3. The role of sensitivity analysis in assessing uncertainty

    International Nuclear Information System (INIS)

    Crick, M.J.; Hill, M.D.

    1987-01-01

    Outside the specialist world of those carrying out performance assessments considerable confusion has arisen about the meanings of sensitivity analysis and uncertainty analysis. In this paper we attempt to reduce this confusion. We then go on to review approaches to sensitivity analysis within the context of assessing uncertainty, and to outline the types of test available to identify sensitive parameters, together with their advantages and disadvantages. The views expressed in this paper are those of the authors; they have not been formally endorsed by the National Radiological Protection Board and should not be interpreted as Board advice

  4. Analysis and Reduction of Complex Networks Under Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Knio, Omar M

    2014-04-09

    This is a collaborative proposal that aims at developing new methods for the analysis and reduction of complex multiscale networks under uncertainty. The approach is based on combining methods of computational singular perturbation (CSP) and probabilistic uncertainty quantification. In deterministic settings, CSP yields asymptotic approximations of reduced-dimensionality “slow manifolds” on which a multiscale dynamical system evolves. Introducing uncertainty raises fundamentally new issues, particularly concerning its impact on the topology of slow manifolds, and means to represent and quantify associated variability. To address these challenges, this project uses polynomial chaos (PC) methods to reformulate uncertain network models, and to analyze them using CSP in probabilistic terms. Specific objectives include (1) developing effective algorithms that can be used to illuminate fundamental and unexplored connections among model reduction, multiscale behavior, and uncertainty, and (2) demonstrating the performance of these algorithms through applications to model problems.

  5. Uncertainty analysis of power monitoring transit time ultrasonic flow meters

    International Nuclear Information System (INIS)

    Orosz, A.; Miller, D. W.; Christensen, R. N.; Arndt, S.

    2006-01-01

    A general uncertainty analysis is applied to chordal, transit time ultrasonic flow meters that are used in nuclear power plant feedwater loops. This investigation focuses on relationships between the major parameters of the flow measurement. For this study, mass flow rate is divided into three components, profile factor, density, and a form of volumetric flow rate. All system parameters are used to calculate values for these three components. Uncertainty is analyzed using a perturbation method. Sensitivity coefficients for major system parameters are shown, and these coefficients are applicable to a range of ultrasonic flow meters used in similar applications. Also shown is the uncertainty to be expected for density along with its relationship to other system uncertainties. One other conclusion is that pipe diameter sensitivity coefficients may be a function of the calibration technique used. (authors)

  6. Bayesian analysis for uncertainty estimation of a canopy transpiration model

    Science.gov (United States)

    Samanta, S.; Mackay, D. S.; Clayton, M. K.; Kruger, E. L.; Ewers, B. E.

    2007-04-01

    A Bayesian approach was used to fit a conceptual transpiration model to half-hourly transpiration rates for a sugar maple (Acer saccharum) stand collected over a 5-month period and probabilistically estimate its parameter and prediction uncertainties. The model used the Penman-Monteith equation with the Jarvis model for canopy conductance. This deterministic model was extended by adding a normally distributed error term. This extension enabled using Markov chain Monte Carlo simulations to sample the posterior parameter distributions. The residuals revealed approximate conformance to the assumption of normally distributed errors. However, minor systematic structures in the residuals at fine timescales suggested model changes that would potentially improve the modeling of transpiration. Results also indicated considerable uncertainties in the parameter and transpiration estimates. This simple methodology of uncertainty analysis would facilitate the deductive step during the development cycle of deterministic conceptual models by accounting for these uncertainties while drawing inferences from data.

  7. Nordic reference study on uncertainty and sensitivity analysis

    International Nuclear Information System (INIS)

    Hirschberg, S.; Jacobsson, P.; Pulkkinen, U.; Porn, K.

    1989-01-01

    This paper provides a review of the first phase of Nordic reference study on uncertainty and sensitivity analysis. The main objective of this study is to use experiences form previous Nordic Benchmark Exercises and reference studies concerning critical modeling issues such as common cause failures and human interactions, and to demonstrate the impact of associated uncertainties on the uncertainty of the investigated accident sequence. This has been done independently by three working groups which used different approaches to modeling and to uncertainty analysis. The estimated uncertainty interval for the analyzed accident sequence is large. Also the discrepancies between the groups are substantial but can be explained. Sensitivity analyses which have been carried out concern e.g. use of different CCF-quantification models, alternative handling of CCF-data, time windows for operator actions and time dependences in phase mission operation, impact of state-of-knowledge dependences and ranking of dominating uncertainty contributors. Specific findings with respect to these issues are summarized in the paper

  8. Phenomenological uncertainty analysis of early containment failure at severe accident of nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Su Won

    2011-02-15

    The severe accident has inherently significant uncertainty due to wide range of conditions and performing experiments, validation and practical application are extremely difficult because of its high temperature and pressure. Although internal and external researches were put into practice, the reference used in Korean nuclear plants were foreign data of 1980s and safety analysis as the probabilistic safety assessment has not applied the newest methodology. Also, it is applied to containment pressure formed into point value as results of thermal hydraulic analysis to identify the probability of containment failure in level 2 PSA. In this paper, the uncertainty analysis methods for phenomena of severe accident influencing early containment failure were developed, the uncertainty analysis that apply Korean nuclear plants using the MELCOR code was performed and it is a point of view to present the distribution of containment pressure as a result of uncertainty analysis. Because early containment failure is important factor of Large Early Release Frequency(LERF) that is used as representative criteria of decision-making in nuclear power plants, it was selected in this paper among various modes of containment failure. Important phenomena of early containment failure at severe accident based on previous researches were comprehended and methodology of 7th steps to evaluate uncertainty was developed. The MELCOR input for analysis of the severe accident reflected natural circulation flow was developed and the accident scenario for station black out that was representative initial event of early containment failure was determined. By reviewing the internal model and correlation for MELCOR model relevant important phenomena of early containment failure, the uncertainty factors which could affect on the uncertainty were founded and the major factors were finally identified through the sensitivity analysis. In order to determine total number of MELCOR calculations which can

  9. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    International Nuclear Information System (INIS)

    Heo, Jaeseok; Kim, Kyung Doo

    2015-01-01

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper

  10. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Jaeseok, E-mail: jheo@kaeri.re.kr; Kim, Kyung Doo, E-mail: kdkim@kaeri.re.kr

    2015-10-15

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper.

  11. Development and application of objective uncertainty measures for nuclear power plant transient analysis[Dissertation 3897

    Energy Technology Data Exchange (ETDEWEB)

    Vinai, P

    2007-10-15

    For the development, design and licensing of a nuclear power plant (NPP), a sound safety analysis is necessary to study the diverse physical phenomena involved in the system behaviour under operational and transient conditions. Such studies are based on detailed computer simulations. With the progresses achieved in computer technology and the greater availability of experimental and plant data, the use of best estimate codes for safety evaluations has gained increasing acceptance. The application of best estimate safety analysis has raised new problems that need to be addressed: it has become more crucial to assess as to how reliable code predictions are, especially when they need to be compared against safety limits that must not be crossed. It becomes necessary to identify and quantify the various possible sources of uncertainty that affect the reliability of the results. Currently, such uncertainty evaluations are generally based on experts' opinion. In the present research, a novel methodology based on a non-parametric statistical approach has been developed for objective quantification of best-estimate code uncertainties related to the physical models used in the code. The basis is an evaluation of the accuracy of a given physical model achieved by comparing its predictions with experimental data from an appropriate set of separate-effect tests. The differences between measurements and predictions can be considered stochastically distributed, and thus a statistical approach can be employed. The first step was the development of a procedure for investigating the dependence of a given physical model's accuracy on the experimental conditions. Each separate-effect test effectively provides a random sample of discrepancies between measurements and predictions, corresponding to a location in the state space defined by a certain number of independent system variables. As a consequence, the samples of 'errors', achieved from analysis of the entire

  12. Development and application of objective uncertainty measures for nuclear power plant transient analysis

    International Nuclear Information System (INIS)

    Vinai, P.

    2007-10-01

    For the development, design and licensing of a nuclear power plant (NPP), a sound safety analysis is necessary to study the diverse physical phenomena involved in the system behaviour under operational and transient conditions. Such studies are based on detailed computer simulations. With the progresses achieved in computer technology and the greater availability of experimental and plant data, the use of best estimate codes for safety evaluations has gained increasing acceptance. The application of best estimate safety analysis has raised new problems that need to be addressed: it has become more crucial to assess as to how reliable code predictions are, especially when they need to be compared against safety limits that must not be crossed. It becomes necessary to identify and quantify the various possible sources of uncertainty that affect the reliability of the results. Currently, such uncertainty evaluations are generally based on experts' opinion. In the present research, a novel methodology based on a non-parametric statistical approach has been developed for objective quantification of best-estimate code uncertainties related to the physical models used in the code. The basis is an evaluation of the accuracy of a given physical model achieved by comparing its predictions with experimental data from an appropriate set of separate-effect tests. The differences between measurements and predictions can be considered stochastically distributed, and thus a statistical approach can be employed. The first step was the development of a procedure for investigating the dependence of a given physical model's accuracy on the experimental conditions. Each separate-effect test effectively provides a random sample of discrepancies between measurements and predictions, corresponding to a location in the state space defined by a certain number of independent system variables. As a consequence, the samples of 'errors', achieved from analysis of the entire database, are

  13. Representing Uncertainty on Model Analysis Plots

    Science.gov (United States)

    Smith, Trevor I.

    2016-01-01

    Model analysis provides a mechanism for representing student learning as measured by standard multiple-choice surveys. The model plot contains information regarding both how likely students in a particular class are to choose the correct answer and how likely they are to choose an answer consistent with a well-documented conceptual model.…

  14. Uncertainty analysis of light water reactor unit fuel pin cells

    Energy Technology Data Exchange (ETDEWEB)

    Kamerow, S.; Ivanov, K., E-mail: sln107@PSU.EDU, E-mail: kni1@PSU.EDU [Department of Mechanical and Nuclear Engineering, The Pennsylvania State University, PA (United States); Moreno, C. Arenas, E-mail: cristina.arenas@UPC.EDU [Department of Physics and Nuclear Engineering, Technical University of Catalonia, Barcelona (Spain)

    2011-07-01

    The study explored the calculation of uncertainty based on available covariance data and computational tools. Uncertainty due to temperature changes and different fuel compositions are the main focus of this analysis. Selected unit fuel pin cells were analyzed according to the OECD LWR UAM benchmark specifications. Criticality and uncertainty analyses were performed using TSUNAMI-1D sequence in SCALE 6.0. It was found that uncertainties increase with increasing temperature while k{sub eff} decreases. This increase in the uncertainty is due to the increase in sensitivity of the largest contributor of uncertainty, namely nuclide reaction {sup 238}U (n, gamma). The sensitivity grew larger as the capture cross-section of {sup 238}U expanded due to Doppler broadening. In addition, three different compositions (UOx, MOx, and UOxGd{sub 2}O{sub 3}) of fuel cells were analyzed. It showed a remarkable increase in uncertainty in k{sub eff} for the case of the MOx fuel cell and UOxGd{sub 2}O{sub 3} fuel cell. The increase in the uncertainty of k{sub eff} in UOxGd{sub 2}O{sub 3} fuel was nearly twice of that in MOx fuel and almost four times the amount in UOx fuel. The components of the uncertainties in k{sub eff} in each case were examined and it was found that the neutron-nuclide reaction of {sup 238}U, mainly (n,n'), contributed the most to the uncertainties in the cases of MOx and UOxGd{sub 2}O{sub 3}. At higher energy, the covariance coefficient matrix of {sup 238}U (n,n') to {sup 238}U (n,n') and {sup 238}U (n,n') cross-section showed very large values. Further, examination of the UOxGd{sub 2}O{sub 3} case found that the {sup 238}U (n,n') became the dominant contributor to the uncertainty because most of the thermal neutrons in the cell were absorbed by Gadolinium in UOxGd{sub 2}O{sub 3} case and thus shifting the neutron spectrum to higher energy. For the MOx case on other hand, {sup 239}Pu has a very strong absorption cross-section at low energy

  15. Uncertainty in river discharge observations: a quantitative analysis

    Directory of Open Access Journals (Sweden)

    G. Di Baldassarre

    2009-06-01

    Full Text Available This study proposes a framework for analysing and quantifying the uncertainty of river flow data. Such uncertainty is often considered to be negligible with respect to other approximations affecting hydrological studies. Actually, given that river discharge data are usually obtained by means of the so-called rating curve method, a number of different sources of error affect the derived observations. These include: errors in measurements of river stage and discharge utilised to parameterise the rating curve, interpolation and extrapolation error of the rating curve, presence of unsteady flow conditions, and seasonal variations of the state of the vegetation (i.e. roughness. This study aims at analysing these sources of uncertainty using an original methodology. The novelty of the proposed framework lies in the estimation of rating curve uncertainty, which is based on hydraulic simulations. These latter are carried out on a reach of the Po River (Italy by means of a one-dimensional (1-D hydraulic model code (HEC-RAS. The results of the study show that errors in river flow data are indeed far from negligible.

  16. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    Science.gov (United States)

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Uncertainty on faecal analysis on dose assessment

    Energy Technology Data Exchange (ETDEWEB)

    Juliao, Ligia M.Q.C.; Melo, Dunstana R.; Sousa, Wanderson de O.; Santos, Maristela S.; Fernandes, Paulo Cesar P. [Instituto de Radioprotecao e Dosimetria, Comissao Nacional de Energia Nuclear, Av. Salvador Allende s/n. Via 9, Recreio, CEP 22780-160, Rio de Janeiro, RJ (Brazil)

    2007-07-01

    Monitoring programmes for internal dose assessment may need to have a combination of bioassay techniques, e.g. urine and faecal analysis, especially in workplaces where compounds of different solubilities are handled and also in cases of accidental intakes. Faecal analysis may be an important data for assessment of committed effective dose due to exposure to insoluble compounds, since the activity excreted by urine may not be detectable, unless a very sensitive measurement system is available. This paper discusses the variability of the daily faecal excretion based on data from just one daily collection; collection during three consecutive days: samples analysed individually and samples analysed as a pool. The results suggest that just 1 d collection is not appropriate for dose assessment, since the 24 h uranium excretion may vary by a factor of 40. On the basis of this analysis, the recommendation should be faecal collection during three consecutive days, and samples analysed as a pool, it is more economic and faster. (authors)

  18. Sensitivity and Uncertainty Analysis for coolant void reactivity in a CANDU Fuel Lattice Cell Model

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Seung Yeol; Shim, Hyung Jin [Seoul National University, Seoul (Korea, Republic of)

    2016-10-15

    In this study, the EPBM is implemented in Seoul National university Monte Carlo (MC) code, McCARD which has the k uncertainty evaluation capability by the adjoint-weighted perturbation (AWP) method. The implementation is verified by comparing the sensitivities of the k-eigenvalue difference to the microscopic cross sections computed by the DPBM and the direct subtractions for the TMI-1 pin-cell problem. The uncertainty of the coolant void reactivity (CVR) in a CANDU fuel lattice model due to the ENDF/B-VII.1 covariance data is calculated by its sensitivities estimated by the EPBM. The method based on the eigenvalue perturbation theory (EPBM) utilizes the 1st order adjoint-weighted perturbation (AWP) technique to estimate the sensitivity of the eigenvalue difference. Furthermore this method can be easily applied in a S/U analysis code system equipped with the eigenvalue sensitivity calculation capability. The EPBM is implemented in McCARD code and verified by showing good agreement with reference solution. Then the McCARD S/U analysis have been performed with the EPBM module for the CVR in CANDU fuel lattice problem. It shows that the uncertainty contributions of nu of {sup 235}U and gamma reaction of {sup 238}U are dominant.

  19. Review of best estimate plus uncertainty methods of thermal-hydraulic safety analysis

    International Nuclear Information System (INIS)

    Prosek, A.; Mavko, B.

    2003-01-01

    In 1988 United States Nuclear Regulatory Commission approved the revised rule on the acceptance of emergency core cooling system (ECCS) performance. Since that there has been significant interest in the development of codes and methodologies for best-estimate loss-of-coolant accident (LOCAs) analyses. Several new best estimate plus uncertainty methods (BEPUs) were developed in the world. The purpose of the paper is to review the developments in the direction of best estimate approaches with uncertainty quantification and to discuss the problems in practical applications of BEPU methods. In general, the licensee methods are following original methods. The study indicated that uncertainty analysis with random sampling of input parameters and the use of order statistics for desired tolerance limits of output parameters is today commonly accepted and mature approach. (author)

  20. A Peep into the Uncertainty-Complexity-Relevance Modeling Trilemma through Global Sensitivity and Uncertainty Analysis

    Science.gov (United States)

    Munoz-Carpena, R.; Muller, S. J.; Chu, M.; Kiker, G. A.; Perz, S. G.

    2014-12-01

    Model Model complexity resulting from the need to integrate environmental system components cannot be understated. In particular, additional emphasis is urgently needed on rational approaches to guide decision making through uncertainties surrounding the integrated system across decision-relevant scales. However, in spite of the difficulties that the consideration of modeling uncertainty represent for the decision process, it should not be avoided or the value and science behind the models will be undermined. These two issues; i.e., the need for coupled models that can answer the pertinent questions and the need for models that do so with sufficient certainty, are the key indicators of a model's relevance. Model relevance is inextricably linked with model complexity. Although model complexity has advanced greatly in recent years there has been little work to rigorously characterize the threshold of relevance in integrated and complex models. Formally assessing the relevance of the model in the face of increasing complexity would be valuable because there is growing unease among developers and users of complex models about the cumulative effects of various sources of uncertainty on model outputs. In particular, this issue has prompted doubt over whether the considerable effort going into further elaborating complex models will in fact yield the expected payback. New approaches have been proposed recently to evaluate the uncertainty-complexity-relevance modeling trilemma (Muller, Muñoz-Carpena and Kiker, 2011) by incorporating state-of-the-art global sensitivity and uncertainty analysis (GSA/UA) in every step of the model development so as to quantify not only the uncertainty introduced by the addition of new environmental components, but the effect that these new components have over existing components (interactions, non-linear responses). Outputs from the analysis can also be used to quantify system resilience (stability, alternative states, thresholds or tipping

  1. Information Synthesis in Uncertainty Studies: Application to the Analysis of the BEMUSE Results

    International Nuclear Information System (INIS)

    Baccou, J.; Chojnacki, E.; Destercke, S.

    2013-01-01

    To demonstrate that the nuclear power plants are designed to respond safely at numerous postulated accidents computer codes are used. The models of these computer codes are an approximation of the real physical behaviour occurring during an accident. Moreover the data used to run these codes are also known with a limited accuracy. Therefore the code predictions are not exact but uncertain. To deal with these uncertainties, 'best estimate' codes with 'best estimate' input data are used to obtain a best estimate calculation and it is necessary to derive the uncertainty associated to their estimations. For this reason, regulatory authorities demand in particular to technical safety organization such as the French Institut de Radioprotection et de Surete Nucleaire (IRSN) to provide results taking into account all the uncertainty sources to assess safety quantities are below critical values. Uncertainty analysis can be seen as a problem of information treatment and a special effort on four methodological key issues has to be done. The first one is related to information modelling. In safety studies, one can distinguish two kinds of uncertainty. The first type, called aleatory uncertainty, is due to the natural variability of an observed phenomenon and cannot be reduced by the arrival of new information. The second type, called epistemic uncertainty, can arise from imprecision. Contrary to the previous one, this uncertainty can be reduced by increasing the state of knowledge. Performing a relevant information modelling therefore requires to work with a mathematical formalism flexible enough to faithfully treat both types of uncertainties. The second one deals with information propagation through a computer code. It requires to run the codes several times and it is usually achieved thanks to a coupling to a statistical software. The complexity of the propagation is strongly connected to the mathematical framework used for the information modelling. The more general the

  2. Code comparison for accelerator design and analysis

    International Nuclear Information System (INIS)

    Parsa, Z.

    1988-01-01

    We present a comparison between results obtained from standard accelerator physics codes used for the design and analysis of synchrotrons and storage rings, with programs SYNCH, MAD, HARMON, PATRICIA, PATPET, BETA, DIMAD, MARYLIE and RACE-TRACK. In our analysis we have considered 5 (various size) lattices with large and small angles including AGS Booster (10/degree/ bend), RHIC (2.24/degree/), SXLS, XLS (XUV ring with 45/degree/ bend) and X-RAY rings. The differences in the integration methods used and the treatment of the fringe fields in these codes could lead to different results. The inclusion of nonlinear (e.g., dipole) terms may be necessary in these calculations specially for a small ring. 12 refs., 6 figs., 10 tabs

  3. Electromagnetic field and mechanical stress analysis code

    International Nuclear Information System (INIS)

    1978-01-01

    Analysis TEXMAGST is a two stage linear finite element code for the analysis of static magnetic fields in three dimensional structures and associated mechanical stresses produced by the anti J x anti B forces within these structures. The electromagnetic problem is solved in terms of magnetic vector potential A for a given current density anti J as curl 1/μ curl anti A = anti J considering the magnetic permeability as constant. The Coulombian gauge (div anti A = o) was chosen and was implemented through the use of Lagrange multipliers. The second stage of the problem - the calculation of mechanical stresses in the same three dimensional structure is solved by using the same code with few modifications - through a restart card. Body forces anti J x anti B within each element are calculated from the solution of the first stage run and represent the input to the second stage run which will give the solution for the stress problem

  4. Planning for robust reserve networks using uncertainty analysis

    Science.gov (United States)

    Moilanen, A.; Runge, M.C.; Elith, Jane; Tyre, A.; Carmel, Y.; Fegraus, E.; Wintle, B.A.; Burgman, M.; Ben-Haim, Y.

    2006-01-01

    Planning land-use for biodiversity conservation frequently involves computer-assisted reserve selection algorithms. Typically such algorithms operate on matrices of species presence?absence in sites, or on species-specific distributions of model predicted probabilities of occurrence in grid cells. There are practically always errors in input data?erroneous species presence?absence data, structural and parametric uncertainty in predictive habitat models, and lack of correspondence between temporal presence and long-run persistence. Despite these uncertainties, typical reserve selection methods proceed as if there is no uncertainty in the data or models. Having two conservation options of apparently equal biological value, one would prefer the option whose value is relatively insensitive to errors in planning inputs. In this work we show how uncertainty analysis for reserve planning can be implemented within a framework of information-gap decision theory, generating reserve designs that are robust to uncertainty. Consideration of uncertainty involves modifications to the typical objective functions used in reserve selection. Search for robust-optimal reserve structures can still be implemented via typical reserve selection optimization techniques, including stepwise heuristics, integer-programming and stochastic global search.

  5. Statistically based uncertainty assessments in nuclear risk analysis

    International Nuclear Information System (INIS)

    Spencer, F.W.; Diegert, K.V.; Easterling, R.G.

    1987-01-01

    Over the last decade, the problems of estimation and uncertainty assessment in probabilistics risk assessment (PRAs) have been addressed in a variety of NRC and industry-sponsored projects. These problems have received attention because of a recognition that major uncertainties in risk estimation exist, which can be reduced by collecting more and better data and other information, and because of a recognition that better methods for assessing these uncertainties are needed. In particular, a clear understanding of the nature and magnitude of various sources of uncertainty is needed to facilitate descision-making on possible plant changes and research options. Recent PRAs have employed methods of probability propagation, sometimes involving the use of Bayes Theorem, and intended to formalize the use of ''engineering judgment'' or ''expert opinion.'' All sources, or feelings, of uncertainty are expressed probabilistically, so that uncertainty analysis becomes simply a matter of probability propagation. Alternatives to forcing a probabilistic framework at all stages of a PRA are a major concern in this paper, however

  6. Fire-accident analysis code (FIRAC) verification

    International Nuclear Information System (INIS)

    Nichols, B.D.; Gregory, W.S.; Fenton, D.L.; Smith, P.R.

    1986-01-01

    The FIRAC computer code predicts fire-induced transients in nuclear fuel cycle facility ventilation systems. FIRAC calculates simultaneously the gas-dynamic, material transport, and heat transport transients that occur in any arbitrarily connected network system subjected to a fire. The network system may include ventilation components such as filters, dampers, ducts, and blowers. These components are connected to rooms and corridors to complete the network for moving air through the facility. An experimental ventilation system has been constructed to verify FIRAC and other accident analysis codes. The design emphasizes network system characteristics and includes multiple chambers, ducts, blowers, dampers, and filters. A larger industrial heater and a commercial dust feeder are used to inject thermal energy and aerosol mass. The facility is instrumented to measure volumetric flow rate, temperature, pressure, and aerosol concentration throughout the system. Aerosol release rates and mass accumulation on filters also are measured. We have performed a series of experiments in which a known rate of thermal energy is injected into the system. We then simulated this experiment with the FIRAC code. This paper compares and discusses the gas-dynamic and heat transport data obtained from the ventilation system experiments with those predicted by the FIRAC code. The numerically predicted data generally are within 10% of the experimental data

  7. Web interface for plasma analysis codes

    Energy Technology Data Exchange (ETDEWEB)

    Emoto, M. [National Institute for Fusion Science, 322-6 Oroshi, Toki, Gifu 509-5292 (Japan)], E-mail: emo@nifs.ac.jp; Murakami, S. [Kyoto University, Yoshida-Honmachi, Sakyo-ku, Kyoto 606-8501 (Japan); Yoshida, M.; Funaba, H.; Nagayama, Y. [National Institute for Fusion Science, 322-6 Oroshi, Toki, Gifu 509-5292 (Japan)

    2008-04-15

    There are many analysis codes that analyze various aspects of plasma physics. However, most of them are FORTRAN programs that are written to be run in supercomputers. On the other hand, many scientists use GUI (graphical user interface)-based operating systems. For those who are not familiar with supercomputers, it is a difficult task to run analysis codes in supercomputers, and they often hesitate to use these programs to substantiate their ideas. Furthermore, these analysis codes are written for personal use, and the programmers do not expect these programs to be run by other users. In order to make these programs to be widely used by many users, the authors developed user-friendly interfaces using a Web interface. Since the Web browser is one of the most common applications, it is useful for both the users and developers. In order to realize interactive Web interface, AJAX technique is widely used, and the authors also adopted AJAX. To build such an AJAX based Web system, Ruby on Rails plays an important role in this system. Since this application framework, which is written in Ruby, abstracts the Web interfaces necessary to implement AJAX and database functions, it enables the programmers to efficiently develop the Web-based application. In this paper, the authors will introduce the system and demonstrate the usefulness of this approach.

  8. Web interface for plasma analysis codes

    International Nuclear Information System (INIS)

    Emoto, M.; Murakami, S.; Yoshida, M.; Funaba, H.; Nagayama, Y.

    2008-01-01

    There are many analysis codes that analyze various aspects of plasma physics. However, most of them are FORTRAN programs that are written to be run in supercomputers. On the other hand, many scientists use GUI (graphical user interface)-based operating systems. For those who are not familiar with supercomputers, it is a difficult task to run analysis codes in supercomputers, and they often hesitate to use these programs to substantiate their ideas. Furthermore, these analysis codes are written for personal use, and the programmers do not expect these programs to be run by other users. In order to make these programs to be widely used by many users, the authors developed user-friendly interfaces using a Web interface. Since the Web browser is one of the most common applications, it is useful for both the users and developers. In order to realize interactive Web interface, AJAX technique is widely used, and the authors also adopted AJAX. To build such an AJAX based Web system, Ruby on Rails plays an important role in this system. Since this application framework, which is written in Ruby, abstracts the Web interfaces necessary to implement AJAX and database functions, it enables the programmers to efficiently develop the Web-based application. In this paper, the authors will introduce the system and demonstrate the usefulness of this approach

  9. Geological-structural models used in SR 97. Uncertainty analysis

    Energy Technology Data Exchange (ETDEWEB)

    Saksa, P.; Nummela, J. [FINTACT Oy (Finland)

    1998-10-01

    The uncertainty of geological-structural models was studied for the three sites in SR 97, called Aberg, Beberg and Ceberg. The evaluation covered both regional and site scale models, the emphasis being placed on fracture zones in the site scale. Uncertainty is a natural feature of all geoscientific investigations. It originates from measurements (errors in data, sampling limitations, scale variation) and conceptualisation (structural geometries and properties, ambiguous geometric or parametric solutions) to name the major ones. The structures of A-, B- and Ceberg are fracture zones of varying types. No major differences in the conceptualisation between the sites were noted. One source of uncertainty in the site models is the non-existence of fracture and zone information in the scale from 10 to 300 - 1000 m. At Aberg the development of the regional model has been performed very thoroughly. At the site scale one major source of uncertainty is that a clear definition of the target area is missing. Structures encountered in the boreholes are well explained and an interdisciplinary approach in interpretation have taken place. Beberg and Ceberg regional models contain relatively large uncertainties due to the investigation methodology and experience available at that time. In site scale six additional structures were proposed both to Beberg and Ceberg to variant analysis of these sites. Both sites include uncertainty in the form of many non-interpreted fractured sections along the boreholes. Statistical analysis gives high occurrences of structures for all three sites: typically 20 - 30 structures/km{sup 3}. Aberg has highest structural frequency, Beberg comes next and Ceberg has the lowest. The borehole configuration, orientations and surveying goals were inspected to find whether preferences or factors causing bias were present. Data from Aberg supports the conclusion that Aespoe sub volume would be an anomalously fractured, tectonised unit of its own. This means that

  10. Geological-structural models used in SR 97. Uncertainty analysis

    International Nuclear Information System (INIS)

    Saksa, P.; Nummela, J.

    1998-10-01

    The uncertainty of geological-structural models was studied for the three sites in SR 97, called Aberg, Beberg and Ceberg. The evaluation covered both regional and site scale models, the emphasis being placed on fracture zones in the site scale. Uncertainty is a natural feature of all geoscientific investigations. It originates from measurements (errors in data, sampling limitations, scale variation) and conceptualisation (structural geometries and properties, ambiguous geometric or parametric solutions) to name the major ones. The structures of A-, B- and Ceberg are fracture zones of varying types. No major differences in the conceptualisation between the sites were noted. One source of uncertainty in the site models is the non-existence of fracture and zone information in the scale from 10 to 300 - 1000 m. At Aberg the development of the regional model has been performed very thoroughly. At the site scale one major source of uncertainty is that a clear definition of the target area is missing. Structures encountered in the boreholes are well explained and an interdisciplinary approach in interpretation have taken place. Beberg and Ceberg regional models contain relatively large uncertainties due to the investigation methodology and experience available at that time. In site scale six additional structures were proposed both to Beberg and Ceberg to variant analysis of these sites. Both sites include uncertainty in the form of many non-interpreted fractured sections along the boreholes. Statistical analysis gives high occurrences of structures for all three sites: typically 20 - 30 structures/km 3 . Aberg has highest structural frequency, Beberg comes next and Ceberg has the lowest. The borehole configuration, orientations and surveying goals were inspected to find whether preferences or factors causing bias were present. Data from Aberg supports the conclusion that Aespoe sub volume would be an anomalously fractured, tectonised unit of its own. This means that the

  11. Two-dimensional cross-section sensitivity and uncertainty analysis of the LBM [Lithium Blanket Module] experiments at LOTUS

    International Nuclear Information System (INIS)

    Davidson, J.W.; Dudziak, D.J.; Pelloni, S.; Stepanek, J.

    1988-01-01

    In a recent common Los Alamos/PSI effort, a sensitivity and nuclear data uncertainty path for the modular code system AARE (Advanced Analysis for Reactor Engineering) was developed. This path includes the cross-section code TRAMIX, the one-dimensional finite difference S/sub N/-transport code ONEDANT, the two-dimensional finite element S/sub N/-transport code TRISM, and the one- and two-dimensional sensitivity and nuclear data uncertainty code SENSIBL. Within the framework of the present work a complete set of forward and adjoint two-dimensional TRISM calculations were performed both for the bare, as well as for the Pb- and Be-preceeded, LBM using MATXS8 libraries. Then a two-dimensional sensitivity and uncertainty analysis for all cases was performed. The goal of this analysis was the determination of the uncertainties of a calculated tritium production per source neutron from lithium along the central Li 2 O rod in the LBM. Considered were the contributions from 1 H, 6 Li, 7 Li, 9 Be, /sup nat/C, 14 N, 16 O, 23 Na, 27 Al, /sup nat/Si, /sup nat/Cr, /sup nat/Fe, /sup nat/Ni, and /sup nat/Pb. 22 refs., 1 fig., 3 tabs

  12. Introduction of thermal-hydraulic analysis code and system analysis code for HTGR

    International Nuclear Information System (INIS)

    Tanaka, Mitsuhiro; Izaki, Makoto; Koike, Hiroyuki; Tokumitsu, Masashi

    1984-01-01

    Kawasaki Heavy Industries Ltd. has advanced the development and systematization of analysis codes, aiming at lining up the analysis codes for heat transferring flow and control characteristics, taking up HTGR plants as the main object. In order to make the model of flow when shock waves propagate to heating tubes, SALE-3D which can analyze a complex system was developed, therefore, it is reported in this paper. Concerning the analysis code for control characteristics, the method of sensitivity analysis in a topological space including an example of application is reported. The flow analysis code SALE-3D is that for analyzing the flow of compressible viscous fluid in a three-dimensional system over the velocity range from incompressibility limit to supersonic velocity. The fundamental equations and fundamental algorithm of the SALE-3D, the calculation of cell volume, the plotting of perspective drawings and the analysis of the three-dimensional behavior of shock waves propagating in heating tubes after their rupture accident are described. The method of sensitivity analysis was added to the analysis code for control characteristics in a topological space, and blow-down phenomena was analyzed by its application. (Kako, I.)

  13. Two-dimensional cross-section sensitivity and uncertainty analysis of the LBM experience at LOTUS

    International Nuclear Information System (INIS)

    Davidson, J.W.; Dudziak, D.J.; Pelloni, S.; Stepanek, J.

    1989-01-01

    In recent years, the LOTUS fusion blanket facility at IGA-EPF in Lausanne provided a series of irradiation experiments with the Lithium Blanket Module (LBM). The LBM has both realistic fusion blanket and materials and configuration. It is approximately an 80-cm cube, and the breeding material is Li 2 . Using as the D-T neutron source the Haefely Neutron Generator (HNG) with an intensity of about 5·10 12 n/s, a series of experiments with the bare LBM as well as with the LBM preceded by Pb, Be and ThO 2 multipliers were carried out. In a recent common Los Alamos/PSI effort, a sensitivity and nuclear data uncertainty path for the modular code system AARE (Advanced Analysis for Reactor Engineering) was developed. This path includes the cross-section code TRAMIX, the one-dimensional finite difference S n -transport code ONEDANT, the two-dimensional finite element S n -transport code TRISM, and the one- and two-dimensional sensitivity and nuclear data uncertainty code SENSIBL. For the nucleonic transport calculations, three 187-neutron-group libraries are presently available: MATXS8A and MATXS8F based on ENDF/B-V evaluations and MAT187 based on JEF/EFF evaluations. COVFILS-2, a 74-group library of neutron cross-sections, scattering matrices and covariances, is the data source for SENSIBL; the 74-group structure of COVFILS-2 is a subset of the Los Alamos 187-group structure. Within the framework of the present work a complete set of forward and adjoint two-dimensional TRISM calculations were performed both for the bare, as well as for the Pb- and Be-preceded, LBM using MATXS8 libraries. Then a two-dimensional sensitivity and uncertainty analysis for all cases was performed

  14. Uncertainty and sensitivity analysis of fission gas behavior in engineering-scale fuel modeling

    Energy Technology Data Exchange (ETDEWEB)

    Pastore, Giovanni, E-mail: Giovanni.Pastore@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Swiler, L.P., E-mail: LPSwile@sandia.gov [Optimization and Uncertainty Quantification, Sandia National Laboratories, P.O. Box 5800, Albuquerque, NM 87185-1318 (United States); Hales, J.D., E-mail: Jason.Hales@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Novascone, S.R., E-mail: Stephen.Novascone@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Perez, D.M., E-mail: Danielle.Perez@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Spencer, B.W., E-mail: Benjamin.Spencer@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Luzzi, L., E-mail: Lelio.Luzzi@polimi.it [Politecnico di Milano, Department of Energy, Nuclear Engineering Division, via La Masa 34, I-20156 Milano (Italy); Van Uffelen, P., E-mail: Paul.Van-Uffelen@ec.europa.eu [European Commission, Joint Research Centre, Institute for Transuranium Elements, Hermann-von-Helmholtz-Platz 1, D-76344 Karlsruhe (Germany); Williamson, R.L., E-mail: Richard.Williamson@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States)

    2015-01-15

    The role of uncertainties in fission gas behavior calculations as part of engineering-scale nuclear fuel modeling is investigated using the BISON fuel performance code with a recently implemented physics-based model for fission gas release and swelling. Through the integration of BISON with the DAKOTA software, a sensitivity analysis of the results to selected model parameters is carried out based on UO{sub 2} single-pellet simulations covering different power regimes. The parameters are varied within ranges representative of the relative uncertainties and consistent with the information in the open literature. The study leads to an initial quantitative assessment of the uncertainty in fission gas behavior predictions with the parameter characterization presently available. Also, the relative importance of the single parameters is evaluated. Moreover, a sensitivity analysis is carried out based on simulations of a fuel rod irradiation experiment, pointing out a significant impact of the considered uncertainties on the calculated fission gas release and cladding diametral strain. The results of the study indicate that the commonly accepted deviation between calculated and measured fission gas release by a factor of 2 approximately corresponds to the inherent modeling uncertainty at high fission gas release. Nevertheless, significantly higher deviations may be expected for values around 10% and lower. Implications are discussed in terms of directions of research for the improved modeling of fission gas behavior for engineering purposes.

  15. Error Analysis of CM Data Products Sources of Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Hunt, Brian D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eckert-Gallup, Aubrey Celia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cochran, Lainy Dromgoole [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kraus, Terrence D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Allen, Mark B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Beal, Bill [National Security Technologies, Joint Base Andrews, MD (United States); Okada, Colin [National Security Technologies, LLC. (NSTec), Las Vegas, NV (United States); Simpson, Mathew [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-01

    This goal of this project is to address the current inability to assess the overall error and uncertainty of data products developed and distributed by DOE’s Consequence Management (CM) Program. This is a widely recognized shortfall, the resolution of which would provide a great deal of value and defensibility to the analysis results, data products, and the decision making process that follows this work. A global approach to this problem is necessary because multiple sources of error and uncertainty contribute to the ultimate production of CM data products. Therefore, this project will require collaboration with subject matter experts across a wide range of FRMAC skill sets in order to quantify the types of uncertainty that each area of the CM process might contain and to understand how variations in these uncertainty sources contribute to the aggregated uncertainty present in CM data products. The ultimate goal of this project is to quantify the confidence level of CM products to ensure that appropriate public and worker protections decisions are supported by defensible analysis.

  16. Operational reactor physics analysis codes (ORPAC)

    International Nuclear Information System (INIS)

    Kumar, Jainendra; Singh, K.P.; Singh, Kanchhi

    2007-07-01

    For efficient, smooth and safe operation of a nuclear research reactor, many reactor physics evaluations are regularly required. As part of reactor core management the important activities are maintaining core reactivity status, core power distribution, xenon estimations, safety evaluation of in-pile irradiation samples and experimental assemblies and assessment of nuclear safety in fuel handling/storage. In-pile irradiation of samples requires a prior estimation of the reactivity load due to the sample, the heating rate and the activity developed in it during irradiation. For the safety of personnel handling irradiated samples the dose rate at the surface of shielded flask housing the irradiated sample should be less than 200 mR/Hr.Therefore, a proper shielding and radioactive cooling of the irradiated sample are required to meet the said requirement. Knowledge of xenon load variation with time (Startup-curve) helps in estimating Xenon override time. Monitoring of power in individual fuel channels during reactor operation is essential to know any abnormal power distribution to avoid unsafe situations. Complexities in the estimation of above mentioned reactor parameters and their frequent requirement compel one to use computer codes to avoid possible human errors. For efficient and quick evaluation of parameters related to reactor operations such as xenon load, critical moderator height and nuclear heating and reactivity load of isotope samples/experimental assembly, a computer code ORPAC (Operational Reactor Physics Analysis Codes) has been developed. This code is being used for regular assessment of reactor physics parameters in Dhruva and Cirus. The code ORPAC written in Visual Basic 6.0 environment incorporates several important operational reactor physics aspects on a single platform with graphical user interfaces (GUI) to make it more user-friendly and presentable. (author)

  17. Sensitivity/uncertainty analysis of a borehole scenario comparing Latin Hypercube Sampling and deterministic sensitivity approaches

    International Nuclear Information System (INIS)

    Harper, W.V.; Gupta, S.K.

    1983-10-01

    A computer code was used to study steady-state flow for a hypothetical borehole scenario. The model consists of three coupled equations with only eight parameters and three dependent variables. This study focused on steady-state flow as the performance measure of interest. Two different approaches to sensitivity/uncertainty analysis were used on this code. One approach, based on Latin Hypercube Sampling (LHS), is a statistical sampling method, whereas, the second approach is based on the deterministic evaluation of sensitivities. The LHS technique is easy to apply and should work well for codes with a moderate number of parameters. Of deterministic techniques, the direct method is preferred when there are many performance measures of interest and a moderate number of parameters. The adjoint method is recommended when there are a limited number of performance measures and an unlimited number of parameters. This unlimited number of parameters capability can be extremely useful for finite element or finite difference codes with a large number of grid blocks. The Office of Nuclear Waste Isolation will use the technique most appropriate for an individual situation. For example, the adjoint method may be used to reduce the scope to a size that can be readily handled by a technique such as LHS. Other techniques for sensitivity/uncertainty analysis, e.g., kriging followed by conditional simulation, will be used also. 15 references, 4 figures, 9 tables

  18. Development of disruption thermal analysis code DREAM

    Energy Technology Data Exchange (ETDEWEB)

    Yamazaki, Seiichiro; Kobayahsi, Takeshi [Kawasaki Heavy Industries Ltd., Kobe (Japan); Seki, Masahiro

    1989-07-01

    When a plasma disruption takes place in a tokamak type fusion reactor, plasma facing componenets such as first wall and divertor/limiter are subjected to a intensse heat load in a short duration. At the surface of the wall, temperature rapidly rises, and melting and evaporation occurs. It causes reduction of wall thickness and crack initiation/propagation. As lifetime of the components is significantly affected by them, the transient analysis in consideration of phase changes and radiation heat loss in required in the design of these components. This paper describes the computer code DREAM, developed to perform the disruption thermal analysis, taking phase changes and radiation into account. (author).

  19. Development of disruption thermal analysis code DREAM

    International Nuclear Information System (INIS)

    Yamazaki, Seiichiro; Kobayahsi, Takeshi; Seki, Masahiro.

    1989-01-01

    When a plasma disruption takes place in a tokamak type fusion reactor, plasma facing componenets such as first wall and divertor/limiter are subjected to a intensse heat load in a short duration. At the surface of the wall, temperature rapidly rises, and melting and evaporation occurs. It causes reduction of wall thickness and crack initiation/propagation. As lifetime of the components is significantly affected by them, the transient analysis in consideration of phase changes and radiation heat loss in required in the design of these components. This paper describes the computer code DREAM, developed to perform the disruption thermal analysis, taking phase changes and radiation into account. (author)

  20. Acquired experience on organizing 3D S.UN.COP: international course to support nuclear license by user training in the areas of scaling, uncertainty, and 3D thermal-hydraulics/neutron-kinetics coupled codes

    Energy Technology Data Exchange (ETDEWEB)

    Petruzzi, Alessandro; D' Auria, Francesco [University of Pisa, San Piero a Grado (Italy). Nuclear Research Group San Piero a Grado (GRNSPG); Galetti, Regina, E-mail: regina@cnen.gov.b [National Commission for Nuclear Energy (CNEN), Rio de Janeiro, RJ (Brazil); Bajs, Tomislav [University of Zagreb (Croatia). Fac. of Electrical Engineering and Computing. Dept. of Power Systems; Reventos, Francesc [Technical University of Catalonia, Barcelona (Spain). Dept. of Physics and Nuclear Engineering

    2011-07-01

    Thermal-hydraulic system computer codes are extensively used worldwide for analysis of nuclear facilities by utilities, regulatory bodies, nuclear power plant designers, vendors, and research organizations. Computer code user represents a source of uncertainty that may significantly affect the results of system code calculations. Code user training and qualification represent an effective means for reducing the variation of results caused by the application of the codes by different users. This paper describes the experience in applying a systematic approach to training code users who, upon completion of the training, should be able to perform calculations making the best possible use of the capabilities of best estimate codes. In addition, this paper presents the organization and the main features of the 3D S.UN.COP (scaling, uncertainty, and 3D coupled code calculations) seminars during which particular emphasis is given to practical applications in connection with the licensing process of best estimate plus uncertainty methodologies, showing the designer, utility and regulatory approaches. (author)

  1. Acquired experience on organizing 3D S.UN.COP: international course to support nuclear license by user training in the areas of scaling, uncertainty, and 3D thermal-hydraulics/neutron-kinetics coupled codes

    International Nuclear Information System (INIS)

    Petruzzi, Alessandro; D'Auria, Francesco; Galetti, Regina; Bajs, Tomislav; Reventos, Francesc

    2011-01-01

    Thermal-hydraulic system computer codes are extensively used worldwide for analysis of nuclear facilities by utilities, regulatory bodies, nuclear power plant designers, vendors, and research organizations. Computer code user represents a source of uncertainty that may significantly affect the results of system code calculations. Code user training and qualification represent an effective means for reducing the variation of results caused by the application of the codes by different users. This paper describes the experience in applying a systematic approach to training code users who, upon completion of the training, should be able to perform calculations making the best possible use of the capabilities of best estimate codes. In addition, this paper presents the organization and the main features of the 3D S.UN.COP (scaling, uncertainty, and 3D coupled code calculations) seminars during which particular emphasis is given to practical applications in connection with the licensing process of best estimate plus uncertainty methodologies, showing the designer, utility and regulatory approaches. (author)

  2. CONTEMPT-DG containment analysis code

    International Nuclear Information System (INIS)

    Deem, R.E.; Rousseau, K.

    1982-01-01

    The assessment of hydrogen burning in a containment building during a degraded core event requires a knowledge of various system responses. These system responses (i.e. heat sinks, fan cooler units, sprays, etc.) can have a marked effect on the overall containment integrity results during a hydrogen burn. In an attempt to properly handle the various system responses and still retain the capability to perform sensitivity analysis on various parameters, the CONTEMPT-DG computer code was developed. This paper will address the historical development of the code, its various features, and the rationale for its development. Comparisons between results from the CONTEMPT-DG analyses and results from similar MARCH analyses will also be given

  3. BEMUSE Phase III Report - Uncertainty and Sensitivity Analysis of the LOFT L2-5 Test

    International Nuclear Information System (INIS)

    Bazin, P.; Crecy, A. de; Glaeser, H.; Skorek, T.; Joucla, J.; Probst, P.; Chung, B.; Oh, D.Y.; Kyncl, M.; Pernica, R.; Macek, J.; Meca, R.; Macian, R.; D'Auria, F.; Petruzzi, A.; Perez, M.; Reventos, F.; Fujioka, K.

    2007-02-01

    This report summarises the various contributions (ten participants) for phase 3 of BEMUSE: Uncertainty and Sensitivity Analyses of the LOFT L2-5 experiment, a Large-Break Loss-of-Coolant-Accident (LB-LOCA). For this phase, precise requirements step by step were provided to the participants. Four main parts are defined, which are: 1. List and uncertainties of the input uncertain parameters. 2. Uncertainty analysis results. 3. Sensitivity analysis results. 4. Improved methods, assessment of the methods (optional). 5% and 95% percentiles have to be estimated for 6 output parameters, which are of two kinds: 1. Scalar output parameters (First Peak Cladding Temperature (PCT), Second Peak Cladding Temperature, Time of accumulator injection, Time of complete quenching); 2. Time trends output parameters (Maximum cladding temperature, Upper plenum pressure). The main lessons learnt from phase 3 of the BEMUSE programme are the following: - for uncertainty analysis, all the participants use a probabilistic method associated with the use of Wilks' formula, except for UNIPI with its CIAU method (Code with the Capability of Internal Assessment of Uncertainty). Use of both methods has been successfully mastered. - Compared with the experiment, the results of uncertainty analysis are good on the whole. For example, for the cladding temperature-type output parameters (1. PCT, 2. PCT, time of complete quenching, maximum cladding temperature), 8 participants out of 10 find upper and lower bounds which envelop the experimental data. - Sensitivity analysis has been successfully performed by all the participants using the probabilistic method. All the used influence measures include the range of variation of the input parameters. Synthesis tables of the most influential phenomena and parameters have been plotted and participants will be able to use them for the continuation of the BEMUSE programme

  4. Estimating annual bole biomass production using uncertainty analysis

    Science.gov (United States)

    Travis J. Woolley; Mark E. Harmon; Kari B. O' Connell

    2007-01-01

    Two common sampling methodologies coupled with a simple statistical model were evaluated to determine the accuracy and precision of annual bole biomass production (BBP) and inter-annual variability estimates using this type of approach. We performed an uncertainty analysis using Monte Carlo methods in conjunction with radial growth core data from trees in three Douglas...

  5. Model Uncertainty and Robustness: A Computational Framework for Multimodel Analysis

    Science.gov (United States)

    Young, Cristobal; Holsteen, Katherine

    2017-01-01

    Model uncertainty is pervasive in social science. A key question is how robust empirical results are to sensible changes in model specification. We present a new approach and applied statistical software for computational multimodel analysis. Our approach proceeds in two steps: First, we estimate the modeling distribution of estimates across all…

  6. An educational model for ensemble streamflow simulation and uncertainty analysis

    Directory of Open Access Journals (Sweden)

    A. AghaKouchak

    2013-02-01

    Full Text Available This paper presents the hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity. HBV-Ensemble was administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of uncertainty in hydrological modeling.

  7. Statistical uncertainty analysis of radon transport in nonisothermal, unsaturated soils

    International Nuclear Information System (INIS)

    Holford, D.J.; Owczarski, P.C.; Gee, G.W.; Freeman, H.D.

    1990-10-01

    To accurately predict radon fluxes soils to the atmosphere, we must know more than the radium content of the soil. Radon flux from soil is affected not only by soil properties, but also by meteorological factors such as air pressure and temperature changes at the soil surface, as well as the infiltration of rainwater. Natural variations in meteorological factors and soil properties contribute to uncertainty in subsurface model predictions of radon flux, which, when coupled with a building transport model, will also add uncertainty to predictions of radon concentrations in homes. A statistical uncertainty analysis using our Rn3D finite-element numerical model was conducted to assess the relative importance of these meteorological factors and the soil properties affecting radon transport. 10 refs., 10 figs., 3 tabs

  8. Decision analysis of shoreline protection under climate change uncertainty

    Science.gov (United States)

    Chao, Philip T.; Hobbs, Benjamin F.

    1997-04-01

    If global warming occurs, it could significantly affect water resource distribution and availability. Yet it is unclear whether the prospect of such change is relevant to water resources management decisions being made today. We model a shoreline protection decision problem with a stochastic dynamic program (SDP) to determine whether consideration of the possibility of climate change would alter the decision. Three questions are addressed with the SDP: (l) How important is climate change compared to other uncertainties?, (2) What is the economic loss if climate change uncertainty is ignored?, and (3) How does belief in climate change affect the timing of the decision? In the case study, sensitivity analysis shows that uncertainty in real discount rates has a stronger effect upon the decision than belief in climate change. Nevertheless, a strong belief in climate change makes the shoreline protection project less attractive and often alters the decision to build it.

  9. Application of uncertainty analysis method for calculations of accident conditions for RP AES-2006

    International Nuclear Information System (INIS)

    Zajtsev, S.I.; Bykov, M.A.; Zakutaev, M.O.; Siryapin, V.N.; Petkevich, I.G.; Siryapin, N.V.; Borisov, S.L.; Kozlachkov, A.N.

    2015-01-01

    An analysis of some accidents using the uncertainly assessment methods is given. The list of the variable parameters incorporated the model parameters of the computer codes, initial and boundary conditions of reactor plant, neutronics. On the basis of the performed calculations of the accident conditions using the statistical method, errors assessment is presented in the determination of the main parameters comparable with the acceptance criteria. It was shown that in the investigated accidents the values of the calculated parameters with account for their error obtained from TRAP-KS and KORSAR/GP Codes do not exceed the established acceptance criteria. Besides, these values do not exceed the values obtained in the conservative calculations. A possibility in principle of the actual application of the method of estimation of uncertainty was shown to justify the safety of WWER AES-2006 using the thermal-physical codes KORSAR/GP and TRAP-KS, PANDA and SUSA programs [ru

  10. Uncertainty propagation in a 3-D thermal code for performance assessment of a nuclear waste disposal

    International Nuclear Information System (INIS)

    Dutfoy, A.; Ritz, J.B.

    2001-01-01

    Given the very large time scale involved, the performance assessment of a nuclear waste repository requires numerical modelling. Because we are uncertain of the exact value of the input parameters, we have to analyse the impact of these uncertainties on the outcome of the physical models. The EDF Division Research and Development has set a reliability method to propagate these uncertainties or variability through models which requires much less physical simulations than the usual simulation methods. We apply the reliability method MEFISTO to a base case modelling the heat transfers in a virtual disposal in the future site of the French underground research laboratory, in the East of France. This study is led in collaboration with ANDRA which is the French Nuclear Waste Management Agency. With this exercise, we want to evaluate the thermal behaviour of a concept related to the variation of physical parameters and their uncertainty. (author)

  11. Extensive neutronic sensitivity-uncertainty analysis of a fusion reactor shielding blanket

    International Nuclear Information System (INIS)

    Hogenbirk, A.

    1994-01-01

    In this paper the results are presented of an extensive neutronic sensitivity-uncertainty study performed for the design of a shielding blanket for a next-step fusion reactor, such as ITER. A code system was used, which was developed at ECN Petten. The uncertainty in an important response parameter, the neutron heating in the inboard superconducting coils, was evaluated. Neutron transport calculations in the 100 neutron group GAM-II structure were performed using the code ANISN. For the sensitivity and uncertainty calculations the code SUSD was used. Uncertainties due to cross-section uncertainties were taken into account as well as uncertainties due to uncertainties in energy and angular distributions of scattered neutrons (SED and SAD uncertainties, respectively). The subject of direct-term uncertainties (i.e. uncertainties due to uncertainties in the kerma factors of the superconducting coils) is briefly touched upon. It is shown that SAD uncertainties, which have been largely neglected until now, contribute significantly to the total uncertainty. Moreover, the contribution of direct-term uncertainties may be large. The total uncertainty in the neutron heating, only due to Fe cross-sections, amounts to approximately 25%, which is rather large. However, uncertainty data are scarce and the data may very well be conservative. It is shown in this paper that with the code system used, sensitivity and uncertainty calculations can be performed in a straightforward way. Therefore, it is suggested that emphasis is now put on the generation of realistic, reliable covariance data for cross-sections as well as for angular and energy distributions. ((orig.))

  12. IAEA Coordinated Research Project on HTGR Reactor Physics, Thermal-hydraulics and Depletion Uncertainty Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Strydom, Gerhard [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bostelmann, F. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The predictive capability of coupled neutronics/thermal-hydraulics and depletion simulations for reactor design and safety analysis can be assessed with sensitivity analysis (SA) and uncertainty analysis (UA) methods. Uncertainty originates from errors in physical data, manufacturing uncertainties, modelling and computational algorithms. (The interested reader is referred to the large body of published SA and UA literature for a more complete overview of the various types of uncertainties, methodologies and results obtained). SA is helpful for ranking the various sources of uncertainty and error in the results of core analyses. SA and UA are required to address cost, safety, and licensing needs and should be applied to all aspects of reactor multi-physics simulation. SA and UA can guide experimental, modelling, and algorithm research and development. Current SA and UA rely either on derivative-based methods such as stochastic sampling methods or on generalized perturbation theory to obtain sensitivity coefficients. Neither approach addresses all needs. In order to benefit from recent advances in modelling and simulation and the availability of new covariance data (nuclear data uncertainties) extensive sensitivity and uncertainty studies are needed for quantification of the impact of different sources of uncertainties on the design and safety parameters of HTGRs. Only a parallel effort in advanced simulation and in nuclear data improvement will be able to provide designers with more robust and well validated calculation tools to meet design target accuracies. In February 2009, the Technical Working Group on Gas-Cooled Reactors (TWG-GCR) of the International Atomic Energy Agency (IAEA) recommended that the proposed Coordinated Research Program (CRP) on

  13. Error and Uncertainty Analysis for Ecological Modeling and Simulation

    Science.gov (United States)

    2001-12-01

    nitrate flux to the Gulf of Mexico. Nature (Brief Communication) 414: 166-167. (Uncertainty analysis done with SERDP software) Gertner, G., G...D. Goolsby 2001. Relating N inputs to the Mississippi River Basin and nitrate flux in the Lower Mississippi River: A comparison of approaches...Journal of Remote Sensing, 25(4):367-380. Wu, J., D.E. Jelinski, M. Luck, and P.T. Tueller, 2000. Multiscale analysis of landscape heterogeneity: scale

  14. A Preliminary Study on Sensitivity and Uncertainty Analysis with Statistic Method: Uncertainty Analysis with Cross Section Sampling from Lognormal Distribution

    Energy Technology Data Exchange (ETDEWEB)

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung [Hanyang Univ., Seoul (Korea, Republic of); Noh, Jae Man [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis.

  15. A Preliminary Study on Sensitivity and Uncertainty Analysis with Statistic Method: Uncertainty Analysis with Cross Section Sampling from Lognormal Distribution

    International Nuclear Information System (INIS)

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung; Noh, Jae Man

    2013-01-01

    The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis

  16. Preliminary uncertainty analysis of OECD/UAM benchmark for the TMI-1 reactor

    International Nuclear Information System (INIS)

    Cardoso, Fabiano S.; Faria, Rochkhudson B.; Silva, Lucas M.C.; Pereira, Claubia; Fortini, Angela

    2015-01-01

    Nowadays the demand from nuclear research centers for safety, regulation and better-estimated predictions provided with confidence bounds has been increasing. On that way, studies have pointed out that present uncertainties in the nuclear data should be significantly reduced, to get the full benefit from the advanced modeling and simulation initiatives. The major outcome of NEA/OECD (UAM) workshop took place Italy on 2006, was the preparation of a benchmark work program with steps (exercises) that would be needed to define the uncertainty and modeling tasks. On that direction, this work was performed within the framework of UAM Exercise 1 (I-1) 'Cell Physics' to validate the study, and to be able estimated the accuracies of the model. The objectives of this study were to make a preliminary analysis of criticality values of TMI-1 PWR and the biases of the results from two different nuclear codes multiplication factor. The range of the bias was obtained using the deterministic codes: NEWT (New ESC-based Weighting Transport code), the two-dimensional transport module that uses AMPX-formatted cross-sections processed by other SCALE; and WIMSD5 (Winfrith Improved Multi-Group Scheme) code. The WIMSD5 system consists of a simplified geometric representation of heterogeneous space zones that are coupled with each other and with the boundaries, while the properties of each spacing element are obtained from Carlson DSN method or Collision Probability method. (author)

  17. Use of computer codes for system reliability analysis

    International Nuclear Information System (INIS)

    Sabek, M.; Gaafar, M.; Poucet, A.

    1988-01-01

    This paper gives a collective summary of the studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRANTIC, FTAP, computer code package RALLY, and BOUNDS codes. Two reference study cases were executed by each code. The results obtained logic/probabilistic analysis as well as computation time are compared

  18. Systemization of burnup sensitivity analysis code

    International Nuclear Information System (INIS)

    Tatsumi, Masahiro; Hyoudou, Hideaki

    2004-02-01

    To practical use of fact reactors, it is a very important subject to improve prediction accuracy for neutronic properties in LMFBR cores from the viewpoints of improvements on plant efficiency with rationally high performance cores and that on reliability and safety margins. A distinct improvement on accuracy in nuclear core design has been accomplished by development of adjusted nuclear library using the cross-section adjustment method, in which the results of critical experiments of JUPITER and so on are reflected. In the design of large LMFBR cores, however, it is important to accurately estimate not only neutronic characteristics, for example, reaction rate distribution and control rod worth but also burnup characteristics, for example, burnup reactivity loss, breeding ratio and so on. For this purpose, it is desired to improve prediction accuracy of burnup characteristics using the data widely obtained in actual core such as the experimental fast reactor core 'JOYO'. The analysis of burnup characteristics is needed to effectively use burnup characteristics data in the actual cores based on the cross-section adjustment method. So far, development of a analysis code for burnup sensitivity, SAGEP-BURN, has been done and confirmed its effectiveness. However, there is a problem that analysis sequence become inefficient because of a big burden to user due to complexity of the theory of burnup sensitivity and limitation of the system. It is also desired to rearrange the system for future revision since it is becoming difficult to implement new functionalities in the existing large system. It is not sufficient to unify each computational component for some reasons; computational sequence may be changed for each item being analyzed or for purpose such as interpretation of physical meaning. Therefore it is needed to systemize the current code for burnup sensitivity analysis with component blocks of functionality that can be divided or constructed on occasion. For this

  19. Benchmarking and application of the state-of-the-art uncertainty analysis methods XSUSA and SHARK-X

    International Nuclear Information System (INIS)

    Aures, A.; Bostelmann, F.; Hursin, M.; Leray, O.

    2017-01-01

    Highlights: • Application of the uncertainty analysis methods XSUSA and SHARK-X. • Propagation of nuclear data uncertainty through PWR pin cell depletion calculation. • Uncertainty quantification of eigenvalue, nuclide densities and Doppler coefficient. • Top contributor to overall output uncertainty by sensitivity analysis. • Comparison with SAMPLER and TSUNAMI of the SCALE code package. - Abstract: This study presents collaborative work performed between GRS and PSI on benchmarking and application of the state-of-the-art uncertainty analysis methods XSUSA and SHARK-X. Applied to a PWR pin cell depletion calculation, both methods propagate input uncertainty from nuclear data to output uncertainty. The uncertainty of the multiplication factors, nuclide densities, and fuel temperature coefficients derived by both methods are compared at various burnup steps. Comparisons of these quantities are furthermore performed with the SAMPLER module of SCALE 6.2. The perturbation theory based TSUNAMI module of both SCALE 6.1 and SCALE 6.2 is additionally applied for comparisons of the reactivity coefficient.

  20. Treatment of uncertainties in the IPCC: a philosophical analysis

    Science.gov (United States)

    Jebeile, J.; Drouet, I.

    2014-12-01

    The IPCC produces scientific reports out of findings on climate and climate change. Because the findings are uncertain in many respects, the production of reports requires aggregating assessments of uncertainties of different kinds. This difficult task is currently regulated by the Guidance note for lead authors of the IPCC fifth assessment report on consistent treatment of uncertainties. The note recommends that two metrics—i.e. confidence and likelihood— be used for communicating the degree of certainty in findings. Confidence is expressed qualitatively "based on the type, amount, quality, and consistency of evidence […] and the degree of agreement", while likelihood is expressed probabilistically "based on statistical analysis of observations or model results, or expert judgment". Therefore, depending on the evidence evaluated, authors have the choice to present either an assigned level of confidence or a quantified measure of likelihood. But aggregating assessments of uncertainties of these two different kinds express distinct and conflicting methodologies. So the question arises whether the treatment of uncertainties in the IPCC is rationally justified. In order to answer the question, it is worth comparing the IPCC procedures with the formal normative theories of epistemic rationality which have been developed by philosophers. These theories—which include contributions to the philosophy of probability and to bayesian probabilistic confirmation theory—are relevant for our purpose because they are commonly used to assess the rationality of common collective jugement formation based on uncertain knowledge. In this paper we make the comparison and pursue the following objectives: i/we determine whether the IPCC confidence and likelihood can be compared with the notions of uncertainty targeted by or underlying the formal normative theories of epistemic rationality; ii/we investigate whether the formal normative theories of epistemic rationality justify

  1. Systemization of burnup sensitivity analysis code. 2

    International Nuclear Information System (INIS)

    Tatsumi, Masahiro; Hyoudou, Hideaki

    2005-02-01

    Towards the practical use of fast reactors, it is a very important subject to improve prediction accuracy for neutronic properties in LMFBR cores from the viewpoint of improvements on plant efficiency with rationally high performance cores and that on reliability and safety margins. A distinct improvement on accuracy in nuclear core design has been accomplished by the development of adjusted nuclear library using the cross-section adjustment method, in which the results of criticality experiments of JUPITER and so on are reflected. In the design of large LMFBR cores, however, it is important to accurately estimate not only neutronic characteristics, for example, reaction rate distribution and control rod worth but also burnup characteristics, for example, burnup reactivity loss, breeding ratio and so on. For this purpose, it is desired to improve prediction accuracy of burnup characteristics using the data widely obtained in actual core such as the experimental fast reactor 'JOYO'. The analysis of burnup characteristics is needed to effectively use burnup characteristics data in the actual cores based on the cross-section adjustment method. So far, a burnup sensitivity analysis code, SAGEP-BURN, has been developed and confirmed its effectiveness. However, there is a problem that analysis sequence become inefficient because of a big burden to users due to complexity of the theory of burnup sensitivity and limitation of the system. It is also desired to rearrange the system for future revision since it is becoming difficult to implement new functions in the existing large system. It is not sufficient to unify each computational component for the following reasons; the computational sequence may be changed for each item being analyzed or for purpose such as interpretation of physical meaning. Therefore, it is needed to systemize the current code for burnup sensitivity analysis with component blocks of functionality that can be divided or constructed on occasion. For

  2. Comparative Criticality Analysis of Two Monte Carlo Codes on Centrifugal Atomizer: MCNPS and SCALE

    International Nuclear Information System (INIS)

    Kang, H-S; Jang, M-S; Kim, S-R; Park, J-M; Kim, K-N

    2015-01-01

    There are two well-known Monte Carlo codes for criticality analysis, MCNP5 and SCALE. MCNP5 is a general-purpose Monte Carlo N-Particle code that can be used for neutron, photon, electron or coupled neutron / photon / electron transport, including the capability to calculate eigenvalues for critical system as a main analysis code. SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor physics, radiation shielding, radioactive source term characterization, and sensitivity and uncertainty analysis. SCALE was conceived and funded by US NRC to perform standardized computer analysis for licensing evaluation and is used widely in the world. We performed a validation test of MCNP5 and a comparative analysis of Monte Carlo codes, MCNP5 and SCALE, in terms of the critical analysis of centrifugal atomizer. In the criticality analysis using MCNP5 code, we obtained the statistically reliable results by using a large number of source histories per cycle and performing of uncertainty analysis

  3. Comparative Criticality Analysis of Two Monte Carlo Codes on Centrifugal Atomizer: MCNPS and SCALE

    Energy Technology Data Exchange (ETDEWEB)

    Kang, H-S; Jang, M-S; Kim, S-R [NESS, Daejeon (Korea, Republic of); Park, J-M; Kim, K-N [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    There are two well-known Monte Carlo codes for criticality analysis, MCNP5 and SCALE. MCNP5 is a general-purpose Monte Carlo N-Particle code that can be used for neutron, photon, electron or coupled neutron / photon / electron transport, including the capability to calculate eigenvalues for critical system as a main analysis code. SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor physics, radiation shielding, radioactive source term characterization, and sensitivity and uncertainty analysis. SCALE was conceived and funded by US NRC to perform standardized computer analysis for licensing evaluation and is used widely in the world. We performed a validation test of MCNP5 and a comparative analysis of Monte Carlo codes, MCNP5 and SCALE, in terms of the critical analysis of centrifugal atomizer. In the criticality analysis using MCNP5 code, we obtained the statistically reliable results by using a large number of source histories per cycle and performing of uncertainty analysis.

  4. DNBR calculation in digital core protection system by a subchannel analysis code

    International Nuclear Information System (INIS)

    In, W. K.; Yoo, Y. J.; Hwang, T. H.; Ji, S. K.

    2001-01-01

    The DNBR calculation uncertainty and DNBR margin were evaluated in digital core protection system by a thermal-hydrualic subchannel analysis code MATRA. A simplified thermal-hydraulic code CETOP is used to calculate on-line DNBR in core protection system at a digital PWR. The DNBR tuning process against a best-estimate subchannel analysis code is required for CETOP to ensure accurate and conservative DNBR calculation but not necessary for MATRA. The DNBR calculations by MATRA and CETOP were performed for a large number of operating condition in Yonggwang nulcear units 3-4 where the digitial core protection system is initially implemented in Korea. MATRA resulted in a less negative mean value (i.e., reduce the overconservatism) and a somewhat larger standard deviation of the DNBR error. The uncertainty corrected minimum DNBR by MATRA was shown to be higher by 1.8% -9.9% that the CETOP DNBR

  5. Current lead thermal analysis code 'CURRENT'

    International Nuclear Information System (INIS)

    Yamaguchi, Masahito; Tada, Eisuke; Shimamoto, Susumu; Hata, Kenichiro.

    1985-08-01

    Large gas-cooled current lead with the capacity more than 30 kA and 22 kV is required for superconducting toroidal and poloidal coils for fusion application. The current lead is used to carry electrical current from the power supply system at room temperature to the superconducting coil at 4 K. Accordingly, the thermal performance of the current lead is significantly important to determine the heat load requirements of the coil system at 4 K. Japan Atomic Energy Research Institute (JAERI) has being developed the large gas-cooled current leads with the optimum condition in which the heat load is around 1 W per 1 kA at 4 K. In order to design the current lead with the optimum thermal performances, JAERI developed thermal analysis code named as ''CURRENT'' which can theoretically calculate the optimum geometric shape and cooling conditions of the current lead. The basic equations and the instruction manual of the analysis code are described in this report. (author)

  6. An analysis of combined standard uncertainty for radiochemical measurements of environmental samples

    International Nuclear Information System (INIS)

    Berne, A.

    1996-01-01

    It is anticipated that future data acquisitions intended for use in radiological risk assessments will require the incorporation of uncertainty analysis. Often, only one aliquot of the sample is taken and a single determination is made. Under these circumstances, the total uncertainty is calculated using the open-quotes propagation of errorsclose quotes approach. However, there is no agreement in the radioanalytical community as to the exact equations to use. The Quality Assurance/Metrology Division of the Environmental Measurements Laboratory has developed a systematic process to compute uncertainties in constituent components of the analytical procedure, as well as the combined standard uncertainty (CSU). The equations for computation are presented here, with examples of their use. They have also been incorporated into a code for use in the spreadsheet application, QuattroPro trademark. Using the spreadsheet with appropriate inputs permits an analysis of the variations in the CSU as a function of several different variables. The relative importance of the open-quotes counting uncertaintyclose quotes can also be ascertained

  7. SWEPP PAN assay system uncertainty analysis: Active mode measurements of solidified aqueous sludge waste

    International Nuclear Information System (INIS)

    Blackwood, L.G.; Harker, Y.D.; Meachum, T.R.

    1997-12-01

    The Idaho National Engineering and Environmental Laboratory is being used as a temporary storage facility for transuranic waste generated by the US Nuclear Weapons program at the Rocky Flats Plant (RFP) in Golden, Colorado. Currently, there is a large effort in progress to prepare to ship this waste to the Waste Isolation Pilot Plant (WIPP) in Carlsbad, New Mexico. In order to meet the TRU Waste Characterization Quality Assurance Program Plan nondestructive assay compliance requirements and quality assurance objectives, it is necessary to determine the total uncertainty of the radioassay results produced by the Stored Waste Examination Pilot Plant (SWEPP) Passive Active Neutron (PAN) radioassay system. This paper is one of a series of reports quantifying the results of the uncertainty analysis of the PAN system measurements for specific waste types and measurement modes. In particular this report covers active mode measurements of weapons grade plutonium-contaminated aqueous sludge waste contained in 208 liter drums (item description codes 1, 2, 7, 800, 803, and 807). Results of the uncertainty analysis for PAN active mode measurements of aqueous sludge indicate that a bias correction multiplier of 1.55 should be applied to the PAN aqueous sludge measurements. With the bias correction, the uncertainty bounds on the expected bias are 0 ± 27%. These bounds meet the Quality Assurance Program Plan requirements for radioassay systems

  8. Uncertainty Assessment of Hydrological Frequency Analysis Using Bootstrap Method

    Directory of Open Access Journals (Sweden)

    Yi-Ming Hu

    2013-01-01

    Full Text Available The hydrological frequency analysis (HFA is the foundation for the hydraulic engineering design and water resources management. Hydrological extreme observations or samples are the basis for HFA; the representativeness of a sample series to the population distribution is extremely important for the estimation reliability of the hydrological design value or quantile. However, for most of hydrological extreme data obtained in practical application, the size of the samples is usually small, for example, in China about 40~50 years. Generally, samples with small size cannot completely display the statistical properties of the population distribution, thus leading to uncertainties in the estimation of hydrological design values. In this paper, a new method based on bootstrap is put forward to analyze the impact of sampling uncertainty on the design value. By bootstrap resampling technique, a large number of bootstrap samples are constructed from the original flood extreme observations; the corresponding design value or quantile is estimated for each bootstrap sample, so that the sampling distribution of design value is constructed; based on the sampling distribution, the uncertainty of quantile estimation can be quantified. Compared with the conventional approach, this method provides not only the point estimation of a design value but also quantitative evaluation on uncertainties of the estimation.

  9. Uncertainty analysis of a nondestructive radioassay system for transuranic waste

    International Nuclear Information System (INIS)

    Harker, Y.D.; Blackwood, L.G.; Meachum, T.R.; Yoon, W.Y.

    1996-01-01

    Radioassay of transuranic waste in 207 liter drums currently stored at the Idaho National Engineering Laboratory is achieved using a Passive Active Neutron (PAN) nondestructive assay system. In order to meet data quality assurance requirements for shipping and eventual permanent storage of these drums at the Waste Isolation Pilot Plant in Carlsbad, New Mexico, the total uncertainty of the PAN system measurements must be assessed. In particular, the uncertainty calculations are required to include the effects of variations in waste matrix parameters and related variables on the final measurement results. Because of the complexities involved in introducing waste matrix parameter effects into the uncertainty calculations, standard methods of analysis (e.g., experimentation followed by propagation of errors) could not be implemented. Instead, a modified statistical sampling and verification approach was developed. In this modified approach the total performance of the PAN system is simulated using computer models of the assay system and the resultant output is compared with the known input to assess the total uncertainty. This paper describes the simulation process and illustrates its application to waste comprised of weapons grade plutonium-contaminated graphite molds

  10. Quantifying and managing uncertainty in operational modal analysis

    Science.gov (United States)

    Au, Siu-Kui; Brownjohn, James M. W.; Mottershead, John E.

    2018-03-01

    Operational modal analysis aims at identifying the modal properties (natural frequency, damping, etc.) of a structure using only the (output) vibration response measured under ambient conditions. Highly economical and feasible, it is becoming a common practice in full-scale vibration testing. In the absence of (input) loading information, however, the modal properties have significantly higher uncertainty than their counterparts identified from free or forced vibration (known input) tests. Mastering the relationship between identification uncertainty and test configuration is of great interest to both scientists and engineers, e.g., for achievable precision limits and test planning/budgeting. Addressing this challenge beyond the current state-of-the-art that are mostly concerned with identification algorithms, this work obtains closed form analytical expressions for the identification uncertainty (variance) of modal parameters that fundamentally explains the effect of test configuration. Collectively referred as 'uncertainty laws', these expressions are asymptotically correct for well-separated modes, small damping and long data; and are applicable under non-asymptotic situations. They provide a scientific basis for planning and standardization of ambient vibration tests, where factors such as channel noise, sensor number and location can be quantitatively accounted for. The work is reported comprehensively with verification through synthetic and experimental data (laboratory and field), scientific implications and practical guidelines for planning ambient vibration tests.

  11. Two-dimensional cross-section sensitivity and uncertainty analysis for fusion reactor blankets

    International Nuclear Information System (INIS)

    Embrechts, M.J.

    1982-02-01

    A two-dimensional sensitivity and uncertainty analysis for the heating of the TF coil for the FED (fusion engineering device) blanket was performed. The uncertainties calculated are of the same order of magnitude as those resulting from a one-dimensional analysis. The largest uncertainties were caused by the cross section uncertainties for chromium

  12. Uncertainty and sensitivity analysis of environmental transport models

    International Nuclear Information System (INIS)

    Margulies, T.S.; Lancaster, L.E.

    1985-01-01

    An uncertainty and sensitivity analysis has been made of the CRAC-2 (Calculations of Reactor Accident Consequences) atmospheric transport and deposition models. Robustness and uncertainty aspects of air and ground deposited material and the relative contribution of input and model parameters were systematically studied. The underlying data structures were investigated using a multiway layout of factors over specified ranges generated via a Latin hypercube sampling scheme. The variables selected in our analysis include: weather bin, dry deposition velocity, rain washout coefficient/rain intensity, duration of release, heat content, sigma-z (vertical) plume dispersion parameter, sigma-y (crosswind) plume dispersion parameter, and mixing height. To determine the contributors to the output variability (versus distance from the site) step-wise regression analyses were performed on transformations of the spatial concentration patterns simulated. 27 references, 2 figures, 3 tables

  13. Stochastic analysis in production process and ecology under uncertainty

    CERN Document Server

    Bieda, Bogusław

    2014-01-01

    The monograph addresses a problem of stochastic analysis based on the uncertainty assessment by simulation and application of this method in ecology and steel industry under uncertainty. The first chapter defines the Monte Carlo (MC) method and random variables in stochastic models. Chapter two deals with the contamination transport in porous media. Stochastic approach for Municipal Solid Waste transit time contaminants modeling using MC simulation has been worked out. The third chapter describes the risk analysis of the waste to energy facility proposal for Konin city, including the financial aspects. Environmental impact assessment of the ArcelorMittal Steel Power Plant, in Kraków - in the chapter four - is given. Thus, four scenarios of the energy mix production processes were studied. Chapter five contains examples of using ecological Life Cycle Assessment (LCA) - a relatively new method of environmental impact assessment - which help in preparing pro-ecological strategy, and which can lead to reducing t...

  14. INTEGRATION OF SYSTEM COMPONENTS AND UNCERTAINTY ANALYSIS - HANFORD EXAMPLES

    International Nuclear Information System (INIS)

    Wood, M.I.

    2009-01-01

    (sm b ullet) Deterministic 'One Off' analyses as basis for evaluating sensitivity and uncertainty relative to reference case (sm b ullet) Spatial coverage identical to reference case (sm b ullet) Two types of analysis assumptions - Minimax parameter values around reference case conditions - 'What If' cases that change reference case condition and associated parameter values (sm b ullet) No conclusions about likelihood of estimated result other than' qualitative expectation that actual outcome should tend toward reference case estimate

  15. Technology relevance of the 'uncertainty analysis in modelling' project for nuclear reactor safety

    International Nuclear Information System (INIS)

    D'Auria, F.; Langenbuch, S.; Royer, E.; Del Nevo, A.; Parisi, C.; Petruzzi, A.

    2007-01-01

    The OECD/NEA Nuclear Science Committee (NSC) endorsed the setting up of an Expert Group on Uncertainty Analysis in Modelling (UAM) in June 2006. This Expert Group reports to the Working Party on Scientific issues in Reactor Systems (WPRS) and because it addresses multi-scale / multi-physics aspects of uncertainty analysis, it will work in close co-ordination with the benchmark groups on coupled neutronics-thermal-hydraulics and on coupled core-plant problems, and the CSNI Group on Analysis and Management of Accidents (GAMA). The NEA/NSC has endorsed that this activity be undertaken with Prof. K. Ivanov from the Pennsylvania State University (PSU) as the main coordinator and host with the assistance of the Scientific Board. The objective of the proposed work is to define, coordinate, conduct, and report an international benchmark for uncertainty analysis in best-estimate coupled code calculations for design, operation, and safety analysis of LWRs entitled 'OECD UAM LWR Benchmark'. At the First Benchmark Workshop (UAM-1) held from 10 to 11 May 2007 at the OECD/NEA, one action concerned the forming of a sub-group, led by F. D'Auria, member of CSNI, responsible for defining the objectives, the impact and benefit of the UAM for safety and licensing. This report is the result of this action by the subgroup. (authors)

  16. Sensitivity and uncertainty analysis for Ignalina NPP confinement in case of loss of coolant accident

    International Nuclear Information System (INIS)

    Urbonavicius, E.; Babilas, E.; Rimkevicius, S.

    2003-01-01

    At present the best-estimate approach in the safety analysis of nuclear power plants is widely used around the world. The application of such approach requires to estimate the uncertainty of the calculated results. Various methodologies are applied in order to determine the uncertainty with the required accuracy. One of them is the statistical methodology developed at GRS mbH in Germany and integrated into the SUSA tool, which was applied for the sensitivity and uncertainty analysis of the thermal-hydraulic parameters inside the confinement (Accident Localisation System) of Ignalina NPP with RBMK-1500 reactor in case of Maximum Design Basis Accident (break of 900 mm diameter pipe). Several parameters that could potentially influence the calculated results were selected for the analysis. A set of input data with different initial values of the selected parameters was generated. In order to receive the results with 95 % probability and 95 % accuracy, 100 runs were performed with COCOSYS code developed at GRS mbH. The calculated results were processed with SUSA tool. The performed analysis showed a rather low dispersion of the results and only in the initial period of the accident. Besides, the analysis showed that there is no threat to the building structures of Ignalina NPP confinement in case of the considered accident scenario. (author)

  17. Cross-section sensitivity and uncertainty analysis of the FNG copper benchmark experiment

    Energy Technology Data Exchange (ETDEWEB)

    Kodeli, I., E-mail: ivan.kodeli@ijs.si [Jožef Stefan Institute, Jamova 39, SI-1000 Ljubljana (Slovenia); Kondo, K. [Karlsruhe Institute of Technology, Postfach 3640, D-76021 Karlsruhe (Germany); Japan Atomic Energy Agency, Rokkasho-mura (Japan); Perel, R.L. [Racah Institute of Physics, Hebrew University of Jerusalem, IL-91904 Jerusalem (Israel); Fischer, U. [Karlsruhe Institute of Technology, Postfach 3640, D-76021 Karlsruhe (Germany)

    2016-11-01

    A neutronics benchmark experiment on copper assembly was performed end 2014–beginning 2015 at the 14-MeV Frascati neutron generator (FNG) of ENEA Frascati with the objective to provide the experimental database required for the validation of the copper nuclear data relevant for ITER design calculations, including the related uncertainties. The paper presents the pre- and post-analysis of the experiment performed using cross-section sensitivity and uncertainty codes, both deterministic (SUSD3D) and Monte Carlo (MCSEN5). Cumulative reaction rates and neutron flux spectra, their sensitivity to the cross sections, as well as the corresponding uncertainties were estimated for different selected detector positions up to ∼58 cm in the copper assembly. This permitted in the pre-analysis phase to optimize the geometry, the detector positions and the choice of activation reactions, and in the post-analysis phase to interpret the results of the measurements and the calculations, to conclude on the quality of the relevant nuclear cross-section data, and to estimate the uncertainties in the calculated nuclear responses and fluxes. Large uncertainties in the calculated reaction rates and neutron spectra of up to 50%, rarely observed at this level in the benchmark analysis using today's nuclear data, were predicted, particularly high for fast reactions. Observed C/E (dis)agreements with values as low as 0.5 partly confirm these predictions. Benchmark results are therefore expected to contribute to the improvement of both cross section as well as covariance data evaluations.

  18. Performance testing of thermal analysis codes for nuclear fuel casks

    International Nuclear Information System (INIS)

    Sanchez, L.C.

    1987-01-01

    In 1982 Sandia National Laboratories held the First Industry/Government Joint Thermal and Structural Codes Information Exchange and presented the initial stages of an investigation of thermal analysis computer codes for use in the design of nuclear fuel shipping casks. The objective of the investigation was to (1) document publicly available computer codes, (2) assess code capabilities as determined from their user's manuals, and (3) assess code performance on cask-like model problems. Computer codes are required to handle the thermal phenomena of conduction, convection and radiation. Several of the available thermal computer codes were tested on a set of model problems to assess performance on cask-like problems. Solutions obtained with the computer codes for steady-state thermal analysis were in good agreement and the solutions for transient thermal analysis differed slightly among the computer codes due to modeling differences

  19. Error-correction coding and decoding bounds, codes, decoders, analysis and applications

    CERN Document Server

    Tomlinson, Martin; Ambroze, Marcel A; Ahmed, Mohammed; Jibril, Mubarak

    2017-01-01

    This book discusses both the theory and practical applications of self-correcting data, commonly known as error-correcting codes. The applications included demonstrate the importance of these codes in a wide range of everyday technologies, from smartphones to secure communications and transactions. Written in a readily understandable style, the book presents the authors’ twenty-five years of research organized into five parts: Part I is concerned with the theoretical performance attainable by using error correcting codes to achieve communications efficiency in digital communications systems. Part II explores the construction of error-correcting codes and explains the different families of codes and how they are designed. Techniques are described for producing the very best codes. Part III addresses the analysis of low-density parity-check (LDPC) codes, primarily to calculate their stopping sets and low-weight codeword spectrum which determines the performance of these codes. Part IV deals with decoders desi...

  20. 76 FR 64931 - Building Energy Codes Cost Analysis

    Science.gov (United States)

    2011-10-19

    ...-0046] Building Energy Codes Cost Analysis AGENCY: Office of Energy Efficiency and Renewable Energy... reopening of the time period for submitting comments on the request for information on Building Energy Codes... the request for information on Building Energy Code Cost Analysis and provide docket number EERE-2011...

  1. Concepts involved in a proposed application of uncertainty analysis to the performance assessment of high-level nuclear waste isolation systems

    International Nuclear Information System (INIS)

    Maerker, R.E.

    1986-03-01

    This report introduces the concepts of a previously developed methodology which could readily be extended to the field of performance assessment for high-level nuclear waste isolation systems. The methodology incorporates sensitivities previously obtained with the GRESS code into an uncertainty analysis, from which propagated uncertainties in calculated responses may be derived from basic data uncertainties. Following a definition of terms, examples are provided illustrating commonly used conventions for describing the concepts of covariance and sensitivity. Examples of solutions to problems previously encountered in related fields involving uncertainty analysis and use of a generalized linear least-squares adjustment procedure are also presented. 5 refs., 14 tabs

  2. Uncertainty quantification and sensitivity analysis with CASL Core Simulator VERA-CS

    International Nuclear Information System (INIS)

    Brown, C.S.; Zhang, Hongbin

    2016-01-01

    VERA-CS (Virtual Environment for Reactor Applications, Core Simulator) is a coupled neutron transport and thermal-hydraulics code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). An approach to uncertainty quantification and sensitivity analysis with VERA-CS was developed and a new toolkit was created to perform uncertainty quantification and sensitivity analysis. A 2 × 2 fuel assembly model was developed and simulated by VERA-CS, and uncertainty quantification and sensitivity analysis were performed with fourteen uncertain input parameters. The minimum departure from nucleate boiling ratio (MDNBR), maximum fuel center-line temperature, and maximum outer clad surface temperature were chosen as the selected figures of merit. Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis and coolant inlet temperature was consistently the most influential parameter. Parameters used as inputs to the critical heat flux calculation with the W-3 correlation were shown to be the most influential on the MDNBR, maximum fuel center-line temperature, and maximum outer clad surface temperature.

  3. Deterministic methods for sensitivity and uncertainty analysis in large-scale computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Oblow, E.M.; Pin, F.G.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.; Lucius, J.L.

    1987-01-01

    The fields of sensitivity and uncertainty analysis are dominated by statistical techniques when large-scale modeling codes are being analyzed. This paper reports on the development and availability of two systems, GRESS and ADGEN, that make use of computer calculus compilers to automate the implementation of deterministic sensitivity analysis capability into existing computer models. This automation removes the traditional limitation of deterministic sensitivity methods. The paper describes a deterministic uncertainty analysis method (DUA) that uses derivative information as a basis to propagate parameter probability distributions to obtain result probability distributions. The paper demonstrates the deterministic approach to sensitivity and uncertainty analysis as applied to a sample problem that models the flow of water through a borehole. The sample problem is used as a basis to compare the cumulative distribution function of the flow rate as calculated by the standard statistical methods and the DUA method. The DUA method gives a more accurate result based upon only two model executions compared to fifty executions in the statistical case

  4. Summary of the CEC/USDOE workshop on uncertainty analysis

    International Nuclear Information System (INIS)

    Elderkin, C.E.; Kelly, G.N.

    1990-06-01

    There is uncertainty in all aspects of assessing the consequences of accidental releases of radioactive material, from understanding and describing the environmental and biological transfer processes to modeling emergency response. The need for an exchange of views and a comparison of approaches between the diverse disciplines led to the organization of a CEC/USDOE Workshop on Uncertainty Analysis held in Santa Fe, New Mexico, in November 1989. The workshop brought together specialists in a number of disciplines, including those expert in the mathematics and statistics of uncertainty analysis, in expert judgment elicitation and evaluation, and in all aspects of assessing the radiological and environmental consequences of accidental releases of radioactive material. In addition, there was participation from users of the output of accident consequences assessment in decision making and/or regulatory frameworks. The main conclusions that emerged from the workshop are summarized in this paper. These are discussed in the context of three different types of accident consequence assessment: probabilistic assessments of accident consequences undertaken as inputs to risk analyses of nuclear installations, assessments of accident consequences in real time to provide inputs to decisions on the introduction of countermeasures, and the reconstruction of doses and risks resulting form past releases of radioactive material

  5. Similarity and uncertainty analysis of the ALLEGRO MOX core

    International Nuclear Information System (INIS)

    Vrban, B.; Hascik, J.; Necas, V.; Slugen, V.

    2015-01-01

    The similarity and uncertainty analysis of the ESNII+ ALLEGRO MOX core has identified specific problems and challenges in the field of neutronic calculations. Similarity assessment identified 9 partly comparable experiments where only one reached ck and E values over 0.9. However the Global Integral Index G remains still low (0.75) and cannot be judge das sufficient. The total uncertainty of calculated k eff induced by XS data is according to our calculation 1.04%. The main contributors to this uncertainty are 239 Pu nubar and 238 U inelastic scattering. The additional margin from uncovered sensitivities was determined to be 0.28%. The identified low number of similar experiments prevents the use of advanced XS adjustment and bias estimation methods. More experimental data are needed and presented results may serve as a basic step in development of necessary critical assemblies. Although exact data are not presented in the paper, faster 44 energy group calculation gives almost the same results in similarity analysis in comparison to more complex 238 group calculation. Finally, it was demonstrated that TSUNAMI-IP utility can play a significant role in the future fast reactor development in Slovakia and in the Visegrad region. Clearly a further Research and Development and strong effort should be carried out in order to receive more complex methodology consisting of more plausible covariance data and related quantities. (authors)

  6. Sensitivity and Uncertainty Analysis of IAEA CRP HTGR Benchmark Using McCARD

    International Nuclear Information System (INIS)

    Jang, Sang Hoon; Shim, Hyung Jin

    2016-01-01

    The benchmark consists of 4 phases starting from the local standalone modeling (Phase I) to the safety calculation of coupled system with transient situation (Phase IV). As a preliminary study of UAM on HTGR, this paper covers the exercise 1 and 2 of Phase I which defines the unit cell and lattice geometry of MHTGR-350 (General Atomics). The objective of these exercises is to quantify the uncertainty of the multiplication factor induced by perturbing nuclear data as well as to analyze the specific features of HTGR such as double heterogeneity and self-shielding treatment. The uncertainty quantification of IAEA CRP HTGR UAM benchmarks were conducted using first-order AWP method in McCARD. Uncertainty of the multiplication factor was estimated only for the microscopic cross section perturbation. To reduce the computation time and memory shortage, recently implemented uncertainty analysis module in MC wielandt calculation was adjusted. The covariance data of cross section was generated by NJOY/ERRORR module with ENDF/B-VII.1. The numerical result was compared with evaluation result of DeCART/MUSAD code system developed by KAERI. IAEA CRP HTGR UAM benchmark problems were analyzed using McCARD. The numerical results were compared with Serpent for eigenvalue calculation and DeCART/MUSAD for S/U analysis. In eigenvalue calculation, inconsistencies were found in the result with ENDF/B-VII.1 cross section library and it was found to be the effect of thermal scattering data of graphite. As to S/U analysis, McCARD results matched well with DeCART/MUSAD, but showed some discrepancy in 238U capture regarding implicit uncertainty.

  7. Uncertainty analysis on probabilistic fracture mechanics assessment methodology

    International Nuclear Information System (INIS)

    Rastogi, Rohit; Vinod, Gopika; Chandra, Vikas; Bhasin, Vivek; Babar, A.K.; Rao, V.V.S.S.; Vaze, K.K.; Kushwaha, H.S.; Venkat-Raj, V.

    1999-01-01

    Fracture Mechanics has found a profound usage in the area of design of components and assessing fitness for purpose/residual life estimation of an operating component. Since defect size and material properties are statistically distributed, various probabilistic approaches have been employed for the computation of fracture probability. Monte Carlo Simulation is one such procedure towards the analysis of fracture probability. This paper deals with uncertainty analysis using the Monte Carlo Simulation methods. These methods were developed based on the R6 failure assessment procedure, which has been widely used in analysing the integrity of structures. The application of this method is illustrated with a case study. (author)

  8. Probabilistic Accident Consequence Uncertainty Analysis of the Food Chain Module in the COSYMA Package (invited paper)

    International Nuclear Information System (INIS)

    Brown, J.; Jones, J.A.

    2000-01-01

    This paper describes the uncertainty analysis of the food chain module of COSYMA and the uncertainty distributions on the input parameter values for the food chain model provided by the expert panels that were used for the analysis. Two expert panels were convened, covering the areas of soil and plant transfer processes and transfer to and through animals. The aggregated uncertainty distributions from the experts for the elicited variables were used in an uncertainty analysis of the food chain module of COSYMA. The main aim of the module analysis was to identify those parameters whose uncertainty makes large contributions to the overall uncertainty and so should be included in the overall analysis. (author)

  9. Analysis of uncertainties in CRAC2 calculations: the inhalation pathway

    International Nuclear Information System (INIS)

    Killough, G.G.; Dunning, D.E. Jr.

    1984-01-01

    CRAC2 is a computer code for estimating the health effects and economic costs that might result from a release of radioactivity from a nuclear reactor to the environment. This paper describes tests of sensitivity of the predicted health effects to uncertainties in parameters associated with inhalation of the released radionuclides. These parameters are the particle size of the carrier aerosol and, for each element in the release, the clearance parameters for the lung model on which the code's dose conversion factors for inhalation are based. CRAC2 uses hourly meteorological data and a straight-line Gaussian plume model to predict the transport of airborne radioactivity; it includes models for plume depletion and population evacuation, and data for the distributions of population and land use. The code can compute results for single weather sequences, or it can perform random sampling of weather sequences from the meteorological data file and compute results for each weather sequence in the sample. For the work described in this paper, we concentrated on three fixed weather sequences that represent a range of conditions. For each fixed weather sequence, we applied random sampling to joint distributions of the inhalation parameters in order to estimate the sensitivity of the predicted health effects. All sampling runs produced coefficients of variation that were less than 50%, but some differences of means between weather sequences were substantial, as were some differences between means and the corresponding CRAC2 results without random sampling. Early injuries showed differences of as much as 1 to 2 orders of magnitude, while the differences in early fatalities were less than a factor of 2. Latent cancer fatalities varied by less than 10%. 19 references, 6 figures, 3 tables

  10. The Impact of Uncertainty on Investment. A Meta-Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Koetse, M.J. [Department of Spatial Economics, Vrije Universiteit Amsterdam (Netherlands); De Groot, Henri L.F. [Tinbergen Institute, Amsterdam (Netherlands); Florax, R.J.G.M. [Department of Agricultural Economics, Purdue University, West Lafayette (United States)

    2006-07-01

    In this paper we perform a meta-analysis on empirical estimates of the impact between investment and uncertainty. Since the outcomes of primary studies are largely incomparable with respect to the magnitude of the effect, our analysis focuses on the direction and statistical significance of the relationship. The standard approach in this situation is to estimate an ordered probit model on a categorical estimate, defined in terms of the direction of the effect. The estimates are transformed into marginal effects, in order to represent the changes in the probability of finding a negative significant, insignificant, and positive significant estimate. Although a meta-analysis generally does not allow for inferences on the correctness of model specifications in primary studies, our results give clear directions for model building in empirical investment research. For example, not including factor prices in investment models may seriously affect the model outcomes. Furthermore, we find that Q models produce more negative significant estimates than other models do, ceteris paribus. The outcome of a study is also affected by the type of data used in a primary study. Although it is clear that meta-analysis cannot always give decisive insights into the explanations for the variation in empirical outcomes, our meta-analysis shows that we can explain to a large extent why empirical estimates of the investment uncertainty relationship differ.

  11. Information on Hydrologic Conceptual Models, Parameters, Uncertainty Analysis, and Data Sources for Dose Assessments at Decommissioning Sites

    International Nuclear Information System (INIS)

    Meyer, Philip D.; Gee, Glendon W.; Nicholson, Thomas J.

    1999-01-01

    This report addresses issues related to the analysis of uncertainty in dose assessments conducted as part of decommissioning analyses. The analysis is limited to the hydrologic aspects of the exposure pathway involving infiltration of water at the ground surface, leaching of contaminants, and transport of contaminants through the groundwater to a point of exposure. The basic conceptual models and mathematical implementations of three dose assessment codes are outlined along with the site-specific conditions under which the codes may provide inaccurate, potentially nonconservative results. In addition, the hydrologic parameters of the codes are identified and compared. A methodology for parameter uncertainty assessment is outlined that considers the potential data limitations and modeling needs of decommissioning analyses. This methodology uses generic parameter distributions based on national or regional databases, sensitivity analysis, probabilistic modeling, and Bayesian updating to incorporate site-specific information. Data sources for best-estimate parameter values and parameter uncertainty information are also reviewed. A follow-on report will illustrate the uncertainty assessment methodology using decommissioning test cases

  12. Information on Hydrologic Conceptual Models, Parameters, Uncertainty Analysis, and Data Sources for Dose Assessments at Decommissioning Sites

    International Nuclear Information System (INIS)

    Meyer D, Philip; Gee W, Glendon

    2000-01-01

    This report addresses issues related to the analysis of uncertainty in dose assessments conducted as part of decommissioning analyses. The analysis is limited to the hydrologic aspects of the exposure pathway involving infiltration of water at the ground surface, leaching of contaminants, and transport of contaminants through the groundwater to a point of exposure. The basic conceptual models and mathematical implementations of three dose assessment codes are outlined along with the site-specific conditions under which the codes may provide inaccurate, potentially nonconservative results. In addition, the hydrologic parameters of the codes are identified and compared. A methodology for parameter uncertainty assessment is outlined that considers the potential data limitations and modeling needs of decommissioning analyses. This methodology uses generic parameter distributions based on national or regional databases, sensitivity analysis, probabilistic modeling, and Bayesian updating to incorporate site-specific information. Data sources for best-estimate parameter values and parameter uncertainty information are also reviewed. A follow-on report will illustrate the uncertainty assessment methodology using decommissioning test cases

  13. Analysis of quantum error-correcting codes: Symplectic lattice codes and toric codes

    Science.gov (United States)

    Harrington, James William

    Quantum information theory is concerned with identifying how quantum mechanical resources (such as entangled quantum states) can be utilized for a number of information processing tasks, including data storage, computation, communication, and cryptography. Efficient quantum algorithms and protocols have been developed for performing some tasks (e.g. , factoring large numbers, securely communicating over a public channel, and simulating quantum mechanical systems) that appear to be very difficult with just classical resources. In addition to identifying the separation between classical and quantum computational power, much of the theoretical focus in this field over the last decade has been concerned with finding novel ways of encoding quantum information that are robust against errors, which is an important step toward building practical quantum information processing devices. In this thesis I present some results on the quantum error-correcting properties of oscillator codes (also described as symplectic lattice codes) and toric codes. Any harmonic oscillator system (such as a mode of light) can be encoded with quantum information via symplectic lattice codes that are robust against shifts in the system's continuous quantum variables. I show the existence of lattice codes whose achievable rates match the one-shot coherent information over the Gaussian quantum channel. Also, I construct a family of symplectic self-dual lattices and search for optimal encodings of quantum information distributed between several oscillators. Toric codes provide encodings of quantum information into two-dimensional spin lattices that are robust against local clusters of errors and which require only local quantum operations for error correction. Numerical simulations of this system under various error models provide a calculation of the accuracy threshold for quantum memory using toric codes, which can be related to phase transitions in certain condensed matter models. I also present

  14. Uncertainty analysis in WWTP model applications: a critical discussion using an example from design

    DEFF Research Database (Denmark)

    Sin, Gürkan; Gernaey, Krist; Neumann, Marc B.

    2009-01-01

    of design performance criteria differs significantly. The implication for the practical applications of uncertainty analysis in the wastewater industry is profound: (i) as the uncertainty analysis results are specific to the framing used, the results must be interpreted within the context of that framing......This study focuses on uncertainty analysis of WWTP models and analyzes the issue of framing and how it affects the interpretation of uncertainty analysis results. As a case study, the prediction of uncertainty involved in model-based design of a wastewater treatment plant is studied. The Monte...... to stoichiometric, biokinetic and influent parameters; (2) uncertainty due to hydraulic behaviour of the plant and mass transfer parameters; (3) uncertainty due to the combination of (1) and (2). The results demonstrate that depending on the way the uncertainty analysis is framed, the estimated uncertainty...

  15. Application of intelligence based uncertainty analysis for HLW disposal

    International Nuclear Information System (INIS)

    Kato, Kazuyuki

    2003-01-01

    Safety assessment for geological disposal of high level radioactive waste inevitably involves factors that cannot be specified in a deterministic manner. These are namely: (1) 'variability' that arises from stochastic nature of the processes and features considered, e.g., distribution of canister corrosion times and spatial heterogeneity of a host geological formation; (2) 'ignorance' due to incomplete or imprecise knowledge of the processes and conditions expected in the future, e.g., uncertainty in the estimation of solubilities and sorption coefficients for important nuclides. In many cases, a decision in assessment, e.g., selection among model options or determination of a parameter value, is subjected to both variability and ignorance in a combined form. It is clearly important to evaluate both influences of variability and ignorance on the result of a safety assessment in a consistent manner. We developed a unified methodology to handle variability and ignorance by using probabilistic and possibilistic techniques respectively. The methodology has been applied to safety assessment of geological disposal of high level radioactive waste. Uncertainties associated with scenarios, models and parameters were defined in terms of fuzzy membership functions derived through a series of interviews to the experts while variability was formulated by means of probability density functions (pdfs) based on available data set. The exercise demonstrated applicability of the new methodology and, in particular, its advantage in quantifying uncertainties based on expert's opinion and in providing information on dependence of assessment result on the level of conservatism. In addition, it was also shown that sensitivity analysis could identify key parameters in reducing uncertainties associated with the overall assessment. The above information can be used to support the judgment process and guide the process of disposal system development in optimization of protection against

  16. Genetic Code Analysis Toolkit: A novel tool to explore the coding properties of the genetic code and DNA sequences

    Science.gov (United States)

    Kraljić, K.; Strüngmann, L.; Fimmel, E.; Gumbel, M.

    2018-01-01

    The genetic code is degenerated and it is assumed that redundancy provides error detection and correction mechanisms in the translation process. However, the biological meaning of the code's structure is still under current research. This paper presents a Genetic Code Analysis Toolkit (GCAT) which provides workflows and algorithms for the analysis of the structure of nucleotide sequences. In particular, sets or sequences of codons can be transformed and tested for circularity, comma-freeness, dichotomic partitions and others. GCAT comes with a fertile editor custom-built to work with the genetic code and a batch mode for multi-sequence processing. With the ability to read FASTA files or load sequences from GenBank, the tool can be used for the mathematical and statistical analysis of existing sequence data. GCAT is Java-based and provides a plug-in concept for extensibility. Availability: Open source Homepage:http://www.gcat.bio/

  17. Benchmarks for Uncertainty Analysis in Modelling (UAM) for the Design, Operation and Safety Analysis of LWRs - Volume I: Specification and Support Data for Neutronics Cases (Phase I)

    International Nuclear Information System (INIS)

    Ivanov, K.; Avramova, M.; Kamerow, S.; Kodeli, I.; Sartori, E.; Ivanov, E.; Cabellos, O.

    2013-01-01

    The objective of the OECD LWR UAM activity is to establish an internationally accepted benchmark framework to compare, assess and further develop different uncertainty analysis methods associated with the design, operation and safety of LWRs. As a result, the LWR UAM benchmark will help to address current nuclear power generation industry and regulation needs and issues related to practical implementation of risk-informed regulation. The realistic evaluation of consequences must be made with best-estimate coupled codes, but to be meaningful, such results should be supplemented by an uncertainty analysis. The use of coupled codes allows us to avoid unnecessary penalties due to incoherent approximations in the traditional decoupled calculations, and to obtain more accurate evaluation of margins regarding licensing limit. This becomes important for licensing power upgrades, improved fuel assembly and control rod designs, higher burn-up and others issues related to operating LWRs as well as to the new Generation 3+ designs being licensed now (ESBWR, AP-1 000, EPR-1 600, etc.). Establishing an internationally accepted LWR UAM benchmark framework offers the possibility to accelerate the licensing process when using best estimate methods. The proposed technical approach is to establish a benchmark for uncertainty analysis in best-estimate modelling and coupled multi-physics and multi-scale LWR analysis, using as bases a series of well-defined problems with complete sets of input specifications and reference experimental data. The objective is to determine the uncertainty in LWR system calculations at all stages of coupled reactor physics/thermal hydraulics calculations. The full chain of uncertainty propagation from basic data, engineering uncertainties, across different scales (multi-scale), and physics phenomena (multi-physics) will be tested on a number of benchmark exercises for which experimental data are available and for which the power plant details have been

  18. Biomass Thermogravimetric Analysis: Uncertainty Determination Methodology and Sampling Maps Generation

    Science.gov (United States)

    Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Eguía, Pablo; Collazo, Joaquín

    2010-01-01

    The objective of this study was to develop a methodology for the determination of the maximum sampling error and confidence intervals of thermal properties obtained from thermogravimetric analysis (TG), including moisture, volatile matter, fixed carbon and ash content. The sampling procedure of the TG analysis was of particular interest and was conducted with care. The results of the present study were compared to those of a prompt analysis, and a correlation between the mean values and maximum sampling errors of the methods were not observed. In general, low and acceptable levels of uncertainty and error were obtained, demonstrating that the properties evaluated by TG analysis were representative of the overall fuel composition. The accurate determination of the thermal properties of biomass with precise confidence intervals is of particular interest in energetic biomass applications. PMID:20717532

  19. Application of coupled codes for safety analysis and licensing issues

    International Nuclear Information System (INIS)

    Langenbuch, S.; Velkov, K.

    2006-01-01

    An overview is given on the development and the advantages of coupled codes which integrate 3D neutron kinetics into thermal-hydraulic system codes. The work performed within GRS by coupling the thermal-hydraulic system code ATHLET and the 3D neutronics code QUABOX/CUBBOX is described as an example. The application of the coupled codes as best-estimate simulation tools for safety analysis is discussed. Some examples from German licensing practices are given which demonstrate how the improved analytical methods of coupled codes have contributed to solve licensing issues related to optimized and more economical use of fuel. (authors)

  20. Analytical validation of the CACECO containment analysis code

    International Nuclear Information System (INIS)

    Peak, R.D.

    1979-08-01

    The CACECO containment analysis code was developed to predict the thermodynamic responses of LMFBR containment facilities to a variety of accidents. This report covers the verification of the CACECO code by problems that can be solved by hand calculations or by reference to textbook and literature examples. The verification concentrates on the accuracy of the material and energy balances maintained by the code and on the independence of the four cells analyzed by the code so that the user can be assured that the code analyses are numerically correct and independent of the organization of the input data submitted to the code

  1. Users' guide to CACECO containment analysis code. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    Peak, R.D.

    1979-06-01

    The CACECO containment analysis code was developed to predict the thermodynamic responses of LMFBR containment facilities to a variety of accidents. The code is included in the National Energy Software Center Library at Argonne National Laboratory as Program No. 762. This users' guide describes the CACECO code and its data input requirements. The code description covers the many mathematical models used and the approximations used in their solution. The descriptions are detailed to the extent that the user can modify the code to suit his unique needs, and, indeed, the reader is urged to consider code modification acceptable.

  2. Uncertainty analysis of multiple canister repository model by large-scale calculation

    International Nuclear Information System (INIS)

    Tsujimoto, K.; Okuda, H.; Ahn, J.

    2007-01-01

    A prototype uncertainty analysis has been made by using the multiple-canister radionuclide transport code, VR, for performance assessment for the high-level radioactive waste repository. Fractures in the host rock determine main conduit of groundwater, and thus significantly affect the magnitude of radionuclide release rates from the repository. In this study, the probability distribution function (PDF) for the number of connected canisters in the same fracture cluster that bears water flow has been determined in a Monte-Carlo fashion by running the FFDF code with assumed PDFs for fracture geometry. The uncertainty for the release rate of 237 Np from a hypothetical repository containing 100 canisters has been quantitatively evaluated by using the VR code with PDFs for the number of connected canisters and the near field rock porosity. The calculation results show that the mass transport is greatly affected by (1) the magnitude of the radionuclide source determined by the number of connected canisters by the fracture cluster, and (2) the canister concentration effect in the same fracture network. The results also show the two conflicting tendencies that the more fractures in the repository model space, the greater average value but the smaller uncertainty of the peak fractional release rate is. To perform a vast amount of calculation, we have utilized the Earth Simulator and SR8000. The multi-level hybrid programming method is applied in the optimization to exploit high performance of the Earth Simulator. The Latin Hypercube Sampling has been utilized to reduce the number of samplings in Monte-Carlo calculation. (authors)

  3. Uranium Isotopic Analysis with the FRAM Isotopic Analysis Code

    International Nuclear Information System (INIS)

    Vo, D.T.; Sampson, T.E.

    1999-01-01

    FRAM is the acronym for Fixed-Energy Response-Function Analysis with Multiple efficiency. This software was developed at Los Alamos National Laboratory originally for plutonium isotopic analysis. Later, it was adapted for uranium isotopic analysis in addition to plutonium. It is a code based on a self-calibration using several gamma-ray peaks for determining the isotopic ratios. The versatile-parameter database structure governs all facets of the data analysis. User editing of the parameter sets allows great flexibility in handling data with different isotopic distributions, interfering isotopes, and different acquisition parameters such as energy calibration and detector type

  4. Use of error files in uncertainty analysis and data adjustment

    International Nuclear Information System (INIS)

    Chestnutt, M.M.; McCracken, A.K.; McCracken, A.K.

    1979-01-01

    Some results are given from uncertainty analyses on Pressurized Water Reactor (PWR) and Fast Reactor Theoretical Benchmarks. Upper limit estimates of calculated quantities are shown to be significantly reduced by the use of ENDF/B data covariance files and recently published few-group covariance matrices. Some problems in the analysis of single-material benchmark experiments are discussed with reference to the Winfrith iron benchmark experiment. Particular attention is given to the difficulty of making use of very extensive measurements which are likely to be a feature of this type of experiment. Preliminary results of an adjustment in iron are shown

  5. Uncertainty analysis in calculations of a road accident consequences

    International Nuclear Information System (INIS)

    Bonnefous, S.; Brenot, J.; Hubert, P.

    1995-01-01

    This paper develops a concrete situation witch is the search for an evacuation distance in case of a road accident implying a chlorine tank. The methodological aspect is how implementing uncertainty analysis in deterministic models with random parameters. The study demonstrates a great dispersion in the results. It allows to establish satisfactory decision rules and a hierarchy on parameters witch is useful to define priorities in the search for information and to improve the treatment of these parameters. (authors). 8 refs., 1 fig., 2 tabs

  6. Codeword Structure Analysis for LDPC Convolutional Codes

    Directory of Open Access Journals (Sweden)

    Hua Zhou

    2015-12-01

    Full Text Available The codewords of a low-density parity-check (LDPC convolutional code (LDPC-CC are characterised into structured and non-structured. The number of the structured codewords is dominated by the size of the polynomial syndrome former matrix H T ( D , while the number of the non-structured ones depends on the particular monomials or polynomials in H T ( D . By evaluating the relationship of the codewords between the mother code and its super codes, the low weight non-structured codewords in the super codes can be eliminated by appropriately choosing the monomials or polynomials in H T ( D , resulting in improved distance spectrum of the mother code.

  7. Additional challenges for uncertainty analysis in river engineering

    Science.gov (United States)

    Berends, Koen; Warmink, Jord; Hulscher, Suzanne

    2016-04-01

    the proposed intervention. The implicit assumption underlying such analysis is that both models are commensurable. We hypothesize that they are commensurable only to a certain extent. In an idealised study we have demonstrated that prediction performance loss should be expected with increasingly large engineering works. When accounting for parametric uncertainty of floodplain roughness in model identification, we see uncertainty bounds for predicted effects of interventions increase with increasing intervention scale. Calibration of these types of models therefore seems to have a shelf-life, beyond which calibration does not longer improves prediction. Therefore a qualification scheme for model use is required that can be linked to model validity. In this study, we characterize model use along three dimensions: extrapolation (using the model with different external drivers), extension (using the model for different output or indicators) and modification (using modified models). Such use of models is expected to have implications for the applicability of surrogating modelling for efficient uncertainty analysis as well, which is recommended for future research. Warmink, J. J.; Straatsma, M. W.; Huthoff, F.; Booij, M. J. & Hulscher, S. J. M. H. 2013. Uncertainty of design water levels due to combined bed form and vegetation roughness in the Dutch river Waal. Journal of Flood Risk Management 6, 302-318 . DOI: 10.1111/jfr3.12014

  8. Sample application of sensitivity/uncertainty analysis techniques to a groundwater transport problem. National Low-Level Waste Management Program

    International Nuclear Information System (INIS)

    Seitz, R.R.; Rood, A.S.; Harris, G.A.; Maheras, S.J.; Kotecki, M.

    1991-06-01

    The primary objective of this document is to provide sample applications of selected sensitivity and uncertainty analysis techniques within the context of the radiological performance assessment process. These applications were drawn from the companion document Guidelines for Sensitivity and Uncertainty Analyses of Low-Level Radioactive Waste Performance Assessment Computer Codes (S. Maheras and M. Kotecki, DOE/LLW-100, 1990). Three techniques are illustrated in this document: one-factor-at-a-time (OFAT) analysis, fractional factorial design, and Latin hypercube sampling. The report also illustrates the differences in sensitivity and uncertainty analysis at the early and latter stages of the performance assessment process, and potential pitfalls that can be encountered when applying the techniques. The emphasis is on application of the techniques as opposed to the actual results, since the results are hypothetical and are not based on site-specific conditions

  9. The analysis of thermal-hydraulic models in MELCOR code

    Energy Technology Data Exchange (ETDEWEB)

    Kim, M H; Hur, C; Kim, D K; Cho, H J [POhang Univ., of Science and TECHnology, Pohang (Korea, Republic of)

    1996-07-15

    The objective of the present work is to verify the prediction and analysis capability of MELCOR code about the progression of severe accidents in light water reactor and also to evaluate appropriateness of thermal-hydraulic models used in MELCOR code. Comparing the results of experiment and calculation with MELCOR code is carried out to achieve the above objective. Specially, the comparison between the CORA-13 experiment and the MELCOR code calculation was performed.

  10. Eigenvalue sensitivity analysis and uncertainty quantification in SCALE6.2.1 using continuous-energy Monte Carlo Method

    Energy Technology Data Exchange (ETDEWEB)

    Labarile, A.; Barrachina, T.; Miró, R.; Verdú, G., E-mail: alabarile@iqn.upv.es, E-mail: tbarrachina@iqn.upv.es, E-mail: rmiro@iqn.upv.es, E-mail: gverdu@iqn.upv.es [Institute for Industrial, Radiophysical and Environmental Safety - ISIRYM, Valencia (Spain); Pereira, C., E-mail: claubia@nuclear.ufmg.br [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Departamento de Engenharia Nuclear

    2017-07-01

    The use of Best-Estimate computer codes is one of the greatest concerns in the nuclear industry especially for licensing analysis. Of paramount importance is the estimation of the uncertainties of the whole system to establish the safety margins based on highly reliable results. The estimation of these uncertainties should be performed by applying a methodology to propagate the uncertainties from the input parameters and the models implemented in the code to the output parameters. This study employs two different approaches for the Sensitivity Analysis (SA) and Uncertainty Quantification (UQ), the adjoint-based perturbation theory of TSUNAMI-3D, and the stochastic sampling technique of SAMPLER/KENO. The cases studied are two models of Light Water Reactors in the framework of the OECD/NEA UAM-LWR benchmark, a Boiling Water Reactor (BWR) and a Pressurized Water Reactor (PWR). Both of them at Hot Full Power (HFP) and Hot Zero Power (HZP) conditions, with and without control rod. This work presents the results of k{sub eff} from different simulation, and discuss the comparison of the two methods employed. In particular, a list of the major contributors to the uncertainty of k{sub eff} in terms of microscopic cross sections; their sensitivity coefficients; a comparison between the results of the two modules and with reference values; statistical information from the stochastic approach, and the probability and statistical confidence reached in the simulations. The reader will find all these information discussed in this paper. (author)

  11. Statistical analysis of the uncertainty related to flood hazard appraisal

    Science.gov (United States)

    Notaro, Vincenza; Freni, Gabriele

    2015-12-01

    The estimation of flood hazard frequency statistics for an urban catchment is of great interest in practice. It provides the evaluation of potential flood risk and related damage and supports decision making for flood risk management. Flood risk is usually defined as function of the probability, that a system deficiency can cause flooding (hazard), and the expected damage, due to the flooding magnitude (damage), taking into account both the exposure and the vulnerability of the goods at risk. The expected flood damage can be evaluated by an a priori estimation of potential damage caused by flooding or by interpolating real damage data. With regard to flood hazard appraisal several procedures propose to identify some hazard indicator (HI) such as flood depth or the combination of flood depth and velocity and to assess the flood hazard corresponding to the analyzed area comparing the HI variables with user-defined threshold values or curves (penalty curves or matrixes). However, flooding data are usually unavailable or piecemeal allowing for carrying out a reliable flood hazard analysis, therefore hazard analysis is often performed by means of mathematical simulations aimed at evaluating water levels and flow velocities over catchment surface. As results a great part of the uncertainties intrinsic to flood risk appraisal can be related to the hazard evaluation due to the uncertainty inherent to modeling results and to the subjectivity of the user defined hazard thresholds applied to link flood depth to a hazard level. In the present work, a statistical methodology was proposed for evaluating and reducing the uncertainties connected with hazard level estimation. The methodology has been applied to a real urban watershed as case study.

  12. Parallelization of Subchannel Analysis Code MATRA

    International Nuclear Information System (INIS)

    Kim, Seongjin; Hwang, Daehyun; Kwon, Hyouk

    2014-01-01

    A stand-alone calculation of MATRA code used up pertinent computing time for the thermal margin calculations while a relatively considerable time is needed to solve the whole core pin-by-pin problems. In addition, it is strongly required to improve the computation speed of the MATRA code to satisfy the overall performance of the multi-physics coupling calculations. Therefore, a parallel approach to improve and optimize the computability of the MATRA code is proposed and verified in this study. The parallel algorithm is embodied in the MATRA code using the MPI communication method and the modification of the previous code structure was minimized. An improvement is confirmed by comparing the results between the single and multiple processor algorithms. The speedup and efficiency are also evaluated when increasing the number of processors. The parallel algorithm was implemented to the subchannel code MATRA using the MPI. The performance of the parallel algorithm was verified by comparing the results with those from the MATRA with the single processor. It is also noticed that the performance of the MATRA code was greatly improved by implementing the parallel algorithm for the 1/8 core and whole core problems

  13. Development and application of best-estimate LWR safety analysis codes

    International Nuclear Information System (INIS)

    Reocreux, M.

    1997-01-01

    This paper is a review of the status and the future orientations of the development and application of best estimate LWR safety analysis codes. The present status of these codes exhibits a large success and almost a complete fulfillment of the objectives which were assigned in the 70s. The applications of Best Estimate codes are numerous and cover a large variety of safety questions. However these applications raised a number of problems. The first ones concern the need to have a better control of the quality of the results. This means requirements on code assessment and on uncertainties evaluation. The second ones concern needs for code development and specifically regarding physical models, numerics, coupling with other codes and programming. The analysis of the orientations for code developments and applications in the next years, shows that some developments should be made without delay in order to solve today questions whereas some others are more long term and should be tested for example in some pilot programmes before being eventually applied in main code development. Each of these development programmes are analyzed in the paper by detailing their main content and their possible interest. (author)

  14. Module type plant system dynamics analysis code (MSG-COPD). Code manual

    International Nuclear Information System (INIS)

    Sakai, Takaaki

    2002-11-01

    MSG-COPD is a module type plant system dynamics analysis code which involves a multi-dimensional thermal-hydraulics calculation module to analyze pool type of fast breeder reactors. Explanations of each module and the methods for the input data are described in this code manual. (author)

  15. Sensitivity analysis and uncertainties simulation of the migration of radionuclide in the system of geological disposal-CRP-GEORC model

    International Nuclear Information System (INIS)

    Su Rui; Wang Ju; Chen Weiming; Zong Zihua; Zhao Honggang

    2008-01-01

    CRP-GEORC concept model is an artificial system of geological disposal for High-Level radioactive waste. Sensitivity analysis and uncertainties simulation of the migration of radionuclide Se-79 and I-129 in the far field of this system by using GoldSim Code have been conducted. It can be seen from the simulation results that variables used to describe the geological features and characterization of groundwater flow are sensitive variables of whole geological disposal system. The uncertainties of parameters have remarkable influence on the simulation results. (authors)

  16. Nuclear data uncertainty analysis for the generation IV gas-cooled fast reactor

    International Nuclear Information System (INIS)

    Pelloni, S.; Mikityuk, K.

    2012-01-01

    For the European 2400 MW Gas-cooled Fast Reactor (GoFastR), this paper summarizes a priori uncertainties, i.e. without any integral experiment assessment, of the main neutronic parameters which were obtained on the basis of the deterministic code system ERANOS (Edition 2.2-N). JEFF-3.1 cross-sections were used in conjunction with the newest ENDF/B-VII.0 based covariance library (COMMARA-2.0) resulting from a recent cooperation of the Brookhaven and Los Alamos National Laboratories within the Advanced Fuel Cycle Initiative. The basis for the analysis is the original GoFastR concept with carbide fuel pins and silicon-carbide ceramic cladding, which was developed and proposed in the first quarter of 2009 by the 'French alternative energies and Atomic Energy Commission', CEA. The main conclusions from the current study are that nuclear data uncertainties of neutronic parameters may still be too large for this Generation IV reactor, especially concerning the multiplication factor, despite the fact that the new covariance library is quite complete; These uncertainties, in relative terms, do not show the a priori expected increase with bum-up as a result of the minor actinide and fission product build-up. Indeed, they are found almost independent of the fuel depletion, since the uncertainty associated with 238 U inelastic scattering results largely dominating. This finding clearly supports the activities of Subgroup 33 of the Working Party on International Nuclear Data Evaluation Cooperation (WPEC), i.e. Methods and issues for the combined use of integral experiments and covariance data, attempting to reduce the present unbiased uncertainties on nuclear data through adjustments based on available experimental data. (authors)

  17. Selected examples of practical approaches for the assessment of model reliability - parameter uncertainty analysis

    International Nuclear Information System (INIS)

    Hofer, E.; Hoffman, F.O.

    1987-02-01

    The uncertainty analysis of model predictions has to discriminate between two fundamentally different types of uncertainty. The presence of stochastic variability (Type 1 uncertainty) necessitates the use of a probabilistic model instead of the much simpler deterministic one. Lack of knowledge (Type 2 uncertainty), however, applies to deterministic as well as to probabilistic model predictions and often dominates over uncertainties of Type 1. The term ''probability'' is interpreted differently in the probabilistic analysis of either type of uncertainty. After these discriminations have been explained the discussion centers on the propagation of parameter uncertainties through the model, the derivation of quantitative uncertainty statements for model predictions and the presentation and interpretation of the results of a Type 2 uncertainty analysis. Various alternative approaches are compared for a very simple deterministic model

  18. Sensitivity/uncertainty analysis for the Hiroshima dosimetry reevaluation effort

    International Nuclear Information System (INIS)

    Broadhead, B.L.; Lillie, R.A.; Pace, J.V. III; Cacuci, D.G.

    1987-01-01

    Uncertainty estimates and cross correlations by range/survivor location have been obtained for the free-in-air (FIA) tissue kerma for the Hiroshima atomic event. These uncertainties in the FIA kerma include contributions due to various modeling parameters and the basic cross section data and are given at three ground ranges, 700, 1000 and 1500 m. The estimated uncertainties are nearly constant over the given ground ranges and are approximately 27% for the prompt neutron kerma and secondary gamma kerma and 35% for the prompt gamma kerma. The total kerma uncertainty is dominated by the secondary gamma kerma uncertainties which are in turn largely due to the modeling parameter uncertainties

  19. OPR1000 RCP Flow Coastdown Analysis using SPACE Code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dong-Hyuk; Kim, Seyun [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    The Korean nuclear industry developed a thermal-hydraulic analysis code for the safety analysis of PWRs, named SPACE(Safety and Performance Analysis Code for Nuclear Power Plant). Current loss of flow transient analysis of OPR1000 uses COAST code to calculate transient RCS(Reactor Coolant System) flow. The COAST code calculates RCS loop flow using pump performance curves and RCP(Reactor Coolant Pump) inertia. In this paper, SPACE code is used to reproduce RCS flowrates calculated by COAST code. The loss of flow transient is transient initiated by reduction of forced reactor coolant circulation. Typical loss of flow transients are complete loss of flow(CLOF) and locked rotor(LR). OPR1000 RCP flow coastdown analysis was performed using SPACE using simplified nodalization. Complete loss of flow(4 RCP trip) was analyzed. The results show good agreement with those from COAST code, which is CE code for calculating RCS flow during loss of flow transients. Through this study, we confirmed that SPACE code can be used instead of COAST code for RCP flow coastdown analysis.

  20. Joint analysis of epistemic and aleatory uncertainty in stability analysis for geo-hazard assessments

    Science.gov (United States)

    Rohmer, Jeremy; Verdel, Thierry

    2017-04-01

    Uncertainty analysis is an unavoidable task of stability analysis of any geotechnical systems. Such analysis usually relies on the safety factor SF (if SF is below some specified threshold), the failure is possible). The objective of the stability analysis is then to estimate the failure probability P for SF to be below the specified threshold. When dealing with uncertainties, two facets should be considered as outlined by several authors in the domain of geotechnics, namely "aleatoric uncertainty" (also named "randomness" or "intrinsic variability") and "epistemic uncertainty" (i.e. when facing "vague, incomplete or imprecise information" such as limited databases and observations or "imperfect" modelling). The benefits of separating both facets of uncertainty can be seen from a risk management perspective because: - Aleatoric uncertainty, being a property of the system under study, cannot be reduced. However, practical actions can be taken to circumvent the potentially dangerous effects of such variability; - Epistemic uncertainty, being due to the incomplete/imprecise nature of available information, can be reduced by e.g., increasing the number of tests (lab or in site survey), improving the measurement methods or evaluating calculation procedure with model tests, confronting more information sources (expert opinions, data from literature, etc.). Uncertainty treatment in stability analysis usually restricts to the probabilistic framework to represent both facets of uncertainty. Yet, in the domain of geo-hazard assessments (like landslides, mine pillar collapse, rockfalls, etc.), the validity of this approach can be debatable. In the present communication, we propose to review the major criticisms available in the literature against the systematic use of probability in situations of high degree of uncertainty. On this basis, the feasibility of using a more flexible uncertainty representation tool is then investigated, namely Possibility distributions (e

  1. Stochastic dynamic analysis of marine risers considering Gaussian system uncertainties

    Science.gov (United States)

    Ni, Pinghe; Li, Jun; Hao, Hong; Xia, Yong

    2018-03-01

    This paper performs the stochastic dynamic response analysis of marine risers with material uncertainties, i.e. in the mass density and elastic modulus, by using Stochastic Finite Element Method (SFEM) and model reduction technique. These uncertainties are assumed having Gaussian distributions. The random mass density and elastic modulus are represented by using the Karhunen-Loève (KL) expansion. The Polynomial Chaos (PC) expansion is adopted to represent the vibration response because the covariance of the output is unknown. Model reduction based on the Iterated Improved Reduced System (IIRS) technique is applied to eliminate the PC coefficients of the slave degrees of freedom to reduce the dimension of the stochastic system. Monte Carlo Simulation (MCS) is conducted to obtain the reference response statistics. Two numerical examples are studied in this paper. The response statistics from the proposed approach are compared with those from MCS. It is noted that the computational time is significantly reduced while the accuracy is kept. The results demonstrate the efficiency of the proposed approach for stochastic dynamic response analysis of marine risers.

  2. Overview of hybrid subspace methods for uncertainty quantification, sensitivity analysis

    International Nuclear Information System (INIS)

    Abdel-Khalik, Hany S.; Bang, Youngsuk; Wang, Congjian

    2013-01-01

    Highlights: ► We overview the state-of-the-art in uncertainty quantification and sensitivity analysis. ► We overview new developments in above areas using hybrid methods. ► We give a tutorial introduction to above areas and the new developments. ► Hybrid methods address the explosion in dimensionality in nonlinear models. ► Representative numerical experiments are given. -- Abstract: The role of modeling and simulation has been heavily promoted in recent years to improve understanding of complex engineering systems. To realize the benefits of modeling and simulation, concerted efforts in the areas of uncertainty quantification and sensitivity analysis are required. The manuscript intends to serve as a pedagogical presentation of the material to young researchers and practitioners with little background on the subjects. We believe this is important as the role of these subjects is expected to be integral to the design, safety, and operation of existing as well as next generation reactors. In addition to covering the basics, an overview of the current state-of-the-art will be given with particular emphasis on the challenges pertaining to nuclear reactor modeling. The second objective will focus on presenting our own development of hybrid subspace methods intended to address the explosion in the computational overhead required when handling real-world complex engineering systems.

  3. A retrospective dosimetry method and its uncertainty analysis

    International Nuclear Information System (INIS)

    Zhang, L.; Jia, D.; Dai, G.

    2000-01-01

    The main aim of a radiation epidemiological study is to assess the risk of the population exposed to ionizing radiation. The actual work of the assessment may be very difficult because dose information about the population is often indirect and incomplete. It is very important, therefore, to find a way of estimating reasonable and reliable doses of the population by a retrospective method from limited information. In order to provide reasonable dose information for the cohort study of Chinese medical diagnostic X-ray workers, a retrospective dosimetry method was established. In China, a cohort study of more than 27,000 medical diagnostic X-ray workers, with 25,000 controls, has been carried out for about fifteen years in order to assess the risk to an occupationally exposed population. Obviously, a key to the success of the study is to obtain reliable and reasonable results of dose estimation by the dose reconstruction method. Before 1985, there was a lack of information regarding personal dose measured directly; however, we can obtain other indirect information. Examples are information about working loads from the documents of the hospitals, information about operational conditions of the workers of different statuses by a survey of occupational history, and the exposure levels of various working conditions by some simulation methods. The information for estimating organ dose can also be obtained by simulating experiments with a phantom. Based on the information mentioned above, a mathematical model and computerizing system for dose reconstruction of this occupational population was design and developed. Uncertainty analysis very important for dose reconstruction. The sources of uncertainty of our study are coming from two fields. One is coming from the mode of dose reconstruction. Another is coming from the survey of the occupational history. In the result reported, main results of the uncertainty will be presented. In order to control the uncertainty of the

  4. Fast neutron analysis code SAD1

    International Nuclear Information System (INIS)

    Jung, M.; Ott, C.

    1985-01-01

    A listing and an example of outputs of the M.C. code SAD1 are given here. This code has been used many times to predict responses of fast neutrons in hydrogenic materials (in our case emulsions or plastics) towards the elastic n, p scattering. It can be easily extended to other kinds of such materials and to any kind of incident fast neutron spectrum

  5. Towards an Industrial Application of Statistical Uncertainty Analysis Methods to Multi-physical Modelling and Safety Analyses

    International Nuclear Information System (INIS)

    Zhang, Jinzhao; Segurado, Jacobo; Schneidesch, Christophe

    2013-01-01

    Since 1980's, Tractebel Engineering (TE) has being developed and applied a multi-physical modelling and safety analyses capability, based on a code package consisting of the best estimate 3D neutronic (PANTHER), system thermal hydraulic (RELAP5), core sub-channel thermal hydraulic (COBRA-3C), and fuel thermal mechanic (FRAPCON/FRAPTRAN) codes. A series of methodologies have been developed to perform and to license the reactor safety analysis and core reload design, based on the deterministic bounding approach. Following the recent trends in research and development as well as in industrial applications, TE has been working since 2010 towards the application of the statistical sensitivity and uncertainty analysis methods to the multi-physical modelling and licensing safety analyses. In this paper, the TE multi-physical modelling and safety analyses capability is first described, followed by the proposed TE best estimate plus statistical uncertainty analysis method (BESUAM). The chosen statistical sensitivity and uncertainty analysis methods (non-parametric order statistic method or bootstrap) and tool (DAKOTA) are then presented, followed by some preliminary results of their applications to FRAPCON/FRAPTRAN simulation of OECD RIA fuel rod codes benchmark and RELAP5/MOD3.3 simulation of THTF tests. (authors)

  6. Application of data analysis techniques to nuclear reactor systems code to accuracy assessment

    International Nuclear Information System (INIS)

    Kunz, R.F.; Kasmala, G.F.; Murray, C.J.; Mahaffy, J.H.

    2000-01-01

    An automated code assessment program (ACAP) has been developed by the authors to provide quantitative comparisons between nuclear reactor systems (NRS) code results and experimental measurements. This software was developed under subcontract to the United States Nuclear Regulatory Commission for use in its NRS code consolidation efforts. In this paper, background on the topic of NRS accuracy and uncertainty assessment is provided which motivates the development of and defines basic software requirements for ACAP. A survey of data analysis techniques was performed, focusing on the applicability of methods in the construction of NRS code-data comparison measures. The results of this review process, which further defined the scope, user interface and process for using ACAP are also summarized. A description of the software package and several sample applications to NRS data sets are provided. Its functionality and ability to provide objective accuracy assessment figures are demonstrated. (author)

  7. SCDAP: a light water reactor computer code for severe core damage analysis

    International Nuclear Information System (INIS)

    Marino, G.P.; Allison, C.M.; Majumdar, D.

    1982-01-01

    Development of the first code version (MODO) of the Severe Core Damage Analysis Package (SCDAP) computer code is described, and calculations made with SCDAP/MODO are presented. The objective of this computer code development program is to develop a capability for analyzing severe disruption of a light water reactor core, including fuel and cladding liquefaction, flow, and freezing; fission product release; hydrogen generation; quenched-induced fragmentation; coolability of the resulting geometry; and ultimately vessel failure due to vessel-melt interaction. SCDAP will be used to identify the phenomena which control core behavior during a severe accident, to help quantify uncertainties in risk assessment analysis, and to support planning and evaluation of severe fuel damage experiments and data. SCDAP/MODO addresses the behavior of a single fuel bundle. Future versions will be developed with capabilities for core-wide and vessel-melt interaction analysis

  8. Methodology to carry out a sensitivity and uncertainty analysis for cross sections using a coupled model Trace-Parcs

    International Nuclear Information System (INIS)

    Reyes F, M. C.; Del Valle G, E.; Gomez T, A. M.; Sanchez E, V.

    2015-09-01

    A methodology was implemented to carry out a sensitivity and uncertainty analysis for cross sections used in a coupled model for Trace/Parcs in a transient of control rod fall of a BWR-5. A model of the reactor core for the neutronic code Parcs was used, in which the assemblies located in the core are described. Thermo-hydraulic model in Trace was a simple model, where only a component type Chan was designed to represent all the core assemblies, which it was within a single vessel and boundary conditions were established. The thermo-hydraulic part was coupled with the neutron part, first for the steady state and then a transient of control rod fall was carried out for the sensitivity and uncertainty analysis. To carry out the analysis of cross sections used in the coupled model Trace/Parcs during the transient, the Probability Density Functions for 22 parameters selected from the total of neutronic parameters that use Parcs were generated, obtaining 100 different cases for the coupled model Trace/Parcs, each one with a database of different cross sections. All these cases were executed with the coupled model, obtaining in consequence 100 different output files for the transient of control rod fall doing emphasis in the nominal power, for which an uncertainty analysis was realized at the same time generate the band of uncertainty. With this analysis is possible to observe the ranges of results of the elected responses varying the selected uncertainty parameters. The sensitivity analysis complements the uncertainty analysis, identifying the parameter or parameters with more influence on the results and thus focuses on these parameters in order to better understand their effects. Beyond the obtained results, because is not a model with real operation data, the importance of this work is to know the application of the methodology to carry out the sensitivity and uncertainty analyses. (Author)

  9. Challenges on innovations of newly-developed safety analysis codes

    International Nuclear Information System (INIS)

    Yang, Yanhua; Zhang, Hao

    2016-01-01

    With the development of safety analysis method, the safety analysis codes meet more challenges. Three challenges are presented in this paper, which are mathematic model, code design and user interface. Combined with the self-reliance safety analysis code named COSINE, the ways of meeting these requirements are suggested, that is to develop multi-phases, multi-fields and multi-dimension models, to adopt object-oriented code design ideal and to improve the way of modeling, calculation control and data post-processing in the user interface.

  10. Challenges on innovations of newly-developed safety analysis codes

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Yanhua [Shanghai Jiao Tong Univ. (China). School of Nuclear Science and Engineering; Zhang, Hao [State Nuclear Power Software Development Center, Beijing (China). Beijing Future Science and Technology City

    2016-05-15

    With the development of safety analysis method, the safety analysis codes meet more challenges. Three challenges are presented in this paper, which are mathematic model, code design and user interface. Combined with the self-reliance safety analysis code named COSINE, the ways of meeting these requirements are suggested, that is to develop multi-phases, multi-fields and multi-dimension models, to adopt object-oriented code design ideal and to improve the way of modeling, calculation control and data post-processing in the user interface.

  11. ZERBERUS - the code for reliability analysis of crack containing structures

    International Nuclear Information System (INIS)

    Cizelj, L.; Riesch-Oppermann, H.

    1992-04-01

    Brief description of the First- and Second Order Reliability Methods, being the theoretical background of the code, is given. The code structure is described in detail, with special emphasis to the new application fields. The numerical example investigates failure probability of steam generator tubing affected by stress corrosion cracking. The changes necessary to accommodate this analysis within the ZERBERUS code are explained. Analysis results are compared to different Monte Carlo techniques. (orig./HP) [de

  12. Geoengineering to Avoid Overshoot: An Analysis of Uncertainty

    Science.gov (United States)

    Tanaka, Katsumasa; Cho, Cheolhung; Krey, Volker; Patt, Anthony; Rafaj, Peter; Rao-Skirbekk, Shilpa; Wagner, Fabian

    2010-05-01

    ., 2009) is employed to calculate climate responses including associated uncertainty and to estimate geoengineering profiles to cap the warming at 2°C since preindustrial. The inversion setup for the model ACC2 is used to estimate the uncertain parameters (e.g. climate sensitivity) against associated historical observations (e.g. global-mean surface air temperature). Our preliminary results show that under climate and scenario uncertainties, a geoengineering intervention to avoid an overshoot would be with medium intensity in the latter half of this century (≈ 1 Mt. Pinatubo eruption every 4 years in terms of stratospheric sulfur injections). The start year of geoengineering intervention does not significantly influence the long-term geoengineering profile. However, a geoengineering intervention of the medium intensity could bring about substantial environmental side effects such as the destruction of stratospheric ozone. Our results point to the necessity to pursue persistently mainstream mitigation efforts. 2) Pollution Abatement and Geoengineering The second study examines the potential of geoengineering combined with air clean policy. A drastic air pollution abatement might result in an abrupt warming because it would suddenly remove the tropospheric aerosols which partly offset the background global warming (e.g. Andreae et al, 2005, Raddatz and Tanaka, 2010). This study investigates the magnitude of unrealized warming under a range of policy assumptions and associated uncertainties. Then the profile of geoengineering is estimated to suppress the warming that would be accompanied by clean air policy. This study is the first attempt to explore uncertainty in the warming caused by clean air policy - Kloster et al. (2009), which assess regional changes in climate and hydrological cycle, has not however included associated uncertainties in the analysis. A variety of policy assumptions will be devised to represent various degrees of air pollution abatement. These

  13. Uncertainty analysis comes to integrated assessment models for climate change…and conversely

    NARCIS (Netherlands)

    Cooke, R.M.

    2012-01-01

    This article traces the development of uncertainty analysis through three generations punctuated by large methodology investments in the nuclear sector. Driven by a very high perceived legitimation burden, these investments aimed at strengthening the scientific basis of uncertainty quantification.

  14. Analysis and evaluation of regulatory uncertainties in 10 CFR 60 subparts B and E

    International Nuclear Information System (INIS)

    Weiner, R.F.; Patrick, W.C.

    1990-01-01

    This paper presents an attribute analysis scheme for prioritizing the resolution of regulatory uncertainties. Attributes are presented which assist in identifying the need for timeliness and durability of the resolution of an uncertainty

  15. Dynamic Simulation, Sensitivity and Uncertainty Analysis of a Demonstration Scale Lignocellulosic Enzymatic Hydrolysis Process

    DEFF Research Database (Denmark)

    Prunescu, Remus Mihail; Sin, Gürkan

    2014-01-01

    This study presents the uncertainty and sensitivity analysis of a lignocellulosic enzymatic hydrolysis model considering both model and feed parameters as sources of uncertainty. The dynamic model is parametrized for accommodating various types of biomass, and different enzymatic complexes...

  16. Needs of the CSAU uncertainty method

    International Nuclear Information System (INIS)

    Prosek, A.; Mavko, B.

    2000-01-01

    The use of best estimate codes for safety analysis requires quantification of the uncertainties. These uncertainties are inherently linked to the chosen safety analysis methodology. Worldwide, various methods were proposed for this quantification. The purpose of this paper was to identify the needs of the Code Scaling, Applicability, and Uncertainty (CSAU) methodology and then to answer the needs. The specific procedural steps were combined from other methods for uncertainty evaluation and new tools and procedures were proposed. The uncertainty analysis approach and tools were then utilized for confirmatory study. The uncertainty was quantified for the RELAP5/MOD3.2 thermalhydraulic computer code. The results of the adapted CSAU approach to the small-break loss-of-coolant accident (SB LOCA) show that the adapted CSAU can be used for any thermal-hydraulic safety analysis with uncertainty evaluation. However, it was indicated that there are still some limitations in the CSAU approach that need to be resolved. (author)

  17. Bayesian uncertainty analysis with applications to turbulence modeling

    International Nuclear Information System (INIS)

    Cheung, Sai Hung; Oliver, Todd A.; Prudencio, Ernesto E.; Prudhomme, Serge; Moser, Robert D.

    2011-01-01

    In this paper, we apply Bayesian uncertainty quantification techniques to the processes of calibrating complex mathematical models and predicting quantities of interest (QoI's) with such models. These techniques also enable the systematic comparison of competing model classes. The processes of calibration and comparison constitute the building blocks of a larger validation process, the goal of which is to accept or reject a given mathematical model for the prediction of a particular QoI for a particular scenario. In this work, we take the first step in this process by applying the methodology to the analysis of the Spalart-Allmaras turbulence model in the context of incompressible, boundary layer flows. Three competing model classes based on the Spalart-Allmaras model are formulated, calibrated against experimental data, and used to issue a prediction with quantified uncertainty. The model classes are compared in terms of their posterior probabilities and their prediction of QoI's. The model posterior probability represents the relative plausibility of a model class given the data. Thus, it incorporates the model's ability to fit experimental observations. Alternatively, comparing models using the predicted QoI connects the process to the needs of decision makers that use the results of the model. We show that by using both the model plausibility and predicted QoI, one has the opportunity to reject some model classes after calibration, before subjecting the remaining classes to additional validation challenges.

  18. Quantification of Uncertainty in the Flood Frequency Analysis

    Science.gov (United States)

    Kasiapillai Sudalaimuthu, K.; He, J.; Swami, D.

    2017-12-01

    Flood frequency analysis (FFA) is usually carried out for planning and designing of water resources and hydraulic structures. Owing to the existence of variability in sample representation, selection of distribution and estimation of distribution parameters, the estimation of flood quantile has been always uncertain. Hence, suitable approaches must be developed to quantify the uncertainty in the form of prediction interval as an alternate to deterministic approach. The developed framework in the present study to include uncertainty in the FFA discusses a multi-objective optimization approach to construct the prediction interval using ensemble of flood quantile. Through this approach, an optimal variability of distribution parameters is identified to carry out FFA. To demonstrate the proposed approach, annual maximum flow data from two gauge stations (Bow river at Calgary and Banff, Canada) are used. The major focus of the present study was to evaluate the changes in magnitude of flood quantiles due to the recent extreme flood event occurred during the year 2013. In addition, the efficacy of the proposed method was further verified using standard bootstrap based sampling approaches and found that the proposed method is reliable in modeling extreme floods as compared to the bootstrap methods.

  19. Selection of Representative Models for Decision Analysis Under Uncertainty

    Science.gov (United States)

    Meira, Luis A. A.; Coelho, Guilherme P.; Santos, Antonio Alberto S.; Schiozer, Denis J.

    2016-03-01

    The decision-making process in oil fields includes a step of risk analysis associated with the uncertainties present in the variables of the problem. Such uncertainties lead to hundreds, even thousands, of possible scenarios that are supposed to be analyzed so an effective production strategy can be selected. Given this high number of scenarios, a technique to reduce this set to a smaller, feasible subset of representative scenarios is imperative. The selected scenarios must be representative of the original set and also free of optimistic and pessimistic bias. This paper is devoted to propose an assisted methodology to identify representative models in oil fields. To do so, first a mathematical function was developed to model the representativeness of a subset of models with respect to the full set that characterizes the problem. Then, an optimization tool was implemented to identify the representative models of any problem, considering not only the cross-plots of the main output variables, but also the risk curves and the probability distribution of the attribute-levels of the problem. The proposed technique was applied to two benchmark cases and the results, evaluated by experts in the field, indicate that the obtained solutions are richer than those identified by previously adopted manual approaches. The program bytecode is available under request.

  20. Uncertainty and Sensitivity Analysis Applied to the Validation of BWR Bundle Thermal-Hydraulic Calculations

    International Nuclear Information System (INIS)

    Hernandez-Solis, Augusto

    2010-04-01

    This work has two main objectives. The first one is to enhance the validation process of the thermal-hydraulic features of the Westinghouse code POLCA-T. This is achieved by computing a quantitative validation limit based on statistical uncertainty analysis. This validation theory is applied to some of the benchmark cases of the following macroscopic BFBT exercises: 1) Single and two phase bundle pressure drops, 2) Steady-state cross-sectional averaged void fraction, 3) Transient cross-sectional averaged void fraction and 4) Steady-state critical power tests. Sensitivity analysis is also performed to identify the most important uncertain parameters for each exercise. The second objective consists in showing the clear advantages of using the quasi-random Latin Hypercube Sampling (LHS) strategy over simple random sampling (SRS). LHS allows a much better coverage of the input uncertainties than SRS because it densely stratifies across the range of each input probability distribution. The aim here is to compare both uncertainty analyses on the BWR assembly void axial profile prediction in steady-state, and on the transient void fraction prediction at a certain axial level coming from a simulated re-circulation pump trip scenario. It is shown that the replicated void fraction mean (either in steady-state or transient conditions) has less variability when using LHS than SRS for the same number of calculations (i.e. same input space sample size) even if the resulting void fraction axial profiles are non-monotonic. It is also shown that the void fraction uncertainty limits achieved with SRS by running 458 calculations (sample size required to cover 95% of 8 uncertain input parameters with a 95% confidence), result in the same uncertainty limits achieved by LHS with only 100 calculations. These are thus clear indications on the advantages of using LHS. Finally, the present study contributes to a realistic analysis of nuclear reactors, in the sense that the uncertainties of

  1. Manometer Behavior Analysis using CATHENA, RELAP and GOTHIC Codes

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yang Hoon; Han, Kee Soo; Moon, Bok Ja; Jang, Misuk [Nuclear Engineering Service and Solution Co. Ltd., Daejeon (Korea, Republic of)

    2014-05-15

    In this presentation, simple thermal hydraulic behavior is analyzed using three codes to show the possibility of using alternative codes. We established three models of simple u-tube manometer using three different codes. CATHENA (Canadian Algorithm for Thermal hydraulic Network Analysis), RELAP (Reactor Excursion and Leak Analysis Program), GOTHIC (Generation of Thermal Hydraulic Information for Containments) are used for this analysis. CATHENA and RELAP are widely used codes for the analysis of system behavior of CANDU and PWR. And GOTHIC code also has been widely used for the analysis of thermal hydraulic behavior in the containment system. In this paper, the internal behavior of u-tube manometer was analyzed using 3 codes, CATHENA, RELAP and GOTHIC. The general transient behavior is similar among 3 codes. However, the behavior simulated using GOTHIC shows some different trend compared with the results from the other 2 codes at the end of the transient. It would be resulted from the use of different physical model in GOTHIC, which is specialized for the multi-phase thermal hydraulic behavior analysis of containment system unlike the other two codes.

  2. Content Analysis Coding Schemes for Online Asynchronous Discussion

    Science.gov (United States)

    Weltzer-Ward, Lisa

    2011-01-01

    Purpose: Researchers commonly utilize coding-based analysis of classroom asynchronous discussion contributions as part of studies of online learning and instruction. However, this analysis is inconsistent from study to study with over 50 coding schemes and procedures applied in the last eight years. The aim of this article is to provide a basis…

  3. A Semantic Analysis Method for Scientific and Engineering Code

    Science.gov (United States)

    Stewart, Mark E. M.

    1998-01-01

    This paper develops a procedure to statically analyze aspects of the meaning or semantics of scientific and engineering code. The analysis involves adding semantic declarations to a user's code and parsing this semantic knowledge with the original code using multiple expert parsers. These semantic parsers are designed to recognize formulae in different disciplines including physical and mathematical formulae and geometrical position in a numerical scheme. In practice, a user would submit code with semantic declarations of primitive variables to the analysis procedure, and its semantic parsers would automatically recognize and document some static, semantic concepts and locate some program semantic errors. A prototype implementation of this analysis procedure is demonstrated. Further, the relationship between the fundamental algebraic manipulations of equations and the parsing of expressions is explained. This ability to locate some semantic errors and document semantic concepts in scientific and engineering code should reduce the t