WorldWideScience

Sample records for risk-based uncertainty analysis

  1. Approach to uncertainty in risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rish, W.R.

    1988-08-01

    In the Fall of 1985 EPA's Office of Radiation Programs (ORP) initiated a project to develop a formal approach to dealing with uncertainties encountered when estimating and evaluating risks to human health and the environment. Based on a literature review of modeling uncertainty, interviews with ORP technical and management staff, and input from experts on uncertainty analysis, a comprehensive approach was developed. This approach recognizes by design the constraints on budget, time, manpower, expertise, and availability of information often encountered in ''real world'' modeling. It is based on the observation that in practice risk modeling is usually done to support a decision process. As such, the approach focuses on how to frame a given risk modeling problem, how to use that framing to select an appropriate mixture of uncertainty analyses techniques, and how to integrate the techniques into an uncertainty assessment that effectively communicates important information and insight to decision-makers. The approach is presented in this report. Practical guidance on characterizing and analyzing uncertainties about model form and quantities and on effectively communicating uncertainty analysis results is included. Examples from actual applications are presented.

  2. Approach to uncertainty in risk analysis

    International Nuclear Information System (INIS)

    Rish, W.R.

    1988-08-01

    In the Fall of 1985 EPA's Office of Radiation Programs (ORP) initiated a project to develop a formal approach to dealing with uncertainties encountered when estimating and evaluating risks to human health and the environment. Based on a literature review of modeling uncertainty, interviews with ORP technical and management staff, and input from experts on uncertainty analysis, a comprehensive approach was developed. This approach recognizes by design the constraints on budget, time, manpower, expertise, and availability of information often encountered in ''real world'' modeling. It is based on the observation that in practice risk modeling is usually done to support a decision process. As such, the approach focuses on how to frame a given risk modeling problem, how to use that framing to select an appropriate mixture of uncertainty analyses techniques, and how to integrate the techniques into an uncertainty assessment that effectively communicates important information and insight to decision-makers. The approach is presented in this report. Practical guidance on characterizing and analyzing uncertainties about model form and quantities and on effectively communicating uncertainty analysis results is included. Examples from actual applications are presented

  3. Statistically based uncertainty assessments in nuclear risk analysis

    International Nuclear Information System (INIS)

    Spencer, F.W.; Diegert, K.V.; Easterling, R.G.

    1987-01-01

    Over the last decade, the problems of estimation and uncertainty assessment in probabilistics risk assessment (PRAs) have been addressed in a variety of NRC and industry-sponsored projects. These problems have received attention because of a recognition that major uncertainties in risk estimation exist, which can be reduced by collecting more and better data and other information, and because of a recognition that better methods for assessing these uncertainties are needed. In particular, a clear understanding of the nature and magnitude of various sources of uncertainty is needed to facilitate descision-making on possible plant changes and research options. Recent PRAs have employed methods of probability propagation, sometimes involving the use of Bayes Theorem, and intended to formalize the use of ''engineering judgment'' or ''expert opinion.'' All sources, or feelings, of uncertainty are expressed probabilistically, so that uncertainty analysis becomes simply a matter of probability propagation. Alternatives to forcing a probabilistic framework at all stages of a PRA are a major concern in this paper, however

  4. Risk Analysis of Reservoir Flood Routing Calculation Based on Inflow Forecast Uncertainty

    Directory of Open Access Journals (Sweden)

    Binquan Li

    2016-10-01

    Full Text Available Possible risks in reservoir flood control and regulation cannot be objectively assessed by deterministic flood forecasts, resulting in the probability of reservoir failure. We demonstrated a risk analysis of reservoir flood routing calculation accounting for inflow forecast uncertainty in a sub-basin of Huaihe River, China. The Xinanjiang model was used to provide deterministic flood forecasts, and was combined with the Hydrologic Uncertainty Processor (HUP to quantify reservoir inflow uncertainty in the probability density function (PDF form. Furthermore, the PDFs of reservoir water level (RWL and the risk rate of RWL exceeding a defined safety control level could be obtained. Results suggested that the median forecast (50th percentiles of HUP showed better agreement with observed inflows than the Xinanjiang model did in terms of the performance measures of flood process, peak, and volume. In addition, most observations (77.2% were bracketed by the uncertainty band of 90% confidence interval, with some small exceptions of high flows. Results proved that this framework of risk analysis could provide not only the deterministic forecasts of inflow and RWL, but also the fundamental uncertainty information (e.g., 90% confidence band for the reservoir flood routing calculation.

  5. Uncertainties in Cancer Risk Coefficients for Environmental Exposure to Radionuclides. An Uncertainty Analysis for Risk Coefficients Reported in Federal Guidance Report No. 13

    Energy Technology Data Exchange (ETDEWEB)

    Pawel, David [U.S. Environmental Protection Agency; Leggett, Richard Wayne [ORNL; Eckerman, Keith F [ORNL; Nelson, Christopher [U.S. Environmental Protection Agency

    2007-01-01

    Federal Guidance Report No. 13 (FGR 13) provides risk coefficients for estimation of the risk of cancer due to low-level exposure to each of more than 800 radionuclides. Uncertainties in risk coefficients were quantified in FGR 13 for 33 cases (exposure to each of 11 radionuclides by each of three exposure pathways) on the basis of sensitivity analyses in which various combinations of plausible biokinetic, dosimetric, and radiation risk models were used to generate alternative risk coefficients. The present report updates the uncertainty analysis in FGR 13 for the cases of inhalation and ingestion of radionuclides and expands the analysis to all radionuclides addressed in that report. The analysis indicates that most risk coefficients for inhalation or ingestion of radionuclides are determined within a factor of 5 or less by current information. That is, application of alternate plausible biokinetic and dosimetric models and radiation risk models (based on the linear, no-threshold hypothesis with an adjustment for the dose and dose rate effectiveness factor) is unlikely to change these coefficients by more than a factor of 5. In this analysis the assessed uncertainty in the radiation risk model was found to be the main determinant of the uncertainty category for most risk coefficients, but conclusions concerning the relative contributions of risk and dose models to the total uncertainty in a risk coefficient may depend strongly on the method of assessing uncertainties in the risk model.

  6. An introductory guide to uncertainty analysis in environmental and health risk assessment

    International Nuclear Information System (INIS)

    Hoffman, F.O.; Hammonds, J.S.

    1992-10-01

    To compensate for the potential for overly conservative estimates of risk using standard US Environmental Protection Agency methods, an uncertainty analysis should be performed as an integral part of each risk assessment. Uncertainty analyses allow one to obtain quantitative results in the form of confidence intervals that will aid in decision making and will provide guidance for the acquisition of additional data. To perform an uncertainty analysis, one must frequently rely on subjective judgment in the absence of data to estimate the range and a probability distribution describing the extent of uncertainty about a true but unknown value for each parameter of interest. This information is formulated from professional judgment based on an extensive review of literature, analysis of the data, and interviews with experts. Various analytical and numerical techniques are available to allow statistical propagation of the uncertainty in the model parameters to a statement of uncertainty in the risk to a potentially exposed individual. Although analytical methods may be straightforward for relatively simple models, they rapidly become complicated for more involved risk assessments. Because of the tedious efforts required to mathematically derive analytical approaches to propagate uncertainty in complicated risk assessments, numerical methods such as Monte Carlo simulation should be employed. The primary objective of this report is to provide an introductory guide for performing uncertainty analysis in risk assessments being performed for Superfund sites

  7. Risk Characterization uncertainties associated description, sensitivity analysis

    International Nuclear Information System (INIS)

    Carrillo, M.; Tovar, M.; Alvarez, J.; Arraez, M.; Hordziejewicz, I.; Loreto, I.

    2013-01-01

    The power point presentation is about risks to the estimated levels of exposure, uncertainty and variability in the analysis, sensitivity analysis, risks from exposure to multiple substances, formulation of guidelines for carcinogenic and genotoxic compounds and risk subpopulations

  8. The explicit treatment of model uncertainties in the presence of aleatory and epistemic parameter uncertainties in risk and reliability analysis

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Yang, Joon Eon

    2003-01-01

    In the risk and reliability analysis of complex technological systems, the primary concern of formal uncertainty analysis is to understand why uncertainties arise, and to evaluate how they impact the results of the analysis. In recent times, many of the uncertainty analyses have focused on parameters of the risk and reliability analysis models, whose values are uncertain in an aleatory or an epistemic way. As the field of parametric uncertainty analysis matures, however, more attention is being paid to the explicit treatment of uncertainties that are addressed in the predictive model itself as well as the accuracy of the predictive model. The essential steps for evaluating impacts of these model uncertainties in the presence of parameter uncertainties are to determine rigorously various sources of uncertainties to be addressed in an underlying model itself and in turn model parameters, based on our state-of-knowledge and relevant evidence. Answering clearly the question of how to characterize and treat explicitly the forgoing different sources of uncertainty is particularly important for practical aspects such as risk and reliability optimization of systems as well as more transparent risk information and decision-making under various uncertainties. The main purpose of this paper is to provide practical guidance for quantitatively treating various model uncertainties that would often be encountered in the risk and reliability modeling process of complex technological systems

  9. Uncertainty analysis in the applications of nuclear probabilistic risk assessment

    International Nuclear Information System (INIS)

    Le Duy, T.D.

    2011-01-01

    The aim of this thesis is to propose an approach to model parameter and model uncertainties affecting the results of risk indicators used in the applications of nuclear Probabilistic Risk assessment (PRA). After studying the limitations of the traditional probabilistic approach to represent uncertainty in PRA model, a new approach based on the Dempster-Shafer theory has been proposed. The uncertainty analysis process of the proposed approach consists in five main steps. The first step aims to model input parameter uncertainties by belief and plausibility functions according to the data PRA model. The second step involves the propagation of parameter uncertainties through the risk model to lay out the uncertainties associated with output risk indicators. The model uncertainty is then taken into account in the third step by considering possible alternative risk models. The fourth step is intended firstly to provide decision makers with information needed for decision making under uncertainty (parametric and model) and secondly to identify the input parameters that have significant uncertainty contributions on the result. The final step allows the process to be continued in loop by studying the updating of beliefs functions given new data. The proposed methodology was implemented on a real but simplified application of PRA model. (author)

  10. The role of uncertainty analysis in dose reconstruction and risk assessment

    International Nuclear Information System (INIS)

    Hoffman, F.O.; Simon, S.L.; Thiessen. K.M.

    1996-01-01

    Dose reconstruction and risk assessment rely heavily on the use of mathematical models to extrapolate information beyond the realm of direct observation. Because models are merely approximations of real systems, their predictions are inherently uncertain. As a result, full disclosure of uncertainty in dose and risk estimates is essential to achieve scientific credibility and to build public trust. The need for formal analysis of uncertainty in model predictions was presented during the nineteenth annual meeting of the NCRP. At that time, quantitative uncertainty analysis was considered a relatively new and difficult subject practiced by only a few investigators. Today, uncertainty analysis has become synonymous with the assessment process itself. When an uncertainty analysis is used iteratively within the assessment process, it can guide experimental research to refine dose and risk estimates, deferring potentially high cost or high consequence decisions until uncertainty is either acceptable or irreducible. Uncertainty analysis is now mandated for all ongoing dose reconstruction projects within the United States, a fact that distinguishes dose reconstruction from other types of exposure and risk assessments. 64 refs., 6 figs., 1 tab

  11. Assessing scenario and parametric uncertainties in risk analysis: a model uncertainty audit

    International Nuclear Information System (INIS)

    Tarantola, S.; Saltelli, A.; Draper, D.

    1999-01-01

    In the present study a process of model audit is addressed on a computational model used for predicting maximum radiological doses to humans in the field of nuclear waste disposal. Global uncertainty and sensitivity analyses are employed to assess output uncertainty and to quantify the contribution of parametric and scenario uncertainties to the model output. These tools are of fundamental importance for risk analysis and decision making purposes

  12. A scenario-based modeling approach for emergency evacuation management and risk analysis under multiple uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Lv, Y., E-mail: lvyying@hotmail.com [School of Traffic and Transportation, Beijing Jiaotong University, Beijing 100044 (China); Faculty of Engineering and Applied Science, University of Regina, Regina, Saskatchewan S4S 0A2 (Canada); Huang, G.H., E-mail: huang@iseis.org [Faculty of Engineering and Applied Science, University of Regina, Regina, Saskatchewan S4S 0A2 (Canada); Guo, L., E-mail: guoli8658@hotmail.com [Faculty of Engineering and Applied Science, University of Regina, Regina, Saskatchewan S4S 0A2 (Canada); Li, Y.P., E-mail: yongping.li@iseis.org [MOE Key Laboratory of Regional Energy and Environmental Systems Optimization, Resources and Environmental Research Academy, North China Electric Power University, Beijing 102206 (China); Dai, C., E-mail: daichao321@gmail.com [College of Environmental Sciences and Engineering, Peking University, Beijing 100871 (China); Wang, X.W., E-mail: wangxingwei0812@gamil.com [State Key Laboratory of Water Environment Simulation, School of Environment, Beijing Normal University, Beijing 100875 (China); Sun, W., E-mail: sunwei@iseis.org [Faculty of Engineering and Applied Science, University of Regina, Regina, Saskatchewan S4S 0A2 (Canada)

    2013-02-15

    Highlights: ► An interval-parameter joint-probabilistic integer programming method is developed. ► It is useful for nuclear emergency management practices under uncertainties. ► It can schedule optimal routes with maximizing evacuees during a finite time. ► Scenario-based analysis enhances robustness in controlling system risk. ► The method will help to improve the capability of disaster responses. -- Abstract: Nuclear emergency evacuation is important to prevent radioactive harms by hazardous materials and to limit the accidents’ consequences; however, uncertainties are involved in the components and processes of such a management system. In the study, an interval-parameter joint-probabilistic integer programming (IJIP) method is developed for emergency evacuation management under uncertainties. Optimization techniques of interval-parameter programming (IPP) and joint-probabilistic constrained (JPC) programming are incorporated into an integer linear programming framework, so that the approach can deal with uncertainties expressed as joint probability and interval values. The IJIP method can schedule the optimal routes to guarantee the maximum population evacuated away from the effected zone during a finite time. Furthermore, it can also facilitate post optimization analysis to enhance robustness in controlling system violation risk imposed on the joint-probabilistic constraints. The developed method has been applied to a case study of nuclear emergency management; meanwhile, a number of scenarios under different system conditions have been analyzed. It is indicated that the solutions are useful for evacuation management practices. The result of the IJIP method can not only help to raise the capability of disaster responses in a systematic manner, but also provide an insight into complex relationships among evacuation planning, resources utilizations, policy requirements and system risks.

  13. New challenges on uncertainty propagation assessment of flood risk analysis

    Science.gov (United States)

    Martins, Luciano; Aroca-Jiménez, Estefanía; Bodoque, José M.; Díez-Herrero, Andrés

    2016-04-01

    Natural hazards, such as floods, cause considerable damage to the human life, material and functional assets every year and around the World. Risk assessment procedures has associated a set of uncertainties, mainly of two types: natural, derived from stochastic character inherent in the flood process dynamics; and epistemic, that are associated with lack of knowledge or the bad procedures employed in the study of these processes. There are abundant scientific and technical literature on uncertainties estimation in each step of flood risk analysis (e.g. rainfall estimates, hydraulic modelling variables); but very few experience on the propagation of the uncertainties along the flood risk assessment. Therefore, epistemic uncertainties are the main goal of this work, in particular,understand the extension of the propagation of uncertainties throughout the process, starting with inundability studies until risk analysis, and how far does vary a proper analysis of the risk of flooding. These methodologies, such as Polynomial Chaos Theory (PCT), Method of Moments or Monte Carlo, are used to evaluate different sources of error, such as data records (precipitation gauges, flow gauges...), hydrologic and hydraulic modelling (inundation estimation), socio-demographic data (damage estimation) to evaluate the uncertainties propagation (UP) considered in design flood risk estimation both, in numerical and cartographic expression. In order to consider the total uncertainty and understand what factors are contributed most to the final uncertainty, we used the method of Polynomial Chaos Theory (PCT). It represents an interesting way to handle to inclusion of uncertainty in the modelling and simulation process. PCT allows for the development of a probabilistic model of the system in a deterministic setting. This is done by using random variables and polynomials to handle the effects of uncertainty. Method application results have a better robustness than traditional analysis

  14. Where do uncertainties reside within environmental risk assessments? Expert opinion on uncertainty distributions for pesticide risks to surface water organisms.

    Science.gov (United States)

    Skinner, Daniel J C; Rocks, Sophie A; Pollard, Simon J T

    2016-12-01

    A reliable characterisation of uncertainties can aid uncertainty identification during environmental risk assessments (ERAs). However, typologies can be implemented inconsistently, causing uncertainties to go unidentified. We present an approach based on nine structured elicitations, in which subject-matter experts, for pesticide risks to surface water organisms, validate and assess three dimensions of uncertainty: its level (the severity of uncertainty, ranging from determinism to ignorance); nature (whether the uncertainty is epistemic or aleatory); and location (the data source or area in which the uncertainty arises). Risk characterisation contains the highest median levels of uncertainty, associated with estimating, aggregating and evaluating the magnitude of risks. Regarding the locations in which uncertainty is manifest, data uncertainty is dominant in problem formulation, exposure assessment and effects assessment. The comprehensive description of uncertainty described will enable risk analysts to prioritise the required phases, groups of tasks, or individual tasks within a risk analysis according to the highest levels of uncertainty, the potential for uncertainty to be reduced or quantified, or the types of location-based uncertainty, thus aiding uncertainty prioritisation during environmental risk assessments. In turn, it is expected to inform investment in uncertainty reduction or targeted risk management action. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  15. Risk uncertainty analysis methods for NUREG-1150

    International Nuclear Information System (INIS)

    Benjamin, A.S.; Boyd, G.J.

    1987-01-01

    Evaluation and display of risk uncertainties for NUREG-1150 constitute a principal focus of the Severe Accident Risk Rebaselining/Risk Reduction Program (SARRP). Some of the principal objectives of the uncertainty evaluation are: (1) to provide a quantitative estimate that reflects, for those areas considered, a credible and realistic range of uncertainty in risk; (2) to rank the various sources of uncertainty with respect to their importance for various measures of risk; and (3) to characterize the state of understanding of each aspect of the risk assessment for which major uncertainties exist. This paper describes the methods developed to fulfill these objectives

  16. Assessment and uncertainty analysis of groundwater risk.

    Science.gov (United States)

    Li, Fawen; Zhu, Jingzhao; Deng, Xiyuan; Zhao, Yong; Li, Shaofei

    2018-01-01

    Groundwater with relatively stable quantity and quality is commonly used by human being. However, as the over-mining of groundwater, problems such as groundwater funnel, land subsidence and salt water intrusion have emerged. In order to avoid further deterioration of hydrogeological problems in over-mining regions, it is necessary to conduct the assessment of groundwater risk. In this paper, risks of shallow and deep groundwater in the water intake area of the South-to-North Water Transfer Project in Tianjin, China, were evaluated. Firstly, two sets of four-level evaluation index system were constructed based on the different characteristics of shallow and deep groundwater. Secondly, based on the normalized factor values and the synthetic weights, the risk values of shallow and deep groundwater were calculated. Lastly, the uncertainty of groundwater risk assessment was analyzed by indicator kriging method. The results meet the decision maker's demand for risk information, and overcome previous risk assessment results expressed in the form of deterministic point estimations, which ignore the uncertainty of risk assessment. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Uncertainty Instability Risk Analysis of High Concrete Arch Dam Abutments

    Directory of Open Access Journals (Sweden)

    Xin Cao

    2017-01-01

    Full Text Available The uncertainties associated with concrete arch dams rise with the increased height of dams. Given the uncertainties associated with influencing factors, the stability of high arch dam abutments as a fuzzy random event was studied. In addition, given the randomness and fuzziness of calculation parameters as well as the failure criterion, hazard point and hazard surface uncertainty instability risk ratio models were proposed for high arch dam abutments on the basis of credibility theory. The uncertainty instability failure criterion was derived through the analysis of the progressive instability failure process on the basis of Shannon’s entropy theory. The uncertainties associated with influencing factors were quantized by probability or possibility distribution assignments. Gaussian random theory was used to generate random realizations for influence factors with spatial variability. The uncertainty stability analysis method was proposed by combining the finite element analysis and the limit equilibrium method. The instability risk ratio was calculated using the Monte Carlo simulation method and fuzzy random postprocessing. Results corroborate that the modeling approach is sound and that the calculation method is feasible.

  18. Overview of methods for uncertainty analysis and sensitivity analysis in probabilistic risk assessment

    International Nuclear Information System (INIS)

    Iman, R.L.; Helton, J.C.

    1985-01-01

    Probabilistic Risk Assessment (PRA) is playing an increasingly important role in the nuclear reactor regulatory process. The assessment of uncertainties associated with PRA results is widely recognized as an important part of the analysis process. One of the major criticisms of the Reactor Safety Study was that its representation of uncertainty was inadequate. The desire for the capability to treat uncertainties with the MELCOR risk code being developed at Sandia National Laboratories is indicative of the current interest in this topic. However, as yet, uncertainty analysis and sensitivity analysis in the context of PRA is a relatively immature field. In this paper, available methods for uncertainty analysis and sensitivity analysis in a PRA are reviewed. This review first treats methods for use with individual components of a PRA and then considers how these methods could be combined in the performance of a complete PRA. In the context of this paper, the goal of uncertainty analysis is to measure the imprecision in PRA outcomes of interest, and the goal of sensitivity analysis is to identify the major contributors to this imprecision. There are a number of areas that must be considered in uncertainty analysis and sensitivity analysis for a PRA: (1) information, (2) systems analysis, (3) thermal-hydraulic phenomena/fission product behavior, (4) health and economic consequences, and (5) display of results. Each of these areas and the synthesis of them into a complete PRA are discussed

  19. Integrated Risk-Capability Analysis under Deep Uncertainty : An ESDMA Approach

    NARCIS (Netherlands)

    Pruyt, E.; Kwakkel, J.H.

    2012-01-01

    Integrated risk-capability analysis methodologies for dealing with increasing degrees of complexity and deep uncertainty are urgently needed in an ever more complex and uncertain world. Although scenario approaches, risk assessment methods, and capability analysis methods are used, few organizations

  20. The characterisation and evaluation of uncertainty in probabilistic risk analysis

    International Nuclear Information System (INIS)

    Parry, G.W.; Winter, P.W.

    1980-10-01

    The sources of uncertainty in probabilistic risk analysis are discussed using the event/fault tree methodology as an example. The role of statistics in quantifying these uncertainties is investigated. A class of uncertainties is identified which is, at present, unquantifiable, using either classical or Bayesian statistics. It is argued that Bayesian statistics is the more appropriate vehicle for the probabilistic analysis of rare events and a short review is given with some discussion on the representation of ignorance. (author)

  1. Uncertainty analysis in vulnerability estimations for elements at risk- a review of concepts and some examples on landslides

    Science.gov (United States)

    Ciurean, R. L.; Glade, T.

    2012-04-01

    Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.

  2. Interpretations of alternative uncertainty representations in a reliability and risk analysis context

    International Nuclear Information System (INIS)

    Aven, T.

    2011-01-01

    Probability is the predominant tool used to measure uncertainties in reliability and risk analyses. However, other representations also exist, including imprecise (interval) probability, fuzzy probability and representations based on the theories of evidence (belief functions) and possibility. Many researchers in the field are strong proponents of these alternative methods, but some are also sceptical. In this paper, we address one basic requirement set for quantitative measures of uncertainty: the interpretation needed to explain what an uncertainty number expresses. We question to what extent the various measures meet this requirement. Comparisons are made with probabilistic analysis, where uncertainty is represented by subjective probabilities, using either a betting interpretation or a reference to an uncertainty standard interpretation. By distinguishing between chances (expressing variation) and subjective probabilities, new insights are gained into the link between the alternative uncertainty representations and probability.

  3. Uncertainty and sensitivity analysis of flood risk management decisions based on stationary and nonstationary model choices

    Directory of Open Access Journals (Sweden)

    Rehan Balqis M.

    2016-01-01

    Full Text Available Current practice in flood frequency analysis assumes that the stochastic properties of extreme floods follow that of stationary conditions. As human intervention and anthropogenic climate change influences in hydrometeorological variables are becoming evident in some places, there have been suggestions that nonstationary statistics would be better to represent the stochastic properties of the extreme floods. The probabilistic estimation of non-stationary models, however, is surrounded with uncertainty related to scarcity of observations and modelling complexities hence the difficulty to project the future condition. In the face of uncertain future and the subjectivity of model choices, this study attempts to demonstrate the practical implications of applying a nonstationary model and compares it with a stationary model in flood risk assessment. A fully integrated framework to simulate decision makers’ behaviour in flood frequency analysis is thereby developed. The framework is applied to hypothetical flood risk management decisions and the outcomes are compared with those of known underlying future conditions. Uncertainty of the economic performance of the risk-based decisions is assessed through Monte Carlo simulations. Sensitivity of the results is also tested by varying the possible magnitude of future changes. The application provides quantitative and qualitative comparative results that satisfy a preliminary analysis of whether the nonstationary model complexity should be applied to improve the economic performance of decisions. Results obtained from the case study shows that the relative differences of competing models for all considered possible future changes are small, suggesting that stationary assumptions are preferred to a shift to nonstationary statistics for practical application of flood risk management. Nevertheless, nonstationary assumption should also be considered during a planning stage in addition to stationary assumption

  4. GENERAL RISKS AND UNCERTAINTIES OF REPORTING AND MANAGEMENT REPORTING RISKS

    Directory of Open Access Journals (Sweden)

    CAMELIA I. LUNGU

    2011-04-01

    Full Text Available Purpose: Highlighting risks and uncertainties reporting based on a literature review research. Objectives: The delimitation of risk management models and uncertainties in fundamental research. Research method: Fundamental research study directed to identify the relevant risks’ models presented in entities’ financial statements. Uncertainty is one of the fundamental coordinates of our world. As showed J.K. Galbraith (1978, the world now lives under the age of uncertainty. Moreover, we can say that contemporary society development could be achieved by taking decisions under uncertainty, though, risks. Growing concern for the study of uncertainty, its effects and precautions led to the rather recent emergence of a new science, science of hazards (les cindyniques - l.fr. (Kenvern, 1991. Current analysis of risk are dominated by Beck’s (1992 notion that a risk society now exists whereby we have become more concerned about our impact upon nature than the impact of nature upon us. Clearly, risk permeates most aspects of corporate but also of regular life decision-making and few can predict with any precision the future. The risk is almost always a major variable in real-world corporate decision-making, and managers that ignore it are in a real peril. In these circumstances, a possible answer is assuming financial discipline with an appropriate system of incentives.

  5. Operational Implementation of a Pc Uncertainty Construct for Conjunction Assessment Risk Analysis

    Science.gov (United States)

    Newman, Lauri K.; Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    Earlier this year the NASA Conjunction Assessment and Risk Analysis (CARA) project presented the theoretical and algorithmic aspects of a method to include the uncertainties in the calculation inputs when computing the probability of collision (Pc) between two space objects, principally uncertainties in the covariances and the hard-body radius. The output of this calculation approach is to produce rather than a single Pc value an entire probability density function that will represent the range of possible Pc values given the uncertainties in the inputs and bring CA risk analysis methodologies more in line with modern risk management theory. The present study provides results from the exercise of this method against an extended dataset of satellite conjunctions in order to determine the effect of its use on the evaluation of conjunction assessment (CA) event risk posture. The effects are found to be considerable: a good number of events are downgraded from or upgraded to a serious risk designation on the basis of consideration of the Pc uncertainty. The findings counsel the integration of the developed methods into NASA CA operations.

  6. Analysis of uncertainty in modeling perceived risks

    International Nuclear Information System (INIS)

    Melnyk, R.; Sandquist, G.M.

    2005-01-01

    Expanding on a mathematical model developed for quantifying and assessing perceived risks, the distribution functions, variances, and uncertainties associated with estimating the model parameters are quantified. The analytical model permits the identification and assignment of any number of quantifiable risk perception factors that can be incorporated within standard risk methodology. Those risk perception factors associated with major technical issues are modeled using lognormal probability density functions to span the potentially large uncertainty variations associated with these risk perceptions. The model quantifies the logic of public risk perception and provides an effective means for measuring and responding to perceived risks. (authors)

  7. Stochastic goal programming based groundwater remediation management under human-health-risk uncertainty

    International Nuclear Information System (INIS)

    Li, Jing; He, Li; Lu, Hongwei; Fan, Xing

    2014-01-01

    Highlights: • We propose an integrated optimal groundwater remediation design approach. • The approach can address stochasticity in carcinogenic risks. • Goal programming is used to make the system approaching to ideal operation and remediation effects. • The uncertainty in slope factor is evaluated under different confidence levels. • Optimal strategies are obtained to support remediation design under uncertainty. - Abstract: An optimal design approach for groundwater remediation is developed through incorporating numerical simulation, health risk assessment, uncertainty analysis and nonlinear optimization within a general framework. Stochastic analysis and goal programming are introduced into the framework to handle uncertainties in real-world groundwater remediation systems. Carcinogenic risks associated with remediation actions are further evaluated at four confidence levels. The differences between ideal and predicted constraints are minimized by goal programming. The approach is then applied to a contaminated site in western Canada for creating a set of optimal remediation strategies. Results from the case study indicate that factors including environmental standards, health risks and technical requirements mutually affected and restricted themselves. Stochastic uncertainty existed in the entire process of remediation optimization, which should to be taken into consideration in groundwater remediation design

  8. Uncertainty Analysis of A Flood Risk Mapping Procedure Applied In Urban Areas

    Science.gov (United States)

    Krause, J.; Uhrich, S.; Bormann, H.; Diekkrüger, B.

    In the framework of IRMA-Sponge program the presented study was part of the joint research project FRHYMAP (flood risk and hydrological mapping). A simple con- ceptual flooding model (FLOODMAP) has been developed to simulate flooded areas besides rivers within cities. FLOODMAP requires a minimum of input data (digital el- evation model (DEM), river line, water level plain) and parameters and calculates the flood extent as well as the spatial distribution of flood depths. of course the simulated model results are affected by errors and uncertainties. Possible sources of uncertain- ties are the model structure, model parameters and input data. Thus after the model validation (comparison of simulated water to observed extent, taken from airborne pictures) the uncertainty of the essential input data set (digital elevation model) was analysed. Monte Carlo simulations were performed to assess the effect of uncertain- ties concerning the statistics of DEM quality and to derive flooding probabilities from the set of simulations. The questions concerning a minimum resolution of a DEM re- quired for flood simulation and concerning the best aggregation procedure of a given DEM was answered by comparing the results obtained using all available standard GIS aggregation procedures. Seven different aggregation procedures were applied to high resolution DEMs (1-2m) in three cities (Bonn, Cologne, Luxembourg). Basing on this analysis the effect of 'uncertain' DEM data was estimated and compared with other sources of uncertainties. Especially socio-economic information and monetary transfer functions required for a damage risk analysis show a high uncertainty. There- fore this study helps to analyse the weak points of the flood risk and damage risk assessment procedure.

  9. Integrating risk analysis and multi-criteria decision support under uncertainty in electricity distribution system asset management

    International Nuclear Information System (INIS)

    Catrinu, M.D.; Nordgard, D.E.

    2011-01-01

    Asset managers in electricity distribution companies generally recognize the need and the challenge of adding structure and a higher degree of formal analysis into the increasingly complex asset management decisions. This implies improving the present asset management practice by making the best use of the available data and expert knowledge and by adopting new methods for risk analysis and decision support and nevertheless better ways to document the decisions made. This paper discusses methods for integrating risk analysis and multi-criteria decision support under uncertainty in electricity distribution system asset management. The focus is on how to include the different company objectives and risk analyses into a structured decision framework when deciding how to handle the physical assets of the electricity distribution network. This paper presents an illustrative example of decision support for maintenance and reinvestment strategies based, using expert knowledge, simplified risk analyses and multi-criteria decision analysis under uncertainty.

  10. Uncertainty characterization approaches for risk assessment of DBPs in drinking water: a review.

    Science.gov (United States)

    Chowdhury, Shakhawat; Champagne, Pascale; McLellan, P James

    2009-04-01

    The management of risk from disinfection by-products (DBPs) in drinking water has become a critical issue over the last three decades. The areas of concern for risk management studies include (i) human health risk from DBPs, (ii) disinfection performance, (iii) technical feasibility (maintenance, management and operation) of treatment and disinfection approaches, and (iv) cost. Human health risk assessment is typically considered to be the most important phase of the risk-based decision-making or risk management studies. The factors associated with health risk assessment and other attributes are generally prone to considerable uncertainty. Probabilistic and non-probabilistic approaches have both been employed to characterize uncertainties associated with risk assessment. The probabilistic approaches include sampling-based methods (typically Monte Carlo simulation and stratified sampling) and asymptotic (approximate) reliability analysis (first- and second-order reliability methods). Non-probabilistic approaches include interval analysis, fuzzy set theory and possibility theory. However, it is generally accepted that no single method is suitable for the entire spectrum of problems encountered in uncertainty analyses for risk assessment. Each method has its own set of advantages and limitations. In this paper, the feasibility and limitations of different uncertainty analysis approaches are outlined for risk management studies of drinking water supply systems. The findings assist in the selection of suitable approaches for uncertainty analysis in risk management studies associated with DBPs and human health risk.

  11. The treatment of uncertainties in risk for regulatory decision making

    International Nuclear Information System (INIS)

    Baybutt, P.; Cox, D.C.; Denning, R.S.; Kurth, R.E.; Fraley, D.W.; Heaberlin, S.W.

    1982-01-01

    This paper describes research conducted in an ongoing program at Battelle to develop and adapt decision analysis methods for regulatory decision making. A general approach to risk-based decision making is discussed. The nature of uncertainties in risk assessment is described and methods for their inclusion in decision making are proposed. The use of decision analysis methods in regulatory decision making and the consideration of uncertainties is illustrated in a realistic case study

  12. Analysis of parameter uncertainties in the assessment of seismic risk for nuclear power plants

    International Nuclear Information System (INIS)

    Yucemen, S.M.

    1981-04-01

    Probabilistic and statistical methods are used to develop a procedure by which the seismic risk at a specific site can be systematically analyzed. The proposed probabilistic procedure provides a consisted method for the modelling, analysis and updating of uncertainties that are involved in the seismic risk analysis for nuclear power plants. Methods are proposed for including these uncertainties in the final value of calculated risks. Two specific case studies are presented in detail to illustrate the application of the probabilistic method of seismic risk evaluation and to investigate the sensitivity of results to different assumptions

  13. Learning Risk-Taking and Coping with Uncertainty through Experiential, Team-Based Entrepreneurship Education

    Science.gov (United States)

    Arpiainen, Riitta-Liisa; Kurczewska, Agnieszka

    2017-01-01

    This empirical study investigates how students' perceptions of risk-taking and coping with uncertainty change while they are exposed to experience-based entrepreneurship education. The aim of the study is twofold. First, the authors set out to identify the dynamics of entrepreneurial thinking among students experiencing risk and uncertainty while…

  14. Uncertainty Analysis on the Health Risk Caused by a Radiological Terror

    International Nuclear Information System (INIS)

    Jeong, Hyojoon; Hwang, Wontae; Kim, Eunhan; Han, Moonhee

    2010-01-01

    Atmospheric dispersion modeling and uncertainty analysis were carried out support the decision making in case of radiological emergency events. The risk caused by inhalation is highly dependent on air concentrations for radionuclides, therefore air monitoring and dispersion modeling have to be performed carefully to reduce the uncertainty of the health risk assessment for the public. When an intentional release of radioactive materials occurs in an urban area, air quality for radioactive materials in the environment is of great importance to take action for countermeasures and environmental risk assessments. Atmospheric modeling is part of the decision making tasks and that it is particularly important for emergency managers as they often need to take actions quickly on very inadequate information. In this study, we assume an 137 Cs explosion of 50 TBq using RDDs in the metropolitan area of Soul, South Korea. California Puff Model is used to calculate an atmospheric dispersion and transport for 137 Cs, and environmental risk analysis is performed using the Monte Carlo method

  15. A real-time, dynamic early-warning model based on uncertainty analysis and risk assessment for sudden water pollution accidents.

    Science.gov (United States)

    Hou, Dibo; Ge, Xiaofan; Huang, Pingjie; Zhang, Guangxin; Loáiciga, Hugo

    2014-01-01

    A real-time, dynamic, early-warning model (EP-risk model) is proposed to cope with sudden water quality pollution accidents affecting downstream areas with raw-water intakes (denoted as EPs). The EP-risk model outputs the risk level of water pollution at the EP by calculating the likelihood of pollution and evaluating the impact of pollution. A generalized form of the EP-risk model for river pollution accidents based on Monte Carlo simulation, the analytic hierarchy process (AHP) method, and the risk matrix method is proposed. The likelihood of water pollution at the EP is calculated by the Monte Carlo method, which is used for uncertainty analysis of pollutants' transport in rivers. The impact of water pollution at the EP is evaluated by expert knowledge and the results of Monte Carlo simulation based on the analytic hierarchy process. The final risk level of water pollution at the EP is determined by the risk matrix method. A case study of the proposed method is illustrated with a phenol spill accident in China.

  16. Survey of sampling-based methods for uncertainty and sensitivity analysis

    International Nuclear Information System (INIS)

    Helton, J.C.; Johnson, J.D.; Sallaberry, C.J.; Storlie, C.B.

    2006-01-01

    Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (i) definition of probability distributions to characterize epistemic uncertainty in analysis inputs (ii) generation of samples from uncertain analysis inputs (iii) propagation of sampled inputs through an analysis (iv) presentation of uncertainty analysis results, and (v) determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two-dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition

  17. Survey of sampling-based methods for uncertainty and sensitivity analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jay Dean; Helton, Jon Craig; Sallaberry, Cedric J. PhD. (.; .); Storlie, Curt B. (Colorado State University, Fort Collins, CO)

    2006-06-01

    Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (1) Definition of probability distributions to characterize epistemic uncertainty in analysis inputs, (2) Generation of samples from uncertain analysis inputs, (3) Propagation of sampled inputs through an analysis, (4) Presentation of uncertainty analysis results, and (5) Determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition.

  18. A Framework for Understanding Uncertainty in Seismic Risk Assessment.

    Science.gov (United States)

    Foulser-Piggott, Roxane; Bowman, Gary; Hughes, Martin

    2017-10-11

    A better understanding of the uncertainty that exists in models used for seismic risk assessment is critical to improving risk-based decisions pertaining to earthquake safety. Current models estimating the probability of collapse of a building do not consider comprehensively the nature and impact of uncertainty. This article presents a model framework to enhance seismic risk assessment and thus gives decisionmakers a fuller understanding of the nature and limitations of the estimates. This can help ensure that risks are not over- or underestimated and the value of acquiring accurate data is appreciated fully. The methodology presented provides a novel treatment of uncertainties in input variables, their propagation through the model, and their effect on the results. The study presents ranges of possible annual collapse probabilities for different case studies on buildings in different parts of the world, exposed to different levels of seismicity, and with different vulnerabilities. A global sensitivity analysis was conducted to determine the significance of uncertain variables. Two key outcomes are (1) that the uncertainty in ground-motion conversion equations has the largest effect on the uncertainty in the calculation of annual collapse probability; and (2) the vulnerability of a building appears to have an effect on the range of annual collapse probabilities produced, i.e., the level of uncertainty in the estimate of annual collapse probability, with less vulnerable buildings having a smaller uncertainty. © 2017 Society for Risk Analysis.

  19. Exploring the uncertainties in cancer risk assessment using the integrated probabilistic risk assessment (IPRA) approach.

    Science.gov (United States)

    Slob, Wout; Bakker, Martine I; Biesebeek, Jan Dirk Te; Bokkers, Bas G H

    2014-08-01

    Current methods for cancer risk assessment result in single values, without any quantitative information on the uncertainties in these values. Therefore, single risk values could easily be overinterpreted. In this study, we discuss a full probabilistic cancer risk assessment approach in which all the generally recognized uncertainties in both exposure and hazard assessment are quantitatively characterized and probabilistically evaluated, resulting in a confidence interval for the final risk estimate. The methodology is applied to three example chemicals (aflatoxin, N-nitrosodimethylamine, and methyleugenol). These examples illustrate that the uncertainty in a cancer risk estimate may be huge, making single value estimates of cancer risk meaningless. Further, a risk based on linear extrapolation tends to be lower than the upper 95% confidence limit of a probabilistic risk estimate, and in that sense it is not conservative. Our conceptual analysis showed that there are two possible basic approaches for cancer risk assessment, depending on the interpretation of the dose-incidence data measured in animals. However, it remains unclear which of the two interpretations is the more adequate one, adding an additional uncertainty to the already huge confidence intervals for cancer risk estimates. © 2014 Society for Risk Analysis.

  20. Disruptive event uncertainties in a perturbation approach to nuclear waste repository risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Harvey, T.F.

    1980-09-01

    A methodology is developed for incorporating a full range of the principal forecasting uncertainties into a risk analysis of a nuclear waste repository. The result of this methodology is a set of risk curves similar to those used by Rasmussen in WASH-1400. The set of curves is partially derived from a perturbation approach to analyze potential disruptive event sequences. Such a scheme could be useful in truncating the number of disruptive event scenarios and providing guidance to those establishing data-base development priorities.

  1. Uncertainty assessment of urban pluvial flood risk in a context of climate change adaptation decision making

    DEFF Research Database (Denmark)

    Arnbjerg-Nielsen, Karsten; Zhou, Qianqian

    2014-01-01

    uncertainty analysis, which can assess and quantify the overall uncertainty in relation to climate change adaptation to urban flash floods. The analysis is based on an uncertainty cascade that by means of Monte Carlo simulations of flood risk assessments incorporates climate change impacts as a key driver......There has been a significant increase in climatic extremes in many regions. In Central and Northern Europe, this has led to more frequent and more severe floods. Along with improved flood modelling technologies this has enabled development of economic assessment of climate change adaptation...... to increasing urban flood risk. Assessment of adaptation strategies often requires a comprehensive risk-based economic analysis of current risk, drivers of change of risk over time, and measures to reduce the risk. However, such studies are often associated with large uncertainties. The uncertainties arise from...

  2. Sensitivity and uncertainty analyses in aging risk-based prioritizations

    International Nuclear Information System (INIS)

    Hassan, M.; Uryas'ev, S.; Vesely, W.E.

    1993-01-01

    Aging risk evaluations of nuclear power plants using Probabilistic Risk Analyses (PRAs) involve assessments of the impact of aging structures, systems, and components (SSCs) on plant core damage frequency (CDF). These assessments can be used to prioritize the contributors to aging risk reflecting the relative risk potential of the SSCs. Aging prioritizations are important for identifying the SSCs contributing most to plant risk and can provide a systematic basis on which aging risk control and management strategies for a plant can be developed. However, these prioritizations are subject to variabilities arising from uncertainties in data, and/or from various modeling assumptions. The objective of this paper is to present an evaluation of the sensitivity of aging prioritizations of active components to uncertainties in aging risk quantifications. Approaches for robust prioritization of SSCs also are presented which are less susceptible to the uncertainties

  3. Uncertainty analysis and validation of environmental models. The empirically based uncertainty analysis

    International Nuclear Information System (INIS)

    Monte, Luigi; Hakanson, Lars; Bergstroem, Ulla; Brittain, John; Heling, Rudie

    1996-01-01

    The principles of Empirically Based Uncertainty Analysis (EBUA) are described. EBUA is based on the evaluation of 'performance indices' that express the level of agreement between the model and sets of empirical independent data collected in different experimental circumstances. Some of these indices may be used to evaluate the confidence limits of the model output. The method is based on the statistical analysis of the distribution of the index values and on the quantitative relationship of these values with the ratio 'experimental data/model output'. Some performance indices are described in the present paper. Among these, the so-called 'functional distance' (d) between the logarithm of model output and the logarithm of the experimental data, defined as d 2 =Σ n 1 ( ln M i - ln O i ) 2 /n where M i is the i-th experimental value, O i the corresponding model evaluation and n the number of the couplets 'experimental value, predicted value', is an important tool for the EBUA method. From the statistical distribution of this performance index, it is possible to infer the characteristics of the distribution of the ratio 'experimental data/model output' and, consequently to evaluate the confidence limits for the model predictions. This method was applied to calculate the uncertainty level of a model developed to predict the migration of radiocaesium in lacustrine systems. Unfortunately, performance indices are affected by the uncertainty of the experimental data used in validation. Indeed, measurement results of environmental levels of contamination are generally associated with large uncertainty due to the measurement and sampling techniques and to the large variability in space and time of the measured quantities. It is demonstrated that this non-desired effect, in some circumstances, may be corrected by means of simple formulae

  4. Risk Assessment Method for Offshore Structure Based on Global Sensitivity Analysis

    Directory of Open Access Journals (Sweden)

    Zou Tao

    2012-01-01

    Full Text Available Based on global sensitivity analysis (GSA, this paper proposes a new risk assessment method for an offshore structure design. This method quantifies all the significances among random variables and their parameters at first. And by comparing the degree of importance, all minor factors would be negligible. Then, the global uncertainty analysis work would be simplified. Global uncertainty analysis (GUA is an effective way to study the complexity and randomness of natural events. Since field measured data and statistical results often have inevitable errors and uncertainties which lead to inaccurate prediction and analysis, the risk in the design stage of offshore structures caused by uncertainties in environmental loads, sea level, and marine corrosion must be taken into account. In this paper, the multivariate compound extreme value distribution model (MCEVD is applied to predict the extreme sea state of wave, current, and wind. The maximum structural stress and deformation of a Jacket platform are analyzed and compared with different design standards. The calculation result sufficiently demonstrates the new risk assessment method’s rationality and security.

  5. Uncertainty analysis in Monte Carlo criticality computations

    International Nuclear Information System (INIS)

    Qi Ao

    2011-01-01

    Highlights: ► Two types of uncertainty methods for k eff Monte Carlo computations are examined. ► Sampling method has the least restrictions on perturbation but computing resources. ► Analytical method is limited to small perturbation on material properties. ► Practicality relies on efficiency, multiparameter applicability and data availability. - Abstract: Uncertainty analysis is imperative for nuclear criticality risk assessments when using Monte Carlo neutron transport methods to predict the effective neutron multiplication factor (k eff ) for fissionable material systems. For the validation of Monte Carlo codes for criticality computations against benchmark experiments, code accuracy and precision are measured by both the computational bias and uncertainty in the bias. The uncertainty in the bias accounts for known or quantified experimental, computational and model uncertainties. For the application of Monte Carlo codes for criticality analysis of fissionable material systems, an administrative margin of subcriticality must be imposed to provide additional assurance of subcriticality for any unknown or unquantified uncertainties. Because of a substantial impact of the administrative margin of subcriticality on economics and safety of nuclear fuel cycle operations, recently increasing interests in reducing the administrative margin of subcriticality make the uncertainty analysis in criticality safety computations more risk-significant. This paper provides an overview of two most popular k eff uncertainty analysis methods for Monte Carlo criticality computations: (1) sampling-based methods, and (2) analytical methods. Examples are given to demonstrate their usage in the k eff uncertainty analysis due to uncertainties in both neutronic and non-neutronic parameters of fissionable material systems.

  6. Uncertainty as Knowledge: Constraints on Policy Choices Provided by Analysis of Uncertainty

    Science.gov (United States)

    Lewandowsky, S.; Risbey, J.; Smithson, M.; Newell, B. R.

    2012-12-01

    Uncertainty forms an integral part of climate science, and it is often cited in connection with arguments against mitigative action. We argue that an analysis of uncertainty must consider existing knowledge as well as uncertainty, and the two must be evaluated with respect to the outcomes and risks associated with possible policy options. Although risk judgments are inherently subjective, an analysis of the role of uncertainty within the climate system yields two constraints that are robust to a broad range of assumptions. Those constraints are that (a) greater uncertainty about the climate system is necessarily associated with greater expected damages from warming, and (b) greater uncertainty translates into a greater risk of the failure of mitigation efforts. These ordinal constraints are unaffected by subjective or cultural risk-perception factors, they are independent of the discount rate, and they are independent of the magnitude of the estimate for climate sensitivity. The constraints mean that any appeal to uncertainty must imply a stronger, rather than weaker, need to cut greenhouse gas emissions than in the absence of uncertainty.

  7. Uncertainty analysis guide

    International Nuclear Information System (INIS)

    Andres, T.H.

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  8. Uncertainty analysis guide

    Energy Technology Data Exchange (ETDEWEB)

    Andres, T.H

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  9. Uncertainty Analysis and Overtopping Risk Evaluation of Maroon Dam withMonte Carlo and Latin Hypercube Methods

    Directory of Open Access Journals (Sweden)

    J. M. Vali Samani

    2016-02-01

    Full Text Available Introduction: The greatest part of constructed dams belongs to embankment dams and there are many examples of their failures throughout history. About one-third of the world’s dam failures have been caused by flood overtopping, which indicates that flood overtopping is an important factor affecting reservoir projects’ safety. Moreover, because of a poor understanding of the randomness of floods, reservoir water levels during flood seasons are often lowered artificially in order to avoid overtopping and protect the lives and property of downstream residents. So, estimation of dam overtopping risk with regard to uncertainties is more important than achieving the dam’s safety. This study presents the procedure for risk evaluation of dam overtopping due to various uncertaintiess in inflows and reservoir initial condition. Materials and Methods: This study aims to present a practical approach and compare the different uncertainty analysis methods in the evaluation of dam overtopping risk due to flood. For this purpose, Monte Carlo simulation and Latin hypercube sampling methods were used to calculate the overtopping risk, evaluate the uncertainty, and calculate the highest water level during different flood events. To assess these methods from a practical point of view, the Maroon dam was chosen for the case study. Figure. 1 indicates the work procedure, including three parts: 1 Identification and evaluation of effective factors on flood routing and dam overtopping, 2 Data collection and analysis for reservoir routing and uncertainty analysis, 3 Uncertainty and risk analysis. Figure 1- Diagram of dam overtopping risk evaluation Results and Discussion: Figure 2 shows the results of the computed overtopping risks for the Maroon Dam without considering the wind effect, for the initial water level of 504 m as an example. As it is shown in Figure. 2, the trends of the risk curves computed by the different uncertainty analysis methods are similar

  10. Uncertainty quantification in flood risk assessment

    Science.gov (United States)

    Blöschl, Günter; Hall, Julia; Kiss, Andrea; Parajka, Juraj; Perdigão, Rui A. P.; Rogger, Magdalena; Salinas, José Luis; Viglione, Alberto

    2017-04-01

    Uncertainty is inherent to flood risk assessments because of the complexity of the human-water system, which is characterised by nonlinearities and interdependencies, because of limited knowledge about system properties and because of cognitive biases in human perception and decision-making. On top of the uncertainty associated with the assessment of the existing risk to extreme events, additional uncertainty arises because of temporal changes in the system due to climate change, modifications of the environment, population growth and the associated increase in assets. Novel risk assessment concepts are needed that take into account all these sources of uncertainty. They should be based on the understanding of how flood extremes are generated and how they change over time. They should also account for the dynamics of risk perception of decision makers and population in the floodplains. In this talk we discuss these novel risk assessment concepts through examples from Flood Frequency Hydrology, Socio-Hydrology and Predictions Under Change. We believe that uncertainty quantification in flood risk assessment should lead to a robust approach of integrated flood risk management aiming at enhancing resilience rather than searching for optimal defense strategies.

  11. Climate policy uncertainty and investment risk

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2007-06-21

    Our climate is changing. This is certain. Less certain, however, is the timing and magnitude of climate change, and the cost of transition to a low-carbon world. Therefore, many policies and programmes are still at a formative stage, and policy uncertainty is very high. This book identifies how climate change policy uncertainty may affect investment behaviour in the power sector. For power companies, where capital stock is intensive and long-lived, those risks rank among the biggest and can create an incentive to delay investment. Our analysis results show that the risk premiums of climate change uncertainty can add 40% of construction costs of the plant for power investors, and 10% of price surcharges for the electricity end-users. This publication tells what can be done in policy design to reduce these costs. Incorporating the results of quantitative analysis, this publication also shows the sensitivity of different power sector investment decisions to different risks. It compares the effects of climate policy uncertainty with energy market uncertainty, showing the relative importance of these sources of risk for different technologies in different market types. Drawing on extensive consultation with power companies and financial investors, it also assesses the implications for policy makers, allowing the key messages to be transferred into policy designs. This book is a useful tool for governments to improve climate policy mechanisms and create more certainty for power investors.

  12. Economic risk-based analysis: Effect of technical and market price uncertainties on the production of glycerol-based isobutanol

    DEFF Research Database (Denmark)

    Loureiro da Costa Lira Gargalo, Carina; Gernaey, Krist; Sin, Gürkan

    2016-01-01

    to propagate the market price and technical uncertainties to the economic indicator calculations and to quantify the respective economic risk. The results clearly indicated that under the given market price uncertainties, the probability of obtaining a negative NPV is 0.95. This is a very high probability...

  13. Uncertainty analysis for Ulysses safety evaluation report

    International Nuclear Information System (INIS)

    Frank, M.V.

    1991-01-01

    As part of the effort to review the Ulysses Final Safety Analysis Report and to understand the risk of plutonium release from the Ulysses spacecraft General Purpose Heat Source---Radioisotope Thermal Generator (GPHS-RTG), the Interagency Nuclear Safety Review Panel (INSRP) and the author performed an integrated, quantitative analysis of the uncertainties of the calculated risk of plutonium release from Ulysses. Using state-of-art probabilistic risk assessment technology, the uncertainty analysis accounted for both variability and uncertainty of the key parameters of the risk analysis. The results show that INSRP had high confidence that risk of fatal cancers from potential plutonium release associated with calculated launch and deployment accident scenarios is low

  14. Fuzzy probability based fault tree analysis to propagate and quantify epistemic uncertainty

    International Nuclear Information System (INIS)

    Purba, Julwan Hendry; Sony Tjahyani, D.T.; Ekariansyah, Andi Sofrany; Tjahjono, Hendro

    2015-01-01

    Highlights: • Fuzzy probability based fault tree analysis is to evaluate epistemic uncertainty in fuzzy fault tree analysis. • Fuzzy probabilities represent likelihood occurrences of all events in a fault tree. • A fuzzy multiplication rule quantifies epistemic uncertainty of minimal cut sets. • A fuzzy complement rule estimate epistemic uncertainty of the top event. • The proposed FPFTA has successfully evaluated the U.S. Combustion Engineering RPS. - Abstract: A number of fuzzy fault tree analysis approaches, which integrate fuzzy concepts into the quantitative phase of conventional fault tree analysis, have been proposed to study reliabilities of engineering systems. Those new approaches apply expert judgments to overcome the limitation of the conventional fault tree analysis when basic events do not have probability distributions. Since expert judgments might come with epistemic uncertainty, it is important to quantify the overall uncertainties of the fuzzy fault tree analysis. Monte Carlo simulation is commonly used to quantify the overall uncertainties of conventional fault tree analysis. However, since Monte Carlo simulation is based on probability distribution, this technique is not appropriate for fuzzy fault tree analysis, which is based on fuzzy probabilities. The objective of this study is to develop a fuzzy probability based fault tree analysis to overcome the limitation of fuzzy fault tree analysis. To demonstrate the applicability of the proposed approach, a case study is performed and its results are then compared to the results analyzed by a conventional fault tree analysis. The results confirm that the proposed fuzzy probability based fault tree analysis is feasible to propagate and quantify epistemic uncertainties in fault tree analysis

  15. Simplified quantitative treatment of uncertainty and interindividual variability in health risk assessment

    International Nuclear Information System (INIS)

    Bogen, K.T.

    1993-01-01

    A distinction between uncertainty (or the extent of lack of knowledge) and interindividual variability (or the extent of person-to-person heterogeneity) regarding the values of input variates must be maintained if a quantitative characterization of uncertainty in population risk or in individual risk is sought. Here, some practical methods are presented that should facilitate implementation of the analytic framework for uncertainty and variability proposed by Bogen and Spear. (1,2) Two types of methodology are discussed: one that facilitates the distinction between uncertainty and variability per se, and another that may be used to simplify quantitative analysis of distributed inputs representing either uncertainty or variability. A simple and a complex form for modeled increased risk are presented and then used to illustrate methods facilitating the distinction between uncertainty and variability in reference to characterization of both population and individual risk. Finally, a simple form of discrete probability calculus is proposed as an easily implemented, practical altemative to Monte-Carlo based procedures to quantitative integration of uncertainty and variability in risk assessment

  16. Risk Assessment and Decision-Making under Uncertainty in Tunnel and Underground Engineering

    Directory of Open Access Journals (Sweden)

    Yuanpu Xia

    2017-10-01

    Full Text Available The impact of uncertainty on risk assessment and decision-making is increasingly being prioritized, especially for large geotechnical projects such as tunnels, where uncertainty is often the main source of risk. Epistemic uncertainty, which can be reduced, is the focus of attention. In this study, the existing entropy-risk decision model is first discussed and analyzed, and its deficiencies are improved upon and overcome. Then, this study addresses the fact that existing studies only consider parameter uncertainty and ignore the influence of the model uncertainty. Here, focus is on the issue of model uncertainty and differences in risk consciousness with different decision-makers. The utility theory is introduced in the model. Finally, a risk decision model is proposed based on the sensitivity analysis and the tolerance cost, which can improve decision-making efficiency. This research can provide guidance or reference for the evaluation and decision-making of complex systems engineering problems, and indicate a direction for further research of risk assessment and decision-making issues.

  17. Uncertainty Analysis of Few Group Cross Sections Based on Generalized Perturbation Theory

    International Nuclear Information System (INIS)

    Han, Tae Young; Lee, Hyun Chul; Noh, Jae Man

    2014-01-01

    In this paper, the methodology of the sensitivity and uncertainty analysis code based on GPT was described and the preliminary verification calculations on the PMR200 pin cell problem were carried out. As a result, they are in a good agreement when compared with the results by TSUNAMI. From this study, it is expected that MUSAD code based on GPT can produce the uncertainty of the homogenized few group microscopic cross sections for a core simulator. For sensitivity and uncertainty analyses for general core responses, a two-step method is available and it utilizes the generalized perturbation theory (GPT) for homogenized few group cross sections in the first step and stochastic sampling method for general core responses in the second step. The uncertainty analysis procedure based on GPT in the first step needs the generalized adjoint solution from a cell or lattice code. For this, the generalized adjoint solver has been integrated into DeCART in our previous work. In this paper, MUSAD (Modues of Uncertainty and Sensitivity Analysis for DeCART) code based on the classical perturbation theory was expanded to the function of the sensitivity and uncertainty analysis for few group cross sections based on GPT. First, the uncertainty analysis method based on GPT was described and, in the next section, the preliminary results of the verification calculation on a VHTR pin cell problem were compared with the results by TSUNAMI of SCALE 6.1

  18. A risk-based method for planning of bus–subway corridor evacuation under hybrid uncertainties

    International Nuclear Information System (INIS)

    Lv, Y.; Yan, X.D.; Sun, W.; Gao, Z.Y.

    2015-01-01

    Emergencies involved in a bus–subway corridor system are associated with many processes and factors with social and economic implications. These processes and factors and their interactions are related to a variety of uncertainties. In this study, an interval chance-constrained integer programming (EICI) method is developed in response to such challenges for bus–subway corridor based evacuation planning. The method couples a chance-constrained programming with an interval integer programming model framework. It can thus deal with interval uncertainties that cannot be quantified with specified probability distribution functions. Meanwhile, it can also reflect stochastic features of traffic flow capacity, and thereby help examine the related violation risk of constraint. The EICI method is applied to a subway incident based evacuation case study. It is solved through an interactive algorithm that does not lead to more complicated intermediate submodels and has a relatively low computational requirement. A number of decision alternatives could be directly generated based on results from the EICI method. It is indicated that the solutions cannot only help decision makers identify desired population evacuation and vehicle dispatch schemes under hybrid uncertainties, but also provide bases for in-depth analyses of tradeoffs among evacuation plans, total evacuation time, and constraint-violation risks. - Highlights: • An inexact model is developed for the bus–subway corridor evacuation management. • It tackles stochastic and interval uncertainties in an integer programming problem. • It can examine violation risk of the roadway flow capacity related constraint. • It will help identify evacuation schemes under hybrid uncertainties

  19. Unsharpness-risk analysis

    International Nuclear Information System (INIS)

    Preyssl, C.

    1986-01-01

    Safety analysis provides the only tool for evaluation and quantification of rare or hypothetical events leading to system failure. So far probability theory has been used for the fault- and event-tree methodology. The phenomenon of uncertainties constitutes an important aspect in risk analysis. Uncertainties can be classified as originating from 'randomness' or 'fuzziness'. Probability theory addresses randomness only. The use of 'fuzzy set theory' makes it possible to include both types of uncertainty in the mathematical model of risk analysis. Thus the 'fuzzy fault tree' is expressed in 'possibilistic' terms implying a range of simplifications and improvements. 'Human failure' and 'conditionality' can be treated correctly. Only minimum-maximum relations are used to combine the possibility distributions of events. Various event-classifications facilitate the interpretation of the results. The method is demonstrated by application to a TRIGA-research reactor. Uncertainty as an implicit part of 'fuzzy risk' can be quantified explicitly using an 'uncertainty measure'. Based on this the 'degree of relative compliance' with a quantizative safety goal can be defined for a particular risk. The introduction of 'weighting functionals' guarantees the consideration of the importances attached to different parts of the risk exceeding or complying with the standard. The comparison of two reference systems is demonstrated in a case study. It is concluded that any application of the 'fuzzy risk analysis' has to be free of any hypostatization when reducing subjective to objective information. (Author)

  20. Introduction to risk and uncertainty in hydrosystem engineering

    CERN Document Server

    Goodarzi, Ehsan; Teang Shui, Lee

    2013-01-01

    Water engineers require knowledge of stochastic, frequency concepts, uncertainty analysis, risk assessment, and the processes that predict unexpected events. This book presents the basics of stochastic, risk and uncertainty analysis, and random sampling techniques in conjunction with straightforward examples which are solved step by step. In addition, appropriate Excel functions are included as an alternative to solve the examples, and two real case studies is presented in the last chapters of book.

  1. Train integrity detection risk analysis based on PRISM

    Science.gov (United States)

    Wen, Yuan

    2018-04-01

    GNSS based Train Integrity Monitoring System (TIMS) is an effective and low-cost detection scheme for train integrity detection. However, as an external auxiliary system of CTCS, GNSS may be influenced by external environments, such as uncertainty of wireless communication channels, which may lead to the failure of communication and positioning. In order to guarantee the reliability and safety of train operation, a risk analysis method of train integrity detection based on PRISM is proposed in this article. First, we analyze the risk factors (in GNSS communication process and the on-board communication process) and model them. Then, we evaluate the performance of the model in PRISM based on the field data. Finally, we discuss how these risk factors influence the train integrity detection process.

  2. Uncertainty Analysis and Expert Judgment in Seismic Hazard Analysis

    Science.gov (United States)

    Klügel, Jens-Uwe

    2011-01-01

    The large uncertainty associated with the prediction of future earthquakes is usually regarded as the main reason for increased hazard estimates which have resulted from some recent large scale probabilistic seismic hazard analysis studies (e.g. the PEGASOS study in Switzerland and the Yucca Mountain study in the USA). It is frequently overlooked that such increased hazard estimates are characteristic for a single specific method of probabilistic seismic hazard analysis (PSHA): the traditional (Cornell-McGuire) PSHA method which has found its highest level of sophistication in the SSHAC probability method. Based on a review of the SSHAC probability model and its application in the PEGASOS project, it is shown that the surprising results of recent PSHA studies can be explained to a large extent by the uncertainty model used in traditional PSHA, which deviates from the state of the art in mathematics and risk analysis. This uncertainty model, the Ang-Tang uncertainty model, mixes concepts of decision theory with probabilistic hazard assessment methods leading to an overestimation of uncertainty in comparison to empirical evidence. Although expert knowledge can be a valuable source of scientific information, its incorporation into the SSHAC probability method does not resolve the issue of inflating uncertainties in PSHA results. Other, more data driven, PSHA approaches in use in some European countries are less vulnerable to this effect. The most valuable alternative to traditional PSHA is the direct probabilistic scenario-based approach, which is closely linked with emerging neo-deterministic methods based on waveform modelling.

  3. Review of studies related to uncertainty in risk analsis

    International Nuclear Information System (INIS)

    Rish, W.R.; Marnicio, R.J.

    1988-08-01

    The Environmental Protection Agency's Office of Radiation Programs (ORP) is responsible for regulating on a national level the risks associated with technological sources of ionizing radiation in the environment. A critical activity of the ORP is analyzing and evaluating risk. The ORP believes that the analysis of uncertainty should be an integral part of any risk assessment; therefore, the ORP has initiated a project to develop framework for the treatment of uncertainty in risk analysis. Summaries of recent studies done in five areas of study are presented

  4. Gambling in Latin: incorporating uncertainty in risk management

    Energy Technology Data Exchange (ETDEWEB)

    Gratt, L.B.; Levin, L. (IWG Corporation, San Diego, CA (United States))

    1994-08-01

    Risk assessment uses assumptions based on differing degrees of conservatism. This complicates the understanding of the uncertainty in the final risk estimate. Uncertainties arise from each component of the risk assessment process: source terms, atmospheric transport, exposure, and dose response. Probabilistic modeling using Monte Carlo and Latin Square sampling techniques (reference to Gambling in Latin) allows for an improved approach to risk assessment and management. 16 refs., 1 fig., 1 tab.

  5. Review of studies related to uncertainty in risk analsis

    Energy Technology Data Exchange (ETDEWEB)

    Rish, W.R.; Marnicio, R.J.

    1988-08-01

    The Environmental Protection Agency's Office of Radiation Programs (ORP) is responsible for regulating on a national level the risks associated with technological sources of ionizing radiation in the environment. A critical activity of the ORP is analyzing and evaluating risk. The ORP believes that the analysis of uncertainty should be an integral part of any risk assessment; therefore, the ORP has initiated a project to develop framework for the treatment of uncertainty in risk analysis. Summaries of recent studies done in five areas of study are presented.

  6. Uncertainties in risk assessment and decision making

    International Nuclear Information System (INIS)

    Starzec, Peter; Purucker, Tom; Stewart, Robert

    2008-02-01

    confidence interval under different assumptions regarding the data structure. The results stress the importance to invoke statistical methods and also illustrate how the choice of a wrong methodology may affect the quality of risk assessment and foundations for decision making. The uncertainty in assessing the volume of contaminated soil was shown to be dependant only to a low extent on the interpolation technique used for the specific case study analyzed. It is, however, expected that the uncertainty may increase significantly, if more restrictive risk criteria (lower guideline value) are applied. Despite a possible low uncertainty in assessing the contaminated soil volume, the uncertainty in its localization can be substantial. Based on the demo example presented, it comes out that the risk-based input for decision on soil treatment may vary depending on what assumptions were adopted during interpolation process. Uncertainty in an ecological exposure model with regard to the moving pattern of a receptor in relation till spatial distribution of contaminant has been demonstrated by studies on pronghorn (Antilocapra americana). The results from numerical simulations show that a lack in knowledge on the receptor moving routes may bring about substantial uncertainty in exposure assessment. The presented concept is mainly applicable for 'mobile' receptors on relatively large areas. A number of statistical definitions/methods/concepts are presented in the report of which some are not elaborated on in detail, while readers are referred to proper literature. The mail goal with the study has been rather to shed more light on aspects related to uncertainty in risk assessment and to demonstrate potential consequences of wrong approach than to provide readers with formal guideline and recommendations. However, the outcome from the study will hopefully contribute to the further work on novel approaches towards more reliable risk assessments

  7. Including model uncertainty in risk-informed decision making

    International Nuclear Information System (INIS)

    Reinert, Joshua M.; Apostolakis, George E.

    2006-01-01

    Model uncertainties can have a significant impact on decisions regarding licensing basis changes. We present a methodology to identify basic events in the risk assessment that have the potential to change the decision and are known to have significant model uncertainties. Because we work with basic event probabilities, this methodology is not appropriate for analyzing uncertainties that cause a structural change to the model, such as success criteria. We use the risk achievement worth (RAW) importance measure with respect to both the core damage frequency (CDF) and the change in core damage frequency (ΔCDF) to identify potentially important basic events. We cross-check these with generically important model uncertainties. Then, sensitivity analysis is performed on the basic event probabilities, which are used as a proxy for the model parameters, to determine how much error in these probabilities would need to be present in order to impact the decision. A previously submitted licensing basis change is used as a case study. Analysis using the SAPHIRE program identifies 20 basic events as important, four of which have model uncertainties that have been identified in the literature as generally important. The decision is fairly insensitive to uncertainties in these basic events. In three of these cases, one would need to show that model uncertainties would lead to basic event probabilities that would be between two and four orders of magnitude larger than modeled in the risk assessment before they would become important to the decision. More detailed analysis would be required to determine whether these higher probabilities are reasonable. Methods to perform this analysis from the literature are reviewed and an example is demonstrated using the case study

  8. Ignoring correlation in uncertainty and sensitivity analysis in life cycle assessment: what is the risk?

    Energy Technology Data Exchange (ETDEWEB)

    Groen, E.A., E-mail: Evelyne.Groen@gmail.com [Wageningen University, P.O. Box 338, Wageningen 6700 AH (Netherlands); Heijungs, R. [Vrije Universiteit Amsterdam, De Boelelaan 1105, Amsterdam 1081 HV (Netherlands); Leiden University, Einsteinweg 2, Leiden 2333 CC (Netherlands)

    2017-01-15

    Life cycle assessment (LCA) is an established tool to quantify the environmental impact of a product. A good assessment of uncertainty is important for making well-informed decisions in comparative LCA, as well as for correctly prioritising data collection efforts. Under- or overestimation of output uncertainty (e.g. output variance) will lead to incorrect decisions in such matters. The presence of correlations between input parameters during uncertainty propagation, can increase or decrease the the output variance. However, most LCA studies that include uncertainty analysis, ignore correlations between input parameters during uncertainty propagation, which may lead to incorrect conclusions. Two approaches to include correlations between input parameters during uncertainty propagation and global sensitivity analysis were studied: an analytical approach and a sampling approach. The use of both approaches is illustrated for an artificial case study of electricity production. Results demonstrate that both approaches yield approximately the same output variance and sensitivity indices for this specific case study. Furthermore, we demonstrate that the analytical approach can be used to quantify the risk of ignoring correlations between input parameters during uncertainty propagation in LCA. We demonstrate that: (1) we can predict if including correlations among input parameters in uncertainty propagation will increase or decrease output variance; (2) we can quantify the risk of ignoring correlations on the output variance and the global sensitivity indices. Moreover, this procedure requires only little data. - Highlights: • Ignoring correlation leads to under- or overestimation of the output variance. • We demonstrated that the risk of ignoring correlation can be quantified. • The procedure proposed is generally applicable in life cycle assessment. • In some cases, ignoring correlation has a minimal effect on decision-making tools.

  9. Uncertainties in risk assessment at USDOE facilities

    Energy Technology Data Exchange (ETDEWEB)

    Hamilton, L.D.; Holtzman, S.; Meinhold, A.F.; Morris, S.C.; Rowe, M.D.

    1994-01-01

    The United States Department of Energy (USDOE) has embarked on an ambitious program to remediate environmental contamination at its facilities. Decisions concerning cleanup goals, choices among cleanup technologies, and funding prioritization should be largely risk-based. Risk assessments will be used more extensively by the USDOE in the future. USDOE needs to develop and refine risk assessment methods and fund research to reduce major sources of uncertainty in risk assessments at USDOE facilities. The terms{open_quote} risk assessment{close_quote} and{open_quote} risk management{close_quote} are frequently confused. The National Research Council (1983) and the United States Environmental Protection Agency (USEPA, 1991a) described risk assessment as a scientific process that contributes to risk management. Risk assessment is the process of collecting, analyzing and integrating data and information to identify hazards, assess exposures and dose responses, and characterize risks. Risk characterization must include a clear presentation of {open_quotes}... the most significant data and uncertainties...{close_quotes} in an assessment. Significant data and uncertainties are {open_quotes}...those that define and explain the main risk conclusions{close_quotes}. Risk management integrates risk assessment information with other considerations, such as risk perceptions, socioeconomic and political factors, and statutes, to make and justify decisions. Risk assessments, as scientific processes, should be made independently of the other aspects of risk management (USEPA, 1991a), but current methods for assessing health risks are based on conservative regulatory principles, causing unnecessary public concern and misallocation of funds for remediation.

  10. Uncertainties in risk assessment at USDOE facilities

    International Nuclear Information System (INIS)

    Hamilton, L.D.; Holtzman, S.; Meinhold, A.F.; Morris, S.C.; Rowe, M.D.

    1994-01-01

    The United States Department of Energy (USDOE) has embarked on an ambitious program to remediate environmental contamination at its facilities. Decisions concerning cleanup goals, choices among cleanup technologies, and funding prioritization should be largely risk-based. Risk assessments will be used more extensively by the USDOE in the future. USDOE needs to develop and refine risk assessment methods and fund research to reduce major sources of uncertainty in risk assessments at USDOE facilities. The terms open-quote risk assessment close-quote and open-quote risk management close-quote are frequently confused. The National Research Council (1983) and the United States Environmental Protection Agency (USEPA, 1991a) described risk assessment as a scientific process that contributes to risk management. Risk assessment is the process of collecting, analyzing and integrating data and information to identify hazards, assess exposures and dose responses, and characterize risks. Risk characterization must include a clear presentation of open-quotes... the most significant data and uncertainties...close quotes in an assessment. Significant data and uncertainties are open-quotes...those that define and explain the main risk conclusionsclose quotes. Risk management integrates risk assessment information with other considerations, such as risk perceptions, socioeconomic and political factors, and statutes, to make and justify decisions. Risk assessments, as scientific processes, should be made independently of the other aspects of risk management (USEPA, 1991a), but current methods for assessing health risks are based on conservative regulatory principles, causing unnecessary public concern and misallocation of funds for remediation

  11. Risk analysis under uncertainty, the precautionary principle, and the new EU chemicals strategy.

    Science.gov (United States)

    Rogers, Michael D

    2003-06-01

    Three categories of uncertainty in relation to risk assessment are defined; uncertainty in effect, uncertainty in cause, and uncertainty in the relationship between a hypothesised cause and effect. The Precautionary Principle (PP) relates to the third type of uncertainty. Three broad descriptions of the PP are set out, uncertainty justifies action, uncertainty requires action, and uncertainty requires a reversal of the burden of proof for risk assessments. The application of the PP is controversial but what matters in practise is the precautionary action (PA) that follows. The criteria by which the PAs should be judged are detailed. This framework for risk assessment and management under uncertainty is then applied to the envisaged European system for the regulation of chemicals. A new EU regulatory system has been proposed which shifts the burden of proof concerning risk assessments from the regulator to the producer, and embodies the PP in all three of its main regulatory stages. The proposals are critically discussed in relation to three chemicals, namely, atrazine (an endocrine disrupter), cadmium (toxic and possibly carcinogenic), and hydrogen fluoride (a toxic, high-production-volume chemical). Reversing the burden of proof will speed up the regulatory process but the examples demonstrate that applying the PP appropriately, and balancing the countervailing risks and the socio-economic benefits, will continue to be a difficult task for the regulator. The paper concludes with a discussion of the role of precaution in the management of change and of the importance of trust in the effective regulation of uncertain risks.

  12. Risk assessment under deep uncertainty: A methodological comparison

    International Nuclear Information System (INIS)

    Shortridge, Julie; Aven, Terje; Guikema, Seth

    2017-01-01

    Probabilistic Risk Assessment (PRA) has proven to be an invaluable tool for evaluating risks in complex engineered systems. However, there is increasing concern that PRA may not be adequate in situations with little underlying knowledge to support probabilistic representation of uncertainties. As analysts and policy makers turn their attention to deeply uncertain hazards such as climate change, a number of alternatives to traditional PRA have been proposed. This paper systematically compares three diverse approaches for risk analysis under deep uncertainty (qualitative uncertainty factors, probability bounds, and robust decision making) in terms of their representation of uncertain quantities, analytical output, and implications for risk management. A simple example problem is used to highlight differences in the way that each method relates to the traditional risk assessment process and fundamental issues associated with risk assessment and description. We find that the implications for decision making are not necessarily consistent between approaches, and that differences in the representation of uncertain quantities and analytical output suggest contexts in which each method may be most appropriate. Finally, each methodology demonstrates how risk assessment can inform decision making in deeply uncertain contexts, informing more effective responses to risk problems characterized by deep uncertainty. - Highlights: • We compare three diverse approaches to risk assessment under deep uncertainty. • A simple example problem highlights differences in analytical process and results. • Results demonstrate how methodological choices can impact risk assessment results.

  13. The Uncertainties of Risk Management

    DEFF Research Database (Denmark)

    Vinnari, Eija; Skærbæk, Peter

    2014-01-01

    for expanding risk management. More generally, such uncertainties relate to the professional identities and responsibilities of operational managers as defined by the framing devices. Originality/value – The paper offers three contributions to the extant literature: first, it shows how risk management itself......Purpose – The purpose of this paper is to analyse the implementation of risk management as a tool for internal audit activities, focusing on unexpected effects or uncertainties generated during its application. Design/methodology/approach – Public and confidential documents as well as semi......-structured interviews are analysed through the lens of actor-network theory to identify the effects of risk management devices in a Finnish municipality. Findings – The authors found that risk management, rather than reducing uncertainty, itself created unexpected uncertainties that would otherwise not have emerged...

  14. Evaluation of risk impact of changes to Completion Times addressing model and parameter uncertainties

    International Nuclear Information System (INIS)

    Martorell, S.; Martón, I.; Villamizar, M.; Sánchez, A.I.; Carlos, S.

    2014-01-01

    This paper presents an approach and an example of application for the evaluation of risk impact of changes to Completion Times within the License Basis of a Nuclear Power Plant based on the use of the Probabilistic Risk Assessment addressing identification, treatment and analysis of uncertainties in an integrated manner. It allows full development of a three tired approach (Tier 1–3) following the principles of the risk-informed decision-making accounting for uncertainties as proposed by many regulators. Completion Time is the maximum outage time a safety related equipment is allowed to be down, e.g. for corrective maintenance, which is established within the Limiting Conditions for Operation included into Technical Specifications for operation of a Nuclear Power Plant. The case study focuses on a Completion Time change of the Accumulators System of a Nuclear Power Plant using a level 1 PRA. It focuses on several sources of model and parameter uncertainties. The results obtained show the risk impact of the proposed CT change including both types of epistemic uncertainties is small as compared with current safety goals of concern to Tier 1. However, what concerns to Tier 2 and 3, the results obtained show how the use of some traditional and uncertainty importance measures helps in identifying high risky configurations that should be avoided in NPP technical specifications no matter the duration of CT (Tier 2), and other configurations that could take part of a configuration risk management program (Tier 3). - Highlights: • New approach for evaluation of risk impact of changes to Completion Times. • Integrated treatment and analysis of model and parameter uncertainties. • PSA based application to support risk-informed decision-making. • Measures of importance for identification of risky configurations. • Management of important safety issues to accomplish safety goals

  15. Quantile uncertainty and value-at-risk model risk.

    Science.gov (United States)

    Alexander, Carol; Sarabia, José María

    2012-08-01

    This article develops a methodology for quantifying model risk in quantile risk estimates. The application of quantile estimates to risk assessment has become common practice in many disciplines, including hydrology, climate change, statistical process control, insurance and actuarial science, and the uncertainty surrounding these estimates has long been recognized. Our work is particularly important in finance, where quantile estimates (called Value-at-Risk) have been the cornerstone of banking risk management since the mid 1980s. A recent amendment to the Basel II Accord recommends additional market risk capital to cover all sources of "model risk" in the estimation of these quantiles. We provide a novel and elegant framework whereby quantile estimates are adjusted for model risk, relative to a benchmark which represents the state of knowledge of the authority that is responsible for model risk. A simulation experiment in which the degree of model risk is controlled illustrates how to quantify Value-at-Risk model risk and compute the required regulatory capital add-on for banks. An empirical example based on real data shows how the methodology can be put into practice, using only two time series (daily Value-at-Risk and daily profit and loss) from a large bank. We conclude with a discussion of potential applications to nonfinancial risks. © 2012 Society for Risk Analysis.

  16. Foundations of Risk Analysis

    CERN Document Server

    Aven, Terje

    2012-01-01

    Foundations of Risk Analysis presents the issues core to risk analysis - understanding what risk means, expressing risk, building risk models, addressing uncertainty, and applying probability models to real problems. The author provides the readers with the knowledge and basic thinking they require to successfully manage risk and uncertainty to support decision making. This updated edition reflects recent developments on risk and uncertainty concepts, representations and treatment. New material in Foundations of Risk Analysis includes:An up to date presentation of how to understand, define and

  17. Uncertainties and demonstration of compliance with numerical risk standards

    International Nuclear Information System (INIS)

    Preyssl, C.; Cullingford, M.C.

    1987-01-01

    When dealing with numerical results of a probabilistic risk analysis performed for a complex system, such as a nuclear power plant, one major objective may be to deal with the problem of compliance or non-compliance with a prefixed risk standard. The uncertainties in the risk results associated with the consequences and their probabilities of occurrence may be considered by representing the risk as a risk band. Studying the area and distance between the upper and lower bound of the risk band provides consistent information on the uncertainties in terms of risk, not by means of scalars only but also by real functions. Criteria can be defined for determining compliance with a numerical risk standard, and the 'weighting functional' method, representing a possible tool for testing compliance of risk results, is introduced. By shifting the upper confidence bound due to redefinition, part of the risk band may exceed the standard without changing the underlying results. Using the concept described it is possible to determine the amount of risk, i.e. uncertainty, exceeding the standard. The mathematical treatment of uncertainties therefore allows probabilistic risk assessment results to be compared. A realistic example illustrates the method. (author)

  18. Reliability analysis under epistemic uncertainty

    International Nuclear Information System (INIS)

    Nannapaneni, Saideep; Mahadevan, Sankaran

    2016-01-01

    This paper proposes a probabilistic framework to include both aleatory and epistemic uncertainty within model-based reliability estimation of engineering systems for individual limit states. Epistemic uncertainty is considered due to both data and model sources. Sparse point and/or interval data regarding the input random variables leads to uncertainty regarding their distribution types, distribution parameters, and correlations; this statistical uncertainty is included in the reliability analysis through a combination of likelihood-based representation, Bayesian hypothesis testing, and Bayesian model averaging techniques. Model errors, which include numerical solution errors and model form errors, are quantified through Gaussian process models and included in the reliability analysis. The probability integral transform is used to develop an auxiliary variable approach that facilitates a single-level representation of both aleatory and epistemic uncertainty. This strategy results in an efficient single-loop implementation of Monte Carlo simulation (MCS) and FORM/SORM techniques for reliability estimation under both aleatory and epistemic uncertainty. Two engineering examples are used to demonstrate the proposed methodology. - Highlights: • Epistemic uncertainty due to data and model included in reliability analysis. • A novel FORM-based approach proposed to include aleatory and epistemic uncertainty. • A single-loop Monte Carlo approach proposed to include both types of uncertainties. • Two engineering examples used for illustration.

  19. Improved Monte Carlo Method for PSA Uncertainty Analysis

    International Nuclear Information System (INIS)

    Choi, Jongsoo

    2016-01-01

    The treatment of uncertainty is an important issue for regulatory decisions. Uncertainties exist from knowledge limitations. A probabilistic approach has exposed some of these limitations and provided a framework to assess their significance and assist in developing a strategy to accommodate them in the regulatory process. The uncertainty analysis (UA) is usually based on the Monte Carlo method. This paper proposes a Monte Carlo UA approach to calculate the mean risk metrics accounting for the SOKC between basic events (including CCFs) using efficient random number generators and to meet Capability Category III of the ASME/ANS PRA standard. Audit calculation is needed in PSA regulatory reviews of uncertainty analysis results submitted for licensing. The proposed Monte Carlo UA approach provides a high degree of confidence in PSA reviews. All PSA needs accounting for the SOKC between event probabilities to meet the ASME/ANS PRA standard

  20. Improved Monte Carlo Method for PSA Uncertainty Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jongsoo [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2016-10-15

    The treatment of uncertainty is an important issue for regulatory decisions. Uncertainties exist from knowledge limitations. A probabilistic approach has exposed some of these limitations and provided a framework to assess their significance and assist in developing a strategy to accommodate them in the regulatory process. The uncertainty analysis (UA) is usually based on the Monte Carlo method. This paper proposes a Monte Carlo UA approach to calculate the mean risk metrics accounting for the SOKC between basic events (including CCFs) using efficient random number generators and to meet Capability Category III of the ASME/ANS PRA standard. Audit calculation is needed in PSA regulatory reviews of uncertainty analysis results submitted for licensing. The proposed Monte Carlo UA approach provides a high degree of confidence in PSA reviews. All PSA needs accounting for the SOKC between event probabilities to meet the ASME/ANS PRA standard.

  1. Structural uncertainty in seismic risk analysis. Seismic safety margins research program

    Energy Technology Data Exchange (ETDEWEB)

    Hasselman, T K; Simonian, S S [J.H. Wiggins Company (United States)

    1980-03-01

    This report documents the formulation of a methodology for modeling and evaluating the effects of structural uncertainty on predicted modal characteristics of the major structures and substructures of commercial nuclear power plants. The uncertainties are cast in the form of normalized random variables which represent the demonstrated ability to predict modal frequencies, damping and modal response amplitudes for broad generic types of structures (steel frame, reinforced concrete and prestressed concrete). Data based on observed differences between predicted and measured structural performance at the member, substructure, and/or major structural system levels are used to quantify uncertainties and thus form the data base for statistical analysis. Proper normalization enables data from non-nuclear structures, e.g., office buildings, to be included in the data base. Numerous alternative methods are defined within the general framework of this methodology. The report also documents the results of a data survey to identify, classify and evaluate available data for the required data base. A bibliography of 95 references is included. Deficiencies in the currently identified data base are exposed, and remedial measures suggested. Recommendations are made for implementation of the methodology. (author)

  2. Accounting for predictive uncertainty in a risk analysis focusing on radiological contamination of groundwater

    International Nuclear Information System (INIS)

    Andricevic, R.; Jacobson, R.L.; Daniels, J.I.

    1994-12-01

    This study focuses on the probabilistic travel time approach for predicting transport of radionuclides by groundwater velocity considering parameter uncertainty. The principal entity in the presented model is a travel time probability density function (pdf) conditioned on the set of parameters used to describe different transport processes like advection, dispersion, sorption, and decay. The model is applied to predict the arrival time of radionuclides in groundwater from the Nevada Test Site (NTS) at possible locations of potential human receptors nearby. Because of the lack of sorption the Tritium is found to provide the largest risk. Inclusion of sorption processes indicate that the parameter uncertainty and especially negative correlation between the mean velocity and the sorption strength is instrumental in evaluating the radionuclides arrival time at the prespecified accessible environment. Our analysis of potential health risks takes into consideration uncertainties in physiological attributes, as well as in committed effective dose and the estimate of physical detriment per unit Committed Effective Dose

  3. Use of quantitative uncertainty analysis for human health risk assessment

    International Nuclear Information System (INIS)

    Duncan, F.L.W.; Gordon, J.W.; Kelly, M.

    1994-01-01

    Current human health risk assessment method for environmental risks typically use point estimates of risk accompanied by qualitative discussions of uncertainty. Alternatively, Monte Carlo simulations may be used with distributions for input parameters to estimate the resulting risk distribution and descriptive risk percentiles. These two techniques are applied for the ingestion of 1,1=dichloroethene in ground water. The results indicate that Monte Carlo simulations provide significantly more information for risk assessment and risk management than do point estimates

  4. Integration of expert knowledge and uncertainty in natural risk assessment

    Science.gov (United States)

    Baruffini, Mirko; Jaboyedoff, Michel

    2010-05-01

    Natural hazards occurring in alpine regions during the last decades have clearly shown that interruptions of the Swiss railway power supply and closures of the Gotthard highway due to those events have increased the awareness of infrastructure vulnerability also in Switzerland and illustrate the potential impacts of failures on the performance of infrastructure systems. This asks for a high level of surveillance and preservation along the transalpine lines. Traditional simulation models are only partially capable to predict complex systems behaviours and the subsequently designed and implemented protection strategies are not able to mitigate the full spectrum of risk consequences. They are costly, and maximal protection is most probably not economically feasible. In addition, the quantitative risk assessment approaches such as fault tree analysis, event tree analysis and equivalent annual fatality analysis rely heavily on statistical information. Collecting sufficient data to base a statistical probability of risk is costly and, in many situations, such data does not exist; thus, expert knowledge and experience or engineering judgment can be exploited to estimate risk qualitatively. In order to overcome the statistics lack we used models based on expert's knowledge in order to qualitatively predict based on linguistic appreciation that are more expressive and natural in risk assessment. Fuzzy reasoning (FR) can be used providing a mechanism of computing with words (Zadeh, 1965) for modelling qualitative human thought processes in analyzing complex systems and decisions. Uncertainty in predicting the risk levels arises from such situations because no fully-formalized knowledge are available. Another possibility is to use probability based on triangular probability density function (T-PDF) that can be used to follow the same flow-chart as FR. We implemented the Swiss natural hazard recommendations FR and probability using T-PDF in order to obtain hazard zoning and

  5. Statistical analysis tolerance using jacobian torsor model based on uncertainty propagation method

    Directory of Open Access Journals (Sweden)

    W Ghie

    2016-04-01

    Full Text Available One risk inherent in the use of assembly components is that the behaviourof these components is discovered only at the moment an assembly isbeing carried out. The objective of our work is to enable designers to useknown component tolerances as parameters in models that can be usedto predict properties at the assembly level. In this paper we present astatistical approach to assemblability evaluation, based on tolerance andclearance propagations. This new statistical analysis method for toleranceis based on the Jacobian-Torsor model and the uncertainty measurementapproach. We show how this can be accomplished by modeling thedistribution of manufactured dimensions through applying a probabilitydensity function. By presenting an example we show how statisticaltolerance analysis should be used in the Jacobian-Torsor model. This workis supported by previous efforts aimed at developing a new generation ofcomputational tools for tolerance analysis and synthesis, using theJacobian-Torsor approach. This approach is illustrated on a simple threepartassembly, demonstrating the method’s capability in handling threedimensionalgeometry.

  6. An introductory guide to uncertainty analysis in environmental and health risk assessment. Environmental Restoration Program

    International Nuclear Information System (INIS)

    Hammonds, J.S.; Hoffman, F.O.; Bartell, S.M.

    1994-12-01

    This report presents guidelines for evaluating uncertainty in mathematical equations and computer models applied to assess human health and environmental risk. Uncertainty analyses involve the propagation of uncertainty in model parameters and model structure to obtain confidence statements for the estimate of risk and identify the model components of dominant importance. Uncertainty analyses are required when there is no a priori knowledge about uncertainty in the risk estimate and when there is a chance that the failure to assess uncertainty may affect the selection of wrong options for risk reduction. Uncertainty analyses are effective when they are conducted in an iterative mode. When the uncertainty in the risk estimate is intolerable for decision-making, additional data are acquired for the dominant model components that contribute most to uncertainty. This process is repeated until the level of residual uncertainty can be tolerated. A analytical and numerical methods for error propagation are presented along with methods for identifying the most important contributors to uncertainty. Monte Carlo simulation with either Simple Random Sampling (SRS) or Latin Hypercube Sampling (LHS) is proposed as the most robust method for propagating uncertainty through either simple or complex models. A distinction is made between simulating a stochastically varying assessment endpoint (i.e., the distribution of individual risks in an exposed population) and quantifying uncertainty due to lack of knowledge about a fixed but unknown quantity (e.g., a specific individual, the maximally exposed individual, or the mean, median, or 95%-tile of the distribution of exposed individuals). Emphasis is placed on the need for subjective judgement to quantify uncertainty when relevant data are absent or incomplete

  7. A dominance-based approach to map risks of ecological invasions in the presence of severe uncertainty

    Science.gov (United States)

    Denys Yemshanov; Frank H. Koch; D. Barry Lyons; Mark Ducey; Klaus Koehler

    2012-01-01

    Aim Uncertainty has been widely recognized as one of the most critical issues in predicting the expansion of ecological invasions. The uncertainty associated with the introduction and spread of invasive organisms influences how pest management decision makers respond to expanding incursions. We present a model-based approach to map risk of ecological invasions that...

  8. Guidance for treatment of variability and uncertainty in ecological risk assessments of contaminated sites

    International Nuclear Information System (INIS)

    1998-06-01

    Uncertainty is a seemingly simple concept that has caused great confusion and conflict in the field of risk assessment. This report offers guidance for the analysis and presentation of variability and uncertainty in ecological risk assessments, an important issue in the remedial investigation and feasibility study processes. This report discusses concepts of probability in terms of variance and uncertainty, describes how these concepts differ in ecological risk assessment from human health risk assessment, and describes probabilistic aspects of specific ecological risk assessment techniques. The report ends with 17 points to consider in performing an uncertainty analysis for an ecological risk assessment of a contaminated site

  9. Advanced uncertainty modelling for container port risk analysis.

    Science.gov (United States)

    Alyami, Hani; Yang, Zaili; Riahi, Ramin; Bonsall, Stephen; Wang, Jin

    2016-08-13

    Globalization has led to a rapid increase of container movements in seaports. Risks in seaports need to be appropriately addressed to ensure economic wealth, operational efficiency, and personnel safety. As a result, the safety performance of a Container Terminal Operational System (CTOS) plays a growing role in improving the efficiency of international trade. This paper proposes a novel method to facilitate the application of Failure Mode and Effects Analysis (FMEA) in assessing the safety performance of CTOS. The new approach is developed through incorporating a Fuzzy Rule-Based Bayesian Network (FRBN) with Evidential Reasoning (ER) in a complementary manner. The former provides a realistic and flexible method to describe input failure information for risk estimates of individual hazardous events (HEs) at the bottom level of a risk analysis hierarchy. The latter is used to aggregate HEs safety estimates collectively, allowing dynamic risk-based decision support in CTOS from a systematic perspective. The novel feature of the proposed method, compared to those in traditional port risk analysis lies in a dynamic model capable of dealing with continually changing operational conditions in ports. More importantly, a new sensitivity analysis method is developed and carried out to rank the HEs by taking into account their specific risk estimations (locally) and their Risk Influence (RI) to a port's safety system (globally). Due to its generality, the new approach can be tailored for a wide range of applications in different safety and reliability engineering and management systems, particularly when real time risk ranking is required to measure, predict, and improve the associated system safety performance. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis.

    Science.gov (United States)

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  11. Uncertainty: lotteries and risk

    OpenAIRE

    Ávalos, Eloy

    2011-01-01

    In this paper we develop the theory of uncertainty in a context where the risks assumed by the individual are measurable and manageable. We primarily use the definition of lottery to formulate the axioms of the individual's preferences, and its representation through the utility function von Neumann - Morgenstern. We study the expected utility theorem and its properties, the paradoxes of choice under uncertainty and finally the measures of risk aversion with monetary lotteries.

  12. A Proposal on the Advanced Sampling Based Sensitivity and Uncertainty Analysis Method for the Eigenvalue Uncertainty Analysis

    International Nuclear Information System (INIS)

    Kim, Song Hyun; Song, Myung Sub; Shin, Chang Ho; Noh, Jae Man

    2014-01-01

    In using the perturbation theory, the uncertainty of the response can be estimated by a single transport simulation, and therefore it requires small computational load. However, it has a disadvantage that the computation methodology must be modified whenever estimating different response type such as multiplication factor, flux, or power distribution. Hence, it is suitable for analyzing few responses with lots of perturbed parameters. Statistical approach is a sampling based method which uses randomly sampled cross sections from covariance data for analyzing the uncertainty of the response. XSUSA is a code based on the statistical approach. The cross sections are only modified with the sampling based method; thus, general transport codes can be directly utilized for the S/U analysis without any code modifications. However, to calculate the uncertainty distribution from the result, code simulation should be enough repeated with randomly sampled cross sections. Therefore, this inefficiency is known as a disadvantage of the stochastic method. In this study, an advanced sampling method of the cross sections is proposed and verified to increase the estimation efficiency of the sampling based method. In this study, to increase the estimation efficiency of the sampling based S/U method, an advanced sampling and estimation method was proposed. The main feature of the proposed method is that the cross section averaged from each single sampled cross section is used. For the use of the proposed method, the validation was performed using the perturbation theory

  13. Uncertainty analysis techniques

    International Nuclear Information System (INIS)

    Marivoet, J.; Saltelli, A.; Cadelli, N.

    1987-01-01

    The origin of the uncertainty affecting Performance Assessments, as well as their propagation to dose and risk results is discussed. The analysis is focused essentially on the uncertainties introduced by the input parameters, the values of which may range over some orders of magnitude and may be given as probability distribution function. The paper briefly reviews the existing sampling techniques used for Monte Carlo simulations and the methods for characterizing the output curves, determining their convergence and confidence limits. Annual doses, expectation values of the doses and risks are computed for a particular case of a possible repository in clay, in order to illustrate the significance of such output characteristics as the mean, the logarithmic mean and the median as well as their ratios. The report concludes that provisionally, due to its better robustness, such estimation as the 90th percentile may be substituted to the arithmetic mean for comparison of the estimated doses with acceptance criteria. In any case, the results obtained through Uncertainty Analyses must be interpreted with caution as long as input data distribution functions are not derived from experiments reasonably reproducing the situation in a well characterized repository and site

  14. Uncertainty analysis for geologic disposal of radioactive waste

    International Nuclear Information System (INIS)

    Cranwell, R.M.; Helton, J.C.

    1981-01-01

    The incorporation and representation of uncertainty in the analysis of the consequences and risks associated with the geologic disposal of high-level radioactive waste are discussed. Such uncertainty has three primary components: process modeling uncertainty, model input data uncertainty, and scenario uncertainty. The following topics are considered in connection with the preceding components: propagation of uncertainty in the modeling of a disposal site, sampling of input data for models, and uncertainty associated with model output

  15. Model uncertainty in safety assessment

    International Nuclear Information System (INIS)

    Pulkkinen, U.; Huovinen, T.

    1996-01-01

    The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.)

  16. Model uncertainty in safety assessment

    Energy Technology Data Exchange (ETDEWEB)

    Pulkkinen, U; Huovinen, T [VTT Automation, Espoo (Finland). Industrial Automation

    1996-01-01

    The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.).

  17. Assessing risks and uncertainties in forest dynamics under different management scenarios and climate change

    Directory of Open Access Journals (Sweden)

    Matthias Albert

    2015-05-01

    Full Text Available Background Forest management faces a climate induced shift in growth potential and increasing current and emerging new risks. Vulnerability analysis provides decision support based on projections of natural resources taking risks and uncertainties into account. In this paper we (1 characterize differences in forest dynamics under three management scenarios, (2 analyse the effects of the three scenarios on two risk factors, windthrow and drought stress, and (3 quantify the effects and the amount of uncertainty arising from climate projections on height increment and drought stress. Methods In four regions in northern Germany, we apply three contrasting management scenarios and project forest development under climate change until 2070. Three climate runs (minimum, median, maximum based on the emission scenario RCP 8.5 control the site-sensitive forest growth functions. The minimum and maximum climate run define the range of prospective climate development. Results The projections of different management regimes until 2070 show the diverging medium-term effects of thinnings and harvests and long-term effects of species conversion on a regional scale. Examples of windthrow vulnerability and drought stress reveal how adaptation measures depend on the applied management path and the decision-maker’s risk attitude. Uncertainty analysis shows the increasing variability of drought risk projections with time. The effect of climate projections on height growth are quantified and uncertainty analysis reveals that height growth of young trees is dominated by the age-trend whereas the climate signal in height increment of older trees is decisive. Conclusions Drought risk is a serious issue in the eastern regions independent of the applied silvicultural scenario, but adaptation measures are limited as the proportion of the most drought tolerant species Scots pine is already high. Windthrow risk is no serious overall threat in any region, but adequate

  18. Deterministic uncertainty analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1987-12-01

    This paper presents a deterministic uncertainty analysis (DUA) method for calculating uncertainties that has the potential to significantly reduce the number of computer runs compared to conventional statistical analysis. The method is based upon the availability of derivative and sensitivity data such as that calculated using the well known direct or adjoint sensitivity analysis techniques. Formation of response surfaces using derivative data and the propagation of input probability distributions are discussed relative to their role in the DUA method. A sample problem that models the flow of water through a borehole is used as a basis to compare the cumulative distribution function of the flow rate as calculated by the standard statistical methods and the DUA method. Propogation of uncertainties by the DUA method is compared for ten cases in which the number of reference model runs was varied from one to ten. The DUA method gives a more accurate representation of the true cumulative distribution of the flow rate based upon as few as two model executions compared to fifty model executions using a statistical approach. 16 refs., 4 figs., 5 tabs

  19. Quantification of uncertainty in flood risk assessment for flood protection planning: a Bayesian approach

    Science.gov (United States)

    Dittes, Beatrice; Špačková, Olga; Ebrahimian, Negin; Kaiser, Maria; Rieger, Wolfgang; Disse, Markus; Straub, Daniel

    2017-04-01

    Flood risk estimates are subject to significant uncertainties, e.g. due to limited records of historic flood events, uncertainty in flood modeling, uncertain impact of climate change or uncertainty in the exposure and loss estimates. In traditional design of flood protection systems, these uncertainties are typically just accounted for implicitly, based on engineering judgment. In the AdaptRisk project, we develop a fully quantitative framework for planning of flood protection systems under current and future uncertainties using quantitative pre-posterior Bayesian decision analysis. In this contribution, we focus on the quantification of the uncertainties and study their relative influence on the flood risk estimate and on the planning of flood protection systems. The following uncertainty components are included using a Bayesian approach: 1) inherent and statistical (i.e. limited record length) uncertainty; 2) climate uncertainty that can be learned from an ensemble of GCM-RCM models; 3) estimates of climate uncertainty components not covered in 2), such as bias correction, incomplete ensemble, local specifics not captured by the GCM-RCM models; 4) uncertainty in the inundation modelling; 5) uncertainty in damage estimation. We also investigate how these uncertainties are possibly reduced in the future when new evidence - such as new climate models, observed extreme events, and socio-economic data - becomes available. Finally, we look into how this new evidence influences the risk assessment and effectivity of flood protection systems. We demonstrate our methodology for a pre-alpine catchment in southern Germany: the Mangfall catchment in Bavaria that includes the city of Rosenheim, which suffered significant losses during the 2013 flood event.

  20. Bayesian-network-based safety risk analysis in construction projects

    International Nuclear Information System (INIS)

    Zhang, Limao; Wu, Xianguo; Skibniewski, Miroslaw J.; Zhong, Jingbing; Lu, Yujie

    2014-01-01

    This paper presents a systemic decision support approach for safety risk analysis under uncertainty in tunnel construction. Fuzzy Bayesian Networks (FBN) is used to investigate causal relationships between tunnel-induced damage and its influential variables based upon the risk/hazard mechanism analysis. Aiming to overcome limitations on the current probability estimation, an expert confidence indicator is proposed to ensure the reliability of the surveyed data for fuzzy probability assessment of basic risk factors. A detailed fuzzy-based inference procedure is developed, which has a capacity of implementing deductive reasoning, sensitivity analysis and abductive reasoning. The “3σ criterion” is adopted to calculate the characteristic values of a triangular fuzzy number in the probability fuzzification process, and the α-weighted valuation method is adopted for defuzzification. The construction safety analysis progress is extended to the entire life cycle of risk-prone events, including the pre-accident, during-construction continuous and post-accident control. A typical hazard concerning the tunnel leakage in the construction of Wuhan Yangtze Metro Tunnel in China is presented as a case study, in order to verify the applicability of the proposed approach. The results demonstrate the feasibility of the proposed approach and its application potential. A comparison of advantages and disadvantages between FBN and fuzzy fault tree analysis (FFTA) as risk analysis tools is also conducted. The proposed approach can be used to provide guidelines for safety analysis and management in construction projects, and thus increase the likelihood of a successful project in a complex environment. - Highlights: • A systemic Bayesian network based approach for safety risk analysis is developed. • An expert confidence indicator for probability fuzzification is proposed. • Safety risk analysis progress is extended to entire life cycle of risk-prone events. • A typical

  1. Status of XSUSA for sampling based nuclear data uncertainty and sensitivity analysis

    International Nuclear Information System (INIS)

    Zwermann, W.; Gallner, L.; Klein, M.; Krzydacz-Hausmann; Pasichnyk, I.; Pautz, A.; Velkov, K.

    2013-01-01

    In the present contribution, an overview of the sampling based XSUSA method for sensitivity and uncertainty analysis with respect to nuclear data is given. The focus is on recent developments and applications of XSUSA. These applications include calculations for critical assemblies, fuel assembly depletion calculations, and steady state as well as transient reactor core calculations. The analyses are partially performed in the framework of international benchmark working groups (UACSA - Uncertainty Analyses for Criticality Safety Assessment, UAM - Uncertainty Analysis in Modelling). It is demonstrated that particularly for full-scale reactor calculations the influence of the nuclear data uncertainties on the results can be substantial. For instance, for the radial fission rate distributions of mixed UO 2 /MOX light water reactor cores, the 2σ uncertainties in the core centre and periphery can reach values exceeding 10%. For a fast transient, the resulting time behaviour of the reactor power was covered by a wide uncertainty band. Overall, the results confirm the necessity of adding systematic uncertainty analyses to best-estimate reactor calculations. (authors)

  2. Climate change, uncertainty and investment in flood risk reduction

    OpenAIRE

    Pol, van der, T.D.

    2015-01-01

    Economic analysis of flood risk management strategies has become more complex due to climate change. This thesis investigates the impact of climate change on investment in flood risk reduction, and applies optimisation methods to support identification of optimal flood risk management strategies. Chapter 2 provides an overview of cost-benefit analysis (CBA) of flood risk management strategies under climate change uncertainty and new information. CBA is applied to determine optimal dike height...

  3. Uncertainties in fatal cancer risk estimates used in radiation protection

    International Nuclear Information System (INIS)

    Kai, Michiaki

    1999-01-01

    Although ICRP and NCRP had not described the details of uncertainties in cancer risk estimates in radiation protection, NCRP, in 1997, firstly reported the results of uncertainty analysis (NCRP No.126) and which is summarized in this paper. The NCRP report pointed out that there are following five factors which uncertainty possessing: uncertainty in epidemiological studies, in dose assessment, in transforming the estimates to risk assessment, in risk prediction and in extrapolation to the low dose/dose rate. These individual factors were analyzed statistically to obtain the relationship between the probability of cancer death in the US population and life time risk coefficient (% per Sv), which showed that, for the latter, the mean value was 3.99 x 10 -2 /Sv, median, 3.38 x 10 -2 /Sv, GSD (geometrical standard deviation), 1.83 x 10 -2 /Sv and 95% confidential limit, 1.2-8.84 x 10 -2 /Sv. The mean value was smaller than that of ICRP recommendation (5 x 10 -2 /Sv), indicating that the value has the uncertainty factor of 2.5-3. Moreover, the most important factor was shown to be the uncertainty in DDREF (dose/dose rate reduction factor). (K.H.)

  4. Cassini Spacecraft Uncertainty Analysis Data and Methodology Review and Update/Volume 1: Updated Parameter Uncertainty Models for the Consequence Analysis

    Energy Technology Data Exchange (ETDEWEB)

    WHEELER, TIMOTHY A.; WYSS, GREGORY D.; HARPER, FREDERICK T.

    2000-11-01

    Uncertainty distributions for specific parameters of the Cassini General Purpose Heat Source Radioisotope Thermoelectric Generator (GPHS-RTG) Final Safety Analysis Report consequence risk analysis were revised and updated. The revisions and updates were done for all consequence parameters for which relevant information exists from the joint project on Probabilistic Accident Consequence Uncertainty Analysis by the United States Nuclear Regulatory Commission and the Commission of European Communities.

  5. The Stock Market: Risk vs. Uncertainty.

    Science.gov (United States)

    Griffitts, Dawn

    2002-01-01

    This economics education publication focuses on the U.S. stock market and the risk and uncertainty that an individual faces when investing in the market. The material explains that risk and uncertainty relate to the same underlying concept randomness. It defines and discusses both concepts and notes that although risk is quantifiable, uncertainty…

  6. [Status Quo, Uncertainties and Trends Analysis of Environmental Risk Assessment for PFASs].

    Science.gov (United States)

    Hao, Xue-wen; Li, Li; Wang, Jie; Cao, Yan; Liu, Jian-guo

    2015-08-01

    This study systematically combed the definition and change of terms, category and application of perfluoroalkyl and polyfluoroalkyl substances (PFASs) in international academic, focusing on the environmental risk and exposure assessment of PFASs, to comprehensively analyze the current status, uncertainties and trends of PFASs' environmental risk assessment. Overall, the risk assessment of PFASs is facing a complicated situation involving complex substance pedigrees, various types, complex derivative relations, confidential business information and risk uncertainties. Although the environmental risk of long-chain PFASs has been widely recognized, the short-chain PFASs and short-chain fluorotelomers as their alternatives still have many research gaps and uncertainties in environmental hazards, environmental fate and exposure risk. The scope of risk control of PFASs in the international community is still worth discussing. Due to trade secrets and market competition, the chemical structure and risk information of PFASs' alternatives are generally lack of openness and transparency. The environmental risk of most fluorinated and non-fluorinated alternatives is not clear. In total, the international research on PFASs risk assessment gradually transfer from long-chain perfluoroalkyl acids (PFAAs) represented by perfluorooctane sulfonic acid (PFOS) and perfluorooctanoic acid (PFOA) to short-chain PFAAs, and then extends to other PFASs. The main problems to be solved urgently and researched continuously are: the environmental hazardous assessment indexes, such as bioaccumulation and environmental migration, optimization method, the environmental release and multimedia environmental fate of short-chain PFASs; the environmental fate of neutral PFASs and the transformation and contribution as precursors of short-chain PFASs; the risk identification and assessment of fluorinated and non-fluorinated alternatives of PFASs.

  7. A new measure of uncertainty importance based on distributional sensitivity analysis for PSA

    International Nuclear Information System (INIS)

    Han, Seok Jung; Tak, Nam Il; Chun, Moon Hyun

    1996-01-01

    The main objective of the present study is to propose a new measure of uncertainty importance based on distributional sensitivity analysis. The new measure is developed to utilize a metric distance obtained from cumulative distribution functions (cdfs). The measure is evaluated for two cases: one is a cdf given by a known analytical distribution and the other given by an empirical distribution generated by a crude Monte Carlo simulation. To study its applicability, the present measure has been applied to two different cases. The results are compared with those of existing three methods. The present approach is a useful measure of uncertainty importance which is based on cdfs. This method is simple and easy to calculate uncertainty importance without any complex process. On the basis of the results obtained in the present work, the present method is recommended to be used as a tool for the analysis of uncertainty importance

  8. An enhanced unified uncertainty analysis approach based on first order reliability method with single-level optimization

    International Nuclear Information System (INIS)

    Yao, Wen; Chen, Xiaoqian; Huang, Yiyong; Tooren, Michel van

    2013-01-01

    In engineering, there exist both aleatory uncertainties due to the inherent variation of the physical system and its operational environment, and epistemic uncertainties due to lack of knowledge and which can be reduced with the collection of more data. To analyze the uncertain distribution of the system performance under both aleatory and epistemic uncertainties, combined probability and evidence theory can be employed to quantify the compound effects of the mixed uncertainties. The existing First Order Reliability Method (FORM) based Unified Uncertainty Analysis (UUA) approach nests the optimization based interval analysis in the improved Hasofer–Lind–Rackwitz–Fiessler (iHLRF) algorithm based Most Probable Point (MPP) searching procedure, which is computationally inhibitive for complex systems and may encounter convergence problem as well. Therefore, in this paper it is proposed to use general optimization solvers to search MPP in the outer loop and then reformulate the double-loop optimization problem into an equivalent single-level optimization (SLO) problem, so as to simplify the uncertainty analysis process, improve the robustness of the algorithm, and alleviate the computational complexity. The effectiveness and efficiency of the proposed method is demonstrated with two numerical examples and one practical satellite conceptual design problem. -- Highlights: ► Uncertainty analysis under mixed aleatory and epistemic uncertainties is studied. ► A unified uncertainty analysis method is proposed with combined probability and evidence theory. ► The traditional nested analysis method is converted to single level optimization for efficiency. ► The effectiveness and efficiency of the proposed method are testified with three examples

  9. Risk-based flood protection planning under climate change and modeling uncertainty: a pre-alpine case study

    Directory of Open Access Journals (Sweden)

    B. Dittes

    2018-05-01

    Full Text Available Planning authorities are faced with a range of questions when planning flood protection measures: is the existing protection adequate for current and future demands or should it be extended? How will flood patterns change in the future? How should the uncertainty pertaining to this influence the planning decision, e.g., for delaying planning or including a safety margin? Is it sufficient to follow a protection criterion (e.g., to protect from the 100-year flood or should the planning be conducted in a risk-based way? How important is it for flood protection planning to accurately estimate flood frequency (changes, costs and damage? These are questions that we address for a medium-sized pre-alpine catchment in southern Germany, using a sequential Bayesian decision making framework that quantitatively addresses the full spectrum of uncertainty. We evaluate different flood protection systems considered by local agencies in a test study catchment. Despite large uncertainties in damage, cost and climate, the recommendation is robust for the most conservative approach. This demonstrates the feasibility of making robust decisions under large uncertainty. Furthermore, by comparison to a previous study, it highlights the benefits of risk-based planning over the planning of flood protection to a prescribed return period.

  10. Risk-based flood protection planning under climate change and modeling uncertainty: a pre-alpine case study

    Science.gov (United States)

    Dittes, Beatrice; Kaiser, Maria; Špačková, Olga; Rieger, Wolfgang; Disse, Markus; Straub, Daniel

    2018-05-01

    Planning authorities are faced with a range of questions when planning flood protection measures: is the existing protection adequate for current and future demands or should it be extended? How will flood patterns change in the future? How should the uncertainty pertaining to this influence the planning decision, e.g., for delaying planning or including a safety margin? Is it sufficient to follow a protection criterion (e.g., to protect from the 100-year flood) or should the planning be conducted in a risk-based way? How important is it for flood protection planning to accurately estimate flood frequency (changes), costs and damage? These are questions that we address for a medium-sized pre-alpine catchment in southern Germany, using a sequential Bayesian decision making framework that quantitatively addresses the full spectrum of uncertainty. We evaluate different flood protection systems considered by local agencies in a test study catchment. Despite large uncertainties in damage, cost and climate, the recommendation is robust for the most conservative approach. This demonstrates the feasibility of making robust decisions under large uncertainty. Furthermore, by comparison to a previous study, it highlights the benefits of risk-based planning over the planning of flood protection to a prescribed return period.

  11. A multi-reservoir based water-hydroenergy management model for identifying the risk horizon of regional resources-energy policy under uncertainties

    International Nuclear Information System (INIS)

    Zeng, X.T.; Zhang, S.J.; Feng, J.; Huang, G.H.; Li, Y.P.; Zhang, P.; Chen, J.P.; Li, K.L.

    2017-01-01

    Highlights: • A multi-reservoir system can handle water/energy deficit, flood and sediment damage. • A MWH model is developed for planning a water allocation and energy generation issue. • A mixed fuzzy-stochastic risk analysis method (MFSR) can handle uncertainties in MWH. • A hybrid MWH model can plan human-recourse-energy with a robust and effective manner. • Results can support adjusting water-energy policy to satisfy increasing demands. - Abstract: In this study, a multi-reservoir based water-hydroenergy management (MWH) model is developed for planning water allocation and hydroenergy generation (WAHG) under uncertainties. A mixed fuzzy-stochastic risk analysis method (MFSR) is introduced to handle objective and subjective uncertainties in MWH model, which can couple fuzzy credibility programming and risk management within a general two-stage context, with aim to reflect the infeasibility risks between expected targets and random second-stage recourse costs. The developed MWH model (embedded by MFSR method) can be applied to a practical study of WAHG issue in Jing River Basin (China), which encounters conflicts between human activity and resource/energy crisis. The construction of water-energy nexus (WEN) is built to reflect integrity of economic development and resource/energy conservation, as well as confronting natural and artificial damages such as water deficit, electricity insufficient, floodwater, high sedimentation deposition contemporarily. Meanwhile, the obtained results with various credibility levels and target-violated risk levels can support generating a robust plan associated with risk control for identification of the optimized water-allocation and hydroenergy-generation alternatives, as well as flood controls. Moreover, results can be beneficial for policymakers to discern the optimal water/sediment release routes, reservoirs’ storage variations (impacted by sediment deposition), electricity supply schedules and system benefit

  12. A new uncertainty importance measure

    International Nuclear Information System (INIS)

    Borgonovo, E.

    2007-01-01

    Uncertainty in parameters is present in many risk assessment problems and leads to uncertainty in model predictions. In this work, we introduce a global sensitivity indicator which looks at the influence of input uncertainty on the entire output distribution without reference to a specific moment of the output (moment independence) and which can be defined also in the presence of correlations among the parameters. We discuss its mathematical properties and highlight the differences between the present indicator, variance-based uncertainty importance measures and a moment independent sensitivity indicator previously introduced in the literature. Numerical results are discussed with application to the probabilistic risk assessment model on which Iman [A matrix-based approach to uncertainty and sensitivity analysis for fault trees. Risk Anal 1987;7(1):22-33] first introduced uncertainty importance measures

  13. Evaluating Sources of Risks in Large Engineering Projects: The Roles of Equivocality and Uncertainty

    Directory of Open Access Journals (Sweden)

    Leena Pekkinen

    2015-11-01

    Full Text Available Contemporary project risk management literature introduces uncertainty, i.e., the lack of information, as a fundamental basis of project risks. In this study the authors assert that equivocality, i.e., the existence of multiple and conflicting interpretations, can also serve as a basis of risks. With an in-depth empirical investigation of a large complex engineering project the authors identified risk sources having their bases in the situations where uncertainty or equivocality was the predominant attribute. The information processing theory proposes different managerial practices for risk management based on the sources of risks in uncertainty or equivocality.

  14. Qualitative uncertainty analysis in probabilistic safety assessment context

    International Nuclear Information System (INIS)

    Apostol, M.; Constantin, M; Turcu, I.

    2007-01-01

    In Probabilistic Safety Assessment (PSA) context, an uncertainty analysis is performed either to estimate the uncertainty in the final results (the risk to public health and safety) or to estimate the uncertainty in some intermediate quantities (the core damage frequency, the radionuclide release frequency or fatality frequency). The identification and evaluation of uncertainty are important tasks because they afford credit to the results and help in the decision-making process. Uncertainty analysis can be performed qualitatively or quantitatively. This paper performs a preliminary qualitative uncertainty analysis, by identification of major uncertainty in PSA level 1- level 2 interface and in the other two major procedural steps of a level 2 PSA i.e. the analysis of accident progression and of the containment and analysis of source term for severe accidents. One should mention that a level 2 PSA for a Nuclear Power Plant (NPP) involves the evaluation and quantification of the mechanisms, amount and probabilities of subsequent radioactive material releases from the containment. According to NUREG 1150, an important task in source term analysis is fission products transport analysis. The uncertainties related to the isotopes distribution in CANDU NPP primary circuit and isotopes' masses transferred in the containment, using SOPHAEROS module from ASTEC computer code will be also presented. (authors)

  15. Uncertainties in smart grids behavior and modeling: What are the risks and vulnerabilities? How to analyze them?

    Energy Technology Data Exchange (ETDEWEB)

    Zio, Enrico, E-mail: enrico.zio@ecp.fr [Ecole Centrale Paris - Supelec, Paris (France); Politecnico di Milano, Milano (Italy); Aven, Terje, E-mail: terje.aven@uis.no [University of Stavanger, Stavanger (Norway)

    2011-10-15

    This paper looks into the future world of smart grids from a rather different perspective than usual: that of uncertainties and the related risks and vulnerabilities. An analysis of the foreseen constituents of smart grids and of the technological, operational, economical and policy-driven challenges is carried out to identify and characterize the related uncertainties and associated risks and vulnerabilities. The focus is on the challenges posed to the representation and treatment of uncertainties in the performance assessment of such systems, given their complexity and high-level of integration of novel technologies. A general framework of analysis is proposed. - Highlights: > We have looked into the development and operation of smart grids for power distribution. > We have identified a number of uncertainties, which are expected to play an influential role. > We have provided some guidelines to address these issues, based on probability intervals.

  16. A new approach to reduce uncertainties in space radiation cancer risk predictions.

    Directory of Open Access Journals (Sweden)

    Francis A Cucinotta

    Full Text Available The prediction of space radiation induced cancer risk carries large uncertainties with two of the largest uncertainties being radiation quality and dose-rate effects. In risk models the ratio of the quality factor (QF to the dose and dose-rate reduction effectiveness factor (DDREF parameter is used to scale organ doses for cosmic ray proton and high charge and energy (HZE particles to a hazard rate for γ-rays derived from human epidemiology data. In previous work, particle track structure concepts were used to formulate a space radiation QF function that is dependent on particle charge number Z, and kinetic energy per atomic mass unit, E. QF uncertainties where represented by subjective probability distribution functions (PDF for the three QF parameters that described its maximum value and shape parameters for Z and E dependences. Here I report on an analysis of a maximum QF parameter and its uncertainty using mouse tumor induction data. Because experimental data for risks at low doses of γ-rays are highly uncertain which impacts estimates of maximum values of relative biological effectiveness (RBEmax, I developed an alternate QF model, denoted QFγAcute where QFs are defined relative to higher acute γ-ray doses (0.5 to 3 Gy. The alternate model reduces the dependence of risk projections on the DDREF, however a DDREF is still needed for risk estimates for high-energy protons and other primary or secondary sparsely ionizing space radiation components. Risk projections (upper confidence levels (CL for space missions show a reduction of about 40% (CL∼50% using the QFγAcute model compared the QFs based on RBEmax and about 25% (CL∼35% compared to previous estimates. In addition, I discuss how a possible qualitative difference leading to increased tumor lethality for HZE particles compared to low LET radiation and background tumors remains a large uncertainty in risk estimates.

  17. Mean-variance model for portfolio optimization with background risk based on uncertainty theory

    Science.gov (United States)

    Zhai, Jia; Bai, Manying

    2018-04-01

    The aim of this paper is to develop a mean-variance model for portfolio optimization considering the background risk, liquidity and transaction cost based on uncertainty theory. In portfolio selection problem, returns of securities and assets liquidity are assumed as uncertain variables because of incidents or lacking of historical data, which are common in economic and social environment. We provide crisp forms of the model and a hybrid intelligent algorithm to solve it. Under a mean-variance framework, we analyze the portfolio frontier characteristic considering independently additive background risk. In addition, we discuss some effects of background risk and liquidity constraint on the portfolio selection. Finally, we demonstrate the proposed models by numerical simulations.

  18. Sampling based uncertainty analysis of 10% hot leg break LOCA in large scale test facility

    International Nuclear Information System (INIS)

    Sengupta, Samiran; Kraina, V.; Dubey, S. K.; Rao, R. S.; Gupta, S. K.

    2010-01-01

    Sampling based uncertainty analysis was carried out to quantify uncertainty in predictions of best estimate code RELAP5/MOD3.2 for a thermal hydraulic test (10% hot leg break LOCA) performed in the Large Scale Test Facility (LSTF) as a part of an IAEA coordinated research project. The nodalisation of the test facility was qualified for both steady state and transient level by systematically applying the procedures led by uncertainty methodology based on accuracy extrapolation (UMAE); uncertainty analysis was carried out using the Latin hypercube sampling (LHS) method to evaluate uncertainty for ten input parameters. Sixteen output parameters were selected for uncertainty evaluation and uncertainty band between 5 th and 95 th percentile of the output parameters were evaluated. It was observed that the uncertainty band for the primary pressure during two phase blowdown is larger than that of the remaining period. Similarly, a larger uncertainty band is observed relating to accumulator injection flow during reflood phase. Importance analysis was also carried out and standard rank regression coefficients were computed to quantify the effect of each individual input parameter on output parameters. It was observed that the break discharge coefficient is the most important uncertain parameter relating to the prediction of all the primary side parameters and that the steam generator (SG) relief pressure setting is the most important parameter in predicting the SG secondary pressure

  19. Water shortage risk assessment considering large-scale regional transfers: a copula-based uncertainty case study in Lunan, China.

    Science.gov (United States)

    Gao, Xueping; Liu, Yinzhu; Sun, Bowen

    2018-06-05

    The risk of water shortage caused by uncertainties, such as frequent drought, varied precipitation, multiple water resources, and different water demands, brings new challenges to the water transfer projects. Uncertainties exist for transferring water and local surface water; therefore, the relationship between them should be thoroughly studied to prevent water shortage. For more effective water management, an uncertainty-based water shortage risk assessment model (UWSRAM) is developed to study the combined effect of multiple water resources and analyze the shortage degree under uncertainty. The UWSRAM combines copula-based Monte Carlo stochastic simulation and the chance-constrained programming-stochastic multiobjective optimization model, using the Lunan water-receiving area in China as an example. Statistical copula functions are employed to estimate the joint probability of available transferring water and local surface water and sampling from the multivariate probability distribution, which are used as inputs for the optimization model. The approach reveals the distribution of water shortage and is able to emphasize the importance of improving and updating transferring water and local surface water management, and examine their combined influence on water shortage risk assessment. The possible available water and shortages can be calculated applying the UWSRAM, also with the corresponding allocation measures under different water availability levels and violating probabilities. The UWSRAM is valuable for mastering the overall multi-water resource and water shortage degree, adapting to the uncertainty surrounding water resources, establishing effective water resource planning policies for managers and achieving sustainable development.

  20. Methodologies for uncertainty analysis in the level 2 PSA and their implementation procedures

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Yang, Joon Eun; Kim, Dong Ha

    2002-04-01

    Main purpose of this report to present standardized methodologies for uncertainty analysis in the Level 2 Probabilistic Safety Assessment (PSA) and their implementation procedures, based on results obtained through a critical review of the existing methodologies for the analysis of uncertainties employed in the Level 2 PSA, especially Accident Progression Event Tree (APET). Uncertainties employed in the Level 2 PSA, quantitative expressions of overall knowledge of analysts' and experts' participating in the probabilistic quantification process of phenomenological accident progressions ranging from core melt to containment failure, their numerical values are directly related to the degree of confidence that the analyst has that a given phenomenological event or accident process will or will not occur, or analyst's subjective probabilities of occurrence. These results that are obtained from Level 2 PSA uncertainty analysis, become an essential contributor to the plant risk, in addition to the Level 1 PSA and Level 3 PSA uncertainties. Uncertainty analysis methodologies and their implementation procedures presented in this report was prepared based on the following criteria: 'uncertainty quantification process must be logical, scrutable, complete, consistent and in an appropriate level of detail, as mandated by the Level 2 PSA objectives'. For the aforementioned purpose, this report deals mainly with (1) summary of general or Level 2 PSA specific uncertainty analysis methodologies, (2) selection of phenomenological branch events for uncertainty analysis in the APET, methodology for quantification of APET uncertainty inputs and its implementation procedure, (3) statistical propagation of uncertainty inputs through APET and its implementation procedure, and (4) formal procedure for quantification of APET uncertainties and source term categories (STCs) through the Level 2 PSA quantification codes

  1. Uncertainty in exposure of underground miners to radon daughters and the effect of uncertainty on risk estimates

    International Nuclear Information System (INIS)

    1989-10-01

    Studies of underground miners provide the principal basis for assessing the risk from radon daughter exposure. An important problem in all epidemiological studies of underground miners is the reliability of the estimates of the miners' exposures. This study examines the various sources of uncertainty in exposure estimation for the principal epidemiologic studies reported in the literature including the temporal and spatial variability of radon sources and, with the passage of time, changes to both mining methods and ventilation conditions. Uncertainties about work histories and the role of other hard rock mining experience are also discussed. The report also describes two statistical approaches, both based on Bayesian methods, by which the effects on the estimated risk coefficient of uncertainty in exposure (WLM) can be examined. One approach requires only an estimate of the cumulative WLM exposure of a group of miners, an estimate of the number of (excess) lung cancers potentially attributable to that exposure, and a specification of the uncertainty about the cumulative exposure of the group. The second approach is based on a linear regression model which incorporates errors (uncertainty) in the independent variable (WLM) and allows the dependent variable (cases) to be Poisson distributed. The method permits the calculation of marginal probability distributions for either slope (risk coefficient) or intercept. The regression model approach is applied to several published data sets from epidemiological studies of miners. Specific results are provided for each data set and apparent differences in risk coefficients are discussed. The studies of U.S. uranium miners, Ontario uranium miners and Czechoslovakian uranium miners are argued to provide the best basis for risk estimation at this time. In general terms, none of the analyses performed are inconsistent with a linear exposure-effect relation. Based on analyses of the overall miner groups, the most likely ranges

  2. One Approach to the Fire PSA Uncertainty Analysis

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.; Vukovic, I.

    2002-01-01

    Experienced practical events and findings from the number of fire probabilistic safety assessment (PSA) studies show that fire has high relative importance for nuclear power plant safety. Fire PSA is a very challenging phenomenon and a number of issues are still in the area of research and development. This has a major impact on the conservatism of fire PSA findings. One way to reduce the level of conservatism is to conduct uncertainty analysis. At the top-level, uncertainty of the fire PSA can be separated in to three segments. The first segment is related to fire initiating events frequencies. The second uncertainty segment is connected to the uncertainty of fire damage. Finally, there is uncertainty related to the PSA model, which propagates this fire-initiated damage to the core damage or other analyzed risk. This paper discusses all three segments of uncertainty. Some recent experience with fire PSA study uncertainty analysis, usage of fire analysis code COMPBRN IIIe, and uncertainty evaluation importance to the final result is presented.(author)

  3. Risk and Uncertainties, Analysis and Evaluation: Lessons for Adaptation and Integration

    International Nuclear Information System (INIS)

    Yohe, G.; Dowlatabadi, H.

    1999-01-01

    This paper draws ten lessons from analyses of adaptation to climate change under conditions of risk and uncertainty: (1) Socio-economic systems will likely respond most to extreme realizations of climate change. (2) Systems have been responding to variations in climate for centuries. (3) Future change will effect future citizens and their institutions. (4) Human systems can be the sources of surprise. (5) Perceptions of risk depend upon welfare valuations that depend upon expectations. (6) Adaptive decisions will be made in response to climate change and climate change policy. (7) Analysis of adaptive decisions should recognize the second-best context of those decisions. (8) Climate change offers opportunity as well as risk. (9) All plausible futures should be explored. (10) Multiple methodological approaches should be accommodated. These lessons support two pieces of advice for the Third Assessment Report: (1) Work toward consensus, but not at the expense of thorough examination and reporting of the 'tails' of the distributions of the future. (2) Integrated assessment is only one unifying methodology; others that can better accommodate those tails should be encouraged and embraced. 12 refs

  4. Risk and Uncertainty in Production Economics | Otaha | African ...

    African Journals Online (AJOL)

    One of the most celebrated and feared concepts in the World today are risk which is the product of uncertainty. Risk and uncertainty are often used interchangeably by many economists as if they are the same thing, but it is not true. While risk can be measured and estimated, and even ensured, uncertainty can not.

  5. Risk Assessment Uncertainties in Cybersecurity Investments

    Directory of Open Access Journals (Sweden)

    Andrew Fielder

    2018-06-01

    Full Text Available When undertaking cybersecurity risk assessments, it is important to be able to assign numeric values to metrics to compute the final expected loss that represents the risk that an organization is exposed to due to cyber threats. Even if risk assessment is motivated by real-world observations and data, there is always a high chance of assigning inaccurate values due to different uncertainties involved (e.g., evolving threat landscape, human errors and the natural difficulty of quantifying risk. Existing models empower organizations to compute optimal cybersecurity strategies given their financial constraints, i.e., available cybersecurity budget. Further, a general game-theoretic model with uncertain payoffs (probability-distribution-valued payoffs shows that such uncertainty can be incorporated in the game-theoretic model by allowing payoffs to be random. This paper extends previous work in the field to tackle uncertainties in risk assessment that affect cybersecurity investments. The findings from simulated examples indicate that although uncertainties in cybersecurity risk assessment lead, on average, to different cybersecurity strategies, they do not play a significant role in the final expected loss of the organization when utilising a game-theoretic model and methodology to derive these strategies. The model determines robust defending strategies even when knowledge regarding risk assessment values is not accurate. As a result, it is possible to show that the cybersecurity investments’ tool is capable of providing effective decision support.

  6. Dealing with unquantifiable uncertainties in landslide modelling for urban risk reduction in developing countries

    Science.gov (United States)

    Almeida, Susana; Holcombe, Liz; Pianosi, Francesca; Wagener, Thorsten

    2016-04-01

    Landslides have many negative economic and societal impacts, including the potential for significant loss of life and damage to infrastructure. Slope stability assessment can be used to guide decisions about the management of landslide risk, but its usefulness can be challenged by high levels of uncertainty in predicting landslide occurrence. Prediction uncertainty may be associated with the choice of model that is used to assess slope stability, the quality of the available input data, or a lack of knowledge of how future climatic and socio-economic changes may affect future landslide risk. While some of these uncertainties can be characterised by relatively well-defined probability distributions, for other uncertainties, such as those linked to climate change, no probability distribution is available to characterise them. This latter type of uncertainty, often referred to as deep uncertainty, means that robust policies need to be developed that are expected to perform acceptably well over a wide range of future conditions. In our study the impact of deep uncertainty on slope stability predictions is assessed in a quantitative and structured manner using Global Sensitivity Analysis (GSA) and the Combined Hydrology and Stability Model (CHASM). In particular, we use several GSA methods including the Method of Morris, Regional Sensitivity Analysis and Classification and Regression Trees (CART), as well as advanced visualization tools, to assess the combination of conditions that may lead to slope failure. Our example application is a slope in the Caribbean, an area that is naturally susceptible to landslides due to a combination of high rainfall rates during the hurricane season, steep slopes, and highly weathered residual soils. Rapid unplanned urbanisation and changing climate may further exacerbate landslide risk in the future. Our example shows how we can gain useful information in the presence of deep uncertainty by combining physically based models with GSA in

  7. A two-stage method of quantitative flood risk analysis for reservoir real-time operation using ensemble-based hydrologic forecasts

    Science.gov (United States)

    Liu, P.

    2013-12-01

    Quantitative analysis of the risk for reservoir real-time operation is a hard task owing to the difficulty of accurate description of inflow uncertainties. The ensemble-based hydrologic forecasts directly depict the inflows not only the marginal distributions but also their persistence via scenarios. This motivates us to analyze the reservoir real-time operating risk with ensemble-based hydrologic forecasts as inputs. A method is developed by using the forecast horizon point to divide the future time into two stages, the forecast lead-time and the unpredicted time. The risk within the forecast lead-time is computed based on counting the failure number of forecast scenarios, and the risk in the unpredicted time is estimated using reservoir routing with the design floods and the reservoir water levels of forecast horizon point. As a result, a two-stage risk analysis method is set up to quantify the entire flood risks by defining the ratio of the number of scenarios that excessive the critical value to the total number of scenarios. The China's Three Gorges Reservoir (TGR) is selected as a case study, where the parameter and precipitation uncertainties are implemented to produce ensemble-based hydrologic forecasts. The Bayesian inference, Markov Chain Monte Carlo, is used to account for the parameter uncertainty. Two reservoir operation schemes, the real operated and scenario optimization, are evaluated for the flood risks and hydropower profits analysis. With the 2010 flood, it is found that the improvement of the hydrologic forecast accuracy is unnecessary to decrease the reservoir real-time operation risk, and most risks are from the forecast lead-time. It is therefore valuable to decrease the avarice of ensemble-based hydrologic forecasts with less bias for a reservoir operational purpose.

  8. Crashworthiness uncertainty analysis of typical civil aircraft based on Box–Behnken method

    OpenAIRE

    Ren Yiru; Xiang Jinwu

    2014-01-01

    The crashworthiness is an important design factor of civil aircraft related with the safety of occupant during impact accident. It is a highly nonlinear transient dynamic problem and may be greatly influenced by the uncertainty factors. Crashworthiness uncertainty analysis is conducted to investigate the effects of initial conditions, structural dimensions and material properties. Simplified finite element model is built based on the geometrical model and basic physics phenomenon. Box–Behnken...

  9. Nonlinear Uncertainty Propagation of Satellite State Error for Tracking and Conjunction Risk Assessment

    Science.gov (United States)

    2017-12-18

    AFRL-RV-PS- AFRL-RV-PS- TR-2017-0177 TR-2017-0177 NONLINEAR UNCERTAINTY PROPAGATION OF SATELLITE STATE ERROR FOR TRACKING AND CONJUNCTION RISK...Uncertainty Propagation of Satellite State Error for Tracking and Conjunction Risk Assessment 5a. CONTRACT NUMBER FA9453-16-1-0084 5b. GRANT NUMBER...prediction and satellite conjunction analysis. Statistical approach utilizes novel methods to build better uncertainty state characterization in the context

  10. Risks, uncertainty, vagueness

    International Nuclear Information System (INIS)

    Haefele, W.; Renn, O.; Erdmann, G.

    1990-01-01

    The notion of 'risk' is discussed in its social and technological contexts, leading to an investigation of the terms factuality, hypotheticality, uncertainty, and vagueness, and to the problems of acceptance and acceptability especially in the context of political decision finding. (DG) [de

  11. Application of a Novel Dose-Uncertainty Model for Dose-Uncertainty Analysis in Prostate Intensity-Modulated Radiotherapy

    International Nuclear Information System (INIS)

    Jin Hosang; Palta, Jatinder R.; Kim, You-Hyun; Kim, Siyong

    2010-01-01

    Purpose: To analyze dose uncertainty using a previously published dose-uncertainty model, and to assess potential dosimetric risks existing in prostate intensity-modulated radiotherapy (IMRT). Methods and Materials: The dose-uncertainty model provides a three-dimensional (3D) dose-uncertainty distribution in a given confidence level. For 8 retrospectively selected patients, dose-uncertainty maps were constructed using the dose-uncertainty model at the 95% CL. In addition to uncertainties inherent to the radiation treatment planning system, four scenarios of spatial errors were considered: machine only (S1), S1 + intrafraction, S1 + interfraction, and S1 + both intrafraction and interfraction errors. To evaluate the potential risks of the IMRT plans, three dose-uncertainty-based plan evaluation tools were introduced: confidence-weighted dose-volume histogram, confidence-weighted dose distribution, and dose-uncertainty-volume histogram. Results: Dose uncertainty caused by interfraction setup error was more significant than that of intrafraction motion error. The maximum dose uncertainty (95% confidence) of the clinical target volume (CTV) was smaller than 5% of the prescribed dose in all but two cases (13.9% and 10.2%). The dose uncertainty for 95% of the CTV volume ranged from 1.3% to 2.9% of the prescribed dose. Conclusions: The dose uncertainty in prostate IMRT could be evaluated using the dose-uncertainty model. Prostate IMRT plans satisfying the same plan objectives could generate a significantly different dose uncertainty because a complex interplay of many uncertainty sources. The uncertainty-based plan evaluation contributes to generating reliable and error-resistant treatment plans.

  12. Value at risk (VaR in uncertainty: Analysis with parametric method and black & scholes simulations

    Directory of Open Access Journals (Sweden)

    Humberto Banda Ortiz

    2014-07-01

    Full Text Available VaR is the most accepted risk measure worldwide and the leading reference in any risk management assessment. However, its methodology has important limitations which makes it unreliable in contexts of crisis or high uncertainty. For this reason, the aim of this work is to test the VaR accuracy when is employed in contexts of volatility, for which we compare the VaR outcomes in scenarios of both stability and uncertainty, using the parametric method and a historical simulation based on data generated with the Black & Scholes model. VaR main objective is the prediction of the highest expected loss for any given portfolio, but even when it is considered a useful tool for risk management under conditions of markets stability, we found that it is substantially inaccurate in contexts of crisis or high uncertainty. In addition, we found that the Black & Scholes simulations lead to underestimate the expected losses, in comparison with the parametric method and we also found that those disparities increase substantially in times of crisis. In the first section of this work we present a brief context of risk management in finance. In section II we present the existent literature relative to the VaR concept, its methods and applications. In section III we describe the methodology and assumptions used in this work. Section IV is dedicated to expose the findings. And finally, in Section V we present our conclusions.

  13. Risk Worth Taking - Entrepreneurial Behaviour When Faced with Risk and Uncertainty

    DEFF Research Database (Denmark)

    Zichella, Giulio

    theory suggests differences in risk taking due to individual characteristics. However, entrepreneurship theory did not provide empirical support for such differences. Using data from a laboratory experiment with simple money games, we observe how individuals from two different groups (entrepreneurial......-oriented, non-entrepreneurial-oriented) react to different degrees of risk and uncertainty when real monetary incentives are involved in each decision. The analysis reveals significant differences between entrepreneurial and non-entrepreneurial-oriented individuals in their decision making. In particular...

  14. Managing project risks and uncertainties

    Directory of Open Access Journals (Sweden)

    Mike Mentis

    2015-01-01

    Full Text Available This article considers threats to a project slipping on budget, schedule and fit-for-purpose. Threat is used here as the collective for risks (quantifiable bad things that can happen and uncertainties (poorly or not quantifiable bad possible events. Based on experience with projects in developing countries this review considers that (a project slippage is due to uncertainties rather than risks, (b while eventuation of some bad things is beyond control, managed execution and oversight are still the primary means to keeping within budget, on time and fit-for-purpose, (c improving project delivery is less about bigger and more complex and more about coordinated focus, effectiveness and developing thought-out heuristics, and (d projects take longer and cost more partly because threat identification is inaccurate, the scope of identified threats is too narrow, and the threat assessment product is not integrated into overall project decision-making and execution. Almost by definition, what is poorly known is likely to cause problems. Yet it is not just the unquantifiability and intangibility of uncertainties causing project slippage, but that they are insufficiently taken into account in project planning and execution that cause budget and time overruns. Improving project performance requires purpose-driven and managed deployment of scarce seasoned professionals. This can be aided with independent oversight by deeply experienced panelists who contribute technical insights and can potentially show that diligence is seen to be done.

  15. Impact of Hydrogeological Uncertainty on Estimation of Environmental Risks Posed by Hydrocarbon Transportation Networks

    Science.gov (United States)

    Ciriello, V.; Lauriola, I.; Bonvicini, S.; Cozzani, V.; Di Federico, V.; Tartakovsky, Daniel M.

    2017-11-01

    Ubiquitous hydrogeological uncertainty undermines the veracity of quantitative predictions of soil and groundwater contamination due to accidental hydrocarbon spills from onshore pipelines. Such predictions, therefore, must be accompanied by quantification of predictive uncertainty, especially when they are used for environmental risk assessment. We quantify the impact of parametric uncertainty on quantitative forecasting of temporal evolution of two key risk indices, volumes of unsaturated and saturated soil contaminated by a surface spill of light nonaqueous-phase liquids. This is accomplished by treating the relevant uncertain parameters as random variables and deploying two alternative probabilistic models to estimate their effect on predictive uncertainty. A physics-based model is solved with a stochastic collocation method and is supplemented by a global sensitivity analysis. A second model represents the quantities of interest as polynomials of random inputs and has a virtually negligible computational cost, which enables one to explore any number of risk-related contamination scenarios. For a typical oil-spill scenario, our method can be used to identify key flow and transport parameters affecting the risk indices, to elucidate texture-dependent behavior of different soils, and to evaluate, with a degree of confidence specified by the decision-maker, the extent of contamination and the correspondent remediation costs.

  16. Deep Uncertainties in Sea-Level Rise and Storm Surge Projections: Implications for Coastal Flood Risk Management.

    Science.gov (United States)

    Oddo, Perry C; Lee, Ben S; Garner, Gregory G; Srikrishnan, Vivek; Reed, Patrick M; Forest, Chris E; Keller, Klaus

    2017-09-05

    Sea levels are rising in many areas around the world, posing risks to coastal communities and infrastructures. Strategies for managing these flood risks present decision challenges that require a combination of geophysical, economic, and infrastructure models. Previous studies have broken important new ground on the considerable tensions between the costs of upgrading infrastructure and the damages that could result from extreme flood events. However, many risk-based adaptation strategies remain silent on certain potentially important uncertainties, as well as the tradeoffs between competing objectives. Here, we implement and improve on a classic decision-analytical model (Van Dantzig 1956) to: (i) capture tradeoffs across conflicting stakeholder objectives, (ii) demonstrate the consequences of structural uncertainties in the sea-level rise and storm surge models, and (iii) identify the parametric uncertainties that most strongly influence each objective using global sensitivity analysis. We find that the flood adaptation model produces potentially myopic solutions when formulated using traditional mean-centric decision theory. Moving from a single-objective problem formulation to one with multiobjective tradeoffs dramatically expands the decision space, and highlights the need for compromise solutions to address stakeholder preferences. We find deep structural uncertainties that have large effects on the model outcome, with the storm surge parameters accounting for the greatest impacts. Global sensitivity analysis effectively identifies important parameter interactions that local methods overlook, and that could have critical implications for flood adaptation strategies. © 2017 Society for Risk Analysis.

  17. An overview of the risk uncertainty assessment process for the Cassini space mission

    International Nuclear Information System (INIS)

    Wyss, G.D.

    1996-01-01

    The Cassini spacecraft is a deep space probe whose mission is to explore the planet Saturn and its moons. Since the spacecraft's electrical requirements will be supplied by radioisotope thermoelectric generators (RTGs), the spacecraft designers and mission planners must assure that potential accidents involving the spacecraft do not pose significant human risk. The Cassini risk analysis team is seeking to perform a quantitative uncertainty analysis as a part of the overall mission risk assessment program. This paper describes the uncertainty analysis methodology to be used for the Cassini mission and compares it to the methods that were originally developed for evaluation of commercial nuclear power reactors

  18. An evaluation of the treatment of risk and uncertainties in the IPCC reports on climate change.

    Science.gov (United States)

    Aven, Terje; Renn, Ortwin

    2015-04-01

    Few global threats rival global climate change in scale and potential consequence. The principal international authority assessing climate risk is the Intergovernmental Panel on Climate Change (IPCC). Through repeated assessments the IPCC has devoted considerable effort and interdisciplinary competence to articulating a common characterization of climate risk and uncertainties. We have reviewed the assessment and its foundation for the Fifth Assessment Reports published in 2013 and 2014, in particular the guidance note for lead authors of the fifth IPCC assessment report on consistent treatment of uncertainties. Our analysis shows that the work carried out by the ICPP is short of providing a theoretically and conceptually convincing foundation on the treatment of risk and uncertainties. The main reasons for our assessment are: (i) the concept of risk is given a too narrow definition (a function of consequences and probability/likelihood); and (ii) the reports lack precision in delineating their concepts and methods. The goal of this article is to contribute to improving the handling of uncertainty and risk in future IPCC studies, thereby obtaining a more theoretically substantiated characterization as well as enhanced scientific quality for risk analysis in this area. Several suggestions for how to improve the risk and uncertainty treatment are provided. © 2014 Society for Risk Analysis.

  19. Use of probabilistic methods for analysis of cost and duration uncertainties in a decision analysis framework

    International Nuclear Information System (INIS)

    Boak, D.M.; Painton, L.

    1995-01-01

    Probabilistic forecasting techniques have been used in many risk assessment and performance assessment applications on radioactive waste disposal projects such as Yucca Mountain and the Waste Isolation Pilot Plant (WIPP). Probabilistic techniques such as Monte Carlo and Latin Hypercube sampling methods are routinely used to treat uncertainties in physical parameters important in simulating radionuclide transport in a coupled geohydrologic system and assessing the ability of that system to comply with regulatory release limits. However, the use of probabilistic techniques in the treatment of uncertainties in the cost and duration of programmatic alternatives on risk and performance assessment projects is less common. Where significant uncertainties exist and where programmatic decisions must be made despite existing uncertainties, probabilistic techniques may yield important insights into decision options, especially when used in a decision analysis framework and when properly balanced with deterministic analyses. For relatively simple evaluations, these types of probabilistic evaluations can be made using personal computer-based software

  20. A task specific uncertainty analysis method for least-squares-based form characterization of ultra-precision freeform surfaces

    International Nuclear Information System (INIS)

    Ren, M J; Cheung, C F; Kong, L B

    2012-01-01

    In the measurement of ultra-precision freeform surfaces, least-squares-based form characterization methods are widely used to evaluate the form error of the measured surfaces. Although many methodologies have been proposed in recent years to improve the efficiency of the characterization process, relatively little research has been conducted on the analysis of associated uncertainty in the characterization results which may result from those characterization methods being used. As a result, this paper presents a task specific uncertainty analysis method with application in the least-squares-based form characterization of ultra-precision freeform surfaces. That is, the associated uncertainty in the form characterization results is estimated when the measured data are extracted from a specific surface with specific sampling strategy. Three factors are considered in this study which include measurement error, surface form error and sample size. The task specific uncertainty analysis method has been evaluated through a series of experiments. The results show that the task specific uncertainty analysis method can effectively estimate the uncertainty of the form characterization results for a specific freeform surface measurement

  1. Sensitivity to Uncertainty in Asteroid Impact Risk Assessment

    Science.gov (United States)

    Mathias, D.; Wheeler, L.; Prabhu, D. K.; Aftosmis, M.; Dotson, J.; Robertson, D. K.

    2015-12-01

    The Engineering Risk Assessment (ERA) team at NASA Ames Research Center is developing a physics-based impact risk model for probabilistically assessing threats from potential asteroid impacts on Earth. The model integrates probabilistic sampling of asteroid parameter ranges with physics-based analyses of entry, breakup, and impact to estimate damage areas and casualties from various impact scenarios. Assessing these threats is a highly coupled, dynamic problem involving significant uncertainties in the range of expected asteroid characteristics, how those characteristics may affect the level of damage, and the fidelity of various modeling approaches and assumptions. The presented model is used to explore the sensitivity of impact risk estimates to these uncertainties in order to gain insight into what additional data or modeling refinements are most important for producing effective, meaningful risk assessments. In the extreme cases of very small or very large impacts, the results are generally insensitive to many of the characterization and modeling assumptions. However, the nature of the sensitivity can change across moderate-sized impacts. Results will focus on the value of additional information in this critical, mid-size range, and how this additional data can support more robust mitigation decisions.

  2. Uncertainty modelling and analysis of volume calculations based on a regular grid digital elevation model (DEM)

    Science.gov (United States)

    Li, Chang; Wang, Qing; Shi, Wenzhong; Zhao, Sisi

    2018-05-01

    The accuracy of earthwork calculations that compute terrain volume is critical to digital terrain analysis (DTA). The uncertainties in volume calculations (VCs) based on a DEM are primarily related to three factors: 1) model error (ME), which is caused by an adopted algorithm for a VC model, 2) discrete error (DE), which is usually caused by DEM resolution and terrain complexity, and 3) propagation error (PE), which is caused by the variables' error. Based on these factors, the uncertainty modelling and analysis of VCs based on a regular grid DEM are investigated in this paper. Especially, how to quantify the uncertainty of VCs is proposed by a confidence interval based on truncation error (TE). In the experiments, the trapezoidal double rule (TDR) and Simpson's double rule (SDR) were used to calculate volume, where the TE is the major ME, and six simulated regular grid DEMs with different terrain complexity and resolution (i.e. DE) were generated by a Gauss synthetic surface to easily obtain the theoretical true value and eliminate the interference of data errors. For PE, Monte-Carlo simulation techniques and spatial autocorrelation were used to represent DEM uncertainty. This study can enrich uncertainty modelling and analysis-related theories of geographic information science.

  3. Uncertainty in ecological risk assessment: A statistician's view

    International Nuclear Information System (INIS)

    Smith, E.P.

    1995-01-01

    Uncertainty is a topic that has different meanings to researchers, modelers, managers and policy makers. The perspective of this presentation will be on the modeling view of uncertainty and its quantitative assessment. The goal is to provide some insight into how a statistician visualizes and addresses the issue of uncertainty in ecological risk assessment problems. In ecological risk assessment, uncertainty arises from many sources and is of different type depending on what is studies, where it is studied and how it is studied. Some major sources and their impact are described. A variety of quantitative approaches to modeling uncertainty are characterized and a general taxonomy given. Examples of risk assessments of lake acidification, power plant impact assessment and the setting of standards for chemicals will be used discuss approaches to quantitative assessment of uncertainty and some of the potential difficulties

  4. Risk assessment through drinking water pathway via uncertainty modeling of contaminant transport using soft computing

    International Nuclear Information System (INIS)

    Datta, D.; Ranade, A.K.; Pandey, M.; Sathyabama, N.; Kumar, Brij

    2012-01-01

    The basic objective of an environmental impact assessment (EIA) is to build guidelines to reduce the associated risk or mitigate the consequences of the reactor accident at its source to prevent deterministic health effects, to reduce the risk of stochastic health effects (eg. cancer and severe hereditary effects) as much as reasonable achievable by implementing protective actions in accordance with IAEA guidance (IAEA Safety Series No. 115, 1996). The measure of exposure being the basic tool to take any appropriate decisions related to risk reduction, EIA is traditionally expressed in terms of radiation exposure to the member of the public. However, models used to estimate the exposure received by the member of the public are governed by parameters some of which are deterministic with relative uncertainty and some of which are stochastic as well as imprecise (insufficient knowledge). In an admixture environment of this type, it is essential to assess the uncertainty of a model to estimate the bounds of the exposure to the public to invoke a decision during an event of nuclear or radiological emergency. With a view to this soft computing technique such as evidence theory based assessment of model parameters is addressed to compute the risk or exposure to the member of the public. The possible pathway of exposure to the member of the public in the aquatic food stream is the drinking of water. Accordingly, this paper presents the uncertainty analysis of exposure via uncertainty analysis of the contaminated water. Evidence theory finally addresses the uncertainty in terms of lower bound as belief measure and upper bound of exposure as plausibility measure. In this work EIA is presented using evidence theory. Data fusion technique is used to aggregate the knowledge on the uncertain information. Uncertainty of concentration and exposure is expressed as an interval of belief, plausibility

  5. Measurement uncertainty analysis techniques applied to PV performance measurements

    International Nuclear Information System (INIS)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results

  6. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  7. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  8. Including uncertainty in hazard analysis through fuzzy measures

    International Nuclear Information System (INIS)

    Bott, T.F.; Eisenhawer, S.W.

    1997-12-01

    This paper presents a method for capturing the uncertainty expressed by an Hazard Analysis (HA) expert team when estimating the frequencies and consequences of accident sequences and provides a sound mathematical framework for propagating this uncertainty to the risk estimates for these accident sequences. The uncertainty is readily expressed as distributions that can visually aid the analyst in determining the extent and source of risk uncertainty in HA accident sequences. The results also can be expressed as single statistics of the distribution in a manner analogous to expressing a probabilistic distribution as a point-value statistic such as a mean or median. The study discussed here used data collected during the elicitation portion of an HA on a high-level waste transfer process to demonstrate the techniques for capturing uncertainty. These data came from observations of the uncertainty that HA team members expressed in assigning frequencies and consequences to accident sequences during an actual HA. This uncertainty was captured and manipulated using ideas from possibility theory. The result of this study is a practical method for displaying and assessing the uncertainty in the HA team estimates of the frequency and consequences for accident sequences. This uncertainty provides potentially valuable information about accident sequences that typically is lost in the HA process

  9. Thermal-Hydraulic Analysis for SBLOCA in OPR1000 and Evaluation of Uncertainty for PSA

    International Nuclear Information System (INIS)

    Kim, Tae Jin; Park, Goon Cherl

    2012-01-01

    Probabilistic Safety assessment (PSA) is a mathematical tool to evaluate numerical estimates of risk for nuclear power plants (NPPs). But PSA has the problems about quality and reliability since the quantification of uncertainties from thermal hydraulic (TH) analysis has not been included in the quantification of overall uncertainties in PSA. From the former research, it is proved that the quantification of uncertainties from best-estimate LBLOCA analysis can improve the PSA quality by modifying the core damage frequency (CDF) from the existing PSA report. Basing on the similar concept, this study considers the quantification of SBLOCA analysis results. In this study, however, operator error parameters are also included in addition to the phenomenon parameters which are considered in LBLOCA analysis

  10. Application of intelligence based uncertainty analysis for HLW disposal

    International Nuclear Information System (INIS)

    Kato, Kazuyuki

    2003-01-01

    Safety assessment for geological disposal of high level radioactive waste inevitably involves factors that cannot be specified in a deterministic manner. These are namely: (1) 'variability' that arises from stochastic nature of the processes and features considered, e.g., distribution of canister corrosion times and spatial heterogeneity of a host geological formation; (2) 'ignorance' due to incomplete or imprecise knowledge of the processes and conditions expected in the future, e.g., uncertainty in the estimation of solubilities and sorption coefficients for important nuclides. In many cases, a decision in assessment, e.g., selection among model options or determination of a parameter value, is subjected to both variability and ignorance in a combined form. It is clearly important to evaluate both influences of variability and ignorance on the result of a safety assessment in a consistent manner. We developed a unified methodology to handle variability and ignorance by using probabilistic and possibilistic techniques respectively. The methodology has been applied to safety assessment of geological disposal of high level radioactive waste. Uncertainties associated with scenarios, models and parameters were defined in terms of fuzzy membership functions derived through a series of interviews to the experts while variability was formulated by means of probability density functions (pdfs) based on available data set. The exercise demonstrated applicability of the new methodology and, in particular, its advantage in quantifying uncertainties based on expert's opinion and in providing information on dependence of assessment result on the level of conservatism. In addition, it was also shown that sensitivity analysis could identify key parameters in reducing uncertainties associated with the overall assessment. The above information can be used to support the judgment process and guide the process of disposal system development in optimization of protection against

  11. Analysis of uncertainties of thermal hydraulic calculations

    International Nuclear Information System (INIS)

    Macek, J.; Vavrin, J.

    2002-12-01

    In 1993-1997 it was proposed, within OECD projects, that a common program should be set up for uncertainty analysis by a probabilistic method based on a non-parametric statistical approach for system computer codes such as RELAP, ATHLET and CATHARE and that a method should be developed for statistical analysis of experimental databases for the preparation of the input deck and statistical analysis of the output calculation results. Software for such statistical analyses would then have to be processed as individual tools independent of the computer codes used for the thermal hydraulic analysis and programs for uncertainty analysis. In this context, a method for estimation of a thermal hydraulic calculation is outlined and selected methods of statistical analysis of uncertainties are described, including methods for prediction accuracy assessment based on the discrete Fourier transformation principle. (author)

  12. Development of Risk Uncertainty Factors from Historical NASA Projects

    Science.gov (United States)

    Amer, Tahani R.

    2011-01-01

    NASA is a good investment of federal funds and strives to provide the best value to the nation. NASA has consistently budgeted to unrealistic cost estimates, which are evident in the cost growth in many of its programs. In this investigation, NASA has been using available uncertainty factors from the Aerospace Corporation, Air Force, and Booz Allen Hamilton to develop projects risk posture. NASA has no insight into the developmental of these factors and, as demonstrated here, this can lead to unrealistic risks in many NASA Programs and projects (P/p). The primary contribution of this project is the development of NASA missions uncertainty factors, from actual historical NASA projects, to aid cost-estimating as well as for independent reviews which provide NASA senior management with information and analysis to determine the appropriate decision regarding P/p. In general terms, this research project advances programmatic analysis for NASA projects.

  13. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, main report

    International Nuclear Information System (INIS)

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The ultimate objective of the joint effort was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. Experts developed their distributions independently. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. To validate the distributions generated for the dispersion code input variables, samples from the distributions and propagated through the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the first of a three-volume document describing the project

  14. Incorporating parametric uncertainty into population viability analysis models

    Science.gov (United States)

    McGowan, Conor P.; Runge, Michael C.; Larson, Michael A.

    2011-01-01

    Uncertainty in parameter estimates from sampling variation or expert judgment can introduce substantial uncertainty into ecological predictions based on those estimates. However, in standard population viability analyses, one of the most widely used tools for managing plant, fish and wildlife populations, parametric uncertainty is often ignored in or discarded from model projections. We present a method for explicitly incorporating this source of uncertainty into population models to fully account for risk in management and decision contexts. Our method involves a two-step simulation process where parametric uncertainty is incorporated into the replication loop of the model and temporal variance is incorporated into the loop for time steps in the model. Using the piping plover, a federally threatened shorebird in the USA and Canada, as an example, we compare abundance projections and extinction probabilities from simulations that exclude and include parametric uncertainty. Although final abundance was very low for all sets of simulations, estimated extinction risk was much greater for the simulation that incorporated parametric uncertainty in the replication loop. Decisions about species conservation (e.g., listing, delisting, and jeopardy) might differ greatly depending on the treatment of parametric uncertainty in population models.

  15. Adaptive Governance, Uncertainty, and Risk: Policy Framing and Responses to Climate Change, Drought, and Flood.

    Science.gov (United States)

    Hurlbert, Margot; Gupta, Joyeeta

    2016-02-01

    As climate change impacts result in more extreme events (such as droughts and floods), the need to understand which policies facilitate effective climate change adaptation becomes crucial. Hence, this article answers the question: How do governments and policymakers frame policy in relation to climate change, droughts, and floods and what governance structures facilitate adaptation? This research interrogates and analyzes through content analysis, supplemented by semi-structured qualitative interviews, the policy response to climate change, drought, and flood in relation to agricultural producers in four case studies in river basins in Chile, Argentina, and Canada. First, an epistemological explanation of risk and uncertainty underscores a brief literature review of adaptive governance, followed by policy framing in relation to risk and uncertainty, and an analytical model is developed. Pertinent findings of the four cases are recounted, followed by a comparative analysis. In conclusion, recommendations are made to improve policies and expand adaptive governance to better account for uncertainty and risk. This article is innovative in that it proposes an expanded model of adaptive governance in relation to "risk" that can help bridge the barrier of uncertainty in science and policy. © 2015 Society for Risk Analysis.

  16. Sustainability Risk Evaluation for Large-Scale Hydropower Projects with Hybrid Uncertainty

    Directory of Open Access Journals (Sweden)

    Weiyao Tang

    2018-01-01

    Full Text Available As large-scale hydropower projects are influenced by many factors, risk evaluations are complex. This paper considers a hydropower project as a complex system from the perspective of sustainability risk, and divides it into three subsystems: the natural environment subsystem, the eco-environment subsystem and the socioeconomic subsystem. Risk-related factors and quantitative dimensions of each subsystem are comprehensively analyzed considering uncertainty of some quantitative dimensions solved by hybrid uncertainty methods, including fuzzy (e.g., the national health degree, the national happiness degree, the protection of cultural heritage, random (e.g., underground water levels, river width, and fuzzy random uncertainty (e.g., runoff volumes, precipitation. By calculating the sustainability risk-related degree in each of the risk-related factors, a sustainable risk-evaluation model is built. Based on the calculation results, the critical sustainability risk-related factors are identified and targeted to reduce the losses caused by sustainability risk factors of the hydropower project. A case study at the under-construction Baihetan hydropower station is presented to demonstrate the viability of the risk-evaluation model and to provide a reference for the sustainable risk evaluation of other large-scale hydropower projects.

  17. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis☆

    Science.gov (United States)

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-01-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987

  18. Uncertainties in Earthquake Loss Analysis: A Case Study From Southern California

    Science.gov (United States)

    Mahdyiar, M.; Guin, J.

    2005-12-01

    Probabilistic earthquake hazard and loss analyses play important roles in many areas of risk management, including earthquake related public policy and insurance ratemaking. Rigorous loss estimation for portfolios of properties is difficult since there are various types of uncertainties in all aspects of modeling and analysis. It is the objective of this study to investigate the sensitivity of earthquake loss estimation to uncertainties in regional seismicity, earthquake source parameters, ground motions, and sites' spatial correlation on typical property portfolios in Southern California. Southern California is an attractive region for such a study because it has a large population concentration exposed to significant levels of seismic hazard. During the last decade, there have been several comprehensive studies of most regional faults and seismogenic sources. There have also been detailed studies on regional ground motion attenuations and regional and local site responses to ground motions. This information has been used by engineering seismologists to conduct regional seismic hazard and risk analysis on a routine basis. However, one of the more difficult tasks in such studies is the proper incorporation of uncertainties in the analysis. From the hazard side, there are uncertainties in the magnitudes, rates and mechanisms of the seismic sources and local site conditions and ground motion site amplifications. From the vulnerability side, there are considerable uncertainties in estimating the state of damage of buildings under different earthquake ground motions. From an analytical side, there are challenges in capturing the spatial correlation of ground motions and building damage, and integrating thousands of loss distribution curves with different degrees of correlation. In this paper we propose to address some of these issues by conducting loss analyses of a typical small portfolio in southern California, taking into consideration various source and ground

  19. An uncertainty and sensitivity analysis approach for GIS-based multicriteria landslide susceptibility mapping

    Science.gov (United States)

    Feizizadeh, Bakhtiar; Blaschke, Thomas

    2014-01-01

    GIS-based multicriteria decision analysis (MCDA) methods are increasingly being used in landslide susceptibility mapping. However, the uncertainties that are associated with MCDA techniques may significantly impact the results. This may sometimes lead to inaccurate outcomes and undesirable consequences. This article introduces a new GIS-based MCDA approach. We illustrate the consequences of applying different MCDA methods within a decision-making process through uncertainty analysis. Three GIS-MCDA methods in conjunction with Monte Carlo simulation (MCS) and Dempster–Shafer theory are analyzed for landslide susceptibility mapping (LSM) in the Urmia lake basin in Iran, which is highly susceptible to landslide hazards. The methodology comprises three stages. First, the LSM criteria are ranked and a sensitivity analysis is implemented to simulate error propagation based on the MCS. The resulting weights are expressed through probability density functions. Accordingly, within the second stage, three MCDA methods, namely analytical hierarchy process (AHP), weighted linear combination (WLC) and ordered weighted average (OWA), are used to produce the landslide susceptibility maps. In the third stage, accuracy assessments are carried out and the uncertainties of the different results are measured. We compare the accuracies of the three MCDA methods based on (1) the Dempster–Shafer theory and (2) a validation of the results using an inventory of known landslides and their respective coverage based on object-based image analysis of IRS-ID satellite images. The results of this study reveal that through the integration of GIS and MCDA models, it is possible to identify strategies for choosing an appropriate method for LSM. Furthermore, our findings indicate that the integration of MCDA and MCS can significantly improve the accuracy of the results. In LSM, the AHP method performed best, while the OWA reveals better performance in the reliability assessment. The WLC

  20. Risks Analysis of Logistics Financial Business Based on Evidential Bayesian Network

    Directory of Open Access Journals (Sweden)

    Ying Yan

    2013-01-01

    Full Text Available Risks in logistics financial business are identified and classified. Making the failure of the business as the root node, a Bayesian network is constructed to measure the risk levels in the business. Three importance indexes are calculated to find the most important risks in the business. And more, considering the epistemic uncertainties in the risks, evidence theory associate with Bayesian network is used as an evidential network in the risk analysis of logistics finance. To find how much uncertainty in root node is produced by each risk, a new index, epistemic importance, is defined. Numerical examples show that the proposed methods could provide a lot of useful information. With the information, effective approaches could be found to control and avoid these sensitive risks, thus keep logistics financial business working more reliable. The proposed method also gives a quantitative measure of risk levels in logistics financial business, which provides guidance for the selection of financing solutions.

  1. Uncertainty Analysis with Considering Resonance Self-shielding Effect

    Energy Technology Data Exchange (ETDEWEB)

    Han, Tae Young [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    If infinitely diluted multi-group cross sections were used for the sensitivity, the covariance data from the evaluated nuclear data library (ENDL) was directly applied. However, in case of using a self-shielded multi-group cross section, the covariance data should be corrected considering self-shielding effect. Usually, implicit uncertainty can be defined as the uncertainty change by the resonance self-shielding effect as described above. MUSAD ( Modules of Uncertainty and Sensitivity Analysis for DeCART ) has been developed for a multiplication factor and cross section uncertainty based on the generalized perturbation theory and it, however, can only quantify the explicit uncertainty by the self-shielded multi-group cross sections without considering the implicit effect. Thus, this paper addresses the implementation of the implicit uncertainty analysis module into the code and the numerical results for the verification are provided. The implicit uncertainty analysis module has been implemented into MUSAD based on infinitely-diluted cross section-based consistent method. The verification calculation was performed on MHTGR 350 Ex.I-1a and the differences with McCARD result decrease from 40% to 1% in CZP case and 3% in HFP case. From this study, it is expected that MUSAD code can reasonably produce the complete uncertainty on VHTR or LWR where the resonance self-shielding effect should be significantly considered.

  2. Uncertainty Analysis with Considering Resonance Self-shielding Effect

    International Nuclear Information System (INIS)

    Han, Tae Young

    2016-01-01

    If infinitely diluted multi-group cross sections were used for the sensitivity, the covariance data from the evaluated nuclear data library (ENDL) was directly applied. However, in case of using a self-shielded multi-group cross section, the covariance data should be corrected considering self-shielding effect. Usually, implicit uncertainty can be defined as the uncertainty change by the resonance self-shielding effect as described above. MUSAD ( Modules of Uncertainty and Sensitivity Analysis for DeCART ) has been developed for a multiplication factor and cross section uncertainty based on the generalized perturbation theory and it, however, can only quantify the explicit uncertainty by the self-shielded multi-group cross sections without considering the implicit effect. Thus, this paper addresses the implementation of the implicit uncertainty analysis module into the code and the numerical results for the verification are provided. The implicit uncertainty analysis module has been implemented into MUSAD based on infinitely-diluted cross section-based consistent method. The verification calculation was performed on MHTGR 350 Ex.I-1a and the differences with McCARD result decrease from 40% to 1% in CZP case and 3% in HFP case. From this study, it is expected that MUSAD code can reasonably produce the complete uncertainty on VHTR or LWR where the resonance self-shielding effect should be significantly considered

  3. Flood risk assessment and robust management under deep uncertainty: Application to Dhaka City

    Science.gov (United States)

    Mojtahed, Vahid; Gain, Animesh Kumar; Giupponi, Carlo

    2014-05-01

    The socio-economic changes as well as climatic changes have been the main drivers of uncertainty in environmental risk assessment and in particular flood. The level of future uncertainty that researchers face when dealing with problems in a future perspective with focus on climate change is known as Deep Uncertainty (also known as Knightian uncertainty), since nobody has already experienced and undergone those changes before and our knowledge is limited to the extent that we have no notion of probabilities, and therefore consolidated risk management approaches have limited potential.. Deep uncertainty is referred to circumstances that analysts and experts do not know or parties to decision making cannot agree on: i) the appropriate models describing the interaction among system variables, ii) probability distributions to represent uncertainty about key parameters in the model 3) how to value the desirability of alternative outcomes. The need thus emerges to assist policy-makers by providing them with not a single and optimal solution to the problem at hand, such as crisp estimates for the costs of damages of natural hazards considered, but instead ranges of possible future costs, based on the outcomes of ensembles of assessment models and sets of plausible scenarios. Accordingly, we need to substitute optimality as a decision criterion with robustness. Under conditions of deep uncertainty, the decision-makers do not have statistical and mathematical bases to identify optimal solutions, while instead they should prefer to implement "robust" decisions that perform relatively well over all conceivable outcomes out of all future unknown scenarios. Under deep uncertainty, analysts cannot employ probability theory or other statistics that usually can be derived from observed historical data and therefore, we turn to non-statistical measures such as scenario analysis. We construct several plausible scenarios with each scenario being a full description of what may happen

  4. Estimating the Economic Attractiveness of Investment Projects in Conditions of Uncertainty and Risk with the Use of Sensitivity Analysis

    Directory of Open Access Journals (Sweden)

    Kotsyuba Oleksiy S.

    2018-02-01

    Full Text Available The article is concerned with the methodology of economic substantiation of real investments in case of considerable lack of information on possible fluctuations of initial parameters and the resulting risk. The analysis of sensitivity as the main instrument for accounting the risk in the indicated problem situation is the focus of the presented research. In the publication, on the basis of the apparatus of interval mathematics, a set of models for comparative estimation of economic attractiveness (efficiency of alternative investment projects in conditions of uncertainty and risk is formulated, using the sensitivity analysis. The developed instrumentarium assumes both mono- and poly-interval version of the sensitivity analysis. As the risk component in the constructed models is used: in some – values of the specially developed sensitivity coefficient, in others – the worst values, which are based on the interval estimations of the partial criteria of efficiency. The sensitivity coefficient, according to the approach proposed in the publication, is the ratio of the target semi-range of variation to the increase (economy of efficiency, which is provided when the basic level of the analyzed partial criterion of economic attractiveness in comparison with some of its threshold (limit value is being reached.

  5. Methods for Addressing Uncertainty and Variability to Characterize Potential Health Risk From Trichloroethylene-Contaminated Ground Water Beale Air Force Base in California: Integration of Uncertainty and Variability in Pharmacokinetics and Dose-Response; TOPICAL

    International Nuclear Information System (INIS)

    Bogen, K.T.

    1999-01-01

    Traditional estimates of health risk are typically inflated, particularly if cancer is the dominant endpoint and there is fundamental uncertainty as to mechanism(s) of action. Risk is more realistically characterized if it accounts for joint uncertainty and interindividual variability after applying a unified probabilistic approach to the distributed parameters of all (linear as well as nonlinear) risk-extrapolation models involved. Such an approach was applied to characterize risks to potential future residents posed by trichloroethylene (TCE) in ground water at an inactive landfill site on Beale Air Force Base in California. Variability and uncertainty were addressed in exposure-route-specific estimates of applied dose, in pharmacokinetically based estimates of route-specific metabolized fractions of absorbed TCE, and in corresponding biologically effective doses estimated under a genotoxic/linear (MA(sub g)) vs. a cytotoxic/nonlinear (MA(sub c)) mechanistic assumption for TCE-induced cancer. Increased risk conditional on effective dose was estimated under MA(sub G) based on seven rodent-bioassay data sets, and under MA, based on mouse hepatotoxicity data. Mean and upper-bound estimates of combined risk calculated by the unified approach were and lt;10(sup -6) and and lt;10(sup -4), respectively, while corresponding estimates based on traditional deterministic methods were and gt;10(sup -5) and and gt;10(sup -4), respectively. It was estimated that no TCE-related harm is likely occur due any plausible residential exposure scenario involving the site. The unified approach illustrated is particularly suited to characterizing risks that involve uncertain and/or diverse mechanisms of action

  6. Incorporating the Technology Roadmap Uncertainties into the Project Risk Assessment

    International Nuclear Information System (INIS)

    Bonnema, B.E.

    2002-01-01

    This paper describes two methods, Technology Roadmapping and Project Risk Assessment, which were used to identify and manage the technical risks relating to the treatment of sodium bearing waste at the Idaho National Engineering and Environmental Laboratory. The waste treatment technology under consideration was Direct Vitrification. The primary objective of the Technology Roadmap is to identify technical data uncertainties for the technologies involved and to prioritize the testing or development studies to fill the data gaps. Similarly, project management's objective for a multi-million dollar construction project includes managing all the key risks in accordance to DOE O 413.3 - ''Program and Project Management for the Acquisition of Capital Assets.'' In the early stages, the Project Risk Assessment is based upon a qualitative analysis for each risk's probability and consequence. In order to clearly prioritize the work to resolve the technical issues identified in the Technology Roadmap, the issues must be cross- referenced to the project's Risk Assessment. This will enable the project to get the best value for the cost to mitigate the risks

  7. A practical sensitivity analysis method for ranking sources of uncertainty in thermal–hydraulics applications

    Energy Technology Data Exchange (ETDEWEB)

    Pourgol-Mohammad, Mohammad, E-mail: pourgolmohammad@sut.ac.ir [Department of Mechanical Engineering, Sahand University of Technology, Tabriz (Iran, Islamic Republic of); Hoseyni, Seyed Mohsen [Department of Basic Sciences, East Tehran Branch, Islamic Azad University, Tehran (Iran, Islamic Republic of); Hoseyni, Seyed Mojtaba [Building & Housing Research Center, Tehran (Iran, Islamic Republic of); Sepanloo, Kamran [Nuclear Science and Technology Research Institute, Tehran (Iran, Islamic Republic of)

    2016-08-15

    Highlights: • Existing uncertainty ranking methods prove inconsistent for TH applications. • Introduction of a new method for ranking sources of uncertainty in TH codes. • Modified PIRT qualitatively identifies and ranks uncertainty sources more precisely. • The importance of parameters is calculated by a limited number of TH code executions. • Methodology is applied successfully on LOFT-LB1 test facility. - Abstract: In application to thermal–hydraulic calculations by system codes, sensitivity analysis plays an important role for managing the uncertainties of code output and risk analysis. Sensitivity analysis is also used to confirm the results of qualitative Phenomena Identification and Ranking Table (PIRT). Several methodologies have been developed to address uncertainty importance assessment. Generally, uncertainty importance measures, mainly devised for the Probabilistic Risk Assessment (PRA) applications, are not affordable for computationally demanding calculations of the complex thermal–hydraulics (TH) system codes. In other words, for effective quantification of the degree of the contribution of each phenomenon to the total uncertainty of the output, a practical approach is needed by considering high computational burden of TH calculations. This study aims primarily to show the inefficiency of the existing approaches and then introduces a solution to cope with the challenges in this area by modification of variance-based uncertainty importance method. Important parameters are identified by the modified PIRT approach qualitatively then their uncertainty importance is quantified by a local derivative index. The proposed index is attractive from its practicality point of view on TH applications. It is capable of calculating the importance of parameters by a limited number of TH code executions. Application of the proposed methodology is demonstrated on LOFT-LB1 test facility.

  8. A practical sensitivity analysis method for ranking sources of uncertainty in thermal–hydraulics applications

    International Nuclear Information System (INIS)

    Pourgol-Mohammad, Mohammad; Hoseyni, Seyed Mohsen; Hoseyni, Seyed Mojtaba; Sepanloo, Kamran

    2016-01-01

    Highlights: • Existing uncertainty ranking methods prove inconsistent for TH applications. • Introduction of a new method for ranking sources of uncertainty in TH codes. • Modified PIRT qualitatively identifies and ranks uncertainty sources more precisely. • The importance of parameters is calculated by a limited number of TH code executions. • Methodology is applied successfully on LOFT-LB1 test facility. - Abstract: In application to thermal–hydraulic calculations by system codes, sensitivity analysis plays an important role for managing the uncertainties of code output and risk analysis. Sensitivity analysis is also used to confirm the results of qualitative Phenomena Identification and Ranking Table (PIRT). Several methodologies have been developed to address uncertainty importance assessment. Generally, uncertainty importance measures, mainly devised for the Probabilistic Risk Assessment (PRA) applications, are not affordable for computationally demanding calculations of the complex thermal–hydraulics (TH) system codes. In other words, for effective quantification of the degree of the contribution of each phenomenon to the total uncertainty of the output, a practical approach is needed by considering high computational burden of TH calculations. This study aims primarily to show the inefficiency of the existing approaches and then introduces a solution to cope with the challenges in this area by modification of variance-based uncertainty importance method. Important parameters are identified by the modified PIRT approach qualitatively then their uncertainty importance is quantified by a local derivative index. The proposed index is attractive from its practicality point of view on TH applications. It is capable of calculating the importance of parameters by a limited number of TH code executions. Application of the proposed methodology is demonstrated on LOFT-LB1 test facility.

  9. Hydroclimatic risks and uncertainty in the global power sector

    Science.gov (United States)

    Gidden, Matthew; Byers, Edward; Greve, Peter; Kahil, Taher; Parkinson, Simon; Raptis, Catherine; Rogelj, Joeri; Satoh, Yusuke; van Vliet, Michelle; Wada, Yoshide; Krey, Volker; Langan, Simon; Riahi, Keywan

    2017-04-01

    Approximately 80% of the world's electricity supply depends on reliable water resources. Thermoelectric and hydropower plants have been impacted by low flows and floods in recent years, notably in the US, Brazil, France, and China, amongst other countries. The dependence on reliable flows imputes a large vulnerability to the electricity supply system due to hydrological variability and the impacts of climate change. Using an updated dataset of global electricity capacity with global climate and hydrological data from the ISI-MIP project, we present an overview analysis of power sector vulnerability to hydroclimatic risks, including low river flows and peak flows. We show how electricity generation in individual countries and transboundary river basins can be impacted, helping decision-makers identify key at-risk geographical regions. Furthermore, our use of a multi-model ensemble of climate and hydrological models allows us to quantify the uncertainty of projected impacts, such that basin-level risks and uncertainty can be compared.

  10. Model uncertainty in financial markets : Long run risk and parameter uncertainty

    NARCIS (Netherlands)

    de Roode, F.A.

    2014-01-01

    Uncertainty surrounding key parameters of financial markets, such as the in- flation and equity risk premium, constitute a major risk for institutional investors with long investment horizons. Hedging the investors’ inflation exposure can be challenging due to the lack of domestic inflation-linked

  11. Risk in technical and scientific studies: general introduction to uncertainty management and the concept of risk

    International Nuclear Information System (INIS)

    Apostolakis, G.E.

    2004-01-01

    George Apostolakis (MIT) presented an introduction to the concept of risk and uncertainty management and their use in technical and scientific studies. He noted that Quantitative Risk Assessment (QRA) provides support to the overall treatment of a system as an integrated socio-technical system. Specifically, QRA aims to answer the questions: - What can go wrong (e.g., accident sequences or scenarios)? - How likely are these sequences or scenarios? - What are the consequences of these sequences or scenarios? The Quantitative Risk Assessment deals with two major types of uncertainty. An assessment requires a 'model of the world', and this preferably would be a deterministic model based on underlying processes. In practice, there are uncertainties in this model of the world relating to variability or randomness that cannot be accounted for directly in a deterministic model and that may require a probabilistic or aleatory model. Both deterministic and aleatory models of the world have assumptions and parameters, and there are 'state-of-knowledge' or epistemic uncertainties associated with these. Sensitivity studies or eliciting expert opinion can be used to address the uncertainties in assumptions, and the level of confidence in parameter values can be characterised using probability distributions (pdfs). Overall, the distinction between aleatory and epistemic uncertainties is not always clear, and both can be treated mathematically in the same way. Lessons on safety assessments that can be learnt from experience at nuclear power plants are that beliefs about what is important can be wrong if a risk assessment is not performed. Also, precautionary approaches are not always conservative if failure modes are not identified. Nevertheless, it is important to recognize that uncertainties will remain despite a quantitative risk assessment: e.g., is the scenario list complete, are the models accepted as reasonable, and are parameter probability distributions representative of

  12. A risk explicit interval linear programming model for uncertainty-based environmental economic optimization in the Lake Fuxian watershed, China.

    Science.gov (United States)

    Zhang, Xiaoling; Huang, Kai; Zou, Rui; Liu, Yong; Yu, Yajuan

    2013-01-01

    The conflict of water environment protection and economic development has brought severe water pollution and restricted the sustainable development in the watershed. A risk explicit interval linear programming (REILP) method was used to solve integrated watershed environmental-economic optimization problem. Interval linear programming (ILP) and REILP models for uncertainty-based environmental economic optimization at the watershed scale were developed for the management of Lake Fuxian watershed, China. Scenario analysis was introduced into model solution process to ensure the practicality and operability of optimization schemes. Decision makers' preferences for risk levels can be expressed through inputting different discrete aspiration level values into the REILP model in three periods under two scenarios. Through balancing the optimal system returns and corresponding system risks, decision makers can develop an efficient industrial restructuring scheme based directly on the window of "low risk and high return efficiency" in the trade-off curve. The representative schemes at the turning points of two scenarios were interpreted and compared to identify a preferable planning alternative, which has the relatively low risks and nearly maximum benefits. This study provides new insights and proposes a tool, which was REILP, for decision makers to develop an effectively environmental economic optimization scheme in integrated watershed management.

  13. A Risk Explicit Interval Linear Programming Model for Uncertainty-Based Environmental Economic Optimization in the Lake Fuxian Watershed, China

    Directory of Open Access Journals (Sweden)

    Xiaoling Zhang

    2013-01-01

    Full Text Available The conflict of water environment protection and economic development has brought severe water pollution and restricted the sustainable development in the watershed. A risk explicit interval linear programming (REILP method was used to solve integrated watershed environmental-economic optimization problem. Interval linear programming (ILP and REILP models for uncertainty-based environmental economic optimization at the watershed scale were developed for the management of Lake Fuxian watershed, China. Scenario analysis was introduced into model solution process to ensure the practicality and operability of optimization schemes. Decision makers’ preferences for risk levels can be expressed through inputting different discrete aspiration level values into the REILP model in three periods under two scenarios. Through balancing the optimal system returns and corresponding system risks, decision makers can develop an efficient industrial restructuring scheme based directly on the window of “low risk and high return efficiency” in the trade-off curve. The representative schemes at the turning points of two scenarios were interpreted and compared to identify a preferable planning alternative, which has the relatively low risks and nearly maximum benefits. This study provides new insights and proposes a tool, which was REILP, for decision makers to develop an effectively environmental economic optimization scheme in integrated watershed management.

  14. A Proposal of Estimation Methodology to Improve Calculation Efficiency of Sampling-based Method in Nuclear Data Sensitivity and Uncertainty Analysis

    International Nuclear Information System (INIS)

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung; Noh, Jae Man

    2014-01-01

    The uncertainty with the sampling-based method is evaluated by repeating transport calculations with a number of cross section data sampled from the covariance uncertainty data. In the transport calculation with the sampling-based method, the transport equation is not modified; therefore, all uncertainties of the responses such as k eff , reaction rates, flux and power distribution can be directly obtained all at one time without code modification. However, a major drawback with the sampling-based method is that it requires expensive computational load for statistically reliable results (inside confidence level 0.95) in the uncertainty analysis. The purpose of this study is to develop a method for improving the computational efficiency and obtaining highly reliable uncertainty result in using the sampling-based method with Monte Carlo simulation. The proposed method is a method to reduce the convergence time of the response uncertainty by using the multiple sets of sampled group cross sections in a single Monte Carlo simulation. The proposed method was verified by estimating GODIVA benchmark problem and the results were compared with that of conventional sampling-based method. In this study, sampling-based method based on central limit theorem is proposed to improve calculation efficiency by reducing the number of repetitive Monte Carlo transport calculation required to obtain reliable uncertainty analysis results. Each set of sampled group cross sections is assigned to each active cycle group in a single Monte Carlo simulation. The criticality uncertainty for the GODIVA problem is evaluated by the proposed and previous method. The results show that the proposed sampling-based method can efficiently decrease the number of Monte Carlo simulation required for evaluate uncertainty of k eff . It is expected that the proposed method will improve computational efficiency of uncertainty analysis with sampling-based method

  15. Assessing uncertainty and risk in exploited marine populations

    International Nuclear Information System (INIS)

    Fogarty, M.J.; Mayo, R.K.; O'Brien, L.; Serchuk, F.M.; Rosenberg, A.A.

    1996-01-01

    The assessment and management of exploited fish and invertebrate populations is subject to several types of uncertainty. This uncertainty translates into risk to the population in the development and implementation of fishery management advice. Here, we define risk as the probability that exploitation rates will exceed a threshold level where long term sustainability of the stock is threatened. We distinguish among several sources of error or uncertainty due to (a) stochasticity in demographic rates and processes, particularly in survival rates during the early fife stages; (b) measurement error resulting from sampling variation in the determination of population parameters or in model estimation; and (c) the lack of complete information on population and ecosystem dynamics. The first represents a form of aleatory uncertainty while the latter two factors represent forms of epistemic uncertainty. To illustrate these points, we evaluate the recent status of the Georges Bank cod stock in a risk assessment framework. Short term stochastic projections are made accounting for uncertainty in population size and for random variability in the number of young surviving to enter the fishery. We show that recent declines in this cod stock can be attributed to exploitation rates that have substantially exceeded sustainable levels

  16. Risk newsboy: approach for addressing uncertainty in developing action levels and cleanup limits

    International Nuclear Information System (INIS)

    Cooke, Roger; MacDonell, Margaret

    2007-01-01

    Site cleanup decisions involve developing action levels and residual limits for key contaminants, to assure health protection during the cleanup period and into the long term. Uncertainty is inherent in the toxicity information used to define these levels, based on incomplete scientific knowledge regarding dose-response relationships across various hazards and exposures at environmentally relevant levels. This problem can be addressed by applying principles used to manage uncertainty in operations research, as illustrated by the newsboy dilemma. Each day a newsboy must balance the risk of buying more papers than he can sell against the risk of not buying enough. Setting action levels and cleanup limits involves a similar concept of balancing and distributing risks and benefits in the face of uncertainty. The newsboy approach can be applied to develop health-based target concentrations for both radiological and chemical contaminants, with stakeholder input being crucial to assessing 'regret' levels. Associated tools include structured expert judgment elicitation to quantify uncertainty in the dose-response relationship, and mathematical techniques such as probabilistic inversion and iterative proportional fitting. (authors)

  17. Computer-Based Model Calibration and Uncertainty Analysis: Terms and Concepts

    Science.gov (United States)

    2015-07-01

    uncertainty analyses throughout the lifecycle of planning, designing, and operating of Civil Works flood risk management projects as described in...Education 140:3–14. Doherty, J. 2004. PEST : Model-independent parameter estimation, User Manual. 5th ed. Brisbane, Queensland, Australia: Watermark

  18. Uncertainty analysis of neutron transport calculation

    International Nuclear Information System (INIS)

    Oka, Y.; Furuta, K.; Kondo, S.

    1987-01-01

    A cross section sensitivity-uncertainty analysis code, SUSD was developed. The code calculates sensitivity coefficients for one and two-dimensional transport problems based on the first order perturbation theory. Variance and standard deviation of detector responses or design parameters can be obtained using cross section covariance matrix. The code is able to perform sensitivity-uncertainty analysis for secondary neutron angular distribution(SAD) and secondary neutron energy distribution(SED). Covariances of 6 Li and 7 Li neutron cross sections in JENDL-3PR1 were evaluated including SAD and SED. Covariances of Fe and Be were also evaluated. The uncertainty of tritium breeding ratio, fast neutron leakage flux and neutron heating was analysed on four types of blanket concepts for a commercial tokamak fusion reactor. The uncertainty of tritium breeding ratio was less than 6 percent. Contribution from SAD/SED uncertainties are significant for some parameters. Formulas to estimate the errors of numerical solution of the transport equation were derived based on the perturbation theory. This method enables us to deterministically estimate the numerical errors due to iterative solution, spacial discretization and Legendre polynomial expansion of transfer cross-sections. The calculational errors of the tritium breeding ratio and the fast neutron leakage flux of the fusion blankets were analysed. (author)

  19. Stochastic analysis in production process and ecology under uncertainty

    CERN Document Server

    Bieda, Bogusław

    2014-01-01

    The monograph addresses a problem of stochastic analysis based on the uncertainty assessment by simulation and application of this method in ecology and steel industry under uncertainty. The first chapter defines the Monte Carlo (MC) method and random variables in stochastic models. Chapter two deals with the contamination transport in porous media. Stochastic approach for Municipal Solid Waste transit time contaminants modeling using MC simulation has been worked out. The third chapter describes the risk analysis of the waste to energy facility proposal for Konin city, including the financial aspects. Environmental impact assessment of the ArcelorMittal Steel Power Plant, in Kraków - in the chapter four - is given. Thus, four scenarios of the energy mix production processes were studied. Chapter five contains examples of using ecological Life Cycle Assessment (LCA) - a relatively new method of environmental impact assessment - which help in preparing pro-ecological strategy, and which can lead to reducing t...

  20. Economic–Environmental Sustainability in Building Projects: Introducing Risk and Uncertainty in LCCE and LCCA

    Directory of Open Access Journals (Sweden)

    Elena Fregonara

    2018-06-01

    Full Text Available The aim of this paper is to propose a methodology for supporting decision-making in the design stages of new buildings or in the retrofitting of existing heritages. The focus is on the evaluation of economic–environmental sustainability, considering the presence of risk and uncertainty. An application of risk analysis in conjunction with Life-Cycle Cost Analysis (LCCA is proposed for selecting the preferable solution between technological options, which represents a recent and poorly explored context of analysis. It is assumed that there is a presence of uncertainty in cost estimating, in terms of the Life-Cycle Cost Estimates (LCCEs and uncertainty in the technical performance of the life-cycle cost analysis. According to the probability analysis, which was solved through stochastic simulation and the Monte Carlo Method (MCM, risk and uncertainty are modeled as stochastic variables or as “stochastic relevant cost drivers”. Coherently, the economic–financial and energy–environmental sustainability is analyzed through the calculation of a conjoint “economic–environmental indicator”, in terms of the stochastic global cost. A case study of the multifunctional building glass façade project in Northern Italy is proposed. The application demonstrates that introducing flexibility into the input data and the duration of the service lives of components and the economic and environmental behavior of alternative scenarios can lead to opposite results compared to a deterministic analysis. The results give full evidence of the environmental variables’ capacity to significantly perturb the model output.

  1. [Uncertainty characterization approaches for ecological risk assessment of polycyclic aromatic hydrocarbon in Taihu Lake].

    Science.gov (United States)

    Guo, Guang-Hui; Wu, Feng-Chang; He, Hong-Ping; Feng, Cheng-Lian; Zhang, Rui-Qing; Li, Hui-Xian

    2012-04-01

    Probabilistic approaches, such as Monte Carlo Sampling (MCS) and Latin Hypercube Sampling (LHS), and non-probabilistic approaches, such as interval analysis, fuzzy set theory and variance propagation, were used to characterize uncertainties associated with risk assessment of sigma PAH8 in surface water of Taihu Lake. The results from MCS and LHS were represented by probability distributions of hazard quotients of sigma PAH8 in surface waters of Taihu Lake. The probabilistic distribution of hazard quotient were obtained from the results of MCS and LHS based on probabilistic theory, which indicated that the confidence intervals of hazard quotient at 90% confidence level were in the range of 0.000 18-0.89 and 0.000 17-0.92, with the mean of 0.37 and 0.35, respectively. In addition, the probabilities that the hazard quotients from MCS and LHS exceed the threshold of 1 were 9.71% and 9.68%, respectively. The sensitivity analysis suggested the toxicity data contributed the most to the resulting distribution of quotients. The hazard quotient of sigma PAH8 to aquatic organisms ranged from 0.000 17 to 0.99 using interval analysis. The confidence interval was (0.001 5, 0.016 3) at the 90% confidence level calculated using fuzzy set theory, and the confidence interval was (0.000 16, 0.88) at the 90% confidence level based on the variance propagation. These results indicated that the ecological risk of sigma PAH8 to aquatic organisms were low. Each method has its own set of advantages and limitations, which was based on different theory; therefore, the appropriate method should be selected on a case-by-case to quantify the effects of uncertainties on the ecological risk assessment. Approach based on the probabilistic theory was selected as the most appropriate method to assess the risk of sigma PAH8 in surface water of Taihu Lake, which provided an important scientific foundation of risk management and control for organic pollutants in water.

  2. Sensitivity functions for uncertainty analysis: Sensitivity and uncertainty analysis of reactor performance parameters

    International Nuclear Information System (INIS)

    Greenspan, E.

    1982-01-01

    This chapter presents the mathematical basis for sensitivity functions, discusses their physical meaning and information they contain, and clarifies a number of issues concerning their application, including the definition of group sensitivities, the selection of sensitivity functions to be included in the analysis, and limitations of sensitivity theory. Examines the theoretical foundation; criticality reset sensitivities; group sensitivities and uncertainties; selection of sensitivities included in the analysis; and other uses and limitations of sensitivity functions. Gives the theoretical formulation of sensitivity functions pertaining to ''as-built'' designs for performance parameters of the form of ratios of linear flux functionals (such as reaction-rate ratios), linear adjoint functionals, bilinear functions (such as reactivity worth ratios), and for reactor reactivity. Offers a consistent procedure for reducing energy-dependent or fine-group sensitivities and uncertainties to broad group sensitivities and uncertainties. Provides illustrations of sensitivity functions as well as references to available compilations of such functions and of total sensitivities. Indicates limitations of sensitivity theory originating from the fact that this theory is based on a first-order perturbation theory

  3. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    International Nuclear Information System (INIS)

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project

  4. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    Energy Technology Data Exchange (ETDEWEB)

    Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States); Lui, C.H. [Nuclear Regulatory Commission, Washington, DC (United States); Goossens, L.H.J.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Paesler-Sauer, J. [Research Center, Karlsruhe (Germany); Helton, J.C. [and others

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.

  5. From risk management to uncertainty management: a significant change in project management

    Institute of Scientific and Technical Information of China (English)

    LI Gui-jun; ZHANG Yue-song

    2006-01-01

    Starting with the meanings of the terms "risk" and "uncertainty,"" he paper compares uncertainty management with risk management in project management. We bring some doubt to the use of "risk" and "uncertainty" interchangeably in project management and deem their scope, methods, responses, monitoring and controlling should be different too. Illustrations are given covering terminology, description, and treatment from different perspectives of uncertainty management and risk management. Furthermore, the paper retains that project risk management (PRM) processes might be modified to facilitate an uncertainty management perspective,and we support that project uncertainty management (PUM) can enlarge its contribution to improving project management performance, which will result in a significant change in emphasis compared with most risk management.

  6. Two approaches to incorporate clinical data uncertainty into multiple criteria decision analysis for benefit-risk assessment of medicinal products.

    Science.gov (United States)

    Wen, Shihua; Zhang, Lanju; Yang, Bo

    2014-07-01

    The Problem formulation, Objectives, Alternatives, Consequences, Trade-offs, Uncertainties, Risk attitude, and Linked decisions (PrOACT-URL) framework and multiple criteria decision analysis (MCDA) have been recommended by the European Medicines Agency for structured benefit-risk assessment of medicinal products undergoing regulatory review. The objective of this article was to provide solutions to incorporate the uncertainty from clinical data into the MCDA model when evaluating the overall benefit-risk profiles among different treatment options. Two statistical approaches, the δ-method approach and the Monte-Carlo approach, were proposed to construct the confidence interval of the overall benefit-risk score from the MCDA model as well as other probabilistic measures for comparing the benefit-risk profiles between treatment options. Both approaches can incorporate the correlation structure between clinical parameters (criteria) in the MCDA model and are straightforward to implement. The two proposed approaches were applied to a case study to evaluate the benefit-risk profile of an add-on therapy for rheumatoid arthritis (drug X) relative to placebo. It demonstrated a straightforward way to quantify the impact of the uncertainty from clinical data to the benefit-risk assessment and enabled statistical inference on evaluating the overall benefit-risk profiles among different treatment options. The δ-method approach provides a closed form to quantify the variability of the overall benefit-risk score in the MCDA model, whereas the Monte-Carlo approach is more computationally intensive but can yield its true sampling distribution for statistical inference. The obtained confidence intervals and other probabilistic measures from the two approaches enhance the benefit-risk decision making of medicinal products. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  7. Uncertainties in thick-target PIXE analysis

    International Nuclear Information System (INIS)

    Campbell, J.L.; Cookson, J.A.; Paul, H.

    1983-01-01

    Thick-target PIXE analysis insolves uncertainties arising from the calculation of thick-target X-ray production in addition to the usual PIXE uncertainties. The calculation demands knowledge of ionization cross-sections, stopping powers and photon attenuation coefficients. Information on these is reviewed critically and a computational method is used to estimate the uncertainties transmitted from this data base into results of thick-target PIXE analyses with reference to particular specimen types using beams of 2-3 MeV protons. A detailed assessment of the accuracy of thick-target PIXE is presented. (orig.)

  8. Phenomenological uncertainty analysis of early containment failure at severe accident of nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Su Won

    2011-02-15

    The severe accident has inherently significant uncertainty due to wide range of conditions and performing experiments, validation and practical application are extremely difficult because of its high temperature and pressure. Although internal and external researches were put into practice, the reference used in Korean nuclear plants were foreign data of 1980s and safety analysis as the probabilistic safety assessment has not applied the newest methodology. Also, it is applied to containment pressure formed into point value as results of thermal hydraulic analysis to identify the probability of containment failure in level 2 PSA. In this paper, the uncertainty analysis methods for phenomena of severe accident influencing early containment failure were developed, the uncertainty analysis that apply Korean nuclear plants using the MELCOR code was performed and it is a point of view to present the distribution of containment pressure as a result of uncertainty analysis. Because early containment failure is important factor of Large Early Release Frequency(LERF) that is used as representative criteria of decision-making in nuclear power plants, it was selected in this paper among various modes of containment failure. Important phenomena of early containment failure at severe accident based on previous researches were comprehended and methodology of 7th steps to evaluate uncertainty was developed. The MELCOR input for analysis of the severe accident reflected natural circulation flow was developed and the accident scenario for station black out that was representative initial event of early containment failure was determined. By reviewing the internal model and correlation for MELCOR model relevant important phenomena of early containment failure, the uncertainty factors which could affect on the uncertainty were founded and the major factors were finally identified through the sensitivity analysis. In order to determine total number of MELCOR calculations which can

  9. Assessing uncertainty in extreme events: Applications to risk-based decision making in interdependent infrastructure sectors

    International Nuclear Information System (INIS)

    Barker, Kash; Haimes, Yacov Y.

    2009-01-01

    Risk-based decision making often relies upon expert probability assessments, particularly in the consequences of disruptive events and when such events are extreme or catastrophic in nature. Naturally, such expert-elicited probability distributions can be fraught with errors, as they describe events which occur very infrequently and for which only sparse data exist. This paper presents a quantitative framework, the extreme event uncertainty sensitivity impact method (EE-USIM), for measuring the sensitivity of extreme event consequences to uncertainties in the parameters of the underlying probability distribution. The EE-USIM is demonstrated with the Inoperability input-output model (IIM), a model with which to evaluate the propagation of inoperability throughout an interdependent set of economic and infrastructure sectors. The EE-USIM also makes use of a two-sided power distribution function generated by expert elicitation of extreme event consequences

  10. Probabilistic risk assessment for CO2 storage in geological formations: robust design and support for decision making under uncertainty

    Science.gov (United States)

    Oladyshkin, Sergey; Class, Holger; Helmig, Rainer; Nowak, Wolfgang

    2010-05-01

    CO2 storage in geological formations is currently being discussed intensively as a technology for mitigating CO2 emissions. However, any large-scale application requires a thorough analysis of the potential risks. Current numerical simulation models are too expensive for probabilistic risk analysis and for stochastic approaches based on brute-force repeated simulation. Even single deterministic simulations may require parallel high-performance computing. The multiphase flow processes involved are too non-linear for quasi-linear error propagation and other simplified stochastic tools. As an alternative approach, we propose a massive stochastic model reduction based on the probabilistic collocation method. The model response is projected onto a orthogonal basis of higher-order polynomials to approximate dependence on uncertain parameters (porosity, permeability etc.) and design parameters (injection rate, depth etc.). This allows for a non-linear propagation of model uncertainty affecting the predicted risk, ensures fast computation and provides a powerful tool for combining design variables and uncertain variables into one approach based on an integrative response surface. Thus, the design task of finding optimal injection regimes explicitly includes uncertainty, which leads to robust designs of the non-linear system that minimize failure probability and provide valuable support for risk-informed management decisions. We validate our proposed stochastic approach by Monte Carlo simulation using a common 3D benchmark problem (Class et al. Computational Geosciences 13, 2009). A reasonable compromise between computational efforts and precision was reached already with second-order polynomials. In our case study, the proposed approach yields a significant computational speedup by a factor of 100 compared to Monte Carlo simulation. We demonstrate that, due to the non-linearity of the flow and transport processes during CO2 injection, including uncertainty in the analysis

  11. Political uncertainty and firm risk in China

    Directory of Open Access Journals (Sweden)

    Danglun Luo

    2017-12-01

    Full Text Available The political uncertainty surrounded by the turnover of government officials has a major impact on local economies and local firms. This paper investigates the relationship between the turnover of prefecture-city officials and the inherent risk faced by local firms in China. Using data from 1999 to 2012, we find that prefecture-city official turnovers significantly increased firm risk. Our results show that the political risk was mitigated when new prefecture-city officials were well connected with their provincial leaders. In addition, the impact of political uncertainty was more pronounced for regulated firms and firms residing in provinces with low market openness.

  12. Improving default risk prediction using Bayesian model uncertainty techniques.

    Science.gov (United States)

    Kazemi, Reza; Mosleh, Ali

    2012-11-01

    Credit risk is the potential exposure of a creditor to an obligor's failure or refusal to repay the debt in principal or interest. The potential of exposure is measured in terms of probability of default. Many models have been developed to estimate credit risk, with rating agencies dating back to the 19th century. They provide their assessment of probability of default and transition probabilities of various firms in their annual reports. Regulatory capital requirements for credit risk outlined by the Basel Committee on Banking Supervision have made it essential for banks and financial institutions to develop sophisticated models in an attempt to measure credit risk with higher accuracy. The Bayesian framework proposed in this article uses the techniques developed in physical sciences and engineering for dealing with model uncertainty and expert accuracy to obtain improved estimates of credit risk and associated uncertainties. The approach uses estimates from one or more rating agencies and incorporates their historical accuracy (past performance data) in estimating future default risk and transition probabilities. Several examples demonstrate that the proposed methodology can assess default probability with accuracy exceeding the estimations of all the individual models. Moreover, the methodology accounts for potentially significant departures from "nominal predictions" due to "upsetting events" such as the 2008 global banking crisis. © 2012 Society for Risk Analysis.

  13. Joint analysis of epistemic and aleatory uncertainty in stability analysis for geo-hazard assessments

    Science.gov (United States)

    Rohmer, Jeremy; Verdel, Thierry

    2017-04-01

    Uncertainty analysis is an unavoidable task of stability analysis of any geotechnical systems. Such analysis usually relies on the safety factor SF (if SF is below some specified threshold), the failure is possible). The objective of the stability analysis is then to estimate the failure probability P for SF to be below the specified threshold. When dealing with uncertainties, two facets should be considered as outlined by several authors in the domain of geotechnics, namely "aleatoric uncertainty" (also named "randomness" or "intrinsic variability") and "epistemic uncertainty" (i.e. when facing "vague, incomplete or imprecise information" such as limited databases and observations or "imperfect" modelling). The benefits of separating both facets of uncertainty can be seen from a risk management perspective because: - Aleatoric uncertainty, being a property of the system under study, cannot be reduced. However, practical actions can be taken to circumvent the potentially dangerous effects of such variability; - Epistemic uncertainty, being due to the incomplete/imprecise nature of available information, can be reduced by e.g., increasing the number of tests (lab or in site survey), improving the measurement methods or evaluating calculation procedure with model tests, confronting more information sources (expert opinions, data from literature, etc.). Uncertainty treatment in stability analysis usually restricts to the probabilistic framework to represent both facets of uncertainty. Yet, in the domain of geo-hazard assessments (like landslides, mine pillar collapse, rockfalls, etc.), the validity of this approach can be debatable. In the present communication, we propose to review the major criticisms available in the literature against the systematic use of probability in situations of high degree of uncertainty. On this basis, the feasibility of using a more flexible uncertainty representation tool is then investigated, namely Possibility distributions (e

  14. STOCHASTIC METHODS IN RISK ANALYSIS

    Directory of Open Access Journals (Sweden)

    Vladimíra OSADSKÁ

    2017-06-01

    Full Text Available In this paper, we review basic stochastic methods which can be used to extend state-of-the-art deterministic analytical methods for risk analysis. We can conclude that the standard deterministic analytical methods highly depend on the practical experience and knowledge of the evaluator and therefore, the stochastic methods should be introduced. The new risk analysis methods should consider the uncertainties in input values. We present how large is the impact on the results of the analysis solving practical example of FMECA with uncertainties modelled using Monte Carlo sampling.

  15. Economics versus psychology.Risk, uncertainty and the expected utility theory

    OpenAIRE

    Schilirò, Daniele

    2017-01-01

    The present contribution examines the emergence of expected utility theory by John von Neumann and Oskar Morgenstern, the subjective the expected utility theory by Savage, and the problem of choice under risk and uncertainty, focusing in particular on the seminal work “The Utility Analysis of Choices involving Risk" (1948) by Milton Friedman and Leonard Savage to show how the evolution of the theory of choice has determined a separation of economics from psychology.

  16. Farm decision making under risk and uncertainty.

    NARCIS (Netherlands)

    Backus, G.B.C.; Eidman, V.R.; Dijkhuizen, A.A.

    1997-01-01

    Relevant portions of the risk literature are reviewed, relating them to observed behaviour in farm decision-making. Relevant topics for applied agricultural risk research are proposed. The concept of decision making under risk and uncertainty is discussed by reviewing the theory of Subjective

  17. Treating Uncertainties in A Nuclear Seismic Probabilistic Risk Assessment by Means of the Distemper-Safer Theory of Evidence

    International Nuclear Information System (INIS)

    Lo, Chungkung; Pedroni, N.; Zio, E.

    2014-01-01

    The analyses carried out within the Seismic Probabilistic Risk Assessments (SPRAs) of Nuclear Power Plants (NPPs) are affected by significant aleatory and epistemic uncertainties. These uncertainties have to be represented and quantified coherently with the data, information and knowledge available, to provide reasonable assurance that related decisions can be taken robustly and with confidence. The amount of data, information and knowledge available for seismic risk assessment is typically limited, so that the analysis must strongly rely on expert judgments. In this paper, a Dempster-Shafer Theory (DST) framework for handling uncertainties in NPP SPRAs is proposed and applied to an example case study. The main contributions of this paper are two: (i) applying the complete DST framework to SPRA models, showing how to build the Dempster-Shafer structures of the uncertainty parameters based on industry generic data, and (ii) embedding Bayesian updating based on plant specific data into the framework. The results of the application to a case study show that the approach is feasible and effective in (i) describing and jointly propagating aleatory and epistemic uncertainties in SPRA models and (ii) providing 'conservative' bounds on the safety quantities of interest (i. e. Core Damage Frequency, CDF) that reflect the (limited) state of knowledge of the experts about the system of interest

  18. Treating Uncertainties in A Nuclear Seismic Probabilistic Risk Assessment by Means of the Distemper-Safer Theory of Evidence

    Energy Technology Data Exchange (ETDEWEB)

    Lo, Chungkung [Chair on Systems Science and the Energetic Challenge, Paris (France); Pedroni, N.; Zio, E. [Politecnico di Milano, Milano (Italy)

    2014-02-15

    The analyses carried out within the Seismic Probabilistic Risk Assessments (SPRAs) of Nuclear Power Plants (NPPs) are affected by significant aleatory and epistemic uncertainties. These uncertainties have to be represented and quantified coherently with the data, information and knowledge available, to provide reasonable assurance that related decisions can be taken robustly and with confidence. The amount of data, information and knowledge available for seismic risk assessment is typically limited, so that the analysis must strongly rely on expert judgments. In this paper, a Dempster-Shafer Theory (DST) framework for handling uncertainties in NPP SPRAs is proposed and applied to an example case study. The main contributions of this paper are two: (i) applying the complete DST framework to SPRA models, showing how to build the Dempster-Shafer structures of the uncertainty parameters based on industry generic data, and (ii) embedding Bayesian updating based on plant specific data into the framework. The results of the application to a case study show that the approach is feasible and effective in (i) describing and jointly propagating aleatory and epistemic uncertainties in SPRA models and (ii) providing 'conservative' bounds on the safety quantities of interest (i. e. Core Damage Frequency, CDF) that reflect the (limited) state of knowledge of the experts about the system of interest.

  19. Risk, unexpected uncertainty, and estimation uncertainty: Bayesian learning in unstable settings.

    Directory of Open Access Journals (Sweden)

    Elise Payzan-LeNestour

    Full Text Available Recently, evidence has emerged that humans approach learning using Bayesian updating rather than (model-free reinforcement algorithms in a six-arm restless bandit problem. Here, we investigate what this implies for human appreciation of uncertainty. In our task, a Bayesian learner distinguishes three equally salient levels of uncertainty. First, the Bayesian perceives irreducible uncertainty or risk: even knowing the payoff probabilities of a given arm, the outcome remains uncertain. Second, there is (parameter estimation uncertainty or ambiguity: payoff probabilities are unknown and need to be estimated. Third, the outcome probabilities of the arms change: the sudden jumps are referred to as unexpected uncertainty. We document how the three levels of uncertainty evolved during the course of our experiment and how it affected the learning rate. We then zoom in on estimation uncertainty, which has been suggested to be a driving force in exploration, in spite of evidence of widespread aversion to ambiguity. Our data corroborate the latter. We discuss neural evidence that foreshadowed the ability of humans to distinguish between the three levels of uncertainty. Finally, we investigate the boundaries of human capacity to implement Bayesian learning. We repeat the experiment with different instructions, reflecting varying levels of structural uncertainty. Under this fourth notion of uncertainty, choices were no better explained by Bayesian updating than by (model-free reinforcement learning. Exit questionnaires revealed that participants remained unaware of the presence of unexpected uncertainty and failed to acquire the right model with which to implement Bayesian updating.

  20. Risk Analysis and Decision Making FY 2013 Milestone Report

    Energy Technology Data Exchange (ETDEWEB)

    Engel, David W.; Dalton, Angela C.; Dale, Crystal; Jones, Edward; Thompson, J.

    2013-06-01

    Risk analysis and decision making is one of the critical objectives of CCSI, which seeks to use information from science-based models with quantified uncertainty to inform decision makers who are making large capital investments. The goal of this task is to develop tools and capabilities to facilitate the development of risk models tailored for carbon capture technologies, quantify the uncertainty of model predictions, and estimate the technical and financial risks associated with the system. This effort aims to reduce costs by identifying smarter demonstrations, which could accelerate development and deployment of the technology by several years.

  1. Statistically based uncertainty analysis for ranking of component importance in the thermal-hydraulic safety analysis of the Advanced Neutron Source Reactor

    International Nuclear Information System (INIS)

    Wilson, G.E.

    1992-01-01

    The Analytic Hierarchy Process (AHP) has been used to help determine the importance of components and phenomena in thermal-hydraulic safety analyses of nuclear reactors. The AHP results are based, in part on expert opinion. Therefore, it is prudent to evaluate the uncertainty of the AHP ranks of importance. Prior applications have addressed uncertainty with experimental data comparisons and bounding sensitivity calculations. These methods work well when a sufficient experimental data base exists to justify the comparisons. However, in the case of limited or no experimental data the size of the uncertainty is normally made conservatively large. Accordingly, the author has taken another approach, that of performing a statistically based uncertainty analysis. The new work is based on prior evaluations of the importance of components and phenomena in the thermal-hydraulic safety analysis of the Advanced Neutron Source Reactor (ANSR), a new facility now in the design phase. The uncertainty during large break loss of coolant, and decay heat removal scenarios is estimated by assigning a probability distribution function (pdf) to the potential error in the initial expert estimates of pair-wise importance between the components. Using a Monte Carlo sampling technique, the error pdfs are propagated through the AHP software solutions to determine a pdf of uncertainty in the system wide importance of each component. To enhance the generality of the results, study of one other problem having different number of elements is reported, as are the effects of a larger assumed pdf error in the expert ranks. Validation of the Monte Carlo sample size and repeatability are also documented

  2. Risk aversion and uncertainty in cost-effectiveness analysis: the expected-utility, moment-generating function approach.

    Science.gov (United States)

    Elbasha, Elamin H

    2005-05-01

    The availability of patient-level data from clinical trials has spurred a lot of interest in developing methods for quantifying and presenting uncertainty in cost-effectiveness analysis (CEA). Although the majority has focused on developing methods for using sample data to estimate a confidence interval for an incremental cost-effectiveness ratio (ICER), a small strand of the literature has emphasized the importance of incorporating risk preferences and the trade-off between the mean and the variance of returns to investment in health and medicine (mean-variance analysis). This paper shows how the exponential utility-moment-generating function approach is a natural extension to this branch of the literature for modelling choices from healthcare interventions with uncertain costs and effects. The paper assumes an exponential utility function, which implies constant absolute risk aversion, and is based on the fact that the expected value of this function results in a convenient expression that depends only on the moment-generating function of the random variables. The mean-variance approach is shown to be a special case of this more general framework. The paper characterizes the solution to the resource allocation problem using standard optimization techniques and derives the summary measure researchers need to estimate for each programme, when the assumption of risk neutrality does not hold, and compares it to the standard incremental cost-effectiveness ratio. The importance of choosing the correct distribution of costs and effects and the issues related to estimation of the parameters of the distribution are also discussed. An empirical example to illustrate the methods and concepts is provided. Copyright 2004 John Wiley & Sons, Ltd

  3. Uncertainty and sensitivity analysis of electro-mechanical impedance based SHM system

    International Nuclear Information System (INIS)

    Rosiek, M; Martowicz, A; Uhl, T

    2010-01-01

    The paper deals with the application of uncertainty and sensitivity analysis performed for FE simulations for electro-mechanical impedance based SHM system. The measurement of electro-mechanical impedance allows to follow changes of mechanical properties of monitored construction. Therefore it can be effectively applied to conclude about presence of damage. Coupled FE simulations have been carried out for simultaneous consideration of both structural dynamics and piezoelectric properties of a simple beam with bonded transducer. Several indexes have been used to assess the damage growth. In the paper the results obtained with both deterministic and stochastic simulations are shown and discussed. First, the relationship between size of introduced damage and its indexes has been studied. Second, ranges of variation of selected model properties have been assumed to find relationships between them and damage indexes. The most influential parameters have been found. Finally, the overall propagation of considered uncertainty has been assessed and related histograms plotted to discuss effectiveness and robustness of tested damage indexes based on the measurement of electro-mechanical impedance.

  4. Addressing uncertainties in the ERICA Integrated Approach

    International Nuclear Information System (INIS)

    Oughton, D.H.; Agueero, A.; Avila, R.; Brown, J.E.; Copplestone, D.; Gilek, M.

    2008-01-01

    Like any complex environmental problem, ecological risk assessment of the impacts of ionising radiation is confounded by uncertainty. At all stages, from problem formulation through to risk characterisation, the assessment is dependent on models, scenarios, assumptions and extrapolations. These include technical uncertainties related to the data used, conceptual uncertainties associated with models and scenarios, as well as social uncertainties such as economic impacts, the interpretation of legislation, and the acceptability of the assessment results to stakeholders. The ERICA Integrated Approach has been developed to allow an assessment of the risks of ionising radiation, and includes a number of methods that are intended to make the uncertainties and assumptions inherent in the assessment more transparent to users and stakeholders. Throughout its development, ERICA has recommended that assessors deal openly with the deeper dimensions of uncertainty and acknowledge that uncertainty is intrinsic to complex systems. Since the tool is based on a tiered approach, the approaches to dealing with uncertainty vary between the tiers, ranging from a simple, but highly conservative screening to a full probabilistic risk assessment including sensitivity analysis. This paper gives on overview of types of uncertainty that are manifest in ecological risk assessment and the ERICA Integrated Approach to dealing with some of these uncertainties

  5. Assessment of Uncertainty-Based Screening Volumes for NASA Robotic LEO and GEO Conjunction Risk Assessment

    Science.gov (United States)

    Narvet, Steven W.; Frigm, Ryan C.; Hejduk, Matthew D.

    2011-01-01

    Conjunction Assessment operations require screening assets against the space object catalog by placing a pre-determined spatial volume around each asset and predicting when another object will violate that volume. The selection of the screening volume used for each spacecraft is a trade-off between observing all conjunction events that may pose a potential risk to the primary spacecraft and the ability to analyze those predicted events. If the screening volumes are larger, then more conjunctions can be observed and therefore the probability of a missed detection of a high risk conjunction event is small; however, the amount of data which needs to be analyzed increases. This paper characterizes the sensitivity of screening volume size to capturing typical orbit uncertainties and the expected number of conjunction events observed. These sensitivities are quantified in the form of a trade space that allows for selection of appropriate screen-ing volumes to fit the desired concept of operations, system limitations, and tolerable analyst workloads. This analysis will specifically highlight the screening volume determination and selection process for use in the NASA Conjunction Assessment Risk Analysis process but will also provide a general framework for other Owner / Operators faced with similar decisions.

  6. Simulating fuel behavior under transient conditions using FRAPTRAN and uncertainty analysis using Dakota

    International Nuclear Information System (INIS)

    Gomes, Daniel S.; Teixeira, Antonio S.

    2017-01-01

    Although regulatory agencies have shown a special interest in incorporating best estimate approaches in the fuel licensing process, fuel codes are currently licensed based on only the deterministic limits such as those seen in 10CRF50, and therefore, may yield unrealistic safety margins. The concept of uncertainty analysis is employed to more realistically manage this risk. In this study, uncertainties were classified into two categories: probabilistic and epistemic (owing to a lack of pre-existing knowledge in this area). Fuel rods have three sources of uncertainty: manufacturing tolerance, boundary conditions, and physical models. The first step in successfully analyzing the uncertainties involves performing a statistical analysis on the input parameters used throughout the fuel code. The response obtained from this analysis must show proportional index correlations because the uncertainties are globally propagated. The Dakota toolkit was used to analyze the FRAPTRAN transient fuel code. The subsequent sensitivity analyses helped in identifying the key parameters with the highest correlation indices including the peak cladding temperature and the time required for cladding failures. The uncertainty analysis was performed using an IFA-650-5 fuel rod, which was in line with the tests performed in the Halden Project in Norway. The main objectives of the Halden project included studying the ballooning and rupture processes. The results of this experiment demonstrate the accuracy and applicability of the physical models in evaluating the thermal conductivity, mechanical model, and fuel swelling formulations. (author)

  7. Simulating fuel behavior under transient conditions using FRAPTRAN and uncertainty analysis using Dakota

    Energy Technology Data Exchange (ETDEWEB)

    Gomes, Daniel S.; Teixeira, Antonio S., E-mail: dsgomes@ipen.br, E-mail: teixeira@ipen [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    Although regulatory agencies have shown a special interest in incorporating best estimate approaches in the fuel licensing process, fuel codes are currently licensed based on only the deterministic limits such as those seen in 10CRF50, and therefore, may yield unrealistic safety margins. The concept of uncertainty analysis is employed to more realistically manage this risk. In this study, uncertainties were classified into two categories: probabilistic and epistemic (owing to a lack of pre-existing knowledge in this area). Fuel rods have three sources of uncertainty: manufacturing tolerance, boundary conditions, and physical models. The first step in successfully analyzing the uncertainties involves performing a statistical analysis on the input parameters used throughout the fuel code. The response obtained from this analysis must show proportional index correlations because the uncertainties are globally propagated. The Dakota toolkit was used to analyze the FRAPTRAN transient fuel code. The subsequent sensitivity analyses helped in identifying the key parameters with the highest correlation indices including the peak cladding temperature and the time required for cladding failures. The uncertainty analysis was performed using an IFA-650-5 fuel rod, which was in line with the tests performed in the Halden Project in Norway. The main objectives of the Halden project included studying the ballooning and rupture processes. The results of this experiment demonstrate the accuracy and applicability of the physical models in evaluating the thermal conductivity, mechanical model, and fuel swelling formulations. (author)

  8. Risk Management and Uncertainty in Infrastructure Projects

    DEFF Research Database (Denmark)

    Harty, Chris; Neerup Themsen, Tim; Tryggestad, Kjell

    2014-01-01

    The assumption that large complex projects should be managed in order to reduce uncertainty and increase predictability is not new. What is relatively new, however, is that uncertainty reduction can and should be obtained through formal risk management approaches. We question both assumptions...... by addressing a more fundamental question about the role of knowledge in current risk management practices. Inquiries into the predominant approaches to risk management in large infrastructure and construction projects reveal their assumptions about knowledge and we discuss the ramifications these have...... for project and construction management. Our argument and claim is that predominant risk management approaches tends to reinforce conventional ideas of project control whilst undermining other notions of value and relevance of built assets and project management process. These approaches fail to consider...

  9. Risk Aversion, Price Uncertainty and Irreversible Investments

    NARCIS (Netherlands)

    van den Goorbergh, R.W.J.; Huisman, K.J.M.; Kort, P.M.

    2003-01-01

    This paper generalizes the theory of irreversible investment under uncertainty by allowing for risk averse investors in the absence of com-plete markets.Until now this theory has only been developed in the cases of risk neutrality, or risk aversion in combination with complete markets.Within a

  10. Evaluating the Impact of Contaminant Dilution and Biodegradation in Uncertainty Quantification of Human Health Risk

    Science.gov (United States)

    Zarlenga, Antonio; de Barros, Felipe; Fiori, Aldo

    2016-04-01

    We present a probabilistic framework for assessing human health risk due to groundwater contamination. Our goal is to quantify how physical hydrogeological and biochemical parameters control the magnitude and uncertainty of human health risk. Our methodology captures the whole risk chain from the aquifer contamination to the tap water assumption by human population. The contaminant concentration, the key parameter for the risk estimation, is governed by the interplay between the large-scale advection, caused by heterogeneity and the degradation processes strictly related to the local scale dispersion processes. The core of the hazard identification and of the methodology is the reactive transport model: erratic displacement of contaminant in groundwater, due to the spatial variability of hydraulic conductivity (K), is characterized by a first-order Lagrangian stochastic model; different dynamics are considered as possible ways of biodegradation in aerobic and anaerobic conditions. With the goal of quantifying uncertainty, the Beta distribution is assumed for the concentration probability density function (pdf) model, while different levels of approximation are explored for the estimation of the one-point concentration moments. The information pertaining the flow and transport is connected with a proper dose response assessment which generally involves the estimation of physiological parameters of the exposed population. Human health response depends on the exposed individual metabolism (e.g. variability) and is subject to uncertainty. Therefore, the health parameters are intrinsically a stochastic. As a consequence, we provide an integrated in a global probabilistic human health risk framework which allows the propagation of the uncertainty from multiple sources. The final result, the health risk pdf, is expressed as function of a few relevant, physically-based parameters such as the size of the injection area, the Péclet number, the K structure metrics and

  11. Quality in environmental science for policy: Assessing uncertainty as a component of policy analysis

    International Nuclear Information System (INIS)

    Maxim, Laura; Sluijs, Jeroen P. van der

    2011-01-01

    The sheer number of attempts to define and classify uncertainty reveals an awareness of its importance in environmental science for policy, though the nature of uncertainty is often misunderstood. The interdisciplinary field of uncertainty analysis is unstable; there are currently several incomplete notions of uncertainty leading to different and incompatible uncertainty classifications. One of the most salient shortcomings of present-day practice is that most of these classifications focus on quantifying uncertainty while ignoring the qualitative aspects that tend to be decisive in the interface between science and policy. Consequently, the current practices of uncertainty analysis contribute to increasing the perceived precision of scientific knowledge, but do not adequately address its lack of socio-political relevance. The 'positivistic' uncertainty analysis models (like those that dominate the fields of climate change modelling and nuclear or chemical risk assessment) have little social relevance, as they do not influence negotiations between stakeholders. From the perspective of the science-policy interface, the current practices of uncertainty analysis are incomplete and incorrectly focused. We argue that although scientific knowledge produced and used in a context of political decision-making embodies traditional scientific characteristics, it also holds additional properties linked to its influence on social, political, and economic relations. Therefore, the significance of uncertainty cannot be assessed based on quality criteria that refer to the scientific content only; uncertainty must also include quality criteria specific to the properties and roles of this scientific knowledge within political, social, and economic contexts and processes. We propose a conceptual framework designed to account for such substantive, contextual, and procedural criteria of knowledge quality. At the same time, the proposed framework includes and synthesizes the various

  12. Analysis of algal bloom risk with uncertainties in lakes by integrating self-organizing map and fuzzy information theory

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Qiuwen, E-mail: qchen@rcees.ac.cn [RCEES, Chinese Academy of Sciences, Shuangqinglu 18, Beijing 10085 (China); China Three Gorges University, Daxuelu 8, Yichang 443002 (China); CEER, Nanjing Hydraulics Research Institute, Guangzhoulu 223, Nanjing 210029 (China); Rui, Han; Li, Weifeng; Zhang, Yanhui [RCEES, Chinese Academy of Sciences, Shuangqinglu 18, Beijing 10085 (China)

    2014-06-01

    Algal blooms are a serious problem in waters, which damage aquatic ecosystems and threaten drinking water safety. However, the outbreak mechanism of algal blooms is very complex with great uncertainty, especially for large water bodies where environmental conditions have obvious variation in both space and time. This study developed an innovative method which integrated a self-organizing map (SOM) and fuzzy information diffusion theory to comprehensively analyze algal bloom risks with uncertainties. The Lake Taihu was taken as study case and the long-term (2004–2010) on-site monitoring data were used. The results showed that algal blooms in Taihu Lake were classified into four categories and exhibited obvious spatial–temporal patterns. The lake was mainly characterized by moderate bloom but had high uncertainty, whereas severe blooms with low uncertainty were observed in the northwest part of the lake. The study gives insight on the spatial–temporal dynamics of algal blooms, and should help government and decision-makers outline policies and practices on bloom monitoring and prevention. The developed method provides a promising approach to estimate algal bloom risks under uncertainties. - Highlights: • An innovative method is developed to analyze algal bloom risks with uncertainties. • The algal blooms in Taihu Lake showed obvious spatial and temporal patterns. • The lake is mainly characterized as moderate bloom but with high uncertainty. • Severe bloom with low uncertainty appeared occasionally in the northwest part. • The results provide important information to bloom monitoring and management.

  13. Uncertainty Visualization Using Copula-Based Analysis in Mixed Distribution Models.

    Science.gov (United States)

    Hazarika, Subhashis; Biswas, Ayan; Shen, Han-Wei

    2018-01-01

    Distributions are often used to model uncertainty in many scientific datasets. To preserve the correlation among the spatially sampled grid locations in the dataset, various standard multivariate distribution models have been proposed in visualization literature. These models treat each grid location as a univariate random variable which models the uncertainty at that location. Standard multivariate distributions (both parametric and nonparametric) assume that all the univariate marginals are of the same type/family of distribution. But in reality, different grid locations show different statistical behavior which may not be modeled best by the same type of distribution. In this paper, we propose a new multivariate uncertainty modeling strategy to address the needs of uncertainty modeling in scientific datasets. Our proposed method is based on a statistically sound multivariate technique called Copula, which makes it possible to separate the process of estimating the univariate marginals and the process of modeling dependency, unlike the standard multivariate distributions. The modeling flexibility offered by our proposed method makes it possible to design distribution fields which can have different types of distribution (Gaussian, Histogram, KDE etc.) at the grid locations, while maintaining the correlation structure at the same time. Depending on the results of various standard statistical tests, we can choose an optimal distribution representation at each location, resulting in a more cost efficient modeling without significantly sacrificing on the analysis quality. To demonstrate the efficacy of our proposed modeling strategy, we extract and visualize uncertain features like isocontours and vortices in various real world datasets. We also study various modeling criterion to help users in the task of univariate model selection.

  14. Assessment of Risks and Uncertainties in Poultry Farming in Kwara ...

    African Journals Online (AJOL)

    , identify the risks and uncertainties encountered by the farmers, determines the level of severity of the risks and uncertainties, and identifies the coping strategies employed by the farmers. Primary data obtained from 99 registered poultry ...

  15. Introduction of risk size in the determination of uncertainty factor UFL in risk assessment

    International Nuclear Information System (INIS)

    Xue Jinling; Lu Yun; Velasquez, Natalia; Hu Hongying; Yu Ruozhen; Liu Zhengtao; Meng Wei

    2012-01-01

    The methodology for using uncertainty factors in health risk assessment has been developed for several decades. A default value is usually applied for the uncertainty factor UF L , which is used to extrapolate from LOAEL (lowest observed adverse effect level) to NAEL (no adverse effect level). Here, we have developed a new method that establishes a linear relationship between UF L and the additional risk level at LOAEL based on the dose–response information, which represents a very important factor that should be carefully considered. This linear formula makes it possible to select UF L properly in the additional risk range from 5.3% to 16.2%. Also the results remind us that the default value 10 may not be conservative enough when the additional risk level at LOAEL exceeds 16.2%. Furthermore, this novel method not only provides a flexible UF L instead of the traditional default value, but also can ensure a conservative estimation of the UF L with fewer errors, and avoid the benchmark response selection involved in the benchmark dose method. These advantages can improve the estimation of the extrapolation starting point in the risk assessment. (letter)

  16. Quantitative risk assessment using the capacity-demand analysis

    International Nuclear Information System (INIS)

    Morgenroth, M.; Donnelly, C.R.; Westermann, G.D.; Huang, J.H.S.; Lam, T.M.

    1999-01-01

    The hydroelectric industry's recognition of the importance of avoiding unexpected failure, or forced outages, led to the development of probabilistic, or risk-based, methods in order to attempt to quantify exposures. Traditionally, such analysis has been carried out by qualitative assessments, relying on experience and sound engineering judgment to determine the optimum time to maintain, repair or replace a part or system. Depending on the nature of the problem, however, and the level of experience of those included in the decision making process, it is difficult to find a balance between acting proactively and accepting some amount of risk. The development of a practical means for establishing the probability of failure of any part or system, based on the determination of the statistical distribution of engineering properties such as acting stresses, is discussed. The capacity-demand analysis methodology, coupled with probablistic, risk-based analysis, permits all the factors associated with a decision to rehabilitate or replace a part, including the risks associated with the timing of the decision, to be assessed in a transparent and defendable manner. The methodology does not eliminate judgment altogether, but does move it from the level of estimating the risk of failure to the lower level of estimating variability in material properties, uncertainty in loading, and the uncertainties inherent in any engineering analysis. The method was successfully used in 1998 to carry out a comprehensive, economic risk analysis for the entire water conveyance system of a 90 year old hydropower station. The analysis included a number of diverse parts ranging from rock slopes and aging steel and concrete conduits, and the method allowed a rational assessment of the risks associated with reach of these varied parts to be determined, permitting the essential remedial works to be prioritized. 14 refs., 4 figs

  17. Systematic Analysis Of Ocean Colour Uncertainties

    Science.gov (United States)

    Lavender, Samantha

    2013-12-01

    This paper reviews current research into the estimation of uncertainties as a pixel-based measure to aid non- specialist users of remote sensing products. An example MERIS image, captured on the 28 March 2012, was processed with above-water atmospheric correction code. This was initially based on both the Antoine & Morel Standard Atmospheric Correction, with Bright Pixel correction component, and Doerffer Neural Network coastal water's approach. It's showed that analysis of the atmospheric by-products yield important information about the separation of the atmospheric and in-water signals, helping to sign-post possible uncertainties in the atmospheric correction results. Further analysis has concentrated on implementing a ‘simplistic' atmospheric correction so that the impact of changing the input auxiliary data can be analysed; the influence of changing surface pressure is demonstrated. Future work will focus on automating the analysis, so that the methodology can be implemented within an operational system.

  18. Reduction and Uncertainty Analysis of Chemical Mechanisms Based on Local and Global Sensitivities

    Science.gov (United States)

    Esposito, Gaetano

    Numerical simulations of critical reacting flow phenomena in hypersonic propulsion devices require accurate representation of finite-rate chemical kinetics. The chemical kinetic models available for hydrocarbon fuel combustion are rather large, involving hundreds of species and thousands of reactions. As a consequence, they cannot be used in multi-dimensional computational fluid dynamic calculations in the foreseeable future due to the prohibitive computational cost. In addition to the computational difficulties, it is also known that some fundamental chemical kinetic parameters of detailed models have significant level of uncertainty due to limited experimental data available and to poor understanding of interactions among kinetic parameters. In the present investigation, local and global sensitivity analysis techniques are employed to develop a systematic approach of reducing and analyzing detailed chemical kinetic models. Unlike previous studies in which skeletal model reduction was based on the separate analysis of simple cases, in this work a novel strategy based on Principal Component Analysis of local sensitivity values is presented. This new approach is capable of simultaneously taking into account all the relevant canonical combustion configurations over different composition, temperature and pressure conditions. Moreover, the procedure developed in this work represents the first documented inclusion of non-premixed extinction phenomena, which is of great relevance in hypersonic combustors, in an automated reduction algorithm. The application of the skeletal reduction to a detailed kinetic model consisting of 111 species in 784 reactions is demonstrated. The resulting reduced skeletal model of 37--38 species showed that the global ignition/propagation/extinction phenomena of ethylene-air mixtures can be predicted within an accuracy of 2% of the full detailed model. The problems of both understanding non-linear interactions between kinetic parameters and

  19. Durability reliability analysis for corroding concrete structures under uncertainty

    Science.gov (United States)

    Zhang, Hao

    2018-02-01

    This paper presents a durability reliability analysis of reinforced concrete structures subject to the action of marine chloride. The focus is to provide insight into the role of epistemic uncertainties on durability reliability. The corrosion model involves a number of variables whose probabilistic characteristics cannot be fully determined due to the limited availability of supporting data. All sources of uncertainty, both aleatory and epistemic, should be included in the reliability analysis. Two methods are available to formulate the epistemic uncertainty: the imprecise probability-based method and the purely probabilistic method in which the epistemic uncertainties are modeled as random variables. The paper illustrates how the epistemic uncertainties are modeled and propagated in the two methods, and shows how epistemic uncertainties govern the durability reliability.

  20. A methodology for uncertainty analysis of reference equations of state

    DEFF Research Database (Denmark)

    Cheung, Howard; Frutiger, Jerome; Bell, Ian H.

    We present a detailed methodology for the uncertainty analysis of reference equations of state (EOS) based on Helmholtz energy. In recent years there has been an increased interest in uncertainties of property data and process models of thermal systems. In the literature there are various...... for uncertainty analysis is suggested as a tool for EOS. The uncertainties of the EOS properties are calculated from the experimental values and the EOS model structure through the parameter covariance matrix and subsequent linear error propagation. This allows reporting the uncertainty range (95% confidence...

  1. Risk-based decision analysis for groundwater operable units

    International Nuclear Information System (INIS)

    Chiaramonte, G.R.

    1995-01-01

    This document proposes a streamlined approach and methodology for performing risk assessment in support of interim remedial measure (IRM) decisions involving the remediation of contaminated groundwater on the Hanford Site. This methodology, referred to as ''risk-based decision analysis,'' also supports the specification of target cleanup volumes and provides a basis for design and operation of the groundwater remedies. The risk-based decision analysis can be completed within a short time frame and concisely documented. The risk-based decision analysis is more versatile than the qualitative risk assessment (QRA), because it not only supports the need for IRMs, but also provides criteria for defining the success of the IRMs and provides the risk-basis for decisions on final remedies. For these reasons, it is proposed that, for groundwater operable units, the risk-based decision analysis should replace the more elaborate, costly, and time-consuming QRA

  2. TREATING UNCERTAINTIES IN A NUCLEAR SEISMIC PROBABILISTIC RISK ASSESSMENT BY MEANS OF THE DEMPSTER-SHAFER THEORY OF EVIDENCE

    Directory of Open Access Journals (Sweden)

    CHUNG-KUNG LO

    2014-02-01

    Full Text Available The analyses carried out within the Seismic Probabilistic Risk Assessments (SPRAs of Nuclear Power Plants (NPPs are affected by significant aleatory and epistemic uncertainties. These uncertainties have to be represented and quantified coherently with the data, information and knowledge available, to provide reasonable assurance that related decisions can be taken robustly and with confidence. The amount of data, information and knowledge available for seismic risk assessment is typically limited, so that the analysis must strongly rely on expert judgments. In this paper, a Dempster-Shafer Theory (DST framework for handling uncertainties in NPP SPRAs is proposed and applied to an example case study. The main contributions of this paper are two: (i applying the complete DST framework to SPRA models, showing how to build the Dempster-Shafer structures of the uncertainty parameters based on industry generic data, and (ii embedding Bayesian updating based on plant specific data into the framework. The results of the application to a case study show that the approach is feasible and effective in (i describing and jointly propagating aleatory and epistemic uncertainties in SPRA models and (ii providing ‘conservative’ bounds on the safety quantities of interest (i.e. Core Damage Frequency, CDF that reflect the (limited state of knowledge of the experts about the system of interest.

  3. Risk analysis of gravity dam instability using credibility theory Monte Carlo simulation model.

    Science.gov (United States)

    Xin, Cao; Chongshi, Gu

    2016-01-01

    Risk analysis of gravity dam stability involves complicated uncertainty in many design parameters and measured data. Stability failure risk ratio described jointly by probability and possibility has deficiency in characterization of influence of fuzzy factors and representation of the likelihood of risk occurrence in practical engineering. In this article, credibility theory is applied into stability failure risk analysis of gravity dam. Stability of gravity dam is viewed as a hybrid event considering both fuzziness and randomness of failure criterion, design parameters and measured data. Credibility distribution function is conducted as a novel way to represent uncertainty of influence factors of gravity dam stability. And combining with Monte Carlo simulation, corresponding calculation method and procedure are proposed. Based on a dam section, a detailed application of the modeling approach on risk calculation of both dam foundation and double sliding surfaces is provided. The results show that, the present method is feasible to be applied on analysis of stability failure risk for gravity dams. The risk assessment obtained can reflect influence of both sorts of uncertainty, and is suitable as an index value.

  4. Uncertainty analysis of LBLOCA for Advanced Heavy Water Reactor

    International Nuclear Information System (INIS)

    Srivastava, A.; Lele, H.G.; Ghosh, A.K.; Kushwaha, H.S.

    2008-01-01

    The main objective of safety analysis is to demonstrate in a robust way that all safety requirements are met, i.e. sufficient margins exist between real values of important parameters and their threshold values at which damage of the barriers against release of radioactivity would occur. As stated in the IAEA Safety Requirements for Design of NPPs 'a safety analysis of the plant design shall be conducted in which methods of both deterministic and probabilistic analysis shall be applied'. It is required that 'the computer programs, analytical methods and plant models used in the safety analysis shall be verified and validated, and adequate consideration shall be given to uncertainties'. Uncertainties are present in calculations due to the computer codes, initial and boundary conditions, plant state, fuel parameters, scaling and numerical solution algorithm. All conservative approaches, still widely used, were introduced to cover uncertainties due to limited capability for modelling and understanding of physical phenomena at the early stages of safety analysis. The results obtained by this approach are quite unrealistic and the level of conservatism is not fully known. Another approach is the use of Best Estimate (BE) codes with realistic initial and boundary conditions. If this approach is selected, it should be based on statistically combined uncertainties for plant initial and boundary conditions, assumptions and code models. The current trends are going into direction of the best estimate code with some conservative assumptions of the system with realistic input data with uncertainty analysis. The BE analysis with evaluation of uncertainties offers, in addition, a way to quantify the existing plant safety margins. Its broader use in the future is therefore envisaged, even though it is not always feasible because of the difficulty of quantifying code uncertainties with sufficiently narrow range for every phenomenon and for each accident sequence. In this paper

  5. Deep Uncertainty Surrounding Coastal Flood Risk Projections: A Case Study for New Orleans

    Science.gov (United States)

    Wong, Tony E.; Keller, Klaus

    2017-10-01

    Future sea-level rise drives severe risks for many coastal communities. Strategies to manage these risks hinge on a sound characterization of the uncertainties. For example, recent studies suggest that large fractions of the Antarctic ice sheet (AIS) may rapidly disintegrate in response to rising global temperatures, leading to potentially several meters of sea-level rise during the next few centuries. It is deeply uncertain, for example, whether such an AIS disintegration will be triggered, how much this would increase sea-level rise, whether extreme storm surges intensify in a warming climate, or which emissions pathway future societies will choose. Here, we assess the impacts of these deep uncertainties on projected flooding probabilities for a levee ring in New Orleans, LA. We use 18 scenarios, presenting probabilistic projections within each one, to sample key deeply uncertain future projections of sea-level rise, radiative forcing pathways, storm surge characterization, and contributions from rapid AIS mass loss. The implications of these deep uncertainties for projected flood risk are thus characterized by a set of 18 probability distribution functions. We use a global sensitivity analysis to assess which mechanisms contribute to uncertainty in projected flood risk over the course of a 50-year design life. In line with previous work, we find that the uncertain storm surge drives the most substantial risk, followed by general AIS dynamics, in our simple model for future flood risk for New Orleans.

  6. A risk-averse optimization model for trading wind energy in a market environment under uncertainty

    International Nuclear Information System (INIS)

    Pousinho, H.M.I.; Mendes, V.M.F.; Catalao, J.P.S.

    2011-01-01

    In this paper, a stochastic programming approach is proposed for trading wind energy in a market environment under uncertainty. Uncertainty in the energy market prices is the main cause of high volatility of profits achieved by power producers. The volatile and intermittent nature of wind energy represents another source of uncertainty. Hence, each uncertain parameter is modeled by scenarios, where each scenario represents a plausible realization of the uncertain parameters with an associated occurrence probability. Also, an appropriate risk measurement is considered. The proposed approach is applied on a realistic case study, based on a wind farm in Portugal. Finally, conclusions are duly drawn. -- Highlights: → We model uncertainties on energy market prices and wind power production. → A hybrid intelligent approach generates price-wind power scenarios. → Risk aversion is also incorporated in the proposed stochastic programming approach. → A realistic case study, based on a wind farm in Portugal, is provided. → Our approach allows selecting the best solution according to the desired risk exposure level.

  7. The uncertainty cascade in flood risk assessment under changing climatic conditions - the Biala Tarnowska case study

    Science.gov (United States)

    Doroszkiewicz, Joanna; Romanowicz, Renata

    2016-04-01

    Uncertainty in the results of the hydraulic model is not only associated with the limitations of that model and the shortcomings of data. An important factor that has a major impact on the uncertainty of the flood risk assessment in a changing climate conditions is associated with the uncertainty of future climate scenarios (IPCC WG I, 2013). Future climate projections provided by global climate models are used to generate future runoff required as an input to hydraulic models applied in the derivation of flood risk maps. Biala Tarnowska catchment, situated in southern Poland is used as a case study. Future discharges at the input to a hydraulic model are obtained using the HBV model and climate projections obtained from the EUROCORDEX project. The study describes a cascade of uncertainty related to different stages of the process of derivation of flood risk maps under changing climate conditions. In this context it takes into account the uncertainty of future climate projections, an uncertainty of flow routing model, the propagation of that uncertainty through the hydraulic model, and finally, the uncertainty related to the derivation of flood risk maps. One of the aims of this study is an assessment of a relative impact of different sources of uncertainty on the uncertainty of flood risk maps. Due to the complexity of the process, an assessment of total uncertainty of maps of inundation probability might be very computer time consuming. As a way forward we present an application of a hydraulic model simulator based on a nonlinear transfer function model for the chosen locations along the river reach. The transfer function model parameters are estimated based on the simulations of the hydraulic model at each of the model cross-section. The study shows that the application of the simulator substantially reduces the computer requirements related to the derivation of flood risk maps under future climatic conditions. Acknowledgements: This work was supported by the

  8. Risk-based classification system of nanomaterials

    International Nuclear Information System (INIS)

    Tervonen, Tommi; Linkov, Igor; Figueira, Jose Rui; Steevens, Jeffery; Chappell, Mark; Merad, Myriam

    2009-01-01

    Various stakeholders are increasingly interested in the potential toxicity and other risks associated with nanomaterials throughout the different stages of a product's life cycle (e.g., development, production, use, disposal). Risk assessment methods and tools developed and applied to chemical and biological materials may not be readily adaptable for nanomaterials because of the current uncertainty in identifying the relevant physico-chemical and biological properties that adequately describe the materials. Such uncertainty is further driven by the substantial variations in the properties of the original material due to variable manufacturing processes employed in nanomaterial production. To guide scientists and engineers in nanomaterial research and application as well as to promote the safe handling and use of these materials, we propose a decision support system for classifying nanomaterials into different risk categories. The classification system is based on a set of performance metrics that measure both the toxicity and physico-chemical characteristics of the original materials, as well as the expected environmental impacts through the product life cycle. Stochastic multicriteria acceptability analysis (SMAA-TRI), a formal decision analysis method, was used as the foundation for this task. This method allowed us to cluster various nanomaterials in different ecological risk categories based on our current knowledge of nanomaterial physico-chemical characteristics, variation in produced material, and best professional judgments. SMAA-TRI uses Monte Carlo simulations to explore all feasible values for weights, criteria measurements, and other model parameters to assess the robustness of nanomaterial grouping for risk management purposes.

  9. Dealing with phenomenological uncertainty in risk analysis

    International Nuclear Information System (INIS)

    Theofanous, T.G.

    1994-01-01

    The Risk-Oriented Accident Analysis Methodology (ROAAM) is summarized and developed further towards a formal definition. The key ideas behind the methodology and these more formal aspects are also presented and discussed

  10. Deterministic uncertainty analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1987-01-01

    Uncertainties of computer results are of primary interest in applications such as high-level waste (HLW) repository performance assessment in which experimental validation is not possible or practical. This work presents an alternate deterministic approach for calculating uncertainties that has the potential to significantly reduce the number of computer runs required for conventional statistical analysis. 7 refs., 1 fig

  11. A Quantitative Measure For Evaluating Project Uncertainty Under Variation And Risk Effects

    Directory of Open Access Journals (Sweden)

    A. Chenarani

    2017-10-01

    Full Text Available The effects of uncertainty on a project and the risk event as the consequence of uncertainty are analyzed. The uncertainty index is proposed as a quantitative measure for evaluating the uncertainty of a project. This is done by employing entropy as the indicator of system disorder and lack of information. By employing this index, the uncertainty of each activity and its increase due to risk effects as well as project uncertainty changes as a function of time can be assessed. The results are implemented and analyzed for a small turbojet engine development project as the case study. The results of this study can be useful for project managers and other stakeholders for selecting the most effective risk management and uncertainty controlling method.

  12. Sensitivity and uncertainty analysis

    CERN Document Server

    Cacuci, Dan G; Navon, Ionel Michael

    2005-01-01

    As computer-assisted modeling and analysis of physical processes have continued to grow and diversify, sensitivity and uncertainty analyses have become indispensable scientific tools. Sensitivity and Uncertainty Analysis. Volume I: Theory focused on the mathematical underpinnings of two important methods for such analyses: the Adjoint Sensitivity Analysis Procedure and the Global Adjoint Sensitivity Analysis Procedure. This volume concentrates on the practical aspects of performing these analyses for large-scale systems. The applications addressed include two-phase flow problems, a radiative c

  13. Validation of Fuel Performance Uncertainty for RIA Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Nam-Gyu; Yoo, Jong-Sung; Jung, Yil-Sup [KEPCO Nuclear Fuel Co., Daejeon (Korea, Republic of)

    2016-10-15

    To achieve this the computer code performance has to be validated based on the experimental results. And for the uncertainty quantification, important uncertainty parameters need to be selected, and combined uncertainty has to be evaluated with an acceptable statistical treatment. And important uncertainty parameters to the rod performance such as fuel enthalpy, fission gas release, cladding hoop strain etc. were chosen through the rigorous sensitivity studies. And their validity has been assessed by utilizing the experimental results, which were tested in CABRI and NSRR. Analysis results revealed that several tested rods were not bounded within combined fuel performance uncertainty. Assessment of fuel performance with an extended fuel power uncertainty on tested rods in NSRR and CABRI has been done. Analysis results showed that several tested rods were not bounded within calculated fuel performance uncertainty. This implies that the currently considered uncertainty range of the parameters is not enough to cover the fuel performance sufficiently.

  14. Epistemic uncertainties and natural hazard risk assessment - Part 1: A review of the issues

    Science.gov (United States)

    Beven, K. J.; Aspinall, W. P.; Bates, P. D.; Borgomeo, E.; Goda, K.; Hall, J. W.; Page, T.; Phillips, J. C.; Rougier, J. T.; Simpson, M.; Stephenson, D. B.; Smith, P. J.; Wagener, T.; Watson, M.

    2015-12-01

    Uncertainties in natural hazard risk assessment are generally dominated by the sources arising from lack of knowledge or understanding of the processes involved. There is a lack of knowledge about frequencies, process representations, parameters, present and future boundary conditions, consequences and impacts, and the meaning of observations in evaluating simulation models. These are the epistemic uncertainties that can be difficult to constrain, especially in terms of event or scenario probabilities, even as elicited probabilities rationalized on the basis of expert judgements. This paper reviews the issues raised by trying to quantify the effects of epistemic uncertainties. Such scientific uncertainties might have significant influence on decisions that are made for risk management, so it is important to communicate the meaning of an uncertainty estimate and to provide an audit trail of the assumptions on which it is based. Some suggestions for good practice in doing so are made.

  15. Use of screening techniques to reduce uncertainty in risk assessment at a former manufactured gas plant site

    International Nuclear Information System (INIS)

    Logan, C.M.; Walden, R.H.; Baker, S.R.; Pekar, Z.; LaKind, J.S.; MacFarlane, I.D.

    1995-01-01

    Preliminary analysis of risks from a former manufactured gas plant (MGP) site revealed six media associated with potential exposure pathways: soils, air, surface water, groundwater, estuarine sediments, and aquatic biota. Contaminants of concern (COCs) include polycyclic aromatic hydrocarbons, volatile organic hydrocarbons, metals, cyanide, and PCBs. Available chemical data, including site-specific measurements and existing data from other sources (e.g., agency monitoring programs, Chesapeake Bay Program), were evaluated for potential utility in risk assessment. Where sufficient data existed, risk calculations were performed using central tendency and reasonable maximum exposure estimates. Where site-specific data were not available, risks were estimated using conservatively high default assumptions for dose and/or exposure duration. Because of the large number of potential exposure pathways and COCs, a sensitivity analysis was conducted to determine which information most influences risk assessment outcome so that any additional data collection to reduce uncertainty can be cost-effectively targeted. The sensitivity analysis utilized two types of information: (1) the impact that uncertainty in risk input values has on output risk estimates, and (2) the potential improvement in key risk input values, and consequently output values, if better site-specific data were available. A decision matrix using both quantitative and qualitative information was developed to prioritize sampling strategies to minimize uncertainty in the final risk assessment

  16. Sensitivity, uncertainty, and importance analysis of a risk assessment

    International Nuclear Information System (INIS)

    Andsten, R.S.; Vaurio, J.K.

    1992-01-01

    In this paper a number of supplementary studies and applications associated with probabilistic safety assessment (PSA) are described, including sensitivity and importance evaluations of failures, errors, systems, and groups of components. The main purpose is to illustrate the usefulness of a PSA for making decisions about safety improvements, training, allowed outage times, and test intervals. A useful measure of uncertainty importance is presented, and it points out areas needing development, such as reactor vessel aging phenomena, for reducing overall uncertainty. A time-dependent core damage frequency is also presented, illustrating the impact of testing scenarios and intervals. Tea methods and applications presented are based on the Level 1 PSA carried out for the internal initiating event of the Loviisa 1 nuclear power station. Steam generator leakages and associated operator actions are major contributors to the current core-damage frequency estimate of 2 x10 -4 /yr. The results are used to improve the plant and procedures and to guide future improvements

  17. Risk, uncertainty and prophet: The psychological insights of Frank H. Knight

    Directory of Open Access Journals (Sweden)

    Tim Rakow

    2010-10-01

    Full Text Available Economist Frank H. Knight (1885--1972 is commonly credited with defining the distinction between decisions under ``risk'' (known chance and decisions under ``uncertainty'' (unmeasurable probability in his 1921 book Risk, Uncertainty and Profit. A closer reading of Knight (1921 reveals a host of psychological insights beyond this risk-uncertainty distinction, many of which foreshadow revolutionary advances in psychological decision theory from the latter half of the 20th century. Knight's description of economic decision making shared much with Simon's (1955, 1956 notion of bounded rationality, whereby choice behavior is regulated by cognitive and environmental constraints. Knight described features of risky choice that were to become key components of prospect theory (Kahneman and Tversky, 1979: the reference dependent valuation of outcomes, and the non-linear weighting of probabilities. Knight also discussed several biases in human decision making, and pointed to two systems of reasoning: one quick, intuitive but error prone, and a slower, more deliberate, rule-based system. A discussion of Knight's potential contribution to psychological decision theory emphasises the importance of a historical perspective on theory development, and the potential value of sourcing ideas from other disciplines or from earlier periods of time.

  18. Uncertainty analysis for secondary energy distributions

    International Nuclear Information System (INIS)

    Gerstl, S.A.W.

    1978-01-01

    In many transport calculations the integral design parameter of interest (response) is determined mainly by secondary particles such as gamma rays from (n,γ) reactions or secondary neutrons from inelastic scattering events or (n,2n) reactions. Standard sensitivity analysis usually allows to calculate the sensitivities to the production cross sections of such secondaries, but an extended formalism is needed to also obtain the sensitivities to the energy distribution of the generated secondary particles. For a 30-group standard cross-section set 84% of all non-zero table positions pertain to the description of secondary energy distributions (SED's) and only 16% to the actual reaction cross sections. Therefore, any sensitivity/uncertainty analysis which does not consider the effects of SED's is incomplete and neglects most of the input data. This paper describes the methods of how sensitivity profiles for SED's are obtained and used to estimate the uncertainty of an integral response due to uncertainties in these SED's. The detailed theory is documented elsewhere and implemented in the LASL sensitivity code SENSIT. SED sensitivity profiles have proven particularly valuable in cross-section uncertainty analyses for fusion reactors. Even when the production cross sections for secondary neutrons were assumed to be without error, the uncertainties in the energy distribution of these secondaries produced appreciable uncertainties in the calculated tritium breeding rate. However, complete error files for SED's are presently nonexistent. Therefore, methods will be described that allow rough error estimates due to estimated SED uncertainties based on integral SED sensitivities

  19. Uncertainty and Sensitivity of Direct Economic Flood Damages: the FloodRisk Free and Open-Source Software

    Science.gov (United States)

    Albano, R.; Sole, A.; Mancusi, L.; Cantisani, A.; Perrone, A.

    2017-12-01

    The considerable increase of flood damages in the the past decades has shifted in Europe the attention from protection against floods to managing flood risks. In this context, the expected damages assessment represents a crucial information within the overall flood risk management process. The present paper proposes an open source software, called FloodRisk, that is able to operatively support stakeholders in the decision making processes with a what-if approach by carrying out the rapid assessment of the flood consequences, in terms of direct economic damage and loss of human lives. The evaluation of the damage scenarios, trough the use of the GIS software proposed here, is essential for cost-benefit or multi-criteria analysis of risk mitigation alternatives. However, considering that quantitative assessment of flood damages scenarios is characterized by intrinsic uncertainty, a scheme has been developed to identify and quantify the role of the input parameters in the total uncertainty of flood loss model application in urban areas with mild terrain and complex topography. By the concept of parallel models, the contribution of different module and input parameters to the total uncertainty is quantified. The results of the present case study have exhibited a high epistemic uncertainty on the damage estimation module and, in particular, on the type and form of the utilized damage functions, which have been adapted and transferred from different geographic and socio-economic contexts because there aren't depth-damage functions that are specifically developed for Italy. Considering that uncertainty and sensitivity depend considerably on local characteristics, the epistemic uncertainty associated with the risk estimate is reduced by introducing additional information into the risk analysis. In the light of the obtained results, it is evident the need to produce and disseminate (open) data to develop micro-scale vulnerability curves. Moreover, the urgent need to push

  20. Uncertainty Estimation Cheat Sheet for Probabilistic Risk Assessment

    Science.gov (United States)

    Britton, Paul T.; Al Hassan, Mohammad; Ring, Robert W.

    2017-01-01

    "Uncertainty analysis itself is uncertain, therefore, you cannot evaluate it exactly," Source Uncertain Quantitative results for aerospace engineering problems are influenced by many sources of uncertainty. Uncertainty analysis aims to make a technical contribution to decision-making through the quantification of uncertainties in the relevant variables as well as through the propagation of these uncertainties up to the result. Uncertainty can be thought of as a measure of the 'goodness' of a result and is typically represented as statistical dispersion. This paper will explain common measures of centrality and dispersion; and-with examples-will provide guidelines for how they may be estimated to ensure effective technical contributions to decision-making.

  1. Uncertainty Analysis of Light Water Reactor Fuel Lattices

    Directory of Open Access Journals (Sweden)

    C. Arenas

    2013-01-01

    Full Text Available The study explored the calculation of uncertainty based on available cross-section covariance data and computational tool on fuel lattice levels, which included pin cell and the fuel assembly models. Uncertainty variations due to temperatures changes and different fuel compositions are the main focus of this analysis. Selected assemblies and unit pin cells were analyzed according to the OECD LWR UAM benchmark specifications. Criticality and uncertainty analysis were performed using TSUNAMI-2D sequence in SCALE 6.1. It was found that uncertainties increase with increasing temperature, while kinf decreases. This increase in the uncertainty is due to the increase in sensitivity of the largest contributing reaction of uncertainty, namely, the neutron capture reaction 238U(n, γ due to the Doppler broadening. In addition, three types (UOX, MOX, and UOX-Gd2O3 of fuel material compositions were analyzed. A remarkable increase in uncertainty in kinf was observed for the case of MOX fuel. The increase in uncertainty of kinf in MOX fuel was nearly twice the corresponding value in UOX fuel. The neutron-nuclide reaction of 238U, mainly inelastic scattering (n, n′, contributed the most to the uncertainties in the MOX fuel, shifting the neutron spectrum to higher energy compared to the UOX fuel.

  2. Modeling the Near-Term Risk of Climate Uncertainty: Interdependencies among the U.S. States

    Science.gov (United States)

    Lowry, T. S.; Backus, G.; Warren, D.

    2010-12-01

    Decisions made to address climate change must start with an understanding of the risk of an uncertain future to human systems, which in turn means understanding both the consequence as well as the probability of a climate induced impact occurring. In other words, addressing climate change is an exercise in risk-informed policy making, which implies that there is no single correct answer or even a way to be certain about a single answer; the uncertainty in future climate conditions will always be present and must be taken as a working-condition for decision making. In order to better understand the implications of uncertainty on risk and to provide a near-term rationale for policy interventions, this study estimates the impacts from responses to climate change on U.S. state- and national-level economic activity by employing a risk-assessment methodology for evaluating uncertain future climatic conditions. Using the results from the Intergovernmental Panel on Climate Change’s (IPCC) Fourth Assessment Report (AR4) as a proxy for climate uncertainty, changes in hydrology over the next 40 years were mapped and then modeled to determine the physical consequences on economic activity and to perform a detailed 70-industry analysis of the economic impacts among the interacting lower-48 states. The analysis determines industry-level effects, employment impacts at the state level, interstate population migration, consequences to personal income, and ramifications for the U.S. trade balance. The conclusions show that the average risk of damage to the U.S. economy from climate change is on the order of $1 trillion over the next 40 years, with losses in employment equivalent to nearly 7 million full-time jobs. Further analysis shows that an increase in uncertainty raises this risk. This paper will present the methodology behind the approach, a summary of the underlying models, as well as the path forward for improving the approach.

  3. A risk-based evaluation of the impact of key uncertainties on the prediction of severe accident source terms - STU

    International Nuclear Information System (INIS)

    Ang, M.L.; Grindon, E.; Dutton, L.M.C.; Garcia-Sedano, P.; Santamaria, C.S.; Centner, B.; Auglaire, M.; Routamo, T.; Outa, S.; Jokiniemi, J.; Gustavsson, V.; Wennerstrom, H.; Spanier, L.; Gren, M.; Boschiero, M-H; Droulas, J-L; Friederichs, H-G; Sonnenkalb, M.

    2001-01-01

    The purpose of this project is to address the key uncertainties associated with a number of fission product release and transport phenomena in a wider context and to assess their relevance to key severe accident sequences. This project is a wide-based analysis involving eight reactor designs that are representative of the reactors currently operating in the European Union (EU). In total, 20 accident sequences covering a wide range of conditions have been chosen to provide the basis for sensitivity studies. The appraisal is achieved through a systematic risk-based framework developed within this project. Specifically, this is a quantitative interpretation of the sensitivity calculations on the basis of 'significance indicators', applied above defined threshold values. These threshold values represent a good surrogate for 'large release', which is defined in a number of EU countries. In addition, the results are placed in the context of in-containment source term limits, for advanced light water reactor designs, as defined by international guidelines. Overall, despite the phenomenological uncertainties, the predicted source terms (both into the containment, and subsequently, into the environment) do not display a high degree of sensitivity to the individual fission product issues addressed in this project. This is due, mainly, to the substantial capacity for the attenuation of airborne fission products by the designed safety provisions and the natural fission product retention mechanisms within the containment

  4. Code development of total sensitivity and uncertainty analysis for reactor physics calculations

    International Nuclear Information System (INIS)

    Wan, C.; Cao, L.; Wu, H.; Zu, T.; Shen, W.

    2015-01-01

    Sensitivity and uncertainty analysis are essential parts for reactor system to perform risk and policy analysis. In this study, total sensitivity and corresponding uncertainty analysis for responses of neutronics calculations have been accomplished and developed the S&U analysis code named UNICORN. The UNICORN code can consider the implicit effects of multigroup cross sections on the responses. The UNICORN code has been applied to typical pin-cell case in this paper, and can be proved correct by comparison the results with those of the TSUNAMI-1D code. (author)

  5. Code development of total sensitivity and uncertainty analysis for reactor physics calculations

    Energy Technology Data Exchange (ETDEWEB)

    Wan, C.; Cao, L.; Wu, H.; Zu, T., E-mail: chenghuiwan@stu.xjtu.edu.cn, E-mail: caolz@mail.xjtu.edu.cn, E-mail: hongchun@mail.xjtu.edu.cn, E-mail: tiejun@mail.xjtu.edu.cn [Xi' an Jiaotong Univ., School of Nuclear Science and Technology, Xi' an (China); Shen, W., E-mail: Wei.Shen@cnsc-ccsn.gc.ca [Xi' an Jiaotong Univ., School of Nuclear Science and Technology, Xi' an (China); Canadian Nuclear Safety Commission, Ottawa, ON (Canada)

    2015-07-01

    Sensitivity and uncertainty analysis are essential parts for reactor system to perform risk and policy analysis. In this study, total sensitivity and corresponding uncertainty analysis for responses of neutronics calculations have been accomplished and developed the S&U analysis code named UNICORN. The UNICORN code can consider the implicit effects of multigroup cross sections on the responses. The UNICORN code has been applied to typical pin-cell case in this paper, and can be proved correct by comparison the results with those of the TSUNAMI-1D code. (author)

  6. Sparse grid-based polynomial chaos expansion for aerodynamics of an airfoil with uncertainties

    Directory of Open Access Journals (Sweden)

    Xiaojing WU

    2018-05-01

    Full Text Available The uncertainties can generate fluctuations with aerodynamic characteristics. Uncertainty Quantification (UQ is applied to compute its impact on the aerodynamic characteristics. In addition, the contribution of each uncertainty to aerodynamic characteristics should be computed by uncertainty sensitivity analysis. Non-Intrusive Polynomial Chaos (NIPC has been successfully applied to uncertainty quantification and uncertainty sensitivity analysis. However, the non-intrusive polynomial chaos method becomes inefficient as the number of random variables adopted to describe uncertainties increases. This deficiency becomes significant in stochastic aerodynamic analysis considering the geometric uncertainty because the description of geometric uncertainty generally needs many parameters. To solve the deficiency, a Sparse Grid-based Polynomial Chaos (SGPC expansion is used to do uncertainty quantification and sensitivity analysis for stochastic aerodynamic analysis considering geometric and operational uncertainties. It is proved that the method is more efficient than non-intrusive polynomial chaos and Monte Carlo Simulation (MSC method for the stochastic aerodynamic analysis. By uncertainty quantification, it can be learnt that the flow characteristics of shock wave and boundary layer separation are sensitive to the geometric uncertainty in transonic region. The uncertainty sensitivity analysis reveals the individual and coupled effects among the uncertainty parameters. Keywords: Non-intrusive polynomial chaos, Sparse grid, Stochastic aerodynamic analysis, Uncertainty sensitivity analysis, Uncertainty quantification

  7. The management of uncertainties in the French regulation on deep disposal: the development of a non-risk based approach

    International Nuclear Information System (INIS)

    Raimbault, P.

    2004-01-01

    The development of a safety case for disposal of high level and medium level long-lived waste in a geological formation has to handle two main difficulties: - uncertainties associated to natural systems; - uncertainties associated to the consideration of long time scales. Licensing of the different steps leading to geological disposal implies thus that a sufficient level of confidence in the safety case will be obtained, at each step, among the different stakeholders. The confidence in the safety case relies on the whole set of arguments of different natures which complement each other and build up the file. This means that, to be defensible, the safety case should be organised in such a way that it can be reviewed and scrutinized in a structured manner. This also means that individual elements of the safety case will have to be considered separately even if all elements should fit well in the integrated safety case. This segregation implies some inherent decoupling of parts of the system, of its evolution over time and of the events that may impact on it. This decoupling will thus introduce inherent uncertainties that risk or non-risk based approaches have to deal with since both approaches have to introduce transparency in the analysis. In the non-risk based or deterministic approach this segregation is pushed further in order to put into perspective the different elements of appreciation that allow to judge the safety case as a whole. The French regulation on deep disposal presented in the basic safety rule RFS III.3.f, issued in 1991, takes these points into consideration to set the basis for the safety case in the framework of a deterministic approach. This basic safety rule is currently being revised in order to clarify some concepts and to take account evolution of ideas at the national and international level. However the basic rationale behind the safety assessment methodology will remain the same. The approach presented in RFS III.2.f implies that at

  8. Accounting for Households' Perceived Income Uncertainty in Consumption Risk Sharing

    NARCIS (Netherlands)

    Singh, S.; Stoltenberg, C.A.

    2017-01-01

    We develop a consumption risk-sharing model that distinguishes households' perceived income uncertainty from income uncertainty as measured by an econometrician. Households receive signals on their future disposable income that can drive a gap between the two uncertainties. Accounting for the

  9. Parameter uncertainty effects on variance-based sensitivity analysis

    International Nuclear Information System (INIS)

    Yu, W.; Harris, T.J.

    2009-01-01

    In the past several years there has been considerable commercial and academic interest in methods for variance-based sensitivity analysis. The industrial focus is motivated by the importance of attributing variance contributions to input factors. A more complete understanding of these relationships enables companies to achieve goals related to quality, safety and asset utilization. In a number of applications, it is possible to distinguish between two types of input variables-regressive variables and model parameters. Regressive variables are those that can be influenced by process design or by a control strategy. With model parameters, there are typically no opportunities to directly influence their variability. In this paper, we propose a new method to perform sensitivity analysis through a partitioning of the input variables into these two groupings: regressive variables and model parameters. A sequential analysis is proposed, where first an sensitivity analysis is performed with respect to the regressive variables. In the second step, the uncertainty effects arising from the model parameters are included. This strategy can be quite useful in understanding process variability and in developing strategies to reduce overall variability. When this method is used for nonlinear models which are linear in the parameters, analytical solutions can be utilized. In the more general case of models that are nonlinear in both the regressive variables and the parameters, either first order approximations can be used, or numerically intensive methods must be used

  10. Uncertainty, causality and decision: The case of social risks and nuclear risk in particular

    International Nuclear Information System (INIS)

    Lahidji, R.

    2012-01-01

    Probability and causality are two indispensable tools for addressing situations of social risk. Causal relations are the foundation for building risk assessment models and identifying risk prevention, mitigation and compensation measures. Probability enables us to quantify risk assessments and to calibrate intervention measures. It therefore seems not only natural, but also necessary to make the role of causality and probability explicit in the definition of decision problems in situations of social risk. Such is the aim of this thesis.By reviewing the terminology of risk and the logic of public interventions in various fields of social risk, we gain a better understanding of the notion and of the issues that one faces when trying to model it. We further elaborate our analysis in the case of nuclear safety, examining in detail how methods and policies have been developed in this field and how they have evolved through time. This leads to a number of observations concerning risk and safety assessments.Generalising the concept of intervention in a Bayesian network allows us to develop a variety of causal Bayesian networks adapted to our needs. In this framework, we propose a definition of risk which seems to be relevant for a broad range of issues. We then offer simple applications of our model to specific aspects of the Fukushima accident and other nuclear safety problems. In addition to specific lessons, the analysis leads to the conclusion that a systematic approach for identifying uncertainties is needed in this area. When applied to decision theory, our tool evolves into a dynamic decision model in which acts cause consequences and are causally interconnected. The model provides a causal interpretation of Savage's conceptual framework, solves some of its paradoxes and clarifies certain aspects. It leads us to considering uncertainty with regard to a problem's causal structure as the source of ambiguity in decision-making, an interpretation which corresponds to a

  11. Uncertainty analysis in estimating Japanese ingestion of global fallout Cs-137 using health risk evaluation model

    International Nuclear Information System (INIS)

    Shimada, Yoko; Morisawa, Shinsuke

    1998-01-01

    Most of model estimation of the environmental contamination includes some uncertainty associated with the parameter uncertainty in the model. In this study, the uncertainty was analyzed in a model for evaluating the ingestion of radionuclide caused by the long-term global low-level radioactive contamination by using various uncertainty analysis methods: the percentile estimate, the robustness analysis and the fuzzy estimate. The model is mainly composed of five sub-models, which include their own uncertainty; we also analyzed the uncertainty. The major findings obtained in this study include that the possibility of the discrepancy between predicted value by the model simulation and the observed data is less than 10%; the uncertainty of the predicted value is higher before 1950 and after 1980; the uncertainty of the predicted value can be reduced by decreasing the uncertainty of some environmental parameters in the model; the reliability of the model can definitively depend on the following environmental factors: direct foliar absorption coefficient, transfer factor of radionuclide from stratosphere down to troposphere, residual rate by food processing and cooking, transfer factor of radionuclide in ocean and sedimentation in ocean. (author)

  12. Source modelling in seismic risk analysis for nuclear power plants

    International Nuclear Information System (INIS)

    Yucemen, M.S.

    1978-12-01

    The proposed probabilistic procedure provides a consistent method for the modelling, analysis and updating of uncertainties that are involved in the seismic risk analysis for nuclear power plants. The potential earthquake activity zones are idealized as point, line or area sources. For these seismic source types, expressions to evaluate their contribution to seismic risk are derived, considering all the possible site-source configurations. The seismic risk at a site is found to depend not only on the inherent randomness of the earthquake occurrences with respect to magnitude, time and space, but also on the uncertainties associated with the predicted values of the seismic and geometric parameters, as well as the uncertainty in the attenuation model. The uncertainty due to the attenuation equation is incorporated into the analysis through the use of random correction factors. The influence of the uncertainty resulting from the insufficient information on the seismic parameters and source geometry is introduced into the analysis by computing a mean risk curve averaged over the various alternative assumptions on the parameters and source geometry. Seismic risk analysis is carried for the city of Denizli, which is located in the seismically most active zone of Turkey. The second analysis is for Akkuyu

  13. Uncertainty Assessments in Fast Neutron Activation Analysis

    International Nuclear Information System (INIS)

    W. D. James; R. Zeisler

    2000-01-01

    Fast neutron activation analysis (FNAA) carried out with the use of small accelerator-based neutron generators is routinely used for major/minor element determinations in industry, mineral and petroleum exploration, and to some extent in research. While the method shares many of the operational procedures and therefore errors inherent to conventional thermal neutron activation analysis, its unique implementation gives rise to additional specific concerns that can result in errors or increased uncertainties of measured quantities. The authors were involved in a recent effort to evaluate irreversible incorporation of oxygen into a standard reference material (SRM) by direct measurement of oxygen by FNAA. That project required determination of oxygen in bottles of the SRM stored in varying environmental conditions and a comparison of the results. We recognized the need to accurately describe the total uncertainty of the measurements to accurately characterize any differences in the resulting average concentrations. It is our intent here to discuss the breadth of potential parameters that have the potential to contribute to the random and nonrandom errors of the method and provide estimates of the magnitude of uncertainty introduced. In addition, we will discuss the steps taken in this recent FNAA project to control quality, assess the uncertainty of the measurements, and evaluate results based on the statistical reproducibility

  14. Time of Concentration equations: the role of morphometric uncertainties in flood risk analysis and management

    Science.gov (United States)

    Martins, Luciano; Díez-Herrero, Andrés; Bodoque, Jose M.; Bateira, Carlos

    2016-04-01

    The perception of flood risk by the responsible authorities on the flood management disasters and mitigation strategies should be based on an overall evaluation of the uncertainties associated with the procedures for risk assessment and mapping production. This contribution presents the results of the development of mapping evaluation of the time of concentration (tc). This parameter reflects the time-space at which a watershed responds to rainfall events and is the most frequently utilized time parameter, and is of great importance in many hydrologic analysis. Accurate estimates of the tc are very important, for instance, if tc is under-estimated, the result is an over-estimated peak discharge and vice versa, resulting significant variations on the flooded areas, and could have important consequences in terms of the land use and occupation of territory, as management's own flood risk. The methology used evaluate 20 different empirical, semi-empirical and kinematics equations of tc calculation, due to different cartographic scales (1:200000; 1:100000; 1:25000; LIDAR 5x5m &1x1m) in in two hydrographic basins with distinct dimensions and geomorphological characteristics, located in the Gredos Mountain range (Spain). The results suggest that the changes in the cartographic scale, has not influence as significant as one might expect. The most important variations occur in the characteristics of the fequations, use different morphometricparameters in the calculations. Some just are based on geomorphological criteria and other magnify the hydraulic characteristics of the channels, resulting in very different tc values. However, we highlighting the role of cartographic scale particularly in the application of semi-empirical equations that take into account changes in land use and occupation. In this case, the determination of parameters, such as flow coefficient, curve number and roughness coefficient are very sensitive to cartographic scale. Sensitivity analysis

  15. The shape of uncertainty: underwriting decisions in the face of catastrophic risk

    International Nuclear Information System (INIS)

    Keykhah, M.

    1998-01-01

    This paper will explore how insurance and re-insurance underwriters price catastrophe risk from natural perils. It will first describe the theoretical nature of pricing risk, and outline studies of underwriting that propose analyzing decision making from a more behavioral than rational choice perspective. The paper then argues that in order to provide the appropriate context for probability (which is the focus of the studies on decision making under uncertainty), it may be helpful to look at the nature of choice within a market and organizational context. Moreover, the nature of probability itself is explored with a review to construct a broader analysis. Finally, it will be argued that the causal framework of the underwriter, in addition to inductive reasoning, provides a shape to uncertainty. (author)

  16. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    Science.gov (United States)

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  17. Summary of the CEC/USDOE workshop on uncertainty analysis

    International Nuclear Information System (INIS)

    Elderkin, C.E.; Kelly, G.N.

    1990-06-01

    There is uncertainty in all aspects of assessing the consequences of accidental releases of radioactive material, from understanding and describing the environmental and biological transfer processes to modeling emergency response. The need for an exchange of views and a comparison of approaches between the diverse disciplines led to the organization of a CEC/USDOE Workshop on Uncertainty Analysis held in Santa Fe, New Mexico, in November 1989. The workshop brought together specialists in a number of disciplines, including those expert in the mathematics and statistics of uncertainty analysis, in expert judgment elicitation and evaluation, and in all aspects of assessing the radiological and environmental consequences of accidental releases of radioactive material. In addition, there was participation from users of the output of accident consequences assessment in decision making and/or regulatory frameworks. The main conclusions that emerged from the workshop are summarized in this paper. These are discussed in the context of three different types of accident consequence assessment: probabilistic assessments of accident consequences undertaken as inputs to risk analyses of nuclear installations, assessments of accident consequences in real time to provide inputs to decisions on the introduction of countermeasures, and the reconstruction of doses and risks resulting form past releases of radioactive material

  18. Scientific uncertainties associated with risk assessment of radiation

    International Nuclear Information System (INIS)

    Hubert, P.; Fagnani, F.

    1989-05-01

    The proper use and interpretation of data pertaining to biological effects of ionizing radiations is based on a continuous effort to discuss the various assumptions and uncertainties in the process of risk assessment. In this perspective, it has been considered useful by the Committee to review critically the general scientific foundations that constitute the basic framework of data for the evaluation of health effects of radiation. This review is an attempt to identify the main sources of uncertainties, to give, when possible, an order of magnitude for their relative importance, and to clarify the principal interactions between the different steps of the process of risk quantification. The discussion has been restricted to stochastic effects and especially to cancer induction in man: observations at the cellular levels and animal and in vitro experiments have not been considered. The consequences which might result from abandoning the hypothesis of linearity have not been directly examined in this draft, especially in respect to the concept of collective dose. Since another document dealing with 'Dose-response relationships for radiation-induced cancer' is in preparation, an effort has been made to avoid any overlap by making reference to that document whenever necessary

  19. Scientific uncertainties associated with risk assessment of radiation

    Energy Technology Data Exchange (ETDEWEB)

    Hubert, P; Fagnani, F

    1989-05-01

    The proper use and interpretation of data pertaining to biological effects of ionizing radiations is based on a continuous effort to discuss the various assumptions and uncertainties in the process of risk assessment. In this perspective, it has been considered useful by the Committee to review critically the general scientific foundations that constitute the basic framework of data for the evaluation of health effects of radiation. This review is an attempt to identify the main sources of uncertainties, to give, when possible, an order of magnitude for their relative importance, and to clarify the principal interactions between the different steps of the process of risk quantification. The discussion has been restricted to stochastic effects and especially to cancer induction in man: observations at the cellular levels and animal and in vitro experiments have not been considered. The consequences which might result from abandoning the hypothesis of linearity have not been directly examined in this draft, especially in respect to the concept of collective dose. Since another document dealing with 'Dose-response relationships for radiation-induced cancer' is in preparation, an effort has been made to avoid any overlap by making reference to that document whenever necessary.

  20. Cost Risk Analysis Based on Perception of the Engineering Process

    Science.gov (United States)

    Dean, Edwin B.; Wood, Darrell A.; Moore, Arlene A.; Bogart, Edward H.

    1986-01-01

    In most cost estimating applications at the NASA Langley Research Center (LaRC), it is desirable to present predicted cost as a range of possible costs rather than a single predicted cost. A cost risk analysis generates a range of cost for a project and assigns a probability level to each cost value in the range. Constructing a cost risk curve requires a good estimate of the expected cost of a project. It must also include a good estimate of expected variance of the cost. Many cost risk analyses are based upon an expert's knowledge of the cost of similar projects in the past. In a common scenario, a manager or engineer, asked to estimate the cost of a project in his area of expertise, will gather historical cost data from a similar completed project. The cost of the completed project is adjusted using the perceived technical and economic differences between the two projects. This allows errors from at least three sources. The historical cost data may be in error by some unknown amount. The managers' evaluation of the new project and its similarity to the old project may be in error. The factors used to adjust the cost of the old project may not correctly reflect the differences. Some risk analyses are based on untested hypotheses about the form of the statistical distribution that underlies the distribution of possible cost. The usual problem is not just to come up with an estimate of the cost of a project, but to predict the range of values into which the cost may fall and with what level of confidence the prediction is made. Risk analysis techniques that assume the shape of the underlying cost distribution and derive the risk curve from a single estimate plus and minus some amount usually fail to take into account the actual magnitude of the uncertainty in cost due to technical factors in the project itself. This paper addresses a cost risk method that is based on parametric estimates of the technical factors involved in the project being costed. The engineering

  1. Risk-informed regulation: handling uncertainty for a rational management of safety

    International Nuclear Information System (INIS)

    Zio, Enrico

    2008-01-01

    A risk-informed regulatory approach implies that risk insights be used as supplement of deterministic information for safety decision-making purposes. In this view, the use of risk assessment techniques is expected to lead to improved safety and a more rational allocation of the limited resources available. On the other hand, it is recognized that uncertainties affect both the deterministic safety analyses and the risk assessments. In order for the risk-informed decision making process to be effective, the adequate representation and treatment of such uncertainties is mandatory. In this paper, the risk-informed regulatory framework is considered under the focus of the uncertainty issue. Traditionally, probability theory has provided the language and mathematics for the representation and treatment of uncertainty. More recently, other mathematical structures have been introduced. In particular, the Dempster-Shafer theory of evidence is here illustrated as a generalized framework encompassing probability theory and possibility theory. The special case of probability theory is only addressed as term of comparison, given that it is a well known subject. On the other hand, the special case of possibility theory is amply illustrated. An example of the combination of probability and possibility for treating the uncertainty in the parameters of an event tree is illustrated

  2. Risk-based classification system of nanomaterials

    Energy Technology Data Exchange (ETDEWEB)

    Tervonen, Tommi, E-mail: t.p.tervonen@rug.n [University of Groningen, Faculty of Economics and Business (Netherlands); Linkov, Igor, E-mail: igor.linkov@usace.army.mi [US Army Research and Development Center (United States); Figueira, Jose Rui, E-mail: figueira@ist.utl.p [Technical University of Lisbon, CEG-IST, Centre for Management Studies, Instituto Superior Tecnico (Portugal); Steevens, Jeffery, E-mail: jeffery.a.steevens@usace.army.mil; Chappell, Mark, E-mail: mark.a.chappell@usace.army.mi [US Army Research and Development Center (United States); Merad, Myriam, E-mail: myriam.merad@ineris.f [INERIS BP 2, Societal Management of Risks Unit/Accidental Risks Division (France)

    2009-05-15

    Various stakeholders are increasingly interested in the potential toxicity and other risks associated with nanomaterials throughout the different stages of a product's life cycle (e.g., development, production, use, disposal). Risk assessment methods and tools developed and applied to chemical and biological materials may not be readily adaptable for nanomaterials because of the current uncertainty in identifying the relevant physico-chemical and biological properties that adequately describe the materials. Such uncertainty is further driven by the substantial variations in the properties of the original material due to variable manufacturing processes employed in nanomaterial production. To guide scientists and engineers in nanomaterial research and application as well as to promote the safe handling and use of these materials, we propose a decision support system for classifying nanomaterials into different risk categories. The classification system is based on a set of performance metrics that measure both the toxicity and physico-chemical characteristics of the original materials, as well as the expected environmental impacts through the product life cycle. Stochastic multicriteria acceptability analysis (SMAA-TRI), a formal decision analysis method, was used as the foundation for this task. This method allowed us to cluster various nanomaterials in different ecological risk categories based on our current knowledge of nanomaterial physico-chemical characteristics, variation in produced material, and best professional judgments. SMAA-TRI uses Monte Carlo simulations to explore all feasible values for weights, criteria measurements, and other model parameters to assess the robustness of nanomaterial grouping for risk management purposes.

  3. Uncertainty propagation in probabilistic risk assessment: A comparative study

    International Nuclear Information System (INIS)

    Ahmed, S.; Metcalf, D.R.; Pegram, J.W.

    1982-01-01

    Three uncertainty propagation techniques, namely method of moments, discrete probability distribution (DPD), and Monte Carlo simulation, generally used in probabilistic risk assessment, are compared and conclusions drawn in terms of the accuracy of the results. For small uncertainty in the basic event unavailabilities, the three methods give similar results. For large uncertainty, the method of moments is in error, and the appropriate method is to propagate uncertainty in the discrete form either by DPD method without sampling or by Monte Carlo. (orig.)

  4. Risk analysis

    International Nuclear Information System (INIS)

    Baron, J.H.; Nunez McLeod, J.; Rivera, S.S.

    1997-01-01

    This book contains a selection of research works performed in the CEDIAC Institute (Cuyo National University) in the area of Risk Analysis, with specific orientations to the subjects of uncertainty and sensitivity studies, software reliability, severe accident modeling, etc. This volume presents important material for all those researches who want to have an insight in the risk analysis field, as a tool to solution several problems frequently found in the engineering and applied sciences field, as well as for the academic teachers who want to keep up to date, including the new developments and improvements continuously arising in this field [es

  5. Unquestioned answers or unanswered questions: beliefs about science guide responses to uncertainty in climate change risk communication.

    Science.gov (United States)

    Rabinovich, Anna; Morton, Thomas A

    2012-06-01

    In two experimental studies we investigated the effect of beliefs about the nature and purpose of science (classical vs. Kuhnian models of science) on responses to uncertainty in scientific messages about climate change risk. The results revealed a significant interaction between both measured (Study 1) and manipulated (Study 2) beliefs about science and the level of communicated uncertainty on willingness to act in line with the message. Specifically, messages that communicated high uncertainty were more persuasive for participants who shared an understanding of science as debate than for those who believed that science is a search for absolute truth. In addition, participants who had a concept of science as debate were more motivated by higher (rather than lower) uncertainty in climate change messages. The results suggest that achieving alignment between the general public's beliefs about science and the style of the scientific messages is crucial for successful risk communication in science. Accordingly, rather than uncertainty always undermining the effectiveness of science communication, uncertainty can enhance message effects when it fits the audience's understanding of what science is. © 2012 Society for Risk Analysis.

  6. Deterministic sensitivity and uncertainty analysis for large-scale computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Pin, F.G.; Oblow, E.M.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.

    1988-01-01

    This paper presents a comprehensive approach to sensitivity and uncertainty analysis of large-scale computer models that is analytic (deterministic) in principle and that is firmly based on the model equations. The theory and application of two systems based upon computer calculus, GRESS and ADGEN, are discussed relative to their role in calculating model derivatives and sensitivities without a prohibitive initial manpower investment. Storage and computational requirements for these two systems are compared for a gradient-enhanced version of the PRESTO-II computer model. A Deterministic Uncertainty Analysis (DUA) method that retains the characteristics of analytically computing result uncertainties based upon parameter probability distributions is then introduced and results from recent studies are shown. 29 refs., 4 figs., 1 tab

  7. Exploiting risk-reward structures in decision making under uncertainty.

    Science.gov (United States)

    Leuker, Christina; Pachur, Thorsten; Hertwig, Ralph; Pleskac, Timothy J

    2018-06-01

    People often have to make decisions under uncertainty-that is, in situations where the probabilities of obtaining a payoff are unknown or at least difficult to ascertain. One solution to this problem is to infer the probability from the magnitude of the potential payoff and thus exploit the inverse relationship between payoffs and probabilities that occurs in many domains in the environment. Here, we investigated how the mind may implement such a solution: (1) Do people learn about risk-reward relationships from the environment-and if so, how? (2) How do learned risk-reward relationships impact preferences in decision-making under uncertainty? Across three experiments (N = 352), we found that participants can learn risk-reward relationships from being exposed to choice environments with a negative, positive, or uncorrelated risk-reward relationship. They were able to learn the associations both from gambles with explicitly stated payoffs and probabilities (Experiments 1 & 2) and from gambles about epistemic events (Experiment 3). In subsequent decisions under uncertainty, participants often exploited the learned association by inferring probabilities from the magnitudes of the payoffs. This inference systematically influenced their preferences under uncertainty: Participants who had been exposed to a negative risk-reward relationship tended to prefer the uncertain option over a smaller sure option for low payoffs, but not for high payoffs. This pattern reversed in the positive condition and disappeared in the uncorrelated condition. This adaptive change in preferences is consistent with the use of the risk-reward heuristic. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. A rigorous methodology for development and uncertainty analysis of group contribution based property models

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Abildskov, Jens; Sin, Gürkan

    ) weighted-least-square regression. 3) Initialization of estimation by use of linear algebra providing a first guess. 4) Sequential parameter and simultaneous GC parameter by using of 4 different minimization algorithms. 5) Thorough uncertainty analysis: a) based on asymptotic approximation of parameter...... covariance matrix b) based on boot strap method. Providing 95%-confidence intervals of parameters and predicted property. 6) Performance statistics analysis and model application. The application of the methodology is shown for a new GC model built to predict lower flammability limit (LFL) for refrigerants...... their credibility and robustness in wider industrial and scientific applications....

  9. Uncertainty analysis of the radiological characteristics of radioactive waste using a method based on log-normal distributions

    International Nuclear Information System (INIS)

    Gigase, Yves

    2007-01-01

    Available in abstract form only. Full text of publication follows: The uncertainty on characteristics of radioactive LILW waste packages is difficult to determine and often very large. This results from a lack of knowledge of the constitution of the waste package and of the composition of the radioactive sources inside. To calculate a quantitative estimate of the uncertainty on a characteristic of a waste package one has to combine these various uncertainties. This paper discusses an approach to this problem, based on the use of the log-normal distribution, which is both elegant and easy to use. It can provide as example quantitative estimates of uncertainty intervals that 'make sense'. The purpose is to develop a pragmatic approach that can be integrated into existing characterization methods. In this paper we show how our method can be applied to the scaling factor method. We also explain how it can be used when estimating other more complex characteristics such as the total uncertainty of a collection of waste packages. This method could have applications in radioactive waste management, more in particular in those decision processes where the uncertainty on the amount of activity is considered to be important such as in probability risk assessment or the definition of criteria for acceptance or categorization. (author)

  10. Sensitivity Analysis of Uncertainty Parameter based on MARS-LMR Code on SHRT-45R of EBR II

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Seok-Ju; Kang, Doo-Hyuk; Seo, Jae-Seung [System Engineering and Technology Co., Daejeon (Korea, Republic of); Bae, Sung-Won [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Jeong, Hae-Yong [Sejong University, Seoul (Korea, Republic of)

    2016-10-15

    In order to assess the uncertainty quantification of the MARS-LMR code, the code has been improved by modifying the source code to accommodate calculation process required for uncertainty quantification. In the present study, a transient of Unprotected Loss of Flow(ULOF) is selected as typical cases of as Anticipated Transient without Scram(ATWS) which belongs to DEC category. The MARS-LMR input generation for EBR II SHRT-45R and execution works are performed by using the PAPIRUS program. The sensitivity analysis is carried out with Uncertainty Parameter of the MARS-LMR code for EBR-II SHRT-45R. Based on the results of sensitivity analysis, dominant parameters with large sensitivity to FoM are picked out. Dominant parameters selected are closely related to the development process of ULOF event.

  11. Uncertainty analysis of environmental models

    International Nuclear Information System (INIS)

    Monte, L.

    1990-01-01

    In the present paper an evaluation of the output uncertainty of an environmental model for assessing the transfer of 137 Cs and 131 I in the human food chain are carried out on the basis of a statistical analysis of data reported by the literature. The uncertainty analysis offers the oppotunity of obtaining some remarkable information about the uncertainty of models predicting the migration of non radioactive substances in the environment mainly in relation to the dry and wet deposition

  12. The role of social cost-benefit analysis in societal decision-making under large uncertainties with application to robbery at a cash depot

    International Nuclear Information System (INIS)

    Jones-Lee, M.; Aven, T.

    2009-01-01

    Social cost-benefit analysis is a well-established method for guiding decisions about safety investments, particularly in situations in which it is possible to make accurate predictions of future performance. However, its direct applicability to situations involving large degrees of uncertainty is less obvious and this raises the question of the extent to which social cost-benefit analysis can provide a useful input to the decision framework that has been explicitly developed to deal with safety decisions in which uncertainty is a major factor, namely risk analysis. This is the main focus of the arguments developed in this paper. In particular, we provide new insights by examining the fundamentals of both approaches and our principal conclusion is that social cost-benefit analysis and risk analysis represent complementary input bases to the decision-making process, and even in the case of large uncertainties social cost-benefit analysis may provide very useful decision support. What is required is the establishment of a proper contextual framework which structures and gives adequate weight to the uncertainties. An application to the possibility of a robbery at a cash depot is examined as a practical example.

  13. Allocating risk capital for a brownfields redevelopment project under hydrogeological and financial uncertainty.

    Science.gov (United States)

    Yu, Soonyoung; Unger, Andre J A; Parker, Beth; Kim, Taehee

    2012-06-15

    In this study, we defined risk capital as the contingency fee or insurance premium that a brownfields redeveloper needs to set aside from the sale of each house in case they need to repurchase it at a later date because the indoor air has been detrimentally affected by subsurface contamination. The likelihood that indoor air concentrations will exceed a regulatory level subject to subsurface heterogeneity and source zone location uncertainty is simulated by a physics-based hydrogeological model using Monte Carlo realizations, yielding the probability of failure. The cost of failure is the future value of the house indexed to the stochastic US National Housing index. The risk capital is essentially the probability of failure times the cost of failure with a surcharge to compensate the developer against hydrogeological and financial uncertainty, with the surcharge acting as safety loading reflecting the developers' level of risk aversion. We review five methodologies taken from the actuarial and financial literature to price the risk capital for a highly stylized brownfield redevelopment project, with each method specifically adapted to accommodate our notion of the probability of failure. The objective of this paper is to develop an actuarially consistent approach for combining the hydrogeological and financial uncertainty into a contingency fee that the brownfields developer should reserve (i.e. the risk capital) in order to hedge their risk exposure during the project. Results indicate that the price of the risk capital is much more sensitive to hydrogeological rather than financial uncertainty. We use the Capital Asset Pricing Model to estimate the risk-adjusted discount rate to depreciate all costs to present value for the brownfield redevelopment project. A key outcome of this work is that the presentation of our risk capital valuation methodology is sufficiently generalized for application to a wide variety of engineering projects. Copyright © 2012 Elsevier

  14. Uncertainty and sensitivity analysis on probabilistic safety assessment of an experimental facility

    International Nuclear Information System (INIS)

    Burgazzi, L.

    2000-01-01

    The aim of this work is to perform an uncertainty and sensitivity analysis on the probabilistic safety assessment of the International Fusion Materials Irradiation Facility (IFMIF), in order to assess the effect on the final risk values of the uncertainties associated with the generic data used for the initiating events and component reliability and to identify the key quantities contributing to this uncertainty. The analysis is conducted on the expected frequency calculated for the accident sequences, defined through the event tree (ET) modeling. This is in order to increment credit to the ET model quantification, to calculate frequency distributions for the occurrence of events and, consequently, to assess if sequences have been correctly selected on the probability standpoint and finally to verify the fulfillment of the safety conditions. Uncertainty and sensitivity analysis are performed using respectively Monte Carlo sampling and an importance parameter technique. (author)

  15. Stakeholder-driven multi-attribute analysis for energy project selection under uncertainty

    International Nuclear Information System (INIS)

    Read, Laura; Madani, Kaveh; Mokhtari, Soroush; Hanks, Catherine

    2017-01-01

    In practice, selecting an energy project for development requires balancing criteria and competing stakeholder priorities to identify the best alternative. Energy source selection can be modeled as multi-criteria decision-maker problems to provide quantitative support to reconcile technical, economic, environmental, social, and political factors with respect to the stakeholders' interests. Decision making among these complex interactions should also account for the uncertainty present in the input data. In response, this work develops a stochastic decision analysis framework to evaluate alternatives by involving stakeholders to identify both quantitative and qualitative selection criteria and performance metrics which carry uncertainties. The developed framework is illustrated using a case study from Fairbanks, Alaska, where decision makers and residents must decide on a new source of energy for heating and electricity. We approach this problem in a five step methodology: (1) engaging experts (role players) to develop criteria of project performance; (2) collecting a range of quantitative and qualitative input information to determine the performance of each proposed solution according to the selected criteria; (3) performing a Monte-Carlo analysis to capture uncertainties given in the inputs; (4) applying multi-criteria decision-making, social choice (voting), and fallback bargaining methods to account for three different levels of cooperation among the stakeholders; and (5) computing an aggregate performance index (API) score for each alternative based on its performance across criteria and cooperation levels. API scores communicate relative performance between alternatives. In this way, our methodology maps uncertainty from the input data to reflect risk in the decision and incorporates varying degrees of cooperation into the analysis to identify an optimal and practical alternative. - Highlights: • We develop an applicable stakeholder-driven framework for

  16. Fuzzy Uncertainty Evaluation for Fault Tree Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ki Beom; Shim, Hyung Jin [Seoul National University, Seoul (Korea, Republic of); Jae, Moo Sung [Hanyang University, Seoul (Korea, Republic of)

    2015-05-15

    This traditional probabilistic approach can calculate relatively accurate results. However it requires a long time because of repetitive computation due to the MC method. In addition, when informative data for statistical analysis are not sufficient or some events are mainly caused by human error, the probabilistic approach may not be possible because uncertainties of these events are difficult to be expressed by probabilistic distributions. In order to reduce the computation time and quantify uncertainties of top events when basic events whose uncertainties are difficult to be expressed by probabilistic distributions exist, the fuzzy uncertainty propagation based on fuzzy set theory can be applied. In this paper, we develop a fuzzy uncertainty propagation code and apply the fault tree of the core damage accident after the large loss of coolant accident (LLOCA). The fuzzy uncertainty propagation code is implemented and tested for the fault tree of the radiation release accident. We apply this code to the fault tree of the core damage accident after the LLOCA in three cases and compare the results with those computed by the probabilistic uncertainty propagation using the MC method. The results obtained by the fuzzy uncertainty propagation can be calculated in relatively short time, covering the results obtained by the probabilistic uncertainty propagation.

  17. Statistical analysis of the uncertainty related to flood hazard appraisal

    Science.gov (United States)

    Notaro, Vincenza; Freni, Gabriele

    2015-12-01

    The estimation of flood hazard frequency statistics for an urban catchment is of great interest in practice. It provides the evaluation of potential flood risk and related damage and supports decision making for flood risk management. Flood risk is usually defined as function of the probability, that a system deficiency can cause flooding (hazard), and the expected damage, due to the flooding magnitude (damage), taking into account both the exposure and the vulnerability of the goods at risk. The expected flood damage can be evaluated by an a priori estimation of potential damage caused by flooding or by interpolating real damage data. With regard to flood hazard appraisal several procedures propose to identify some hazard indicator (HI) such as flood depth or the combination of flood depth and velocity and to assess the flood hazard corresponding to the analyzed area comparing the HI variables with user-defined threshold values or curves (penalty curves or matrixes). However, flooding data are usually unavailable or piecemeal allowing for carrying out a reliable flood hazard analysis, therefore hazard analysis is often performed by means of mathematical simulations aimed at evaluating water levels and flow velocities over catchment surface. As results a great part of the uncertainties intrinsic to flood risk appraisal can be related to the hazard evaluation due to the uncertainty inherent to modeling results and to the subjectivity of the user defined hazard thresholds applied to link flood depth to a hazard level. In the present work, a statistical methodology was proposed for evaluating and reducing the uncertainties connected with hazard level estimation. The methodology has been applied to a real urban watershed as case study.

  18. Addressing imperfect maintenance modelling uncertainty in unavailability and cost based optimization

    International Nuclear Information System (INIS)

    Sanchez, Ana; Carlos, Sofia; Martorell, Sebastian; Villanueva, Jose F.

    2009-01-01

    Optimization of testing and maintenance activities performed in the different systems of a complex industrial plant is of great interest as the plant availability and economy strongly depend on the maintenance activities planned. Traditionally, two types of models, i.e. deterministic and probabilistic, have been considered to simulate the impact of testing and maintenance activities on equipment unavailability and the cost involved. Both models present uncertainties that are often categorized as either aleatory or epistemic uncertainties. The second group applies when there is limited knowledge on the proper model to represent a problem, and/or the values associated to the model parameters, so the results of the calculation performed with them incorporate uncertainty. This paper addresses the problem of testing and maintenance optimization based on unavailability and cost criteria and considering epistemic uncertainty in the imperfect maintenance modelling. It is framed as a multiple criteria decision making problem where unavailability and cost act as uncertain and conflicting decision criteria. A tolerance interval based approach is used to address uncertainty with regard to effectiveness parameter and imperfect maintenance model embedded within a multiple-objective genetic algorithm. A case of application for a stand-by safety related system of a nuclear power plant is presented. The results obtained in this application show the importance of considering uncertainties in the modelling of imperfect maintenance, as the optimal solutions found are associated with a large uncertainty that influences the final decision making depending on, for example, if the decision maker is risk averse or risk neutral

  19. Addressing imperfect maintenance modelling uncertainty in unavailability and cost based optimization

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez, Ana [Department of Statistics and Operational Research, Polytechnic University of Valencia, Camino de Vera, s/n, 46071 Valencia (Spain); Carlos, Sofia [Department of Chemical and Nuclear Engineering, Polytechnic University of Valencia, Camino de Vera, s/n, 46071 Valencia (Spain); Martorell, Sebastian [Department of Chemical and Nuclear Engineering, Polytechnic University of Valencia, Camino de Vera, s/n, 46071 Valencia (Spain)], E-mail: smartore@iqn.upv.es; Villanueva, Jose F. [Department of Chemical and Nuclear Engineering, Polytechnic University of Valencia, Camino de Vera, s/n, 46071 Valencia (Spain)

    2009-01-15

    Optimization of testing and maintenance activities performed in the different systems of a complex industrial plant is of great interest as the plant availability and economy strongly depend on the maintenance activities planned. Traditionally, two types of models, i.e. deterministic and probabilistic, have been considered to simulate the impact of testing and maintenance activities on equipment unavailability and the cost involved. Both models present uncertainties that are often categorized as either aleatory or epistemic uncertainties. The second group applies when there is limited knowledge on the proper model to represent a problem, and/or the values associated to the model parameters, so the results of the calculation performed with them incorporate uncertainty. This paper addresses the problem of testing and maintenance optimization based on unavailability and cost criteria and considering epistemic uncertainty in the imperfect maintenance modelling. It is framed as a multiple criteria decision making problem where unavailability and cost act as uncertain and conflicting decision criteria. A tolerance interval based approach is used to address uncertainty with regard to effectiveness parameter and imperfect maintenance model embedded within a multiple-objective genetic algorithm. A case of application for a stand-by safety related system of a nuclear power plant is presented. The results obtained in this application show the importance of considering uncertainties in the modelling of imperfect maintenance, as the optimal solutions found are associated with a large uncertainty that influences the final decision making depending on, for example, if the decision maker is risk averse or risk neutral.

  20. Uncertainty analysis for effluent trading planning using a Bayesian estimation-based simulation-optimization modeling approach.

    Science.gov (United States)

    Zhang, J L; Li, Y P; Huang, G H; Baetz, B W; Liu, J

    2017-06-01

    In this study, a Bayesian estimation-based simulation-optimization modeling approach (BESMA) is developed for identifying effluent trading strategies. BESMA incorporates nutrient fate modeling with soil and water assessment tool (SWAT), Bayesian estimation, and probabilistic-possibilistic interval programming with fuzzy random coefficients (PPI-FRC) within a general framework. Based on the water quality protocols provided by SWAT, posterior distributions of parameters can be analyzed through Bayesian estimation; stochastic characteristic of nutrient loading can be investigated which provides the inputs for the decision making. PPI-FRC can address multiple uncertainties in the form of intervals with fuzzy random boundaries and the associated system risk through incorporating the concept of possibility and necessity measures. The possibility and necessity measures are suitable for optimistic and pessimistic decision making, respectively. BESMA is applied to a real case of effluent trading planning in the Xiangxihe watershed, China. A number of decision alternatives can be obtained under different trading ratios and treatment rates. The results can not only facilitate identification of optimal effluent-trading schemes, but also gain insight into the effects of trading ratio and treatment rate on decision making. The results also reveal that decision maker's preference towards risk would affect decision alternatives on trading scheme as well as system benefit. Compared with the conventional optimization methods, it is proved that BESMA is advantageous in (i) dealing with multiple uncertainties associated with randomness and fuzziness in effluent-trading planning within a multi-source, multi-reach and multi-period context; (ii) reflecting uncertainties existing in nutrient transport behaviors to improve the accuracy in water quality prediction; and (iii) supporting pessimistic and optimistic decision making for effluent trading as well as promoting diversity of decision

  1. CEC/USDOE workshop on uncertainty analysis

    International Nuclear Information System (INIS)

    Elderkin, C.E.; Kelly, G.N.

    1990-07-01

    Any measured or assessed quantity contains uncertainty. The quantitative estimation of such uncertainty is becoming increasingly important, especially in assuring that safety requirements are met in design, regulation, and operation of nuclear installations. The CEC/USDOE Workshop on Uncertainty Analysis, held in Santa Fe, New Mexico, on November 13 through 16, 1989, was organized jointly by the Commission of European Communities (CEC's) Radiation Protection Research program, dealing with uncertainties throughout the field of consequence assessment, and DOE's Atmospheric Studies in Complex Terrain (ASCOT) program, concerned with the particular uncertainties in time and space variant transport and dispersion. The workshop brought together US and European scientists who have been developing or applying uncertainty analysis methodologies, conducted in a variety of contexts, often with incomplete knowledge of the work of others in this area. Thus, it was timely to exchange views and experience, identify limitations of approaches to uncertainty and possible improvements, and enhance the interface between developers and users of uncertainty analysis methods. Furthermore, the workshop considered the extent to which consistent, rigorous methods could be used in various applications within consequence assessment. 3 refs

  2. Applied research in uncertainty modeling and analysis

    CERN Document Server

    Ayyub, Bilal

    2005-01-01

    Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

  3. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)] [and others

    1997-06-01

    This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  4. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report

    International Nuclear Information System (INIS)

    Brown, J.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-06-01

    This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses

  5. Development of a Prototype Model-Form Uncertainty Knowledge Base

    Science.gov (United States)

    Green, Lawrence L.

    2016-01-01

    Uncertainties are generally classified as either aleatory or epistemic. Aleatory uncertainties are those attributed to random variation, either naturally or through manufacturing processes. Epistemic uncertainties are generally attributed to a lack of knowledge. One type of epistemic uncertainty is called model-form uncertainty. The term model-form means that among the choices to be made during a design process within an analysis, there are different forms of the analysis process, which each give different results for the same configuration at the same flight conditions. Examples of model-form uncertainties include the grid density, grid type, and solver type used within a computational fluid dynamics code, or the choice of the number and type of model elements within a structures analysis. The objectives of this work are to identify and quantify a representative set of model-form uncertainties and to make this information available to designers through an interactive knowledge base (KB). The KB can then be used during probabilistic design sessions, so as to enable the possible reduction of uncertainties in the design process through resource investment. An extensive literature search has been conducted to identify and quantify typical model-form uncertainties present within aerospace design. An initial attempt has been made to assemble the results of this literature search into a searchable KB, usable in real time during probabilistic design sessions. A concept of operations and the basic structure of a model-form uncertainty KB are described. Key operations within the KB are illustrated. Current limitations in the KB, and possible workarounds are explained.

  6. Assessment of uncertainties in risk analysis of chemical establishments. The ASSURANCE project. Final summary report

    DEFF Research Database (Denmark)

    Lauridsen, K.; Kozine, Igor; Markert, Frank

    2002-01-01

    and led the comparison of results in order to reveal the causes for differences between the partners' results. The results of the project point to an increased awareness of the potential uncertainties in riskanalyses and highlight a number of important sources of such uncertainties. In the hazard......This report summarises the results obtained in the ASSURANCE project (EU contract number ENV4-CT97-0627). Seven teams have performed risk analyses for the same chemical facility, an ammonia storage. The EC's Joint Research Centre at Ispra and RisøNational Laboratory co-ordinated the exercise...

  7. Uncertainty analysis of the FRAP code

    International Nuclear Information System (INIS)

    Peck, S.O.

    1978-01-01

    A user oriented, automated uncertainty analysis capability has been built into the FRAP code (Fuel Rod Analysis Program) and applied to a PWR fuel rod undergoing a LOCA. The method of uncertainty analysis is the Response Surface Method (RSM). (author)

  8. Uncertainty analysis for results of thermal hydraulic codes of best-estimate-type

    International Nuclear Information System (INIS)

    Alva N, J.

    2010-01-01

    In this thesis, some fundamental knowledge is presented about uncertainty analysis and about diverse methodologies applied in the study of nuclear power plant transient event analysis, particularly related to thermal hydraulics phenomena. These concepts and methodologies mentioned in this work come from a wide bibliographical research in the nuclear power subject. Methodologies for uncertainty analysis have been developed by quite diverse institutions, and they have been widely used worldwide for application to results from best-estimate-type computer codes in nuclear reactor thermal hydraulics and safety analysis. Also, the main uncertainty sources, types of uncertainties, and aspects related to best estimate modeling and methods are introduced. Once the main bases of uncertainty analysis have been set, and some of the known methodologies have been introduced, it is presented in detail the CSAU methodology, which will be applied in the analyses. The main objective of this thesis is to compare the results of an uncertainty and sensibility analysis by using the Response Surface Technique to the application of W ilks formula, apply through a loss coolant experiment and an event of rise in a BWR. Both techniques are options in the part of uncertainty and sensibility analysis of the CSAU methodology, which was developed for the analysis of transients and accidents at nuclear power plants, and it is the base of most of the methodologies used in licensing of nuclear power plants practically everywhere. Finally, the results of applying both techniques are compared and discussed. (Author)

  9. Uncertainty analysis and design optimization of hybrid rocket motor powered vehicle for suborbital flight

    Directory of Open Access Journals (Sweden)

    Zhu Hao

    2015-06-01

    Full Text Available In this paper, we propose an uncertainty analysis and design optimization method and its applications on a hybrid rocket motor (HRM powered vehicle. The multidisciplinary design model of the rocket system is established and the design uncertainties are quantified. The sensitivity analysis of the uncertainties shows that the uncertainty generated from the error of fuel regression rate model has the most significant effect on the system performances. Then the differences between deterministic design optimization (DDO and uncertainty-based design optimization (UDO are discussed. Two newly formed uncertainty analysis methods, including the Kriging-based Monte Carlo simulation (KMCS and Kriging-based Taylor series approximation (KTSA, are carried out using a global approximation Kriging modeling method. Based on the system design model and the results of design uncertainty analysis, the design optimization of an HRM powered vehicle for suborbital flight is implemented using three design optimization methods: DDO, KMCS and KTSA. The comparisons indicate that the two UDO methods can enhance the design reliability and robustness. The researches and methods proposed in this paper can provide a better way for the general design of HRM powered vehicles.

  10. Distribution of uncertainties at the municipality level for flood risk modelling along the river Meuse: implications for policy-making

    Science.gov (United States)

    Pirotton, Michel; Stilmant, Frédéric; Erpicum, Sébastien; Dewals, Benjamin; Archambeau, Pierre

    2016-04-01

    Flood risk modelling has been conducted for the whole course of the river Meuse in Belgium. Major cities, such as Liege (200,000 inh.) and Namur (110,000 inh.), are located in the floodplains of river Meuse. Particular attention has been paid to uncertainty analysis and its implications for decision-making. The modelling chain contains flood frequency analysis, detailed 2D hydraulic computations, damage modelling and risk calculation. The relative importance of each source of uncertainty to the overall results uncertainty has been estimated by considering several alternate options for each step of the analysis: different distributions were considered in the flood frequency analysis; the influence of modelling assumptions and boundary conditions (e.g., steady vs. unsteady) were taken into account for the hydraulic computation; two different landuse classifications and two sets of damage functions were used; the number of exceedance probabilities involved in the risk calculation (by integration of the risk-curves) was varied. In addition, the sensitivity of the results with respect to increases in flood discharges was assessed. The considered increases are consistent with a "wet" climate change scenario for the time horizons 2021-2050 and 2071-2100 (Detrembleur et al., 2015). The results of hazard computation differ significantly between the upper and lower parts of the course of river Meuse in Belgium. In the former, inundation extents grow gradually as the considered flood discharge is increased (i.e. the exceedance probability is reduced), while in the downstream part, protection structures (mainly concrete walls) prevent inundation for flood discharges corresponding to exceedance probabilities of 0.01 and above (in the present climate). For higher discharges, large inundation extents are obtained in the floodplains. The highest values of risk (mean annual damage) are obtained in the municipalities which undergo relatively frequent flooding (upper part of the

  11. Uncertainty analysis of the FRAP code

    International Nuclear Information System (INIS)

    Peck, S.O.

    1978-01-01

    A user oriented, automated uncertainty analysis capability has been built into the Fuel Rod Analysis Program (FRAP) code and has been applied to a pressurized water reactor (PWR) fuel rod undergoing a loss-of-coolant accident (LOCA). The method of uncertainty analysis is the response surface method. The automated version significantly reduced the time required to complete the analysis and, at the same time, greatly increased the problem scope. Results of the analysis showed a significant difference in the total and relative contributions to the uncertainty of the response parameters between steady state and transient conditions

  12. Prediction Model of Collapse Risk Based on Information Entropy and Distance Discriminant Analysis Method

    Directory of Open Access Journals (Sweden)

    Hujun He

    2017-01-01

    Full Text Available The prediction and risk classification of collapse is an important issue in the process of highway construction in mountainous regions. Based on the principles of information entropy and Mahalanobis distance discriminant analysis, we have produced a collapse hazard prediction model. We used the entropy measure method to reduce the influence indexes of the collapse activity and extracted the nine main indexes affecting collapse activity as the discriminant factors of the distance discriminant analysis model (i.e., slope shape, aspect, gradient, and height, along with exposure of the structural face, stratum lithology, relationship between weakness face and free face, vegetation cover rate, and degree of rock weathering. We employ postearthquake collapse data in relation to construction of the Yingxiu-Wolong highway, Hanchuan County, China, as training samples for analysis. The results were analyzed using the back substitution estimation method, showing high accuracy and no errors, and were the same as the prediction result of uncertainty measure. Results show that the classification model based on information entropy and distance discriminant analysis achieves the purpose of index optimization and has excellent performance, high prediction accuracy, and a zero false-positive rate. The model can be used as a tool for future evaluation of collapse risk.

  13. The spectre of uncertainty in communicating technological risk

    Energy Technology Data Exchange (ETDEWEB)

    Broesius, Michael T. [Univ. of California, Livermore, CA (United States)

    1993-12-01

    The literature does not clearly describe the potential moral and ethical conflicts that can exist between technology sponsors and the technical communicators whose job it is to present potentially risky technology to the non-technical people most likely to be imperiled by such risk. Equally important, the literature does not address the issue of uncertainty -- not the uncertainty likely to be experienced by the community at risk, but the unreliable processes and methodologies used by technology sponsors to define, quantify, and develop strategies to mitigate technological risks. In this paper, the author goes beyond a description of risk communication, the nature of the generally predictable interaction between technology advocates and non-technically trained individuals, and current trends in the field. Although that kind of information is critical to the success of any risk communication activity, and he has included it when necessary to provide background and perspective, without knowing how and why risk assessment is done, it has limited practical applicability outside the sterile, value-free vacuum in which it is usually framed. Technical communicators, particularly those responsible for communicating potential technological risk, must also understand the social, political, economic, statistical, and ethical issues they will invariably encounter.

  14. Analysis of algal bloom risk with uncertainties in lakes by integrating self-organizing map and fuzzy information theory.

    Science.gov (United States)

    Chen, Qiuwen; Rui, Han; Li, Weifeng; Zhang, Yanhui

    2014-06-01

    Algal blooms are a serious problem in waters, which damage aquatic ecosystems and threaten drinking water safety. However, the outbreak mechanism of algal blooms is very complex with great uncertainty, especially for large water bodies where environmental conditions have obvious variation in both space and time. This study developed an innovative method which integrated a self-organizing map (SOM) and fuzzy information diffusion theory to comprehensively analyze algal bloom risks with uncertainties. The Lake Taihu was taken as study case and the long-term (2004-2010) on-site monitoring data were used. The results showed that algal blooms in Taihu Lake were classified into four categories and exhibited obvious spatial-temporal patterns. The lake was mainly characterized by moderate bloom but had high uncertainty, whereas severe blooms with low uncertainty were observed in the northwest part of the lake. The study gives insight on the spatial-temporal dynamics of algal blooms, and should help government and decision-makers outline policies and practices on bloom monitoring and prevention. The developed method provides a promising approach to estimate algal bloom risks under uncertainties. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. A unified framework for risk and vulnerability analysis covering both safety and security

    International Nuclear Information System (INIS)

    Aven, Terje

    2007-01-01

    Recently, we have seen several attempts to establish adequate risk and vulnerability analyses tools and related management frameworks dealing not only with accidental events but also security problems. These attempts have been based on different analysis approaches and using alternative building blocks. In this paper, we discuss some of these and show how a unified framework for such analyses and management tasks can be developed. The framework is based on the use of probability as a measure of uncertainty, as seen through the eyes of the assessor, and define risk as the combination of possible consequences and related uncertainties. Risk and vulnerability characterizations are introduced incorporating ideas both from vulnerability analyses literature as well as from the risk classification scheme introduced by Renn and Klinke

  16. Dealing with uncertainty arising out of probabilistic risk assessment

    International Nuclear Information System (INIS)

    Solomon, K.A.; Kastenberg, W.E.; Nelson, P.F.

    1984-03-01

    In addressing the area of safety goal implementation, the question of uncertainty arises. This report suggests that the Nuclear Regulatory Commission (NRC) should examine how other regulatory organizations have addressed the issue. Several examples are given from the chemical industry, and comparisons are made to nuclear power risks. Recommendations are made as to various considerations that the NRC should require in probabilistic risk assessments in order to properly treat uncertainties in the implementation of the safety goal policy. 40 references, 7 figures, 5 tables

  17. Uncertainty Analysis via Failure Domain Characterization: Unrestricted Requirement Functions

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2011-01-01

    This paper proposes an uncertainty analysis framework based on the characterization of the uncertain parameter space. This characterization enables the identification of worst-case uncertainty combinations and the approximation of the failure and safe domains with a high level of accuracy. Because these approximations are comprised of subsets of readily computable probability, they enable the calculation of arbitrarily tight upper and lower bounds to the failure probability. The methods developed herein, which are based on nonlinear constrained optimization, are applicable to requirement functions whose functional dependency on the uncertainty is arbitrary and whose explicit form may even be unknown. Some of the most prominent features of the methodology are the substantial desensitization of the calculations from the assumed uncertainty model (i.e., the probability distribution describing the uncertainty) as well as the accommodation for changes in such a model with a practically insignificant amount of computational effort.

  18. Subspace-based Inverse Uncertainty Quantification for Nuclear Data Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Khuwaileh, B.A., E-mail: bakhuwai@ncsu.edu; Abdel-Khalik, H.S.

    2015-01-15

    Safety analysis and design optimization depend on the accurate prediction of various reactor attributes. Predictions can be enhanced by reducing the uncertainty associated with the attributes of interest. An inverse problem can be defined and solved to assess the sources of uncertainty, and experimental effort can be subsequently directed to further improve the uncertainty associated with these sources. In this work a subspace-based algorithm for inverse sensitivity/uncertainty quantification (IS/UQ) has been developed to enable analysts account for all sources of nuclear data uncertainties in support of target accuracy assessment-type analysis. An approximate analytical solution of the optimization problem is used to guide the search for the dominant uncertainty subspace. By limiting the search to a subspace, the degrees of freedom available for the optimization search are significantly reduced. A quarter PWR fuel assembly is modeled and the accuracy of the multiplication factor and the fission reaction rate are used as reactor attributes whose uncertainties are to be reduced. Numerical experiments are used to demonstrate the computational efficiency of the proposed algorithm. Our ongoing work is focusing on extending the proposed algorithm to account for various forms of feedback, e.g., thermal-hydraulics and depletion effects.

  19. Uncertainty analysis for the assembly and core simulation of BEAVRS at the HZP conditions

    Energy Technology Data Exchange (ETDEWEB)

    Wan, Chenghui [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Cao, Liangzhi, E-mail: caolz@mail.xjtu.edu.cn [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Wu, Hongchun [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Shen, Wei [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Canadian Nuclear Safety Commission, Ottawa, Ontario (Canada)

    2017-04-15

    Highlights: • Uncertainty analysis has been completed based on the “two-step” scheme. • Uncertainty analysis has been performed to BEAVRS at HZP. • For lattice calculations, the few-group constant’s uncertainty was quantified. • For core simulation, uncertainties of k{sub eff} and power distributions were quantified. - Abstract: Based on the “two-step” scheme for the reactor-physics calculations, the capability of uncertainty analysis for the core simulations has been implemented in the UNICORN code, an in-house code for the sensitivity and uncertainty analysis of the reactor-physics calculations. Applying the statistical sampling method, the nuclear-data uncertainties can be propagated to the important predictions of the core simulations. The uncertainties of the few-group constants introduced by the uncertainties of the multigroup microscopic cross sections are quantified first for the lattice calculations; the uncertainties of the few-group constants are then propagated to the core multiplication factor and core power distributions for the core simulations. Up to now, our in-house lattice code NECP-CACTI and the neutron-diffusion solver NECP-VIOLET have been implemented in UNICORN for the steady-state core simulations based on the “two-step” scheme. With NECP-CACTI and NECP-VIOLET, the modeling and simulation of the steady-state BEAVRS benchmark problem at the HZP conditions was performed, and the results were compared with those obtained by CASMO-4E. Based on the modeling and simulation, the UNICORN code has been applied to perform the uncertainty analysis for BAEVRS at HZP. The uncertainty results of the eigenvalues and two-group constants for the lattice calculations and the multiplication factor and the power distributions for the steady-state core simulations are obtained and analyzed in detail.

  20. Uncertainty analysis for the assembly and core simulation of BEAVRS at the HZP conditions

    International Nuclear Information System (INIS)

    Wan, Chenghui; Cao, Liangzhi; Wu, Hongchun; Shen, Wei

    2017-01-01

    Highlights: • Uncertainty analysis has been completed based on the “two-step” scheme. • Uncertainty analysis has been performed to BEAVRS at HZP. • For lattice calculations, the few-group constant’s uncertainty was quantified. • For core simulation, uncertainties of k_e_f_f and power distributions were quantified. - Abstract: Based on the “two-step” scheme for the reactor-physics calculations, the capability of uncertainty analysis for the core simulations has been implemented in the UNICORN code, an in-house code for the sensitivity and uncertainty analysis of the reactor-physics calculations. Applying the statistical sampling method, the nuclear-data uncertainties can be propagated to the important predictions of the core simulations. The uncertainties of the few-group constants introduced by the uncertainties of the multigroup microscopic cross sections are quantified first for the lattice calculations; the uncertainties of the few-group constants are then propagated to the core multiplication factor and core power distributions for the core simulations. Up to now, our in-house lattice code NECP-CACTI and the neutron-diffusion solver NECP-VIOLET have been implemented in UNICORN for the steady-state core simulations based on the “two-step” scheme. With NECP-CACTI and NECP-VIOLET, the modeling and simulation of the steady-state BEAVRS benchmark problem at the HZP conditions was performed, and the results were compared with those obtained by CASMO-4E. Based on the modeling and simulation, the UNICORN code has been applied to perform the uncertainty analysis for BAEVRS at HZP. The uncertainty results of the eigenvalues and two-group constants for the lattice calculations and the multiplication factor and the power distributions for the steady-state core simulations are obtained and analyzed in detail.

  1. Reporting and analyzing statistical uncertainties in Monte Carlo-based treatment planning

    International Nuclear Information System (INIS)

    Chetty, Indrin J.; Rosu, Mihaela; Kessler, Marc L.; Fraass, Benedick A.; Haken, Randall K. ten; Kong, Feng-Ming; McShan, Daniel L.

    2006-01-01

    Purpose: To investigate methods of reporting and analyzing statistical uncertainties in doses to targets and normal tissues in Monte Carlo (MC)-based treatment planning. Methods and Materials: Methods for quantifying statistical uncertainties in dose, such as uncertainty specification to specific dose points, or to volume-based regions, were analyzed in MC-based treatment planning for 5 lung cancer patients. The effect of statistical uncertainties on target and normal tissue dose indices was evaluated. The concept of uncertainty volume histograms for targets and organs at risk was examined, along with its utility, in conjunction with dose volume histograms, in assessing the acceptability of the statistical precision in dose distributions. The uncertainty evaluation tools were extended to four-dimensional planning for application on multiple instances of the patient geometry. All calculations were performed using the Dose Planning Method MC code. Results: For targets, generalized equivalent uniform doses and mean target doses converged at 150 million simulated histories, corresponding to relative uncertainties of less than 2% in the mean target doses. For the normal lung tissue (a volume-effect organ), mean lung dose and normal tissue complication probability converged at 150 million histories despite the large range in the relative organ uncertainty volume histograms. For 'serial' normal tissues such as the spinal cord, large fluctuations exist in point dose relative uncertainties. Conclusions: The tools presented here provide useful means for evaluating statistical precision in MC-based dose distributions. Tradeoffs between uncertainties in doses to targets, volume-effect organs, and 'serial' normal tissues must be considered carefully in determining acceptable levels of statistical precision in MC-computed dose distributions

  2. Real-Time Optimal Flood Control Decision Making and Risk Propagation Under Multiple Uncertainties

    Science.gov (United States)

    Zhu, Feilin; Zhong, Ping-An; Sun, Yimeng; Yeh, William W.-G.

    2017-12-01

    Multiple uncertainties exist in the optimal flood control decision-making process, presenting risks involving flood control decisions. This paper defines the main steps in optimal flood control decision making that constitute the Forecast-Optimization-Decision Making (FODM) chain. We propose a framework for supporting optimal flood control decision making under multiple uncertainties and evaluate risk propagation along the FODM chain from a holistic perspective. To deal with uncertainties, we employ stochastic models at each link of the FODM chain. We generate synthetic ensemble flood forecasts via the martingale model of forecast evolution. We then establish a multiobjective stochastic programming with recourse model for optimal flood control operation. The Pareto front under uncertainty is derived via the constraint method coupled with a two-step process. We propose a novel SMAA-TOPSIS model for stochastic multicriteria decision making. Then we propose the risk assessment model, the risk of decision-making errors and rank uncertainty degree to quantify the risk propagation process along the FODM chain. We conduct numerical experiments to investigate the effects of flood forecast uncertainty on optimal flood control decision making and risk propagation. We apply the proposed methodology to a flood control system in the Daduhe River basin in China. The results indicate that the proposed method can provide valuable risk information in each link of the FODM chain and enable risk-informed decisions with higher reliability.

  3. Uncertainty and risk in wildland fire management: A review

    Science.gov (United States)

    Matthew P. Thompson; Dave E. Calkin

    2011-01-01

    Wildland fire management is subject to manifold sources of uncertainty. Beyond the unpredictability of wildfire behavior, uncertainty stems from inaccurate/missing data, limited resource value measures to guide prioritization across fires and resources at risk, and an incomplete scientific understanding of ecological response to fire, of fire behavior response to...

  4. Uncertainty visualization in HARDI based on ensembles of ODFs

    KAUST Repository

    Jiao, Fangxiang; Phillips, Jeff M.; Gur, Yaniv; Johnson, Chris R.

    2012-01-01

    In this paper, we propose a new and accurate technique for uncertainty analysis and uncertainty visualization based on fiber orientation distribution function (ODF) glyphs, associated with high angular resolution diffusion imaging (HARDI). Our visualization applies volume rendering techniques to an ensemble of 3D ODF glyphs, which we call SIP functions of diffusion shapes, to capture their variability due to underlying uncertainty. This rendering elucidates the complex heteroscedastic structural variation in these shapes. Furthermore, we quantify the extent of this variation by measuring the fraction of the volume of these shapes, which is consistent across all noise levels, the certain volume ratio. Our uncertainty analysis and visualization framework is then applied to synthetic data, as well as to HARDI human-brain data, to study the impact of various image acquisition parameters and background noise levels on the diffusion shapes. © 2012 IEEE.

  5. Uncertainty visualization in HARDI based on ensembles of ODFs

    KAUST Repository

    Jiao, Fangxiang

    2012-02-01

    In this paper, we propose a new and accurate technique for uncertainty analysis and uncertainty visualization based on fiber orientation distribution function (ODF) glyphs, associated with high angular resolution diffusion imaging (HARDI). Our visualization applies volume rendering techniques to an ensemble of 3D ODF glyphs, which we call SIP functions of diffusion shapes, to capture their variability due to underlying uncertainty. This rendering elucidates the complex heteroscedastic structural variation in these shapes. Furthermore, we quantify the extent of this variation by measuring the fraction of the volume of these shapes, which is consistent across all noise levels, the certain volume ratio. Our uncertainty analysis and visualization framework is then applied to synthetic data, as well as to HARDI human-brain data, to study the impact of various image acquisition parameters and background noise levels on the diffusion shapes. © 2012 IEEE.

  6. The role of quantitative uncertainty in the safety analysis of flammable gas accidents in Hanford waste tanks

    International Nuclear Information System (INIS)

    Bratzel, D.R.

    1998-01-01

    Following a 1990 investigation into flammable gas generation, retention, and release mechanisms within the Hanford Site high-level waste tanks, personnel concluded that the existing Authorization Basis documentation did not adequately evaluate flammable gas hazards. The US Department of Energy Headquarters subsequently declared the flammable gas hazard as an unresolved safety issue. Although work scope has been focused on resolution of the issue, it has yet to be resolved due to considerable uncertainty regarding essential technical parameters and associated risk. Resolution of the Flammable Gas Safety Issue will include the identification of a set of controls for the Authorization Basis for the tanks which will require a safety analysis of flammable gas accidents. A traditional nuclear facility safety analysis is based primarily on the analysis of a set of bounding accidents to represent the risks of the possible accidents and hazardous conditions at a facility. While this approach may provide some indication of the bounding consequences of accidents for facilities, it does not provide a satisfactory basis for identification of facility risk or safety controls when there is considerable uncertainty associated with accident phenomena and/or data as is the case with potential flammable gas accidents at the Hanford Site. This is due to the difficulties in identifying the bounding case and reaching consensus among safety analysts, facility operations and engineering, and the regulator on the implications of the safety analysis results. In addition, the bounding cases are frequently based on simplifying assumptions that make the analysis results insensitive to variations among facilities or the impact of alternative safety control strategies. The existing safety analysis of flammable gas accidents for the Tank Waste Remediation system (TWRS) at the Hanford Site has these difficulties. However, Hanford Site personnel are developing a refined safety analysis approach

  7. Health risk assessment of polycyclic aromatic hydrocarbons in the source water and drinking water of China: Quantitative analysis based on published monitoring data.

    Science.gov (United States)

    Wu, Bing; Zhang, Yan; Zhang, Xu-Xiang; Cheng, Shu-Pei

    2011-12-01

    A carcinogenic risk assessment of polycyclic aromatic hydrocarbons (PAHs) in source water and drinking water of China was conducted using probabilistic techniques from a national perspective. The published monitoring data of PAHs were gathered and converted into BaP equivalent (BaP(eq)) concentrations. Based on the transformed data, comprehensive risk assessment was performed by considering different age groups and exposure pathways. Monte Carlo simulation and sensitivity analysis were applied to quantify uncertainties of risk estimation. The risk analysis indicated that, the risk values for children and teens were lower than the accepted value (1.00E-05), indicating no significant carcinogenic risk. The probability of risk values above 1.00E-05 was 5.8% and 6.7% for adults and lifetime groups, respectively. Overall, carcinogenic risks of PAHs in source water and drinking water of China were mostly accepted. However, specific regions, such as Yellow river of Lanzhou reach and Qiantang river should be paid more attention. Notwithstanding the uncertainties inherent in the risk assessment, this study is the first attempt to provide information on carcinogenic risk of PAHs in source water and drinking water of China, and might be useful for potential strategies of carcinogenic risk management and reduction. Copyright © 2011 Elsevier B.V. All rights reserved.

  8. Key drivers and economic consequences of high-end climate scenarios: uncertainties and risks

    DEFF Research Database (Denmark)

    Halsnæs, Kirsten; Kaspersen, Per Skougaard; Drews, Martin

    2015-01-01

    The consequences of high-end climate scenarios and the risks of extreme events involve a number of critical assumptions and methodological challenges related to key uncertainties in climate scenarios and modelling, impact analysis, and economics. A methodological framework for integrated analysis...... of extreme events increase beyond scaling, and in combination with economic assumptions we find a very wide range of risk estimates for urban precipitation events. A sensitivity analysis addresses 32 combinations of climate scenarios, damage cost curve approaches, and economic assumptions, including risk...... aversion and equity represented by discount rates. Major impacts of alternative assumptions are investigated. As a result, this study demonstrates that in terms of decision making the actual expectations concerning future climate scenarios and the economic assumptions applied are very important...

  9. A Preliminary Study on Sensitivity and Uncertainty Analysis with Statistic Method: Uncertainty Analysis with Cross Section Sampling from Lognormal Distribution

    Energy Technology Data Exchange (ETDEWEB)

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung [Hanyang Univ., Seoul (Korea, Republic of); Noh, Jae Man [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis.

  10. A Preliminary Study on Sensitivity and Uncertainty Analysis with Statistic Method: Uncertainty Analysis with Cross Section Sampling from Lognormal Distribution

    International Nuclear Information System (INIS)

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung; Noh, Jae Man

    2013-01-01

    The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis

  11. A retrospective dosimetry method and its uncertainty analysis

    International Nuclear Information System (INIS)

    Zhang, L.; Jia, D.; Dai, G.

    2000-01-01

    The main aim of a radiation epidemiological study is to assess the risk of the population exposed to ionizing radiation. The actual work of the assessment may be very difficult because dose information about the population is often indirect and incomplete. It is very important, therefore, to find a way of estimating reasonable and reliable doses of the population by a retrospective method from limited information. In order to provide reasonable dose information for the cohort study of Chinese medical diagnostic X-ray workers, a retrospective dosimetry method was established. In China, a cohort study of more than 27,000 medical diagnostic X-ray workers, with 25,000 controls, has been carried out for about fifteen years in order to assess the risk to an occupationally exposed population. Obviously, a key to the success of the study is to obtain reliable and reasonable results of dose estimation by the dose reconstruction method. Before 1985, there was a lack of information regarding personal dose measured directly; however, we can obtain other indirect information. Examples are information about working loads from the documents of the hospitals, information about operational conditions of the workers of different statuses by a survey of occupational history, and the exposure levels of various working conditions by some simulation methods. The information for estimating organ dose can also be obtained by simulating experiments with a phantom. Based on the information mentioned above, a mathematical model and computerizing system for dose reconstruction of this occupational population was design and developed. Uncertainty analysis very important for dose reconstruction. The sources of uncertainty of our study are coming from two fields. One is coming from the mode of dose reconstruction. Another is coming from the survey of the occupational history. In the result reported, main results of the uncertainty will be presented. In order to control the uncertainty of the

  12. Optimal natural resources management under uncertainty with catastrophic risk

    Energy Technology Data Exchange (ETDEWEB)

    Motoh, Tsujimura [Graduate School of Economics, Kyoto University, Yoshida-honmochi, Sakyo-ku, Kyoto 606-8501 (Japan)

    2004-05-01

    We examine an optimal natural resources management problem under uncertainty with catastrophic risk and investigate the optimal rate of use of a natural resource. For this purpose, we use stochastic control theory. We assume that, until a catastrophic event occurs, the stock of the natural resource is governed by a stochastic differential equation. We describe the catastrophic phenomenon as a Poisson process. From this analysis, we show the optimal rate of use of the natural resource in explicit form. Furthermore, we present comparative static results for the optimal rate of use of the natural resource.

  13. Optimal natural resources management under uncertainty with catastrophic risk

    International Nuclear Information System (INIS)

    Motoh, Tsujimura

    2004-01-01

    We examine an optimal natural resources management problem under uncertainty with catastrophic risk and investigate the optimal rate of use of a natural resource. For this purpose, we use stochastic control theory. We assume that, until a catastrophic event occurs, the stock of the natural resource is governed by a stochastic differential equation. We describe the catastrophic phenomenon as a Poisson process. From this analysis, we show the optimal rate of use of the natural resource in explicit form. Furthermore, we present comparative static results for the optimal rate of use of the natural resource

  14. A review of different perspectives on uncertainty and risk and an alternative modeling paradigm

    International Nuclear Information System (INIS)

    Samson, Sundeep; Reneke, James A.; Wiecek, Margaret M.

    2009-01-01

    The literature in economics, finance, operations research, engineering and in general mathematics is first reviewed on the subject of defining uncertainty and risk. The review goes back to 1901. Different perspectives on uncertainty and risk are examined and a new paradigm to model uncertainty and risk is proposed using relevant ideas from this study. This new paradigm is used to represent, aggregate and propagate uncertainty and interpret the resulting variability in a challenge problem developed by Oberkampf et al. [2004, Challenge problems: uncertainty in system response given uncertain parameters. Reliab Eng Syst Safety 2004; 85(1): 11-9]. The challenge problem is further extended into a decision problem that is treated within a multicriteria decision making framework to illustrate how the new paradigm yields optimal decisions under uncertainty. The accompanying risk is defined as the probability of an unsatisfactory system response quantified by a random function of the uncertainty

  15. Risk-based emergency decision support

    International Nuclear Information System (INIS)

    Koerte, Jens

    2003-01-01

    In the present paper we discuss how to assist critical decisions taken under complex, contingent circumstances, with a high degree of uncertainty and short time frames. In such sharp-end decision regimes, standard rule-based decision support systems do not capture the complexity of the situation. At the same time, traditional risk analysis is of little use due to variability in the specific circumstances. How then, can an organisation provide assistance to, e.g. pilots in dealing with such emergencies? A method called 'contingent risk and decision analysis' is presented, to provide decision support for decisions under variable circumstances and short available time scales. The method consists of nine steps of definition, modelling, analysis and criteria definition to be performed 'off-line' by analysts, and procedure generation to transform the analysis result into an operational decision aid. Examples of pilots' decisions in response to sudden vibration in offshore helicopter transport method are used to illustrate the approach

  16. Holistic stakeholder-oriented and case study-based risk analysis

    Science.gov (United States)

    Heisterkamp, Tobias

    2013-04-01

    by evaluating the results and their correspondence to reality continuously. Using case studies can help to identify important stakeholders, notably potential affected groups. To cover essential interests of all important stakeholders, a wide range of vulnerabilities, regarding physical and social aspects, and including their resiliences, has to be assessed. The case studies of storm events offer a solid base for investigations, no matter which method is used. They expose shortcomings like gaps in the warning chain or misunderstandings in warning communication. Case studies of extreme events are very interesting for many stakeholders, insurances or fire brigades use them in their daily work. Thus a case study-based approach is a further chance to integrated practitioners into the analysis process and to get results, which they easily understand and which they can transfer into application. There could a second advantage in taking many data sets into account: Each data set like the meteorological observations of wind gust speeds has inherent shortcomings like limited expressiveness, significance or especially uncertainties. Using various approaches could frame the final result and prevent expanding biases or misinterpretations. Altogether, this work stresses the role of transdisciplinary holistic approaches in vulnerability assessments for risk analyses.

  17. Uncertainty analysis in WWTP model applications: a critical discussion using an example from design

    DEFF Research Database (Denmark)

    Sin, Gürkan; Gernaey, Krist; Neumann, Marc B.

    2009-01-01

    of design performance criteria differs significantly. The implication for the practical applications of uncertainty analysis in the wastewater industry is profound: (i) as the uncertainty analysis results are specific to the framing used, the results must be interpreted within the context of that framing......This study focuses on uncertainty analysis of WWTP models and analyzes the issue of framing and how it affects the interpretation of uncertainty analysis results. As a case study, the prediction of uncertainty involved in model-based design of a wastewater treatment plant is studied. The Monte...... to stoichiometric, biokinetic and influent parameters; (2) uncertainty due to hydraulic behaviour of the plant and mass transfer parameters; (3) uncertainty due to the combination of (1) and (2). The results demonstrate that depending on the way the uncertainty analysis is framed, the estimated uncertainty...

  18. Uncertainty estimation and risk prediction in air quality

    International Nuclear Information System (INIS)

    Garaud, Damien

    2011-01-01

    This work is about uncertainty estimation and risk prediction in air quality. Firstly, we build a multi-model ensemble of air quality simulations which can take into account all uncertainty sources related to air quality modeling. Ensembles of photochemical simulations at continental and regional scales are automatically generated. Then, these ensemble are calibrated with a combinatorial optimization method. It selects a sub-ensemble which is representative of uncertainty or shows good resolution and reliability for probabilistic forecasting. This work shows that it is possible to estimate and forecast uncertainty fields related to ozone and nitrogen dioxide concentrations or to improve the reliability of threshold exceedance predictions. The approach is compared with Monte Carlo simulations, calibrated or not. The Monte Carlo approach appears to be less representative of the uncertainties than the multi-model approach. Finally, we quantify the observational error, the representativeness error and the modeling errors. The work is applied to the impact of thermal power plants, in order to quantify the uncertainty on the impact estimates. (author) [fr

  19. Phenomenological uncertainty analysis of containment building pressure load caused by severe accident sequences

    International Nuclear Information System (INIS)

    Park, S.Y.; Ahn, K.I.

    2014-01-01

    Highlights: • Phenomenological uncertainty analysis has been applied to level 2 PSA. • The methodology provides an alternative to simple deterministic analyses and sensitivity studies. • A realistic evaluation provides a more complete characterization of risks. • Uncertain parameters of MAAP code for the early containment failure were identified. - Abstract: This paper illustrates an application of a severe accident analysis code, MAAP, to the uncertainty evaluation of early containment failure scenarios employed in the containment event tree (CET) model of a reference plant. An uncertainty analysis of containment pressure behavior during severe accidents has been performed for an optimum assessment of an early containment failure model. The present application is mainly focused on determining an estimate of the containment building pressure load caused by severe accident sequences of a nuclear power plant. Key modeling parameters and phenomenological models employed for the present uncertainty analysis are closely related to the in-vessel hydrogen generation, direct containment heating, and gas combustion. The basic approach of this methodology is to (1) develop severe accident scenarios for which containment pressure loads should be performed based on a level 2 PSA, (2) identify severe accident phenomena relevant to an early containment failure, (3) identify the MAAP input parameters, sensitivity coefficients, and modeling options that describe or influence the early containment failure phenomena, (4) prescribe the likelihood descriptions of the potential range of these parameters, and (5) evaluate the code predictions using a number of random combinations of parameter inputs sampled from the likelihood distributions

  20. Uncertainty studies and risk assessment for CO2 storage in geological formations

    International Nuclear Information System (INIS)

    Walter, Lena Sophie

    2013-01-01

    Carbon capture and storage (CCS) in deep geological formations is one possible option to mitigate the greenhouse gas effect by reducing CO 2 emissions into the atmosphere. The assessment of the risks related to CO 2 storage is an important task. Events such as CO 2 leakage and brine displacement could result in hazards for human health and the environment. In this thesis, a systematic and comprehensive risk assessment concept is presented to investigate various levels of uncertainties and to assess risks using numerical simulations. Depending on the risk and the processes, which should be assessed, very complex models, large model domains, large time scales, and many simulations runs for estimating probabilities are required. To reduce the resulting high computational costs, a model reduction technique (the arbitrary polynomial chaos expansion) and a method for model coupling in space are applied. The different levels of uncertainties are: statistical uncertainty in parameter distributions, scenario uncertainty, e.g. different geological features, and recognized ignorance due to assumptions in the conceptual model set-up. Recognized ignorance and scenario uncertainty are investigated by simulating well defined model set-ups and scenarios. According to damage values, which are defined as a model output, the set-ups and scenarios can be compared and ranked. For statistical uncertainty probabilities can be determined by running Monte Carlo simulations with the reduced model. The results are presented in various ways: e.g., mean damage, probability density function, cumulative distribution function, or an overall risk value by multiplying the damage with the probability. If the model output (damage) cannot be compared to provided criteria (e.g. water quality criteria), analytical approximations are presented to translate the damage into comparable values. The overall concept is applied for the risks related to brine displacement and infiltration into drinking water

  1. Risk assessment calculations using MEPAS, an accepted screening methodology, and an uncertainty analysis for the reranking of Waste Area Groupings at Oak Ridge National Laboratory, Oak Ridge, Tennessee

    International Nuclear Information System (INIS)

    Shevenell, L.; Hoffman, F.O.; MacIntosh, D.

    1992-03-01

    The Waste Area Groupings (WAGs) at the Oak Ridge National Laboratory (ORNL) were reranked with respect to on- and off-site human health risks using two different methods. Risks associated with selected contaminants from each WAG for occupants of WAG 2 or an off-site area were calculated using a modified formulation of the Multimedia Environmental Pollutant Assessment System (MEPAS) and a method suitable for screening, referred to as the ORNL/ESD method (the method developed by the Environmental Sciences Division at ORNL) in this report. Each method resulted in a different ranking of the WAGs. The rankings from the two methods are compared in this report. All risk assessment calculations, except the original MEPAS calculations, indicated that WAGs 1; 2, 6, 7 (WAGs 2, 6 and 7 as one combined WAG); and 4 pose the greatest potential threat to human health. However, the overall rankings of the WAGs using constant parameter values in the different methods were inconclusive because uncertainty in parameter values can change the calculated risk associated with particular pathways, and hence, the final rankings. Uncertainty analysis using uncertainties about all model parameters were used to reduce biases associated with parameter selection and to more reliably rank waste sites according to potential risks associated with site contaminants. Uncertainty analysis indicates that the WAGs should be considered for further investigation, or remediation, in the following order: (1) WAG 1; (2) WAGs 2, 6, and 7 (combined); and 4; (3) WAGs 3, 5, and 9; and, (4) WAG 8

  2. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

    Energy Technology Data Exchange (ETDEWEB)

    Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States)] [and others

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.

  3. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

    International Nuclear Information System (INIS)

    Harper, F.T.; Young, M.L.; Miller, L.A.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes

  4. Communicating Uncertainty about Climate Change for Application to Security Risk Management

    Science.gov (United States)

    Gulledge, J. M.

    2011-12-01

    The science of climate change has convincingly demonstrated that human activities, including the release of greenhouse gases, land-surface changes, particle emissions, and redistribution of water, are changing global and regional climates. Consequently, key institutions are now concerned about the potential social impacts of climate change. For example, the 2010 Quadrennial Defense Review Report from the U.S. Department of Defense states that "climate change, energy security, and economic stability are inextricably linked." Meanwhile, insured losses from climate and weather-related natural disasters have risen dramatically over the past thirty years. Although these losses stem largely from socioeconomic trends, insurers are concerned that climate change could exacerbate this trend and render certain types of climate risk non-diversifiable. Meanwhile, the climate science community-broadly defined as physical, biological, and social scientists focused on some aspect of climate change-remains largely focused scholarly activities that are valued in the academy but not especially useful to decision makers. On the other hand, climate scientists who engage in policy discussions have generally permitted vested interests who support or oppose climate policies to frame the discussion of climate science within the policy arena. Such discussions focus on whether scientific uncertainties are sufficiently resolved to justify policy and the vested interests overstate or understate key uncertainties to support their own agendas. Consequently, the scientific community has become absorbed defending scientific findings to the near exclusion of developing novel tools to aid in risk-based decision-making. For example, the Intergovernmental Panel on Climate Change (IPCC), established expressly for the purpose of informing governments, has largely been engaged in attempts to reduce unavoidable uncertainties rather than helping the world's governments define a science-based risk

  5. Holistic uncertainty analysis in river basin modeling for climate vulnerability assessment

    Science.gov (United States)

    Taner, M. U.; Wi, S.; Brown, C.

    2017-12-01

    The challenges posed by uncertain future climate are a prominent concern for water resources managers. A number of frameworks exist for assessing the impacts of climate-related uncertainty, including internal climate variability and anthropogenic climate change, such as scenario-based approaches and vulnerability-based approaches. While in many cases climate uncertainty may be dominant, other factors such as future evolution of the river basin, hydrologic response and reservoir operations are potentially significant sources of uncertainty. While uncertainty associated with modeling hydrologic response has received attention, very little attention has focused on the range of uncertainty and possible effects of the water resources infrastructure and management. This work presents a holistic framework that allows analysis of climate, hydrologic and water management uncertainty in water resources systems analysis with the aid of a water system model designed to integrate component models for hydrology processes and water management activities. The uncertainties explored include those associated with climate variability and change, hydrologic model parameters, and water system operation rules. A Bayesian framework is used to quantify and model the uncertainties at each modeling steps in integrated fashion, including prior and the likelihood information about model parameters. The framework is demonstrated in a case study for the St. Croix Basin located at border of United States and Canada.

  6. Risk-Informed Safety Margin Characterization (RISMC): Integrated Treatment of Aleatory and Epistemic Uncertainty in Safety Analysis

    International Nuclear Information System (INIS)

    Youngblood, R.W.

    2010-01-01

    The concept of 'margin' has a long history in nuclear licensing and in the codification of good engineering practices. However, some traditional applications of 'margin' have been carried out for surrogate scenarios (such as design basis scenarios), without regard to the actual frequencies of those scenarios, and have been carried out with in a systematically conservative fashion. This means that the effectiveness of the application of the margin concept is determined in part by the original choice of surrogates, and is limited in any case by the degree of conservatism imposed on the evaluation. In the RISMC project, which is part of the Department of Energy's 'Light Water Reactor Sustainability Program' (LWRSP), we are developing a risk-informed characterization of safety margin. Beginning with the traditional discussion of 'margin' in terms of a 'load' (a physical challenge to system or component function) and a 'capacity' (the capability of that system or component to accommodate the challenge), we are developing the capability to characterize probabilistic load and capacity spectra, reflecting both aleatory and epistemic uncertainty in system response. For example, the probabilistic load spectrum will reflect the frequency of challenges of a particular severity. Such a characterization is required if decision-making is to be informed optimally. However, in order to enable the quantification of probabilistic load spectra, existing analysis capability needs to be extended. Accordingly, the INL is working on a next-generation safety analysis capability whose design will allow for much more efficient parameter uncertainty analysis, and will enable a much better integration of reliability-related and phenomenology-related aspects of margin.

  7. A stochastic multicriteria model for evidence-based decision making in drug benefit-risk analysis.

    Science.gov (United States)

    Tervonen, Tommi; van Valkenhoef, Gert; Buskens, Erik; Hillege, Hans L; Postmus, Douwe

    2011-05-30

    Drug benefit-risk (BR) analysis is based on firm clinical evidence regarding various safety and efficacy outcomes. In this paper, we propose a new and more formal approach for constructing a supporting multi-criteria model that fully takes into account the evidence on efficacy and adverse drug reactions. Our approach is based on the stochastic multi-criteria acceptability analysis methodology, which allows us to compute the typical value judgments that support a decision, to quantify decision uncertainty, and to compute a comprehensive BR profile. We construct a multi-criteria model for the therapeutic group of second-generation antidepressants. We assess fluoxetine and venlafaxine together with placebo according to incidence of treatment response and three common adverse drug reactions by using data from a published study. Our model shows that there are clear trade-offs among the treatment alternatives. Copyright © 2011 John Wiley & Sons, Ltd.

  8. Quantitative Analysis of Variability and Uncertainty in Environmental Data and Models. Volume 1. Theory and Methodology Based Upon Bootstrap Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Frey, H. Christopher [North Carolina State University, Raleigh, NC (United States); Rhodes, David S. [North Carolina State University, Raleigh, NC (United States)

    1999-04-30

    This is Volume 1 of a two-volume set of reports describing work conducted at North Carolina State University sponsored by Grant Number DE-FG05-95ER30250 by the U.S. Department of Energy. The title of the project is “Quantitative Analysis of Variability and Uncertainty in Acid Rain Assessments.” The work conducted under sponsorship of this grant pertains primarily to two main topics: (1) development of new methods for quantitative analysis of variability and uncertainty applicable to any type of model; and (2) analysis of variability and uncertainty in the performance, emissions, and cost of electric power plant combustion-based NOx control technologies. These two main topics are reported separately in Volumes 1 and 2.

  9. Risk Management and Uncertainty in Large Complex Public Projects

    DEFF Research Database (Denmark)

    Neerup Themsen, Tim; Harty, Chris; Tryggestad, Kjell

    Governmental actors worldwide are promoting risk management as a rational approach to man-age uncertainty and improve the abilities to deliver large complex projects according to budget, time plans, and pre-set project specifications: But what do we know about the effects of risk management...... on the abilities to meet such objectives? Using Callon’s (1998) twin notions of framing and overflowing we examine the implementation of risk management within the Dan-ish public sector and the effects this generated for the management of two large complex pro-jects. We show how the rational framing of risk...... management have generated unexpected costly outcomes such as: the undermining of the longer-term value and societal relevance of the built asset, the negligence of the wider range of uncertainties emerging during project processes, and constraining forms of knowledge. We also show how expert accountants play...

  10. Visualizing the uncertainty in the relationship between seasonal average climate and malaria risk.

    Science.gov (United States)

    MacLeod, D A; Morse, A P

    2014-12-02

    Around $1.6 billion per year is spent financing anti-malaria initiatives, and though malaria morbidity is falling, the impact of annual epidemics remains significant. Whilst malaria risk may increase with climate change, projections are highly uncertain and to sidestep this intractable uncertainty, adaptation efforts should improve societal ability to anticipate and mitigate individual events. Anticipation of climate-related events is made possible by seasonal climate forecasting, from which warnings of anomalous seasonal average temperature and rainfall, months in advance are possible. Seasonal climate hindcasts have been used to drive climate-based models for malaria, showing significant skill for observed malaria incidence. However, the relationship between seasonal average climate and malaria risk remains unquantified. Here we explore this relationship, using a dynamic weather-driven malaria model. We also quantify key uncertainty in the malaria model, by introducing variability in one of the first order uncertainties in model formulation. Results are visualized as location-specific impact surfaces: easily integrated with ensemble seasonal climate forecasts, and intuitively communicating quantified uncertainty. Methods are demonstrated for two epidemic regions, and are not limited to malaria modeling; the visualization method could be applied to any climate impact.

  11. Modelling and propagation of uncertainties in the German Risk Study

    International Nuclear Information System (INIS)

    Hofer, E.; Krzykacz, B.

    1982-01-01

    Risk assessments are generally subject to uncertainty considerations. This is because of the various estimates that are involved. The paper points out those estimates in the so-called phase A of the German Risk Study, for which uncertainties were quantified. It explains the probabilistic models applied in the assessment to their impact on the findings of the study. Finally the resulting subjective confidence intervals of the study results are presented and their sensitivity to these probabilistic models is investigated

  12. Accounting for Epistemic Uncertainty in Mission Supportability Assessment: A Necessary Step in Understanding Risk and Logistics Requirements

    Science.gov (United States)

    Owens, Andrew; De Weck, Olivier L.; Stromgren, Chel; Goodliff, Kandyce; Cirillo, William

    2017-01-01

    Future crewed missions to Mars present a maintenance logistics challenge that is unprecedented in human spaceflight. Mission endurance – defined as the time between resupply opportunities – will be significantly longer than previous missions, and therefore logistics planning horizons are longer and the impact of uncertainty is magnified. Maintenance logistics forecasting typically assumes that component failure rates are deterministically known and uses them to represent aleatory uncertainty, or uncertainty that is inherent to the process being examined. However, failure rates cannot be directly measured; rather, they are estimated based on similarity to other components or statistical analysis of observed failures. As a result, epistemic uncertainty – that is, uncertainty in knowledge of the process – exists in failure rate estimates that must be accounted for. Analyses that neglect epistemic uncertainty tend to significantly underestimate risk. Epistemic uncertainty can be reduced via operational experience; for example, the International Space Station (ISS) failure rate estimates are refined using a Bayesian update process. However, design changes may re-introduce epistemic uncertainty. Thus, there is a tradeoff between changing a design to reduce failure rates and operating a fixed design to reduce uncertainty. This paper examines the impact of epistemic uncertainty on maintenance logistics requirements for future Mars missions, using data from the ISS Environmental Control and Life Support System (ECLS) as a baseline for a case study. Sensitivity analyses are performed to investigate the impact of variations in failure rate estimates and epistemic uncertainty on spares mass. The results of these analyses and their implications for future system design and mission planning are discussed.

  13. A Benefit-Risk Analysis Approach to Capture Regulatory Decision-Making: Multiple Myeloma.

    Science.gov (United States)

    Raju, G K; Gurumurthi, Karthik; Domike, Reuben; Kazandjian, Dickran; Landgren, Ola; Blumenthal, Gideon M; Farrell, Ann; Pazdur, Richard; Woodcock, Janet

    2018-01-01

    Drug regulators around the world make decisions about drug approvability based on qualitative benefit-risk analysis. In this work, a quantitative benefit-risk analysis approach captures regulatory decision-making about new drugs to treat multiple myeloma (MM). MM assessments have been based on endpoints such as time to progression (TTP), progression-free survival (PFS), and objective response rate (ORR) which are different than benefit-risk analysis based on overall survival (OS). Twenty-three FDA decisions on MM drugs submitted to FDA between 2003 and 2016 were identified and analyzed. The benefits and risks were quantified relative to comparators (typically the control arm of the clinical trial) to estimate whether the median benefit-risk was positive or negative. A sensitivity analysis was demonstrated using ixazomib to explore the magnitude of uncertainty. FDA approval decision outcomes were consistent and logical using this benefit-risk framework. © 2017 American Society for Clinical Pharmacology and Therapeutics.

  14. Maritime transportation risk analysis: Review and analysis in light of some foundational issues

    International Nuclear Information System (INIS)

    Goerlandt, Floris; Montewka, Jakub

    2015-01-01

    Many methods and applications for maritime transportation risk analysis have been presented in the literature. In parallel, there is a recent focus on foundational issues in risk analysis, with calls for intensified research on fundamental concepts and principles underlying the scientific field. This paper presents a review and analysis of risk definitions, perspectives and scientific approaches to risk analysis found in the maritime transportation application area, focusing on applications addressing accidental risk of shipping in a sea area. For this purpose, a classification of risk definitions, an overview of elements in risk perspectives and a classification of approaches to risk analysis science are applied. Results reveal that in the application area, risk is strongly tied to probability, both in definitions and perspectives, while alternative views exist. A diffuse situation is also found concerning the scientific approach to risk analysis, with realist, proceduralist and constructivist foundations co-existing. Realist approaches dominate the application area. Very few applications systematically account for uncertainty, neither concerning the evidence base nor in relation to the limitations of the risk model in relation to the space of possible outcomes. Some suggestions are made to improve the current situation, aiming to strengthen the scientific basis for risk analysis. - Highlights: • Risk analyses in maritime transportation analysed in light of foundational issues. • Focus on definitions, perspectives and scientific approaches to risk analysis. • Probability-based definitions and realist approaches dominate the field. • Findings support calls for increased focus on foundational issues in risk research. • Some suggestions are made to improve the current situation

  15. Parameter Uncertainty for Repository Thermal Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hardin, Ernest [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hadgu, Teklu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Greenberg, Harris [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dupont, Mark [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-10-01

    This report is one follow-on to a study of reference geologic disposal design concepts (Hardin et al. 2011a). Based on an analysis of maximum temperatures, that study concluded that certain disposal concepts would require extended decay storage prior to emplacement, or the use of small waste packages, or both. The study used nominal values for thermal properties of host geologic media and engineered materials, demonstrating the need for uncertainty analysis to support the conclusions. This report is a first step that identifies the input parameters of the maximum temperature calculation, surveys published data on measured values, uses an analytical approach to determine which parameters are most important, and performs an example sensitivity analysis. Using results from this first step, temperature calculations planned for FY12 can focus on only the important parameters, and can use the uncertainty ranges reported here. The survey of published information on thermal properties of geologic media and engineered materials, is intended to be sufficient for use in generic calculations to evaluate the feasibility of reference disposal concepts. A full compendium of literature data is beyond the scope of this report. The term “uncertainty” is used here to represent both measurement uncertainty and spatial variability, or variability across host geologic units. For the most important parameters (e.g., buffer thermal conductivity) the extent of literature data surveyed samples these different forms of uncertainty and variability. Finally, this report is intended to be one chapter or section of a larger FY12 deliverable summarizing all the work on design concepts and thermal load management for geologic disposal (M3FT-12SN0804032, due 15Aug2012).

  16. Comparison of the effect of hazard and response/fragility uncertainties on core melt probability uncertainty

    International Nuclear Information System (INIS)

    Mensing, R.W.

    1985-01-01

    This report proposes a method for comparing the effects of the uncertainty in probabilistic risk analysis (PRA) input parameters on the uncertainty in the predicted risks. The proposed method is applied to compare the effect of uncertainties in the descriptions of (1) the seismic hazard at a nuclear power plant site and (2) random variations in plant subsystem responses and component fragility on the uncertainty in the predicted probability of core melt. The PRA used is that developed by the Seismic Safety Margins Research Program

  17. Sensitivity and uncertainty analysis of NET/ITER shielding blankets

    International Nuclear Information System (INIS)

    Hogenbirk, A.; Gruppelaar, H.; Verschuur, K.A.

    1990-09-01

    Results are presented of sensitivity and uncertainty calculations based upon the European fusion file (EFF-1). The effect of uncertainties in Fe, Cr and Ni cross sections on the nuclear heating in the coils of a NET/ITER shielding blanket has been studied. The analysis has been performed for the total cross section as well as partial cross sections. The correct expression for the sensitivity profile was used, including the gain term. The resulting uncertainty in the nuclear heating lies between 10 and 20 per cent. (author). 18 refs.; 2 figs.; 2 tabs

  18. Expert judgment based multi-criteria decision model to address uncertainties in risk assessment of nanotechnology-enabled food products

    International Nuclear Information System (INIS)

    Flari, Villie; Chaudhry, Qasim; Neslo, Rabin; Cooke, Roger

    2011-01-01

    Currently, risk assessment of nanotechnology-enabled food products is considered difficult due to the large number of uncertainties involved. We developed an approach which could address some of the main uncertainties through the use of expert judgment. Our approach employs a multi-criteria decision model, based on probabilistic inversion that enables capturing experts’ preferences in regard to safety of nanotechnology-enabled food products, and identifying their opinions in regard to the significance of key criteria that are important in determining the safety of such products. An advantage of these sample-based techniques is that they provide out-of-sample validation and therefore a robust scientific basis. This validation in turn adds predictive power to the model developed. We achieved out-of-sample validation in two ways: (1) a portion of the expert preference data was excluded from the model’s fitting and was then predicted by the model fitted on the remaining rankings and (2) a (partially) different set of experts generated new scenarios, using the same criteria employed in the model, and ranked them; their ranks were compared with ranks predicted by the model. The degree of validation in each method was less than perfect but reasonably substantial. The validated model we applied captured and modelled experts’ preferences regarding safety of hypothetical nanotechnology-enabled food products. It appears therefore that such an approach can provide a promising route to explore further for assessing the risk of nanotechnology-enabled food products.

  19. Climate uncertainty and implications for U.S. state-level risk assessment through 2050.

    Energy Technology Data Exchange (ETDEWEB)

    Loose, Verne W.; Lowry, Thomas Stephen; Malczynski, Leonard A.; Tidwell, Vincent Carroll; Stamber, Kevin Louis; Kelic, Andjelka; Backus, George A.; Warren, Drake E.; Zagonel, Aldo A.; Ehlen, Mark Andrew; Klise, Geoffrey T.; Vargas, Vanessa N.

    2009-10-01

    Decisions for climate policy will need to take place in advance of climate science resolving all relevant uncertainties. Further, if the concern of policy is to reduce risk, then the best-estimate of climate change impacts may not be so important as the currently understood uncertainty associated with realizable conditions having high consequence. This study focuses on one of the most uncertain aspects of future climate change - precipitation - to understand the implications of uncertainty on risk and the near-term justification for interventions to mitigate the course of climate change. We show that the mean risk of damage to the economy from climate change, at the national level, is on the order of one trillion dollars over the next 40 years, with employment impacts of nearly 7 million labor-years. At a 1% exceedance-probability, the impact is over twice the mean-risk value. Impacts at the level of individual U.S. states are then typically in the multiple tens of billions dollar range with employment losses exceeding hundreds of thousands of labor-years. We used results of the Intergovernmental Panel on Climate Change's (IPCC) Fourth Assessment Report 4 (AR4) climate-model ensemble as the referent for climate uncertainty over the next 40 years, mapped the simulated weather hydrologically to the county level for determining the physical consequence to economic activity at the state level, and then performed a detailed, seventy-industry, analysis of economic impact among the interacting lower-48 states. We determined industry GDP and employment impacts at the state level, as well as interstate population migration, effect on personal income, and the consequences for the U.S. trade balance.

  20. Quantitative risk assessment via uncertainty analysis in combination with error propagation for the determination of the dynamic Design Space of the primary drying step during freeze-drying

    DEFF Research Database (Denmark)

    Van Bockstal, Pieter Jan; Mortier, Séverine Thérèse F.C.; Corver, Jos

    2017-01-01

    of a freeze-drying process, allowing to quantitatively estimate and control the risk of cake collapse (i.e., the Risk of Failure (RoF)). The propagation of the error on the estimation of the thickness of the dried layer Ldried as function of primary drying time was included in the uncertainty analysis...

  1. Principal results of uncertainty and sensibility analysis for generic spanish AGP-granite

    International Nuclear Information System (INIS)

    Bolado, R.; Moya, J.A.

    1998-01-01

    Recently, ENRESA published his Performance Assessment of a deep geologic repository in granite. This paper summarises the main results of the uncertainty and sensitivity analysis performed on the data generated for the main scenario in the ENRESA Performance Assessment. The uncertainty analysis allowed us to determine the most important radionuclides, which were ''129I, ''36 Cl, ''79 Se and ''126 Sn, and to estimate upper bounds for the risk due to each one of them and for the global risk. Since ''129 I was the most important radionuclide, the main efforts in the sensitivity study were done in studying the most influential parameters on the maximum dose due to that radionuclide. The analysis shows that the order of magnitude of the maximum dose is essentially related to geosphere transport parameters. Nevertheless, the most influential parameters, when considering only the highest values of the maximum doses, are those that control the total amount of contaminant that can be driven into the main path to biosphere. (Author) 3 refs

  2. A scenario-based procedure for seismic risk analysis

    International Nuclear Information System (INIS)

    Kluegel, J.-U.; Mualchin, L.; Panza, G.F.

    2006-12-01

    A new methodology for seismic risk analysis based on probabilistic interpretation of deterministic or scenario-based hazard analysis, in full compliance with the likelihood principle and therefore meeting the requirements of modern risk analysis, has been developed. The proposed methodology can easily be adjusted to deliver its output in a format required for safety analysts and civil engineers. The scenario-based approach allows the incorporation of all available information collected in a geological, seismotectonic and geotechnical database of the site of interest as well as advanced physical modelling techniques to provide a reliable and robust deterministic design basis for civil infrastructures. The robustness of this approach is of special importance for critical infrastructures. At the same time a scenario-based seismic hazard analysis allows the development of the required input for probabilistic risk assessment (PRA) as required by safety analysts and insurance companies. The scenario-based approach removes the ambiguity in the results of probabilistic seismic hazard analysis (PSHA) which relies on the projections of Gutenberg-Richter (G-R) equation. The problems in the validity of G-R projections, because of incomplete to total absence of data for making the projections, are still unresolved. Consequently, the information from G-R must not be used in decisions for design of critical structures or critical elements in a structure. The scenario-based methodology is strictly based on observable facts and data and complemented by physical modelling techniques, which can be submitted to a formalised validation process. By means of sensitivity analysis, knowledge gaps related to lack of data can be dealt with easily, due to the limited amount of scenarios to be investigated. The proposed seismic risk analysis can be used with confidence for planning, insurance and engineering applications. (author)

  3. Probabilistic evaluation of uncertainties and risks in aerospace components

    Science.gov (United States)

    Shah, A. R.; Shiao, M. C.; Nagpal, V. K.; Chamis, C. C.

    1992-01-01

    This paper summarizes a methodology developed at NASA Lewis Research Center which computationally simulates the structural, material, and load uncertainties associated with Space Shuttle Main Engine (SSME) components. The methodology was applied to evaluate the scatter in static, buckling, dynamic, fatigue, and damage behavior of the SSME turbo pump blade. Also calculated are the probability densities of typical critical blade responses, such as effective stress, natural frequency, damage initiation, most probable damage path, etc. Risk assessments were performed for different failure modes, and the effect of material degradation on the fatigue and damage behaviors of a blade were calculated using a multi-factor interaction equation. Failure probabilities for different fatigue cycles were computed and the uncertainties associated with damage initiation and damage propagation due to different load cycle were quantified. Evaluations on the effects of mistuned blades on a rotor were made; uncertainties in the excitation frequency were found to significantly amplify the blade responses of a mistuned rotor. The effects of the number of blades on a rotor were studied. The autocorrelation function of displacements and the probability density function of the first passage time for deterministic and random barriers for structures subjected to random processes also were computed. A brief discussion was included on the future direction of probabilistic structural analysis.

  4. Assessment and presentation of uncertainties in probabilistic risk assessment: how should this be done

    International Nuclear Information System (INIS)

    Garlick, A.R.; Holloway, N.J.

    1987-01-01

    Despite continuing improvements in probabilistic risk assessment (PRA) techniques, PRA results, particularly those including degraded core analysis, will have maximum uncertainties of several orders of magnitude. This makes the expression of results, a matter no less important than their estimation. We put forward some ideas on the assessment and expression of highly uncertain quantities, such as probabilities of outcomes of a severe accident. These do not form a consistent set, but rather a number of alternative approaches aimed at stimulating discussion. These include non-probability expressions, such as fuzzy logic or Schafer's support and plausibility which abandon the purely probabilistic expression of risk for a more flexible type of expression, in which other types of measure are possible. The 'risk equivalent plant' concepts represent the opposite approach. Since uncertainty in a risk measure is in itself a form of risk, an attempt is made to define a 'risk equivalent' which is a risk with perfectly defined parameters, regarded (by means of suitable methods of judgement) as 'equally undesirable' with the actual plant. Some guidelines are given on the use of Bayesian methods in data-free or limited data situations. (author)

  5. An analysis of combined standard uncertainty for radiochemical measurements of environmental samples

    International Nuclear Information System (INIS)

    Berne, A.

    1996-01-01

    It is anticipated that future data acquisitions intended for use in radiological risk assessments will require the incorporation of uncertainty analysis. Often, only one aliquot of the sample is taken and a single determination is made. Under these circumstances, the total uncertainty is calculated using the open-quotes propagation of errorsclose quotes approach. However, there is no agreement in the radioanalytical community as to the exact equations to use. The Quality Assurance/Metrology Division of the Environmental Measurements Laboratory has developed a systematic process to compute uncertainties in constituent components of the analytical procedure, as well as the combined standard uncertainty (CSU). The equations for computation are presented here, with examples of their use. They have also been incorporated into a code for use in the spreadsheet application, QuattroPro trademark. Using the spreadsheet with appropriate inputs permits an analysis of the variations in the CSU as a function of several different variables. The relative importance of the open-quotes counting uncertaintyclose quotes can also be ascertained

  6. A risk assessment methodology for incorporating uncertainties using fuzzy concepts

    International Nuclear Information System (INIS)

    Cho, Hyo-Nam; Choi, Hyun-Ho; Kim, Yoon-Bae

    2002-01-01

    This paper proposes a new methodology for incorporating uncertainties using fuzzy concepts into conventional risk assessment frameworks. This paper also introduces new forms of fuzzy membership curves, designed to consider the uncertainty range that represents the degree of uncertainties involved in both probabilistic parameter estimates and subjective judgments, since it is often difficult or even impossible to precisely estimate the occurrence rate of an event in terms of one single crisp probability. It is to be noted that simple linguistic variables such as 'High/Low' and 'Good/Bad' have the limitations in quantifying the various risks inherent in construction projects, but only represent subjective mental cognition adequately. Therefore, in this paper, the statements that include some quantification with giving specific value or scale, such as 'Close to any value' or 'Higher/Lower than analyzed value', are used in order to get over the limitations. It may be stated that the proposed methodology will be very useful for the systematic and rational risk assessment of construction projects

  7. Evidence-based quantification of uncertainties induced via simulation-based modeling

    International Nuclear Information System (INIS)

    Riley, Matthew E.

    2015-01-01

    The quantification of uncertainties in simulation-based modeling traditionally focuses upon quantifying uncertainties in the parameters input into the model, referred to as parametric uncertainties. Often neglected in such an approach are the uncertainties induced by the modeling process itself. This deficiency is often due to a lack of information regarding the problem or the models considered, which could theoretically be reduced through the introduction of additional data. Because of the nature of this epistemic uncertainty, traditional probabilistic frameworks utilized for the quantification of uncertainties are not necessarily applicable to quantify the uncertainties induced in the modeling process itself. This work develops and utilizes a methodology – incorporating aspects of Dempster–Shafer Theory and Bayesian model averaging – to quantify uncertainties of all forms for simulation-based modeling problems. The approach expands upon classical parametric uncertainty approaches, allowing for the quantification of modeling-induced uncertainties as well, ultimately providing bounds on classical probability without the loss of epistemic generality. The approach is demonstrated on two different simulation-based modeling problems: the computation of the natural frequency of a simple two degree of freedom non-linear spring mass system and the calculation of the flutter velocity coefficient for the AGARD 445.6 wing given a subset of commercially available modeling choices. - Highlights: • Modeling-induced uncertainties are often mishandled or ignored in the literature. • Modeling-induced uncertainties are epistemic in nature. • Probabilistic representations of modeling-induced uncertainties are restrictive. • Evidence theory and Bayesian model averaging are integrated. • Developed approach is applicable for simulation-based modeling problems

  8. Consideration of uncertainties in CCDF risk curves in safety oriented decision making processes

    International Nuclear Information System (INIS)

    Stern, E.; Tadmor, J.

    1988-01-01

    In recent years, some of the results of Probabilistic Risk Assessment (i.e. the magnitudes of the various adverse health effects and other effects of potential accidents in nuclear power plants) have usually been presented in Complementary Cumulative Distribution Function curves, widely known as CCDF risk curves. CCDF curves are characteristic of probabilistic accident analyses and consequence calculations, although, in many cases, the codes producing the CCDF curves consist of a mixture of both probabilistic and deterministic calculations. One of the main difficulties in the process of PRA is the problem of uncertainties associated with the risk assessments. The uncertainties, as expressed in CCDF risk curves can be classified into two main categories: (a) uncertainties expressed by the CCDF risk curve itself due to its probabilistic nature and - (b) the uncertainty band of CCDF risk curves. The band consists of a ''family of CCDF curves'' which represents the risks (e.g. early/late fatalities) evaluated at various levels of confidence for a specific Plant-Site Combination (PSC) i.e. a certain nuclear power plant located at a certain site. The reasons why a family of curves rather than a single curve represents the risk of a certain PSC have been discussed. Generally, the uncertainty band of CCDF curves is limited by the 95% (''conservative'') and the 5% curves. In most cases the 50% (median, ''best estimate'') curve is also shown because scientists tend to believe that it represents the ''realistic'' (or real) risk of the plant

  9. Water supply infrastructure planning under multiple uncertainties: A differentiated approach

    Science.gov (United States)

    Fletcher, S.; Strzepek, K.

    2017-12-01

    Many water planners face increased pressure on water supply systems from increasing demands from population and economic growth in combination with uncertain water supply. Supply uncertainty arises from short-term climate variability and long-term climate change as well as uncertainty in groundwater availability. Social and economic uncertainties - such as sectoral competition for water, food and energy security, urbanization, and environmental protection - compound physical uncertainty. Further, the varying risk aversion of stakeholders and water managers makes it difficult to assess the necessity of expensive infrastructure investments to reduce risk. We categorize these uncertainties on two dimensions: whether they can be updated over time by collecting additional information, and whether the uncertainties can be described probabilistically or are "deep" uncertainties whose likelihood is unknown. Based on this, we apply a decision framework that combines simulation for probabilistic uncertainty, scenario analysis for deep uncertainty, and multi-stage decision analysis for uncertainties that are reduced over time with additional information. In light of these uncertainties and the investment costs of large infrastructure, we propose the assessment of staged, modular infrastructure and information updating as a hedge against risk. We apply this framework to cases in Melbourne, Australia and Riyadh, Saudi Arabia. Melbourne is a surface water system facing uncertain population growth and variable rainfall and runoff. A severe drought from 1997 to 2009 prompted investment in a 150 MCM/y reverse osmosis desalination plan with a capital cost of 3.5 billion. Our analysis shows that flexible design in which a smaller portion of capacity is developed initially with the option to add modular capacity in the future can mitigate uncertainty and reduce the expected lifetime costs by up to 1 billion. In Riyadh, urban water use relies on fossil groundwater aquifers and

  10. Uncertainty Evaluation of the SFR Subchannel Thermal-Hydraulic Modeling Using a Hot Channel Factors Analysis

    International Nuclear Information System (INIS)

    Choi, Sun Rock; Cho, Chung Ho; Kim, Sang Ji

    2011-01-01

    In an SFR core analysis, a hot channel factors (HCF) method is most commonly used to evaluate uncertainty. It was employed to the early design such as the CRBRP and IFR. In other ways, the improved thermal design procedure (ITDP) is able to calculate the overall uncertainty based on the Root Sum Square technique and sensitivity analyses of each design parameters. The Monte Carlo method (MCM) is also employed to estimate the uncertainties. In this method, all the input uncertainties are randomly sampled according to their probability density functions and the resulting distribution for the output quantity is analyzed. Since an uncertainty analysis is basically calculated from the temperature distribution in a subassembly, the core thermal-hydraulic modeling greatly affects the resulting uncertainty. At KAERI, the SLTHEN and MATRA-LMR codes have been utilized to analyze the SFR core thermal-hydraulics. The SLTHEN (steady-state LMR core thermal hydraulics analysis code based on the ENERGY model) code is a modified version of the SUPERENERGY2 code, which conducts a multi-assembly, steady state calculation based on a simplified ENERGY model. The detailed subchannel analysis code MATRA-LMR (Multichannel Analyzer for Steady-State and Transients in Rod Arrays for Liquid Metal Reactors), an LMR version of MATRA, was also developed specifically for the SFR core thermal-hydraulic analysis. This paper describes comparative studies for core thermal-hydraulic models. The subchannel analysis and a hot channel factors based uncertainty evaluation system is established to estimate the core thermofluidic uncertainties using the MATRA-LMR code and the results are compared to those of the SLTHEN code

  11. Comparing the treatment of uncertainty in Bayesian networks and fuzzy expert systems used for a human reliability analysis application

    International Nuclear Information System (INIS)

    Baraldi, Piero; Podofillini, Luca; Mkrtchyan, Lusine; Zio, Enrico; Dang, Vinh N.

    2015-01-01

    The use of expert systems can be helpful to improve the transparency and repeatability of assessments in areas of risk analysis with limited data available. In this field, human reliability analysis (HRA) is no exception, and, in particular, dependence analysis is an HRA task strongly based on analyst judgement. The analysis of dependence among Human Failure Events refers to the assessment of the effect of an earlier human failure on the probability of the subsequent ones. This paper analyses and compares two expert systems, based on Bayesian Belief Networks and Fuzzy Logic (a Fuzzy Expert System, FES), respectively. The comparison shows that a BBN approach should be preferred in all the cases characterized by quantifiable uncertainty in the input (i.e. when probability distributions can be assigned to describe the input parameters uncertainty), since it provides a satisfactory representation of the uncertainty and its output is directly interpretable for use within PSA. On the other hand, in cases characterized by very limited knowledge, an analyst may feel constrained by the probabilistic framework, which requires assigning probability distributions for describing uncertainty. In these cases, the FES seems to lead to a more transparent representation of the input and output uncertainty. - Highlights: • We analyse treatment of uncertainty in two expert systems. • We compare a Bayesian Belief Network (BBN) and a Fuzzy Expert System (FES). • We focus on the input assessment, inference engines and output assessment. • We focus on an application problem of interest for human reliability analysis. • We emphasize the application rather than math to reach non-BBN or FES specialists

  12. Format to communicate risk and uncertainty about the disposal of radioactive waste to different stakeholders; questionnaire and analysis of the results of the questionnaire. Deliverable D8

    International Nuclear Information System (INIS)

    Bolado, R.

    2009-10-01

    This report summarises the activities performed at JRC-IE to develop a format to communicate key ideas about uncertainty and risk associated to a SNF/HLW repository. After a period of literature research in different areas related to the safety assessment of nuclear waste repositories, the following subjects were selected as the key ideas to communicate to different stakeholders This report summarises the activities performed at JRC-IE to develop a format to communicate key ideas about uncertainty and risk associated to a SNF/HLW repository. After a period of literature research in different areas related to the safety assessment of nuclear waste repositories, the following subjects were selected as the key ideas to communicate to different stakeholders - The concept of risk; - What is a repository and how does it work; - Involved uncertainties. Origin, classification and treatment; - Key numeric and graphic results of a safety assessment; - Comparison with other risks. The format chosen is a verbal presentation supported by a PowerPoint file containing graphic material. This is a very flexible format that allows a lot of interaction with the audience. The format has been tested in two in-house debate sessions. The analysis of answers given by participants to a questionnaire and the notes taken during the debate held after the presentation will be used to update the format. Participants in the debate sessions were quite positive about the sections that tackled the concept of risk, the way a repository works and the comparison of safety limits used in European national regulations with the worldwide average radiation level, but they showed some more criticism about the way to communicate some key results from a safety assessment. They found especially difficult to understand some graphic results obtained via sensitivity analysis, and provided suggestions to improve some graphic representations of uncertainty. They also advised to reduce as much as possible the use

  13. Sources of uncertainty in characterizing health risks from flare emissions

    International Nuclear Information System (INIS)

    Hrudey, S.E.

    2000-01-01

    The assessment of health risks associated with gas flaring was the focus of this paper. Health risk assessments for environmental decision-making includes the evaluation of scientific data to identify hazards and to determine dose-response assessments, exposure assessments and risk characterization. Gas flaring has been the cause for public health concerns in recent years, most notably since 1996 after a published report by the Alberta Research Council. Some of the major sources of uncertainty associated with identifying hazardous contaminants in flare emissions were discussed. Methods to predict human exposures to emitted contaminants were examined along with risk characterization of predicted exposures to several identified contaminants. One of the problems is that elemental uncertainties exist regarding flare emissions which places limitations of the degree of reassurance that risk assessment can provide, but risk assessment can nevertheless offer some guidance to those responsible for flare emissions

  14. Regional Frequency and Uncertainty Analysis of Extreme Precipitation in Bangladesh

    Science.gov (United States)

    Mortuza, M. R.; Demissie, Y.; Li, H. Y.

    2014-12-01

    Increased frequency of extreme precipitations, especially those with multiday durations, are responsible for recent urban floods and associated significant losses of lives and infrastructures in Bangladesh. Reliable and routinely updated estimation of the frequency of occurrence of such extreme precipitation events are thus important for developing up-to-date hydraulic structures and stormwater drainage system that can effectively minimize future risk from similar events. In this study, we have updated the intensity-duration-frequency (IDF) curves for Bangladesh using daily precipitation data from 1961 to 2010 and quantified associated uncertainties. Regional frequency analysis based on L-moments is applied on 1-day, 2-day and 5-day annual maximum precipitation series due to its advantages over at-site estimation. The regional frequency approach pools the information from climatologically similar sites to make reliable estimates of quantiles given that the pooling group is homogeneous and of reasonable size. We have used Region of influence (ROI) approach along with homogeneity measure based on L-moments to identify the homogenous pooling groups for each site. Five 3-parameter distributions (i.e., Generalized Logistic, Generalized Extreme value, Generalized Normal, Pearson Type Three, and Generalized Pareto) are used for a thorough selection of appropriate models that fit the sample data. Uncertainties related to the selection of the distributions and historical data are quantified using the Bayesian Model Averaging and Balanced Bootstrap approaches respectively. The results from this study can be used to update the current design and management of hydraulic structures as well as in exploring spatio-temporal variations of extreme precipitation and associated risk.

  15. Estimate of the uncertainties in the relative risk of secondary malignant neoplasms following proton therapy and intensity-modulated photon therapy

    International Nuclear Information System (INIS)

    Fontenot, Jonas D; Bloch, Charles; Followill, David; Titt, Uwe; Newhauser, Wayne D

    2010-01-01

    Theoretical calculations have shown that proton therapy can reduce the incidence of radiation-induced secondary malignant neoplasms (SMN) compared with photon therapy for patients with prostate cancer. However, the uncertainties associated with calculations of SMN risk had not been assessed. The objective of this study was to quantify the uncertainties in projected risks of secondary cancer following contemporary proton and photon radiotherapies for prostate cancer. We performed a rigorous propagation of errors and several sensitivity tests to estimate the uncertainty in the ratio of relative risk (RRR) due to the largest contributors to the uncertainty: the radiation weighting factor for neutrons, the dose-response model for radiation carcinogenesis and interpatient variations in absorbed dose. The interval of values for the radiation weighting factor for neutrons and the dose-response model were derived from the literature, while interpatient variations in absorbed dose were taken from actual patient data. The influence of each parameter on a baseline RRR value was quantified. Our analysis revealed that the calculated RRR was insensitive to the largest contributors to the uncertainty. Uncertainties in the radiation weighting factor for neutrons, the shape of the dose-risk model and interpatient variations in therapeutic and stray doses introduced a total uncertainty of 33% to the baseline RRR calculation.

  16. Environment and Human Health: The Challenge of Uncertainty in Risk Assessment

    Directory of Open Access Journals (Sweden)

    Alex G. Stewart

    2018-01-01

    Full Text Available High quality and accurate environmental investigations and analysis are essential to any assessment of contamination and to the decision-making process thereafter. Remediation decisions may be focused by health outcomes, whether already present or a predicted risk. The variability inherent in environmental media and analysis can be quantified statistically; uncertainty in models can be reduced by additional research; deep uncertainty exists when environmental or biomedical processes are not understood, or agreed upon, or remain uncharacterized. Deep uncertainty is common where health and environment interact. Determinants of health operate from the individual’s genes to the international level; often several levels act synergistically. We show this in detail for lead (Pb. Pathways, exposure, dose and response also vary, modifying certainty. Multi-disciplinary approaches, built on high-quality environmental investigations, enable the management of complex and uncertain situations. High quality, accurate environmental investigations into pollution issues remain the cornerstone of understanding attributable health outcomes and developing appropriate responses and remediation. However, they are not sufficient on their own, needing careful integration with the wider contexts and stakeholder agendas, without which any response to the environmental assessment may very well founder. Such approaches may benefit more people than any other strategy.

  17. Risk analysis for earth dam overtopping

    Directory of Open Access Journals (Sweden)

    Mo Chongxun

    2008-06-01

    Full Text Available In this paper, a model of overtopping risk under the joint effects of floods and wind waves, which is based on risk analysis theory and takes into account the uncertainties of floods, wind waves, reservoir capacity and discharge capacity of the spillway, is proposed and applied to the Chengbihe Reservoir in Baise City in Guangxi Zhuang Autonomous Region. The simulated results indicate that the flood control limiting level can be raised by 0.40 m under the condition that the reservoir overtopping risk is controlled within a mean variance of 5×10−6. As a result, the reservoir storage will increase to 16 million m3 and electrical energy generation and other functions of the reservoir will also increase greatly.

  18. Integrated uncertainty analysis using RELAP/SCDAPSIM/MOD4.0

    International Nuclear Information System (INIS)

    Perez, M.; Reventos, F.; Wagner, R.; Allison, C.

    2009-01-01

    The RELAP/SCDAPSIM/MOD4.0 code, designed to predict the behavior of reactor systems during normal and accident conditions, is being developed as part of an international nuclear technology Software Development and Training Program (SDTP). RELAP/SCDAPSIM/MOD4.0, which is the first version of RELAP5 completely rewritten to FORTRAN 90/95/2000 standards, uses the publicly available RELAP5 and SCDAP models in combination with (a) advanced programming and numerical techniques, (b) advanced SDTP-member-developed models for LWR, HWR, and research reactor analysis, and (c) a variety of other member-developed computational packages. One such computational package is an integrated uncertainty analysis package being developed jointly by the Technical University of Catalunya (UPC) and Innovative Systems Software (ISS). The integrated uncertainty analysis approach used in the package uses the following steps: 1. Selection of the plant; 2. Selection of the scenario; 3. Selection of the safety criteria; 4. Identification and ranking of the relevant phenomena based on the safety criteria; 5. Selection of the appropriate code parameters to represent those phenomena; 6. Association of uncertainty by means of Probability Distribution Functions (PDFs) for each selected parameter; 7. Random sampling of the selected parameters according to its PDF and performing multiple computer runs to obtain uncertainty bands with a certain percentile and confidence level; 8. Processing the results of the multiple computer runs to estimate the uncertainty bands for the computed quantities associated with the selected safety criteria. The first four steps are performed by the user prior to the RELAP/SCDAPSIM/MOD4.0 analysis. The remaining steps are included with the MOD4.0 integrated uncertainty analysis (IUA) package. This paper briefly describes the integrated uncertainty analysis package including (a) the features of the package, (b) the implementation of the package into RELAP/SCDAPSIM/MOD4.0, and

  19. Environmental impact and risk assessments and key factors contributing to the overall uncertainties

    International Nuclear Information System (INIS)

    Salbu, Brit

    2016-01-01

    There is a significant number of nuclear and radiological sources that have contributed, are still contributing, or have the potential to contribute to radioactive contamination of the environment in the future. To protect the environment from radioactive contamination, impact and risk assessments are performed prior to or during a release event, short or long term after deposition or prior and after implementation of countermeasures. When environmental impact and risks are assessed, however, a series of factors will contribute to the overall uncertainties. To provide environmental impact and risk assessments, information on processes, kinetics and a series of input variables is needed. Adding problems such as variability, questionable assumptions, gaps in knowledge, extrapolations and poor conceptual model structures, a series of factors are contributing to large and often unacceptable uncertainties in impact and risk assessments. Information on the source term and the release scenario is an essential starting point in impact and risk models; the source determines activity concentrations and atom ratios of radionuclides released, while the release scenario determine the physico-chemical forms of released radionuclides such as particle size distribution, structure and density. Releases will most often contain other contaminants such as metals, and due to interactions, contaminated sites should be assessed as a multiple stressor scenario. Following deposition, a series of stressors, interactions and processes will influence the ecosystem transfer of radionuclide species and thereby influence biological uptake (toxicokinetics) and responses (toxicodynamics) in exposed organisms. Due to the variety of biological species, extrapolation is frequently needed to fill gaps in knowledge e.g., from effects to no effects, from effects in one organism to others, from one stressor to mixtures. Most toxtests are, however, performed as short term exposure of adult organisms

  20. Seismic risk analysis for the Westinghouse Electric facility, Cheswick, Pennsylvania

    International Nuclear Information System (INIS)

    1977-01-01

    This report presents the results of a detailed seismic risk analysis of the Westinghouse Electric plutonium fuel development facility at Cheswick, Pennsylvania. This report focuses on earthquakes. The historical seismic record was established after a review of available literature, consultation with operators of local seismic arrays and examination of appropriate seismic data bases. Because of the aseismicity of the region around the site, an analysis different from the conventional closest approach in a tectonic province was adapted. Earthquakes as far from the site as 1,000 km were included, as were the possibility of earthquakes at the site. In addition, various uncertainties in the input were explicitly considered in the analysis. For example, allowance was made for both the uncertainty in predicting maximum possible earthquakes in the region and the effect of the dispersion of data about the best fit attenuation relation. The attenuation relationship is derived from two of the most recent, advanced studies relating earthquake intensity reports and acceleration. Results of the risk analysis, which include a Bayesian estimate of the uncertainties, are presented as return period accelerations. The best estimate curve indicates that the Westinghouse facility will experience 0.05 g every 220 years and 0.10 g every 1400 years. The accelerations are very insensitive to the details of the source region geometries or the historical earthquake statistics in each region and each of the source regions contributes almost equally to the cumulative risk at the site

  1. Risk classification and uncertainty propagation for virtual water distribution systems

    International Nuclear Information System (INIS)

    Torres, Jacob M.; Brumbelow, Kelly; Guikema, Seth D.

    2009-01-01

    While the secrecy of real water distribution system data is crucial, it poses difficulty for research as results cannot be publicized. This data includes topological layouts of pipe networks, pump operation schedules, and water demands. Therefore, a library of virtual water distribution systems can be an important research tool for comparative development of analytical methods. A virtual city, 'Micropolis', has been developed, including a comprehensive water distribution system, as a first entry into such a library. This virtual city of 5000 residents is fully described in both geographic information systems (GIS) and EPANet hydraulic model frameworks. A risk classification scheme and Monte Carlo analysis are employed for an attempted water supply contamination attack. Model inputs to be considered include uncertainties in: daily water demand, seasonal demand, initial storage tank levels, the time of day a contamination event is initiated, duration of contamination event, and contaminant quantity. Findings show that reasonable uncertainties in model inputs produce high variability in exposure levels. It is also shown that exposure level distributions experience noticeable sensitivities to population clusters within the contaminant spread area. High uncertainties in exposure patterns lead to greater resources needed for more effective mitigation strategies.

  2. Sensitivity of Asteroid Impact Risk to Uncertainty in Asteroid Properties and Entry Parameters

    Science.gov (United States)

    Wheeler, Lorien; Mathias, Donovan; Dotson, Jessie L.; NASA Asteroid Threat Assessment Project

    2017-10-01

    A central challenge in assessing the threat posed by asteroids striking Earth is the large amount of uncertainty inherent throughout all aspects of the problem. Many asteroid properties are not well characterized and can range widely from strong, dense, monolithic irons to loosely bound, highly porous rubble piles. Even for an object of known properties, the specific entry velocity, angle, and impact location can swing the potential consequence from no damage to causing millions of casualties. Due to the extreme rarity of large asteroid strikes, there are also large uncertainties in how different types of asteroids will interact with the atmosphere during entry, how readily they may break up or ablate, and how much surface damage will be caused by the resulting airbursts or impacts.In this work, we use our Probabilistic Asteroid Impact Risk (PAIR) model to investigate the sensitivity of asteroid impact damage to uncertainties in key asteroid properties, entry parameters, or modeling assumptions. The PAIR model combines physics-based analytic models of asteroid entry and damage in a probabilistic Monte Carlo framework to assess the risk posed by a wide range of potential impacts. The model samples from uncertainty distributions of asteroid properties and entry parameters to generate millions of specific impact cases, and models the atmospheric entry and damage for each case, including blast overpressure, thermal radiation, tsunami inundation, and global effects. To assess the risk sensitivity, we alternately fix and vary the different input parameters and compare the effect on the resulting range of damage produced. The goal of these studies is to help guide future efforts in asteroid characterization and model refinement by determining which properties most significantly affect the potential risk.

  3. Use of Quantitative Uncertainty Analysis to Support M&VDecisions in ESPCs

    Energy Technology Data Exchange (ETDEWEB)

    Mathew, Paul A.; Koehling, Erick; Kumar, Satish

    2005-05-11

    Measurement and Verification (M&V) is a critical elementof an Energy Savings Performance Contract (ESPC) - without M&V, thereisno way to confirm that the projected savings in an ESPC are in factbeing realized. For any given energy conservation measure in an ESPC,there are usually several M&V choices, which will vary in terms ofmeasurement uncertainty, cost, and technical feasibility. Typically,M&V decisions are made almost solely based on engineering judgmentand experience, with little, if any, quantitative uncertainty analysis(QUA). This paper describes the results of a pilot project initiated bythe Department of Energy s Federal Energy Management Program to explorethe use of Monte-Carlo simulation to assess savings uncertainty andthereby augment the M&V decision-making process in ESPCs. The intentwas to use QUA selectively in combination with heuristic knowledge, inorder to obtain quantitative estimates of the savings uncertainty withoutthe burden of a comprehensive "bottoms-up" QUA. This approach was used toanalyze the savings uncertainty in an ESPC for a large federal agency.The QUA was seamlessly integrated into the ESPC development process andthe incremental effort was relatively small with user-friendly tools thatare commercially available. As the case study illustrates, in some casesthe QUA simply confirms intuitive or qualitative information, while inother cases, it provides insight that suggests revisiting the M&Vplan. The case study also showed that M&V decisions should beinformed by the portfolio risk diversification. By providing quantitativeuncertainty information, QUA can effectively augment the M&Vdecision-making process as well as the overall ESPC financialanalysis.

  4. Uncertainty Flow Facilitates Zero-Shot Multi-Label Learning in Affective Facial Analysis

    Directory of Open Access Journals (Sweden)

    Wenjun Bai

    2018-02-01

    Full Text Available Featured Application: The proposed Uncertainty Flow framework may benefit the facial analysis with its promised elevation in discriminability in multi-label affective classification tasks. Moreover, this framework also allows the efficient model training and between tasks knowledge transfer. The applications that rely heavily on continuous prediction on emotional valance, e.g., to monitor prisoners’ emotional stability in jail, can be directly benefited from our framework. Abstract: To lower the single-label dependency on affective facial analysis, it urges the fruition of multi-label affective learning. The impediment to practical implementation of existing multi-label algorithms pertains to scarcity of scalable multi-label training datasets. To resolve this, an inductive transfer learning based framework, i.e.,Uncertainty Flow, is put forward in this research to allow knowledge transfer from a single labelled emotion recognition task to a multi-label affective recognition task. I.e., the model uncertainty—which can be quantified in Uncertainty Flow—is distilled from a single-label learning task. The distilled model uncertainty ensures the later efficient zero-shot multi-label affective learning. On the theoretical perspective, within our proposed Uncertainty Flow framework, the feasibility of applying weakly informative priors, e.g., uniform and Cauchy prior, is fully explored in this research. More importantly, based on the derived weight uncertainty, three sets of prediction related uncertainty indexes, i.e., soft-max uncertainty, pure uncertainty and uncertainty plus are proposed to produce reliable and accurate multi-label predictions. Validated on our manual annotated evaluation dataset, i.e., the multi-label annotated FER2013, our proposed Uncertainty Flow in multi-label facial expression analysis exhibited superiority to conventional multi-label learning algorithms and multi-label compatible neural networks. The success of our

  5. Using finite mixture models in thermal-hydraulics system code uncertainty analysis

    Energy Technology Data Exchange (ETDEWEB)

    Carlos, S., E-mail: scarlos@iqn.upv.es [Department d’Enginyeria Química i Nuclear, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain); Sánchez, A. [Department d’Estadística Aplicada i Qualitat, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain); Ginestar, D. [Department de Matemàtica Aplicada, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain); Martorell, S. [Department d’Enginyeria Química i Nuclear, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain)

    2013-09-15

    Highlights: • Best estimate codes simulation needs uncertainty quantification. • The output variables can present multimodal probability distributions. • The analysis of multimodal distribution is performed using finite mixture models. • Two methods to reconstruct output variable probability distribution are used. -- Abstract: Nuclear Power Plant safety analysis is mainly based on the use of best estimate (BE) codes that predict the plant behavior under normal or accidental conditions. As the BE codes introduce uncertainties due to uncertainty in input parameters and modeling, it is necessary to perform uncertainty assessment (UA), and eventually sensitivity analysis (SA), of the results obtained. These analyses are part of the appropriate treatment of uncertainties imposed by current regulation based on the adoption of the best estimate plus uncertainty (BEPU) approach. The most popular approach for uncertainty assessment, based on Wilks’ method, obtains a tolerance/confidence interval, but it does not completely characterize the output variable behavior, which is required for an extended UA and SA. However, the development of standard UA and SA impose high computational cost due to the large number of simulations needed. In order to obtain more information about the output variable and, at the same time, to keep computational cost as low as possible, there has been a recent shift toward developing metamodels (model of model), or surrogate models, that approximate or emulate complex computer codes. In this way, there exist different techniques to reconstruct the probability distribution using the information provided by a sample of values as, for example, the finite mixture models. In this paper, the Expectation Maximization and the k-means algorithms are used to obtain a finite mixture model that reconstructs the output variable probability distribution from data obtained with RELAP-5 simulations. Both methodologies have been applied to a separated

  6. A Minimax Regret Analysis of Flood Risk Management Strategies Under Climate Change Uncertainty and Emerging Information

    NARCIS (Netherlands)

    Pol, van der T.D.; Gabbert, S.; Weikard, H.P.; Ierland, van E.C.; Hendrix, E.M.T.

    2017-01-01

    This paper studies the dynamic application of the minimax regret (MR) decision criterion to identify robust flood risk management strategies under climate change uncertainty and emerging information. An MR method is developed that uses multiple learning scenarios, for example about sea level rise

  7. Uncertainty studies and risk assessment for CO{sub 2} storage in geological formations

    Energy Technology Data Exchange (ETDEWEB)

    Walter, Lena Sophie

    2013-07-01

    Carbon capture and storage (CCS) in deep geological formations is one possible option to mitigate the greenhouse gas effect by reducing CO{sub 2} emissions into the atmosphere. The assessment of the risks related to CO{sub 2} storage is an important task. Events such as CO{sub 2} leakage and brine displacement could result in hazards for human health and the environment. In this thesis, a systematic and comprehensive risk assessment concept is presented to investigate various levels of uncertainties and to assess risks using numerical simulations. Depending on the risk and the processes, which should be assessed, very complex models, large model domains, large time scales, and many simulations runs for estimating probabilities are required. To reduce the resulting high computational costs, a model reduction technique (the arbitrary polynomial chaos expansion) and a method for model coupling in space are applied. The different levels of uncertainties are: statistical uncertainty in parameter distributions, scenario uncertainty, e.g. different geological features, and recognized ignorance due to assumptions in the conceptual model set-up. Recognized ignorance and scenario uncertainty are investigated by simulating well defined model set-ups and scenarios. According to damage values, which are defined as a model output, the set-ups and scenarios can be compared and ranked. For statistical uncertainty probabilities can be determined by running Monte Carlo simulations with the reduced model. The results are presented in various ways: e.g., mean damage, probability density function, cumulative distribution function, or an overall risk value by multiplying the damage with the probability. If the model output (damage) cannot be compared to provided criteria (e.g. water quality criteria), analytical approximations are presented to translate the damage into comparable values. The overall concept is applied for the risks related to brine displacement and infiltration into

  8. Dealing with uncertainty and pursuing superior technology options in risk management-The inherency risk analysis

    International Nuclear Information System (INIS)

    Helland, Aasgeir

    2009-01-01

    Current regulatory systems focus on the state of scientific evidence as the predominant factor for how to handle risks to human health and the environment. However, production and assessment of risk information are costly and time-consuming, and firms have an intrinsic disincentive to produce and distribute information about risks of their products as this could endanger their production opportunities and sales. An emphasis on more or better science may result in insufficient thought and attention going into the exploration of technology alternatives, and that risk management policies miss out on the possible achievement of a more favorable set of consequences. In this article, a method is proposed that combines risk assessment with the search for alternative technological options as a part of the risk management procedure. The method proposed is the inherency risk analysis where the first stage focuses on the original agent subject to investigation, the second stage focuses on identifying technological options whereas the third stage reviews the different alternatives, searching for the most attractive tradeoffs between costs and inherent safety. This is then used as a fundament for deciding which technology option to pursue. This method aims at providing a solution-focused, systematic technology-based approach for addressing and setting priorities for environmental problems. By combining risk assessment with a structured approach to identify superior technology options within a risk management system, the result could very well be a win-win situation for both company and the environment.

  9. Uncertainty theory

    CERN Document Server

    Liu, Baoding

    2015-01-01

    When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...

  10. Incorporating uncertainties into risk assessment with an application to the exploratory studies facilities at Yucca Mountain

    International Nuclear Information System (INIS)

    Fathauer, P.M.

    1995-08-01

    A methodology that incorporates variability and reducible sources of uncertainty into the probabilistic and consequence components of risk was developed. The method was applied to the north tunnel of the Exploratory Studies Facility at Yucca Mountain in Nevada. In this assessment, variability and reducible sources of uncertainty were characterized and propagated through the risk assessment models using a Monte Carlo based software package. The results were then manipulated into risk curves at the 5% and 95% confidence levels for both the variability and overall uncertainty analyses, thus distinguishing between variability and reducible sources of uncertainty. In the Yucca Mountain application, the designation of the north tunnel as an item important to public safety, as defined by 10 CFR 60, was determined. Specifically, the annual frequency of a rock fall breaching a waste package causing an off-site dose of 500 mrem (5x10 -3 Sv) was calculated. The annual frequency, taking variability into account, ranged from 1.9x10 -9 per year at the 5% confidence level to 2.5x10 -9 per year at the 95% confidence level. The frequency range after including all uncertainty was 9.5x10 -10 to 1.8x10 -8 per year. The maximum observable frequency, at the 100% confidence level, was 4.9x10 -8 per year. This is below the 10 -6 per year frequency criteria of 10 CFR 60. Therefore, based on this work, the north tunnel does not fall under the items important to public safety designation for the event studied

  11. Evidence Theory Based Uncertainty Quantification in Radiological Risk due to Accidental Release of Radioactivity from a Nuclear Power Plant

    International Nuclear Information System (INIS)

    Ingale, S. V.; Datta, D.

    2010-01-01

    Consequence of the accidental release of radioactivity from a nuclear power plant is assessed in terms of exposure or dose to the members of the public. Assessment of risk is routed through this dose computation. Dose computation basically depends on the basic dose assessment model and exposure pathways. One of the exposure pathways is the ingestion of contaminated food. The aim of the present paper is to compute the uncertainty associated with the risk to the members of the public due to the ingestion of contaminated food. The governing parameters of the ingestion dose assessment model being imprecise, we have approached evidence theory to compute the bound of the risk. The uncertainty is addressed by the belief and plausibility fuzzy measures.

  12. Uncertainty analysis of neural network based flood forecasting models: An ensemble based approach for constructing prediction interval

    Science.gov (United States)

    Kasiviswanathan, K.; Sudheer, K.

    2013-05-01

    Artificial neural network (ANN) based hydrologic models have gained lot of attention among water resources engineers and scientists, owing to their potential for accurate prediction of flood flows as compared to conceptual or physics based hydrologic models. The ANN approximates the non-linear functional relationship between the complex hydrologic variables in arriving at the river flow forecast values. Despite a large number of applications, there is still some criticism that ANN's point prediction lacks in reliability since the uncertainty of predictions are not quantified, and it limits its use in practical applications. A major concern in application of traditional uncertainty analysis techniques on neural network framework is its parallel computing architecture with large degrees of freedom, which makes the uncertainty assessment a challenging task. Very limited studies have considered assessment of predictive uncertainty of ANN based hydrologic models. In this study, a novel method is proposed that help construct the prediction interval of ANN flood forecasting model during calibration itself. The method is designed to have two stages of optimization during calibration: at stage 1, the ANN model is trained with genetic algorithm (GA) to obtain optimal set of weights and biases vector, and during stage 2, the optimal variability of ANN parameters (obtained in stage 1) is identified so as to create an ensemble of predictions. During the 2nd stage, the optimization is performed with multiple objectives, (i) minimum residual variance for the ensemble mean, (ii) maximum measured data points to fall within the estimated prediction interval and (iii) minimum width of prediction interval. The method is illustrated using a real world case study of an Indian basin. The method was able to produce an ensemble that has an average prediction interval width of 23.03 m3/s, with 97.17% of the total validation data points (measured) lying within the interval. The derived

  13. Risk-based cost-benefit analysis for evaluating microbial risk mitigation in a drinking water system.

    Science.gov (United States)

    Bergion, Viktor; Lindhe, Andreas; Sokolova, Ekaterina; Rosén, Lars

    2018-04-01

    Waterborne outbreaks of gastrointestinal diseases can cause large costs to society. Risk management needs to be holistic and transparent in order to reduce these risks in an effective manner. Microbial risk mitigation measures in a drinking water system were investigated using a novel approach combining probabilistic risk assessment and cost-benefit analysis. Lake Vomb in Sweden was used to exemplify and illustrate the risk-based decision model. Four mitigation alternatives were compared, where the first three alternatives, A1-A3, represented connecting 25, 50 and 75%, respectively, of on-site wastewater treatment systems in the catchment to the municipal wastewater treatment plant. The fourth alternative, A4, represented installing a UV-disinfection unit in the drinking water treatment plant. Quantitative microbial risk assessment was used to estimate the positive health effects in terms of quality adjusted life years (QALYs), resulting from the four mitigation alternatives. The health benefits were monetised using a unit cost per QALY. For each mitigation alternative, the net present value of health and environmental benefits and investment, maintenance and running costs was calculated. The results showed that only A4 can reduce the risk (probability of infection) below the World Health Organization guidelines of 10 -4 infections per person per year (looking at the 95th percentile). Furthermore, all alternatives resulted in a negative net present value. However, the net present value would be positive (looking at the 50 th percentile using a 1% discount rate) if non-monetised benefits (e.g. increased property value divided evenly over the studied time horizon and reduced microbial risks posed to animals), estimated at 800-1200 SEK (€100-150) per connected on-site wastewater treatment system per year, were included. This risk-based decision model creates a robust and transparent decision support tool. It is flexible enough to be tailored and applied to local

  14. Approach to uncertainty evaluation for safety analysis

    International Nuclear Information System (INIS)

    Ogura, Katsunori

    2005-01-01

    Nuclear power plant safety used to be verified and confirmed through accident simulations using computer codes generally because it is very difficult to perform integrated experiments or tests for the verification and validation of the plant safety due to radioactive consequence, cost, and scaling to the actual plant. Traditionally the plant safety had been secured owing to the sufficient safety margin through the conservative assumptions and models to be applied to those simulations. Meanwhile the best-estimate analysis based on the realistic assumptions and models in support of the accumulated insights could be performed recently, inducing the reduction of safety margin in the analysis results and the increase of necessity to evaluate the reliability or uncertainty of the analysis results. This paper introduces an approach to evaluate the uncertainty of accident simulation and its results. (Note: This research had been done not in the Japan Nuclear Energy Safety Organization but in the Tokyo Institute of Technology.) (author)

  15. Low-dose extrapolation of radiation health risks: some implications of uncertainty for radiation protection at low doses.

    Science.gov (United States)

    Land, Charles E

    2009-11-01

    Ionizing radiation is a known and well-quantified human cancer risk factor, based on a remarkably consistent body of information from epidemiological studies of exposed populations. Typical examples of risk estimation include use of Japanese atomic bomb survivor data to estimate future risk from radiation-related cancer among American patients receiving multiple computed tomography scans, persons affected by radioactive fallout, or persons whose livelihoods involve some radiation exposure, such as x-ray technicians, interventional radiologists, or shipyard workers. Our estimates of radiation-related risk are uncertain, reflecting statistical variation and our imperfect understanding of crucial assumptions that must be made if we are to apply existing epidemiological data to particular situations. Fortunately, that uncertainty is also highly quantifiable, and can be presented concisely and transparently. Radiation protection is ultimately a political process that involves consent by stakeholders, a diverse group that includes people who might be expected to be risk-averse and concerned with plausible upper limits on risk (how bad could it be?), cost-averse and concerned with lower limits on risk (can you prove there is a nontrivial risk at current dose levels?), or combining both points of view. How radiation-related risk is viewed by individuals and population subgroups also depends very much on perception of related benefit, which might be (for example) medical, economic, altruistic, or nonexistent. The following presentation follows the lead of National Council on Radiation Protection and Measurements (NCRP) Commentary 14, NCRP Report 126, and later documents in treating radiation protection from the viewpoint of quantitative uncertainty analysis.

  16. Climate change - An uncertainty factor in risk analysis of contaminated land

    International Nuclear Information System (INIS)

    Augustsson, Anna; Filipsson, Monika; Oberg, Tomas; Bergbaeck, Bo

    2011-01-01

    Metals frequently occur at contaminated sites, where their potential toxicity and persistence require risk assessments that consider possible long-term changes. Changes in climate are likely to affect the speciation, mobility, and risks associated with metals. This paper provides an example of how the climate effect can be inserted in a commonly used exposure model, and how the exposure then changes compared to present conditions. The comparison was made for cadmium (Cd) exposure to 4-year-old children at a highly contaminated iron and steel works site in southeastern Sweden. Both deterministic and probabilistic approaches (through probability bounds analysis, PBA) were used in the exposure assessment. Potential climate-sensitive variables were determined by a literature review. Although only six of the total 39 model variables were assumed to be sensitive to a change in climate (groundwater infiltration, hydraulic conductivity, soil moisture, soil:water distribution, and two bioconcentration factors), the total exposure was clearly affected. For example, by altering the climate-sensitive variables in the order of 15% to 20%, the deterministic estimate of exposure increased by 27%. Similarly, the PBA estimate of the reasonable maximum exposure (RME, defined as the upper bound of the 95th percentile) increased by almost 20%. This means that sites where the exposure in present conditions is determined to be slightly below guideline values may in the future exceed these guidelines, and risk management decisions could thus be affected. The PBA, however, showed that there is also a possibility of lower exposure levels, which means that the changes assumed for the climate-sensitive variables increase the total uncertainty in the probabilistic calculations. This highlights the importance of considering climate as a factor in the characterization of input data to exposure assessments at contaminated sites. The variable with the strongest influence on the result was the

  17. Uncertainty Analysis of Power Grid Investment Capacity Based on Monte Carlo

    Science.gov (United States)

    Qin, Junsong; Liu, Bingyi; Niu, Dongxiao

    By analyzing the influence factors of the investment capacity of power grid, to depreciation cost, sales price and sales quantity, net profit, financing and GDP of the second industry as the dependent variable to build the investment capacity analysis model. After carrying out Kolmogorov-Smirnov test, get the probability distribution of each influence factor. Finally, obtained the grid investment capacity uncertainty of analysis results by Monte Carlo simulation.

  18. Risk-Based Sampling: I Don't Want to Weight in Vain.

    Science.gov (United States)

    Powell, Mark R

    2015-12-01

    Recently, there has been considerable interest in developing risk-based sampling for food safety and animal and plant health for efficient allocation of inspection and surveillance resources. The problem of risk-based sampling allocation presents a challenge similar to financial portfolio analysis. Markowitz (1952) laid the foundation for modern portfolio theory based on mean-variance optimization. However, a persistent challenge in implementing portfolio optimization is the problem of estimation error, leading to false "optimal" portfolios and unstable asset weights. In some cases, portfolio diversification based on simple heuristics (e.g., equal allocation) has better out-of-sample performance than complex portfolio optimization methods due to estimation uncertainty. Even for portfolios with a modest number of assets, the estimation window required for true optimization may imply an implausibly long stationary period. The implications for risk-based sampling are illustrated by a simple simulation model of lot inspection for a small, heterogeneous group of producers. © 2015 Society for Risk Analysis.

  19. Analysis and Reduction of Complex Networks Under Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Knio, Omar M

    2014-04-09

    This is a collaborative proposal that aims at developing new methods for the analysis and reduction of complex multiscale networks under uncertainty. The approach is based on combining methods of computational singular perturbation (CSP) and probabilistic uncertainty quantification. In deterministic settings, CSP yields asymptotic approximations of reduced-dimensionality “slow manifolds” on which a multiscale dynamical system evolves. Introducing uncertainty raises fundamentally new issues, particularly concerning its impact on the topology of slow manifolds, and means to represent and quantify associated variability. To address these challenges, this project uses polynomial chaos (PC) methods to reformulate uncertain network models, and to analyze them using CSP in probabilistic terms. Specific objectives include (1) developing effective algorithms that can be used to illuminate fundamental and unexplored connections among model reduction, multiscale behavior, and uncertainty, and (2) demonstrating the performance of these algorithms through applications to model problems.

  20. Software for occupational health and safety risk analysis based on a fuzzy model.

    Science.gov (United States)

    Stefanovic, Miladin; Tadic, Danijela; Djapan, Marko; Macuzic, Ivan

    2012-01-01

    Risk and safety management are very important issues in healthcare systems. Those are complex systems with many entities, hazards and uncertainties. In such an environment, it is very hard to introduce a system for evaluating and simulating significant hazards. In this paper, we analyzed different types of hazards in healthcare systems and we introduced a new fuzzy model for evaluating and ranking hazards. Finally, we presented a developed software solution, based on the suggested fuzzy model for evaluating and monitoring risk.

  1. Uncertainty analysis of hydrological modeling in a tropical area using different algorithms

    Science.gov (United States)

    Rafiei Emam, Ammar; Kappas, Martin; Fassnacht, Steven; Linh, Nguyen Hoang Khanh

    2018-01-01

    Hydrological modeling outputs are subject to uncertainty resulting from different sources of errors (e.g., error in input data, model structure, and model parameters), making quantification of uncertainty in hydrological modeling imperative and meant to improve reliability of modeling results. The uncertainty analysis must solve difficulties in calibration of hydrological models, which further increase in areas with data scarcity. The purpose of this study is to apply four uncertainty analysis algorithms to a semi-distributed hydrological model, quantifying different source of uncertainties (especially parameter uncertainty) and evaluate their performance. In this study, the Soil and Water Assessment Tools (SWAT) eco-hydrological model was implemented for the watershed in the center of Vietnam. The sensitivity of parameters was analyzed, and the model was calibrated. The uncertainty analysis for the hydrological model was conducted based on four algorithms: Generalized Likelihood Uncertainty Estimation (GLUE), Sequential Uncertainty Fitting (SUFI), Parameter Solution method (ParaSol) and Particle Swarm Optimization (PSO). The performance of the algorithms was compared using P-factor and Rfactor, coefficient of determination (R 2), the Nash Sutcliffe coefficient of efficiency (NSE) and Percent Bias (PBIAS). The results showed the high performance of SUFI and PSO with P-factor>0.83, R-factor 0.91, NSE>0.89, and 0.18analysis. Indeed, the uncertainty analysis must be accounted when the outcomes of the model use for policy or management decisions.

  2. Automated uncertainty analysis methods in the FRAP computer codes

    International Nuclear Information System (INIS)

    Peck, S.O.

    1980-01-01

    A user oriented, automated uncertainty analysis capability has been incorporated in the Fuel Rod Analysis Program (FRAP) computer codes. The FRAP codes have been developed for the analysis of Light Water Reactor fuel rod behavior during steady state (FRAPCON) and transient (FRAP-T) conditions as part of the United States Nuclear Regulatory Commission's Water Reactor Safety Research Program. The objective of uncertainty analysis of these codes is to obtain estimates of the uncertainty in computed outputs of the codes is to obtain estimates of the uncertainty in computed outputs of the codes as a function of known uncertainties in input variables. This paper presents the methods used to generate an uncertainty analysis of a large computer code, discusses the assumptions that are made, and shows techniques for testing them. An uncertainty analysis of FRAP-T calculated fuel rod behavior during a hypothetical loss-of-coolant transient is presented as an example and carried through the discussion to illustrate the various concepts

  3. Representation of analysis results involving aleatory and epistemic uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jay Dean (ProStat, Mesa, AZ); Helton, Jon Craig (Arizona State University, Tempe, AZ); Oberkampf, William Louis; Sallaberry, Cedric J.

    2008-08-01

    Procedures are described for the representation of results in analyses that involve both aleatory uncertainty and epistemic uncertainty, with aleatory uncertainty deriving from an inherent randomness in the behavior of the system under study and epistemic uncertainty deriving from a lack of knowledge about the appropriate values to use for quantities that are assumed to have fixed but poorly known values in the context of a specific study. Aleatory uncertainty is usually represented with probability and leads to cumulative distribution functions (CDFs) or complementary cumulative distribution functions (CCDFs) for analysis results of interest. Several mathematical structures are available for the representation of epistemic uncertainty, including interval analysis, possibility theory, evidence theory and probability theory. In the presence of epistemic uncertainty, there is not a single CDF or CCDF for a given analysis result. Rather, there is a family of CDFs and a corresponding family of CCDFs that derive from epistemic uncertainty and have an uncertainty structure that derives from the particular uncertainty structure (i.e., interval analysis, possibility theory, evidence theory, probability theory) used to represent epistemic uncertainty. Graphical formats for the representation of epistemic uncertainty in families of CDFs and CCDFs are investigated and presented for the indicated characterizations of epistemic uncertainty.

  4. Incorporating uncertainty analysis into life cycle estimates of greenhouse gas emissions from biomass production

    International Nuclear Information System (INIS)

    Johnson, David R.; Willis, Henry H.; Curtright, Aimee E.; Samaras, Constantine; Skone, Timothy

    2011-01-01

    Before further investments are made in utilizing biomass as a source of renewable energy, both policy makers and the energy industry need estimates of the net greenhouse gas (GHG) reductions expected from substituting biobased fuels for fossil fuels. Such GHG reductions depend greatly on how the biomass is cultivated, transported, processed, and converted into fuel or electricity. Any policy aiming to reduce GHGs with biomass-based energy must account for uncertainties in emissions at each stage of production, or else it risks yielding marginal reductions, if any, while potentially imposing great costs. This paper provides a framework for incorporating uncertainty analysis specifically into estimates of the life cycle GHG emissions from the production of biomass. We outline the sources of uncertainty, discuss the implications of uncertainty and variability on the limits of life cycle assessment (LCA) models, and provide a guide for practitioners to best practices in modeling these uncertainties. The suite of techniques described herein can be used to improve the understanding and the representation of the uncertainties associated with emissions estimates, thus enabling improved decision making with respect to the use of biomass for energy and fuel production. -- Highlights: → We describe key model, scenario and data uncertainties in LCAs of biobased fuels. → System boundaries and allocation choices should be consistent with study goals. → Scenarios should be designed around policy levers that can be controlled. → We describe a new way to analyze the importance of covariance between inputs.

  5. Bayesian uncertainty analyses of probabilistic risk models

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1989-01-01

    Applications of Bayesian principles to the uncertainty analyses are discussed in the paper. A short review of the most important uncertainties and their causes is provided. An application of the principle of maximum entropy to the determination of Bayesian prior distributions is described. An approach based on so called probabilistic structures is presented in order to develop a method of quantitative evaluation of modelling uncertainties. The method is applied to a small example case. Ideas for application areas for the proposed method are discussed

  6. Can Bayesian Belief Networks help tackling conceptual model uncertainties in contaminated site risk assessment?

    DEFF Research Database (Denmark)

    Troldborg, Mads; Thomsen, Nanna Isbak; McKnight, Ursula S.

    different conceptual models may describe the same contaminated site equally well. In many cases, conceptual model uncertainty has been shown to be one of the dominant sources for uncertainty and is therefore essential to account for when quantifying uncertainties in risk assessments. We present here......A key component in risk assessment of contaminated sites is the formulation of a conceptual site model. The conceptual model is a simplified representation of reality and forms the basis for the mathematical modelling of contaminant fate and transport at the site. A conceptual model should...... a Bayesian Belief Network (BBN) approach for evaluating the uncertainty in risk assessment of groundwater contamination from contaminated sites. The approach accounts for conceptual model uncertainty by considering multiple conceptual models, each of which represents an alternative interpretation of the site...

  7. On the influence of uncertainties in estimating risk aversion and working interest

    International Nuclear Information System (INIS)

    MacKay, J.A.; Lerche, I.

    1996-01-01

    The influence of uncertainties in costs, value, success probability, risk tolerance and mandated working interest are evaluated for their impact on assessing probable ranges of uncertainty on risk adjusted value, RAV, using different models. The relative importance of different factors in contributing to the uncertainty in RAV is analyzed, as is the influence of different probability distributions for the intrinsic variables entering the RAV model formulae. Numerical illustrations indicate how the RAV probabilities depend not only on the model functions (Cozzolino, hyperbolic tangent) used to provide RAV estimates, but also on the intrinsic shapes of the probability distributions from which are drawn input parameter values for Monte Carlo simulations. In addition, a mandated range of working interest can be addressed as an extra variable contributing to the probabilistic range of RAV; while negative RAV values for high-cost project can be used to assess the probable buy-out amount one should be prepared to pay depending on corporate risk philosophy. Also, the procedures illustrate how the relative contributions of scientific factors influence uncertainty of reserve assessments, allowing one to determine where to concentrate effort to improve the ranges of uncertainty. (Author)

  8. Uncertainty analysis

    International Nuclear Information System (INIS)

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software

  9. Sunway Medical Laboratory Quality Control Plans Based on Six Sigma, Risk Management and Uncertainty.

    Science.gov (United States)

    Jairaman, Jamuna; Sakiman, Zarinah; Li, Lee Suan

    2017-03-01

    Sunway Medical Centre (SunMed) implemented Six Sigma, measurement uncertainty, and risk management after the CLSI EP23 Individualized Quality Control Plan approach. Despite the differences in all three approaches, each implementation was beneficial to the laboratory, and none was in conflict with another approach. A synthesis of these approaches, built on a solid foundation of quality control planning, can help build a strong quality management system for the entire laboratory. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Model based climate information on drought risk in Africa

    Science.gov (United States)

    Calmanti, S.; Syroka, J.; Jones, C.; Carfagna, F.; Dell'Aquila, A.; Hoefsloot, P.; Kaffaf, S.; Nikulin, G.

    2012-04-01

    The United Nations World Food Programme (WFP) has embarked upon the endeavor of creating a sustainable Africa-wide natural disaster risk management system. A fundamental building block of this initiative is the setup of a drought impact modeling platform called Africa Risk-View that aims to quantify and monitor weather-related food security risk in Africa. The modeling approach is based the Water Requirement Satisfaction Index (WRSI), as the fundamental indicator of the performances of agriculture and uses historical records of food assistance operation to project future potential needs for livelihood protection. By using climate change scenarios as an input to Africa Risk-View it is possible, in principles, to evaluate the future impact of climate variability on critical issues such as food security and the overall performance of the envisaged risk management system. A necessary preliminary step to this challenging task is the exploration of the sources of uncertainties affecting the assessment based on modeled climate change scenarios. For this purpose, a limited set of climate models have been selected in order verify the relevance of using climate model output data with Africa Risk-View and to explore a minimal range of possible sources of uncertainty. This first evaluation exercise started before the setup of the CORDEX framework and has relied on model output available at the time. In particular only one regional downscaling was available for the entire African continent from the ENSEMBLES project. The analysis shows that current coarse resolution global climate models can not directly feed into the Africa RiskView risk-analysis tool. However, regional downscaling may help correcting the inherent biases observed in the datasets. Further analysis is performed by using the first data available under the CORDEX framework. In particular, we consider a set of simulation driven with boundary conditions from the reanalysis ERA-Interim to evaluate the skill drought

  11. Quantified Risk Ranking Model for Condition-Based Risk and Reliability Centered Maintenance

    Science.gov (United States)

    Chattopadhyaya, Pradip Kumar; Basu, Sushil Kumar; Majumdar, Manik Chandra

    2017-06-01

    In the recent past, risk and reliability centered maintenance (RRCM) framework is introduced with a shift in the methodological focus from reliability and probabilities (expected values) to reliability, uncertainty and risk. In this paper authors explain a novel methodology for risk quantification and ranking the critical items for prioritizing the maintenance actions on the basis of condition-based risk and reliability centered maintenance (CBRRCM). The critical items are identified through criticality analysis of RPN values of items of a system and the maintenance significant precipitating factors (MSPF) of items are evaluated. The criticality of risk is assessed using three risk coefficients. The likelihood risk coefficient treats the probability as a fuzzy number. The abstract risk coefficient deduces risk influenced by uncertainty, sensitivity besides other factors. The third risk coefficient is called hazardous risk coefficient, which is due to anticipated hazards which may occur in the future and the risk is deduced from criteria of consequences on safety, environment, maintenance and economic risks with corresponding cost for consequences. The characteristic values of all the three risk coefficients are obtained with a particular test. With few more tests on the system, the values may change significantly within controlling range of each coefficient, hence `random number simulation' is resorted to obtain one distinctive value for each coefficient. The risk coefficients are statistically added to obtain final risk coefficient of each critical item and then the final rankings of critical items are estimated. The prioritization in ranking of critical items using the developed mathematical model for risk assessment shall be useful in optimization of financial losses and timing of maintenance actions.

  12. Risk, rationality, and regret: responding to the uncertainty of childhood food anaphylaxis.

    Science.gov (United States)

    Hu, W; Kerridge, I; Kemp, A

    2005-06-01

    Risk and uncertainty are unavoidable in clinical medicine. In the case of childhood food allergy, the dysphoric experience of uncertainty is heightened by the perception of unpredictable danger to young children. Medicine has tended to respond to uncertainty with forms of rational decision making. Rationality cannot, however, resolve uncertainty and provides an insufficient account of risk. This paper compares the medical and parental accounts of two peanut allergic toddlers to highlight the value of emotions in decision making. One emotion in particular, regret, assists in explaining the actions taken to prevent allergic reactions, given the diffuse nature of responsibility for children. In this light, the assumption that doctors make rational judgments while patients have emotion led preferences is a false dichotomy. Reconciling medical and lay accounts requires acknowledgement of the interrelationship between the rational and the emotional, and may lead to more appropriate clinical decision making under conditions of uncertainty.

  13. Ontology-based specification, identification and analysis of perioperative risks.

    Science.gov (United States)

    Uciteli, Alexandr; Neumann, Juliane; Tahar, Kais; Saleh, Kutaiba; Stucke, Stephan; Faulbrück-Röhr, Sebastian; Kaeding, André; Specht, Martin; Schmidt, Tobias; Neumuth, Thomas; Besting, Andreas; Stegemann, Dominik; Portheine, Frank; Herre, Heinrich

    2017-09-06

    Medical personnel in hospitals often works under great physical and mental strain. In medical decision-making, errors can never be completely ruled out. Several studies have shown that between 50 and 60% of adverse events could have been avoided through better organization, more attention or more effective security procedures. Critical situations especially arise during interdisciplinary collaboration and the use of complex medical technology, for example during surgical interventions and in perioperative settings (the period of time before, during and after surgical intervention). In this paper, we present an ontology and an ontology-based software system, which can identify risks across medical processes and supports the avoidance of errors in particular in the perioperative setting. We developed a practicable definition of the risk notion, which is easily understandable by the medical staff and is usable for the software tools. Based on this definition, we developed a Risk Identification Ontology (RIO) and used it for the specification and the identification of perioperative risks. An agent system was developed, which gathers risk-relevant data during the whole perioperative treatment process from various sources and provides it for risk identification and analysis in a centralized fashion. The results of such an analysis are provided to the medical personnel in form of context-sensitive hints and alerts. For the identification of the ontologically specified risks, we developed an ontology-based software module, called Ontology-based Risk Detector (OntoRiDe). About 20 risks relating to cochlear implantation (CI) have already been implemented. Comprehensive testing has indicated the correctness of the data acquisition, risk identification and analysis components, as well as the web-based visualization of results.

  14. Adjoint-Based Uncertainty Quantification with MCNP

    Energy Technology Data Exchange (ETDEWEB)

    Seifried, Jeffrey E. [Univ. of California, Berkeley, CA (United States)

    2011-09-01

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  15. Estimation of a quantity of interest in uncertainty analysis: Some help from Bayesian decision theory

    International Nuclear Information System (INIS)

    Pasanisi, Alberto; Keller, Merlin; Parent, Eric

    2012-01-01

    In the context of risk analysis under uncertainty, we focus here on the problem of estimating a so-called quantity of interest of an uncertainty analysis problem, i.e. a given feature of the probability distribution function (pdf) of the output of a deterministic model with uncertain inputs. We will stay here in a fully probabilistic setting. A common problem is how to account for epistemic uncertainty tainting the parameter of the probability distribution of the inputs. In the standard practice, this uncertainty is often neglected (plug-in approach). When a specific uncertainty assessment is made, under the basis of the available information (expertise and/or data), a common solution consists in marginalizing the joint distribution of both observable inputs and parameters of the probabilistic model (i.e. computing the predictive pdf of the inputs), then propagating it through the deterministic model. We will reinterpret this approach in the light of Bayesian decision theory, and will put into evidence that this practice leads the analyst to adopt implicitly a specific loss function which may be inappropriate for the problem under investigation, and suboptimal from a decisional perspective. These concepts are illustrated on a simple numerical example, concerning a case of flood risk assessment.

  16. Seismic risk analysis for the Babcock and Wilcox facility, Leechburg, Pennsylvania

    International Nuclear Information System (INIS)

    1977-01-01

    The results of a detailed seismic risk analysis of the Babcock and Wilcox Plutonium Fuel Fabrication facility at Leechburg, Pennsylvania are presented. This report focuses on earthquakes; the other natural hazards, being addressed in separate reports, are severe weather (strong winds and tornados) and floods. The calculational method used is based on Cornell's work (1968); it has been previously applied to safety evaluations of major projects. The historical seismic record was established after a review of available literature, consultation with operators of local seismic arrays and examination of appropriate seismic data bases. Because of the aseismicity of the region around the site, an analysis different from the conventional closest approach in a tectonic province was adapted. Earthquakes as far from the site as 1,000 km were included, as were the possibility of earthquakes at the site. In addition, various uncertainties in the input were explicitly considered in the analysis. The results of the risk analysis, which include a Bayesian estimate of the uncertainties, are presented, expressed as return period accelerations. The best estimate curve indicates that the Babcock and Wilcox facility will experience 0.05 g every 220 years and 0.10 g every 1400 years. The bounding curves roughly represent the one standard deviation confidence limits about the best estimate, reflecting the uncertainty in certain of the input. Detailed examination of the results show that the accelerations are very insensitive to the details of the source region geometries or the historical earthquake statistics in each region and that each of the source regions contributes almost equally to the cumulative risk at the site. If required for structural analysis, acceleration response spectra for the site can be constructed by scaling the mean response spectrum for alluvium in WASH 1255 by these peak accelerations

  17. Uncertainty analysis of time-dependent nonlinear systems: theory and application to transient thermal hydraulics

    International Nuclear Information System (INIS)

    Barhen, J.; Bjerke, M.A.; Cacuci, D.G.; Mullins, C.B.; Wagschal, G.G.

    1982-01-01

    An advanced methodology for performing systematic uncertainty analysis of time-dependent nonlinear systems is presented. This methodology includes a capability for reducing uncertainties in system parameters and responses by using Bayesian inference techniques to consistently combine prior knowledge with additional experimental information. The determination of best estimates for the system parameters, for the responses, and for their respective covariances is treated as a time-dependent constrained minimization problem. Three alternative formalisms for solving this problem are developed. The two ''off-line'' formalisms, with and without ''foresight'' characteristics, require the generation of a complete sensitivity data base prior to performing the uncertainty analysis. The ''online'' formalism, in which uncertainty analysis is performed interactively with the system analysis code, is best suited for treatment of large-scale highly nonlinear time-dependent problems. This methodology is applied to the uncertainty analysis of a transient upflow of a high pressure water heat transfer experiment. For comparison, an uncertainty analysis using sensitivities computed by standard response surface techniques is also performed. The results of the analysis indicate the following. Major reduction of the discrepancies in the calculation/experiment ratios is achieved by using the new methodology. Incorporation of in-bundle measurements in the uncertainty analysis significantly reduces system uncertainties. Accuracy of sensitivities generated by response-surface techniques should be carefully assessed prior to using them as a basis for uncertainty analyses of transient reactor safety problems

  18. Development of seismic risk analysis methodologies at JAERI

    International Nuclear Information System (INIS)

    Tanaka, T.; Abe, K.; Ebisawa, K.; Oikawa, T.

    1988-01-01

    The usefulness of probabilistic safety assessment (PSA) is recognized worldwidely for balanced design and regulation of nuclear power plants. In Japan, the Japan Atomic Energy Research Institute (JAERI) has been engaged in developing methodologies necessary for carrying out PSA. The research and development program was started in 1980. In those days the effort was only for internal initiator PSA. In 1985 the program was expanded so as to include external event analysis. Although this expanded program is to cover various external initiators, the current effort is dedicated for seismic risk analysis. There are three levels of seismic PSA, similarly to internal initiator PSA: Level 1: Evaluation of core damage frequency, Level 2: Evaluation of radioactive release frequency and source terms, and Level 3: Evaluation of environmental consequence. In the JAERI's program, only the methodologies for level 1 seismic PSA are under development. The methodology development for seismic risk analysis is divided into two phases. The Phase I study is to establish a whole set of simple methodologies based on currently available data. In the Phase II, Sensitivity study will be carried out to identify the parameters whose uncertainty may result in lage uncertainty in seismic risk, and For such parameters, the methodology will be upgraded. Now the Phase I study has almost been completed. In this report, outlines of the study and some of its outcomes are described

  19. Calibration and Measurement Uncertainty Estimation of Radiometric Data: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Habte, A.; Sengupta, M.; Reda, I.; Andreas, A.; Konings, J.

    2014-11-01

    Evaluating the performance of photovoltaic cells, modules, and arrays that form large solar deployments relies on accurate measurements of the available solar resource. Therefore, determining the accuracy of these solar radiation measurements provides a better understanding of investment risks. This paper provides guidelines and recommended procedures for estimating the uncertainty in calibrations and measurements by radiometers using methods that follow the International Bureau of Weights and Measures Guide to the Expression of Uncertainty (GUM). Standardized analysis based on these procedures ensures that the uncertainty quoted is well documented.

  20. Characterization, propagation and analysis of aleatory and epistemic uncertainty in the 2008 performance assessment for the proposed repository for radioactive waste at Yucca Mountain, Nevada

    International Nuclear Information System (INIS)

    Helton, Jon Craig; Sallaberry, Cedric M.; Hansen, Clifford W.

    2010-01-01

    The 2008 performance assessment (PA) for the proposed repository for high-level radioactive waste at Yucca Mountain (YM), Nevada, illustrates the conceptual structure of risk assessments for complex systems. The 2008 YM PA is based on the following three conceptual entities: a probability space that characterizes aleatory uncertainty; a function that predicts consequences for individual elements of the sample space for aleatory uncertainty; and a probability space that characterizes epistemic uncertainty. These entities and their use in the characterization, propagation and analysis of aleatory and epistemic uncertainty are described and illustrated with results from the 2008 YM PA.

  1. A tool for efficient, model-independent management optimization under uncertainty

    Science.gov (United States)

    White, Jeremy; Fienen, Michael N.; Barlow, Paul M.; Welter, Dave E.

    2018-01-01

    To fill a need for risk-based environmental management optimization, we have developed PESTPP-OPT, a model-independent tool for resource management optimization under uncertainty. PESTPP-OPT solves a sequential linear programming (SLP) problem and also implements (optional) efficient, “on-the-fly” (without user intervention) first-order, second-moment (FOSM) uncertainty techniques to estimate model-derived constraint uncertainty. Combined with a user-specified risk value, the constraint uncertainty estimates are used to form chance-constraints for the SLP solution process, so that any optimal solution includes contributions from model input and observation uncertainty. In this way, a “single answer” that includes uncertainty is yielded from the modeling analysis. PESTPP-OPT uses the familiar PEST/PEST++ model interface protocols, which makes it widely applicable to many modeling analyses. The use of PESTPP-OPT is demonstrated with a synthetic, integrated surface-water/groundwater model. The function and implications of chance constraints for this synthetic model are discussed.

  2. Uncertainty Analysis via Failure Domain Characterization: Polynomial Requirement Functions

    Science.gov (United States)

    Crespo, Luis G.; Munoz, Cesar A.; Narkawicz, Anthony J.; Kenny, Sean P.; Giesy, Daniel P.

    2011-01-01

    This paper proposes an uncertainty analysis framework based on the characterization of the uncertain parameter space. This characterization enables the identification of worst-case uncertainty combinations and the approximation of the failure and safe domains with a high level of accuracy. Because these approximations are comprised of subsets of readily computable probability, they enable the calculation of arbitrarily tight upper and lower bounds to the failure probability. A Bernstein expansion approach is used to size hyper-rectangular subsets while a sum of squares programming approach is used to size quasi-ellipsoidal subsets. These methods are applicable to requirement functions whose functional dependency on the uncertainty is a known polynomial. Some of the most prominent features of the methodology are the substantial desensitization of the calculations from the uncertainty model assumed (i.e., the probability distribution describing the uncertainty) as well as the accommodation for changes in such a model with a practically insignificant amount of computational effort.

  3. Physics-based Entry, Descent and Landing Risk Model

    Science.gov (United States)

    Gee, Ken; Huynh, Loc C.; Manning, Ted

    2014-01-01

    A physics-based risk model was developed to assess the risk associated with thermal protection system failures during the entry, descent and landing phase of a manned spacecraft mission. In the model, entry trajectories were computed using a three-degree-of-freedom trajectory tool, the aerothermodynamic heating environment was computed using an engineering-level computational tool and the thermal response of the TPS material was modeled using a one-dimensional thermal response tool. The model was capable of modeling the effect of micrometeoroid and orbital debris impact damage on the TPS thermal response. A Monte Carlo analysis was used to determine the effects of uncertainties in the vehicle state at Entry Interface, aerothermodynamic heating and material properties on the performance of the TPS design. The failure criterion was set as a temperature limit at the bondline between the TPS and the underlying structure. Both direct computation and response surface approaches were used to compute the risk. The model was applied to a generic manned space capsule design. The effect of material property uncertainty and MMOD damage on risk of failure were analyzed. A comparison of the direct computation and response surface approach was undertaken.

  4. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    Energy Technology Data Exchange (ETDEWEB)

    Langenbrunner, James R. [Los Alamos National Laboratory; Booker, Jane M [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Salazar, Issac F [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  5. Characterising bias in regulatory risk and decision analysis: An analysis of heuristics applied in health technology appraisal, chemicals regulation, and climate change governance.

    Science.gov (United States)

    MacGillivray, Brian H

    2017-08-01

    In many environmental and public health domains, heuristic methods of risk and decision analysis must be relied upon, either because problem structures are ambiguous, reliable data is lacking, or decisions are urgent. This introduces an additional source of uncertainty beyond model and measurement error - uncertainty stemming from relying on inexact inference rules. Here we identify and analyse heuristics used to prioritise risk objects, to discriminate between signal and noise, to weight evidence, to construct models, to extrapolate beyond datasets, and to make policy. Some of these heuristics are based on causal generalisations, yet can misfire when these relationships are presumed rather than tested (e.g. surrogates in clinical trials). Others are conventions designed to confer stability to decision analysis, yet which may introduce serious error when applied ritualistically (e.g. significance testing). Some heuristics can be traced back to formal justifications, but only subject to strong assumptions that are often violated in practical applications. Heuristic decision rules (e.g. feasibility rules) in principle act as surrogates for utility maximisation or distributional concerns, yet in practice may neglect costs and benefits, be based on arbitrary thresholds, and be prone to gaming. We highlight the problem of rule-entrenchment, where analytical choices that are in principle contestable are arbitrarily fixed in practice, masking uncertainty and potentially introducing bias. Strategies for making risk and decision analysis more rigorous include: formalising the assumptions and scope conditions under which heuristics should be applied; testing rather than presuming their underlying empirical or theoretical justifications; using sensitivity analysis, simulations, multiple bias analysis, and deductive systems of inference (e.g. directed acyclic graphs) to characterise rule uncertainty and refine heuristics; adopting "recovery schemes" to correct for known biases

  6. Optimizing an Investment Solution in Conditions of Uncertainty and Risk as a Multicriterial Task

    Directory of Open Access Journals (Sweden)

    Kotsyuba Oleksiy S.

    2017-10-01

    Full Text Available The article is concerned with the methodology for optimizing investment decisions in conditions of uncertainty and risk. The subject area of the study relates, first of all, to real investment. The problem of modeling an optimal investment solution is considered to be a multicriterial task. Also, the constructive part of the publication is based on the position that the multicriteriality of objectives of investment projecting is the result, first, of the complex nature of the category of economic attractiveness (efficiency of real investment, and secondly, of the need to take into account the risk factor, which is a vector measure, in the preparation of an investment solution. An attempt has been made to develop an instrumentarium to optimize investment decisions in a situation of uncertainty and the risk it engenders, based on the use of roll-up of the local criteria. As a result of its implementation, a model has been proposed, which has the advantage that it takes into account, to a greater extent than is the case for standardized roll-up options, the contensive and formal features of the local (detailed criteria.

  7. Risk Analysis for Road Tunnels – A Metamodel to Efficiently Integrate Complex Fire Scenarios

    DEFF Research Database (Denmark)

    Berchtold, Florian; Knaust, Christian; Arnold, Lukas

    2018-01-01

    Fires in road tunnels constitute complex scenarios with interactions between the fire, tunnel users and safety measures. More and more methodologies for risk analysis quantify the consequences of these scenarios with complex models. Examples for complex models are the computational fluid dynamics...... complex scenarios in risk analysis. To face this challenge, we improved the metamodel used in the methodology for risk analysis presented on ISTSS 2016. In general, a metamodel quickly interpolates the consequences of few scenarios simulated with the complex models to a large number of arbitrary scenarios...... used in risk analysis. Now, our metamodel consists of the projection array-based design, the moving least squares method, and the prediction interval to quantify the metamodel uncertainty. Additionally, we adapted the projection array-based design in two ways: the focus of the sequential refinement...

  8. Scenario-based approach to risk analysis in support of cyber security

    Energy Technology Data Exchange (ETDEWEB)

    Gertman, D. I.; Folkers, R.; Roberts, J. [Idaho National Laboratory, Roberts and Folkers Associates, LLC, Idaho Falls, ID 83404 (United States)

    2006-07-01

    control systems, perpetrators will attempt to control and defeat automation systems, engineering access, control systems and protective systems implemented in today's critical infrastructures. Major systems such as supervisory control and data acquisition (SCADA) systems are likely targets for attack. Not all attack scenarios have the same expected frequency or consequence. The attacks will be directed and structured and thus, are not be characterized as random events when one considers failure probabilities. Attack types differ in their consequence as a function of the probability associated with various sub events in the presence of specific system configurations. Ideally, a series of generic scenarios can be identified for each of the major critical infrastructure (CI) sectors. A scenario-based approach to risk assessment allows decision makers to place financial and personnel resources in-place for attacks that truly matter: e.g. attacks that generate physical and economic damage. The use of scenario-based analysis allows risk reduction goals to be informed by more than consequence analysis alone. The key CI targets used in the present study were identified previously as part of a mid-level consequence analysis performed at INL by the Control System Security Program (CSSP) for the National Cyber Security Div. (NCSD) of the Dept. of Homeland Security (DHS). This paper discusses the process for and results associated with the development of scenario-based cyber attacks upon control systems including the information and personnel requirements for scenario development. Challenges to scenario development including completeness and uncertainty characterization are discussed as well. Further, the scenario discussed herein, is one of a number of scenarios for infrastructures currently under review. (authors)

  9. Scenario-based approach to risk analysis in support of cyber security

    International Nuclear Information System (INIS)

    Gertman, D. I.; Folkers, R.; Roberts, J.

    2006-01-01

    control systems, perpetrators will attempt to control and defeat automation systems, engineering access, control systems and protective systems implemented in today's critical infrastructures. Major systems such as supervisory control and data acquisition (SCADA) systems are likely targets for attack. Not all attack scenarios have the same expected frequency or consequence. The attacks will be directed and structured and thus, are not be characterized as random events when one considers failure probabilities. Attack types differ in their consequence as a function of the probability associated with various sub events in the presence of specific system configurations. Ideally, a series of generic scenarios can be identified for each of the major critical infrastructure (CI) sectors. A scenario-based approach to risk assessment allows decision makers to place financial and personnel resources in-place for attacks that truly matter: e.g. attacks that generate physical and economic damage. The use of scenario-based analysis allows risk reduction goals to be informed by more than consequence analysis alone. The key CI targets used in the present study were identified previously as part of a mid-level consequence analysis performed at INL by the Control System Security Program (CSSP) for the National Cyber Security Div. (NCSD) of the Dept. of Homeland Security (DHS). This paper discusses the process for and results associated with the development of scenario-based cyber attacks upon control systems including the information and personnel requirements for scenario development. Challenges to scenario development including completeness and uncertainty characterization are discussed as well. Further, the scenario discussed herein, is one of a number of scenarios for infrastructures currently under review. (authors)

  10. Erha Uncertainty Analysis: Planning for the future

    International Nuclear Information System (INIS)

    Brami, T.R.; Hopkins, D.F.; Loguer, W.L.; Cornagia, D.M.; Braisted, A.W.C.

    2002-01-01

    The Erha field (OPL 209) was discovered in 1999 approximately 100 km off the coast of Nigeria in 1,100 m of water. The discovery well (Erha-1) encountered oil and gas in deep-water clastic reservoirs. The first appraisal well (Erha-2) drilled 1.6 km downdip to the northwest penetrated an oil-water contact and confirmed a potentially commercial discovery. However, the Erha-3 and Erha-3 ST-1 boreholes, drilled on the faulted east-side of the field in 2001, encountered shallower fluid contacts. As a result of these findings, a comprehensive field-wide uncertainty analysis was performed to better understand what we know versus what we think regarding resource size and economic viability The uncertainty analysis process applied at Erha is an integrated scenario-based probabilistic approach to model resource and reserves. Its goal is to provide quantitative results for a variety of scenarios, thus allowing identification of and focus on critical controls (the variables that are likely to impose the greatest influence).The initial focus at Erha was to incorporate the observed fluid contacts and to develop potential scenarios that included the range of possibilities in unpenetrated portions of the field. Four potential compartmentalization scenarios were hypothesized. The uncertainty model combines these scenarios with reservoir parameters and their plausible ranges. Input data comes from multiple sources including: wells, 3D seismic, reservoir flow simulation, geochemistry, fault-seal analysis, sequence stratigraphic analysis, and analogs. Once created, the model is sampled using Monte-Carlo techniques to create probability density functions for a variety of variables including oil in-place and recoverable reserves.Results of the uncertainty analysis support that despite a thinner oil column on the faulted east-side of the field, Erha is an economically attractive opportunity. Further, the results have been to develop data acquisition plans and mitigation strategies that

  11. Bayesian risk-based decision method for model validation under uncertainty

    International Nuclear Information System (INIS)

    Jiang Xiaomo; Mahadevan, Sankaran

    2007-01-01

    This paper develops a decision-making methodology for computational model validation, considering the risk of using the current model, data support for the current model, and cost of acquiring new information to improve the model. A Bayesian decision theory-based method is developed for this purpose, using a likelihood ratio as the validation metric for model assessment. An expected risk or cost function is defined as a function of the decision costs, and the likelihood and prior of each hypothesis. The risk is minimized through correctly assigning experimental data to two decision regions based on the comparison of the likelihood ratio with a decision threshold. A Bayesian validation metric is derived based on the risk minimization criterion. Two types of validation tests are considered: pass/fail tests and system response value measurement tests. The methodology is illustrated for the validation of reliability prediction models in a tension bar and an engine blade subjected to high cycle fatigue. The proposed method can effectively integrate optimal experimental design into model validation to simultaneously reduce the cost and improve the accuracy of reliability model assessment

  12. Cost uncertainty for different levels of technology maturity

    International Nuclear Information System (INIS)

    DeMuth, S.F.; Franklin, A.L.

    1996-01-01

    It is difficult at best to apply a single methodology for estimating cost uncertainties related to technologies of differing maturity. While highly mature technologies may have significant performance and manufacturing cost data available, less well developed technologies may be defined in only conceptual terms. Regardless of the degree of technical maturity, often a cost estimate relating to application of the technology may be required to justify continued funding for development. Yet, a cost estimate without its associated uncertainty lacks the information required to assess the economic risk. For this reason, it is important for the developer to provide some type of uncertainty along with a cost estimate. This study demonstrates how different methodologies for estimating uncertainties can be applied to cost estimates for technologies of different maturities. For a less well developed technology an uncertainty analysis of the cost estimate can be based on a sensitivity analysis; whereas, an uncertainty analysis of the cost estimate for a well developed technology can be based on an error propagation technique from classical statistics. It was decided to demonstrate these uncertainty estimation techniques with (1) an investigation of the additional cost of remediation due to beyond baseline, nearly complete, waste heel retrieval from underground storage tanks (USTs) at Hanford; and (2) the cost related to the use of crystalline silico-titanate (CST) rather than the baseline CS100 ion exchange resin for cesium separation from UST waste at Hanford

  13. Development of Property Models with Uncertainty Estimate for Process Design under Uncertainty

    DEFF Research Database (Denmark)

    Hukkerikar, Amol; Sarup, Bent; Abildskov, Jens

    more reliable predictions with a new and improved set of model parameters for GC (group contribution) based and CI (atom connectivity index) based models and to quantify the uncertainties in the estimated property values from a process design point-of-view. This includes: (i) parameter estimation using....... The comparison of model prediction uncertainties with reported range of measurement uncertainties is presented for the properties with related available data. The application of the developed methodology to quantify the effect of these uncertainties on the design of different unit operations (distillation column......, the developed methodology can be used to quantify the sensitivity of process design to uncertainties in property estimates; obtain rationally the risk/safety factors in process design; and identify additional experimentation needs in order to reduce most critical uncertainties....

  14. Best-estimate analysis and decision making under uncertainty

    International Nuclear Information System (INIS)

    Orechwa, Y.

    2004-01-01

    In many engineering analyses of system safety the traditional reliance on conservative evaluation model calculations is being replaced with so called best-estimate analysis. These best-estimate analyses differentiate themselves from the traditional conservative analyses through two ingredients, namely realistic models and an account of the residual uncertainty associated with the model calculations. Best-estimate analysis, in the context of this paper, refers to the numerical evaluation of system properties of interest in situations where direct confirmatory measurements are not feasible. A decision with regard to the safety of the system is then made based on the computed numerical values of the system properties of interest. These situations generally arise in the design of systems that require computed and generally nontrivial extrapolations from the available data. In the case of nuclear reactors, examples are criticality of spent fuel pools, neutronic parameters of new advanced designs where insufficient material is available for mockup critical experiments and, the large break loss of coolant accident (LOCA). In this paper the case of LOCA, is taken to discuss the best-estimate analysis and decision making. Central to decision making is information. Thus, of interest is the source, quantity and quality of the information obtained in a best-estimate analysis, and used to define the acceptance criteria and to formulate a decision rule. This in effect expands the problem from the calculation of a conservative margin to a predefined acceptance criterion, to the formulation of a consistent decision rule and the computation of a test statistic for application of the decision rule. The latter view is a necessary condition for developing risk informed decision rules, and, thus, the relation between design basis analysis criteria and probabilistic risk assessment criteria is key. The discussion is in the context of making a decision under uncertainty for a reactor

  15. Optimization of FRAP uncertainty analysis option

    International Nuclear Information System (INIS)

    Peck, S.O.

    1979-10-01

    The automated uncertainty analysis option that has been incorporated in the FRAP codes (FRAP-T5 and FRAPCON-2) provides the user with a means of obtaining uncertainty bands on code predicted variables at user-selected times during a fuel pin analysis. These uncertainty bands are obtained by multiple single fuel pin analyses to generate data which can then be analyzed by second order statistical error propagation techniques. In this process, a considerable amount of data is generated and stored on tape. The user has certain choices to make regarding which independent variables are to be used in the analysis and what order of error propagation equation should be used in modeling the output response. To aid the user in these decisions, a computer program, ANALYZ, has been written and added to the uncertainty analysis option package. A variety of considerations involved in fitting response surface equations and certain pit-falls of which the user should be aware are discussed. An equation is derived expressing a residual as a function of a fitted model and an assumed true model. A variety of experimental design choices are discussed, including the advantages and disadvantages of each approach. Finally, a description of the subcodes which constitute program ANALYZ is provided

  16. Communicating uncertainty in cost-benefit analysis : A cognitive psychological perspective

    NARCIS (Netherlands)

    Mouter, N.; Holleman, M.; Calvert, S.C.; Annema, J.A.

    2013-01-01

    Based on a cognitive psychological theory, this paper aims to improve the communication of uncertainty in Cost-Benefit Analysis. The theory is based on different cognitive-personality and cognitive-social psychological constructs that may help explain individual differences in the processing of

  17. A nonparametric approach to medical survival data: Uncertainty in the context of risk in mortality analysis

    International Nuclear Information System (INIS)

    Janurová, Kateřina; Briš, Radim

    2014-01-01

    Medical survival right-censored data of about 850 patients are evaluated to analyze the uncertainty related to the risk of mortality on one hand and compare two basic surgery techniques in the context of risk of mortality on the other hand. Colorectal data come from patients who underwent colectomy in the University Hospital of Ostrava. Two basic surgery operating techniques are used for the colectomy: either traditional (open) or minimally invasive (laparoscopic). Basic question arising at the colectomy operation is, which type of operation to choose to guarantee longer overall survival time. Two non-parametric approaches have been used to quantify probability of mortality with uncertainties. In fact, complement of the probability to one, i.e. survival function with corresponding confidence levels is calculated and evaluated. First approach considers standard nonparametric estimators resulting from both the Kaplan–Meier estimator of survival function in connection with Greenwood's formula and the Nelson–Aalen estimator of cumulative hazard function including confidence interval for survival function as well. The second innovative approach, represented by Nonparametric Predictive Inference (NPI), uses lower and upper probabilities for quantifying uncertainty and provides a model of predictive survival function instead of the population survival function. The traditional log-rank test on one hand and the nonparametric predictive comparison of two groups of lifetime data on the other hand have been compared to evaluate risk of mortality in the context of mentioned surgery techniques. The size of the difference between two groups of lifetime data has been considered and analyzed as well. Both nonparametric approaches led to the same conclusion, that the minimally invasive operating technique guarantees the patient significantly longer survival time in comparison with the traditional operating technique

  18. Quantitative Analysis of Uncertainty in Medical Reporting: Creating a Standardized and Objective Methodology.

    Science.gov (United States)

    Reiner, Bruce I

    2018-04-01

    Uncertainty in text-based medical reports has long been recognized as problematic, frequently resulting in misunderstanding and miscommunication. One strategy for addressing the negative clinical ramifications of report uncertainty would be the creation of a standardized methodology for characterizing and quantifying uncertainty language, which could provide both the report author and reader with context related to the perceived level of diagnostic confidence and accuracy. A number of computerized strategies could be employed in the creation of this analysis including string search, natural language processing and understanding, histogram analysis, topic modeling, and machine learning. The derived uncertainty data offers the potential to objectively analyze report uncertainty in real time and correlate with outcomes analysis for the purpose of context and user-specific decision support at the point of care, where intervention would have the greatest clinical impact.

  19. Uncertainties in human health risk assessment of environmental contaminants: A review and perspective.

    Science.gov (United States)

    Dong, Zhaomin; Liu, Yanju; Duan, Luchun; Bekele, Dawit; Naidu, Ravi

    2015-12-01

    Addressing uncertainties in human health risk assessment is a critical issue when evaluating the effects of contaminants on public health. A range of uncertainties exist through the source-to-outcome continuum, including exposure assessment, hazard and risk characterisation. While various strategies have been applied to characterising uncertainty, classical approaches largely rely on how to maximise the available resources. Expert judgement, defaults and tools for characterising quantitative uncertainty attempt to fill the gap between data and regulation requirements. The experiences of researching 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) illustrated uncertainty sources and how to maximise available information to determine uncertainties, and thereby provide an 'adequate' protection to contaminant exposure. As regulatory requirements and recurring issues increase, the assessment of complex scenarios involving a large number of chemicals requires more sophisticated tools. Recent advances in exposure and toxicology science provide a large data set for environmental contaminants and public health. In particular, biomonitoring information, in vitro data streams and computational toxicology are the crucial factors in the NexGen risk assessment, as well as uncertainties minimisation. Although in this review we cannot yet predict how the exposure science and modern toxicology will develop in the long-term, current techniques from emerging science can be integrated to improve decision-making. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Impact of Climate Change. Policy Uncertainty in Power Investment

    International Nuclear Information System (INIS)

    Blyth, W.; Yang, M.

    2006-10-01

    Climate change policies are being introduced or actively considered in all IEA member countries, changing the investment conditions and technology choices in the energy sector. Many of these policies are at a formative stage, and policy uncertainty is currently high. The objective of this paper is to quantify the impacts of climate change policy on power investment. We use Real Options Analysis approach in the study and model uncertain carbon price and fuel price with stochastic variables. The analysis compares the effects of climate policy uncertainty with fuel price uncertainty, showing the relative importance of these sources of risk for different technologies. This paper considers views on the importance of climate policy risk, how it is managed, and how it might affect investment behaviour. The implications for policymakers are analyzed, allowing the key messages to be transferred into policy design decisions. We found that in many cases, the dominant risks facing base-load generation investment decisions will be market risks associated with electricity and fuel prices. However, under certain conditions and for some technologies, climate policy uncertainty can be an important risk factor, creating an incentive to delay investment and raising investment thresholds. This paper concludes that government climate change policies to promote investment in low-carbon technologies should aim to overcome this incentive to delay by sending long-term investment signals backed up by strengthened international policy action to enhance domestic policy credibility

  1. Advanced risk analysis of systems endangered by ESD

    International Nuclear Information System (INIS)

    Kiss, Istvan; Szedenik, Norbert; Nemeth, Balint; Gulyas, Attila; Berta, Istvan

    2008-01-01

    Evaluation of industrial processes to determine risk of fire or explosion caused by electrostatic discharge (ESD) is even nowadays qualitative in most cases. Although qualitative analysis significantly helps to make an industrial process safer, it is based on the survey of the process and strongly subjective, depending on the estimation of an expert. Fault tree analysis is a traditional method to quantify the risk; it helps to select optimal protection. However, determination of top event, secondary events and basic events of the fault tree is difficult, especially the quantification of the probabilities of the basic events. In several cases no statistical information is available for most of the events. Using fuzzy membership functions instead of simple numbers for the quantification of probabilities makes it possible to take this uncertainty into consideration. Fuzzy logic based fault tree analysis of chemical processes were made to determine the effect of basic events on the probability of the top event (explosion or fire) and its reliability.

  2. Assessing the reliability of dose coefficients for exposure to radioiodine by members of the public, accounting for dosimetric and risk model uncertainties.

    Science.gov (United States)

    Puncher, M; Zhang, W; Harrison, J D; Wakeford, R

    2017-06-26

    Assessments of risk to a specific population group resulting from internal exposure to a particular radionuclide can be used to assess the reliability of the appropriate International Commission on Radiological Protection (ICRP) dose coefficients used as a radiation protection device for the specified exposure pathway. An estimate of the uncertainty on the associated risk is important for informing judgments on reliability; a derived uncertainty factor, UF, is an estimate of the 95% probable geometric difference between the best risk estimate and the nominal risk and is a useful tool for making this assessment. This paper describes the application of parameter uncertainty analysis to quantify uncertainties resulting from internal exposures to radioiodine by members of the public, specifically 1, 10 and 20-year old females from the population of England and Wales. Best estimates of thyroid cancer incidence risk (lifetime attributable risk) are calculated for ingestion or inhalation of 129 I and 131 I, accounting for uncertainties in biokinetic model and cancer risk model parameter values. These estimates are compared with the equivalent ICRP derived nominal age-, sex- and population-averaged estimates of excess thyroid cancer incidence to obtain UFs. Derived UF values for ingestion or inhalation of 131 I for 1 year, 10-year and 20-year olds are around 28, 12 and 6, respectively, when compared with ICRP Publication 103 nominal values, and 9, 7 and 14, respectively, when compared with ICRP Publication 60 values. Broadly similar results were obtained for 129 I. The uncertainties on risk estimates are largely determined by uncertainties on risk model parameters rather than uncertainties on biokinetic model parameters. An examination of the sensitivity of the results to the risk models and populations used in the calculations show variations in the central estimates of risk of a factor of around 2-3. It is assumed that the direct proportionality of excess thyroid cancer

  3. Systematic Evaluation of Uncertainty in Material Flow Analysis

    DEFF Research Database (Denmark)

    Laner, David; Rechberger, Helmut; Astrup, Thomas Fruergaard

    2014-01-01

    Material flow analysis (MFA) is a tool to investigate material flows and stocks in defined systems as a basis for resource management or environmental pollution control. Because of the diverse nature of sources and the varying quality and availability of data, MFA results are inherently uncertain....... Uncertainty analyses have received increasing attention in recent MFA studies, but systematic approaches for selection of appropriate uncertainty tools are missing. This article reviews existing literature related to handling of uncertainty in MFA studies and evaluates current practice of uncertainty analysis......) and exploratory MFA (identification of critical parameters and system behavior). Whereas mathematically simpler concepts focusing on data uncertainty characterization are appropriate for descriptive MFAs, statistical approaches enabling more-rigorous evaluation of uncertainty and model sensitivity are needed...

  4. A probabilistic model for deriving soil quality criteria based on secondary poisoning of top predators. I. Model description and uncertainty analysis.

    Science.gov (United States)

    Traas, T P; Luttik, R; Jongbloed, R H

    1996-08-01

    In previous studies, the risk of toxicant accumulation in food chains was used to calculate quality criteria for surface water and soil. A simple algorithm was used to calculate maximum permissable concentrations [MPC = no-observed-effect concentration/bioconcentration factor(NOEC/BCF)]. These studies were limited to simple food chains. This study presents a method to calculate MPCs for more complex food webs of predators. The previous method is expanded. First, toxicity data (NOECs) for several compounds were corrected for differences between laboratory animals and animals in the wild. Second, for each compound, it was assumed these NOECs were a sample of a log-logistic distribution of mammalian and avian NOECs. Third, bioaccumulation factors (BAFs) for major food items of predators were collected and were assumed to derive from different log-logistic distributions of BAFs. Fourth, MPCs for each compound were calculated using Monte Carlo sampling from NOEC and BAF distributions. An uncertainty analysis for cadmium was performed to identify the most uncertain parameters of the model. Model analysis indicated that most of the prediction uncertainty of the model can be ascribed to uncertainty of species sensitivity as expressed by NOECs. A very small proportion of model uncertainty is contributed by BAFs from food webs. Correction factors for the conversion of NOECs from laboratory conditions to the field have some influence on the final value of MPC5, but the total prediction uncertainty of the MPC is quite large. It is concluded that the uncertainty in species sensitivity is quite large. To avoid unethical toxicity testing with mammalian or avian predators, it cannot be avoided to use this uncertainty in the method proposed to calculate MPC distributions. The fifth percentile of the MPC is suggested as a safe value for top predators.

  5. Quantification of margins and uncertainties: Alternative representations of epistemic uncertainty

    International Nuclear Information System (INIS)

    Helton, Jon C.; Johnson, Jay D.

    2011-01-01

    In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, 'Quantification of Margins and Uncertainties: Conceptual and Computational Basis,' describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples that employ probability for the representation of aleatory and epistemic uncertainty. The current presentation introduces and illustrates the use of interval analysis, possibility theory and evidence theory as alternatives to the use of probability theory for the representation of epistemic uncertainty in QMU-type analyses. The following topics are considered: the mathematical structure of alternative representations of uncertainty, alternative representations of epistemic uncertainty in QMU analyses involving only epistemic uncertainty, and alternative representations of epistemic uncertainty in QMU analyses involving a separation of aleatory and epistemic uncertainty. Analyses involving interval analysis, possibility theory and evidence theory are illustrated with the same two notional examples used in the presentation indicated above to illustrate the use of probability to represent aleatory and epistemic uncertainty in QMU analyses.

  6. Stochastic fuzzy environmental risk characterization of uncertainty and variability in risk assessments: A case study of polycyclic aromatic hydrocarbons in soil at a petroleum-contaminated site in China

    International Nuclear Information System (INIS)

    Hu, Yan; Wang, Zesen; Wen, Jingya; Li, Yu

    2016-01-01

    Highlights: • Deal with environmental quality guidelines absence in risk characterization. • Quantitative represention of uncertainty from environmental quality guidelines. • Quantitative represention of variability from contaminant exposure concentrations. • Establishment of stochastic-fuzzy environmental risk characterization approach framework. - Abstract: Better decisions are made using risk assessment models when uncertainty and variability are explicitly acknowledged. Uncertainty caused by a lack of uniform and scientifically supported environmental quality guidelines and variability in the degree of exposure of environmental systems to contaminants are here incorporated in a stochastic fuzzy environmental risk characterization (SFERC) approach. The approach is based on quotient probability distribution and environmental risk level fuzzy membership function methods. The SFERC framework was used to characterize the environmental risks posed by 16 priority polycyclic aromatic hydrocarbons (PAHs) in soil at a typical petroleum-contaminated site in China. This relied on integrating data from the literature and field and laboratory experiments. The environmental risk levels posed by the PAHs under four risk scenarios were determined using the SFERC approach, using “residential land” and “industrial land” environmental quality guidelines under “loose” and “strict” strictness parameters. The results showed that environmental risks posed by PAHs in soil are primarily caused by oil exploitation, traffic emissions, and coal combustion. The SFERC approach is an effective tool for characterizing uncertainty and variability in environmental risk assessments and for managing contaminated sites.

  7. Stochastic fuzzy environmental risk characterization of uncertainty and variability in risk assessments: A case study of polycyclic aromatic hydrocarbons in soil at a petroleum-contaminated site in China

    Energy Technology Data Exchange (ETDEWEB)

    Hu, Yan [MOE Key Laboratory of Regional Energy Systems Optimization, Resources and Environmental Research Academy, North China Electric Power University, Beijing 102206 (China); State Key Laboratory of Environmental Criteria and Risk Assessment, Chinese Research Academy of Environment Sciences, Beijing 100012 (China); Wang, Zesen [MOE Key Laboratory of Regional Energy Systems Optimization, Resources and Environmental Research Academy, North China Electric Power University, Beijing 102206 (China); Wen, Jingya [MOE Key Laboratory of Regional Energy Systems Optimization, Resources and Environmental Research Academy, North China Electric Power University, Beijing 102206 (China); Institute of Hydropower and Environment Research, Beijing 100012 (China); Li, Yu, E-mail: liyuxx8@hotmail.com [MOE Key Laboratory of Regional Energy Systems Optimization, Resources and Environmental Research Academy, North China Electric Power University, Beijing 102206 (China)

    2016-10-05

    Highlights: • Deal with environmental quality guidelines absence in risk characterization. • Quantitative represention of uncertainty from environmental quality guidelines. • Quantitative represention of variability from contaminant exposure concentrations. • Establishment of stochastic-fuzzy environmental risk characterization approach framework. - Abstract: Better decisions are made using risk assessment models when uncertainty and variability are explicitly acknowledged. Uncertainty caused by a lack of uniform and scientifically supported environmental quality guidelines and variability in the degree of exposure of environmental systems to contaminants are here incorporated in a stochastic fuzzy environmental risk characterization (SFERC) approach. The approach is based on quotient probability distribution and environmental risk level fuzzy membership function methods. The SFERC framework was used to characterize the environmental risks posed by 16 priority polycyclic aromatic hydrocarbons (PAHs) in soil at a typical petroleum-contaminated site in China. This relied on integrating data from the literature and field and laboratory experiments. The environmental risk levels posed by the PAHs under four risk scenarios were determined using the SFERC approach, using “residential land” and “industrial land” environmental quality guidelines under “loose” and “strict” strictness parameters. The results showed that environmental risks posed by PAHs in soil are primarily caused by oil exploitation, traffic emissions, and coal combustion. The SFERC approach is an effective tool for characterizing uncertainty and variability in environmental risk assessments and for managing contaminated sites.

  8. Risk assessments of regional climate change over Europe: generation of probabilistic ensemble and uncertainty assessment for EURO-CODEX

    Science.gov (United States)

    Yuan, J.; Kopp, R. E.

    2017-12-01

    Quantitative risk analysis of regional climate change is crucial for risk management and impact assessment of climate change. Two major challenges to assessing the risks of climate change are: CMIP5 model runs, which drive EURO-CODEX downscaling runs, do not cover the full range of uncertainty of future projections; Climate models may underestimate the probability of tail risks (i.e. extreme events). To overcome the difficulties, this study offers a viable avenue, where a set of probabilistic climate ensemble is generated using the Surrogate/Model Mixed Ensemble (SMME) method. The probabilistic ensembles for temperature and precipitation are used to assess the range of uncertainty covered by five bias-corrected simulations from the high-resolution (0.11º) EURO-CODEX database, which are selected by the PESETA (The Projection of Economic impacts of climate change in Sectors of the European Union based on bottom-up Analysis) III project. Results show that the distribution of SMME ensemble is notably wider than both distribution of raw ensemble of GCMs and the spread of the five EURO-CORDEX in RCP8.5. Tail risks are well presented by the SMME ensemble. Both SMME ensemble and EURO-CORDEX projections are aggregated to administrative level, and are integrated into impact functions of PESETA III to assess climate risks in Europe. To further evaluate the uncertainties introduced by the downscaling process, we compare the 5 runs from EURO-CORDEX with runs from the corresponding GCMs. Time series of regional mean, spatial patterns, and climate indices are examined for the future climate (2080-2099) deviating from the present climate (1981-2010). The downscaling processes do not appear to be trend-preserving, e.g. the increase in regional mean temperature from EURO-CORDEX is slower than that from the corresponding GCM. The spatial pattern comparison reveals that the differences between each pair of GCM and EURO-CORDEX are small in winter. In summer, the temperatures of EURO

  9. Uncertainty analysis for hot channel

    International Nuclear Information System (INIS)

    Panka, I.; Kereszturi, A.

    2006-01-01

    The fulfillment of the safety analysis acceptance criteria is usually evaluated by separate hot channel calculations using the results of neutronic or/and thermo hydraulic system calculations. In case of an ATWS event (inadvertent withdrawal of control assembly), according to the analysis, a number of fuel rods are experiencing DNB for a longer time and must be regarded as failed. Their number must be determined for a further evaluation of the radiological consequences. In the deterministic approach, the global power history must be multiplied by different hot channel factors (kx) taking into account the radial power peaking factors for each fuel pin. If DNB occurs it is necessary to perform a few number of hot channel calculations to determine the limiting kx leading just to DNB and fuel failure (the conservative DNBR limit is 1.33). Knowing the pin power distribution from the core design calculation, the number of failed fuel pins can be calculated. The above procedure can be performed by conservative assumptions (e.g. conservative input parameters in the hot channel calculations), as well. In case of hot channel uncertainty analysis, the relevant input parameters (k x, mass flow, inlet temperature of the coolant, pin average burnup, initial gap size, selection of power history influencing the gap conductance value) of hot channel calculations and the DNBR limit are varied considering the respective uncertainties. An uncertainty analysis methodology was elaborated combining the response surface method with the one sided tolerance limit method of Wilks. The results of deterministic and uncertainty hot channel calculations are compared regarding to the number of failed fuel rods, max. temperature of the clad surface and max. temperature of the fuel (Authors)

  10. A Benefit-Risk Analysis Approach to Capture Regulatory Decision-Making: Non-Small Cell Lung Cancer.

    Science.gov (United States)

    Raju, G K; Gurumurthi, K; Domike, R; Kazandjian, D; Blumenthal, G; Pazdur, R; Woodcock, J

    2016-12-01

    Drug regulators around the world make decisions about drug approvability based on qualitative benefit-risk analyses. There is much interest in quantifying regulatory approaches to benefit and risk. In this work the use of a quantitative benefit-risk analysis was applied to regulatory decision-making about new drugs to treat advanced non-small cell lung cancer (NSCLC). Benefits and risks associated with 20 US Food and Drug Administration (FDA) decisions associated with a set of candidate treatments submitted between 2003 and 2015 were analyzed. For benefit analysis, the median overall survival (OS) was used where available. When not available, OS was estimated based on overall response rate (ORR) or progression-free survival (PFS). Risks were analyzed based on magnitude (or severity) of harm and likelihood of occurrence. Additionally, a sensitivity analysis was explored to demonstrate analysis of systematic uncertainty. FDA approval decision outcomes considered were found to be consistent with the benefit-risk logic. © 2016 American Society for Clinical Pharmacology and Therapeutics.

  11. Probabilistic Mass Growth Uncertainties

    Science.gov (United States)

    Plumer, Eric; Elliott, Darren

    2013-01-01

    Mass has been widely used as a variable input parameter for Cost Estimating Relationships (CER) for space systems. As these space systems progress from early concept studies and drawing boards to the launch pad, their masses tend to grow substantially, hence adversely affecting a primary input to most modeling CERs. Modeling and predicting mass uncertainty, based on historical and analogous data, is therefore critical and is an integral part of modeling cost risk. This paper presents the results of a NASA on-going effort to publish mass growth datasheet for adjusting single-point Technical Baseline Estimates (TBE) of masses of space instruments as well as spacecraft, for both earth orbiting and deep space missions at various stages of a project's lifecycle. This paper will also discusses the long term strategy of NASA Headquarters in publishing similar results, using a variety of cost driving metrics, on an annual basis. This paper provides quantitative results that show decreasing mass growth uncertainties as mass estimate maturity increases. This paper's analysis is based on historical data obtained from the NASA Cost Analysis Data Requirements (CADRe) database.

  12. GRAPH THEORY APPROACH TO QUANTIFY UNCERTAINTY OF PERFORMANCE MEASURES

    Directory of Open Access Journals (Sweden)

    Sérgio D. Sousa

    2015-03-01

    Full Text Available In this work, the performance measurement process is studied to quantify the uncertainty induced in the resulting performance measure (PM. To that end, the causes of uncertainty are identified, analysing the activities undertaken in the three following stages of the performance measurement process: design and implementation, data collection and record, and determination and analysis. A quantitative methodology based on graph theory and on the sources of uncertainty of the performance measurement process is used to calculate an uncertainty index to evaluate the level of uncertainty of a given PM or (key performance indicator. An application example is presented. The quantification of PM uncertainty could contribute to better represent the risk associated with a given decision and also to improve the PM to increase its precision and reliability.

  13. Uncertainty assessment of climate change adaptation options in urban flash floods

    DEFF Research Database (Denmark)

    Zhou, Qianqian; Arnbjerg-Nielsen, Karsten

    Adaptation is necessary to cope with the increasing flood risk in cities due to anthropogenic climate change in many regions of the world. The choice of adaptation strategies can and should be based on a comprehensive risk-based economic analysis to indicate the net benefits of proposed options...... presented is based on a flood risk framework that is in accordance with the EU flood directive, but adapted and extended to incorporate anticipated future changes due to city development and hydrologic extremes. The framework is used to study the importance of inherent uncertainties in order to find robust......-effective regardless of the uncertainties from climate change impacts and /or damage estimation procedure when considering the ability to reduce the risk of flooding. The description of the correlation structure between the key inputs proved to be important in order to obtain a correct description of the resulting...

  14. Information Uncertainty to Compare Qualitative Reasoning Security Risk Assessment Results

    Energy Technology Data Exchange (ETDEWEB)

    Chavez, Gregory M [Los Alamos National Laboratory; Key, Brian P [Los Alamos National Laboratory; Zerkle, David K [Los Alamos National Laboratory; Shevitz, Daniel W [Los Alamos National Laboratory

    2009-01-01

    The security risk associated with malevolent acts such as those of terrorism are often void of the historical data required for a traditional PRA. Most information available to conduct security risk assessments for these malevolent acts is obtained from subject matter experts as subjective judgements. Qualitative reasoning approaches such as approximate reasoning and evidential reasoning are useful for modeling the predicted risk from information provided by subject matter experts. Absent from these approaches is a consistent means to compare the security risk assessment results. Associated with each predicted risk reasoning result is a quantifiable amount of information uncertainty which can be measured and used to compare the results. This paper explores using entropy measures to quantify the information uncertainty associated with conflict and non-specificity in the predicted reasoning results. The measured quantities of conflict and non-specificity can ultimately be used to compare qualitative reasoning results which are important in triage studies and ultimately resource allocation. Straight forward extensions of previous entropy measures are presented here to quantify the non-specificity and conflict associated with security risk assessment results obtained from qualitative reasoning models.

  15. The uncertainty analysis of model results a practical guide

    CERN Document Server

    Hofer, Eduard

    2018-01-01

    This book is a practical guide to the uncertainty analysis of computer model applications. Used in many areas, such as engineering, ecology and economics, computer models are subject to various uncertainties at the level of model formulations, parameter values and input data. Naturally, it would be advantageous to know the combined effect of these uncertainties on the model results as well as whether the state of knowledge should be improved in order to reduce the uncertainty of the results most effectively. The book supports decision-makers, model developers and users in their argumentation for an uncertainty analysis and assists them in the interpretation of the analysis results.

  16. Use of Monte Carlo analysis in a risk-based prioritization of toxic constituents in house dust.

    Science.gov (United States)

    Ginsberg, Gary L; Belleggia, Giuliana

    2017-12-01

    Many chemicals have been detected in house dust with exposures to the general public and particularly young children of potential health concern. House dust is also an indicator of chemicals present in consumer products and the built environment that may constitute a health risk. The current analysis compiles a database of recent house dust concentrations from the United States and Canada, focusing upon semi-volatile constituents. Seven constituents from the phthalate and flame retardant categories were selected for risk-based screening and prioritization: diethylhexyl phthalate (DEHP), butyl benzyl phthalate (BBzP), diisononyl phthalate (DINP), a pentabrominated diphenyl ether congener (BDE-99), hexabromocyclododecane (HBCDD), tris(1,3-dichloro-2-propyl) phosphate (TDCIPP) and tris(2-chloroethyl) phosphate (TCEP). Monte Carlo analysis was used to represent the variability in house dust concentration as well as the uncertainty in the toxicology database in the estimation of children's exposure and risk. Constituents were prioritized based upon the percentage of the distribution of risk results for cancer and non-cancer endpoints that exceeded a hazard quotient (HQ) of 1. The greatest percent HQ exceedances were for DEHP (cancer and non-cancer), BDE-99 (non-cancer) and TDCIPP (cancer). Current uses and the potential for reducing levels of these constituents in house dust are discussed. Exposure and risk for other phthalates and flame retardants in house dust may increase if they are used to substitute for these prioritized constituents. Therefore, alternative assessment and green chemistry solutions are important elements in decreasing children's exposure to chemicals of concern in the indoor environment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. The nexus between geopolitical uncertainty and crude oil markets: An entropy-based wavelet analysis

    Science.gov (United States)

    Uddin, Gazi Salah; Bekiros, Stelios; Ahmed, Ali

    2018-04-01

    The global financial crisis and the subsequent geopolitical turbulence in energy markets have brought increased attention to the proper statistical modeling especially of the crude oil markets. In particular, we utilize a time-frequency decomposition approach based on wavelet analysis to explore the inherent dynamics and the casual interrelationships between various types of geopolitical, economic and financial uncertainty indices and oil markets. Via the introduction of a mixed discrete-continuous multiresolution analysis, we employ the entropic criterion for the selection of the optimal decomposition level of a MODWT as well as the continuous-time coherency and phase measures for the detection of business cycle (a)synchronization. Overall, a strong heterogeneity in the revealed interrelationships is detected over time and across scales.

  18. Demonstration of a modelling-based multi-criteria decision analysis procedure for prioritisation of occupational risks from manufactured nanomaterials.

    Science.gov (United States)

    Hristozov, Danail; Zabeo, Alex; Alstrup Jensen, Keld; Gottardo, Stefania; Isigonis, Panagiotis; Maccalman, Laura; Critto, Andrea; Marcomini, Antonio

    2016-11-01

    Several tools to facilitate the risk assessment and management of manufactured nanomaterials (MN) have been developed. Most of them require input data on physicochemical properties, toxicity and scenario-specific exposure information. However, such data are yet not readily available, and tools that can handle data gaps in a structured way to ensure transparent risk analysis for industrial and regulatory decision making are needed. This paper proposes such a quantitative risk prioritisation tool, based on a multi-criteria decision analysis algorithm, which combines advanced exposure and dose-response modelling to calculate margins of exposure (MoE) for a number of MN in order to rank their occupational risks. We demonstrated the tool in a number of workplace exposure scenarios (ES) involving the production and handling of nanoscale titanium dioxide, zinc oxide (ZnO), silver and multi-walled carbon nanotubes. The results of this application demonstrated that bag/bin filling, manual un/loading and dumping of large amounts of dry powders led to high emissions, which resulted in high risk associated with these ES. The ZnO MN revealed considerable hazard potential in vivo, which significantly influenced the risk prioritisation results. In order to study how variations in the input data affect our results, we performed probabilistic Monte Carlo sensitivity/uncertainty analysis, which demonstrated that the performance of the proposed model is stable against changes in the exposure and hazard input variables.

  19. Stochastic Risk and Uncertainty Analysis for Shale Gas Extraction in the Karoo Basin of South Africa

    Directory of Open Access Journals (Sweden)

    Abdon Atangana

    2014-01-01

    Full Text Available We made use of groundwater flow and mass transport equations to investigate the crucial potential risk of water pollution from hydraulic fracturing especially in the case of the Karoo system in South Africa. This paper shows that the upward migration of fluids will depend on the apertures of the cement cracks and fractures in the rock formation. The greater the apertures, the quicker the movement of the fluid. We presented a novel sampling method, which is the combination of the Monte Carlo and the Latin hypercube sampling. The method was used for uncertainties analysis of the apertures in the groundwater and mass transport equations. The study reveals that, in the case of the Karoo, fracking will only be successful if and only if the upward methane and fracking fluid migration can be controlled, for example, by plugging the entire fracked reservoir with cement.

  20. Scientific uncertainties and climate risks

    International Nuclear Information System (INIS)

    Petit, M.

    2005-01-01

    thus occur. The IPCC reports are meant to be read by policy makers and business managers. However, even if they are convinced, those people cannot act against the will of voters and consumers. A larger public has thus to be addressed as well. Meeting this challenge is not so difficult, since, those in authority are human beings as well; the real challenge is mutual understanding between specialists and non-specialists. Early in the process of preparing the Third Assessment Report (TAR), a series of recommendations was made, to insure a consistent treatment of the uncertainties in the various chapters of the report, and in the various Working Groups, which address the science of climate change, its impacts and its control. For the Fourth Assessment Report, ways of providing assessments of uncertainty that can improve risk analysis are studied. The proposed approach included convening an IPCC workshop that actually took place in Dublin in May 2004. The present issue of Comptes Rendus Geoscience gathers papers that reflect presentations made during this meeting. The papers included in this special issue have been introduced in a Cartesian way, going from the most theoretical to the most applied. They are published in the reverse order to make easier the life of the reader, by proceeding from concrete examples to concepts

  1. Synthesis of Enterprise and Value-Based Methods for Multiattribute Risk Analysis

    International Nuclear Information System (INIS)

    Kenley, C. Robert; Collins, John W.; Beck, John M.; Heydt, Harold J.; Garcia, Chad B.

    2001-01-01

    This paper describes a method for performing multiattribute decision analysis to prioritize ap-proaches to handling risks during the development and operation of complex socio-technical systems. The method combines risk categorization based on enterprise views, risk prioritization of the categories based on the Analytic Hierarchy Process (AHP), and more standard probability-consequence ratings schemes. We also apply value-based testing me-thods used in software development to prioritize risk-handling approaches. We describe a tool that synthesizes the methods and performs a multiattribute analysis of the technical and pro-grammatic risks on the Next Generation Nuclear Plant (NGNP) enterprise.

  2. Incorporating reliability evaluation into the uncertainty analysis of electricity market price

    International Nuclear Information System (INIS)

    Kang, Chongqing; Bai, Lichao; Xia, Qing; Jiang, Jianjian; Zhao, Jing

    2005-01-01

    A novel model and algorithm for analyzing the uncertainties in electricity market is proposed in this paper. In this model, bidding decision is formulated as a probabilistic model that takes into account the decision-maker's willingness to bid, risk preferences, the fluctuation of fuel-price, etc. At the same time, generating unit's uncertain output model is considered by its forced outage rate (FOR). Based on the model, the uncertainty of market price is then analyzed. Taking the analytical results into consideration, not only the reliability of the power system can be conventionally analyzed, but also the possible distribution of market prices can be easily obtained. The probability distribution of market prices can be further used to calculate the expected output and the sales income of generating unit in the market. Based on these results, it is also possible to evaluate the risk involved by generating units. A simple system with four generating units is used to illustrate the proposed algorithm. The proposed algorithm and the modeling technique are expected to helpful to the market participants in making their economic decisions

  3. Uncertainty analysis of the 35% reactor inlet header break in a CANDU 6 reactor using RELAP/SCDAPSIM/MOD4.0 with integrated uncertainty analysis option

    International Nuclear Information System (INIS)

    Dupleac, D.; Perez, M.; Reventos, F.; Allison, C.

    2011-01-01

    The RELAP/SCDAPSIM/MOD4.0 code, designed to predict the behavior of reactor systems during normal and accident conditions, is being developed as part of an international nuclear technology Software Development and Training Program (SDTP). RELAP/SCDAPSIM/MOD4.0, which is the first version of RELAP5 completely rewritten to FORTRAN 90/95/2000 standards, uses the publicly available RELAP5 and SCDAP models in combination with (a) advanced programming and numerical techniques, (b) advanced SDTP-member-developed models for LWR, HWR, and research reactor analysis, and (c) a variety of other member-developed computational packages. One such computational package is an integrated uncertainty analysis (IUA) package being developed jointly by the Technical University of Catalonia (UPC) and Innovative Systems Software (ISS). RELAP/SCDAPSIM/MOD4.0(IUA) follows the input-propagation approach using probability distribution functions to define the uncertainty of the input parameters. The main steps for this type of methodologies, often referred as to statistical approaches or Wilks’ methods, are the ones that follow: 1. Selection of the plant; 2. Selection of the scenario; 3. Selection of the safety criteria; 4. Identification and ranking of the relevant phenomena based on the safety criteria; 5. Selection of the appropriate code parameters to represent those phenomena; 6. Association of uncertainty by means of Probability Distribution Functions (PDFs) for each selected parameter; 7. Random sampling of the selected parameters according to its PDF and performing multiple computer runs to obtain uncertainty bands with a certain percentile and confidence level; 8. Processing the results of the multiple computer runs to estimate the uncertainty bands for the computed quantities associated with the selected safety criteria. RELAP/SCDAPSIM/MOD4.0(IUA) calculates the number of required code runs given the desired percentile and confidence level, performs the sampling process for the

  4. Uncertainty analysis of the 35% reactor inlet header break in a CANDU 6 reactor using RELAP/SCDAPSIM/MOD4.0 with integrated uncertainty analysis option

    Energy Technology Data Exchange (ETDEWEB)

    Dupleac, D., E-mail: danieldu@cne.pub.ro [Politehnica Univ. of Bucharest (Romania); Perez, M.; Reventos, F., E-mail: marina.perez@upc.edu, E-mail: francesc.reventos@upc.edu [Technical Univ. of Catalonia (Spain); Allison, C., E-mail: iss@cableone.net [Innovative Systems Software (United States)

    2011-07-01

    The RELAP/SCDAPSIM/MOD4.0 code, designed to predict the behavior of reactor systems during normal and accident conditions, is being developed as part of an international nuclear technology Software Development and Training Program (SDTP). RELAP/SCDAPSIM/MOD4.0, which is the first version of RELAP5 completely rewritten to FORTRAN 90/95/2000 standards, uses the publicly available RELAP5 and SCDAP models in combination with (a) advanced programming and numerical techniques, (b) advanced SDTP-member-developed models for LWR, HWR, and research reactor analysis, and (c) a variety of other member-developed computational packages. One such computational package is an integrated uncertainty analysis (IUA) package being developed jointly by the Technical University of Catalonia (UPC) and Innovative Systems Software (ISS). RELAP/SCDAPSIM/MOD4.0(IUA) follows the input-propagation approach using probability distribution functions to define the uncertainty of the input parameters. The main steps for this type of methodologies, often referred as to statistical approaches or Wilks’ methods, are the ones that follow: 1. Selection of the plant; 2. Selection of the scenario; 3. Selection of the safety criteria; 4. Identification and ranking of the relevant phenomena based on the safety criteria; 5. Selection of the appropriate code parameters to represent those phenomena; 6. Association of uncertainty by means of Probability Distribution Functions (PDFs) for each selected parameter; 7. Random sampling of the selected parameters according to its PDF and performing multiple computer runs to obtain uncertainty bands with a certain percentile and confidence level; 8. Processing the results of the multiple computer runs to estimate the uncertainty bands for the computed quantities associated with the selected safety criteria. RELAP/SCDAPSIM/MOD4.0(IUA) calculates the number of required code runs given the desired percentile and confidence level, performs the sampling process for the

  5. Brine migration resulting from CO2 injection into saline aquifers – An approach to risk estimation including various levels of uncertainty

    DEFF Research Database (Denmark)

    Walter, Lena; Binning, Philip John; Oladyshkin, Sergey

    2012-01-01

    resulting from displaced brine. Quantifying risk on the basis of numerical simulations requires consideration of different kinds of uncertainties and this study considers both, scenario uncertainty and statistical uncertainty. Addressing scenario uncertainty involves expert opinion on relevant geological......Comprehensive risk assessment is a major task for large-scale projects such as geological storage of CO2. Basic hazards are damage to the integrity of caprocks, leakage of CO2, or reduction of groundwater quality due to intrusion of fluids. This study focuses on salinization of freshwater aquifers...... for large-scale 3D models including complex physics. Therefore, we apply a model reduction based on arbitrary polynomial chaos expansion combined with probabilistic collocation method. It is shown that, dependent on data availability, both types of uncertainty can be equally significant. The presented study...

  6. Urban drainage models - making uncertainty analysis simple

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana

    2012-01-01

    in each measured/observed datapoint; an issue which is commonly overlook in the uncertainty analysis of urban drainage models. This comparison allows the user to intuitively estimate the optimum number of simulations required to conduct uncertainty analyses. The output of the method includes parameter......There is increasing awareness about uncertainties in modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here...

  7. Additional challenges for uncertainty analysis in river engineering

    Science.gov (United States)

    Berends, Koen; Warmink, Jord; Hulscher, Suzanne

    2016-04-01

    the proposed intervention. The implicit assumption underlying such analysis is that both models are commensurable. We hypothesize that they are commensurable only to a certain extent. In an idealised study we have demonstrated that prediction performance loss should be expected with increasingly large engineering works. When accounting for parametric uncertainty of floodplain roughness in model identification, we see uncertainty bounds for predicted effects of interventions increase with increasing intervention scale. Calibration of these types of models therefore seems to have a shelf-life, beyond which calibration does not longer improves prediction. Therefore a qualification scheme for model use is required that can be linked to model validity. In this study, we characterize model use along three dimensions: extrapolation (using the model with different external drivers), extension (using the model for different output or indicators) and modification (using modified models). Such use of models is expected to have implications for the applicability of surrogating modelling for efficient uncertainty analysis as well, which is recommended for future research. Warmink, J. J.; Straatsma, M. W.; Huthoff, F.; Booij, M. J. & Hulscher, S. J. M. H. 2013. Uncertainty of design water levels due to combined bed form and vegetation roughness in the Dutch river Waal. Journal of Flood Risk Management 6, 302-318 . DOI: 10.1111/jfr3.12014

  8. The HTA Risk Analysis Chart: Visualising the Need for and Potential Value of Managed Entry Agreements in Health Technology Assessment.

    Science.gov (United States)

    Grimm, Sabine Elisabeth; Strong, Mark; Brennan, Alan; Wailoo, Allan J

    2017-12-01

    Recent changes to the regulatory landscape of pharmaceuticals may sometimes require reimbursement authorities to issue guidance on technologies that have a less mature evidence base. Decision makers need to be aware of risks associated with such health technology assessment (HTA) decisions and the potential to manage this risk through managed entry agreements (MEAs). This work develops methods for quantifying risk associated with specific MEAs and for clearly communicating this to decision makers. We develop the 'HTA risk analysis chart', in which we present the payer strategy and uncertainty burden (P-SUB) as a measure of overall risk. The P-SUB consists of the payer uncertainty burden (PUB), the risk stemming from decision uncertainty as to which is the truly optimal technology from the relevant set of technologies, and the payer strategy burden (PSB), the additional risk of approving a technology that is not expected to be optimal. We demonstrate the approach using three recent technology appraisals from the UK National Institute for Health and Clinical Excellence (NICE), each of which considered a price-based MEA. The HTA risk analysis chart was calculated using results from standard probabilistic sensitivity analyses. In all three HTAs, the new interventions were associated with substantial risk as measured by the P-SUB. For one of these technologies, the P-SUB was reduced to zero with the proposed price reduction, making this intervention cost effective with near complete certainty. For the other two, the risk reduced substantially with a much reduced PSB and a slightly increased PUB. The HTA risk analysis chart shows the risk that the healthcare payer incurs under unresolved decision uncertainty and when considering recommending a technology that is not expected to be optimal given current evidence. This allows the simultaneous consideration of financial and data-collection MEA schemes in an easily understood format. The use of HTA risk analysis charts will

  9. Management of uncertainties and the role of risk in Andra programme

    International Nuclear Information System (INIS)

    Grevoz, A.

    2004-01-01

    This paper aims at presenting some aspects of the management of uncertainties in the programmes of Andra for the study of a deep geological repository for high level, long-lived wastes. After presenting very briefly the context of risk-management in the French safety rules, it will discuss how Andra has attempted to link the notions of uncertainties and risk in its 'Dossier 2001' for the clay medium, how this could be made compatible with a deterministic safety approach, what feedback it got from this first exercise, and what are the perspectives for future work. (author)

  10. Environmental impact and risk assessments and key factors contributing to the overall uncertainties.

    Science.gov (United States)

    Salbu, Brit

    2016-01-01

    There is a significant number of nuclear and radiological sources that have contributed, are still contributing, or have the potential to contribute to radioactive contamination of the environment in the future. To protect the environment from radioactive contamination, impact and risk assessments are performed prior to or during a release event, short or long term after deposition or prior and after implementation of countermeasures. When environmental impact and risks are assessed, however, a series of factors will contribute to the overall uncertainties. To provide environmental impact and risk assessments, information on processes, kinetics and a series of input variables is needed. Adding problems such as variability, questionable assumptions, gaps in knowledge, extrapolations and poor conceptual model structures, a series of factors are contributing to large and often unacceptable uncertainties in impact and risk assessments. Information on the source term and the release scenario is an essential starting point in impact and risk models; the source determines activity concentrations and atom ratios of radionuclides released, while the release scenario determine the physico-chemical forms of released radionuclides such as particle size distribution, structure and density. Releases will most often contain other contaminants such as metals, and due to interactions, contaminated sites should be assessed as a multiple stressor scenario. Following deposition, a series of stressors, interactions and processes will influence the ecosystem transfer of radionuclide species and thereby influence biological uptake (toxicokinetics) and responses (toxicodynamics) in exposed organisms. Due to the variety of biological species, extrapolation is frequently needed to fill gaps in knowledge e.g., from effects to no effects, from effects in one organism to others, from one stressor to mixtures. Most toxtests are, however, performed as short term exposure of adult organisms

  11. Development of Uncertainty Analysis Method for SMART Digital Core Protection and Monitoring System

    International Nuclear Information System (INIS)

    Koo, Bon Seung; In, Wang Kee; Hwang, Dae Hyun

    2012-01-01

    The Korea Atomic Energy Research Institute has developed a system-integrated modular advanced reactor (SMART) for a seawater desalination and electricity generation. Online digital core protection and monitoring systems, called SCOPS and SCOMS respectively were developed. SCOPS calculates minimum DNBR and maximum LPD based on the several online measured system parameters. SCOMS calculates the variables of limiting conditions for operation. KAERI developed overall uncertainty analysis methodology which is used statistically combining uncertainty components of SMART core protection and monitoring system. By applying overall uncertainty factors in on-line SCOPS/SCOMS calculation, calculated LPD and DNBR are conservative with a 95/95 probability/confidence level. In this paper, uncertainty analysis method is described for SMART core protection and monitoring system

  12. Decision-making under risk and uncertainty

    International Nuclear Information System (INIS)

    Gatev, G.I.

    2006-01-01

    Fuzzy sets and interval analysis tools to make computations and solve optimisation problems are presented. Fuzzy and interval extensions of Decision Theory criteria for decision-making under parametric uncertainty of prior information (probabilities, payoffs) are developed. An interval probability approach to the mean-value criterion is proposed. (author)

  13. Risk assessment and model for community-based construction ...

    African Journals Online (AJOL)

    It, therefore, becomes necessary to systematically manage uncertainty in community-based construction in order to increase the likelihood of meeting project objectives using necessary risk management strategies. Risk management, which is an iterative process due to the dynamic nature of many risks, follows three main ...

  14. Radical uncertainty, non-predictability, antifragility and risk-sharing Islamic finance

    Directory of Open Access Journals (Sweden)

    Umar Rafi

    2016-12-01

    Full Text Available Under conditions of radical uncertainty, risk sharing renders financial systems anti-fragile. Our goal in this paper is to show that risk-sharing Islamic finance (RSIF shares the characteristics defined by Taleb for an anti-fragile system, by mapping some characteristics of anti-fragility onto those of risk-sharing Islamic finance. A key insight around which such a connection can be established is by relating the principle of “no risk-no gain”from Islamic finance to the concept of skin-in-the-game from anti-fragility theory. The relationship is then extended to other characteristics of the two frameworks, to show that RSIF overlaps with anti-fragility over many dimensions. The broader case for an antifragile system includes another important characteristic, namely “soul in the game” and concern for social justice. It is the authors’ hope that emerging research on anti-fragility, combined with the emerging research on RSIF, can have a lasting impact on the field of finance by laying the foundations for a compelling case that it is time for humanity to replace the dominant debt-based risk transfer/risk shifting financial system with a system in which everyone shares the risks faced by society. JEL: D81, D89, E44, F34, G32

  15. Hybrid Structural Reliability Analysis under Multisource Uncertainties Based on Universal Grey Numbers

    Directory of Open Access Journals (Sweden)

    Xingfa Yang

    2018-01-01

    Full Text Available Nondeterministic parameters of certain distribution are employed to model structural uncertainties, which are usually assumed as stochastic factors. However, model parameters may not be precisely represented due to some factors in engineering practices, such as lack of sufficient data, data with fuzziness, and unknown-but-bounded conditions. To this end, interval and fuzzy parameters are implemented and an efficient approach to structural reliability analysis with random-interval-fuzzy hybrid parameters is proposed in this study. Fuzzy parameters are first converted to equivalent random ones based on the equal entropy principle. 3σ criterion is then employed to transform the equivalent random and the original random parameters to interval variables. In doing this, the hybrid reliability problem is transformed into the one only with interval variables, in other words, nonprobabilistic reliability analysis problem. Nevertheless, the problem of interval extension existed in interval arithmetic, especially for the nonlinear systems. Therefore, universal grey mathematics, which can tackle the issue of interval extension, is employed to solve the nonprobabilistic reliability analysis problem. The results show that the proposed method can obtain more conservative results of the hybrid structural reliability.

  16. Quantum uncertainty relation based on the mean deviation

    OpenAIRE

    Sharma, Gautam; Mukhopadhyay, Chiranjib; Sazim, Sk; Pati, Arun Kumar

    2018-01-01

    Traditional forms of quantum uncertainty relations are invariably based on the standard deviation. This can be understood in the historical context of simultaneous development of quantum theory and mathematical statistics. Here, we present alternative forms of uncertainty relations, in both state dependent and state independent forms, based on the mean deviation. We illustrate the robustness of this formulation in situations where the standard deviation based uncertainty relation is inapplica...

  17. Integration of renewable generation uncertainties into stochastic unit commitment considering reserve and risk: A comparative study

    International Nuclear Information System (INIS)

    Quan, Hao; Srinivasan, Dipti; Khosravi, Abbas

    2016-01-01

    The uncertainties of renewable energy have brought great challenges to power system commitment, dispatches and reserve requirement. This paper presents a comparative study on integration of renewable generation uncertainties into SCUC (stochastic security-constrained unit commitment) considering reserve and risk. Renewable forecast uncertainties are captured by a list of PIs (prediction intervals). A new scenario generation method is proposed to generate scenarios from these PIs. Different system uncertainties are considered as scenarios in the stochastic SCUC problem formulation. Two comparative simulations with single (E1: wind only) and multiple sources of uncertainty (E2: load, wind, solar and generation outages) are investigated. Five deterministic and four stochastic case studies are performed. Different generation costs, reserve strategies and associated risks are compared under various scenarios. Demonstrated results indicate the overall costs of E2 is lower than E1 due to penetration of solar power and the associated risk in deterministic cases of E2 is higher than E1. It implies the superimposed effect of uncertainties during uncertainty integration. The results also demonstrate that power systems run a higher level of risk during peak load hours, and that stochastic models are more robust than deterministic ones. - Highlights: • An extensive comparative study for renewable integration is presented. • A novel scenario generation method is proposed. • Wind and solar uncertainties are represented by a list of prediction intervals. • Unit commitment and dispatch costs are discussed considering reserve and risk.

  18. A smart Monte Carlo procedure for production costing and uncertainty analysis

    International Nuclear Information System (INIS)

    Parker, C.; Stremel, J.

    1996-01-01

    Electric utilities using chronological production costing models to decide whether to buy or sell power over the next week or next few weeks need to determine potential profits or losses under a number of uncertainties. A large amount of money can be at stake--often $100,000 a day or more--and one party of the sale must always take on the risk. In the case of fixed price ($/MWh) contracts, the seller accepts the risk. In the case of cost plus contracts, the buyer must accept the risk. So, modeling uncertainty and understanding the risk accurately can improve the competitive edge of the user. This paper investigates an efficient procedure for representing risks and costs from capacity outages. Typically, production costing models use an algorithm based on some form of random number generator to select resources as available or on outage. These algorithms allow experiments to be repeated and gains and losses to be observed in a short time. The authors perform several experiments to examine the capability of three unit outage selection methods and measures their results. Specifically, a brute force Monte Carlo procedure, a Monte Carlo procedure with Latin Hypercube sampling, and a Smart Monte Carlo procedure with cost stratification and directed sampling are examined

  19. Risk, Robustness and Water Resources Planning Under Uncertainty

    Science.gov (United States)

    Borgomeo, Edoardo; Mortazavi-Naeini, Mohammad; Hall, Jim W.; Guillod, Benoit P.

    2018-03-01

    Risk-based water resources planning is based on the premise that water managers should invest up to the point where the marginal benefit of risk reduction equals the marginal cost of achieving that benefit. However, this cost-benefit approach may not guarantee robustness under uncertain future conditions, for instance under climatic changes. In this paper, we expand risk-based decision analysis to explore possible ways of enhancing robustness in engineered water resources systems under different risk attitudes. Risk is measured as the expected annual cost of water use restrictions, while robustness is interpreted in the decision-theoretic sense as the ability of a water resource system to maintain performance—expressed as a tolerable risk of water use restrictions—under a wide range of possible future conditions. Linking risk attitudes with robustness allows stakeholders to explicitly trade-off incremental increases in robustness with investment costs for a given level of risk. We illustrate the framework through a case study of London's water supply system using state-of-the -art regional climate simulations to inform the estimation of risk and robustness.

  20. Treatment of uncertainties in the IPCC: a philosophical analysis

    Science.gov (United States)

    Jebeile, J.; Drouet, I.

    2014-12-01

    The IPCC produces scientific reports out of findings on climate and climate change. Because the findings are uncertain in many respects, the production of reports requires aggregating assessments of uncertainties of different kinds. This difficult task is currently regulated by the Guidance note for lead authors of the IPCC fifth assessment report on consistent treatment of uncertainties. The note recommends that two metrics—i.e. confidence and likelihood— be used for communicating the degree of certainty in findings. Confidence is expressed qualitatively "based on the type, amount, quality, and consistency of evidence […] and the degree of agreement", while likelihood is expressed probabilistically "based on statistical analysis of observations or model results, or expert judgment". Therefore, depending on the evidence evaluated, authors have the choice to present either an assigned level of confidence or a quantified measure of likelihood. But aggregating assessments of uncertainties of these two different kinds express distinct and conflicting methodologies. So the question arises whether the treatment of uncertainties in the IPCC is rationally justified. In order to answer the question, it is worth comparing the IPCC procedures with the formal normative theories of epistemic rationality which have been developed by philosophers. These theories—which include contributions to the philosophy of probability and to bayesian probabilistic confirmation theory—are relevant for our purpose because they are commonly used to assess the rationality of common collective jugement formation based on uncertain knowledge. In this paper we make the comparison and pursue the following objectives: i/we determine whether the IPCC confidence and likelihood can be compared with the notions of uncertainty targeted by or underlying the formal normative theories of epistemic rationality; ii/we investigate whether the formal normative theories of epistemic rationality justify

  1. Comprehensive neutron cross-section and secondary energy distribution uncertainty analysis for a fusion reactor

    International Nuclear Information System (INIS)

    Gerstl, S.A.W.; LaBauve, R.J.; Young, P.G.

    1980-05-01

    On the example of General Atomic's well-documented Power Generating Fusion Reactor (PGFR) design, this report exercises a comprehensive neutron cross-section and secondary energy distribution (SED) uncertainty analysis. The LASL sensitivity and uncertainty analysis code SENSIT is used to calculate reaction cross-section sensitivity profiles and integral SED sensitivity coefficients. These are then folded with covariance matrices and integral SED uncertainties to obtain the resulting uncertainties of three calculated neutronics design parameters: two critical radiation damage rates and a nuclear heating rate. The report documents the first sensitivity-based data uncertainty analysis, which incorporates a quantitative treatment of the effects of SED uncertainties. The results demonstrate quantitatively that the ENDF/B-V cross-section data files for C, H, and O, including their SED data, are fully adequate for this design application, while the data for Fe and Ni are at best marginally adequate because they give rise to response uncertainties up to 25%. Much higher response uncertainties are caused by cross-section and SED data uncertainties in Cu (26 to 45%), tungsten (24 to 54%), and Cr (up to 98%). Specific recommendations are given for re-evaluations of certain reaction cross-sections, secondary energy distributions, and uncertainty estimates

  2. Uncertainty Propagation in Monte Carlo Depletion Analysis

    International Nuclear Information System (INIS)

    Shim, Hyung Jin; Kim, Yeong-il; Park, Ho Jin; Joo, Han Gyu; Kim, Chang Hyo

    2008-01-01

    A new formulation aimed at quantifying uncertainties of Monte Carlo (MC) tallies such as k eff and the microscopic reaction rates of nuclides and nuclide number densities in MC depletion analysis and examining their propagation behaviour as a function of depletion time step (DTS) is presented. It is shown that the variance of a given MC tally used as a measure of its uncertainty in this formulation arises from four sources; the statistical uncertainty of the MC tally, uncertainties of microscopic cross sections and nuclide number densities, and the cross correlations between them and the contribution of the latter three sources can be determined by computing the correlation coefficients between the uncertain variables. It is also shown that the variance of any given nuclide number density at the end of each DTS stems from uncertainties of the nuclide number densities (NND) and microscopic reaction rates (MRR) of nuclides at the beginning of each DTS and they are determined by computing correlation coefficients between these two uncertain variables. To test the viability of the formulation, we conducted MC depletion analysis for two sample depletion problems involving a simplified 7x7 fuel assembly (FA) and a 17x17 PWR FA, determined number densities of uranium and plutonium isotopes and their variances as well as k ∞ and its variance as a function of DTS, and demonstrated the applicability of the new formulation for uncertainty propagation analysis that need be followed in MC depletion computations. (authors)

  3. Discussion of OECD LWR Uncertainty Analysis in Modelling Benchmark

    International Nuclear Information System (INIS)

    Ivanov, K.; Avramova, M.; Royer, E.; Gillford, J.

    2013-01-01

    The demand for best estimate calculations in nuclear reactor design and safety evaluations has increased in recent years. Uncertainty quantification has been highlighted as part of the best estimate calculations. The modelling aspects of uncertainty and sensitivity analysis are to be further developed and validated on scientific grounds in support of their performance and application to multi-physics reactor simulations. The Organization for Economic Co-operation and Development (OECD) / Nuclear Energy Agency (NEA) Nuclear Science Committee (NSC) has endorsed the creation of an Expert Group on Uncertainty Analysis in Modelling (EGUAM). Within the framework of activities of EGUAM/NSC the OECD/NEA initiated the Benchmark for Uncertainty Analysis in Modelling for Design, Operation, and Safety Analysis of Light Water Reactor (OECD LWR UAM benchmark). The general objective of the benchmark is to propagate the predictive uncertainties of code results through complex coupled multi-physics and multi-scale simulations. The benchmark is divided into three phases with Phase I highlighting the uncertainty propagation in stand-alone neutronics calculations, while Phase II and III are focused on uncertainty analysis of reactor core and system respectively. This paper discusses the progress made in Phase I calculations, the Specifications for Phase II and the incoming challenges in defining Phase 3 exercises. The challenges of applying uncertainty quantification to complex code systems, in particular the time-dependent coupled physics models are the large computational burden and the utilization of non-linear models (expected due to the physics coupling). (authors)

  4. Lognormal Approximations of Fault Tree Uncertainty Distributions.

    Science.gov (United States)

    El-Shanawany, Ashraf Ben; Ardron, Keith H; Walker, Simon P

    2018-01-26

    Fault trees are used in reliability modeling to create logical models of fault combinations that can lead to undesirable events. The output of a fault tree analysis (the top event probability) is expressed in terms of the failure probabilities of basic events that are input to the model. Typically, the basic event probabilities are not known exactly, but are modeled as probability distributions: therefore, the top event probability is also represented as an uncertainty distribution. Monte Carlo methods are generally used for evaluating the uncertainty distribution, but such calculations are computationally intensive and do not readily reveal the dominant contributors to the uncertainty. In this article, a closed-form approximation for the fault tree top event uncertainty distribution is developed, which is applicable when the uncertainties in the basic events of the model are lognormally distributed. The results of the approximate method are compared with results from two sampling-based methods: namely, the Monte Carlo method and the Wilks method based on order statistics. It is shown that the closed-form expression can provide a reasonable approximation to results obtained by Monte Carlo sampling, without incurring the computational expense. The Wilks method is found to be a useful means of providing an upper bound for the percentiles of the uncertainty distribution while being computationally inexpensive compared with full Monte Carlo sampling. The lognormal approximation method and Wilks's method appear attractive, practical alternatives for the evaluation of uncertainty in the output of fault trees and similar multilinear models. © 2018 Society for Risk Analysis.

  5. Coupling Uncertainties with Accuracy Assessment in Object-Based Slum Detections, Case Study: Jakarta, Indonesia

    NARCIS (Netherlands)

    Pratomo, J.; Kuffer, M.; Martinez, Javier; Kohli, D.

    2017-01-01

    Object-Based Image Analysis (OBIA) has been successfully used to map slums. In general, the occurrence of uncertainties in producing geographic data is inevitable. However, most studies concentrated solely on assessing the classification accuracy and neglecting the inherent uncertainties. Our

  6. Uncertainty and Sensitivity Analysis of Afterbody Radiative Heating Predictions for Earth Entry

    Science.gov (United States)

    West, Thomas K., IV; Johnston, Christopher O.; Hosder, Serhat

    2016-01-01

    The objective of this work was to perform sensitivity analysis and uncertainty quantification for afterbody radiative heating predictions of Stardust capsule during Earth entry at peak afterbody radiation conditions. The radiation environment in the afterbody region poses significant challenges for accurate uncertainty quantification and sensitivity analysis due to the complexity of the flow physics, computational cost, and large number of un-certain variables. In this study, first a sparse collocation non-intrusive polynomial chaos approach along with global non-linear sensitivity analysis was used to identify the most significant uncertain variables and reduce the dimensions of the stochastic problem. Then, a total order stochastic expansion was constructed over only the important parameters for an efficient and accurate estimate of the uncertainty in radiation. Based on previous work, 388 uncertain parameters were considered in the radiation model, which came from the thermodynamics, flow field chemistry, and radiation modeling. The sensitivity analysis showed that only four of these variables contributed significantly to afterbody radiation uncertainty, accounting for almost 95% of the uncertainty. These included the electronic- impact excitation rate for N between level 2 and level 5 and rates of three chemical reactions in uencing N, N(+), O, and O(+) number densities in the flow field.

  7. Comments on 'An information-theoretic basis for uncertainty analysis: application to the QUASAR severe accident study' by S.D. Unwin, E.G. Cazzoli, R.E. Davis and others, and 'Uncertainty characterization of data for probabilistic risk assessment' by R.A. Bari and C.K. Park

    International Nuclear Information System (INIS)

    Cooke, R.M.

    1989-01-01

    Comments on papers by Bari and Park (Uncertainty Characterization of Data for probabilistic risk assessment) and Unwin et al (An information-theoretic Basis for Uncertainty Analysis) are made. These raise issues on the work expounded. The main one is that entropy is not a proper measure of uncertainty, especially for a continuous variable, mainly because the entropy for a continuous distribution is not invariant and can be negative. The use of expert opinion and the method proposed for evaluating uncertainty are discussed. One comment makes three points, first good elicitation practice is more useful than a few constraints on probability distributions, second the maximum entropy approach can be a useful addition to good elicitation practice, third that the maximum entropy approach is misused. (UK)

  8. Blockchain to Rule the Waves - Nascent Design Principles for Reducing Risk and Uncertainty in Decentralized Environments

    DEFF Research Database (Denmark)

    Nærland, Kristoffer; Müller-Bloch, Christoph; Beck, Roman

    2017-01-01

    Many decentralized, inter-organizational environments such as supply chains are characterized by high transactional uncertainty and risk. At the same time, blockchain technology promises to mitigate these issues by introducing certainty into economic transactions. This paper discusses the findings...... of a Design Science Research project involving the construction and evaluation of an information technology artifact in collaboration with Maersk, a leading international shipping company, where central documents in shipping, such as the Bill of Lading, are turned into a smart contract on blockchain. Based...... on our insights from the project, we provide first evidence for preliminary design principles for applications that aim to mitigate the transactional risk and uncertainty in decentralized environments using blockchain. Both the artifact and the first evidence for emerging design principles are novel...

  9. Flood risk analysis for flood control and sediment transportation in sandy regions: A case study in the Loess Plateau, China

    Science.gov (United States)

    Guo, Aijun; Chang, Jianxia; Wang, Yimin; Huang, Qiang; Zhou, Shuai

    2018-05-01

    Traditional flood risk analysis focuses on the probability of flood events exceeding the design flood of downstream hydraulic structures while neglecting the influence of sedimentation in river channels on regional flood control systems. This work advances traditional flood risk analysis by proposing a univariate and copula-based bivariate hydrological risk framework which incorporates both flood control and sediment transport. In developing the framework, the conditional probabilities of different flood events under various extreme precipitation scenarios are estimated by exploiting the copula-based model. Moreover, a Monte Carlo-based algorithm is designed to quantify the sampling uncertainty associated with univariate and bivariate hydrological risk analyses. Two catchments located on the Loess plateau are selected as study regions: the upper catchments of the Xianyang and Huaxian stations (denoted as UCX and UCH, respectively). The univariate and bivariate return periods, risk and reliability in the context of uncertainty for the purposes of flood control and sediment transport are assessed for the study regions. The results indicate that sedimentation triggers higher risks of damaging the safety of local flood control systems compared with the event that AMF exceeds the design flood of downstream hydraulic structures in the UCX and UCH. Moreover, there is considerable sampling uncertainty affecting the univariate and bivariate hydrologic risk evaluation, which greatly challenges measures of future flood mitigation. In addition, results also confirm that the developed framework can estimate conditional probabilities associated with different flood events under various extreme precipitation scenarios aiming for flood control and sediment transport. The proposed hydrological risk framework offers a promising technical reference for flood risk analysis in sandy regions worldwide.

  10. An estimation of uncertainties in containment P/T analysis using CONTEMPT/LT code

    International Nuclear Information System (INIS)

    Kang, Y.M.; Park, G.C.; Lee, U.C.; Kang, C.S.

    1991-01-01

    In a nuclear power plant, the containment design pressure and temperature (P/T) have been established based on the unrealistic conservatism with suffering from a drawback in the economics. Thus, it is necessary that the uncertainties of design P/T values have to be well defined through an extensive uncertainty analysis with plant-specific input data and or models used in the computer code. This study is to estimate plant-specific uncertainties of containment design P/T using the Monte Carlo method in Kori-3 reactor. Kori-3 plant parameters and Uchida heat transfer coefficient are selected to be treated statistically after the sensitivity study. The Monte Carlo analysis has performed based on the response surface method with the CONTEMPT/LT code and Latin Hypercube sampling technique. Finally, the design values based on 95 %/95 % probability are compared with worst estimated values to assess the design margin. (author)

  11. Network theory-based analysis of risk interactions in large engineering projects

    International Nuclear Information System (INIS)

    Fang, Chao; Marle, Franck; Zio, Enrico; Bocquet, Jean-Claude

    2012-01-01

    This paper presents an approach based on network theory to deal with risk interactions in large engineering projects. Indeed, such projects are exposed to numerous and interdependent risks of various nature, which makes their management more difficult. In this paper, a topological analysis based on network theory is presented, which aims at identifying key elements in the structure of interrelated risks potentially affecting a large engineering project. This analysis serves as a powerful complement to classical project risk analysis. Its originality lies in the application of some network theory indicators to the project risk management field. The construction of the risk network requires the involvement of the project manager and other team members assigned to the risk management process. Its interpretation improves their understanding of risks and their potential interactions. The outcomes of the analysis provide a support for decision-making regarding project risk management. An example of application to a real large engineering project is presented. The conclusion is that some new insights can be found about risks, about their interactions and about the global potential behavior of the project. - Highlights: ► The method addresses the modeling of complexity in project risk analysis. ► Network theory indicators enable other risks than classical criticality analysis to be highlighted. ► This topological analysis improves project manager's understanding of risks and risk interactions. ► This helps project manager to make decisions considering the position in the risk network. ► An application to a real tramway implementation project in a city is provided.

  12. Advances in Financial Risk Management and Economic Policy Uncertainty: An Overview

    NARCIS (Netherlands)

    S.M. Hammoudeh (Shawkat); M.J. McAleer (Michael)

    2014-01-01

    markdownabstract__Abstract__ Financial risk management is difficult at the best of times, but especially so in the presence of economic uncertainty and financial crises. The purpose of this special issue on “Advances in Financial Risk Management and Economic Policy Uncertainty” is to highlight

  13. Hazardous waste transportation risk assessment: Benefits of a combined deterministic and probabilistic Monte Carlo approach in expressing risk uncertainty

    International Nuclear Information System (INIS)

    Policastro, A.J.; Lazaro, M.A.; Cowen, M.A.; Hartmann, H.M.; Dunn, W.E.; Brown, D.F.

    1995-01-01

    This paper presents a combined deterministic and probabilistic methodology for modeling hazardous waste transportation risk and expressing the uncertainty in that risk. Both the deterministic and probabilistic methodologies are aimed at providing tools useful in the evaluation of alternative management scenarios for US Department of Energy (DOE) hazardous waste treatment, storage, and disposal (TSD). The probabilistic methodology can be used to provide perspective on and quantify uncertainties in deterministic predictions. The methodology developed has been applied to 63 DOE shipments made in fiscal year 1992, which contained poison by inhalation chemicals that represent an inhalation risk to the public. Models have been applied to simulate shipment routes, truck accident rates, chemical spill probabilities, spill/release rates, dispersion, population exposure, and health consequences. The simulation presented in this paper is specific to trucks traveling from DOE sites to their commercial TSD facilities, but the methodology is more general. Health consequences are presented as the number of people with potentially life-threatening health effects. Probabilistic distributions were developed (based on actual item data) for accident release amounts, time of day and season of the accident, and meteorological conditions

  14. Industrial water resources management based on violation risk analysis of the total allowable target on wastewater discharge.

    Science.gov (United States)

    Yue, Wencong; Cai, Yanpeng; Xu, Linyu; Yang, Zhifeng; Yin, Xin'An; Su, Meirong

    2017-07-11

    To improve the capabilities of conventional methodologies in facilitating industrial water allocation under uncertain conditions, an integrated approach was developed through the combination of operational research, uncertainty analysis, and violation risk analysis methods. The developed approach can (a) address complexities of industrial water resources management (IWRM) systems, (b) facilitate reflections of multiple uncertainties and risks of the system and incorporate them into a general optimization framework, and (c) manage robust actions for industrial productions in consideration of water supply capacity and wastewater discharging control. The developed method was then demonstrated in a water-stressed city (i.e., the City of Dalian), northeastern China. Three scenarios were proposed according to the city's industrial plans. The results indicated that in the planning year of 2020 (a) the production of civilian-used steel ships and machine-made paper & paperboard would reduce significantly, (b) violation risk of chemical oxygen demand (COD) discharge under scenario 1 would be the most prominent, compared with those under scenarios 2 and 3, (c) the maximal total economic benefit under scenario 2 would be higher than the benefit under scenario 3, and (d) the production of rolling contact bearing, rail vehicles, and commercial vehicles would be promoted.

  15. Uncertainty analysis for the BEACON-COLSS core monitoring system application

    International Nuclear Information System (INIS)

    Morita, T.; Boyd, W.A.; Seong, K.B.

    2005-01-01

    This paper will cover the measurement uncertainty analysis of BEACON-COLSS core monitoring system. The uncertainty evaluation is made by using a BEACON-COLSS simulation program. By simulating the BEACON on-line operation for analytically generated reactor conditions, accuracy of the 'Measured' results can be evaluated by comparing to analytically generated 'Truth'. The DNB power margin is evaluated based on the Combustion Engineering's Modified Statistical Combination of Uncertainties (MSCU) using the CETOPD code for the DNBR calculation. A BEACON-COLSS simulation program for the uncertainty evaluation function has been established for plant applications. Qualification work has been completed for two Combustion Engineering plants. Results of the BEACON-COLSS measured peaking factors and DNBR power margin are plant type dependent and are applicable to reload cores as long as the core geometry and detector layout are unchanged. (authors)

  16. Addendum to ‘Understanding risks in the light of uncertainty: low-probability, high-impact coastal events in cities’

    Science.gov (United States)

    Galarraga, Ibon; Sainz de Murieta, Elisa; Markandya, Anil; María Abadie, Luis

    2018-02-01

    This addendum adds to the analysis presented in ‘Understanding risks in the light of uncertainty: low-probability, high-impact coastal events in cities’ Abadie et al (2017 Environ. Res. Lett. 12 014017). We propose to use the framework developed earlier to enhance communication and understanding of risks, with the aim of bridging the gap between highly technical risk management discussion to the public risk aversion debate. We also propose that the framework could be used for stress-testing resilience.

  17. Some considerations on the treatment of uncertainties in risk assessment for practical decision making

    International Nuclear Information System (INIS)

    Aven, Terje; Zio, Enrico

    2011-01-01

    This paper discusses the challenges involved in the representation and treatment of uncertainties in risk assessment, taking the point of view of its use in support to decision making. Two main issues are addressed: (1) how to faithfully represent and express the knowledge available to best support the decision making and (2) how to best inform the decision maker. A general risk-uncertainty framework is presented which provides definitions and interpretations of the key concepts introduced. The framework covers probability theory as well as alternative representations of uncertainty, including interval probability, possibility and evidence theory.

  18. Assessing the Uncertainty in QUANTEC's Dose–Response Relation of Lung and Spinal Cord With a Bootstrap Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wedenberg, Minna, E-mail: minna.wedenberg@raysearchlabs.com

    2013-11-15

    Purpose: To apply a statistical bootstrap analysis to assess the uncertainty in the dose–response relation for the endpoints pneumonitis and myelopathy reported in the QUANTEC review. Methods and Materials: The bootstrap method assesses the uncertainty of the estimated population-based dose-response relation due to sample variability, which reflects the uncertainty due to limited numbers of patients in the studies. A large number of bootstrap replicates of the original incidence data were produced by random sampling with replacement. The analysis requires only the dose, the number of patients, and the number of occurrences of the studied endpoint, for each study. Two dose–response models, a Poisson-based model and the Lyman model, were fitted to each bootstrap replicate using maximum likelihood. Results: The bootstrap analysis generates a family of curves representing the range of plausible dose–response relations, and the 95% bootstrap confidence intervals give an estimated upper and lower toxicity risk. The curve families for the 2 dose–response models overlap for doses included in the studies at hand but diverge beyond that, with the Lyman model suggesting a steeper slope. The resulting distributions of the model parameters indicate correlation and non-Gaussian distribution. For both data sets, the likelihood of the observed data was higher for the Lyman model in >90% of the bootstrap replicates. Conclusions: The bootstrap method provides a statistical analysis of the uncertainty in the estimated dose–response relation for myelopathy and pneumonitis. It suggests likely values of model parameter values, their confidence intervals, and how they interrelate for each model. Finally, it can be used to evaluate to what extent data supports one model over another. For both data sets considered here, the Lyman model was preferred over the Poisson-based model.

  19. Nordic reference study on uncertainty and sensitivity analysis

    International Nuclear Information System (INIS)

    Hirschberg, S.; Jacobsson, P.; Pulkkinen, U.; Porn, K.

    1989-01-01

    This paper provides a review of the first phase of Nordic reference study on uncertainty and sensitivity analysis. The main objective of this study is to use experiences form previous Nordic Benchmark Exercises and reference studies concerning critical modeling issues such as common cause failures and human interactions, and to demonstrate the impact of associated uncertainties on the uncertainty of the investigated accident sequence. This has been done independently by three working groups which used different approaches to modeling and to uncertainty analysis. The estimated uncertainty interval for the analyzed accident sequence is large. Also the discrepancies between the groups are substantial but can be explained. Sensitivity analyses which have been carried out concern e.g. use of different CCF-quantification models, alternative handling of CCF-data, time windows for operator actions and time dependences in phase mission operation, impact of state-of-knowledge dependences and ranking of dominating uncertainty contributors. Specific findings with respect to these issues are summarized in the paper

  20. Health risks of climate change: An assessment of uncertainties and its implications for adaptation policies

    Science.gov (United States)

    2012-01-01

    Background Projections of health risks of climate change are surrounded with uncertainties in knowledge. Understanding of these uncertainties will help the selection of appropriate adaptation policies. Methods We made an inventory of conceivable health impacts of climate change, explored the type and level of uncertainty for each impact, and discussed its implications for adaptation policy. A questionnaire-based expert elicitation was performed using an ordinal scoring scale. Experts were asked to indicate the level of precision with which health risks can be estimated, given the present state of knowledge. We assessed the individual scores, the expertise-weighted descriptive statistics, and the argumentation given for each score. Suggestions were made for how dealing with uncertainties could be taken into account in climate change adaptation policy strategies. Results The results showed that the direction of change could be indicated for most anticipated health effects. For several potential effects, too little knowledge exists to indicate whether any impact will occur, or whether the impact will be positive or negative. For several effects, rough ‘order-of-magnitude’ estimates were considered possible. Factors limiting health impact quantification include: lack of data, multi-causality, unknown impacts considering a high-quality health system, complex cause-effect relations leading to multi-directional impacts, possible changes of present-day response-relations, and difficulties in predicting local climate impacts. Participants considered heat-related mortality and non-endemic vector-borne diseases particularly relevant for climate change adaptation. Conclusions For possible climate related health impacts characterised by ignorance, adaptation policies that focus on enhancing the health system’s and society’s capability of dealing with possible future changes, uncertainties and surprises (e.g. through resilience, flexibility, and adaptive capacity) are

  1. Risk-Based Two-Stage Stochastic Optimization Problem of Micro-Grid Operation with Renewables and Incentive-Based Demand Response Programs

    Directory of Open Access Journals (Sweden)

    Pouria Sheikhahmadi

    2018-03-01

    Full Text Available The operation problem of a micro-grid (MG in grid-connected mode is an optimization one in which the main objective of the MG operator (MGO is to minimize the operation cost with optimal scheduling of resources and optimal trading energy with the main grid. The MGO can use incentive-based demand response programs (DRPs to pay an incentive to the consumers to change their demands in the peak hours. Moreover, the MGO forecasts the output power of renewable energy resources (RERs and models their uncertainties in its problem. In this paper, the operation problem of an MGO is modeled as a risk-based two-stage stochastic optimization problem. To model the uncertainties of RERs, two-stage stochastic programming is considered and conditional value at risk (CVaR index is used to manage the MGO’s risk-level. Moreover, the non-linear economic models of incentive-based DRPs are used by the MGO to change the peak load. The numerical studies are done to investigate the effect of incentive-based DRPs on the operation problem of the MGO. Moreover, to show the effect of the risk-averse parameter on MGO decisions, a sensitivity analysis is carried out.

  2. Uncertainty Assessment of Hydrological Frequency Analysis Using Bootstrap Method

    Directory of Open Access Journals (Sweden)

    Yi-Ming Hu

    2013-01-01

    Full Text Available The hydrological frequency analysis (HFA is the foundation for the hydraulic engineering design and water resources management. Hydrological extreme observations or samples are the basis for HFA; the representativeness of a sample series to the population distribution is extremely important for the estimation reliability of the hydrological design value or quantile. However, for most of hydrological extreme data obtained in practical application, the size of the samples is usually small, for example, in China about 40~50 years. Generally, samples with small size cannot completely display the statistical properties of the population distribution, thus leading to uncertainties in the estimation of hydrological design values. In this paper, a new method based on bootstrap is put forward to analyze the impact of sampling uncertainty on the design value. By bootstrap resampling technique, a large number of bootstrap samples are constructed from the original flood extreme observations; the corresponding design value or quantile is estimated for each bootstrap sample, so that the sampling distribution of design value is constructed; based on the sampling distribution, the uncertainty of quantile estimation can be quantified. Compared with the conventional approach, this method provides not only the point estimation of a design value but also quantitative evaluation on uncertainties of the estimation.

  3. Forecasting Investment Risks in Conditions of Uncertainty

    Directory of Open Access Journals (Sweden)

    Andrenko Elena A.

    2017-04-01

    Full Text Available The article is aimed at studying the topical problem of evaluation and forecasting risks of investment activity of enterprises in conditions of uncertainty. Generalizing the researches on qualitative and quantitative methods for evaluating investment risks has helped to reveal certain shortcomings of the proposed approaches, to note in most of the publications there are no results as to any practical application, and to allocate promising directions. On the basis of the theory of fuzzy sets, a model of forecasting the expected risk has been proposed, making use of the Gauss membership function, which has certain advantages over the multi-angular membership functions. Dependences of investment risk from the parameters characterizing the investment project have been obtained. Using the formulas obtained, the total risk of investing in innovation project depending on the boundary conditions has been defined. As the researched target, index of profitability has been selected. The model provides the potential investors and developers with forecasting possible scenarios of investment process to make informed managerial decisions about the appropriateness of introduction and implementation of a project.

  4. Too Risk Averse to Stay Honest? Business Corruption, Uncertainty and Attitudes Toward Risk

    OpenAIRE

    Soreide, T.

    2009-01-01

    The presence of business-corruption provokes firms to make choices between legal business approaches and illegal bribery. The outcome of a chosen strategy will usually be uncertain at the time the decision is made, and a firm's decision will depend partly on its attitude towards risk. Drawing on empirical results about business corruption, this paper describes the risks, uncertainties and benefits attached to bribery, and specifies their impact on firms' propensity to offer bribes. it then de...

  5. Information-theoretic approach to uncertainty importance

    International Nuclear Information System (INIS)

    Park, C.K.; Bari, R.A.

    1985-01-01

    A method is presented for importance analysis in probabilistic risk assessments (PRA) for which the results of interest are characterized by full uncertainty distributions and not just point estimates. The method is based on information theory in which entropy is a measure of uncertainty of a probability density function. We define the relative uncertainty importance between two events as the ratio of the two exponents of the entropies. For the log-normal and log-uniform distributions the importance measure is comprised of the median (central tendency) and of the logarithm of the error factor (uncertainty). Thus, if accident sequences are ranked this way, and the error factors are not all equal, then a different rank order would result than if the sequences were ranked by the central tendency measure alone. As an illustration, the relative importance of internal events and in-plant fires was computed on the basis of existing PRA results

  6. Risk-sensitive optimal feedback control accounts for sensorimotor behavior under uncertainty.

    Directory of Open Access Journals (Sweden)

    Arne J Nagengast

    2010-07-01

    Full Text Available Many aspects of human motor behavior can be understood using optimality principles such as optimal feedback control. However, these proposed optimal control models are risk-neutral; that is, they are indifferent to the variability of the movement cost. Here, we propose the use of a risk-sensitive optimal controller that incorporates movement cost variance either as an added cost (risk-averse controller or as an added value (risk-seeking controller to model human motor behavior in the face of uncertainty. We use a sensorimotor task to test the hypothesis that subjects are risk-sensitive. Subjects controlled a virtual ball undergoing Brownian motion towards a target. Subjects were required to minimize an explicit cost, in points, that was a combination of the final positional error of the ball and the integrated control cost. By testing subjects on different levels of Brownian motion noise and relative weighting of the position and control cost, we could distinguish between risk-sensitive and risk-neutral control. We show that subjects change their movement strategy pessimistically in the face of increased uncertainty in accord with the predictions of a risk-averse optimal controller. Our results suggest that risk-sensitivity is a fundamental attribute that needs to be incorporated into optimal feedback control models.

  7. Facing uncertainty in ecosystem services-based resource management.

    Science.gov (United States)

    Grêt-Regamey, Adrienne; Brunner, Sibyl H; Altwegg, Jürg; Bebi, Peter

    2013-09-01

    The concept of ecosystem services is increasingly used as a support for natural resource management decisions. While the science for assessing ecosystem services is improving, appropriate methods to address uncertainties in a quantitative manner are missing. Ignoring parameter uncertainties, modeling uncertainties and uncertainties related to human-environment interactions can modify decisions and lead to overlooking important management possibilities. In this contribution, we present a new approach for mapping the uncertainties in the assessment of multiple ecosystem services. The spatially explicit risk approach links Bayesian networks to a Geographic Information System for forecasting the value of a bundle of ecosystem services and quantifies the uncertainties related to the outcomes in a spatially explicit manner. We demonstrate that mapping uncertainties in ecosystem services assessments provides key information for decision-makers seeking critical areas in the delivery of ecosystem services in a case study in the Swiss Alps. The results suggest that not only the total value of the bundle of ecosystem services is highly dependent on uncertainties, but the spatial pattern of the ecosystem services values changes substantially when considering uncertainties. This is particularly important for the long-term management of mountain forest ecosystems, which have long rotation stands and are highly sensitive to pressing climate and socio-economic changes. Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. Use of meteorological information in the risk analysis of a mixed wind farm and solar

    Science.gov (United States)

    Mengelkamp, H.-T.; Bendel, D.

    2010-09-01

    Use of meteorological information in the risk analysis of a mixed wind farm and solar power plant portfolio H.-T. Mengelkamp*,** , D. Bendel** *GKSS Research Center Geesthacht GmbH **anemos Gesellschaft für Umweltmeteorologie mbH The renewable energy industry has rapidly developed during the last two decades and so have the needs for high quality comprehensive meteorological services. It is, however, only recently that international financial institutions bundle wind farms and solar power plants and offer shares in these aggregate portfolios. The monetary value of a mixed wind farm and solar power plant portfolio is determined by legal and technical aspects, the expected annual energy production of each wind farm and solar power plant and the associated uncertainty of the energy yield estimation or the investment risk. Building an aggregate portfolio will reduce the overall uncertainty through diversification in contrast to the single wind farm/solar power plant energy yield uncertainty. This is similar to equity funds based on a variety of companies or products. Meteorological aspects contribute to the diversification in various ways. There is the uncertainty in the estimation of the expected long-term mean energy production of the wind and solar power plants. Different components of uncertainty have to be considered depending on whether the power plant is already in operation or in the planning phase. The uncertainty related to a wind farm in the planning phase comprises the methodology of the wind potential estimation and the uncertainty of the site specific wind turbine power curve as well as the uncertainty of the wind farm effect calculation. The uncertainty related to a solar power plant in the pre-operational phase comprises the uncertainty of the radiation data base and that of the performance curve. The long-term mean annual energy yield of operational wind farms and solar power plants is estimated on the basis of the actual energy production and it

  9. Evaluation of severe accident risks: Methodology for the containment, source term, consequence, and risk integration analyses. Volume 1, Revision 1

    International Nuclear Information System (INIS)

    Gorham, E.D.; Breeding, R.J.; Brown, T.D.; Harper, F.T.; Helton, J.C.; Murfin, W.B.; Hora, S.C.

    1993-12-01

    NUREG-1150 examines the risk to the public from five nuclear power plants. The NUREG-1150 plant studies are Level III probabilistic risk assessments (PRAs) and, as such, they consist of four analysis components: accident frequency analysis, accident progression analysis, source term analysis, and consequence analysis. This volume summarizes the methods utilized in performing the last three components and the assembly of these analyses into an overall risk assessment. The NUREG-1150 analysis approach is based on the following ideas: (1) general and relatively fast-running models for the individual analysis components, (2) well-defined interfaces between the individual analysis components, (3) use of Monte Carlo techniques together with an efficient sampling procedure to propagate uncertainties, (4) use of expert panels to develop distributions for important phenomenological issues, and (5) automation of the overall analysis. Many features of the new analysis procedures were adopted to facilitate a comprehensive treatment of uncertainty in the complete risk analysis. Uncertainties in the accident frequency, accident progression and source term analyses were included in the overall uncertainty assessment. The uncertainties in the consequence analysis were not included in this assessment. A large effort was devoted to the development of procedures for obtaining expert opinion and the execution of these procedures to quantify parameters and phenomena for which there is large uncertainty and divergent opinions in the reactor safety community

  10. Tolerance analysis in manufacturing using process capability ratio with measurement uncertainty

    DEFF Research Database (Denmark)

    Mahshid, Rasoul; Mansourvar, Zahra; Hansen, Hans Nørgaard

    2017-01-01

    . In this paper, a new statistical analysis was applied to manufactured products to assess achieved tolerances when the process is known while using capability ratio and expanded uncertainty. The analysis has benefits for process planning, determining actual precision limits, process optimization, troubleshoot......Tolerance analysis provides valuable information regarding performance of manufacturing process. It allows determining the maximum possible variation of a quality feature in production. Previous researches have focused on application of tolerance analysis to the design of mechanical assemblies...... malfunctioning existing part. The capability measure is based on a number of measurements performed on part’s quality variable. Since the ratio relies on measurements, elimination of any possible error has notable negative impact on results. Therefore, measurement uncertainty was used in combination with process...

  11. Flood Risk Assessment Based On Security Deficit Analysis

    Science.gov (United States)

    Beck, J.; Metzger, R.; Hingray, B.; Musy, A.

    Risk is a human perception: a given risk may be considered as acceptable or unac- ceptable depending on the group that has to face that risk. Flood risk analysis of- ten estimates economic losses from damages, but neglects the question of accept- able/unacceptable risk. With input from land use managers, politicians and other stakeholders, risk assessment based on security deficit analysis determines objects with unacceptable risk and their degree of security deficit. Such a risk assessment methodology, initially developed by the Swiss federal authorities, is illustrated by its application on a reach of the Alzette River (Luxembourg) in the framework of the IRMA-SPONGE FRHYMAP project. Flood risk assessment always involves a flood hazard analysis, an exposed object vulnerability analysis, and an analysis combing the results of these two previous analyses. The flood hazard analysis was done with the quasi-2D hydraulic model FldPln to produce flood intensity maps. Flood intensity was determined by the water height and velocity. Object data for the vulnerability analysis, provided by the Luxembourg government, were classified according to their potential damage. Potential damage is expressed in terms of direct, human life and secondary losses. A thematic map was produced to show the object classification. Protection goals were then attributed to the object classes. Protection goals are assigned in terms of an acceptable flood intensity for a certain flood frequency. This is where input from land use managers and politicians comes into play. The perception of risk in the re- gion or country influences the protection goal assignment. Protection goals as used in Switzerland were used in this project. Thematic maps showing the protection goals of each object in the case study area for a given flood frequency were produced. Com- parison between an object's protection goal and the intensity of the flood that touched the object determine the acceptability of the risk and the

  12. Methodology for risk-based analysis of technical specifications

    International Nuclear Information System (INIS)

    Vesely, W.E.; Gaertner, J.P.; Wagner, D.P.

    1985-01-01

    Part of the effort by EPRI to apply probabilistic risk assessment methods and results to the solution of utility problems involves the investigation of methods for risk-based analysis of technical specifications. The culmination of this investigation is the SOCRATES computer code developed by Battelle's Columbus Laboratories to assist in the evaluation of technical specifications of nuclear power plants. The program is designed to use information found in PRAs to re-evaluate risk for changes in component allowed outage times (AOTs) and surveillance test intervals (STIs). The SOCRATES program is a unique and important tool for technical specification evaluations. The detailed component unavailability model allows a detailed analysis of AOT and STI contributions to risk. Explicit equations allow fast and inexpensive calculations. Because the code is designed to accept ranges of parameters and to save results of calculations that do not change during the analysis, sensitivity studies are efficiently performed and results are clearly displayed

  13. On treatment of uncertainty in system planning

    International Nuclear Information System (INIS)

    Flage, R.; Aven, T.

    2009-01-01

    In system planning and operation considerable efforts and resources are spent to reduce uncertainties, as a part of project management, uncertainty management and safety management. The basic idea seems to be that uncertainties are purely negative and should be reduced. In this paper we challenge this way of thinking, using a common industry practice as an example. In accordance with this industry practice, three uncertainty interval categories are used: ±40% intervals for the feasibility phase, ±30% intervals for the concept development phase and ±20% intervals for the engineering phase. The problem is that such a regime could easily lead to a conservative management regime encouraging the use of existing methods and tools, as new activities and novel solutions and arrangements necessarily mean increased uncertainties. In the paper we suggest an alternative approach based on uncertainty and risk descriptions, but having no predefined uncertainty reduction structures. The approach makes use of risk assessments and economic optimisation tools such as the expected net present value, but acknowledges the need for broad risk management processes which extend beyond the analyses. Different concerns need to be balanced, including economic aspects, uncertainties and risk, and practicability

  14. Joint analysis of input and parametric uncertainties in watershed water quality modeling: A formal Bayesian approach

    Science.gov (United States)

    Han, Feng; Zheng, Yi

    2018-06-01

    Significant Input uncertainty is a major source of error in watershed water quality (WWQ) modeling. It remains challenging to address the input uncertainty in a rigorous Bayesian framework. This study develops the Bayesian Analysis of Input and Parametric Uncertainties (BAIPU), an approach for the joint analysis of input and parametric uncertainties through a tight coupling of Markov Chain Monte Carlo (MCMC) analysis and Bayesian Model Averaging (BMA). The formal likelihood function for this approach is derived considering a lag-1 autocorrelated, heteroscedastic, and Skew Exponential Power (SEP) distributed error model. A series of numerical experiments were performed based on a synthetic nitrate pollution case and on a real study case in the Newport Bay Watershed, California. The Soil and Water Assessment Tool (SWAT) and Differential Evolution Adaptive Metropolis (DREAM(ZS)) were used as the representative WWQ model and MCMC algorithm, respectively. The major findings include the following: (1) the BAIPU can be implemented and used to appropriately identify the uncertain parameters and characterize the predictive uncertainty; (2) the compensation effect between the input and parametric uncertainties can seriously mislead the modeling based management decisions, if the input uncertainty is not explicitly accounted for; (3) the BAIPU accounts for the interaction between the input and parametric uncertainties and therefore provides more accurate calibration and uncertainty results than a sequential analysis of the uncertainties; and (4) the BAIPU quantifies the credibility of different input assumptions on a statistical basis and can be implemented as an effective inverse modeling approach to the joint inference of parameters and inputs.

  15. County-Level Climate Uncertainty for Risk Assessments: Volume 1.

    Energy Technology Data Exchange (ETDEWEB)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M; Walker, La Tonya Nicole; Roberts, Barry L; Malczynski, Leonard A.

    2017-06-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plus two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.

  16. Nuclear data sensitivity/uncertainty analysis for XT-ADS

    International Nuclear Information System (INIS)

    Sugawara, Takanori; Sarotto, Massimo; Stankovskiy, Alexey; Van den Eynde, Gert

    2011-01-01

    Highlights: → The sensitivity and uncertainty analyses were performed to comprehend the reliability of the XT-ADS neutronic design. → The uncertainties deduced from the covariance data for the XT-ADS criticality were 0.94%, 1.9% and 1.1% by the SCALE 44-group, TENDL-2009 and JENDL-3.3 data, respectively. → When the target accuracy of 0.3%Δk for the criticality was considered, the uncertainties did not satisfy it. → To achieve this accuracy, the uncertainties should be improved by experiments under an adequate condition. - Abstract: The XT-ADS, an accelerator-driven system for an experimental demonstration, has been investigated in the framework of IP EUROTRANS FP6 project. In this study, the sensitivity and uncertainty analyses were performed to comprehend the reliability of the XT-ADS neutronic design. For the sensitivity analysis, it was found that the sensitivity coefficients were significantly different by changing the geometry models and calculation codes. For the uncertainty analysis, it was confirmed that the uncertainties deduced from the covariance data varied significantly by changing them. The uncertainties deduced from the covariance data for the XT-ADS criticality were 0.94%, 1.9% and 1.1% by the SCALE 44-group, TENDL-2009 and JENDL-3.3 data, respectively. When the target accuracy of 0.3%Δk for the criticality was considered, the uncertainties did not satisfy it. To achieve this accuracy, the uncertainties should be improved by experiments under an adequate condition.

  17. Selection of Representative Models for Decision Analysis Under Uncertainty

    Science.gov (United States)

    Meira, Luis A. A.; Coelho, Guilherme P.; Santos, Antonio Alberto S.; Schiozer, Denis J.

    2016-03-01

    The decision-making process in oil fields includes a step of risk analysis associated with the uncertainties present in the variables of the problem. Such uncertainties lead to hundreds, even thousands, of possible scenarios that are supposed to be analyzed so an effective production strategy can be selected. Given this high number of scenarios, a technique to reduce this set to a smaller, feasible subset of representative scenarios is imperative. The selected scenarios must be representative of the original set and also free of optimistic and pessimistic bias. This paper is devoted to propose an assisted methodology to identify representative models in oil fields. To do so, first a mathematical function was developed to model the representativeness of a subset of models with respect to the full set that characterizes the problem. Then, an optimization tool was implemented to identify the representative models of any problem, considering not only the cross-plots of the main output variables, but also the risk curves and the probability distribution of the attribute-levels of the problem. The proposed technique was applied to two benchmark cases and the results, evaluated by experts in the field, indicate that the obtained solutions are richer than those identified by previously adopted manual approaches. The program bytecode is available under request.

  18. Risk Based Milk Pricing Model at Dairy Farmers Level

    Directory of Open Access Journals (Sweden)

    W. Septiani

    2017-12-01

    Full Text Available The milk price from a cooperative institution to farmer does not fully cover the production cost. Though, dairy farmers encounter various risks and uncertainties in conducting their business. The highest risk in milk supply lies in the activities at the farm. This study was designed to formulate a model for calculating milk price at farmer’s level based on risk. Risks that occur on farms include the risk of cow breeding, sanitation, health care, cattle feed management, milking and milk sales. This research used the location of the farm in West Java region. There were five main stages in the preparation of this model, (1 identification and analysis of influential factors, (2 development of a conceptual model, (3 structural analysis and the amount of production costs, (4 model calculation of production cost with risk factors, and (5 risk based milk pricing model. This research built a relationship between risks on smallholder dairy farms with the production costs to be incurred by the farmers. It was also obtained the formulation of risk adjustment factor calculation for the variable costs of production in dairy cattle farm. The difference in production costs with risk and the total production cost without risk was about 8% to 10%. It could be concluded that the basic price of milk proposed based on the research was around IDR 4,250-IDR 4,350/L for 3 to 4 cows ownership. Increasing farmer income was expected to be obtained by entering the value of this risk in the calculation of production costs. 

  19. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)] [and others

    1997-06-01

    This volume is the second of a two-volume document that summarizes a joint project by the US Nuclear Regulatory and the Commission of European Communities to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This two-volume report, which examines mechanisms and uncertainties of transfer through the food chain, is the first in a series of five such reports. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain transfer that affect calculations of offsite radiological consequences. Seven of the experts reported on transfer into the food chain through soil and plants, nine reported on transfer via food products from animals, and two reported on both. The expert judgment elicitation procedure and its outcomes are described in these volumes. This volume contains seven appendices. Appendix A presents a brief discussion of the MAACS and COSYMA model codes. Appendix B is the structure document and elicitation questionnaire for the expert panel on soils and plants. Appendix C presents the rationales and responses of each of the members of the soils and plants expert panel. Appendix D is the structure document and elicitation questionnaire for the expert panel on animal transfer. The rationales and responses of each of the experts on animal transfer are given in Appendix E. Brief biographies of the food chain expert panel members are provided in Appendix F. Aggregated results of expert responses are presented in graph format in Appendix G.

  20. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 2: Appendices

    International Nuclear Information System (INIS)

    Brown, J.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-06-01

    This volume is the second of a two-volume document that summarizes a joint project by the US Nuclear Regulatory and the Commission of European Communities to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This two-volume report, which examines mechanisms and uncertainties of transfer through the food chain, is the first in a series of five such reports. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain transfer that affect calculations of offsite radiological consequences. Seven of the experts reported on transfer into the food chain through soil and plants, nine reported on transfer via food products from animals, and two reported on both. The expert judgment elicitation procedure and its outcomes are described in these volumes. This volume contains seven appendices. Appendix A presents a brief discussion of the MAACS and COSYMA model codes. Appendix B is the structure document and elicitation questionnaire for the expert panel on soils and plants. Appendix C presents the rationales and responses of each of the members of the soils and plants expert panel. Appendix D is the structure document and elicitation questionnaire for the expert panel on animal transfer. The rationales and responses of each of the experts on animal transfer are given in Appendix E. Brief biographies of the food chain expert panel members are provided in Appendix F. Aggregated results of expert responses are presented in graph format in Appendix G