WorldWideScience

Sample records for significant uncertainties exist

  1. Summary of existing uncertainty methods

    International Nuclear Information System (INIS)

    Glaeser, Horst

    2013-01-01

    A summary of existing and most used uncertainty methods is presented, and the main features are compared. One of these methods is the order statistics method based on Wilks' formula. It is applied in safety research as well as in licensing. This method has been first proposed by GRS for use in deterministic safety analysis, and is now used by many organisations world-wide. Its advantage is that the number of potential uncertain input and output parameters is not limited to a small number. Such a limitation was necessary for the first demonstration of the Code Scaling Applicability Uncertainty Method (CSAU) by the United States Regulatory Commission (USNRC). They did not apply Wilks' formula in their statistical method propagating input uncertainties to obtain the uncertainty of a single output variable, like peak cladding temperature. A Phenomena Identification and Ranking Table (PIRT) was set up in order to limit the number of uncertain input parameters, and consequently, the number of calculations to be performed. Another purpose of such a PIRT process is to identify the most important physical phenomena which a computer code should be suitable to calculate. The validation of the code should be focused on the identified phenomena. Response surfaces are used in some applications replacing the computer code for performing a high number of calculations. The second well known uncertainty method is the Uncertainty Methodology Based on Accuracy Extrapolation (UMAE) and the follow-up method 'Code with the Capability of Internal Assessment of Uncertainty (CIAU)' developed by the University Pisa. Unlike the statistical approaches, the CIAU does compare experimental data with calculation results. It does not consider uncertain input parameters. Therefore, the CIAU is highly dependent on the experimental database. The accuracy gained from the comparison between experimental data and calculated results are extrapolated to obtain the uncertainty of the system code predictions

  2. From risk management to uncertainty management: a significant change in project management

    Institute of Scientific and Technical Information of China (English)

    LI Gui-jun; ZHANG Yue-song

    2006-01-01

    Starting with the meanings of the terms "risk" and "uncertainty,"" he paper compares uncertainty management with risk management in project management. We bring some doubt to the use of "risk" and "uncertainty" interchangeably in project management and deem their scope, methods, responses, monitoring and controlling should be different too. Illustrations are given covering terminology, description, and treatment from different perspectives of uncertainty management and risk management. Furthermore, the paper retains that project risk management (PRM) processes might be modified to facilitate an uncertainty management perspective,and we support that project uncertainty management (PUM) can enlarge its contribution to improving project management performance, which will result in a significant change in emphasis compared with most risk management.

  3. Do the Uncertainty Relations Really have Crucial Significances for Physics?

    Directory of Open Access Journals (Sweden)

    Dumitru S.

    2010-10-01

    Full Text Available It is proved the falsity of idea that the Uncertainty Relations (UR have crucial significances for physics. Additionally one argues for the necesity of an UR-disconnected quantum philosophy.

  4. Optimized Clustering Estimators for BAO Measurements Accounting for Significant Redshift Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Ross, Ashley J. [Portsmouth U., ICG; Banik, Nilanjan [Fermilab; Avila, Santiago [Madrid, IFT; Percival, Will J. [Portsmouth U., ICG; Dodelson, Scott [Fermilab; Garcia-Bellido, Juan [Madrid, IFT; Crocce, Martin [ICE, Bellaterra; Elvin-Poole, Jack [Jodrell Bank; Giannantonio, Tommaso [Cambridge U., KICC; Manera, Marc [Cambridge U., DAMTP; Sevilla-Noarbe, Ignacio [Madrid, CIEMAT

    2017-05-15

    We determine an optimized clustering statistic to be used for galaxy samples with significant redshift uncertainty, such as those that rely on photometric redshifts. To do so, we study the BAO information content as a function of the orientation of galaxy clustering modes with respect to their angle to the line-of-sight (LOS). The clustering along the LOS, as observed in a redshift-space with significant redshift uncertainty, has contributions from clustering modes with a range of orientations with respect to the true LOS. For redshift uncertainty $\\sigma_z \\geq 0.02(1+z)$ we find that while the BAO information is confined to transverse clustering modes in the true space, it is spread nearly evenly in the observed space. Thus, measuring clustering in terms of the projected separation (regardless of the LOS) is an efficient and nearly lossless compression of the signal for $\\sigma_z \\geq 0.02(1+z)$. For reduced redshift uncertainty, a more careful consideration is required. We then use more than 1700 realizations of galaxy simulations mimicking the Dark Energy Survey Year 1 sample to validate our analytic results and optimized analysis procedure. We find that using the correlation function binned in projected separation, we can achieve uncertainties that are within 10 per cent of of those predicted by Fisher matrix forecasts. We predict that DES Y1 should achieve a 5 per cent distance measurement using our optimized methods. We expect the results presented here to be important for any future BAO measurements made using photometric redshift data.

  5. Uncertainty, joint uncertainty, and the quantum uncertainty principle

    International Nuclear Information System (INIS)

    Narasimhachar, Varun; Poostindouz, Alireza; Gour, Gilad

    2016-01-01

    Historically, the element of uncertainty in quantum mechanics has been expressed through mathematical identities called uncertainty relations, a great many of which continue to be discovered. These relations use diverse measures to quantify uncertainty (and joint uncertainty). In this paper we use operational information-theoretic principles to identify the common essence of all such measures, thereby defining measure-independent notions of uncertainty and joint uncertainty. We find that most existing entropic uncertainty relations use measures of joint uncertainty that yield themselves to a small class of operational interpretations. Our notion relaxes this restriction, revealing previously unexplored joint uncertainty measures. To illustrate the utility of our formalism, we derive an uncertainty relation based on one such new measure. We also use our formalism to gain insight into the conditions under which measure-independent uncertainty relations can be found. (paper)

  6. Significant uncertainty in global scale hydrological modeling from precipitation data errors

    Science.gov (United States)

    Sperna Weiland, Frederiek C.; Vrugt, Jasper A.; van Beek, Rens (L.) P. H.; Weerts, Albrecht H.; Bierkens, Marc F. P.

    2015-10-01

    In the past decades significant progress has been made in the fitting of hydrologic models to data. Most of this work has focused on simple, CPU-efficient, lumped hydrologic models using discharge, water table depth, soil moisture, or tracer data from relatively small river basins. In this paper, we focus on large-scale hydrologic modeling and analyze the effect of parameter and rainfall data uncertainty on simulated discharge dynamics with the global hydrologic model PCR-GLOBWB. We use three rainfall data products; the CFSR reanalysis, the ERA-Interim reanalysis, and a combined ERA-40 reanalysis and CRU dataset. Parameter uncertainty is derived from Latin Hypercube Sampling (LHS) using monthly discharge data from five of the largest river systems in the world. Our results demonstrate that the default parameterization of PCR-GLOBWB, derived from global datasets, can be improved by calibrating the model against monthly discharge observations. Yet, it is difficult to find a single parameterization of PCR-GLOBWB that works well for all of the five river basins considered herein and shows consistent performance during both the calibration and evaluation period. Still there may be possibilities for regionalization based on catchment similarities. Our simulations illustrate that parameter uncertainty constitutes only a minor part of predictive uncertainty. Thus, the apparent dichotomy between simulations of global-scale hydrologic behavior and actual data cannot be resolved by simply increasing the model complexity of PCR-GLOBWB and resolving sub-grid processes. Instead, it would be more productive to improve the characterization of global rainfall amounts at spatial resolutions of 0.5° and smaller.

  7. Neglect Of Parameter Estimation Uncertainty Can Significantly Overestimate Structural Reliability

    Directory of Open Access Journals (Sweden)

    Rózsás Árpád

    2015-12-01

    Full Text Available Parameter estimation uncertainty is often neglected in reliability studies, i.e. point estimates of distribution parameters are used for representative fractiles, and in probabilistic models. A numerical example examines the effect of this uncertainty on structural reliability using Bayesian statistics. The study reveals that the neglect of parameter estimation uncertainty might lead to an order of magnitude underestimation of failure probability.

  8. Application of a Novel Dose-Uncertainty Model for Dose-Uncertainty Analysis in Prostate Intensity-Modulated Radiotherapy

    International Nuclear Information System (INIS)

    Jin Hosang; Palta, Jatinder R.; Kim, You-Hyun; Kim, Siyong

    2010-01-01

    Purpose: To analyze dose uncertainty using a previously published dose-uncertainty model, and to assess potential dosimetric risks existing in prostate intensity-modulated radiotherapy (IMRT). Methods and Materials: The dose-uncertainty model provides a three-dimensional (3D) dose-uncertainty distribution in a given confidence level. For 8 retrospectively selected patients, dose-uncertainty maps were constructed using the dose-uncertainty model at the 95% CL. In addition to uncertainties inherent to the radiation treatment planning system, four scenarios of spatial errors were considered: machine only (S1), S1 + intrafraction, S1 + interfraction, and S1 + both intrafraction and interfraction errors. To evaluate the potential risks of the IMRT plans, three dose-uncertainty-based plan evaluation tools were introduced: confidence-weighted dose-volume histogram, confidence-weighted dose distribution, and dose-uncertainty-volume histogram. Results: Dose uncertainty caused by interfraction setup error was more significant than that of intrafraction motion error. The maximum dose uncertainty (95% confidence) of the clinical target volume (CTV) was smaller than 5% of the prescribed dose in all but two cases (13.9% and 10.2%). The dose uncertainty for 95% of the CTV volume ranged from 1.3% to 2.9% of the prescribed dose. Conclusions: The dose uncertainty in prostate IMRT could be evaluated using the dose-uncertainty model. Prostate IMRT plans satisfying the same plan objectives could generate a significantly different dose uncertainty because a complex interplay of many uncertainty sources. The uncertainty-based plan evaluation contributes to generating reliable and error-resistant treatment plans.

  9. Uncertainty in Seismic Capacity of Masonry Buildings

    Directory of Open Access Journals (Sweden)

    Nicola Augenti

    2012-07-01

    Full Text Available Seismic assessment of masonry structures is plagued by both inherent randomness and model uncertainty. The former is referred to as aleatory uncertainty, the latter as epistemic uncertainty because it depends on the knowledge level. Pioneering studies on reinforced concrete buildings have revealed a significant influence of modeling parameters on seismic vulnerability. However, confidence in mechanical properties of existing masonry buildings is much lower than in the case of reinforcing steel and concrete. This paper is aimed at assessing whether and how uncertainty propagates from material properties to seismic capacity of an entire masonry structure. A typical two-story unreinforced masonry building is analyzed. Based on previous statistical characterization of mechanical properties of existing masonry types, the following random variables have been considered in this study: unit weight, uniaxial compressive strength, shear strength at zero confining stress, Young’s modulus, shear modulus, and available ductility in shear. Probability density functions were implemented to generate a significant number of realizations and static pushover analysis of the case-study building was performed for each vector of realizations, load combination and lateral load pattern. Analysis results show a large dispersion in displacement capacity and lower dispersion in spectral acceleration capacity. This can directly affect decision-making because both design and retrofit solutions depend on seismic capacity predictions. Therefore, engineering judgment should always be used when assessing structural safety of existing masonry constructions against design earthquakes, based on a series of seismic analyses under uncertain parameters.

  10. Health significance and statistical uncertainty. The value of P-value.

    Science.gov (United States)

    Consonni, Dario; Bertazzi, Pier Alberto

    2017-10-27

    The P-value is widely used as a summary statistics of scientific results. Unfortunately, there is a widespread tendency to dichotomize its value in "P0.05" ("statistically not significant"), with the former implying a "positive" result and the latter a "negative" one. To show the unsuitability of such an approach when evaluating the effects of environmental and occupational risk factors. We provide examples of distorted use of P-value and of the negative consequences for science and public health of such a black-and-white vision. The rigid interpretation of P-value as a dichotomy favors the confusion between health relevance and statistical significance, discourages thoughtful thinking, and distorts attention from what really matters, the health significance. A much better way to express and communicate scientific results involves reporting effect estimates (e.g., risks, risks ratios or risk differences) and their confidence intervals (CI), which summarize and convey both health significance and statistical uncertainty. Unfortunately, many researchers do not usually consider the whole interval of CI but only examine if it includes the null-value, therefore degrading this procedure to the same P-value dichotomy (statistical significance or not). In reporting statistical results of scientific research present effects estimates with their confidence intervals and do not qualify the P-value as "significant" or "not significant".

  11. Uncertainty enabled Sensor Observation Services

    Science.gov (United States)

    Cornford, Dan; Williams, Matthew; Bastin, Lucy

    2010-05-01

    Almost all observations of reality are contaminated with errors, which introduce uncertainties into the actual observation result. Such uncertainty is often held to be a data quality issue, and quantification of this uncertainty is essential for the principled exploitation of the observations. Many existing systems treat data quality in a relatively ad-hoc manner, however if the observation uncertainty is a reliable estimate of the error on the observation with respect to reality then knowledge of this uncertainty enables optimal exploitation of the observations in further processes, or decision making. We would argue that the most natural formalism for expressing uncertainty is Bayesian probability theory. In this work we show how the Open Geospatial Consortium Sensor Observation Service can be implemented to enable the support of explicit uncertainty about observations. We show how the UncertML candidate standard is used to provide a rich and flexible representation of uncertainty in this context. We illustrate this on a data set of user contributed weather data where the INTAMAP interpolation Web Processing Service is used to help estimate the uncertainty on the observations of unknown quality, using observations with known uncertainty properties. We then go on to discuss the implications of uncertainty for a range of existing Open Geospatial Consortium standards including SWE common and Observations and Measurements. We discuss the difficult decisions in the design of the UncertML schema and its relation and usage within existing standards and show various options. We conclude with some indications of the likely future directions for UncertML in the context of Open Geospatial Consortium services.

  12. Illustrative uncertainty visualization of DTI fiber pathways

    NARCIS (Netherlands)

    Brecheisen, R.; Platel, B.; Haar Romeny, B.M. Ter; Vilanova, A.

    2013-01-01

    Diffusion Tensor Imaging (DTI) and fiber tracking provide unique insight into the 3D structure of fibrous tissues in the brain. However, the output of fiber tracking contains a significant amount of uncertainty accumulated in the various steps of the processing pipeline. Existing DTI visualization

  13. The uncertainties in estimating measurement uncertainties

    International Nuclear Information System (INIS)

    Clark, J.P.; Shull, A.H.

    1994-01-01

    All measurements include some error. Whether measurements are used for accountability, environmental programs or process support, they are of little value unless accompanied by an estimate of the measurements uncertainty. This fact is often overlooked by the individuals who need measurements to make decisions. This paper will discuss the concepts of measurement, measurements errors (accuracy or bias and precision or random error), physical and error models, measurement control programs, examples of measurement uncertainty, and uncertainty as related to measurement quality. Measurements are comparisons of unknowns to knowns, estimates of some true value plus uncertainty; and are no better than the standards to which they are compared. Direct comparisons of unknowns that match the composition of known standards will normally have small uncertainties. In the real world, measurements usually involve indirect comparisons of significantly different materials (e.g., measuring a physical property of a chemical element in a sample having a matrix that is significantly different from calibration standards matrix). Consequently, there are many sources of error involved in measurement processes that can affect the quality of a measurement and its associated uncertainty. How the uncertainty estimates are determined and what they mean is as important as the measurement. The process of calculating the uncertainty of a measurement itself has uncertainties that must be handled correctly. Examples of chemistry laboratory measurement will be reviewed in this report and recommendations made for improving measurement uncertainties

  14. Uncertainties and severe-accident management

    International Nuclear Information System (INIS)

    Kastenberg, W.E.

    1991-01-01

    Severe-accident management can be defined as the use of existing and or alternative resources, systems, and actions to prevent or mitigate a core-melt accident. Together with risk management (e.g., changes in plant operation and/or addition of equipment) and emergency planning (off-site actions), accident management provides an extension of the defense-indepth safety philosophy for severe accidents. A significant number of probabilistic safety assessments have been completed, which yield the principal plant vulnerabilities, and can be categorized as (a) dominant sequences with respect to core-melt frequency, (b) dominant sequences with respect to various risk measures, (c) dominant threats that challenge safety functions, and (d) dominant threats with respect to failure of safety systems. Severe-accident management strategies can be generically classified as (a) use of alternative resources, (b) use of alternative equipment, and (c) use of alternative actions. For each sequence/threat and each combination of strategy, there may be several options available to the operator. Each strategy/option involves phenomenological and operational considerations regarding uncertainty. These include (a) uncertainty in key phenomena, (b) uncertainty in operator behavior, (c) uncertainty in system availability and behavior, and (d) uncertainty in information availability (i.e., instrumentation). This paper focuses on phenomenological uncertainties associated with severe-accident management strategies

  15. BEPU methods and combining of uncertainties

    International Nuclear Information System (INIS)

    Prosek, A.; Mavko, B.

    2004-01-01

    After approval of the revised rule on the acceptance of emergency core cooling system (ECCS) performance in 1988 there has been significant interest in the development of codes and methodologies for best-estimate loss-of-coolant accident (LOCAs) analyses. The Code Scaling, Applicability and Uncertainty (CSAU) evaluation method was developed and demonstrated for large-break (LB) LOCA in a pressurized water reactor. Later several new best estimate plus uncertainty methods (BEPUs) were developed in the world. The purpose of the paper is to identify and compare the statistical approaches of BEPU methods and present their important plant and licensing applications. The study showed that uncertainty analysis with random sampling of input parameters and the use of order statistics for desired tolerance limits of output parameters is today commonly accepted approach. The existing BEPU methods seems mature enough while the future research may be focused on the codes with internal assessment of uncertainty. (author)

  16. Uncertainty analysis techniques

    International Nuclear Information System (INIS)

    Marivoet, J.; Saltelli, A.; Cadelli, N.

    1987-01-01

    The origin of the uncertainty affecting Performance Assessments, as well as their propagation to dose and risk results is discussed. The analysis is focused essentially on the uncertainties introduced by the input parameters, the values of which may range over some orders of magnitude and may be given as probability distribution function. The paper briefly reviews the existing sampling techniques used for Monte Carlo simulations and the methods for characterizing the output curves, determining their convergence and confidence limits. Annual doses, expectation values of the doses and risks are computed for a particular case of a possible repository in clay, in order to illustrate the significance of such output characteristics as the mean, the logarithmic mean and the median as well as their ratios. The report concludes that provisionally, due to its better robustness, such estimation as the 90th percentile may be substituted to the arithmetic mean for comparison of the estimated doses with acceptance criteria. In any case, the results obtained through Uncertainty Analyses must be interpreted with caution as long as input data distribution functions are not derived from experiments reasonably reproducing the situation in a well characterized repository and site

  17. Compensation of significant parametric uncertainties using sliding mode online learning

    Science.gov (United States)

    Schnetter, Philipp; Kruger, Thomas

    An augmented nonlinear inverse dynamics (NID) flight control strategy using sliding mode online learning for a small unmanned aircraft system (UAS) is presented. Because parameter identification for this class of aircraft often is not valid throughout the complete flight envelope, aerodynamic parameters used for model based control strategies may show significant deviations. For the concept of feedback linearization this leads to inversion errors that in combination with the distinctive susceptibility of small UAS towards atmospheric turbulence pose a demanding control task for these systems. In this work an adaptive flight control strategy using feedforward neural networks for counteracting such nonlinear effects is augmented with the concept of sliding mode control (SMC). SMC-learning is derived from variable structure theory. It considers a neural network and its training as a control problem. It is shown that by the dynamic calculation of the learning rates, stability can be guaranteed and thus increase the robustness against external disturbances and system failures. With the resulting higher speed of convergence a wide range of simultaneously occurring disturbances can be compensated. The SMC-based flight controller is tested and compared to the standard gradient descent (GD) backpropagation algorithm under the influence of significant model uncertainties and system failures.

  18. Survey of Existing Uncertainty Quantification Capabilities for Army Relevant Problems

    Science.gov (United States)

    2017-11-27

    first of these introductory sections is an overview of UQ and its various methods. The second of these discusses issues pertaining to the use of UQ...can be readily assessed, as well as the variance or other statistical measures of the distribu- tion of parameters. The uncertainty in the parameters is... statistics of the outputs of these methods, such as the moments of the probability distributions of model outputs. The module does not explicitly support

  19. Chemical model reduction under uncertainty

    KAUST Repository

    Najm, Habib; Galassi, R. Malpica; Valorani, M.

    2016-01-01

    We outline a strategy for chemical kinetic model reduction under uncertainty. We present highlights of our existing deterministic model reduction strategy, and describe the extension of the formulation to include parametric uncertainty in the detailed mechanism. We discuss the utility of this construction, as applied to hydrocarbon fuel-air kinetics, and the associated use of uncertainty-aware measures of error between predictions from detailed and simplified models.

  20. Chemical model reduction under uncertainty

    KAUST Repository

    Najm, Habib

    2016-01-05

    We outline a strategy for chemical kinetic model reduction under uncertainty. We present highlights of our existing deterministic model reduction strategy, and describe the extension of the formulation to include parametric uncertainty in the detailed mechanism. We discuss the utility of this construction, as applied to hydrocarbon fuel-air kinetics, and the associated use of uncertainty-aware measures of error between predictions from detailed and simplified models.

  1. Uncertainty as Knowledge: Constraints on Policy Choices Provided by Analysis of Uncertainty

    Science.gov (United States)

    Lewandowsky, S.; Risbey, J.; Smithson, M.; Newell, B. R.

    2012-12-01

    Uncertainty forms an integral part of climate science, and it is often cited in connection with arguments against mitigative action. We argue that an analysis of uncertainty must consider existing knowledge as well as uncertainty, and the two must be evaluated with respect to the outcomes and risks associated with possible policy options. Although risk judgments are inherently subjective, an analysis of the role of uncertainty within the climate system yields two constraints that are robust to a broad range of assumptions. Those constraints are that (a) greater uncertainty about the climate system is necessarily associated with greater expected damages from warming, and (b) greater uncertainty translates into a greater risk of the failure of mitigation efforts. These ordinal constraints are unaffected by subjective or cultural risk-perception factors, they are independent of the discount rate, and they are independent of the magnitude of the estimate for climate sensitivity. The constraints mean that any appeal to uncertainty must imply a stronger, rather than weaker, need to cut greenhouse gas emissions than in the absence of uncertainty.

  2. Conditional uncertainty principle

    Science.gov (United States)

    Gour, Gilad; Grudka, Andrzej; Horodecki, Michał; Kłobus, Waldemar; Łodyga, Justyna; Narasimhachar, Varun

    2018-04-01

    We develop a general operational framework that formalizes the concept of conditional uncertainty in a measure-independent fashion. Our formalism is built upon a mathematical relation which we call conditional majorization. We define conditional majorization and, for the case of classical memory, we provide its thorough characterization in terms of monotones, i.e., functions that preserve the partial order under conditional majorization. We demonstrate the application of this framework by deriving two types of memory-assisted uncertainty relations, (1) a monotone-based conditional uncertainty relation and (2) a universal measure-independent conditional uncertainty relation, both of which set a lower bound on the minimal uncertainty that Bob has about Alice's pair of incompatible measurements, conditioned on arbitrary measurement that Bob makes on his own system. We next compare the obtained relations with their existing entropic counterparts and find that they are at least independent.

  3. Revisiting organizational interpretation and three types of uncertainty

    DEFF Research Database (Denmark)

    Sund, Kristian J.

    2015-01-01

    that might help explain and untangle some of the conflicting empirical results found in the extant literature. The paper illustrates how the literature could benefit from re-conceptualizing the perceived environmental uncertainty construct to take into account different types of uncertainty. Practical....... Design/methodology/approach – This conceptual paper extends existing conceptual work by distinguishing between general and issue-specific scanning and linking the interpretation process to three different types of perceived uncertainty: state, effect and response uncertainty. Findings – It is proposed...... on existing work by linking the interpretation process to three different types of uncertainty (state, effect and response uncertainty) with several novel and testable propositions. The paper also differentiates clearly general (regular) scanning from issue-specific (irregular) scanning. Finally, the paper...

  4. Improved Monte Carlo Method for PSA Uncertainty Analysis

    International Nuclear Information System (INIS)

    Choi, Jongsoo

    2016-01-01

    The treatment of uncertainty is an important issue for regulatory decisions. Uncertainties exist from knowledge limitations. A probabilistic approach has exposed some of these limitations and provided a framework to assess their significance and assist in developing a strategy to accommodate them in the regulatory process. The uncertainty analysis (UA) is usually based on the Monte Carlo method. This paper proposes a Monte Carlo UA approach to calculate the mean risk metrics accounting for the SOKC between basic events (including CCFs) using efficient random number generators and to meet Capability Category III of the ASME/ANS PRA standard. Audit calculation is needed in PSA regulatory reviews of uncertainty analysis results submitted for licensing. The proposed Monte Carlo UA approach provides a high degree of confidence in PSA reviews. All PSA needs accounting for the SOKC between event probabilities to meet the ASME/ANS PRA standard

  5. Improved Monte Carlo Method for PSA Uncertainty Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jongsoo [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2016-10-15

    The treatment of uncertainty is an important issue for regulatory decisions. Uncertainties exist from knowledge limitations. A probabilistic approach has exposed some of these limitations and provided a framework to assess their significance and assist in developing a strategy to accommodate them in the regulatory process. The uncertainty analysis (UA) is usually based on the Monte Carlo method. This paper proposes a Monte Carlo UA approach to calculate the mean risk metrics accounting for the SOKC between basic events (including CCFs) using efficient random number generators and to meet Capability Category III of the ASME/ANS PRA standard. Audit calculation is needed in PSA regulatory reviews of uncertainty analysis results submitted for licensing. The proposed Monte Carlo UA approach provides a high degree of confidence in PSA reviews. All PSA needs accounting for the SOKC between event probabilities to meet the ASME/ANS PRA standard.

  6. Interpolation in Time Series: An Introductive Overview of Existing Methods, Their Performance Criteria and Uncertainty Assessment

    Directory of Open Access Journals (Sweden)

    Mathieu Lepot

    2017-10-01

    Full Text Available A thorough review has been performed on interpolation methods to fill gaps in time-series, efficiency criteria, and uncertainty quantifications. On one hand, there are numerous available methods: interpolation, regression, autoregressive, machine learning methods, etc. On the other hand, there are many methods and criteria to estimate efficiencies of these methods, but uncertainties on the interpolated values are rarely calculated. Furthermore, while they are estimated according to standard methods, the prediction uncertainty is not taken into account: a discussion is thus presented on the uncertainty estimation of interpolated/extrapolated data. Finally, some suggestions for further research and a new method are proposed.

  7. Roadmap toward addressing and communicating uncertainty in LCA

    DEFF Research Database (Denmark)

    Laurin, Lise; Vigon, Bruce; Fantke, Peter

    2017-01-01

    -characterized uncertainty. The group has investigated current best LCA practices, such as refinements to the pedigree matrix used to assess LCI data quality. In parallel, in the frame of UNEP-SETAC Life Cycle Initiative flagship project on providing Harmonization and Global Guidance for Environmental Life Cycle Impact...... uncertainty is further related to input data, model selection and choices, amongst other aspects. Currently, methods exist to assess and assign uncertainty and variability on LCI data as well as LCIA characterization results. However, often uncertainty is only assessed and reported qualitatively......, is not comparable across impact categories and not consistently assessed and reported across levels of detail. Furthermore, many existing methods and models do not report uncertainty at all or limit their uncertainty assessment to a sensitivity analysis of selected input parameters, while ignoring variability...

  8. Significance of uncertainties derived from settling tank model structure and parameters on predicting WWTP performance - A global sensitivity analysis study

    DEFF Research Database (Denmark)

    Ramin, Elham; Sin, Gürkan; Mikkelsen, Peter Steen

    2011-01-01

    Uncertainty derived from one of the process models – such as one-dimensional secondary settling tank (SST) models – can impact the output of the other process models, e.g., biokinetic (ASM1), as well as the integrated wastewater treatment plant (WWTP) models. The model structure and parameter...... and from the last aerobic bioreactor upstream to the SST (Garrett/hydraulic method). For model structure uncertainty, two one-dimensional secondary settling tank (1-D SST) models are assessed, including a first-order model (the widely used Takács-model), in which the feasibility of using measured...... uncertainty of settler models can therefore propagate, and add to the uncertainties in prediction of any plant performance criteria. Here we present an assessment of the relative significance of secondary settling model performance in WWTP simulations. We perform a global sensitivity analysis (GSA) based...

  9. Predictive uncertainty in auditory sequence processing

    DEFF Research Database (Denmark)

    Hansen, Niels Chr.; Pearce, Marcus T

    2014-01-01

    in a melodic sequence (inferred uncertainty). Finally, we simulate listeners' perception of expectedness and uncertainty using computational models of auditory expectation. A detailed model comparison indicates which model parameters maximize fit to the data and how they compare to existing models...

  10. Oil price uncertainty in Canada

    Energy Technology Data Exchange (ETDEWEB)

    Elder, John [Department of Finance and Real Estate, 1272 Campus Delivery, Colorado State University, Fort Collins, CO 80523 (United States); Serletis, Apostolos [Department of Economics, University of Calgary, Calgary, Alberta (Canada)

    2009-11-15

    Bernanke [Bernanke, Ben S. Irreversibility, uncertainty, and cyclical investment. Quarterly Journal of Economics 98 (1983), 85-106.] shows how uncertainty about energy prices may induce optimizing firms to postpone investment decisions, thereby leading to a decline in aggregate output. Elder and Serletis [Elder, John and Serletis, Apostolos. Oil price uncertainty.] find empirical evidence that uncertainty about oil prices has tended to depress investment in the United States. In this paper we assess the robustness of these results by investigating the effects of oil price uncertainty in Canada. Our results are remarkably similar to existing results for the United States, providing additional evidence that uncertainty about oil prices may provide another explanation for why the sharp oil price declines of 1985 failed to produce rapid output growth. Impulse-response analysis suggests that uncertainty about oil prices may tend to reinforce the negative response of output to positive oil shocks. (author)

  11. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  12. [Influence of Uncertainty and Uncertainty Appraisal on Self-management in Hemodialysis Patients].

    Science.gov (United States)

    Jang, Hyung Suk; Lee, Chang Suk; Yang, Young Hee

    2015-04-01

    This study was done to examine the relation of uncertainty, uncertainty appraisal, and self-management in patients undergoing hemodialysis, and to identify factors influencing self-management. A convenience sample of 92 patients receiving hemodialysis was selected. Data were collected using a structured questionnaire and medical records. The collected data were analyzed using descriptive statistics, t-test, ANOVA, Pearson correlations and multiple regression analysis with the SPSS/WIN 20.0 program. The participants showed a moderate level of uncertainty with the highest score being for ambiguity among the four uncertainty subdomains. Scores for uncertainty danger or opportunity appraisals were under the mid points. The participants were found to perform a high level of self-management such as diet control, management of arteriovenous fistula, exercise, medication, physical management, measurements of body weight and blood pressure, and social activity. The self-management of participants undergoing hemodialysis showed a significant relationship with uncertainty and uncertainty appraisal. The significant factors influencing self-management were uncertainty, uncertainty opportunity appraisal, hemodialysis duration, and having a spouse. These variables explained 32.8% of the variance in self-management. The results suggest that intervention programs to reduce the level of uncertainty and to increase the level of uncertainty opportunity appraisal among patients would improve the self-management of hemodialysis patients.

  13. Fundamental uncertainty limit of optical flow velocimetry according to Heisenberg's uncertainty principle.

    Science.gov (United States)

    Fischer, Andreas

    2016-11-01

    Optical flow velocity measurements are important for understanding the complex behavior of flows. Although a huge variety of methods exist, they are either based on a Doppler or a time-of-flight measurement principle. Doppler velocimetry evaluates the velocity-dependent frequency shift of light scattered at a moving particle, whereas time-of-flight velocimetry evaluates the traveled distance of a scattering particle per time interval. Regarding the aim of achieving a minimal measurement uncertainty, it is unclear if one principle allows to achieve lower uncertainties or if both principles can achieve equal uncertainties. For this reason, the natural, fundamental uncertainty limit according to Heisenberg's uncertainty principle is derived for Doppler and time-of-flight measurement principles, respectively. The obtained limits of the velocity uncertainty are qualitatively identical showing, e.g., a direct proportionality for the absolute value of the velocity to the power of 32 and an indirect proportionality to the square root of the scattered light power. Hence, both measurement principles have identical potentials regarding the fundamental uncertainty limit due to the quantum mechanical behavior of photons. This fundamental limit can be attained (at least asymptotically) in reality either with Doppler or time-of-flight methods, because the respective Cramér-Rao bounds for dominating photon shot noise, which is modeled as white Poissonian noise, are identical with the conclusions from Heisenberg's uncertainty principle.

  14. Religion in the face of uncertainty: an uncertainty-identity theory account of religiousness.

    Science.gov (United States)

    Hogg, Michael A; Adelman, Janice R; Blagg, Robert D

    2010-02-01

    The authors characterize religions as social groups and religiosity as the extent to which a person identifies with a religion, subscribes to its ideology or worldview, and conforms to its normative practices. They argue that religions have attributes that make them well suited to reduce feelings of self-uncertainty. According to uncertainty-identity theory, people are motivated to reduce feelings of uncertainty about or reflecting on self; and identification with groups, particularly highly entitative groups, is a very effective way to reduce uncertainty. All groups provide belief systems and normative prescriptions related to everyday life. However, religions also address the nature of existence, invoking sacred entities and associated rituals and ceremonies. They are entitative groups that provide a moral compass and rules for living that pervade a person's life, making them particularly attractive in times of uncertainty. The authors document data supporting their analysis and discuss conditions that transform religiosity into religious zealotry and extremism.

  15. Treatment of uncertainty in low-level waste performance assessment

    International Nuclear Information System (INIS)

    Kozak, M.W.; Olague, N.E.; Gallegos, D.P.; Rao, R.R.

    1991-01-01

    Uncertainties arise from a number of different sources in low-level waste performance assessment. In this paper the types of uncertainty are reviewed, and existing methods for quantifying and reducing each type of uncertainty are discussed. These approaches are examined in the context of the current low-level radioactive waste regulatory performance objectives, which are deterministic. The types of uncertainty discussed in this paper are model uncertainty, uncertainty about future conditions, and parameter uncertainty. The advantages and disadvantages of available methods for addressing uncertainty in low-level waste performance assessment are presented. 25 refs

  16. Risk uncertainty analysis methods for NUREG-1150

    International Nuclear Information System (INIS)

    Benjamin, A.S.; Boyd, G.J.

    1987-01-01

    Evaluation and display of risk uncertainties for NUREG-1150 constitute a principal focus of the Severe Accident Risk Rebaselining/Risk Reduction Program (SARRP). Some of the principal objectives of the uncertainty evaluation are: (1) to provide a quantitative estimate that reflects, for those areas considered, a credible and realistic range of uncertainty in risk; (2) to rank the various sources of uncertainty with respect to their importance for various measures of risk; and (3) to characterize the state of understanding of each aspect of the risk assessment for which major uncertainties exist. This paper describes the methods developed to fulfill these objectives

  17. Framework for managing uncertainty in property projects

    NARCIS (Netherlands)

    Reymen, I.M.M.J.; Dewulf, G.P.M.R.; Blokpoel, S.B.

    2008-01-01

    A primary task of property development (or real estate development, RED) is making assessments and managing risks and uncertainties. Property managers cope with a wide range of uncertainties, particularly in the early project phases. Although the existing literature addresses the management of

  18. Justification for recommended uncertainties

    International Nuclear Information System (INIS)

    Pronyaev, V.G.; Badikov, S.A.; Carlson, A.D.

    2007-01-01

    The uncertainties obtained in an earlier standards evaluation were considered to be unrealistically low by experts of the US Cross Section Evaluation Working Group (CSEWG). Therefore, the CSEWG Standards Subcommittee replaced the covariance matrices of evaluated uncertainties by expanded percentage errors that were assigned to the data over wide energy groups. There are a number of reasons that might lead to low uncertainties of the evaluated data: Underestimation of the correlations existing between the results of different measurements; The presence of unrecognized systematic uncertainties in the experimental data can lead to biases in the evaluated data as well as to underestimations of the resulting uncertainties; Uncertainties for correlated data cannot only be characterized by percentage uncertainties or variances. Covariances between evaluated value at 0.2 MeV and other points obtained in model (RAC R matrix and PADE2 analytical expansion) and non-model (GMA) fits of the 6 Li(n,t) TEST1 data and the correlation coefficients are presented and covariances between the evaluated value at 0.045 MeV and other points (along the line or column of the matrix) as obtained in EDA and RAC R matrix fits of the data available for reactions that pass through the formation of the 7 Li system are discussed. The GMA fit with the GMA database is shown for comparison. The following diagrams are discussed: Percentage uncertainties of the evaluated cross section for the 6 Li(n,t) reaction and the for the 235 U(n,f) reaction; estimation given by CSEWG experts; GMA result with full GMA database, including experimental data for the 6 Li(n,t), 6 Li(n,n) and 6 Li(n,total) reactions; uncertainties in the GMA combined fit for the standards; EDA and RAC R matrix results, respectively. Uncertainties of absolute and 252 Cf fission spectrum averaged cross section measurements, and deviations between measured and evaluated values for 235 U(n,f) cross-sections in the neutron energy range 1

  19. Using a Meniscus to Teach Uncertainty in Measurement

    Science.gov (United States)

    Backman, Philip

    2008-01-01

    I have found that students easily understand that a measurement cannot be exact, but they often seem to lack an understanding of why it is important to know "something" about the magnitude of the uncertainty. This tends to promote an attitude that almost any uncertainty value will do. Such indifference may exist because once an uncertainty is…

  20. Marketable pollution permits with uncertainty and transaction costs

    International Nuclear Information System (INIS)

    Montero, Juan-Pablo

    1998-01-01

    Increasing interest in the use of marketable permits for pollution control has become evident in recent years. Concern regarding their performance still remains because empirical evidence has shown transaction costs and uncertainty to be significant in past and existing marketable permits programs. In this paper we develop theoretical and numerical models that include transaction costs and uncertainty (in trade approval) to show their effects on market performance (i.e., equilibrium price of permits and trading volume) and aggregate control costs. We also show that in the presence of transaction costs and uncertainty the initial allocation of permits may not be neutral in terms of efficiency. Furthermore, using a numerical model for a hypothetical NO x trading program in which participants have discrete control technology choices, we find that aggregate control costs and the equilibrium price of permits are sensitive to the initial allocation of permits, even for constant marginal transaction costs and certainty

  1. Risk Assessment Uncertainties in Cybersecurity Investments

    Directory of Open Access Journals (Sweden)

    Andrew Fielder

    2018-06-01

    Full Text Available When undertaking cybersecurity risk assessments, it is important to be able to assign numeric values to metrics to compute the final expected loss that represents the risk that an organization is exposed to due to cyber threats. Even if risk assessment is motivated by real-world observations and data, there is always a high chance of assigning inaccurate values due to different uncertainties involved (e.g., evolving threat landscape, human errors and the natural difficulty of quantifying risk. Existing models empower organizations to compute optimal cybersecurity strategies given their financial constraints, i.e., available cybersecurity budget. Further, a general game-theoretic model with uncertain payoffs (probability-distribution-valued payoffs shows that such uncertainty can be incorporated in the game-theoretic model by allowing payoffs to be random. This paper extends previous work in the field to tackle uncertainties in risk assessment that affect cybersecurity investments. The findings from simulated examples indicate that although uncertainties in cybersecurity risk assessment lead, on average, to different cybersecurity strategies, they do not play a significant role in the final expected loss of the organization when utilising a game-theoretic model and methodology to derive these strategies. The model determines robust defending strategies even when knowledge regarding risk assessment values is not accurate. As a result, it is possible to show that the cybersecurity investments’ tool is capable of providing effective decision support.

  2. Entropic uncertainty relations in the Heisenberg XXZ model and its controlling via filtering operations

    Science.gov (United States)

    Ming, Fei; Wang, Dong; Shi, Wei-Nan; Huang, Ai-Jun; Sun, Wen-Yang; Ye, Liu

    2018-04-01

    The uncertainty principle is recognized as an elementary ingredient of quantum theory and sets up a significant bound to predict outcome of measurement for a couple of incompatible observables. In this work, we develop dynamical features of quantum memory-assisted entropic uncertainty relations (QMA-EUR) in a two-qubit Heisenberg XXZ spin chain with an inhomogeneous magnetic field. We specifically derive the dynamical evolutions of the entropic uncertainty with respect to the measurement in the Heisenberg XXZ model when spin A is initially correlated with quantum memory B. It has been found that the larger coupling strength J of the ferromagnetism ( J 0 ) chains can effectively degrade the measuring uncertainty. Besides, it turns out that the higher temperature can induce the inflation of the uncertainty because the thermal entanglement becomes relatively weak in this scenario, and there exists a distinct dynamical behavior of the uncertainty when an inhomogeneous magnetic field emerges. With the growing magnetic field | B | , the variation of the entropic uncertainty will be non-monotonic. Meanwhile, we compare several different optimized bounds existing with the initial bound proposed by Berta et al. and consequently conclude Adabi et al.'s result is optimal. Moreover, we also investigate the mixedness of the system of interest, dramatically associated with the uncertainty. Remarkably, we put forward a possible physical interpretation to explain the evolutionary phenomenon of the uncertainty. Finally, we take advantage of a local filtering operation to steer the magnitude of the uncertainty. Therefore, our explorations may shed light on the entropic uncertainty under the Heisenberg XXZ model and hence be of importance to quantum precision measurement over solid state-based quantum information processing.

  3. On the EU approach for DEMO architecture exploration and dealing with uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Coleman, M., E-mail: matti.coleman@euro-fusion.org [EUROfusion Consortium, Boltzmannstraße 2, 85748 Garching (Germany); CCFE Fusion Association, Culham Science Centre, Abingdon, Oxfordshire OX14 3DB (United Kingdom); Maviglia, F.; Bachmann, C. [EUROfusion Consortium, Boltzmannstraße 2, 85748 Garching (Germany); Anthony, J. [CCFE Fusion Association, Culham Science Centre, Abingdon, Oxfordshire OX14 3DB (United Kingdom); Federici, G. [EUROfusion Consortium, Boltzmannstraße 2, 85748 Garching (Germany); Shannon, M. [EUROfusion Consortium, Boltzmannstraße 2, 85748 Garching (Germany); CCFE Fusion Association, Culham Science Centre, Abingdon, Oxfordshire OX14 3DB (United Kingdom); Wenninger, R. [EUROfusion Consortium, Boltzmannstraße 2, 85748 Garching (Germany); Max-Planck-Institut für Plasmaphysik, 85748 Garching (Germany)

    2016-11-01

    Highlights: • The issue of epistemic uncertainties in the DEMO design basis is described. • An approach to tackle uncertainty by investigating plant architectures is proposed. • The first wall heat load uncertainty is addressed following the proposed approach. - Abstract: One of the difficulties inherent in designing a future fusion reactor is dealing with uncertainty. As the major step between ITER and the commercial exploitation of nuclear fusion energy, DEMO will have to address many challenges – the natures of which are still not fully known. Unlike fission reactors, fusion reactors suffer from the intrinsic complexity of the tokamak (numerous interdependent system parameters) and from the dependence of plasma physics on scale – prohibiting design exploration founded on incremental progression and small-scale experimentation. For DEMO, this means that significant technical uncertainties will exist for some time to come, and a systems engineering design exploration approach must be developed to explore the reactor architecture when faced with these uncertainties. Important uncertainties in the context of fusion reactor design are discussed and a strategy for dealing with these is presented, treating the uncertainty in the first wall loads as an example.

  4. Interactions between perceived uncertainty types in service dyads

    DEFF Research Database (Denmark)

    Kreye, Melanie

    2018-01-01

    to avoid business failure. A conceptual framework of four uncertainty types is investigated: environmental, technological, organisational, and relational uncertainty. We present insights from four empirical cases of service dyads collected via multiple sources of evidence including 54 semi-structured...... interviews, observations, and secondary data. The cases show seven interaction paths with direct knock-on effects between two uncertainty types and indirect knock-on effects between three or four uncertainty types. The findings suggest a causal chain from environmental, technological, organisational......, to relational uncertainty. This research contributes to the servitization literature by (i) con-firming the existence of uncertainty types, (ii) providing an in-depth characterisation of technological uncertainty, and (iii) showing the interaction paths between four uncertainty types in the form of a causal...

  5. Holistic uncertainty analysis in river basin modeling for climate vulnerability assessment

    Science.gov (United States)

    Taner, M. U.; Wi, S.; Brown, C.

    2017-12-01

    The challenges posed by uncertain future climate are a prominent concern for water resources managers. A number of frameworks exist for assessing the impacts of climate-related uncertainty, including internal climate variability and anthropogenic climate change, such as scenario-based approaches and vulnerability-based approaches. While in many cases climate uncertainty may be dominant, other factors such as future evolution of the river basin, hydrologic response and reservoir operations are potentially significant sources of uncertainty. While uncertainty associated with modeling hydrologic response has received attention, very little attention has focused on the range of uncertainty and possible effects of the water resources infrastructure and management. This work presents a holistic framework that allows analysis of climate, hydrologic and water management uncertainty in water resources systems analysis with the aid of a water system model designed to integrate component models for hydrology processes and water management activities. The uncertainties explored include those associated with climate variability and change, hydrologic model parameters, and water system operation rules. A Bayesian framework is used to quantify and model the uncertainties at each modeling steps in integrated fashion, including prior and the likelihood information about model parameters. The framework is demonstrated in a case study for the St. Croix Basin located at border of United States and Canada.

  6. Use of probabilistic methods for analysis of cost and duration uncertainties in a decision analysis framework

    International Nuclear Information System (INIS)

    Boak, D.M.; Painton, L.

    1995-01-01

    Probabilistic forecasting techniques have been used in many risk assessment and performance assessment applications on radioactive waste disposal projects such as Yucca Mountain and the Waste Isolation Pilot Plant (WIPP). Probabilistic techniques such as Monte Carlo and Latin Hypercube sampling methods are routinely used to treat uncertainties in physical parameters important in simulating radionuclide transport in a coupled geohydrologic system and assessing the ability of that system to comply with regulatory release limits. However, the use of probabilistic techniques in the treatment of uncertainties in the cost and duration of programmatic alternatives on risk and performance assessment projects is less common. Where significant uncertainties exist and where programmatic decisions must be made despite existing uncertainties, probabilistic techniques may yield important insights into decision options, especially when used in a decision analysis framework and when properly balanced with deterministic analyses. For relatively simple evaluations, these types of probabilistic evaluations can be made using personal computer-based software

  7. Stereo-particle image velocimetry uncertainty quantification

    International Nuclear Information System (INIS)

    Bhattacharya, Sayantan; Vlachos, Pavlos P; Charonko, John J

    2017-01-01

    Particle image velocimetry (PIV) measurements are subject to multiple elemental error sources and thus estimating overall measurement uncertainty is challenging. Recent advances have led to a posteriori uncertainty estimation methods for planar two-component PIV. However, no complete methodology exists for uncertainty quantification in stereo PIV. In the current work, a comprehensive framework is presented to quantify the uncertainty stemming from stereo registration error and combine it with the underlying planar velocity uncertainties. The disparity in particle locations of the dewarped images is used to estimate the positional uncertainty of the world coordinate system, which is then propagated to the uncertainty in the calibration mapping function coefficients. Next, the calibration uncertainty is combined with the planar uncertainty fields of the individual cameras through an uncertainty propagation equation and uncertainty estimates are obtained for all three velocity components. The methodology was tested with synthetic stereo PIV data for different light sheet thicknesses, with and without registration error, and also validated with an experimental vortex ring case from 2014 PIV challenge. Thorough sensitivity analysis was performed to assess the relative impact of the various parameters to the overall uncertainty. The results suggest that in absence of any disparity, the stereo PIV uncertainty prediction method is more sensitive to the planar uncertainty estimates than to the angle uncertainty, although the latter is not negligible for non-zero disparity. Overall the presented uncertainty quantification framework showed excellent agreement between the error and uncertainty RMS values for both the synthetic and the experimental data and demonstrated reliable uncertainty prediction coverage. This stereo PIV uncertainty quantification framework provides the first comprehensive treatment on the subject and potentially lays foundations applicable to volumetric

  8. A Framework for Understanding Uncertainty in Seismic Risk Assessment.

    Science.gov (United States)

    Foulser-Piggott, Roxane; Bowman, Gary; Hughes, Martin

    2017-10-11

    A better understanding of the uncertainty that exists in models used for seismic risk assessment is critical to improving risk-based decisions pertaining to earthquake safety. Current models estimating the probability of collapse of a building do not consider comprehensively the nature and impact of uncertainty. This article presents a model framework to enhance seismic risk assessment and thus gives decisionmakers a fuller understanding of the nature and limitations of the estimates. This can help ensure that risks are not over- or underestimated and the value of acquiring accurate data is appreciated fully. The methodology presented provides a novel treatment of uncertainties in input variables, their propagation through the model, and their effect on the results. The study presents ranges of possible annual collapse probabilities for different case studies on buildings in different parts of the world, exposed to different levels of seismicity, and with different vulnerabilities. A global sensitivity analysis was conducted to determine the significance of uncertain variables. Two key outcomes are (1) that the uncertainty in ground-motion conversion equations has the largest effect on the uncertainty in the calculation of annual collapse probability; and (2) the vulnerability of a building appears to have an effect on the range of annual collapse probabilities produced, i.e., the level of uncertainty in the estimate of annual collapse probability, with less vulnerable buildings having a smaller uncertainty. © 2017 Society for Risk Analysis.

  9. Uncertainty assessment for accelerator-driven systems

    International Nuclear Information System (INIS)

    Finck, P. J.; Gomes, I.; Micklich, B.; Palmiotti, G.

    1999-01-01

    The concept of a subcritical system driven by an external source of neutrons provided by an accelerator ADS (Accelerator Driver System) has been recently revived and is becoming more popular in the world technical community with active programs in Europe, Russia, Japan, and the U.S. A general consensus has been reached in adopting for the subcritical component a fast spectrum liquid metal cooled configuration. Both a lead-bismuth eutectic, sodium and gas are being considered as a coolant; each has advantages and disadvantages. The major expected advantage is that subcriticality avoids reactivity induced transients. The potentially large subcriticality margin also should allow for the introduction of very significant quantities of waste products (minor Actinides and Fission Products) which negatively impact the safety characteristics of standard cores. In the U.S. these arguments are the basis for the development of the Accelerator Transmutation of Waste (ATW), which has significant potential in reducing nuclear waste levels. Up to now, neutronic calculations have not attached uncertainties on the values of the main nuclear integral parameters that characterize the system. Many of these parameters (e.g., degree of subcriticality) are crucial to demonstrate the validity and feasibility of this concept. In this paper we will consider uncertainties related to nuclear data only. The present knowledge of the cross sections of many isotopes that are not usually utilized in existing reactors (like Bi, Pb-207, Pb-208, and also Minor Actinides and Fission Products) suggests that uncertainties in the integral parameters will be significantly larger than for conventional reactor systems, and this raises concerns on the neutronic performance of those systems

  10. Identifying significant uncertainties in thermally dependent processes for repository performance analysis

    International Nuclear Information System (INIS)

    Gansemer, J.D.; Lamont, A.

    1994-01-01

    In order to study the performance of the potential Yucca Mountain Nuclear Waste Repository, scientific investigations are being conducted to reduce the uncertainty about process models and system parameters. This paper is intended to demonstrate a method for determining a strategy for the cost effective management of these investigations. It is not meant to be a complete study of all processes and interactions, but does outline a method which can be applied to more in-depth investigations

  11. Quantifying uncertainty in Transcranial Magnetic Stimulation - A high resolution simulation study in ICBM space.

    Science.gov (United States)

    Toschi, Nicola; Keck, Martin E; Welt, Tobias; Guerrisi, Maria

    2012-01-01

    Transcranial Magnetic Stimulation offers enormous potential for noninvasive brain stimulation. While it is known that brain tissue significantly "reshapes" induced field and charge distributions, most modeling investigations to-date have focused on single-subject data with limited generality. Further, the effects of the significant uncertainties which exist in the simulation (i.e. brain conductivity distributions) and stimulation (e.g. coil positioning and orientations) setup have not been quantified. In this study, we construct a high-resolution anisotropic head model in standard ICBM space, which can be used as a population-representative standard for bioelectromagnetic simulations. Further, we employ Monte-Carlo simulations in order to quantify how uncertainties in conductivity values propagate all the way to induced field and currents, demonstrating significant, regionally dependent dispersions in values which are commonly assumed "ground truth". This framework can be leveraged in order to quantify the effect of any type of uncertainty in noninvasive brain stimulation and bears relevance in all applications of TMS, both investigative and therapeutic.

  12. Uncertainty in spatial planning proceedings

    Directory of Open Access Journals (Sweden)

    Aleš Mlakar

    2009-01-01

    Full Text Available Uncertainty is distinctive of spatial planning as it arises from the necessity to co-ordinate the various interests within the area, from the urgency of adopting spatial planning decisions, the complexity of the environment, physical space and society, addressing the uncertainty of the future and from the uncertainty of actually making the right decision. Response to uncertainty is a series of measures that mitigate the effects of uncertainty itself. These measures are based on two fundamental principles – standardization and optimization. The measures are related to knowledge enhancement and spatial planning comprehension, in the legal regulation of changes, in the existence of spatial planning as a means of different interests co-ordination, in the active planning and the constructive resolution of current spatial problems, in the integration of spatial planning and the environmental protection process, in the implementation of the analysis as the foundation of spatial planners activities, in the methods of thinking outside the parameters, in forming clear spatial concepts and in creating a transparent management spatial system and also in the enforcement the participatory processes.

  13. A Bayesian statistical method for quantifying model form uncertainty and two model combination methods

    International Nuclear Information System (INIS)

    Park, Inseok; Grandhi, Ramana V.

    2014-01-01

    Apart from parametric uncertainty, model form uncertainty as well as prediction error may be involved in the analysis of engineering system. Model form uncertainty, inherently existing in selecting the best approximation from a model set cannot be ignored, especially when the predictions by competing models show significant differences. In this research, a methodology based on maximum likelihood estimation is presented to quantify model form uncertainty using the measured differences of experimental and model outcomes, and is compared with a fully Bayesian estimation to demonstrate its effectiveness. While a method called the adjustment factor approach is utilized to propagate model form uncertainty alone into the prediction of a system response, a method called model averaging is utilized to incorporate both model form uncertainty and prediction error into it. A numerical problem of concrete creep is used to demonstrate the processes for quantifying model form uncertainty and implementing the adjustment factor approach and model averaging. Finally, the presented methodology is applied to characterize the engineering benefits of a laser peening process

  14. Systematic Evaluation of Uncertainty in Material Flow Analysis

    DEFF Research Database (Denmark)

    Laner, David; Rechberger, Helmut; Astrup, Thomas Fruergaard

    2014-01-01

    Material flow analysis (MFA) is a tool to investigate material flows and stocks in defined systems as a basis for resource management or environmental pollution control. Because of the diverse nature of sources and the varying quality and availability of data, MFA results are inherently uncertain....... Uncertainty analyses have received increasing attention in recent MFA studies, but systematic approaches for selection of appropriate uncertainty tools are missing. This article reviews existing literature related to handling of uncertainty in MFA studies and evaluates current practice of uncertainty analysis......) and exploratory MFA (identification of critical parameters and system behavior). Whereas mathematically simpler concepts focusing on data uncertainty characterization are appropriate for descriptive MFAs, statistical approaches enabling more-rigorous evaluation of uncertainty and model sensitivity are needed...

  15. Can you put too much on your plate? Uncertainty exposure in servitized triads

    DEFF Research Database (Denmark)

    Kreye, Melanie E.

    2017-01-01

    -national servitized triad in a European-North African set-up which was collected through 29 semi-structured interviews and secondary data. Findings: The empirical study identified the existence of the three uncertainty types and directional knock-on effects between them. Specifically, environmental uncertainty...... relational governance reduced relational uncertainty. The knock-on effects were reduced through organisational and relational responses. Originality: This paper makes two contributions. First, a structured analysis of the uncertainty exposure in servitized triads is presented which shows the existence...... of three individual uncertainty types and the knock-on effects between them. Second, organisational responses to reduce the three uncertainty types individually and the knock-on effects between them are presented....

  16. Appropriatie spatial scales to achieve model output uncertainty goals

    NARCIS (Netherlands)

    Booij, Martijn J.; Melching, Charles S.; Chen, Xiaohong; Chen, Yongqin; Xia, Jun; Zhang, Hailun

    2008-01-01

    Appropriate spatial scales of hydrological variables were determined using an existing methodology based on a balance in uncertainties from model inputs and parameters extended with a criterion based on a maximum model output uncertainty. The original methodology uses different relationships between

  17. Model Uncertainty Quantification Methods In Data Assimilation

    Science.gov (United States)

    Pathiraja, S. D.; Marshall, L. A.; Sharma, A.; Moradkhani, H.

    2017-12-01

    Data Assimilation involves utilising observations to improve model predictions in a seamless and statistically optimal fashion. Its applications are wide-ranging; from improving weather forecasts to tracking targets such as in the Apollo 11 mission. The use of Data Assimilation methods in high dimensional complex geophysical systems is an active area of research, where there exists many opportunities to enhance existing methodologies. One of the central challenges is in model uncertainty quantification; the outcome of any Data Assimilation study is strongly dependent on the uncertainties assigned to both observations and models. I focus on developing improved model uncertainty quantification methods that are applicable to challenging real world scenarios. These include developing methods for cases where the system states are only partially observed, where there is little prior knowledge of the model errors, and where the model error statistics are likely to be highly non-Gaussian.

  18. On the uncertainty principle. V

    International Nuclear Information System (INIS)

    Halpern, O.

    1976-01-01

    The treatment of ideal experiments connected with the uncertainty principle is continued. The author analyzes successively measurements of momentum and position, and discusses the common reason why the results in all cases differ from the conventional ones. A similar difference exists for the measurement of field strengths. The interpretation given by Weizsaecker, who tried to interpret Bohr's complementarity principle by introducing a multi-valued logic is analyzed. The treatment of the uncertainty principle ΔE Δt is deferred to a later paper as is the interpretation of the method of variation of constants. Every ideal experiment discussed shows various lower limits for the value of the uncertainty product which limits depend on the experimental arrangement and are always (considerably) larger than h. (Auth.)

  19. Incorporating Forecast Uncertainty in Utility Control Center

    Energy Technology Data Exchange (ETDEWEB)

    Makarov, Yuri V.; Etingov, Pavel V.; Ma, Jian

    2014-07-09

    Uncertainties in forecasting the output of intermittent resources such as wind and solar generation, as well as system loads are not adequately reflected in existing industry-grade tools used for transmission system management, generation commitment, dispatch and market operation. There are other sources of uncertainty such as uninstructed deviations of conventional generators from their dispatch set points, generator forced outages and failures to start up, load drops, losses of major transmission facilities and frequency variation. These uncertainties can cause deviations from the system balance, which sometimes require inefficient and costly last minute solutions in the near real-time timeframe. This Chapter considers sources of uncertainty and variability, overall system uncertainty model, a possible plan for transition from deterministic to probabilistic methods in planning and operations, and two examples of uncertainty-based fools for grid operations.This chapter is based on work conducted at the Pacific Northwest National Laboratory (PNNL)

  20. Uncertainty representation of grey numbers and grey sets.

    Science.gov (United States)

    Yang, Yingjie; Liu, Sifeng; John, Robert

    2014-09-01

    In the literature, there is a presumption that a grey set and an interval-valued fuzzy set are equivalent. This presumption ignores the existence of discrete components in a grey number. In this paper, new measurements of uncertainties of grey numbers and grey sets, consisting of both absolute and relative uncertainties, are defined to give a comprehensive representation of uncertainties in a grey number and a grey set. Some simple examples are provided to illustrate that the proposed uncertainty measurement can give an effective representation of both absolute and relative uncertainties in a grey number and a grey set. The relationships between grey sets and interval-valued fuzzy sets are also analyzed from the point of view of the proposed uncertainty representation. The analysis demonstrates that grey sets and interval-valued fuzzy sets provide different but overlapping models for uncertainty representation in sets.

  1. Deterministic uncertainty analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1987-01-01

    Uncertainties of computer results are of primary interest in applications such as high-level waste (HLW) repository performance assessment in which experimental validation is not possible or practical. This work presents an alternate deterministic approach for calculating uncertainties that has the potential to significantly reduce the number of computer runs required for conventional statistical analysis. 7 refs., 1 fig

  2. Sensitivity and uncertainty analysis for functionals of the time-dependent nuclide density field

    International Nuclear Information System (INIS)

    Williams, M.L.; Weisbin, C.R.

    1978-04-01

    An approach to extend the present ORNL sensitivity program to include functionals of the time-dependent nuclide density field is developed. An adjoint equation for the nuclide field was derived previously by using generalized perturbation theory; the present derivation makes use of a variational principle and results in the same equation. The physical significance of this equation is discussed and compared to that of the time-dependent neutron adjoint equation. Computational requirements for determining sensitivity profiles and uncertainties for functionals of the time-dependent nuclide density vector are developed within the framework of the existing FORSS system; in this way the current capability is significantly extended. The development, testing, and use of an adjoint version of the ORIGEN isotope generation and depletion code are documented. Finally, a sample calculation is given which estimates the uncertainty in the plutonium inventory at shutdown of a PWR due to assumed uncertainties in uranium and plutonium cross sections. 8 figures, 4 tables

  3. Utilization of Software Tools for Uncertainty Calculation in Measurement Science Education

    International Nuclear Information System (INIS)

    Zangl, Hubert; Zine-Zine, Mariam; Hoermaier, Klaus

    2015-01-01

    Despite its importance, uncertainty is often neglected by practitioners in the design of system even in safety critical applications. Thus, problems arising from uncertainty may only be identified late in the design process and thus lead to additional costs. Although there exists numerous tools to support uncertainty calculation, reasons for limited usage in early design phases may be low awareness of the existence of the tools and insufficient training in the practical application. We present a teaching philosophy that addresses uncertainty from the very beginning of teaching measurement science, in particular with respect to the utilization of software tools. The developed teaching material is based on the GUM method and makes use of uncertainty toolboxes in the simulation environment. Based on examples in measurement science education we discuss advantages and disadvantages of the proposed teaching philosophy and include feedback from students

  4. Modeling for waste management associated with environmental-impact abatement under uncertainty.

    Science.gov (United States)

    Li, P; Li, Y P; Huang, G H; Zhang, J L

    2015-04-01

    Municipal solid waste (MSW) treatment can generate significant amounts of pollutants, and thus pose a risk on human health. Besides, in MSW management, various uncertainties exist in the related costs, impact factors, and objectives, which can affect the optimization processes and the decision schemes generated. In this study, a life cycle assessment-based interval-parameter programming (LCA-IPP) method is developed for MSW management associated with environmental-impact abatement under uncertainty. The LCA-IPP can effectively examine the environmental consequences based on a number of environmental impact categories (i.e., greenhouse gas equivalent, acid gas emissions, and respiratory inorganics), through analyzing each life cycle stage and/or major contributing process related to various MSW management activities. It can also tackle uncertainties existed in the related costs, impact factors, and objectives and expressed as interval numbers. Then, the LCA-IPP method is applied to MSW management for the City of Beijing, the capital of China, where energy consumptions and six environmental parameters [i.e., CO2, CO, CH4, NOX, SO2, inhalable particle (PM10)] are used as systematic tool to quantify environmental releases in entire life cycle stage of waste collection, transportation, treatment, and disposal of. Results associated with system cost, environmental impact, and the related policy implication are generated and analyzed. Results can help identify desired alternatives for managing MSW flows, which has advantages in providing compromised schemes under an integrated consideration of economic efficiency and environmental impact under uncertainty.

  5. Uncertainty and Cognitive Control

    Directory of Open Access Journals (Sweden)

    Faisal eMushtaq

    2011-10-01

    Full Text Available A growing trend of neuroimaging, behavioural and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1 There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2 There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3 The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the need for control; (4 Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders.

  6. On treatment of uncertainty in system planning

    International Nuclear Information System (INIS)

    Flage, R.; Aven, T.

    2009-01-01

    In system planning and operation considerable efforts and resources are spent to reduce uncertainties, as a part of project management, uncertainty management and safety management. The basic idea seems to be that uncertainties are purely negative and should be reduced. In this paper we challenge this way of thinking, using a common industry practice as an example. In accordance with this industry practice, three uncertainty interval categories are used: ±40% intervals for the feasibility phase, ±30% intervals for the concept development phase and ±20% intervals for the engineering phase. The problem is that such a regime could easily lead to a conservative management regime encouraging the use of existing methods and tools, as new activities and novel solutions and arrangements necessarily mean increased uncertainties. In the paper we suggest an alternative approach based on uncertainty and risk descriptions, but having no predefined uncertainty reduction structures. The approach makes use of risk assessments and economic optimisation tools such as the expected net present value, but acknowledges the need for broad risk management processes which extend beyond the analyses. Different concerns need to be balanced, including economic aspects, uncertainties and risk, and practicability

  7. Uncertainties affecting fund collection, management and final utilisation

    International Nuclear Information System (INIS)

    Soederberg, Olof

    2006-01-01

    The paper presents, on a general level, major uncertainties in financing systems aiming at providing secure funding for future costs for decommissioning. The perspective chosen is that of a fund collector/manager. The paper also contains a description of how these uncertainties are dealt within the Swedish financing system and particularly from the perspective of the Board of the Swedish Nuclear Waste Fund. It is concluded that existing uncertainties are a good reason not to postpone decommissioning activities to a distant future. This aspect is important also when countries have in place financing systems that have been constructed in order to be robust against identified uncertainties. (author)

  8. Research of Uncertainty Reasoning in Pineapple Disease Identification System

    Science.gov (United States)

    Liu, Liqun; Fan, Haifeng

    In order to deal with the uncertainty of evidences mostly existing in pineapple disease identification system, a reasoning model based on evidence credibility factor was established. The uncertainty reasoning method is discussed,including: uncertain representation of knowledge, uncertain representation of rules, uncertain representation of multi-evidences and update of reasoning rules. The reasoning can fully reflect the uncertainty in disease identification and reduce the influence of subjective factors on the accuracy of the system.

  9. Uncertainties in risk assessment and decision making

    International Nuclear Information System (INIS)

    Starzec, Peter; Purucker, Tom; Stewart, Robert

    2008-02-01

    The general concept for risk assessment in accordance with the Swedish model for contaminated soil implies that the toxicological reference value for a given receptor is first back-calculated to a corresponding concentration of a compound in soil and (if applicable) then modified with respect to e.g. background levels, acute toxicity, and factor of safety. This result in a guideline value that is subsequently compared to the observed concentration levels. Many sources of uncertainty exist when assessing whether the risk for a receptor is significant or not. In this study, the uncertainty aspects have been addressed from three standpoints: 1. Uncertainty in the comparison between the level of contamination (source) and a given risk criterion (e.g. a guideline value) and possible implications on subsequent decisions. This type of uncertainty is considered to be most important in situations where a contaminant is expected to be spatially heterogeneous without any tendency to form isolated clusters (hotspots) that can be easily delineated, i.e. where mean values are appropriate to compare to the risk criterion. 2. Uncertainty in spatial distribution of a contaminant. Spatial uncertainty should be accounted for when hotspots are to be delineated and the volume of soil contaminated with levels above a stated decision criterion has to be assessed (quantified). 3. Uncertainty in an ecological exposure model with regard to the moving pattern of a receptor in relation to spatial distribution of contaminant in question. The study points out that the choice of methodology to characterize the relation between contaminant concentration and a pre-defined risk criterion is governed by a conceptual perception of the contaminant's spatial distribution and also depends on the structure of collected data (observations). How uncertainty in transition from contaminant concentration into risk criterion can be quantified was demonstrated by applying hypothesis tests and the concept of

  10. Source terms: an investigation of uncertainties, magnitudes, and recommendations for research. [PWR; BWR

    Energy Technology Data Exchange (ETDEWEB)

    Levine, S.; Kaiser, G. D.; Arcieri, W. C.; Firstenberg, H.; Fulford, P. J.; Lam, P. S.; Ritzman, R. L.; Schmidt, E. R.

    1982-03-01

    The purpose of this document is to assess the state of knowledge and expert opinions that exist about fission product source terms from potential nuclear power plant accidents. This is so that recommendations can be made for research and analyses which have the potential to reduce the uncertainties in these estimated source terms and to derive improved methods for predicting their magnitudes. The main reasons for writing this report are to indicate the major uncertainties involved in defining realistic source terms that could arise from severe reactor accidents, to determine which factors would have the most significant impact on public risks and emergency planning, and to suggest research and analyses that could result in the reduction of these uncertainties. Source terms used in the conventional consequence calculations in the licensing process are not explicitly addressed.

  11. Entropic uncertainty relations-a survey

    International Nuclear Information System (INIS)

    Wehner, Stephanie; Winter, Andreas

    2010-01-01

    Uncertainty relations play a central role in quantum mechanics. Entropic uncertainty relations in particular have gained significant importance within quantum information, providing the foundation for the security of many quantum cryptographic protocols. Yet, little is known about entropic uncertainty relations with more than two measurement settings. In the present survey, we review known results and open questions.

  12. Fuzzy Uncertainty Evaluation for Fault Tree Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ki Beom; Shim, Hyung Jin [Seoul National University, Seoul (Korea, Republic of); Jae, Moo Sung [Hanyang University, Seoul (Korea, Republic of)

    2015-05-15

    This traditional probabilistic approach can calculate relatively accurate results. However it requires a long time because of repetitive computation due to the MC method. In addition, when informative data for statistical analysis are not sufficient or some events are mainly caused by human error, the probabilistic approach may not be possible because uncertainties of these events are difficult to be expressed by probabilistic distributions. In order to reduce the computation time and quantify uncertainties of top events when basic events whose uncertainties are difficult to be expressed by probabilistic distributions exist, the fuzzy uncertainty propagation based on fuzzy set theory can be applied. In this paper, we develop a fuzzy uncertainty propagation code and apply the fault tree of the core damage accident after the large loss of coolant accident (LLOCA). The fuzzy uncertainty propagation code is implemented and tested for the fault tree of the radiation release accident. We apply this code to the fault tree of the core damage accident after the LLOCA in three cases and compare the results with those computed by the probabilistic uncertainty propagation using the MC method. The results obtained by the fuzzy uncertainty propagation can be calculated in relatively short time, covering the results obtained by the probabilistic uncertainty propagation.

  13. The Significance of an Excess in a Counting Experiment: Assessing the Impact of Systematic Uncertainties and the Case with a Gaussian Background

    Science.gov (United States)

    Vianello, Giacomo

    2018-05-01

    Several experiments in high-energy physics and astrophysics can be treated as on/off measurements, where an observation potentially containing a new source or effect (“on” measurement) is contrasted with a background-only observation free of the effect (“off” measurement). In counting experiments, the significance of the new source or effect can be estimated with a widely used formula from Li & Ma, which assumes that both measurements are Poisson random variables. In this paper we study three other cases: (i) the ideal case where the background measurement has no uncertainty, which can be used to study the maximum sensitivity that an instrument can achieve, (ii) the case where the background estimate b in the off measurement has an additional systematic uncertainty, and (iii) the case where b is a Gaussian random variable instead of a Poisson random variable. The latter case applies when b comes from a model fitted on archival or ancillary data, or from the interpolation of a function fitted on data surrounding the candidate new source/effect. Practitioners typically use a formula that is only valid when b is large and when its uncertainty is very small, while we derive a general formula that can be applied in all regimes. We also develop simple methods that can be used to assess how much an estimate of significance is sensitive to systematic uncertainties on the efficiency or on the background. Examples of applications include the detection of short gamma-ray bursts and of new X-ray or γ-ray sources. All the techniques presented in this paper are made available in a Python code that is ready to use.

  14. Conquering complexity - Dealing with uncertainty and ambiguity in water management

    NARCIS (Netherlands)

    Hommes, Saskia

    2008-01-01

    Water management problems are embedded in a natural and social system that is characterized by complexity. Knowledge uncertainty and the existence of divergent actors’ perceptions contribute to this complexity. Consequently, dealing with water management issues is not just a knowledge uncertainty

  15. Cassini Spacecraft Uncertainty Analysis Data and Methodology Review and Update/Volume 1: Updated Parameter Uncertainty Models for the Consequence Analysis

    Energy Technology Data Exchange (ETDEWEB)

    WHEELER, TIMOTHY A.; WYSS, GREGORY D.; HARPER, FREDERICK T.

    2000-11-01

    Uncertainty distributions for specific parameters of the Cassini General Purpose Heat Source Radioisotope Thermoelectric Generator (GPHS-RTG) Final Safety Analysis Report consequence risk analysis were revised and updated. The revisions and updates were done for all consequence parameters for which relevant information exists from the joint project on Probabilistic Accident Consequence Uncertainty Analysis by the United States Nuclear Regulatory Commission and the Commission of European Communities.

  16. Uncertainty visualisation in the Model Web

    Science.gov (United States)

    Gerharz, L. E.; Autermann, C.; Hopmann, H.; Stasch, C.; Pebesma, E.

    2012-04-01

    Visualisation of geospatial data as maps is a common way to communicate spatially distributed information. If temporal and furthermore uncertainty information are included in the data, efficient visualisation methods are required. For uncertain spatial and spatio-temporal data, numerous visualisation methods have been developed and proposed, but only few tools for visualisation of data in a standardised way exist. Furthermore, usually they are realised as thick clients, and lack functionality of handling data coming from web services as it is envisaged in the Model Web. We present an interactive web tool for visualisation of uncertain spatio-temporal data developed in the UncertWeb project. The client is based on the OpenLayers JavaScript library. OpenLayers provides standard map windows and navigation tools, i.e. pan, zoom in/out, to allow interactive control for the user. Further interactive methods are implemented using jStat, a JavaScript library for statistics plots developed in UncertWeb, and flot. To integrate the uncertainty information into existing standards for geospatial data, the Uncertainty Markup Language (UncertML) was applied in combination with OGC Observations&Measurements 2.0 and JavaScript Object Notation (JSON) encodings for vector and NetCDF for raster data. The client offers methods to visualise uncertain vector and raster data with temporal information. Uncertainty information considered for the tool are probabilistic and quantified attribute uncertainties which can be provided as realisations or samples, full probability distributions functions and statistics. Visualisation is supported for uncertain continuous and categorical data. In the client, the visualisation is realised using a combination of different methods. Based on previously conducted usability studies, a differentiation between expert (in statistics or mapping) and non-expert users has been indicated as useful. Therefore, two different modes are realised together in the tool

  17. Potential effects of organizational uncertainty on safety

    Energy Technology Data Exchange (ETDEWEB)

    Durbin, N.E. [MPD Consulting Group, Kirkland, WA (United States); Lekberg, A. [Swedish Nuclear Power Inspectorate, Stockholm (Sweden); Melber, B.D. [Melber Consulting, Seattle WA (United States)

    2001-12-01

    When organizations face significant change - reorganization, mergers, acquisitions, down sizing, plant closures or decommissioning - both the organizations and the workers in those organizations experience significant uncertainty about the future. This uncertainty affects the organization and the people working in the organization - adversely affecting morale, reducing concentration on safe operations, and resulting in the loss of key staff. Hence, organizations, particularly those using high risk technologies, which are facing significant change need to consider and plan for the effects of organizational uncertainty on safety - as well as planning for other consequences of change - technical, economic, emotional, and productivity related. This paper reviews some of what is known about the effects of uncertainty on organizations and individuals, discusses the potential consequences of uncertainty on organizational and individual behavior, and presents some of the implications for safety professionals.

  18. Potential effects of organizational uncertainty on safety

    International Nuclear Information System (INIS)

    Durbin, N.E.; Lekberg, A.; Melber, B.D.

    2001-12-01

    When organizations face significant change - reorganization, mergers, acquisitions, down sizing, plant closures or decommissioning - both the organizations and the workers in those organizations experience significant uncertainty about the future. This uncertainty affects the organization and the people working in the organization - adversely affecting morale, reducing concentration on safe operations, and resulting in the loss of key staff. Hence, organizations, particularly those using high risk technologies, which are facing significant change need to consider and plan for the effects of organizational uncertainty on safety - as well as planning for other consequences of change - technical, economic, emotional, and productivity related. This paper reviews some of what is known about the effects of uncertainty on organizations and individuals, discusses the potential consequences of uncertainty on organizational and individual behavior, and presents some of the implications for safety professionals

  19. A probabilistic approach to cost and duration uncertainties in environmental decisions

    International Nuclear Information System (INIS)

    Boak, D.M.; Painton, L.

    1996-01-01

    Sandia National Laboratories has developed a method for analyzing life-cycle costs using probabilistic cost forecasting and utility theory to determine the most cost-effective alternatives for safe interim storage of radioactive materials. The method explicitly incorporates uncertainties in cost and storage duration by (1) treating uncertain component costs as random variables represented by probability distributions, (2) treating uncertain durations as chance nodes in a decision tree, and (3) using stochastic simulation tools to generate life-cycle cost forecasts for each storage alternative. The method applies utility functions to the forecasted costs to incorporate the decision maker's risk preferences, making it possible to compare alternatives on the basis of both cost and cost utility. Finally, the method is used to help identify key contributors to the uncertainty in forecasted costs to focus efforts aimed at reducing cost uncertainties. Where significant cost and duration uncertainties exist, and where programmatic decisions must be made despite these uncertainties, probabilistic forecasting techniques can yield important insights into decision alternatives, especially when used as part of a larger decision analysis framework and when properly balanced with deterministic analyses. Although the method is built around an interim storage example, it is potentially applicable to many other environmental decision problems

  20. How uncertainty analysis of streamflow data can reduce costs and promote robust decisions in water management applications

    Science.gov (United States)

    McMillan, Hilary; Seibert, Jan; Petersen-Overleir, Asgeir; Lang, Michel; White, Paul; Snelder, Ton; Rutherford, Kit; Krueger, Tobias; Mason, Robert; Kiang, Julie

    2017-07-01

    Streamflow data are used for important environmental and economic decisions, such as specifying and regulating minimum flows, managing water supplies, and planning for flood hazards. Despite significant uncertainty in most flow data, the flow series for these applications are often communicated and used without uncertainty information. In this commentary, we argue that proper analysis of uncertainty in river flow data can reduce costs and promote robust conclusions in water management applications. We substantiate our argument by providing case studies from Norway and New Zealand where streamflow uncertainty analysis has uncovered economic costs in the hydropower industry, improved public acceptance of a controversial water management policy, and tested the accuracy of water quality trends. We discuss the need for practical uncertainty assessment tools that generate multiple flow series realizations rather than simple error bounds. Although examples of such tools are in development, considerable barriers for uncertainty analysis and communication still exist for practitioners, and future research must aim to provide easier access and usability of uncertainty estimates. We conclude that flow uncertainty analysis is critical for good water management decisions.

  1. A Research on Uncertainty Evaluation in Verification and Calibration on LSC facility

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung-Jin; Park, Eung-Seop; Kim, Hee-Gang [Yeong Gwang NPP Supervisory Center for Environment Radiation and Safety, Yeonggwang (Korea, Republic of); Han, Sang-Jun [Chosun Univ., Gwangju (Korea, Republic of)

    2007-10-15

    Compared with environmental sample existing around Nuclear Power Plant, the uncertainty due to geometry difference when the calibration about Liquid Scintillation Counter using the solid H-3 Standard Source of 200,000 DPM(Disintegration Per Minute) is executed exists. Therefore, this paper intends to investigate the root cause of uncertainty due to geometry difference using Quantulus 1220 instrument and H-3 Standard source of solid and liquid form. And Teflon vial was used as a measurement cell. In this paper, it is judged that main factors which can bring about uncertainty about geometry difference are a plastic cell existing into Teflon vial and activity difference, the configuration difference of H-3 Standard Source, and evaluation on these factors are performed through experiment and measurement.

  2. A Research on Uncertainty Evaluation in Verification and Calibration on LSC facility

    International Nuclear Information System (INIS)

    Lee, Seung-Jin; Park, Eung-Seop; Kim, Hee-Gang; Han, Sang-Jun

    2007-01-01

    Compared with environmental sample existing around Nuclear Power Plant, the uncertainty due to geometry difference when the calibration about Liquid Scintillation Counter using the solid H-3 Standard Source of 200,000 DPM(Disintegration Per Minute) is executed exists. Therefore, this paper intends to investigate the root cause of uncertainty due to geometry difference using Quantulus 1220 instrument and H-3 Standard source of solid and liquid form. And Teflon vial was used as a measurement cell. In this paper, it is judged that main factors which can bring about uncertainty about geometry difference are a plastic cell existing into Teflon vial and activity difference, the configuration difference of H-3 Standard Source, and evaluation on these factors are performed through experiment and measurement

  3. Quantification of Safety-Critical Software Test Uncertainty

    International Nuclear Information System (INIS)

    Khalaquzzaman, M.; Cho, Jaehyun; Lee, Seung Jun; Jung, Wondea

    2015-01-01

    The method, conservatively assumes that the failure probability of a software for the untested inputs is 1, and the failure probability turns in 0 for successful testing of all test cases. However, in reality the chance of failure exists due to the test uncertainty. Some studies have been carried out to identify the test attributes that affect the test quality. Cao discussed the testing effort, testing coverage, and testing environment. Management of the test uncertainties was discussed in. In this study, the test uncertainty has been considered to estimate the software failure probability because the software testing process is considered to be inherently uncertain. A reliability estimation of software is very important for a probabilistic safety analysis of a digital safety critical system of NPPs. This study focused on the estimation of the probability of a software failure that considers the uncertainty in software testing. In our study, BBN has been employed as an example model for software test uncertainty quantification. Although it can be argued that the direct expert elicitation of test uncertainty is much simpler than BBN estimation, however the BBN approach provides more insights and a basis for uncertainty estimation

  4. Accounting for uncertainty in evaluating water quality impacts of urban development plan

    International Nuclear Information System (INIS)

    Zhou Jiquan; Liu Yi; Chen Jining

    2010-01-01

    The implementation of urban development plans causes land use change, which can have significant environmental impacts. In light of this, environmental concerns should be considered sufficiently at an early stage of the planning process. However, uncertainties existing in urban development plans hamper the application of strategic environmental assessment, which is applied to evaluate the environmental impacts of policies, plans and programs. This study develops an integrated assessment method based on accounting uncertainty of environmental impacts. And the proposed method consists of four main steps: (1) designing scenarios of economic scale and industrial structure, (2) sampling for possible land use layouts, (3) evaluating each sample's environmental impact, and (4) identifying environmentally sensitive industries. In doing so, uncertainties of environmental impacts can be accounted. Then environmental risk, overall environmental pressure and potential extreme environmental impact of urban development plans can be analyzed, and environmentally sensitive factors can be identified, especially under considerations of uncertainties. It can help decision-makers enhance environmental consideration and take measures in the early stage of decision-making.

  5. Habitable zone dependence on stellar parameter uncertainties

    International Nuclear Information System (INIS)

    Kane, Stephen R.

    2014-01-01

    An important property of exoplanetary systems is the extent of the Habitable Zone (HZ), defined as that region where water can exist in a liquid state on the surface of a planet with sufficient atmospheric pressure. Both ground- and space-based observations have revealed a plethora of confirmed exoplanets and exoplanetary candidates, most notably from the Kepler mission using the transit detection technique. Many of these detected planets lie within the predicted HZ of their host star. However, as is the case with the derived properties of the planets themselves, the HZ boundaries depend on how well we understand the host star. Here we quantify the uncertainties of HZ boundaries on the parameter uncertainties of the host star. We examine the distribution of stellar parameter uncertainties from confirmed exoplanet hosts and Kepler candidate hosts and translate these into HZ boundary uncertainties. We apply this to several known systems with an HZ planet to determine the uncertainty in their HZ status.

  6. Habitable zone dependence on stellar parameter uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Kane, Stephen R., E-mail: skane@sfsu.edu [Department of Physics and Astronomy, San Francisco State University, 1600 Holloway Avenue, San Francisco, CA 94132 (United States)

    2014-02-20

    An important property of exoplanetary systems is the extent of the Habitable Zone (HZ), defined as that region where water can exist in a liquid state on the surface of a planet with sufficient atmospheric pressure. Both ground- and space-based observations have revealed a plethora of confirmed exoplanets and exoplanetary candidates, most notably from the Kepler mission using the transit detection technique. Many of these detected planets lie within the predicted HZ of their host star. However, as is the case with the derived properties of the planets themselves, the HZ boundaries depend on how well we understand the host star. Here we quantify the uncertainties of HZ boundaries on the parameter uncertainties of the host star. We examine the distribution of stellar parameter uncertainties from confirmed exoplanet hosts and Kepler candidate hosts and translate these into HZ boundary uncertainties. We apply this to several known systems with an HZ planet to determine the uncertainty in their HZ status.

  7. Accounting for Epistemic Uncertainty in Mission Supportability Assessment: A Necessary Step in Understanding Risk and Logistics Requirements

    Science.gov (United States)

    Owens, Andrew; De Weck, Olivier L.; Stromgren, Chel; Goodliff, Kandyce; Cirillo, William

    2017-01-01

    Future crewed missions to Mars present a maintenance logistics challenge that is unprecedented in human spaceflight. Mission endurance – defined as the time between resupply opportunities – will be significantly longer than previous missions, and therefore logistics planning horizons are longer and the impact of uncertainty is magnified. Maintenance logistics forecasting typically assumes that component failure rates are deterministically known and uses them to represent aleatory uncertainty, or uncertainty that is inherent to the process being examined. However, failure rates cannot be directly measured; rather, they are estimated based on similarity to other components or statistical analysis of observed failures. As a result, epistemic uncertainty – that is, uncertainty in knowledge of the process – exists in failure rate estimates that must be accounted for. Analyses that neglect epistemic uncertainty tend to significantly underestimate risk. Epistemic uncertainty can be reduced via operational experience; for example, the International Space Station (ISS) failure rate estimates are refined using a Bayesian update process. However, design changes may re-introduce epistemic uncertainty. Thus, there is a tradeoff between changing a design to reduce failure rates and operating a fixed design to reduce uncertainty. This paper examines the impact of epistemic uncertainty on maintenance logistics requirements for future Mars missions, using data from the ISS Environmental Control and Life Support System (ECLS) as a baseline for a case study. Sensitivity analyses are performed to investigate the impact of variations in failure rate estimates and epistemic uncertainty on spares mass. The results of these analyses and their implications for future system design and mission planning are discussed.

  8. Information-theoretic approach to uncertainty importance

    International Nuclear Information System (INIS)

    Park, C.K.; Bari, R.A.

    1985-01-01

    A method is presented for importance analysis in probabilistic risk assessments (PRA) for which the results of interest are characterized by full uncertainty distributions and not just point estimates. The method is based on information theory in which entropy is a measure of uncertainty of a probability density function. We define the relative uncertainty importance between two events as the ratio of the two exponents of the entropies. For the log-normal and log-uniform distributions the importance measure is comprised of the median (central tendency) and of the logarithm of the error factor (uncertainty). Thus, if accident sequences are ranked this way, and the error factors are not all equal, then a different rank order would result than if the sequences were ranked by the central tendency measure alone. As an illustration, the relative importance of internal events and in-plant fires was computed on the basis of existing PRA results

  9. Statistically based uncertainty assessments in nuclear risk analysis

    International Nuclear Information System (INIS)

    Spencer, F.W.; Diegert, K.V.; Easterling, R.G.

    1987-01-01

    Over the last decade, the problems of estimation and uncertainty assessment in probabilistics risk assessment (PRAs) have been addressed in a variety of NRC and industry-sponsored projects. These problems have received attention because of a recognition that major uncertainties in risk estimation exist, which can be reduced by collecting more and better data and other information, and because of a recognition that better methods for assessing these uncertainties are needed. In particular, a clear understanding of the nature and magnitude of various sources of uncertainty is needed to facilitate descision-making on possible plant changes and research options. Recent PRAs have employed methods of probability propagation, sometimes involving the use of Bayes Theorem, and intended to formalize the use of ''engineering judgment'' or ''expert opinion.'' All sources, or feelings, of uncertainty are expressed probabilistically, so that uncertainty analysis becomes simply a matter of probability propagation. Alternatives to forcing a probabilistic framework at all stages of a PRA are a major concern in this paper, however

  10. Uncertainty in project phases: A framework for organisational change management

    DEFF Research Database (Denmark)

    Kreye, Melanie; Balangalibun, Sarah

    2015-01-01

    in the early stage of the change project but was delayed until later phases. Furthermore, the sources of uncertainty were found to be predominantly within the organisation that initiated the change project and connected to the project scope. Based on these findings, propositions for future research are defined......Uncertainty is an integral challenge when managing organisational change projects (OCPs). Current literature highlights the importance of uncertainty; however, falls short of giving insights into the nature of uncertainty and suggestions for managing it. Specifically, no insights exist on how...... uncertainty develops over the different phases of OCPs. This paper presents case-based evidence on different sources of uncertainty in OCPs and how these develop over the different project phases. The results showed some surprising findings as the majority of the uncertainty did not manifest itself...

  11. A novel dose uncertainty model and its application for dose verification

    International Nuclear Information System (INIS)

    Jin Hosang; Chung Heetaek; Liu Chihray; Palta, Jatinder; Suh, Tae-Suk; Kim, Siyong

    2005-01-01

    Based on statistical approach, a novel dose uncertainty model was introduced considering both nonspatial and spatial dose deviations. Non-space-oriented uncertainty is mainly caused by dosimetric uncertainties, and space-oriented dose uncertainty is the uncertainty caused by all spatial displacements. Assuming these two parts are independent, dose difference between measurement and calculation is a linear combination of nonspatial and spatial dose uncertainties. Two assumptions were made: (1) the relative standard deviation of nonspatial dose uncertainty is inversely proportional to the dose standard deviation σ, and (2) the spatial dose uncertainty is proportional to the gradient of dose. The total dose uncertainty is a quadratic sum of the nonspatial and spatial uncertainties. The uncertainty model provides the tolerance dose bound for comparison between calculation and measurement. In the statistical uncertainty model based on a Gaussian distribution, a confidence level of 3σ theoretically confines 99.74% of measurements within the bound. By setting the confidence limit, the tolerance bound for dose comparison can be made analogous to that of existing dose comparison methods (e.g., a composite distribution analysis, a γ test, a χ evaluation, and a normalized agreement test method). However, the model considers the inherent dose uncertainty characteristics of the test points by taking into account the space-specific history of dose accumulation, while the previous methods apply a single tolerance criterion to the points, although dose uncertainty at each point is significantly different from others. Three types of one-dimensional test dose distributions (a single large field, a composite flat field made by two identical beams, and three-beam intensity-modulated fields) were made to verify the robustness of the model. For each test distribution, the dose bound predicted by the uncertainty model was compared with simulated measurements. The simulated

  12. Uncertainty Propagation in Hypersonic Vehicle Aerothermoelastic Analysis

    Science.gov (United States)

    Lamorte, Nicolas Etienne

    Hypersonic vehicles face a challenging flight environment. The aerothermoelastic analysis of its components requires numerous simplifying approximations. Identifying and quantifying the effect of uncertainties pushes the limits of the existing deterministic models, and is pursued in this work. An uncertainty quantification framework is used to propagate the effects of identified uncertainties on the stability margins and performance of the different systems considered. First, the aeroelastic stability of a typical section representative of a control surface on a hypersonic vehicle is examined. Variability in the uncoupled natural frequencies of the system is modeled to mimic the effect of aerodynamic heating. Next, the stability of an aerodynamically heated panel representing a component of the skin of a generic hypersonic vehicle is considered. Uncertainty in the location of transition from laminar to turbulent flow and the heat flux prediction is quantified using CFD. In both cases significant reductions of the stability margins are observed. A loosely coupled airframe--integrated scramjet engine is considered next. The elongated body and cowl of the engine flow path are subject to harsh aerothermodynamic loading which causes it to deform. Uncertainty associated with deformation prediction is propagated to the engine performance analysis. The cowl deformation is the main contributor to the sensitivity of the propulsion system performance. Finally, a framework for aerothermoelastic stability boundary calculation for hypersonic vehicles using CFD is developed. The usage of CFD enables one to consider different turbulence conditions, laminar or turbulent, and different models of the air mixture, in particular real gas model which accounts for dissociation of molecules at high temperature. The system is found to be sensitive to turbulence modeling as well as the location of the transition from laminar to turbulent flow. Real gas effects play a minor role in the

  13. Characterizing Epistemic Uncertainty for Launch Vehicle Designs

    Science.gov (United States)

    Novack, Steven D.; Rogers, Jim; Hark, Frank; Al Hassan, Mohammad

    2016-01-01

    NASA Probabilistic Risk Assessment (PRA) has the task of estimating the aleatory (randomness) and epistemic (lack of knowledge) uncertainty of launch vehicle loss of mission and crew risk and communicating the results. Launch vehicles are complex engineered systems designed with sophisticated subsystems that are built to work together to accomplish mission success. Some of these systems or subsystems are in the form of heritage equipment, while some have never been previously launched. For these cases, characterizing the epistemic uncertainty is of foremost importance, and it is anticipated that the epistemic uncertainty of a modified launch vehicle design versus a design of well understood heritage equipment would be greater. For reasons that will be discussed, standard uncertainty propagation methods using Monte Carlo simulation produce counter intuitive results and significantly underestimate epistemic uncertainty for launch vehicle models. Furthermore, standard PRA methods such as Uncertainty-Importance analyses used to identify components that are significant contributors to uncertainty are rendered obsolete since sensitivity to uncertainty changes are not reflected in propagation of uncertainty using Monte Carlo methods.This paper provides a basis of the uncertainty underestimation for complex systems and especially, due to nuances of launch vehicle logic, for launch vehicles. It then suggests several alternative methods for estimating uncertainty and provides examples of estimation results. Lastly, the paper shows how to implement an Uncertainty-Importance analysis using one alternative approach, describes the results, and suggests ways to reduce epistemic uncertainty by focusing on additional data or testing of selected components.

  14. Risk Assessment and Decision-Making under Uncertainty in Tunnel and Underground Engineering

    Directory of Open Access Journals (Sweden)

    Yuanpu Xia

    2017-10-01

    Full Text Available The impact of uncertainty on risk assessment and decision-making is increasingly being prioritized, especially for large geotechnical projects such as tunnels, where uncertainty is often the main source of risk. Epistemic uncertainty, which can be reduced, is the focus of attention. In this study, the existing entropy-risk decision model is first discussed and analyzed, and its deficiencies are improved upon and overcome. Then, this study addresses the fact that existing studies only consider parameter uncertainty and ignore the influence of the model uncertainty. Here, focus is on the issue of model uncertainty and differences in risk consciousness with different decision-makers. The utility theory is introduced in the model. Finally, a risk decision model is proposed based on the sensitivity analysis and the tolerance cost, which can improve decision-making efficiency. This research can provide guidance or reference for the evaluation and decision-making of complex systems engineering problems, and indicate a direction for further research of risk assessment and decision-making issues.

  15. Uncertainties in Safety Analysis. A literature review

    International Nuclear Information System (INIS)

    Ekberg, C.

    1995-05-01

    The purpose of the presented work has been to give a short summary of the origins of many uncertainties arising in the designing and performance assessment of a repository for spent nuclear fuel. Some different methods to treat these uncertainties is also included. The methods and conclusions are in many cases general in the sense that they are applicable to many other disciplines where simulations are used. As a conclusion it may be noted that uncertainties of different origin have been discussed and debated, but one large group, e.g. computer simulations, where the methods to make a more explicit investigation exists, have not been investigated in a satisfying way. 50 refs

  16. Uncertainties in Safety Analysis. A literature review

    Energy Technology Data Exchange (ETDEWEB)

    Ekberg, C [Chalmers Univ. of Technology, Goeteborg (Sweden). Dept. of Nuclear Chemistry

    1995-05-01

    The purpose of the presented work has been to give a short summary of the origins of many uncertainties arising in the designing and performance assessment of a repository for spent nuclear fuel. Some different methods to treat these uncertainties is also included. The methods and conclusions are in many cases general in the sense that they are applicable to many other disciplines where simulations are used. As a conclusion it may be noted that uncertainties of different origin have been discussed and debated, but one large group, e.g. computer simulations, where the methods to make a more explicit investigation exists, have not been investigated in a satisfying way. 50 refs.

  17. Uncertainty quantification in flood risk assessment

    Science.gov (United States)

    Blöschl, Günter; Hall, Julia; Kiss, Andrea; Parajka, Juraj; Perdigão, Rui A. P.; Rogger, Magdalena; Salinas, José Luis; Viglione, Alberto

    2017-04-01

    Uncertainty is inherent to flood risk assessments because of the complexity of the human-water system, which is characterised by nonlinearities and interdependencies, because of limited knowledge about system properties and because of cognitive biases in human perception and decision-making. On top of the uncertainty associated with the assessment of the existing risk to extreme events, additional uncertainty arises because of temporal changes in the system due to climate change, modifications of the environment, population growth and the associated increase in assets. Novel risk assessment concepts are needed that take into account all these sources of uncertainty. They should be based on the understanding of how flood extremes are generated and how they change over time. They should also account for the dynamics of risk perception of decision makers and population in the floodplains. In this talk we discuss these novel risk assessment concepts through examples from Flood Frequency Hydrology, Socio-Hydrology and Predictions Under Change. We believe that uncertainty quantification in flood risk assessment should lead to a robust approach of integrated flood risk management aiming at enhancing resilience rather than searching for optimal defense strategies.

  18. Predictive uncertainty in auditory sequence processing

    Directory of Open Access Journals (Sweden)

    Niels Chr. eHansen

    2014-09-01

    Full Text Available Previous studies of auditory expectation have focused on the expectedness perceived by listeners retrospectively in response to events. In contrast, this research examines predictive uncertainty - a property of listeners’ prospective state of expectation prior to the onset of an event. We examine the information-theoretic concept of Shannon entropy as a model of predictive uncertainty in music cognition. This is motivated by the Statistical Learning Hypothesis, which proposes that schematic expectations reflect probabilistic relationships between sensory events learned implicitly through exposure.Using probability estimates from an unsupervised, variable-order Markov model, 12 melodic contexts high in entropy and 12 melodic contexts low in entropy were selected from two musical repertoires differing in structural complexity (simple and complex. Musicians and non-musicians listened to the stimuli and provided explicit judgments of perceived uncertainty (explicit uncertainty. We also examined an indirect measure of uncertainty computed as the entropy of expectedness distributions obtained using a classical probe-tone paradigm where listeners rated the perceived expectedness of the final note in a melodic sequence (inferred uncertainty. Finally, we simulate listeners’ perception of expectedness and uncertainty using computational models of auditory expectation. A detailed model comparison indicates which model parameters maximize fit to the data and how they compare to existing models in the literature.The results show that listeners experience greater uncertainty in high-entropy musical contexts than low-entropy contexts. This effect is particularly apparent for inferred uncertainty and is stronger in musicians than non-musicians. Consistent with the Statistical Learning Hypothesis, the results suggest that increased domain-relevant training is associated with an increasingly accurate cognitive model of probabilistic structure in music.

  19. Predictive uncertainty in auditory sequence processing.

    Science.gov (United States)

    Hansen, Niels Chr; Pearce, Marcus T

    2014-01-01

    Previous studies of auditory expectation have focused on the expectedness perceived by listeners retrospectively in response to events. In contrast, this research examines predictive uncertainty-a property of listeners' prospective state of expectation prior to the onset of an event. We examine the information-theoretic concept of Shannon entropy as a model of predictive uncertainty in music cognition. This is motivated by the Statistical Learning Hypothesis, which proposes that schematic expectations reflect probabilistic relationships between sensory events learned implicitly through exposure. Using probability estimates from an unsupervised, variable-order Markov model, 12 melodic contexts high in entropy and 12 melodic contexts low in entropy were selected from two musical repertoires differing in structural complexity (simple and complex). Musicians and non-musicians listened to the stimuli and provided explicit judgments of perceived uncertainty (explicit uncertainty). We also examined an indirect measure of uncertainty computed as the entropy of expectedness distributions obtained using a classical probe-tone paradigm where listeners rated the perceived expectedness of the final note in a melodic sequence (inferred uncertainty). Finally, we simulate listeners' perception of expectedness and uncertainty using computational models of auditory expectation. A detailed model comparison indicates which model parameters maximize fit to the data and how they compare to existing models in the literature. The results show that listeners experience greater uncertainty in high-entropy musical contexts than low-entropy contexts. This effect is particularly apparent for inferred uncertainty and is stronger in musicians than non-musicians. Consistent with the Statistical Learning Hypothesis, the results suggest that increased domain-relevant training is associated with an increasingly accurate cognitive model of probabilistic structure in music.

  20. Interpolation in Time Series : An Introductive Overview of Existing Methods, Their Performance Criteria and Uncertainty Assessment

    NARCIS (Netherlands)

    Lepot, M.J.; Aubin, Jean Baptiste; Clemens, F.H.L.R.

    2017-01-01

    A thorough review has been performed on interpolation methods to fill gaps in time-series, efficiency criteria, and uncertainty quantifications. On one hand, there are numerous available methods: interpolation, regression, autoregressive, machine learning methods, etc. On the other hand, there are

  1. Uncertainties in the proton lifetime

    International Nuclear Information System (INIS)

    Ellis, J.; Nanopoulos, D.V.; Rudaz, S.; Gaillard, M.K.

    1980-04-01

    We discuss the masses of the leptoquark bosons m(x) and the proton lifetime in Grand Unified Theories based principally on SU(5). It is emphasized that estimates of m(x) based on the QCD coupling and the fine structure constant are probably more reliable than those using the experimental value of sin 2 theta(w). Uncertainties in the QCD Λ parameter and the correct value of α are discussed. We estimate higher order effects on the evolution of coupling constants in a momentum space renormalization scheme. It is shown that increasing the number of generations of fermions beyond the minimal three increases m(X) by almost a factor of 2 per generation. Additional uncertainties exist for each generation of technifermions that may exist. We discuss and discount the possibility that proton decay could be 'Cabibbo-rotated' away, and a speculation that Lorentz invariance may be violated in proton decay at a detectable level. We estimate that in the absence of any substantial new physics beyond that in the minimal SU(5) model the proton lifetimes is 8 x 10 30+-2 years

  2. Uncertainty in hydrological signatures

    Science.gov (United States)

    McMillan, Hilary; Westerberg, Ida

    2015-04-01

    magnitude and bias, and to test how uncertainty depended on the density of the raingauge network and flow gauging station characteristics. The uncertainties were sometimes large (i.e. typical intervals of ±10-40% relative uncertainty) and highly variable between signatures. Uncertainty in the mean discharge was around ±10% for both catchments, while signatures describing the flow variability had much higher uncertainties in the Mahurangi where there was a fast rainfall-runoff response and greater high-flow rating uncertainty. Event and total runoff ratios had uncertainties from ±10% to ±15% depending on the number of rain gauges used; precipitation uncertainty was related to interpolation rather than point uncertainty. Uncertainty distributions in these signatures were skewed, and meant that differences in signature values between these catchments were often not significant. We hope that this study encourages others to use signatures in a way that is robust to data uncertainty.

  3. Some Implications of Two Forms of the Generalized Uncertainty Principle

    Directory of Open Access Journals (Sweden)

    Mohammed M. Khalil

    2014-01-01

    Full Text Available Various theories of quantum gravity predict the existence of a minimum length scale, which leads to the modification of the standard uncertainty principle to the Generalized Uncertainty Principle (GUP. In this paper, we study two forms of the GUP and calculate their implications on the energy of the harmonic oscillator and the hydrogen atom more accurately than previous studies. In addition, we show how the GUP modifies the Lorentz force law and the time-energy uncertainty principle.

  4. Uncertainty analysis of the FRAP code

    International Nuclear Information System (INIS)

    Peck, S.O.

    1978-01-01

    A user oriented, automated uncertainty analysis capability has been built into the Fuel Rod Analysis Program (FRAP) code and has been applied to a pressurized water reactor (PWR) fuel rod undergoing a loss-of-coolant accident (LOCA). The method of uncertainty analysis is the response surface method. The automated version significantly reduced the time required to complete the analysis and, at the same time, greatly increased the problem scope. Results of the analysis showed a significant difference in the total and relative contributions to the uncertainty of the response parameters between steady state and transient conditions

  5. Uncertainty in the inelastic resonant scattering assisted by phonons

    International Nuclear Information System (INIS)

    Garcia, N.; Garcia-Sanz, J.; Solana, J.

    1977-01-01

    We have analyzed the inelastic minima observed in new results of He atoms scattered from LiF(001) surfaces. This is done considering bound state resonance processes assisted by phonons. The analysis presents large uncertainties. In the range of uncertainty, we find two ''possible'' bands associated with the vibrations of F - and Li + , respectively. Many more experimental data are necessary to confirm the existence of these processes

  6. The role of general relativity in the uncertainty principle

    International Nuclear Information System (INIS)

    Padmanabhan, T.

    1986-01-01

    The role played by general relativity in quantum mechanics (especially as regards the uncertainty principle) is investigated. It is confirmed that the validity of time-energy uncertainty does depend on gravitational time dilation. It is also shown that there exists an intrinsic lower bound to the accuracy with which acceleration due to gravity can be measured. The motion of equivalence principle in quantum mechanics is clarified. (author)

  7. On the need for a time- and location-dependent estimation of the NDSI threshold value for reducing existing uncertainties in snow cover maps at different scales

    Science.gov (United States)

    Härer, Stefan; Bernhardt, Matthias; Siebers, Matthias; Schulz, Karsten

    2018-05-01

    Knowledge of current snow cover extent is essential for characterizing energy and moisture fluxes at the Earth's surface. The snow-covered area (SCA) is often estimated by using optical satellite information in combination with the normalized-difference snow index (NDSI). The NDSI thereby uses a threshold for the definition if a satellite pixel is assumed to be snow covered or snow free. The spatiotemporal representativeness of the standard threshold of 0.4 is however questionable at the local scale. Here, we use local snow cover maps derived from ground-based photography to continuously calibrate the NDSI threshold values (NDSIthr) of Landsat satellite images at two European mountain sites of the period from 2010 to 2015. The Research Catchment Zugspitzplatt (RCZ, Germany) and Vernagtferner area (VF, Austria) are both located within a single Landsat scene. Nevertheless, the long-term analysis of the NDSIthr demonstrated that the NDSIthr at these sites are not correlated (r = 0.17) and different than the standard threshold of 0.4. For further comparison, a dynamic and locally optimized NDSI threshold was used as well as another locally optimized literature threshold value (0.7). It was shown that large uncertainties in the prediction of the SCA of up to 24.1 % exist in satellite snow cover maps in cases where the standard threshold of 0.4 is used, but a newly developed calibrated quadratic polynomial model which accounts for seasonal threshold dynamics can reduce this error. The model minimizes the SCA uncertainties at the calibration site VF by 50 % in the evaluation period and was also able to improve the results at RCZ in a significant way. Additionally, a scaling experiment shows that the positive effect of a locally adapted threshold diminishes using a pixel size of 500 m or larger, underlining the general applicability of the standard threshold at larger scales.

  8. Impact of dose-distribution uncertainties on rectal ntcp modeling I: Uncertainty estimates

    International Nuclear Information System (INIS)

    Fenwick, John D.; Nahum, Alan E.

    2001-01-01

    A trial of nonescalated conformal versus conventional radiotherapy treatment of prostate cancer has been carried out at the Royal Marsden NHS Trust (RMH) and Institute of Cancer Research (ICR), demonstrating a significant reduction in the rate of rectal bleeding reported for patients treated using the conformal technique. The relationship between planned rectal dose-distributions and incidences of bleeding has been analyzed, showing that the rate of bleeding falls significantly as the extent of the rectal wall receiving a planned dose-level of more than 57 Gy is reduced. Dose-distributions delivered to the rectal wall over the course of radiotherapy treatment inevitably differ from planned distributions, due to sources of uncertainty such as patient setup error, rectal wall movement and variation in the absolute rectal wall surface area. In this paper estimates of the differences between planned and treated rectal dose-distribution parameters are obtained for the RMH/ICR nonescalated conformal technique, working from a distribution of setup errors observed during the RMH/ICR trial, movement data supplied by Lebesque and colleagues derived from repeat CT scans, and estimates of rectal circumference variations extracted from the literature. Setup errors and wall movement are found to cause only limited systematic differences between mean treated and planned rectal dose-distribution parameter values, but introduce considerable uncertainties into the treated values of some dose-distribution parameters: setup errors lead to 22% and 9% relative uncertainties in the highly dosed fraction of the rectal wall and the wall average dose, respectively, with wall movement leading to 21% and 9% relative uncertainties. Estimates obtained from the literature of the uncertainty in the absolute surface area of the distensible rectal wall are of the order of 13%-18%. In a subsequent paper the impact of these uncertainties on analyses of the relationship between incidences of bleeding

  9. Modeling Input Errors to Improve Uncertainty Estimates for Sediment Transport Model Predictions

    Science.gov (United States)

    Jung, J. Y.; Niemann, J. D.; Greimann, B. P.

    2016-12-01

    Bayesian methods using Markov chain Monte Carlo algorithms have recently been applied to sediment transport models to assess the uncertainty in the model predictions due to the parameter values. Unfortunately, the existing approaches can only attribute overall uncertainty to the parameters. This limitation is critical because no model can produce accurate forecasts if forced with inaccurate input data, even if the model is well founded in physical theory. In this research, an existing Bayesian method is modified to consider the potential errors in input data during the uncertainty evaluation process. The input error is modeled using Gaussian distributions, and the means and standard deviations are treated as uncertain parameters. The proposed approach is tested by coupling it to the Sedimentation and River Hydraulics - One Dimension (SRH-1D) model and simulating a 23-km reach of the Tachia River in Taiwan. The Wu equation in SRH-1D is used for computing the transport capacity for a bed material load of non-cohesive material. Three types of input data are considered uncertain: (1) the input flowrate at the upstream boundary, (2) the water surface elevation at the downstream boundary, and (3) the water surface elevation at a hydraulic structure in the middle of the reach. The benefits of modeling the input errors in the uncertainty analysis are evaluated by comparing the accuracy of the most likely forecast and the coverage of the observed data by the credible intervals to those of the existing method. The results indicate that the internal boundary condition has the largest uncertainty among those considered. Overall, the uncertainty estimates from the new method are notably different from those of the existing method for both the calibration and forecast periods.

  10. Worry, Intolerance of Uncertainty, and Statistics Anxiety

    Science.gov (United States)

    Williams, Amanda S.

    2013-01-01

    Statistics anxiety is a problem for most graduate students. This study investigates the relationship between intolerance of uncertainty, worry, and statistics anxiety. Intolerance of uncertainty was significantly related to worry, and worry was significantly related to three types of statistics anxiety. Six types of statistics anxiety were…

  11. Inflation, inflation uncertainty and output growth in the USA

    Science.gov (United States)

    Bhar, Ramprasad; Mallik, Girijasankar

    2010-12-01

    Employing a multivariate EGARCH-M model, this study investigates the effects of inflation uncertainty and growth uncertainty on inflation and output growth in the United States. Our results show that inflation uncertainty has a positive and significant effect on the level of inflation and a negative and significant effect on the output growth. However, output uncertainty has no significant effect on output growth or inflation. The oil price also has a positive and significant effect on inflation. These findings are robust and have been corroborated by use of an impulse response function. These results have important implications for inflation-targeting monetary policy, and the aim of stabilization policy in general.

  12. Robustness of dynamic systems with parameter uncertainties

    CERN Document Server

    Balemi, S; Truöl, W

    1992-01-01

    Robust Control is one of the fastest growing and promising areas of research today. In many practical systems there exist uncertainties which have to be considered in the analysis and design of control systems. In the last decade methods were developed for dealing with dynamic systems with unstructured uncertainties such as HOO_ and £I-optimal control. For systems with parameter uncertainties, the seminal paper of V. L. Kharitonov has triggered a large amount of very promising research. An international workshop dealing with all aspects of robust control was successfully organized by S. P. Bhattacharyya and L. H. Keel in San Antonio, Texas, USA in March 1991. We organized the second international workshop in this area in Ascona, Switzer­ land in April 1992. However, this second workshop was restricted to robust control of dynamic systems with parameter uncertainties with the objective to concentrate on some aspects of robust control. This book contains a collection of papers presented at the International W...

  13. Application of uncertainty and sensitivity analysis to the air quality SHERPA modelling tool

    Science.gov (United States)

    Pisoni, E.; Albrecht, D.; Mara, T. A.; Rosati, R.; Tarantola, S.; Thunis, P.

    2018-06-01

    Air quality has significantly improved in Europe over the past few decades. Nonetheless we still find high concentrations in measurements mainly in specific regions or cities. This dimensional shift, from EU-wide to hot-spot exceedances, calls for a novel approach to regional air quality management (to complement EU-wide existing policies). The SHERPA (Screening for High Emission Reduction Potentials on Air quality) modelling tool was developed in this context. It provides an additional tool to be used in support to regional/local decision makers responsible for the design of air quality plans. It is therefore important to evaluate the quality of the SHERPA model, and its behavior in the face of various kinds of uncertainty. Uncertainty and sensitivity analysis techniques can be used for this purpose. They both reveal the links between assumptions and forecasts, help in-model simplification and may highlight unexpected relationships between inputs and outputs. Thus, a policy steered SHERPA module - predicting air quality improvement linked to emission reduction scenarios - was evaluated by means of (1) uncertainty analysis (UA) to quantify uncertainty in the model output, and (2) by sensitivity analysis (SA) to identify the most influential input sources of this uncertainty. The results of this study provide relevant information about the key variables driving the SHERPA output uncertainty, and advise policy-makers and modellers where to place their efforts for an improved decision-making process.

  14. Investment, regulation, and uncertainty

    Science.gov (United States)

    Smyth, Stuart J; McDonald, Jillian; Falck-Zepeda, Jose

    2014-01-01

    As with any technological innovation, time refines the technology, improving upon the original version of the innovative product. The initial GM crops had single traits for either herbicide tolerance or insect resistance. Current varieties have both of these traits stacked together and in many cases other abiotic and biotic traits have also been stacked. This innovation requires investment. While this is relatively straight forward, certain conditions need to exist such that investments can be facilitated. The principle requirement for investment is that regulatory frameworks render consistent and timely decisions. If the certainty of regulatory outcomes weakens, the potential for changes in investment patterns increases.   This article provides a summary background to the leading plant breeding technologies that are either currently being used to develop new crop varieties or are in the pipeline to be applied to plant breeding within the next few years. Challenges for existing regulatory systems are highlighted. Utilizing an option value approach from investment literature, an assessment of uncertainty regarding the regulatory approval for these varying techniques is undertaken. This research highlights which technology development options have the greatest degree of uncertainty and hence, which ones might be expected to see an investment decline. PMID:24499745

  15. Uncertainty during breast diagnostic evaluation: state of the science.

    Science.gov (United States)

    Montgomery, Mariann

    2010-01-01

    To present the state of the science on uncertainty in relationship to the experiences of women undergoing diagnostic evaluation for suspected breast cancer. Published articles from Medline, CINAHL, PubMED, and PsycINFO from 1983-2008 using the following key words: breast biopsy, mammography, uncertainty, reframing, inner strength, and disruption. Fifty research studies were examined with all reporting the presence of anxiety persisting throughout the diagnostic evaluation until certitude is achieved through the establishment of a definitive diagnosis. Indirect determinants of uncertainty for women undergoing breast diagnostic evaluation include measures of anxiety, depression, social support, emotional responses, defense mechanisms, and the psychological impact of events. Understanding and influencing the uncertainty experience have been suggested to be key in relieving psychosocial distress and positively influencing future screening behaviors. Several studies examine correlational relationships among anxiety, selection of coping methods, and demographic factors that influence uncertainty. A gap exists in the literature with regard to the relationship of inner strength and uncertainty. Nurses can be invaluable in assisting women in coping with the uncertainty experience by providing positive communication and support. Nursing interventions should be designed and tested for their effects on uncertainty experienced by women undergoing a breast diagnostic evaluation.

  16. Some sources of the underestimation of evaluated cross section uncertainties

    International Nuclear Information System (INIS)

    Badikov, S.A.; Gai, E.V.

    2003-01-01

    The problem of the underestimation of evaluated cross-section uncertainties is addressed. Two basic sources of the underestimation of evaluated cross-section uncertainties - a) inconsistency between declared and observable experimental uncertainties and b) inadequacy between applied statistical models and processed experimental data - are considered. Both the sources of the underestimation are mainly a consequence of existence of the uncertainties unrecognized by experimenters. A model of a 'constant shift' is proposed for taking unrecognised experimental uncertainties into account. The model is applied for statistical analysis of the 238 U(n,f)/ 235 U(n,f) reaction cross-section ratio measurements. It is demonstrated that multiplication by sqrt(χ 2 ) as instrument for correction of underestimated evaluated cross-section uncertainties fails in case of correlated measurements. It is shown that arbitrary assignment of uncertainties and correlation in a simple least squares fit of two correlated measurements of unknown mean leads to physically incorrect evaluated results. (author)

  17. Methodology for qualitative uncertainty assessment of climate impact indicators

    Science.gov (United States)

    Otto, Juliane; Keup-Thiel, Elke; Rechid, Diana; Hänsler, Andreas; Pfeifer, Susanne; Roth, Ellinor; Jacob, Daniela

    2016-04-01

    The FP7 project "Climate Information Portal for Copernicus" (CLIPC) is developing an integrated platform of climate data services to provide a single point of access for authoritative scientific information on climate change and climate change impacts. In this project, the Climate Service Center Germany (GERICS) has been in charge of the development of a methodology on how to assess the uncertainties related to climate impact indicators. Existing climate data portals mainly treat the uncertainties in two ways: Either they provide generic guidance and/or express with statistical measures the quantifiable fraction of the uncertainty. However, none of the climate data portals give the users a qualitative guidance how confident they can be in the validity of the displayed data. The need for such guidance was identified in CLIPC user consultations. Therefore, we aim to provide an uncertainty assessment that provides the users with climate impact indicator-specific guidance on the degree to which they can trust the outcome. We will present an approach that provides information on the importance of different sources of uncertainties associated with a specific climate impact indicator and how these sources affect the overall 'degree of confidence' of this respective indicator. To meet users requirements in the effective communication of uncertainties, their feedback has been involved during the development process of the methodology. Assessing and visualising the quantitative component of uncertainty is part of the qualitative guidance. As visual analysis method, we apply the Climate Signal Maps (Pfeifer et al. 2015), which highlight only those areas with robust climate change signals. Here, robustness is defined as a combination of model agreement and the significance of the individual model projections. Reference Pfeifer, S., Bülow, K., Gobiet, A., Hänsler, A., Mudelsee, M., Otto, J., Rechid, D., Teichmann, C. and Jacob, D.: Robustness of Ensemble Climate Projections

  18. Dynamic Uncertainty for Compensated Second-Order Systems

    Directory of Open Access Journals (Sweden)

    Clemens Elster

    2010-08-01

    Full Text Available The compensation of LTI systems and the evaluation of the according uncertainty is of growing interest in metrology. Uncertainty evaluation in metrology ought to follow specific guidelines, and recently two corresponding uncertainty evaluation schemes have been proposed for FIR and IIR filtering. We employ these schemes to compare an FIR and an IIR approach for compensating a second-order LTI system which has relevance in metrology. Our results suggest that the FIR approach is superior in the sense that it yields significantly smaller uncertainties when real-time evaluation of uncertainties is desired.

  19. Geological-structural models used in SR 97. Uncertainty analysis

    Energy Technology Data Exchange (ETDEWEB)

    Saksa, P.; Nummela, J. [FINTACT Oy (Finland)

    1998-10-01

    The uncertainty of geological-structural models was studied for the three sites in SR 97, called Aberg, Beberg and Ceberg. The evaluation covered both regional and site scale models, the emphasis being placed on fracture zones in the site scale. Uncertainty is a natural feature of all geoscientific investigations. It originates from measurements (errors in data, sampling limitations, scale variation) and conceptualisation (structural geometries and properties, ambiguous geometric or parametric solutions) to name the major ones. The structures of A-, B- and Ceberg are fracture zones of varying types. No major differences in the conceptualisation between the sites were noted. One source of uncertainty in the site models is the non-existence of fracture and zone information in the scale from 10 to 300 - 1000 m. At Aberg the development of the regional model has been performed very thoroughly. At the site scale one major source of uncertainty is that a clear definition of the target area is missing. Structures encountered in the boreholes are well explained and an interdisciplinary approach in interpretation have taken place. Beberg and Ceberg regional models contain relatively large uncertainties due to the investigation methodology and experience available at that time. In site scale six additional structures were proposed both to Beberg and Ceberg to variant analysis of these sites. Both sites include uncertainty in the form of many non-interpreted fractured sections along the boreholes. Statistical analysis gives high occurrences of structures for all three sites: typically 20 - 30 structures/km{sup 3}. Aberg has highest structural frequency, Beberg comes next and Ceberg has the lowest. The borehole configuration, orientations and surveying goals were inspected to find whether preferences or factors causing bias were present. Data from Aberg supports the conclusion that Aespoe sub volume would be an anomalously fractured, tectonised unit of its own. This means that

  20. Geological-structural models used in SR 97. Uncertainty analysis

    International Nuclear Information System (INIS)

    Saksa, P.; Nummela, J.

    1998-10-01

    The uncertainty of geological-structural models was studied for the three sites in SR 97, called Aberg, Beberg and Ceberg. The evaluation covered both regional and site scale models, the emphasis being placed on fracture zones in the site scale. Uncertainty is a natural feature of all geoscientific investigations. It originates from measurements (errors in data, sampling limitations, scale variation) and conceptualisation (structural geometries and properties, ambiguous geometric or parametric solutions) to name the major ones. The structures of A-, B- and Ceberg are fracture zones of varying types. No major differences in the conceptualisation between the sites were noted. One source of uncertainty in the site models is the non-existence of fracture and zone information in the scale from 10 to 300 - 1000 m. At Aberg the development of the regional model has been performed very thoroughly. At the site scale one major source of uncertainty is that a clear definition of the target area is missing. Structures encountered in the boreholes are well explained and an interdisciplinary approach in interpretation have taken place. Beberg and Ceberg regional models contain relatively large uncertainties due to the investigation methodology and experience available at that time. In site scale six additional structures were proposed both to Beberg and Ceberg to variant analysis of these sites. Both sites include uncertainty in the form of many non-interpreted fractured sections along the boreholes. Statistical analysis gives high occurrences of structures for all three sites: typically 20 - 30 structures/km 3 . Aberg has highest structural frequency, Beberg comes next and Ceberg has the lowest. The borehole configuration, orientations and surveying goals were inspected to find whether preferences or factors causing bias were present. Data from Aberg supports the conclusion that Aespoe sub volume would be an anomalously fractured, tectonised unit of its own. This means that the

  1. Identification of optimal strategies for energy management systems planning under multiple uncertainties

    International Nuclear Information System (INIS)

    Cai, Y.P.; Huang, G.H.; Yang, Z.F.; Tan, Q.

    2009-01-01

    Management of energy resources is crucial for many regions throughout the world. Many economic, environmental and political factors are having significant effects on energy management practices, leading to a variety of uncertainties in relevant decision making. The objective of this research is to identify optimal strategies in the planning of energy management systems under multiple uncertainties through the development of a fuzzy-random interval programming (FRIP) model. The method is based on an integration of the existing interval linear programming (ILP), superiority-inferiority-based fuzzy-stochastic programming (SI-FSP) and mixed integer linear programming (MILP). Such a FRIP model allows multiple uncertainties presented as interval values, possibilistic and probabilistic distributions, as well as their combinations within a general optimization framework. It can also be used for facilitating capacity-expansion planning of energy-production facilities within a multi-period and multi-option context. Complexities in energy management systems can be systematically reflected, thus applicability of the modeling process can be highly enhanced. The developed method has then been applied to a case of long-term energy management planning for a region with three cities. Useful solutions for the planning of energy management systems were generated. Interval solutions associated with different risk levels of constraint violation were obtained. They could be used for generating decision alternatives and thus help decision makers identify desired policies under various economic and system-reliability constraints. The solutions can also provide desired energy resource/service allocation and capacity-expansion plans with a minimized system cost, a maximized system reliability and a maximized energy security. Tradeoffs between system costs and constraint-violation risks could be successfully tackled, i.e., higher costs will increase system stability, while a desire for lower

  2. A review of uncertainty research in impact assessment

    International Nuclear Information System (INIS)

    Leung, Wanda; Noble, Bram; Gunn, Jill; Jaeger, Jochen A.G.

    2015-01-01

    This paper examines uncertainty research in Impact Assessment (IA) and the focus of attention of the IA scholarly literature. We do so by first exploring ‘outside’ the IA literature, identifying three main themes of uncertainty research, and then apply these themes to examine the focus of scholarly research on uncertainty ‘inside’ IA. Based on a search of the database Scopus, we identified 134 journal papers published between 1970 and 2013 that address uncertainty in IA, 75% of which were published since 2005. We found that 90% of IA research addressing uncertainty focused on uncertainty in the practice of IA, including uncertainty in impact predictions, models and managing environmental impacts. Notwithstanding early guidance on uncertainty treatment in IA from the 1980s, we found no common, underlying conceptual framework that was guiding research on uncertainty in IA practice. Considerably less attention, only 9% of papers, focused on uncertainty communication, disclosure and decision-making under uncertain conditions, the majority of which focused on the need to disclose uncertainties as opposed to providing guidance on how to do so and effectively use that information to inform decisions. Finally, research focused on theory building for explaining human behavior with respect to uncertainty avoidance constituted only 1% of the IA published literature. We suggest the need for further conceptual framework development for researchers focused on identifying and addressing uncertainty in IA practice; the need for guidance on how best to communicate uncertainties in practice, versus criticizing practitioners for not doing so; research that explores how best to interpret and use disclosures about uncertainty when making decisions about project approvals, and the implications of doing so; and academic theory building and exploring the utility of existing theories to better understand and explain uncertainty avoidance behavior in IA. - Highlights: • We

  3. A review of uncertainty research in impact assessment

    Energy Technology Data Exchange (ETDEWEB)

    Leung, Wanda, E-mail: wanda.leung@usask.ca [Department of Geography and Planning, University of Saskatchewan, 117 Science Place, Saskatoon, Saskatchewan S7N 5A5 (Canada); Noble, Bram, E-mail: b.noble@usask.ca [Department of Geography and Planning, School of Environment and Sustainability, University of Saskatchewan, 117 Science Place, Saskatoon, Saskatchewan S7N 5A5 (Canada); Gunn, Jill, E-mail: jill.gunn@usask.ca [Department of Geography and Planning, University of Saskatchewan, 117 Science Place, Saskatoon, Saskatchewan S7N 5A5 (Canada); Jaeger, Jochen A.G., E-mail: jochen.jaeger@concordia.ca [Department of Geography, Planning and Environment, Concordia University, 1455 de Maisonneuve W., Suite 1255, Montreal, Quebec H3G 1M8 (Canada); Loyola Sustainability Research Centre, Concordia University, 7141 Sherbrooke W., AD-502, Montreal, Quebec H4B 1R6 (Canada)

    2015-01-15

    This paper examines uncertainty research in Impact Assessment (IA) and the focus of attention of the IA scholarly literature. We do so by first exploring ‘outside’ the IA literature, identifying three main themes of uncertainty research, and then apply these themes to examine the focus of scholarly research on uncertainty ‘inside’ IA. Based on a search of the database Scopus, we identified 134 journal papers published between 1970 and 2013 that address uncertainty in IA, 75% of which were published since 2005. We found that 90% of IA research addressing uncertainty focused on uncertainty in the practice of IA, including uncertainty in impact predictions, models and managing environmental impacts. Notwithstanding early guidance on uncertainty treatment in IA from the 1980s, we found no common, underlying conceptual framework that was guiding research on uncertainty in IA practice. Considerably less attention, only 9% of papers, focused on uncertainty communication, disclosure and decision-making under uncertain conditions, the majority of which focused on the need to disclose uncertainties as opposed to providing guidance on how to do so and effectively use that information to inform decisions. Finally, research focused on theory building for explaining human behavior with respect to uncertainty avoidance constituted only 1% of the IA published literature. We suggest the need for further conceptual framework development for researchers focused on identifying and addressing uncertainty in IA practice; the need for guidance on how best to communicate uncertainties in practice, versus criticizing practitioners for not doing so; research that explores how best to interpret and use disclosures about uncertainty when making decisions about project approvals, and the implications of doing so; and academic theory building and exploring the utility of existing theories to better understand and explain uncertainty avoidance behavior in IA. - Highlights: • We

  4. Uncertainties as Barriers for Knowledge Sharing with Enterprise Social Media

    DEFF Research Database (Denmark)

    Trier, Matthias; Fung, Magdalene; Hansen, Abigail

    2017-01-01

    become a barrier for the participants’ adoption. There is only limited existing research studying the types of uncertainties that employees perceive and their impact on knowledge transfer via social media. To address this gap, this article presents a qualitative interview-based study of the adoption...... of the Enterprise Social Media tool Yammer for knowledge sharing in a large global organization. We identify and categorize nine uncertainties that were perceived as barriers by the respondents. The study revealed that the uncertainty types play an important role in affecting employees’ participation...

  5. On Commitments and Other Uncertainty Reduction Tools in Joint Action

    Directory of Open Access Journals (Sweden)

    Michael John

    2015-01-01

    Full Text Available In this paper, we evaluate the proposal that a central function of commitments within joint action is to reduce various kinds of uncertainty, and that this accounts for the prevalence of commitments in joint action. While this idea is prima facie attractive, we argue that it faces two serious problems. First, commitments can only reduce uncertainty if they are credible, and accounting for the credibility of commitments proves not to be straightforward. Second, there are many other ways in which uncertainty is commonly reduced within joint actions, which raises the possibility that commitments may be superfluous. Nevertheless, we argue that the existence of these alternative uncertainty reduction processes does not make commitments superfluous after all but, rather, helps to explain how commitments may contribute in various ways to uncertainty reduction.

  6. Optimization of refinery operations when uncertainty exists: Algeria's case; Optimisation des operations du raffinage en presence d'incertitudes: Cas de l'Algerie

    Energy Technology Data Exchange (ETDEWEB)

    Benyoucef, Abderrezak; Lantz, Frederic

    2010-09-15

    The objective of this article is to analyze the development of Algeria refinery industry when uncertainty exists, from a dynamic linear programming model. Because of the different market conditions volatility, many parameters must be able to be considered as uncertain. In our study, we treat mainly uncertainties of petroleum products demand. The model gives production levels, the units market rate and the exterior exchange of products at horizons 2030. It allows to appreciate the impact of volatility on this industry's development. [French] L'objectif de cet article est d'analyser le developpement de l'industrie algerienne du raffinage en presence d'incertitudes, a partir d'un modele de programmation lineaire dynamique. En raison de la volatilite des differentes conditions du marche, de nombreux parametres doivent pouvoir etre consideres comme incertains. Dans notre etude, nous traitons en particulier des incertitudes sur la demande des produits petroliers. Le modele fournit les niveaux de production, le taux de marche des unites et les echanges exterieurs de produits a l'horizon 2030. Il permet ainsi d'apprecier l'impact de la volatilite sur le developpement de cette industrie.

  7. Strategic Capital Budgeting : Asset Replacement Under Uncertainty

    NARCIS (Netherlands)

    Pawlina, G.; Kort, P.M.

    2001-01-01

    We consider a firm's decision to replace an existing production technology with a new, more cost-efficient one.Kulatilaka and Perotti [1998, Management Science] nd that, in a two-period model, increased product market uncertainty could encourage the firm to invest strategically in the new

  8. The economic implications of carbon cycle uncertainty

    International Nuclear Information System (INIS)

    Smith, Steven J.; Edmonds, James A.

    2006-01-01

    This paper examines the implications of uncertainty in the carbon cycle for the cost of stabilizing carbon dioxide concentrations. Using a state of the art integrated assessment model, we find that uncertainty in our understanding of the carbon cycle has significant implications for the costs of a climate stabilization policy, with cost differences denominated in trillions of dollars. Uncertainty in the carbon cycle is equivalent to a change in concentration target of up to 100 ppmv. The impact of carbon cycle uncertainties are smaller than those for climate sensitivity, and broadly comparable to the effect of uncertainty in technology availability

  9. Some target assay uncertainties for passive neutron coincidence counting

    International Nuclear Information System (INIS)

    Ensslin, N.; Langner, D.G.; Menlove, H.O.; Miller, M.C.; Russo, P.A.

    1990-01-01

    This paper provides some target assay uncertainties for passive neutron coincidence counting of plutonium metal, oxide, mixed oxide, and scrap and waste. The target values are based in part on past user experience and in part on the estimated results from new coincidence counting techniques that are under development. The paper summarizes assay error sources and the new coincidence techniques, and recommends the technique that is likely to yield the lowest assay uncertainty for a given material type. These target assay uncertainties are intended to be useful for NDA instrument selection and assay variance propagation studies for both new and existing facilities. 14 refs., 3 tabs

  10. Synthesis of Optimal Processing Pathway for Microalgae-based Biorefinery under Uncertainty

    DEFF Research Database (Denmark)

    Rizwan, Muhammad; Lee, Jay H.; Gani, Rafiqul

    2015-01-01

    decision making, we propose a systematic framework for the synthesis and optimal design of microalgae-based processing network under uncertainty. By incorporating major uncertainties into the biorefinery superstructure model we developed previously, a stochastic mixed integer nonlinear programming (s......The research in the field of microalgae-based biofuels and chemicals is in early phase of the development, and therefore a wide range of uncertainties exist due to inconsistencies among and shortage of technical information. In order to handle and address these uncertainties to ensure robust......MINLP) problem is formulated for determining the optimal biorefinery structure under given parameter uncertainties modelled as sampled scenarios. The solution to the sMINLP problem determines the optimal decisions with respect to processing technologies, material flows, and product portfolio in the presence...

  11. Error and uncertainty in scientific practice

    NARCIS (Netherlands)

    Boumans, M.; Hon, G.; Petersen, A.C.

    2014-01-01

    Assessment of error and uncertainty is a vital component of both natural and social science. Empirical research involves dealing with all kinds of errors and uncertainties, yet there is significant variance in how such results are dealt with. Contributors to this volume present case studies of

  12. Uncertainty analysis of LBLOCA for Advanced Heavy Water Reactor

    International Nuclear Information System (INIS)

    Srivastava, A.; Lele, H.G.; Ghosh, A.K.; Kushwaha, H.S.

    2008-01-01

    The main objective of safety analysis is to demonstrate in a robust way that all safety requirements are met, i.e. sufficient margins exist between real values of important parameters and their threshold values at which damage of the barriers against release of radioactivity would occur. As stated in the IAEA Safety Requirements for Design of NPPs 'a safety analysis of the plant design shall be conducted in which methods of both deterministic and probabilistic analysis shall be applied'. It is required that 'the computer programs, analytical methods and plant models used in the safety analysis shall be verified and validated, and adequate consideration shall be given to uncertainties'. Uncertainties are present in calculations due to the computer codes, initial and boundary conditions, plant state, fuel parameters, scaling and numerical solution algorithm. All conservative approaches, still widely used, were introduced to cover uncertainties due to limited capability for modelling and understanding of physical phenomena at the early stages of safety analysis. The results obtained by this approach are quite unrealistic and the level of conservatism is not fully known. Another approach is the use of Best Estimate (BE) codes with realistic initial and boundary conditions. If this approach is selected, it should be based on statistically combined uncertainties for plant initial and boundary conditions, assumptions and code models. The current trends are going into direction of the best estimate code with some conservative assumptions of the system with realistic input data with uncertainty analysis. The BE analysis with evaluation of uncertainties offers, in addition, a way to quantify the existing plant safety margins. Its broader use in the future is therefore envisaged, even though it is not always feasible because of the difficulty of quantifying code uncertainties with sufficiently narrow range for every phenomenon and for each accident sequence. In this paper

  13. Advanced Approach to Consider Aleatory and Epistemic Uncertainties for Integral Accident Simulations

    International Nuclear Information System (INIS)

    Peschke, Joerg; Kloos, Martina

    2013-01-01

    The use of best-estimate codes together with realistic input data generally requires that all potentially important epistemic uncertainties which may affect the code prediction are considered in order to get an adequate quantification of the epistemic uncertainty of the prediction as an expression of the existing imprecise knowledge. To facilitate the performance of the required epistemic uncertainty analyses, methods and corresponding software tools are available like, for instance, the GRS-tool SUSA (Software for Uncertainty and Sensitivity Analysis). However, for risk-informed decision-making, the restriction on epistemic uncertainties alone is not enough. Transients and accident scenarios are also affected by aleatory uncertainties which are due to the unpredictable nature of phenomena. It is essential that aleatory uncertainties are taken into account as well, not only in a simplified and supposedly conservative way but as realistic as possible. The additional consideration of aleatory uncertainties, for instance, on the behavior of the technical system, the performance of plant operators, or on the behavior of the physical process provides a quantification of probabilistically significant accident sequences. Only if a safety analysis is able to account for both epistemic and aleatory uncertainties in a realistic manner, it can provide a well-founded risk-informed answer for decision-making. At GRS, an advanced probabilistic dynamics method was developed to address this problem and to provide a more realistic modeling and assessment of transients and accident scenarios. This method allows for an integral simulation of complex dynamic processes particularly taking into account interactions between the plant dynamics as simulated by a best-estimate code, the dynamics of operator actions and the influence of epistemic and aleatory uncertainties. In this paper, the GRS method MCDET (Monte Carlo Dynamic Event Tree) for probabilistic dynamics analysis is explained

  14. Generalized uncertainty principle and entropy of three-dimensional rotating acoustic black hole

    International Nuclear Information System (INIS)

    Zhao, HuiHua; Li, GuangLiang; Zhang, LiChun

    2012-01-01

    Using the new equation of state density from the generalized uncertainty principle, we investigate statistics entropy of a 3-dimensional rotating acoustic black hole. When λ introduced in the generalized uncertainty principle takes a specific value, we obtain an area entropy and a correction term associated with the acoustic black hole. In this method, there does not exist any divergence and one needs not the small mass approximation in the original brick-wall model. -- Highlights: ► Statistics entropy of a 3-dimensional rotating acoustic black hole is studied. ► We obtain an area entropy and a correction term associated with it. ► We make λ introduced in the generalized uncertainty principle take a specific value. ► There does not exist any divergence in this method.

  15. An adaptive-binning method for generating constant-uncertainty/constant-significance light curves with Fermi-LAT data

    International Nuclear Information System (INIS)

    Lott, B.; Escande, L.; Larsson, S.; Ballet, J.

    2012-01-01

    Here, we present a method enabling the creation of constant-uncertainty/constant-significance light curves with the data of the Fermi-Large Area Telescope (LAT). The adaptive-binning method enables more information to be encapsulated within the light curve than with the fixed-binning method. Although primarily developed for blazar studies, it can be applied to any sources. Furthermore, this method allows the starting and ending times of each interval to be calculated in a simple and quick way during a first step. The reported mean flux and spectral index (assuming the spectrum is a power-law distribution) in the interval are calculated via the standard LAT analysis during a second step. In the absence of major caveats associated with this method Monte-Carlo simulations have been established. We present the performance of this method in determining duty cycles as well as power-density spectra relative to the traditional fixed-binning method.

  16. Uncertainty estimation of ultrasonic thickness measurement

    International Nuclear Information System (INIS)

    Yassir Yassen, Abdul Razak Daud; Mohammad Pauzi Ismail; Abdul Aziz Jemain

    2009-01-01

    The most important factor that should be taken into consideration when selecting ultrasonic thickness measurement technique is its reliability. Only when the uncertainty of a measurement results is known, it may be judged if the result is adequate for intended purpose. The objective of this study is to model the ultrasonic thickness measurement function, to identify the most contributing input uncertainty components, and to estimate the uncertainty of the ultrasonic thickness measurement results. We assumed that there are five error sources significantly contribute to the final error, these sources are calibration velocity, transit time, zero offset, measurement repeatability and resolution, by applying the propagation of uncertainty law to the model function, a combined uncertainty of the ultrasonic thickness measurement was obtained. In this study the modeling function of ultrasonic thickness measurement was derived. By using this model the estimation of the uncertainty of the final output result was found to be reliable. It was also found that the most contributing input uncertainty components are calibration velocity, transit time linearity and zero offset. (author)

  17. A unified approach for squeal instability analysis of disc brakes with two types of random-fuzzy uncertainties

    Science.gov (United States)

    Lü, Hui; Shangguan, Wen-Bin; Yu, Dejie

    2017-09-01

    Automotive brake systems are always subjected to various types of uncertainties and two types of random-fuzzy uncertainties may exist in the brakes. In this paper, a unified approach is proposed for squeal instability analysis of disc brakes with two types of random-fuzzy uncertainties. In the proposed approach, two uncertainty analysis models with mixed variables are introduced to model the random-fuzzy uncertainties. The first one is the random and fuzzy model, in which random variables and fuzzy variables exist simultaneously and independently. The second one is the fuzzy random model, in which uncertain parameters are all treated as random variables while their distribution parameters are expressed as fuzzy numbers. Firstly, the fuzziness is discretized by using α-cut technique and the two uncertainty analysis models are simplified into random-interval models. Afterwards, by temporarily neglecting interval uncertainties, the random-interval models are degraded into random models, in which the expectations, variances, reliability indexes and reliability probabilities of system stability functions are calculated. And then, by reconsidering the interval uncertainties, the bounds of the expectations, variances, reliability indexes and reliability probabilities are computed based on Taylor series expansion. Finally, by recomposing the analysis results at each α-cut level, the fuzzy reliability indexes and probabilities can be obtained, by which the brake squeal instability can be evaluated. The proposed approach gives a general framework to deal with both types of random-fuzzy uncertainties that may exist in the brakes and its effectiveness is demonstrated by numerical examples. It will be a valuable supplement to the systematic study of brake squeal considering uncertainty.

  18. Collaborative framework for PIV uncertainty quantification: the experimental database

    International Nuclear Information System (INIS)

    Neal, Douglas R; Sciacchitano, Andrea; Scarano, Fulvio; Smith, Barton L

    2015-01-01

    The uncertainty quantification of particle image velocimetry (PIV) measurements has recently become a topic of great interest as shown by the recent appearance of several different methods within the past few years. These approaches have different working principles, merits and limitations, which have been speculated upon in subsequent studies. This paper reports a unique experiment that has been performed specifically to test the efficacy of PIV uncertainty methods. The case of a rectangular jet, as previously studied by Timmins et al (2012) and Wilson and Smith (2013b), is used. The novel aspect of the experiment is simultaneous velocity measurements using two different time-resolved PIV systems and a hot-wire anemometry (HWA) system. The first PIV system, called the PIV measurement system (‘PIV-MS’), is intended for nominal measurements of which the uncertainty is to be evaluated. It is based on a single camera and features a dynamic velocity range (DVR) representative of typical PIV experiments. The second PIV system, called the ‘PIV-HDR’ (high dynamic range) system, features a significantly higher DVR obtained with a higher digital imaging resolution. The hot-wire is placed in close proximity to the PIV measurement domain. The three measurement systems were carefully set to simultaneously measure the flow velocity at the same time and location. The comparison between the PIV-HDR system and the HWA provides an estimate of the measurement precision of the reference velocity for evaluation of the instantaneous error in the measurement system. The discrepancy between the PIV-MS and the reference data provides the measurement error, which is later used to assess the different uncertainty quantification methods proposed in the literature. A detailed comparison of the uncertainty estimation methods based on the present datasets is presented in a second paper from Sciacchitano et al (2015). Furthermore, this database offers the potential to be used for

  19. Nuclear data sensitivity/uncertainty analysis for XT-ADS

    International Nuclear Information System (INIS)

    Sugawara, Takanori; Sarotto, Massimo; Stankovskiy, Alexey; Van den Eynde, Gert

    2011-01-01

    Highlights: → The sensitivity and uncertainty analyses were performed to comprehend the reliability of the XT-ADS neutronic design. → The uncertainties deduced from the covariance data for the XT-ADS criticality were 0.94%, 1.9% and 1.1% by the SCALE 44-group, TENDL-2009 and JENDL-3.3 data, respectively. → When the target accuracy of 0.3%Δk for the criticality was considered, the uncertainties did not satisfy it. → To achieve this accuracy, the uncertainties should be improved by experiments under an adequate condition. - Abstract: The XT-ADS, an accelerator-driven system for an experimental demonstration, has been investigated in the framework of IP EUROTRANS FP6 project. In this study, the sensitivity and uncertainty analyses were performed to comprehend the reliability of the XT-ADS neutronic design. For the sensitivity analysis, it was found that the sensitivity coefficients were significantly different by changing the geometry models and calculation codes. For the uncertainty analysis, it was confirmed that the uncertainties deduced from the covariance data varied significantly by changing them. The uncertainties deduced from the covariance data for the XT-ADS criticality were 0.94%, 1.9% and 1.1% by the SCALE 44-group, TENDL-2009 and JENDL-3.3 data, respectively. When the target accuracy of 0.3%Δk for the criticality was considered, the uncertainties did not satisfy it. To achieve this accuracy, the uncertainties should be improved by experiments under an adequate condition.

  20. Remediation of the Faultless Underground Nuclear Test: Moving Forward in the Face of Model Uncertainty

    International Nuclear Information System (INIS)

    Chapman, J. B.; Pohlmann, K.; Pohll, G.; Hassan, A.; Sanders, P.; Sanchez, M.; Jaunarajs, S.

    2002-01-01

    parameter values and the additive effects of multiple sources of uncertainty. Ultimately, the question was whether new data collection would substantially reduce uncertainty in the model. A Data Decision Analysis (DDA) was performed to quantify uncertainty in the existing model and determine the most cost-beneficial activities for reducing uncertainty, if reduction was needed. The DDA indicated that though there is large uncertainty present in some model parameters, the overall uncertainty in the calculated contaminant boundary during the 1,000-year regulatory timeframe is relatively small. As a result, limited uncertainty reduction can be expected from expensive characterization activities. With these results, DOE and NDEP have determined that the site model is suitable for moving forward in the corrective action process. Key to this acceptance is acknowledgment that the model requires independent validation data and the site requires long-term monitoring. Developing the validation and monitoring plans, and calculating contaminant boundaries are the tasks now being pursued for the site. The significant progress made for the site is due to the close cooperation and communication of the parties involved and an acceptance and understanding of the role of uncertainty

  1. Added Value of uncertainty Estimates of SOurce term and Meteorology (AVESOME)

    DEFF Research Database (Denmark)

    Sørensen, Jens Havskov; Schönfeldt, Fredrik; Sigg, Robert

    In the early phase of a nuclear accident, two large sources of uncertainty exist: one related to the source term and one associated with the meteorological data. Operational methods are being developed in AVESOME for quantitative estimation of uncertainties in atmospheric dispersion prediction.......g. at national meteorological services, the proposed methodology is feasible for real-time use, thereby adding value to decision support. In the recent NKS-B projects MUD, FAUNA and MESO, the implications of meteorological uncertainties for nuclear emergency preparedness and management have been studied...... uncertainty in atmospheric dispersion model forecasting stemming from both the source term and the meteorological data is examined. Ways to implement the uncertainties of forecasting in DSSs, and the impacts on real-time emergency management are described. The proposed methodology allows for efficient real...

  2. Optimization Under Uncertainty for Wake Steering Strategies: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Quick, Julian [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Annoni, Jennifer [National Renewable Energy Laboratory (NREL), Golden, CO (United States); King, Ryan N [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Dykes, Katherine L [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Fleming, Paul A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Ning, Andrew [Brigham Young University

    2017-05-01

    Wind turbines in a wind power plant experience significant power losses because of aerodynamic interactions between turbines. One control strategy to reduce these losses is known as 'wake steering,' in which upstream turbines are yawed to direct wakes away from downstream turbines. Previous wake steering research has assumed perfect information, however, there can be significant uncertainty in many aspects of the problem, including wind inflow and various turbine measurements. Uncertainty has significant implications for performance of wake steering strategies. Consequently, the authors formulate and solve an optimization under uncertainty (OUU) problem for finding optimal wake steering strategies in the presence of yaw angle uncertainty. The OUU wake steering strategy is demonstrated on a two-turbine test case and on the utility-scale, offshore Princess Amalia Wind Farm. When we accounted for yaw angle uncertainty in the Princess Amalia Wind Farm case, inflow-direction-specific OUU solutions produced between 0% and 1.4% more power than the deterministically optimized steering strategies, resulting in an overall annual average improvement of 0.2%. More importantly, the deterministic optimization is expected to perform worse and with more downside risk than the OUU result when realistic uncertainty is taken into account. Additionally, the OUU solution produces fewer extreme yaw situations than the deterministic solution.

  3. Uncertainty of inhalation dose coefficients for representative physical and chemical forms of iodine-131

    Science.gov (United States)

    Harvey, Richard Paul, III

    Releases of radioactive material have occurred at various Department of Energy (DOE) weapons facilities and facilities associated with the nuclear fuel cycle in the generation of electricity. Many different radionuclides have been released to the environment with resulting exposure of the population to these various sources of radioactivity. Radioiodine has been released from a number of these facilities and is a potential public health concern due to its physical and biological characteristics. Iodine exists as various isotopes, but our focus is on 131I due to its relatively long half-life, its prevalence in atmospheric releases and its contribution to offsite dose. The assumption of physical and chemical form is speculated to have a profound impact on the deposition of radioactive material within the respiratory tract. In the case of iodine, it has been shown that more than one type of physical and chemical form may be released to, or exist in, the environment; iodine can exist as a particle or as a gas. The gaseous species can be further segregated based on chemical form: elemental, inorganic, and organic iodides. Chemical compounds in each class are assumed to behave similarly with respect to biochemistry. Studies at Oak Ridge National Laboratories have demonstrated that 131I is released as a particulate, as well as in elemental, inorganic and organic chemical form. The internal dose estimate from 131I may be very different depending on the effect that chemical form has on fractional deposition, gas uptake, and clearance in the respiratory tract. There are many sources of uncertainty in the estimation of environmental dose including source term, airborne transport of radionuclides, and internal dosimetry. Knowledge of uncertainty in internal dosimetry is essential for estimating dose to members of the public and for determining total uncertainty in dose estimation. Important calculational steps in any lung model is regional estimation of deposition fractions

  4. CSAU (Code Scaling, Applicability and Uncertainty)

    International Nuclear Information System (INIS)

    Wilson, G.E.; Boyack, B.E.

    1989-01-01

    Best Estimate computer codes have been accepted by the U.S. Nuclear Regulatory Commission as an optional tool for performing safety analysis related to the licensing and regulation of current nuclear reactors producing commercial electrical power, providing their uncertainty is quantified. In support of this policy change, the NRC and its contractors and consultants have developed and demonstrated an uncertainty quantification methodology called CSAU. The primary use of the CSAU methodology is to quantify safety margins for existing designs; however, the methodology can also serve an equally important role in advanced reactor research for plants not yet built. This paper describes the CSAU methodology, at the generic process level, and provides the general principles whereby it may be applied to evaluations of advanced reactor designs

  5. Not Normal: the uncertainties of scientific measurements

    Science.gov (United States)

    Bailey, David C.

    2017-01-01

    Judging the significance and reproducibility of quantitative research requires a good understanding of relevant uncertainties, but it is often unclear how well these have been evaluated and what they imply. Reported scientific uncertainties were studied by analysing 41 000 measurements of 3200 quantities from medicine, nuclear and particle physics, and interlaboratory comparisons ranging from chemistry to toxicology. Outliers are common, with 5σ disagreements up to five orders of magnitude more frequent than naively expected. Uncertainty-normalized differences between multiple measurements of the same quantity are consistent with heavy-tailed Student's t-distributions that are often almost Cauchy, far from a Gaussian Normal bell curve. Medical research uncertainties are generally as well evaluated as those in physics, but physics uncertainty improves more rapidly, making feasible simple significance criteria such as the 5σ discovery convention in particle physics. Contributions to measurement uncertainty from mistakes and unknown problems are not completely unpredictable. Such errors appear to have power-law distributions consistent with how designed complex systems fail, and how unknown systematic errors are constrained by researchers. This better understanding may help improve analysis and meta-analysis of data, and help scientists and the public have more realistic expectations of what scientific results imply.

  6. Implicit knowledge of visual uncertainty guides decisions with asymmetric outcomes

    DEFF Research Database (Denmark)

    Whiteley, Louise Emma; Sahani, Maneesh

    2008-01-01

    under conditions of uncertainty. Here we show that human observers performing a simple visual choice task under an externally imposed loss function approach the optimal strategy, as defined by Bayesian probability and decision theory (Berger, 1985; Cox, 1961). In concert with earlier work, this suggests...... are pre-existing, widespread, and can be propagated to decision-making areas of the brain....... that observers possess a model of their internal uncertainty and can utilize this model in the neural computations that underlie their behavior (Knill & Pouget, 2004). In our experiment, optimal behavior requires that observers integrate the loss function with an estimate of their internal uncertainty rather...

  7. A pseudo-statistical approach to treat choice uncertainty: the example of partitioning allocation methods

    NARCIS (Netherlands)

    Mendoza Beltran, A.; Heijungs, R.; Guinée, J.; Tukker, A.

    2016-01-01

    Purpose: Despite efforts to treat uncertainty due to methodological choices in life cycle assessment (LCA) such as standardization, one-at-a-time (OAT) sensitivity analysis, and analytical and statistical methods, no method exists that propagate this source of uncertainty for all relevant processes

  8. Developing scales measuring disorder-specific intolerance of uncertainty (DSIU) : a new perspective on transdiagnostic

    NARCIS (Netherlands)

    Thibodeau, Michel A; Carleton, R Nicholas; McEvoy, Peter M; Zvolensky, Michael J; Brandt, Charles P; Boelen, Paul A; Mahoney, Alison E J; Deacon, Brett J; Asmundson, Gordon J G

    Intolerance of uncertainty (IU) is a construct of growing prominence in literature on anxiety disorders and major depressive disorder. Existing measures of IU do not define the uncertainty that respondents perceive as distressing. To address this limitation, we developed eight scales measuring

  9. Interpretations of alternative uncertainty representations in a reliability and risk analysis context

    International Nuclear Information System (INIS)

    Aven, T.

    2011-01-01

    Probability is the predominant tool used to measure uncertainties in reliability and risk analyses. However, other representations also exist, including imprecise (interval) probability, fuzzy probability and representations based on the theories of evidence (belief functions) and possibility. Many researchers in the field are strong proponents of these alternative methods, but some are also sceptical. In this paper, we address one basic requirement set for quantitative measures of uncertainty: the interpretation needed to explain what an uncertainty number expresses. We question to what extent the various measures meet this requirement. Comparisons are made with probabilistic analysis, where uncertainty is represented by subjective probabilities, using either a betting interpretation or a reference to an uncertainty standard interpretation. By distinguishing between chances (expressing variation) and subjective probabilities, new insights are gained into the link between the alternative uncertainty representations and probability.

  10. Exploring the implication of climate process uncertainties within the Earth System Framework

    Science.gov (United States)

    Booth, B.; Lambert, F. H.; McNeal, D.; Harris, G.; Sexton, D.; Boulton, C.; Murphy, J.

    2011-12-01

    Uncertainties in the magnitude of future climate change have been a focus of a great deal of research. Much of the work with General Circulation Models has focused on the atmospheric response to changes in atmospheric composition, while other processes remain outside these frameworks. Here we introduce an ensemble of new simulations, based on an Earth System configuration of HadCM3C, designed to explored uncertainties in both physical (atmospheric, oceanic and aerosol physics) and carbon cycle processes, using perturbed parameter approaches previously used to explore atmospheric uncertainty. Framed in the context of the climate response to future changes in emissions, the resultant future projections represent significantly broader uncertainty than existing concentration driven GCM assessments. The systematic nature of the ensemble design enables interactions between components to be explored. For example, we show how metrics of physical processes (such as climate sensitivity) are also influenced carbon cycle parameters. The suggestion from this work is that carbon cycle processes represent a comparable contribution to uncertainty in future climate projections as contributions from atmospheric feedbacks more conventionally explored. The broad range of climate responses explored within these ensembles, rather than representing a reason for inaction, provide information on lower likelihood but high impact changes. For example while the majority of these simulations suggest that future Amazon forest extent is resilient to the projected climate changes, a small number simulate dramatic forest dieback. This ensemble represents a framework to examine these risks, breaking them down into physical processes (such as ocean temperature drivers of rainfall change) and vegetation processes (where uncertainties point towards requirements for new observational constraints).

  11. Illness uncertainty and treatment motivation in type 2 diabetes patients.

    Science.gov (United States)

    Apóstolo, João Luís Alves; Viveiros, Catarina Sofia Castro; Nunes, Helena Isabel Ribeiro; Domingues, Helena Raquel Faustino

    2007-01-01

    To characterize the uncertainty in illness and the motivation for treatment and to evaluate the existing relation between these variables in individuals with type 2 diabetes. Descriptive, correlational study, using a sample of 62 individuals in diabetes consultation sessions. The Uncertainty Stress Scale and the Treatment Self-Regulation Questionnaire were used. The individuals with type 2 diabetes present low levels of uncertainty in illness and a high motivation for treatment, with a stronger intrinsic than extrinsic motivation. A negative correlation was verified between the uncertainty in the face of the prognosis and treatment and the intrinsic motivation. These individuals are already adapted, acting according to the meanings they attribute to illness. Uncertainty can function as a threat, intervening negatively in the attribution of meaning to the events related to illness and in the process of adaptation and motivation to adhere to treatment. Intrinsic motivation seems to be essential to adhere to treatment.

  12. How uncertainty in socio-economic variables affects large-scale transport model forecasts

    DEFF Research Database (Denmark)

    Manzo, Stefano; Nielsen, Otto Anker; Prato, Carlo Giacomo

    2015-01-01

    A strategic task assigned to large-scale transport models is to forecast the demand for transport over long periods of time to assess transport projects. However, by modelling complex systems transport models have an inherent uncertainty which increases over time. As a consequence, the longer...... the period forecasted the less reliable is the forecasted model output. Describing uncertainty propagation patterns over time is therefore important in order to provide complete information to the decision makers. Among the existing literature only few studies analyze uncertainty propagation patterns over...

  13. Reusable launch vehicle model uncertainties impact analysis

    Science.gov (United States)

    Chen, Jiaye; Mu, Rongjun; Zhang, Xin; Deng, Yanpeng

    2018-03-01

    Reusable launch vehicle(RLV) has the typical characteristics of complex aerodynamic shape and propulsion system coupling, and the flight environment is highly complicated and intensely changeable. So its model has large uncertainty, which makes the nominal system quite different from the real system. Therefore, studying the influences caused by the uncertainties on the stability of the control system is of great significance for the controller design. In order to improve the performance of RLV, this paper proposes the approach of analyzing the influence of the model uncertainties. According to the typical RLV, the coupling dynamic and kinematics models are built. Then different factors that cause uncertainties during building the model are analyzed and summed up. After that, the model uncertainties are expressed according to the additive uncertainty model. Choosing the uncertainties matrix's maximum singular values as the boundary model, and selecting the uncertainties matrix's norm to show t how much the uncertainty factors influence is on the stability of the control system . The simulation results illustrate that the inertial factors have the largest influence on the stability of the system, and it is necessary and important to take the model uncertainties into consideration before the designing the controller of this kind of aircraft( like RLV, etc).

  14. Uncertainty and sampling issues in tank characterization

    International Nuclear Information System (INIS)

    Liebetrau, A.M.; Pulsipher, B.A.; Kashporenko, D.M.

    1997-06-01

    A defensible characterization strategy must recognize that uncertainties are inherent in any measurement or estimate of interest and must employ statistical methods for quantifying and managing those uncertainties. Estimates of risk and therefore key decisions must incorporate knowledge about uncertainty. This report focuses statistical methods that should be employed to ensure confident decision making and appropriate management of uncertainty. Sampling is a major source of uncertainty that deserves special consideration in the tank characterization strategy. The question of whether sampling will ever provide the reliable information needed to resolve safety issues is explored. The issue of sample representativeness must be resolved before sample information is reliable. Representativeness is a relative term but can be defined in terms of bias and precision. Currently, precision can be quantified and managed through an effective sampling and statistical analysis program. Quantifying bias is more difficult and is not being addressed under the current sampling strategies. Bias could be bounded by (1) employing new sampling methods that can obtain samples from other areas in the tanks, (2) putting in new risers on some worst case tanks and comparing the results from existing risers with new risers, or (3) sampling tanks through risers under which no disturbance or activity has previously occurred. With some bound on bias and estimates of precision, various sampling strategies could be determined and shown to be either cost-effective or infeasible

  15. Gaussian Process Interpolation for Uncertainty Estimation in Image Registration

    Science.gov (United States)

    Wachinger, Christian; Golland, Polina; Reuter, Martin; Wells, William

    2014-01-01

    Intensity-based image registration requires resampling images on a common grid to evaluate the similarity function. The uncertainty of interpolation varies across the image, depending on the location of resampled points relative to the base grid. We propose to perform Bayesian inference with Gaussian processes, where the covariance matrix of the Gaussian process posterior distribution estimates the uncertainty in interpolation. The Gaussian process replaces a single image with a distribution over images that we integrate into a generative model for registration. Marginalization over resampled images leads to a new similarity measure that includes the uncertainty of the interpolation. We demonstrate that our approach increases the registration accuracy and propose an efficient approximation scheme that enables seamless integration with existing registration methods. PMID:25333127

  16. Understanding uncertainty

    CERN Document Server

    Lindley, Dennis V

    2013-01-01

    Praise for the First Edition ""...a reference for everyone who is interested in knowing and handling uncertainty.""-Journal of Applied Statistics The critically acclaimed First Edition of Understanding Uncertainty provided a study of uncertainty addressed to scholars in all fields, showing that uncertainty could be measured by probability, and that probability obeyed three basic rules that enabled uncertainty to be handled sensibly in everyday life. These ideas were extended to embrace the scientific method and to show how decisions, containing an uncertain element, could be rationally made.

  17. Uncertainty evaluation in normalization of isotope delta measurement results against international reference materials.

    Science.gov (United States)

    Meija, Juris; Chartrand, Michelle M G

    2018-01-01

    Isotope delta measurements are normalized against international reference standards. Although multi-point normalization is becoming a standard practice, the existing uncertainty evaluation practices are either undocumented or are incomplete. For multi-point normalization, we present errors-in-variables regression models for explicit accounting of the measurement uncertainty of the international standards along with the uncertainty that is attributed to their assigned values. This manuscript presents framework to account for the uncertainty that arises due to a small number of replicate measurements and discusses multi-laboratory data reduction while accounting for inevitable correlations between the laboratories due to the use of identical reference materials for calibration. Both frequentist and Bayesian methods of uncertainty analysis are discussed.

  18. Deterministic uncertainty analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1987-12-01

    This paper presents a deterministic uncertainty analysis (DUA) method for calculating uncertainties that has the potential to significantly reduce the number of computer runs compared to conventional statistical analysis. The method is based upon the availability of derivative and sensitivity data such as that calculated using the well known direct or adjoint sensitivity analysis techniques. Formation of response surfaces using derivative data and the propagation of input probability distributions are discussed relative to their role in the DUA method. A sample problem that models the flow of water through a borehole is used as a basis to compare the cumulative distribution function of the flow rate as calculated by the standard statistical methods and the DUA method. Propogation of uncertainties by the DUA method is compared for ten cases in which the number of reference model runs was varied from one to ten. The DUA method gives a more accurate representation of the true cumulative distribution of the flow rate based upon as few as two model executions compared to fifty model executions using a statistical approach. 16 refs., 4 figs., 5 tabs

  19. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    Energy Technology Data Exchange (ETDEWEB)

    Langenbrunner, James R. [Los Alamos National Laboratory; Booker, Jane M [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Salazar, Issac F [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  20. Recognizing and responding to uncertainty: a grounded theory of nurses' uncertainty.

    Science.gov (United States)

    Cranley, Lisa A; Doran, Diane M; Tourangeau, Ann E; Kushniruk, Andre; Nagle, Lynn

    2012-08-01

    There has been little research to date exploring nurses' uncertainty in their practice. Understanding nurses' uncertainty is important because it has potential implications for how care is delivered. The purpose of this study is to develop a substantive theory to explain how staff nurses experience and respond to uncertainty in their practice. Between 2006 and 2008, a grounded theory study was conducted that included in-depth semi-structured interviews. Fourteen staff nurses working in adult medical-surgical intensive care units at two teaching hospitals in Ontario, Canada, participated in the study. The theory recognizing and responding to uncertainty characterizes the processes through which nurses' uncertainty manifested and how it was managed. Recognizing uncertainty involved the processes of assessing, reflecting, questioning, and/or being unable to predict aspects of the patient situation. Nurses' responses to uncertainty highlighted the cognitive-affective strategies used to manage uncertainty. Study findings highlight the importance of acknowledging uncertainty and having collegial support to manage uncertainty. The theory adds to our understanding the processes involved in recognizing uncertainty, strategies and outcomes of managing uncertainty, and influencing factors. Tailored nursing education programs should be developed to assist nurses in developing skills in articulating and managing their uncertainty. Further research is needed to extend, test and refine the theory of recognizing and responding to uncertainty to develop strategies for managing uncertainty. This theory advances the nursing perspective of uncertainty in clinical practice. The theory is relevant to nurses who are faced with uncertainty and complex clinical decisions, to managers who support nurses in their clinical decision-making, and to researchers who investigate ways to improve decision-making and care delivery. ©2012 Sigma Theta Tau International.

  1. Uncertainty quantification metrics for whole product life cycle cost estimates in aerospace innovation

    Science.gov (United States)

    Schwabe, O.; Shehab, E.; Erkoyuncu, J.

    2015-08-01

    The lack of defensible methods for quantifying cost estimate uncertainty over the whole product life cycle of aerospace innovations such as propulsion systems or airframes poses a significant challenge to the creation of accurate and defensible cost estimates. Based on the axiomatic definition of uncertainty as the actual prediction error of the cost estimate, this paper provides a comprehensive overview of metrics used for the uncertainty quantification of cost estimates based on a literature review, an evaluation of publicly funded projects such as part of the CORDIS or Horizon 2020 programs, and an analysis of established approaches used by organizations such NASA, the U.S. Department of Defence, the ESA, and various commercial companies. The metrics are categorized based on their foundational character (foundations), their use in practice (state-of-practice), their availability for practice (state-of-art) and those suggested for future exploration (state-of-future). Insights gained were that a variety of uncertainty quantification metrics exist whose suitability depends on the volatility of available relevant information, as defined by technical and cost readiness level, and the number of whole product life cycle phases the estimate is intended to be valid for. Information volatility and number of whole product life cycle phases can hereby be considered as defining multi-dimensional probability fields admitting various uncertainty quantification metric families with identifiable thresholds for transitioning between them. The key research gaps identified were the lacking guidance grounded in theory for the selection of uncertainty quantification metrics and lacking practical alternatives to metrics based on the Central Limit Theorem. An innovative uncertainty quantification framework consisting of; a set-theory based typology, a data library, a classification system, and a corresponding input-output model are put forward to address this research gap as the basis

  2. pyNSMC: A Python Module for Null-Space Monte Carlo Uncertainty Analysis

    Science.gov (United States)

    White, J.; Brakefield, L. K.

    2015-12-01

    The null-space monte carlo technique is a non-linear uncertainty analyses technique that is well-suited to high-dimensional inverse problems. While the technique is powerful, the existing workflow for completing null-space monte carlo is cumbersome, requiring the use of multiple commandline utilities, several sets of intermediate files and even a text editor. pyNSMC is an open-source python module that automates the workflow of null-space monte carlo uncertainty analyses. The module is fully compatible with the PEST and PEST++ software suites and leverages existing functionality of pyEMU, a python framework for linear-based uncertainty analyses. pyNSMC greatly simplifies the existing workflow for null-space monte carlo by taking advantage of object oriented design facilities in python. The core of pyNSMC is the ensemble class, which draws and stores realized random vectors and also provides functionality for exporting and visualizing results. By relieving users of the tedium associated with file handling and command line utility execution, pyNSMC instead focuses the user on the important steps and assumptions of null-space monte carlo analysis. Furthermore, pyNSMC facilitates learning through flow charts and results visualization, which are available at many points in the algorithm. The ease-of-use of the pyNSMC workflow is compared to the existing workflow for null-space monte carlo for a synthetic groundwater model with hundreds of estimable parameters.

  3. Managing Uncertainty in Water Infrastructure Design Using Info-gap Robustness

    Science.gov (United States)

    Irias, X.; Cicala, D.

    2013-12-01

    Info-gap theory, a tool for managing deep uncertainty, can be of tremendous value for design of water systems in areas of high seismic risk. Maintaining reliable water service in those areas is subject to significant uncertainties including uncertainty of seismic loading, unknown seismic performance of infrastructure, uncertain costs of innovative seismic-resistant construction, unknown costs to repair seismic damage, unknown societal impacts from downtime, and more. Practically every major earthquake that strikes a population center reveals additional knowledge gaps. In situations of such deep uncertainty, info-gap can offer advantages over traditional approaches, whether deterministic approaches that use empirical safety factors to address the uncertainties involved, or probabilistic methods that attempt to characterize various stochastic properties and target a compromise between cost and reliability. The reason is that in situations of deep uncertainty, it may not be clear what safety factor would be reasonable, or even if any safety factor is sufficient to address the uncertainties, and we may lack data to characterize the situation probabilistically. Info-gap is a tool that recognizes up front that our best projection of the future may be wrong. Thus, rather than seeking a solution that is optimal for that projection, info-gap seeks a solution that works reasonably well for all plausible conditions. In other words, info-gap seeks solutions that are robust in the face of uncertainty. Info-gap has been used successfully across a wide range of disciplines including climate change science, project management, and structural design. EBMUD is currently using info-gap to help it gain insight into possible solutions for providing reliable water service to an island community within its service area. The island, containing about 75,000 customers, is particularly vulnerable to water supply disruption from earthquakes, since it has negligible water storage and is

  4. Capital flight and the uncertainty of government policies

    NARCIS (Netherlands)

    Hermes, N.; Lensink, R.

    2000-01-01

    This paper shows that policy uncertainty, measured by the uncertainty of budget deficits, tax payments, government consumption and the inflation rate, has a statistically significant positive impact on capital flight. This result remains robust after having applied stability tests.

  5. Uncertainty of Modal Parameters Estimated by ARMA Models

    DEFF Research Database (Denmark)

    Jensen, Jacob Laigaard; Brincker, Rune; Rytter, Anders

    1990-01-01

    In this paper the uncertainties of identified modal parameters such as eidenfrequencies and damping ratios are assed. From the measured response of dynamic excited structures the modal parameters may be identified and provide important structural knowledge. However the uncertainty of the parameters...... by simulation study of a lightly damped single degree of freedom system. Identification by ARMA models has been choosen as system identification method. It is concluded that both the sampling interval and number of sampled points may play a significant role with respect to the statistical errors. Furthermore......, it is shown that the model errors may also contribute significantly to the uncertainty....

  6. Can Future Uncertainty Keep Children Out of School?

    DEFF Research Database (Denmark)

    Lilleør, Helene Bie

    that uncertainty about future returns results in a need for risk diversification, that children function as old-age security providers when there are no available pension systems, that the human capital investment decision of one child is likely to be influenced by that of his/her siblings, and that rural parents...... face a choice of investing in either specific or general human capital of their children. In this paper, I investigate the effects of future income uncertainty on the joint human capital investment decision of children in a household. I develop and calibrate a simple illustrative human capital...... portfolio model and show that existing levels of uncertainty can indeed result in less than full school enrolment within a household, even in a world of perfect credit markets. The paper thus offers an alternative explanation for why it might be optimal for rural parents not to send all of their children...

  7. Horsetail matching: a flexible approach to optimization under uncertainty

    Science.gov (United States)

    Cook, L. W.; Jarrett, J. P.

    2018-04-01

    It is important to design engineering systems to be robust with respect to uncertainties in the design process. Often, this is done by considering statistical moments, but over-reliance on statistical moments when formulating a robust optimization can produce designs that are stochastically dominated by other feasible designs. This article instead proposes a formulation for optimization under uncertainty that minimizes the difference between a design's cumulative distribution function and a target. A standard target is proposed that produces stochastically non-dominated designs, but the formulation also offers enough flexibility to recover existing approaches for robust optimization. A numerical implementation is developed that employs kernels to give a differentiable objective function. The method is applied to algebraic test problems and a robust transonic airfoil design problem where it is compared to multi-objective, weighted-sum and density matching approaches to robust optimization; several advantages over these existing methods are demonstrated.

  8. Capital flight and the uncertainty of government policies

    NARCIS (Netherlands)

    Hermes, C.L.M.; Lensink, B.W.

    This paper shows that policy uncertainty, measured by the uncertainty of budget deficits, tart payments, government consumption and the inflation rate, has a statistically significant positive impact on capital flight. This result remains robust after having applied stability tests. (C) 2001

  9. A practical sensitivity analysis method for ranking sources of uncertainty in thermal–hydraulics applications

    Energy Technology Data Exchange (ETDEWEB)

    Pourgol-Mohammad, Mohammad, E-mail: pourgolmohammad@sut.ac.ir [Department of Mechanical Engineering, Sahand University of Technology, Tabriz (Iran, Islamic Republic of); Hoseyni, Seyed Mohsen [Department of Basic Sciences, East Tehran Branch, Islamic Azad University, Tehran (Iran, Islamic Republic of); Hoseyni, Seyed Mojtaba [Building & Housing Research Center, Tehran (Iran, Islamic Republic of); Sepanloo, Kamran [Nuclear Science and Technology Research Institute, Tehran (Iran, Islamic Republic of)

    2016-08-15

    Highlights: • Existing uncertainty ranking methods prove inconsistent for TH applications. • Introduction of a new method for ranking sources of uncertainty in TH codes. • Modified PIRT qualitatively identifies and ranks uncertainty sources more precisely. • The importance of parameters is calculated by a limited number of TH code executions. • Methodology is applied successfully on LOFT-LB1 test facility. - Abstract: In application to thermal–hydraulic calculations by system codes, sensitivity analysis plays an important role for managing the uncertainties of code output and risk analysis. Sensitivity analysis is also used to confirm the results of qualitative Phenomena Identification and Ranking Table (PIRT). Several methodologies have been developed to address uncertainty importance assessment. Generally, uncertainty importance measures, mainly devised for the Probabilistic Risk Assessment (PRA) applications, are not affordable for computationally demanding calculations of the complex thermal–hydraulics (TH) system codes. In other words, for effective quantification of the degree of the contribution of each phenomenon to the total uncertainty of the output, a practical approach is needed by considering high computational burden of TH calculations. This study aims primarily to show the inefficiency of the existing approaches and then introduces a solution to cope with the challenges in this area by modification of variance-based uncertainty importance method. Important parameters are identified by the modified PIRT approach qualitatively then their uncertainty importance is quantified by a local derivative index. The proposed index is attractive from its practicality point of view on TH applications. It is capable of calculating the importance of parameters by a limited number of TH code executions. Application of the proposed methodology is demonstrated on LOFT-LB1 test facility.

  10. A practical sensitivity analysis method for ranking sources of uncertainty in thermal–hydraulics applications

    International Nuclear Information System (INIS)

    Pourgol-Mohammad, Mohammad; Hoseyni, Seyed Mohsen; Hoseyni, Seyed Mojtaba; Sepanloo, Kamran

    2016-01-01

    Highlights: • Existing uncertainty ranking methods prove inconsistent for TH applications. • Introduction of a new method for ranking sources of uncertainty in TH codes. • Modified PIRT qualitatively identifies and ranks uncertainty sources more precisely. • The importance of parameters is calculated by a limited number of TH code executions. • Methodology is applied successfully on LOFT-LB1 test facility. - Abstract: In application to thermal–hydraulic calculations by system codes, sensitivity analysis plays an important role for managing the uncertainties of code output and risk analysis. Sensitivity analysis is also used to confirm the results of qualitative Phenomena Identification and Ranking Table (PIRT). Several methodologies have been developed to address uncertainty importance assessment. Generally, uncertainty importance measures, mainly devised for the Probabilistic Risk Assessment (PRA) applications, are not affordable for computationally demanding calculations of the complex thermal–hydraulics (TH) system codes. In other words, for effective quantification of the degree of the contribution of each phenomenon to the total uncertainty of the output, a practical approach is needed by considering high computational burden of TH calculations. This study aims primarily to show the inefficiency of the existing approaches and then introduces a solution to cope with the challenges in this area by modification of variance-based uncertainty importance method. Important parameters are identified by the modified PIRT approach qualitatively then their uncertainty importance is quantified by a local derivative index. The proposed index is attractive from its practicality point of view on TH applications. It is capable of calculating the importance of parameters by a limited number of TH code executions. Application of the proposed methodology is demonstrated on LOFT-LB1 test facility.

  11. Implications of nuclear data uncertainties to reactor design

    International Nuclear Information System (INIS)

    Greebler, P.; Hutchins, B.A.; Cowan, C.L.

    1970-01-01

    Uncertainties in nuclear data require significant allowances to be made in the design and the operating conditions of reactor cores and of shielded-reactor-plant and fuel-processing systems. These allowances result in direct cost increases due to overdesign of components and equipment and reduced core and fuel operating performance. Compromising the allowances for data uncertainties has indirect cost implications due to increased risks of failure to meet plant and fuel performance objectives, with warrantees involved in some cases, and to satisfy licensed safety requirements. Fast breeders are the most sensitive power reactors to the uncertainties in nuclear data over the neutron energy range of interest for fission reactors, and this paper focuses on the implications of the data uncertainties to design and operation of fast breeder reactors and fuel-processing systems. The current status of uncertainty in predicted physics parameters due to data uncertainties is reviewed and compared with the situation in 1966 and that projected for within the next two years due to anticipated data improvements. Implications of the uncertainties in the predicted physics parameters to design and operation are discussed for both a near-term prototype or demonstration breeder plant (∼300 MW(e)) and a longer-term large (∼1000 MW(e)) plant. Significant improvements in the nuclear data have been made during the past three years, the most important of these to fast power reactors being the 239 Pu alpha below 15 keV. The most important remaining specific data uncertainties are illustrated by their individual contributions to the computational uncertainty of selected physics parameters, and recommended priorities and accuracy requirements for improved data are presented

  12. Changes in Rectal Dose Due to Alterations in Beam Angles for Setup Uncertainty and Range Uncertainty in Carbon-Ion Radiotherapy for Prostate Cancer.

    Directory of Open Access Journals (Sweden)

    Yoshiki Kubota

    Full Text Available Carbon-ion radiotherapy of prostate cancer is challenging in patients with metal implants in one or both hips. Problems can be circumvented by using fields at oblique angles. To evaluate the influence of setup and range uncertainties accompanying oblique field angles, we calculated rectal dose changes with oblique orthogonal field angles, using a device with fixed fields at 0° and 90° and a rotating patient couch.Dose distributions were calculated at the standard angles of 0° and 90°, and then at 30° and 60°. Setup uncertainty was simulated with changes from -2 mm to +2 mm for fields in the anterior-posterior, left-right, and cranial-caudal directions, and dose changes from range uncertainty were calculated with a 1 mm water-equivalent path length added to the target isocenter in each angle. The dose distributions regarding the passive irradiation method were calculated using the K2 dose algorithm.The rectal volumes with 0°, 30°, 60°, and 90° field angles at 95% of the prescription dose were 3.4±0.9 cm3, 2.8±1.1 cm3, 2.2±0.8 cm3, and 3.8±1.1 cm3, respectively. As compared with 90° fields, 30° and 60° fields had significant advantages regarding setup uncertainty and significant disadvantages regarding range uncertainty, but were not significantly different from the 90° field setup and range uncertainties.The setup and range uncertainties calculated at 30° and 60° field angles were not associated with a significant change in rectal dose relative to those at 90°.

  13. Determination of the reference air kerma rate for 192Ir brachytherapy sources and the related uncertainty

    International Nuclear Information System (INIS)

    Dijk, Eduard van; Kolkman-Deurloo, Inger-Karine K.; Damen, Patricia M. G.

    2004-01-01

    Different methods exist to determine the air kerma calibration factor of an ionization chamber for the spectrum of a 192 Ir high-dose-rate (HDR) or pulsed-dose-rate (PDR) source. An analysis of two methods to obtain such a calibration factor was performed: (i) the method recommended by [Goetsch et al., Med. Phys. 18, 462-467 (1991)] and (ii) the method employed by the Dutch national standards institute NMi [Petersen et al., Report S-EI-94.01 (NMi, Delft, The Netherlands, 1994)]. This analysis showed a systematic difference on the order of 1% in the determination of the strength of 192 Ir HDR and PDR sources depending on the method used for determining the air kerma calibration factor. The definitive significance of the difference between these methods can only be addressed after performing an accurate analysis of the associated uncertainties. For an NE 2561 (or equivalent) ionization chamber and an in-air jig, a typical uncertainty budget of 0.94% was found with the NMi method. The largest contribution in the type-B uncertainty is the uncertainty in the air kerma calibration factor for isotope i, N k i , as determined by the primary or secondary standards laboratories. This uncertainty is dominated by the uncertainties in the physical constants for the average mass-energy absorption coefficient ratio and the stopping power ratios. This means that it is not foreseeable that the standards laboratories can decrease the uncertainty in the air kerma calibration factors for ionization chambers in the short term. When the results of the determination of the 192 Ir reference air kerma rates in, e.g., different institutes are compared, the uncertainties in the physical constants are the same. To compare the applied techniques, the ratio of the results can be judged by leaving out the uncertainties due to these physical constants. In that case an uncertainty budget of 0.40% (coverage factor=2) should be taken into account. Due to the differences in approach between the

  14. Probabilistic risk assessment for new and existing chemicals: Example calculations

    NARCIS (Netherlands)

    Jager T; Hollander HA den; Janssen GB; Poel P van der; Rikken MGJ; Vermeire TG; ECO; CSR; LAE; CSR

    2000-01-01

    In the risk assessment methods for new and existing chemicals in the EU, "risk" is characterised by means of the deterministic quotient of exposure and effects (PEC/PNEC or Margin of Safety). From a scientific viewpoint, the uncertainty in the risk quotient should be accounted for explicitly in the

  15. NASA/DOD Aerospace Knowledge Diffusion Research Project. Report 15: Technical uncertainty and project complexity as correlates of information use by US industry-affiliated aerospace engineers and scientists: Results of an exploratory investigation

    Science.gov (United States)

    Pinelli, Thomas E.; Glassman, Nanci A.; Affelder, Linda O.; Hecht, Laura M.; Kennedy, John M.; Barclay, Rebecca O.

    1993-01-01

    An exploratory study was conducted that investigated the influence of technical uncertainty and project complexity on information use by U.S. industry-affiliated aerospace engineers and scientists. The study utilized survey research in the form of a self-administered mail questionnaire. U.S. aerospace engineers and scientists on the Society of Automotive Engineers (SAE) mailing list served as the study population. The adjusted response rate was 67 percent. The survey instrument is appendix C to this report. Statistically significant relationships were found to exist between technical uncertainty, project complexity, and information use. Statistically significant relationships were found to exist between technical uncertainty, project complexity, and the use of federally funded aerospace R&D. The results of this investigation are relevant to researchers investigating information-seeking behavior of aerospace engineers. They are also relevant to R&D managers and policy planners concerned with transferring the results of federally funded aerospace R&D to the U.S. aerospace industry.

  16. Investment choice under uncertainty: A review essay

    Directory of Open Access Journals (Sweden)

    Trifunović Dejan

    2005-01-01

    Full Text Available An investment opportunity whose return is perfectly predictable, hardly exists at all. Instead, investor makes his decisions under conditions of uncertainty. Theory of expected utility is the main analytical tool for description of choice under uncertainty. Critics of the theory contend that individuals have bounded rationality and that the theory of expected utility is not correct. When agents are faced with risky decisions they behave differently, conditional on their attitude towards risk. They can be risk loving, risk averse or risk neutral. In order to make an investment decision it is necessary to compare probability distribution functions of returns. Investment decision making is much simpler if one uses expected values and variances instead of probability distribution functions.

  17. Quantification of margins and uncertainties: Alternative representations of epistemic uncertainty

    International Nuclear Information System (INIS)

    Helton, Jon C.; Johnson, Jay D.

    2011-01-01

    In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, 'Quantification of Margins and Uncertainties: Conceptual and Computational Basis,' describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples that employ probability for the representation of aleatory and epistemic uncertainty. The current presentation introduces and illustrates the use of interval analysis, possibility theory and evidence theory as alternatives to the use of probability theory for the representation of epistemic uncertainty in QMU-type analyses. The following topics are considered: the mathematical structure of alternative representations of uncertainty, alternative representations of epistemic uncertainty in QMU analyses involving only epistemic uncertainty, and alternative representations of epistemic uncertainty in QMU analyses involving a separation of aleatory and epistemic uncertainty. Analyses involving interval analysis, possibility theory and evidence theory are illustrated with the same two notional examples used in the presentation indicated above to illustrate the use of probability to represent aleatory and epistemic uncertainty in QMU analyses.

  18. Uncertainty analysis guide

    International Nuclear Information System (INIS)

    Andres, T.H.

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  19. Uncertainty analysis guide

    Energy Technology Data Exchange (ETDEWEB)

    Andres, T.H

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  20. Uncertainty Aware Structural Topology Optimization Via a Stochastic Reduced Order Model Approach

    Science.gov (United States)

    Aguilo, Miguel A.; Warner, James E.

    2017-01-01

    This work presents a stochastic reduced order modeling strategy for the quantification and propagation of uncertainties in topology optimization. Uncertainty aware optimization problems can be computationally complex due to the substantial number of model evaluations that are necessary to accurately quantify and propagate uncertainties. This computational complexity is greatly magnified if a high-fidelity, physics-based numerical model is used for the topology optimization calculations. Stochastic reduced order model (SROM) methods are applied here to effectively 1) alleviate the prohibitive computational cost associated with an uncertainty aware topology optimization problem; and 2) quantify and propagate the inherent uncertainties due to design imperfections. A generic SROM framework that transforms the uncertainty aware, stochastic topology optimization problem into a deterministic optimization problem that relies only on independent calls to a deterministic numerical model is presented. This approach facilitates the use of existing optimization and modeling tools to accurately solve the uncertainty aware topology optimization problems in a fraction of the computational demand required by Monte Carlo methods. Finally, an example in structural topology optimization is presented to demonstrate the effectiveness of the proposed uncertainty aware structural topology optimization approach.

  1. An uncertainty importance measure using a distance metric for the change in a cumulative distribution function

    International Nuclear Information System (INIS)

    Chun, Moon-Hyun; Han, Seok-Jung; Tak, Nam-IL

    2000-01-01

    A simple measure of uncertainty importance using the entire change of cumulative distribution functions (CDFs) has been developed for use in probability safety assessments (PSAs). The entire change of CDFs is quantified in terms of the metric distance between two CDFs. The metric distance measure developed in this study reflects the relative impact of distributional changes of inputs on the change of an output distribution, while most of the existing uncertainty importance measures reflect the magnitude of relative contribution of input uncertainties to the output uncertainty. The present measure has been evaluated analytically for various analytical distributions to examine its characteristics. To illustrate the applicability and strength of the present measure, two examples are provided. The first example is an application of the present measure to a typical problem of a system fault tree analysis and the second one is for a hypothetical non-linear model. Comparisons of the present result with those obtained by existing uncertainty importance measures show that the metric distance measure is a useful tool to express the measure of uncertainty importance in terms of the relative impact of distributional changes of inputs on the change of an output distribution

  2. Incorporating outcome uncertainty and prior outcome beliefs in stated preferences

    DEFF Research Database (Denmark)

    Lundhede, Thomas; Jacobsen, Jette Bredahl; Hanley, Nick

    2015-01-01

    Stated preference studies tell respondents that policies create environmental changes with varying levels of uncertainty. However, respondents may include their own a priori assessments of uncertainty when making choices among policy options. Using a choice experiment eliciting respondents......’ preferences for conservation policies under climate change, we find that higher outcome uncertainty reduces utility. When accounting for endogeneity, we find that prior beliefs play a significant role in this cost of uncertainty. Thus, merely stating “objective” levels of outcome uncertainty...

  3. Analyzing climate change impacts on water resources under uncertainty using an integrated simulation-optimization approach

    Science.gov (United States)

    Zhuang, X. W.; Li, Y. P.; Nie, S.; Fan, Y. R.; Huang, G. H.

    2018-01-01

    An integrated simulation-optimization (ISO) approach is developed for assessing climate change impacts on water resources. In the ISO, uncertainties presented as both interval numbers and probability distributions can be reflected. Moreover, ISO permits in-depth analyses of various policy scenarios that are associated with different levels of economic consequences when the promised water-allocation targets are violated. A snowmelt-precipitation-driven watershed (Kaidu watershed) in northwest China is selected as the study case for demonstrating the applicability of the proposed method. Results of meteorological projections disclose that the incremental trend of temperature (e.g., minimum and maximum values) and precipitation exist. Results also reveal that (i) the system uncertainties would significantly affect water resources allocation pattern (including target and shortage); (ii) water shortage would be enhanced from 2016 to 2070; and (iii) the more the inflow amount decreases, the higher estimated water shortage rates are. The ISO method is useful for evaluating climate change impacts within a watershed system with complicated uncertainties and helping identify appropriate water resources management strategies hedging against drought.

  4. Communicating spatial uncertainty to non-experts using R

    Science.gov (United States)

    Luzzi, Damiano; Sawicka, Kasia; Heuvelink, Gerard; de Bruin, Sytze

    2016-04-01

    Effective visualisation methods are important for the efficient use of uncertainty information for various groups of users. Uncertainty propagation analysis is often used with spatial environmental models to quantify the uncertainty within the information. A challenge arises when trying to effectively communicate the uncertainty information to non-experts (not statisticians) in a wide range of cases. Due to the growing popularity and applicability of the open source programming language R, we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. The package has implemented Monte Carlo algorithms for uncertainty propagation, the output of which is represented by an ensemble of model outputs (i.e. a sample from a probability distribution). Numerous visualisation methods exist that aim to present such spatial uncertainty information both statically, dynamically and interactively. To provide the most universal visualisation tools for non-experts, we conducted a survey on a group of 20 university students and assessed the effectiveness of selected static and interactive methods for visualising uncertainty in spatial variables such as DEM and land cover. The static methods included adjacent maps and glyphs for continuous variables. Both allow for displaying maps with information about the ensemble mean, variance/standard deviation and prediction intervals. Adjacent maps were also used for categorical data, displaying maps of the most probable class, as well as its associated probability. The interactive methods included a graphical user interface, which in addition to displaying the previously mentioned variables also allowed for comparison of joint uncertainties at multiple locations. The survey indicated that users could understand the basics of the uncertainty information displayed in the static maps, with the interactive interface allowing for more in-depth information. Subsequently, the R

  5. Compilation of information on uncertainties involved in deposition modeling

    International Nuclear Information System (INIS)

    Lewellen, W.S.; Varma, A.K.; Sheng, Y.P.

    1985-04-01

    The current generation of dispersion models contains very simple parameterizations of deposition processes. The analysis here looks at the physical mechanisms governing these processes in an attempt to see if more valid parameterizations are available and what level of uncertainty is involved in either these simple parameterizations or any more advanced parameterization. The report is composed of three parts. The first, on dry deposition model sensitivity, provides an estimate of the uncertainty existing in current estimates of the deposition velocity due to uncertainties in independent variables such as meteorological stability, particle size, surface chemical reactivity and canopy structure. The range of uncertainty estimated for an appropriate dry deposition velocity for a plume generated by a nuclear power plant accident is three orders of magnitude. The second part discusses the uncertainties involved in precipitation scavenging rates for effluents resulting from a nuclear reactor accident. The conclusion is that major uncertainties are involved both as a result of the natural variability of the atmospheric precipitation process and due to our incomplete understanding of the underlying process. The third part involves a review of the important problems associated with modeling the interaction between the atmosphere and a forest. It gives an indication of the magnitude of the problem involved in modeling dry deposition in such environments. Separate analytics have been done for each section and are contained in the EDB

  6. Uncertainty analyses of infiltration and subsurface flow and transport for SDMP sites

    International Nuclear Information System (INIS)

    Meyer, P.D.; Rockhold, M.L.; Gee, G.W.

    1997-09-01

    US Nuclear Regulatory Commission staff have identified a number of sites requiring special attention in the decommissioning process because of elevated levels of radioactive contaminants. Traits common to many of these sites include limited data characterizing the subsurface, the presence of long-lived radionuclides necessitating a long-term analysis (1,000 years or more), and potential exposure through multiple pathways. As a consequence of these traits, the uncertainty in predicted exposures can be significant. In addition, simplifications to the physical system and the transport mechanisms are often necessary to reduce the computational requirements of the analysis. Several multiple-pathway transport codes exist for estimating dose, two of which were used in this study. These two codes have built-in Monte Carlo simulation capabilities that were used for the uncertainty analysis. Several tools for improving uncertainty analyses of exposure estimates through the groundwater pathway have been developed and are discussed in this report. Generic probability distributions for unsaturated and saturated zone soil hydraulic parameters are presented. A method is presented to combine the generic distributions with site-specific water retention data using a Bayesian analysis. The resulting updated soil hydraulic parameter distributions can be used to obtain an updated estimate of the probability distribution of dose. The method is illustrated using a hypothetical decommissioning site

  7. Incorporating the effects of socioeconomic uncertainty into priority setting for conservation investment.

    Science.gov (United States)

    McBride, Marissa F; Wilson, Kerrie A; Bode, Michael; Possingham, Hugh P

    2007-12-01

    Uncertainty in the implementation and outcomes of conservation actions that is not accounted for leaves conservation plans vulnerable to potential changes in future conditions. We used a decision-theoretic approach to investigate the effects of two types of investment uncertainty on the optimal allocation of global conservation resources for land acquisition in the Mediterranean Basin. We considered uncertainty about (1) whether investment will continue and (2) whether the acquired biodiversity assets are secure, which we termed transaction uncertainty and performance uncertainty, respectively. We also developed and tested the robustness of different rules of thumb for guiding the allocation of conservation resources when these sources of uncertainty exist. In the presence of uncertainty in future investment ability (transaction uncertainty), the optimal strategy was opportunistic, meaning the investment priority should be to act where uncertainty is highest while investment remains possible. When there was a probability that investments would fail (performance uncertainty), the optimal solution became a complex trade-off between the immediate biodiversity benefits of acting in a region and the perceived longevity of the investment. In general, regions were prioritized for investment when they had the greatest performance certainty, even if an alternative region was highly threatened or had higher biodiversity value. The improved performance of rules of thumb when accounting for uncertainty highlights the importance of explicitly incorporating sources of investment uncertainty and evaluating potential conservation investments in the context of their likely long-term success.

  8. On entropic uncertainty relations in the presence of a minimal length

    Science.gov (United States)

    Rastegin, Alexey E.

    2017-07-01

    Entropic uncertainty relations for the position and momentum within the generalized uncertainty principle are examined. Studies of this principle are motivated by the existence of a minimal observable length. Then the position and momentum operators satisfy the modified commutation relation, for which more than one algebraic representation is known. One of them is described by auxiliary momentum so that the momentum and coordinate wave functions are connected by the Fourier transform. However, the probability density functions of the physically true and auxiliary momenta are different. As the corresponding entropies differ, known entropic uncertainty relations are changed. Using differential Shannon entropies, we give a state-dependent formulation with correction term. State-independent uncertainty relations are obtained in terms of the Rényi entropies and the Tsallis entropies with binning. Such relations allow one to take into account a finiteness of measurement resolution.

  9. Impacts of Korea's Exchange Rate Uncertainty on Exports

    Directory of Open Access Journals (Sweden)

    Kwon Sik Kim

    2003-12-01

    Full Text Available This paper examines the effects of two types of uncertainty related to the real effective exchange rate (REER in Korea for export trends. To decompose uncertainties into two types of component, I propose an advanced generalized Markov switching model, as developed by Hamilton (1989 and then expanded by Kim and Kim (1996. The proposed model is useful in uncovering two sources of uncertainty: the permanent component of REER and the purely transitory component. I think that the two types of uncertainties have a different effect on export trends in Korea. The transitory component of REER has no effect on the export trend at 5-percent significance, but the permanent component has an effect at this level. In addition, the degree of uncertainty, consisting of low, medium and high uncertainty in the permanent component, and low, medium and high uncertainty in transitory component of REER, also has different effects on export trends in Korea. Only high uncertainty in permanent components effects export trends. The results show that when the policy authority intends to prevent the shrinkage of exports due to the deepening of uncertainties in the foreign exchange market, the economic impacts of its intervention could appear differently according to the characteristics and degree of the uncertainties. Therefore, they imply that its economic measures, which could not grasp the sources of uncertainties properly, may even bring economic costs.

  10. Ozone Decline and Recovery: The Significance of Uncertainties

    Science.gov (United States)

    Harris, N. R. P.

    2017-12-01

    Stratospheric ozone depletion has been one of the leading environmental issues of the last 40 years. It has required research scientists, industry and government to work together to address it successfully. Steps have been taken to reduce the emissions of ozone depleting substances (ODS) under successive revisions of the measures in the 30 year old Montreal Protocol. These have led to a reduction in atmospheric ODS concentrations and so are expected over time to result in a reduction of chemical ozone depletion by ODS. This 'recovery' is being influenced by a number of other factors (natural variability, climate change, other changes in stratospheric chemistry) which makes it hard to provide good, quantitative estimates of the impact of the recent ODS reductions on stratospheric ozone. In this presentation, I discuss how ozone trends were linked to ODS during the period of ozone depletion and during the recent period of 'recovery', i.e. before and after the peak in atmospheric ODS. It is important to be as rigorous as possible in order to give public confidence in the advice provided through the scientific assessment process. We thus need to be as critical of our analyses of the recent data as possible, even though there is a strong expectation and hope from all sides that stratospheric ozone is recovering. I will describe in outline the main challenges that exist now and looking forward.

  11. THE PROPAGATION OF UNCERTAINTIES IN STELLAR POPULATION SYNTHESIS MODELING. II. THE CHALLENGE OF COMPARING GALAXY EVOLUTION MODELS TO OBSERVATIONS

    International Nuclear Information System (INIS)

    Conroy, Charlie; Gunn, James E.; White, Martin

    2010-01-01

    Models for the formation and evolution of galaxies readily predict physical properties such as star formation rates, metal-enrichment histories, and, increasingly, gas and dust content of synthetic galaxies. Such predictions are frequently compared to the spectral energy distributions of observed galaxies via the stellar population synthesis (SPS) technique. Substantial uncertainties in SPS exist, and yet their relevance to the task of comparing galaxy evolution models to observations has received little attention. In the present work, we begin to address this issue by investigating the importance of uncertainties in stellar evolution, the initial stellar mass function (IMF), and dust and interstellar medium (ISM) properties on the translation from models to observations. We demonstrate that these uncertainties translate into substantial uncertainties in the ultraviolet, optical, and near-infrared colors of synthetic galaxies. Aspects that carry significant uncertainties include the logarithmic slope of the IMF above 1 M sun , dust attenuation law, molecular cloud disruption timescale, clumpiness of the ISM, fraction of unobscured starlight, and treatment of advanced stages of stellar evolution including blue stragglers, the horizontal branch, and the thermally pulsating asymptotic giant branch. The interpretation of the resulting uncertainties in the derived colors is highly non-trivial because many of the uncertainties are likely systematic, and possibly correlated with the physical properties of galaxies. We therefore urge caution when comparing models to observations.

  12. The impact of inflation uncertainty on interest rates

    OpenAIRE

    Cheong, Chongcheul; Kim, Gi-Hong; Podivinsky, Jan M.

    2010-01-01

    In this paper, the impact of inflation uncertainty on interest rates is investigated for the case of the U.S. three-month Treasury bill rate. We emphasize how consistentOLS estimation can be applied to an empirical equation which includes a proxy variable of inflation uncertainty measured by an ARCH-type model. A significant negative relationship between the two variables is provided. This evidence is contrasted with the view of the inflation risk premium in which inflation uncertainty positi...

  13. Do Orthopaedic Surgeons Acknowledge Uncertainty?

    Science.gov (United States)

    Teunis, Teun; Janssen, Stein; Guitton, Thierry G; Ring, David; Parisien, Robert

    2016-06-01

    R(2), 0.29). The relatively low levels of uncertainty among orthopaedic surgeons and confidence bias seem inconsistent with the paucity of definitive evidence. If patients want to be informed of the areas of uncertainty and surgeon-to-surgeon variation relevant to their care, it seems possible that a low recognition of uncertainty and surgeon confidence bias might hinder adequately informing patients, informed decisions, and consent. Moreover, limited recognition of uncertainty is associated with modifiable factors such as confidence bias, trust in orthopaedic evidence base, and statistical understanding. Perhaps improved statistical teaching in residency, journal clubs to improve the critique of evidence and awareness of bias, and acknowledgment of knowledge gaps at courses and conferences might create awareness about existing uncertainties. Level 1, prognostic study.

  14. A Peep into the Uncertainty-Complexity-Relevance Modeling Trilemma through Global Sensitivity and Uncertainty Analysis

    Science.gov (United States)

    Munoz-Carpena, R.; Muller, S. J.; Chu, M.; Kiker, G. A.; Perz, S. G.

    2014-12-01

    Model Model complexity resulting from the need to integrate environmental system components cannot be understated. In particular, additional emphasis is urgently needed on rational approaches to guide decision making through uncertainties surrounding the integrated system across decision-relevant scales. However, in spite of the difficulties that the consideration of modeling uncertainty represent for the decision process, it should not be avoided or the value and science behind the models will be undermined. These two issues; i.e., the need for coupled models that can answer the pertinent questions and the need for models that do so with sufficient certainty, are the key indicators of a model's relevance. Model relevance is inextricably linked with model complexity. Although model complexity has advanced greatly in recent years there has been little work to rigorously characterize the threshold of relevance in integrated and complex models. Formally assessing the relevance of the model in the face of increasing complexity would be valuable because there is growing unease among developers and users of complex models about the cumulative effects of various sources of uncertainty on model outputs. In particular, this issue has prompted doubt over whether the considerable effort going into further elaborating complex models will in fact yield the expected payback. New approaches have been proposed recently to evaluate the uncertainty-complexity-relevance modeling trilemma (Muller, Muñoz-Carpena and Kiker, 2011) by incorporating state-of-the-art global sensitivity and uncertainty analysis (GSA/UA) in every step of the model development so as to quantify not only the uncertainty introduced by the addition of new environmental components, but the effect that these new components have over existing components (interactions, non-linear responses). Outputs from the analysis can also be used to quantify system resilience (stability, alternative states, thresholds or tipping

  15. Quantum Action Principle with Generalized Uncertainty Principle

    OpenAIRE

    Gu, Jie

    2013-01-01

    One of the common features in all promising candidates of quantum gravity is the existence of a minimal length scale, which naturally emerges with a generalized uncertainty principle, or equivalently a modified commutation relation. Schwinger's quantum action principle was modified to incorporate this modification, and was applied to the calculation of the kernel of a free particle, partly recovering the result previously studied using path integral.

  16. GENERAL RISKS AND UNCERTAINTIES OF REPORTING AND MANAGEMENT REPORTING RISKS

    Directory of Open Access Journals (Sweden)

    CAMELIA I. LUNGU

    2011-04-01

    Full Text Available Purpose: Highlighting risks and uncertainties reporting based on a literature review research. Objectives: The delimitation of risk management models and uncertainties in fundamental research. Research method: Fundamental research study directed to identify the relevant risks’ models presented in entities’ financial statements. Uncertainty is one of the fundamental coordinates of our world. As showed J.K. Galbraith (1978, the world now lives under the age of uncertainty. Moreover, we can say that contemporary society development could be achieved by taking decisions under uncertainty, though, risks. Growing concern for the study of uncertainty, its effects and precautions led to the rather recent emergence of a new science, science of hazards (les cindyniques - l.fr. (Kenvern, 1991. Current analysis of risk are dominated by Beck’s (1992 notion that a risk society now exists whereby we have become more concerned about our impact upon nature than the impact of nature upon us. Clearly, risk permeates most aspects of corporate but also of regular life decision-making and few can predict with any precision the future. The risk is almost always a major variable in real-world corporate decision-making, and managers that ignore it are in a real peril. In these circumstances, a possible answer is assuming financial discipline with an appropriate system of incentives.

  17. Measurement uncertainty analysis techniques applied to PV performance measurements

    International Nuclear Information System (INIS)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results

  18. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  19. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  20. Uncertainty analysis in Monte Carlo criticality computations

    International Nuclear Information System (INIS)

    Qi Ao

    2011-01-01

    Highlights: ► Two types of uncertainty methods for k eff Monte Carlo computations are examined. ► Sampling method has the least restrictions on perturbation but computing resources. ► Analytical method is limited to small perturbation on material properties. ► Practicality relies on efficiency, multiparameter applicability and data availability. - Abstract: Uncertainty analysis is imperative for nuclear criticality risk assessments when using Monte Carlo neutron transport methods to predict the effective neutron multiplication factor (k eff ) for fissionable material systems. For the validation of Monte Carlo codes for criticality computations against benchmark experiments, code accuracy and precision are measured by both the computational bias and uncertainty in the bias. The uncertainty in the bias accounts for known or quantified experimental, computational and model uncertainties. For the application of Monte Carlo codes for criticality analysis of fissionable material systems, an administrative margin of subcriticality must be imposed to provide additional assurance of subcriticality for any unknown or unquantified uncertainties. Because of a substantial impact of the administrative margin of subcriticality on economics and safety of nuclear fuel cycle operations, recently increasing interests in reducing the administrative margin of subcriticality make the uncertainty analysis in criticality safety computations more risk-significant. This paper provides an overview of two most popular k eff uncertainty analysis methods for Monte Carlo criticality computations: (1) sampling-based methods, and (2) analytical methods. Examples are given to demonstrate their usage in the k eff uncertainty analysis due to uncertainties in both neutronic and non-neutronic parameters of fissionable material systems.

  1. The method of belief scales as a means for dealing with uncertainty in tough regulatory decisions.

    Energy Technology Data Exchange (ETDEWEB)

    Pilch, Martin M.

    2005-10-01

    Modeling and simulation is playing an increasing role in supporting tough regulatory decisions, which are typically characterized by variabilities and uncertainties in the scenarios, input conditions, failure criteria, model parameters, and even model form. Variability exists when there is a statistically significant database that is fully relevant to the application. Uncertainty, on the other hand, is characterized by some degree of ignorance. A simple algebraic problem was used to illustrate how various risk methodologies address variability and uncertainty in a regulatory context. These traditional risk methodologies include probabilistic methods (including frequensic and Bayesian perspectives) and second-order methods where variabilities and uncertainties are treated separately. Representing uncertainties with (subjective) probability distributions and using probabilistic methods to propagate subjective distributions can lead to results that are not logically consistent with available knowledge and that may not be conservative. The Method of Belief Scales (MBS) is developed as a means to logically aggregate uncertain input information and to propagate that information through the model to a set of results that are scrutable, easily interpretable by the nonexpert, and logically consistent with the available input information. The MBS, particularly in conjunction with sensitivity analyses, has the potential to be more computationally efficient than other risk methodologies. The regulatory language must be tailored to the specific risk methodology if ambiguity and conflict are to be avoided.

  2. Uncertainties on lung doses from inhaled plutonium.

    Science.gov (United States)

    Puncher, Matthew; Birchall, Alan; Bull, Richard K

    2011-10-01

    In a recent epidemiological study, Bayesian uncertainties on lung doses have been calculated to determine lung cancer risk from occupational exposures to plutonium. These calculations used a revised version of the Human Respiratory Tract Model (HRTM) published by the ICRP. In addition to the Bayesian analyses, which give probability distributions of doses, point estimates of doses (single estimates without uncertainty) were also provided for that study using the existing HRTM as it is described in ICRP Publication 66; these are to be used in a preliminary analysis of risk. To infer the differences between the point estimates and Bayesian uncertainty analyses, this paper applies the methodology to former workers of the United Kingdom Atomic Energy Authority (UKAEA), who constituted a subset of the study cohort. The resulting probability distributions of lung doses are compared with the point estimates obtained for each worker. It is shown that mean posterior lung doses are around two- to fourfold higher than point estimates and that uncertainties on doses vary over a wide range, greater than two orders of magnitude for some lung tissues. In addition, we demonstrate that uncertainties on the parameter values, rather than the model structure, are largely responsible for these effects. Of these it appears to be the parameters describing absorption from the lungs to blood that have the greatest impact on estimates of lung doses from urine bioassay. Therefore, accurate determination of the chemical form of inhaled plutonium and the absorption parameter values for these materials is important for obtaining reliable estimates of lung doses and hence risk from occupational exposures to plutonium.

  3. Nuclear Data Uncertainty Quantification: Past, Present and Future

    International Nuclear Information System (INIS)

    Smith, D.L.

    2015-01-01

    An historical overview is provided of the mathematical foundations of uncertainty quantification and the roles played in the more recent past by nuclear data uncertainties in nuclear data evaluations and nuclear applications. Significant advances that have established the mathematical framework for contemporary nuclear data evaluation methods, as well as the use of uncertainty information in nuclear data evaluation and nuclear applications, are described. This is followed by a brief examination of the current status concerning nuclear data evaluation methodology, covariance data generation, and the application of evaluated nuclear data uncertainties in contemporary nuclear technology. A few possible areas for future investigation of this subject are also suggested

  4. Nuclear Data Uncertainty Quantification: Past, Present and Future

    Science.gov (United States)

    Smith, D. L.

    2015-01-01

    An historical overview is provided of the mathematical foundations of uncertainty quantification and the roles played in the more recent past by nuclear data uncertainties in nuclear data evaluations and nuclear applications. Significant advances that have established the mathematical framework for contemporary nuclear data evaluation methods, as well as the use of uncertainty information in nuclear data evaluation and nuclear applications, are described. This is followed by a brief examination of the current status concerning nuclear data evaluation methodology, covariance data generation, and the application of evaluated nuclear data uncertainties in contemporary nuclear technology. A few possible areas for future investigation of this subject are also suggested.

  5. IAEA CRP on HTGR Uncertainties in Modeling: Assessment of Phase I Lattice to Core Model Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Rouxelin, Pascal Nicolas [Idaho National Lab. (INL), Idaho Falls, ID (United States); Strydom, Gerhard [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    II 1a. The steady state core calculations were simulated with the INL coupled-code system known as the Parallel and Highly Innovative Simulation for INL Code System (PHISICS) and the system thermal-hydraulics code known as the Reactor Excursion and Leak Analysis Program (RELAP) 5 3D using the nuclear data libraries previously generated with NEWT. It was observed that significant differences in terms of multiplication factor and neutron flux exist between the various permutations of the Phase I super-cell lattice calculations. The use of these cross section libraries only leads to minor changes in the Phase II core simulation results for fresh fuel but shows significantly larger discrepancies for spent fuel cores. Furthermore, large incongruities were found between the SCALE NEWT and KENO VI results for the super cells, and while some trends could be identified, a final conclusion on this issue could not yet be reached. This report will be revised in mid 2016 with more detailed analyses of the super-cell problems and their effects on the core models, using the latest version of SCALE (6.2). The super-cell models seem to show substantial improvements in terms of neutron flux as compared to single-block models, particularly at thermal energies.

  6. The uncertainty of reference standards--a guide to understanding factors impacting uncertainty, uncertainty calculations, and vendor certifications.

    Science.gov (United States)

    Gates, Kevin; Chang, Ning; Dilek, Isil; Jian, Huahua; Pogue, Sherri; Sreenivasan, Uma

    2009-10-01

    Certified solution standards are widely used in forensic toxicological, clinical/diagnostic, and environmental testing. Typically, these standards are purchased as ampouled solutions with a certified concentration. Vendors present concentration and uncertainty differently on their Certificates of Analysis. Understanding the factors that impact uncertainty and which factors have been considered in the vendor's assignment of uncertainty are critical to understanding the accuracy of the standard and the impact on testing results. Understanding these variables is also important for laboratories seeking to comply with ISO/IEC 17025 requirements and for those preparing reference solutions from neat materials at the bench. The impact of uncertainty associated with the neat material purity (including residual water, residual solvent, and inorganic content), mass measurement (weighing techniques), and solvent addition (solution density) on the overall uncertainty of the certified concentration is described along with uncertainty calculations.

  7. Heisenberg's principle of uncertainty and the uncertainty relations

    International Nuclear Information System (INIS)

    Redei, Miklos

    1987-01-01

    The usual verbal form of the Heisenberg uncertainty principle and the usual mathematical formulation (the so-called uncertainty theorem) are not equivalent. The meaning of the concept 'uncertainty' is not unambiguous and different interpretations are used in the literature. Recently a renewed interest has appeared to reinterpret and reformulate the precise meaning of Heisenberg's principle and to find adequate mathematical form. The suggested new theorems are surveyed and critically analyzed. (D.Gy.) 20 refs

  8. Measurement Uncertainty

    Science.gov (United States)

    Koch, Michael

    Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.

  9. NICE guidelines, clinical practice and antisocial personality disorder: the ethical implications of ontological uncertainty.

    Science.gov (United States)

    Pickersgill, M D

    2009-11-01

    The British National Institute for Health and Clinical Excellence (NICE) has recently (28 January 2009) released new guidelines for the diagnosis, treatment and prevention of the psychiatric category antisocial personality disorder (ASPD). Evident in these recommendations is a broader ambiguity regarding the ontology of ASPD. Although, perhaps, a mundane feature of much of medicine, in this case, ontological uncertainty has significant ethical implications as a product of the profound consequences for an individual categorised with this disorder. This paper argues that in refraining from emphasising uncertainty, NICE risks reifying a controversial category. This is particularly problematical given that the guidelines recommend the identification of individuals "at risk" of raising antisocial children. Although this paper does not argue that NICE is "wrong" in any of its recommendations, more emphasis should have been placed on discussions of the ethical implications of diagnosis and treatment, especially given the multiple uncertainties associated with ASPD. It is proposed that these important issues be examined in more detail in revisions of existing NICE recommendations, and be included in upcoming guidance. This paper thus raises key questions regarding the place and role of ethics within the current and future remit of NICE.

  10. Reliability Analysis and Test Planning using CAPO-Test for Existing Structures

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Engelund, S.; Faber, Michael Havbro

    2000-01-01

    Evaluation of the reliability of existing concrete structures often requires that the compressive strength of the concrete is estimated on the basis of tests performed with concrete samples from the structure considered. In this paper the CAPO-test method is considered. The different sources...... of uncertainty related to this method are described. It is shown how the uncertainty in the transformation from the CAPO-test results to estimates of the concrete strength can be modeled. Further, the statistical uncertainty is modeled using Bayesian statistics. Finally, it is shown how reliability-based optimal...... planning of CAPO-tests can be performed taking into account the expected costs due to the CAPO-tests and possible repair or failure of the structure considered. An illustrative example is presented where the CAPO-test is compared with conventional concrete cylinder compression tests performed on cores...

  11. An enhanced unified uncertainty analysis approach based on first order reliability method with single-level optimization

    International Nuclear Information System (INIS)

    Yao, Wen; Chen, Xiaoqian; Huang, Yiyong; Tooren, Michel van

    2013-01-01

    In engineering, there exist both aleatory uncertainties due to the inherent variation of the physical system and its operational environment, and epistemic uncertainties due to lack of knowledge and which can be reduced with the collection of more data. To analyze the uncertain distribution of the system performance under both aleatory and epistemic uncertainties, combined probability and evidence theory can be employed to quantify the compound effects of the mixed uncertainties. The existing First Order Reliability Method (FORM) based Unified Uncertainty Analysis (UUA) approach nests the optimization based interval analysis in the improved Hasofer–Lind–Rackwitz–Fiessler (iHLRF) algorithm based Most Probable Point (MPP) searching procedure, which is computationally inhibitive for complex systems and may encounter convergence problem as well. Therefore, in this paper it is proposed to use general optimization solvers to search MPP in the outer loop and then reformulate the double-loop optimization problem into an equivalent single-level optimization (SLO) problem, so as to simplify the uncertainty analysis process, improve the robustness of the algorithm, and alleviate the computational complexity. The effectiveness and efficiency of the proposed method is demonstrated with two numerical examples and one practical satellite conceptual design problem. -- Highlights: ► Uncertainty analysis under mixed aleatory and epistemic uncertainties is studied. ► A unified uncertainty analysis method is proposed with combined probability and evidence theory. ► The traditional nested analysis method is converted to single level optimization for efficiency. ► The effectiveness and efficiency of the proposed method are testified with three examples

  12. Uncertainties in Forecasting Streamflow using Entropy Theory

    Science.gov (United States)

    Cui, H.; Singh, V. P.

    2017-12-01

    Streamflow forecasting is essential in river restoration, reservoir operation, power generation, irrigation, navigation, and water management. However, there is always uncertainties accompanied in forecast, which may affect the forecasting results and lead to large variations. Therefore, uncertainties must be considered and be assessed properly when forecasting streamflow for water management. The aim of our work is to quantify the uncertainties involved in forecasting streamflow and provide reliable streamflow forecast. Despite that streamflow time series are stochastic, they exhibit seasonal and periodic patterns. Therefore, streamflow forecasting entails modeling seasonality, periodicity, and its correlation structure, and assessing uncertainties. This study applies entropy theory to forecast streamflow and measure uncertainties during the forecasting process. To apply entropy theory for streamflow forecasting, spectral analysis is combined to time series analysis, as spectral analysis can be employed to characterize patterns of streamflow variation and identify the periodicity of streamflow. That is, it permits to extract significant information for understanding the streamflow process and prediction thereof. Application of entropy theory for streamflow forecasting involves determination of spectral density, determination of parameters, and extension of autocorrelation function. The uncertainties brought by precipitation input, forecasting model and forecasted results are measured separately using entropy. With information theory, how these uncertainties transported and aggregated during these processes will be described.

  13. Calculation of uncertainties; Calculo de incertidumbres

    Energy Technology Data Exchange (ETDEWEB)

    Diaz-Asencio, Misael [Centro de Estudios Ambientales de Cienfuegos (Cuba)

    2012-07-01

    One of the most important aspects in relation to the quality assurance in any analytical activity is the estimation of measurement uncertainty. There is general agreement that 'the expression of the result of a measurement is not complete without specifying its associated uncertainty'. An analytical process is the mechanism for obtaining methodological information (measurand) of a material system (population). This implies the need for the definition of the problem, the choice of methods for sampling and measurement and proper execution of these activities for obtaining information. The result of a measurement is only an approximation or estimate of the value of the measurand, which is complete only when accompanied by an estimate of the uncertainty of the analytical process. According to the 'Vocabulary of Basic and General Terms in Metrology' measurement uncertainty' is the parameter associated with the result of a measurement that characterizes the dispersion of the values that could reasonably be attributed to the measurand (or magnitude). This parameter could be a standard deviation or a confidence interval. The uncertainty evaluation requires detailed look at all possible sources, but not disproportionately. We can make a good estimate of the uncertainty concentrating efforts on the largest contributions. The key steps of the process of determining the uncertainty in the measurements are: - the specification of the measurand; - identification of the sources of uncertainty - the quantification of individual components of uncertainty, - calculate the combined standard uncertainty; - report of uncertainty. [Spanish] Uno de los aspectos mas importantes en relacion con el aseguramiento de la calidad en cualquier actividad analitica es la estimacion de la incertidumbre de la medicion. Existe el acuerdo general que 'la expresion del resultado de una medicion no esta completa sin especificar su incertidumbre asociada'. Un proceso analitico es el mecanismo

  14. Critical loads - assessment of uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Barkman, A.

    1998-10-01

    The effects of data uncertainty in applications of the critical loads concept were investigated on different spatial resolutions in Sweden and northern Czech Republic. Critical loads of acidity (CL) were calculated for Sweden using the biogeochemical model PROFILE. Three methods with different structural complexity were used to estimate the adverse effects of S0{sub 2} concentrations in northern Czech Republic. Data uncertainties in the calculated critical loads/levels and exceedances (EX) were assessed using Monte Carlo simulations. Uncertainties within cumulative distribution functions (CDF) were aggregated by accounting for the overlap between site specific confidence intervals. Aggregation of data uncertainties within CDFs resulted in lower CL and higher EX best estimates in comparison with percentiles represented by individual sites. Data uncertainties were consequently found to advocate larger deposition reductions to achieve non-exceedance based on low critical loads estimates on 150 x 150 km resolution. Input data were found to impair the level of differentiation between geographical units at all investigated resolutions. Aggregation of data uncertainty within CDFs involved more constrained confidence intervals for a given percentile. Differentiation as well as identification of grid cells on 150 x 150 km resolution subjected to EX was generally improved. Calculation of the probability of EX was shown to preserve the possibility to differentiate between geographical units. Re-aggregation of the 95%-ile EX on 50 x 50 km resolution generally increased the confidence interval for each percentile. Significant relationships were found between forest decline and the three methods addressing risks induced by S0{sub 2} concentrations. Modifying S0{sub 2} concentrations by accounting for the length of the vegetation period was found to constitute the most useful trade-off between structural complexity, data availability and effects of data uncertainty. Data

  15. The application, benefits and challenges of retrofitting the existing buildings

    Science.gov (United States)

    Khairi, Muhammad; Jaapar, Aini; Yahya, Zaharah

    2017-11-01

    Sustainable development has been the main topic of debate for years in some countries such as United Kingdom, United State of America and Malaysia. Depletion of natural resources, global warming, economics uncertainty and health issues are some of the reasons behind sustainable development movements, it is not just a political debate in the parliament but more towards collective works among sectors in order to minimizing the negative impact of development to the environment and other living organism. Retrofit an existing building is one of the solutions to reduce the dependency on constructing new buildings. There are huge numbers of existing building stocks that suitable to be retrofitted such as historical buildings, offices, residential, warehouse, factories, vacant buildings and other historical buildings. Therefore, the aim of this research is to provide information on the application, benefits and challenges of retrofitting an existing building. Two buildings were chosen as case studies following by site visits and observation to the buildings. The data were then compared in a table form. Primary and secondary sources were also used for this research. The application of retrofit should be promoted across the construction and conservation industries since it has significant tangible and intangible benefits. It is one of the most environmentally friendly and efficient solutions to optimize the energy performance and could also helps to extend the life of the existing building or historical buildings while ensuring optimum thermal comfort for the occupants which leads to higher productivity.

  16. Global impact of uncertainties in China’s gas market

    International Nuclear Information System (INIS)

    Xunpeng, Shi; Variam, Hari Malamakkavu Padinjare; Tao, Jacqueline

    2017-01-01

    This paper examines the uncertainties in Chinese gas markets, analyze the reasons and quantify their impact on the world gas market. A literature review found significant variability among the outlooks on China's gas sector. Further assessment found that uncertainties in economic growth, structural change in markets, environmental regulations, price and institutional changes contribute to the uncertainties. The analysis of China’s demand and supply uncertainties with a world gas-trading model found significant changes in global production, trade patterns and spot prices, with pipeline exporters being most affected. China's domestic production and pipeline imports from Central Asia are the major buffers that can offset much of the uncertainties. The study finds an asymmetric phenomenon. Pipeline imports are responding to China's uncertainties in both low and high demand scenarios while LNG imports are only responding to high demand scenario. The major reasons are higher TOP levels and the current practice of import only up to the minimum TOP levels for LNG, as well as a lack of liberalized gas markets. The study shows that it is necessary to create LNG markets that can respond to market dynamics, through either a reduction of TOP levels or change of pricing mechanisms to hub indexation. - Highlights: • Economic growth, regulations, reforms and shale gas cause the uncertainties. • Pipeline exporters to China and Southeast Asian and Australian LNG exporters affected the most. • China’s domestic production and pipe imports offset much of the uncertainties. • Pipeline imports are responding to China’s uncertainties in both low and high demand. • LNG imports are only responding to high demand scenario.

  17. Risk, unexpected uncertainty, and estimation uncertainty: Bayesian learning in unstable settings.

    Directory of Open Access Journals (Sweden)

    Elise Payzan-LeNestour

    Full Text Available Recently, evidence has emerged that humans approach learning using Bayesian updating rather than (model-free reinforcement algorithms in a six-arm restless bandit problem. Here, we investigate what this implies for human appreciation of uncertainty. In our task, a Bayesian learner distinguishes three equally salient levels of uncertainty. First, the Bayesian perceives irreducible uncertainty or risk: even knowing the payoff probabilities of a given arm, the outcome remains uncertain. Second, there is (parameter estimation uncertainty or ambiguity: payoff probabilities are unknown and need to be estimated. Third, the outcome probabilities of the arms change: the sudden jumps are referred to as unexpected uncertainty. We document how the three levels of uncertainty evolved during the course of our experiment and how it affected the learning rate. We then zoom in on estimation uncertainty, which has been suggested to be a driving force in exploration, in spite of evidence of widespread aversion to ambiguity. Our data corroborate the latter. We discuss neural evidence that foreshadowed the ability of humans to distinguish between the three levels of uncertainty. Finally, we investigate the boundaries of human capacity to implement Bayesian learning. We repeat the experiment with different instructions, reflecting varying levels of structural uncertainty. Under this fourth notion of uncertainty, choices were no better explained by Bayesian updating than by (model-free reinforcement learning. Exit questionnaires revealed that participants remained unaware of the presence of unexpected uncertainty and failed to acquire the right model with which to implement Bayesian updating.

  18. Incorporating uncertainty in predictive species distribution modelling.

    Science.gov (United States)

    Beale, Colin M; Lennon, Jack J

    2012-01-19

    Motivated by the need to solve ecological problems (climate change, habitat fragmentation and biological invasions), there has been increasing interest in species distribution models (SDMs). Predictions from these models inform conservation policy, invasive species management and disease-control measures. However, predictions are subject to uncertainty, the degree and source of which is often unrecognized. Here, we review the SDM literature in the context of uncertainty, focusing on three main classes of SDM: niche-based models, demographic models and process-based models. We identify sources of uncertainty for each class and discuss how uncertainty can be minimized or included in the modelling process to give realistic measures of confidence around predictions. Because this has typically not been performed, we conclude that uncertainty in SDMs has often been underestimated and a false precision assigned to predictions of geographical distribution. We identify areas where development of new statistical tools will improve predictions from distribution models, notably the development of hierarchical models that link different types of distribution model and their attendant uncertainties across spatial scales. Finally, we discuss the need to develop more defensible methods for assessing predictive performance, quantifying model goodness-of-fit and for assessing the significance of model covariates.

  19. Economic Policy Uncertainty and Long-Run Stock Market Volatility and Correlation

    DEFF Research Database (Denmark)

    Asgharian, Hossein; Christiansen, Charlotte; Hou, Ai Jun

    We use Baker, Bloom, and Davis’s (2016) economic policy uncertainty indices in combination with the mixed data sampling (MIDAS) approach to investigate long-run stock market volatility and correlation, primarily for the US and UK. Long-run US–UK stock market correlation depends positively on US...... economic policy uncertainty shocks. The dependence is asymmetric, with only positive shocks - increasing uncertainty - being of importance. The US long-run stock market volatility depends significantly on US economic policy uncertainty shocks but not on UK shocks, while the UK long-run stock market...... volatility depends significantly on both. Allowing for US economic policy uncertainty shocks improves the out-of-sample forecasting of US–UK stock market correlation and enhances portfolio performance. Similar results apply to the long-run correlation between the US and Canada, China, and Germany....

  20. Propagation of interval and probabilistic uncertainty in cyberinfrastructure-related data processing and data fusion

    CERN Document Server

    Servin, Christian

    2015-01-01

    On various examples ranging from geosciences to environmental sciences, this book explains how to generate an adequate description of uncertainty, how to justify semiheuristic algorithms for processing uncertainty, and how to make these algorithms more computationally efficient. It explains in what sense the existing approach to uncertainty as a combination of random and systematic components is only an approximation, presents a more adequate three-component model with an additional periodic error component, and explains how uncertainty propagation techniques can be extended to this model. The book provides a justification for a practically efficient heuristic technique (based on fuzzy decision-making). It explains how the computational complexity of uncertainty processing can be reduced. The book also shows how to take into account that in real life, the information about uncertainty is often only partially known, and, on several practical examples, explains how to extract the missing information about uncer...

  1. Effects of utility demand-side management programs on uncertainty

    International Nuclear Information System (INIS)

    Hirst, E.

    1994-01-01

    Electric utilities face a variety of uncertainties that complicate their long-term resource planning. These uncertainties include future economic and load growths, fuel prices, environmental and economic regulations, performance of existing power plants, cost and availability of purchased power, and the costs and performance of new demand and supply resources. As utilities increasingly turn to demand-side management (DSM) programs to provide resources, it becomes more important to analyze the interactions between these programs and the uncertainties facing utilities. This paper uses a dynamic planning model to quantify the uncertainty effects of supply-only vs DSM + supply resource portfolios. The analysis considers four sets of uncertainties: economic growth, fuel prices, the costs to build new power plants, and the costs to operate DSM programs. The two types of portfolios are tested against these four sets of uncertainties for the period 1990 to 2010. Sensitivity, scenario, and worst-case analysis methods are used. The sensitivity analyses show that the DSM + supply resource portfolio is less sensitive to unanticipated changes in economic growth, fuel prices, and power-plant construction costs than is the supply-only portfolio. The supply-only resource mix is better only with respect to uncertainties about the costs of DSM programs. The base-case analysis shows that including DSM programs in the utility's resource portfolio reduces the net present value of revenue requirements (NPV-RR) by 490 million dollars. The scenario-analysis results show an additional 30 million dollars (6%) in benefits associated with reduction in these uncertainties. In the worst-case analysis, the DSM + supply portfolio again reduces the cost penalty associated with guessing wrong for both cases, when the utility plans for high needs and learns it has low needs and vice versa. 20 refs

  2. Dual long memory of inflation and test of the relationship between inflation and inflation uncertainty

    OpenAIRE

    LIU Jinquan; ZHENG Tingguo; SUI Jianli

    2008-01-01

    This paper uses the ARFIMA-FIGARCH model to investigate the China¡¯s monthly inflation rate from January 1983 to October 2005. It is found that both first moment and second moment of inflation have remarkable long memory, indicating the existence of long memory properties in both inflation level and inflation uncertainty. By the Granger-causality test on inflation rate and inflation uncertainty, it is shown that the inflation level affects the inflation uncertainty and so supports Friedman hy...

  3. On uncertainty relations in quantum mechanics

    International Nuclear Information System (INIS)

    Ignatovich, V.K.

    2004-01-01

    Uncertainty relations (UR) are shown to have nothing specific for quantum mechanics (QM), being the general property valid for the arbitrary function. A wave function of a particle simultaneously having a precisely defined position and momentum in QM is demonstrated. Interference on two slits in a screen is shown to exist in classical mechanics. A nonlinear classical system of equations replacing the QM Schroedinger equation is suggested. This approach is shown to have nothing in common with the Bohm mechanics

  4. Including model uncertainty in risk-informed decision making

    International Nuclear Information System (INIS)

    Reinert, Joshua M.; Apostolakis, George E.

    2006-01-01

    Model uncertainties can have a significant impact on decisions regarding licensing basis changes. We present a methodology to identify basic events in the risk assessment that have the potential to change the decision and are known to have significant model uncertainties. Because we work with basic event probabilities, this methodology is not appropriate for analyzing uncertainties that cause a structural change to the model, such as success criteria. We use the risk achievement worth (RAW) importance measure with respect to both the core damage frequency (CDF) and the change in core damage frequency (ΔCDF) to identify potentially important basic events. We cross-check these with generically important model uncertainties. Then, sensitivity analysis is performed on the basic event probabilities, which are used as a proxy for the model parameters, to determine how much error in these probabilities would need to be present in order to impact the decision. A previously submitted licensing basis change is used as a case study. Analysis using the SAPHIRE program identifies 20 basic events as important, four of which have model uncertainties that have been identified in the literature as generally important. The decision is fairly insensitive to uncertainties in these basic events. In three of these cases, one would need to show that model uncertainties would lead to basic event probabilities that would be between two and four orders of magnitude larger than modeled in the risk assessment before they would become important to the decision. More detailed analysis would be required to determine whether these higher probabilities are reasonable. Methods to perform this analysis from the literature are reviewed and an example is demonstrated using the case study

  5. Application of fuzzy system theory in addressing the presence of uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Yusmye, A. Y. N. [Institute of Engineering Mathematics, Universiti Malaysia Perlis Kampus Pauh Putra, 02600, Arau, Perlis (Malaysia); Goh, B. Y.; Adnan, N. F.; Ariffin, A. K. [Department of Mechanical and Materials, Faculty of Engineering and Built Environment Universiti Kebangsaan Malaysia 43600 UKM Bangi, Selangor (Malaysia)

    2015-02-03

    In this paper, the combinations of fuzzy system theory with the finite element methods are present and discuss to deal with the uncertainties. The present of uncertainties is needed to avoid for prevent the failure of the material in engineering. There are three types of uncertainties, which are stochastic, epistemic and error uncertainties. In this paper, the epistemic uncertainties have been considered. For the epistemic uncertainty, it exists as a result of incomplete information and lack of knowledge or data. Fuzzy system theory is a non-probabilistic method, and this method is most appropriate to interpret the uncertainty compared to statistical approach when the deal with the lack of data. Fuzzy system theory contains a number of processes started from converting the crisp input to fuzzy input through fuzzification process and followed by the main process known as mapping process. The term mapping here means that the logical relationship between two or more entities. In this study, the fuzzy inputs are numerically integrated based on extension principle method. In the final stage, the defuzzification process is implemented. Defuzzification is an important process to allow the conversion of the fuzzy output to crisp outputs. Several illustrative examples are given and from the simulation, the result showed that propose the method produces more conservative results comparing with the conventional finite element method.

  6. Application of fuzzy system theory in addressing the presence of uncertainties

    International Nuclear Information System (INIS)

    Yusmye, A. Y. N.; Goh, B. Y.; Adnan, N. F.; Ariffin, A. K.

    2015-01-01

    In this paper, the combinations of fuzzy system theory with the finite element methods are present and discuss to deal with the uncertainties. The present of uncertainties is needed to avoid for prevent the failure of the material in engineering. There are three types of uncertainties, which are stochastic, epistemic and error uncertainties. In this paper, the epistemic uncertainties have been considered. For the epistemic uncertainty, it exists as a result of incomplete information and lack of knowledge or data. Fuzzy system theory is a non-probabilistic method, and this method is most appropriate to interpret the uncertainty compared to statistical approach when the deal with the lack of data. Fuzzy system theory contains a number of processes started from converting the crisp input to fuzzy input through fuzzification process and followed by the main process known as mapping process. The term mapping here means that the logical relationship between two or more entities. In this study, the fuzzy inputs are numerically integrated based on extension principle method. In the final stage, the defuzzification process is implemented. Defuzzification is an important process to allow the conversion of the fuzzy output to crisp outputs. Several illustrative examples are given and from the simulation, the result showed that propose the method produces more conservative results comparing with the conventional finite element method

  7. Reporting and analyzing statistical uncertainties in Monte Carlo-based treatment planning

    International Nuclear Information System (INIS)

    Chetty, Indrin J.; Rosu, Mihaela; Kessler, Marc L.; Fraass, Benedick A.; Haken, Randall K. ten; Kong, Feng-Ming; McShan, Daniel L.

    2006-01-01

    Purpose: To investigate methods of reporting and analyzing statistical uncertainties in doses to targets and normal tissues in Monte Carlo (MC)-based treatment planning. Methods and Materials: Methods for quantifying statistical uncertainties in dose, such as uncertainty specification to specific dose points, or to volume-based regions, were analyzed in MC-based treatment planning for 5 lung cancer patients. The effect of statistical uncertainties on target and normal tissue dose indices was evaluated. The concept of uncertainty volume histograms for targets and organs at risk was examined, along with its utility, in conjunction with dose volume histograms, in assessing the acceptability of the statistical precision in dose distributions. The uncertainty evaluation tools were extended to four-dimensional planning for application on multiple instances of the patient geometry. All calculations were performed using the Dose Planning Method MC code. Results: For targets, generalized equivalent uniform doses and mean target doses converged at 150 million simulated histories, corresponding to relative uncertainties of less than 2% in the mean target doses. For the normal lung tissue (a volume-effect organ), mean lung dose and normal tissue complication probability converged at 150 million histories despite the large range in the relative organ uncertainty volume histograms. For 'serial' normal tissues such as the spinal cord, large fluctuations exist in point dose relative uncertainties. Conclusions: The tools presented here provide useful means for evaluating statistical precision in MC-based dose distributions. Tradeoffs between uncertainties in doses to targets, volume-effect organs, and 'serial' normal tissues must be considered carefully in determining acceptable levels of statistical precision in MC-computed dose distributions

  8. Charm quark mass with calibrated uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Erler, Jens [Universidad Nacional Autonoma de Mexico, Instituto de Fisica, Mexico, DF (Mexico); Masjuan, Pere [Universitat Autonoma de Barcelona, Grup de Fisica Teorica, Departament de Fisica, Barcelona (Spain); Institut de Fisica d' Altes Energies (IFAE), The Barcelona Institute of Science and Technology (BIST), Barcelona (Spain); Spiesberger, Hubert [Johannes Gutenberg-Universitaet, PRISMA Cluster of Excellence, Institut fuer Physik, Mainz (Germany); University of Cape Town, Centre for Theoretical and Mathematical Physics and Department of Physics, Rondebosch (South Africa)

    2017-02-15

    We determine the charm quark mass m{sub c} from QCD sum rules of the moments of the vector current correlator calculated in perturbative QCD at O(α{sub s}{sup 3}). Only experimental data for the charm resonances below the continuum threshold are needed in our approach, while the continuum contribution is determined by requiring self-consistency between various sum rules, including the one for the zeroth moment. Existing data from the continuum region can then be used to bound the theoretic uncertainty. Our result is m{sub c}(m{sub c}) = 1272 ± 8 MeV for α{sub s}(M{sub Z}) = 0.1182, where the central value is in very good agreement with other recent determinations based on the relativistic sum rule approach. On the other hand, there is considerably less agreement regarding the theory dominated uncertainty and we pay special attention to the question how to quantify and justify it. (orig.)

  9. Generalized uncertainty principle as a consequence of the effective field theory

    Energy Technology Data Exchange (ETDEWEB)

    Faizal, Mir, E-mail: mirfaizalmir@gmail.com [Irving K. Barber School of Arts and Sciences, University of British Columbia – Okanagan, Kelowna, British Columbia V1V 1V7 (Canada); Department of Physics and Astronomy, University of Lethbridge, Lethbridge, Alberta T1K 3M4 (Canada); Ali, Ahmed Farag, E-mail: ahmed.ali@fsc.bu.edu.eg [Department of Physics, Faculty of Science, Benha University, Benha, 13518 (Egypt); Netherlands Institute for Advanced Study, Korte Spinhuissteeg 3, 1012 CG Amsterdam (Netherlands); Nassar, Ali, E-mail: anassar@zewailcity.edu.eg [Department of Physics, Zewail City of Science and Technology, 12588, Giza (Egypt)

    2017-02-10

    We will demonstrate that the generalized uncertainty principle exists because of the derivative expansion in the effective field theories. This is because in the framework of the effective field theories, the minimum measurable length scale has to be integrated away to obtain the low energy effective action. We will analyze the deformation of a massive free scalar field theory by the generalized uncertainty principle, and demonstrate that the minimum measurable length scale corresponds to a second more massive scale in the theory, which has been integrated away. We will also analyze CFT operators dual to this deformed scalar field theory, and observe that scaling of the new CFT operators indicates that they are dual to this more massive scale in the theory. We will use holographic renormalization to explicitly calculate the renormalized boundary action with counter terms for this scalar field theory deformed by generalized uncertainty principle, and show that the generalized uncertainty principle contributes to the matter conformal anomaly.

  10. Generalized uncertainty principle as a consequence of the effective field theory

    Directory of Open Access Journals (Sweden)

    Mir Faizal

    2017-02-01

    Full Text Available We will demonstrate that the generalized uncertainty principle exists because of the derivative expansion in the effective field theories. This is because in the framework of the effective field theories, the minimum measurable length scale has to be integrated away to obtain the low energy effective action. We will analyze the deformation of a massive free scalar field theory by the generalized uncertainty principle, and demonstrate that the minimum measurable length scale corresponds to a second more massive scale in the theory, which has been integrated away. We will also analyze CFT operators dual to this deformed scalar field theory, and observe that scaling of the new CFT operators indicates that they are dual to this more massive scale in the theory. We will use holographic renormalization to explicitly calculate the renormalized boundary action with counter terms for this scalar field theory deformed by generalized uncertainty principle, and show that the generalized uncertainty principle contributes to the matter conformal anomaly.

  11. Instrument uncertainty predictions

    International Nuclear Information System (INIS)

    Coutts, D.A.

    1991-07-01

    The accuracy of measurements and correlations should normally be provided for most experimental activities. The uncertainty is a measure of the accuracy of a stated value or equation. The uncertainty term reflects a combination of instrument errors, modeling limitations, and phenomena understanding deficiencies. This report provides several methodologies to estimate an instrument's uncertainty when used in experimental work. Methods are shown to predict both the pretest and post-test uncertainty

  12. Entropic formulation of the uncertainty principle for the number and annihilation operators

    International Nuclear Information System (INIS)

    Rastegin, Alexey E

    2011-01-01

    An entropic approach to formulating uncertainty relations for the number-annihilation pair is considered. We construct some normal operator that traces the annihilation operator as well as commuting quadratures with a complete system of common eigenfunctions. Expanding the measured wave function with respect to them, one obtains a relevant probability distribution. Another distribution is naturally generated by measuring the number operator. Due to the Riesz-Thorin theorem, there exists a nontrivial inequality between corresponding functionals of the above distributions. We find the bound in this inequality and further derive uncertainty relations in terms of both the Rényi and Tsallis entropies. Entropic uncertainty relations for a continuous distribution as well as relations for a discretized one are presented. (comment)

  13. Uncertainty Analyses and Strategy

    International Nuclear Information System (INIS)

    Kevin Coppersmith

    2001-01-01

    performance difficult. Likewise, a demonstration of the magnitude of conservatisms in the dose estimates that result from conservative inputs is difficult to determine. To respond to these issues, the DOE explored the significance of uncertainties and the magnitude of conservatisms in the SSPA Volumes 1 and 2 (BSC 2001 [DIRS 155950]; BSC 2001 [DIRS 154659]). The three main goals of this report are: (1) To briefly summarize and consolidate the discussion of much of the work that has been done over the past few years to evaluate, clarify, and improve the representation of uncertainties in the TSPA and performance projections for a potential repository. This report does not contain any new analyses of those uncertainties, but it summarizes in one place the main findings of that work. (2) To develop a strategy for how uncertainties may be handled in the TSPA and supporting analyses and models to support a License Application, should the site be recommended. It should be noted that the strategy outlined in this report is based on current information available to DOE. The strategy may be modified pending receipt of additional pertinent information, such as the Yucca Mountain Review Plan. (3) To discuss issues related to communication about uncertainties, and propose some approaches the DOE may use in the future to improve how it communicates uncertainty in its models and performance assessments to decision-makers and to technical audiences

  14. Uncertainty analysis of neutron transport calculation

    International Nuclear Information System (INIS)

    Oka, Y.; Furuta, K.; Kondo, S.

    1987-01-01

    A cross section sensitivity-uncertainty analysis code, SUSD was developed. The code calculates sensitivity coefficients for one and two-dimensional transport problems based on the first order perturbation theory. Variance and standard deviation of detector responses or design parameters can be obtained using cross section covariance matrix. The code is able to perform sensitivity-uncertainty analysis for secondary neutron angular distribution(SAD) and secondary neutron energy distribution(SED). Covariances of 6 Li and 7 Li neutron cross sections in JENDL-3PR1 were evaluated including SAD and SED. Covariances of Fe and Be were also evaluated. The uncertainty of tritium breeding ratio, fast neutron leakage flux and neutron heating was analysed on four types of blanket concepts for a commercial tokamak fusion reactor. The uncertainty of tritium breeding ratio was less than 6 percent. Contribution from SAD/SED uncertainties are significant for some parameters. Formulas to estimate the errors of numerical solution of the transport equation were derived based on the perturbation theory. This method enables us to deterministically estimate the numerical errors due to iterative solution, spacial discretization and Legendre polynomial expansion of transfer cross-sections. The calculational errors of the tritium breeding ratio and the fast neutron leakage flux of the fusion blankets were analysed. (author)

  15. Uncertainty and measurement

    International Nuclear Information System (INIS)

    Landsberg, P.T.

    1990-01-01

    This paper explores how the quantum mechanics uncertainty relation can be considered to result from measurements. A distinction is drawn between the uncertainties obtained by scrutinising experiments and the standard deviation type of uncertainty definition used in quantum formalism. (UK)

  16. Parametric uncertainty in optical image modeling

    Science.gov (United States)

    Potzick, James; Marx, Egon; Davidson, Mark

    2006-10-01

    Optical photomask feature metrology and wafer exposure process simulation both rely on optical image modeling for accurate results. While it is fair to question the accuracies of the available models, model results also depend on several input parameters describing the object and imaging system. Errors in these parameter values can lead to significant errors in the modeled image. These parameters include wavelength, illumination and objective NA's, magnification, focus, etc. for the optical system, and topography, complex index of refraction n and k, etc. for the object. In this paper each input parameter is varied over a range about its nominal value and the corresponding images simulated. Second order parameter interactions are not explored. Using the scenario of the optical measurement of photomask features, these parametric sensitivities are quantified by calculating the apparent change of the measured linewidth for a small change in the relevant parameter. Then, using reasonable values for the estimated uncertainties of these parameters, the parametric linewidth uncertainties can be calculated and combined to give a lower limit to the linewidth measurement uncertainty for those parameter uncertainties.

  17. Assessing the Expected Value of Research Studies in Reducing Uncertainty and Improving Implementation Dynamics.

    Science.gov (United States)

    Grimm, Sabine E; Dixon, Simon; Stevens, John W

    2017-07-01

    With low implementation of cost-effective health technologies being a problem in many health systems, it is worth considering the potential effects of research on implementation at the time of health technology assessment. Meaningful and realistic implementation estimates must be of dynamic nature. To extend existing methods for assessing the value of research studies in terms of both reduction of uncertainty and improvement in implementation by considering diffusion based on expert beliefs with and without further research conditional on the strength of evidence. We use expected value of sample information and expected value of specific implementation measure concepts accounting for the effects of specific research studies on implementation and the reduction of uncertainty. Diffusion theory and elicitation of expert beliefs about the shape of diffusion curves inform implementation dynamics. We illustrate use of the resulting dynamic expected value of research in a preterm birth screening technology and results are compared with those from a static analysis. Allowing for diffusion based on expert beliefs had a significant impact on the expected value of research in the case study, suggesting that mistakes are made where static implementation levels are assumed. Incorporating the effects of research on implementation resulted in an increase in the expected value of research compared to the expected value of sample information alone. Assessing the expected value of research in reducing uncertainty and improving implementation dynamics has the potential to complement currently used analyses in health technology assessments, especially in recommendations for further research. The combination of expected value of research, diffusion theory, and elicitation described in this article is an important addition to the existing methods of health technology assessment.

  18. Uncertainty and Bounded Rationality: An Empirical Study in Financial Sector Incerteza e Racionalidade Limitada: Um Estudo Empírico no Setor Financeiro

    Directory of Open Access Journals (Sweden)

    Adilson Aderito da Silva

    2012-06-01

    Full Text Available The levels of uncertainty perceived by managers as having the perspective of the theoretical support of the Information Uncertainty, focusing on the multidimensional approach proposed by Milliken (1987, which supports the existence of three types of uncertainty: uncertainty of state, uncertainty effect and response uncertainty. The levels of rationality of managers were estimated to construct a second order from the uncertainties of effect and response uncertainty, with the theoretical support in the definitions of the concept of bounded rationality proposed by Simon (1957. The data collected from the 118 employees of the banking sector in the State of São Paulo were analyzed using Structural Equation Modeling with the Software Smart PLS. The results indicated a significant influence of the state uncertainty on the  level of rationality of managers and bring important methodological and conceptual contributions to the advancement of studies on the subject of uncertainty in decision makingEsta pesquisa foi desenvolvida com o objetivo de avaliar o impacto da incerteza percebida no ambiente sobre os níveis de racionalidade dos gestores do setor financeiro. Para tal, foram estimados os níveis de incerteza percebidos pelos gestores tendo como suporte teórico a perspectiva da Incerteza da Informação, com foco na abordagem multidimensional proposta por Milliken (1987, que defende a existência de três tipos de incerteza: incerteza de estado, incerteza de efeito e incerteza de resposta. Os níveis de racionalidade dos gestores foram estimados como um construto de segunda ordem a partir das incertezas de efeito e da incerteza de resposta, com o suporte teórico nas definições do conceito de racionalidade limitada propostas  por Simon (1957.  Os dados coletados junto aos 118 funcionários do setor bancário no Estado de São Paulo foram analisados por meio de Modelagem por Equações Estruturais com o Software Smart PLS. Os resultados

  19. Economic policy uncertainty, equity premium and dependence between their quantiles: Evidence from quantile-on-quantile approach

    Science.gov (United States)

    Raza, Syed Ali; Zaighum, Isma; Shah, Nida

    2018-02-01

    This paper examines the relationship between economic policy uncertainty and equity premium in G7 countries over a period of the monthly data from January 1989 to December 2015 using a novel technique namely QQ regression proposed by Sim and Zhou (2015). Based on QQ approach, we estimate how the quantiles of the economic policy uncertainty affect the quantiles of the equity premium. Thus, it provides a comprehensive insight into the overall dependence structure between the equity premium and economic policy uncertainty as compared to traditional techniques like OLS or quantile regression. Overall, our empirical evidence suggests the existence of a negative association between equity premium and EPU predominately in all G7 countries, especially in the extreme low and extreme high tails. However, differences exist among countries and across different quantiles of EPU and the equity premium within each country. The existence of this heterogeneity among countries is due to the differences in terms of dependency on economic policy, other stock markets, and the linkages with other country's equity market.

  20. Propagation of registration uncertainty during multi-fraction cervical cancer brachytherapy

    Science.gov (United States)

    Amir-Khalili, A.; Hamarneh, G.; Zakariaee, R.; Spadinger, I.; Abugharbieh, R.

    2017-10-01

    Multi-fraction cervical cancer brachytherapy is a form of image-guided radiotherapy that heavily relies on 3D imaging during treatment planning, delivery, and quality control. In this context, deformable image registration can increase the accuracy of dosimetric evaluations, provided that one can account for the uncertainties associated with the registration process. To enable such capability, we propose a mathematical framework that first estimates the registration uncertainty and subsequently propagates the effects of the computed uncertainties from the registration stage through to the visualizations, organ segmentations, and dosimetric evaluations. To ensure the practicality of our proposed framework in real world image-guided radiotherapy contexts, we implemented our technique via a computationally efficient and generalizable algorithm that is compatible with existing deformable image registration software. In our clinical context of fractionated cervical cancer brachytherapy, we perform a retrospective analysis on 37 patients and present evidence that our proposed methodology for computing and propagating registration uncertainties may be beneficial during therapy planning and quality control. Specifically, we quantify and visualize the influence of registration uncertainty on dosimetric analysis during the computation of the total accumulated radiation dose on the bladder wall. We further show how registration uncertainty may be leveraged into enhanced visualizations that depict the quality of the registration and highlight potential deviations from the treatment plan prior to the delivery of radiation treatment. Finally, we show that we can improve the transfer of delineated volumetric organ segmentation labels from one fraction to the next by encoding the computed registration uncertainties into the segmentation labels.

  1. Significant biases affecting abundance determinations

    Science.gov (United States)

    Wesson, Roger

    2015-08-01

    I have developed two highly efficient codes to automate analyses of emission line nebulae. The tools place particular emphasis on the propagation of uncertainties. The first tool, ALFA, uses a genetic algorithm to rapidly optimise the parameters of gaussian fits to line profiles. It can fit emission line spectra of arbitrary resolution, wavelength range and depth, with no user input at all. It is well suited to highly multiplexed spectroscopy such as that now being carried out with instruments such as MUSE at the VLT. The second tool, NEAT, carries out a full analysis of emission line fluxes, robustly propagating uncertainties using a Monte Carlo technique.Using these tools, I have found that considerable biases can be introduced into abundance determinations if the uncertainty distribution of emission lines is not well characterised. For weak lines, normally distributed uncertainties are generally assumed, though it is incorrect to do so, and significant biases can result. I discuss observational evidence of these biases. The two new codes contain routines to correctly characterise the probability distributions, giving more reliable results in analyses of emission line nebulae.

  2. Toward a definition of intolerance of uncertainty: a review of factor analytical studies of the Intolerance of Uncertainty Scale.

    Science.gov (United States)

    Birrell, Jane; Meares, Kevin; Wilkinson, Andrew; Freeston, Mark

    2011-11-01

    Since its emergence in the early 1990s, a narrow but concentrated body of research has developed examining the role of intolerance of uncertainty (IU) in worry, and yet we still know little about its phenomenology. In an attempt to clarify our understanding of this construct, this paper traces the way in which our understanding and definition of IU have evolved throughout the literature. This paper also aims to further our understanding of IU by exploring the latent variables measures by the Intolerance of Uncertainty Scale (IUS; Freeston, Rheaume, Letarte, Dugas & Ladouceur, 1994). A review of the literature surrounding IU confirmed that the current definitions are categorical and lack specificity. A critical review of existing factor analytic studies was carried out in order to determine the underlying factors measured by the IUS. Systematic searches yielded 9 papers for review. Two factors with 12 consistent items emerged throughout the exploratory studies, and the stability of models containing these two factors was demonstrated in subsequent confirmatory studies. It is proposed that these factors represent (i) desire for predictability and an active engagement in seeking certainty, and (ii) paralysis of cognition and action in the face of uncertainty. It is suggested that these factors may represent approach and avoidance responses to uncertainty. Further research is required to confirm the construct validity of these factors and to determine the stability of this structure within clinical samples. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. Evaluating Sources of Risks in Large Engineering Projects: The Roles of Equivocality and Uncertainty

    Directory of Open Access Journals (Sweden)

    Leena Pekkinen

    2015-11-01

    Full Text Available Contemporary project risk management literature introduces uncertainty, i.e., the lack of information, as a fundamental basis of project risks. In this study the authors assert that equivocality, i.e., the existence of multiple and conflicting interpretations, can also serve as a basis of risks. With an in-depth empirical investigation of a large complex engineering project the authors identified risk sources having their bases in the situations where uncertainty or equivocality was the predominant attribute. The information processing theory proposes different managerial practices for risk management based on the sources of risks in uncertainty or equivocality.

  4. Economic–Environmental Sustainability in Building Projects: Introducing Risk and Uncertainty in LCCE and LCCA

    Directory of Open Access Journals (Sweden)

    Elena Fregonara

    2018-06-01

    Full Text Available The aim of this paper is to propose a methodology for supporting decision-making in the design stages of new buildings or in the retrofitting of existing heritages. The focus is on the evaluation of economic–environmental sustainability, considering the presence of risk and uncertainty. An application of risk analysis in conjunction with Life-Cycle Cost Analysis (LCCA is proposed for selecting the preferable solution between technological options, which represents a recent and poorly explored context of analysis. It is assumed that there is a presence of uncertainty in cost estimating, in terms of the Life-Cycle Cost Estimates (LCCEs and uncertainty in the technical performance of the life-cycle cost analysis. According to the probability analysis, which was solved through stochastic simulation and the Monte Carlo Method (MCM, risk and uncertainty are modeled as stochastic variables or as “stochastic relevant cost drivers”. Coherently, the economic–financial and energy–environmental sustainability is analyzed through the calculation of a conjoint “economic–environmental indicator”, in terms of the stochastic global cost. A case study of the multifunctional building glass façade project in Northern Italy is proposed. The application demonstrates that introducing flexibility into the input data and the duration of the service lives of components and the economic and environmental behavior of alternative scenarios can lead to opposite results compared to a deterministic analysis. The results give full evidence of the environmental variables’ capacity to significantly perturb the model output.

  5. Evaluation of Uncertainties in the Determination of Phosphorus by RNAA

    International Nuclear Information System (INIS)

    Rick L. Paul

    2000-01-01

    A radiochemical neutron activation analysis (RNAA) procedure for the determination of phosphorus in metals and other materials has been developed and critically evaluated. Uncertainties evaluated as type A include those arising from measurement replication, yield determination, neutron self-shielding, irradiation geometry, measurement of the quantity for concentration normalization (sample mass, area, etc.), and analysis of standards. Uncertainties evaluated as type B include those arising from beta contamination corrections, beta decay curve fitting, and beta self-absorption corrections. The evaluation of uncertainties in the determination of phosphorus is illustrated for three different materials in Table I. The metal standard reference materials (SRMs) 2175 and 861 were analyzed for value assignment of phosphorus; implanted silicon was analyzed to evaluate the technique for certification of phosphorus. The most significant difference in the error evaluation of the three materials lies in the type B uncertainties. The relatively uncomplicated matrix of the high-purity silicon allows virtually complete purification of phosphorus from other beta emitters; hence, minimal contamination correction is needed. Furthermore, because the chemistry is less rigorous, the carrier yield is more reproducible, and self-absorption corrections are less significant. Improvements in the chemical purification procedures for phosphorus in complex matrices will decrease the type B uncertainties for all samples. Uncertainties in the determination of carrier yield, the most significant type A error in the analysis of the silicon, also need to be evaluated more rigorously and minimized in the future

  6. Uncertainty Analysis with Considering Resonance Self-shielding Effect

    Energy Technology Data Exchange (ETDEWEB)

    Han, Tae Young [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    If infinitely diluted multi-group cross sections were used for the sensitivity, the covariance data from the evaluated nuclear data library (ENDL) was directly applied. However, in case of using a self-shielded multi-group cross section, the covariance data should be corrected considering self-shielding effect. Usually, implicit uncertainty can be defined as the uncertainty change by the resonance self-shielding effect as described above. MUSAD ( Modules of Uncertainty and Sensitivity Analysis for DeCART ) has been developed for a multiplication factor and cross section uncertainty based on the generalized perturbation theory and it, however, can only quantify the explicit uncertainty by the self-shielded multi-group cross sections without considering the implicit effect. Thus, this paper addresses the implementation of the implicit uncertainty analysis module into the code and the numerical results for the verification are provided. The implicit uncertainty analysis module has been implemented into MUSAD based on infinitely-diluted cross section-based consistent method. The verification calculation was performed on MHTGR 350 Ex.I-1a and the differences with McCARD result decrease from 40% to 1% in CZP case and 3% in HFP case. From this study, it is expected that MUSAD code can reasonably produce the complete uncertainty on VHTR or LWR where the resonance self-shielding effect should be significantly considered.

  7. Uncertainty Analysis with Considering Resonance Self-shielding Effect

    International Nuclear Information System (INIS)

    Han, Tae Young

    2016-01-01

    If infinitely diluted multi-group cross sections were used for the sensitivity, the covariance data from the evaluated nuclear data library (ENDL) was directly applied. However, in case of using a self-shielded multi-group cross section, the covariance data should be corrected considering self-shielding effect. Usually, implicit uncertainty can be defined as the uncertainty change by the resonance self-shielding effect as described above. MUSAD ( Modules of Uncertainty and Sensitivity Analysis for DeCART ) has been developed for a multiplication factor and cross section uncertainty based on the generalized perturbation theory and it, however, can only quantify the explicit uncertainty by the self-shielded multi-group cross sections without considering the implicit effect. Thus, this paper addresses the implementation of the implicit uncertainty analysis module into the code and the numerical results for the verification are provided. The implicit uncertainty analysis module has been implemented into MUSAD based on infinitely-diluted cross section-based consistent method. The verification calculation was performed on MHTGR 350 Ex.I-1a and the differences with McCARD result decrease from 40% to 1% in CZP case and 3% in HFP case. From this study, it is expected that MUSAD code can reasonably produce the complete uncertainty on VHTR or LWR where the resonance self-shielding effect should be significantly considered

  8. Aspects of uncertainty analysis in accident consequence modeling

    International Nuclear Information System (INIS)

    Travis, C.C.; Hoffman, F.O.

    1981-01-01

    Mathematical models are frequently used to determine probable dose to man from an accidental release of radionuclides by a nuclear facility. With increased emphasis on the accuracy of these models, the incorporation of uncertainty analysis has become one of the most crucial and sensitive components in evaluating the significance of model predictions. In the present paper, we address three aspects of uncertainty in models used to assess the radiological impact to humans: uncertainties resulting from the natural variability in human biological parameters; the propagation of parameter variability by mathematical models; and comparison of model predictions to observational data

  9. Site utility system optimization with operation adjustment under uncertainty

    International Nuclear Information System (INIS)

    Sun, Li; Gai, Limei; Smith, Robin

    2017-01-01

    Highlights: • Uncertainties are classified into time-based and probability-based uncertain factors. • Multi-period operation and recourses deal with uncertainty implementation. • Operation scheduling are specified at the design stage to deal with uncertainties. • Steam mains superheating affects steam distribution and power generation in the system. - Abstract: Utility systems must satisfy process energy and power demands under varying conditions. The system performance is decided by the system configuration and individual equipment operating load for boilers, gas turbines, steam turbines, condensers, and let down valves. Steam mains conditions in terms of steam pressures and steam superheating also play important roles on steam distribution in the system and power generation by steam expansion in steam turbines, and should be included in the system optimization. Uncertainties such as process steam power demand changes and electricity price fluctuations should be included in the system optimization to eliminate as much as possible the production loss caused by steam power deficits due to uncertainties. In this paper, uncertain factors are classified into time-based and probability-based uncertain factors, and operation scheduling containing multi-period equipment load sharing, redundant equipment start up, and electricity import to compensate for power deficits, have been presented to deal with the happens of uncertainties, and are formulated as a multi-period item and a recourse item in the optimization model. There are two case studies in this paper. One case illustrates the system design to determine system configuration, equipment selection, and system operation scheduling at the design stage to deal with uncertainties. The other case provides operational optimization scenarios for an existing system, especially when the steam superheating varies. The proposed method can provide practical guidance to system energy efficiency improvement.

  10. On ISSM and leveraging the Cloud towards faster quantification of the uncertainty in ice-sheet mass balance projections

    Science.gov (United States)

    Larour, E.; Schlegel, N.

    2016-11-01

    With the Amazon EC2 Cloud becoming available as a viable platform for parallel computing, Earth System Models are increasingly interested in leveraging its capabilities towards improving climate projections. In particular, faced with long wait periods on high-end clusters, the elasticity of the Cloud presents a unique opportunity of potentially "infinite" availability of small-sized clusters running on high-performance instances. Among specific applications of this new paradigm, we show here how uncertainty quantification in climate projections of polar ice sheets (Antarctica and Greenland) can be significantly accelerated using the Cloud. Indeed, small-sized clusters are very efficient at delivering sensitivity and sampling analysis, core tools of uncertainty quantification. We demonstrate how this approach was used to carry out an extensive analysis of ice-flow projections on one of the largest basins in Greenland, the North-East Greenland Glacier, using the Ice Sheet System Model, the public-domain NASA-funded ice-flow modeling software. We show how errors in the projections were accurately quantified using Monte-Carlo sampling analysis on the EC2 Cloud, and how a judicious mix of high-end parallel computing and Cloud use can best leverage existing infrastructures, and significantly accelerate delivery of potentially ground-breaking climate projections, and in particular, enable uncertainty quantification that were previously impossible to achieve.

  11. Uncertainty Flow Facilitates Zero-Shot Multi-Label Learning in Affective Facial Analysis

    Directory of Open Access Journals (Sweden)

    Wenjun Bai

    2018-02-01

    Full Text Available Featured Application: The proposed Uncertainty Flow framework may benefit the facial analysis with its promised elevation in discriminability in multi-label affective classification tasks. Moreover, this framework also allows the efficient model training and between tasks knowledge transfer. The applications that rely heavily on continuous prediction on emotional valance, e.g., to monitor prisoners’ emotional stability in jail, can be directly benefited from our framework. Abstract: To lower the single-label dependency on affective facial analysis, it urges the fruition of multi-label affective learning. The impediment to practical implementation of existing multi-label algorithms pertains to scarcity of scalable multi-label training datasets. To resolve this, an inductive transfer learning based framework, i.e.,Uncertainty Flow, is put forward in this research to allow knowledge transfer from a single labelled emotion recognition task to a multi-label affective recognition task. I.e., the model uncertainty—which can be quantified in Uncertainty Flow—is distilled from a single-label learning task. The distilled model uncertainty ensures the later efficient zero-shot multi-label affective learning. On the theoretical perspective, within our proposed Uncertainty Flow framework, the feasibility of applying weakly informative priors, e.g., uniform and Cauchy prior, is fully explored in this research. More importantly, based on the derived weight uncertainty, three sets of prediction related uncertainty indexes, i.e., soft-max uncertainty, pure uncertainty and uncertainty plus are proposed to produce reliable and accurate multi-label predictions. Validated on our manual annotated evaluation dataset, i.e., the multi-label annotated FER2013, our proposed Uncertainty Flow in multi-label facial expression analysis exhibited superiority to conventional multi-label learning algorithms and multi-label compatible neural networks. The success of our

  12. Exploring Best Practice Skills to Predict Uncertainties in Venture Capital Investment Decision-Making

    Science.gov (United States)

    Blum, David Arthur

    Algae biodiesel is the sole sustainable and abundant transportation fuel source that can replace petrol diesel use; however, high competition and economic uncertainties exist, influencing independent venture capital decision making. Technology, market, management, and government action uncertainties influence competition and economic uncertainties in the venture capital industry. The purpose of this qualitative case study was to identify the best practice skills at IVC firms to predict uncertainty between early and late funding stages. The basis of the study was real options theory, a framework used to evaluate and understand the economic and competition uncertainties inherent in natural resource investment and energy derived from plant-based oils. Data were collected from interviews of 24 venture capital partners based in the United States who invest in algae and other renewable energy solutions. Data were analyzed by coding and theme development interwoven with the conceptual framework. Eight themes emerged: (a) expected returns model, (b) due diligence, (c) invest in specific sectors, (d) reduced uncertainty-late stage, (e) coopetition, (f) portfolio firm relationships, (g) differentiation strategy, and (h) modeling uncertainty and best practice. The most noteworthy finding was that predicting uncertainty at the early stage was impractical; at the expansion and late funding stages, however, predicting uncertainty was possible. The implications of these findings will affect social change by providing independent venture capitalists with best practice skills to increase successful exits, lessen uncertainty, and encourage increased funding of renewable energy firms, contributing to cleaner and healthier communities throughout the United States..

  13. Propagation of nuclear data uncertainties for fusion power measurements

    Directory of Open Access Journals (Sweden)

    Sjöstrand Henrik

    2017-01-01

    Full Text Available Neutron measurements using neutron activation systems are an essential part of the diagnostic system at large fusion machines such as JET and ITER. Nuclear data is used to infer the neutron yield. Consequently, high-quality nuclear data is essential for the proper determination of the neutron yield and fusion power. However, uncertainties due to nuclear data are not fully taken into account in uncertainty analysis for neutron yield calibrations using activation foils. This paper investigates the neutron yield uncertainty due to nuclear data using the so-called Total Monte Carlo Method. The work is performed using a detailed MCNP model of the JET fusion machine; the uncertainties due to the cross-sections and angular distributions in JET structural materials, as well as the activation cross-sections in the activation foils, are analysed. It is found that a significant contribution to the neutron yield uncertainty can come from uncertainties in the nuclear data.

  14. Uncertainty in social dilemmas

    OpenAIRE

    Kwaadsteniet, Erik Willem de

    2007-01-01

    This dissertation focuses on social dilemmas, and more specifically, on environmental uncertainty in these dilemmas. Real-life social dilemma situations are often characterized by uncertainty. For example, fishermen mostly do not know the exact size of the fish population (i.e., resource size uncertainty). Several researchers have therefore asked themselves the question as to how such uncertainty influences people’s choice behavior. These researchers have repeatedly concluded that uncertainty...

  15. Uncertainty theory

    CERN Document Server

    Liu, Baoding

    2015-01-01

    When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...

  16. Rational consensus under uncertainty: Expert judgment in the EC-USNRC uncertainty study

    International Nuclear Information System (INIS)

    Cooke, R.; Kraan, B.; Goossens, L.

    1999-01-01

    ? Simply choosing a maximally feasible pool of experts and combining their views by some method of equal representation might achieve a form of political consensus among the experts involved, but will not achieve rational consensus. If expert viewpoints are related to the institutions at which the experts are employed, then numerical representation of viewpoints in the pool may be, and/or may be perceived to be influenced by the size of the interests funding the institutes. We collect a number of conclusions regarding the use of structured expert judgment. 1 . Experts' subjective uncertainties may be used to advance rational consensus in the face of large uncertainties, in so far as the necessary conditions for rational consensus are satisfied. 2. Empirical control of experts' subjective uncertainties is possible. 3. Experts' performance as subjective probability assessors is not uniform, there are significant differences in performance. 4. Experts as a group may show poor performance. 5. A structured combination of expert judgment may show satisfactory performance, even though the experts individually perform poorly. 6. The performance based combination generally outperforms the equal weight combination. 7. The combination of experts' subjective probabilities, according to the schemes discussed here, generally has wider 90% central confidence intervals than the experts individually; particularly in the case of the equal weight combination. We note that poor performance as a subjective probability assessor does not indicate a lack of substantive expert knowledge. Rather, it indicates unfamiliarity with quantifying subjective uncertainty in terms of subjective probability distributions. Experts were provided with training in subjective probability assessment, but of course their formal training does not (yet) prepare them for such tasks

  17. Policy Uncertainty and the US Ethanol Industry

    Directory of Open Access Journals (Sweden)

    Jason P. H. Jones

    2017-11-01

    Full Text Available The Renewable Fuel Standard (RFS2, as implemented, has introduced uncertainty into US ethanol producers and the supporting commodity market. First, the fixed mandate for what is mainly cornstarch-based ethanol has increased feedstock price volatility and exerts a general effect across the agricultural sector. Second, the large discrepancy between the original Energy Independence and Security Act (EISA intentions and the actual RFS2 implementation for some fuel classes has increased the investment uncertainty facing investors in biofuel production, distribution, and consumption. Here we discuss and analyze the sources of uncertainty and evaluate the effect of potential RFS2 adjustments as they influence these uncertainties. This includes the use of a flexible, production dependent mandate on corn starch ethanol. We find that a flexible mandate on cornstarch ethanol relaxed during drought could significantly reduce commodity price spikes and alleviate the decline of livestock production in cases of feedstock production shortfalls, but it would increase the risk for ethanol investors.

  18. Uncertainties in Organ Burdens Estimated from PAS

    International Nuclear Information System (INIS)

    La Bone, T.R.

    2004-01-01

    To calculate committed effective dose equivalent, one needs to know the quantity of the radionuclide in all significantly irradiated organs (the organ burden) as a function of time following the intake. There are two major sources of uncertainty in an organ burden estimated from personal air sampling (PAS) data: (1) The uncertainty in going from the exposure measured with the PAS to the quantity of aerosol inhaled by the individual, and (2) The uncertainty in going from the intake to the organ burdens at any given time, taking into consideration the biological variability of the biokinetic models from person to person (interperson variability) and in one person over time (intra-person variability). We have been using biokinetic modeling methods developed by researchers at the University of Florida to explore the impact of inter-person variability on the uncertainty of organ burdens estimated from PAS data. These initial studies suggest that the uncertainties are so large that PAS might be considered to be a qualitative (rather than quantitative) technique. These results indicate that more studies should be performed to properly classify the reliability and usefulness of using PAS monitoring data to estimate organ burdens, organ dose, and ultimately CEDE

  19. Large break LOCA uncertainty evaluation and comparison with conservative calculation

    International Nuclear Information System (INIS)

    Glaeser, H.G.

    2004-01-01

    The first formulation of the USA Code of Federal Regulations (CFR) 10CFR50 with applicable sections specific to NPP licensing requirements was released 1976. Over a decade later 10CFR 50.46 allowed the use of BE codes instead of conservative code models but uncertainties have to be identified and quantified. Guidelines were released that described interpretations developed over the intervening years that are applicable. Other countries established similar conservative procedures and acceptance criteria. Because conservative methods were used to calculate the peak values of key parameters, such as peak clad temperature (PCT), it was always acknowledged that a large margin, between the 'conservative' calculated value and the 'true' value, existed. Beside USA, regulation in other countries, like Germany, for example, allowed that the state of science and technology is applied in licensing. I.e. the increase of experimental evidence and progress in code development during time could be used. There was no requirement to apply a pure evaluation methodology with licensed assumptions and frozen codes. The thermal-hydraulic system codes became more and more best-estimate codes based on comprehensive validation. This development was and is possible because the rules and guidelines provide the necessary latitude to consider further development of safety technology. Best estimate codes are allowed to be used in licensing in combination with conservative initial and boundary conditions. However, uncertainty quantification is not required. Since some of the initial and boundary conditions are more conservative compared with those internationally used (e.g. 106% reactor power instead 102%, a single failure plus a non-availability due to preventive maintenance is assumed, etc.) it is claimed that the uncertainties of code models are covered. Since many utilities apply for power increase, calculation results come closer to some licensing criteria. The situation in German licensing

  20. Neural Mechanisms of Updating under Reducible and Irreducible Uncertainty.

    Science.gov (United States)

    Kobayashi, Kenji; Hsu, Ming

    2017-07-19

    Adaptive decision making depends on an agent's ability to use environmental signals to reduce uncertainty. However, because of multiple types of uncertainty, agents must take into account not only the extent to which signals violate prior expectations but also whether uncertainty can be reduced in the first place. Here we studied how human brains of both sexes respond to signals under conditions of reducible and irreducible uncertainty. We show behaviorally that subjects' value updating was sensitive to the reducibility of uncertainty, and could be quantitatively characterized by a Bayesian model where agents ignore expectancy violations that do not update beliefs or values. Using fMRI, we found that neural processes underlying belief and value updating were separable from responses to expectancy violation, and that reducibility of uncertainty in value modulated connections from belief-updating regions to value-updating regions. Together, these results provide insights into how agents use knowledge about uncertainty to make better decisions while ignoring mere expectancy violation. SIGNIFICANCE STATEMENT To make good decisions, a person must observe the environment carefully, and use these observations to reduce uncertainty about consequences of actions. Importantly, uncertainty should not be reduced purely based on how surprising the observations are, particularly because in some cases uncertainty is not reducible. Here we show that the human brain indeed reduces uncertainty adaptively by taking into account the nature of uncertainty and ignoring mere surprise. Behaviorally, we show that human subjects reduce uncertainty in a quasioptimal Bayesian manner. Using fMRI, we characterize brain regions that may be involved in uncertainty reduction, as well as the network they constitute, and dissociate them from brain regions that respond to mere surprise. Copyright © 2017 the authors 0270-6474/17/376972-11$15.00/0.

  1. Limited entropic uncertainty as new principle of quantum physics

    International Nuclear Information System (INIS)

    Ion, D.B.; Ion, M.L.

    2001-01-01

    experimental illustration of the LEU-Principle is presented for the cases y=1 and q=0.5, 1, and 2. (iii) For the nonextensive quantum systems with negative q we also proved the validity of the state independent entropic uncertainty relations: exp{(1-2 1-q )/(q-1)} ≤ V θL (q). Moreover, in this case we get that the optimal Tsallis-like entropies (if they exists for q<0) provides only an important improvement of the above state independent entropic uncertainty relations. (authors)

  2. Moving Beyond 2% Uncertainty: A New Framework for Quantifying Lidar Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Newman, Jennifer F.; Clifton, Andrew

    2017-03-08

    Remote sensing of wind using lidar is revolutionizing wind energy. However, current generations of wind lidar are ascribed a climatic value of uncertainty, which is based on a poor description of lidar sensitivity to external conditions. In this presentation, we show how it is important to consider the complete lidar measurement process to define the measurement uncertainty, which in turn offers the ability to define a much more granular and dynamic measurement uncertainty. This approach is a progression from the 'white box' lidar uncertainty method.

  3. The effect of short-range spatial variability on soil sampling uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Perk, Marcel van der [Department of Physical Geography, Utrecht University, P.O. Box 80115, 3508 TC Utrecht (Netherlands)], E-mail: m.vanderperk@geo.uu.nl; De Zorzi, Paolo; Barbizzi, Sabrina; Belli, Maria [Agenzia per la Protezione dell' Ambiente e per i Servizi Tecnici (APAT), Servizio Laboratori, Misure ed Attivita di Campo, Via di Castel Romano, 100-00128 Roma (Italy); Fajgelj, Ales; Sansone, Umberto [International Atomic Energy Agency (IAEA), Agency' s Laboratories Seibersdorf, A-1400 Vienna (Austria); Jeran, Zvonka; Jacimovic, Radojko [Jozef Stefan Institute, Jamova 39, 1000 Ljubljana (Slovenia)

    2008-11-15

    This paper aims to quantify the soil sampling uncertainty arising from the short-range spatial variability of elemental concentrations in the topsoils of agricultural, semi-natural, and contaminated environments. For the agricultural site, the relative standard sampling uncertainty ranges between 1% and 5.5%. For the semi-natural area, the sampling uncertainties are 2-4 times larger than in the agricultural area. The contaminated site exhibited significant short-range spatial variability in elemental composition, which resulted in sampling uncertainties of 20-30%.

  4. The effect of short-range spatial variability on soil sampling uncertainty.

    Science.gov (United States)

    Van der Perk, Marcel; de Zorzi, Paolo; Barbizzi, Sabrina; Belli, Maria; Fajgelj, Ales; Sansone, Umberto; Jeran, Zvonka; Jaćimović, Radojko

    2008-11-01

    This paper aims to quantify the soil sampling uncertainty arising from the short-range spatial variability of elemental concentrations in the topsoils of agricultural, semi-natural, and contaminated environments. For the agricultural site, the relative standard sampling uncertainty ranges between 1% and 5.5%. For the semi-natural area, the sampling uncertainties are 2-4 times larger than in the agricultural area. The contaminated site exhibited significant short-range spatial variability in elemental composition, which resulted in sampling uncertainties of 20-30%.

  5. Managing Measurement Uncertainty in Building Acoustics

    Directory of Open Access Journals (Sweden)

    Chiara Scrosati

    2015-12-01

    Full Text Available In general, uncertainties should preferably be determined following the principles laid down in ISO/IEC Guide 98-3, the Guide to the expression of uncertainty in measurement (GUM:1995. According to current knowledge, it seems impossible to formulate these models for the different quantities in building acoustics. Therefore, the concepts of repeatability and reproducibility are necessary to determine the uncertainty of building acoustics measurements. This study shows the uncertainty of field measurements of a lightweight wall, a heavyweight floor, a façade with a single glazing window and a façade with double glazing window that were analyzed by a Round Robin Test (RRT, conducted in a full-scale experimental building at ITC-CNR (Construction Technologies Institute of the National Research Council of Italy. The single number quantities and their uncertainties were evaluated in both narrow and enlarged range and it was shown that including or excluding the low frequencies leads to very significant differences, except in the case of the sound insulation of façades with single glazing window. The results obtained in these RRTs were compared with other results from literature, which confirm the increase of the uncertainty of single number quantities due to the low frequencies extension. Having stated the measurement uncertainty for a single measurement, in building acoustics, it is also very important to deal with sampling for the purposes of classification of buildings or building units. Therefore, this study also shows an application of the sampling included in the Italian Standard on the acoustic classification of building units on a serial type building consisting of 47 building units. It was found that the greatest variability is observed in the façade and it depends on both the great variability of window’s typologies and on workmanship. Finally, it is suggested how to manage the uncertainty in building acoustics, both for one single

  6. Do oil shocks predict economic policy uncertainty?

    Science.gov (United States)

    Rehman, Mobeen Ur

    2018-05-01

    Oil price fluctuations have influential role in global economic policies for developed as well as emerging countries. I investigate the role of international oil prices disintegrated into structural (i) oil supply shock, (ii) aggregate demand shock and (iii) oil market specific demand shocks, based on the work of Kilian (2009) using structural VAR framework on economic policies uncertainty of sampled markets. Economic policy uncertainty, due to its non-linear behavior is modeled in a regime switching framework with disintegrated structural oil shocks. Our results highlight that Indian, Spain and Japanese economic policy uncertainty responds to the global oil price shocks, however aggregate demand shocks fail to induce any change. Oil specific demand shocks are significant only for China and India in high volatility state.

  7. Methodologies for uncertainty analysis in the level 2 PSA and their implementation procedures

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Yang, Joon Eun; Kim, Dong Ha

    2002-04-01

    Main purpose of this report to present standardized methodologies for uncertainty analysis in the Level 2 Probabilistic Safety Assessment (PSA) and their implementation procedures, based on results obtained through a critical review of the existing methodologies for the analysis of uncertainties employed in the Level 2 PSA, especially Accident Progression Event Tree (APET). Uncertainties employed in the Level 2 PSA, quantitative expressions of overall knowledge of analysts' and experts' participating in the probabilistic quantification process of phenomenological accident progressions ranging from core melt to containment failure, their numerical values are directly related to the degree of confidence that the analyst has that a given phenomenological event or accident process will or will not occur, or analyst's subjective probabilities of occurrence. These results that are obtained from Level 2 PSA uncertainty analysis, become an essential contributor to the plant risk, in addition to the Level 1 PSA and Level 3 PSA uncertainties. Uncertainty analysis methodologies and their implementation procedures presented in this report was prepared based on the following criteria: 'uncertainty quantification process must be logical, scrutable, complete, consistent and in an appropriate level of detail, as mandated by the Level 2 PSA objectives'. For the aforementioned purpose, this report deals mainly with (1) summary of general or Level 2 PSA specific uncertainty analysis methodologies, (2) selection of phenomenological branch events for uncertainty analysis in the APET, methodology for quantification of APET uncertainty inputs and its implementation procedure, (3) statistical propagation of uncertainty inputs through APET and its implementation procedure, and (4) formal procedure for quantification of APET uncertainties and source term categories (STCs) through the Level 2 PSA quantification codes

  8. CHARACTERIZING AND PROPAGATING MODELING UNCERTAINTIES IN PHOTOMETRICALLY DERIVED REDSHIFT DISTRIBUTIONS

    International Nuclear Information System (INIS)

    Abrahamse, Augusta; Knox, Lloyd; Schmidt, Samuel; Thorman, Paul; Anthony Tyson, J.; Zhan Hu

    2011-01-01

    The uncertainty in the redshift distributions of galaxies has a significant potential impact on the cosmological parameter values inferred from multi-band imaging surveys. The accuracy of the photometric redshifts measured in these surveys depends not only on the quality of the flux data, but also on a number of modeling assumptions that enter into both the training set and spectral energy distribution (SED) fitting methods of photometric redshift estimation. In this work we focus on the latter, considering two types of modeling uncertainties: uncertainties in the SED template set and uncertainties in the magnitude and type priors used in a Bayesian photometric redshift estimation method. We find that SED template selection effects dominate over magnitude prior errors. We introduce a method for parameterizing the resulting ignorance of the redshift distributions, and for propagating these uncertainties to uncertainties in cosmological parameters.

  9. Development of Property Models with Uncertainty Estimate for Process Design under Uncertainty

    DEFF Research Database (Denmark)

    Hukkerikar, Amol; Sarup, Bent; Abildskov, Jens

    more reliable predictions with a new and improved set of model parameters for GC (group contribution) based and CI (atom connectivity index) based models and to quantify the uncertainties in the estimated property values from a process design point-of-view. This includes: (i) parameter estimation using....... The comparison of model prediction uncertainties with reported range of measurement uncertainties is presented for the properties with related available data. The application of the developed methodology to quantify the effect of these uncertainties on the design of different unit operations (distillation column......, the developed methodology can be used to quantify the sensitivity of process design to uncertainties in property estimates; obtain rationally the risk/safety factors in process design; and identify additional experimentation needs in order to reduce most critical uncertainties....

  10. Uncertainty Quantification for a Sailing Yacht Hull, Using Multi-Fidelity Kriging

    NARCIS (Netherlands)

    de Baar, J.H.S.; Roberts, S; Dwight, R.P.; Mallol, B.

    2015-01-01

    Uncertainty Quantication (UQ) for CFD-based ship design can require a large number of simulations, resulting in signicant overall computational cost. Presently, we use an existing method, multi-delity Kriging, to reduce the number of simulations required for the UQ analysis of the performance of a

  11. Trans-Planckian Effects in Inflationary Cosmology and the Modified Uncertainty Principle

    DEFF Research Database (Denmark)

    F. Hassan, S.; Sloth, Martin Snoager

    2002-01-01

    There are good indications that fundamental physics gives rise to a modified space-momentum uncertainty relation that implies the existence of a minimum length scale. We implement this idea in the scalar field theory that describes density perturbations in flat Robertson-Walker space-time. This l...

  12. Differentiating intolerance of uncertainty from three related but distinct constructs.

    Science.gov (United States)

    Rosen, Natalie O; Ivanova, Elena; Knäuper, Bärbel

    2014-01-01

    Individual differences in uncertainty have been associated with heightened anxiety, stress and approach-oriented coping. Intolerance of uncertainty (IU) is a trait characteristic that arises from negative beliefs about uncertainty and its consequences. Researchers have established the central role of IU in the development of problematic worry and maladaptive coping, highlighting the importance of this construct to anxiety disorders. However, there is a need to improve our understanding of the phenomenology of IU. The goal of this paper was to present hypotheses regarding the similarities and differences between IU and three related constructs--intolerance of ambiguity, uncertainty orientation, and need for cognitive closure--and to call for future empirical studies to substantiate these hypotheses. To assist with achieving this goal, we conducted a systematic review of the literature, which also served to identify current gaps in knowledge. This paper differentiates these constructs by outlining each definition and general approaches to assessment, reviewing the existing empirical relations, and proposing theoretical similarities and distinctions. Findings may assist researchers in selecting the appropriate construct to address their research questions. Future research directions for the application of these constructs, particularly within the field of clinical and health psychology, are discussed.

  13. Mayer control problem with probabilistic uncertainty on initial positions

    Science.gov (United States)

    Marigonda, Antonio; Quincampoix, Marc

    2018-03-01

    In this paper we introduce and study an optimal control problem in the Mayer's form in the space of probability measures on Rn endowed with the Wasserstein distance. Our aim is to study optimality conditions when the knowledge of the initial state and velocity is subject to some uncertainty, which are modeled by a probability measure on Rd and by a vector-valued measure on Rd, respectively. We provide a characterization of the value function of such a problem as unique solution of an Hamilton-Jacobi-Bellman equation in the space of measures in a suitable viscosity sense. Some applications to a pursuit-evasion game with uncertainty in the state space is also discussed, proving the existence of a value for the game.

  14. Multi-scenario modelling of uncertainty in stochastic chemical systems

    International Nuclear Information System (INIS)

    Evans, R. David; Ricardez-Sandoval, Luis A.

    2014-01-01

    Uncertainty analysis has not been well studied at the molecular scale, despite extensive knowledge of uncertainty in macroscale systems. The ability to predict the effect of uncertainty allows for robust control of small scale systems such as nanoreactors, surface reactions, and gene toggle switches. However, it is difficult to model uncertainty in such chemical systems as they are stochastic in nature, and require a large computational cost. To address this issue, a new model of uncertainty propagation in stochastic chemical systems, based on the Chemical Master Equation, is proposed in the present study. The uncertain solution is approximated by a composite state comprised of the averaged effect of samples from the uncertain parameter distributions. This model is then used to study the effect of uncertainty on an isomerization system and a two gene regulation network called a repressilator. The results of this model show that uncertainty in stochastic systems is dependent on both the uncertain distribution, and the system under investigation. -- Highlights: •A method to model uncertainty on stochastic systems was developed. •The method is based on the Chemical Master Equation. •Uncertainty in an isomerization reaction and a gene regulation network was modelled. •Effects were significant and dependent on the uncertain input and reaction system. •The model was computationally more efficient than Kinetic Monte Carlo

  15. Uncertainty in Forest Net Present Value Estimations

    Directory of Open Access Journals (Sweden)

    Ilona Pietilä

    2010-09-01

    Full Text Available Uncertainty related to inventory data, growth models and timber price fluctuation was investigated in the assessment of forest property net present value (NPV. The degree of uncertainty associated with inventory data was obtained from previous area-based airborne laser scanning (ALS inventory studies. The study was performed, applying the Monte Carlo simulation, using stand-level growth and yield projection models and three alternative rates of interest (3, 4 and 5%. Timber price fluctuation was portrayed with geometric mean-reverting (GMR price models. The analysis was conducted for four alternative forest properties having varying compartment structures: (A a property having an even development class distribution, (B sapling stands, (C young thinning stands, and (D mature stands. Simulations resulted in predicted yield value (predicted NPV distributions at both stand and property levels. Our results showed that ALS inventory errors were the most prominent source of uncertainty, leading to a 5.1–7.5% relative deviation of property-level NPV when an interest rate of 3% was applied. Interestingly, ALS inventory led to significant biases at the property level, ranging from 8.9% to 14.1% (3% interest rate. ALS inventory-based bias was the most significant in mature stand properties. Errors related to the growth predictions led to a relative standard deviation in NPV, varying from 1.5% to 4.1%. Growth model-related uncertainty was most significant in sapling stand properties. Timber price fluctuation caused the relative standard deviations ranged from 3.4% to 6.4% (3% interest rate. The combined relative variation caused by inventory errors, growth model errors and timber price fluctuation varied, depending on the property type and applied rates of interest, from 6.4% to 12.6%. By applying the methodology described here, one may take into account the effects of various uncertainty factors in the prediction of forest yield value and to supply the

  16. A Bayesian approach for quantification of model uncertainty

    International Nuclear Information System (INIS)

    Park, Inseok; Amarchinta, Hemanth K.; Grandhi, Ramana V.

    2010-01-01

    In most engineering problems, more than one model can be created to represent an engineering system's behavior. Uncertainty is inevitably involved in selecting the best model from among the models that are possible. Uncertainty in model selection cannot be ignored, especially when the differences between the predictions of competing models are significant. In this research, a methodology is proposed to quantify model uncertainty using measured differences between experimental data and model outcomes under a Bayesian statistical framework. The adjustment factor approach is used to propagate model uncertainty into prediction of a system response. A nonlinear vibration system is used to demonstrate the processes for implementing the adjustment factor approach. Finally, the methodology is applied on the engineering benefits of a laser peening process, and a confidence band for residual stresses is established to indicate the reliability of model prediction.

  17. Quantum time uncertainty in a gravity's rainbow formalism

    International Nuclear Information System (INIS)

    Galan, Pablo; Marugan, Guillermo A. Mena

    2004-01-01

    The existence of a minimum time uncertainty is usually argued to be a consequence of the combination of quantum mechanics and general relativity. Most of the studies that point to this result are nonetheless based on perturbative quantization approaches, in which the effect of matter on the geometry is regarded as a correction to a classical background. In this paper, we consider rainbow spacetimes constructed from doubly special relativity by using a modification of the proposals of Magueijo and Smolin. In these models, gravitational effects are incorporated (at least to a certain extent) in the definition of the energy-momentum of particles without adhering to a perturbative treatment of the backreaction. In this context, we derive and compare the expressions of the time uncertainty in quantizations that use as evolution parameter either the background or the rainbow time coordinates. These two possibilities can be regarded as corresponding to perturbative and nonperturbative quantization schemes, respectively. We show that, while a nonvanishing time uncertainty is generically unavoidable in a perturbative framework, an infinite time resolution can in fact be achieved in a nonperturbative quantization for the whole family of doubly special relativity theories with unbounded physical energy

  18. Subspace-based Inverse Uncertainty Quantification for Nuclear Data Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Khuwaileh, B.A., E-mail: bakhuwai@ncsu.edu; Abdel-Khalik, H.S.

    2015-01-15

    Safety analysis and design optimization depend on the accurate prediction of various reactor attributes. Predictions can be enhanced by reducing the uncertainty associated with the attributes of interest. An inverse problem can be defined and solved to assess the sources of uncertainty, and experimental effort can be subsequently directed to further improve the uncertainty associated with these sources. In this work a subspace-based algorithm for inverse sensitivity/uncertainty quantification (IS/UQ) has been developed to enable analysts account for all sources of nuclear data uncertainties in support of target accuracy assessment-type analysis. An approximate analytical solution of the optimization problem is used to guide the search for the dominant uncertainty subspace. By limiting the search to a subspace, the degrees of freedom available for the optimization search are significantly reduced. A quarter PWR fuel assembly is modeled and the accuracy of the multiplication factor and the fission reaction rate are used as reactor attributes whose uncertainties are to be reduced. Numerical experiments are used to demonstrate the computational efficiency of the proposed algorithm. Our ongoing work is focusing on extending the proposed algorithm to account for various forms of feedback, e.g., thermal-hydraulics and depletion effects.

  19. Review of monitoring uncertainty requirements in the CDM

    International Nuclear Information System (INIS)

    Shishlov, Igor; Bellassen, Valentin

    2014-10-01

    In order to ensure the environmental integrity of carbon offset projects, emission reductions certified under the Clean Development Mechanism (CDM) have to be 'real, measurable and additional', which is ensured through the monitoring, reporting and verification (MRV) process. MRV, however, comes at a cost that ranges from several cents to EUR1.20 and above per ton of CO 2 e depending on the project type. This article analyzes monitoring uncertainty requirements for carbon offset projects with a particular focus on the trade-off between monitoring stringency and cost. To this end, we review existing literature, scrutinize both overarching monitoring guidelines and the 10 most-used methodologies, and finally we analyze four case studies. We find that there is indeed a natural trade-off between the stringency and the cost of monitoring, which if not addressed properly may become a major barrier for the implementation of offset projects in some sectors. We demonstrate that this trade-off has not been systematically addressed in the overarching CDM guidelines and that there are only limited incentives to reduce monitoring uncertainty. Some methodologies and calculation tools as well as some other offset standards, however, do incorporate provisions for a trade-off between monitoring costs and stringency. These provisions may take the form of discounting emissions reductions based on the level of monitoring uncertainty - or more implicitly through allowing a project developer to choose between monitoring a given parameter and using a conservative default value. Our findings support the introduction of an uncertainty standard under the CDM for more comprehensive, yet cost-efficient, accounting for monitoring uncertainty in carbon offset projects. (authors)

  20. Towards Thermodynamics with Generalized Uncertainty Principle

    International Nuclear Information System (INIS)

    Moussa, Mohamed; Farag Ali, Ahmed

    2014-01-01

    Various frameworks of quantum gravity predict a modification in the Heisenberg uncertainty principle to a so-called generalized uncertainty principle (GUP). Introducing quantum gravity effect makes a considerable change in the density of states inside the volume of the phase space which changes the statistical and thermodynamical properties of any physical system. In this paper we investigate the modification in thermodynamic properties of ideal gases and photon gas. The partition function is calculated and using it we calculated a considerable growth in the thermodynamical functions for these considered systems. The growth may happen due to an additional repulsive force between constitutes of gases which may be due to the existence of GUP, hence predicting a considerable increase in the entropy of the system. Besides, by applying GUP on an ideal gas in a trapped potential, it is found that GUP assumes a minimum measurable value of thermal wavelength of particles which agrees with discrete nature of the space that has been derived in previous studies from the GUP

  1. Using spatial uncertainty to manipulate the size of the attention focus.

    Science.gov (United States)

    Huang, Dan; Xue, Linyan; Wang, Xin; Chen, Yao

    2016-09-01

    Preferentially processing behaviorally relevant information is vital for primate survival. In visuospatial attention studies, manipulating the spatial extent of attention focus is an important question. Although many studies have claimed to successfully adjust attention field size by either varying the uncertainty about the target location (spatial uncertainty) or adjusting the size of the cue orienting the attention focus, no systematic studies have assessed and compared the effectiveness of these methods. We used a multiple cue paradigm with 2.5° and 7.5° rings centered around a target position to measure the cue size effect, while the spatial uncertainty levels were manipulated by changing the number of cueing positions. We found that spatial uncertainty had a significant impact on reaction time during target detection, while the cue size effect was less robust. We also carefully varied the spatial scope of potential target locations within a small or large region and found that this amount of variation in spatial uncertainty can also significantly influence target detection speed. Our results indicate that adjusting spatial uncertainty is more effective than varying cue size when manipulating attention field size.

  2. Insurance Applications of Active Fault Maps Showing Epistemic Uncertainty

    Science.gov (United States)

    Woo, G.

    2005-12-01

    high deductible is in force, this requires estimation of the epistemic uncertainty on fault geometry and activity. Transport infrastructure insurance is of practical interest in seismic countries. On the North Anatolian Fault in Turkey, there is uncertainty over an unbroken segment between the eastern end of the Dazce Fault and Bolu. This may have ruptured during the 1944 earthquake. Existing hazard maps may simply use a question mark to flag uncertainty. However, a far more informative type of hazard map might express spatial variations in the confidence level associated with a fault map. Through such visual guidance, an insurance risk analyst would be better placed to price earthquake cover, allowing for epistemic uncertainty.

  3. Impact of Uncertainty on Calculations for Recovery from Loss of Offsite Power

    International Nuclear Information System (INIS)

    Kelly, Dana L.

    2010-01-01

    Uncertainty, both aleatory and epistemic, can have a significant impact on estimated probabilities of recovering from loss of offsite power within a specified time window, and such probabilities are an input to risk-informed decisions as to the significance of inspection findings in the U.S. Nuclear Regulatory Commission's Reactor Oversight Process. In particular, the choice of aleatory model for offsite power recovery time can have a significant impact on the estimated nonrecovery probability, especially if epistemic uncertainty regarding parameters in the aleatory model is accounted for properly. In past and current analyses, such uncertainty has largely been ignored. This paper examines the impact of both aleatory and epistemic uncertainty on the results, using modern open-source Bayesian inference software, which implements Markov chain Monte Carlo sampling. It includes examples of time-dependent convolution calculations to show the impact that uncertainty can have on this increasingly frequent type of calculation, also. The results show that the 'point estimate' result, which is an input to risk-informed decisions, can easily be uncertain by a factor of 10 if both aleatory and epistemic uncertainties are considered. The paper also illustrates the use of Bayesian model selection criteria to aid in the choice of aleatory model.

  4. Investment, regulation, and uncertainty: managing new plant breeding techniques.

    Science.gov (United States)

    Smyth, Stuart J; McDonald, Jillian; Falck-Zepeda, Jose

    2014-01-01

    As with any technological innovation, time refines the technology, improving upon the original version of the innovative product. The initial GM crops had single traits for either herbicide tolerance or insect resistance. Current varieties have both of these traits stacked together and in many cases other abiotic and biotic traits have also been stacked. This innovation requires investment. While this is relatively straight forward, certain conditions need to exist such that investments can be facilitated. The principle requirement for investment is that regulatory frameworks render consistent and timely decisions. If the certainty of regulatory outcomes weakens, the potential for changes in investment patterns increases.   This article provides a summary background to the leading plant breeding technologies that are either currently being used to develop new crop varieties or are in the pipeline to be applied to plant breeding within the next few years. Challenges for existing regulatory systems are highlighted. Utilizing an option value approach from investment literature, an assessment of uncertainty regarding the regulatory approval for these varying techniques is undertaken. This research highlights which technology development options have the greatest degree of uncertainty and hence, which ones might be expected to see an investment decline.

  5. Position-momentum uncertainty relations in the presence of quantum memory

    DEFF Research Database (Denmark)

    Furrer, Fabian; Berta, Mario; Tomamichel, Marco

    2014-01-01

    A prominent formulation of the uncertainty principle identifies the fundamental quantum feature that no particle may be prepared with certain outcomes for both position and momentum measurements. Often the statistical uncertainties are thereby measured in terms of entropies providing a clear oper....... As an illustration, we evaluate the uncertainty relations for position and momentum measurements, which is operationally significant in that it implies security of a quantum key distribution scheme based on homodyne detection of squeezed Gaussian states....

  6. [The metrology of uncertainty: a study of vital statistics from Chile and Brazil].

    Science.gov (United States)

    Carvajal, Yuri; Kottow, Miguel

    2012-11-01

    This paper addresses the issue of uncertainty in the measurements used in public health analysis and decision-making. The Shannon-Wiener entropy measure was adapted to express the uncertainty contained in counting causes of death in official vital statistics from Chile. Based on the findings, the authors conclude that metrological requirements in public health are as important as the measurements themselves. The study also considers and argues for the existence of uncertainty associated with the statistics' performative properties, both by the way the data are structured as a sort of syntax of reality and by exclusion of what remains beyond the quantitative modeling used in each case. Following the legacy of pragmatic thinking and using conceptual tools from the sociology of translation, the authors emphasize that by taking uncertainty into account, public health can contribute to a discussion on the relationship between technology, democracy, and formation of a participatory public.

  7. A new measure of uncertainty importance based on distributional sensitivity analysis for PSA

    International Nuclear Information System (INIS)

    Han, Seok Jung; Tak, Nam Il; Chun, Moon Hyun

    1996-01-01

    The main objective of the present study is to propose a new measure of uncertainty importance based on distributional sensitivity analysis. The new measure is developed to utilize a metric distance obtained from cumulative distribution functions (cdfs). The measure is evaluated for two cases: one is a cdf given by a known analytical distribution and the other given by an empirical distribution generated by a crude Monte Carlo simulation. To study its applicability, the present measure has been applied to two different cases. The results are compared with those of existing three methods. The present approach is a useful measure of uncertainty importance which is based on cdfs. This method is simple and easy to calculate uncertainty importance without any complex process. On the basis of the results obtained in the present work, the present method is recommended to be used as a tool for the analysis of uncertainty importance

  8. Analogy as a strategy for supporting complex problem solving under uncertainty.

    Science.gov (United States)

    Chan, Joel; Paletz, Susannah B F; Schunn, Christian D

    2012-11-01

    Complex problem solving in naturalistic environments is fraught with uncertainty, which has significant impacts on problem-solving behavior. Thus, theories of human problem solving should include accounts of the cognitive strategies people bring to bear to deal with uncertainty during problem solving. In this article, we present evidence that analogy is one such strategy. Using statistical analyses of the temporal dynamics between analogy and expressed uncertainty in the naturalistic problem-solving conversations among scientists on the Mars Rover Mission, we show that spikes in expressed uncertainty reliably predict analogy use (Study 1) and that expressed uncertainty reduces to baseline levels following analogy use (Study 2). In addition, in Study 3, we show with qualitative analyses that this relationship between uncertainty and analogy is not due to miscommunication-related uncertainty but, rather, is primarily concentrated on substantive problem-solving issues. Finally, we discuss a hypothesis about how analogy might serve as an uncertainty reduction strategy in naturalistic complex problem solving.

  9. Results from the Application of Uncertainty Methods in the CSNI Uncertainty Methods Study (UMS)

    International Nuclear Information System (INIS)

    Glaeser, H.

    2008-01-01

    Within licensing procedures there is the incentive to replace the conservative requirements for code application by a - best estimate - concept supplemented by an uncertainty analysis to account for predictive uncertainties of code results. Methods have been developed to quantify these uncertainties. The Uncertainty Methods Study (UMS) Group, following a mandate from CSNI, has compared five methods for calculating the uncertainty in the predictions of advanced -best estimate- thermal-hydraulic codes. Most of the methods identify and combine input uncertainties. The major differences between the predictions of the methods came from the choice of uncertain parameters and the quantification of the input uncertainties, i.e. the wideness of the uncertainty ranges. Therefore, suitable experimental and analytical information has to be selected to specify these uncertainty ranges or distributions. After the closure of the Uncertainty Method Study (UMS) and after the report was issued comparison calculations of experiment LSTF-SB-CL-18 were performed by University of Pisa using different versions of the RELAP 5 code. It turned out that the version used by two of the participants calculated a 170 K higher peak clad temperature compared with other versions using the same input deck. This may contribute to the differences of the upper limit of the uncertainty ranges.

  10. The Uncertainty Principle in the Presence of Quantum Memory

    Science.gov (United States)

    Renes, Joseph M.; Berta, Mario; Christandl, Matthias; Colbeck, Roger; Renner, Renato

    2010-03-01

    One consequence of Heisenberg's uncertainty principle is that no observer can predict the outcomes of two incompatible measurements performed on a system to arbitrary precision. However, this implication is invalid if the the observer possesses a quantum memory, a distinct possibility in light of recent technological advances. Entanglement between the system and the memory is responsible for the breakdown of the uncertainty principle, as illustrated by the EPR paradox. In this work we present an improved uncertainty principle which takes this entanglement into account. By quantifying uncertainty using entropy, we show that the sum of the entropies associated with incompatible measurements must exceed a quantity which depends on the degree of incompatibility and the amount of entanglement between system and memory. Apart from its foundational significance, the uncertainty principle motivated the first proposals for quantum cryptography, though the possibility of an eavesdropper having a quantum memory rules out using the original version to argue that these proposals are secure. The uncertainty relation introduced here alleviates this problem and paves the way for its widespread use in quantum cryptography.

  11. Information Seeking in Uncertainty Management Theory: Exposure to Information About Medical Uncertainty and Information-Processing Orientation as Predictors of Uncertainty Management Success.

    Science.gov (United States)

    Rains, Stephen A; Tukachinsky, Riva

    2015-01-01

    Uncertainty management theory outlines the processes through which individuals cope with health-related uncertainty. Information seeking has been frequently documented as an important uncertainty management strategy. The reported study investigates exposure to specific types of medical information during a search, and one's information-processing orientation as predictors of successful uncertainty management (i.e., a reduction in the discrepancy between the level of uncertainty one feels and the level one desires). A lab study was conducted in which participants were primed to feel more or less certain about skin cancer and then were allowed to search the World Wide Web for skin cancer information. Participants' search behavior was recorded and content analyzed. The results indicate that exposure to two health communication constructs that pervade medical forms of uncertainty (i.e., severity and susceptibility) and information-processing orientation predicted uncertainty management success.

  12. Computational Fluid Dynamics Uncertainty Analysis Applied to Heat Transfer over a Flat Plate

    Science.gov (United States)

    Groves, Curtis Edward; Ilie, Marcel; Schallhorn, Paul A.

    2013-01-01

    There have been few discussions on using Computational Fluid Dynamics (CFD) without experimental validation. Pairing experimental data, uncertainty analysis, and analytical predictions provides a comprehensive approach to verification and is the current state of the art. With pressed budgets, collecting experimental data is rare or non-existent. This paper investigates and proposes a method to perform CFD uncertainty analysis only from computational data. The method uses current CFD uncertainty techniques coupled with the Student-T distribution to predict the heat transfer coefficient over a at plate. The inputs to the CFD model are varied from a specified tolerance or bias error and the difference in the results are used to estimate the uncertainty. The variation in each input is ranked from least to greatest to determine the order of importance. The results are compared to heat transfer correlations and conclusions drawn about the feasibility of using CFD without experimental data. The results provide a tactic to analytically estimate the uncertainty in a CFD model when experimental data is unavailable

  13. The Post Keynesians and Douglas North about Uncertainty and Institutions: the Missing Link?

    Directory of Open Access Journals (Sweden)

    Ivan V. Rozmainsky

    2016-09-01

    Full Text Available The paper compares the Post Keynesian and North’s approaches to an analysis of uncertainty and uncertainty-reducing role of institutions. Author emphasizes that Post Keynesianism was the first school of economic thought that has made uncertainty the starting point of own research program. Many of North’s line of reasoning about uncertainty were mentioned by the Post Keynesians long before his works had been published. It is not fortuitous that in the course of discussion of ergodicity and non-ergodicity North cites Davidson, leader of Post Keynesianism. Both approaches conclude that Neoclassical theory cannot be applicable to the solution of the real world’s problem because it ignores “genuine” (fundamental uncertainty. The paper considers also differences between these two approaches. Whereas North’s theory explains inability of many systems to grow steadily, the Post Keynesians emphasize problems of systems perceived as successful. These problems exist due the fact that institutional evolution decreases effectiveness of uncertainty’s reduction and are revealed in macroeconomic and financial crises.

  14. Natural gas, uncertainty, and climate policy in the US electric power sector

    International Nuclear Information System (INIS)

    Bistline, John E.

    2014-01-01

    This paper investigates how uncertainties related to natural gas prices and potential climate policies may influence capacity investments, utilization, and emissions in US electricity markets. Using a two-stage stochastic programming approach, model results suggest that climate policies are stronger drivers of greenhouse gas emission trajectories than new natural gas supplies. The dynamics of learning and irreversibility may give rise to an investment climate where strategic delay is optimal. Hedging strategies are shown to be sensitive to the specification of probability distributions for climate policy and natural gas prices, highlighting the important role of uncertainty quantification in future research. The paper also illustrates how this stochastic modeling framework could be used to quantify the value of limiting methane emissions from natural gas production. - Highlights: • This paper examines how uncertainty may impact natural gas in the power sector. • Uncertainties like gas prices, upstream emissions, and climate policy are modeled. • Climate policies are stronger drivers of emissions than gas supply conditions. • Lower gas prices are likely to spark greater utilization of existing capacity. • Irreversibility and uncertainty may make strategic delay optimal

  15. Stochastic goal programming based groundwater remediation management under human-health-risk uncertainty

    International Nuclear Information System (INIS)

    Li, Jing; He, Li; Lu, Hongwei; Fan, Xing

    2014-01-01

    Highlights: • We propose an integrated optimal groundwater remediation design approach. • The approach can address stochasticity in carcinogenic risks. • Goal programming is used to make the system approaching to ideal operation and remediation effects. • The uncertainty in slope factor is evaluated under different confidence levels. • Optimal strategies are obtained to support remediation design under uncertainty. - Abstract: An optimal design approach for groundwater remediation is developed through incorporating numerical simulation, health risk assessment, uncertainty analysis and nonlinear optimization within a general framework. Stochastic analysis and goal programming are introduced into the framework to handle uncertainties in real-world groundwater remediation systems. Carcinogenic risks associated with remediation actions are further evaluated at four confidence levels. The differences between ideal and predicted constraints are minimized by goal programming. The approach is then applied to a contaminated site in western Canada for creating a set of optimal remediation strategies. Results from the case study indicate that factors including environmental standards, health risks and technical requirements mutually affected and restricted themselves. Stochastic uncertainty existed in the entire process of remediation optimization, which should to be taken into consideration in groundwater remediation design

  16. LJUNGSKILE 1.0 A Computer Program for Investigation of Uncertainties in Chemical Speciation

    International Nuclear Information System (INIS)

    Ekberg, Christian; Oedegaard-Jensen, Arvid

    2002-11-01

    In analysing the long-term safety of nuclear waste disposal, there is a need to investigate uncertainties in chemical speciation calculations. Chemical speciation is of importance in evaluating the solubility of radionuclides, the chemical degradation of engineering materials, and chemical processes controlling groundwater composition. The uncertainties in chemical speciation may for instance be related to uncertainties in thermodynamic data, the groundwater composition, or the extrapolation to the actual temperature and ionic strength. The magnitude of such uncertainties and its implications are seldom explicitly evaluated in any detail. Commonly available chemical speciation programmes normally do not have a build-in option to include uncertainty ranges. The program developed within this project has the capability of incorporating uncertainty ranges in speciation calculations and can be used for graphical presentation of uncertainty ranges for dominant species. The program should be regarded as a starting point for assessing uncertainties in chemical speciation, since it is not yet comprehensive in its capabilities. There may be limitations in its usefulness to address various geochemical problems. The LJUNGSKILE code allows the user to select two approaches: the Monte Carlo (MC) approach and the Latin Hypercube Sampling (LHS). LHS allows to produce a satisfactory statistics with a minimum of CPU time. It is, in general, possible to do a simple theoretical speciation calculation within seconds. There are, admittedly, alternatives to LHS and there is criticism towards the uncritical use of LHS output because commonly correlation between some of the input variables exists. LHS, like MC, is not capable to take these correlations into account. Such a correlation can, i.e. exist between the pH of a solution and the partial pressure of CO 2 : higher pH solutions may absorb larger amounts of CO 2 and can reduce the CO 2 partial pressure. It is therefore of advantage to

  17. Uncertainty information in climate data records from Earth observation

    Science.gov (United States)

    Merchant, C. J.

    2017-12-01

    How to derive and present uncertainty in climate data records (CDRs) has been debated within the European Space Agency Climate Change Initiative, in search of common principles applicable across a range of essential climate variables. Various points of consensus have been reached, including the importance of improving provision of uncertainty information and the benefit of adopting international norms of metrology for language around the distinct concepts of uncertainty and error. Providing an estimate of standard uncertainty per datum (or the means to readily calculate it) emerged as baseline good practice, and should be highly relevant to users of CDRs when the uncertainty in data is variable (the usual case). Given this baseline, the role of quality flags is clarified as being complementary to and not repetitive of uncertainty information. Data with high uncertainty are not poor quality if a valid estimate of the uncertainty is available. For CDRs and their applications, the error correlation properties across spatio-temporal scales present important challenges that are not fully solved. Error effects that are negligible in the uncertainty of a single pixel may dominate uncertainty in the large-scale and long-term. A further principle is that uncertainty estimates should themselves be validated. The concepts of estimating and propagating uncertainty are generally acknowledged in geophysical sciences, but less widely practised in Earth observation and development of CDRs. Uncertainty in a CDR depends in part (and usually significantly) on the error covariance of the radiances and auxiliary data used in the retrieval. Typically, error covariance information is not available in the fundamental CDR (FCDR) (i.e., with the level-1 radiances), since provision of adequate level-1 uncertainty information is not yet standard practice. Those deriving CDRs thus cannot propagate the radiance uncertainty to their geophysical products. The FIDUCEO project (www.fiduceo.eu) is

  18. Uncertainties in scaling factors for ab initio vibrational zero-point energies

    Science.gov (United States)

    Irikura, Karl K.; Johnson, Russell D.; Kacker, Raghu N.; Kessel, Rüdiger

    2009-03-01

    Vibrational zero-point energies (ZPEs) determined from ab initio calculations are often scaled by empirical factors. An empirical scaling factor partially compensates for the effects arising from vibrational anharmonicity and incomplete treatment of electron correlation. These effects are not random but are systematic. We report scaling factors for 32 combinations of theory and basis set, intended for predicting ZPEs from computed harmonic frequencies. An empirical scaling factor carries uncertainty. We quantify and report, for the first time, the uncertainties associated with scaling factors for ZPE. The uncertainties are larger than generally acknowledged; the scaling factors have only two significant digits. For example, the scaling factor for B3LYP/6-31G(d) is 0.9757±0.0224 (standard uncertainty). The uncertainties in the scaling factors lead to corresponding uncertainties in predicted ZPEs. The proposed method for quantifying the uncertainties associated with scaling factors is based upon the Guide to the Expression of Uncertainty in Measurement, published by the International Organization for Standardization. We also present a new reference set of 60 diatomic and 15 polyatomic "experimental" ZPEs that includes estimated uncertainties.

  19. Extensive neutronic sensitivity-uncertainty analysis of a fusion reactor shielding blanket

    International Nuclear Information System (INIS)

    Hogenbirk, A.

    1994-01-01

    In this paper the results are presented of an extensive neutronic sensitivity-uncertainty study performed for the design of a shielding blanket for a next-step fusion reactor, such as ITER. A code system was used, which was developed at ECN Petten. The uncertainty in an important response parameter, the neutron heating in the inboard superconducting coils, was evaluated. Neutron transport calculations in the 100 neutron group GAM-II structure were performed using the code ANISN. For the sensitivity and uncertainty calculations the code SUSD was used. Uncertainties due to cross-section uncertainties were taken into account as well as uncertainties due to uncertainties in energy and angular distributions of scattered neutrons (SED and SAD uncertainties, respectively). The subject of direct-term uncertainties (i.e. uncertainties due to uncertainties in the kerma factors of the superconducting coils) is briefly touched upon. It is shown that SAD uncertainties, which have been largely neglected until now, contribute significantly to the total uncertainty. Moreover, the contribution of direct-term uncertainties may be large. The total uncertainty in the neutron heating, only due to Fe cross-sections, amounts to approximately 25%, which is rather large. However, uncertainty data are scarce and the data may very well be conservative. It is shown in this paper that with the code system used, sensitivity and uncertainty calculations can be performed in a straightforward way. Therefore, it is suggested that emphasis is now put on the generation of realistic, reliable covariance data for cross-sections as well as for angular and energy distributions. ((orig.))

  20. Aleatoric and epistemic uncertainties in sampling based nuclear data uncertainty and sensitivity analyses

    International Nuclear Information System (INIS)

    Zwermann, W.; Krzykacz-Hausmann, B.; Gallner, L.; Klein, M.; Pautz, A.; Velkov, K.

    2012-01-01

    Sampling based uncertainty and sensitivity analyses due to epistemic input uncertainties, i.e. to an incomplete knowledge of uncertain input parameters, can be performed with arbitrary application programs to solve the physical problem under consideration. For the description of steady-state particle transport, direct simulations of the microscopic processes with Monte Carlo codes are often used. This introduces an additional source of uncertainty, the aleatoric sampling uncertainty, which is due to the randomness of the simulation process performed by sampling, and which adds to the total combined output sampling uncertainty. So far, this aleatoric part of uncertainty is minimized by running a sufficiently large number of Monte Carlo histories for each sample calculation, thus making its impact negligible as compared to the impact from sampling the epistemic uncertainties. Obviously, this process may cause high computational costs. The present paper shows that in many applications reliable epistemic uncertainty results can also be obtained with substantially lower computational effort by performing and analyzing two appropriately generated series of samples with much smaller number of Monte Carlo histories each. The method is applied along with the nuclear data uncertainty and sensitivity code package XSUSA in combination with the Monte Carlo transport code KENO-Va to various critical assemblies and a full scale reactor calculation. It is shown that the proposed method yields output uncertainties and sensitivities equivalent to the traditional approach, with a high reduction of computing time by factors of the magnitude of 100. (authors)

  1. Uncertainty, probability and information-gaps

    International Nuclear Information System (INIS)

    Ben-Haim, Yakov

    2004-01-01

    This paper discusses two main ideas. First, we focus on info-gap uncertainty, as distinct from probability. Info-gap theory is especially suited for modelling and managing uncertainty in system models: we invest all our knowledge in formulating the best possible model; this leaves the modeller with very faulty and fragmentary information about the variation of reality around that optimal model. Second, we examine the interdependence between uncertainty modelling and decision-making. Good uncertainty modelling requires contact with the end-use, namely, with the decision-making application of the uncertainty model. The most important avenue of uncertainty-propagation is from initial data- and model-uncertainties into uncertainty in the decision-domain. Two questions arise. Is the decision robust to the initial uncertainties? Is the decision prone to opportune windfall success? We apply info-gap robustness and opportunity functions to the analysis of representation and propagation of uncertainty in several of the Sandia Challenge Problems

  2. Real Options Effect of Uncertainty and Labor Demand Shocks on the Housing Market

    OpenAIRE

    Lee, Gabriel; Nguyen Thanh, Binh; Strobel, Johannes

    2016-01-01

    This paper shows that uncertainty affects the housing market in two significant ways. First, uncertainty shocks adversely affect housing prices but not the quantities that are traded. Controlling for a broad set of variables in fixed-effects regressions, we find that uncertainty shocks reduce housing prices and median sales prices in the amount of 1.4% and 1.8%, respectively, but the effect is not statistically significant for the percentage changes of all homes sold. Second, when...

  3. Uncertainty and sensitivity studies supporting the interpretation of the results of TVO I/II PRA

    International Nuclear Information System (INIS)

    Holmberg, J.

    1992-01-01

    A comprehensive Level 1 probabilistic risk assessment (PRA) has been performed for the TVO I/II nuclear power units. As a part of the PRA project, uncertainties of risk models and methods were systematically studied in order to describe them and to demonstrate their impact by way of results. The uncertainty study was divided into two phases: a qualitative and a quantitative study. The qualitative study contained identification of uncertainties and qualitative assessments of their importance. The PRA was introduced, and identified assumptions and uncertainties behind the models were documented. The most significant uncertainties were selected by importance measures or other judgements for further quantitative studies. The quantitative study included sensitivity studies and propagation of uncertainty ranges. In the sensitivity studies uncertain assumptions or parameters were varied in order to illustrate the sensitivity of the models. The propagation of the uncertainty ranges demonstrated the impact of the statistical uncertainties of the parameter values. The Monte Carlo method was used as a propagation method. The most significant uncertainties were those involved in modelling human interactions, dependences and common cause failures (CCFs), loss of coolant accident (LOCA) frequencies and pressure suppression. The qualitative mapping out of the uncertainty factors turned out to be useful in planning quantitative studies. It also served as internal review of the assumptions made in the PRA. The sensitivity studies were perhaps the most advantageous part of the quantitative study because they allowed individual analyses of the significance of uncertainty sources identified. The uncertainty study was found reasonable in systematically and critically assessing uncertainties in a risk analysis. The usefulness of this study depends on the decision maker (power company) since uncertainty studies are primarily carried out to support decision making when uncertainties are

  4. Uncertainty analysis in the applications of nuclear probabilistic risk assessment

    International Nuclear Information System (INIS)

    Le Duy, T.D.

    2011-01-01

    The aim of this thesis is to propose an approach to model parameter and model uncertainties affecting the results of risk indicators used in the applications of nuclear Probabilistic Risk assessment (PRA). After studying the limitations of the traditional probabilistic approach to represent uncertainty in PRA model, a new approach based on the Dempster-Shafer theory has been proposed. The uncertainty analysis process of the proposed approach consists in five main steps. The first step aims to model input parameter uncertainties by belief and plausibility functions according to the data PRA model. The second step involves the propagation of parameter uncertainties through the risk model to lay out the uncertainties associated with output risk indicators. The model uncertainty is then taken into account in the third step by considering possible alternative risk models. The fourth step is intended firstly to provide decision makers with information needed for decision making under uncertainty (parametric and model) and secondly to identify the input parameters that have significant uncertainty contributions on the result. The final step allows the process to be continued in loop by studying the updating of beliefs functions given new data. The proposed methodology was implemented on a real but simplified application of PRA model. (author)

  5. Uncertainty Analysis of In leakage Test for Pressurized Control Room Envelop

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J. B. [KHNP Central Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    In leakage tests for control room envelops(CRE) of newly constructed nuclear power plants are required to prove the control room habitability. Results of the in leakage tests should be analyzed using an uncertainty analysis. Test uncertainty can be an issue if the test results for pressurized CREs show low in leakage. To have a better knowledge of the test uncertainty, a statistical model for the uncertainty analysis is described here and a representative uncertainty analysis of a sample in leakage test is presented. A statistical method for analyzing the uncertainty of the in leakage test is presented here and a representative uncertainty analysis of a sample in leakage test was performed. By using the statistical method we can evaluate the test result with certain level of significance. This method can be more helpful when the difference of the two mean values of the test result is small.

  6. Uncertainty Analysis of In leakage Test for Pressurized Control Room Envelop

    International Nuclear Information System (INIS)

    Lee, J. B.

    2013-01-01

    In leakage tests for control room envelops(CRE) of newly constructed nuclear power plants are required to prove the control room habitability. Results of the in leakage tests should be analyzed using an uncertainty analysis. Test uncertainty can be an issue if the test results for pressurized CREs show low in leakage. To have a better knowledge of the test uncertainty, a statistical model for the uncertainty analysis is described here and a representative uncertainty analysis of a sample in leakage test is presented. A statistical method for analyzing the uncertainty of the in leakage test is presented here and a representative uncertainty analysis of a sample in leakage test was performed. By using the statistical method we can evaluate the test result with certain level of significance. This method can be more helpful when the difference of the two mean values of the test result is small

  7. Research on uncertainty evaluation measure and method of voltage sag severity

    Science.gov (United States)

    Liu, X. N.; Wei, J.; Ye, S. Y.; Chen, B.; Long, C.

    2018-01-01

    Voltage sag is an inevitable serious problem of power quality in power system. This paper focuses on a general summarization and reviews on the concepts, indices and evaluation methods about voltage sag severity. Considering the complexity and uncertainty of influencing factors, damage degree, the characteristics and requirements of voltage sag severity in the power source-network-load sides, the measure concepts and their existing conditions, evaluation indices and methods of voltage sag severity have been analyzed. Current evaluation techniques, such as stochastic theory, fuzzy logic, as well as their fusion, are reviewed in detail. An index system about voltage sag severity is provided for comprehensive study. The main aim of this paper is to propose thought and method of severity research based on advanced uncertainty theory and uncertainty measure. This study may be considered as a valuable guide for researchers who are interested in the domain of voltage sag severity.

  8. Role of uncertainty in the basalt waste isolation project

    International Nuclear Information System (INIS)

    Knepp, A.J.; Dahlem, D.H.

    1989-01-01

    The current national Civilian Radioactive Waste Management (CRWM) Program to select a mined geologic repository will likely require the extensive use of probabilistic techniques to quantify uncertainty in predictions of repository isolation performance. Performance of nonhomogeneous, geologic hydrologic, and chemical systems must be predicted over time frames of thousands of years and therefore will likely contain significant uncertainty. A qualitative assessment of our limited ability to interrogate the site in a nondestructive manner coupled with the early stage of development in the pertinent geosciences support this statement. The success of the approach to incorporate what currently appears to be an appreciable element of uncertainty into the predictions of repository performance will play an important role in acquiring a license to operate and in establishing the level of safety associated with the concept of long-term geologic storage of nuclear waste. This paper presents a brief background on the Hanford Site and the repository program, references the sources that establish the legislative requirement to quantify uncertainties in performance predictions, and summarized the present and future program at the Hanford Site in this area. The decision to quantify significant sources of uncertainties has had a major impact on the direction of the site characterization program here at Hanford. The paper concludes with a number of observations on the impacts of this decision

  9. Fungal communities in wheat grain show significant co-existence patterns among species

    DEFF Research Database (Denmark)

    Nicolaisen, M.; Justesen, A. F.; Knorr, K.

    2014-01-01

    identified as ‘core’ OTUs as they were found in all or almost all samples and accounted for almost 99 % of all sequences. The remaining OTUs were only sporadically found and only in small amounts. Cluster and factor analyses showed patterns of co-existence among the core species. Cluster analysis grouped...... the 21 core OTUs into three clusters: cluster 1 consisting of saprotrophs, cluster 2 consisting mainly of yeasts and saprotrophs and cluster 3 consisting of wheat pathogens. Principal component extraction showed that the Fusarium graminearum group was inversely related to OTUs of clusters 1 and 2....

  10. Cost-effective conservation of an endangered frog under uncertainty.

    Science.gov (United States)

    Rose, Lucy E; Heard, Geoffrey W; Chee, Yung En; Wintle, Brendan A

    2016-04-01

    How should managers choose among conservation options when resources are scarce and there is uncertainty regarding the effectiveness of actions? Well-developed tools exist for prioritizing areas for one-time and binary actions (e.g., protect vs. not protect), but methods for prioritizing incremental or ongoing actions (such as habitat creation and maintenance) remain uncommon. We devised an approach that combines metapopulation viability and cost-effectiveness analyses to select among alternative conservation actions while accounting for uncertainty. In our study, cost-effectiveness is the ratio between the benefit of an action and its economic cost, where benefit is the change in metapopulation viability. We applied the approach to the case of the endangered growling grass frog (Litoria raniformis), which is threatened by urban development. We extended a Bayesian model to predict metapopulation viability under 9 urbanization and management scenarios and incorporated the full probability distribution of possible outcomes for each scenario into the cost-effectiveness analysis. This allowed us to discern between cost-effective alternatives that were robust to uncertainty and those with a relatively high risk of failure. We found a relatively high risk of extinction following urbanization if the only action was reservation of core habitat; habitat creation actions performed better than enhancement actions; and cost-effectiveness ranking changed depending on the consideration of uncertainty. Our results suggest that creation and maintenance of wetlands dedicated to L. raniformis is the only cost-effective action likely to result in a sufficiently low risk of extinction. To our knowledge we are the first study to use Bayesian metapopulation viability analysis to explicitly incorporate parametric and demographic uncertainty into a cost-effective evaluation of conservation actions. The approach offers guidance to decision makers aiming to achieve cost

  11. From Rupture to Resonance: Uncertainty and Scholarship in Fine Art Research Degrees

    Science.gov (United States)

    Simmons, Beverley; Holbrook, Allyson

    2013-01-01

    This article focuses on the phenomenon of "rupture" identified in student narratives of uncertainty and scholarship experienced during the course of Fine Art research degrees in two Australian universities. Rupture captures the phenomenon of severe disruption or discontinuity in existing knowledge and typically signifies epistemological…

  12. Economic uncertainty and its impact on the Croatian economy

    Directory of Open Access Journals (Sweden)

    Petar Soric

    2017-12-01

    Full Text Available The aim of this paper is to quantify institutional (political and fiscal and non-institutional uncertainty (economic policy uncertainty, Economists’ recession index, natural disasters-related uncertainty, and several disagreement measures. The stated indicators are based on articles from highly popular Croatian news portals, the repository of law amendments (Narodne novine, and Business and Consumer Surveys. We also introduce a composite uncertainty indicator, obtained by the principal components method. The analysis of a structural VAR model of the Croatian economy (both with fixed and time-varying parameters has showed that a vast part of the analysed indicators are significant predictors of economic activity. It is demonstrated that their impact on industrial production is the strongest in the onset of a crisis. On the other hand, the influence of fiscal uncertainty exhibits just the opposite tendencies. It strengthens with the intensification of economic activity, which partially exculpates the possible utilization of fiscal expansion as a counter-crisis tool.

  13. A Variation on Uncertainty Principle and Logarithmic Uncertainty Principle for Continuous Quaternion Wavelet Transforms

    Directory of Open Access Journals (Sweden)

    Mawardi Bahri

    2017-01-01

    Full Text Available The continuous quaternion wavelet transform (CQWT is a generalization of the classical continuous wavelet transform within the context of quaternion algebra. First of all, we show that the directional quaternion Fourier transform (QFT uncertainty principle can be obtained using the component-wise QFT uncertainty principle. Based on this method, the directional QFT uncertainty principle using representation of polar coordinate form is easily derived. We derive a variation on uncertainty principle related to the QFT. We state that the CQWT of a quaternion function can be written in terms of the QFT and obtain a variation on uncertainty principle related to the CQWT. Finally, we apply the extended uncertainty principles and properties of the CQWT to establish logarithmic uncertainty principles related to generalized transform.

  14. Realising the Uncertainty Enabled Model Web

    Science.gov (United States)

    Cornford, D.; Bastin, L.; Pebesma, E. J.; Williams, M.; Stasch, C.; Jones, R.; Gerharz, L.

    2012-12-01

    The FP7 funded UncertWeb project aims to create the "uncertainty enabled model web". The central concept here is that geospatial models and data resources are exposed via standard web service interfaces, such as the Open Geospatial Consortium (OGC) suite of encodings and interface standards, allowing the creation of complex workflows combining both data and models. The focus of UncertWeb is on the issue of managing uncertainty in such workflows, and providing the standards, architecture, tools and software support necessary to realise the "uncertainty enabled model web". In this paper we summarise the developments in the first two years of UncertWeb, illustrating several key points with examples taken from the use case requirements that motivate the project. Firstly we address the issue of encoding specifications. We explain the usage of UncertML 2.0, a flexible encoding for representing uncertainty based on a probabilistic approach. This is designed to be used within existing standards such as Observations and Measurements (O&M) and data quality elements of ISO19115 / 19139 (geographic information metadata and encoding specifications) as well as more broadly outside the OGC domain. We show profiles of O&M that have been developed within UncertWeb and how UncertML 2.0 is used within these. We also show encodings based on NetCDF and discuss possible future directions for encodings in JSON. We then discuss the issues of workflow construction, considering discovery of resources (both data and models). We discuss why a brokering approach to service composition is necessary in a world where the web service interfaces remain relatively heterogeneous, including many non-OGC approaches, in particular the more mainstream SOAP and WSDL approaches. We discuss the trade-offs between delegating uncertainty management functions to the service interfaces themselves and integrating the functions in the workflow management system. We describe two utility services to address

  15. Position-momentum uncertainty relations in the presence of quantum memory

    Energy Technology Data Exchange (ETDEWEB)

    Furrer, Fabian, E-mail: furrer@eve.phys.s.u-tokyo.ac.jp [Department of Physics, Graduate School of Science, University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-0033 (Japan); Berta, Mario [Institute for Quantum Information and Matter, Caltech, Pasadena, California 91125 (United States); Institute for Theoretical Physics, ETH Zurich, Wolfgang-Pauli-Str. 27, 8093 Zürich (Switzerland); Tomamichel, Marco [School of Physics, The University of Sydney, Sydney 2006 (Australia); Centre for Quantum Technologies, National University of Singapore, Singapore 117543 (Singapore); Scholz, Volkher B. [Institute for Theoretical Physics, ETH Zurich, Wolfgang-Pauli-Str. 27, 8093 Zürich (Switzerland); Christandl, Matthias [Institute for Theoretical Physics, ETH Zurich, Wolfgang-Pauli-Str. 27, 8093 Zürich (Switzerland); Department of Mathematical Sciences, University of Copenhagen, Universitetsparken 5, 2100 Copenhagen (Denmark)

    2014-12-15

    A prominent formulation of the uncertainty principle identifies the fundamental quantum feature that no particle may be prepared with certain outcomes for both position and momentum measurements. Often the statistical uncertainties are thereby measured in terms of entropies providing a clear operational interpretation in information theory and cryptography. Recently, entropic uncertainty relations have been used to show that the uncertainty can be reduced in the presence of entanglement and to prove security of quantum cryptographic tasks. However, much of this recent progress has been focused on observables with only a finite number of outcomes not including Heisenberg’s original setting of position and momentum observables. Here, we show entropic uncertainty relations for general observables with discrete but infinite or continuous spectrum that take into account the power of an entangled observer. As an illustration, we evaluate the uncertainty relations for position and momentum measurements, which is operationally significant in that it implies security of a quantum key distribution scheme based on homodyne detection of squeezed Gaussian states.

  16. On the relationship between aerosol model uncertainty and radiative forcing uncertainty.

    Science.gov (United States)

    Lee, Lindsay A; Reddington, Carly L; Carslaw, Kenneth S

    2016-05-24

    The largest uncertainty in the historical radiative forcing of climate is caused by the interaction of aerosols with clouds. Historical forcing is not a directly measurable quantity, so reliable assessments depend on the development of global models of aerosols and clouds that are well constrained by observations. However, there has been no systematic assessment of how reduction in the uncertainty of global aerosol models will feed through to the uncertainty in the predicted forcing. We use a global model perturbed parameter ensemble to show that tight observational constraint of aerosol concentrations in the model has a relatively small effect on the aerosol-related uncertainty in the calculated forcing between preindustrial and present-day periods. One factor is the low sensitivity of present-day aerosol to natural emissions that determine the preindustrial aerosol state. However, the major cause of the weak constraint is that the full uncertainty space of the model generates a large number of model variants that are equally acceptable compared to present-day aerosol observations. The narrow range of aerosol concentrations in the observationally constrained model gives the impression of low aerosol model uncertainty. However, these multiple "equifinal" models predict a wide range of forcings. To make progress, we need to develop a much deeper understanding of model uncertainty and ways to use observations to constrain it. Equifinality in the aerosol model means that tuning of a small number of model processes to achieve model-observation agreement could give a misleading impression of model robustness.

  17. Essentialist beliefs, sexual identity uncertainty, internalized homonegativity and psychological wellbeing in gay men.

    Science.gov (United States)

    Morandini, James S; Blaszczynski, Alexander; Ross, Michael W; Costa, Daniel S J; Dar-Nimrod, Ilan

    2015-07-01

    The present study examined essentialist beliefs about sexual orientation and their implications for sexual identity uncertainty, internalized homonegativity and psychological wellbeing in a sample of gay men. A combination of targeted sampling and snowball strategies were used to recruit 639 gay identifying men for a cross-sectional online survey. Participants completed a questionnaire assessing sexual orientation beliefs, sexual identity uncertainty, internalized homonegativity, and psychological wellbeing outcomes. Structural equation modeling was used to test whether essentialist beliefs were associated with psychological wellbeing indirectly via their effect on sexual identity uncertainty and internalized homonegativity. A unique pattern of direct and indirect effects was observed in which facets of essentialism predicted sexual identity uncertainty, internalized homonegativity and psychological wellbeing. Of note, viewing sexual orientation as immutable/biologically based and as existing in discrete categories, were associated with less sexual identity uncertainty. On the other hand, these beliefs had divergent relationships with internalized homonegativity, with immutability/biological beliefs associated with lower, and discreteness beliefs associated with greater internalized homonegativity. Of interest, although sexual identity uncertainty was associated with poorer psychological wellbeing via its contribution to internalized homophobia, there was no direct relationship between identity uncertainty and psychological wellbeing. Findings indicate that essentializing sexual orientation has mixed implications for sexual identity uncertainty and internalized homonegativity and wellbeing in gay men. Those undertaking educational and clinical interventions with gay men should be aware of the benefits and of caveats of essentialist theories of homosexuality for this population. (c) 2015 APA, all rights reserved).

  18. Robust Path Planning and Feedback Design Under Stochastic Uncertainty

    Science.gov (United States)

    Blackmore, Lars

    2008-01-01

    Autonomous vehicles require optimal path planning algorithms to achieve mission goals while avoiding obstacles and being robust to uncertainties. The uncertainties arise from exogenous disturbances, modeling errors, and sensor noise, which can be characterized via stochastic models. Previous work defined a notion of robustness in a stochastic setting by using the concept of chance constraints. This requires that mission constraint violation can occur with a probability less than a prescribed value.In this paper we describe a novel method for optimal chance constrained path planning with feedback design. The approach optimizes both the reference trajectory to be followed and the feedback controller used to reject uncertainty. Our method extends recent results in constrained control synthesis based on convex optimization to solve control problems with nonconvex constraints. This extension is essential for path planning problems, which inherently have nonconvex obstacle avoidance constraints. Unlike previous approaches to chance constrained path planning, the new approach optimizes the feedback gain as wellas the reference trajectory.The key idea is to couple a fast, nonconvex solver that does not take into account uncertainty, with existing robust approaches that apply only to convex feasible regions. By alternating between robust and nonrobust solutions, the new algorithm guarantees convergence to a global optimum. We apply the new method to an unmanned aircraft and show simulation results that demonstrate the efficacy of the approach.

  19. Quantifying and reducing model-form uncertainties in Reynolds-averaged Navier–Stokes simulations: A data-driven, physics-informed Bayesian approach

    International Nuclear Information System (INIS)

    Xiao, H.; Wu, J.-L.; Wang, J.-X.; Sun, R.; Roy, C.J.

    2016-01-01

    Despite their well-known limitations, Reynolds-Averaged Navier–Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations. Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach has

  20. Quantifying and reducing model-form uncertainties in Reynolds-averaged Navier–Stokes simulations: A data-driven, physics-informed Bayesian approach

    Energy Technology Data Exchange (ETDEWEB)

    Xiao, H., E-mail: hengxiao@vt.edu; Wu, J.-L.; Wang, J.-X.; Sun, R.; Roy, C.J.

    2016-11-01

    Despite their well-known limitations, Reynolds-Averaged Navier–Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations. Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach

  1. Decision-making under great uncertainty

    International Nuclear Information System (INIS)

    Hansson, S.O.

    1992-01-01

    Five types of decision-uncertainty are distinguished: uncertainty of consequences, of values, of demarcation, of reliance, and of co-ordination. Strategies are proposed for each type of uncertainty. The general conclusion is that it is meaningful for decision theory to treat cases with greater uncertainty than the textbook case of 'decision-making under uncertainty'. (au)

  2. Quantifying uncertainty in NDSHA estimates due to earthquake catalogue

    Science.gov (United States)

    Magrin, Andrea; Peresan, Antonella; Vaccari, Franco; Panza, Giuliano

    2014-05-01

    The procedure for the neo-deterministic seismic zoning, NDSHA, is based on the calculation of synthetic seismograms by the modal summation technique. This approach makes use of information about the space distribution of large magnitude earthquakes, which can be defined based on seismic history and seismotectonics, as well as incorporating information from a wide set of geological and geophysical data (e.g., morphostructural features and ongoing deformation processes identified by earth observations). Hence the method does not make use of attenuation models (GMPE), which may be unable to account for the complexity of the product between seismic source tensor and medium Green function and are often poorly constrained by the available observations. NDSHA defines the hazard from the envelope of the values of ground motion parameters determined considering a wide set of scenario earthquakes; accordingly, the simplest outcome of this method is a map where the maximum of a given seismic parameter is associated to each site. In NDSHA uncertainties are not statistically treated as in PSHA, where aleatory uncertainty is traditionally handled with probability density functions (e.g., for magnitude and distance random variables) and epistemic uncertainty is considered by applying logic trees that allow the use of alternative models and alternative parameter values of each model, but the treatment of uncertainties is performed by sensitivity analyses for key modelling parameters. To fix the uncertainty related to a particular input parameter is an important component of the procedure. The input parameters must account for the uncertainty in the prediction of fault radiation and in the use of Green functions for a given medium. A key parameter is the magnitude of sources used in the simulation that is based on catalogue informations, seismogenic zones and seismogenic nodes. Because the largest part of the existing catalogues is based on macroseismic intensity, a rough estimate

  3. Uncertainties in human health risk assessment of environmental contaminants: A review and perspective.

    Science.gov (United States)

    Dong, Zhaomin; Liu, Yanju; Duan, Luchun; Bekele, Dawit; Naidu, Ravi

    2015-12-01

    Addressing uncertainties in human health risk assessment is a critical issue when evaluating the effects of contaminants on public health. A range of uncertainties exist through the source-to-outcome continuum, including exposure assessment, hazard and risk characterisation. While various strategies have been applied to characterising uncertainty, classical approaches largely rely on how to maximise the available resources. Expert judgement, defaults and tools for characterising quantitative uncertainty attempt to fill the gap between data and regulation requirements. The experiences of researching 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) illustrated uncertainty sources and how to maximise available information to determine uncertainties, and thereby provide an 'adequate' protection to contaminant exposure. As regulatory requirements and recurring issues increase, the assessment of complex scenarios involving a large number of chemicals requires more sophisticated tools. Recent advances in exposure and toxicology science provide a large data set for environmental contaminants and public health. In particular, biomonitoring information, in vitro data streams and computational toxicology are the crucial factors in the NexGen risk assessment, as well as uncertainties minimisation. Although in this review we cannot yet predict how the exposure science and modern toxicology will develop in the long-term, current techniques from emerging science can be integrated to improve decision-making. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Cost uncertainty for different levels of technology maturity

    International Nuclear Information System (INIS)

    DeMuth, S.F.; Franklin, A.L.

    1996-01-01

    It is difficult at best to apply a single methodology for estimating cost uncertainties related to technologies of differing maturity. While highly mature technologies may have significant performance and manufacturing cost data available, less well developed technologies may be defined in only conceptual terms. Regardless of the degree of technical maturity, often a cost estimate relating to application of the technology may be required to justify continued funding for development. Yet, a cost estimate without its associated uncertainty lacks the information required to assess the economic risk. For this reason, it is important for the developer to provide some type of uncertainty along with a cost estimate. This study demonstrates how different methodologies for estimating uncertainties can be applied to cost estimates for technologies of different maturities. For a less well developed technology an uncertainty analysis of the cost estimate can be based on a sensitivity analysis; whereas, an uncertainty analysis of the cost estimate for a well developed technology can be based on an error propagation technique from classical statistics. It was decided to demonstrate these uncertainty estimation techniques with (1) an investigation of the additional cost of remediation due to beyond baseline, nearly complete, waste heel retrieval from underground storage tanks (USTs) at Hanford; and (2) the cost related to the use of crystalline silico-titanate (CST) rather than the baseline CS100 ion exchange resin for cesium separation from UST waste at Hanford

  5. A location-inventory model for distribution centers in a three-level supply chain under uncertainty

    OpenAIRE

    Ali Bozorgi-Amiri; M. Saeed Jabalameli; Sara Gharegozloo Hamedani

    2013-01-01

    We study a location-inventory problem in a three level supply chain network under uncertainty, which leads to risk. The (r,Q) inventory control policy is applied for this problem. Besides, uncertainty exists in different parameters such as procurement, transportation costs, supply, demand and the capacity of different facilities (due to disaster, man-made events and etc). We present a robust optimization model, which concurrently specifies: locations of distribution centers to be opened, inve...

  6. DS02 uncertainty analysis

    International Nuclear Information System (INIS)

    Kaul, Dean C.; Egbert, Stephen D.; Woolson, William A.

    2005-01-01

    In order to avoid the pitfalls that so discredited DS86 and its uncertainty estimates, and to provide DS02 uncertainties that are both defensible and credible, this report not only presents the ensemble uncertainties assembled from uncertainties in individual computational elements and radiation dose components but also describes how these relate to comparisons between observed and computed quantities at critical intervals in the computational process. These comparisons include those between observed and calculated radiation free-field components, where observations include thermal- and fast-neutron activation and gamma-ray thermoluminescence, which are relevant to the estimated systematic uncertainty for DS02. The comparisons also include those between calculated and observed survivor shielding, where the observations consist of biodosimetric measurements for individual survivors, which are relevant to the estimated random uncertainty for DS02. (J.P.N.)

  7. An EPGPT-based approach for uncertainty quantification

    International Nuclear Information System (INIS)

    Wang, C.; Abdel-Khalik, H. S.

    2012-01-01

    Generalized Perturbation Theory (GPT) has been widely used by many scientific disciplines to perform sensitivity analysis and uncertainty quantification. This manuscript employs recent developments in GPT theory, collectively referred to as Exact-to-Precision Generalized Perturbation Theory (EPGPT), to enable uncertainty quantification for computationally challenging models, e.g. nonlinear models associated with many input parameters and many output responses and with general non-Gaussian parameters distributions. The core difference between EPGPT and existing GPT is in the way the problem is formulated. GPT formulates an adjoint problem that is dependent on the response of interest. It tries to capture via the adjoint solution the relationship between the response of interest and the constraints on the state variations. EPGPT recasts the problem in terms of a smaller set of what is referred to as the 'active' responses which are solely dependent on the physics model and the boundary and initial conditions rather than on the responses of interest. The objective of this work is to apply an EPGPT methodology to propagate cross-sections variations in typical reactor design calculations. The goal is to illustrate its use and the associated impact for situations where the typical Gaussian assumption for parameters uncertainties is not valid and when nonlinear behavior must be considered. To allow this demonstration, exaggerated variations will be employed to stimulate nonlinear behavior in simple prototypical neutronics models. (authors)

  8. Fuzzy uncertainty modeling applied to AP1000 nuclear power plant LOCA

    International Nuclear Information System (INIS)

    Ferreira Guimaraes, Antonio Cesar; Franklin Lapa, Celso Marcelo; Lamego Simoes Filho, Francisco Fernando; Cabral, Denise Cunha

    2011-01-01

    Research highlights: → This article presents an uncertainty modelling study using a fuzzy approach. → The AP1000 Westinghouse NPP was used and it is provided of passive safety systems. → The use of advanced passive safety systems in NPP has limited operational experience. → Failure rates and basic events probabilities used on the fault tree analysis. → Fuzzy uncertainty approach was employed to reliability of the AP1000 large LOCA. - Abstract: This article presents an uncertainty modeling study using a fuzzy approach applied to the Westinghouse advanced nuclear reactor. The AP1000 Westinghouse Nuclear Power Plant (NPP) is provided of passive safety systems, based on thermo physics phenomenon, that require no operating actions, soon after an incident has been detected. The use of advanced passive safety systems in NPP has limited operational experience. As it occurs in any reliability study, statistically non-significant events report introduces a significant uncertainty level about the failure rates and basic events probabilities used on the fault tree analysis (FTA). In order to model this uncertainty, a fuzzy approach was employed to reliability analysis of the AP1000 large break Loss of Coolant Accident (LOCA). The final results have revealed that the proposed approach may be successfully applied to modeling of uncertainties in safety studies.

  9. Uncertainty: a discriminator for above and below boiling repository design decisions

    International Nuclear Information System (INIS)

    Wilder, D G; Lin, W; Buscheck, T A; Wolery, T J; Francis, N D

    2000-01-01

    The US nuclear waste disposal program is evaluating the Yucca Mountain (YM) site for possible disposal of nuclear waste. Radioactive decay of the waste, particularly spent fuel, generates sufficient heat to significantly raise repository temperatures. Environmental conditions in the repository system evolve in response to this heat. The amount of temperature increase, and thus environmental changes, depends on repository design and operations. Because the evolving environment cannot be directly measured until after waste is emplaced, licensing decisions must be based upon model and analytical projections of the environmental conditions. These analyses have inherent uncertainties. There is concern that elevated temperatures increase uncertainty, because most chemical reaction rates increase with temperature and boiling introduces additional complexity of vapor phase reactions and transport. This concern was expressed by the NWTRB, particularly for above boiling temperatures. They state that ''the cooler the repository, the lower the uncertainty about heat-driven water migration and the better the performance of waste package materials. Above this temperature, technical uncertainties tend to be significantly higher than those associated with below-boiling conditions.'' (Cohon 1999). However, not all uncertainties are reduced by lower temperatures, indeed some may even be increased. This paper addresses impacts of temperatures on uncertainties

  10. Uncertainty Quantification of Multi-Phase Closures

    Energy Technology Data Exchange (ETDEWEB)

    Nadiga, Balasubramanya T. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Baglietto, Emilio [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2017-10-27

    In the ensemble-averaged dispersed phase formulation used for CFD of multiphase ows in nuclear reactor thermohydraulics, closures of interphase transfer of mass, momentum, and energy constitute, by far, the biggest source of error and uncertainty. Reliable estimators of this source of error and uncertainty are currently non-existent. Here, we report on how modern Validation and Uncertainty Quanti cation (VUQ) techniques can be leveraged to not only quantify such errors and uncertainties, but also to uncover (unintended) interactions between closures of di erent phenomena. As such this approach serves as a valuable aide in the research and development of multiphase closures. The joint modeling of lift, drag, wall lubrication, and turbulent dispersion|forces that lead to tranfer of momentum between the liquid and gas phases|is examined in the frame- work of validation of the adiabatic but turbulent experiments of Liu and Banko , 1993. An extensive calibration study is undertaken with a popular combination of closure relations and the popular k-ϵ turbulence model in a Bayesian framework. When a wide range of super cial liquid and gas velocities and void fractions is considered, it is found that this set of closures can be validated against the experimental data only by allowing large variations in the coe cients associated with the closures. We argue that such an extent of variation is a measure of uncertainty induced by the chosen set of closures. We also nd that while mean uid velocity and void fraction pro les are properly t, uctuating uid velocity may or may not be properly t. This aspect needs to be investigated further. The popular set of closures considered contains ad-hoc components and are undesirable from a predictive modeling point of view. Consequently, we next consider improvements that are being developed by the MIT group under CASL and which remove the ad-hoc elements. We use non-intrusive methodologies for sensitivity analysis and calibration (using

  11. A Statistical Modeling Framework for Characterising Uncertainty in Large Datasets: Application to Ocean Colour

    Directory of Open Access Journals (Sweden)

    Peter E. Land

    2018-05-01

    Full Text Available Uncertainty estimation is crucial to establishing confidence in any data analysis, and this is especially true for Essential Climate Variables, including ocean colour. Methods for deriving uncertainty vary greatly across data types, so a generic statistics-based approach applicable to multiple data types is an advantage to simplify the use and understanding of uncertainty data. Progress towards rigorous uncertainty analysis of ocean colour has been slow, in part because of the complexity of ocean colour processing. Here, we present a general approach to uncertainty characterisation, using a database of satellite-in situ matchups to generate a statistical model of satellite uncertainty as a function of its contributing variables. With an example NASA MODIS-Aqua chlorophyll-a matchups database mostly covering the north Atlantic, we demonstrate a model that explains 67% of the squared error in log(chlorophyll-a as a potentially correctable bias, with the remaining uncertainty being characterised as standard deviation and standard error at each pixel. The method is quite general, depending only on the existence of a suitable database of matchups or reference values, and can be applied to other sensors and data types such as other satellite observed Essential Climate Variables, empirical algorithms derived from in situ data, or even model data.

  12. Multifidelity, Multidisciplinary Design Under Uncertainty with Non-Intrusive Polynomial Chaos

    Science.gov (United States)

    West, Thomas K., IV; Gumbert, Clyde

    2017-01-01

    The primary objective of this work is to develop an approach for multifidelity uncertainty quantification and to lay the framework for future design under uncertainty efforts. In this study, multifidelity is used to describe both the fidelity of the modeling of the physical systems, as well as the difference in the uncertainty in each of the models. For computational efficiency, a multifidelity surrogate modeling approach based on non-intrusive polynomial chaos using the point-collocation technique is developed for the treatment of both multifidelity modeling and multifidelity uncertainty modeling. Two stochastic model problems are used to demonstrate the developed methodologies: a transonic airfoil model and multidisciplinary aircraft analysis model. The results of both showed the multifidelity modeling approach was able to predict the output uncertainty predicted by the high-fidelity model as a significant reduction in computational cost.

  13. Ruminations On NDA Measurement Uncertainty Compared TO DA Uncertainty

    International Nuclear Information System (INIS)

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-01-01

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  14. RUMINATIONS ON NDA MEASUREMENT UNCERTAINTY COMPARED TO DA UNCERTAINTY

    Energy Technology Data Exchange (ETDEWEB)

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-06-17

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  15. Understanding and reducing statistical uncertainties in nebular abundance determinations

    Science.gov (United States)

    Wesson, R.; Stock, D. J.; Scicluna, P.

    2012-06-01

    Whenever observations are compared to theories, an estimate of the uncertainties associated with the observations is vital if the comparison is to be meaningful. However, many or even most determinations of temperatures, densities and abundances in photoionized nebulae do not quote the associated uncertainty. Those that do typically propagate the uncertainties using analytical techniques which rely on assumptions that generally do not hold. Motivated by this issue, we have developed Nebular Empirical Analysis Tool (NEAT), a new code for calculating chemical abundances in photoionized nebulae. The code carries out a standard analysis of lists of emission lines using long-established techniques to estimate the amount of interstellar extinction, calculate representative temperatures and densities, compute ionic abundances from both collisionally excited lines and recombination lines, and finally to estimate total elemental abundances using an ionization correction scheme. NEATuses a Monte Carlo technique to robustly propagate uncertainties from line flux measurements through to the derived abundances. We show that, for typical observational data, this approach is superior to analytic estimates of uncertainties. NEAT also accounts for the effect of upward biasing on measurements of lines with low signal-to-noise ratio, allowing us to accurately quantify the effect of this bias on abundance determinations. We find not only that the effect can result in significant overestimates of heavy element abundances derived from weak lines, but also that taking it into account reduces the uncertainty of these abundance determinations. Finally, we investigate the effect of possible uncertainties in R, the ratio of selective-to-total extinction, on abundance determinations. We find that the uncertainty due to this parameter is negligible compared to the statistical uncertainties due to typical line flux measurement uncertainties.

  16. Fundamental uncertainty and stock market volatility

    NARCIS (Netherlands)

    Arnold, I.J.M.; Vrugt, E.B.

    2008-01-01

    We provide empirical evidence on the link between stock market volatility and macroeconomic uncertainty. We show that US stock market volatility is significantly related to the dispersion in economic forecasts from participants in the Survey of Professional Forecasters over the period 1969 to 1996.

  17. Risk management for existing energy facilities. A global approach to numerical safety goals

    International Nuclear Information System (INIS)

    Pate-Cornell, M.E.

    1993-01-01

    This paper presents a structured set of numerical safety goals for risk management of existing energy facilities. The rationale behind these safety goals is based on principles of equity and economic efficiency. Some of the issues involved when using probabilistic risk analyses results for safety decisions are discussed. A brief review of existing safety targets and open-quotes floating numbersclose quotes is presented, and a set of safety goals for industrial risk management is proposed. Relaxation of these standards for existing facilities, the relevance of the lifetime of the plant, the treatment of uncertainties, and problems of failure dependencies are discussed briefly. 17 refs., 1 fig

  18. Treatment of uncertainties in the existence of free berths with risk analysis techniques. Establishment of policies in port of Cadiz (SPAIN)

    Energy Technology Data Exchange (ETDEWEB)

    Awad Nuñez, S.; Camarero Orive, A.; Romero Sanchez-Brunete, M.; Camarero Orive, A.; Gonzalez Cancelas, N.

    2016-07-01

    This research discusses the challenges involved in the treatment of uncertainties in the existence of free berths during the arrival of cruise ships at seaports. Pursuing this goal, a three-step methodology is adopted: 1) Identifying risk sources and critical risk variables and how they are related; 2) Fitting the Probability Distribution Functions that best represent the behaviour of each critical risk variable; and 3) Simulating the probability of a ship having to wait because there are no free berths using a technique that combines statistical concepts (random sampling) with the ability of computers to generate pseudo-random numbers and automate estimations of the values of the set of critical risk variables. The innovative use of risk analysis techniques in this field allows the establishment of policies to improve the planning and management of port infrastructure, for example, deciding when it is necessary to work to increase the number of berths. As a case of study, we applied this methodology to study whether the enlargement of the wharf in the port of Cadiz (Spain) is necessary right now considering the number of cruise ships that have arrived at the port in the past three years, their date and hour of arrival, their length and draught, the duration of their stay in port and their waiting time before being able to enter the port. This action would require moving logistics activities to a new terminal, but would bring to the city the opportunity to rethink the seafront, introducing new cruiser links with the city centre and developing a better seaport-city integration. (Author)

  19. Damage assessment of composite plate structures with material and measurement uncertainty

    Science.gov (United States)

    Chandrashekhar, M.; Ganguli, Ranjan

    2016-06-01

    Composite materials are very useful in structural engineering particularly in weight sensitive applications. Two different test models of the same structure made from composite materials can display very different dynamic behavior due to large uncertainties associated with composite material properties. Also, composite structures can suffer from pre-existing imperfections like delaminations, voids or cracks during fabrication. In this paper, we show that modeling and material uncertainties in composite structures can cause considerable problem in damage assessment. A recently developed C0 shear deformable locking free refined composite plate element is employed in the numerical simulations to alleviate modeling uncertainty. A qualitative estimate of the impact of modeling uncertainty on the damage detection problem is made. A robust Fuzzy Logic System (FLS) with sliding window defuzzifier is used for delamination damage detection in composite plate type structures. The FLS is designed using variations in modal frequencies due to randomness in material properties. Probabilistic analysis is performed using Monte Carlo Simulation (MCS) on a composite plate finite element model. It is demonstrated that the FLS shows excellent robustness in delamination detection at very high levels of randomness in input data.

  20. Embracing uncertainty in applied ecology.

    Science.gov (United States)

    Milner-Gulland, E J; Shea, K

    2017-12-01

    Applied ecologists often face uncertainty that hinders effective decision-making.Common traps that may catch the unwary are: ignoring uncertainty, acknowledging uncertainty but ploughing on, focussing on trivial uncertainties, believing your models, and unclear objectives.We integrate research insights and examples from a wide range of applied ecological fields to illustrate advances that are generally underused, but could facilitate ecologists' ability to plan and execute research to support management.Recommended approaches to avoid uncertainty traps are: embracing models, using decision theory, using models more effectively, thinking experimentally, and being realistic about uncertainty. Synthesis and applications . Applied ecologists can become more effective at informing management by using approaches that explicitly take account of uncertainty.

  1. Decision-Making under Criteria Uncertainty

    Science.gov (United States)

    Kureychik, V. M.; Safronenkova, I. B.

    2018-05-01

    Uncertainty is an essential part of a decision-making procedure. The paper deals with the problem of decision-making under criteria uncertainty. In this context, decision-making under uncertainty, types and conditions of uncertainty were examined. The decision-making problem under uncertainty was formalized. A modification of the mathematical decision support method under uncertainty via ontologies was proposed. A critical distinction of the developed method is ontology usage as its base elements. The goal of this work is a development of a decision-making method under criteria uncertainty with the use of ontologies in the area of multilayer board designing. This method is oriented to improvement of technical-economic values of the examined domain.

  2. Physical Uncertainty Bounds (PUB)

    Energy Technology Data Exchange (ETDEWEB)

    Vaughan, Diane Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Dean L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-03-19

    This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switching out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.

  3. Sparse grid-based polynomial chaos expansion for aerodynamics of an airfoil with uncertainties

    Directory of Open Access Journals (Sweden)

    Xiaojing WU

    2018-05-01

    Full Text Available The uncertainties can generate fluctuations with aerodynamic characteristics. Uncertainty Quantification (UQ is applied to compute its impact on the aerodynamic characteristics. In addition, the contribution of each uncertainty to aerodynamic characteristics should be computed by uncertainty sensitivity analysis. Non-Intrusive Polynomial Chaos (NIPC has been successfully applied to uncertainty quantification and uncertainty sensitivity analysis. However, the non-intrusive polynomial chaos method becomes inefficient as the number of random variables adopted to describe uncertainties increases. This deficiency becomes significant in stochastic aerodynamic analysis considering the geometric uncertainty because the description of geometric uncertainty generally needs many parameters. To solve the deficiency, a Sparse Grid-based Polynomial Chaos (SGPC expansion is used to do uncertainty quantification and sensitivity analysis for stochastic aerodynamic analysis considering geometric and operational uncertainties. It is proved that the method is more efficient than non-intrusive polynomial chaos and Monte Carlo Simulation (MSC method for the stochastic aerodynamic analysis. By uncertainty quantification, it can be learnt that the flow characteristics of shock wave and boundary layer separation are sensitive to the geometric uncertainty in transonic region. The uncertainty sensitivity analysis reveals the individual and coupled effects among the uncertainty parameters. Keywords: Non-intrusive polynomial chaos, Sparse grid, Stochastic aerodynamic analysis, Uncertainty sensitivity analysis, Uncertainty quantification

  4. Effect of minimal length uncertainty on the mass-radius relation of white dwarfs

    Science.gov (United States)

    Mathew, Arun; Nandy, Malay K.

    2018-06-01

    Generalized uncertainty relation that carries the imprint of quantum gravity introduces a minimal length scale into the description of space-time. It effectively changes the invariant measure of the phase space through a factor (1 + βp2) - 3 so that the equation of state for an electron gas undergoes a significant modification from the ideal case. It has been shown in the literature (Rashidi 2016) that the ideal Chandrasekhar limit ceases to exist when the modified equation of state due to the generalized uncertainty is taken into account. To assess the situation in a more complete fashion, we analyze in detail the mass-radius relation of Newtonian white dwarfs whose hydrostatic equilibria are governed by the equation of state of the degenerate relativistic electron gas subjected to the generalized uncertainty principle. As the constraint of minimal length imposes a severe restriction on the availability of high momentum states, it is speculated that the central Fermi momentum cannot have values arbitrarily higher than pmax ∼β - 1 / 2. When this restriction is imposed, it is found that the system approaches limiting mass values higher than the Chandrasekhar mass upon decreasing the parameter β to a value given by a legitimate upper bound. Instead, when the more realistic restriction due to inverse β-decay is considered, it is found that the mass and radius approach the values 1.4518 M⊙ and 601.18 km near the legitimate upper bound for the parameter β.

  5. Another two dark energy models motivated from Karolyhazy uncertainty relation

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Cheng-Yi; Yang, Wen-Li; Song, Yu. [Northwest University, Institute of Modern Physics, Xian (China); Yue, Rui-Hong [Ningbo University, Faculty of Science, Ningbo (China)

    2012-03-15

    The Karolyhazy uncertainty relation indicates that there exists a minimal detectable cell {delta}t{sup 3} over the region t{sup 3} in Minkowski space-time. Due to the energy-time uncertainty relation, the energy of the cell {delta}t {sup 3} cannot be less {delta}t{sup -1}. Then we get a new energy density of metric fluctuations of Minkowski spacetime as {delta}t{sup -4}. Motivated by the energy density, we propose two new dark-energy models. One model is characterized by the age of the universe and the other is characterized by the conformal age of the universe. We find that in the two models, the dark energy mimics a cosmological constant in the late time. (orig.)

  6. An Approximation Solution to Refinery Crude Oil Scheduling Problem with Demand Uncertainty Using Joint Constrained Programming

    OpenAIRE

    Duan, Qianqian; Yang, Genke; Xu, Guanglin; Pan, Changchun

    2014-01-01

    This paper is devoted to develop an approximation method for scheduling refinery crude oil operations by taking into consideration the demand uncertainty. In the stochastic model the demand uncertainty is modeled as random variables which follow a joint multivariate distribution with a specific correlation structure. Compared to deterministic models in existing works, the stochastic model can be more practical for optimizing crude oil operations. Using joint chance constraints, the demand unc...

  7. Uncertainties in the simulation of groundwater recharge at different scales

    Directory of Open Access Journals (Sweden)

    H. Bogena

    2005-01-01

    Full Text Available Digital spatial data always imply some kind of uncertainty. The source of this uncertainty can be found in their compilation as well as the conceptual design that causes a more or less exact abstraction of the real world, depending on the scale under consideration. Within the framework of hydrological modelling, in which numerous data sets from diverse sources of uneven quality are combined, the various uncertainties are accumulated. In this study, the GROWA model is taken as an example to examine the effects of different types of uncertainties on the calculated groundwater recharge. Distributed input errors are determined for the parameters' slope and aspect using a Monte Carlo approach. Landcover classification uncertainties are analysed by using the conditional probabilities of a remote sensing classification procedure. The uncertainties of data ensembles at different scales and study areas are discussed. The present uncertainty analysis showed that the Gaussian error propagation method is a useful technique for analysing the influence of input data on the simulated groundwater recharge. The uncertainties involved in the land use classification procedure and the digital elevation model can be significant in some parts of the study area. However, for the specific model used in this study it was shown that the precipitation uncertainties have the greatest impact on the total groundwater recharge error.

  8. Uncertainty and sensitivity analysis for the simulation of a station blackout scenario in the Jules Horowitz Reactor

    International Nuclear Information System (INIS)

    Ghione, Alberto; Noel, Brigitte; Vinai, Paolo; Demazière, Christophe

    2017-01-01

    Highlights: • A station blackout scenario in the Jules Horowitz Reactor is analyzed using CATHARE. • Input and model uncertainties relevant to the transient, are considered. • A statistical methodology for the propagation of the uncertainties is applied. • No safety criteria are exceeded and sufficiently large safety margins are estimated. • The most influential uncertainties are determined with a sensitivity analysis. - Abstract: An uncertainty and sensitivity analysis for the simulation of a station blackout scenario in the Jules Horowitz Reactor (JHR) is presented. The JHR is a new material testing reactor under construction at CEA on the Cadarache site, France. The thermal-hydraulic system code CATHARE is applied to investigate the response of the reactor system to the scenario. The uncertainty and sensitivity study was based on a statistical methodology for code uncertainty propagation, and the ‘Uncertainty and Sensitivity’ platform URANIE was used. Accordingly, the input uncertainties relevant to the transient, were identified, quantified, and propagated to the code output. The results show that the safety criteria are not exceeded and sufficiently large safety margins exist. In addition, the most influential input uncertainties on the safety parameters were found by making use of a sensitivity analysis.

  9. Uncertainty Propagation in OMFIT

    Science.gov (United States)

    Smith, Sterling; Meneghini, Orso; Sung, Choongki

    2017-10-01

    A rigorous comparison of power balance fluxes and turbulent model fluxes requires the propagation of uncertainties in the kinetic profiles and their derivatives. Making extensive use of the python uncertainties package, the OMFIT framework has been used to propagate covariant uncertainties to provide an uncertainty in the power balance calculation from the ONETWO code, as well as through the turbulent fluxes calculated by the TGLF code. The covariant uncertainties arise from fitting 1D (constant on flux surface) density and temperature profiles and associated random errors with parameterized functions such as a modified tanh. The power balance and model fluxes can then be compared with quantification of the uncertainties. No effort is made at propagating systematic errors. A case study will be shown for the effects of resonant magnetic perturbations on the kinetic profiles and fluxes at the top of the pedestal. A separate attempt at modeling the random errors with Monte Carlo sampling will be compared to the method of propagating the fitting function parameter covariant uncertainties. Work supported by US DOE under DE-FC02-04ER54698, DE-FG2-95ER-54309, DE-SC 0012656.

  10. Quantification of uncertainty in flood risk assessment for flood protection planning: a Bayesian approach

    Science.gov (United States)

    Dittes, Beatrice; Špačková, Olga; Ebrahimian, Negin; Kaiser, Maria; Rieger, Wolfgang; Disse, Markus; Straub, Daniel

    2017-04-01

    Flood risk estimates are subject to significant uncertainties, e.g. due to limited records of historic flood events, uncertainty in flood modeling, uncertain impact of climate change or uncertainty in the exposure and loss estimates. In traditional design of flood protection systems, these uncertainties are typically just accounted for implicitly, based on engineering judgment. In the AdaptRisk project, we develop a fully quantitative framework for planning of flood protection systems under current and future uncertainties using quantitative pre-posterior Bayesian decision analysis. In this contribution, we focus on the quantification of the uncertainties and study their relative influence on the flood risk estimate and on the planning of flood protection systems. The following uncertainty components are included using a Bayesian approach: 1) inherent and statistical (i.e. limited record length) uncertainty; 2) climate uncertainty that can be learned from an ensemble of GCM-RCM models; 3) estimates of climate uncertainty components not covered in 2), such as bias correction, incomplete ensemble, local specifics not captured by the GCM-RCM models; 4) uncertainty in the inundation modelling; 5) uncertainty in damage estimation. We also investigate how these uncertainties are possibly reduced in the future when new evidence - such as new climate models, observed extreme events, and socio-economic data - becomes available. Finally, we look into how this new evidence influences the risk assessment and effectivity of flood protection systems. We demonstrate our methodology for a pre-alpine catchment in southern Germany: the Mangfall catchment in Bavaria that includes the city of Rosenheim, which suffered significant losses during the 2013 flood event.

  11. On the connection between complementarity and uncertainty principles in the Mach–Zehnder interferometric setting

    International Nuclear Information System (INIS)

    Bosyk, G M; Portesi, M; Holik, F; Plastino, A

    2013-01-01

    We revisit the connection between the complementarity and uncertainty principles of quantum mechanics within the framework of Mach–Zehnder interferometry. We focus our attention on the trade-off relation between complementary path information and fringe visibility. This relation is equivalent to the uncertainty relation of Schrödinger and Robertson for a suitably chosen pair of observables. We show that it is equivalent as well to the uncertainty inequality provided by Landau and Pollak. We also study the relationship of this trade-off relation with a family of entropic uncertainty relations based on Rényi entropies. There is no equivalence in this case, but the different values of the entropic parameter do define regimes that provides us with a tool to discriminate between non-trivial states of minimum uncertainty. The existence of such regimes agrees with previous results of Luis (2011 Phys. Rev. A 84 034101), although their meaning was not sufficiently clear. We discuss the origin of these regimes with the intention of gaining a deeper understanding of entropic measures. (paper)

  12. Methodologies of Uncertainty Propagation Calculation

    International Nuclear Information System (INIS)

    Chojnacki, Eric

    2002-01-01

    After recalling the theoretical principle and the practical difficulties of the methodologies of uncertainty propagation calculation, the author discussed how to propagate input uncertainties. He said there were two kinds of input uncertainty: - variability: uncertainty due to heterogeneity, - lack of knowledge: uncertainty due to ignorance. It was therefore necessary to use two different propagation methods. He demonstrated this in a simple example which he generalised, treating the variability uncertainty by the probability theory and the lack of knowledge uncertainty by the fuzzy theory. He cautioned, however, against the systematic use of probability theory which may lead to unjustifiable and illegitimate precise answers. Mr Chojnacki's conclusions were that the importance of distinguishing variability and lack of knowledge increased as the problem was getting more and more complex in terms of number of parameters or time steps, and that it was necessary to develop uncertainty propagation methodologies combining probability theory and fuzzy theory

  13. Stock market volatility and macroeconomic uncertainty

    NARCIS (Netherlands)

    Arnold, I.J.M.; Vrugt, E.B.

    2006-01-01

    This paper provides empirical evidence on the link between stock market volatility and macroeconomic uncertainty. We show that US stock market volatility is significantly related to the dispersion in economic forecasts from SPF survey participants over the period from 1969 to 1996. This link is much

  14. Quantifying uncertainties in the structural response of SSME blades

    Science.gov (United States)

    Nagpal, Vinod K.

    1987-01-01

    To quantify the uncertainties associated with the geometry and material properties of a Space Shuttle Main Engine (SSME) turbopump blade, a computer code known as STAEBL was used. A finite element model of the blade used 80 triangular shell elements with 55 nodes and five degrees of freedom per node. The whole study was simulated on the computer and no real experiments were conducted. The structural response has been evaluated in terms of three variables which are natural frequencies, root (maximum) stress, and blade tip displacements. The results of the study indicate that only the geometric uncertainties have significant effects on the response. Uncertainties in material properties have insignificant effects.

  15. Quantifying measurement uncertainty and spatial variability in the context of model evaluation

    Science.gov (United States)

    Choukulkar, A.; Brewer, A.; Pichugina, Y. L.; Bonin, T.; Banta, R. M.; Sandberg, S.; Weickmann, A. M.; Djalalova, I.; McCaffrey, K.; Bianco, L.; Wilczak, J. M.; Newman, J. F.; Draxl, C.; Lundquist, J. K.; Wharton, S.; Olson, J.; Kenyon, J.; Marquis, M.

    2017-12-01

    In an effort to improve wind forecasts for the wind energy sector, the Department of Energy and the NOAA funded the second Wind Forecast Improvement Project (WFIP2). As part of the WFIP2 field campaign, a large suite of in-situ and remote sensing instrumentation was deployed to the Columbia River Gorge in Oregon and Washington from October 2015 - March 2017. The array of instrumentation deployed included 915-MHz wind profiling radars, sodars, wind- profiling lidars, and scanning lidars. The role of these instruments was to provide wind measurements at high spatial and temporal resolution for model evaluation and improvement of model physics. To properly determine model errors, the uncertainties in instrument-model comparisons need to be quantified accurately. These uncertainties arise from several factors such as measurement uncertainty, spatial variability, and interpolation of model output to instrument locations, to name a few. In this presentation, we will introduce a formalism to quantify measurement uncertainty and spatial variability. The accuracy of this formalism will be tested using existing datasets such as the eXperimental Planetary boundary layer Instrumentation Assessment (XPIA) campaign. Finally, the uncertainties in wind measurement and the spatial variability estimates from the WFIP2 field campaign will be discussed to understand the challenges involved in model evaluation.

  16. The state of the art of the impact of sampling uncertainty on measurement uncertainty

    Science.gov (United States)

    Leite, V. J.; Oliveira, E. C.

    2018-03-01

    The measurement uncertainty is a parameter that marks the reliability and can be divided into two large groups: sampling and analytical variations. Analytical uncertainty is a controlled process, performed in the laboratory. The same does not occur with the sampling uncertainty, which, because it faces several obstacles and there is no clarity on how to perform the procedures, has been neglected, although it is admittedly indispensable to the measurement process. This paper aims at describing the state of the art of sampling uncertainty and at assessing its relevance to measurement uncertainty.

  17. Uncertainty and sensitivity analysis in nuclear accident consequence assessment

    International Nuclear Information System (INIS)

    Karlberg, Olof.

    1989-01-01

    This report contains the results of a four year project in research contracts with the Nordic Cooperation in Nuclear Safety and the National Institute for Radiation Protection. An uncertainty/sensitivity analysis methodology consisting of Latin Hypercube sampling and regression analysis was applied to an accident consequence model. A number of input parameters were selected and the uncertainties related to these parameter were estimated within a Nordic group of experts. Individual doses, collective dose, health effects and their related uncertainties were then calculated for three release scenarios and for a representative sample of meteorological situations. From two of the scenarios the acute phase after an accident were simulated and from one the long time consequences. The most significant parameters were identified. The outer limits of the calculated uncertainty distributions are large and will grow to several order of magnitudes for the low probability consequences. The uncertainty in the expectation values are typical a factor 2-5 (1 Sigma). The variation in the model responses due to the variation of the weather parameters is fairly equal to the parameter uncertainty induced variation. The most important parameters showed out to be different for each pathway of exposure, which could be expected. However, the overall most important parameters are the wet deposition coefficient and the shielding factors. A general discussion of the usefulness of uncertainty analysis in consequence analysis is also given. (au)

  18. Uncertainty and Sensitivity Analysis of Afterbody Radiative Heating Predictions for Earth Entry

    Science.gov (United States)

    West, Thomas K., IV; Johnston, Christopher O.; Hosder, Serhat

    2016-01-01

    The objective of this work was to perform sensitivity analysis and uncertainty quantification for afterbody radiative heating predictions of Stardust capsule during Earth entry at peak afterbody radiation conditions. The radiation environment in the afterbody region poses significant challenges for accurate uncertainty quantification and sensitivity analysis due to the complexity of the flow physics, computational cost, and large number of un-certain variables. In this study, first a sparse collocation non-intrusive polynomial chaos approach along with global non-linear sensitivity analysis was used to identify the most significant uncertain variables and reduce the dimensions of the stochastic problem. Then, a total order stochastic expansion was constructed over only the important parameters for an efficient and accurate estimate of the uncertainty in radiation. Based on previous work, 388 uncertain parameters were considered in the radiation model, which came from the thermodynamics, flow field chemistry, and radiation modeling. The sensitivity analysis showed that only four of these variables contributed significantly to afterbody radiation uncertainty, accounting for almost 95% of the uncertainty. These included the electronic- impact excitation rate for N between level 2 and level 5 and rates of three chemical reactions in uencing N, N(+), O, and O(+) number densities in the flow field.

  19. Type-2 fuzzy elliptic membership functions for modeling uncertainty

    DEFF Research Database (Denmark)

    Kayacan, Erdal; Sarabakha, Andriy; Coupland, Simon

    2018-01-01

    Whereas type-1 and type-2 membership functions (MFs) are the core of any fuzzy logic system, there are no performance criteria available to evaluate the goodness or correctness of the fuzzy MFs. In this paper, we make extensive analysis in terms of the capability of type-2 elliptic fuzzy MFs...... in modeling uncertainty. Having decoupled parameters for its support and width, elliptic MFs are unique amongst existing type-2 fuzzy MFs. In this investigation, the uncertainty distribution along the elliptic MF support is studied, and a detailed analysis is given to compare and contrast its performance...... advantages mentioned above, elliptic MFs have comparable prediction results when compared to Gaussian and triangular MFs. Finally, in order to test the performance of fuzzy logic controller with elliptic interval type-2 MFs, extensive real-time experiments are conducted for the 3D trajectory tracking problem...

  20. Return Predictability, Model Uncertainty, and Robust Investment

    DEFF Research Database (Denmark)

    Lukas, Manuel

    Stock return predictability is subject to great uncertainty. In this paper we use the model confidence set approach to quantify uncertainty about expected utility from investment, accounting for potential return predictability. For monthly US data and six representative return prediction models, we...... find that confidence sets are very wide, change significantly with the predictor variables, and frequently include expected utilities for which the investor prefers not to invest. The latter motivates a robust investment strategy maximizing the minimal element of the confidence set. The robust investor...... allocates a much lower share of wealth to stocks compared to a standard investor....

  1. Summary from the epistemic uncertainty workshop: consensus amid diversity

    International Nuclear Information System (INIS)

    Ferson, Scott; Joslyn, Cliff A.; Helton, Jon C.; Oberkampf, William L.; Sentz, Kari

    2004-01-01

    The 'Epistemic Uncertainty Workshop' sponsored by Sandia National Laboratories was held in Albuquerque, New Mexico, on 6-7 August 2002. The workshop was organized around a set of Challenge Problems involving both epistemic and aleatory uncertainty that the workshop participants were invited to solve and discuss. This concluding article in a special issue of Reliability Engineering and System Safety based on the workshop discusses the intent of the Challenge Problems, summarizes some discussions from the workshop, and provides a technical comparison among the papers in this special issue. The Challenge Problems were computationally simple models that were intended as vehicles for the illustration and comparison of conceptual and numerical techniques for use in analyses that involve: (i) epistemic uncertainty, (ii) aggregation of multiple characterizations of epistemic uncertainty, (iii) combination of epistemic and aleatory uncertainty, and (iv) models with repeated parameters. There was considerable diversity of opinion at the workshop about both methods and fundamental issues, and yet substantial consensus about what the answers to the problems were, and even about how each of the four issues should be addressed. Among the technical approaches advanced were probability theory, Dempster-Shafer evidence theory, random sets, sets of probability measures, imprecise coherent probabilities, coherent lower previsions, probability boxes, possibility theory, fuzzy sets, joint distribution tableaux, polynomial chaos expansions, and info-gap models. Although some participants maintained that a purely probabilistic approach is fully capable of accounting for all forms of uncertainty, most agreed that the treatment of epistemic uncertainty introduces important considerations and that the issues underlying the Challenge Problems are legitimate and significant. Topics identified as meriting additional research include elicitation of uncertainty representations, aggregation of

  2. Analysis of Uncertainty in Dynamic Processes Development of Banks Functioning

    Directory of Open Access Journals (Sweden)

    Aleksei V. Korovyakovskii

    2013-01-01

    Full Text Available The paper offers the approach to measure of uncertainty estimation in dynamic processes of banks functioning, using statistic data of different banking operations indicators. To calculate measure of uncertainty in dynamic processes of banks functioning the phase images of relevant sets of statistic data are considered. Besides, it is shown that the form of phase image of the studied sets of statistic data can act as a basis of measure of uncertainty estimation in dynamic processes of banks functioning. The set of analytical characteristics are offered to formalize the form of phase image definition of the studied sets of statistic data. It is shown that the offered analytical characteristics consider inequality of changes in values of the studied sets of statistic data, which is one of the ways of uncertainty display in dynamic processes development. The invariant estimates of measure of uncertainty in dynamic processes of banks functioning, considering significant changes in absolute values of the same indicators for different banks were obtained. The examples of calculation of measure of uncertainty in dynamic processes of concrete banks functioning were cited.

  3. Thermal-Hydraulic Analysis for SBLOCA in OPR1000 and Evaluation of Uncertainty for PSA

    International Nuclear Information System (INIS)

    Kim, Tae Jin; Park, Goon Cherl

    2012-01-01

    Probabilistic Safety assessment (PSA) is a mathematical tool to evaluate numerical estimates of risk for nuclear power plants (NPPs). But PSA has the problems about quality and reliability since the quantification of uncertainties from thermal hydraulic (TH) analysis has not been included in the quantification of overall uncertainties in PSA. From the former research, it is proved that the quantification of uncertainties from best-estimate LBLOCA analysis can improve the PSA quality by modifying the core damage frequency (CDF) from the existing PSA report. Basing on the similar concept, this study considers the quantification of SBLOCA analysis results. In this study, however, operator error parameters are also included in addition to the phenomenon parameters which are considered in LBLOCA analysis

  4. Quantifying structural uncertainty on fault networks using a marked point process within a Bayesian framework

    Science.gov (United States)

    Aydin, Orhun; Caers, Jef Karel

    2017-08-01

    Faults are one of the building-blocks for subsurface modeling studies. Incomplete observations of subsurface fault networks lead to uncertainty pertaining to location, geometry and existence of faults. In practice, gaps in incomplete fault network observations are filled based on tectonic knowledge and interpreter's intuition pertaining to fault relationships. Modeling fault network uncertainty with realistic models that represent tectonic knowledge is still a challenge. Although methods that address specific sources of fault network uncertainty and complexities of fault modeling exists, a unifying framework is still lacking. In this paper, we propose a rigorous approach to quantify fault network uncertainty. Fault pattern and intensity information are expressed by means of a marked point process, marked Strauss point process. Fault network information is constrained to fault surface observations (complete or partial) within a Bayesian framework. A structural prior model is defined to quantitatively express fault patterns, geometries and relationships within the Bayesian framework. Structural relationships between faults, in particular fault abutting relations, are represented with a level-set based approach. A Markov Chain Monte Carlo sampler is used to sample posterior fault network realizations that reflect tectonic knowledge and honor fault observations. We apply the methodology to a field study from Nankai Trough & Kumano Basin. The target for uncertainty quantification is a deep site with attenuated seismic data with only partially visible faults and many faults missing from the survey or interpretation. A structural prior model is built from shallow analog sites that are believed to have undergone similar tectonics compared to the site of study. Fault network uncertainty for the field is quantified with fault network realizations that are conditioned to structural rules, tectonic information and partially observed fault surfaces. We show the proposed

  5. Incorporating forecast uncertainties into EENS for wind turbine studies

    Energy Technology Data Exchange (ETDEWEB)

    Toh, G.K.; Gooi, H.B. [School of EEE, Nanyang Technological University, Singapore 639798 (Singapore)

    2011-02-15

    The rapid increase in wind power generation around the world has stimulated the development of applicable technologies to model the uncertainties of wind power resulting from the stochastic nature of wind and fluctuations of demand for integration of wind turbine generators (WTGs). In this paper the load and wind power forecast errors are integrated into the expected energy not served (EENS) formulation through determination of probabilities using the normal distribution approach. The effects of forecast errors and wind energy penetration in the power system are traversed. The impact of wind energy penetration on system reliability, total cost for energy and reserve procurement is then studied for a conventional power system. The results show a degradation of system reliability with significant wind energy penetration in the generation system. This work provides a useful insight into system reliability and economics for the independent system operator (ISO) to deploy energy/reserve providers when WTGs are integrated into the existing power system. (author)

  6. Calculation of the detection limit in radiation measurements with systematic uncertainties

    International Nuclear Information System (INIS)

    Kirkpatrick, J.M.; Russ, W.; Venkataraman, R.; Young, B.M.

    2015-01-01

    The detection limit (L D ) or Minimum Detectable Activity (MDA) is an a priori evaluation of assay sensitivity intended to quantify the suitability of an instrument or measurement arrangement for the needs of a given application. Traditional approaches as pioneered by Currie rely on Gaussian approximations to yield simple, closed-form solutions, and neglect the effects of systematic uncertainties in the instrument calibration. These approximations are applicable over a wide range of applications, but are of limited use in low-count applications, when high confidence values are required, or when systematic uncertainties are significant. One proposed modification to the Currie formulation attempts account for systematic uncertainties within a Gaussian framework. We have previously shown that this approach results in an approximation formula that works best only for small values of the relative systematic uncertainty, for which the modification of Currie's method is the least necessary, and that it significantly overestimates the detection limit or gives infinite or otherwise non-physical results for larger systematic uncertainties where such a correction would be the most useful. We have developed an alternative approach for calculating detection limits based on realistic statistical modeling of the counting distributions which accurately represents statistical and systematic uncertainties. Instead of a closed form solution, numerical and iterative methods are used to evaluate the result. Accurate detection limits can be obtained by this method for the general case

  7. Uncertainties in hydrogen combustion

    International Nuclear Information System (INIS)

    Stamps, D.W.; Wong, C.C.; Nelson, L.S.

    1988-01-01

    Three important areas of hydrogen combustion with uncertainties are identified: high-temperature combustion, flame acceleration and deflagration-to-detonation transition, and aerosol resuspension during hydrogen combustion. The uncertainties associated with high-temperature combustion may affect at least three different accident scenarios: the in-cavity oxidation of combustible gases produced by core-concrete interactions, the direct containment heating hydrogen problem, and the possibility of local detonations. How these uncertainties may affect the sequence of various accident scenarios is discussed and recommendations are made to reduce these uncertainties. 40 references

  8. Uncertainty in artificial intelligence

    CERN Document Server

    Kanal, LN

    1986-01-01

    How to deal with uncertainty is a subject of much controversy in Artificial Intelligence. This volume brings together a wide range of perspectives on uncertainty, many of the contributors being the principal proponents in the controversy.Some of the notable issues which emerge from these papers revolve around an interval-based calculus of uncertainty, the Dempster-Shafer Theory, and probability as the best numeric model for uncertainty. There remain strong dissenting opinions not only about probability but even about the utility of any numeric method in this context.

  9. Uncertainty analysis in raw material and utility cost of biorefinery synthesis and design

    DEFF Research Database (Denmark)

    Cheali, Peam; Quaglia, Alberto; Gernaey, Krist

    2014-01-01

    are characterized by considerable uncertainty. These uncertainties might have significant impact on the results of the design problem, and therefore need to be carefully evaluated and managed, in order to generate candidates for robust design. In this contribution, we study the effect of data uncertainty (raw...... material price and utility cost) on the design of a biorefinery process network....

  10. Estimation of sampling error uncertainties in observed surface air temperature change in China

    Science.gov (United States)

    Hua, Wei; Shen, Samuel S. P.; Weithmann, Alexander; Wang, Huijun

    2017-08-01

    This study examines the sampling error uncertainties in the monthly surface air temperature (SAT) change in China over recent decades, focusing on the uncertainties of gridded data, national averages, and linear trends. Results indicate that large sampling error variances appear at the station-sparse area of northern and western China with the maximum value exceeding 2.0 K2 while small sampling error variances are found at the station-dense area of southern and eastern China with most grid values being less than 0.05 K2. In general, the negative temperature existed in each month prior to the 1980s, and a warming in temperature began thereafter, which accelerated in the early and mid-1990s. The increasing trend in the SAT series was observed for each month of the year with the largest temperature increase and highest uncertainty of 0.51 ± 0.29 K (10 year)-1 occurring in February and the weakest trend and smallest uncertainty of 0.13 ± 0.07 K (10 year)-1 in August. The sampling error uncertainties in the national average annual mean SAT series are not sufficiently large to alter the conclusion of the persistent warming in China. In addition, the sampling error uncertainties in the SAT series show a clear variation compared with other uncertainty estimation methods, which is a plausible reason for the inconsistent variations between our estimate and other studies during this period.

  11. Uncertainty in prostate cancer. Ethnic and family patterns.

    Science.gov (United States)

    Germino, B B; Mishel, M H; Belyea, M; Harris, L; Ware, A; Mohler, J

    1998-01-01

    Prostate cancer occurs 37% more often in African-American men than in white men. Patients and their family care providers (FCPs) may have different experiences of cancer and its treatment. This report addresses two questions: 1) What is the relationship of uncertainty to family coping, psychological adjustment to illness, and spiritual factors? and 2) Are these patterns of relationship similar for patients and their family care givers and for whites and African-Americans? A sample of white and African-American men and their family care givers (N = 403) was drawn from an ongoing study, testing the efficacy of an uncertainty management intervention with men with stage B prostate cancer. Data were collected at study entry, either 1 week after post-surgical catheter removal or at the beginning of primary radiation treatment. Measures of uncertainty, adult role behavior, problem solving, social support, importance of God in one's life, family coping, psychological adjustment to illness, and perceptions of health and illness met standard criteria for internal consistency. Analyses of baseline data using Pearson's product moment correlations were conducted to examine the relationships of person, disease, and contextual factors to uncertainty. For family coping, uncertainty was significantly and positively related to two domains in white family care providers only. In African-American and white family care providers, the more uncertainty experienced, the less positive they felt about treatment. Uncertainty for all care givers was related inversely to positive feelings about the patient recovering from the illness. For all patients and for white family members, uncertainty was related inversely to the quality of the domestic environment. For everyone, uncertainty was related inversely to psychological distress. Higher levels of uncertainty were related to a poorer social environment for African-American patients and for white family members. For white patients and their

  12. Representing uncertainty in objective functions: extension to include the influence of serial correlation

    Science.gov (United States)

    Croke, B. F.

    2008-12-01

    The role of performance indicators is to give an accurate indication of the fit between a model and the system being modelled. As all measurements have an associated uncertainty (determining the significance that should be given to the measurement), performance indicators should take into account uncertainties in the observed quantities being modelled as well as in the model predictions (due to uncertainties in inputs, model parameters and model structure). In the presence of significant uncertainty in observed and modelled output of a system, failure to adequately account for variations in the uncertainties means that the objective function only gives a measure of how well the model fits the observations, not how well the model fits the system being modelled. Since in most cases, the interest lies in fitting the system response, it is vital that the objective function(s) be designed to account for these uncertainties. Most objective functions (e.g. those based on the sum of squared residuals) assume homoscedastic uncertainties. If model contribution to the variations in residuals can be ignored, then transformations (e.g. Box-Cox) can be used to remove (or at least significantly reduce) heteroscedasticity. An alternative which is more generally applicable is to explicitly represent the uncertainties in the observed and modelled values in the objective function. Previous work on this topic addressed the modifications to standard objective functions (Nash-Sutcliffe efficiency, RMSE, chi- squared, coefficient of determination) using the optimal weighted averaging approach. This paper extends this previous work; addressing the issue of serial correlation. A form for an objective function that includes serial correlation will be presented, and the impact on model fit discussed.

  13. A structured analysis of uncertainty surrounding modeled impacts of groundwater-extraction rules

    Science.gov (United States)

    Guillaume, Joseph H. A.; Qureshi, M. Ejaz; Jakeman, Anthony J.

    2012-08-01

    Integrating economic and groundwater models for groundwater-management can help improve understanding of trade-offs involved between conflicting socioeconomic and biophysical objectives. However, there is significant uncertainty in most strategic decision-making situations, including in the models constructed to represent them. If not addressed, this uncertainty may be used to challenge the legitimacy of the models and decisions made using them. In this context, a preliminary uncertainty analysis was conducted of a dynamic coupled economic-groundwater model aimed at assessing groundwater extraction rules. The analysis demonstrates how a variety of uncertainties in such a model can be addressed. A number of methods are used including propagation of scenarios and bounds on parameters, multiple models, block bootstrap time-series sampling and robust linear regression for model calibration. These methods are described within the context of a theoretical uncertainty management framework, using a set of fundamental uncertainty management tasks and an uncertainty typology.

  14. Expanding Uncertainty Principle to Certainty-Uncertainty Principles with Neutrosophy and Quad-stage Method

    Directory of Open Access Journals (Sweden)

    Fu Yuhua

    2015-03-01

    Full Text Available The most famous contribution of Heisenberg is uncertainty principle. But the original uncertainty principle is improper. Considering all the possible situations (including the case that people can create laws and applying Neutrosophy and Quad-stage Method, this paper presents "certainty-uncertainty principles" with general form and variable dimension fractal form. According to the classification of Neutrosophy, "certainty-uncertainty principles" can be divided into three principles in different conditions: "certainty principle", namely a particle’s position and momentum can be known simultaneously; "uncertainty principle", namely a particle’s position and momentum cannot be known simultaneously; and neutral (fuzzy "indeterminacy principle", namely whether or not a particle’s position and momentum can be known simultaneously is undetermined. The special cases of "certain ty-uncertainty principles" include the original uncertainty principle and Ozawa inequality. In addition, in accordance with the original uncertainty principle, discussing high-speed particle’s speed and track with Newton mechanics is unreasonable; but according to "certaintyuncertainty principles", Newton mechanics can be used to discuss the problem of gravitational defection of a photon orbit around the Sun (it gives the same result of deflection angle as given by general relativity. Finally, for the reason that in physics the principles, laws and the like that are regardless of the principle (law of conservation of energy may be invalid; therefore "certaintyuncertainty principles" should be restricted (or constrained by principle (law of conservation of energy, and thus it can satisfy the principle (law of conservation of energy.

  15. A stochastic optimization model under modeling uncertainty and parameter certainty for groundwater remediation design-Part I. Model development

    Energy Technology Data Exchange (ETDEWEB)

    He, L., E-mail: li.he@ryerson.ca [Department of Civil Engineering, Faculty of Engineering, Architecture and Science, Ryerson University, 350 Victoria Street, Toronto, Ontario, M5B 2K3 (Canada); Huang, G.H. [Environmental Systems Engineering Program, Faculty of Engineering, University of Regina, Regina, Saskatchewan, S4S 0A2 (Canada); College of Urban Environmental Sciences, Peking University, Beijing 100871 (China); Lu, H.W. [Environmental Systems Engineering Program, Faculty of Engineering, University of Regina, Regina, Saskatchewan, S4S 0A2 (Canada)

    2010-04-15

    Solving groundwater remediation optimization problems based on proxy simulators can usually yield optimal solutions differing from the 'true' ones of the problem. This study presents a new stochastic optimization model under modeling uncertainty and parameter certainty (SOMUM) and the associated solution method for simultaneously addressing modeling uncertainty associated with simulator residuals and optimizing groundwater remediation processes. This is a new attempt different from the previous modeling efforts. The previous ones focused on addressing uncertainty in physical parameters (i.e. soil porosity) while this one aims to deal with uncertainty in mathematical simulator (arising from model residuals). Compared to the existing modeling approaches (i.e. only parameter uncertainty is considered), the model has the advantages of providing mean-variance analysis for contaminant concentrations, mitigating the effects of modeling uncertainties on optimal remediation strategies, offering confidence level of optimal remediation strategies to system designers, and reducing computational cost in optimization processes.

  16. Using finite mixture models in thermal-hydraulics system code uncertainty analysis

    Energy Technology Data Exchange (ETDEWEB)

    Carlos, S., E-mail: scarlos@iqn.upv.es [Department d’Enginyeria Química i Nuclear, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain); Sánchez, A. [Department d’Estadística Aplicada i Qualitat, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain); Ginestar, D. [Department de Matemàtica Aplicada, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain); Martorell, S. [Department d’Enginyeria Química i Nuclear, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain)

    2013-09-15

    Highlights: • Best estimate codes simulation needs uncertainty quantification. • The output variables can present multimodal probability distributions. • The analysis of multimodal distribution is performed using finite mixture models. • Two methods to reconstruct output variable probability distribution are used. -- Abstract: Nuclear Power Plant safety analysis is mainly based on the use of best estimate (BE) codes that predict the plant behavior under normal or accidental conditions. As the BE codes introduce uncertainties due to uncertainty in input parameters and modeling, it is necessary to perform uncertainty assessment (UA), and eventually sensitivity analysis (SA), of the results obtained. These analyses are part of the appropriate treatment of uncertainties imposed by current regulation based on the adoption of the best estimate plus uncertainty (BEPU) approach. The most popular approach for uncertainty assessment, based on Wilks’ method, obtains a tolerance/confidence interval, but it does not completely characterize the output variable behavior, which is required for an extended UA and SA. However, the development of standard UA and SA impose high computational cost due to the large number of simulations needed. In order to obtain more information about the output variable and, at the same time, to keep computational cost as low as possible, there has been a recent shift toward developing metamodels (model of model), or surrogate models, that approximate or emulate complex computer codes. In this way, there exist different techniques to reconstruct the probability distribution using the information provided by a sample of values as, for example, the finite mixture models. In this paper, the Expectation Maximization and the k-means algorithms are used to obtain a finite mixture model that reconstructs the output variable probability distribution from data obtained with RELAP-5 simulations. Both methodologies have been applied to a separated

  17. Influence of measurement uncertainty on classification of thermal environment in buildings according to European Standard EN 15251

    DEFF Research Database (Denmark)

    Kolarik, Jakub; Olesen, Bjarne W.

    2015-01-01

    European Standard EN 15 251 in its current version does not provide any guidance on how to handle uncertainty of long term measurements of indoor environmental parameters used for classification of buildings. The objective of the study was to analyse the uncertainty for field measurements...... measurements of operative temperature at two measuring points (south/south-west and north/northeast orientation). Results of the present study suggest that measurement uncertainty needs to be considered during assessment of thermal environment in existing buildings. When expanded standard uncertainty was taken...... into account in categorization of thermal environment according to EN 15251, the difference in prevalence of exceeded category limits were up to 17.3%, 8.3% and 2% of occupied hours for category I, II and III respectively....

  18. Assessing scenario and parametric uncertainties in risk analysis: a model uncertainty audit

    International Nuclear Information System (INIS)

    Tarantola, S.; Saltelli, A.; Draper, D.

    1999-01-01

    In the present study a process of model audit is addressed on a computational model used for predicting maximum radiological doses to humans in the field of nuclear waste disposal. Global uncertainty and sensitivity analyses are employed to assess output uncertainty and to quantify the contribution of parametric and scenario uncertainties to the model output. These tools are of fundamental importance for risk analysis and decision making purposes

  19. Uncertainty Analysis of Resistance Tests in Ata Nutku Ship Model Testing Laboratory of Istanbul Technical University

    Directory of Open Access Journals (Sweden)

    Cihad DELEN

    2015-12-01

    Full Text Available In this study, some systematical resistance tests, where were performed in Ata Nutku Ship Model Testing Laboratory of Istanbul Technical University (ITU, have been included in order to determine the uncertainties. Experiments which are conducted in the framework of mathematical and physical rules for the solution of engineering problems, measurements, calculations include uncertainty. To question the reliability of the obtained values, the existing uncertainties should be expressed as quantities. The uncertainty of a measurement system is not known if the results do not carry a universal value. On the other hand, resistance is one of the most important parameters that should be considered in the process of ship design. Ship resistance during the design phase of a ship cannot be determined precisely and reliably due to the uncertainty resources in determining the resistance value that are taken into account. This case may cause negative effects to provide the required specifications in the latter design steps. The uncertainty arising from the resistance test has been estimated and compared for a displacement type ship and high speed marine vehicles according to ITTC 2002 and ITTC 2014 regulations which are related to the uncertainty analysis methods. Also, the advantages and disadvantages of both ITTC uncertainty analysis methods have been discussed.

  20. Bayesian models for comparative analysis integrating phylogenetic uncertainty

    Directory of Open Access Journals (Sweden)

    Villemereuil Pierre de

    2012-06-01

    Full Text Available Abstract Background Uncertainty in comparative analyses can come from at least two sources: a phylogenetic uncertainty in the tree topology or branch lengths, and b uncertainty due to intraspecific variation in trait values, either due to measurement error or natural individual variation. Most phylogenetic comparative methods do not account for such uncertainties. Not accounting for these sources of uncertainty leads to false perceptions of precision (confidence intervals will be too narrow and inflated significance in hypothesis testing (e.g. p-values will be too small. Although there is some application-specific software for fitting Bayesian models accounting for phylogenetic error, more general and flexible software is desirable. Methods We developed models to directly incorporate phylogenetic uncertainty into a range of analyses that biologists commonly perform, using a Bayesian framework and Markov Chain Monte Carlo analyses. Results We demonstrate applications in linear regression, quantification of phylogenetic signal, and measurement error models. Phylogenetic uncertainty was incorporated by applying a prior distribution for the phylogeny, where this distribution consisted of the posterior tree sets from Bayesian phylogenetic tree estimation programs. The models were analysed using simulated data sets, and applied to a real data set on plant traits, from rainforest plant species in Northern Australia. Analyses were performed using the free and open source software OpenBUGS and JAGS. Conclusions Incorporating phylogenetic uncertainty through an empirical prior distribution of trees leads to more precise estimation of regression model parameters than using a single consensus tree and enables a more realistic estimation of confidence intervals. In addition, models incorporating measurement errors and/or individual variation, in one or both variables, are easily formulated in the Bayesian framework. We show that BUGS is a useful, flexible

  1. Bayesian models for comparative analysis integrating phylogenetic uncertainty

    Science.gov (United States)

    2012-01-01

    Background Uncertainty in comparative analyses can come from at least two sources: a) phylogenetic uncertainty in the tree topology or branch lengths, and b) uncertainty due to intraspecific variation in trait values, either due to measurement error or natural individual variation. Most phylogenetic comparative methods do not account for such uncertainties. Not accounting for these sources of uncertainty leads to false perceptions of precision (confidence intervals will be too narrow) and inflated significance in hypothesis testing (e.g. p-values will be too small). Although there is some application-specific software for fitting Bayesian models accounting for phylogenetic error, more general and flexible software is desirable. Methods We developed models to directly incorporate phylogenetic uncertainty into a range of analyses that biologists commonly perform, using a Bayesian framework and Markov Chain Monte Carlo analyses. Results We demonstrate applications in linear regression, quantification of phylogenetic signal, and measurement error models. Phylogenetic uncertainty was incorporated by applying a prior distribution for the phylogeny, where this distribution consisted of the posterior tree sets from Bayesian phylogenetic tree estimation programs. The models were analysed using simulated data sets, and applied to a real data set on plant traits, from rainforest plant species in Northern Australia. Analyses were performed using the free and open source software OpenBUGS and JAGS. Conclusions Incorporating phylogenetic uncertainty through an empirical prior distribution of trees leads to more precise estimation of regression model parameters than using a single consensus tree and enables a more realistic estimation of confidence intervals. In addition, models incorporating measurement errors and/or individual variation, in one or both variables, are easily formulated in the Bayesian framework. We show that BUGS is a useful, flexible general purpose tool for

  2. Real-Time Optimal Flood Control Decision Making and Risk Propagation Under Multiple Uncertainties

    Science.gov (United States)

    Zhu, Feilin; Zhong, Ping-An; Sun, Yimeng; Yeh, William W.-G.

    2017-12-01

    Multiple uncertainties exist in the optimal flood control decision-making process, presenting risks involving flood control decisions. This paper defines the main steps in optimal flood control decision making that constitute the Forecast-Optimization-Decision Making (FODM) chain. We propose a framework for supporting optimal flood control decision making under multiple uncertainties and evaluate risk propagation along the FODM chain from a holistic perspective. To deal with uncertainties, we employ stochastic models at each link of the FODM chain. We generate synthetic ensemble flood forecasts via the martingale model of forecast evolution. We then establish a multiobjective stochastic programming with recourse model for optimal flood control operation. The Pareto front under uncertainty is derived via the constraint method coupled with a two-step process. We propose a novel SMAA-TOPSIS model for stochastic multicriteria decision making. Then we propose the risk assessment model, the risk of decision-making errors and rank uncertainty degree to quantify the risk propagation process along the FODM chain. We conduct numerical experiments to investigate the effects of flood forecast uncertainty on optimal flood control decision making and risk propagation. We apply the proposed methodology to a flood control system in the Daduhe River basin in China. The results indicate that the proposed method can provide valuable risk information in each link of the FODM chain and enable risk-informed decisions with higher reliability.

  3. Uncertainty Communication. Issues and good practice

    International Nuclear Information System (INIS)

    Kloprogge, P.; Van der Sluijs, J.; Wardekker, A.

    2007-12-01

    In 2003 the Netherlands Environmental Assessment Agency (MNP) published the RIVM/MNP Guidance for Uncertainty Assessment and Communication. The Guidance assists in dealing with uncertainty in environmental assessments. Dealing with uncertainty is essential because assessment results regarding complex environmental issues are of limited value if the uncertainties have not been taken into account adequately. A careful analysis of uncertainties in an environmental assessment is required, but even more important is the effective communication of these uncertainties in the presentation of assessment results. The Guidance yields rich and differentiated insights in uncertainty, but the relevance of this uncertainty information may vary across audiences and uses of assessment results. Therefore, the reporting of uncertainties is one of the six key issues that is addressed in the Guidance. In practice, users of the Guidance felt a need for more practical assistance in the reporting of uncertainty information. This report explores the issue of uncertainty communication in more detail, and contains more detailed guidance on the communication of uncertainty. In order to make this a 'stand alone' document several questions that are mentioned in the detailed Guidance have been repeated here. This document thus has some overlap with the detailed Guidance. Part 1 gives a general introduction to the issue of communicating uncertainty information. It offers guidelines for (fine)tuning the communication to the intended audiences and context of a report, discusses how readers of a report tend to handle uncertainty information, and ends with a list of criteria that uncertainty communication needs to meet to increase its effectiveness. Part 2 helps writers to analyze the context in which communication takes place, and helps to map the audiences, and their information needs. It further helps to reflect upon anticipated uses and possible impacts of the uncertainty information on the

  4. Statistical Uncertainty Quantification of Physical Models during Reflood of LBLOCA

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Deog Yeon; Seul, Kwang Won; Woo, Sweng Woong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-05-15

    The use of the best-estimate (BE) computer codes in safety analysis for loss-of-coolant accident (LOCA) is the major trend in many countries to reduce the significant conservatism. A key feature of this BE evaluation requires the licensee to quantify the uncertainty of the calculations. So, it is very important how to determine the uncertainty distribution before conducting the uncertainty evaluation. Uncertainty includes those of physical model and correlation, plant operational parameters, and so forth. The quantification process is often performed mainly by subjective expert judgment or obtained from reference documents of computer code. In this respect, more mathematical methods are needed to reasonably determine the uncertainty ranges. The first uncertainty quantification are performed with the various increments for two influential uncertainty parameters to get the calculated responses and their derivatives. The different data set with two influential uncertainty parameters for FEBA tests, are chosen applying more strict criteria for selecting responses and their derivatives, which may be considered as the user’s effect in the CIRCÉ applications. Finally, three influential uncertainty parameters are considered to study the effect on the number of uncertainty parameters due to the limitation of CIRCÉ method. With the determined uncertainty ranges, uncertainty evaluations for FEBA tests are performed to check whether the experimental responses such as the cladding temperature or pressure drop are inside the limits of calculated uncertainty bounds. A confirmation step will be performed to evaluate the quality of the information in the case of the different reflooding PERICLES experiments. The uncertainty ranges of physical model in MARS-KS thermal-hydraulic code during the reflooding were quantified by CIRCÉ method using FEBA experiment tests, instead of expert judgment. Also, through the uncertainty evaluation for FEBA and PERICLES tests, it was confirmed

  5. Uncertainty in Simulating Wheat Yields Under Climate Change

    Energy Technology Data Exchange (ETDEWEB)

    Asseng, S.; Ewert, F.; Rosenzweig, C.; Jones, J.W.; Hatfield, Jerry; Ruane, Alex; Boote, K. J.; Thorburn, Peter; Rotter, R.P.; Cammarano, D.; Brisson, N.; Basso, B.; Martre, P.; Aggarwal, P.K.; Angulo, C.; Bertuzzi, P.; Biernath, C.; Challinor, AJ; Doltra, J.; Gayler, S.; Goldberg, R.; Grant, Robert; Heng, L.; Hooker, J.; Hunt, L.A.; Ingwersen, J.; Izaurralde, Roberto C.; Kersebaum, K.C.; Mueller, C.; Naresh Kumar, S.; Nendel, C.; O' Leary, G.O.; Olesen, JE; Osborne, T.; Palosuo, T.; Priesack, E.; Ripoche, D.; Semenov, M.A.; Shcherbak, I.; Steduto, P.; Stockle, Claudio O.; Stratonovitch, P.; Streck, T.; Supit, I.; Tao, F.; Travasso, M.; Waha, K.; Wallach, D.; White, J.W.; Williams, J.R.; Wolf, J.

    2013-09-01

    Anticipating the impacts of climate change on crop yields is critical for assessing future food security. Process-based crop simulation models are the most commonly used tools in such assessments1,2. Analysis of uncertainties in future greenhouse gas emissions and their impacts on future climate change has been increasingly described in the literature3,4 while assessments of the uncertainty in crop responses to climate change are very rare. Systematic and objective comparisons across impact studies is difficult, and thus has not been fully realized5. Here we present the largest coordinated and standardized crop model intercomparison for climate change impacts on wheat production to date. We found that several individual crop models are able to reproduce measured grain yields under current diverse environments, particularly if sufficient details are provided to execute them. However, simulated climate change impacts can vary across models due to differences in model structures and algorithms. The crop-model component of uncertainty in climate change impact assessments was considerably larger than the climate-model component from Global Climate Models (GCMs). Model responses to high temperatures and temperature-by-CO2 interactions are identified as major sources of simulated impact uncertainties. Significant reductions in impact uncertainties through model improvements in these areas and improved quantification of uncertainty through multi-model ensembles are urgently needed for a more reliable translation of climate change scenarios into agricultural impacts in order to develop adaptation strategies and aid policymaking.

  6. Reliability analysis under epistemic uncertainty

    International Nuclear Information System (INIS)

    Nannapaneni, Saideep; Mahadevan, Sankaran

    2016-01-01

    This paper proposes a probabilistic framework to include both aleatory and epistemic uncertainty within model-based reliability estimation of engineering systems for individual limit states. Epistemic uncertainty is considered due to both data and model sources. Sparse point and/or interval data regarding the input random variables leads to uncertainty regarding their distribution types, distribution parameters, and correlations; this statistical uncertainty is included in the reliability analysis through a combination of likelihood-based representation, Bayesian hypothesis testing, and Bayesian model averaging techniques. Model errors, which include numerical solution errors and model form errors, are quantified through Gaussian process models and included in the reliability analysis. The probability integral transform is used to develop an auxiliary variable approach that facilitates a single-level representation of both aleatory and epistemic uncertainty. This strategy results in an efficient single-loop implementation of Monte Carlo simulation (MCS) and FORM/SORM techniques for reliability estimation under both aleatory and epistemic uncertainty. Two engineering examples are used to demonstrate the proposed methodology. - Highlights: • Epistemic uncertainty due to data and model included in reliability analysis. • A novel FORM-based approach proposed to include aleatory and epistemic uncertainty. • A single-loop Monte Carlo approach proposed to include both types of uncertainties. • Two engineering examples used for illustration.

  7. Simplified propagation of standard uncertainties

    International Nuclear Information System (INIS)

    Shull, A.H.

    1997-01-01

    An essential part of any measurement control program is adequate knowledge of the uncertainties of the measurement system standards. Only with an estimate of the standards'' uncertainties can one determine if the standard is adequate for its intended use or can one calculate the total uncertainty of the measurement process. Purchased standards usually have estimates of uncertainty on their certificates. However, when standards are prepared and characterized by a laboratory, variance propagation is required to estimate the uncertainty of the standard. Traditional variance propagation typically involves tedious use of partial derivatives, unfriendly software and the availability of statistical expertise. As a result, the uncertainty of prepared standards is often not determined or determined incorrectly. For situations meeting stated assumptions, easier shortcut methods of estimation are now available which eliminate the need for partial derivatives and require only a spreadsheet or calculator. A system of simplifying the calculations by dividing into subgroups of absolute and relative uncertainties is utilized. These methods also incorporate the International Standards Organization (ISO) concepts for combining systematic and random uncertainties as published in their Guide to the Expression of Measurement Uncertainty. Details of the simplified methods and examples of their use are included in the paper

  8. Communicating diagnostic uncertainty in surgical pathology reports: disparities between sender and receiver.

    Science.gov (United States)

    Lindley, Sarah W; Gillies, Elizabeth M; Hassell, Lewis A

    2014-10-01

    Surgical pathologists use a variety of phrases to communicate varying degrees of diagnostic certainty which have the potential to be interpreted differently than intended. This study sought to: (1) assess the setting, varieties and frequency of use of phrases of diagnostic uncertainty in the diagnostic line of surgical pathology reports, (2) evaluate use of uncertainty expressions by experience and gender, (3) determine how these phrases are interpreted by clinicians and pathologists, and (4) assess solutions to this communication problem. We evaluated 1500 surgical pathology reports to determine frequency of use of uncertainty terms, identified those most commonly used, and looked for variations in usage rates on the basis of case type, experience and gender. We surveyed 76 physicians at tumor boards who were asked to assign a percentage of certainty to diagnoses containing expressions of uncertainty. We found expressions of uncertainty in 35% of diagnostic reports, with no statistically significant difference in usage based on age or gender. We found wide variation in the percentage of certainty clinicians assigned to the phrases studied. We conclude that non-standardized language used in the communication of diagnostic uncertainty is a significant source of miscommunication, both amongst pathologists and between pathologists and clinicians. Copyright © 2014 The Authors. Published by Elsevier GmbH.. All rights reserved.

  9. Evaluating prediction uncertainty

    International Nuclear Information System (INIS)

    McKay, M.D.

    1995-03-01

    The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented

  10. Robust guaranteed cost tracking control of quadrotor UAV with uncertainties.

    Science.gov (United States)

    Xu, Zhiwei; Nian, Xiaohong; Wang, Haibo; Chen, Yinsheng

    2017-07-01

    In this paper, a robust guaranteed cost controller (RGCC) is proposed for quadrotor UAV system with uncertainties to address set-point tracking problem. A sufficient condition of the existence for RGCC is derived by Lyapunov stability theorem. The designed RGCC not only guarantees the whole closed-loop system asymptotically stable but also makes the quadratic performance level built for the closed-loop system have an upper bound irrespective to all admissible parameter uncertainties. Then, an optimal robust guaranteed cost controller is developed to minimize the upper bound of performance level. Simulation results verify the presented control algorithms possess small overshoot and short setting time, with which the quadrotor has ability to perform set-point tracking task well. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  11. An evaluation of uncertainties in radioecological models

    International Nuclear Information System (INIS)

    Hoffmann, F.O.; Little, C.A.; Miller, C.W.; Dunning, D.E. Jr.; Rupp, E.M.; Shor, R.W.; Schaeffer, D.L.; Baes, C.F. III

    1978-01-01

    The paper presents results of analyses for seven selected parameters commonly used in environmental radiological assessment models, assuming that the available data are representative of the true distribution of parameter values and that their respective distributions are lognormal. Estimates of the most probable, median, mean, and 99th percentile for each parameter are fiven and compared to U.S. NRC default values. The regulatory default values are generally greater than the median values for the selected parameters, but some are associated with percentiles significantly less than the 50th. The largest uncertainties appear to be associated with aquatic bioaccumulation factors for fresh water fish. Approximately one order of magnitude separates median values and values of the 99th percentile. The uncertainty is also estimated for the annual dose rate predicted by a multiplicative chain model for the transport of molecular iodine-131 via the air-pasture-cow-milk-child's thyroid pathway. The value for the 99th percentile is ten times larger than the median value of the predicted dose normalized for a given air concentration of 131 I 2 . About 72% of the uncertainty in this model is contributed by the dose conversion factor and the milk transfer coefficient. Considering the difficulties in obtaining a reliable quantification of the true uncertainties in model predictions, methods for taking these uncertainties into account when determining compliance with regulatory statutes are discussed. (orig./HP) [de

  12. Effects of Economic Policy Uncertainty Shocks on the Long-Run US-UK Stock Market Correlation

    DEFF Research Database (Denmark)

    Asgharian, Hossein; Christiansen, Charlotte; Gupta, Rangan

    We use the economic policy uncertainty indices of Baker, Bloom, and Davis (2016) in combination with the mixed data sampling (MIDAS) approach to investigate the US and UK stock market movements. The long-run US-UK stock market correlation depends positively on US economic policy uncertainty shocks....... The US long-run stock market volatility depends significantly on the US economic policy uncertainty shocks but not on UK shocks while the UK depends significantly on both....

  13. The relationship of parental overprotection, perceived child vulnerability, and parenting stress to uncertainty in youth with chronic illness.

    Science.gov (United States)

    Mullins, Larry L; Wolfe-Christensen, Cortney; Pai, Ahna L Hoff; Carpentier, Melissa Y; Gillaspy, Stephen; Cheek, Jeff; Page, Melanie

    2007-09-01

    To examine the relationship of parent-reported overprotection (OP), perceived child vulnerability (PCV), and parenting stress (PS) to youth-reported illness uncertainty, and to explore potential developmental differences. Eighty-two children and 82 adolescents (n = 164) diagnosed with Type 1 diabetes mellitus (DM1) or asthma, completed a measure of illness uncertainty, while their parents completed measures of OP, PCV, and PS. After controlling for demographic and illness parameters, both PCV and PS significantly predicted youth illness uncertainty in the combined sample. Within the child group, only PS significantly predicted illness uncertainty, whereas only PCV significantly predicted uncertainty for adolescents. Specific parenting variables are associated with youth-reported illness uncertainty; however, their relationship varies according to developmental level. Although OP has been identified as a predictor of child psychological outcomes in other studies, it does not appear to be associated with illness uncertainty in youth with DM1 or asthma.

  14. Uncertainty analysis in WWTP model applications: a critical discussion using an example from design

    DEFF Research Database (Denmark)

    Sin, Gürkan; Gernaey, Krist; Neumann, Marc B.

    2009-01-01

    of design performance criteria differs significantly. The implication for the practical applications of uncertainty analysis in the wastewater industry is profound: (i) as the uncertainty analysis results are specific to the framing used, the results must be interpreted within the context of that framing......This study focuses on uncertainty analysis of WWTP models and analyzes the issue of framing and how it affects the interpretation of uncertainty analysis results. As a case study, the prediction of uncertainty involved in model-based design of a wastewater treatment plant is studied. The Monte...... to stoichiometric, biokinetic and influent parameters; (2) uncertainty due to hydraulic behaviour of the plant and mass transfer parameters; (3) uncertainty due to the combination of (1) and (2). The results demonstrate that depending on the way the uncertainty analysis is framed, the estimated uncertainty...

  15. A technique for improved stability of adaptive feedforward controllers without detailed uncertainty measurements

    International Nuclear Information System (INIS)

    Berkhoff, A P

    2012-01-01

    Model errors in adaptive controllers for the reduction of broadband noise and vibrations may lead to unstable systems or increased error signals. Previous research on active structures with small damping has shown that the addition of a low-authority controller which increases damping in the system may lead to improved performance of an adaptive, high-authority controller. Other researchers have suggested the use of frequency dependent regularization based on measured uncertainties. In this paper an alternative method is presented that avoids the disadvantages of these methods, namely the additional complex hardware and the need to obtain detailed information on the uncertainties. An analysis is made of an adaptive feedforward controller in which a difference exists between the secondary path and the model as used in the controller. The real parts of the eigenvalues that determine the stability of the system are expressed in terms of the amount of uncertainty and the singular values of the secondary path. Modifications of the feedforward control scheme are suggested that aim to improve performance without requiring detailed uncertainty measurements. (paper)

  16. Eliciting geologists' tacit model of the uncertainty of mapped geological boundaries

    Science.gov (United States)

    Lark, R. M.; Lawley, R. S.; Barron, A. J. M.; Aldiss, D. T.; Ambrose, K.; Cooper, A. H.; Lee, J. R.; Waters, C. N.

    2015-01-01

    It is generally accepted that geological linework, such as mapped boundaries, are uncertain for various reasons. It is difficult to quantify this uncertainty directly, because the investigation of error in a boundary at a single location may be costly and time consuming, and many such observations are needed to estimate an uncertainty model with confidence. However, it is also recognized across many disciplines that experts generally have a tacit model of the uncertainty of information that they produce (interpretations, diagnoses etc.) and formal methods exist to extract this model in usable form by elicitation. In this paper we report a trial in which uncertainty models for mapped boundaries in six geological scenarios were elicited from a group of five experienced geologists. In five cases a consensus distribution was obtained, which reflected both the initial individually elicted distribution and a structured process of group discussion in which individuals revised their opinions. In a sixth case a consensus was not reached. This concerned a boundary between superficial deposits where the geometry of the contact is hard to visualize. The trial showed that the geologists' tacit model of uncertainty in mapped boundaries reflects factors in addition to the cartographic error usually treated by buffering linework or in written guidance on its application. It suggests that further application of elicitation, to scenarios at an appropriate level of generalization, could be useful to provide working error models for the application and interpretation of linework.

  17. Additivity of entropic uncertainty relations

    Directory of Open Access Journals (Sweden)

    René Schwonnek

    2018-03-01

    Full Text Available We consider the uncertainty between two pairs of local projective measurements performed on a multipartite system. We show that the optimal bound in any linear uncertainty relation, formulated in terms of the Shannon entropy, is additive. This directly implies, against naive intuition, that the minimal entropic uncertainty can always be realized by fully separable states. Hence, in contradiction to proposals by other authors, no entanglement witness can be constructed solely by comparing the attainable uncertainties of entangled and separable states. However, our result gives rise to a huge simplification for computing global uncertainty bounds as they now can be deduced from local ones. Furthermore, we provide the natural generalization of the Maassen and Uffink inequality for linear uncertainty relations with arbitrary positive coefficients.

  18. Calibration Uncertainties in the Droplet Measurement Technologies Cloud Condensation Nuclei Counter

    Science.gov (United States)

    Hibert, Kurt James

    Cloud condensation nuclei (CCN) serve as the nucleation sites for the condensation of water vapor in Earth's atmosphere and are important for their effect on climate and weather. The influence of CCN on cloud radiative properties (aerosol indirect effect) is the most uncertain of quantified radiative forcing changes that have occurred since pre-industrial times. CCN influence the weather because intrinsic and extrinsic aerosol properties affect cloud formation and precipitation development. To quantify these effects, it is necessary to accurately measure CCN, which requires accurate calibrations using a consistent methodology. Furthermore, the calibration uncertainties are required to compare measurements from different field projects. CCN uncertainties also aid the integration of CCN measurements with atmospheric models. The commercially available Droplet Measurement Technologies (DMT) CCN Counter is used by many research groups, so it is important to quantify its calibration uncertainty. Uncertainties in the calibration of the DMT CCN counter exist in the flow rate and supersaturation values. The concentration depends on the accuracy of the flow rate calibration, which does not have a large (4.3 %) uncertainty. The supersaturation depends on chamber pressure, temperature, and flow rate. The supersaturation calibration is a complex process since the chamber's supersaturation must be inferred from a temperature difference measurement. Additionally, calibration errors can result from the Kohler theory assumptions, fitting methods utilized, the influence of multiply-charged particles, and calibration points used. In order to determine the calibration uncertainties and the pressure dependence of the supersaturation calibration, three calibrations are done at each pressure level: 700, 840, and 980 hPa. Typically 700 hPa is the pressure used for aircraft measurements in the boundary layer, 840 hPa is the calibration pressure at DMT in Boulder, CO, and 980 hPa is the

  19. Uncertainties in coupled thermal-hydrological processes associated with the drift scale test at Yucca Mountain, Nevada

    International Nuclear Information System (INIS)

    Mukhopadhyay, Sumitra; Tsang, Y.W.

    2002-01-01

    Understanding thermally driven coupled hydrological, mechanical, and chemical processes in unsaturated fractured tuff is essential for evaluating the performance of the potential radioactive waste repository at Yucca Mountain, Nevada. The Drift Scale Test (DST), intended for acquiring such an understanding of these processes, has generated a huge volume of temperature and moisture redistribution data. Sophisticated thermal hydrological (TH) conceptual models have yielded a good fit between simulation results and those measured data. However, some uncertainties in understanding the TH processes associated with the DST still exist. This paper evaluates these uncertainties and provides quantitative estimates of the range of these uncertainties. Of particular interest for the DST are the uncertainties resulting from the unmonitored loss of vapor through an open bulkhead of the test. There was concern that the outcome from the test might have been significantly altered by these losses. Using alternative conceptual models, we illustrate that predicted mean temperatures from the DST are within 1 degree C of the measured mean temperatures through the first two years of heating. The simulated spatial and temporal evolution of drying and condensation fronts is found to be qualitatively consistent with measured saturation data. Energy and mass balance computation shows that no more than 13 percent of the input energy is lost because of vapor leaving the test domain through the bulkhead. The change in average saturation in fractures is also relatively small. For a hypothetical situation in which no vapor is allowed to exit through the bulkhead, the simulated average fracture saturation is not qualitatively different enough to be discerned by measured moisture redistribution data. This leads us to conclude that the DST, despite the uncertainties associated with open field testing, has provided an excellent understanding of the TH processes

  20. Uncertainty, robustness, and the value of information in managing a population of northern bobwhites

    Science.gov (United States)

    Johnson, Fred A.; Hagan, Greg; Palmer, William E.; Kemmerer, Michael

    2014-01-01

    The abundance of northern bobwhites (Colinus virginianus) has decreased throughout their range. Managers often respond by considering improvements in harvest and habitat management practices, but this can be challenging if substantial uncertainty exists concerning the cause(s) of the decline. We were interested in how application of decision science could be used to help managers on a large, public management area in southwestern Florida where the bobwhite is a featured species and where abundance has severely declined. We conducted a workshop with managers and scientists to elicit management objectives, alternative hypotheses concerning population limitation in bobwhites, potential management actions, and predicted management outcomes. Using standard and robust approaches to decision making, we determined that improved water management and perhaps some changes in hunting practices would be expected to produce the best management outcomes in the face of uncertainty about what is limiting bobwhite abundance. We used a criterion called the expected value of perfect information to determine that a robust management strategy may perform nearly as well as an optimal management strategy (i.e., a strategy that is expected to perform best, given the relative importance of different management objectives) with all uncertainty resolved. We used the expected value of partial information to determine that management performance could be increased most by eliminating uncertainty over excessive-harvest and human-disturbance hypotheses. Beyond learning about the factors limiting bobwhites, adoption of a dynamic management strategy, which recognizes temporal changes in resource and environmental conditions, might produce the greatest management benefit. Our research demonstrates that robust approaches to decision making, combined with estimates of the value of information, can offer considerable insight into preferred management approaches when great uncertainty exists about

  1. Uncertainty quantification in computational fluid dynamics and aircraft engines

    CERN Document Server

    Montomoli, Francesco; D'Ammaro, Antonio; Massini, Michela; Salvadori, Simone

    2015-01-01

    This book introduces novel design techniques developed to increase the safety of aircraft engines. The authors demonstrate how the application of uncertainty methods can overcome problems in the accurate prediction of engine lift, caused by manufacturing error. This in turn ameliorates the difficulty of achieving required safety margins imposed by limits in current design and manufacturing methods. This text shows that even state-of-the-art computational fluid dynamics (CFD) are not able to predict the same performance measured in experiments; CFD methods assume idealised geometries but ideal geometries do not exist, cannot be manufactured and their performance differs from real-world ones. By applying geometrical variations of a few microns, the agreement with experiments improves dramatically, but unfortunately the manufacturing errors in engines or in experiments are unknown. In order to overcome this limitation, uncertainty quantification considers the probability density functions of manufacturing errors...

  2. “Stringy” coherent states inspired by generalized uncertainty principle

    Science.gov (United States)

    Ghosh, Subir; Roy, Pinaki

    2012-05-01

    Coherent States with Fractional Revival property, that explicitly satisfy the Generalized Uncertainty Principle (GUP), have been constructed in the context of Generalized Harmonic Oscillator. The existence of such states is essential in motivating the GUP based phenomenological results present in the literature which otherwise would be of purely academic interest. The effective phase space is Non-Canonical (or Non-Commutative in popular terminology). Our results have a smooth commutative limit, equivalent to Heisenberg Uncertainty Principle. The Fractional Revival time analysis yields an independent bound on the GUP parameter. Using this and similar bounds obtained here, we derive the largest possible value of the (GUP induced) minimum length scale. Mandel parameter analysis shows that the statistics is Sub-Poissonian. Correspondence Principle is deformed in an interesting way. Our computational scheme is very simple as it requires only first order corrected energy values and undeformed basis states.

  3. “Stringy” coherent states inspired by generalized uncertainty principle

    International Nuclear Information System (INIS)

    Ghosh, Subir; Roy, Pinaki

    2012-01-01

    Coherent States with Fractional Revival property, that explicitly satisfy the Generalized Uncertainty Principle (GUP), have been constructed in the context of Generalized Harmonic Oscillator. The existence of such states is essential in motivating the GUP based phenomenological results present in the literature which otherwise would be of purely academic interest. The effective phase space is Non-Canonical (or Non-Commutative in popular terminology). Our results have a smooth commutative limit, equivalent to Heisenberg Uncertainty Principle. The Fractional Revival time analysis yields an independent bound on the GUP parameter. Using this and similar bounds obtained here, we derive the largest possible value of the (GUP induced) minimum length scale. Mandel parameter analysis shows that the statistics is Sub-Poissonian. Correspondence Principle is deformed in an interesting way. Our computational scheme is very simple as it requires only first order corrected energy values and undeformed basis states.

  4. Political uncertainty and firm risk in China

    Directory of Open Access Journals (Sweden)

    Danglun Luo

    2017-12-01

    Full Text Available The political uncertainty surrounded by the turnover of government officials has a major impact on local economies and local firms. This paper investigates the relationship between the turnover of prefecture-city officials and the inherent risk faced by local firms in China. Using data from 1999 to 2012, we find that prefecture-city official turnovers significantly increased firm risk. Our results show that the political risk was mitigated when new prefecture-city officials were well connected with their provincial leaders. In addition, the impact of political uncertainty was more pronounced for regulated firms and firms residing in provinces with low market openness.

  5. Accounting for data uncertainties in comparing risks from energy systems

    International Nuclear Information System (INIS)

    Hauptmanns, Ulrich

    1998-01-01

    Data and models for risk comparisons are uncertain and this is true all the more the larger the time horizon contemplated. Statistical methods are presented for dealing with data uncertainties thus providing a broader foundation for decisions. Nevertheless, it has to be borne in mind that no method exists to account for the 'unforeseeable' which is always present in decision making with respect to the far future. (author)

  6. A stochastic optimization model under modeling uncertainty and parameter certainty for groundwater remediation design--part I. Model development.

    Science.gov (United States)

    He, L; Huang, G H; Lu, H W

    2010-04-15

    Solving groundwater remediation optimization problems based on proxy simulators can usually yield optimal solutions differing from the "true" ones of the problem. This study presents a new stochastic optimization model under modeling uncertainty and parameter certainty (SOMUM) and the associated solution method for simultaneously addressing modeling uncertainty associated with simulator residuals and optimizing groundwater remediation processes. This is a new attempt different from the previous modeling efforts. The previous ones focused on addressing uncertainty in physical parameters (i.e. soil porosity) while this one aims to deal with uncertainty in mathematical simulator (arising from model residuals). Compared to the existing modeling approaches (i.e. only parameter uncertainty is considered), the model has the advantages of providing mean-variance analysis for contaminant concentrations, mitigating the effects of modeling uncertainties on optimal remediation strategies, offering confidence level of optimal remediation strategies to system designers, and reducing computational cost in optimization processes. 2009 Elsevier B.V. All rights reserved.

  7. Multimodel Uncertainty Changes in Simulated River Flows Induced by Human Impact Parameterizations

    Science.gov (United States)

    Liu, Xingcai; Tang, Qiuhong; Cui, Huijuan; Mu, Mengfei; Gerten Dieter; Gosling, Simon; Masaki, Yoshimitsu; Satoh, Yusuke; Wada, Yoshihide

    2017-01-01

    Human impacts increasingly affect the global hydrological cycle and indeed dominate hydrological changes in some regions. Hydrologists have sought to identify the human-impact-induced hydrological variations via parameterizing anthropogenic water uses in global hydrological models (GHMs). The consequently increased model complexity is likely to introduce additional uncertainty among GHMs. Here, using four GHMs, between-model uncertainties are quantified in terms of the ratio of signal to noise (SNR) for average river flow during 1971-2000 simulated in two experiments, with representation of human impacts (VARSOC) and without (NOSOC). It is the first quantitative investigation of between-model uncertainty resulted from the inclusion of human impact parameterizations. Results show that the between-model uncertainties in terms of SNRs in the VARSOC annual flow are larger (about 2 for global and varied magnitude for different basins) than those in the NOSOC, which are particularly significant in most areas of Asia and northern areas to the Mediterranean Sea. The SNR differences are mostly negative (-20 to 5, indicating higher uncertainty) for basin-averaged annual flow. The VARSOC high flow shows slightly lower uncertainties than NOSOC simulations, with SNR differences mostly ranging from -20 to 20. The uncertainty differences between the two experiments are significantly related to the fraction of irrigation areas of basins. The large additional uncertainties in VARSOC simulations introduced by the inclusion of parameterizations of human impacts raise the urgent need of GHMs development regarding a better understanding of human impacts. Differences in the parameterizations of irrigation, reservoir regulation and water withdrawals are discussed towards potential directions of improvements for future GHM development. We also discuss the advantages of statistical approaches to reduce the between-model uncertainties, and the importance of calibration of GHMs for not only

  8. Tolerance analysis in manufacturing using process capability ratio with measurement uncertainty

    DEFF Research Database (Denmark)

    Mahshid, Rasoul; Mansourvar, Zahra; Hansen, Hans Nørgaard

    2017-01-01

    . In this paper, a new statistical analysis was applied to manufactured products to assess achieved tolerances when the process is known while using capability ratio and expanded uncertainty. The analysis has benefits for process planning, determining actual precision limits, process optimization, troubleshoot......Tolerance analysis provides valuable information regarding performance of manufacturing process. It allows determining the maximum possible variation of a quality feature in production. Previous researches have focused on application of tolerance analysis to the design of mechanical assemblies...... malfunctioning existing part. The capability measure is based on a number of measurements performed on part’s quality variable. Since the ratio relies on measurements, elimination of any possible error has notable negative impact on results. Therefore, measurement uncertainty was used in combination with process...

  9. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  10. Uncertainties in Steric Sea Level Change Estimation During the Satellite Altimeter Era: Concepts and Practices

    Science.gov (United States)

    MacIntosh, C. R.; Merchant, C. J.; von Schuckmann, K.

    2017-01-01

    This article presents a review of current practice in estimating steric sea level change, focussed on the treatment of uncertainty. Steric sea level change is the contribution to the change in sea level arising from the dependence of density on temperature and salinity. It is a significant component of sea level rise and a reflection of changing ocean heat content. However, tracking these steric changes still remains a significant challenge for the scientific community. We review the importance of understanding the uncertainty in estimates of steric sea level change. Relevant concepts of uncertainty are discussed and illustrated with the example of observational uncertainty propagation from a single profile of temperature and salinity measurements to steric height. We summarise and discuss the recent literature on methodologies and techniques used to estimate steric sea level in the context of the treatment of uncertainty. Our conclusions are that progress in quantifying steric sea level uncertainty will benefit from: greater clarity and transparency in published discussions of uncertainty, including exploitation of international standards for quantifying and expressing uncertainty in measurement; and the development of community "recipes" for quantifying the error covariances in observations and from sparse sampling and for estimating and propagating uncertainty across spatio-temporal scales.

  11. Deterministic methods for sensitivity and uncertainty analysis in large-scale computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Oblow, E.M.; Pin, F.G.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.; Lucius, J.L.

    1987-01-01

    The fields of sensitivity and uncertainty analysis are dominated by statistical techniques when large-scale modeling codes are being analyzed. This paper reports on the development and availability of two systems, GRESS and ADGEN, that make use of computer calculus compilers to automate the implementation of deterministic sensitivity analysis capability into existing computer models. This automation removes the traditional limitation of deterministic sensitivity methods. The paper describes a deterministic uncertainty analysis method (DUA) that uses derivative information as a basis to propagate parameter probability distributions to obtain result probability distributions. The paper demonstrates the deterministic approach to sensitivity and uncertainty analysis as applied to a sample problem that models the flow of water through a borehole. The sample problem is used as a basis to compare the cumulative distribution function of the flow rate as calculated by the standard statistical methods and the DUA method. The DUA method gives a more accurate result based upon only two model executions compared to fifty executions in the statistical case

  12. Experiences of Uncertainty in Men With an Elevated PSA.

    Science.gov (United States)

    Biddle, Caitlin; Brasel, Alicia; Underwood, Willie; Orom, Heather

    2015-05-15

    A significant proportion of men, ages 50 to 70 years, have, and continue to receive prostate specific antigen (PSA) tests to screen for prostate cancer (PCa). Approximately 70% of men with an elevated PSA level will not subsequently be diagnosed with PCa. Semistructured interviews were conducted with 13 men with an elevated PSA level who had not been diagnosed with PCa. Uncertainty was prominent in men's reactions to the PSA results, stemming from unanswered questions about the PSA test, PCa risk, and confusion about their management plan. Uncertainty was exacerbated or reduced depending on whether health care providers communicated in lay and empathetic ways, and provided opportunities for question asking. To manage uncertainty, men engaged in information and health care seeking, self-monitoring, and defensive cognition. Results inform strategies for meeting informational needs of men with an elevated PSA and confirm the primary importance of physician communication behavior for open information exchange and uncertainty reduction. © The Author(s) 2015.

  13. Modeling theoretical uncertainties in phenomenological analyses for particle physics

    Energy Technology Data Exchange (ETDEWEB)

    Charles, Jerome [CNRS, Aix-Marseille Univ, Universite de Toulon, CPT UMR 7332, Marseille Cedex 9 (France); Descotes-Genon, Sebastien [CNRS, Univ. Paris-Sud, Universite Paris-Saclay, Laboratoire de Physique Theorique (UMR 8627), Orsay Cedex (France); Niess, Valentin [CNRS/IN2P3, UMR 6533, Laboratoire de Physique Corpusculaire, Aubiere Cedex (France); Silva, Luiz Vale [CNRS, Univ. Paris-Sud, Universite Paris-Saclay, Laboratoire de Physique Theorique (UMR 8627), Orsay Cedex (France); Univ. Paris-Sud, CNRS/IN2P3, Universite Paris-Saclay, Groupe de Physique Theorique, Institut de Physique Nucleaire, Orsay Cedex (France); J. Stefan Institute, Jamova 39, P. O. Box 3000, Ljubljana (Slovenia)

    2017-04-15

    The determination of the fundamental parameters of the Standard Model (and its extensions) is often limited by the presence of statistical and theoretical uncertainties. We present several models for the latter uncertainties (random, nuisance, external) in the frequentist framework, and we derive the corresponding p values. In the case of the nuisance approach where theoretical uncertainties are modeled as biases, we highlight the important, but arbitrary, issue of the range of variation chosen for the bias parameters. We introduce the concept of adaptive p value, which is obtained by adjusting the range of variation for the bias according to the significance considered, and which allows us to tackle metrology and exclusion tests with a single and well-defined unified tool, which exhibits interesting frequentist properties. We discuss how the determination of fundamental parameters is impacted by the model chosen for theoretical uncertainties, illustrating several issues with examples from quark flavor physics. (orig.)

  14. A new uncertainty importance measure

    International Nuclear Information System (INIS)

    Borgonovo, E.

    2007-01-01

    Uncertainty in parameters is present in many risk assessment problems and leads to uncertainty in model predictions. In this work, we introduce a global sensitivity indicator which looks at the influence of input uncertainty on the entire output distribution without reference to a specific moment of the output (moment independence) and which can be defined also in the presence of correlations among the parameters. We discuss its mathematical properties and highlight the differences between the present indicator, variance-based uncertainty importance measures and a moment independent sensitivity indicator previously introduced in the literature. Numerical results are discussed with application to the probabilistic risk assessment model on which Iman [A matrix-based approach to uncertainty and sensitivity analysis for fault trees. Risk Anal 1987;7(1):22-33] first introduced uncertainty importance measures

  15. In pursuit of a fit-for-purpose uncertainty guide

    Science.gov (United States)

    White, D. R.

    2016-08-01

    Measurement uncertainty is a measure of the quality of a measurement; it enables users of measurements to manage the risks and costs associated with decisions influenced by measurements, and it supports metrological traceability by quantifying the proximity of measurement results to true SI values. The Guide to the Expression of Uncertainty in Measurement (GUM) ensures uncertainty statements meet these purposes and encourages the world-wide harmony of measurement uncertainty practice. Although the GUM is an extraordinarily successful document, it has flaws, and a revision has been proposed. Like the already-published supplements to the GUM, the proposed revision employs objective Bayesian statistics instead of frequentist statistics. This paper argues that the move away from a frequentist treatment of measurement error to a Bayesian treatment of states of knowledge is misguided. The move entails changes in measurement philosophy, a change in the meaning of probability, and a change in the object of uncertainty analysis, all leading to different numerical results, increased costs, increased confusion, a loss of trust, and, most significantly, a loss of harmony with current practice. Recommendations are given for a revision in harmony with the current GUM and allowing all forms of statistical inference.

  16. Concept of uncertainty in relation to the foresight research

    Directory of Open Access Journals (Sweden)

    Magruk Andrzej

    2017-03-01

    Full Text Available Uncertainty is one of the most important features of many areas of social and economic life, especially in the forward-looking context. On the one hand, the degree of uncertainty is associated with the objective essence of randomness of the phenomenon, and on the other, with the subjective perspective of a man. Future-oriented perception of human activities is laden with an incomplete specificity of the analysed phenomena, their volatility, and lack of continuity. A man is unable to determine, with complete certainty, the further course of these phenomena. According to the author of this article, in order to significantly reduce the uncertainty while making strategic decisions in a complex environment, we should focus our actions on the future through systemic research of foresight. This article attempts to answer the following research questions: 1 What is the relationship between foresight studies in the system perspective to studies of the uncertainty? 2 What classes of foresight methods enable the research of uncertainty in the process of system inquiry of the future? This study conducted deductive reasoning based on the results of the analysis methods and criticism of literature.

  17. Statistical uncertainties and unrecognized relationships

    International Nuclear Information System (INIS)

    Rankin, J.P.

    1985-01-01

    Hidden relationships in specific designs directly contribute to inaccuracies in reliability assessments. Uncertainty factors at the system level may sometimes be applied in attempts to compensate for the impact of such unrecognized relationships. Often uncertainty bands are used to relegate unknowns to a miscellaneous category of low-probability occurrences. However, experience and modern analytical methods indicate that perhaps the dominant, most probable and significant events are sometimes overlooked in statistical reliability assurances. The author discusses the utility of two unique methods of identifying the otherwise often unforeseeable system interdependencies for statistical evaluations. These methods are sneak circuit analysis and a checklist form of common cause failure analysis. Unless these techniques (or a suitable equivalent) are also employed along with the more widely-known assurance tools, high reliability of complex systems may not be adequately assured. This concern is indicated by specific illustrations. 8 references, 5 figures

  18. Uncertainty calculations made easier

    International Nuclear Information System (INIS)

    Hogenbirk, A.

    1994-07-01

    The results are presented of a neutron cross section sensitivity/uncertainty analysis performed in a complicated 2D model of the NET shielding blanket design inside the ITER torus design, surrounded by the cryostat/biological shield as planned for ITER. The calculations were performed with a code system developed at ECN Petten, with which sensitivity/uncertainty calculations become relatively simple. In order to check the deterministic neutron transport calculations (performed with DORT), calculations were also performed with the Monte Carlo code MCNP. Care was taken to model the 2.0 cm wide gaps between two blanket segments, as the neutron flux behind the vacuum vessel is largely determined by neutrons streaming through these gaps. The resulting neutron flux spectra are in excellent agreement up to the end of the cryostat. It is noted, that at this position the attenuation of the neutron flux is about 1 l orders of magnitude. The uncertainty in the energy integrated flux at the beginning of the vacuum vessel and at the beginning of the cryostat was determined in the calculations. The uncertainty appears to be strongly dependent on the exact geometry: if the gaps are filled with stainless steel, the neutron spectrum changes strongly, which results in an uncertainty of 70% in the energy integrated flux at the beginning of the cryostat in the no-gap-geometry, compared to an uncertainty of only 5% in the gap-geometry. Therefore, it is essential to take into account the exact geometry in sensitivity/uncertainty calculations. Furthermore, this study shows that an improvement of the covariance data is urgently needed in order to obtain reliable estimates of the uncertainties in response parameters in neutron transport calculations. (orig./GL)

  19. The Impact of Economic Parameter Uncertainty Growth on Regional Energy Demand Assessment

    Directory of Open Access Journals (Sweden)

    Olga Vasilyevna Mazurova

    2017-06-01

    Full Text Available The article deals with the forecasting studies based on the energy demand and prices in the region in terms of the complex interconnections between economy (and energy and the growth of uncertainty of the future development of the country and territories. The authors propose a methodological approach, which combines the assessment of the price elasticity of energy demand with the optimization of energy and fuel regional supply. In this case, the price elasticity of demand is determined taking into account the comparison of cost-effectiveness of using different types of fuel and energy by different consumers. The originality of the proposed approach consists in simulating the behaviour of suppliers’ (energy companies and large customers’ (power plants, boiler rooms, industry, transport, population depending on energy price changes, the existing and new technologies, energy-saving activities and restrictions on fuel supplies. To take into account the uncertainty of future economic and energy conditions, some parameters such as prospective technical and economic parameters, price, technological parameters are set as the intervals of possible values with different probability levels. This approach allows making multivariate studies with different combinations of the expected conditions and receiving as a result the range of the projected values of studied indicators. The multivariate calculations show that the fuel demand has a nonlinear dependence on the consumer characteristics, pricing, projection horizon, and the nature of the future conditions uncertainty. The authors have shown that this effect can be significant and should be considered in the forecasts of the development of fuel and energy sector. The methodological approach and quantitative evaluation can be used to improve the economic and energy development strategies of the country and regions

  20. Uncertainty from the choice of microphysics scheme in convection-permitting models significantly exceeds aerosol effects

    Directory of Open Access Journals (Sweden)

    B. White

    2017-10-01

    Full Text Available This study investigates the hydrometeor development and response to cloud droplet number concentration (CDNC perturbations in convection-permitting model configurations. We present results from a real-data simulation of deep convection in the Congo basin, an idealised supercell case, and a warm-rain large-eddy simulation (LES. In each case we compare two frequently used double-moment bulk microphysics schemes and investigate the response to CDNC perturbations. We find that the variability among the two schemes, including the response to aerosol, differs widely between these cases. In all cases, differences in the simulated cloud morphology and precipitation are found to be significantly greater between the microphysics schemes than due to CDNC perturbations within each scheme. Further, we show that the response of the hydrometeors to CDNC perturbations differs strongly not only between microphysics schemes, but the inter-scheme variability also differs between cases of convection. Sensitivity tests show that the representation of autoconversion is the dominant factor that drives differences in rain production between the microphysics schemes in the idealised precipitating shallow cumulus case and in a subregion of the Congo basin simulations dominated by liquid-phase processes. In this region, rain mass is also shown to be relatively insensitive to the radiative effects of an overlying layer of ice-phase cloud. The conversion of cloud ice to snow is the process responsible for differences in cold cloud bias between the schemes in the Congo. In the idealised supercell case, thermodynamic impacts on the storm system using different microphysics parameterisations can equal those due to aerosol effects. These results highlight the large uncertainty in cloud and precipitation responses to aerosol in convection-permitting simulations and have important implications not only for process studies of aerosol–convection interaction, but also for

  1. Two-point method uncertainty during control and measurement of cylindrical element diameters

    Science.gov (United States)

    Glukhov, V. I.; Shalay, V. V.; Radev, H.

    2018-04-01

    The topic of the article is devoted to the urgent problem of the reliability of technical products geometric specifications measurements. The purpose of the article is to improve the quality of parts linear sizes control by the two-point measurement method. The article task is to investigate methodical extended uncertainties in measuring cylindrical element linear sizes. The investigation method is a geometric modeling of the element surfaces shape and location deviations in a rectangular coordinate system. The studies were carried out for elements of various service use, taking into account their informativeness, corresponding to the kinematic pairs classes in theoretical mechanics and the number of constrained degrees of freedom in the datum element function. Cylindrical elements with informativity of 4, 2, 1 and θ (zero) were investigated. The uncertainties estimation of in two-point measurements was made by comparing the results of of linear dimensions measurements with the functional diameters maximum and minimum of the element material. Methodical uncertainty is formed when cylindrical elements with maximum informativeness have shape deviations of the cut and the curvature types. Methodical uncertainty is formed by measuring the element average size for all types of shape deviations. The two-point measurement method cannot take into account the location deviations of a dimensional element, so its use for elements with informativeness less than the maximum creates unacceptable methodical uncertainties in measurements of the maximum, minimum and medium linear dimensions. Similar methodical uncertainties also exist in the arbitration control of the linear dimensions of the cylindrical elements by limiting two-point gauges.

  2. Uncertainty estimation of core safety parameters using cross-correlations of covariance matrix

    International Nuclear Information System (INIS)

    Yamamoto, Akio; Yasue, Yoshihiro; Endo, Tomohiro; Kodama, Yasuhiro; Ohoka, Yasunori; Tatsumi, Masahiro

    2013-01-01

    An uncertainty reduction method for core safety parameters, for which measurement values are not obtained, is proposed. We empirically recognize that there exist some correlations among the prediction errors of core safety parameters, e.g., a correlation between the control rod worth and the assembly relative power at corresponding position. Correlations of errors among core safety parameters are theoretically estimated using the covariance of cross sections and sensitivity coefficients of core parameters. The estimated correlations of errors among core safety parameters are verified through the direct Monte Carlo sampling method. Once the correlation of errors among core safety parameters is known, we can estimate the uncertainty of a safety parameter for which measurement value is not obtained. (author)

  3. Unexpected uncertainty, volatility and decision-making

    Directory of Open Access Journals (Sweden)

    Amy Rachel Bland

    2012-06-01

    Full Text Available The study of uncertainty in decision making is receiving greater attention in the fields of cognitive and computational neuroscience. Several lines of evidence are beginning to elucidate different variants of uncertainty. Particularly, risk, ambiguity and expected and unexpected forms of uncertainty are well articulated in the literature. In this article we review both empirical and theoretical evidence arguing for the potential distinction between three forms of uncertainty; expected uncertainty, unexpected uncertainty and volatility. Particular attention will be devoted to exploring the distinction between unexpected uncertainty and volatility which has been less appreciated in the literature. This includes evidence from computational modelling, neuromodulation, neuroimaging and electrophysiological studies. We further address the possible differentiation of cognitive control mechanisms used to deal with these forms of uncertainty. Particularly we explore a role for conflict monitoring and the temporal integration of information into working memory. Finally, we explore whether the Dual Modes of Control theory provides a theoretical framework for understanding the distinction between unexpected uncertainty and volatility.

  4. Saccadic Suppression of Flash Detection: the Uncertainty Theory VS. Alternative Theories.

    Science.gov (United States)

    Greenhouse, Daniel Stephen

    Helmholtz('1) and others have proposed that when a saccadic eye movement occurs, stability of the visual world is maintained by a process that utilizes a corollary to the efferent motor signal for the eye movement, allowing the visual frame of reference to translate equal in magnitude, but opposite in sign, to the movement itself. This process is now known to be synchronous neither with the saccadic trajectory('2,3) nor in all parts of the visual field.('4) In addition, this process has been shown to have variability('2) whereby the perceived visual direction of a flash presented to a fixed retinal locus during a saccade may change from trial to trial. Hence, uncertainty with respect to visual location of a stimulus may exist during and just before a saccade. It has been established for normal vision that uncertainty produces a decline in detectability of a weak stimulus.('5,6,7) The research reported in this dissertation was performed to test the notion, first suggested by L. Matin,('8) that uncertainty is responsible for saccadic suppression, the decline in detectability that has been reported('9,10,11) for a brief flash presented during a saccade. After having established the existence of suppression under the conditions we employed (1(DEGREES) foveal flash occurring 2 1/2(DEGREES) into a 10(DEGREES) voluntary saccade, presented against an illuminated background) we conducted an initial test of the uncertainty theory. We employed a pedestal (flash at the spatial, temporal, and chromatic locus of the stimulus, occurring on all trials, and sufficiently intense as to be visible during saccades) in an attempt to reduce uncertainty. Suppression was nearly eliminated for all subjects. We interpreted this result in terms of the uncertainty theory, but were unable to reject alternative theories of suppression, which include forms of neural inhibition,('10,11) increaed noise level in the retina during saccades,('12) and metacontrast masking.('13). The next experiment

  5. The Harm that Underestimation of Uncertainty Does to Our Community: A Case Study Using Sunspot Area Measurements

    Science.gov (United States)

    Munoz-Jaramillo, Andres

    2017-08-01

    Data products in heliospheric physics are very often provided without clear estimates of uncertainty. From helioseismology in the solar interior, all the way to in situ solar wind measurements beyond 1AU, uncertainty estimates are typically hard for users to find (buried inside long documents that are separate from the data products), or simply non-existent.There are two main reasons why uncertainty measurements are hard to find:1. Understanding instrumental systematic errors is given a much higher priority inside instrumental teams.2. The desire to perfectly understand all sources of uncertainty postpones indefinitely the actual quantification of uncertainty in our measurements.Using the cross calibration of 200 years of sunspot area measurements as a case study, in this presentation we will discuss the negative impact that inadequate measurements of uncertainty have on users, through the appearance of toxic and unnecessary controversies, and data providers, through the creation of unrealistic expectations regarding the information that can be extracted from their data. We will discuss how empirical estimates of uncertainty represent a very good alternative to not providing any estimates at all, and finalize by discussing the bare essentials that should become our standard practice for future instruments and surveys.

  6. Uncertainties in s-process nucleosynthesis in massive stars determined by Monte Carlo variations

    Science.gov (United States)

    Nishimura, N.; Hirschi, R.; Rauscher, T.; St. J. Murphy, A.; Cescutti, G.

    2017-08-01

    The s-process in massive stars produces the weak component of the s-process (nuclei up to A ˜ 90), in amounts that match solar abundances. For heavier isotopes, such as barium, production through neutron capture is significantly enhanced in very metal-poor stars with fast rotation. However, detailed theoretical predictions for the resulting final s-process abundances have important uncertainties caused both by the underlying uncertainties in the nuclear physics (principally neutron-capture reaction and β-decay rates) as well as by the stellar evolution modelling. In this work, we investigated the impact of nuclear-physics uncertainties relevant to the s-process in massive stars. Using a Monte Carlo based approach, we performed extensive nuclear reaction network calculations that include newly evaluated upper and lower limits for the individual temperature-dependent reaction rates. We found that most of the uncertainty in the final abundances is caused by uncertainties in the neutron-capture rates, while β-decay rate uncertainties affect only a few nuclei near s-process branchings. The s-process in rotating metal-poor stars shows quantitatively different uncertainties and key reactions, although the qualitative characteristics are similar. We confirmed that our results do not significantly change at different metallicities for fast rotating massive stars in the very low metallicity regime. We highlight which of the identified key reactions are realistic candidates for improved measurement by future experiments.

  7. Artificial neural network surrogate development of equivalence models for nuclear data uncertainty propagation in scenario studies

    Directory of Open Access Journals (Sweden)

    Krivtchik Guillaume

    2017-01-01

    Full Text Available Scenario studies simulate the whole fuel cycle over a period of time, from extraction of natural resources to geological storage. Through the comparison of different reactor fleet evolutions and fuel management options, they constitute a decision-making support. Consequently uncertainty propagation studies, which are necessary to assess the robustness of the studies, are strategic. Among numerous types of physical model in scenario computation that generate uncertainty, the equivalence models, built for calculating fresh fuel enrichment (for instance plutonium content in PWR MOX so as to be representative of nominal fuel behavior, are very important. The equivalence condition is generally formulated in terms of end-of-cycle mean core reactivity. As this results from a physical computation, it is therefore associated with an uncertainty. A state-of-the-art of equivalence models is exposed and discussed. It is shown that the existing equivalent models implemented in scenario codes, such as COSI6, are not suited to uncertainty propagation computation, for the following reasons: (i existing analytical models neglect irradiation, which has a strong impact on the result and its uncertainty; (ii current black-box models are not suited to cross-section perturbations management; and (iii models based on transport and depletion codes are too time-consuming for stochastic uncertainty propagation. A new type of equivalence model based on Artificial Neural Networks (ANN has been developed, constructed with data calculated with neutron transport and depletion codes. The model inputs are the fresh fuel isotopy, the irradiation parameters (burnup, core fractionation, etc., cross-sections perturbations and the equivalence criterion (for instance the core target reactivity in pcm at the end of the irradiation cycle. The model output is the fresh fuel content such that target reactivity is reached at the end of the irradiation cycle. Those models are built and

  8. Model uncertainty in safety assessment

    International Nuclear Information System (INIS)

    Pulkkinen, U.; Huovinen, T.

    1996-01-01

    The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.)

  9. Model uncertainty in safety assessment

    Energy Technology Data Exchange (ETDEWEB)

    Pulkkinen, U; Huovinen, T [VTT Automation, Espoo (Finland). Industrial Automation

    1996-01-01

    The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.).

  10. Estimating the uncertainty in thermochemical calculations for oxygen-hydrogen combustors

    Science.gov (United States)

    Sims, Joseph David

    The thermochemistry program CEA2 was combined with the statistical thermodynamics program PAC99 in a Monte Carlo simulation to determine the uncertainty in several CEA2 output variables due to uncertainty in thermodynamic reference values for the reactant and combustion species. In all, six typical performance parameters were examined, along with the required intermediate calculations (five gas properties and eight stoichiometric coefficients), for three hydrogen-oxygen combustors: a main combustor, an oxidizer preburner and a fuel preburner. The three combustors were analyzed in two different modes: design mode, where, for the first time, the uncertainty in thermodynamic reference values---taken from the literature---was considered (inputs to CEA2 were specified and so had no uncertainty); and data reduction mode, where inputs to CEA2 did have uncertainty. The inputs to CEA2 were contrived experimental measurements that were intended to represent the typical combustor testing facility. In design mode, uncertainties in the performance parameters were on the order of 0.1% for the main combustor, on the order of 0.05% for the oxidizer preburner and on the order of 0.01% for the fuel preburner. Thermodynamic reference values for H2O were the dominant sources of uncertainty, as was the assigned enthalpy for liquid oxygen. In data reduction mode, uncertainties in performance parameters increased significantly as a result of the uncertainties in experimental measurements compared to uncertainties in thermodynamic reference values. Main combustor and fuel preburner theoretical performance values had uncertainties of about 0.5%, while the oxidizer preburner had nearly 2%. Associated experimentally-determined performance values for all three combustors were 3% to 4%. The dominant sources of uncertainty in this mode were the propellant flowrates. These results only apply to hydrogen-oxygen combustors and should not be generalized to every propellant combination. Species for

  11. Uncertainty Management and Sensitivity Analysis

    DEFF Research Database (Denmark)

    Rosenbaum, Ralph K.; Georgiadis, Stylianos; Fantke, Peter

    2018-01-01

    Uncertainty is always there and LCA is no exception to that. The presence of uncertainties of different types and from numerous sources in LCA results is a fact, but managing them allows to quantify and improve the precision of a study and the robustness of its conclusions. LCA practice sometimes...... suffers from an imbalanced perception of uncertainties, justifying modelling choices and omissions. Identifying prevalent misconceptions around uncertainties in LCA is a central goal of this chapter, aiming to establish a positive approach focusing on the advantages of uncertainty management. The main...... objectives of this chapter are to learn how to deal with uncertainty in the context of LCA, how to quantify it, interpret and use it, and how to communicate it. The subject is approached more holistically than just focusing on relevant statistical methods or purely mathematical aspects. This chapter...

  12. Effect of Streamflow Forecast Uncertainty on Real-Time Reservoir Operation

    Science.gov (United States)

    Zhao, T.; Cai, X.; Yang, D.

    2010-12-01

    Various hydrological forecast products have been applied to real-time reservoir operation, including deterministic streamflow forecast (DSF), DSF-based probabilistic streamflow forecast (DPSF), and ensemble streamflow forecast (ESF), which represent forecast uncertainty in the form of deterministic forecast error, deterministic forecast error-based uncertainty distribution, and ensemble forecast errors, respectively. Compared to previous studies that treat these forecast products as ad hoc inputs for reservoir operation models, this paper attempts to model the uncertainties involved in the various forecast products and explores their effect on real-time reservoir operation decisions. In hydrology, there are various indices reflecting the magnitude of streamflow forecast uncertainty; meanwhile, few models illustrate the forecast uncertainty evolution process. This research introduces Martingale Model of Forecast Evolution (MMFE) from supply chain management and justifies its assumptions for quantifying the evolution of uncertainty in streamflow forecast as time progresses. Based on MMFE, this research simulates the evolution of forecast uncertainty in DSF, DPSF, and ESF, and applies the reservoir operation models (dynamic programming, DP; stochastic dynamic programming, SDP; and standard operation policy, SOP) to assess the effect of different forms of forecast uncertainty on real-time reservoir operation. Through a hypothetical single-objective real-time reservoir operation model, the results illustrate that forecast uncertainty exerts significant effects. Reservoir operation efficiency, as measured by a utility function, decreases as the forecast uncertainty increases. Meanwhile, these effects also depend on the type of forecast product being used. In general, the utility of reservoir operation with ESF is nearly as high as the utility obtained with a perfect forecast; the utilities of DSF and DPSF are similar to each other but not as efficient as ESF. Moreover

  13. Uncertainty analysis of NDA waste measurements using computer simulations

    International Nuclear Information System (INIS)

    Blackwood, L.G.; Harker, Y.D.; Yoon, W.Y.; Meachum, T.R.

    2000-01-01

    Uncertainty assessments for nondestructive radioassay (NDA) systems for nuclear waste are complicated by factors extraneous to the measurement systems themselves. Most notably, characteristics of the waste matrix (e.g., homogeneity) and radioactive source material (e.g., particle size distribution) can have great effects on measured mass values. Under these circumstances, characterizing the waste population is as important as understanding the measurement system in obtaining realistic uncertainty values. When extraneous waste characteristics affect measurement results, the uncertainty results are waste-type specific. The goal becomes to assess the expected bias and precision for the measurement of a randomly selected item from the waste population of interest. Standard propagation-of-errors methods for uncertainty analysis can be very difficult to implement in the presence of significant extraneous effects on the measurement system. An alternative approach that naturally includes the extraneous effects is as follows: (1) Draw a random sample of items from the population of interest; (2) Measure the items using the NDA system of interest; (3) Establish the true quantity being measured using a gold standard technique; and (4) Estimate bias by deriving a statistical regression model comparing the measurements on the system of interest to the gold standard values; similar regression techniques for modeling the standard deviation of the difference values gives the estimated precision. Actual implementation of this method is often impractical. For example, a true gold standard confirmation measurement may not exist. A more tractable implementation is obtained by developing numerical models for both the waste material and the measurement system. A random sample of simulated waste containers generated by the waste population model serves as input to the measurement system model. This approach has been developed and successfully applied to assessing the quantity of

  14. Detailed modeling of the statistical uncertainty of Thomson scattering measurements

    International Nuclear Information System (INIS)

    Morton, L A; Parke, E; Hartog, D J Den

    2013-01-01

    The uncertainty of electron density and temperature fluctuation measurements is determined by statistical uncertainty introduced by multiple noise sources. In order to quantify these uncertainties precisely, a simple but comprehensive model was made of the noise sources in the MST Thomson scattering system and of the resulting variance in the integrated scattered signals. The model agrees well with experimental and simulated results. The signal uncertainties are then used by our existing Bayesian analysis routine to find the most likely electron temperature and density, with confidence intervals. In the model, photonic noise from scattered light and plasma background light is multiplied by the noise enhancement factor (F) of the avalanche photodiode (APD). Electronic noise from the amplifier and digitizer is added. The amplifier response function shapes the signal and induces correlation in the noise. The data analysis routine fits a characteristic pulse to the digitized signals from the amplifier, giving the integrated scattered signals. A finite digitization rate loses information and can cause numerical integration error. We find a formula for the variance of the scattered signals in terms of the background and pulse amplitudes, and three calibration constants. The constants are measured easily under operating conditions, resulting in accurate estimation of the scattered signals' uncertainty. We measure F ≈ 3 for our APDs, in agreement with other measurements for similar APDs. This value is wavelength-independent, simplifying analysis. The correlated noise we observe is reproduced well using a Gaussian response function. Numerical integration error can be made negligible by using an interpolated characteristic pulse, allowing digitization rates as low as the detector bandwidth. The effect of background noise is also determined

  15. Optimisation of decision making under uncertainty throughout field lifetime: A fractured reservoir example

    Science.gov (United States)

    Arnold, Dan; Demyanov, Vasily; Christie, Mike; Bakay, Alexander; Gopa, Konstantin

    2016-10-01

    Assessing the change in uncertainty in reservoir production forecasts over field lifetime is rarely undertaken because of the complexity of joining together the individual workflows. This becomes particularly important in complex fields such as naturally fractured reservoirs. The impact of this problem has been identified in previous and many solutions have been proposed but never implemented on complex reservoir problems due to the computational cost of quantifying uncertainty and optimising the reservoir development, specifically knowing how many and what kind of simulations to run. This paper demonstrates a workflow that propagates uncertainty throughout field lifetime, and into the decision making process by a combination of a metric-based approach, multi-objective optimisation and Bayesian estimation of uncertainty. The workflow propagates uncertainty estimates from appraisal into initial development optimisation, then updates uncertainty through history matching and finally propagates it into late-life optimisation. The combination of techniques applied, namely the metric approach and multi-objective optimisation, help evaluate development options under uncertainty. This was achieved with a significantly reduced number of flow simulations, such that the combined workflow is computationally feasible to run for a real-field problem. This workflow is applied to two synthetic naturally fractured reservoir (NFR) case studies in appraisal, field development, history matching and mid-life EOR stages. The first is a simple sector model, while the second is a more complex full field example based on a real life analogue. This study infers geological uncertainty from an ensemble of models that are based on the carbonate Brazilian outcrop which are propagated through the field lifetime, before and after the start of production, with the inclusion of production data significantly collapsing the spread of P10-P90 in reservoir forecasts. The workflow links uncertainty

  16. Agriculture-driven deforestation in the tropics from 1990-2015: emissions, trends and uncertainties

    Science.gov (United States)

    Carter, Sarah; Herold, Martin; Avitabile, Valerio; de Bruin, Sytze; De Sy, Veronique; Kooistra, Lammert; Rufino, Mariana C.

    2018-01-01

    Limited data exists on emissions from agriculture-driven deforestation, and available data are typically uncertain. In this paper, we provide comparable estimates of emissions from both all deforestation and agriculture-driven deforestation, with uncertainties for 91 countries across the tropics between 1990 and 2015. Uncertainties associated with input datasets (activity data and emissions factors) were used to combine the datasets, where most certain datasets contribute the most. This method utilizes all the input data, while minimizing the uncertainty of the emissions estimate. The uncertainty of input datasets was influenced by the quality of the data, the sample size (for sample-based datasets), and the extent to which the timeframe of the data matches the period of interest. Area of deforestation, and the agriculture-driver factor (extent to which agriculture drives deforestation), were the most uncertain components of the emissions estimates, thus improvement in the uncertainties related to these estimates will provide the greatest reductions in uncertainties of emissions estimates. Over the period of the study, Latin America had the highest proportion of deforestation driven by agriculture (78%), and Africa had the lowest (62%). Latin America had the highest emissions from agriculture-driven deforestation, and these peaked at 974 ± 148 Mt CO2 yr-1 in 2000-2005. Africa saw a continuous increase in emissions between 1990 and 2015 (from 154 ± 21-412 ± 75 Mt CO2 yr-1), so mitigation initiatives could be prioritized there. Uncertainties for emissions from agriculture-driven deforestation are ± 62.4% (average over 1990-2015), and uncertainties were highest in Asia and lowest in Latin America. Uncertainty information is crucial for transparency when reporting, and gives credibility to related mitigation initiatives. We demonstrate that uncertainty data can also be useful when combining multiple open datasets, so we recommend new data

  17. Incorporation of various uncertainties in dependent failure-probability estimation

    International Nuclear Information System (INIS)

    Samanta, P.K.; Mitra, S.P.

    1982-01-01

    This paper describes an approach that allows the incorporation of various types of uncertainties in the estimation of dependent failure (common mode failure) probability. The types of uncertainties considered are attributable to data, modeling and coupling. The method developed is applied to a class of dependent failures, i.e., multiple human failures during testing, maintenance and calibration. Estimation of these failures is critical as they have been shown to be significant contributors to core melt probability in pressurized water reactors

  18. A Short Review of FDTD-Based Methods for Uncertainty Quantification in Computational Electromagnetics

    Directory of Open Access Journals (Sweden)

    Theodoros T. Zygiridis

    2017-01-01

    Full Text Available We provide a review of selected computational methodologies that are based on the deterministic finite-difference time-domain algorithm and are suitable for the investigation of electromagnetic problems involving uncertainties. As it will become apparent, several alternatives capable of performing uncertainty quantification in a variety of cases exist, each one exhibiting different qualities and ranges of applicability, which we intend to point out here. Given the numerous available approaches, the purpose of this paper is to clarify the main strengths and weaknesses of the described methodologies and help the potential readers to safely select the most suitable approach for their problem under consideration.

  19. Robust stabilisation of time-varying delay systems with probabilistic uncertainties

    Science.gov (United States)

    Jiang, Ning; Xiong, Junlin; Lam, James

    2016-09-01

    For robust stabilisation of time-varying delay systems, only sufficient conditions are available to date. A natural question is as follows: if the existing sufficient conditions are not satisfied, and hence no controllers can be found, what can one do to improve the stability performance of time-varying delay systems? This question is addressed in this paper when there is a probabilistic structure on the parameter uncertainty set. A randomised algorithm is proposed to design a state-feedback controller, which stabilises the system over the uncertainty domain in a probabilistic sense. The capability of the designed controller is quantified by the probability of stability of the resulting closed-loop system. The accuracy of the solution obtained from the randomised algorithm is also analysed. Finally, numerical examples are used to illustrate the effectiveness and advantages of the developed controller design approach.

  20. Simulating fuel behavior under transient conditions using FRAPTRAN and uncertainty analysis using Dakota

    International Nuclear Information System (INIS)

    Gomes, Daniel S.; Teixeira, Antonio S.

    2017-01-01

    Although regulatory agencies have shown a special interest in incorporating best estimate approaches in the fuel licensing process, fuel codes are currently licensed based on only the deterministic limits such as those seen in 10CRF50, and therefore, may yield unrealistic safety margins. The concept of uncertainty analysis is employed to more realistically manage this risk. In this study, uncertainties were classified into two categories: probabilistic and epistemic (owing to a lack of pre-existing knowledge in this area). Fuel rods have three sources of uncertainty: manufacturing tolerance, boundary conditions, and physical models. The first step in successfully analyzing the uncertainties involves performing a statistical analysis on the input parameters used throughout the fuel code. The response obtained from this analysis must show proportional index correlations because the uncertainties are globally propagated. The Dakota toolkit was used to analyze the FRAPTRAN transient fuel code. The subsequent sensitivity analyses helped in identifying the key parameters with the highest correlation indices including the peak cladding temperature and the time required for cladding failures. The uncertainty analysis was performed using an IFA-650-5 fuel rod, which was in line with the tests performed in the Halden Project in Norway. The main objectives of the Halden project included studying the ballooning and rupture processes. The results of this experiment demonstrate the accuracy and applicability of the physical models in evaluating the thermal conductivity, mechanical model, and fuel swelling formulations. (author)

  1. Simulating fuel behavior under transient conditions using FRAPTRAN and uncertainty analysis using Dakota

    Energy Technology Data Exchange (ETDEWEB)

    Gomes, Daniel S.; Teixeira, Antonio S., E-mail: dsgomes@ipen.br, E-mail: teixeira@ipen [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    Although regulatory agencies have shown a special interest in incorporating best estimate approaches in the fuel licensing process, fuel codes are currently licensed based on only the deterministic limits such as those seen in 10CRF50, and therefore, may yield unrealistic safety margins. The concept of uncertainty analysis is employed to more realistically manage this risk. In this study, uncertainties were classified into two categories: probabilistic and epistemic (owing to a lack of pre-existing knowledge in this area). Fuel rods have three sources of uncertainty: manufacturing tolerance, boundary conditions, and physical models. The first step in successfully analyzing the uncertainties involves performing a statistical analysis on the input parameters used throughout the fuel code. The response obtained from this analysis must show proportional index correlations because the uncertainties are globally propagated. The Dakota toolkit was used to analyze the FRAPTRAN transient fuel code. The subsequent sensitivity analyses helped in identifying the key parameters with the highest correlation indices including the peak cladding temperature and the time required for cladding failures. The uncertainty analysis was performed using an IFA-650-5 fuel rod, which was in line with the tests performed in the Halden Project in Norway. The main objectives of the Halden project included studying the ballooning and rupture processes. The results of this experiment demonstrate the accuracy and applicability of the physical models in evaluating the thermal conductivity, mechanical model, and fuel swelling formulations. (author)

  2. African anthropogenic combustion emission inventory: specificities and uncertainties

    Science.gov (United States)

    Sekou, K.; Liousse, C.; Eric-michel, A.; Veronique, Y.; Thierno, D.; Roblou, L.; Toure, E. N.; Julien, B.

    2015-12-01

    Fossil fuel and biofuel emissions of gases and particles in Africa are expected to significantly increase in the near future, particularly due to the growth of African cities. In addition, African large savannah fires occur each year during the dry season, mainly for socio-economical purposes. In this study, we will present the most recent developments of African anthropogenic combustion emission inventories, stressing African specificities. (1)A regional fossil fuel and biofuel inventory for gases and particulates will be presented for Africa at a resolution of 0.25° x 0.25° from 1990 to 2012. For this purpose, the original database of Liousse et al. (2014) has been used after modification for emission factors and for updated regional fuel consumption including new emitter categories (waste burning, flaring) and new activity sectors (i.e. disaggregation of transport into sub-sectors including two wheel ). In terms of emission factors, new measured values will be presented and compared to litterature with a focus on aerosols. They result from measurement campaigns organized in the frame of DACCIWA European program for each kind of African specific anthropogenic sources in 2015, in Abidjan (Ivory Coast), Cotonou (Benin) and in Laboratoire d'Aérologie combustion chamber. Finally, a more detailed spatial distribution of emissions will be proposed at a country level to better take into account road distributions and population densities. (2) Large uncertainties still remain in biomass burning emission inventories estimates, especially over Africa between different datasets such as GFED and AMMABB. Sensitivity tests will be presented to investigate uncertainties in the emission inventories, applying methodologies used for AMMABB and GFED inventories respectively. Then, the relative importance of each sources (fossil fuel, biofuel and biomass burning inventories) on the budgets of carbon monoxide, nitrogen oxides, sulfur dioxide, black and organic carbon, and volatile

  3. Uncertainty in Measurement: Procedures for Determining Uncertainty With Application to Clinical Laboratory Calculations.

    Science.gov (United States)

    Frenkel, Robert B; Farrance, Ian

    2018-01-01

    The "Guide to the Expression of Uncertainty in Measurement" (GUM) is the foundational document of metrology. Its recommendations apply to all areas of metrology including metrology associated with the biomedical sciences. When the output of a measurement process depends on the measurement of several inputs through a measurement equation or functional relationship, the propagation of uncertainties in the inputs to the uncertainty in the output demands a level of understanding of the differential calculus. This review is intended as an elementary guide to the differential calculus and its application to uncertainty in measurement. The review is in two parts. In Part I, Section 3, we consider the case of a single input and introduce the concepts of error and uncertainty. Next we discuss, in the following sections in Part I, such notions as derivatives and differentials, and the sensitivity of an output to errors in the input. The derivatives of functions are obtained using very elementary mathematics. The overall purpose of this review, here in Part I and subsequently in Part II, is to present the differential calculus for those in the medical sciences who wish to gain a quick but accurate understanding of the propagation of uncertainties. © 2018 Elsevier Inc. All rights reserved.

  4. Wind energy: Overcoming inadequate wind and modeling uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Kane, Vivek

    2010-09-15

    'Green Energy' is the call of the day, and significance of Wind Energy can never be overemphasized. But the key question here is - What if the wind resources are inadequate? Studies reveal that the probability of finding favorable wind at a given place on land is only 15%. Moreover, there are inherent uncertainties associated with wind business. Can we overcome inadequate wind resources? Can we scientifically quantify uncertainty and model it to make business sense? This paper proposes a solution, by way of break-through Wind Technologies, combined with advanced tools for Financial Modeling, enabling vital business decisions.

  5. The effect of uncertainty and cooperative behavior on operational performance: Evidence from Brazilian firms

    Directory of Open Access Journals (Sweden)

    Eliane Pereira Zamith Brito

    2017-12-01

    Full Text Available This study aims to examine the effect of managers’ uncertainty on cooperative behavior in interorganizational relationships, and how this affects operational performance. We conducted a survey with 225 Brazilian managers, and analyzed data using confirmatory factor analysis and structural equation modelling. Results present: a a negative influence of uncertainty of state on operational performance; b a positive influence of uncertainty of effect on uncertainty of response; c a significant influence of uncertainty of response on cooperative behavior; and d a positive influence of cooperative behavior on performance. The results indicated that cooperation and uncertainty accounted for 18.8% of the variability of operational performance. Considering the uncertainty that plagues Latin societies, this study can help to create more efficient ways to deal with the phenomenon. Rather than turning a blind eye to uncertainty, our study underscores it and treats it like another business environment issue.

  6. Uncertainty of a hydrological climate change impact assessment - Is it really all about climate uncertainty?

    Science.gov (United States)

    Honti, Mark; Reichert, Peter; Scheidegger, Andreas; Stamm, Christian

    2013-04-01

    Climate change impact assessments have become more and more popular in hydrology since the middle 1980's with another boost after the publication of the IPCC AR4 report. During hundreds of impact studies a quasi-standard methodology emerged, which is mainly shaped by the growing public demand for predicting how water resources management or flood protection should change in the close future. The ``standard'' workflow considers future climate under a specific IPCC emission scenario simulated by global circulation models (GCMs), possibly downscaled by a regional climate model (RCM) and/or a stochastic weather generator. The output from the climate models is typically corrected for bias before feeding it into a calibrated hydrological model, which is run on the past and future meteorological data to analyse the impacts of climate change on the hydrological indicators of interest. The impact predictions are as uncertain as any forecast that tries to describe the behaviour of an extremely complex system decades into the future. Future climate predictions are uncertain due to the scenario uncertainty and the GCM model uncertainty that is obvious on finer resolution than continental scale. Like in any hierarchical model system, uncertainty propagates through the descendant components. Downscaling increases uncertainty with the deficiencies of RCMs and/or weather generators. Bias correction adds a strong deterministic shift to the input data. Finally the predictive uncertainty of the hydrological model ends the cascade that leads to the total uncertainty of the hydrological impact assessment. There is an emerging consensus between many studies on the relative importance of the different uncertainty sources. The prevailing perception is that GCM uncertainty dominates hydrological impact studies. There are only few studies, which found that the predictive uncertainty of hydrological models can be in the same range or even larger than climatic uncertainty. We carried out a

  7. A Survey of Clinical Uncertainty from the Paediatric Basic Specialist Trainee Perspective

    LENUS (Irish Health Repository)

    O’Neill, MB

    2017-06-01

    This study was undertaken to evaluate uncertainty from the Basic Specialist Trainee perspective. The survey of trainees explored 1) factors in decision making, 2) the personal impact of uncertainty, 3) the responses to both clinical errors and challenges to their decision making and 4) the potential strategies to address uncertainty. Forty-one (93%) of trainees surveyed responded. Important factors in decision making were clinical knowledge and senior colleague’s opinion. Sixty percent experienced significant anxiety post call as a consequence of their uncertainty. When errors are made by colleagues, the trainee’s response is acceptance (52.5%), and sympathy (32%).Trainees are strongly influenced by the opinions of senior colleagues often changing their opinions having made confident decisions. Solutions to address uncertainty include enhanced knowledge translation, and to a lesser extent, enhanced personal awareness and resilience awareness. To enhance the training experience for BST and lessen the uncertainty experienced these strategies need to be enacted within the training milieu.

  8. Uncertainty and simulation

    International Nuclear Information System (INIS)

    Depres, B.; Dossantos-Uzarralde, P.

    2009-01-01

    More than 150 researchers and engineers from universities and the industrial world met to discuss on the new methodologies developed around assessing uncertainty. About 20 papers were presented and the main topics were: methods to study the propagation of uncertainties, sensitivity analysis, nuclear data covariances or multi-parameter optimisation. This report gathers the contributions of CEA researchers and engineers

  9. Decoherence effect on quantum-memory-assisted entropic uncertainty relations

    Science.gov (United States)

    Ming, Fei; Wang, Dong; Huang, Ai-Jun; Sun, Wen-Yang; Ye, Liu

    2018-01-01

    Uncertainty principle significantly provides a bound to predict precision of measurement with regard to any two incompatible observables, and thereby plays a nontrivial role in quantum precision measurement. In this work, we observe the dynamical features of the quantum-memory-assisted entropic uncertainty relations (EUR) for a pair of incompatible measurements in an open system characterized by local generalized amplitude damping (GAD) noises. Herein, we derive the dynamical evolution of the entropic uncertainty with respect to the measurement affecting by the canonical GAD noises when particle A is initially entangled with quantum memory B. Specifically, we examine the dynamics of EUR in the frame of three realistic scenarios: one case is that particle A is affected by environmental noise (GAD) while particle B as quantum memory is free from any noises, another case is that particle B is affected by the external noise while particle A is not, and the last case is that both of the particles suffer from the noises. By analytical methods, it turns out that the uncertainty is not full dependent of quantum correlation evolution of the composite system consisting of A and B, but the minimal conditional entropy of the measured subsystem. Furthermore, we present a possible physical interpretation for the behavior of the uncertainty evolution by means of the mixedness of the observed system; we argue that the uncertainty might be dramatically correlated with the systematic mixedness. Furthermore, we put forward a simple and effective strategy to reduce the measuring uncertainty of interest upon quantum partially collapsed measurement. Therefore, our explorations might offer an insight into the dynamics of the entropic uncertainty relation in a realistic system, and be of importance to quantum precision measurement during quantum information processing.

  10. Radiotherapy for breast cancer: respiratory and set-up uncertainties

    International Nuclear Information System (INIS)

    Saliou, M.G.; Giraud, P.; Simon, L.; Fournier-Bidoz, N.; Fourquet, A.; Dendale, R.; Rosenwald, J.C.; Cosset, J.M.

    2005-01-01

    Adjuvant Radiotherapy has been shown to significantly reduce locoregional recurrence but this advantage is associated with increased cardiovascular and pulmonary morbidities. All uncertainties inherent to conformal radiation therapy must be identified in order to increase the precision of treatment; misestimation of these uncertainties increases the potential risk of geometrical misses with, as a consequence, under-dosage of the tumor and/or overdosage of healthy tissues. Geometric uncertainties due to respiratory movements or set-up errors are well known. Two strategies have been proposed to limit their effect: quantification of these uncertainties, which are then taken into account in the final calculation of safety margins and/or reduction of respiratory and set-up uncertainties by an efficient immobilization or gating systems. Measured on portal films with two tangential fields. CLD (central lung distance), defined as the distance between the deep field edge and the interior chest wall at the central axis, seems to be the best predictor of set-up uncertainties. Using CLD, estimated mean set-up errors from the literature are 3.8 and 3.2 mm for the systematic and random errors respectively. These depend partly on the type of immobilization device and could be reduced by the use of portal imaging systems. Furthermore, breast is mobile during respiration with motion amplitude as high as 0.8 to 10 mm in the anteroposterior direction. Respiratory gating techniques, currently on evaluation, have the potential to reduce effect of these movements. Each radiotherapy department should perform its own assessments and determine the geometric uncertainties with respect of the equipment used and its particular treatment practices. This paper is a review of the main geometric uncertainties in breast treatment, due to respiration and set-up, and solutions proposed to limit their impact. (author)

  11. Doppler reactivity uncertainties and their effect upon a hypothetical LOF accident

    International Nuclear Information System (INIS)

    Malloy, D.J.

    1976-01-01

    The statistical uncertainties and the major methodological errors which contribute to the Doppler feedback uncertainty were reviewed and investigated. Improved estimates for the magnitudes of each type of uncertainty were established. The generally applied reactivity feedback methodology has been extended by explicitly treating the coupling effect which exists between the various feedback components. The improved methodology was specifically applied to the coupling of Doppler and sodium void reactivities. In addition, the description of the temperature dependence of the Doppler feedback has been improved by the use of a two-constant formula on a global and regional basis. Feedback and coupling coefficients are presented as a first comparison of the improved and the currently applied methods. Further, the energy release which results from hypothetical disassembly accidents was simulated with a special response surface in the parametric safety evaluation code PARSEC. The impact of the improved feedback methodology and of Doppler coefficient uncertainties was illustrated by the usual parametric relationship between available work-energy and the Doppler coefficient. The work-energy was calculated with the VENUS-II disassembly code and was represented as a response surface in PARSEC. Additionally, the probability distribution for available work-energy, which results from the statistical uncertainty of the Doppler coefficient, was calculated for the current and the improved feedback methodology. The improved feedback description yielded about a 16 percent higher average value for the work-energy. A substantially larger increase is found on the high-yield end of the spectrum: the probability for work-energy above 500 MJ was increased by about a factor of ten

  12. Gauge theories under incorporation of a generalized uncertainty principle

    International Nuclear Information System (INIS)

    Kober, Martin

    2010-01-01

    There is considered an extension of gauge theories according to the assumption of a generalized uncertainty principle which implies a minimal length scale. A modification of the usual uncertainty principle implies an extended shape of matter field equations like the Dirac equation. If there is postulated invariance of such a generalized field equation under local gauge transformations, the usual covariant derivative containing the gauge potential has to be replaced by a generalized covariant derivative. This leads to a generalized interaction between the matter field and the gauge field as well as to an additional self-interaction of the gauge field. Since the existence of a minimal length scale seems to be a necessary assumption of any consistent quantum theory of gravity, the gauge principle is a constitutive ingredient of the standard model, and even gravity can be described as gauge theory of local translations or Lorentz transformations, the presented extension of gauge theories appears as a very important consideration.

  13. A Study of the Impact of Underground Economy on Integral Tax Burden in the Proportional Growth Model under Uncertainty

    Directory of Open Access Journals (Sweden)

    Akif Musayev

    2018-01-01

    Full Text Available Economic processes are naturally characterized by imprecise and uncertain relevant information. One of the main reasons is existence of an underground economy. However, in existing works, real-world imprecision and uncertainty of economic conditions are not taken into account. In this paper we consider a problem of calculation of a taxation base to assess tax burden for proportionally growing economy under uncertainty. In order to account for imprecision and uncertainty of economic processes, we use the theory of fuzzy sets. A fuzzy integral equation is used to identify an integral tax burden taking into account the contribution of the underground economy for a certain financial (tax year. It is also assumed that dynamics of gross domestic product are modeled by fuzzy linear differential equation. An optimal value of tax burden is determined as a solution to the considered fuzzy integral equation. An example is provided to illustrate validity of the proposed study.

  14. Sensitivity and uncertainty analysis of nuclear responses in the EU HCLL TBM of ITER

    International Nuclear Information System (INIS)

    Leichtle, Dieter; Fischer, Ulrich; Perel, Reuven L.; Serikov, Arkady

    2011-01-01

    Within the EU Fusion Technology Programme dedicated theoretical and experimental efforts are conducted to provide reliable nuclear data and computational tools for design analyses for fusion devices like ITER including qualified uncertainty estimates. In this respect, the present paper reports on sensitivity and uncertainty analyses for the EU HCLL Test Blanket Module (TBM) of ITER. Neutron flux spectra and tritium production rates have been calculated using MCNP with a modified version of the ITER Alite torus sector model with integrated TBMs. Sensitivities of such parameters to nuclear cross sections of isotopes contained in the TBM as well as in the ITER device have been calculated using the Monte Carlo code MCSEN. Uncertainties could be obtained by using existing covariance data of the important nuclear cross section files, mainly from ENDF/B-VI, SCALE6.0, but also from recent JEFF/EFF evaluations. Like in the HCLL mock-up experiment two positions at front and back of the TBM have been selected. In both cases the calculated uncertainties of the responses (tritium production rate, neutron flux) are in the range of 2-4%.

  15. UNCERTAINTY IN THE DEVELOPMENT AND USE OF EQUATION OF STATE MODELS

    KAUST Repository

    Weirs, V. Gregory; Fabian, Nathan; Potter, Kristin; McNamara, Laura; Otahal, Thomas

    2013-01-01

    In this paper we present the results from a series of focus groups on the visualization of uncertainty in equation-of-state (EOS) models. The initial goal was to identify the most effective ways to present EOS uncertainty to analysts, code developers, and material modelers. Four prototype visualizations were developed to present EOS surfaces in a three-dimensional, thermodynamic space. Focus group participants, primarily from Sandia National Laboratories, evaluated particular features of the various techniques for different use cases and discussed their individual workflow processes, experiences with other visualization tools, and the impact of uncertainty on their work. Related to our prototypes, we found the 3D presentations to be helpful for seeing a large amount of information at once and for a big-picture view; however, participants also desired relatively simple, two-dimensional graphics for better quantitative understanding and because these plots are part of the existing visual language for material models. In addition to feedback on the prototypes, several themes and issues emerged that are as compelling as the original goal and will eventually serve as a starting point for further development of visualization and analysis tools. In particular, a distributed workflow centered around material models was identified. Material model stakeholders contribute and extract information at different points in this workflow depending on their role, but encounter various institutional and technical barriers which restrict the flow of information. An effective software tool for this community must be cognizant of this workflow and alleviate the bottlenecks and barriers within it. Uncertainty in EOS models is defined and interpreted differently at the various stages of the workflow. In this context, uncertainty propagation is difficult to reduce to the mathematical problem of estimating the uncertainty of an output from uncertain inputs.

  16. Study on uncertainty evaluation methodology related to hydrological parameter of regional groundwater flow analysis model

    International Nuclear Information System (INIS)

    Sakai, Ryutaro; Munakata, Masahiro; Ohoka, Masao; Kameya, Hiroshi

    2009-11-01

    In the safety assessment for a geological disposal of radioactive waste, it is important to develop a methodology for long-term estimation of regional groundwater flow from data acquisition to numerical analyses. In the uncertainties associated with estimation of regional groundwater flow, there are the one that concerns parameters and the one that concerns the hydrologeological evolution. The uncertainties of parameters include measurement errors and their heterogeneity. The authors discussed the uncertainties of hydraulic conductivity as a significant parameter for regional groundwater flow analysis. This study suggests that hydraulic conductivities of rock mass are controlled by rock characteristics such as fractures, porosity and test conditions such as hydraulic gradient, water quality, water temperature and that there exists variations more than ten times in hydraulic conductivity by difference due to test conditions such as hydraulic gradient or due to rock type variations such as rock fractures, porosity. In addition this study demonstrated that confining pressure change caused by uplift and subsidence and change of hydraulic gradient under the long-term evolution of hydrogeological environment could possibly produce variations more than ten times of magnitude in hydraulic conductivity. It was also shown that the effect of water quality change on hydraulic conductivity was not negligible and that the replacement of fresh water and saline water caused by sea level change could induce 0.6 times in current hydraulic conductivities in case of Horonobe site. (author)

  17. The uncertainty budget in pharmaceutical industry

    DEFF Research Database (Denmark)

    Heydorn, Kaj

    of their uncertainty, exactly as described in GUM [2]. Pharmaceutical industry has therefore over the last 5 years shown increasing interest in accreditation according to ISO 17025 [3], and today uncertainty budgets are being developed for all so-called critical measurements. The uncertainty of results obtained...... that the uncertainty of a particular result is independent of the method used for its estimation. Several examples of uncertainty budgets for critical parameters based on the bottom-up procedure will be discussed, and it will be shown how the top-down method is used as a means of verifying uncertainty budgets, based...

  18. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear-reactor-safety research program is described and compared with other methodologies established for performing uncertainty analyses

  19. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear reactor safety research program is described and compared with other methodologies established for performing uncertainty analyses

  20. Do Orthopaedic Surgeons Acknowledge Uncertainty?

    NARCIS (Netherlands)

    Teunis, Teun; Janssen, Stein; Guitton, Thierry G.; Ring, David; Parisien, Robert

    2016-01-01

    Much of the decision-making in orthopaedics rests on uncertain evidence. Uncertainty is therefore part of our normal daily practice, and yet physician uncertainty regarding treatment could diminish patients' health. It is not known if physician uncertainty is a function of the evidence alone or if

  1. Parameter estimation techniques and uncertainty in ground water flow model predictions

    International Nuclear Information System (INIS)

    Zimmerman, D.A.; Davis, P.A.

    1990-01-01

    Quantification of uncertainty in predictions of nuclear waste repository performance is a requirement of Nuclear Regulatory Commission regulations governing the licensing of proposed geologic repositories for high-level radioactive waste disposal. One of the major uncertainties in these predictions is in estimating the ground-water travel time of radionuclides migrating from the repository to the accessible environment. The cause of much of this uncertainty has been attributed to a lack of knowledge about the hydrogeologic properties that control the movement of radionuclides through the aquifers. A major reason for this lack of knowledge is the paucity of data that is typically available for characterizing complex ground-water flow systems. Because of this, considerable effort has been put into developing parameter estimation techniques that infer property values in regions where no measurements exist. Currently, no single technique has been shown to be superior or even consistently conservative with respect to predictions of ground-water travel time. This work was undertaken to compare a number of parameter estimation techniques and to evaluate how differences in the parameter estimates and the estimation errors are reflected in the behavior of the flow model predictions. That is, we wished to determine to what degree uncertainties in flow model predictions may be affected simply by the choice of parameter estimation technique used. 3 refs., 2 figs

  2. Ethics under uncertainty: the morality and appropriateness of utilitarianism when outcomes are uncertain.

    Science.gov (United States)

    Kortenkamp, Katherine V; Moore, Colleen F

    2014-01-01

    Real-life moral dilemmas inevitably involve uncertainty, yet research has not considered how uncertainty affects utilitarian moral judgments. In addition, even though moral dilemma researchers regularly ask respondents, "What is appropriate?" but interpret it to mean, "What is moral?," little research has examined whether a difference exists between asking these 2 types of questions. In this study, 140 college students read moral dilemmas that contained certain or uncertain consequences and then responded as to whether it was appropriate and whether it was moral to kill 1 to save many (a utilitarian choice). Ratings of the appropriateness and morality of the utilitarian choice were lower under uncertainty than certainty. A follow-up experiment found that these results could not be explained entirely by a change in the expected values of the outcomes or a desire to avoid the worst-case scenario. In addition, the utilitarian choice to kill 1 to save many was rated as more appropriate than moral. The results imply that moral decision making may depend critically on whether uncertainties in outcomes are admitted and whether people are asked about appropriateness or morality.

  3. Inexact Multistage Stochastic Chance Constrained Programming Model for Water Resources Management under Uncertainties

    Directory of Open Access Journals (Sweden)

    Hong Zhang

    2017-01-01

    Full Text Available In order to formulate water allocation schemes under uncertainties in the water resources management systems, an inexact multistage stochastic chance constrained programming (IMSCCP model is proposed. The model integrates stochastic chance constrained programming, multistage stochastic programming, and inexact stochastic programming within a general optimization framework to handle the uncertainties occurring in both constraints and objective. These uncertainties are expressed as probability distributions, interval with multiply distributed stochastic boundaries, dynamic features of the long-term water allocation plans, and so on. Compared with the existing inexact multistage stochastic programming, the IMSCCP can be used to assess more system risks and handle more complicated uncertainties in water resources management systems. The IMSCCP model is applied to a hypothetical case study of water resources management. In order to construct an approximate solution for the model, a hybrid algorithm, which incorporates stochastic simulation, back propagation neural network, and genetic algorithm, is proposed. The results show that the optimal value represents the maximal net system benefit achieved with a given confidence level under chance constraints, and the solutions provide optimal water allocation schemes to multiple users over a multiperiod planning horizon.

  4. Uncertainty in geological and hydrogeological data

    Directory of Open Access Journals (Sweden)

    B. Nilsson

    2007-09-01

    Full Text Available Uncertainty in conceptual model structure and in environmental data is of essential interest when dealing with uncertainty in water resources management. To make quantification of uncertainty possible is it necessary to identify and characterise the uncertainty in geological and hydrogeological data. This paper discusses a range of available techniques to describe the uncertainty related to geological model structure and scale of support. Literature examples on uncertainty in hydrogeological variables such as saturated hydraulic conductivity, specific yield, specific storage, effective porosity and dispersivity are given. Field data usually have a spatial and temporal scale of support that is different from the one on which numerical models for water resources management operate. Uncertainty in hydrogeological data variables is characterised and assessed within the methodological framework of the HarmoniRiB classification.

  5. Socializing Identity Through Practice: A Mixed Methods Approach to Family Medicine Resident Perspectives on Uncertainty.

    Science.gov (United States)

    Ledford, Christy J W; Cafferty, Lauren A; Seehusen, Dean A

    2015-01-01

    Uncertainty is a central theme in the practice of medicine and particularly primary care. This study explored how family medicine resident physicians react to uncertainty in their practice. This study incorporated a two-phase mixed methods approach, including semi-structured personal interviews (n=21) and longitudinal self-report surveys (n=21) with family medicine residents. Qualitative analysis showed that though residents described uncertainty as an implicit part of their identity, they still developed tactics to minimize or manage uncertainty in their practice. Residents described increasing comfort with uncertainty the longer they practiced and anticipated that growth continuing throughout their careers. Quantitative surveys showed that reactions to uncertainty were more positive over time; however, the difference was not statistically significant. Qualitative and quantitative results show that as family medicine residents practice medicine their perception of uncertainty changes. To reduce uncertainty, residents use relational information-seeking strategies. From a broader view of practice, residents describe uncertainty neutrally, asserting that uncertainty is simply part of the practice of family medicine.

  6. Inflation and Inflation Uncertainty Revisited: Evidence from Egypt

    Directory of Open Access Journals (Sweden)

    Mesbah Fathy Sharaf

    2015-07-01

    Full Text Available The welfare costs of inflation and inflation uncertainty are well documented in the literature and empirical evidence on the link between the two is sparse in the case of Egypt. This paper investigates the causal relationship between inflation and inflation uncertainty in Egypt using monthly time series data during the period January 1974–April 2015. To endogenously control for any potential structural breaks in the inflation time series, Zivot and Andrews (2002 and Clemente–Montanes–Reyes (1998 unit root tests are used. The inflation–inflation uncertainty relation is modeled by the standard two-step approach as well as simultaneously using various versions of the GARCH-M model to control for any potential feedback effects. The analyses explicitly control for the effect of the Economic Reform and Structural Adjustment Program (ERSAP undertaken by the Egyptian government in the early 1990s, which affected inflation rate and its associated volatility. Results show a high degree of inflation–volatility persistence in the response to inflationary shocks. Granger-causality test along with symmetric and asymmetric GARCH-M models indicate a statistically significant bi-directional positive relationship between inflation and inflation uncertainty, supporting both the Friedman–Ball and the Cukierman–Meltzer hypotheses. The findings are robust to the various estimation methods and model specifications. The findings of this paper support the view of adopting inflation-targeting policy in Egypt, after fulfilling its preconditions, to reduce the welfare cost of inflation and its related uncertainties. Monetary authorities in Egypt should enhance the credibility of monetary policy and attempt to reduce inflation uncertainty, which will help lower inflation rates.

  7. International survey for good practices in forecasting uncertainty assessment and communication

    Science.gov (United States)

    Berthet, Lionel; Piotte, Olivier

    2014-05-01

    Achieving technically sound flood forecasts is a crucial objective for forecasters but remains of poor use if the users do not understand properly their significance and do not use it properly in decision making. One usual way to precise the forecasts limitations is to communicate some information about their uncertainty. Uncertainty assessment and communication to stakeholders are thus important issues for operational flood forecasting services (FFS) but remain open fields for research. French FFS wants to publish graphical streamflow and level forecasts along with uncertainty assessment in near future on its website (available to the greater public). In order to choose the technical options best adapted to its operational context, it carried out a survey among more than 15 fellow institutions. Most of these are providing forecasts and warnings to civil protection officers while some were mostly working for hydroelectricity suppliers. A questionnaire has been prepared in order to standardize the analysis of the practices of the surveyed institutions. The survey was conducted by gathering information from technical reports or from the scientific literature, as well as 'interviews' driven by phone, email discussions or meetings. The questionnaire helped in the exploration of practices in uncertainty assessment, evaluation and communication. Attention was paid to the particular context within which every insitution works, in the analysis drawn from raw results. Results show that most services interviewed assess their forecasts uncertainty. However, practices can differ significantly from a country to another. Popular techniques are ensemble approaches. They allow to take into account several uncertainty sources. Statistical past forecasts analysis (such as the quantile regressions) are also commonly used. Contrary to what was expected, only few services emphasize the role of the forecaster (subjective assessment). Similar contrasts can be observed in uncertainty

  8. Adjoint-Based Uncertainty Quantification with MCNP

    Energy Technology Data Exchange (ETDEWEB)

    Seifried, Jeffrey E. [Univ. of California, Berkeley, CA (United States)

    2011-09-01

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  9. Linear Programming Problems for Generalized Uncertainty

    Science.gov (United States)

    Thipwiwatpotjana, Phantipa

    2010-01-01

    Uncertainty occurs when there is more than one realization that can represent an information. This dissertation concerns merely discrete realizations of an uncertainty. Different interpretations of an uncertainty and their relationships are addressed when the uncertainty is not a probability of each realization. A well known model that can handle…

  10. Uncertainty and validation. Effect of model complexity on uncertainty estimates

    Energy Technology Data Exchange (ETDEWEB)

    Elert, M. [Kemakta Konsult AB, Stockholm (Sweden)] [ed.

    1996-09-01

    In the Model Complexity subgroup of BIOMOVS II, models of varying complexity have been applied to the problem of downward transport of radionuclides in soils. A scenario describing a case of surface contamination of a pasture soil was defined. Three different radionuclides with different environmental behavior and radioactive half-lives were considered: Cs-137, Sr-90 and I-129. The intention was to give a detailed specification of the parameters required by different kinds of model, together with reasonable values for the parameter uncertainty. A total of seven modelling teams participated in the study using 13 different models. Four of the modelling groups performed uncertainty calculations using nine different modelling approaches. The models used range in complexity from analytical solutions of a 2-box model using annual average data to numerical models coupling hydrology and transport using data varying on a daily basis. The complex models needed to consider all aspects of radionuclide transport in a soil with a variable hydrology are often impractical to use in safety assessments. Instead simpler models, often box models, are preferred. The comparison of predictions made with the complex models and the simple models for this scenario show that the predictions in many cases are very similar, e g in the predictions of the evolution of the root zone concentration. However, in other cases differences of many orders of magnitude can appear. One example is the prediction of the flux to the groundwater of radionuclides being transported through the soil column. Some issues that have come to focus in this study: There are large differences in the predicted soil hydrology and as a consequence also in the radionuclide transport, which suggests that there are large uncertainties in the calculation of effective precipitation and evapotranspiration. The approach used for modelling the water transport in the root zone has an impact on the predictions of the decline in root

  11. Efficient uncertainty quantification in fully-integrated surface and subsurface hydrologic simulations

    Science.gov (United States)

    Miller, K. L.; Berg, S. J.; Davison, J. H.; Sudicky, E. A.; Forsyth, P. A.

    2018-01-01

    241 different times each. Numerical experiments show that polynomial chaos is an effective and robust method for quantifying uncertainty in fully-integrated hydrologic simulations, which provides a rich set of features and is computationally efficient. Our approach has the potential for significant speedup over existing sampling based methods when the number of uncertain model parameters is modest ( ≤ 20). To our knowledge, this is the first implementation of the algorithm in a comprehensive, fully-integrated, physically-based three-dimensional hydrosystem model.

  12. Towards minimizing measurement uncertainty in total petroleum hydrocarbon determination by GC-FID

    Energy Technology Data Exchange (ETDEWEB)

    Saari, E.

    2009-07-01

    Despite tightened environmental legislation, spillages of petroleum products remain a serious problem worldwide. The environmental impacts of these spillages are always severe and reliable methods for the identification and quantitative determination of petroleum hydrocarbons in environmental samples are therefore needed. Great improvements in the definition and analysis of total petroleum hydrocarbons (TPH) were finally introduced by international organizations for standardization in 2004. This brought some coherence to the determination and, nowadays, most laboratories seem to employ ISO/DIS 16703:2004, ISO 9377-2:2000 and CEN prEN 14039:2004:E draft international standards for analysing TPH in soil. The implementation of these methods, however, usually fails because the reliability of petroleum hydrocarbon determination has proved to be poor.This thesis describes the assessment of measurement uncertainty for TPH determination in soil. Chemometric methods were used to both estimate the main uncertainty sources and identify the most significant factors affecting these uncertainty sources. The method used for the determinations was based on gas chromatography utilizing flame ionization detection (GC-FID).Chemometric methodology applied in estimating measurement uncertainty for TPH determination showed that the measurement uncertainty is in actual fact dominated by the analytical uncertainty. Within the specific concentration range studied, the analytical uncertainty accounted for as much as 68-80% of the measurement uncertainty. The robustness of the analytical method used for petroleum hydrocarbon determination was then studied in more detail. A two-level Plackett-Burman design and a D-optimal design were utilized to assess the main analytical uncertainty sources of the sample treatment and GC determination procedures. It was also found that the matrix-induced systematic error may also significantly reduce the reliability of petroleum hydrocarbon determination

  13. Framing of Uncertainty in Scientific Publications: Towards Recommendations for Decision Support

    Science.gov (United States)

    Guillaume, J. H. A.; Helgeson, C.; Elsawah, S.; Jakeman, A. J.; Kummu, M.

    2016-12-01

    Uncertainty is recognised as an essential issue in environmental decision making and decision support. As modellers, we notably use a variety of tools and techniques within an analysis, for example related to uncertainty quantification and model validation. We also address uncertainty by how we present results. For example, experienced modellers are careful to distinguish robust conclusions from those that need further work, and the precision of quantitative results is tailored to their accuracy. In doing so, the modeller frames how uncertainty should be interpreted by their audience. This is an area which extends beyond modelling to fields such as philosophy of science, semantics, discourse analysis, intercultural communication and rhetoric. We propose that framing of uncertainty deserves greater attention in the context of decision support, and that there are opportunities in this area for fundamental research, synthesis and knowledge transfer, development of teaching curricula, and significant advances in managing uncertainty in decision making. This presentation reports preliminary results of a study of framing practices. Specifically, we analyse the framing of uncertainty that is visible in the abstracts from a corpus of scientific articles. We do this through textual analysis of the content and structure of those abstracts. Each finding that appears in an abstract is classified according to the uncertainty framing approach used, using a classification scheme that was iteratively revised based on reflection and comparison amongst three coders. This analysis indicates how frequently the different framing approaches are used, and provides initial insights into relationships between frames, how the frames relate to interpretation of uncertainty, and how rhetorical devices are used by modellers to communicate uncertainty in their work. We propose initial hypotheses for how the resulting insights might influence decision support, and help advance decision making to

  14. Analysis of the performance of a H-Darrieus rotor under uncertainty using Polynomial Chaos Expansion

    International Nuclear Information System (INIS)

    Daróczy, László; Janiga, Gábor; Thévenin, Dominique

    2016-01-01

    Due to the growing importance of wind energy, improving the efficiency of energy conversion is essential. Horizontal Axis Wind Turbines are the most well-spread, but H-Darrieus turbines are becoming popular as well due to their simple design and easier integration. Due to the high efficiency of existing wind turbines, further improvements require numerical optimization. One important aspect is to find a better configuration that is also robust, i.e., a configuration that retains its performance under uncertainties. For this purpose, forward uncertainty propagation has to be applied. In the present work, an Uncertainty Quantification (UQ) method, Polynomial Chaos Expansion, is applied to transient, turbulent flow simulations of a variable-speed H-Darrieus turbine, taking into account uncertainty in the preset pitch angle and in the angular velocity. The resulting uncertainty of the performance coefficient and of the quasi-periodic torque curve are quantified. In the presence of stall the instantaneous torque coefficients tend to show asymmetric distributions, meaning that error bars cannot be correctly reconstructed using only mean value and standard deviation. The expected performance was always found to be smaller than in computations without UQ techniques, corresponding to up to 10% of relative losses for λ = 2.5. - Highlights: • Uncertainty Quantification/Polynomial Chaos Expansion successfully applied to H-rotor. • Accounting simultaneously for uncertainty in pitch angle and angular velocity. • Performance coefficient decreases by up to 10% when accounting for uncertainty. • For low tip-speed-ratio, high-order polynomials are needed. • Polynomial order 4 is sufficient to reconstruct distribution at higher TSR.

  15. SCALE-6 Sensitivity/Uncertainty Methods and Covariance Data

    International Nuclear Information System (INIS)

    Williams, Mark L.; Rearden, Bradley T.

    2008-01-01

    Computational methods and data used for sensitivity and uncertainty analysis within the SCALE nuclear analysis code system are presented. The methodology used to calculate sensitivity coefficients and similarity coefficients and to perform nuclear data adjustment is discussed. A description is provided of the SCALE-6 covariance library based on ENDF/B-VII and other nuclear data evaluations, supplemented by 'low-fidelity' approximate covariances. SCALE (Standardized Computer Analyses for Licensing Evaluation) is a modular code system developed by Oak Ridge National Laboratory (ORNL) to perform calculations for criticality safety, reactor physics, and radiation shielding applications. SCALE calculations typically use sequences that execute a predefined series of executable modules to compute particle fluxes and responses like the critical multiplication factor. SCALE also includes modules for sensitivity and uncertainty (S/U) analysis of calculated responses. The S/U codes in SCALE are collectively referred to as TSUNAMI (Tools for Sensitivity and UNcertainty Analysis Methodology Implementation). SCALE-6-scheduled for release in 2008-contains significant new capabilities, including important enhancements in S/U methods and data. The main functions of TSUNAMI are to (a) compute nuclear data sensitivity coefficients and response uncertainties, (b) establish similarity between benchmark experiments and design applications, and (c) reduce uncertainty in calculated responses by consolidating integral benchmark experiments. TSUNAMI includes easy-to-use graphical user interfaces for defining problem input and viewing three-dimensional (3D) geometries, as well as an integrated plotting package.

  16. Uncertainty and sensitivity analyses for age-dependent unavailability model integrating test and maintenance

    International Nuclear Information System (INIS)

    Kančev, Duško; Čepin, Marko

    2012-01-01

    Highlights: ► Application of analytical unavailability model integrating T and M, ageing, and test strategy. ► Ageing data uncertainty propagation on system level assessed via Monte Carlo simulation. ► Uncertainty impact is growing with the extension of the surveillance test interval. ► Calculated system unavailability dependence on two different sensitivity study ageing databases. ► System unavailability sensitivity insights regarding specific groups of BEs as test intervals extend. - Abstract: The interest in operational lifetime extension of the existing nuclear power plants is growing. Consequently, plants life management programs, considering safety components ageing, are being developed and employed. Ageing represents a gradual degradation of the physical properties and functional performance of different components consequently implying their reduced availability. Analyses, which are being made in the direction of nuclear power plants lifetime extension are based upon components ageing management programs. On the other side, the large uncertainties of the ageing parameters as well as the uncertainties associated with most of the reliability data collections are widely acknowledged. This paper addresses the uncertainty and sensitivity analyses conducted utilizing a previously developed age-dependent unavailability model, integrating effects of test and maintenance activities, for a selected stand-by safety system in a nuclear power plant. The most important problem is the lack of data concerning the effects of ageing as well as the relatively high uncertainty associated to these data, which would correspond to more detailed modelling of ageing. A standard Monte Carlo simulation was coded for the purpose of this paper and utilized in the process of assessment of the component ageing parameters uncertainty propagation on system level. The obtained results from the uncertainty analysis indicate the extent to which the uncertainty of the selected

  17. Measurement uncertainty: Friend or foe?

    Science.gov (United States)

    Infusino, Ilenia; Panteghini, Mauro

    2018-02-02

    The definition and enforcement of a reference measurement system, based on the implementation of metrological traceability of patients' results to higher order reference methods and materials, together with a clinically acceptable level of measurement uncertainty, are fundamental requirements to produce accurate and equivalent laboratory results. The uncertainty associated with each step of the traceability chain should be governed to obtain a final combined uncertainty on clinical samples fulfilling the requested performance specifications. It is important that end-users (i.e., clinical laboratory) may know and verify how in vitro diagnostics (IVD) manufacturers have implemented the traceability of their calibrators and estimated the corresponding uncertainty. However, full information about traceability and combined uncertainty of calibrators is currently very difficult to obtain. Laboratory professionals should investigate the need to reduce the uncertainty of the higher order metrological references and/or to increase the precision of commercial measuring systems. Accordingly, the measurement uncertainty should not be considered a parameter to be calculated by clinical laboratories just to fulfil the accreditation standards, but it must become a key quality indicator to describe both the performance of an IVD measuring system and the laboratory itself. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  18. Comparison of the effect of hazard and response/fragility uncertainties on core melt probability uncertainty

    International Nuclear Information System (INIS)

    Mensing, R.W.

    1985-01-01

    This report proposes a method for comparing the effects of the uncertainty in probabilistic risk analysis (PRA) input parameters on the uncertainty in the predicted risks. The proposed method is applied to compare the effect of uncertainties in the descriptions of (1) the seismic hazard at a nuclear power plant site and (2) random variations in plant subsystem responses and component fragility on the uncertainty in the predicted probability of core melt. The PRA used is that developed by the Seismic Safety Margins Research Program

  19. Contribution to uncertainties evaluation for fast reactors neutronic cross sections

    International Nuclear Information System (INIS)

    Privas, Edwin

    2015-01-01

    The thesis has been motivated by a wish to increase the uncertainty knowledge on nuclear data, for safety criteria. It aims the cross sections required by core calculation for sodium fast reactors (SFR), and new tools to evaluate its.The main objective of this work is to provide new tools in order to create coherent evaluated files, with reliable and mastered uncertainties. To answer those problematic, several methods have been implemented within the CONRAD code, which is developed at CEA of Cadarache. After a summary of all the elements required to understand the evaluation world, stochastic methods are presented in order to solve the Bayesian inference. They give the evaluator more information about probability density and they also can be used as validation tools. The algorithms have been successfully tested, despite long calculation time. Then, microscopic constraints have been implemented in CONRAD. They are defined as new information that should be taken into account during the evaluation process. An algorithm has been developed in order to solve, for example, continuity issues between two energy domains, with the Lagrange multiplier formalism. Another method is given by using a marginalization procedure, in order to either complete an existing evaluation with new covariance or add systematic uncertainty on an experiment described by two theories. The algorithms are well performed along examples, such the 238 U total cross section. The last parts focus on the integral data feedback, using methods of integral data assimilation to reduce the uncertainties on cross sections. This work ends with uncertainty reduction on key nuclear reactions, such the capture and fission cross sections of 238 U and 239 Pu, thanks to PROFIL and PROFIL-2 experiments in Phenix and the Jezebel benchmark. (author) [fr

  20. Uncertainty Characterization of Reactor Vessel Fracture Toughness

    International Nuclear Information System (INIS)

    Li, Fei; Modarres, Mohammad

    2002-01-01

    To perform fracture mechanics analysis of reactor vessel, fracture toughness (K Ic ) at various temperatures would be necessary. In a best estimate approach, K Ic uncertainties resulting from both lack of sufficient knowledge and randomness in some of the variables of K Ic must be characterized. Although it may be argued that there is only one type of uncertainty, which is lack of perfect knowledge about the subject under study, as a matter of practice K Ic uncertainties can be divided into two types: aleatory and epistemic. Aleatory uncertainty is related to uncertainty that is very difficult to reduce, if not impossible; epistemic uncertainty, on the other hand, can be practically reduced. Distinction between aleatory and epistemic uncertainties facilitates decision-making under uncertainty and allows for proper propagation of uncertainties in the computation process. Typically, epistemic uncertainties representing, for example, parameters of a model are sampled (to generate a 'snapshot', single-value of the parameters), but the totality of aleatory uncertainties is carried through the calculation as available. In this paper a description of an approach to account for these two types of uncertainties associated with K Ic has been provided. (authors)

  1. Range uncertainties in proton therapy and the role of Monte Carlo simulations

    International Nuclear Information System (INIS)

    Paganetti, Harald

    2012-01-01

    The main advantages of proton therapy are the reduced total energy deposited in the patient as compared to photon techniques and the finite range of the proton beam. The latter adds an additional degree of freedom to treatment planning. The range in tissue is associated with considerable uncertainties caused by imaging, patient setup, beam delivery and dose calculation. Reducing the uncertainties would allow a reduction of the treatment volume and thus allow a better utilization of the advantages of protons. This paper summarizes the role of Monte Carlo simulations when aiming at a reduction of range uncertainties in proton therapy. Differences in dose calculation when comparing Monte Carlo with analytical algorithms are analyzed as well as range uncertainties due to material constants and CT conversion. Range uncertainties due to biological effects and the role of Monte Carlo for in vivo range verification are discussed. Furthermore, the current range uncertainty recipes used at several proton therapy facilities are revisited. We conclude that a significant impact of Monte Carlo dose calculation can be expected in complex geometries where local range uncertainties due to multiple Coulomb scattering will reduce the accuracy of analytical algorithms. In these cases Monte Carlo techniques might reduce the range uncertainty by several mm. (topical review)

  2. Quantum-memory-assisted entropic uncertainty relation in a Heisenberg XYZ chain with an inhomogeneous magnetic field

    Science.gov (United States)

    Wang, Dong; Huang, Aijun; Ming, Fei; Sun, Wenyang; Lu, Heping; Liu, Chengcheng; Ye, Liu

    2017-06-01

    The uncertainty principle provides a nontrivial bound to expose the precision for the outcome of the measurement on a pair of incompatible observables in a quantum system. Therefore, it is of essential importance for quantum precision measurement in the area of quantum information processing. Herein, we investigate quantum-memory-assisted entropic uncertainty relation (QMA-EUR) in a two-qubit Heisenberg \\boldsymbol{X}\\boldsymbol{Y}\\boldsymbol{Z} spin chain. Specifically, we observe the dynamics of QMA-EUR in a realistic model there are two correlated sites linked by a thermal entanglement in the spin chain with an inhomogeneous magnetic field. It turns out that the temperature, the external inhomogeneous magnetic field and the field inhomogeneity can lift the uncertainty of the measurement due to the reduction of the thermal entanglement, and explicitly higher temperature, stronger magnetic field or larger inhomogeneity of the field can result in inflation of the uncertainty. Besides, it is found that there exists distinct dynamical behaviors of the uncertainty for ferromagnetism \\boldsymbol{}≤ft(\\boldsymbol{J}\\boldsymbol{0}\\right) chains. Moreover, we also verify that the measuring uncertainty is dramatically anti-correlated with the purity of the bipartite spin system, the greater purity can result in the reduction of the measuring uncertainty, vice versa. Therefore, our observations might provide a better understanding of the dynamics of the entropic uncertainty in the Heisenberg spin chain, and thus shed light on quantum precision measurement in the framework of versatile systems, particularly solid states.

  3. Sources of uncertainty in characterizing health risks from flare emissions

    International Nuclear Information System (INIS)

    Hrudey, S.E.

    2000-01-01

    The assessment of health risks associated with gas flaring was the focus of this paper. Health risk assessments for environmental decision-making includes the evaluation of scientific data to identify hazards and to determine dose-response assessments, exposure assessments and risk characterization. Gas flaring has been the cause for public health concerns in recent years, most notably since 1996 after a published report by the Alberta Research Council. Some of the major sources of uncertainty associated with identifying hazardous contaminants in flare emissions were discussed. Methods to predict human exposures to emitted contaminants were examined along with risk characterization of predicted exposures to several identified contaminants. One of the problems is that elemental uncertainties exist regarding flare emissions which places limitations of the degree of reassurance that risk assessment can provide, but risk assessment can nevertheless offer some guidance to those responsible for flare emissions

  4. Uncertainties in risk assessment at USDOE facilities

    Energy Technology Data Exchange (ETDEWEB)

    Hamilton, L.D.; Holtzman, S.; Meinhold, A.F.; Morris, S.C.; Rowe, M.D.

    1994-01-01

    The United States Department of Energy (USDOE) has embarked on an ambitious program to remediate environmental contamination at its facilities. Decisions concerning cleanup goals, choices among cleanup technologies, and funding prioritization should be largely risk-based. Risk assessments will be used more extensively by the USDOE in the future. USDOE needs to develop and refine risk assessment methods and fund research to reduce major sources of uncertainty in risk assessments at USDOE facilities. The terms{open_quote} risk assessment{close_quote} and{open_quote} risk management{close_quote} are frequently confused. The National Research Council (1983) and the United States Environmental Protection Agency (USEPA, 1991a) described risk assessment as a scientific process that contributes to risk management. Risk assessment is the process of collecting, analyzing and integrating data and information to identify hazards, assess exposures and dose responses, and characterize risks. Risk characterization must include a clear presentation of {open_quotes}... the most significant data and uncertainties...{close_quotes} in an assessment. Significant data and uncertainties are {open_quotes}...those that define and explain the main risk conclusions{close_quotes}. Risk management integrates risk assessment information with other considerations, such as risk perceptions, socioeconomic and political factors, and statutes, to make and justify decisions. Risk assessments, as scientific processes, should be made independently of the other aspects of risk management (USEPA, 1991a), but current methods for assessing health risks are based on conservative regulatory principles, causing unnecessary public concern and misallocation of funds for remediation.

  5. Uncertainties in risk assessment at USDOE facilities

    International Nuclear Information System (INIS)

    Hamilton, L.D.; Holtzman, S.; Meinhold, A.F.; Morris, S.C.; Rowe, M.D.

    1994-01-01

    The United States Department of Energy (USDOE) has embarked on an ambitious program to remediate environmental contamination at its facilities. Decisions concerning cleanup goals, choices among cleanup technologies, and funding prioritization should be largely risk-based. Risk assessments will be used more extensively by the USDOE in the future. USDOE needs to develop and refine risk assessment methods and fund research to reduce major sources of uncertainty in risk assessments at USDOE facilities. The terms open-quote risk assessment close-quote and open-quote risk management close-quote are frequently confused. The National Research Council (1983) and the United States Environmental Protection Agency (USEPA, 1991a) described risk assessment as a scientific process that contributes to risk management. Risk assessment is the process of collecting, analyzing and integrating data and information to identify hazards, assess exposures and dose responses, and characterize risks. Risk characterization must include a clear presentation of open-quotes... the most significant data and uncertainties...close quotes in an assessment. Significant data and uncertainties are open-quotes...those that define and explain the main risk conclusionsclose quotes. Risk management integrates risk assessment information with other considerations, such as risk perceptions, socioeconomic and political factors, and statutes, to make and justify decisions. Risk assessments, as scientific processes, should be made independently of the other aspects of risk management (USEPA, 1991a), but current methods for assessing health risks are based on conservative regulatory principles, causing unnecessary public concern and misallocation of funds for remediation

  6. PREMIUM - Benchmark on the quantification of the uncertainty of the physical models in the system thermal-hydraulic codes

    International Nuclear Information System (INIS)

    Skorek, Tomasz; Crecy, Agnes de

    2013-01-01

    PREMIUM (Post BEMUSE Reflood Models Input Uncertainty Methods) is an activity launched with the aim to push forward the methods of quantification of physical models uncertainties in thermal-hydraulic codes. It is endorsed by OECD/NEA/CSNI/WGAMA. The benchmark PREMIUM is addressed to all who applies uncertainty evaluation methods based on input uncertainties quantification and propagation. The benchmark is based on a selected case of uncertainty analysis application to the simulation of quench front propagation in an experimental test facility. Application to an experiment enables evaluation and confirmation of the quantified probability distribution functions on the basis of experimental data. The scope of the benchmark comprises a review of the existing methods, selection of potentially important uncertain input parameters, preliminary quantification of the ranges and distributions of the identified parameters, evaluation of the probability density function using experimental results of tests performed on FEBA test facility and confirmation/validation of the performed quantification on the basis of blind calculation of Reflood 2-D PERICLES experiment. (authors)

  7. Strategy under uncertainty.

    Science.gov (United States)

    Courtney, H; Kirkland, J; Viguerie, P

    1997-01-01

    At the heart of the traditional approach to strategy lies the assumption that by applying a set of powerful analytic tools, executives can predict the future of any business accurately enough to allow them to choose a clear strategic direction. But what happens when the environment is so uncertain that no amount of analysis will allow us to predict the future? What makes for a good strategy in highly uncertain business environments? The authors, consultants at McKinsey & Company, argue that uncertainty requires a new way of thinking about strategy. All too often, they say, executives take a binary view: either they underestimate uncertainty to come up with the forecasts required by their companies' planning or capital-budging processes, or they overestimate it, abandon all analysis, and go with their gut instinct. The authors outline a new approach that begins by making a crucial distinction among four discrete levels of uncertainty that any company might face. They then explain how a set of generic strategies--shaping the market, adapting to it, or reserving the right to play at a later time--can be used in each of the four levels. And they illustrate how these strategies can be implemented through a combination of three basic types of actions: big bets, options, and no-regrets moves. The framework can help managers determine which analytic tools can inform decision making under uncertainty--and which cannot. At a broader level, it offers executives a discipline for thinking rigorously and systematically about uncertainty and its implications for strategy.

  8. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    Science.gov (United States)

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Where do uncertainties reside within environmental risk assessments? Expert opinion on uncertainty distributions for pesticide risks to surface water organisms.

    Science.gov (United States)

    Skinner, Daniel J C; Rocks, Sophie A; Pollard, Simon J T

    2016-12-01

    A reliable characterisation of uncertainties can aid uncertainty identification during environmental risk assessments (ERAs). However, typologies can be implemented inconsistently, causing uncertainties to go unidentified. We present an approach based on nine structured elicitations, in which subject-matter experts, for pesticide risks to surface water organisms, validate and assess three dimensions of uncertainty: its level (the severity of uncertainty, ranging from determinism to ignorance); nature (whether the uncertainty is epistemic or aleatory); and location (the data source or area in which the uncertainty arises). Risk characterisation contains the highest median levels of uncertainty, associated with estimating, aggregating and evaluating the magnitude of risks. Regarding the locations in which uncertainty is manifest, data uncertainty is dominant in problem formulation, exposure assessment and effects assessment. The comprehensive description of uncertainty described will enable risk analysts to prioritise the required phases, groups of tasks, or individual tasks within a risk analysis according to the highest levels of uncertainty, the potential for uncertainty to be reduced or quantified, or the types of location-based uncertainty, thus aiding uncertainty prioritisation during environmental risk assessments. In turn, it is expected to inform investment in uncertainty reduction or targeted risk management action. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  10. Uncertainty budget for k0-NAA

    International Nuclear Information System (INIS)

    Robouch, P.; Arana, G.; Eguskiza, M.; Etxebarria, N.

    2000-01-01

    The concepts of the Guide to the expression of Uncertainties in Measurements for chemical measurements (GUM) and the recommendations of the Eurachem document 'Quantifying Uncertainty in Analytical Methods' are applied to set up the uncertainty budget for k 0 -NAA. The 'universally applicable spreadsheet technique', described by KRAGTEN, is applied to the k 0 -NAA basic equations for the computation of uncertainties. The variance components - individual standard uncertainties - highlight the contribution and the importance of the different parameters to be taken into account. (author)

  11. Uncertainty vs. learning in climate policy: Some classical results and new directions

    Energy Technology Data Exchange (ETDEWEB)

    Lange, A. [Univ. of Maryland (United States); Treich, N. [Univ. of Toulouse (France)

    2007-07-01

    Climate policy decisions today have to be made under substantial uncertainty: the impact of accumulating greenhouse gases in the atmosphere is not perfectly known, the future economic and social consequences of climate change, in particular the valuation of possible damages, are uncertain. However, learning will change the basis of making future decisions on abatement policies. These important issues of uncertainty and learning are often presented in a colloquial sense. Two opposing effects are typically put forward: First, uncertainty about future climate damage, which is often associated with the possibility of a catastrophic scenario is said to give a premium to slow down global warming and therefore to increase abatement efforts today. Second learning opportunities will reduce scientific undertainty about climate damage over time. This is often used as an argument to postpone abatement efforts until new information is received. The effects of uncertainty and learning on the optimal design of current climate policy are still much debated both in the academic and the political arena. In this paper, the authors study and contrast the effect of uncertainty and learning in a two-decision model that encompasses most existing microeconomics models of climate change. They first consider the common expected utility framework: While uncertainty has generally no or a negative effect on welfare, learning has always a positive, and thus opposite, effect. The effects of both uncertainty and learning on decisions are less clear. Neither uncertainty nor learning can be used as an argument to increase or reduce emissions today, independently on the degree of risk aversion of the decision-marker and on the nature of irreversibility constraints. The authors then deviate from the expected utility framework and consider a model with ambiguity aversion. The model accounts well for situations of imprecise or multiple probability distributions, as present in the context of climate

  12. Uncertainty governance: an integrated framework for managing and communicating uncertainties

    International Nuclear Information System (INIS)

    Umeki, H.; Naito, M.; Takase, H.

    2004-01-01

    Treatment of uncertainty, or in other words, reasoning with imperfect information is widely recognised as being of great importance within performance assessment (PA) of the geological disposal mainly because of the time scale of interest and spatial heterogeneity that geological environment exhibits. A wide range of formal methods have been proposed for the optimal processing of incomplete information. Many of these methods rely on the use of numerical information, the frequency based concept of probability in particular, to handle the imperfections. However, taking quantitative information as a base for models that solve the problem of handling imperfect information merely creates another problem, i.e., how to provide the quantitative information. In many situations this second problem proves more resistant to solution, and in recent years several authors have looked at a particularly ingenious way in accordance with the rules of well-founded methods such as Bayesian probability theory, possibility theory, and the Dempster-Shafer theory of evidence. Those methods, while drawing inspiration from quantitative methods, do not require the kind of complete numerical information required by quantitative methods. Instead they provide information that, though less precise than that provided by quantitative techniques, is often, if not sufficient, the best that could be achieved. Rather than searching for the best method for handling all imperfect information, our strategy for uncertainty management, that is recognition and evaluation of uncertainties associated with PA followed by planning and implementation of measures to reduce them, is to use whichever method best fits the problem at hand. Such an eclectic position leads naturally to integration of the different formalisms. While uncertainty management based on the combination of semi-quantitative methods forms an important part of our framework for uncertainty governance, it only solves half of the problem

  13. Assessment of wind turbine seismic risk : existing literature and simple study of tower moment demand.

    Energy Technology Data Exchange (ETDEWEB)

    Prowell, Ian (University of California, San Diego, CA); Veers, Paul S.

    2009-03-01

    Various sources of risk exist for all civil structures, one of which is seismic risk. As structures change in scale, the magnitude of seismic risk changes relative to risk from other sources. This paper presents an introduction to seismic hazard as applied to wind turbine structures. The existing design methods and research regarding seismic risk for wind turbines is then summarized. Finally a preliminary assessment is made based on current guidelines to understand how tower moment demand scales as rated power increases. Potential areas of uncertainty in the application of the current guidelines are summarized.

  14. Verification of uncertainty budgets

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Madsen, B.S.

    2005-01-01

    , and therefore it is essential that the applicability of the overall uncertainty budget to actual measurement results be verified on the basis of current experimental data. This should be carried out by replicate analysis of samples taken in accordance with the definition of the measurand, but representing...... the full range of matrices and concentrations for which the budget is assumed to be valid. In this way the assumptions made in the uncertainty budget can be experimentally verified, both as regards sources of variability that are assumed negligible, and dominant uncertainty components. Agreement between...

  15. Effect of precipitation spatial distribution uncertainty on the uncertainty bounds of a snowmelt runoff model output

    Science.gov (United States)

    Jacquin, A. P.

    2012-04-01

    goodness of fit of the model realizations. GLUE-type uncertainty bounds during the verification period are derived at the probability levels p=85%, 90% and 95%. Results indicate that, as expected, prediction uncertainty bounds indeed change if precipitation factors FPi are estimated a priori rather than being allowed to vary, but that this change is not dramatic. Firstly, the width of the uncertainty bounds at the same probability level only slightly reduces compared to the case where precipitation factors are allowed to vary. Secondly, the ability to enclose the observations improves, but the decrease in the fraction of outliers is not significant. These results are probably due to the narrow range of variability allowed to the precipitation factors FPi in the first experiment, which implies that although they indicate the shape of the functional relationship between precipitation and height, the magnitude of precipitation estimates were mainly determined by the magnitude of the observations at the available raingauge. It is probable that the situation where no prior information is available on the realistic ranges of variation of the precipitation factors, and the inclusion of precipitation data uncertainty, would have led to a different conclusion. Acknowledgements: This research was funded by FONDECYT, Research Project 1110279.

  16. Assessing climate change and socio-economic uncertainties in long term management of water resources

    Science.gov (United States)

    Jahanshahi, Golnaz; Dawson, Richard; Walsh, Claire; Birkinshaw, Stephen; Glenis, Vassilis

    2015-04-01

    Long term management of water resources is challenging for decision makers given the range of uncertainties that exist. Such uncertainties are a function of long term drivers of change, such as climate, environmental loadings, demography, land use and other socio economic drivers. Impacts of climate change on frequency of extreme events such as drought make it a serious threat to water resources and water security. The release of probabilistic climate information, such as the UKCP09 scenarios, provides improved understanding of some uncertainties in climate models. This has motivated a more rigorous approach to dealing with other uncertainties in order to understand the sensitivity of investment decisions to future uncertainty and identify adaptation options that are as far as possible robust. We have developed and coupled a system of models that includes a weather generator, simulations of catchment hydrology, demand for water and the water resource system. This integrated model has been applied in the Thames catchment which supplies the city of London, UK. This region is one of the driest in the UK and hence sensitive to water availability. In addition, it is one of the fastest growing parts of the UK and plays an important economic role. Key uncertainties in long term water resources in the Thames catchment, many of which result from earth system processes, are identified and quantified. The implications of these uncertainties are explored using a combination of uncertainty analysis and sensitivity testing. The analysis shows considerable uncertainty in future rainfall, river flow and consequently water resource. For example, results indicate that by the 2050s, low flow (Q95) in the Thames catchment will range from -44 to +9% compared with the control scenario (1970s). Consequently, by the 2050s the average number of drought days are expected to increase 4-6 times relative to the 1970s. Uncertainties associated with urban growth increase these risks further

  17. Propagation of dynamic measurement uncertainty

    International Nuclear Information System (INIS)

    Hessling, J P

    2011-01-01

    The time-dependent measurement uncertainty has been evaluated in a number of recent publications, starting from a known uncertain dynamic model. This could be defined as the 'downward' propagation of uncertainty from the model to the targeted measurement. The propagation of uncertainty 'upward' from the calibration experiment to a dynamic model traditionally belongs to system identification. The use of different representations (time, frequency, etc) is ubiquitous in dynamic measurement analyses. An expression of uncertainty in dynamic measurements is formulated for the first time in this paper independent of representation, joining upward as well as downward propagation. For applications in metrology, the high quality of the characterization may be prohibitive for any reasonably large and robust model to pass the whiteness test. This test is therefore relaxed by not directly requiring small systematic model errors in comparison to the randomness of the characterization. Instead, the systematic error of the dynamic model is propagated to the uncertainty of the measurand, analogously but differently to how stochastic contributions are propagated. The pass criterion of the model is thereby transferred from the identification to acceptance of the total accumulated uncertainty of the measurand. This increases the relevance of the test of the model as it relates to its final use rather than the quality of the calibration. The propagation of uncertainty hence includes the propagation of systematic model errors. For illustration, the 'upward' propagation of uncertainty is applied to determine if an appliance box is damaged in an earthquake experiment. In this case, relaxation of the whiteness test was required to reach a conclusive result

  18. Uncertainties and novel prospects in the study of the soil carbon dynamics

    International Nuclear Information System (INIS)

    Yang Wang; Yuch-Ping Hsieh

    2002-01-01

    Establishment of the Kyoto Protocol has resulted in an effort to look towards living biomass and soils for carbon sequestration. In order for carbon credits to be meaningful, sustained carbon sequestration for decades or longer is required. It has been speculated that improved land management could result in sequestration of a substantial amount of carbon in soils within several decades and therefore can be an important option in reducing atmospheric CO 2 concentration. However, evaluation of soil carbon sources and sinks is difficult because the dynamics of soil carbon storage and release is complex and still not well understood. There has been rapid development of quantitative techniques over the past two decades for measuring the component fluxes of the global carbon cycle and for studying the soil carbon cycle. Most significant development in the soil carbon cycle study is the application of accelerator mass spectrometry (AMS) in radiocarbon measurements. This has made it possible to unravel rates of carbon cycling in soils, by studying natural levels of radiocarbon in soil organic matter and soil CO 2 . Despite the advances in the study of the soil carbon cycle in the recent decades, tremendous uncertainties exist in the sizes and turnover times of soil carbon pools. The uncertainties result from lack of standard methods and incomplete understanding of soil organic carbon dynamics, compounded by natural variability in soil carbon and carbon isotopic content even within the same ecosystem. Many fundamental questions concerning the dynamics of the soil carbon cycle have yet to be answered. This paper reviews and synthesizes the isotopic approaches to the study of the soil carbon cycle. We will focus on uncertainties and limitations associated with these approaches and point out areas where more research is needed to improve our understanding of this important component of the global carbon cycle. (author)

  19. Applying the conservativeness principle to REDD to deal with the uncertainties of the estimates

    International Nuclear Information System (INIS)

    Grassi, Giacomo; Monni, Suvi; Achard, Frederic; Mollicone, Danilo; Federici, Sandro

    2008-01-01

    A common paradigm when the reduction of emissions from deforestations is estimated for the purpose of promoting it as a mitigation option in the context of the United Nations Framework Convention on Climate Change (UNFCCC) is that high uncertainties in input data-i.e., area change and C stock change/area-may seriously undermine the credibility of the estimates and therefore of reduced deforestation as a mitigation option. In this paper, we show how a series of concepts and methodological tools-already existing in UNFCCC decisions and IPCC guidance documents-may greatly help to deal with the uncertainties of the estimates of reduced emissions from deforestation

  20. Applying the conservativeness principle to REDD to deal with the uncertainties of the estimates

    Energy Technology Data Exchange (ETDEWEB)

    Grassi, Giacomo; Monni, Suvi; Achard, Frederic [Institute for Environment and Sustainability, Joint Research Centre of the European Commission, I-21020 Ispra (Italy); Mollicone, Danilo [Department of Geography, University of Alcala de Henares, Madrid (Spain); Federici, Sandro

    2008-07-15

    A common paradigm when the reduction of emissions from deforestations is estimated for the purpose of promoting it as a mitigation option in the context of the United Nations Framework Convention on Climate Change (UNFCCC) is that high uncertainties in input data-i.e., area change and C stock change/area-may seriously undermine the credibility of the estimates and therefore of reduced deforestation as a mitigation option. In this paper, we show how a series of concepts and methodological tools-already existing in UNFCCC decisions and IPCC guidance documents-may greatly help to deal with the uncertainties of the estimates of reduced emissions from deforestation.