WorldWideScience

Sample records for avt-147 computational uncertainty

  1. Uncertainty In Quantum Computation

    OpenAIRE

    Kak, Subhash

    2002-01-01

    We examine the effect of previous history on starting a computation on a quantum computer. Specifically, we assume that the quantum register has some unknown state on it, and it is required that this state be cleared and replaced by a specific superposition state without any phase uncertainty, as needed by quantum algorithms. We show that, in general, this task is computationally impossible.

  2. S-parameter uncertainty computations

    DEFF Research Database (Denmark)

    Vidkjær, Jens

    A method for computing uncertainties of measured s-parameters is presented. Unlike the specification software provided with network analyzers, the new method is capable of calculating the uncertainties of arbitrary s-parameter sets and instrument settings.......A method for computing uncertainties of measured s-parameters is presented. Unlike the specification software provided with network analyzers, the new method is capable of calculating the uncertainties of arbitrary s-parameter sets and instrument settings....

  3. Numerical uncertainty in computational engineering and physics

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois M [Los Alamos National Laboratory

    2009-01-01

    Obtaining a solution that approximates ordinary or partial differential equations on a computational mesh or grid does not necessarily mean that the solution is accurate or even 'correct'. Unfortunately assessing the quality of discrete solutions by questioning the role played by spatial and temporal discretizations generally comes as a distant third to test-analysis comparison and model calibration. This publication is contributed to raise awareness of the fact that discrete solutions introduce numerical uncertainty. This uncertainty may, in some cases, overwhelm in complexity and magnitude other sources of uncertainty that include experimental variability, parametric uncertainty and modeling assumptions. The concepts of consistency, convergence and truncation error are overviewed to explain the articulation between the exact solution of continuous equations, the solution of modified equations and discrete solutions computed by a code. The current state-of-the-practice of code and solution verification activities is discussed. An example in the discipline of hydro-dynamics illustrates the significant effect that meshing can have on the quality of code predictions. A simple method is proposed to derive bounds of solution uncertainty in cases where the exact solution of the continuous equations, or its modified equations, is unknown. It is argued that numerical uncertainty originating from mesh discretization should always be quantified and accounted for in the overall uncertainty 'budget' that supports decision-making for applications in computational physics and engineering.

  4. Uncertainty in biology a computational modeling approach

    CERN Document Server

    Gomez-Cabrero, David

    2016-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies.  Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process.  This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples.  This book is intended for graduate stude...

  5. Efficient Quantification of Uncertainties in Complex Computer Code Results Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Propagation of parameter uncertainties through large computer models can be very resource intensive. Frameworks and tools for uncertainty quantification are...

  6. Probabilistic numerics and uncertainty in computations

    Science.gov (United States)

    Hennig, Philipp; Osborne, Michael A.; Girolami, Mark

    2015-01-01

    We deliver a call to arms for probabilistic numerical methods: algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations. PMID:26346321

  7. Current status of uncertainty analysis methods for computer models

    International Nuclear Information System (INIS)

    This report surveys several existing uncertainty analysis methods for estimating computer output uncertainty caused by input uncertainties, illustrating application examples of those methods to three computer models, MARCH/CORRAL II, TERFOC and SPARC. Merits and limitations of the methods are assessed in the application, and recommendation for selecting uncertainty analysis methods is provided. (author)

  8. Reliable Computational Predictions by Modeling Uncertainties Using Arbitrary Polynomial Chaos

    OpenAIRE

    Witteveen, J.A.S.; Bijl, H

    2006-01-01

    Inherent physical uncertainties can have a significant influence on computational predictions. It is therefore important to take physical uncertainties into account to obtain more reliable computational predictions. The Galerkin polynomial chaos method is a commonly applied uncertainty quantification method. However, the polynomial chaos expansion has some limitations. Firstly, the polynomial chaos expansion based on classical polynomials can achieve exponential convergence for a limited set ...

  9. Efficient uncertainty quantification in computational fluid dynamics

    NARCIS (Netherlands)

    Loeven, G.J.A.

    2010-01-01

    When modeling physical systems, several sources of uncertainty are present. For example, variability in boundary conditions like free stream velocity or ambient pressure are always present. Furthermore, uncertainties in geometry arise from production tolerances, wear or unknown deformations under lo

  10. Understanding Theoretical Uncertainties in Perturbative QCD Computations

    DEFF Research Database (Denmark)

    Jenniches, Laura Katharina

    second project focuses on theoretical uncertainties in perturbative QCD. We perform a study of theoretical uncertainties obtained using the traditional scale-variation and the Cacciari-Houdeau approach [1], which uses Bayesian statistics to estimate missing-higher-order uncertainties. In addition, we......Abstract To compare theoretical predictions and experimental results, we require not only a precise knowledge of the observables themselves, but also a good understanding of the uncertainty introduced by missing higher orders in perturbative QCD. In this work, we present a method which combines...

  11. Multifactorial Uncertainty Assessment for Monitoring Population Abundance using Computer Vision

    NARCIS (Netherlands)

    Beauxis-Aussalet, E.M.A.L.; Hardman, L.

    2015-01-01

    Computer vision enables in-situ monitoring of animal populations at a lower cost and with less ecosystem disturbance than with human observers. However, computer vision uncertainty may not be fully understood by end-users, and the uncertainty assessments performed by technology experts may not fully

  12. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    Energy Technology Data Exchange (ETDEWEB)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  13. Multifactorial Uncertainty Assessment for Monitoring Population Abundance using Computer Vision

    OpenAIRE

    Beauxis-Aussalet, Emmanuelle; Hardman, Hazel Lynda

    2015-01-01

    Computer vision enables in-situ monitoring of animal populations at a lower cost and with less ecosystem disturbance than with human observers. However, computer vision uncertainty may not be fully understood by end-users, and the uncertainty assessments performed by technology experts may not fully address end-user needs. This knowledge gap can yield misinterpretations of computer vision data, and trust issues impeding the transfer of valuable technologies. We bridge this gap with a user-cen...

  14. Uncertainty in biology: a computational modeling approach

    OpenAIRE

    2015-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies. Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building...

  15. MOUSE (MODULAR ORIENTED UNCERTAINTY SYSTEM): A COMPUTERIZED UNCERTAINTY ANALYSIS SYSTEM (FOR MICRO- COMPUTERS)

    Science.gov (United States)

    Environmental engineering calculations involving uncertainties; either in the model itself or in the data, are far beyond the capabilities of conventional analysis for any but the simplest of models. There exist a number of general-purpose computer simulation languages, using Mon...

  16. Propagation of Computational Uncertainty Using the Modern Design of Experiments

    Science.gov (United States)

    DeLoach, Richard

    2007-01-01

    This paper describes the use of formally designed experiments to aid in the error analysis of a computational experiment. A method is described by which the underlying code is approximated with relatively low-order polynomial graduating functions represented by truncated Taylor series approximations to the true underlying response function. A resource-minimal approach is outlined by which such graduating functions can be estimated from a minimum number of case runs of the underlying computational code. Certain practical considerations are discussed, including ways and means of coping with high-order response functions. The distributional properties of prediction residuals are presented and discussed. A practical method is presented for quantifying that component of the prediction uncertainty of a computational code that can be attributed to imperfect knowledge of independent variable levels. This method is illustrated with a recent assessment of uncertainty in computational estimates of Space Shuttle thermal and structural reentry loads attributable to ice and foam debris impact on ascent.

  17. Soft computing approaches to uncertainty propagation in environmental risk mangement

    OpenAIRE

    Kumar, Vikas

    2008-01-01

    Real-world problems, especially those that involve natural systems, are complex and composed of many nondeterministic components having non-linear coupling. It turns out that in dealing with such systems, one has to face a high degree of uncertainty and tolerate imprecision. Classical system models based on numerical analysis, crisp logic or binary logic have characteristics of precision and categoricity and classified as hard computing approach. In contrast soft computing approaches like pro...

  18. Computational principles of sensorimotor control that minimize uncertainty and variability

    OpenAIRE

    Bays, P. M.; Wolpert, D.M.

    2007-01-01

    Sensory and motor noise limits the precision with which we can sense the world and act upon it. Recent research has begun to reveal computational principles by which the central nervous system reduces the sensory uncertainty and movement variability arising from this internal noise. Here we review the role of optimal estimation and sensory filtering in extracting the sensory information required for motor planning, and the role of optimal control, motor adaptation and impedance control in the...

  19. Computer Model Inversion and Uncertainty Quantification in the Geosciences

    Science.gov (United States)

    White, Jeremy T.

    The subject of this dissertation is use of computer models as data analysis tools in several different geoscience settings, including integrated surface water/groundwater modeling, tephra fallout modeling, geophysical inversion, and hydrothermal groundwater modeling. The dissertation is organized into three chapters, which correspond to three individual publication manuscripts. In the first chapter, a linear framework is developed to identify and estimate the potential predictive consequences of using a simple computer model as a data analysis tool. The framework is applied to a complex integrated surface-water/groundwater numerical model with thousands of parameters. Several types of predictions are evaluated, including particle travel time and surface-water/groundwater exchange volume. The analysis suggests that model simplifications have the potential to corrupt many types of predictions. The implementation of the inversion, including how the objective function is formulated, what minimum of the objective function value is acceptable, and how expert knowledge is enforced on parameters, can greatly influence the manifestation of model simplification. Depending on the prediction, failure to specifically address each of these important issues during inversion is shown to degrade the reliability of some predictions. In some instances, inversion is shown to increase, rather than decrease, the uncertainty of a prediction, which defeats the purpose of using a model as a data analysis tool. In the second chapter, an efficient inversion and uncertainty quantification approach is applied to a computer model of volcanic tephra transport and deposition. The computer model simulates many physical processes related to tephra transport and fallout. The utility of the approach is demonstrated for two eruption events. In both cases, the importance of uncertainty quantification is highlighted by exposing the variability in the conditioning provided by the observations used for

  20. Uncertainty quantification in computational fluid dynamics and aircraft engines

    CERN Document Server

    Montomoli, Francesco; D'Ammaro, Antonio; Massini, Michela; Salvadori, Simone

    2015-01-01

    This book introduces novel design techniques developed to increase the safety of aircraft engines. The authors demonstrate how the application of uncertainty methods can overcome problems in the accurate prediction of engine lift, caused by manufacturing error. This in turn ameliorates the difficulty of achieving required safety margins imposed by limits in current design and manufacturing methods. This text shows that even state-of-the-art computational fluid dynamics (CFD) are not able to predict the same performance measured in experiments; CFD methods assume idealised geometries but ideal geometries do not exist, cannot be manufactured and their performance differs from real-world ones. By applying geometrical variations of a few microns, the agreement with experiments improves dramatically, but unfortunately the manufacturing errors in engines or in experiments are unknown. In order to overcome this limitation, uncertainty quantification considers the probability density functions of manufacturing errors...

  1. Error Estimation and Uncertainty Propagation in Computational Fluid Mechanics

    Science.gov (United States)

    Zhu, J. Z.; He, Guowei; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    Numerical simulation has now become an integral part of engineering design process. Critical design decisions are routinely made based on the simulation results and conclusions. Verification and validation of the reliability of the numerical simulation is therefore vitally important in the engineering design processes. We propose to develop theories and methodologies that can automatically provide quantitative information about the reliability of the numerical simulation by estimating numerical approximation error, computational model induced errors and the uncertainties contained in the mathematical models so that the reliability of the numerical simulation can be verified and validated. We also propose to develop and implement methodologies and techniques that can control the error and uncertainty during the numerical simulation so that the reliability of the numerical simulation can be improved.

  2. Approaches for computing uncertainties in predictions of complex-codes

    International Nuclear Information System (INIS)

    Uncertainty analysis aims at characterizing the errors associated with experiments and predictions of computer codes, in contradistinction with sensitivity analysis, which aims at determining the rate of change (i.e., derivative) in the predictions of codes when one or more (typically uncertain) input parameters varies within its range of interest. This paper reviews the salient features of three independent approaches for estimating uncertainties associated with predictions of complex system codes. The first approach reviewed in this paper as the prototype for propagation of code input errors is the so-called 'GRS method', which includes the so-called 'CSAU method' (Code Scaling, Applicability and Uncertainty) and the majority of methods adopted by the nuclear industry. Although the entire set of the actual number of input parameters for a typical NPP (Nuclear Power Plant) input deck, ranging up to about 105 input parameters, could theoretically be considered as uncertainty sources by these methods, only a 'manageable' number (of the order of several tens) is actually taken into account in practice. Ranges of variations, together with suitable PDF (Probability Density Function) are then assigned for each of the uncertain input parameter actually considered in the analysis. The number of computations using the code under investigation needed for obtaining the desired confidence in the results can be determined theoretically (it is of the order of 100). Subsequently, an additional number of computations (ca. 100) with the code are performed to propagate the uncertainties inside the code, from inputs to outputs (results). The second approach reviewed in this paper is the propagation of code output errors, as representatively illustrated by the UMAE-CIAU (Uncertainty Method based upon Accuracy Extrapolation 'embedded' into the Code with capability of Internal Assessment of Uncertainty). Note that this class of methods includes only a few applications from industry

  3. A High Performance Bayesian Computing Framework for Spatiotemporal Uncertainty Modeling

    Science.gov (United States)

    Cao, G.

    2015-12-01

    All types of spatiotemporal measurements are subject to uncertainty. With spatiotemporal data becomes increasingly involved in scientific research and decision making, it is important to appropriately model the impact of uncertainty. Quantitatively modeling spatiotemporal uncertainty, however, is a challenging problem considering the complex dependence and dataheterogeneities.State-space models provide a unifying and intuitive framework for dynamic systems modeling. In this paper, we aim to extend the conventional state-space models for uncertainty modeling in space-time contexts while accounting for spatiotemporal effects and data heterogeneities. Gaussian Markov Random Field (GMRF) models, also known as conditional autoregressive models, are arguably the most commonly used methods for modeling of spatially dependent data. GMRF models basically assume that a geo-referenced variable primarily depends on its neighborhood (Markov property), and the spatial dependence structure is described via a precision matrix. Recent study has shown that GMRFs are efficient approximation to the commonly used Gaussian fields (e.g., Kriging), and compared with Gaussian fields, GMRFs enjoy a series of appealing features, such as fast computation and easily accounting for heterogeneities in spatial data (e.g, point and areal). This paper represents each spatial dataset as a GMRF and integrates them into a state-space form to statistically model the temporal dynamics. Different types of spatial measurements (e.g., categorical, count or continuous), can be accounted for by according link functions. A fast alternative to MCMC framework, so-called Integrated Nested Laplace Approximation (INLA), was adopted for model inference.Preliminary case studies will be conducted to showcase the advantages of the described framework. In the first case, we apply the proposed method for modeling the water table elevation of Ogallala aquifer over the past decades. In the second case, we analyze the

  4. Verification and uncertainty analysis of fuel codes using distributed computing

    International Nuclear Information System (INIS)

    Of late, nuclear safety analysis computer codes have been held to increasingly high standards of quality assurance. As well, best estimate with uncertainty analysis is taking a more prominent role, displacing to some extent the idea of a limit consequence analysis. In turn, these activities have placed ever-increasing burdens on available computing resources. A recent project at Ontario Hydro has been the development of the capability of using the workstations on our Windows NT LAN as a distributed batch queue. The application developed is called SheepDog. This paper reports on the challenges and opportunities met in this project, as well as the experience gained in applying this method to verification and uncertainty analysis of fuel codes. SheepDog has been applied to performing uncertainty analysis, in a basically CSAU like method, of fuel behaviour during postulated accident scenarios at a nuclear power station. For each scenario, several hundred cases were selected according to a Latin Hypercube scheme, and used to construct a response surface surrogate for the codes. Residual disparities between code predictions and response surfaces led to the suspicion that there were discontinuities in the predictions of the analysis codes. This led to the development of 'stress testing' procedures. This refers to two procedures: coarsely scanning through several input parameters in combination, and finely scanning individual input parameters. For either procedure, the number of code runs required is several hundred. In order to be able to perform stress testing in a reasonable time, SheepDog was applied. The results are examined for such considerations as continuity, smoothness, and physical reasonableness of trends and interactions. In several cases, this analysis uncovered previously unknown errors in analysis codes, and allowed pinpointing the part of the codes that needed to be modified. The challenges involved include the following: the usual choices of development

  5. Contribution to uncertainties computing: application to aerosol nanoparticles metrology

    International Nuclear Information System (INIS)

    This thesis aims to provide SMPS users with a methodology to compute uncertainties associated with the estimation of aerosol size distributions. SMPS selects and detects airborne particles with a Differential Mobility Analyser (DMA) and a Condensation Particle Counter (CPC), respectively. The on-line measurement provides particle counting over a large measuring range. Then, recovering aerosol size distribution from CPC measurements yields to consider an inverse problem under uncertainty. A review of models to represent CPC measurements as a function of the aerosol size distribution is presented in the first chapter showing that competitive theories exist to model the physic involved in the measurement. It shows in the meantime the necessity of modelling parameters and other functions as uncertain. The physical model we established was first created to accurately represent the physic and second to be low time consuming. The first requirement is obvious as it characterizes the performance of the model. On the other hand, the time constraint is common to every large-scale problems for which an evaluation of the uncertainty is sought. To perform the estimation of the size distribution, a new criterion that couples regularization techniques and decomposition on a wavelet basis is described. Regularization is largely used to solve ill-posed problems. The regularized solution is computed as a trade-off between fidelity to the data and prior on the solution to be rebuilt, the trade-off being represented by a scalar known as the regularization parameter. Nevertheless, when dealing with size distributions showing broad and sharp profiles, an homogeneous prior is no longer suitable. Main improvement of this work is brought when such situations occur. The multi-scale approach we propose for the definition of the new prior is an alternative that enables to adjust the weights of the regularization on each scale of the signal. The method is tested against common regularization

  6. Computational Methods for Sensitivity and Uncertainty Analysis in Criticality Safety

    International Nuclear Information System (INIS)

    Interest in the sensitivity methods that were developed and widely used in the 1970s (the FORSS methodology at ORNL among others) has increased recently as a result of potential use in the area of criticality safety data validation procedures to define computational bias, uncertainties and area(s) of applicability. Functional forms of the resulting sensitivity coefficients can be used as formal parameters in the determination of applicability of benchmark experiments to their corresponding industrial application areas. In order for these techniques to be generally useful to the criticality safety practitioner, the procedures governing their use had to be updated and simplified. This paper will describe the resulting sensitivity analysis tools that have been generated for potential use by the criticality safety community

  7. Uncertainty analysis of NDA waste measurements using computer simulations

    International Nuclear Information System (INIS)

    Uncertainty assessments for nondestructive radioassay (NDA) systems for nuclear waste are complicated by factors extraneous to the measurement systems themselves. Most notably, characteristics of the waste matrix (e.g., homogeneity) and radioactive source material (e.g., particle size distribution) can have great effects on measured mass values. Under these circumstances, characterizing the waste population is as important as understanding the measurement system in obtaining realistic uncertainty values. When extraneous waste characteristics affect measurement results, the uncertainty results are waste-type specific. The goal becomes to assess the expected bias and precision for the measurement of a randomly selected item from the waste population of interest. Standard propagation-of-errors methods for uncertainty analysis can be very difficult to implement in the presence of significant extraneous effects on the measurement system. An alternative approach that naturally includes the extraneous effects is as follows: (1) Draw a random sample of items from the population of interest; (2) Measure the items using the NDA system of interest; (3) Establish the true quantity being measured using a gold standard technique; and (4) Estimate bias by deriving a statistical regression model comparing the measurements on the system of interest to the gold standard values; similar regression techniques for modeling the standard deviation of the difference values gives the estimated precision. Actual implementation of this method is often impractical. For example, a true gold standard confirmation measurement may not exist. A more tractable implementation is obtained by developing numerical models for both the waste material and the measurement system. A random sample of simulated waste containers generated by the waste population model serves as input to the measurement system model. This approach has been developed and successfully applied to assessing the quantity of

  8. Computations of uncertainty mediate acute stress responses in humans.

    Science.gov (United States)

    de Berker, Archy O; Rutledge, Robb B; Mathys, Christoph; Marshall, Louise; Cross, Gemma F; Dolan, Raymond J; Bestmann, Sven

    2016-01-01

    The effects of stress are frequently studied, yet its proximal causes remain unclear. Here we demonstrate that subjective estimates of uncertainty predict the dynamics of subjective and physiological stress responses. Subjects learned a probabilistic mapping between visual stimuli and electric shocks. Salivary cortisol confirmed that our stressor elicited changes in endocrine activity. Using a hierarchical Bayesian learning model, we quantified the relationship between the different forms of subjective task uncertainty and acute stress responses. Subjective stress, pupil diameter and skin conductance all tracked the evolution of irreducible uncertainty. We observed a coupling between emotional and somatic state, with subjective and physiological tuning to uncertainty tightly correlated. Furthermore, the uncertainty tuning of subjective and physiological stress predicted individual task performance, consistent with an adaptive role for stress in learning under uncertain threat. Our finding that stress responses are tuned to environmental uncertainty provides new insight into their generation and likely adaptive function. PMID:27020312

  9. Efficient Quantification of Uncertainties in Complex Computer Code Results Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal addresses methods for efficient quantification of margins and uncertainties (QMU) for models that couple multiple, large-scale commercial or...

  10. A best-estimate plus uncertainty type analysis for computing accurate critical channel power uncertainties

    International Nuclear Information System (INIS)

    This paper provides a Critical Channel Power (CCP) uncertainty analysis methodology based on a Monte-Carlo approach. This Monte-Carlo method includes the identification of the sources of uncertainty and the development of error models for the characterization of epistemic and aleatory uncertainties associated with the CCP parameter. Furthermore, the proposed method facilitates a means to use actual operational data leading to improvements over traditional methods (e.g., sensitivity analysis) which assume parametric models that may not accurately capture the possible complex statistical structures in the system input and responses. (author)

  11. The Uncertainty Test for the MAAP Computer Code

    Energy Technology Data Exchange (ETDEWEB)

    Park, S. H.; Song, Y. M.; Park, S. Y.; Ahn, K. I.; Kim, K. R.; Lee, Y. J. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2008-10-15

    After the Three Mile Island Unit 2 (TMI-2) and Chernobyl accidents, safety issues for a severe accident are treated in various aspects. Major issues in our research part include a level 2 PSA. The difficulty in expanding the level 2 PSA as a risk information activity is the uncertainty. In former days, it attached a weight to improve the quality in a internal accident PSA, but the effort is insufficient for decrease the phenomenon uncertainty in the level 2 PSA. In our country, the uncertainty degree is high in the case of a level 2 PSA model, and it is necessary to secure a model to decrease the uncertainty. We have not yet experienced the uncertainty assessment technology, the assessment system itself depends on advanced nations. In advanced nations, the severe accident simulator is implemented in the hardware level. But in our case, basic function in a software level can be implemented. In these circumstance at home and abroad, similar instances are surveyed such as UQM and MELCOR. Referred to these instances, SAUNA (Severe Accident UNcertainty Analysis) system is being developed in our project to assess and decrease the uncertainty in a level 2 PSA. It selects the MAAP code to analyze the uncertainty in a severe accident.

  12. Fuzzy randomness uncertainty in civil engineering and computational mechanics

    CERN Document Server

    Möller, Bernd

    2004-01-01

    This book, for the first time, provides a coherent, overall concept for taking account of uncertainty in the analysis, the safety assessment, and the design of structures. The reader is introduced to the problem of uncertainty modeling and familiarized with particular uncertainty models. For simultaneously considering stochastic and non-stochastic uncertainty the superordinated uncertainty model fuzzy randomness, which contains real valued random variables as well as fuzzy variables as special cases, is presented. For this purpose basic mathematical knowledge concerning the fuzzy set theory and the theory of fuzzy random variables is imparted. The body of the book comprises the appropriate quantification of uncertain structural parameters, the fuzzy and fuzzy probabilistic structural analysis, the fuzzy probabilistic safety assessment, and the fuzzy cluster structural design. The completely new algorithms are described in detail and illustrated by way of demonstrative examples.

  13. Fast Computation of Hemodynamic Sensitivity to Lumen Segmentation Uncertainty.

    Science.gov (United States)

    Sankaran, Sethuraman; Grady, Leo; Taylor, Charles A

    2015-12-01

    Patient-specific blood flow modeling combining imaging data and computational fluid dynamics can aid in the assessment of coronary artery disease. Accurate coronary segmentation and realistic physiologic modeling of boundary conditions are important steps to ensure a high diagnostic performance. Segmentation of the coronary arteries can be constructed by a combination of automated algorithms with human review and editing. However, blood pressure and flow are not impacted equally by different local sections of the coronary artery tree. Focusing human review and editing towards regions that will most affect the subsequent simulations can significantly accelerate the review process. We define geometric sensitivity as the standard deviation in hemodynamics-derived metrics due to uncertainty in lumen segmentation. We develop a machine learning framework for estimating the geometric sensitivity in real time. Features used include geometric and clinical variables, and reduced-order models. We develop an anisotropic kernel regression method for assessment of lumen narrowing score, which is used as a feature in the machine learning algorithm. A multi-resolution sensitivity algorithm is introduced to hierarchically refine regions of high sensitivity so that we can quantify sensitivities to a desired spatial resolution. We show that the mean absolute error of the machine learning algorithm compared to 3D simulations is less than 0.01. We further demonstrate that sensitivity is not predicted simply by anatomic reduction but also encodes information about hemodynamics which in turn depends on downstream boundary conditions. This sensitivity approach can be extended to other systems such as cerebral flow, electro-mechanical simulations, etc. PMID:26087484

  14. Binding in light nuclei: Statistical NN uncertainties vs Computational accuracy

    CERN Document Server

    Perez, R Navarro; Amaro, J E; Arriola, E Ruiz

    2016-01-01

    We analyse the impact of the statistical uncertainties of the the nucleon-nucleon interaction, based on the Granada-2013 np-pp database, on the binding energies of the triton and the alpha particle using a bootstrap method, by solving the Faddeev equations for $^3$H and the Yakubovsky equations for $^4$He respectively. We check that in practice about 30 samples prove enough for a reliable error estimate. An extrapolation of the well fulfilled Tjon-line correlation predicts the experimental binding of the alpha particle within uncertainties.

  15. Computing Statistics under Interval and Fuzzy Uncertainty Applications to Computer Science and Engineering

    CERN Document Server

    Nguyen, Hung T; Wu, Berlin; Xiang, Gang

    2012-01-01

    In many practical situations, we are interested in statistics characterizing a population of objects: e.g. in the mean height of people from a certain area.   Most algorithms for estimating such statistics assume that the sample values are exact. In practice, sample values come from measurements, and measurements are never absolutely accurate. Sometimes, we know the exact probability distribution of the measurement inaccuracy, but often, we only know the upper bound on this inaccuracy. In this case, we have interval uncertainty: e.g. if the measured value is 1.0, and inaccuracy is bounded by 0.1, then the actual (unknown) value of the quantity can be anywhere between 1.0 - 0.1 = 0.9 and 1.0 + 0.1 = 1.1. In other cases, the values are expert estimates, and we only have fuzzy information about the estimation inaccuracy.   This book shows how to compute statistics under such interval and fuzzy uncertainty. The resulting methods are applied to computer science (optimal scheduling of different processors), to in...

  16. May Day: A computer code to perform uncertainty and sensitivity analysis. Manuals

    International Nuclear Information System (INIS)

    The computer program May Day was developed to carry out the uncertainty and sensitivity analysis in the evaluation of radioactive waste storage. The May Day was made by the Polytechnical University of Madrid. (Author)

  17. Computer simulations in room acoustics: concepts and uncertainties.

    Science.gov (United States)

    Vorländer, Michael

    2013-03-01

    Geometrical acoustics are used as a standard model for room acoustic design and consulting. Research on room acoustic simulation focuses on a more accurate modeling of propagation effects such as diffraction and other wave effects in rooms, and on scattering. Much progress was made in this field so that wave models also (for example, the boundary element method and the finite differences in time domain) can now be used for higher frequencies. The concepts and implementations of room simulation methods are briefly reviewed. After all, simulations in architectural acoustics are indeed powerful tools, but their reliability depends on the skills of the operator who has to create an adequate polygon model and has to choose the correct input data of boundary conditions such as absorption and scattering. Very little is known about the uncertainty of this input data. With the theory of error propagation of uncertainties it can be shown that prediction of reverberation times with accuracy better than the just noticeable difference requires input data in a quality which is not available from reverberation room measurements. PMID:23463991

  18. A computer simulation platform for the estimation of measurement uncertainties in dimensional X-ray computed tomography

    DEFF Research Database (Denmark)

    Hiller, Jochen; Reindl, Leonard M

    2012-01-01

    The knowledge of measurement uncertainty is of great importance in conformance testing in production. The tolerance limit for production must be reduced by the amounts of measurement uncertainty to ensure that the parts are in fact within the tolerance. Over the last 5 years, industrial X......-ray computed tomography (CT) has become an important technology for dimensional quality control. In this paper a computer simulation platform is presented which is able to investigate error sources in dimensional CT measurements. The typical workflow in industrial CT metrology is described and methods for...... estimating measurement uncertainties are briefly discussed. As we will show, the developed virtual CT (VCT) simulator can be adapted to various scanner systems, providing realistic CT data. Using the Monte Carlo method (MCM), measurement uncertainties for a given measuring task can be estimated, taking into...

  19. Dissertation Defense: Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems

    Science.gov (United States)

    Groves, Curtis Edward

    2014-01-01

    Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This paper describes an approach to quantify the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft without the use of test data. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional validation by test only mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions.Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computational Fluid Dynamics can be used to verify these requirements; however, the model must be validated by test data. This research includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available and open source solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT, STARCCM+, and OPENFOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid Dynamics model using the methodology found in Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations. This method requires three separate grids and solutions, which quantify the error bars around Computational Fluid Dynamics predictions

  20. Dissertation Defense Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems

    Science.gov (United States)

    Groves, Curtis Edward

    2014-01-01

    Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This paper describes an approach to quantify the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft without the use of test data. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional "validation by test only" mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions. Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computational Fluid Dynamics can be used to verify these requirements; however, the model must be validated by test data. This research includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available and open source solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT, STARCCM+, and OPENFOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid Dynamics model using the methodology found in "Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations". This method requires three separate grids and solutions, which quantify the error bars around Computational Fluid Dynamics

  1. A computational framework for uncertainty integration in stochastic unit commitment with intermittent renewable energy sources

    International Nuclear Information System (INIS)

    Highlights: • A computational framework is proposed for uncertainty integration. • A new scenario generation method is proposed for renewable energy. • Prediction intervals are used to capture uncertainties of wind and solar power. • Load, wind, solar and generator outage uncertainties are integrated together. • Different generation costs and reserves are discussed for decision making. - Abstract: The penetration of intermittent renewable energy sources (IRESs) into power grids has increased in the last decade. Integration of wind farms and solar systems as the major IRESs have significantly boosted the level of uncertainty in operation of power systems. This paper proposes a comprehensive computational framework for quantification and integration of uncertainties in distributed power systems (DPSs) with IRESs. Different sources of uncertainties in DPSs such as electrical load, wind and solar power forecasts and generator outages are covered by the proposed framework. Load forecast uncertainty is assumed to follow a normal distribution. Wind and solar forecast are implemented by a list of prediction intervals (PIs) ranging from 5% to 95%. Their uncertainties are further represented as scenarios using a scenario generation method. Generator outage uncertainty is modeled as discrete scenarios. The integrated uncertainties are further incorporated into a stochastic security-constrained unit commitment (SCUC) problem and a heuristic genetic algorithm is utilized to solve this stochastic SCUC problem. To demonstrate the effectiveness of the proposed method, five deterministic and four stochastic case studies are implemented. Generation costs as well as different reserve strategies are discussed from the perspectives of system economics and reliability. Comparative results indicate that the planned generation costs and reserves are different from the realized ones. The stochastic models show better robustness than deterministic ones. Power systems run a higher

  2. Estimation of measurement uncertainties in X-ray computed tomography metrology using the substitution method

    DEFF Research Database (Denmark)

    Müller, Pavel; Hiller, Jochen; Dai, Y.;

    2014-01-01

    This paper presents the application of the substitution method for the estimation of measurement uncertainties using calibrated workpieces in X-ray computed tomography (CT) metrology. We have shown that this, well accepted method for uncertainty estimation using tactile coordinate measuring...... machines, can be applied to dimensional CT measurements. The method is based on repeated measurements carried out on a calibrated master piece. The master piece is a component of a dose engine from an insulin pen. Measurement uncertainties estimated from the repeated measurements of the master piece were...

  3. Consideration of uncertainties in soil-structure interaction computations

    Energy Technology Data Exchange (ETDEWEB)

    Costantino, C.J.; Miller, C.A. [City Coll., New York, NY (United States). Earthquake Research Center

    1992-12-01

    This report summarizes the results obtained in a study conducted to evaluate and quantify some important effects of soil-structure interaction on the seismic response of Category I facilities. The specific areas addressed in this study have to do with the following: The specification of criteria needed to develop reliable results when using the large computer codes in SSI studies; the development of recommendations for specification of control point location for generic soil sites needed to generate input motions to the SSI analyses; the development of specific criteria to allow the Staff to judge the adequacy of fixed base structural analyses; the development of expanded guidelines for inclusion of variability in soil properties in the SSI calculations; and the development of estimates of radiation damping inherent in the detailed numerical analyses performed or SSI evaluations of typical Category I structures.

  4. In depth uncertainty estimation of the neutron computational tools

    International Nuclear Information System (INIS)

    Verification and validation processes are required to demonstrate the ability of simulation tools to accomplish the task they have been designed for. The values of the characteristics of both 3rd+ generation PWR cores and 4th generation fast neutron reactor cores should be associated with a validation dossier. The new APOLLO3® code, under development at CEA, has to undergo such validation process in particular when treating reactor cores with advanced features. In order to prepare this task, the way that validation has been done in the past for a code like ECCO is being revisited. The aim of this work is to illustrate the change in paradigm between what was done 25 years ago and what is currently done. Two ways to quantify the bias introduced in deterministic calculations exist: 1. Performing deterministic calculations in which some approximations are eliminated or reduced. 2. Comparing deterministic calculations with stochastic ones (TRIPOLI4). The first approach was the one being used for ECCO when programming it since it was accessible with computers available at the time but is not sufficiently independent from the tool being tested. The method could identify the origin of discrepancies with the use of the perturbation method so as to upgrade the method and/or track some possible compensating errors. The second approach is often done by comparing integral results such as reactivity or so-dium void worth obtained with deterministic codes with the one of Monte Carlo codes using continuous energy libraries. This approach is somehow lacking in depth under-standing of the origin of discrepancies, hence is not very instructive for finding a way to improve the computational scheme. The use of the perturbation theory is applied in this paper between ECCO/ERANOS deterministic results and TRIPOLI4 Monte Carlo results for reactivity or sodium void worth of Sodium Fast Reactors (SFR) so as to illustrate this innovative approach. (author)

  5. Modeling Multibody Systems with Uncertainties. Part I: Theoretical and Computational Aspects

    International Nuclear Information System (INIS)

    This study explores the use of generalized polynomial chaos theory for modeling complex nonlinear multibody dynamic systems in the presence of parametric and external uncertainty. The polynomial chaos framework has been chosen because it offers an efficient computational approach for the large, nonlinear multibody models of engineering systems of interest, where the number of uncertain parameters is relatively small, while the magnitude of uncertainties can be very large (e.g., vehicle-soil interaction). The proposed methodology allows the quantification of uncertainty distributions in both time and frequency domains, and enables the simulations of multibody systems to produce results with 'error bars'. The first part of this study presents the theoretical and computational aspects of the polynomial chaos methodology. Both unconstrained and constrained formulations of multibody dynamics are considered. Direct stochastic collocation is proposed as less expensive alternative to the traditional Galerkin approach. It is established that stochastic collocation is equivalent to a stochastic response surface approach. We show that multi-dimensional basis functions are constructed as tensor products of one-dimensional basis functions and discuss the treatment of polynomial and trigonometric nonlinearities. Parametric uncertainties are modeled by finite-support probability densities. Stochastic forcings are discretized using truncated Karhunen-Loeve expansions. The companion paper 'Modeling Multibody Dynamic Systems With Uncertainties. Part II: Numerical Applications' illustrates the use of the proposed methodology on a selected set of test problems. The overall conclusion is that despite its limitations, polynomial chaos is a powerful approach for the simulation of multibody systems with uncertainties

  6. Computer-assisted uncertainty assessment of k0-NAA measurement results

    International Nuclear Information System (INIS)

    In quantifying measurement uncertainty of measurement results obtained by the k0-based neutron activation analysis (k0-NAA), a number of parameters should be considered and appropriately combined in deriving the final budget. To facilitate this process, a program ERON (ERror propagatiON) was developed, which computes uncertainty propagation factors from the relevant formulae and calculates the combined uncertainty. The program calculates uncertainty of the final result-mass fraction of an element in the measured sample-taking into account the relevant neutron flux parameters such as α and f, including their uncertainties. Nuclear parameters and their uncertainties are taken from the IUPAC database (V.P. Kolotov and F. De Corte, Compilation of k0 and related data for NAA). Furthermore, the program allows for uncertainty calculations of the measured parameters needed in k0-NAA: α (determined with either the Cd-ratio or the Cd-covered multi-monitor method), f (using the Cd-ratio or the bare method), Q0 (using the Cd-ratio or internal comparator method) and k0 (using the Cd-ratio, internal comparator or the Cd subtraction method). The results of calculations can be printed or exported to text or MS Excel format for further analysis. Special care was taken to make the calculation engine portable by having possibility of its incorporation into other applications (e.g., DLL and WWW server). Theoretical basis and the program are described in detail, and typical results obtained under real measurement conditions are presented

  7. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Jaeseok, E-mail: jheo@kaeri.re.kr; Kim, Kyung Doo, E-mail: kdkim@kaeri.re.kr

    2015-10-15

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper.

  8. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    International Nuclear Information System (INIS)

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper

  9. Analysis of computational problems expressing the overall uncertainties: Photons, neutrons and electrons

    International Nuclear Information System (INIS)

    In the frame of the EU Coordination Action CONRAD (coordinated network for radiation dosimetry), WP4 was dedicated to work on computational dosimetry with an action entitled 'Uncertainty assessment in computational dosimetry: an intercomparison of approaches'. Participants attempted one or more of eight problems. This paper presents the results from problems 4-8 - dealing with the overall uncertainty budget estimate; a short overview of each problem is presented together with a discussion of the most significant results and conclusions. The scope of the problems discussed here are: the study of a 137Cs calibration irradiator; the manganese bath technique; the iron sphere experiment using neutron time-of-flight technique; the energy response of a RADFET detector and finally the sensitivity and uncertainty analysis for the recoil-proton telescope discussed in the companion paper. (authors)

  10. Efficient Methods for Bayesian Uncertainty Analysis and Global Optimization of Computationally Expensive Environmental Models

    Science.gov (United States)

    Shoemaker, Christine; Espinet, Antoine; Pang, Min

    2015-04-01

    Models of complex environmental systems can be computationally expensive in order to describe the dynamic interactions of the many components over a sizeable time period. Diagnostics of these systems can include forward simulations of calibrated models under uncertainty and analysis of alternatives of systems management. This discussion will focus on applications of new surrogate optimization and uncertainty analysis methods to environmental models that can enhance our ability to extract information and understanding. For complex models, optimization and especially uncertainty analysis can require a large number of model simulations, which is not feasible for computationally expensive models. Surrogate response surfaces can be used in Global Optimization and Uncertainty methods to obtain accurate answers with far fewer model evaluations, which made the methods practical for computationally expensive models for which conventional methods are not feasible. In this paper we will discuss the application of the SOARS surrogate method for estimating Bayesian posterior density functions for model parameters for a TOUGH2 model of geologic carbon sequestration. We will also briefly discuss new parallel surrogate global optimization algorithm applied to two groundwater remediation sites that was implemented on a supercomputer with up to 64 processors. The applications will illustrate the use of these methods to predict the impact of monitoring and management on subsurface contaminants.

  11. Measurement Uncertainty Evaluation in Dimensional X-ray Computed Tomography Using the Bootstrap Method

    DEFF Research Database (Denmark)

    Hiller, Jochen; Genta, Gianfranco; Barbato, Giulio;

    2014-01-01

    Industrial applications of computed tomography (CT) for dimensional metrology on various components are fast increasing, owing to a number of favorable properties such as capability of non-destructive internal measurements. Uncertainty evaluation is however more complex than in conventional...... measurement processes, e.g., with tactile systems, also due to factors related to systematic errors, mainly caused by specific CT image characteristics. In this paper we propose a simulation-based framework for measurement uncertainty evaluation in dimensional CT using the bootstrap method. In a case study...... the problem concerning measurement uncertainties was addressed with bootstrap and successfully applied to ball-bar CT measurements. Results obtained enabled extension to more complex shapes such as actual industrial components as we show by tests on a hollow cylinder workpiece....

  12. Risk assessment through drinking water pathway via uncertainty modeling of contaminant transport using soft computing

    International Nuclear Information System (INIS)

    The basic objective of an environmental impact assessment (EIA) is to build guidelines to reduce the associated risk or mitigate the consequences of the reactor accident at its source to prevent deterministic health effects, to reduce the risk of stochastic health effects (eg. cancer and severe hereditary effects) as much as reasonable achievable by implementing protective actions in accordance with IAEA guidance (IAEA Safety Series No. 115, 1996). The measure of exposure being the basic tool to take any appropriate decisions related to risk reduction, EIA is traditionally expressed in terms of radiation exposure to the member of the public. However, models used to estimate the exposure received by the member of the public are governed by parameters some of which are deterministic with relative uncertainty and some of which are stochastic as well as imprecise (insufficient knowledge). In an admixture environment of this type, it is essential to assess the uncertainty of a model to estimate the bounds of the exposure to the public to invoke a decision during an event of nuclear or radiological emergency. With a view to this soft computing technique such as evidence theory based assessment of model parameters is addressed to compute the risk or exposure to the member of the public. The possible pathway of exposure to the member of the public in the aquatic food stream is the drinking of water. Accordingly, this paper presents the uncertainty analysis of exposure via uncertainty analysis of the contaminated water. Evidence theory finally addresses the uncertainty in terms of lower bound as belief measure and upper bound of exposure as plausibility measure. In this work EIA is presented using evidence theory. Data fusion technique is used to aggregate the knowledge on the uncertain information. Uncertainty of concentration and exposure is expressed as an interval of belief, plausibility

  13. A Novel Method for the Evaluation of Uncertainty in Dose-Volume Histogram Computation

    International Nuclear Information System (INIS)

    Purpose: Dose-volume histograms (DVHs) are a useful tool in state-of-the-art radiotherapy treatment planning, and it is essential to recognize their limitations. Even after a specific dose-calculation model is optimized, dose distributions computed by using treatment-planning systems are affected by several sources of uncertainty, such as algorithm limitations, measurement uncertainty in the data used to model the beam, and residual differences between measured and computed dose. This report presents a novel method to take them into account. Methods and Materials: To take into account the effect of associated uncertainties, a probabilistic approach using a new kind of histogram, a dose-expected volume histogram, is introduced. The expected value of the volume in the region of interest receiving an absorbed dose equal to or greater than a certain value is found by using the probability distribution of the dose at each point. A rectangular probability distribution is assumed for this point dose, and a formulation that accounts for uncertainties associated with point dose is presented for practical computations. Results: This method is applied to a set of DVHs for different regions of interest, including 6 brain patients, 8 lung patients, 8 pelvis patients, and 6 prostate patients planned for intensity-modulated radiation therapy. Conclusions: Results show a greater effect on planning target volume coverage than in organs at risk. In cases of steep DVH gradients, such as planning target volumes, this new method shows the largest differences with the corresponding DVH; thus, the effect of the uncertainty is larger

  14. Prediction and Uncertainty in Computational Modeling of Complex Phenomena: A Whitepaper

    Energy Technology Data Exchange (ETDEWEB)

    Trucano, T.G.

    1999-01-20

    This report summarizes some challenges associated with the use of computational science to predict the behavior of complex phenomena. As such, the document is a compendium of ideas that have been generated by various staff at Sandia. The report emphasizes key components of the use of computational to predict complex phenomena, including computational complexity and correctness of implementations, the nature of the comparison with data, the importance of uncertainty quantification in comprehending what the prediction is telling us, and the role of risk in making and using computational predictions. Both broad and more narrowly focused technical recommendations for research are given. Several computational problems are summarized that help to illustrate the issues we have emphasized. The tone of the report is informal, with virtually no mathematics. However, we have attempted to provide a useful bibliography that would assist the interested reader in pursuing the content of this report in greater depth.

  15. Numerical Uncertainty Analysis for Computational Fluid Dynamics using Student T Distribution -- Application of CFD Uncertainty Analysis Compared to Exact Analytical Solution

    Science.gov (United States)

    Groves, Curtis E.; Ilie, marcel; Shallhorn, Paul A.

    2014-01-01

    Computational Fluid Dynamics (CFD) is the standard numerical tool used by Fluid Dynamists to estimate solutions to many problems in academia, government, and industry. CFD is known to have errors and uncertainties and there is no universally adopted method to estimate such quantities. This paper describes an approach to estimate CFD uncertainties strictly numerically using inputs and the Student-T distribution. The approach is compared to an exact analytical solution of fully developed, laminar flow between infinite, stationary plates. It is shown that treating all CFD input parameters as oscillatory uncertainty terms coupled with the Student-T distribution can encompass the exact solution.

  16. Validation and uncertainty analysis of the Athlet thermal-hydraulic computer code

    International Nuclear Information System (INIS)

    The computer code ATHLET is being developed by GRS as an advanced best-estimate code for the simulation of breaks and transients in Pressurized Water Reactor (PWRs) and Boiling Water Reactor (BWRs) including beyond design basis accidents. A systematic validation of ATHLET is based on a well balanced set of integral and separate effects tests emphasizing the German combined Emergency Core Cooling (ECC) injection system. When using best estimate codes for predictions of reactor plant states during assumed accidents, qualification of the uncertainty in these calculations is highly desirable. A method for uncertainty and sensitivity evaluation has been developed by GRS where the computational effort is independent of the number of uncertain parameters. (author)

  17. Bayesian uncertainty analysis for complex physical systems modelled by computer simulators with applications to tipping points

    Science.gov (United States)

    Caiado, C. C. S.; Goldstein, M.

    2015-09-01

    In this paper we present and illustrate basic Bayesian techniques for the uncertainty analysis of complex physical systems modelled by computer simulators. We focus on emulation and history matching and also discuss the treatment of observational errors and structural discrepancies in time series. We exemplify such methods using a four-box model for the termohaline circulation. We show how these methods may be applied to systems containing tipping points and how to treat possible discontinuities using multiple emulators.

  18. Hilbert's Second Problems and Uncertainty Computing, from HCP Logic's Point of View

    Institute of Scientific and Technical Information of China (English)

    James Kuodo Huang

    2006-01-01

    Hilbert's complete perfect (HCP) logic is introduced. The G(o)del's incompleteness theorem discloses the limit of logic. Huang's universal consistent theorem and relative consistent theorem extends the limit of logic. The proofs of these theorems are in 2-valued logic but the completeness can be extended in the three-valued HCP logic.The author proposes HCP logic for the foundation of uncertainty computing as well.

  19. A novel method for the evaluation of uncertainty in dose volume histogram computation

    CERN Document Server

    Cutanda-Henriquez, Francisco

    2007-01-01

    Dose volume histograms are a useful tool in state-of-the-art radiotherapy planning, and it is essential to be aware of their limitations. Dose distributions computed by treatment planning systems are affected by several sources of uncertainty such as algorithm limitations, measurement uncertainty in the data used to model the beam and residual differences between measured and computed dose, once the model is optimized. In order to take into account the effect of uncertainty, a probabilistic approach is proposed and a new kind of histogram, a dose-expected volume histogram, is introduced. The expected value of the volume in the region of interest receiving an absorbed dose equal or greater than a certain value is found using the probability distribution of the dose at each point. A rectangular probability distribution is assumed for this point dose, and a relationship is given for practical computations. This method is applied to a set of dose volume histograms for different regions of interest for 6 brain pat...

  20. Deterministic methods for sensitivity and uncertainty analysis in large-scale computer models

    International Nuclear Information System (INIS)

    The fields of sensitivity and uncertainty analysis have traditionally been dominated by statistical techniques when large-scale modeling codes are being analyzed. These methods are able to estimate sensitivities, generate response surfaces, and estimate response probability distributions given the input parameter probability distributions. Because the statistical methods are computationally costly, they are usually applied only to problems with relatively small parameter sets. Deterministic methods, on the other hand, are very efficient and can handle large data sets, but generally require simpler models because of the considerable programming effort required for their implementation. The first part of this paper reports on the development and availability of two systems, GRESS and ADGEN, that make use of computer calculus compilers to automate the implementation of deterministic sensitivity analysis capability into existing computer models. This automation removes the traditional limitation of deterministic sensitivity methods. The second part of the paper describes a Deterministic sensitivity methods. The second part of the paper describes a Deterministic Uncertainty Analysis (DUA) method that uses derivative information as a basis to propagate parameter probability distributions to obtain result probability distributions. The last part of the paper demonstrates the deterministic approach to sensitivity and uncertainty analysis as applied to a sample problem that models the flow of water through a borehole

  1. How does our ignorance of rainfall affect the uncertainty of hydrological computations?

    Science.gov (United States)

    Lebecherel, Laure; Andréassian, Vazken

    2014-05-01

    Precipitation is an essential input to hydrological studies, fundamental for water balance studies, for hydrological simulation and forecasting. Since precipitation can be spatially and temporally variable, the configuration of the raingauge network can have a major impact on the accuracy of hydrological computations. Hydrological good sense tells us that the less we know about catchment rainfall, the more uncertain our hydrological computations will be. Quantifying this trend, i.e. the sensitivity of our computations to the design of the rainfall measurement network is essential in a context of increasing requests and decreasing finance. We keep hearing about the need to "rationalize" observation networks. However, this rationalization, which often means a reduction of network density, can deteriorate our rainfall knowledge and can particularly increase the hydrological computation uncertainties. We evaluate here on a large set of French catchments the impact of the rain gauge density and rain gauge network configuration on the uncertainty of several hydrological computations, based on the GR4J daily rainfall-runoff model [Perrin et al., 2003]. Four hydrological applications are considered: (i) daily runoff simulation, (ii) long-term average streamflow assessment, (iii) high-flow quantiles assessment, and (iv) low-flow quantiles assessment. Perrin, C., C. Michel, and V. Andréassian (2003), Improvement of a parsimonious model for streamflow simulation, Journal of Hydrology, 279(1-4), 275-289, doi: 10.1016/s0022-1694(03)00225-7.

  2. Flood risk assessment at the regional scale: Computational challenges and the monster of uncertainty

    Science.gov (United States)

    Efstratiadis, Andreas; Papalexiou, Simon-Michael; Markonis, Yiannis; Koukouvinos, Antonis; Vasiliades, Lampros; Papaioannou, George; Loukas, Athanasios

    2016-04-01

    We present a methodological framework for flood risk assessment at the regional scale, developed within the implementation of the EU Directive 2007/60 in Greece. This comprises three phases: (a) statistical analysis of extreme rainfall data, resulting to spatially-distributed parameters of intensity-duration-frequency (IDF) relationships and their confidence intervals, (b) hydrological simulations, using event-based semi-distributed rainfall-runoff approaches, and (c) hydraulic simulations, employing the propagation of flood hydrographs across the river network and the mapping of inundated areas. The flood risk assessment procedure is employed over the River Basin District of Thessaly, Greece, which requires schematization and modelling of hundreds of sub-catchments, each one examined for several risk scenarios. This is a challenging task, involving multiple computational issues to handle, such as the organization, control and processing of huge amount of hydrometeorological and geographical data, the configuration of model inputs and outputs, and the co-operation of several software tools. In this context, we have developed supporting applications allowing massive data processing and effective model coupling, thus drastically reducing the need for manual interventions and, consequently, the time of the study. Within flood risk computations we also account for three major sources of uncertainty, in an attempt to provide upper and lower confidence bounds of flood maps, i.e. (a) statistical uncertainty of IDF curves, (b) structural uncertainty of hydrological models, due to varying anteceded soil moisture conditions, and (c) parameter uncertainty of hydraulic models, with emphasis to roughness coefficients. Our investigations indicate that the combined effect of the above uncertainties (which are certainly not the unique ones) result to extremely large bounds of potential inundation, thus rising many questions about the interpretation and usefulness of current flood

  3. A sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, J. D. (Prostat, Mesa, AZ); Oberkampf, William Louis; Helton, Jon Craig (Arizona State University, Tempe, AZ); Storlie, Curtis B. (North Carolina State University, Raleigh, NC)

    2006-10-01

    Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a model is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.

  4. Anthropometric approaches and their uncertainties to assigning computational phantoms to individual patients in pediatric dosimetry studies

    Energy Technology Data Exchange (ETDEWEB)

    Whalen, Scott [Department of Nuclear and Radiological Engineering, University of Florida, Gainesville, FL 32611 (United States); Lee, Choonsik [Department of Nuclear and Radiological Engineering, University of Florida, Gainesville, FL 32611 (United States); Williams, Jonathan L [Department of Radiology, University of Florida, Gainesville, FL 32611 (United States); Bolch, Wesley E [Departments of Nuclear and Radiological and Biomedical Engineering, University of Florida, Gainesville, FL 32611 (United States)

    2008-01-21

    Current efforts to reconstruct organ doses in children undergoing diagnostic imaging or therapeutic interventions using ionizing radiation typically rely upon the use of reference anthropomorphic computational phantoms coupled to Monte Carlo radiation transport codes. These phantoms are generally matched to individual patients based upon nearest age or sometimes total body mass. In this study, we explore alternative methods of phantom-to-patient matching with the goal of identifying those methods which yield the lowest residual errors in internal organ volumes. Various thoracic and abdominal organs were segmented and organ volumes obtained from chest-abdominal-pelvic (CAP) computed tomography (CT) image sets from 38 pediatric patients ranging in age from 2 months to 15 years. The organs segmented included the skeleton, heart, kidneys, liver, lungs and spleen. For each organ, least-squared regression lines, 95th percentile confidence intervals and 95th percentile prediction intervals were established as a function of patient age, trunk volume, estimated trunk mass, trunk height, and three estimates of the ventral body cavity volume based on trunk height alone, or in combination with circumferential, width and/or breadth measurements in the mid-chest of the patient. When matching phantom to patient based upon age, residual uncertainties in organ volumes ranged from 53% (lungs) to 33% (kidneys), and when trunk mass was used (surrogate for total body mass as we did not have images of patient head, arms or legs), these uncertainties ranged from 56% (spleen) to 32% (liver). When trunk height is used as the matching parameter, residual uncertainties in organ volumes were reduced to between 21 and 29% for all organs except the spleen (40%). In the case of the lungs and skeleton, the two-fold reduction in organ volume uncertainties was seen in moving from patient age to trunk height-a parameter easily measured in the clinic. When ventral body cavity volumes were used

  5. Anthropometric approaches and their uncertainties to assigning computational phantoms to individual patients in pediatric dosimetry studies

    International Nuclear Information System (INIS)

    Current efforts to reconstruct organ doses in children undergoing diagnostic imaging or therapeutic interventions using ionizing radiation typically rely upon the use of reference anthropomorphic computational phantoms coupled to Monte Carlo radiation transport codes. These phantoms are generally matched to individual patients based upon nearest age or sometimes total body mass. In this study, we explore alternative methods of phantom-to-patient matching with the goal of identifying those methods which yield the lowest residual errors in internal organ volumes. Various thoracic and abdominal organs were segmented and organ volumes obtained from chest-abdominal-pelvic (CAP) computed tomography (CT) image sets from 38 pediatric patients ranging in age from 2 months to 15 years. The organs segmented included the skeleton, heart, kidneys, liver, lungs and spleen. For each organ, least-squared regression lines, 95th percentile confidence intervals and 95th percentile prediction intervals were established as a function of patient age, trunk volume, estimated trunk mass, trunk height, and three estimates of the ventral body cavity volume based on trunk height alone, or in combination with circumferential, width and/or breadth measurements in the mid-chest of the patient. When matching phantom to patient based upon age, residual uncertainties in organ volumes ranged from 53% (lungs) to 33% (kidneys), and when trunk mass was used (surrogate for total body mass as we did not have images of patient head, arms or legs), these uncertainties ranged from 56% (spleen) to 32% (liver). When trunk height is used as the matching parameter, residual uncertainties in organ volumes were reduced to between 21 and 29% for all organs except the spleen (40%). In the case of the lungs and skeleton, the two-fold reduction in organ volume uncertainties was seen in moving from patient age to trunk height-a parameter easily measured in the clinic. When ventral body cavity volumes were used

  6. Anthropometric approaches and their uncertainties to assigning computational phantoms to individual patients in pediatric dosimetry studies

    Science.gov (United States)

    Whalen, Scott; Lee, Choonsik; Williams, Jonathan L.; Bolch, Wesley E.

    2008-01-01

    Current efforts to reconstruct organ doses in children undergoing diagnostic imaging or therapeutic interventions using ionizing radiation typically rely upon the use of reference anthropomorphic computational phantoms coupled to Monte Carlo radiation transport codes. These phantoms are generally matched to individual patients based upon nearest age or sometimes total body mass. In this study, we explore alternative methods of phantom-to-patient matching with the goal of identifying those methods which yield the lowest residual errors in internal organ volumes. Various thoracic and abdominal organs were segmented and organ volumes obtained from chest-abdominal-pelvic (CAP) computed tomography (CT) image sets from 38 pediatric patients ranging in age from 2 months to 15 years. The organs segmented included the skeleton, heart, kidneys, liver, lungs and spleen. For each organ, least-squared regression lines, 95th percentile confidence intervals and 95th percentile prediction intervals were established as a function of patient age, trunk volume, estimated trunk mass, trunk height, and three estimates of the ventral body cavity volume based on trunk height alone, or in combination with circumferential, width and/or breadth measurements in the mid-chest of the patient. When matching phantom to patient based upon age, residual uncertainties in organ volumes ranged from 53% (lungs) to 33% (kidneys), and when trunk mass was used (surrogate for total body mass as we did not have images of patient head, arms or legs), these uncertainties ranged from 56% (spleen) to 32% (liver). When trunk height is used as the matching parameter, residual uncertainties in organ volumes were reduced to between 21 and 29% for all organs except the spleen (40%). In the case of the lungs and skeleton, the two-fold reduction in organ volume uncertainties was seen in moving from patient age to trunk height—a parameter easily measured in the clinic. When ventral body cavity volumes were used

  7. Uncertainties in the interpretation of computer results on the behaviour of LMFBR fuel pins

    International Nuclear Information System (INIS)

    Since about ten years a large number of computer programs has been developed for the simulation of the behaviour of fuel pins in a LMFBR. The applications for these programs are manifold. They can be used to explain the results of the post-irradiation examination of test pins and improve the understanding of the irradiation behaviour of these pins. They give valuable assistance in the preparation and optimal design of irradiation experiments. They are indispensable in the detailed analysis of the influence of individual material phenomena. Moreover, they can be used to analyse the influence of individual design parameters on the fuel pin behaviour with the aim to optimize the pin performance and establish appropriate specifications. Further on, they can be used to predict the behaviour of fuel pins in nuclear power reactors under the scheduled operation and analyse the consequences of off-normal operation with the aim to find out critical conditions, maybe for reasons of safety or the optimization of operation. The computer program can only serve its purpose if several conditions are fulfilled. Sufficient knowledge of all the data necessary for its application is one of these conditions. The experience gained with many computer runs resulted in some critical considerations. It follows that the knowledge of those data is still insufficient in many cases and large uncertainties have to be accepted in the interpretation of computed results. Some sources of uncertainties are demonstrated. Their influence on the computed results is discussed. It is clear that, due to the complexity of the problem, only some selected examples can be treated. The computer calculation results presented have been provided by the computer programs SATURN and TEXZ respectively

  8. Uncertainty analysis methods for quantification of source terms using a large computer code

    International Nuclear Information System (INIS)

    Quantification of uncertainties in the source term estimations by a large computer code, such as MELCOR and MAAP, is an essential process of the current probabilistic safety assessments (PSAs). The main objectives of the present study are (1) to investigate the applicability of a combined procedure of the response surface method (RSM) based on input determined from a statistical design and the Latin hypercube sampling (LHS) technique for the uncertainty analysis of CsI release fractions under a hypothetical severe accident sequence of a station blackout at Young-Gwang nuclear power plant using MAAP3.0B code as a benchmark problem; and (2) to propose a new measure of uncertainty importance based on the distributional sensitivity analysis. On the basis of the results obtained in the present work, the RSM is recommended to be used as a principal tool for an overall uncertainty analysis in source term quantifications, while using the LHS in the calculations of standardized regression coefficients (SRC) and standardized rank regression coefficients (SRRC) to determine the subset of the most important input parameters in the final screening step and to check the cumulative distribution functions (cdfs) obtained by RSM. Verification of the response surface model for its sufficient accuracy is a prerequisite for the reliability of the final results obtained by the combined procedure proposed in the present work. In the present study a new measure has been developed to utilize the metric distance obtained from cumulative distribution functions (cdfs). The measure has been evaluated for three different cases of distributions in order to assess the characteristics of the measure: The first case and the second are when the distribution is known as analytical distributions and the other case is when the distribution is unknown. The first case is given by symmetry analytical distributions. The second case consists of two asymmetry distributions of which the skewness is non zero

  9. Deterministic sensitivity and uncertainty analysis for large-scale computer models

    International Nuclear Information System (INIS)

    The fields of sensitivity and uncertainty analysis have traditionally been dominated by statistical techniques when large-scale modeling codes are being analyzed. These methods are able to estimate sensitivities, generate response surfaces, and estimate response probability distributions given the input parameter probability distributions. Because the statistical methods are computationally costly, they are usually applied only to problems with relatively small parameter sets. Deterministic methods, on the other hand, are very efficient and can handle large data sets, but generally require simpler models because of the considerable programming effort required for their implementation. The first part of this paper reports on the development and availability of two systems, GRESS and ADGEN, that make use of computer calculus compilers to automate the implementation of deterministic sensitivity analysis capability into existing computer models. This automation removes the traditional limitation of deterministic sensitivity methods. This second part of the paper describes a deterministic uncertainty analysis method (DUA) that uses derivative information as a basis to propagate parameter probability distributions to obtain result probability distributions. This paper is applicable to low-level radioactive waste disposal system performance assessment

  10. Uncertainty quantification based on pillars of experiment, theory, and computation. Part I: Data analysis

    Science.gov (United States)

    Elishakoff, I.; Sarlin, N.

    2016-06-01

    In this paper we provide a general methodology of analysis and design of systems involving uncertainties. Available experimental data is enclosed by some geometric figures (triangle, rectangle, ellipse, parallelogram, super ellipse) of minimum area. Then these areas are inflated resorting to the Chebyshev inequality in order to take into account the forecasted data. Next step consists in evaluating response of system when uncertainties are confined to one of the above five suitably inflated geometric figures. This step involves a combined theoretical and computational analysis. We evaluate the maximum response of the system subjected to variation of uncertain parameters in each hypothesized region. The results of triangular, interval, ellipsoidal, parallelogram, and super ellipsoidal calculi are compared with the view of identifying the region that leads to minimum of maximum response. That response is identified as a result of the suggested predictive inference. The methodology thus synthesizes probabilistic notion with each of the five calculi. Using the term "pillar" in the title was inspired by the News Release (2013) on according Honda Prize to J. Tinsley Oden, stating, among others, that "Dr. Oden refers to computational science as the "third pillar" of scientific inquiry, standing beside theoretical and experimental science. Computational science serves as a new paradigm for acquiring knowledge and informing decisions important to humankind". Analysis of systems with uncertainties necessitates employment of all three pillars. The analysis is based on the assumption that that the five shapes are each different conservative estimates of the true bounding region. The smallest of the maximal displacements in x and y directions (for a 2D system) therefore provides the closest estimate of the true displacements based on the above assumption.

  11. Evaluating Statistical Process Control (SPC) techniques and computing the uncertainty of force calibrations

    Science.gov (United States)

    Navard, Sharon E.

    1989-01-01

    In recent years there has been a push within NASA to use statistical techniques to improve the quality of production. Two areas where statistics are used are in establishing product and process quality control of flight hardware and in evaluating the uncertainty of calibration of instruments. The Flight Systems Quality Engineering branch is responsible for developing and assuring the quality of all flight hardware; the statistical process control methods employed are reviewed and evaluated. The Measurement Standards and Calibration Laboratory performs the calibration of all instruments used on-site at JSC as well as those used by all off-site contractors. These calibrations must be performed in such a way as to be traceable to national standards maintained by the National Institute of Standards and Technology, and they must meet a four-to-one ratio of the instrument specifications to calibrating standard uncertainty. In some instances this ratio is not met, and in these cases it is desirable to compute the exact uncertainty of the calibration and determine ways of reducing it. A particular example where this problem is encountered is with a machine which does automatic calibrations of force. The process of force calibration using the United Force Machine is described in detail. The sources of error are identified and quantified when possible. Suggestions for improvement are made.

  12. Uncertainty studies of real anode surface area in computational analysis for molten salt electrorefining

    International Nuclear Information System (INIS)

    Highlights: → Numerical electrochemo-fluid modeling of pyrochemical electrorefining in cross comparison with 2D and 3D analysis models. → Benchmark study on cell potential of molten LiCl-KCl electrorefining with Mark-IV electrorefiner containing EBR-II spent fuel. → Determination of real anode surface area profile model governing electrorefining performance. → Identification of uncertainty factors in electrorefining causing disagreements between simulation and experiment. → Fully transient performance analysis of 80 hours Mark-IV electrorefining with multi-species multi-reaction 1D model. - Abstract: This study examines how much cell potential changes with five differently assumed real anode surface area cases. Determining real anode surface area is a significant issue to be resolved for precisely modeling molten salt electrorefining. Based on a three-dimensional electrorefining model, calculated cell potentials compare with an experimental cell potential variation over 80 h of operation of the Mark-IV electrorefiner with driver fuel from the Experimental Breeder Reactor II. We succeeded to achieve a good agreement with an overall trend of the experimental data with appropriate selection of a mode for real anode surface area, but there are still local inconsistencies between theoretical calculation and experimental observation. In addition, the results were validated and compared with two-dimensional results to identify possible uncertainty factors that had to be further considered in a computational electrorefining analysis. These uncertainty factors include material properties, heterogeneous material distribution, surface roughness, and current efficiency. Zirconium's abundance and complex behavior have more impact on uncertainty towards the latter period of electrorefining at given batch of fuel. The benchmark results found that anode materials would be dissolved from both axial and radial directions at least for low burn-up metallic fuels after active

  13. NASTRAN variance analysis and plotting of HBDY elements. [analysis of uncertainties of the computer results as a function of uncertainties in the input data

    Science.gov (United States)

    Harder, R. L.

    1974-01-01

    The NASTRAN Thermal Analyzer has been intended to do variance analysis and plot the thermal boundary elements. The objective of the variance analysis addition is to assess the sensitivity of temperature variances resulting from uncertainties inherent in input parameters for heat conduction analysis. The plotting capability provides the ability to check the geometry (location, size and orientation) of the boundary elements of a model in relation to the conduction elements. Variance analysis is the study of uncertainties of the computed results as a function of uncertainties of the input data. To study this problem using NASTRAN, a solution is made for both the expected values of all inputs, plus another solution for each uncertain variable. A variance analysis module subtracts the results to form derivatives, and then can determine the expected deviations of output quantities.

  14. Π4U: A high performance computing framework for Bayesian uncertainty quantification of complex models

    Energy Technology Data Exchange (ETDEWEB)

    Hadjidoukas, P.E.; Angelikopoulos, P. [Computational Science and Engineering Laboratory, ETH Zürich, CH-8092 (Switzerland); Papadimitriou, C. [Department of Mechanical Engineering, University of Thessaly, GR-38334 Volos (Greece); Koumoutsakos, P., E-mail: petros@ethz.ch [Computational Science and Engineering Laboratory, ETH Zürich, CH-8092 (Switzerland)

    2015-03-01

    We present Π4U,{sup 1} an extensible framework, for non-intrusive Bayesian Uncertainty Quantification and Propagation (UQ+P) of complex and computationally demanding physical models, that can exploit massively parallel computer architectures. The framework incorporates Laplace asymptotic approximations as well as stochastic algorithms, along with distributed numerical differentiation and task-based parallelism for heterogeneous clusters. Sampling is based on the Transitional Markov Chain Monte Carlo (TMCMC) algorithm and its variants. The optimization tasks associated with the asymptotic approximations are treated via the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). A modified subset simulation method is used for posterior reliability measurements of rare events. The framework accommodates scheduling of multiple physical model evaluations based on an adaptive load balancing library and shows excellent scalability. In addition to the software framework, we also provide guidelines as to the applicability and efficiency of Bayesian tools when applied to computationally demanding physical models. Theoretical and computational developments are demonstrated with applications drawn from molecular dynamics, structural dynamics and granular flow.

  15. Π4U: A high performance computing framework for Bayesian uncertainty quantification of complex models

    International Nuclear Information System (INIS)

    We present Π4U,1 an extensible framework, for non-intrusive Bayesian Uncertainty Quantification and Propagation (UQ+P) of complex and computationally demanding physical models, that can exploit massively parallel computer architectures. The framework incorporates Laplace asymptotic approximations as well as stochastic algorithms, along with distributed numerical differentiation and task-based parallelism for heterogeneous clusters. Sampling is based on the Transitional Markov Chain Monte Carlo (TMCMC) algorithm and its variants. The optimization tasks associated with the asymptotic approximations are treated via the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). A modified subset simulation method is used for posterior reliability measurements of rare events. The framework accommodates scheduling of multiple physical model evaluations based on an adaptive load balancing library and shows excellent scalability. In addition to the software framework, we also provide guidelines as to the applicability and efficiency of Bayesian tools when applied to computationally demanding physical models. Theoretical and computational developments are demonstrated with applications drawn from molecular dynamics, structural dynamics and granular flow

  16. Π4U: A high performance computing framework for Bayesian uncertainty quantification of complex models

    Science.gov (United States)

    Hadjidoukas, P. E.; Angelikopoulos, P.; Papadimitriou, C.; Koumoutsakos, P.

    2015-03-01

    We present Π4U, an extensible framework, for non-intrusive Bayesian Uncertainty Quantification and Propagation (UQ+P) of complex and computationally demanding physical models, that can exploit massively parallel computer architectures. The framework incorporates Laplace asymptotic approximations as well as stochastic algorithms, along with distributed numerical differentiation and task-based parallelism for heterogeneous clusters. Sampling is based on the Transitional Markov Chain Monte Carlo (TMCMC) algorithm and its variants. The optimization tasks associated with the asymptotic approximations are treated via the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). A modified subset simulation method is used for posterior reliability measurements of rare events. The framework accommodates scheduling of multiple physical model evaluations based on an adaptive load balancing library and shows excellent scalability. In addition to the software framework, we also provide guidelines as to the applicability and efficiency of Bayesian tools when applied to computationally demanding physical models. Theoretical and computational developments are demonstrated with applications drawn from molecular dynamics, structural dynamics and granular flow.

  17. UNCERTAINTY MANAGEMENT IN SEISMIC VULNERABILITY ASSESSMENT USING GRANULAR COMPUTING BASED ON COVERING OF UNIVERSE

    Directory of Open Access Journals (Sweden)

    F. Khamespanah

    2013-05-01

    Granular computing model concentrates on a general theory and methodology for problem solving as well as information processing by assuming multiple levels of granularity. Basic elements in granular computing are subsets, classes, and clusters of a universe called elements. In this research GrC is used for extracting classification rules based on seismic vulnerability with minimum entropy to handle uncertainty related to earthquake data. Tehran was selected as the study area. In our previous research, Granular computing model based on a partition model of universe was employed. The model has some kinds of limitations in defining similarity between elements of the universe and defining granules. In the model similarity between elements is defined based on an equivalence relation. According to this relation, two objects are similar based on some attributes, provided for each attribute the values of these objects are equal. In this research a general relation for defining similarity between elements of universe is proposed. The general relation is used for defining similarity and instead of partitioning the universe, granulation is done based on covering of universe. As a result of the study, a physical seismic vulnerability map of Tehran has been produced based on granular computing model. The accuracy of the seismic vulnerability map is evaluated using granular computing model based on covering of universe. The comparison between this model and granular computing model based on partition model of universe is undertaken which verified the superiority of the GrC based on covering of the universe in terms of the match between the achieved results with those confirmed by the related experts' judgments.

  18. Analysis of the CONRAD computational problems expressing only stochastic uncertainties: Neutrons and protons

    International Nuclear Information System (INIS)

    Within the scope of CONRAD (A Coordinated Action for Radiation Dosimetry) Work Package 4 on Computational Dosimetry jointly collaborated with the other research actions on internal dosimetry, complex mixed radiation fields at workplaces and medical staff dosimetry. Besides these collaborative actions, WP4 promoted an international comparison on eight problems with their associated experimental data. A first set of three problems, the results of which are herewith summarised, dealt only with the expression of the stochastic uncertainties of the results: the analysis of the response function of a proton recoil telescope detector, the study of a Bonner sphere neutron spectrometer and the analysis of the neutron spectrum and dosimetric quantity Hp(10) in a thermal neutron facility operated by IRSN Cadarache (the SIGMA facility). A second paper will summarise the results of the other five problems which dealt with the full uncertainty budget estimate. A third paper will present the results of a comparison on in vivo measurements of the 241Am bone-seeker nuclide distributed in the knee. All the detailed papers will be presented in the WP4 Final Workshop Proceedings. (authors)

  19. Dynamic rating curve assessment for hydrometric stations computation of the associated uncertainties: Quality and station management indicators

    Science.gov (United States)

    Morlot, Thomas; Perret, Christian; Favre, Anne-Catherine; Despax, Aurélien; Hauet, Alexandre; Sevrez, Damien; Belleville, Arnaud

    2015-04-01

    Whether we talk about safety reasons, energy production or regulation, water resources management is one of EDF's (French hydropower company) main concerns. To meet these needs, since the fifties EDF-DTG operates a hydrometric network that includes more than 350 hydrometric stations. The data collected allow real time monitoring of rivers (hydro meteorological forecasts at points of interests), as well as hydrological studies and the sizing of structures. Ensuring the quality of stream flow data is a priority. A rating curve is used to indirectly estimate the discharge in rivers based on water level measurements. The discharge values obtained from a rating curve include uncertainties related to the direct stage-discharge measurements (gaugings) used to build the curves, the quality of fit of the curve to these measurements and the constant changes in the river bed morphology. Moreover, the uncertainty of discharges estimated from a rating curve increases with the ''age'' of the rating curve. The level of uncertainty at a given point in time is therefore particularly difficult to assess. Moreover, the current capacity to produce a rating curve is not suited to the frequency of change of the stage-discharge relationship. The actual method does not take into consideration the variation of the flow conditions and the modifications of the river bed which occur due to natural processes such as erosion, sedimentation and seasonal vegetation growth. A « dynamic » method has been developed to compute rating curves while calculating associated uncertainties, thus making it possible to regenerate streamflow data with uncertainty estimates. The method is based on historical gaugings at hydrometric stations. A rating curve is computed for each gauging and a model of the uncertainty is fitted for each of them. The model of uncertainty takes into account the uncertainties in the measurement of the water level, the quality of fit of the curve, the uncertainty of gaugings and the

  20. Dose computation in conformal radiation therapy including geometric uncertainties: Methods and clinical implications

    Science.gov (United States)

    Rosu, Mihaela

    The aim of any radiotherapy is to tailor the tumoricidal radiation dose to the target volume and to deliver as little radiation dose as possible to all other normal tissues. However, the motion and deformation induced in human tissue by ventilatory motion is a major issue, as standard practice usually uses only one computed tomography (CT) scan (and hence one instance of the patient's anatomy) for treatment planning. The interfraction movement that occurs due to physiological processes over time scales shorter than the delivery of one treatment fraction leads to differences between the planned and delivered dose distributions. Due to the influence of these differences on tumors and normal tissues, the tumor control probabilities and normal tissue complication probabilities are likely to be impacted upon in the face of organ motion. In this thesis we apply several methods to compute dose distributions that include the effects of the treatment geometric uncertainties by using the time-varying anatomical information as an alternative to the conventional Planning Target Volume (PTV) approach. The proposed methods depend on the model used to describe the patient's anatomy. The dose and fluence convolution approaches for rigid organ motion are discussed first, with application to liver tumors and the rigid component of the lung tumor movements. For non-rigid behavior a dose reconstruction method that allows the accumulation of the dose to the deforming anatomy is introduced, and applied for lung tumor treatments. Furthermore, we apply the cumulative dose approach to investigate how much information regarding the deforming patient anatomy is needed at the time of treatment planning for tumors located in thorax. The results are evaluated from a clinical perspective. All dose calculations are performed using a Monte Carlo based algorithm to ensure more realistic and more accurate handling of tissue heterogeneities---of particular importance in lung cancer treatment planning.

  1. A Bootstrap Approach to Computing Uncertainty in Inferred Oil and Gas Reserve Estimates

    International Nuclear Information System (INIS)

    This study develops confidence intervals for estimates of inferred oil and gas reserves based on bootstrap procedures. Inferred reserves are expected additions to proved reserves in previously discovered conventional oil and gas fields. Estimates of inferred reserves accounted for 65% of the total oil and 34% of the total gas assessed in the U.S. Geological Survey's 1995 National Assessment of oil and gas in US onshore and State offshore areas. When the same computational methods used in the 1995 Assessment are applied to more recent data, the 80-year (from 1997 through 2076) inferred reserve estimates for pre-1997 discoveries located in the lower 48 onshore and state offshore areas amounted to a total of 39.7 billion barrels of oil (BBO) and 293 trillion cubic feet (TCF) of gas. The 90% confidence interval about the oil estimate derived from the bootstrap approach is 22.4 BBO to 69.5 BBO. The comparable 90% confidence interval for the inferred gas reserve estimate is 217 TCF to 413 TCF. The 90% confidence interval describes the uncertainty that should be attached to the estimates. It also provides a basis for developing scenarios to explore the implications for energy policy analysis

  2. Expressing Uncertainty in Computer-Mediated Discourse: Language as a Marker of Intellectual Work

    Science.gov (United States)

    Jordan, Michelle E.; Schallert, Diane L.; Park, Yangjoo; Lee, SoonAh; Chiang, Yueh-hui Vanessa; Cheng, An-Chih Janne; Song, Kwangok; Chu, Hsiang-Ning Rebecca; Kim, Taehee; Lee, Haekyung

    2012-01-01

    Learning and dialogue may naturally engender feelings and expressions of uncertainty for a variety of reasons and purposes. Yet, little research has examined how patterns of linguistic uncertainty are enacted and changed over time as students reciprocally influence one another and the dialogical system they are creating. This study describes the…

  3. Evaluation of nuclear safety from the outputs of computer codes in the presence of uncertainties

    International Nuclear Information System (INIS)

    We apply methods from order statistics to the problem of satisfying regulations that specify individual criteria to be met by each of a number of outputs, k, from a computer code simulating nuclear accidents. The regulations are assumed to apply to an 'extent', γk, (such as 95%) of the cumulative probability distribution of each output, k, that is obtained by randomly varying the inputs to the code over their ranges of uncertainty. We use a 'bracketing' approach to obtain expressions for the confidence, β, or probability that these desired extents will be covered in N runs of the code. Detailed results are obtained for k=1,2,3, with equal extents, γ, and are shown to depend on the degree of correlation of the outputs. They reduce to the proper expressions in limiting cases. These limiting cases are also analyzed for an arbitrary number of outputs, k. The bracketing methodology is contrasted with the traditional 'coverage' approach in which the objective is to obtain a range of outputs that enclose a total fraction, γ, of all possible outputs, without regard to the extent of individual outputs. For the case of two outputs we develop an alternate formulation and show that the confidence, β, depends on the degree of correlation between outputs. The alternate formulation reduces to the single output case when the outputs are so well correlated that the coverage criterion is always met in a single run of the code if either output lies beyond an extent γ, it reduces to Wilks' expression for un-correlated variables when the outputs are independent, and it reduces to Wald's result when the outputs are so negatively correlated that the coverage criterion could never be met by the two outputs of a single run of the code. The predictions of both formulations are validated by comparison with Monte Carlo simulations

  4. CQL: a computational program for uncertainty evaluation and quality control of metrology laboratories

    International Nuclear Information System (INIS)

    We present a software, developed at CRCN-CNEN, to evaluate uncertainties and quality control in metrology laboratories. Although it has been built for ionizing radiation measurements, the program is adaptable to be used in other type of laboratory as the chemical analysis, for instance. The program was developed based on the ISO issue: Guide to the Expression of Uncertainty in Measurement. Its interface was designed using C++ language with database tools and visual environment to display the two statistical quality control graphs. Beside providing a value for the expanded uncertainty reporting, it also allows to make a statistical evaluation of uncertainty either by varying the input influence quantities or by choosing some coverage probability. (author)

  5. A methodology for computing uncertainty bounds of multivariable systems based on sector stability theory concepts

    Science.gov (United States)

    Waszak, Martin R.

    1992-01-01

    The application of a sector-based stability theory approach to the formulation of useful uncertainty descriptions for linear, time-invariant, multivariable systems is explored. A review of basic sector properties and sector-based approach are presented first. The sector-based approach is then applied to several general forms of parameter uncertainty to investigate its advantages and limitations. The results indicate that the sector uncertainty bound can be used effectively to evaluate the impact of parameter uncertainties on the frequency response of the design model. Inherent conservatism is a potential limitation of the sector-based approach, especially for highly dependent uncertain parameters. In addition, the representation of the system dynamics can affect the amount of conservatism reflected in the sector bound. Careful application of the model can help to reduce this conservatism, however, and the solution approach has some degrees of freedom that may be further exploited to reduce the conservatism.

  6. Computing continuous record of discharge with quantified uncertainty using index velocity observations: A probabilistic machine learning approach

    Science.gov (United States)

    Farahmand, Touraj; Hamilton, Stuart

    2016-04-01

    Application of the index velocity method for computing continuous records of discharge has become increasingly common, especially since the introduction of low-cost acoustic Doppler velocity meters (ADVMs). In general, the index velocity method can be used at locations where stage-discharge methods are used, but it is especially appropriate and recommended when more than one specific discharge can be measured for a specific stage such as backwater and unsteady flow conditions caused by but not limited to the following; stream confluences, streams flowing into lakes or reservoirs, tide-affected streams, regulated streamflows (dams or control structures), or streams affected by meteorological forcing, such as strong prevailing winds. In existing index velocity modeling techniques, two models (ratings) are required; index velocity model and stage-area model. The outputs from each of these models, mean channel velocity (Vm) and cross-sectional area (A), are then multiplied together to compute a discharge. Mean channel velocity (Vm) can generally be determined by a multivariate regression parametric model such as linear regression in the simplest case. The main challenges in the existing index velocity modeling techniques are; 1) Preprocessing and QA/QC of continuous index velocity data and synchronizing them with discharge measurements. 2) Nonlinear relationship between mean velocity and index velocity which is not uncommon at monitoring locations. 3)Model exploration and analysis in order to find the optimal regression model predictor(s) and model type (linear vs nonlinear and if nonlinear number of the parameters). 3) Model changes caused by dynamical changes in the environment (geomorphic, biological) over time 5) Deployment of the final model into the Data Management Systems (DMS) for real-time discharge calculation 6) Objective estimation of uncertainty caused by: field measurement errors; structural uncertainty; parameter uncertainty; and continuous sensor data

  7. PABS: A Computer Program to Normalize Emission Probabilities and Calculate Realistic Uncertainties

    International Nuclear Information System (INIS)

    The program PABS normalizes relative particle emission probabilities to an absolute scale and calculates the relevant uncertainties on this scale. The program is written in Java using the JDK 1.6 library. For additional information about system requirements, the code itself, and compiling from source, see the README file distributed with this program. The mathematical procedures used are given.

  8. PABS: A Computer Program to Normalize Emission Probabilities and Calculate Realistic Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Caron, D. S.; Browne, E.; Norman, E. B.

    2009-08-21

    The program PABS normalizes relative particle emission probabilities to an absolute scale and calculates the relevant uncertainties on this scale. The program is written in Java using the JDK 1.6 library. For additional information about system requirements, the code itself, and compiling from source, see the README file distributed with this program. The mathematical procedures used are given below.

  9. LJUNGSKILE 1.0 A Computer Program for Investigation of Uncertainties in Chemical Speciation

    International Nuclear Information System (INIS)

    In analysing the long-term safety of nuclear waste disposal, there is a need to investigate uncertainties in chemical speciation calculations. Chemical speciation is of importance in evaluating the solubility of radionuclides, the chemical degradation of engineering materials, and chemical processes controlling groundwater composition. The uncertainties in chemical speciation may for instance be related to uncertainties in thermodynamic data, the groundwater composition, or the extrapolation to the actual temperature and ionic strength. The magnitude of such uncertainties and its implications are seldom explicitly evaluated in any detail. Commonly available chemical speciation programmes normally do not have a build-in option to include uncertainty ranges. The program developed within this project has the capability of incorporating uncertainty ranges in speciation calculations and can be used for graphical presentation of uncertainty ranges for dominant species. The program should be regarded as a starting point for assessing uncertainties in chemical speciation, since it is not yet comprehensive in its capabilities. There may be limitations in its usefulness to address various geochemical problems. The LJUNGSKILE code allows the user to select two approaches: the Monte Carlo (MC) approach and the Latin Hypercube Sampling (LHS). LHS allows to produce a satisfactory statistics with a minimum of CPU time. It is, in general, possible to do a simple theoretical speciation calculation within seconds. There are, admittedly, alternatives to LHS and there is criticism towards the uncritical use of LHS output because commonly correlation between some of the input variables exists. LHS, like MC, is not capable to take these correlations into account. Such a correlation can, i.e. exist between the pH of a solution and the partial pressure of CO2: higher pH solutions may absorb larger amounts of CO2 and can reduce the CO2 partial pressure. It is therefore of advantage to

  10. Optimal Technological Portfolios for Climate-Change Policy under Uncertainty: A Computable General Equilibrium Approach

    OpenAIRE

    David F. Bradford; Seung-Rae Kim; Klaus Keller

    2004-01-01

    When exploring solutions to long-term environmental problems such as climate change, it is crucial to understand how the rates and directions of technological change may interact with environmental policies in the presence of uncertainty. This paper analyzes optimal technological portfolios for global carbon emissions reductions in an integrated assessment model of the coupled social-natural system. The model used here is a probabilistic, two-technology extension of Nordhaus" earlier model (N...

  11. Contributions to the uncertainty management in numerical modelization: wave propagation in random media and analysis of computer experiments

    International Nuclear Information System (INIS)

    The present document constitutes my Habilitation thesis report. It recalls my scientific activity of the twelve last years, since my PhD thesis until the works completed as a research engineer at CEA Cadarache. The two main chapters of this document correspond to two different research fields both referring to the uncertainty treatment in engineering problems. The first chapter establishes a synthesis of my work on high frequency wave propagation in random medium. It more specifically relates to the study of the statistical fluctuations of acoustic wave travel-times in random and/or turbulent media. The new results mainly concern the introduction of the velocity field statistical anisotropy in the analytical expressions of the travel-time statistical moments according to those of the velocity field. This work was primarily carried by requirements in geophysics (oil exploration and seismology). The second chapter is concerned by the probabilistic techniques to study the effect of input variables uncertainties in numerical models. My main applications in this chapter relate to the nuclear engineering domain which offers a large variety of uncertainty problems to be treated. First of all, a complete synthesis is carried out on the statistical methods of sensitivity analysis and global exploration of numerical models. The construction and the use of a meta-model (inexpensive mathematical function replacing an expensive computer code) are then illustrated by my work on the Gaussian process model (kriging). Two additional topics are finally approached: the high quantile estimation of a computer code output and the analysis of stochastic computer codes. We conclude this memory with some perspectives about the numerical simulation and the use of predictive models in industry. This context is extremely positive for future researches and application developments. (author)

  12. Estimation and Uncertainty Analysis of Flammability Properties for Computer-aided molecular design of working fluids for thermodynamic cycles

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Abildskov, Jens; Sin, Gürkan

    Computer Aided Molecular Design (CAMD) is an important tool to generate, test and evaluate promising chemical products. CAMD can be used in thermodynamic cycle for the design of pure component or mixture working fluids in order to improve the heat transfer capacity of the system. The safety...... analysis. In this case property prediction models like group contribution (GC) models can estimate flammability data. The estimation needs to be accurate, reliable and as less time consuming as possible [1]. However, GC property prediction methods frequently lack rigorous uncertainty analysis. Hence, there...

  13. Uncertainty analysis of computational methods for deriving sensible heat flux values from scintillometer measurements

    Directory of Open Access Journals (Sweden)

    P. A. Solignac

    2009-11-01

    Full Text Available The use of scintillometers to determine sensible heat fluxes is now common in studies of land-atmosphere interactions. The main interest in these instruments is due to their ability to quantify energy distributions at the landscape scale, as they can calculate sensible heat flux values over long distances, in contrast to Eddy Covariance systems. However, scintillometer data do not provide a direct measure of sensible heat flux, but require additional data, such as the Bowen ratio (β, to provide flux values. The Bowen ratio can either be measured using Eddy Covariance systems or derived from the energy balance closure. In this work, specific requirements for estimating energy fluxes using a scintillometer were analyzed, as well as the accuracy of two flux calculation methods. We first focused on the classical method (used in standard softwares and we analysed the impact of the Bowen ratio on flux value and uncertainty. For instance, an averaged Bowen ratio (β of less than 1 proved to be a significant source of measurement uncertainty. An alternative method, called the "β-closure method", for which the Bowen ratio measurement is not necessary, was also tested. In this case, it was observed that even for low β values, flux uncertainties were reduced and scintillometer data were well correlated with the Eddy Covariance results. Besides, both methods should tend to the same results, but the second one slightly underestimates H while β decreases (<5%.

  14. Uncertainty analysis of computational methods for deriving sensible heat flux values from scintillometer measurements

    Directory of Open Access Journals (Sweden)

    P. A. Solignac

    2009-06-01

    Full Text Available The use of scintillometers to determine sensible heat fluxes is now common in studies of land-atmosphere interactions. The main interest in these instruments is due to their ability to quantify energy distributions at the landscape scale, as they can calculate sensible heat flux values over long distances, in contrast to Eddy Correlation systems. However, scintillometer data do not provide a direct measure of sensible heat flux, but require additional data, such as the Bowen ratio (β, to provide flux values. The Bowen ratio can either be measured using Eddy Correlation systems or derived from the energy balance closure. In this work, specific requirements for estimating energy fluxes using a scintillometer were analyzed, as well as the accuracy of two flux calculation methods. We first focused on the classical method (used in standard software. We analysed the impact of the Bowen ratio according to both time averaging and ratio values; for instance, an averaged Bowen ratio (β of less than 1 proved to be a significant source of measurement uncertainty. An alternative method, called the "β-closure method", for which the Bowen ratio measurement is not necessary, was also tested. In this case, it was observed that even for low β values, flux uncertainties were reduced and scintillometer data were well correlated with the Eddy Correlation results.

  15. Development of a Computational Framework for Stochastic Co-optimization of Water and Energy Resource Allocations under Climatic Uncertainty

    Science.gov (United States)

    Xuan, Y.; Mahinthakumar, K.; Arumugam, S.; DeCarolis, J.

    2015-12-01

    Owing to the lack of a consistent approach to assimilate probabilistic forecasts for water and energy systems, utilization of climate forecasts for conjunctive management of these two systems is very limited. Prognostic management of these two systems presents a stochastic co-optimization problem that seeks to determine reservoir releases and power allocation strategies while minimizing the expected operational costs subject to probabilistic climate forecast constraints. To address these issues, we propose a high performance computing (HPC) enabled computational framework for stochastic co-optimization of water and energy resource allocations under climate uncertainty. The computational framework embodies a new paradigm shift in which attributes of climate (e.g., precipitation, temperature) and its forecasted probability distribution are employed conjointly to inform seasonal water availability and electricity demand. The HPC enabled cyberinfrastructure framework is developed to perform detailed stochastic analyses, and to better quantify and reduce the uncertainties associated with water and power systems management by utilizing improved hydro-climatic forecasts. In this presentation, our stochastic multi-objective solver extended from Optimus (Optimization Methods for Universal Simulators), is introduced. The solver uses parallel cooperative multi-swarm method to solve for efficient solution of large-scale simulation-optimization problems on parallel supercomputers. The cyberinfrastructure harnesses HPC resources to perform intensive computations using ensemble forecast models of streamflow and power demand. The stochastic multi-objective particle swarm optimizer we developed is used to co-optimize water and power system models under constraints over a large number of ensembles. The framework sheds light on the application of climate forecasts and cyber-innovation framework to improve management and promote the sustainability of water and energy systems.

  16. Computing the uncertainty associated with the control of ecological and biological systems

    Directory of Open Access Journals (Sweden)

    Alessandro Ferrarini

    2013-09-01

    Full Text Available Recently, I showed that ecological and biological networks can be controlled by coupling their dynamics to evolutionary modelling. This provides numerous solutions to the goal of guiding a system's behaviour towards the desired result. In this paper, I face another important question: how reliable is the achieved solution? In other words, which is the degree of uncertainty about getting the desired result if values of edges and nodes were a bit different from optimized ones? This is a pivotal question, because it's not assured that while managing a certain system we are able to impose to nodes and edges exactly the optimized values we would need in order to achieve the desired results. In order to face this topic, I have formulated here a 3-parts framework (network dynamics - genetic optimization - stochastic simulations and, using an illustrative example, I have been able to detect the most reliable solution to the goal of network control. The proposed framework could be used to: a counteract damages to ecological and biological networks, b safeguard rare and endangered species, c manage systems at the least possible cost, and d plan optimized bio-manipulations.

  17. Bayesian Uncertainty Analysis SOARS for Computationally Expensive Simulation Models with Application to Contaminant Hydrology in the Cannonsville Watershed

    Science.gov (United States)

    Shoemaker, C. A.; Cowan, D.; Woodbury, J.; Ruppert, D.; Bliznyuk, N.; Wang, Y.; Li, Y.

    2009-12-01

    This paper presents application of a new computationally efficient method SOARS for statistically rigorous assessment of uncertainty in parameters and model output when the model is calibrated to field data. The SOARS method is general and is here applied to watershed problems The innovative aspect of this procedure is that an optimization method is first used to find the maximum likelihood estimator and then the costly simulations done during the optimization are re-used to build a response surface model of the likelihood function. Markov Chain Monte Carlo is applied then to the response surface model to obtain the posterior distributions of the model parameters and the appropriate transformations to correct for non-normal error. On a hazardous spill in channel problem and on a small watershed (37 km2), the computational effort to obtain roughly the same accuracy of solution is 150 model simulations for the SOARS method versus 10,000 simulations for conventional MCMC analysis, which is more than a 60 fold reduction in computational effort. For the larger Cannonsville Watershed (1200 km2) the method is expanded to provide posterior densities not only on parameter values but also on multiple model predictions. Available software for the method will be discussed as well as SOAR’s use for assessing the impact of climate change on hydrology and water-borne pollutant transport in the Cannonsville basin and other watersheds.

  18. Computation of strain and rotation tensor as well as their uncertainties for small arrays in spherical coordinate system

    Institute of Scientific and Technical Information of China (English)

    MENG Guo-jie; REN Jin-wei; WU Ji-cang; SHEN Xu-hui

    2008-01-01

    Based on Taylor series expansion and strain components expressions of elastic mechanics, we derive formulae of strain and rotation tensor for small arrays in spherical coordinates system. By linearization process of the formulae, we also derive expressions of strain components and Euler vector uncertainties respectively for subnets using the law of error propagation. Taking GPS velocity field in Sichuan-Yunnan area as an example, we compute dilation rate and maximum shear strain rate field using the above procedure, and their characteristics are preliminarily carried on. Limits of the strain model for small array are also discussed. We make detailed explanations on small array method and the choice of small arrays. How to set weights of GPS observations are further discussed. Moreover relationship between strain and radius of GPS subnets is also analyzed.

  19. Numerical study of premixed HCCI engine combustion and its sensitivity to computational mesh and model uncertainties

    Science.gov (United States)

    Kong, Song-Charng; Reitz, Rolf D.

    2003-06-01

    This study used a numerical model to investigate the combustion process in a premixed iso-octane homogeneous charge compression ignition (HCCI) engine. The engine was a supercharged Cummins C engine operated under HCCI conditions. The CHEMKIN code was implemented into an updated KIVA-3V code so that the combustion could be modelled using detailed chemistry in the context of engine CFD simulations. The model was able to accurately simulate the ignition timing and combustion phasing for various engine conditions. The unburned hydrocarbon emissions were also well predicted while the carbon monoxide emissions were under predicted. Model results showed that the majority of unburned hydrocarbon is located in the piston-ring crevice region and the carbon monoxide resides in the vicinity of the cylinder walls. A sensitivity study of the computational grid resolution indicated that the combustion predictions were relatively insensitive to the grid density. However, the piston-ring crevice region needed to be simulated with high resolution to obtain accurate emissions predictions. The model results also indicated that HCCI combustion and emissions are very sensitive to the initial mixture temperature. The computations also show that the carbon monoxide emissions prediction can be significantly improved by modifying a key oxidation reaction rate constant.

  20. Reducing annotation cost and uncertainty in computer-aided diagnosis through selective iterative classification

    Science.gov (United States)

    Riely, Amelia; Sablan, Kyle; Xiaotao, Thomas; Furst, Jacob; Raicu, Daniela

    2015-03-01

    Medical imaging technology has always provided radiologists with the opportunity to view and keep records of anatomy of the patient. With the development of machine learning and intelligent computing, these images can be used to create Computer-Aided Diagnosis (CAD) systems, which can assist radiologists in analyzing image data in various ways to provide better health care to patients. This paper looks at increasing accuracy and reducing cost in creating CAD systems, specifically in predicting the malignancy of lung nodules in the Lung Image Database Consortium (LIDC). Much of the cost in creating an accurate CAD system stems from the need for multiple radiologist diagnoses or annotations of each image, since there is rarely a ground truth diagnosis and even different radiologists' diagnoses of the same nodule often disagree. To resolve this issue, this paper outlines an method of selective iterative classification that predicts lung nodule malignancy by using multiple radiologist diagnoses only for cases that can benefit from them. Our method achieved 81% accuracy while costing only 46% of the method that indiscriminately used all annotations, which achieved a lower accuracy of 70%, while costing more.

  1. UCODE_2005 and six other computer codes for universal sensitivity analysis, calibration, and uncertainty evaluation constructed using the JUPITER API

    Science.gov (United States)

    Poeter, Eileen E.; Hill, Mary C.; Banta, Edward R.; Mehl, Steffen; Christensen, Steen

    2006-01-01

    This report documents the computer codes UCODE_2005 and six post-processors. Together the codes can be used with existing process models to perform sensitivity analysis, data needs assessment, calibration, prediction, and uncertainty analysis. Any process model or set of models can be used; the only requirements are that models have numerical (ASCII or text only) input and output files, that the numbers in these files have sufficient significant digits, that all required models can be run from a single batch file or script, and that simulated values are continuous functions of the parameter values. Process models can include pre-processors and post-processors as well as one or more models related to the processes of interest (physical, chemical, and so on), making UCODE_2005 extremely powerful. An estimated parameter can be a quantity that appears in the input files of the process model(s), or a quantity used in an equation that produces a value that appears in the input files. In the latter situation, the equation is user-defined. UCODE_2005 can compare observations and simulated equivalents. The simulated equivalents can be any simulated value written in the process-model output files or can be calculated from simulated values with user-defined equations. The quantities can be model results, or dependent variables. For example, for ground-water models they can be heads, flows, concentrations, and so on. Prior, or direct, information on estimated parameters also can be considered. Statistics are calculated to quantify the comparison of observations and simulated equivalents, including a weighted least-squares objective function. In addition, data-exchange files are produced that facilitate graphical analysis. UCODE_2005 can be used fruitfully in model calibration through its sensitivity analysis capabilities and its ability to estimate parameter values that result in the best possible fit to the observations. Parameters are estimated using nonlinear regression: a

  2. Model Uncertainty

    OpenAIRE

    Clyde, Merlise; George, Edward I.

    2004-01-01

    The evolution of Bayesian approaches for model uncertainty over the past decade has been remarkable. Catalyzed by advances in methods and technology for posterior computation, the scope of these methods has widened substantially. Major thrusts of these developments have included new methods for semiautomatic prior specification and posterior exploration. To illustrate key aspects of this evolution, the highlights of some of these developments are described.

  3. The Personality Trait of Intolerance to Uncertainty Affects Behavior in a Novel Computer-Based Conditioned Place Preference Task

    Science.gov (United States)

    Radell, Milen L.; Myers, Catherine E.; Beck, Kevin D.; Moustafa, Ahmed A.; Allen, Michael Todd

    2016-01-01

    Recent work has found that personality factors that confer vulnerability to addiction can also affect learning and economic decision making. One personality trait which has been implicated in vulnerability to addiction is intolerance to uncertainty (IU), i.e., a preference for familiar over unknown (possibly better) options. In animals, the motivation to obtain drugs is often assessed through conditioned place preference (CPP), which compares preference for contexts where drug reward was previously received. It is an open question whether participants with high IU also show heightened preference for previously rewarded contexts. To address this question, we developed a novel computer-based CPP task for humans in which participants guide an avatar through a paradigm in which one room contains frequent reward (i.e., rich) and one contains less frequent reward (i.e., poor). Following exposure to both contexts, subjects are assessed for preference to enter the previously rich and previously poor room. Individuals with low IU showed little bias to enter the previously rich room first, and instead entered both rooms at about the same rate which may indicate a foraging behavior. By contrast, those with high IU showed a strong bias to enter the previously rich room first. This suggests an increased tendency to chase reward in the intolerant group, consistent with previously observed behavior in opioid-addicted individuals. Thus, the personality factor of high IU may produce a pre-existing cognitive bias that provides a mechanism to promote decision-making processes that increase vulnerability to addiction. PMID:27555829

  4. Reservoir souring: Problems, uncertainties and modelling. Part I: Problems and uncertainty involved in prediction. Part II: Preliminary investigations of a computational model

    International Nuclear Information System (INIS)

    The paper relates to improved oil recovery (IOR) techniques by mathematical modelling. The uncertainty involved in modelling of reservoir souring is discussed. IOR processes are speculated to influence a souring process in a positive direction. Most models do not take into account pH in reservoir fluids, and thus do not account for partitioning behaviour of sulfide. Also, sulfide is antagonistic to bacterial metabolism and impedes to bacterial metabolism and impedes the sulfate reduction rate, this may be an important factor in modelling. Biofilms are thought to play a crucial role in a reservoir souring process. Biofilm in a reservoir matrix is different from biofilm in open systems. This has major impact on microbial impact on microbial transport and behaviour. Studies on microbial activity in reservoir matrices must be carried out with model cores, in order to mimic a realistic situation. Sufficient data do not exist today. The main conclusion is that a model does not reflect a true situation before the nature of these elements is understood. A simplified version of an Norwegian developed biofilm model is discussed. The model incorporates all the important physical phenomena studied in the above references such as bacteria growth limited by nutrients and/or energy sources and hydrogen sulfide adsorption. 18 refs., 8 figs., 1 tab

  5. Uncertainty analysis guide

    Energy Technology Data Exchange (ETDEWEB)

    Andres, T.H

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  6. Uncertainty analysis guide

    International Nuclear Information System (INIS)

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  7. Pitfalls and modelling inconsistencies in computational radiation dosimetry: Lessons learnt from the QUADOS intercomparison. Part I: Neutrons and uncertainties

    International Nuclear Information System (INIS)

    The QUADOS EU cost shared action conducted an intercomparison on the usage of numerical methods in radiation protection and dosimetry. The eight problems proposed were intended to test the usage of Monte Carlo and deterministic methods by assessing the accuracy with which the codes are applied and also the methods used to evaluate uncertainty in the answer gained through these methods. The overall objective was to spread good practice through the community and give users information on how to assess the uncertainties associated with their calculated results. (authors)

  8. Uncertainty analysis using evidence theory - confronting level-1 and level-2 approaches with data availability and computational constraints

    International Nuclear Information System (INIS)

    Dempster-Shafer Theory of Evidence (DST), as an alternative or complementary approach to the representation of uncertainty, is gradually being explored with complex practical applications beyond purely algebraic examples. This paper reviews literature documenting such complex applications and studies its applicability from the point of view of the nature and amount of data that is typically available in industrial risk analysis: medium-size frequential observations for aleatory components, small noised datasets for model parameters and expert judgment for other components. On the basis of a simple flood model encoding typical risk analysis features, different approaches to quantify uncertainty in DST are reviewed and benchmarked in that perspective: (i) combining all sources of uncertainty under a single-level DST model; (ii) separating aleatory and epistemic uncertainties, respectively, modeled with a first probabilistic layer and a second one under DST. Methods for handling data in probabilistic studies such as Kolmogorov-Smirnov tests and quantile-quantile plots are transferred to the domain of DST. We illustrate how data availability guides the choice of the settings and how results and sensitivity analyses can be interpreted in the domain of DST, concluding with recommendations for industrial practice.

  9. Uncertainty and Cognitive Control

    Directory of Open Access Journals (Sweden)

    Faisal eMushtaq

    2011-10-01

    Full Text Available A growing trend of neuroimaging, behavioural and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1 There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2 There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3 The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the need for control; (4 Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders.

  10. Uncertainty and Cognitive Control

    OpenAIRE

    Mushtaq, Faisal; Bland, Amy R; Schaefer, Alexandre

    2011-01-01

    A growing trend of neuroimaging, behavioral, and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decisi...

  11. Uncertainty analysis

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software.

  12. Uncertainty budget for a whole body counter in the scan geometry and computer simulation of the calibration phantoms

    International Nuclear Information System (INIS)

    At the Austrian Research Centers Seibersdorf (ARCS), a whole body counter (WBC) in the scan geometry is used to perform routine measurements for the determination of radioactive intake of workers. The calibration of the WBC is made using bottle phantoms with a homogeneous activity distribution. The same calibration procedures have been simulated using Monte Carlo N-Particle (MCNP) code and FLUKA and the results of the full energy peak efficiencies for eight energies and five phantoms have been compared with the experimental results. The deviation between experiment and simulation results is within 10%. Furthermore, uncertainty budget evaluations have been performed to find out which parameters make substantial contributions to these differences. Therefore, statistical errors of the Monte Carlo simulation, uncertainties in the cross section tables and differences due to geometrical considerations have been taken into account. Comparisons between these results and the one with inhomogeneous distribution, for which the activity is concentrated only in certain parts of the body (such as head, lung, arms and legs), have been performed. The maximum deviation of 43% from the homogeneous case has been found when the activity is concentrated on the arms. (authors)

  13. A new surrogate modeling technique combining Kriging and polynomial chaos expansions – Application to uncertainty analysis in computational dosimetry

    OpenAIRE

    Kersaudy, Pierric; Sudret, Bruno; Varsier, Nadège; Picon, Odile; Wiart, Joe

    2015-01-01

    In numerical dosimetry, the recent advances in high performance computing led to a strong reduction of the required computational time to assess the specific absorption rate (SAR) characterizing the human exposure to electromagnetic waves. However, this procedure remains time-consuming and a single simulation can request several hours. As a consequence, the influence of uncertain input parameters onto the SAR cannot be analyzed using crude Monte Carlo simulation. The solution presented here t...

  14. Artifacts in conventional computed tomography (CT) and free breathing four-dimensional CT induce uncertainty in gross tumor volume determination

    DEFF Research Database (Denmark)

    Persson, Gitte Fredberg; Nygaard, Ditte Eklund; Munck af Rosenschöld, Per;

    2011-01-01

    compare delineated gross tumor volume (GTV) sizes in 3DCT, 4DCT, and BHCT scans of patients with lung tumors. Methods and Materials A total of 36 patients with 46 tumors referred for stereotactic radiotherapy of lung tumors were included. All patients underwent positron emission tomography (PET)/CT, 4DCT......Purpose Artifacts impacting the imaged tumor volume can be seen in conventional three-dimensional CT (3DCT) scans for planning of lung cancer radiotherapy but can be reduced with the use of respiration-correlated imaging, i.e., 4DCT or breathhold CT (BHCT) scans. The aim of this study was to......, and BHCT scans. GTVs in all CT scans of individual patients were delineated during one session by a single physician to minimize systematic delineation uncertainty. The GTV size from the BHCT was considered the closest to true tumor volume and was chosen as the reference. The reference GTV size was...

  15. Understanding uncertainty

    CERN Document Server

    Lindley, Dennis V

    2013-01-01

    Praise for the First Edition ""...a reference for everyone who is interested in knowing and handling uncertainty.""-Journal of Applied Statistics The critically acclaimed First Edition of Understanding Uncertainty provided a study of uncertainty addressed to scholars in all fields, showing that uncertainty could be measured by probability, and that probability obeyed three basic rules that enabled uncertainty to be handled sensibly in everyday life. These ideas were extended to embrace the scientific method and to show how decisions, containing an uncertain element, could be rationally made.

  16. Selection of low activation materials for fusion power plants using ACAB system: the effect of computational methods and cross section uncertainties on waste management assessment

    Energy Technology Data Exchange (ETDEWEB)

    Alonso, M.; Sanz, J.; Rodriguez, A.; Falquina, R. [Universidad Nacional de Educacion a Distancia (UNED), Dept. of Power Engineering, Madrid (Spain); Cabellos, O.; Sanz, J. [Universidad Politecnica de Madrid, Instituto de Fusion Nuclear (UPM) (Spain)

    2003-07-01

    The feasibility of nuclear fusion as a realistic option for energy generation depends on its radioactive waste management assessment. In this respect, the production of high level waste is to be avoided and the reduction of low level waste volumes is to be enhanced. Three different waste management options are commonly regarded in fusion plants: Hands-on Recycling, Remote Recycling and Shallow Land Burial (SLB). Therefore, important research work has been undertaken to find low activation structural materials. In performing this task, a major issue is to compute the concentration limits (CLs) for all natural elements, which will be used to select the intended constituent elements of a particular Low Activation Material (LAM) and assess how much the impurities can deteriorate the waste management properties. Nevertheless, the reliable computation of CLs depends on the accuracy of nuclear data (mainly activation cross-sections) and the suitability of the computational method both for inertial and magnetic fusion environments. In this paper the importance of nuclear data uncertainties and mathematical algorithms used in different activation calculations for waste management purposes will be studied. Our work is centred on the study of {sup 186}W activation under first structural wall conditions of Hylife-II inertial fusion reactor design. The importance of the dominant transmutation/decay sequence has been documented in several publications. From a practical point of view, W is used in low activation materials for fusion applications: Cr-W ferritic/martensitic steels, and the need to better compute its activation has been assessed, in particular in relation to the cross-section uncertainties for reactions leading to Ir isotopes. {sup 192n}Ir and {sup 192}Ir reach a secular equilibrium, and {sup 192n}Ir is the critical one for waste management, with a half life of 241 years. From a theoretical point of view, this is one of the most complex chains appearing in

  17. Computing the Risk of Postprandial Hypo- and Hyperglycemia in Type 1 Diabetes Mellitus Considering Intrapatient Variability and Other Sources of Uncertainty

    Science.gov (United States)

    García-Jaramillo, Maira; Calm, Remei; Bondia, Jorge; Tarín, Cristina; Vehí, Josep

    2009-01-01

    Objective The objective of this article was to develop a methodology to quantify the risk of suffering different grades of hypo- and hyperglycemia episodes in the postprandial state. Methods Interval predictions of patient postprandial glucose were performed during a 5-hour period after a meal for a set of 3315 scenarios. Uncertainty in the patient's insulin sensitivities and carbohydrate (CHO) contents of the planned meal was considered. A normalized area under the curve of the worst-case predicted glucose excursion for severe and mild hypo- and hyperglycemia glucose ranges was obtained and weighted accordingly to their importance. As a result, a comprehensive risk measure was obtained. A reference model of preprandial glucose values representing the behavior in different ranges was chosen by a ξ2 test. The relationship between the computed risk index and the probability of occurrence of events was analyzed for these reference models through 19,500 Monte Carlo simulations. Results The obtained reference models for each preprandial glucose range were 100, 160, and 220 mg/dl. A relationship between the risk index ranges 120 and the probability of occurrence of mild and severe postprandial hyper- and hypoglycemia can be derived. Conclusions When intrapatient variability and uncertainty in the CHO content of the meal are considered, a safer prediction of possible hyper- and hypoglycemia episodes induced by the tested insulin therapy can be calculated. PMID:20144339

  18. Uncertainty in Environmental Economics

    OpenAIRE

    Robert S. Pindyck

    2006-01-01

    In a world of certainty, the design of environmental policy is relatively straightforward, and boils down to maximizing the present value of the flow of social benefits minus costs. But the real world is one of considerable uncertainty -- over the physical and ecological impact of pollution, over the economic costs and benefits of reducing it, and over the discount rates that should be used to compute present values. The implications of uncertainty are complicated by the fact that most enviro...

  19. A research on the verification of models used in the computational codes and the uncertainty reduction method for the containment integrity evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Moo Hwan; Seo, Kyoung Woo [POSTECH, Pohang (Korea, Republic of)

    2001-03-15

    In the probability approach, the calculated CCFPs of all the scenarios were zero, which meant that it was expected that for all the accident scenarios the maximum pressure load induced by DCH was lower than the containment failure pressure obtained from the fragility curve. Thus, it can be stated that the KSNP containment is robust to the DCH threat. And uncertainty of computer codes used to be two (deterministic and probabilistic) approaches were reduced by the sensitivity tests and the research with the verification and comparison of the DCH models in each code. So, this research was to evaluate synthetic result of DCH issue and expose accurate methodology to assess containment integrity about operating PWR in Korea.

  20. Minimising uncertainty induced by temperature extrapolations of thermodynamic data: a pragmatic view on the integration of thermodynamic databases into geochemical computer codes

    International Nuclear Information System (INIS)

    Incorporation of temperature corrections is gaining priority regarding geochemical modelling computer codes with built-in thermodynamic databases related to performance assessment in nuclear waste management. As no experimental data at elevated temperatures are available e.g. for many actinide and lanthanide species, the simplest one-term extrapolations of equilibrium constants are usually assumed in practice. Such extrapolations, if set inappropriately, may accumulate large additional uncertainty at temperatures above 100 deg C. Such errors can be avoided because one-, two- and three-term extrapolations have great predictive potential for isoelectric/iso-coulombic reactions which has to be explored and extensively used in geochemical modelling by LMA and/or GEM algorithm. This can be done efficiently and consistently via implementing a built-in 'hybrid' database combining 'kernel' thermochemical/EoS data for substances with the 'extension' reaction-defined data for other species. (author)

  1. A new surrogate modeling technique combining Kriging and polynomial chaos expansions – Application to uncertainty analysis in computational dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Kersaudy, Pierric, E-mail: pierric.kersaudy@orange.com [Orange Labs, 38 avenue du Général Leclerc, 92130 Issy-les-Moulineaux (France); Whist Lab, 38 avenue du Général Leclerc, 92130 Issy-les-Moulineaux (France); ESYCOM, Université Paris-Est Marne-la-Vallée, 5 boulevard Descartes, 77700 Marne-la-Vallée (France); Sudret, Bruno [ETH Zürich, Chair of Risk, Safety and Uncertainty Quantification, Stefano-Franscini-Platz 5, 8093 Zürich (Switzerland); Varsier, Nadège [Orange Labs, 38 avenue du Général Leclerc, 92130 Issy-les-Moulineaux (France); Whist Lab, 38 avenue du Général Leclerc, 92130 Issy-les-Moulineaux (France); Picon, Odile [ESYCOM, Université Paris-Est Marne-la-Vallée, 5 boulevard Descartes, 77700 Marne-la-Vallée (France); Wiart, Joe [Orange Labs, 38 avenue du Général Leclerc, 92130 Issy-les-Moulineaux (France); Whist Lab, 38 avenue du Général Leclerc, 92130 Issy-les-Moulineaux (France)

    2015-04-01

    In numerical dosimetry, the recent advances in high performance computing led to a strong reduction of the required computational time to assess the specific absorption rate (SAR) characterizing the human exposure to electromagnetic waves. However, this procedure remains time-consuming and a single simulation can request several hours. As a consequence, the influence of uncertain input parameters on the SAR cannot be analyzed using crude Monte Carlo simulation. The solution presented here to perform such an analysis is surrogate modeling. This paper proposes a novel approach to build such a surrogate model from a design of experiments. Considering a sparse representation of the polynomial chaos expansions using least-angle regression as a selection algorithm to retain the most influential polynomials, this paper proposes to use the selected polynomials as regression functions for the universal Kriging model. The leave-one-out cross validation is used to select the optimal number of polynomials in the deterministic part of the Kriging model. The proposed approach, called LARS-Kriging-PC modeling, is applied to three benchmark examples and then to a full-scale metamodeling problem involving the exposure of a numerical fetus model to a femtocell device. The performances of the LARS-Kriging-PC are compared to an ordinary Kriging model and to a classical sparse polynomial chaos expansion. The LARS-Kriging-PC appears to have better performances than the two other approaches. A significant accuracy improvement is observed compared to the ordinary Kriging or to the sparse polynomial chaos depending on the studied case. This approach seems to be an optimal solution between the two other classical approaches. A global sensitivity analysis is finally performed on the LARS-Kriging-PC model of the fetus exposure problem.

  2. A new surrogate modeling technique combining Kriging and polynomial chaos expansions – Application to uncertainty analysis in computational dosimetry

    International Nuclear Information System (INIS)

    In numerical dosimetry, the recent advances in high performance computing led to a strong reduction of the required computational time to assess the specific absorption rate (SAR) characterizing the human exposure to electromagnetic waves. However, this procedure remains time-consuming and a single simulation can request several hours. As a consequence, the influence of uncertain input parameters on the SAR cannot be analyzed using crude Monte Carlo simulation. The solution presented here to perform such an analysis is surrogate modeling. This paper proposes a novel approach to build such a surrogate model from a design of experiments. Considering a sparse representation of the polynomial chaos expansions using least-angle regression as a selection algorithm to retain the most influential polynomials, this paper proposes to use the selected polynomials as regression functions for the universal Kriging model. The leave-one-out cross validation is used to select the optimal number of polynomials in the deterministic part of the Kriging model. The proposed approach, called LARS-Kriging-PC modeling, is applied to three benchmark examples and then to a full-scale metamodeling problem involving the exposure of a numerical fetus model to a femtocell device. The performances of the LARS-Kriging-PC are compared to an ordinary Kriging model and to a classical sparse polynomial chaos expansion. The LARS-Kriging-PC appears to have better performances than the two other approaches. A significant accuracy improvement is observed compared to the ordinary Kriging or to the sparse polynomial chaos depending on the studied case. This approach seems to be an optimal solution between the two other classical approaches. A global sensitivity analysis is finally performed on the LARS-Kriging-PC model of the fetus exposure problem

  3. Conundrums with uncertainty factors.

    Science.gov (United States)

    Cooke, Roger

    2010-03-01

    The practice of uncertainty factors as applied to noncancer endpoints in the IRIS database harkens back to traditional safety factors. In the era before risk quantification, these were used to build in a "margin of safety." As risk quantification takes hold, the safety factor methods yield to quantitative risk calculations to guarantee safety. Many authors believe that uncertainty factors can be given a probabilistic interpretation as ratios of response rates, and that the reference values computed according to the IRIS methodology can thus be converted to random variables whose distributions can be computed with Monte Carlo methods, based on the distributions of the uncertainty factors. Recent proposals from the National Research Council echo this view. Based on probabilistic arguments, several authors claim that the current practice of uncertainty factors is overprotective. When interpreted probabilistically, uncertainty factors entail very strong assumptions on the underlying response rates. For example, the factor for extrapolating from animal to human is the same whether the dosage is chronic or subchronic. Together with independence assumptions, these assumptions entail that the covariance matrix of the logged response rates is singular. In other words, the accumulated assumptions entail a log-linear dependence between the response rates. This in turn means that any uncertainty analysis based on these assumptions is ill-conditioned; it effectively computes uncertainty conditional on a set of zero probability. The practice of uncertainty factors is due for a thorough review. Two directions are briefly sketched, one based on standard regression models, and one based on nonparametric continuous Bayesian belief nets. PMID:20030767

  4. PIV uncertainty propagation

    Science.gov (United States)

    Sciacchitano, Andrea; Wieneke, Bernhard

    2016-08-01

    This paper discusses the propagation of the instantaneous uncertainty of PIV measurements to statistical and instantaneous quantities of interest derived from the velocity field. The expression of the uncertainty of vorticity, velocity divergence, mean value and Reynolds stresses is derived. It is shown that the uncertainty of vorticity and velocity divergence requires the knowledge of the spatial correlation between the error of the x and y particle image displacement, which depends upon the measurement spatial resolution. The uncertainty of statistical quantities is often dominated by the random uncertainty due to the finite sample size and decreases with the square root of the effective number of independent samples. Monte Carlo simulations are conducted to assess the accuracy of the uncertainty propagation formulae. Furthermore, three experimental assessments are carried out. In the first experiment, a turntable is used to simulate a rigid rotation flow field. The estimated uncertainty of the vorticity is compared with the actual vorticity error root-mean-square, with differences between the two quantities within 5–10% for different interrogation window sizes and overlap factors. A turbulent jet flow is investigated in the second experimental assessment. The reference velocity, which is used to compute the reference value of the instantaneous flow properties of interest, is obtained with an auxiliary PIV system, which features a higher dynamic range than the measurement system. Finally, the uncertainty quantification of statistical quantities is assessed via PIV measurements in a cavity flow. The comparison between estimated uncertainty and actual error demonstrates the accuracy of the proposed uncertainty propagation methodology.

  5. Calibration uncertainty

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Anglov, Thomas

    2002-01-01

    Methods recommended by the International Standardization Organisation and Eurachem are not satisfactory for the correct estimation of calibration uncertainty. A novel approach is introduced and tested on actual calibration data for the determination of Pb by ICP-AES. The improved calibration...... uncertainty was verified from independent measurements of the same sample by demonstrating statistical control of analytical results and the absence of bias. The proposed method takes into account uncertainties of the measurement, as well as of the amount of calibrant. It is applicable to all types of...

  6. Introduction to uncertainty quantification

    CERN Document Server

    Sullivan, T J

    2015-01-01

    Uncertainty quantification is a topic of increasing practical importance at the intersection of applied mathematics, statistics, computation, and numerous application areas in science and engineering. This text provides a framework in which the main objectives of the field of uncertainty quantification are defined, and an overview of the range of mathematical methods by which they can be achieved. Complete with exercises throughout, the book will equip readers with both theoretical understanding and practical experience of the key mathematical and algorithmic tools underlying the treatment of uncertainty in modern applied mathematics. Students and readers alike are encouraged to apply the mathematical methods discussed in this book to their own favourite problems to understand their strengths and weaknesses, also making the text suitable as a self-study. This text is designed as an introduction to uncertainty quantification for senior undergraduate and graduate students with a mathematical or statistical back...

  7. Uncertainty theory

    CERN Document Server

    Liu, Baoding

    2015-01-01

    When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...

  8. Sensitivity and uncertainty analysis

    CERN Document Server

    Cacuci, Dan G; Navon, Ionel Michael

    2005-01-01

    As computer-assisted modeling and analysis of physical processes have continued to grow and diversify, sensitivity and uncertainty analyses have become indispensable scientific tools. Sensitivity and Uncertainty Analysis. Volume I: Theory focused on the mathematical underpinnings of two important methods for such analyses: the Adjoint Sensitivity Analysis Procedure and the Global Adjoint Sensitivity Analysis Procedure. This volume concentrates on the practical aspects of performing these analyses for large-scale systems. The applications addressed include two-phase flow problems, a radiative c

  9. Calibration Under Uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Swiler, Laura Painton; Trucano, Timothy Guy

    2005-03-01

    This report is a white paper summarizing the literature and different approaches to the problem of calibrating computer model parameters in the face of model uncertainty. Model calibration is often formulated as finding the parameters that minimize the squared difference between the model-computed data (the predicted data) and the actual experimental data. This approach does not allow for explicit treatment of uncertainty or error in the model itself: the model is considered the %22true%22 deterministic representation of reality. While this approach does have utility, it is far from an accurate mathematical treatment of the true model calibration problem in which both the computed data and experimental data have error bars. This year, we examined methods to perform calibration accounting for the error in both the computer model and the data, as well as improving our understanding of its meaning for model predictability. We call this approach Calibration under Uncertainty (CUU). This talk presents our current thinking on CUU. We outline some current approaches in the literature, and discuss the Bayesian approach to CUU in detail.

  10. Demand Uncertainty

    DEFF Research Database (Denmark)

    Nguyen, Daniel Xuyen

    untested destinations. The option to forecast demands causes firms to delay exporting in order to gather more information about foreign demand. Third, since uncertainty is resolved after entry, many firms enter a destination and then exit after learning that they cannot profit. This prediction reconciles......This paper presents a model of trade that explains why firms wait to export and why many exporters fail. Firms face uncertain demands that are only realized after the firm enters the destination. The model retools the timing of uncertainty resolution found in productivity heterogeneity models. This...... the high rate of exit seen in the first years of exporting. Finally, when faced with multiple countries in which to export, some firms will choose to sequentially export in order to slowly learn more about its chances for success in untested markets....

  11. Orbital State Uncertainty Realism

    Science.gov (United States)

    Horwood, J.; Poore, A. B.

    2012-09-01

    Fundamental to the success of the space situational awareness (SSA) mission is the rigorous inclusion of uncertainty in the space surveillance network. The *proper characterization of uncertainty* in the orbital state of a space object is a common requirement to many SSA functions including tracking and data association, resolution of uncorrelated tracks (UCTs), conjunction analysis and probability of collision, sensor resource management, and anomaly detection. While tracking environments, such as air and missile defense, make extensive use of Gaussian and local linearity assumptions within algorithms for uncertainty management, space surveillance is inherently different due to long time gaps between updates, high misdetection rates, nonlinear and non-conservative dynamics, and non-Gaussian phenomena. The latter implies that "covariance realism" is not always sufficient. SSA also requires "uncertainty realism"; the proper characterization of both the state and covariance and all non-zero higher-order cumulants. In other words, a proper characterization of a space object's full state *probability density function (PDF)* is required. In order to provide a more statistically rigorous treatment of uncertainty in the space surveillance tracking environment and to better support the aforementioned SSA functions, a new class of multivariate PDFs are formulated which more accurately characterize the uncertainty of a space object's state or orbit. The new distribution contains a parameter set controlling the higher-order cumulants which gives the level sets a distinctive "banana" or "boomerang" shape and degenerates to a Gaussian in a suitable limit. Using the new class of PDFs within the general Bayesian nonlinear filter, the resulting filter prediction step (i.e., uncertainty propagation) is shown to have the *same computational cost as the traditional unscented Kalman filter* with the former able to maintain a proper characterization of the uncertainty for up to *ten

  12. Uncertainty-induced quantum nonlocality

    Science.gov (United States)

    Wu, Shao-xiong; Zhang, Jun; Yu, Chang-shui; Song, He-shan

    2014-01-01

    Based on the skew information, we present a quantity, uncertainty-induced quantum nonlocality (UIN) to measure the quantum correlation. It can be considered as the updated version of the original measurement-induced nonlocality (MIN) preserving the good computability but eliminating the non-contractivity problem. For 2×d-dimensional state, it is shown that UIN can be given by a closed form. In addition, we also investigate the maximal uncertainty-induced nonlocality.

  13. Deterministic uncertainty analysis

    International Nuclear Information System (INIS)

    This paper presents a deterministic uncertainty analysis (DUA) method for calculating uncertainties that has the potential to significantly reduce the number of computer runs compared to conventional statistical analysis. The method is based upon the availability of derivative and sensitivity data such as that calculated using the well known direct or adjoint sensitivity analysis techniques. Formation of response surfaces using derivative data and the propagation of input probability distributions are discussed relative to their role in the DUA method. A sample problem that models the flow of water through a borehole is used as a basis to compare the cumulative distribution function of the flow rate as calculated by the standard statistical methods and the DUA method. Propogation of uncertainties by the DUA method is compared for ten cases in which the number of reference model runs was varied from one to ten. The DUA method gives a more accurate representation of the true cumulative distribution of the flow rate based upon as few as two model executions compared to fifty model executions using a statistical approach. 16 refs., 4 figs., 5 tabs

  14. Uncertainty analysis of minimum vessel liquid inventory during a small-break LOCA in a B ampersand W Plant: An application of the CSAU methodology using the RELAP5/MOD3 computer code

    International Nuclear Information System (INIS)

    The Nuclear Regulatory Commission (NRC) revised the emergency core cooling system licensing rule to allow the use of best estimate computer codes, provided the uncertainty of the calculations are quantified and used in the licensing and regulation process. The NRC developed a generic methodology called Code Scaling, Applicability, and Uncertainty (CSAU) to evaluate best estimate code uncertainties. The objective of this work was to adapt and demonstrate the CSAU methodology for a small-break loss-of-coolant accident (SBLOCA) in a Pressurized Water Reactor of Babcock ampersand Wilcox Company lowered loop design using RELAP5/MOD3 as the simulation tool. The CSAU methodology was successfully demonstrated for the new set of variants defined in this project (scenario, plant design, code). However, the robustness of the reactor design to this SBLOCA scenario limits the applicability of the specific results to other plants or scenarios. Several aspects of the code were not exercised because the conditions of the transient never reached enough severity. The plant operator proved to be a determining factor in the course of the transient scenario, and steps were taken to include the operator in the model, simulation, and analyses

  15. Uncertainty quantification and error analysis

    Energy Technology Data Exchange (ETDEWEB)

    Higdon, Dave M [Los Alamos National Laboratory; Anderson, Mark C [Los Alamos National Laboratory; Habib, Salman [Los Alamos National Laboratory; Klein, Richard [Los Alamos National Laboratory; Berliner, Mark [OHIO STATE UNIV.; Covey, Curt [LLNL; Ghattas, Omar [UNIV OF TEXAS; Graziani, Carlo [UNIV OF CHICAGO; Seager, Mark [LLNL; Sefcik, Joseph [LLNL; Stark, Philip [UC/BERKELEY; Stewart, James [SNL

    2010-01-01

    UQ studies all sources of error and uncertainty, including: systematic and stochastic measurement error; ignorance; limitations of theoretical models; limitations of numerical representations of those models; limitations on the accuracy and reliability of computations, approximations, and algorithms; and human error. A more precise definition for UQ is suggested below.

  16. Knowledge Uncertainty and Composed Classifier

    Czech Academy of Sciences Publication Activity Database

    Klimešová, Dana; Ocelíková, E.

    2007-01-01

    Roč. 1, č. 2 (2007), s. 101-105. ISSN 1998-0140 Institutional research plan: CEZ:AV0Z10750506 Keywords : Boosting architecture * contextual modelling * composed classifier * knowledge management , * knowledge * uncertainty Subject RIV: IN - Informatics, Computer Science

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  18. Uncertainty Characterization of Reactor Vessel Fracture Toughness

    International Nuclear Information System (INIS)

    To perform fracture mechanics analysis of reactor vessel, fracture toughness (KIc) at various temperatures would be necessary. In a best estimate approach, KIc uncertainties resulting from both lack of sufficient knowledge and randomness in some of the variables of KIc must be characterized. Although it may be argued that there is only one type of uncertainty, which is lack of perfect knowledge about the subject under study, as a matter of practice KIc uncertainties can be divided into two types: aleatory and epistemic. Aleatory uncertainty is related to uncertainty that is very difficult to reduce, if not impossible; epistemic uncertainty, on the other hand, can be practically reduced. Distinction between aleatory and epistemic uncertainties facilitates decision-making under uncertainty and allows for proper propagation of uncertainties in the computation process. Typically, epistemic uncertainties representing, for example, parameters of a model are sampled (to generate a 'snapshot', single-value of the parameters), but the totality of aleatory uncertainties is carried through the calculation as available. In this paper a description of an approach to account for these two types of uncertainties associated with KIc has been provided. (authors)

  19. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  20. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  1. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  2. Needs of the CSAU uncertainty method

    International Nuclear Information System (INIS)

    The use of best estimate codes for safety analysis requires quantification of the uncertainties. These uncertainties are inherently linked to the chosen safety analysis methodology. Worldwide, various methods were proposed for this quantification. The purpose of this paper was to identify the needs of the Code Scaling, Applicability, and Uncertainty (CSAU) methodology and then to answer the needs. The specific procedural steps were combined from other methods for uncertainty evaluation and new tools and procedures were proposed. The uncertainty analysis approach and tools were then utilized for confirmatory study. The uncertainty was quantified for the RELAP5/MOD3.2 thermalhydraulic computer code. The results of the adapted CSAU approach to the small-break loss-of-coolant accident (SB LOCA) show that the adapted CSAU can be used for any thermal-hydraulic safety analysis with uncertainty evaluation. However, it was indicated that there are still some limitations in the CSAU approach that need to be resolved. (author)

  3. Generalized Uncertainty Principle and Angular Momentum

    CERN Document Server

    Bosso, Pasquale

    2016-01-01

    Various models of quantum gravity suggest a modification of the Heisenberg's Uncertainty Principle, to the so-called Generalized Uncertainty Principle, between position and momentum. In this work we show how this modification influences the theory of angular momentum in Quantum Mechanics. In particular, we compute Planck scale corrections to angular momentum eigenvalues, the Hydrogen atom spectrum, the Stern-Gerlach experiment and the Clebsch-Gordan coefficients. We also examine effects of the Generalized Uncertainty Principle on multi-particle systems.

  4. Fuzzy Uncertainty Evaluation for Fault Tree Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ki Beom; Shim, Hyung Jin [Seoul National University, Seoul (Korea, Republic of); Jae, Moo Sung [Hanyang University, Seoul (Korea, Republic of)

    2015-05-15

    This traditional probabilistic approach can calculate relatively accurate results. However it requires a long time because of repetitive computation due to the MC method. In addition, when informative data for statistical analysis are not sufficient or some events are mainly caused by human error, the probabilistic approach may not be possible because uncertainties of these events are difficult to be expressed by probabilistic distributions. In order to reduce the computation time and quantify uncertainties of top events when basic events whose uncertainties are difficult to be expressed by probabilistic distributions exist, the fuzzy uncertainty propagation based on fuzzy set theory can be applied. In this paper, we develop a fuzzy uncertainty propagation code and apply the fault tree of the core damage accident after the large loss of coolant accident (LLOCA). The fuzzy uncertainty propagation code is implemented and tested for the fault tree of the radiation release accident. We apply this code to the fault tree of the core damage accident after the LLOCA in three cases and compare the results with those computed by the probabilistic uncertainty propagation using the MC method. The results obtained by the fuzzy uncertainty propagation can be calculated in relatively short time, covering the results obtained by the probabilistic uncertainty propagation.

  5. Fuzzy Uncertainty Evaluation for Fault Tree Analysis

    International Nuclear Information System (INIS)

    This traditional probabilistic approach can calculate relatively accurate results. However it requires a long time because of repetitive computation due to the MC method. In addition, when informative data for statistical analysis are not sufficient or some events are mainly caused by human error, the probabilistic approach may not be possible because uncertainties of these events are difficult to be expressed by probabilistic distributions. In order to reduce the computation time and quantify uncertainties of top events when basic events whose uncertainties are difficult to be expressed by probabilistic distributions exist, the fuzzy uncertainty propagation based on fuzzy set theory can be applied. In this paper, we develop a fuzzy uncertainty propagation code and apply the fault tree of the core damage accident after the large loss of coolant accident (LLOCA). The fuzzy uncertainty propagation code is implemented and tested for the fault tree of the radiation release accident. We apply this code to the fault tree of the core damage accident after the LLOCA in three cases and compare the results with those computed by the probabilistic uncertainty propagation using the MC method. The results obtained by the fuzzy uncertainty propagation can be calculated in relatively short time, covering the results obtained by the probabilistic uncertainty propagation

  6. Uncertainty of empirical correlation equations

    Science.gov (United States)

    Feistel, R.; Lovell-Smith, J. W.; Saunders, P.; Seitz, S.

    2016-08-01

    The International Association for the Properties of Water and Steam (IAPWS) has published a set of empirical reference equations of state, forming the basis of the 2010 Thermodynamic Equation of Seawater (TEOS-10), from which all thermodynamic properties of seawater, ice, and humid air can be derived in a thermodynamically consistent manner. For each of the equations of state, the parameters have been found by simultaneously fitting equations for a range of different derived quantities using large sets of measurements of these quantities. In some cases, uncertainties in these fitted equations have been assigned based on the uncertainties of the measurement results. However, because uncertainties in the parameter values have not been determined, it is not possible to estimate the uncertainty in many of the useful quantities that can be calculated using the parameters. In this paper we demonstrate how the method of generalised least squares (GLS), in which the covariance of the input data is propagated into the values calculated by the fitted equation, and in particular into the covariance matrix of the fitted parameters, can be applied to one of the TEOS-10 equations of state, namely IAPWS-95 for fluid pure water. Using the calculated parameter covariance matrix, we provide some preliminary estimates of the uncertainties in derived quantities, namely the second and third virial coefficients for water. We recommend further investigation of the GLS method for use as a standard method for calculating and propagating the uncertainties of values computed from empirical equations.

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  8. Uncertainty and Engagement with Learning Games

    Science.gov (United States)

    Howard-Jones, Paul A.; Demetriou, Skevi

    2009-01-01

    Uncertainty may be an important component of the motivation provided by learning games, especially when associated with gaming rather than learning. Three studies are reported that explore the influence of gaming uncertainty on engagement with computer-based learning games. In the first study, children (10-11 years) played a simple maths quiz.…

  9. Computer

    CERN Document Server

    Atkinson, Paul

    2011-01-01

    The pixelated rectangle we spend most of our day staring at in silence is not the television as many long feared, but the computer-the ubiquitous portal of work and personal lives. At this point, the computer is almost so common we don't notice it in our view. It's difficult to envision that not that long ago it was a gigantic, room-sized structure only to be accessed by a few inspiring as much awe and respect as fear and mystery. Now that the machine has decreased in size and increased in popular use, the computer has become a prosaic appliance, little-more noted than a toaster. These dramati

  10. Quantification of uncertainties in composites

    Science.gov (United States)

    Liaw, D. G.; Singhal, S. N.; Murthy, P. L. N.; Chamis, Christos C.

    1993-01-01

    An integrated methodology is developed for computationally simulating the probabilistic composite material properties at all composite scales. The simulation requires minimum input consisting of the description of uncertainties at the lowest scale (fiber and matrix constituents) of the composite and in the fabrication process variables. The methodology allows the determination of the sensitivity of the composite material behavior to all the relevant primitive variables. This information is crucial for reducing the undesirable scatter in composite behavior at its macro scale by reducing the uncertainties in the most influential primitive variables at the micro scale. The methodology is computationally efficient. The computational time required by the methodology described herein is an order of magnitude less than that for Monte Carlo Simulation. The methodology has been implemented into the computer code PICAN (Probabilistic Integrated Composite ANalyzer). The accuracy and efficiency of the methodology/code are demonstrated by simulating the uncertainties in the heat-transfer, thermal, and mechanical properties of a typical laminate and comparing the results with the Monte Carlo simulation method and experimental data. The important observation is that the computational simulation for probabilistic composite mechanics has sufficient flexibility to capture the observed scatter in composite properties.

  11. Collaborative Project: The problem of bias in defining uncertainty in computationally enabled strategies for data-driven climate model development. Final Technical Report.

    Energy Technology Data Exchange (ETDEWEB)

    Huerta, Gabriel [Univ. of New Mexico, Albuquerque, NM (United States)

    2016-05-10

    The objective of the project is to develop strategies for better representing scientific sensibilities within statistical measures of model skill that then can be used within a Bayesian statistical framework for data-driven climate model development and improved measures of model scientific uncertainty. One of the thorny issues in model evaluation is quantifying the effect of biases on climate projections. While any bias is not desirable, only those biases that affect feedbacks affect scatter in climate projections. The effort at the University of Texas is to analyze previously calculated ensembles of CAM3.1 with perturbed parameters to discover how biases affect projections of global warming. The hypothesis is that compensating errors in the control model can be identified by their effect on a combination of processes and that developing metrics that are sensitive to dependencies among state variables would provide a way to select version of climate models that may reduce scatter in climate projections. Gabriel Huerta at the University of New Mexico is responsible for developing statistical methods for evaluating these field dependencies. The UT effort will incorporate these developments into MECS, which is a set of python scripts being developed at the University of Texas for managing the workflow associated with data-driven climate model development over HPC resources. This report reflects the main activities at the University of New Mexico where the PI (Huerta) and the Postdocs (Nosedal, Hattab and Karki) worked on the project.

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  13. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  14. The uncertainty of valuation

    OpenAIRE

    French, N.; L. Gabrielli

    2005-01-01

    Valuation is often said to be “an art not a science” but this relates to the techniques employed to calculate value not to the underlying concept itself. Valuation is the process of estimating price in the market place. Yet, such an estimation will be affected by uncertainties. Uncertainty in the comparable information available; uncertainty in the current and future market conditions and uncertainty in the specific inputs for the subject property. These input uncertainties will translate int...

  15. Modeling Model Uncertainty

    OpenAIRE

    Onatski, Alexei; Williams, Noah

    2003-01-01

    Recently there has been much interest in studying monetary policy under model uncertainty. We develop methods to analyze different sources of uncertainty in one coherent structure useful for policy decisions. We show how to estimate the size of the uncertainty based on time series data, and incorporate this uncertainty in policy optimization. We propose two different approaches to modeling model uncertainty. The first is model error modeling, which imposes additional structure on the errors o...

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  19. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  20. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  1. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  2. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  3. Assessment of the computational uncertainty of temperature rise and SAR in the eyes and brain under far-field exposure from 1 to 10 GHz

    International Nuclear Information System (INIS)

    This paper presents finite-difference time-domain (FDTD) calculations of specific absorption rate (SAR) values in the head under plane-wave exposure from 1 to 10 GHz using a resolution of 0.5 mm in adult male and female voxel models. Temperature rise due to the power absorption is calculated by the bioheat equation using a multigrid method solver. The computational accuracy is investigated by repeating the calculations with resolutions of 1 mm and 2 mm and comparing the results. Cubically averaged 10 g SAR in the eyes and brain and eye-averaged SAR are calculated and compared to the corresponding temperature rise as well as the recommended limits for exposure. The results suggest that 2 mm resolution should only be used for frequencies smaller than 2.5 GHz, and 1 mm resolution only under 5 GHz. Morphological differences in models seemed to be an important cause of variation: differences in results between the two different models were usually larger than the computational error due to the grid resolution, and larger than the difference between the results for open and closed eyes. Limiting the incident plane-wave power density to smaller than 100 W m-2 was sufficient for ensuring that the temperature rise in the eyes and brain were less than 1 deg. C in the whole frequency range.

  4. Assessment of the computational uncertainty of temperature rise and SAR in the eyes and brain under far-field exposure from 1 to 10 GHz

    Science.gov (United States)

    Laakso, Ilkka

    2009-06-01

    This paper presents finite-difference time-domain (FDTD) calculations of specific absorption rate (SAR) values in the head under plane-wave exposure from 1 to 10 GHz using a resolution of 0.5 mm in adult male and female voxel models. Temperature rise due to the power absorption is calculated by the bioheat equation using a multigrid method solver. The computational accuracy is investigated by repeating the calculations with resolutions of 1 mm and 2 mm and comparing the results. Cubically averaged 10 g SAR in the eyes and brain and eye-averaged SAR are calculated and compared to the corresponding temperature rise as well as the recommended limits for exposure. The results suggest that 2 mm resolution should only be used for frequencies smaller than 2.5 GHz, and 1 mm resolution only under 5 GHz. Morphological differences in models seemed to be an important cause of variation: differences in results between the two different models were usually larger than the computational error due to the grid resolution, and larger than the difference between the results for open and closed eyes. Limiting the incident plane-wave power density to smaller than 100 W m-2 was sufficient for ensuring that the temperature rise in the eyes and brain were less than 1 °C in the whole frequency range.

  5. Assessment of the computational uncertainty of temperature rise and SAR in the eyes and brain under far-field exposure from 1 to 10 GHz

    Energy Technology Data Exchange (ETDEWEB)

    Laakso, Ilkka [Department of Radio Science and Engineering, Helsinki University of Technology, Otakaari 5 A, 02150 Espoo (Finland)], E-mail: ilkka.laakso@tkk.fi

    2009-06-07

    This paper presents finite-difference time-domain (FDTD) calculations of specific absorption rate (SAR) values in the head under plane-wave exposure from 1 to 10 GHz using a resolution of 0.5 mm in adult male and female voxel models. Temperature rise due to the power absorption is calculated by the bioheat equation using a multigrid method solver. The computational accuracy is investigated by repeating the calculations with resolutions of 1 mm and 2 mm and comparing the results. Cubically averaged 10 g SAR in the eyes and brain and eye-averaged SAR are calculated and compared to the corresponding temperature rise as well as the recommended limits for exposure. The results suggest that 2 mm resolution should only be used for frequencies smaller than 2.5 GHz, and 1 mm resolution only under 5 GHz. Morphological differences in models seemed to be an important cause of variation: differences in results between the two different models were usually larger than the computational error due to the grid resolution, and larger than the difference between the results for open and closed eyes. Limiting the incident plane-wave power density to smaller than 100 W m{sup -2} was sufficient for ensuring that the temperature rise in the eyes and brain were less than 1 deg. C in the whole frequency range.

  6. Experimental uncertainty estimation and statistics for data having interval uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Kreinovich, Vladik (Applied Biomathematics, Setauket, New York); Oberkampf, William Louis (Applied Biomathematics, Setauket, New York); Ginzburg, Lev (Applied Biomathematics, Setauket, New York); Ferson, Scott (Applied Biomathematics, Setauket, New York); Hajagos, Janos (Applied Biomathematics, Setauket, New York)

    2007-05-01

    This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.

  7. Uncertainty quantification theory, implementation, and applications

    CERN Document Server

    Smith, Ralph C

    2014-01-01

    The field of uncertainty quantification is evolving rapidly because of increasing emphasis on models that require quantified uncertainties for large-scale applications, novel algorithm development, and new computational architectures that facilitate implementation of these algorithms. Uncertainty Quantification: Theory, Implementation, and Applications provides readers with the basic concepts, theory, and algorithms necessary to quantify input and response uncertainties for simulation models arising in a broad range of disciplines. The book begins with a detailed discussion of applications where uncertainty quantification is critical for both scientific understanding and policy. It then covers concepts from probability and statistics, parameter selection techniques, frequentist and Bayesian model calibration, propagation of uncertainties, quantification of model discrepancy, surrogate model construction, and local and global sensitivity analysis. The author maintains a complementary web page where readers ca...

  8. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  9. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  11. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  13. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  14. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  15. Using dynamical uncertainty models estimating uncertainty bounds on power plant performance prediction

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Stoustrup, Jakob; Mataji, B.

    2007-01-01

    of the prediction error. These proposed dynamical uncertainty models result in an upper and lower bound on the predicted performance of the plant. The dynamical uncertainty models are used to estimate the uncertainty of the predicted performance of a coal-fired power plant. The proposed scheme, which uses dynamical......Predicting the performance of large scale plants can be difficult due to model uncertainties etc, meaning that one can be almost certain that the prediction will diverge from the plant performance with time. In this paper output multiplicative uncertainty models are used as dynamical models...... models, is applied to two different sets of measured plant data. The computed uncertainty bounds cover the measured plant output, while the nominal prediction is outside these uncertainty bounds for some samples in these examples.  ...

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  17. Estimating uncertainties in complex joint inverse problems

    Science.gov (United States)

    Afonso, Juan Carlos

    2016-04-01

    Sources of uncertainty affecting geophysical inversions can be classified either as reflective (i.e. the practitioner is aware of her/his ignorance) or non-reflective (i.e. the practitioner does not know that she/he does not know!). Although we should be always conscious of the latter, the former are the ones that, in principle, can be estimated either empirically (by making measurements or collecting data) or subjectively (based on the experience of the researchers). For complex parameter estimation problems in geophysics, subjective estimation of uncertainty is the most common type. In this context, probabilistic (aka Bayesian) methods are commonly claimed to offer a natural and realistic platform from which to estimate model uncertainties. This is because in the Bayesian approach, errors (whatever their nature) can be naturally included as part of the global statistical model, the solution of which represents the actual solution to the inverse problem. However, although we agree that probabilistic inversion methods are the most powerful tool for uncertainty estimation, the common claim that they produce "realistic" or "representative" uncertainties is not always justified. Typically, ALL UNCERTAINTY ESTIMATES ARE MODEL DEPENDENT, and therefore, besides a thorough characterization of experimental uncertainties, particular care must be paid to the uncertainty arising from model errors and input uncertainties. We recall here two quotes by G. Box and M. Gunzburger, respectively, of special significance for inversion practitioners and for this session: "…all models are wrong, but some are useful" and "computational results are believed by no one, except the person who wrote the code". In this presentation I will discuss and present examples of some problems associated with the estimation and quantification of uncertainties in complex multi-observable probabilistic inversions, and how to address them. Although the emphasis will be on sources of uncertainty related

  18. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  19. Pore Velocity Estimation Uncertainties

    Science.gov (United States)

    Devary, J. L.; Doctor, P. G.

    1982-08-01

    Geostatistical data analysis techniques were used to stochastically model the spatial variability of groundwater pore velocity in a potential waste repository site. Kriging algorithms were applied to Hanford Reservation data to estimate hydraulic conductivities, hydraulic head gradients, and pore velocities. A first-order Taylor series expansion for pore velocity was used to statistically combine hydraulic conductivity, hydraulic head gradient, and effective porosity surfaces and uncertainties to characterize the pore velocity uncertainty. Use of these techniques permits the estimation of pore velocity uncertainties when pore velocity measurements do not exist. Large pore velocity estimation uncertainties were found to be located in the region where the hydraulic head gradient relative uncertainty was maximal.

  20. Fuel behavior model development: FRAPCON-2 uncertainty analysis option

    International Nuclear Information System (INIS)

    An automated uncertainty analysis option has been added to the FRAPCON Computer Code. The option allows the user to obtain estimates of the uncertainty in computer output variables of the code as a function of known uncertainties in input variables. The method of uncertainty analysis used is the Response Surface Method (RSM). Results of the uncertainty analysis option include estimates of the mean and variance of the output variables plus fractional contributions to that variance due to each of the input variables. An example application of the method to FRAPCON standard problem number one is presented

  1. Calculation of uncertainties

    International Nuclear Information System (INIS)

    One of the most important aspects in relation to the quality assurance in any analytical activity is the estimation of measurement uncertainty. There is general agreement that 'the expression of the result of a measurement is not complete without specifying its associated uncertainty'. An analytical process is the mechanism for obtaining methodological information (measurand) of a material system (population). This implies the need for the definition of the problem, the choice of methods for sampling and measurement and proper execution of these activities for obtaining information. The result of a measurement is only an approximation or estimate of the value of the measurand, which is complete only when accompanied by an estimate of the uncertainty of the analytical process. According to the 'Vocabulary of Basic and General Terms in Metrology' measurement uncertainty' is the parameter associated with the result of a measurement that characterizes the dispersion of the values that could reasonably be attributed to the measurand (or magnitude). This parameter could be a standard deviation or a confidence interval. The uncertainty evaluation requires detailed look at all possible sources, but not disproportionately. We can make a good estimate of the uncertainty concentrating efforts on the largest contributions. The key steps of the process of determining the uncertainty in the measurements are: - the specification of the measurand; - identification of the sources of uncertainty - the quantification of individual components of uncertainty, - calculate the combined standard uncertainty; - report of uncertainty.

  2. Treatment of uncertainties in the geologic disposal of radioactive waste

    International Nuclear Information System (INIS)

    Uncertainty in the analysis of geologic waste disposal is generally considered to have three primary components: (1) computer code/model uncertainty, (2) model parameter uncertainty, and (3) scenario uncertainty. Computer code/model uncertainty arises from problems associated with determination of appropriate parameters for use in model construction, mathematical formulatin of models, and numerical techniques used in conjunction with the mathematical formulation of models. Model parameter uncertainty arises from problems associated with selection of appropriate values for model input, data interpretation and possible misuse of data, and variation of data. Scenario uncertainty arises from problems associated with the ''completeness' of scenarios, the definition of parameters which describe scenarios, and the rate or probability of scenario occurrence. The preceding sources of uncertainty are discussed below

  3. Treatment of uncertainties in the geologic disposal of radioactive waste

    Energy Technology Data Exchange (ETDEWEB)

    Cranwell, R.M.

    1985-12-31

    Uncertainty in the analysis of geologic waste disposal is generally considered to have three primary components: (1) computer code/model uncertainty, (2) model parameter uncertainty, and (3) scenario uncertainty. Computer code/model uncertainty arises from problems associated with determination of appropriate parameters for use in model construction, mathematical formulatin of models, and numerical techniques used in conjunction with the mathematical formulation of models. Model parameter uncertainty arises from problems associated with selection of appropriate values for model input, data interpretation and possible misuse of data, and variation of data. Scenario uncertainty arises from problems associated with the "completeness` of scenarios, the definition of parameters which describe scenarios, and the rate or probability of scenario occurrence. The preceding sources of uncertainty are discussed below.

  4. Uncertainty Quantification in Climate Modeling

    Science.gov (United States)

    Sargsyan, K.; Safta, C.; Berry, R.; Debusschere, B.; Najm, H.

    2011-12-01

    We address challenges that sensitivity analysis and uncertainty quantification methods face when dealing with complex computational models. In particular, climate models are computationally expensive and typically depend on a large number of input parameters. We consider the Community Land Model (CLM), which consists of a nested computational grid hierarchy designed to represent the spatial heterogeneity of the land surface. Each computational cell can be composed of multiple land types, and each land type can incorporate one or more sub-models describing the spatial and depth variability. Even for simulations at a regional scale, the computational cost of a single run is quite high and the number of parameters that control the model behavior is very large. Therefore, the parameter sensitivity analysis and uncertainty propagation face significant difficulties for climate models. This work employs several algorithmic avenues to address some of the challenges encountered by classical uncertainty quantification methodologies when dealing with expensive computational models, specifically focusing on the CLM as a primary application. First of all, since the available climate model predictions are extremely sparse due to the high computational cost of model runs, we adopt a Bayesian framework that effectively incorporates this lack-of-knowledge as a source of uncertainty, and produces robust predictions with quantified uncertainty even if the model runs are extremely sparse. In particular, we infer Polynomial Chaos spectral expansions that effectively encode the uncertain input-output relationship and allow efficient propagation of all sources of input uncertainties to outputs of interest. Secondly, the predictability analysis of climate models strongly suffers from the curse of dimensionality, i.e. the large number of input parameters. While single-parameter perturbation studies can be efficiently performed in a parallel fashion, the multivariate uncertainty analysis

  5. Probabilistic Methods for Uncertainty Propagation Applied to Aircraft Design

    Science.gov (United States)

    Green, Lawrence L.; Lin, Hong-Zong; Khalessi, Mohammad R.

    2002-01-01

    Three methods of probabilistic uncertainty propagation and quantification (the method of moments, Monte Carlo simulation, and a nongradient simulation search method) are applied to an aircraft analysis and conceptual design program to demonstrate design under uncertainty. The chosen example problems appear to have discontinuous design spaces and thus these examples pose difficulties for many popular methods of uncertainty propagation and quantification. However, specific implementation features of the first and third methods chosen for use in this study enable successful propagation of small uncertainties through the program. Input uncertainties in two configuration design variables are considered. Uncertainties in aircraft weight are computed. The effects of specifying required levels of constraint satisfaction with specified levels of input uncertainty are also demonstrated. The results show, as expected, that the designs under uncertainty are typically heavier and more conservative than those in which no input uncertainties exist.

  6. Computational Sustainability

    OpenAIRE

    Eaton, Eric; University of Pennsylvania; Gomes, Carla P.; Cornell University; Williams, Brian; Massachusetts Institute of Technology

    2014-01-01

    Computational sustainability problems, which exist in dynamic environments with high amounts of uncertainty, provide a variety of unique challenges to artificial intelligence research and the opportunity for significant impact upon our collective future. This editorial provides an overview of artificial intelligence for computational sustainability, and introduces this special issue of AI Magazine.

  7. Background and Qualification of Uncertainty Methods

    International Nuclear Information System (INIS)

    The evaluation of uncertainty constitutes the necessary supplement of Best Estimate calculations performed to understand accident scenarios in water cooled nuclear reactors. The needs come from the imperfection of computational tools on the one side and from the interest in using such tool to get more precise evaluation of safety margins. The paper reviews the salient features of two independent approaches for estimating uncertainties associated with predictions of complex system codes. Namely the propagation of code input error and the propagation of the calculation output error constitute the key-words for identifying the methods of current interest for industrial applications. Throughout the developed methods, uncertainty bands can be derived (both upper and lower) for any desired quantity of the transient of interest. For the second case, the uncertainty method is coupled with the thermal-hydraulic code to get the Code with capability of Internal Assessment of Uncertainty, whose features are discussed in more detail.

  8. Interpolation Method Needed for Numerical Uncertainty

    Science.gov (United States)

    Groves, Curtis E.; Ilie, Marcel; Schallhorn, Paul A.

    2014-01-01

    Using Computational Fluid Dynamics (CFD) to predict a flow field is an approximation to the exact problem and uncertainties exist. There is a method to approximate the errors in CFD via Richardson's Extrapolation. This method is based off of progressive grid refinement. To estimate the errors, the analyst must interpolate between at least three grids. This paper describes a study to find an appropriate interpolation scheme that can be used in Richardson's extrapolation or other uncertainty method to approximate errors.

  9. Uncertainties in thick-target PIXE analysis

    International Nuclear Information System (INIS)

    Thick-target PIXE analysis insolves uncertainties arising from the calculation of thick-target X-ray production in addition to the usual PIXE uncertainties. The calculation demands knowledge of ionization cross-sections, stopping powers and photon attenuation coefficients. Information on these is reviewed critically and a computational method is used to estimate the uncertainties transmitted from this data base into results of thick-target PIXE analyses with reference to particular specimen types using beams of 2-3 MeV protons. A detailed assessment of the accuracy of thick-target PIXE is presented. (orig.)

  10. Uncertainties in Safety Analysis. A literature review

    Energy Technology Data Exchange (ETDEWEB)

    Ekberg, C. [Chalmers Univ. of Technology, Goeteborg (Sweden). Dept. of Nuclear Chemistry

    1995-05-01

    The purpose of the presented work has been to give a short summary of the origins of many uncertainties arising in the designing and performance assessment of a repository for spent nuclear fuel. Some different methods to treat these uncertainties is also included. The methods and conclusions are in many cases general in the sense that they are applicable to many other disciplines where simulations are used. As a conclusion it may be noted that uncertainties of different origin have been discussed and debated, but one large group, e.g. computer simulations, where the methods to make a more explicit investigation exists, have not been investigated in a satisfying way. 50 refs.

  11. Uncertainties in Safety Analysis. A literature review

    International Nuclear Information System (INIS)

    The purpose of the presented work has been to give a short summary of the origins of many uncertainties arising in the designing and performance assessment of a repository for spent nuclear fuel. Some different methods to treat these uncertainties is also included. The methods and conclusions are in many cases general in the sense that they are applicable to many other disciplines where simulations are used. As a conclusion it may be noted that uncertainties of different origin have been discussed and debated, but one large group, e.g. computer simulations, where the methods to make a more explicit investigation exists, have not been investigated in a satisfying way. 50 refs

  12. Uncertainty in hydrological signatures

    Science.gov (United States)

    Westerberg, I. K.; McMillan, H. K.

    2015-09-01

    Information about rainfall-runoff processes is essential for hydrological analyses, modelling and water-management applications. A hydrological, or diagnostic, signature quantifies such information from observed data as an index value. Signatures are widely used, e.g. for catchment classification, model calibration and change detection. Uncertainties in the observed data - including measurement inaccuracy and representativeness as well as errors relating to data management - propagate to the signature values and reduce their information content. Subjective choices in the calculation method are a further source of uncertainty. We review the uncertainties relevant to different signatures based on rainfall and flow data. We propose a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrate it in two catchments for common signatures including rainfall-runoff thresholds, recession analysis and basic descriptive signatures of flow distribution and dynamics. Our intention is to contribute to awareness and knowledge of signature uncertainty, including typical sources, magnitude and methods for its assessment. We found that the uncertainties were often large (i.e. typical intervals of ±10-40 % relative uncertainty) and highly variable between signatures. There was greater uncertainty in signatures that use high-frequency responses, small data subsets, or subsets prone to measurement errors. There was lower uncertainty in signatures that use spatial or temporal averages. Some signatures were sensitive to particular uncertainty types such as rating-curve form. We found that signatures can be designed to be robust to some uncertainty sources. Signature uncertainties of the magnitudes we found have the potential to change the conclusions of hydrological and ecohydrological analyses, such as cross-catchment comparisons or inferences about dominant processes.

  13. Inventories and sales uncertainty

    OpenAIRE

    Caglayan, M.; Maioli, S. (Silvia); Mateut, S.

    2011-01-01

    We investigate the empirical linkages between sales uncertainty and firms´ inventory investment behavior while controlling for firms´ financial strength. Using large panels of manufacturing firms from several European countries we find that higher sales uncertainty leads to larger stocks of inventories. We also identify an indirect effect of sales uncertainty on inventory accumulation through the financial strength of firms. Our results provide evidence that financial strength mitigates the a...

  14. Uncertainty in artificial intelligence

    CERN Document Server

    Kanal, LN

    1986-01-01

    How to deal with uncertainty is a subject of much controversy in Artificial Intelligence. This volume brings together a wide range of perspectives on uncertainty, many of the contributors being the principal proponents in the controversy.Some of the notable issues which emerge from these papers revolve around an interval-based calculus of uncertainty, the Dempster-Shafer Theory, and probability as the best numeric model for uncertainty. There remain strong dissenting opinions not only about probability but even about the utility of any numeric method in this context.

  15. Measurement Uncertainty and Probability

    Science.gov (United States)

    Willink, Robin

    2013-02-01

    Part I. Principles: 1. Introduction; 2. Foundational ideas in measurement; 3. Components of error or uncertainty; 4. Foundational ideas in probability and statistics; 5. The randomization of systematic errors; 6. Beyond the standard confidence interval; Part II. Evaluation of Uncertainty: 7. Final preparation; 8. Evaluation using the linear approximation; 9. Evaluation without the linear approximations; 10. Uncertainty information fit for purpose; Part III. Related Topics: 11. Measurement of vectors and functions; 12. Why take part in a measurement comparison?; 13. Other philosophies; 14. An assessment of objective Bayesian methods; 15. A guide to the expression of uncertainty in measurement; 16. Measurement near a limit - an insoluble problem?; References; Index.

  16. Radon surveys and uncertainties

    International Nuclear Information System (INIS)

    Radon surveys are made primarily for estimating the radon risk for the population of an area but also for predicting the risk for inhabitants of future buildings. Therefore it is of essential importance to know the uncertainties of such predictions. The outcome of radon surveys is strongly influenced b y many factors with partly large uncertainties. In most cases passive radon detectors are exposed for some weeks or months (up to one-year measurements). In these cases, the contribution of uncertainties in the calibration of the detectors to the total uncertainty is most often of less importance. The main contribution to the uncertainties comes from the unknown treatment of the detectors by the inhabitants during the exposure and by the natural fluctuation of the indoor radon concentration in time. The latter is also true for one-year-measurements. Additional uncertainties are introduced when the measured data are normalized to some time period (e. g. one-year mean) or to some standardized measurement situation. Generally, it is of crucial importance to know the probability for a possible underestimation of the radon risk for an area. The main contributions to the final uncertainties, their sizes and the mathematical procedures which were used during the Austrian Radon Project (ARP) to estimate the uncertainties in the final categorization of areas in radon potential classes will be discussed. In addition, procedures which can be used to reduce some uncertainties will be presented. (author)

  17. Uncertainty and simulation

    International Nuclear Information System (INIS)

    More than 150 researchers and engineers from universities and the industrial world met to discuss on the new methodologies developed around assessing uncertainty. About 20 papers were presented and the main topics were: methods to study the propagation of uncertainties, sensitivity analysis, nuclear data covariances or multi-parameter optimisation. This report gathers the contributions of CEA researchers and engineers

  18. Uncertainty in flood risk mapping

    Science.gov (United States)

    Gonçalves, Luisa M. S.; Fonte, Cidália C.; Gomes, Ricardo

    2014-05-01

    A flood refers to a sharp increase of water level or volume in rivers and seas caused by sudden rainstorms or melting ice due to natural factors. In this paper, the flooding of riverside urban areas caused by sudden rainstorms will be studied. In this context, flooding occurs when the water runs above the level of the minor river bed and enters the major river bed. The level of the major bed determines the magnitude and risk of the flooding. The prediction of the flooding extent is usually deterministic, and corresponds to the expected limit of the flooded area. However, there are many sources of uncertainty in the process of obtaining these limits, which influence the obtained flood maps used for watershed management or as instruments for territorial and emergency planning. In addition, small variations in the delineation of the flooded area can be translated into erroneous risk prediction. Therefore, maps that reflect the uncertainty associated with the flood modeling process have started to be developed, associating a degree of likelihood with the boundaries of the flooded areas. In this paper an approach is presented that enables the influence of the parameters uncertainty to be evaluated, dependent on the type of Land Cover Map (LCM) and Digital Elevation Model (DEM), on the estimated values of the peak flow and the delineation of flooded areas (different peak flows correspond to different flood areas). The approach requires modeling the DEM uncertainty and its propagation to the catchment delineation. The results obtained in this step enable a catchment with fuzzy geographical extent to be generated, where a degree of possibility of belonging to the basin is assigned to each elementary spatial unit. Since the fuzzy basin may be considered as a fuzzy set, the fuzzy area of the basin may be computed, generating a fuzzy number. The catchment peak flow is then evaluated using fuzzy arithmetic. With this methodology a fuzzy number is obtained for the peak flow

  19. Predictive uncertainty in auditory sequence processing

    Directory of Open Access Journals (Sweden)

    Niels Chr. eHansen

    2014-09-01

    Full Text Available Previous studies of auditory expectation have focused on the expectedness perceived by listeners retrospectively in response to events. In contrast, this research examines predictive uncertainty - a property of listeners’ prospective state of expectation prior to the onset of an event. We examine the information-theoretic concept of Shannon entropy as a model of predictive uncertainty in music cognition. This is motivated by the Statistical Learning Hypothesis, which proposes that schematic expectations reflect probabilistic relationships between sensory events learned implicitly through exposure.Using probability estimates from an unsupervised, variable-order Markov model, 12 melodic contexts high in entropy and 12 melodic contexts low in entropy were selected from two musical repertoires differing in structural complexity (simple and complex. Musicians and non-musicians listened to the stimuli and provided explicit judgments of perceived uncertainty (explicit uncertainty. We also examined an indirect measure of uncertainty computed as the entropy of expectedness distributions obtained using a classical probe-tone paradigm where listeners rated the perceived expectedness of the final note in a melodic sequence (inferred uncertainty. Finally, we simulate listeners’ perception of expectedness and uncertainty using computational models of auditory expectation. A detailed model comparison indicates which model parameters maximize fit to the data and how they compare to existing models in the literature.The results show that listeners experience greater uncertainty in high-entropy musical contexts than low-entropy contexts. This effect is particularly apparent for inferred uncertainty and is stronger in musicians than non-musicians. Consistent with the Statistical Learning Hypothesis, the results suggest that increased domain-relevant training is associated with an increasingly accurate cognitive model of probabilistic structure in music.

  20. Predictive uncertainty in auditory sequence processing.

    Science.gov (United States)

    Hansen, Niels Chr; Pearce, Marcus T

    2014-01-01

    Previous studies of auditory expectation have focused on the expectedness perceived by listeners retrospectively in response to events. In contrast, this research examines predictive uncertainty-a property of listeners' prospective state of expectation prior to the onset of an event. We examine the information-theoretic concept of Shannon entropy as a model of predictive uncertainty in music cognition. This is motivated by the Statistical Learning Hypothesis, which proposes that schematic expectations reflect probabilistic relationships between sensory events learned implicitly through exposure. Using probability estimates from an unsupervised, variable-order Markov model, 12 melodic contexts high in entropy and 12 melodic contexts low in entropy were selected from two musical repertoires differing in structural complexity (simple and complex). Musicians and non-musicians listened to the stimuli and provided explicit judgments of perceived uncertainty (explicit uncertainty). We also examined an indirect measure of uncertainty computed as the entropy of expectedness distributions obtained using a classical probe-tone paradigm where listeners rated the perceived expectedness of the final note in a melodic sequence (inferred uncertainty). Finally, we simulate listeners' perception of expectedness and uncertainty using computational models of auditory expectation. A detailed model comparison indicates which model parameters maximize fit to the data and how they compare to existing models in the literature. The results show that listeners experience greater uncertainty in high-entropy musical contexts than low-entropy contexts. This effect is particularly apparent for inferred uncertainty and is stronger in musicians than non-musicians. Consistent with the Statistical Learning Hypothesis, the results suggest that increased domain-relevant training is associated with an increasingly accurate cognitive model of probabilistic structure in music. PMID:25295018

  1. Physical Uncertainty Bounds (PUB)

    Energy Technology Data Exchange (ETDEWEB)

    Vaughan, Diane Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Dean L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-03-19

    This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switching out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.

  2. Nanoparticles: Uncertainty Risk Analysis

    DEFF Research Database (Denmark)

    Grieger, Khara Deanne; Hansen, Steffen Foss; Baun, Anders

    2012-01-01

    Scientific uncertainty plays a major role in assessing the potential environmental risks of nanoparticles. Moreover, there is uncertainty within fundamental data and information regarding the potential environmental and health risks of nanoparticles, hampering risk assessments based on standard...... approaches. To date, there have been a number of different approaches to assess uncertainty of environmental risks in general, and some have also been proposed in the case of nanoparticles and nanomaterials. In recent years, others have also proposed that broader assessments of uncertainty are also needed in...... order to handle the complex potential risks of nanoparticles, including more descriptive characterizations of uncertainty. Some of these approaches are presented and discussed herein, in which the potential strengths and limitations of these approaches are identified along with further challenges for...

  3. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  4. Economic uncertainty and econophysics

    Science.gov (United States)

    Schinckus, Christophe

    2009-10-01

    The objective of this paper is to provide a methodological link between econophysics and economics. I will study a key notion of both fields: uncertainty and the ways of thinking about it developed by the two disciplines. After having presented the main economic theories of uncertainty (provided by Knight, Keynes and Hayek), I show how this notion is paradoxically excluded from the economic field. In economics, uncertainty is totally reduced by an a priori Gaussian framework-in contrast to econophysics, which does not use a priori models because it works directly on data. Uncertainty is then not shaped by a specific model, and is partially and temporally reduced as models improve. This way of thinking about uncertainty has echoes in the economic literature. By presenting econophysics as a Knightian method, and a complementary approach to a Hayekian framework, this paper shows that econophysics can be methodologically justified from an economic point of view.

  5. Information-theoretic approach to uncertainty importance

    Energy Technology Data Exchange (ETDEWEB)

    Park, C.K.; Bari, R.A.

    1985-01-01

    A method is presented for importance analysis in probabilistic risk assessments (PRA) for which the results of interest are characterized by full uncertainty distributions and not just point estimates. The method is based on information theory in which entropy is a measure of uncertainty of a probability density function. We define the relative uncertainty importance between two events as the ratio of the two exponents of the entropies. For the log-normal and log-uniform distributions the importance measure is comprised of the median (central tendency) and of the logarithm of the error factor (uncertainty). Thus, if accident sequences are ranked this way, and the error factors are not all equal, then a different rank order would result than if the sequences were ranked by the central tendency measure alone. As an illustration, the relative importance of internal events and in-plant fires was computed on the basis of existing PRA results.

  6. Information-theoretic approach to uncertainty importance

    International Nuclear Information System (INIS)

    A method is presented for importance analysis in probabilistic risk assessments (PRA) for which the results of interest are characterized by full uncertainty distributions and not just point estimates. The method is based on information theory in which entropy is a measure of uncertainty of a probability density function. We define the relative uncertainty importance between two events as the ratio of the two exponents of the entropies. For the log-normal and log-uniform distributions the importance measure is comprised of the median (central tendency) and of the logarithm of the error factor (uncertainty). Thus, if accident sequences are ranked this way, and the error factors are not all equal, then a different rank order would result than if the sequences were ranked by the central tendency measure alone. As an illustration, the relative importance of internal events and in-plant fires was computed on the basis of existing PRA results

  7. Uncertainty, rationality, and agency

    CERN Document Server

    Hoek, Wiebe van der

    2006-01-01

    Goes across 'classical' borderlines of disciplinesUnifies logic, game theory, and epistemics and studies them in an agent-settingCombines classical and novel approaches to uncertainty, rationality, and agency

  8. Uncertainty in oil projects

    International Nuclear Information System (INIS)

    This report presents an oil project valuation under uncertainty by means of two well-known financial techniques: The Capital Asset Pricing Model (CAPM) and The Black-Scholes Option Pricing Formula. CAPM gives a linear positive relationship between expected rate of return and risk but does not take into consideration the aspect of flexibility which is crucial for an irreversible investment as an oil price is. Introduction of investment decision flexibility by using real options can increase the oil project value substantially. Some simple tests for the importance of uncertainty in stock market for oil investments are performed. Uncertainty in stock returns is correlated with aggregate product market uncertainty according to Pindyck (1991). The results of the tests are not satisfactory due to the short data series but introducing two other explanatory variables the interest rate and Gross Domestic Product make the situation better. 36 refs., 18 figs., 6 tabs

  9. Evaluating prediction uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    McKay, M.D. [Los Alamos National Lab., NM (United States)

    1995-03-01

    The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented.

  10. Evaluating prediction uncertainty

    International Nuclear Information System (INIS)

    The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented

  11. A surrogate-based uncertainty quantification with quantifiable errors

    Energy Technology Data Exchange (ETDEWEB)

    Bang, Y.; Abdel-Khalik, H. S. [North Carolina State Univ., Raleigh, NC 27695 (United States)

    2012-07-01

    Surrogate models are often employed to reduce the computational cost required to complete uncertainty quantification, where one is interested in propagating input parameters uncertainties throughout a complex engineering model to estimate responses uncertainties. An improved surrogate construction approach is introduced here which places a premium on reducing the associated computational cost. Unlike existing methods where the surrogate is constructed first, then employed to propagate uncertainties, the new approach combines both sensitivity and uncertainty information to render further reduction in the computational cost. Mathematically, the reduction is described by a range finding algorithm that identifies a subspace in the parameters space, whereby parameters uncertainties orthogonal to the subspace contribute negligible amount to the propagated uncertainties. Moreover, the error resulting from the reduction can be upper-bounded. The new approach is demonstrated using a realistic nuclear assembly model and compared to existing methods in terms of computational cost and accuracy of uncertainties. Although we believe the algorithm is general, it will be applied here for linear-based surrogates and Gaussian parameters uncertainties. The generalization to nonlinear models will be detailed in a separate article. (authors)

  12. A surrogate-based uncertainty quantification with quantifiable errors

    International Nuclear Information System (INIS)

    Surrogate models are often employed to reduce the computational cost required to complete uncertainty quantification, where one is interested in propagating input parameters uncertainties throughout a complex engineering model to estimate responses uncertainties. An improved surrogate construction approach is introduced here which places a premium on reducing the associated computational cost. Unlike existing methods where the surrogate is constructed first, then employed to propagate uncertainties, the new approach combines both sensitivity and uncertainty information to render further reduction in the computational cost. Mathematically, the reduction is described by a range finding algorithm that identifies a subspace in the parameters space, whereby parameters uncertainties orthogonal to the subspace contribute negligible amount to the propagated uncertainties. Moreover, the error resulting from the reduction can be upper-bounded. The new approach is demonstrated using a realistic nuclear assembly model and compared to existing methods in terms of computational cost and accuracy of uncertainties. Although we believe the algorithm is general, it will be applied here for linear-based surrogates and Gaussian parameters uncertainties. The generalization to nonlinear models will be detailed in a separate article. (authors)

  13. Decision making under uncertainty

    International Nuclear Information System (INIS)

    The theory of evidence and the theory of possibility are considered by some analysts as potential models for uncertainty. This paper discusses two issues: how formal probability theory has been relaxed to develop these uncertainty models; and the degree to which these models can be applied to risk assessment. The scope of the second issue is limited to an investigation of their compatibility for combining various pieces of evidence, which is an important problem in PRA

  14. Uncertainty calculations made easier

    Energy Technology Data Exchange (ETDEWEB)

    Hogenbirk, A.

    1994-07-01

    The results are presented of a neutron cross section sensitivity/uncertainty analysis performed in a complicated 2D model of the NET shielding blanket design inside the ITER torus design, surrounded by the cryostat/biological shield as planned for ITER. The calculations were performed with a code system developed at ECN Petten, with which sensitivity/uncertainty calculations become relatively simple. In order to check the deterministic neutron transport calculations (performed with DORT), calculations were also performed with the Monte Carlo code MCNP. Care was taken to model the 2.0 cm wide gaps between two blanket segments, as the neutron flux behind the vacuum vessel is largely determined by neutrons streaming through these gaps. The resulting neutron flux spectra are in excellent agreement up to the end of the cryostat. It is noted, that at this position the attenuation of the neutron flux is about 1 l orders of magnitude. The uncertainty in the energy integrated flux at the beginning of the vacuum vessel and at the beginning of the cryostat was determined in the calculations. The uncertainty appears to be strongly dependent on the exact geometry: if the gaps are filled with stainless steel, the neutron spectrum changes strongly, which results in an uncertainty of 70% in the energy integrated flux at the beginning of the cryostat in the no-gap-geometry, compared to an uncertainty of only 5% in the gap-geometry. Therefore, it is essential to take into account the exact geometry in sensitivity/uncertainty calculations. Furthermore, this study shows that an improvement of the covariance data is urgently needed in order to obtain reliable estimates of the uncertainties in response parameters in neutron transport calculations. (orig./GL).

  15. Uncertainty calculations made easier

    International Nuclear Information System (INIS)

    The results are presented of a neutron cross section sensitivity/uncertainty analysis performed in a complicated 2D model of the NET shielding blanket design inside the ITER torus design, surrounded by the cryostat/biological shield as planned for ITER. The calculations were performed with a code system developed at ECN Petten, with which sensitivity/uncertainty calculations become relatively simple. In order to check the deterministic neutron transport calculations (performed with DORT), calculations were also performed with the Monte Carlo code MCNP. Care was taken to model the 2.0 cm wide gaps between two blanket segments, as the neutron flux behind the vacuum vessel is largely determined by neutrons streaming through these gaps. The resulting neutron flux spectra are in excellent agreement up to the end of the cryostat. It is noted, that at this position the attenuation of the neutron flux is about 1 l orders of magnitude. The uncertainty in the energy integrated flux at the beginning of the vacuum vessel and at the beginning of the cryostat was determined in the calculations. The uncertainty appears to be strongly dependent on the exact geometry: if the gaps are filled with stainless steel, the neutron spectrum changes strongly, which results in an uncertainty of 70% in the energy integrated flux at the beginning of the cryostat in the no-gap-geometry, compared to an uncertainty of only 5% in the gap-geometry. Therefore, it is essential to take into account the exact geometry in sensitivity/uncertainty calculations. Furthermore, this study shows that an improvement of the covariance data is urgently needed in order to obtain reliable estimates of the uncertainties in response parameters in neutron transport calculations. (orig./GL)

  16. Data assimilation for kinetic parameters uncertainty analysis

    International Nuclear Information System (INIS)

    Several years ago, the OECD/NEA Nuclear Science Committee (NSC) established the Expert Group on Uncertainty Analysis in Modeling (UAM-LWR) after thorough discussions of the demands from nuclear research, industry, safety and regulation to provide the best estimate predictions of nuclear systems parameters with their confidence bounds. UAM objectives include among others, the quantification of uncertainties of neutronic calculations with respect to their value for the multi-physics analysis. Since the kinetics parameters and their uncertainties are of particular interest for these studies the deterministic approaches for analysis of uncertainties in nuclear reactor kinetic parameters (neutron generation lifetime and delayed neutron effective fraction) have been developed in frame of the UAM-LWR. The approach uses combination of generalization of perturbation theory to reactivity analysis, and the Generalized Perturbation Theory (GPT) for sensitivity computation. It has been applied to the UAM complementary fast neutron SNEAK test case that has unique set of experimental data for βeff. In this example, the covariance matrices of nuclear data have been derived from COMMARA library by Bayesian adjustment upon the set of the fast neutron integral benchmarks with the BERING code package. Then, the kinetic parameters uncertainties with their correlations have been applied to simplified model of a reactivity insertion transient where relative uncertainty of power peak was taken as figure of merit. The results demonstrate that the uncertainties due to nuclear data impact significantly the energy release in a coupled transient modeling. It was also found that such uncertainties become higher if the correlations between uncertainties of different lumped parameters are taken into account. (author)

  17. Gravitational tests of the Generalized Uncertainty Principle

    OpenAIRE

    Scardigli, Fabio; Casadio, Roberto

    2014-01-01

    We compute the corrections to the Schwarzschild metric necessary to reproduce the Hawking temperature derived from a generalized uncertainty principle (GUP), so that the GUP deformation parameter is directly linked to the deformation of the metric. Using this modified Schwarzschild metric, we compute corrections to the standard general relativistic predictions for the light deflection and perihelion precession, both for planets in the solar system and for binary pulsars. This analysis allows ...

  18. Uncertainty relations and precession of perihelion

    Science.gov (United States)

    Scardigli, Fabio; Casadio, Roberto

    2016-03-01

    We compute the corrections to the Schwarzschild metric necessary to reproduce the Hawking temperature derived from a Generalized Uncertainty Principle (GUP), so that the GUP deformation parameter is directly linked to the deformation of the metric. Using this modified Schwarzschild metric, we compute corrections to the standard General Relativistic predictions for the perihelion precession for planets in the solar system, and for binary pulsars. This analysis allows us to set bounds for the GUP deformation parameter from well-known astronomical measurements.

  19. Uncertainty reasoning in hierachical visual evidence space

    OpenAIRE

    Qian, Jianzhong

    1990-01-01

    One of the major problems in computer vision involves dealing with uncertain information. Occlusion, dissimilar views, insufficient illumination, insufficient resolution, and degradation give rise to imprecise data. At the same time, incomplete or local knowledge of the scene gives rise to imprecise interpretation rules. Uncertainty arises at different processing levels of computer vision either because of the imprecise data or because of the imprecise interpretation rules. It is natural t...

  20. Uncertainty Analysis for a Jet Flap Airfoil

    Science.gov (United States)

    Green, Lawrence L.; Cruz, Josue

    2006-01-01

    An analysis of variance (ANOVA) study was performed to quantify the potential uncertainties of lift and pitching moment coefficient calculations from a computational fluid dynamics code, relative to an experiment, for a jet flap airfoil configuration. Uncertainties due to a number of factors including grid density, angle of attack and jet flap blowing coefficient were examined. The ANOVA software produced a numerical model of the input coefficient data, as functions of the selected factors, to a user-specified order (linear, 2-factor interference, quadratic, or cubic). Residuals between the model and actual data were also produced at each of the input conditions, and uncertainty confidence intervals (in the form of Least Significant Differences or LSD) for experimental, computational, and combined experimental / computational data sets were computed. The LSD bars indicate the smallest resolvable differences in the functional values (lift or pitching moment coefficient) attributable solely to changes in independent variable, given just the input data points from selected data sets. The software also provided a collection of diagnostics which evaluate the suitability of the input data set for use within the ANOVA process, and which examine the behavior of the resultant data, possibly suggesting transformations which should be applied to the data to reduce the LSD. The results illustrate some of the key features of, and results from, the uncertainty analysis studies, including the use of both numerical (continuous) and categorical (discrete) factors, the effects of the number and range of the input data points, and the effects of the number of factors considered simultaneously.

  1. Decision-making under risk and uncertainty

    International Nuclear Information System (INIS)

    Fuzzy sets and interval analysis tools to make computations and solve optimisation problems are presented. Fuzzy and interval extensions of Decision Theory criteria for decision-making under parametric uncertainty of prior information (probabilities, payoffs) are developed. An interval probability approach to the mean-value criterion is proposed. (author)

  2. Robust Linear Programming with Norm Uncertainty

    OpenAIRE

    Lei Wang; Hong Luo

    2014-01-01

    We consider the linear programming problem with uncertainty set described by $(p,w)$ -norm. We suggest that the robust counterpart of this problem is equivalent to a computationally convex optimization problem. We provide probabilistic guarantees on the feasibility of an optimal robust solution when the uncertain coefficients obey independent and identically distributed normal distributions.

  3. Remaining Useful Life Estimation in Prognosis: An Uncertainty Propagation Problem

    Science.gov (United States)

    Sankararaman, Shankar; Goebel, Kai

    2013-01-01

    The estimation of remaining useful life is significant in the context of prognostics and health monitoring, and the prediction of remaining useful life is essential for online operations and decision-making. However, it is challenging to accurately predict the remaining useful life in practical aerospace applications due to the presence of various uncertainties that affect prognostic calculations, and in turn, render the remaining useful life prediction uncertain. It is challenging to identify and characterize the various sources of uncertainty in prognosis, understand how each of these sources of uncertainty affect the uncertainty in the remaining useful life prediction, and thereby compute the overall uncertainty in the remaining useful life prediction. In order to achieve these goals, this paper proposes that the task of estimating the remaining useful life must be approached as an uncertainty propagation problem. In this context, uncertainty propagation methods which are available in the literature are reviewed, and their applicability to prognostics and health monitoring are discussed.

  4. Uncertainty Estimates for Theoretical Atomic and Molecular Data

    CERN Document Server

    Chung, H -K; Bartschat, K; Csaszar, A G; Drake, G W F; Kirchner, T; Kokoouline, V; Tennyson, J

    2016-01-01

    Sources of uncertainty are reviewed for calculated atomic and molecular data that are important for plasma modeling: atomic and molecular structure and cross sections for electron-atom, electron-molecule, and heavy particle collisions. We concentrate on model uncertainties due to approximations to the fundamental many-body quantum mechanical equations and we aim to provide guidelines to estimate uncertainties as a routine part of computations of data for structure and scattering.

  5. About uncertainties in practical salinity calculations

    Directory of Open Access Journals (Sweden)

    M. Le Menn

    2009-10-01

    Full Text Available Salinity is a quantity computed, in the actual state of the art, from conductivity ratio measurements, knowing temperature and pressure at the time of the measurement and using the Practical Salinity Scale algorithm of 1978 (PSS-78 which gives practical salinity values S. The uncertainty expected on PSS-78 values is ±0.002, but nothing has ever been detailed about the method to work out this uncertainty, and the sources of errors to include in this calculation. Following a guide edited by the Bureau International des Poids et Mesures (BIPM, this paper assess, by two independent methods, the uncertainties of salinity values obtained from a laboratory salinometer and Conductivity-Temperature-Depth (CTD measurements after laboratory calibration of a conductivity cell. The results show that the part due to the PSS-78 relations fits is sometimes as much significant as the instruments one's. This is particularly the case with CTD measurements where correlations between the variables contribute to decrease largely the uncertainty on S, even when the expanded uncertainties on conductivity cells calibrations are largely up of 0.002 mS/cm. The relations given in this publication, and obtained with the normalized GUM method, allow a real analysis of the uncertainties sources and they can be used in a more general way, with instruments having different specifications.

  6. Verification of uncertainty budgets

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Madsen, B.S.

    2005-01-01

    , because their influence requires samples taken at long intervals, e.g., the acquisition of a new calibrant. It is therefore recommended to include verification of the uncertainty budget in the continuous QA/QC monitoring; this will eventually lead to a test also for such rarely occurring effects....... full range of matrices and concentrations for which the budget is assumed to be valid. In this way the assumptions made in the uncertainty budget can be experimentally verified, both as regards sources of variability that are assumed negligible, and dominant uncertainty components. Agreement between...... observed and expected variability is tested by means of the T-test, which follows a chi-square distribution with a number of degrees of freedom determined by the number of replicates. Significant deviations between predicted and observed variability may be caused by a variety of effects, and examples will...

  7. Commonplaces and social uncertainty

    DEFF Research Database (Denmark)

    Lassen, Inger

    2008-01-01

    This article explores the concept of uncertainty in four focus group discussions about genetically modified food. In the discussions, members of the general public interact with food biotechnology scientists while negotiating their attitudes towards genetic engineering. Their discussions offer...... an example of risk discourse in which the use of commonplaces seems to be a central feature (Myers 2004: 81). My analyses support earlier findings that commonplaces serve important interactional purposes (Barton 1999) and that they are used for mitigating disagreement, for closing topics and for facilitating...... risk discourse (Myers 2005; 2007). In additional, however, I argue that commonplaces are used to mitigate feelings of insecurity caused by uncertainty and to negotiate new codes of moral conduct. Keywords: uncertainty, commonplaces, risk discourse, focus groups, appraisal...

  8. Serenity in political uncertainty.

    Science.gov (United States)

    Doumit, Rita; Afifi, Rema A; Devon, Holli A

    2015-01-01

    College students are often faced with academic and personal stressors that threaten their well-being. Added to that may be political and environmental stressors such as acts of violence on the streets, interruptions in schooling, car bombings, targeted religious intimidations, financial hardship, and uncertainty of obtaining a job after graduation. Research on how college students adapt to the latter stressors is limited. The aims of this study were (1) to investigate the associations between stress, uncertainty, resilience, social support, withdrawal coping, and well-being for Lebanese youth during their first year of college and (2) to determine whether these variables predicted well-being. A sample of 293 first-year students enrolled in a private university in Lebanon completed a self-reported questionnaire in the classroom setting. The mean age of sample participants was 18.1 years, with nearly an equal percentage of males and females (53.2% vs 46.8%), who lived with their family (92.5%), and whose family reported high income levels (68.4%). Multiple regression analyses revealed that best determinants of well-being are resilience, uncertainty, social support, and gender that accounted for 54.1% of the variance. Despite living in an environment of frequent violence and political uncertainty, Lebanese youth in this study have a strong sense of well-being and are able to go on with their lives. This research adds to our understanding on how adolescents can adapt to stressors of frequent violence and political uncertainty. Further research is recommended to understand the mechanisms through which young people cope with political uncertainty and violence. PMID:25658930

  9. Uncertainty in artificial intelligence

    CERN Document Server

    Levitt, TS; Lemmer, JF; Shachter, RD

    1990-01-01

    Clearly illustrated in this volume is the current relationship between Uncertainty and AI.It has been said that research in AI revolves around five basic questions asked relative to some particular domain: What knowledge is required? How can this knowledge be acquired? How can it be represented in a system? How should this knowledge be manipulated in order to provide intelligent behavior? How can the behavior be explained? In this volume, all of these questions are addressed. From the perspective of the relationship of uncertainty to the basic questions of AI, the book divides naturally i

  10. Traceability and Measurement Uncertainty

    DEFF Research Database (Denmark)

    Tosello, Guido; De Chiffre, Leonardo

    2004-01-01

    and motivating to this important group. The developed e-learning system consists on 12 different chapters dealing with the following topics: 1. Basics 2. Traceability and measurement uncertainty 3. Coordinate metrology 4. Form measurement 5. Surface testing 6. Optical measurement and testing 7. Measuring rooms 8....... Machine tool testing 9. The role of manufacturing metrology for QM 10. Inspection planning 11. Quality management of measurements incl. Documentation 12. Advanced manufacturing measurement technology The present report (which represents the section 2 - Traceability and Measurement Uncertainty – of the e...

  11. Weighted Uncertainty Relations

    Science.gov (United States)

    Xiao, Yunlong; Jing, Naihuan; Li-Jost, Xianqing; Fei, Shao-Ming

    2016-03-01

    Recently, Maccone and Pati have given two stronger uncertainty relations based on the sum of variances and one of them is nontrivial when the quantum state is not an eigenstate of the sum of the observables. We derive a family of weighted uncertainty relations to provide an optimal lower bound for all situations and remove the restriction on the quantum state. Generalization to multi-observable cases is also given and an optimal lower bound for the weighted sum of the variances is obtained in general quantum situation.

  12. Comment on "A procedure for the estimation of the numerical uncertainty of CFD calculations based on grid refinement studies" (L. Eça and M. Hoekstra, Journal of Computational Physics 262 (2014) 104-130)

    Science.gov (United States)

    Xing, Tao; Stern, Frederick

    2015-11-01

    Eça and Hoekstra [1] proposed a procedure for the estimation of the numerical uncertainty of CFD calculations based on the least squares root (LSR) method. We believe that the LSR method has potential value for providing an extended Richardson-extrapolation solution verification procedure for mixed monotonic and oscillatory or only oscillatory convergent solutions (based on the usual systematic grid-triplet convergence condition R). Current Richardson-extrapolation solution verification procedures [2-7] are restricted to monotonic convergent solutions 0 block diagram, which summarizes the LSR procedure and options, including some of which we are in disagreement. Compared to the grid-triplet and three-step procedure followed by most solution verification methods (convergence condition followed by error and uncertainty estimates), the LSR method follows a four-grid (minimum) and four-step procedure (error estimate, data range parameter Δϕ, FS, and uncertainty estimate).

  13. Probabilistic condition monitoring counting with information uncertainty

    Czech Academy of Sciences Publication Activity Database

    Ettler, P.; Dedecius, Kamil

    Athény: National Technical University of Athens, 2015, s. 1-7. ISBN 978-960-99994-9-6. [1st International Conference on Uncertainty Quantification in Computational Sciences and Engineering (UNCECOMP). Hersonissos, Kréta (GR), 25.05.2015-27.05.2015] R&D Projects: GA MŠk 7D12004 Institutional support: RVO:67985556 Keywords : uncertainty quantification * condition monitoring * fault detection Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2015/AS/dedecius-0445804.pdf

  14. Uncertainty Analyses and Strategy

    International Nuclear Information System (INIS)

    The DOE identified a variety of uncertainties, arising from different sources, during its assessment of the performance of a potential geologic repository at the Yucca Mountain site. In general, the number and detail of process models developed for the Yucca Mountain site, and the complex coupling among those models, make the direct incorporation of all uncertainties difficult. The DOE has addressed these issues in a number of ways using an approach to uncertainties that is focused on producing a defensible evaluation of the performance of a potential repository. The treatment of uncertainties oriented toward defensible assessments has led to analyses and models with so-called ''conservative'' assumptions and parameter bounds, where conservative implies lower performance than might be demonstrated with a more realistic representation. The varying maturity of the analyses and models, and uneven level of data availability, result in total system level analyses with a mix of realistic and conservative estimates (for both probabilistic representations and single values). That is, some inputs have realistically represented uncertainties, and others are conservatively estimated or bounded. However, this approach is consistent with the ''reasonable assurance'' approach to compliance demonstration, which was called for in the U.S. Nuclear Regulatory Commission's (NRC) proposed 10 CFR Part 63 regulation (64 FR 8640 [DIRS 101680]). A risk analysis that includes conservatism in the inputs will result in conservative risk estimates. Therefore, the approach taken for the Total System Performance Assessment for the Site Recommendation (TSPA-SR) provides a reasonable representation of processes and conservatism for purposes of site recommendation. However, mixing unknown degrees of conservatism in models and parameter representations reduces the transparency of the analysis and makes the development of coherent and consistent probability statements about projected repository

  15. Quantification of Uncertainties in Integrated Spacecraft System Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed effort is to investigate a novel uncertainty quantification (UQ) approach based on non-intrusive polynomial chaos (NIPC) for computationally efficient...

  16. Uncertainty and the Great Recession

    OpenAIRE

    Born, Benjamin; Breuer, Sebastian; Elstner, Steffen

    2014-01-01

    Has heightened uncertainty been a major contributor to the Great Recession and the slow recovery in the U.S.? To answer this question, we identify exogenous changes in six uncertainty proxies and quantify their contributions to GDP growth and the unemployment rate. Our results are threefold. First, only a minor part of the rise in uncertainty measures during the Great Recession was driven by exogenous uncertainty shocks. Second, while increased uncertainty explains less than one percentage po...

  17. Uncertainties in repository modeling

    International Nuclear Information System (INIS)

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling

  18. Risks, uncertainty, vagueness

    International Nuclear Information System (INIS)

    The notion of 'risk' is discussed in its social and technological contexts, leading to an investigation of the terms factuality, hypotheticality, uncertainty, and vagueness, and to the problems of acceptance and acceptability especially in the context of political decision finding. (DG)

  19. Uncertainties in repository modeling

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, J.R.

    1996-12-31

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.

  20. Proportional Representation with Uncertainty

    OpenAIRE

    Francesco De Sinopoli; Giovanna Iannantuoni; Elena Manzoni; Carlos Pimienta

    2014-01-01

    We introduce a model with strategic voting in a parliamentary election with proportional representation and uncertainty about voters’ preferences. In any equilibrium of the model, most voters only vote for those parties whose positions are extreme. In the resulting parliament, a consensus government forms and the policy maximizing the sum of utilities of the members of the government is implemented.

  1. Uncertainty and nonseparability

    Science.gov (United States)

    de La Torre, A. C.; Catuogno, P.; Ferrando, S.

    1989-06-01

    A quantum covariance function is introduced whose real and imaginary parts are related to the independent contributions to the uncertainty principle: noncommutativity of the operators and nonseparability. It is shown that factorizability of states is a sufficient but not necessary condition for separability. It is suggested that all quantum effects could be considered to be a consequence of nonseparability alone.

  2. Investment and uncertainty

    DEFF Research Database (Denmark)

    Greasley, David; Madsen, Jakob B.

    2006-01-01

    A severe collapse of fixed capital formation distinguished the onset of the Great Depression from other investment downturns between the world wars. Using a model estimated for the years 1890-2000, we show that the expected profitability of capital measured by Tobin's q, and the uncertainty surro...... depression: rather, its slump helped to propel the wider collapse...

  3. GIS and uncertainty management

    Czech Academy of Sciences Publication Activity Database

    Klimešová, Dana

    Enschede : ISPRS ITC, 2006, s. 98-101. [ISPRS. Mid-term Symposium 2006. Enschede (NL), 08.05.2006-11.05.2006] Institutional research plan: CEZ:AV0Z10750506 Keywords : spatial information sciences * GIS * uncertainty correction * web services Subject RIV: BC - Control System s Theory

  4. manage employee uncertainty

    Institute of Scientific and Technical Information of China (English)

    范梦璇

    2015-01-01

    <正>Employ change-related uncertainty is a condition that under current continually changing business environment,the organizations also have to change,the change include strategic direction,structure and staffing levels to help company to keep competitive(Armenakis&Bedeian,1999);However;these

  5. Risk, Uncertainty and Entrepreneurship

    DEFF Research Database (Denmark)

    Koudstaal, Martin; Sloof, Randolph; Van Praag, Mirjam

    Theory predicts that entrepreneurs have distinct attitudes towards risk and uncertainty, but empirical evidence is mixed. To better understand the unique behavioral characteristics of entrepreneurs and the causes of these mixed results, we perform a large ‘lab-in-the-field’ experiment comparing e...

  6. A Representation of Uncertainty to Aid Insight into Decision Models

    OpenAIRE

    Jimison, Holly B.

    2013-01-01

    Many real world models can be characterized as weak, meaning that there is significant uncertainty in both the data input and inferences. This lack of determinism makes it especially difficult for users of computer decision aids to understand and have confidence in the models. This paper presents a representation for uncertainty and utilities that serves as a framework for graphical summary and computer-generated explanation of decision models. The application described that tests the methodo...

  7. Meteorological Uncertainty of atmospheric Dispersion model results (MUD)

    DEFF Research Database (Denmark)

    Havskov Sørensen, Jens; Amstrup, Bjarne; Feddersen, Henrik;

    uncertainties of the meteorological model results. These uncertainties stem from e.g. limits in meteorological observations used to initialise meteorological forecast series. By perturbing e.g. the initial state of an NWP model run in agreement with the available observational data, an ensemble of...... meteorological forecasts is produced from which uncertainties in the various meteorological parameters are estimated, e.g. probabilities for rain. Corresponding ensembles of atmospheric dispersion can now be computed from which uncertainties of predicted radionuclide concentration and deposition patterns can be...

  8. Fluid flow dynamics under location uncertainty

    CERN Document Server

    Mémin, Etienne

    2013-01-01

    We present a derivation of a stochastic model of Navier Stokes equations that relies on a decomposition of the velocity fields into a differentiable drift component and a time uncorrelated uncertainty random term. This type of decomposition is reminiscent in spirit to the classical Reynolds decomposition. However, the random velocity fluctuations considered here are not differentiable with respect to time, and they must be handled through stochastic calculus. The dynamics associated with the differentiable drift component is derived from a stochastic version of the Reynolds transport theorem. It includes in its general form an uncertainty dependent "subgrid" bulk formula that cannot be immediately related to the usual Boussinesq eddy viscosity assumption constructed from thermal molecular agitation analogy. This formulation, emerging from uncertainties on the fluid parcels location, explains with another viewpoint some subgrid eddy diffusion models currently used in computational fluid dynamics or in geophysi...

  9. Multi-scenario modelling of uncertainty in stochastic chemical systems

    International Nuclear Information System (INIS)

    Uncertainty analysis has not been well studied at the molecular scale, despite extensive knowledge of uncertainty in macroscale systems. The ability to predict the effect of uncertainty allows for robust control of small scale systems such as nanoreactors, surface reactions, and gene toggle switches. However, it is difficult to model uncertainty in such chemical systems as they are stochastic in nature, and require a large computational cost. To address this issue, a new model of uncertainty propagation in stochastic chemical systems, based on the Chemical Master Equation, is proposed in the present study. The uncertain solution is approximated by a composite state comprised of the averaged effect of samples from the uncertain parameter distributions. This model is then used to study the effect of uncertainty on an isomerization system and a two gene regulation network called a repressilator. The results of this model show that uncertainty in stochastic systems is dependent on both the uncertain distribution, and the system under investigation. -- Highlights: •A method to model uncertainty on stochastic systems was developed. •The method is based on the Chemical Master Equation. •Uncertainty in an isomerization reaction and a gene regulation network was modelled. •Effects were significant and dependent on the uncertain input and reaction system. •The model was computationally more efficient than Kinetic Monte Carlo

  10. Multi-scenario modelling of uncertainty in stochastic chemical systems

    Energy Technology Data Exchange (ETDEWEB)

    Evans, R. David [Waterloo Institute for Nanotechnology, University of Waterloo, 200 University Avenue West, Waterloo, ON, N2L 3G1 (Canada); Ricardez-Sandoval, Luis A., E-mail: laricardezsandoval@uwaterloo.ca [Department of Chemical Engineering, University of Waterloo, 200 University Avenue West, Waterloo, ON, N2L 3G1 (Canada); Waterloo Institute for Nanotechnology, University of Waterloo, 200 University Avenue West, Waterloo, ON, N2L 3G1 (Canada)

    2014-09-15

    Uncertainty analysis has not been well studied at the molecular scale, despite extensive knowledge of uncertainty in macroscale systems. The ability to predict the effect of uncertainty allows for robust control of small scale systems such as nanoreactors, surface reactions, and gene toggle switches. However, it is difficult to model uncertainty in such chemical systems as they are stochastic in nature, and require a large computational cost. To address this issue, a new model of uncertainty propagation in stochastic chemical systems, based on the Chemical Master Equation, is proposed in the present study. The uncertain solution is approximated by a composite state comprised of the averaged effect of samples from the uncertain parameter distributions. This model is then used to study the effect of uncertainty on an isomerization system and a two gene regulation network called a repressilator. The results of this model show that uncertainty in stochastic systems is dependent on both the uncertain distribution, and the system under investigation. -- Highlights: •A method to model uncertainty on stochastic systems was developed. •The method is based on the Chemical Master Equation. •Uncertainty in an isomerization reaction and a gene regulation network was modelled. •Effects were significant and dependent on the uncertain input and reaction system. •The model was computationally more efficient than Kinetic Monte Carlo.

  11. Uncertainty Analysis via Failure Domain Characterization: Polynomial Requirement Functions

    Science.gov (United States)

    Crespo, Luis G.; Munoz, Cesar A.; Narkawicz, Anthony J.; Kenny, Sean P.; Giesy, Daniel P.

    2011-01-01

    This paper proposes an uncertainty analysis framework based on the characterization of the uncertain parameter space. This characterization enables the identification of worst-case uncertainty combinations and the approximation of the failure and safe domains with a high level of accuracy. Because these approximations are comprised of subsets of readily computable probability, they enable the calculation of arbitrarily tight upper and lower bounds to the failure probability. A Bernstein expansion approach is used to size hyper-rectangular subsets while a sum of squares programming approach is used to size quasi-ellipsoidal subsets. These methods are applicable to requirement functions whose functional dependency on the uncertainty is a known polynomial. Some of the most prominent features of the methodology are the substantial desensitization of the calculations from the uncertainty model assumed (i.e., the probability distribution describing the uncertainty) as well as the accommodation for changes in such a model with a practically insignificant amount of computational effort.

  12. Uncertainty Analysis via Failure Domain Characterization: Unrestricted Requirement Functions

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2011-01-01

    This paper proposes an uncertainty analysis framework based on the characterization of the uncertain parameter space. This characterization enables the identification of worst-case uncertainty combinations and the approximation of the failure and safe domains with a high level of accuracy. Because these approximations are comprised of subsets of readily computable probability, they enable the calculation of arbitrarily tight upper and lower bounds to the failure probability. The methods developed herein, which are based on nonlinear constrained optimization, are applicable to requirement functions whose functional dependency on the uncertainty is arbitrary and whose explicit form may even be unknown. Some of the most prominent features of the methodology are the substantial desensitization of the calculations from the assumed uncertainty model (i.e., the probability distribution describing the uncertainty) as well as the accommodation for changes in such a model with a practically insignificant amount of computational effort.

  13. Deconvolution of variability and uncertainty in the Cassini safety analysis

    International Nuclear Information System (INIS)

    The standard method for propagation of uncertainty in a risk analysis requires rerunning the risk calculation numerous times with model parameters chosen from their uncertainty distributions. This was not practical for the Cassini nuclear safety analysis, due to the computationally intense nature of the risk calculation. A less computationally intense procedure was developed which requires only two calculations for each accident case. The first of these is the standard 'best-estimate' calculation. In the second calculation, variables and parameters change simultaneously. The mathematical technique of deconvolution is then used to separate out an uncertainty multiplier distribution, which can be used to calculate distribution functions at various levels of confidence

  14. Uncertainties about climate

    International Nuclear Information System (INIS)

    Like meteorology, climatology is not an exact science: climate change forecasts necessarily include a share of uncertainty. It is precisely this uncertainty which is brandished and exploited by the opponents to the global warming theory to put into question the estimations of its future consequences. Is it legitimate to predict the future using the past climate data (well documented up to 100000 years BP) or the climates of other planets, taking into account the impreciseness of the measurements and the intrinsic complexity of the Earth's machinery? How is it possible to model a so huge and interwoven system for which any exact description has become impossible? Why water and precipitations play such an important role in local and global forecasts, and how should they be treated? This book written by two physicists answers with simpleness these delicate questions in order to give anyone the possibility to build his own opinion about global warming and the need to act rapidly

  15. Uncertainty in adaptive capacity

    International Nuclear Information System (INIS)

    The capacity to adapt is a critical element of the process of adaptation: it is the vector of resources that represent the asset base from which adaptation actions can be made. Adaptive capacity can in theory be identified and measured at various scales, from the individual to the nation. The assessment of uncertainty within such measures comes from the contested knowledge domain and theories surrounding the nature of the determinants of adaptive capacity and the human action of adaptation. While generic adaptive capacity at the national level, for example, is often postulated as being dependent on health, governance and political rights, and literacy, and economic well-being, the determinants of these variables at national levels are not widely understood. We outline the nature of this uncertainty for the major elements of adaptive capacity and illustrate these issues with the example of a social vulnerability index for countries in Africa. (authors)

  16. Uncertainty Principle Respects Locality

    CERN Document Server

    Wang, Dongsheng

    2013-01-01

    The notion of nonlocality implicitly implies there might be some kind of spooky action at a distance in nature, however, the validity of quantum mechanics has been well tested up to now. In this work it is argued that the notion of nonlocality is physically improper, the basic principle of locality in nature is well respected by quantum mechanics, namely, the uncertainty principle. We show that the quantum bound on the Clauser, Horne, Shimony, and Holt (CHSH) inequality can be recovered from the uncertainty relation in a multipartite setting, and the same bound exists classically which indicates that nonlocality does not capture the essence of quantum and then distinguish quantum mechanics and classical mechanics properly. We further argue that the super-quantum correlation demonstrated by the nonlocal box is not physically comparable with the quantum one, as the result, the physical foundation for the existence of nonlocality is falsified. The origin of the quantum structure of nature still remains to be exp...

  17. Estimating uncertainty of inference for validation

    Energy Technology Data Exchange (ETDEWEB)

    Booker, Jane M [Los Alamos National Laboratory; Langenbrunner, James R [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2010-09-30

    We present a validation process based upon the concept that validation is an inference-making activity. This has always been true, but the association has not been as important before as it is now. Previously, theory had been confirmed by more data, and predictions were possible based on data. The process today is to infer from theory to code and from code to prediction, making the role of prediction somewhat automatic, and a machine function. Validation is defined as determining the degree to which a model and code is an accurate representation of experimental test data. Imbedded in validation is the intention to use the computer code to predict. To predict is to accept the conclusion that an observable final state will manifest; therefore, prediction is an inference whose goodness relies on the validity of the code. Quantifying the uncertainty of a prediction amounts to quantifying the uncertainty of validation, and this involves the characterization of uncertainties inherent in theory/models/codes and the corresponding data. An introduction to inference making and its associated uncertainty is provided as a foundation for the validation problem. A mathematical construction for estimating the uncertainty in the validation inference is then presented, including a possibility distribution constructed to represent the inference uncertainty for validation under uncertainty. The estimation of inference uncertainty for validation is illustrated using data and calculations from Inertial Confinement Fusion (ICF). The ICF measurements of neutron yield and ion temperature were obtained for direct-drive inertial fusion capsules at the Omega laser facility. The glass capsules, containing the fusion gas, were systematically selected with the intent of establishing a reproducible baseline of high-yield 10{sup 13}-10{sup 14} neutron output. The deuterium-tritium ratio in these experiments was varied to study its influence upon yield. This paper on validation inference is the

  18. SENSIT: a cross-section and design sensitivity and uncertainty analysis code

    International Nuclear Information System (INIS)

    SENSIT computes the sensitivity and uncertainty of a calculated integral response (such as a dose rate) due to input cross sections and their uncertainties. Sensitivity profiles are computed for neutron and gamma-ray reaction cross sections of standard multigroup cross section sets and for secondary energy distributions (SEDs) of multigroup scattering matrices. In the design sensitivity mode, SENSIT computes changes in an integral response due to design changes and gives the appropriate sensitivity coefficients. Cross section uncertainty analyses are performed for three types of input data uncertainties: cross-section covariance matrices for pairs of multigroup reaction cross sections, spectral shape uncertainty parameters for secondary energy distributions (integral SED uncertainties), and covariance matrices for energy-dependent response functions. For all three types of data uncertainties SENSIT computes the resulting variance and estimated standard deviation in an integral response of interest, on the basis of generalized perturbation theory. SENSIT attempts to be more comprehensive than earlier sensitivity analysis codes, such as SWANLAKE

  19. Uncertainty in artificial intelligence

    CERN Document Server

    Shachter, RD; Henrion, M; Lemmer, JF

    1990-01-01

    This volume, like its predecessors, reflects the cutting edge of research on the automation of reasoning under uncertainty.A more pragmatic emphasis is evident, for although some papers address fundamental issues, the majority address practical issues. Topics include the relations between alternative formalisms (including possibilistic reasoning), Dempster-Shafer belief functions, non-monotonic reasoning, Bayesian and decision theoretic schemes, and new inference techniques for belief nets. New techniques are applied to important problems in medicine, vision, robotics, and natural language und

  20. Choice under Partial Uncertainty.

    OpenAIRE

    Kelsey, David

    1993-01-01

    This paper analyzes problems of choice under uncertainty where a decisionmaker does not use subjective probabilities. The decisionmaker has a set of beliefs about which states are more likely than others, but his beliefs cannot be represented as subjective probabilities. Three main kinds of decision rules are possible in this framework. These are maximin-type, maximax-type, and choosing that action that gives the highest payoff in the state, which the decisionmaker believes to be most likely....

  1. Aeroelastic/Aeroservoelastic Uncertainty and Reliability of Advanced Aerospace Vehicles in Flight and Ground Operations Project

    Data.gov (United States)

    National Aeronautics and Space Administration — ASSURE - Aeroelastic / Aeroservoelastic (AE/ASE) Uncertainty and Reliability Engineering capability - is a set of probabilistic computer programs for isolating...

  2. Uncertainties in parton distribution functions

    Energy Technology Data Exchange (ETDEWEB)

    Martin, A.D. [Deptartment of Physics, University of Durham, Durham DH1 3LE (United Kingdom); Roberts, R.G. [Rutherford Appleton Laboratory, Chilton, Didcot, Oxon OX11 0QX (United Kingdom); Stirling, W.J. [Department of Physics, University of Durham, Durham DH1 3LE (United Kingdom); Department of Mathematical Sciences, University of Durham, Durham DH1 3LE (United Kingdom); Thorne, R.S. [Jesus College, University of Oxford, Oxford OX1 3DW (United Kingdom)

    2000-05-01

    We discuss the uncertainty in the predictions for hard scattering cross sections at hadron colliders due to uncertainties in the input parton distributions, using W production at the LHC as an example. (author)

  3. Uncertainty propagation in locally damped dynamic systems

    OpenAIRE

    Cortes Mochales, Lluis; Ferguson, Neil S.; Bhaskar, Atul

    2012-01-01

    In the field of stochastic structural dynamics, perturbation methods are widely used to estimate the response statistics of uncertain systems. When large built up systems are to be modelled in the mid-frequency range, perturbation methods are often combined with finite element model reduction techniques in order to considerably reduce the computation time of the response. Existing methods based on Component Mode Synthesis(CMS) allow the uncertainties in the system parameters to be treated ...

  4. Inflow Uncertainty in Hydropower Markets

    OpenAIRE

    2007-01-01

    In order to analyse the consequences of uncertainty for prices and efficiency in a hydropower system, we apply a two-period model with uncertainty in water inflow. We study three different market structures, perfect competition, monopoly and oligopoly and stress the importance of the shape of the demand function under different distributions of water inflow. The uncertainty element creates possibilities of exercising market power depending on the distribution of uncertainty among producers. T...

  5. Legal uncertainty and contractual innovation

    OpenAIRE

    Yaron Leitner

    2005-01-01

    Although innovative contracts are important for economic growth, when firms face uncertainty as to whether contracts will be enforced, they may choose not to innovate. Legal uncertainty can arise if a judge interprets the terms of a contract in a way that is antithetical to the intentions of the parties to the contract. Or sometimes a judge may understand the contract but overrule it for other reasons. How does legal uncertainty affect firms’ decisions to innovate? In “Legal Uncertainty and C...

  6. Gravitational tests of the generalized uncertainty principle

    International Nuclear Information System (INIS)

    We compute the corrections to the Schwarzschild metric necessary to reproduce the Hawking temperature derived from a generalized uncertainty principle (GUP), so that the GUP deformation parameter is directly linked to the deformation of the metric. Using this modified Schwarzschild metric, we compute corrections to the standard general relativistic predictions for the light deflection and perihelion precession, both for planets in the solar system and for binary pulsars. This analysis allows us to set bounds for the GUP deformation parameter from well-known astronomical measurements. (orig.)

  7. Gravitational tests of the Generalized Uncertainty Principle

    CERN Document Server

    Scardigli, Fabio

    2014-01-01

    We compute the corrections to the Schwarzschild metric necessary to reproduce the Hawking temperature derived from a Generalized Uncertainty Principle (GUP), so that the GUP deformation parameter is directly linked to the deformation of the metric. Using this modified Schwarzschild metric, we compute corrections to the standard General Relativistic predictions for the light deflection and perihelion precession, both for planets in the solar system and for binary pulsars. This analysis allows us to set bounds for the GUP deformation parameter from well-known astronomical measurements.

  8. Gravitational tests of the generalized uncertainty principle

    Science.gov (United States)

    Scardigli, Fabio; Casadio, Roberto

    2015-09-01

    We compute the corrections to the Schwarzschild metric necessary to reproduce the Hawking temperature derived from a generalized uncertainty principle (GUP), so that the GUP deformation parameter is directly linked to the deformation of the metric. Using this modified Schwarzschild metric, we compute corrections to the standard general relativistic predictions for the light deflection and perihelion precession, both for planets in the solar system and for binary pulsars. This analysis allows us to set bounds for the GUP deformation parameter from well-known astronomical measurements.

  9. Gravitational tests of the generalized uncertainty principle

    Energy Technology Data Exchange (ETDEWEB)

    Scardigli, Fabio [American University of the Middle East, Department of Mathematics, College of Engineering, P.O. Box 220, Dasman (Kuwait); Politecnico di Milano, Dipartimento di Matematica, Milan (Italy); Casadio, Roberto [Alma Mater Universita di Bologna, Dipartimento di Fisica e Astronomia, Bologna (Italy); INFN, Sezione di Bologna, Bologna (Italy)

    2015-09-15

    We compute the corrections to the Schwarzschild metric necessary to reproduce the Hawking temperature derived from a generalized uncertainty principle (GUP), so that the GUP deformation parameter is directly linked to the deformation of the metric. Using this modified Schwarzschild metric, we compute corrections to the standard general relativistic predictions for the light deflection and perihelion precession, both for planets in the solar system and for binary pulsars. This analysis allows us to set bounds for the GUP deformation parameter from well-known astronomical measurements. (orig.)

  10. Optimizing production under uncertainty

    DEFF Research Database (Denmark)

    Rasmussen, Svend

    This Working Paper derives criteria for optimal production under uncertainty based on the state-contingent approach (Chambers and Quiggin, 2000), and discusses po-tential problems involved in applying the state-contingent approach in a normative context. The analytical approach uses the concept...... of state-contingent production functions and a definition of inputs including both sort of input, activity and alloca-tion technology. It also analyses production decisions where production is combined with trading in state-contingent claims such as insurance contracts. The final part discusses...

  11. Principles of Uncertainty

    CERN Document Server

    Kadane, Joseph B

    2011-01-01

    An intuitive and mathematical introduction to subjective probability and Bayesian statistics. An accessible, comprehensive guide to the theory of Bayesian statistics, Principles of Uncertainty presents the subjective Bayesian approach, which has played a pivotal role in game theory, economics, and the recent boom in Markov Chain Monte Carlo methods. Both rigorous and friendly, the book contains: Introductory chapters examining each new concept or assumption Just-in-time mathematics -- the presentation of ideas just before they are applied Summary and exercises at the end of each chapter Discus

  12. CANDU lattice uncertainties during burnup

    International Nuclear Information System (INIS)

    Uncertainties associated with fundamental nuclear data accompany evaluated nuclear data libraries in the form of covariance matrices. As nuclear data are important parameters in reactor physics calculations, any associated uncertainty causes a loss of confidence in the calculation results. The quantification of output uncertainties is necessary to adequately establish safety margins of nuclear facilities. In this work, microscopic cross-section has been propagated through lattice burnup calculations applied to a generic CANDU® model. It was found that substantial uncertainty emerges during burnup even when fission yield fraction and decay rate uncertainties are neglected. (author)

  13. Optimality of entropic uncertainty relations

    Energy Technology Data Exchange (ETDEWEB)

    Abdelkhalek, Kais; Duhme, Joerg; Schwonnek, Rene; Werner, Reinhard F. [Institut fuer Theoretische Physik, Leibniz Universitaet Hannover, Hannover (Germany); Englert, Berthold-Georg; Raynal, Philippe [Centre for Quantum Technologies and Department of Physics, National University of Singapore (Singapore); Furrer, Fabian [Department of Physics, Graduate School of Science, University of Tokyo, Tokyo (Japan); Maassen, Hans [Department of Mathematics, Radboud University, Nijmegen (Netherlands)

    2014-07-01

    We investigate the optimality of the entropic uncertainty relation proven by Maassen and Uffink and its generalisation to side information from Berta et al for observables, for which the lower bound attains its maximal value. Here, we call an uncertainty relation optimal if the lower bound can be attained for any value of either of the corresponding uncertainties. We show that the uncertainty relation with side information cannot be optimised. In the case of the Maassen-Uffink uncertainty relation, we disprove a conjecture by Englert et al and provide a characterisation of those states that parametrise the optimal lower bound. This leads to a new conjecture.

  14. Strain Gauge Balance Uncertainty Analysis at NASA Langley: A Technical Review

    Science.gov (United States)

    Tripp, John S.

    1999-01-01

    This paper describes a method to determine the uncertainties of measured forces and moments from multi-component force balances used in wind tunnel tests. A multivariate regression technique is first employed to estimate the uncertainties of the six balance sensitivities and 156 interaction coefficients derived from established balance calibration procedures. These uncertainties are then employed to calculate the uncertainties of force-moment values computed from observed balance output readings obtained during tests. Confidence and prediction intervals are obtained for each computed force and moment as functions of the actual measurands. Techniques are discussed for separate estimation of balance bias and precision uncertainties.

  15. Data uncertainty reduction in high converter reactor designs using PROTEUS phase II integral experiments

    International Nuclear Information System (INIS)

    A detailed sensitivity and uncertainty analysis was performed for several parameters of interest in the design of the high conversion reactor (HCR) concept. The main goals of this work were to determine the response standard deviation due to basic nuclear data uncertainties, and to incorporate integral experiment information from the PROTEUS facility to reduce the computed uncertainties wherever possible. This paper highlights the results for K and five important reaction rate ratios that were part of the measurement program in the PROTEUS phase II experiments. The computed correlation coefficients between the PROTEUS and HCR models were uniformly high, which indicates that a considerable reduction in uncertainty can be achieved (within measurement uncertainties)

  16. Integrating Out Astrophysical Uncertainties

    CERN Document Server

    Fox, Patrick J; Weiner, Neal

    2010-01-01

    Underground searches for dark matter involve a complicated interplay of particle physics, nuclear physics, atomic physics and astrophysics. We attempt to remove the uncertainties associated with astrophysics by developing the means to map the observed signal in one experiment directly into a predicted rate at another. We argue that it is possible to make experimental comparisons that are completely free of astrophysical uncertainties by focusing on {\\em integral} quantities, such as $g(v_{min})=\\int_{v_{min}} dv\\, f(v)/v $ and $\\int_{v_{thresh}} dv\\, v g(v)$. Direct comparisons are possible when the $v_{min}$ space probed by different experiments overlap. As examples, we consider the possible dark matter signals at CoGeNT, DAMA and CRESST-Oxygen. We find that expected rate from CoGeNT in the XENON10 experiment is higher than observed, unless scintillation light output is low. Moreover, we determine that S2-only analyses are constraining, unless the charge yield $Q_y< 2.4 {\\, \\rm electrons/keV}$. For DAMA t...

  17. Oil price uncertainty in Canada

    Energy Technology Data Exchange (ETDEWEB)

    Elder, John [Department of Finance and Real Estate, 1272 Campus Delivery, Colorado State University, Fort Collins, CO 80523 (United States); Serletis, Apostolos [Department of Economics, University of Calgary, Calgary, Alberta (Canada)

    2009-11-15

    Bernanke [Bernanke, Ben S. Irreversibility, uncertainty, and cyclical investment. Quarterly Journal of Economics 98 (1983), 85-106.] shows how uncertainty about energy prices may induce optimizing firms to postpone investment decisions, thereby leading to a decline in aggregate output. Elder and Serletis [Elder, John and Serletis, Apostolos. Oil price uncertainty.] find empirical evidence that uncertainty about oil prices has tended to depress investment in the United States. In this paper we assess the robustness of these results by investigating the effects of oil price uncertainty in Canada. Our results are remarkably similar to existing results for the United States, providing additional evidence that uncertainty about oil prices may provide another explanation for why the sharp oil price declines of 1985 failed to produce rapid output growth. Impulse-response analysis suggests that uncertainty about oil prices may tend to reinforce the negative response of output to positive oil shocks. (author)

  18. Oil price uncertainty in Canada

    International Nuclear Information System (INIS)

    Bernanke [Bernanke, Ben S. Irreversibility, uncertainty, and cyclical investment. Quarterly Journal of Economics 98 (1983), 85-106.] shows how uncertainty about energy prices may induce optimizing firms to postpone investment decisions, thereby leading to a decline in aggregate output. Elder and Serletis [Elder, John and Serletis, Apostolos. Oil price uncertainty.] find empirical evidence that uncertainty about oil prices has tended to depress investment in the United States. In this paper we assess the robustness of these results by investigating the effects of oil price uncertainty in Canada. Our results are remarkably similar to existing results for the United States, providing additional evidence that uncertainty about oil prices may provide another explanation for why the sharp oil price declines of 1985 failed to produce rapid output growth. Impulse-response analysis suggests that uncertainty about oil prices may tend to reinforce the negative response of output to positive oil shocks. (author)

  19. Uncertainty Analysis of Light Water Reactor Fuel Lattices

    Directory of Open Access Journals (Sweden)

    C. Arenas

    2013-01-01

    Full Text Available The study explored the calculation of uncertainty based on available cross-section covariance data and computational tool on fuel lattice levels, which included pin cell and the fuel assembly models. Uncertainty variations due to temperatures changes and different fuel compositions are the main focus of this analysis. Selected assemblies and unit pin cells were analyzed according to the OECD LWR UAM benchmark specifications. Criticality and uncertainty analysis were performed using TSUNAMI-2D sequence in SCALE 6.1. It was found that uncertainties increase with increasing temperature, while kinf decreases. This increase in the uncertainty is due to the increase in sensitivity of the largest contributing reaction of uncertainty, namely, the neutron capture reaction 238U(n, γ due to the Doppler broadening. In addition, three types (UOX, MOX, and UOX-Gd2O3 of fuel material compositions were analyzed. A remarkable increase in uncertainty in kinf was observed for the case of MOX fuel. The increase in uncertainty of kinf in MOX fuel was nearly twice the corresponding value in UOX fuel. The neutron-nuclide reaction of 238U, mainly inelastic scattering (n, n′, contributed the most to the uncertainties in the MOX fuel, shifting the neutron spectrum to higher energy compared to the UOX fuel.

  20. Uncertainty Quantification of Physical Model for Best Estimate Safety Analysis

    International Nuclear Information System (INIS)

    In this work, the model calibration was done to reduce the simulation code's input parameters' uncertainties, and subsequently simulation code's prediction uncertainties of design constraining responses. Each parameter/physical Model's fidelity was identified as well to determine major sources of the modeling uncertainty. This analysis is important in deciding where additional efforts should be given to improve our simulation model. The goal of this work is to develop higher fidelity model by completing experiments and doing uncertainty quantification. Thermal hydraulic parameters were adjusted for both mildly nonlinear and highly nonlinear systems, and their a posterior parameter uncertainties were propagated through the simulation model to predict a posterior uncertainties of the key system attributes. To solve both highly nonlinear as well as mildly nonlinear problem, both deterministic and probabilistic methods were used to complete uncertainty quantification. To accomplish this, the Bayesian approach modified by regularization is used for the mildly nonlinear problem to incorporate available information in quantifying uncertainties. The a priori information considered are the parameters and the experimental data together with their uncertainties. The results indicate that substantial reductions in uncertainties on the system responses can be achieved using experimental data to obtain a posterior input parameters' uncertainty distributions. The MCMC method was used for the highly nonlinear transient. Due to the computational burden, this method would not be applicable if there are many parameters, but it can provide the best solution since the algorithm does not approximate the responses while the deterministic approach assumes linearity of the responses with regard to dependencies on the parameters. Using MCMC non-Gaussian a posterior distributions of the parameters with reduced uncertainties were obtained due to the nonlinearity of the system sensitivity

  1. Uncertainty analyses in systems modeling process

    International Nuclear Information System (INIS)

    In the context of Probabilistic Safety Assessment (PSA), the uncertainty analyses play an important role. The objective is to ensure the qualitative evaluation and quantitative estimation in PSA level 1 results (the core damage frequency, the accident sequences frequency, the top events probability, etc). An application that enables uncertainty calculations by probability distribution propagations in the fault tree model has been developed. It uses the moment method and Monte Carlo method. The application has been integrated into a computer program that allocates the reliability data, quantifies the human errors and labels in a unique way the components. The reliability data used in Institute for Nuclear Research (INR) Pitesti for Cernavoda Probabilistic Safety Evaluation (CPSE) studies is a generic data base. Taking into account the status of reliability data base and the cases by which an error factor for a failure rate lognormal distribution is calculated, the data base has been completed with an error factor for each record. This paper presents the module that performs the uncertainty analysis and an example of uncertainty analysis at the fault tree level. (authors)

  2. LCA data quality: sensitivity and uncertainty analysis.

    Science.gov (United States)

    Guo, M; Murphy, R J

    2012-10-01

    Life cycle assessment (LCA) data quality issues were investigated by using case studies on products from starch-polyvinyl alcohol based biopolymers and petrochemical alternatives. The time horizon chosen for the characterization models was shown to be an important sensitive parameter for the environmental profiles of all the polymers. In the global warming potential and the toxicity potential categories the comparison between biopolymers and petrochemical counterparts altered as the time horizon extended from 20 years to infinite time. These case studies demonstrated that the use of a single time horizon provide only one perspective on the LCA outcomes which could introduce an inadvertent bias into LCA outcomes especially in toxicity impact categories and thus dynamic LCA characterization models with varying time horizons are recommended as a measure of the robustness for LCAs especially comparative assessments. This study also presents an approach to integrate statistical methods into LCA models for analyzing uncertainty in industrial and computer-simulated datasets. We calibrated probabilities for the LCA outcomes for biopolymer products arising from uncertainty in the inventory and from data variation characteristics this has enabled assigning confidence to the LCIA outcomes in specific impact categories for the biopolymer vs. petrochemical polymer comparisons undertaken. Uncertainty combined with the sensitivity analysis carried out in this study has led to a transparent increase in confidence in the LCA findings. We conclude that LCAs lacking explicit interpretation of the degree of uncertainty and sensitivities are of limited value as robust evidence for decision making or comparative assertions. PMID:22854094

  3. Uncertainty Quantification for Cargo Hold Fires

    CERN Document Server

    DeGennaro, Anthony M; Martinelli, Luigi; Rowley, Clarence W

    2015-01-01

    The purpose of this study is twofold -- first, to introduce the application of high-order discontinuous Galerkin methods to buoyancy-driven cargo hold fire simulations, second, to explore statistical variation in the fluid dynamics of a cargo hold fire given parameterized uncertainty in the fire source location and temperature. Cargo hold fires represent a class of problems that require highly-accurate computational methods to simulate faithfully. Hence, we use an in-house discontinuous Galerkin code to treat these flows. Cargo hold fires also exhibit a large amount of uncertainty with respect to the boundary conditions. Thus, the second aim of this paper is to quantify the resulting uncertainty in the flow, using tools from the uncertainty quantification community to ensure that our efforts require a minimal number of simulations. We expect that the results of this study will provide statistical insight into the effects of fire location and temperature on cargo fires, and also assist in the optimization of f...

  4. Uncertainty relation in Schwarzschild spacetime

    Science.gov (United States)

    Feng, Jun; Zhang, Yao-Zhong; Gould, Mark D.; Fan, Heng

    2015-04-01

    We explore the entropic uncertainty relation in the curved background outside a Schwarzschild black hole, and find that Hawking radiation introduces a nontrivial modification on the uncertainty bound for particular observer, therefore it could be witnessed by proper uncertainty game experimentally. We first investigate an uncertainty game between a free falling observer and his static partner holding a quantum memory initially entangled with the quantum system to be measured. Due to the information loss from Hawking decoherence, we find an inevitable increase of the uncertainty on the outcome of measurements in the view of static observer, which is dependent on the mass of the black hole, the distance of observer from event horizon, and the mode frequency of quantum memory. To illustrate the generality of this paradigm, we relate the entropic uncertainty bound with other uncertainty probe, e.g., time-energy uncertainty. In an alternative game between two static players, we show that quantum information of qubit can be transferred to quantum memory through a bath of fluctuating quantum fields outside the black hole. For a particular choice of initial state, we show that the Hawking decoherence cannot counteract entanglement generation after the dynamical evolution of system, which triggers an effectively reduced uncertainty bound that violates the intrinsic limit -log2 ⁡ c. Numerically estimation for a proper choice of initial state shows that our result is comparable with possible real experiments. Finally, a discussion on the black hole firewall paradox in the context of entropic uncertainty relation is given.

  5. On the worst case uncertainty and its evaluation

    International Nuclear Information System (INIS)

    The paper is a review on the worst case uncertainty (WCU) concept, neglected in the Guide to the Expression of Uncertainty in Measurements (GUM), but necessary for a correct uncertainty assessment in a number of practical cases involving distribution with compact support. First, it is highlighted that the knowledge of the WCU is necessary to choose a sensible coverage factor, associated to a sensible coverage probability: the Maximum Acceptable Coverage Factor (MACF) is introduced as a convenient index to guide this choice. Second, propagation rules for the worst-case uncertainty are provided in matrix and scalar form. It is highlighted that when WCU propagation cannot be computed, the Monte Carlo approach is the only way to obtain a correct expanded uncertainty assessment, in contrast to what can be inferred from the GUM. Third, examples of applications of the formulae to ordinary instruments and measurements are given. Also an example taken from the GUM is discussed, underlining some inconsistencies in it

  6. Parameter Uncertainty for Aircraft Aerodynamic Modeling using Recursive Least Squares

    Science.gov (United States)

    Grauer, Jared A.; Morelli, Eugene A.

    2016-01-01

    A real-time method was demonstrated for determining accurate uncertainty levels of stability and control derivatives estimated using recursive least squares and time-domain data. The method uses a recursive formulation of the residual autocorrelation to account for colored residuals, which are routinely encountered in aircraft parameter estimation and change the predicted uncertainties. Simulation data and flight test data for a subscale jet transport aircraft were used to demonstrate the approach. Results showed that the corrected uncertainties matched the observed scatter in the parameter estimates, and did so more accurately than conventional uncertainty estimates that assume white residuals. Only small differences were observed between batch estimates and recursive estimates at the end of the maneuver. It was also demonstrated that the autocorrelation could be reduced to a small number of lags to minimize computation and memory storage requirements without significantly degrading the accuracy of predicted uncertainty levels.

  7. On the worst case uncertainty and its evaluation

    Science.gov (United States)

    Fabbiano, L.; Giaquinto, N.; Savino, M.; Vacca, G.

    2016-02-01

    The paper is a review on the worst case uncertainty (WCU) concept, neglected in the Guide to the Expression of Uncertainty in Measurements (GUM), but necessary for a correct uncertainty assessment in a number of practical cases involving distribution with compact support. First, it is highlighted that the knowledge of the WCU is necessary to choose a sensible coverage factor, associated to a sensible coverage probability: the Maximum Acceptable Coverage Factor (MACF) is introduced as a convenient index to guide this choice. Second, propagation rules for the worst-case uncertainty are provided in matrix and scalar form. It is highlighted that when WCU propagation cannot be computed, the Monte Carlo approach is the only way to obtain a correct expanded uncertainty assessment, in contrast to what can be inferred from the GUM. Third, examples of applications of the formulae to ordinary instruments and measurements are given. Also an example taken from the GUM is discussed, underlining some inconsistencies in it.

  8. Sustainability and uncertainty

    DEFF Research Database (Denmark)

    Jensen, Karsten Klint

    2007-01-01

    The widely used concept of sustainability is seldom precisely defined, and its clarification involves making up one's mind about a range of difficult questions. One line of research (bottom-up) takes sustaining a system over time as its starting point and then infers prescriptions from this...... requirement. Another line (top-down) takes an economical interpretation of the Brundtland Commission's suggestion that the present generation's needsatisfaction should not compromise the need-satisfaction of future generations as its starting point. It then measures sustainability at the level of society and...... clarified ethical goal, disagreements can arise. At present we do not know what substitutions will be possible in the future. This uncertainty clearly affects the prescriptions that follow from the measure of sustainability. Consequently, decisions about how to make future agriculture sustainable are...

  9. Risk, Uncertainty, and Entrepreneurship

    DEFF Research Database (Denmark)

    Koudstaal, martin; Sloof, Randolph; Van Praag, Mirjam

    2015-01-01

    aversion, and not in their risk or ambiguity aversion. This combination of results might be explained by our finding that perceived risk attitude is not only correlated to risk aversion but also to loss aversion. Overall, we therefore suggest using a broader definition of risk that captures this unique......Theory predicts that entrepreneurs have distinct attitudes toward risk and uncertainty, but empirical evidence is mixed. To better understand these mixed results, we perform a large “lab-in-the-field” experiment comparing entrepreneurs to managers (a suitable comparison group) and employees (n D...... 21288). The results indicate that entrepreneurs perceive themselves as less risk averse than managers and employees, in line with common wisdom. However, when using experimental incentivized measures, the differences are subtler. Entrepreneurs are only found to be unique in their lower degree of loss...

  10. Uncertainty as Certaint

    Science.gov (United States)

    Petzinger, Tom

    I am trying to make money in the biotech industry from complexity science. And I am doing it with inspiration that I picked up on the edge of Appalachia spending time with June Holley and ACEnet when I was a Wall Street Journal reporter. I took some of those ideas to Pittsburgh, in biotechnology, in a completely private setting with an economic development focus, but also with a mission t o return profit to private capital. And we are doing that. I submit as a hypothesis, something we are figuring out in the post- industrial era, that business evolves. It is not the definition of business, but business critically involves the design of systems in which uncertainty is treated as a certainty. That is what I have seen and what I have tried to put into practice.

  11. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    Energy Technology Data Exchange (ETDEWEB)

    Langenbrunner, James R. [Los Alamos National Laboratory; Booker, Jane M [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Salazar, Issac F [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  12. Pragmatic aspects of uncertainty propagation: A conceptual review

    KAUST Repository

    Thacker, W.Carlisle

    2015-09-11

    When quantifying the uncertainty of the response of a computationally costly oceanographic or meteorological model stemming from the uncertainty of its inputs, practicality demands getting the most information using the fewest simulations. It is widely recognized that, by interpolating the results of a small number of simulations, results of additional simulations can be inexpensively approximated to provide a useful estimate of the variability of the response. Even so, as computing the simulations to be interpolated remains the biggest expense, the choice of these simulations deserves attention. When making this choice, two requirement should be considered: (i) the nature of the interpolation and ii) the available information about input uncertainty. Examples comparing polynomial interpolation and Gaussian process interpolation are presented for three different views of input uncertainty.

  13. Uncertainty analysis in the task of individual monitoring data

    International Nuclear Information System (INIS)

    Assessment of internal doses is an essential component of individual monitoring programmes for workers and consists of two stages: individual monitoring measurements and interpretation of the monitoring data in terms of annual intake and/or annual internal dose. The overall uncertainty in assessed dose is a combination of the uncertainties in these stages. An algorithm and a computer code were developed for estimating the uncertainties in these stages. An algorithm and a computer code were developed for estimating the uncertainty in the assessment of internal dose in the task of individual monitoring data interpretation. Two main influencing factors are analysed in this paper: the unknown time of the exposure and variability of bioassay measurements. The aim of this analysis is to show that the algorithm is applicable in designing an individual monitoring programme for workers so as to guarantee that the individual dose calculated from individual monitoring measurements does not exceed a required limit with a certain confidence probability. (author)

  14. Scientific visualization uncertainty, multifield, biomedical, and scalable visualization

    CERN Document Server

    Chen, Min; Johnson, Christopher; Kaufman, Arie; Hagen, Hans

    2014-01-01

    Based on the seminar that took place in Dagstuhl, Germany in June 2011, this contributed volume studies the four important topics within the scientific visualization field: uncertainty visualization, multifield visualization, biomedical visualization and scalable visualization. • Uncertainty visualization deals with uncertain data from simulations or sampled data, uncertainty due to the mathematical processes operating on the data, and uncertainty in the visual representation, • Multifield visualization addresses the need to depict multiple data at individual locations and the combination of multiple datasets, • Biomedical is a vast field with select subtopics addressed from scanning methodologies to structural applications to biological applications, • Scalability in scientific visualization is critical as data grows and computational devices range from hand-held mobile devices to exascale computational platforms. Scientific Visualization will be useful to practitioners of scientific visualization, ...

  15. Comprehensive Approach to Verification and Validation of CFD Simulations Applied to Backward Facing Step-Application of CFD Uncertainty Analysis

    Science.gov (United States)

    Groves, Curtis E.; LLie, Marcel; Shallhorn, Paul A.

    2012-01-01

    There are inherent uncertainties and errors associated with using Computational Fluid Dynamics (CFD) to predict the flow field and there is no standard method for evaluating uncertainty in the CFD community. This paper describes an approach to -validate the . uncertainty in using CFD. The method will use the state of the art uncertainty analysis applying different turbulence niodels and draw conclusions on which models provide the least uncertainty and which models most accurately predict the flow of a backward facing step.

  16. Summary from the epistemic uncertainty workshop: consensus amid diversity

    International Nuclear Information System (INIS)

    The 'Epistemic Uncertainty Workshop' sponsored by Sandia National Laboratories was held in Albuquerque, New Mexico, on 6-7 August 2002. The workshop was organized around a set of Challenge Problems involving both epistemic and aleatory uncertainty that the workshop participants were invited to solve and discuss. This concluding article in a special issue of Reliability Engineering and System Safety based on the workshop discusses the intent of the Challenge Problems, summarizes some discussions from the workshop, and provides a technical comparison among the papers in this special issue. The Challenge Problems were computationally simple models that were intended as vehicles for the illustration and comparison of conceptual and numerical techniques for use in analyses that involve: (i) epistemic uncertainty, (ii) aggregation of multiple characterizations of epistemic uncertainty, (iii) combination of epistemic and aleatory uncertainty, and (iv) models with repeated parameters. There was considerable diversity of opinion at the workshop about both methods and fundamental issues, and yet substantial consensus about what the answers to the problems were, and even about how each of the four issues should be addressed. Among the technical approaches advanced were probability theory, Dempster-Shafer evidence theory, random sets, sets of probability measures, imprecise coherent probabilities, coherent lower previsions, probability boxes, possibility theory, fuzzy sets, joint distribution tableaux, polynomial chaos expansions, and info-gap models. Although some participants maintained that a purely probabilistic approach is fully capable of accounting for all forms of uncertainty, most agreed that the treatment of epistemic uncertainty introduces important considerations and that the issues underlying the Challenge Problems are legitimate and significant. Topics identified as meriting additional research include elicitation of uncertainty representations, aggregation of

  17. Project scheduling under uncertainty: survey and research potentials.

    OpenAIRE

    Herroelen, Willy; Leus, Roel

    2005-01-01

    The vast majority of the research efforts in project scheduling assume complete information about the scheduling problem to be solved and a static deterministic environment within which the pre-computed baseline schedule will be executed. However, in the real world, project activities are subject to considerable uncertainty, which is gradually resolved during project execution. In this survey we review the fundamental approaches for scheduling under uncertainty: reactive scheduling, stochasti...

  18. The Uncertainties of Risk Management

    DEFF Research Database (Denmark)

    Vinnari, Eija; Skærbæk, Peter

    2014-01-01

    Purpose – The purpose of this paper is to analyse the implementation of risk management as a tool for internal audit activities, focusing on unexpected effects or uncertainties generated during its application. Design/methodology/approach – Public and confidential documents as well as semi......-structured interviews are analysed through the lens of actor-network theory to identify the effects of risk management devices in a Finnish municipality. Findings – The authors found that risk management, rather than reducing uncertainty, itself created unexpected uncertainties that would otherwise not have emerged....... These include uncertainties relating to legal aspects of risk management solutions, in particular the issue concerning which types of document are considered legally valid; uncertainties relating to the definition and operationalisation of risk management; and uncertainties relating to the resources available...

  19. Thermodynamics of Black Holes and the Symmetric Generalized Uncertainty Principle

    Science.gov (United States)

    Dutta, Abhijit; Gangopadhyay, Sunandan

    2016-06-01

    In this paper, we have investigated the thermodynamics of Schwarzschild and Reissner-Nordström black holes using the symmetric generalised uncertainty principle which contains correction terms involving momentum and position uncertainty. The mass-temperature relationship and the heat capacity for these black holes have been computed using which the critical and remnant masses have been obtained. The entropy is found to satisfy the area law upto leading order logarithmic corrections and corrections of the form A 2 (which is a new finding in this paper) from the symmetric generalised uncertainty principle.

  20. Uncertainty-like relations of the relative entropy of coherence

    Science.gov (United States)

    Liu, Feng; Li, Fei; Chen, Jun; Xing, Wei

    2016-08-01

    Quantum coherence is an important physical resource in quantum computation and quantum information processing. In this paper, we firstly obtain an uncertainty-like expression relating two coherences contained in corresponding local bipartite quantum system. This uncertainty-like inequality shows that the larger the coherence of one subsystem, the less coherence contained in other subsystems. Further, we discuss in detail the uncertainty-like relation among three single-partite quantum systems. We show that the coherence contained in pure tripartite quantum system is greater than the sum of the coherence of all local subsystems.

  1. Inflow uncertainty in hydropower markets

    OpenAIRE

    Hansen, Petter Vegard

    2007-01-01

    Abstract: In order to analyse the consequences of uncertainty for prices and efficiency in a hydropower system, we apply a two-period model with uncertainty in water inflow. We study three different market structures, perfect competition, monopoly and oligopoly and stress the importance of the shape of the demand function under different distributions of water inflow. The uncertainty element creates possibilities of exercising market power depending on the distribution of uncer...

  2. Decommissioning funding: ethics, implementation, uncertainties

    International Nuclear Information System (INIS)

    This status report on Decommissioning Funding: Ethics, Implementation, Uncertainties also draws on the experience of the NEA Working Party on Decommissioning and Dismantling (WPDD). The report offers, in a concise form, an overview of relevant considerations on decommissioning funding mechanisms with regard to ethics, implementation and uncertainties. Underlying ethical principles found in international agreements are identified, and factors influencing the accumulation and management of funds for decommissioning nuclear facilities are discussed together with the main sources of uncertainties of funding systems. (authors)

  3. Inflow Uncertainty in Hydropower Markets

    OpenAIRE

    Hansen, Petter Vegard

    2007-01-01

    Abstract: In order to analyse the consequences of uncertainty for prices and efficiency in a hydropower system, we apply a two-period model with uncertainty in water inflow. We study three different market structures, perfect competition, monopoly and oligopoly and stress the importance of the shape of the demand function under different distributions of water inflow. The uncertainty element creates possibilities of exercising market power depending on the distribution of uncer...

  4. Economic recovery and policy uncertainty

    OpenAIRE

    Baker, Scott R.; Davis, Steven J.; Bloom, Nick; Van Reenen, John

    2012-01-01

    Until some political mechanism creates incentives to elect moderate representatives who can reach across the ideological divide, the US seems destined to heightened levels of policy uncertainty for many years to come. Some research suggests that such uncertainty, particularly over economic policy, partly explains the sluggish nature of the recovery in America – and - according to one recent study, restoring policy uncertainty to levels that prevailed before the financial crisis would raise em...

  5. The Uncertainty of Measurement Results

    International Nuclear Information System (INIS)

    Factors affecting the uncertainty of measurement are explained, basic statistical formulae given, and the theoretical concept explained in the context of pesticide formulation analysis. Practical guidance is provided on how to determine individual uncertainty components within an analytical procedure. An extended and comprehensive table containing the relevant mathematical/statistical expressions elucidates the relevant underlying principles. Appendix I provides a practical elaborated example on measurement uncertainty estimation, above all utilizing experimental repeatability and reproducibility laboratory data. (author)

  6. On Uncertainty Quantification in Particle Accelerators Modelling

    CERN Document Server

    Adelmann, Andreas

    2015-01-01

    Using a cyclotron based model problem, we demonstrate for the first time the applicability and usefulness of a uncertainty quantification (UQ) approach in order to construct surrogate models for quantities such as emittance, energy spread but also the halo parameter, and construct a global sensitivity analysis together with error propagation and $L_{2}$ error analysis. The model problem is selected in a way that it represents a template for general high intensity particle accelerator modelling tasks. The presented physics problem has to be seen as hypothetical, with the aim to demonstrate the usefulness and applicability of the presented UQ approach and not solving a particulate problem. The proposed UQ approach is based on sparse polynomial chaos expansions and relies on a small number of high fidelity particle accelerator simulations. Within this UQ framework, the identification of most important uncertainty sources is achieved by performing a global sensitivity analysis via computing the so-called Sobols' ...

  7. PIV uncertainty quantification from correlation statistics

    International Nuclear Information System (INIS)

    The uncertainty of a PIV displacement field is estimated using a generic post-processing method based on statistical analysis of the correlation process using differences in the intensity pattern in the two images. First the second image is dewarped back onto the first one using the computed displacement field which provides two almost perfectly matching images. Differences are analyzed regarding the effect of shifting the peak of the correlation function. A relationship is derived between the standard deviation of intensity differences in each interrogation window and the expected asymmetry of the correlation peak, which is then converted to the uncertainty of a displacement vector. This procedure is tested with synthetic data for various types of noise and experimental conditions (pixel noise, out-of-plane motion, seeding density, particle image size, etc) and is shown to provide an accurate estimate of the true error. (paper)

  8. Uncertainty quantification in resonance absorption

    International Nuclear Information System (INIS)

    We assess the uncertainty in the resonance escape probability due to uncertainty in the neutron and radiation line widths for the first 21 resonances in 232Th as given by . Simulation, quadrature and polynomial chaos methods are used and the resonance data are assumed to obey a beta distribution. We find the uncertainty in the total resonance escape probability to be the equivalent, in reactivity, of 75–130 pcm. Also shown are pdfs of the resonance escape probability for each resonance and the variation of the uncertainty with temperature. The viability of the polynomial chaos expansion method is clearly demonstrated.

  9. Uncertainty analysis of two-phase flow pressure drop calculations

    Energy Technology Data Exchange (ETDEWEB)

    Siqueira, Cezar A.M.; Costa, Bruno M.P.; Fonseca Junior, Roberto da; Gonalves, Marcelo de A.L. [PETROBRAS, Rio de Janeiro, RJ (Brazil)

    2004-07-01

    The simulation of multiphase flow in pipes is usually performed by petroleum engineers with two main purposes: design of new pipelines and production systems; diagnosis of flow assurance problems in existing systems. The tools used for this calculation are computer codes that use published pressure drop correlations developed for steady-state two-phase flow, such as Hagedorn-Brown, Beggs and Brill and others. Each one of these correlations is best suited for a given situation and the engineer must find out the best option for each particular case, based on his experience. In order to select the best correlation to use and to analyze the results of the calculation, the engineer must determine the reliability of computed values. The uncertainty of the computation is obtained by considering uncertainties of the correlation adopted, of the calculation algorithm and the input data. This paper proposes a method to evaluate the uncertainties of this type of calculation and presents an analysis of these uncertainties. The uncertainty analysis also allows the identification of the parameters that are more significant for the final uncertainty of the simulation. Therefore it makes possible to determine which are the input parameters that must be determined with higher accuracy and the ones that may have lower accuracy, without reducing the reliability of the results. (author)

  10. Spatial data uncertainty management

    Czech Academy of Sciences Publication Activity Database

    Klimešová, Dana

    B4/37, č. 4 (2008), s. 209-213. ISSN 1682-1750 Institutional research plan: CEZ:AV0Z10750506 Keywords : Knowledge * Classification * Knowledge Management * Contextual Modelling * Temporal Modelling * Decision Support Subject RIV: IN - Informatics, Computer Science

  11. Are models, uncertainty, and dispute resolution compatible?

    Science.gov (United States)

    Anderson, J. D.; Wilson, J. L.

    2013-12-01

    Models and their uncertainty often move from an objective use in planning and decision making into the regulatory environment, then sometimes on to dispute resolution through litigation or other legal forums. Through this last transition whatever objectivity the models and uncertainty assessment may have once possessed becomes biased (or more biased) as each party chooses to exaggerate either the goodness of a model, or its worthlessness, depending on which view is in its best interest. If worthlessness is desired, then what was uncertain becomes unknown, or even unknowable. If goodness is desired, then precision and accuracy are often exaggerated and uncertainty, if it is explicitly recognized, encompasses only some parameters or conceptual issues, ignores others, and may minimize the uncertainty that it accounts for. In dispute resolution, how well is the adversarial process able to deal with these biases? The challenge is that they are often cloaked in computer graphics and animations that appear to lend realism to what could be mostly fancy, or even a manufactured outcome. While junk science can be challenged through appropriate motions in federal court, and in most state courts, it not unusual for biased or even incorrect modeling results, or conclusions based on incorrect results, to be permitted to be presented at trial. Courts allow opinions that are based on a "reasonable degree of scientific certainty," but when that 'certainty' is grossly exaggerated by an expert, one way or the other, how well do the courts determine that someone has stepped over the line? Trials are based on the adversary system of justice, so opposing and often irreconcilable views are commonly allowed, leaving it to the judge or jury to sort out the truth. Can advances in scientific theory and engineering practice, related to both modeling and uncertainty, help address this situation and better ensure that juries and judges see more objective modeling results, or at least see

  12. Uncertainty Evaluation of Best Estimate Calculation Results

    International Nuclear Information System (INIS)

    Efforts are underway in Germany to perform analysis using best estimate computer codes and to include uncertainty evaluation in licensing. The German Reactor Safety Commission (RSK) issued a recommendation to perform uncertainty analysis in loss of coolant accident safety analyses (LOCA), recently. A more general requirement is included in a draft revision of the German Nuclear Regulation which is an activity of the German Ministry of Environment and Reactor Safety (BMU). According to the recommendation of the German RSK to perform safety analyses for LOCA in licensing the following deterministic requirements have still to be applied: Most unfavourable single failure, Unavailability due to preventive maintenance, Break location, Break size and break type, Double ended break, 100 percent through 200 percent, Large, medium and small break, Loss of off-site power, Core power (at accident initiation the most unfavourable conditions and values have to be assumed which may occur under normal operation taking into account the set-points of integral power and power density control. Measurement and calibration errors can be considered statistically), Time of fuel cycle. Analysis using best estimate codes with evaluation of uncertainties is the only way to quantify conservatisms with regard to code models and uncertainties of plant, fuel parameters and decay heat. This is especially the case for approaching licensing limits, e.g. due to power up-rates, higher burn-up and higher enrichment. Broader use of best estimate analysis is therefore envisaged in the future. Since some deterministic unfavourable assumptions regarding availability of NPP systems are still used, some conservatism in best-estimate analyses remains. Methods of uncertainty analyses have been developed and applied by the vendor Framatome ANP as well as by GRS in Germany. The GRS development was sponsored by the German Ministry of Economy and Labour (BMWA). (author)

  13. Climate model uncertainty vs. conceptual geological uncertainty in hydrological modeling

    Directory of Open Access Journals (Sweden)

    T. O. Sonnenborg

    2015-04-01

    Full Text Available Projections of climate change impact are associated with a cascade of uncertainties including CO2 emission scenario, climate model, downscaling and impact model. The relative importance of the individual uncertainty sources is expected to depend on several factors including the quantity that is projected. In the present study the impacts of climate model uncertainty and geological model uncertainty on hydraulic head, stream flow, travel time and capture zones are evaluated. Six versions of a physically based and distributed hydrological model, each containing a unique interpretation of the geological structure of the model area, are forced by 11 climate model projections. Each projection of future climate is a result of a GCM-RCM model combination (from the ENSEMBLES project forced by the same CO2 scenario (A1B. The changes from the reference period (1991–2010 to the future period (2081–2100 in projected hydrological variables are evaluated and the effects of geological model and climate model uncertainties are quantified. The results show that uncertainty propagation is context dependent. While the geological conceptualization is the dominating uncertainty source for projection of travel time and capture zones, the uncertainty on the climate models is more important for groundwater hydraulic heads and stream flow.

  14. Climate model uncertainty versus conceptual geological uncertainty in hydrological modeling

    Science.gov (United States)

    Sonnenborg, T. O.; Seifert, D.; Refsgaard, J. C.

    2015-09-01

    Projections of climate change impact are associated with a cascade of uncertainties including in CO2 emission scenarios, climate models, downscaling and impact models. The relative importance of the individual uncertainty sources is expected to depend on several factors including the quantity that is projected. In the present study the impacts of climate model uncertainty and geological model uncertainty on hydraulic head, stream flow, travel time and capture zones are evaluated. Six versions of a physically based and distributed hydrological model, each containing a unique interpretation of the geological structure of the model area, are forced by 11 climate model projections. Each projection of future climate is a result of a GCM-RCM model combination (from the ENSEMBLES project) forced by the same CO2 scenario (A1B). The changes from the reference period (1991-2010) to the future period (2081-2100) in projected hydrological variables are evaluated and the effects of geological model and climate model uncertainties are quantified. The results show that uncertainty propagation is context-dependent. While the geological conceptualization is the dominating uncertainty source for projection of travel time and capture zones, the uncertainty due to the climate models is more important for groundwater hydraulic heads and stream flow.

  15. Optimizing Integrated Terminal Airspace Operations Under Uncertainty

    Science.gov (United States)

    Bosson, Christabelle; Xue, Min; Zelinski, Shannon

    2014-01-01

    In the terminal airspace, integrated departures and arrivals have the potential to increase operations efficiency. Recent research has developed geneticalgorithm- based schedulers for integrated arrival and departure operations under uncertainty. This paper presents an alternate method using a machine jobshop scheduling formulation to model the integrated airspace operations. A multistage stochastic programming approach is chosen to formulate the problem and candidate solutions are obtained by solving sample average approximation problems with finite sample size. Because approximate solutions are computed, the proposed algorithm incorporates the computation of statistical bounds to estimate the optimality of the candidate solutions. A proof-ofconcept study is conducted on a baseline implementation of a simple problem considering a fleet mix of 14 aircraft evolving in a model of the Los Angeles terminal airspace. A more thorough statistical analysis is also performed to evaluate the impact of the number of scenarios considered in the sampled problem. To handle extensive sampling computations, a multithreading technique is introduced.

  16. Climate Certainties and Uncertainties

    International Nuclear Information System (INIS)

    In issue 380 of Futuribles in December 2011, Antonin Pottier analysed in detail the workings of what is today termed 'climate scepticism' - namely the propensity of certain individuals to contest the reality of climate change on the basis of pseudo-scientific arguments. He emphasized particularly that what fuels the debate on climate change is, largely, the degree of uncertainty inherent in the consequences to be anticipated from observation of the facts, not the description of the facts itself. In his view, the main aim of climate sceptics is to block the political measures for combating climate change. However, since they do not admit to this political posture, they choose instead to deny the scientific reality. This month, Futuribles complements this socio-psychological analysis of climate-sceptical discourse with an - in this case, wholly scientific - analysis of what we know (or do not know) about climate change on our planet. Pierre Morel gives a detailed account of the state of our knowledge in the climate field and what we are able to predict in the medium/long-term. After reminding us of the influence of atmospheric meteorological processes on the climate, he specifies the extent of global warming observed since 1850 and the main origin of that warming, as revealed by the current state of knowledge: the increase in the concentration of greenhouse gases. He then describes the changes in meteorological regimes (showing also the limits of climate simulation models), the modifications of hydrological regimes, and also the prospects for rises in sea levels. He also specifies the mechanisms that may potentially amplify all these phenomena and the climate disasters that might ensue. Lastly, he shows what are the scientific data that cannot be disregarded, the consequences of which are now inescapable (melting of the ice-caps, rises in sea level etc.), the only remaining uncertainty in this connection being the date at which these things will happen. 'In this

  17. Uncertainty quantification of neutronics characteristics using Latin hypercube sampling method

    International Nuclear Information System (INIS)

    The Latin Hypercube Sampling (LHS) method is applied to the uncertainty quantification of neutronics characteristics based on the Monte-Carlo based sampling method. The Monte-Carlo based sampling method is one of the uncertainty quantification methods of neutronics characteristics (e.g. neutron multiplication factor) due to uncertainty cross section. The sampling method has the advantage that burnup and thermal-hydraulics feedback effects are easily considered in complicated light water reactor core analysis. Contrary, the sampling method has the disadvantage that statistical errors are involved in estimated uncertainties of neutronics characteristics since the sampling method is a probabilistic approach. Although the statistical errors can be reduced by increasing the number of samples (e.g. number of perturbed cross section libraries), computational cost also increases. Therefore development of an efficient uncertainty quantification method, which provides smaller statistical error of estimated uncertainty while reducing number of samples, is highly desirable. In the present study, adequacy and performance of the LHS method is investigated as an efficient Monte-Carlo based sampling method. Uncertainty quantification of multiplication factor for a BWR fuel assembly is carried out with the LHS method. The results indicate that uncertainty quantification using the LHS method is more efficient than that using the conventional sampling method, which utilizes simple random sampling of cross sections. (author)

  18. Strain gauge measurement uncertainties on hydraulic turbine runner blade

    International Nuclear Information System (INIS)

    Strains experimentally measured with strain gauges can differ from those evaluated using the Finite Element (FE) method. This difference is due mainly to the assumptions and uncertainties inherent to each method. To circumvent this difficulty, we developed a numerical method based on Monte Carlo simulations to evaluate measurement uncertainties produced by the behaviour of a unidirectional welded gauge, its position uncertainty and its integration effect. This numerical method uses the displacement fields of the studied part evaluated by an FE analysis. The paper presents a study case using in situ data measured on a hydraulic turbine runner. The FE analysis of the turbine runner blade was computed, and our numerical method used to evaluate uncertainties on strains measured at five locations with welded strain gauges. Then, measured strains and their uncertainty ranges are compared to the estimated strains. The uncertainty ranges obtained extended from 74 με to 165 με. Furthermore, the biases observed between the median of the uncertainty ranges and the FE strains varied from −36 to 36 με. Note that strain gauge measurement uncertainties depend mainly on displacement fields and gauge geometry.

  19. Quantifying and Reducing Curve-Fitting Uncertainty in Isc

    Energy Technology Data Exchange (ETDEWEB)

    Campanelli, Mark; Duck, Benjamin; Emery, Keith

    2015-06-14

    Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data points can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.

  20. Spatial Uncertainty Model for Visual Features Using a Kinect™ Sensor

    Directory of Open Access Journals (Sweden)

    Jae-Han Park

    2012-06-01

    Full Text Available This study proposes a mathematical uncertainty model for the spatial measurement of visual features using Kinect™ sensors. This model can provide qualitative and quantitative analysis for the utilization of Kinect™ sensors as 3D perception sensors. In order to achieve this objective, we derived the propagation relationship of the uncertainties between the disparity image space and the real Cartesian space with the mapping function between the two spaces. Using this propagation relationship, we obtained the mathematical model for the covariance matrix of the measurement error, which represents the uncertainty for spatial position of visual features from Kinect™ sensors. In order to derive the quantitative model of spatial uncertainty for visual features, we estimated the covariance matrix in the disparity image space using collected visual feature data. Further, we computed the spatial uncertainty information by applying the covariance matrix in the disparity image space and the calibrated sensor parameters to the proposed mathematical model. This spatial uncertainty model was verified by comparing the uncertainty ellipsoids for spatial covariance matrices and the distribution of scattered matching visual features. We expect that this spatial uncertainty model and its analyses will be useful in various Kinect™ sensor applications.

  1. Discussion of OECD LWR Uncertainty Analysis in Modelling Benchmark

    International Nuclear Information System (INIS)

    The demand for best estimate calculations in nuclear reactor design and safety evaluations has increased in recent years. Uncertainty quantification has been highlighted as part of the best estimate calculations. The modelling aspects of uncertainty and sensitivity analysis are to be further developed and validated on scientific grounds in support of their performance and application to multi-physics reactor simulations. The Organization for Economic Co-operation and Development (OECD) / Nuclear Energy Agency (NEA) Nuclear Science Committee (NSC) has endorsed the creation of an Expert Group on Uncertainty Analysis in Modelling (EGUAM). Within the framework of activities of EGUAM/NSC the OECD/NEA initiated the Benchmark for Uncertainty Analysis in Modelling for Design, Operation, and Safety Analysis of Light Water Reactor (OECD LWR UAM benchmark). The general objective of the benchmark is to propagate the predictive uncertainties of code results through complex coupled multi-physics and multi-scale simulations. The benchmark is divided into three phases with Phase I highlighting the uncertainty propagation in stand-alone neutronics calculations, while Phase II and III are focused on uncertainty analysis of reactor core and system respectively. This paper discusses the progress made in Phase I calculations, the Specifications for Phase II and the incoming challenges in defining Phase 3 exercises. The challenges of applying uncertainty quantification to complex code systems, in particular the time-dependent coupled physics models are the large computational burden and the utilization of non-linear models (expected due to the physics coupling). (authors)

  2. Quantifying and Reducing Curve-Fitting Uncertainty in Isc: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Campanelli, Mark; Duck, Benjamin; Emery, Keith

    2015-09-28

    Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data points can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.

  3. Housing Uncertainty and Childhood Impatience

    Science.gov (United States)

    Anil, Bulent; Jordan, Jeffrey L.; Zahirovic-Herbert, Velma

    2011-01-01

    The study demonstrates a direct link between housing uncertainty and children's time preferences, or patience. We show that students who face housing uncertainties through mortgage foreclosures and eviction learn impatient behavior and are therefore at greater risk of making poor intertemporal choices such as dropping out of school. We find that…

  4. Uncertainty in hydrological change modelling

    DEFF Research Database (Denmark)

    Seaby, Lauren Paige

    and future conditions was found to be modest across multiple climate models and bias correction methods. The choice of RCM contributes almost all of the uncertainty across precipitation inputs, explaining 99% of uncertainty in total annual precipitation, while choice of bias correction method...

  5. Mama Software Features: Uncertainty Testing

    Energy Technology Data Exchange (ETDEWEB)

    Ruggiero, Christy E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Porter, Reid B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-05-30

    This document reviews how the uncertainty in the calculations is being determined with test image data. The results of this testing give an ‘initial uncertainty’ number than can be used to estimate the ‘back end’ uncertainty in digital image quantification in images. Statisticians are refining these numbers as part of a UQ effort.

  6. AGRICULTURAL ECONOMICS, INTERDEPENDENCE AND UNCERTAINTY

    OpenAIRE

    Anderson, Jock R.

    1982-01-01

    Interdependence has always been central to economics but assumes pressing importance for agricultural economists as they deal with industrialising agricultures. Continued unresolvable uncertainties, when properly recognised, also add to the challenge of relevant work in agricultural economics. The related roles of interdependence and uncertainty are illustrated through examples from the progress of agricultural technology and enhancement of food security.

  7. Uncertainty in Integrated Assessment Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Mort Webster

    2005-10-17

    The determination of climate policy is a decision under uncertainty. The uncertainty in future climate change impacts is large, as is the uncertainty in the costs of potential policies. Rational and economically efficient policy choices will therefore seek to balance the expected marginal costs with the expected marginal benefits. This approach requires that the risks of future climate change be assessed. The decision process need not be formal or quantitative for descriptions of the risks to be useful. Whatever the decision procedure, a useful starting point is to have as accurate a description of climate risks as possible. Given the goal of describing uncertainty in future climate change, we need to characterize the uncertainty in the main causes of uncertainty in climate impacts. One of the major drivers of uncertainty in future climate change is the uncertainty in future emissions, both of greenhouse gases and other radiatively important species such as sulfur dioxide. In turn, the drivers of uncertainty in emissions are uncertainties in the determinants of the rate of economic growth and in the technologies of production and how those technologies will change over time. This project uses historical experience and observations from a large number of countries to construct statistical descriptions of variability and correlation in labor productivity growth and in AEEI. The observed variability then provides a basis for constructing probability distributions for these drivers. The variance of uncertainty in growth rates can be further modified by expert judgment if it is believed that future variability will differ from the past. But often, expert judgment is more readily applied to projected median or expected paths through time. Analysis of past variance and covariance provides initial assumptions about future uncertainty for quantities that are less intuitive and difficult for experts to estimate, and these variances can be normalized and then applied to mean

  8. Reformulating the Quantum Uncertainty Relation

    Science.gov (United States)

    Li, Jun-Li; Qiao, Cong-Feng

    2015-08-01

    Uncertainty principle is one of the cornerstones of quantum theory. In the literature, there are two types of uncertainty relations, the operator form concerning the variances of physical observables and the entropy form related to entropic quantities. Both these forms are inequalities involving pairwise observables, and are found to be nontrivial to incorporate multiple observables. In this work we introduce a new form of uncertainty relation which may give out complete trade-off relations for variances of observables in pure and mixed quantum systems. Unlike the prevailing uncertainty relations, which are either quantum state dependent or not directly measurable, our bounds for variances of observables are quantum state independent and immune from the “triviality” problem of having zero expectation values. Furthermore, the new uncertainty relation may provide a geometric explanation for the reason why there are limitations on the simultaneous determination of different observables in N-dimensional Hilbert space.

  9. Uncertainty relation in Schwarzschild spacetime

    CERN Document Server

    Feng, Jun; Gould, Mark D; Fan, Heng

    2015-01-01

    We explore the entropic uncertainty relation in the curved background outside a Schwarzschild black hole, and find that Hawking radiation introduce a nontrivial modification on the uncertainty bound for particular observer, therefore could be witnessed by proper uncertainty game experimentally. We first investigate an uncertainty game between a free falling observer and his static partner holding a quantum memory initially entangled with the quantum system to be measured. Due to the information loss from Hawking decoherence, we find an inevitably increase of the uncertainty on the outcome of measurements in the view of static observer, which is dependent on the mass of the black hole, the distance of observer from event horizon, and the mode frequency of quantum memory. In an alternative game between two static players, we show that quantum information of qubit can be transferred to quantum memory through a bath of fluctuating quantum fields outside the black hole, which triggers an effectively reduced uncert...

  10. Uncertainty relation in Schwarzschild spacetime

    Directory of Open Access Journals (Sweden)

    Jun Feng

    2015-04-01

    Full Text Available We explore the entropic uncertainty relation in the curved background outside a Schwarzschild black hole, and find that Hawking radiation introduces a nontrivial modification on the uncertainty bound for particular observer, therefore it could be witnessed by proper uncertainty game experimentally. We first investigate an uncertainty game between a free falling observer and his static partner holding a quantum memory initially entangled with the quantum system to be measured. Due to the information loss from Hawking decoherence, we find an inevitable increase of the uncertainty on the outcome of measurements in the view of static observer, which is dependent on the mass of the black hole, the distance of observer from event horizon, and the mode frequency of quantum memory. To illustrate the generality of this paradigm, we relate the entropic uncertainty bound with other uncertainty probe, e.g., time–energy uncertainty. In an alternative game between two static players, we show that quantum information of qubit can be transferred to quantum memory through a bath of fluctuating quantum fields outside the black hole. For a particular choice of initial state, we show that the Hawking decoherence cannot counteract entanglement generation after the dynamical evolution of system, which triggers an effectively reduced uncertainty bound that violates the intrinsic limit −log2⁡c. Numerically estimation for a proper choice of initial state shows that our result is comparable with possible real experiments. Finally, a discussion on the black hole firewall paradox in the context of entropic uncertainty relation is given.

  11. Uncertainties in Atomic Data and Their Propagation Through Spectral Models. I.

    Science.gov (United States)

    Bautista, M. A.; Fivet, V.; Quinet, P.; Dunn, J.; Gull, T. R.; Kallman, T. R.; Mendoza, C.

    2013-01-01

    We present a method for computing uncertainties in spectral models, i.e., level populations, line emissivities, and emission line ratios, based upon the propagation of uncertainties originating from atomic data.We provide analytic expressions, in the form of linear sets of algebraic equations, for the coupled uncertainties among all levels. These equations can be solved efficiently for any set of physical conditions and uncertainties in the atomic data. We illustrate our method applied to spectral models of Oiii and Fe ii and discuss the impact of the uncertainties on atomic systems under different physical conditions. As to intrinsic uncertainties in theoretical atomic data, we propose that these uncertainties can be estimated from the dispersion in the results from various independent calculations. This technique provides excellent results for the uncertainties in A-values of forbidden transitions in [Fe ii]. Key words: atomic data - atomic processes - line: formation - methods: data analysis - molecular data - molecular processes - techniques: spectroscopic

  12. UNCERTAINTIES IN ATOMIC DATA AND THEIR PROPAGATION THROUGH SPECTRAL MODELS. I

    International Nuclear Information System (INIS)

    We present a method for computing uncertainties in spectral models, i.e., level populations, line emissivities, and emission line ratios, based upon the propagation of uncertainties originating from atomic data. We provide analytic expressions, in the form of linear sets of algebraic equations, for the coupled uncertainties among all levels. These equations can be solved efficiently for any set of physical conditions and uncertainties in the atomic data. We illustrate our method applied to spectral models of O III and Fe II and discuss the impact of the uncertainties on atomic systems under different physical conditions. As to intrinsic uncertainties in theoretical atomic data, we propose that these uncertainties can be estimated from the dispersion in the results from various independent calculations. This technique provides excellent results for the uncertainties in A-values of forbidden transitions in [Fe II].

  13. Uncertainties in Atomic Data and Their Propagation Through Spectral Models. I

    CERN Document Server

    Bautista, Manuel A; Quinet, Pascal; Dunn, Jay; Kallman, Theodore R Gull Timothy R; Mendoza, Claudio

    2013-01-01

    We present a method for computing uncertainties in spectral models, i.e. level populations, line emissivities, and emission line ratios, based upon the propagation of uncertainties originating from atomic data. We provide analytic expressions, in the form of linear sets of algebraic equations, for the coupled uncertainties among all levels. These equations can be solved efficiently for any set of physical conditions and uncertainties in the atomic data. We illustrate our method applied to spectral models of O III and Fe II and discuss the impact of the uncertainties on atomic systems under different physical conditions. As to intrinsic uncertainties in theoretical atomic data, we propose that these uncertainties can be estimated from the dispersion in the results from various independent calculations. This technique provides excellent results for the uncertainties in A-values of forbidden transitions in [Fe II].

  14. UNCERTAINTIES IN ATOMIC DATA AND THEIR PROPAGATION THROUGH SPECTRAL MODELS. I

    Energy Technology Data Exchange (ETDEWEB)

    Bautista, M. A.; Fivet, V. [Department of Physics, Western Michigan University, Kalamazoo, MI 49008 (United States); Quinet, P. [Astrophysique et Spectroscopie, Universite de Mons-UMONS, B-7000 Mons (Belgium); Dunn, J. [Physical Science Department, Georgia Perimeter College, Dunwoody, GA 30338 (United States); Gull, T. R. [Code 667, NASA Goddard Space Flight Center, Greenbelt, MD 20771 (United States); Kallman, T. R. [Code 662, NASA Goddard Space Flight Center, Greenbelt, MD 20771 (United States); Mendoza, C., E-mail: manuel.bautista@wmich.edu [Centro de Fisica, Instituto Venezolano de Investigaciones Cientificas (IVIC), P.O. Box 20632, Caracas 1020A (Venezuela, Bolivarian Republic of)

    2013-06-10

    We present a method for computing uncertainties in spectral models, i.e., level populations, line emissivities, and emission line ratios, based upon the propagation of uncertainties originating from atomic data. We provide analytic expressions, in the form of linear sets of algebraic equations, for the coupled uncertainties among all levels. These equations can be solved efficiently for any set of physical conditions and uncertainties in the atomic data. We illustrate our method applied to spectral models of O III and Fe II and discuss the impact of the uncertainties on atomic systems under different physical conditions. As to intrinsic uncertainties in theoretical atomic data, we propose that these uncertainties can be estimated from the dispersion in the results from various independent calculations. This technique provides excellent results for the uncertainties in A-values of forbidden transitions in [Fe II].

  15. Entropic uncertainty relation based on generalized uncertainty principle

    CERN Document Server

    Kawamoto, Shoichi; Wen, Wen-Yu

    2016-01-01

    We explore the modification of the entropic formulation of uncertainty principle in quantum mechanics which measures the incompatibility of measurements in terms of Shannon entropy. The deformation in question is the type so called generalized uncertainty principle that is motivated by thought experiments in quantum gravity and string theory and is characterized by a parameter of Planck scale. The corrections are evaluated for small deformation parameters by use of the Gaussian wave function and numerical calculation. As the generalized uncertainty principle has proven to be useful in the study of the quantum nature of black holes, this study would be a step toward introducing an information theory viewpoint to black hole physics.

  16. SENSIT: a cross-section and design sensitivity and uncertainty analysis code. [In FORTRAN for CDC-7600, IBM 360

    Energy Technology Data Exchange (ETDEWEB)

    Gerstl, S.A.W.

    1980-01-01

    SENSIT computes the sensitivity and uncertainty of a calculated integral response (such as a dose rate) due to input cross sections and their uncertainties. Sensitivity profiles are computed for neutron and gamma-ray reaction cross sections of standard multigroup cross section sets and for secondary energy distributions (SEDs) of multigroup scattering matrices. In the design sensitivity mode, SENSIT computes changes in an integral response due to design changes and gives the appropriate sensitivity coefficients. Cross section uncertainty analyses are performed for three types of input data uncertainties: cross-section covariance matrices for pairs of multigroup reaction cross sections, spectral shape uncertainty parameters for secondary energy distributions (integral SED uncertainties), and covariance matrices for energy-dependent response functions. For all three types of data uncertainties SENSIT computes the resulting variance and estimated standard deviation in an integral response of interest, on the basis of generalized perturbation theory. SENSIT attempts to be more comprehensive than earlier sensitivity analysis codes, such as SWANLAKE.

  17. Development of a Prototype Model-Form Uncertainty Knowledge Base

    Science.gov (United States)

    Green, Lawrence L.

    2016-01-01

    Uncertainties are generally classified as either aleatory or epistemic. Aleatory uncertainties are those attributed to random variation, either naturally or through manufacturing processes. Epistemic uncertainties are generally attributed to a lack of knowledge. One type of epistemic uncertainty is called model-form uncertainty. The term model-form means that among the choices to be made during a design process within an analysis, there are different forms of the analysis process, which each give different results for the same configuration at the same flight conditions. Examples of model-form uncertainties include the grid density, grid type, and solver type used within a computational fluid dynamics code, or the choice of the number and type of model elements within a structures analysis. The objectives of this work are to identify and quantify a representative set of model-form uncertainties and to make this information available to designers through an interactive knowledge base (KB). The KB can then be used during probabilistic design sessions, so as to enable the possible reduction of uncertainties in the design process through resource investment. An extensive literature search has been conducted to identify and quantify typical model-form uncertainties present within aerospace design. An initial attempt has been made to assemble the results of this literature search into a searchable KB, usable in real time during probabilistic design sessions. A concept of operations and the basic structure of a model-form uncertainty KB are described. Key operations within the KB are illustrated. Current limitations in the KB, and possible workarounds are explained.

  18. Uncertainty propagation within an integrated model of climate change

    International Nuclear Information System (INIS)

    This paper demonstrates a methodology whereby stochastic dynamical systems are used to investigate a climate model's inherent capacity to propagate uncertainty over time. The usefulness of the methodology stems from its ability to identify the variables that account for most of the model's uncertainty. We accomplish this by reformulating a deterministic dynamical system capturing the structure of an integrated climate model into a stochastic dynamical system. Then, via the use of computational techniques of stochastic differential equations accurate uncertainty estimates of the model's variables are determined. The uncertainty is measured in terms of properties of probability distributions of the state variables. The starting characteristics of the uncertainty of the initial state and the random fluctuations are derived from estimates given in the literature. Two aspects of uncertainty are investigated: (1) the dependence on environmental scenario - which is determined by technological development and actions towards environmental protection; and (2) the dependence on the magnitude of the initial state measurement error determined by the progress of climate change and the total magnitude of the system's random fluctuations as well as by our understanding of the climate system. Uncertainty of most of the system's variables is found to be nearly independent of the environmental scenario for the time period under consideration (1990-2100). Even conservative uncertainty estimates result in scenario overlap of several decades during which the consequences of any actions affecting the environment could be very difficult to identify with a sufficient degree of confidence. This fact may have fundamental consequences on the level of social acceptance of any restrictive measures against accelerating global warming. In general, the stochastic fluctuations contribute more to the uncertainty than the initial state measurements. The variables coupling all major climate elements

  19. Uncertainty analysis of nuclear waste package corrosion

    International Nuclear Information System (INIS)

    This paper describes the results of an evaluation of three uncertainty analysis methods for assessing the possible variability in calculating the corrosion process in a nuclear waste package. The purpose of the study is the determination of how each of three uncertainty analysis methods, Monte Carlo, Latin hypercube sampling (LHS) and a modified discrete probability distribution method, perform in such calculations. The purpose is not to examine the absolute magnitude of the numbers but rather to rank the performance of each of the uncertainty methods in assessing the model variability. In this context it was found that the Monte Carlo method provided the most accurate assessment but at a prohibitively high cost. The modified discrete probability method provided accuracy close to that of the Monte Carlo for a fraction of the cost. The LHS method was found to be too inaccurate for this calculation although it would be appropriate for use in a model which requires substantially more computer time than the one studied in this paper

  20. Uncertainties in Nuclear Proliferation Modeling

    International Nuclear Information System (INIS)

    There have been various efforts in the research community to understand the determinants of nuclear proliferation and develop quantitative tools to predict nuclear proliferation events. Such systematic approaches have shown the possibility to provide warning for the international community to prevent nuclear proliferation activities. However, there are still large debates for the robustness of the actual effect of determinants and projection results. Some studies have shown that several factors can cause uncertainties in previous quantitative nuclear proliferation modeling works. This paper analyzes the uncertainties in the past approaches and suggests future works in the view of proliferation history, analysis methods, and variable selection. The research community still lacks the knowledge for the source of uncertainty in current models. Fundamental problems in modeling will remain even other advanced modeling method is developed. Before starting to develop fancy model based on the time dependent proliferation determinants' hypothesis, using graph theory, etc., it is important to analyze the uncertainty of current model to solve the fundamental problems of nuclear proliferation modeling. The uncertainty from different proliferation history coding is small. Serious problems are from limited analysis methods and correlation among the variables. Problems in regression analysis and survival analysis cause huge uncertainties when using the same dataset, which decreases the robustness of the result. Inaccurate variables for nuclear proliferation also increase the uncertainty. To overcome these problems, further quantitative research should focus on analyzing the knowledge suggested on the qualitative nuclear proliferation studies

  1. TSUNAMI Primer: A Primer for Sensitivity/Uncertainty Calculations with SCALE

    Energy Technology Data Exchange (ETDEWEB)

    Rearden, Bradley T [ORNL; Mueller, Don [ORNL; Bowman, Stephen M [ORNL; Busch, Robert D. [University of New Mexico, Albuquerque; Emerson, Scott [University of New Mexico, Albuquerque

    2009-01-01

    This primer presents examples in the application of the SCALE/TSUNAMI tools to generate k{sub eff} sensitivity data for one- and three-dimensional models using TSUNAMI-1D and -3D and to examine uncertainties in the computed k{sub eff} values due to uncertainties in the cross-section data used in their calculation. The proper use of unit cell data and need for confirming the appropriate selection of input parameters through direct perturbations are described. The uses of sensitivity and uncertainty data to identify and rank potential sources of computational bias in an application system and TSUNAMI tools for assessment of system similarity using sensitivity and uncertainty criteria are demonstrated. Uses of these criteria in trending analyses to assess computational biases, bias uncertainties, and gap analyses are also described. Additionally, an application of the data adjustment tool TSURFER is provided, including identification of specific details of sources of computational bias.

  2. Entropic uncertainty relation based on generalized uncertainty principle

    OpenAIRE

    Kawamoto, Shoichi; Hsu, Li-Yi; Wen, Wen-Yu

    2016-01-01

    We explore the modification of the entropic formulation of uncertainty principle in quantum mechanics which measures the incompatibility of measurements in terms of Shannon entropy. The deformation in question is the type so called generalized uncertainty principle that is motivated by thought experiments in quantum gravity and string theory and is characterized by a parameter of Planck scale. The corrections are evaluated for small deformation parameters by use of the Gaussian wave function ...

  3. Uncertainty in Regional Air Quality Modeling

    Science.gov (United States)

    Digar, Antara

    Effective pollution mitigation is the key to successful air quality management. Although states invest millions of dollars to predict future air quality, the regulatory modeling and analysis process to inform pollution control strategy remains uncertain. Traditionally deterministic ‘bright-line’ tests are applied to evaluate the sufficiency of a control strategy to attain an air quality standard. A critical part of regulatory attainment demonstration is the prediction of future pollutant levels using photochemical air quality models. However, because models are uncertain, they yield a false sense of precision that pollutant response to emission controls is perfectly known and may eventually mislead the selection of control policies. These uncertainties in turn affect the health impact assessment of air pollution control strategies. This thesis explores beyond the conventional practice of deterministic attainment demonstration and presents novel approaches to yield probabilistic representations of pollutant response to emission controls by accounting for uncertainties in regional air quality planning. Computationally-efficient methods are developed and validated to characterize uncertainty in the prediction of secondary pollutant (ozone and particulate matter) sensitivities to precursor emissions in the presence of uncertainties in model assumptions and input parameters. We also introduce impact factors that enable identification of model inputs and scenarios that strongly influence pollutant concentrations and sensitivity to precursor emissions. We demonstrate how these probabilistic approaches could be applied to determine the likelihood that any control measure will yield regulatory attainment, or could be extended to evaluate probabilistic health benefits of emission controls, considering uncertainties in both air quality models and epidemiological concentration-response relationships. Finally, ground-level observations for pollutant (ozone) and precursor

  4. Extended Forward Sensitivity Analysis for Uncertainty Quantification

    Energy Technology Data Exchange (ETDEWEB)

    Haihua Zhao; Vincent A. Mousseau

    2008-09-01

    This report presents the forward sensitivity analysis method as a means for quantification of uncertainty in system analysis. The traditional approach to uncertainty quantification is based on a “black box” approach. The simulation tool is treated as an unknown signal generator, a distribution of inputs according to assumed probability density functions is sent in and the distribution of the outputs is measured and correlated back to the original input distribution. This approach requires large number of simulation runs and therefore has high computational cost. Contrary to the “black box” method, a more efficient sensitivity approach can take advantage of intimate knowledge of the simulation code. In this approach equations for the propagation of uncertainty are constructed and the sensitivity is solved for as variables in the same simulation. This “glass box” method can generate similar sensitivity information as the above “black box” approach with couples of runs to cover a large uncertainty region. Because only small numbers of runs are required, those runs can be done with a high accuracy in space and time ensuring that the uncertainty of the physical model is being measured and not simply the numerical error caused by the coarse discretization. In the forward sensitivity method, the model is differentiated with respect to each parameter to yield an additional system of the same size as the original one, the result of which is the solution sensitivity. The sensitivity of any output variable can then be directly obtained from these sensitivities by applying the chain rule of differentiation. We extend the forward sensitivity method to include time and spatial steps as special parameters so that the numerical errors can be quantified against other physical parameters. This extension makes the forward sensitivity method a much more powerful tool to help uncertainty analysis. By knowing the relative sensitivity of time and space steps with other

  5. Decision-making under great uncertainty

    International Nuclear Information System (INIS)

    Five types of decision-uncertainty are distinguished: uncertainty of consequences, of values, of demarcation, of reliance, and of co-ordination. Strategies are proposed for each type of uncertainty. The general conclusion is that it is meaningful for decision theory to treat cases with greater uncertainty than the textbook case of 'decision-making under uncertainty'. (au)

  6. Optimal uncertainty relations in a modified Heisenberg algebra

    CERN Document Server

    Abdelkhalek, Kais; Fiedler, Leander; Mangano, Gianpiero; Schwonnek, René

    2016-01-01

    Various theories that aim at unifying gravity with quantum mechanics suggest modifications of the Heisenberg algebra for position and momentum. From the perspective of quantum mechanics, such modifications lead to new uncertainty relations which are thought (but not proven) to imply the existence of a minimal observable length. Here we prove this statement in a framework of sufficient physical and structural assumptions. Moreover, we present a general method that allows to formulate optimal and state-independent variance-based uncertainty relations. In addition, instead of variances, we make use of entropies as a measure of uncertainty and provide uncertainty relations in terms of min- and Shannon entropies. We compute the corresponding entropic minimal lengths and find that the minimal length in terms of min-entropy is exactly one bit.

  7. Analysis and Reduction of Complex Networks Under Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Knio, Omar M

    2014-04-09

    This is a collaborative proposal that aims at developing new methods for the analysis and reduction of complex multiscale networks under uncertainty. The approach is based on combining methods of computational singular perturbation (CSP) and probabilistic uncertainty quantification. In deterministic settings, CSP yields asymptotic approximations of reduced-dimensionality “slow manifolds” on which a multiscale dynamical system evolves. Introducing uncertainty raises fundamentally new issues, particularly concerning its impact on the topology of slow manifolds, and means to represent and quantify associated variability. To address these challenges, this project uses polynomial chaos (PC) methods to reformulate uncertain network models, and to analyze them using CSP in probabilistic terms. Specific objectives include (1) developing effective algorithms that can be used to illuminate fundamental and unexplored connections among model reduction, multiscale behavior, and uncertainty, and (2) demonstrating the performance of these algorithms through applications to model problems.

  8. Collaborative framework for PIV uncertainty quantification: comparative assessment of methods

    International Nuclear Information System (INIS)

    A posteriori uncertainty quantification of particle image velocimetry (PIV) data is essential to obtain accurate estimates of the uncertainty associated with a given experiment. This is particularly relevant when measurements are used to validate computational models or in design and decision processes. In spite of the importance of the subject, the first PIV uncertainty quantification (PIV-UQ) methods have been developed only in the last three years. The present work is a comparative assessment of four approaches recently proposed in the literature: the uncertainty surface method (Timmins et al 2012), the particle disparity approach (Sciacchitano et al 2013), the peak ratio criterion (Charonko and Vlachos 2013) and the correlation statistics method (Wieneke 2015). The analysis is based upon experiments conducted for this specific purpose, where several measurement techniques are employed simultaneously. The performances of the above approaches are surveyed across different measurement conditions and flow regimes. (paper)

  9. Computational movement analysis

    CERN Document Server

    Laube, Patrick

    2014-01-01

    This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi

  10. Climate Projections and Uncertainty Communication.

    Science.gov (United States)

    Joslyn, Susan L; LeClerc, Jared E

    2016-01-01

    Lingering skepticism about climate change might be due in part to the way climate projections are perceived by members of the public. Variability between scientists' estimates might give the impression that scientists disagree about the fact of climate change rather than about details concerning the extent or timing. Providing uncertainty estimates might clarify that the variability is due in part to quantifiable uncertainty inherent in the prediction process, thereby increasing people's trust in climate projections. This hypothesis was tested in two experiments. Results suggest that including uncertainty estimates along with climate projections leads to an increase in participants' trust in the information. Analyses explored the roles of time, place, demographic differences (e.g., age, gender, education level, political party affiliation), and initial belief in climate change. Implications are discussed in terms of the potential benefit of adding uncertainty estimates to public climate projections. PMID:26695995

  11. Uncertainty product of composite signals

    International Nuclear Information System (INIS)

    The well known uncertainty product of communication theory for a signal in the time domain and its Fourier transform in the frequency domain is studied for a 'composite signal', i.e. a 'pure' signal to which a time-delayed replica is added. This uncertainty product shows the appearance of local maxima and minima as a function of the time delay, leading to the following conjecture: the uncertainty product of a non-Gaussian composite signal can be smaller than that of the 'pure' signal. As an example this conjecture will be proven for the derivative of the Gaussian signal and for the Cauchy distribution. The effect on the uncertainty product of adding a delayed scaled replica of a signal to the original signal in the time domain leads to an important possibility for interpretation in the study of the reverberation phenomenon in echo-location signals of dolphins. (author). Letter-to-the-editor

  12. Uncertainty relations for gravitomagnetic fields

    International Nuclear Information System (INIS)

    Full text: In his book The Physical Principles of the Quantum Theory from 1929 Heisenberg derived the uncertainty relation between the components of electrical and magnetic field in an operational way. The main idea is that any experimental procedure designed to measure simultaneously the field components necessarily involves probe particles, whose intrinsic uncertainty in the values of position and momentum imposes a fundamental limit on how accurate the field components can be measured. Following this approach we derive the uncertainty relation between derivatives of the gravitational metric tensor on the basis of the analogy between Maxwell equations for electromagnetism and the linearized Einstein's field equations. The existence of uncertainty relation opens up a possibility of identifying complementary observables for quantized gravitational field. (author)

  13. Numerical modeling of economic uncertainty

    DEFF Research Database (Denmark)

    Schjær-Jacobsen, Hans

    2007-01-01

    Representation and modeling of economic uncertainty is addressed by different modeling methods, namely stochastic variables and probabilities, interval analysis, and fuzzy numbers, in particular triple estimates. Focusing on discounted cash flow analysis numerical results are presented, comparisons...

  14. Fundamental uncertainties in lung counting.

    Science.gov (United States)

    Kramer, Gary H; Hauck, Barry M

    2007-10-01

    The HML has investigated the effect the uncertainty introduced into an activity estimate from a lung count due to 1) replicate counts and 2) lung set variability. Replicate counts in the HML seem to only be affected by random statistics as the uncertainty can be predicted by Monte Carlo simulations. These findings from the lung set variability experiments suggest that a lung set has an unquantified uncertainty on its activity that adds a component to the uncertainty on the counting efficiency, and ultimately the activity estimate, as they can differ by as much as 30% at 17.5 keV or about 13% at 185.7 keV, when one is expecting only a 3% difference. PMID:17846529

  15. Uncertainty quantification of squeal instability via surrogate modelling

    Science.gov (United States)

    Nobari, Amir; Ouyang, Huajiang; Bannister, Paul

    2015-08-01

    One of the major issues that car manufacturers are facing is the noise and vibration of brake systems. Of the different sorts of noise and vibration, which a brake system may generate, squeal as an irritating high-frequency noise costs the manufacturers significantly. Despite considerable research that has been conducted on brake squeal, the root cause of squeal is still not fully understood. The most common assumption, however, is mode-coupling. Complex eigenvalue analysis is the most widely used approach to the analysis of brake squeal problems. One of the major drawbacks of this technique, nevertheless, is that the effects of variability and uncertainty are not included in the results. Apparently, uncertainty and variability are two inseparable parts of any brake system. Uncertainty is mainly caused by friction, contact, wear and thermal effects while variability mostly stems from the manufacturing process, material properties and component geometries. Evaluating the effects of uncertainty and variability in the complex eigenvalue analysis improves the predictability of noise propensity and helps produce a more robust design. The biggest hurdle in the uncertainty analysis of brake systems is the computational cost and time. Most uncertainty analysis techniques rely on the results of many deterministic analyses. A full finite element model of a brake system typically consists of millions of degrees-of-freedom and many load cases. Running time of such models is so long that automotive industry is reluctant to do many deterministic analyses. This paper, instead, proposes an efficient method of uncertainty propagation via surrogate modelling. A surrogate model of a brake system is constructed in order to reproduce the outputs of the large-scale finite element model and overcome the issue of computational workloads. The probability distribution of the real part of an unstable mode can then be obtained by using the surrogate model with a massive saving of

  16. Supply chain collaboration under uncertainty

    OpenAIRE

    Hasan, Saad; Eckert, Claudia; Earl, Chris

    2012-01-01

    In fluctuating economic conditions such as global recession, supply chains operate under changing conditions of uncertainty. The impact of this uncertainty and associated risk might be mitigated by collaboration. This paper proposes a model of supply chain collaboration based on information exchange and decision coordination at both the strategic and tactical levels. However, a collaborative supply chain can be exposed to associated risks such as the failure of individual actors. Governance r...

  17. Uncertainty analysis and probabilistic modelling

    International Nuclear Information System (INIS)

    Many factors affect the accuracy and precision of probabilistic assessment. This report discusses sources of uncertainty and ways of addressing them. Techniques for propagating uncertainties in model input parameters through to model prediction are discussed as are various techniques for examining how sensitive and uncertain model predictions are to one or more input parameters. Various statements of confidence which can be made concerning the prediction of a probabilistic assessment are discussed as are several matters of potential regulatory interest. 55 refs

  18. Uncertainty, investment, and industry evolution

    OpenAIRE

    Caballero, Ricardo J; Robert S. Pindyck

    1992-01-01

    We study the effects of aggregate and idiosyncratic uncertainty on the entry of firms, total investment, and prices in a competitive industry with irreversible investment. We first use standard dynamic programming methods to determine firms' entry decisions, and we describe the resulting industry equilibrium and its characteristics, emphasizing the effects of different sources of uncertainty. We then show how the conditional distribution of prices can be used as an alternative means of determ...

  19. Investment Under Uncertainty: A Theory

    OpenAIRE

    Mellati, Ali

    2008-01-01

    There must be a restricted time horizon within which investors trust their anticipations in an uncertain condition. In this circumstance investors are concerned about what happens if the worst condition (i.e. decreasing prices) occurs. A best-worst strategy in a discounted payback period framework is applied to examine the effect of uncertainty on the time horizon and investment. The model shows that increasing uncertainty will reduce the time horizon as well as investment. Moreover, the calc...

  20. Uncertainty quantification in reacting flow modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Le MaÒitre, Olivier P. (UniversitÔe d' Evry Val d' Essonne, Evry, France); Reagan, Matthew T.; Knio, Omar M. (Johns Hopkins University, Baltimore, MD); Ghanem, Roger Georges (Johns Hopkins University, Baltimore, MD); Najm, Habib N.

    2003-10-01

    Uncertainty quantification (UQ) in the computational modeling of physical systems is important for scientific investigation, engineering design, and model validation. In this work we develop techniques for UQ based on spectral and pseudo-spectral polynomial chaos (PC) expansions, and we apply these constructions in computations of reacting flow. We develop and compare both intrusive and non-intrusive spectral PC techniques. In the intrusive construction, the deterministic model equations are reformulated using Galerkin projection into a set of equations for the time evolution of the field variable PC expansion mode strengths. The mode strengths relate specific parametric uncertainties to their effects on model outputs. The non-intrusive construction uses sampling of many realizations of the original deterministic model, and projects the resulting statistics onto the PC modes, arriving at the PC expansions of the model outputs. We investigate and discuss the strengths and weaknesses of each approach, and identify their utility under different conditions. We also outline areas where ongoing and future research are needed to address challenges with both approaches.

  1. Deconvolution of variability and uncertainty in the Cassini safety analysis

    International Nuclear Information System (INIS)

    The standard method for propagation of uncertainty in a risk analysis requires rerunning the risk calculation numerous times with model parameters chosen from their uncertainty distributions. This was not practical for the Cassini nuclear safety analysis, due to the computationally intense nature of the risk calculation. A less computationally intense procedure was developed which requires only two calculations for each accident case. The first of these is the standard open-quotes best-estimateclose quotes calculation. In the second calculation, variables and parameters change simultaneously. The mathematical technique of deconvolution is then used to separate out an uncertainty multiplier distribution, which can be used to calculate distribution functions at various levels of confidence. copyright 1998 American Institute of Physics

  2. Advanced Concepts in Fuzzy Logic and Systems with Membership Uncertainty

    CERN Document Server

    Starczewski, Janusz T

    2013-01-01

    This book generalizes fuzzy logic systems for different types of uncertainty, including - semantic ambiguity resulting from limited perception or lack of knowledge about exact membership functions - lack of attributes or granularity arising from discretization of real data - imprecise description of membership functions - vagueness perceived as fuzzification of conditional attributes. Consequently, the membership uncertainty can be modeled by combining methods of conventional and type-2 fuzzy logic, rough set theory and possibility theory.            In particular, this book provides a number of formulae for implementing the operation extended on fuzzy-valued fuzzy sets and presents some basic structures of generalized uncertain fuzzy logic systems, as well as introduces several of methods to generate fuzzy membership uncertainty. It is desirable as a reference book for under-graduates in higher education, master and doctor graduates in the courses of computer science, computational intelligence, or...

  3. Uncertainties in land use data

    Directory of Open Access Journals (Sweden)

    G. Castilla

    2007-11-01

    Full Text Available This paper deals with the description and assessment of uncertainties in land use data derived from Remote Sensing observations, in the context of hydrological studies. Land use is a categorical regionalised variable reporting the main socio-economic role each location has, where the role is inferred from the pattern of occupation of land. The properties of this pattern that are relevant to hydrological processes have to be known with some accuracy in order to obtain reliable results; hence, uncertainty in land use data may lead to uncertainty in model predictions. There are two main uncertainties surrounding land use data, positional and categorical. The first one is briefly addressed and the second one is explored in more depth, including the factors that influence it. We (1 argue that the conventional method used to assess categorical uncertainty, the confusion matrix, is insufficient to propagate uncertainty through distributed hydrologic models; (2 report some alternative methods to tackle this and other insufficiencies; (3 stress the role of metadata as a more reliable means to assess the degree of distrust with which these data should be used; and (4 suggest some practical recommendations.

  4. Uncertainty and Its Description in Decision Models

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Various perspectives in uncertainty are briefly summarized in this paper, and the classification of uncertainties is discussed. The descriptions of different classes of uncertainty are also presented. Furthermore, a risk representation model for decision-making analysis is provided.

  5. A Bayesian approach to model uncertainty

    International Nuclear Information System (INIS)

    A Bayesian approach to model uncertainty is taken. For the case of a finite number of alternative models, the model uncertainty is equivalent to parameter uncertainty. A derivation based on Savage's partition problem is given

  6. [DNA computing].

    Science.gov (United States)

    Błasiak, Janusz; Krasiński, Tadeusz; Popławski, Tomasz; Sakowski, Sebastian

    2011-01-01

    Biocomputers can be an alternative for traditional "silicon-based" computers, which continuous development may be limited due to further miniaturization (imposed by the Heisenberg Uncertainty Principle) and increasing the amount of information between the central processing unit and the main memory (von Neuman bottleneck). The idea of DNA computing came true for the first time in 1994, when Adleman solved the Hamiltonian Path Problem using short DNA oligomers and DNA ligase. In the early 2000s a series of biocomputer models was presented with a seminal work of Shapiro and his colleguas who presented molecular 2 state finite automaton, in which the restriction enzyme, FokI, constituted hardware and short DNA oligomers were software as well as input/output signals. DNA molecules provided also energy for this machine. DNA computing can be exploited in many applications, from study on the gene expression pattern to diagnosis and therapy of cancer. The idea of DNA computing is still in progress in research both in vitro and in vivo and at least promising results of these research allow to have a hope for a breakthrough in the computer science. PMID:21735816

  7. Applications of the TSUNAMI sensitivity and uncertainty analysis methodology

    International Nuclear Information System (INIS)

    The TSUNAMI sensitivity and uncertainty analysis tools under development for the SCALE code system have recently been applied in four criticality safety studies. TSUNAMI is used to identify applicable benchmark experiments for criticality code validation, assist in the design of new critical experiments for a particular need, reevaluate previously computed computational biases, and assess the validation coverage and propose a penalty for noncoverage for a specific application. (author)

  8. Numerical solution of dynamic equilibrium models under Poisson uncertainty

    DEFF Research Database (Denmark)

    Posch, Olaf; Trimborn, Timo

    2013-01-01

    We propose a simple and powerful numerical algorithm to compute the transition process in continuous-time dynamic equilibrium models with rare events. In this paper we transform the dynamic system of stochastic differential equations into a system of functional differential equations of the retar...... solution to Lucas' endogenous growth model under Poisson uncertainty are used to compute the exact numerical error. We show how (potential) catastrophic events such as rare natural disasters substantially affect the economic decisions of households....

  9. Cognitive ability and the effect of strategic uncertainty

    OpenAIRE

    Hanaki, Nobuyuki; Jacquemet, Nicolas; Luchini, Stéphane; Zylbersztejn, Adam

    2014-01-01

    How is one's cognitive ability related to the way one responds to strategic uncertainty? We address this question by conducting a set of experiments in simple 2 × 2 dominance solvable coordination games. Our experiments involve two main treatments: one in which two human subjects interact, and another in which one human subject interacts with a computer program whose behavior is known. By making the behavior of the computer perfectly predictable, the latter treatment eliminates strategic unce...

  10. Cognitive ability and the effect of strategic uncertainty

    OpenAIRE

    Hanaki, Nobuyuki; Jacquemet, Nicolas; Luchini, Stéphane; Zylbersztejn, Adam

    2016-01-01

    International audience How is one's cognitive ability related to the way one responds to strategic uncertainty? We address this question by conducting a set of experiments in simple 2 × 2 dominance solvable coordination games. Our experiments involve two main treatments: one in which two human subjects interact, and another in which one human subject interacts with a computer program whose behavior is known. By making the behavior of the computer perfectly predictable, the latter treatment...

  11. Cognitive ability and the effect of strategic uncertainty

    OpenAIRE

    Hanaki, Nobuyuki; Jacquemet, Nicolas; Luchini, Stéphane; Zylbersztejn, Adam

    2015-01-01

    How is one's cognitive ability related to the way one responds to strategic uncertainty? We address this question by conducting a set of experiments in simple 2 x 2 dominance solvable coordination games. Our experiments involve two main treatments: one in which two human subjects interact, and another in which one human subject interacts with a computer program whose behavior is known. By making the behavior of the computer perfectly predictable, the latter treatment eliminates strategic unce...

  12. Cognitive Ability and the Effect of Strategic Uncertainty

    OpenAIRE

    Nobuyuki Hanaki; Nicolas Jacquemet; Stéphane Luchini; Adam Zylberstejn

    2014-01-01

    How is one’s cognitive ability related to the way one responds to strategic uncertainty? We address this question by conducting a set of experiments in simple 2 x 2 dominance solvable coordination games. Our experiments involve two main treatments: one in which two human subjects interact, and another in which one human subject interacts with a computer program whose behavior is known. By making the behavior of the computer perfectly predictable, the latter treatment eliminates strategic unce...

  13. Statistical Modeling of Uncertainty in Photovoltaic Energy Forecasts

    Czech Academy of Sciences Publication Activity Database

    Brabec, Marek; Pelikán, Emil; Krč, Pavel; Eben, Kryštof; Malý, Marek; Resler, Jaroslav; Juruš, Pavel

    Prague: Institute of Computer Science AS CR, 2012, s. 1-51. [CCECE 2012. IEEE Canadian Conference on Electrical Engineering /25./. REFDE Workshop. Quebec (CA), 29.04.2012] Institutional research plan: CEZ:AV0Z10300504 Keywords : uncertainty modeling * semiparametric models * quantile regression Subject RIV: JE - Non-nuclear Energetics, Energy Consumption ; Use

  14. Axial power monitoring uncertainty in the Savannah River Reactors

    International Nuclear Information System (INIS)

    The results of this analysis quantified the uncertainty associated with monitoring the Axial Power Shape (APS) in the Savannah River Reactors. Thermocouples at each assembly flow exit map the radial power distribution and are the primary means of monitoring power in these reactors. The remaining uncertainty in power monitoring is associated with the relative axial power distribution. The APS is monitored by seven sensors that respond to power on each of nine vertical Axial Power Monitor (APM) rods. Computation of the APS uncertainty, for the reactor power limits analysis, started with a large database of APM rod measurements spanning several years of reactor operation. A computer algorithm was used to randomly select a sample of APSs which were input to a code. This code modeled the thermal-hydraulic performance of a single fuel assembly during a design basis Loss-of Coolant Accident. The assembly power limit at Onset of Significant Voiding was computed for each APS. The output was a distribution of expected assembly power limits that was adjusted to account for the biases caused by instrumentation error and by measuring 7 points rather than a continuous APS. Statistical analysis of the final assembly power limit distribution showed that reducing reactor power by approximately 3% was sufficient to account for APS variation. This data confirmed expectations that the assembly exit thermocouples provide all information needed for monitoring core power. The computational analysis results also quantified the contribution to power limits of the various uncertainties such as instrumentation error

  15. Pseudoharmonic oscillator in quantum mechanics with a generalized uncertainty principle

    CERN Document Server

    Boukhellout, Abdelmalek

    2013-01-01

    The pseudoharmonic oscillator potential is studied in quantum mechanics with a generalized uncertainty relation characterized by the existence of a minimal length. By using the perturbative approach of Brau, we compute the correction to the energy spectrum in the first order of the minimal length parameter {\\beta}. The effect of the minimal length on the vibration-rotation of diatomic molecules is discussed.

  16. Bounds in the generalized Weber problem under locational uncertainty

    DEFF Research Database (Denmark)

    Juel, Henrik

    1981-01-01

    An existing analysis of the bounds on the Weber problem solution under uncertainty is incorrect. For the generalized problem with arbitrary measures of distance, we give easily computable ranges on the bounds and state the conditions under which the exact values of the bounds can be found with...

  17. Statistical Modeling of Uncertainty in Photovoltaic Energy Forecasts

    Czech Academy of Sciences Publication Activity Database

    Brabec, Marek; Pelikán, Emil; Krč, Pavel; Eben, Kryštof; Malý, Marek; Resler, Jaroslav; Juruš, Pavel

    Prague : Institute of Computer Science AS CR, 2012, s. 1-51. [CCECE 2012. IEEE Canadian Conference on Electrical Engineering /25./. REFDE Workshop. Quebec (CA), 29.04.2012] Institutional research plan: CEZ:AV0Z10300504 Keywords : uncertainty modeling * semiparametric models * quantile regression Subject RIV: JE - Non-nuclear Energetics, Energy Consumption ; Use

  18. Propagation of interval and probabilistic uncertainty in cyberinfrastructure-related data processing and data fusion

    CERN Document Server

    Servin, Christian

    2015-01-01

    On various examples ranging from geosciences to environmental sciences, this book explains how to generate an adequate description of uncertainty, how to justify semiheuristic algorithms for processing uncertainty, and how to make these algorithms more computationally efficient. It explains in what sense the existing approach to uncertainty as a combination of random and systematic components is only an approximation, presents a more adequate three-component model with an additional periodic error component, and explains how uncertainty propagation techniques can be extended to this model. The book provides a justification for a practically efficient heuristic technique (based on fuzzy decision-making). It explains how the computational complexity of uncertainty processing can be reduced. The book also shows how to take into account that in real life, the information about uncertainty is often only partially known, and, on several practical examples, explains how to extract the missing information about uncer...

  19. Power system transient stability simulation under uncertainty based on Taylor model arithmetic

    Institute of Scientific and Technical Information of China (English)

    Shouxiang WANG; Zhijie ZHENG; Chengshan WANG

    2009-01-01

    The Taylor model arithmetic is introduced to deal with uncertainty. The uncertainty of model parameters is described by Taylor models and each variable in functions is replaced with the Taylor model (TM). Thus,time domain simulation under uncertainty is transformed to the integration of TM-based differential equations. In this paper, the Taylor series method is employed to compute differential equations; moreover, power system time domain simulation under uncertainty based on Taylor model method is presented. This method allows a rigorous estimation of the influence of either form of uncertainty and only needs one simulation. It is computationally fast compared with the Monte Carlo method, which is another technique for uncertainty analysis. The proposed method has been tested on the 39-bus New England system. The test results illustrate the effectiveness and practical value of the approach by comparing with the results of Monte Carlo simulation and traditional time domain simulation.

  20. Determination of a PWR key neutron parameters uncertainties and conformity studies applications

    International Nuclear Information System (INIS)

    The aim of this thesis was to evaluate uncertainties of key neutron parameters of slab reactors. Uncertainties sources have many origins, technologic origin for parameters of fabrication and physical origin for nuclear data. First, each contribution of uncertainties is calculated and finally, a factor of uncertainties is associated to key slab parameter like reactivity, isotherm reactivity coefficient, control rod efficiency, power form factor before irradiation and lifetime. This factors of uncertainties were computed by Generalized Perturbations Theory in case of step 0 and by directs calculations in case of irradiation problems. One of neutronic conformity applications was about fabrication and nuclear data targets precision adjustments. Statistic (uncertainties) and deterministic (deviations) approaches were studied. Then neutronics key slab parameters uncertainties were reduced and so nuclear performances were optimised. (author)

  1. Total Measurement Uncertainty for the Plutonium Finishing Plant (PFP) Segmented Gamma Scan Assay System

    CERN Document Server

    Fazzari, D M

    2001-01-01

    This report presents the results of an evaluation of the Total Measurement Uncertainty (TMU) for the Canberra manufactured Segmented Gamma Scanner Assay System (SGSAS) as employed at the Hanford Plutonium Finishing Plant (PFP). In this document, TMU embodies the combined uncertainties due to all of the individual random and systematic sources of measurement uncertainty. It includes uncertainties arising from corrections and factors applied to the analysis of transuranic waste to compensate for inhomogeneities and interferences from the waste matrix and radioactive components. These include uncertainty components for any assumptions contained in the calibration of the system or computation of the data. Uncertainties are propagated at 1 sigma. The final total measurement uncertainty value is reported at the 95% confidence level. The SGSAS is a gamma assay system that is used to assay plutonium and uranium waste. The SGSAS system can be used in a stand-alone mode to perform the NDA characterization of a containe...

  2. RUMINATIONS ON NDA MEASUREMENT UNCERTAINTY COMPARED TO DA UNCERTAINTY

    Energy Technology Data Exchange (ETDEWEB)

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-06-17

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  3. Ruminations On NDA Measurement Uncertainty Compared TO DA Uncertainty

    International Nuclear Information System (INIS)

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  4. Critical loads - assessment of uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Barkman, A.

    1998-10-01

    The effects of data uncertainty in applications of the critical loads concept were investigated on different spatial resolutions in Sweden and northern Czech Republic. Critical loads of acidity (CL) were calculated for Sweden using the biogeochemical model PROFILE. Three methods with different structural complexity were used to estimate the adverse effects of S0{sub 2} concentrations in northern Czech Republic. Data uncertainties in the calculated critical loads/levels and exceedances (EX) were assessed using Monte Carlo simulations. Uncertainties within cumulative distribution functions (CDF) were aggregated by accounting for the overlap between site specific confidence intervals. Aggregation of data uncertainties within CDFs resulted in lower CL and higher EX best estimates in comparison with percentiles represented by individual sites. Data uncertainties were consequently found to advocate larger deposition reductions to achieve non-exceedance based on low critical loads estimates on 150 x 150 km resolution. Input data were found to impair the level of differentiation between geographical units at all investigated resolutions. Aggregation of data uncertainty within CDFs involved more constrained confidence intervals for a given percentile. Differentiation as well as identification of grid cells on 150 x 150 km resolution subjected to EX was generally improved. Calculation of the probability of EX was shown to preserve the possibility to differentiate between geographical units. Re-aggregation of the 95%-ile EX on 50 x 50 km resolution generally increased the confidence interval for each percentile. Significant relationships were found between forest decline and the three methods addressing risks induced by S0{sub 2} concentrations. Modifying S0{sub 2} concentrations by accounting for the length of the vegetation period was found to constitute the most useful trade-off between structural complexity, data availability and effects of data uncertainty. Data

  5. Analyses of effect of parameter uncertainty in NPP design on computation of floor response spectra%核电结构设计中参数不确定性对楼层谱计算影响分析

    Institute of Scientific and Technical Information of China (English)

    李建波; 林皋; 钟红; 胡志强

    2011-01-01

    核电厂房的楼层反应谱是核电结构和设备抗震设计的依据.采用通常的确定性的方法计算只能选择可能性比较大,但又相对比较保守的荷载和结构参数进行分析,对计算结果反映实际的程度很难作出可靠的判断.在动力相互作用分析的基础上,针对核电结构设计中的参数不确定性问题,采用基于蒙特卡罗模拟的概率统计法进行楼层反应谱的分析.这种方法可以适应性能参数概率密度函数的各种不规则变化,在概率统计基础上获取不同置信率的响应结果.数值算例表明,该模型合理,计算结果可以对传统的确定性分析结果进行置信率的定量判断,进而对传统确定性方法设计的核电结构安全性有更深入的了解.%Floor response spectrum is the basis for design of structures and equipments in a nuclear power plant (NPP). In current routine of NPP design, deterministic methods are generally employed for computation of floor spectra, in which a proper estimate of loadings and structural parameters, with a good possibility yet conservative, has to be chosen, yielding results difficult to reflect the complicated real situation. To solve this problem, probabilistic and statistical method based on Monte- Carlo simulation is employed in the analyses of floor response spectra of this complicated stochastic system on the dynamic interaction analyses. This procedure can perfectly consider the irregular probabilistic density functions of physical parameters, and obtain floor spectra of different guarantee ratios. Numerical examples show that the numerical results can be used to evaluate the guarantee ratio of results obtained from deterministic analyses, providing better understanding of the safety of NPP designed with conventional deterministic method.

  6. Uncertainty modeling and decision support

    International Nuclear Information System (INIS)

    We first formulate the problem of decision making under uncertainty. The importance of the representation of our knowledge about the uncertainty in formulating a decision process is pointed out. We begin with a brief discussion of the case of probabilistic uncertainty. Next, in considerable detail, we discuss the case of decision making under ignorance. For this case the fundamental role of the attitude of the decision maker is noted and its subjective nature is emphasized. Next the case in which a Dempster-Shafer belief structure is used to model our knowledge of the uncertainty is considered. Here we also emphasize the subjective choices the decision maker must make in formulating a decision function. The case in which the uncertainty is represented by a fuzzy measure (monotonic set function) is then investigated. We then return to the Dempster-Shafer belief structure and show its relationship to the fuzzy measure. This relationship allows us to get a deeper understanding of the formulation the decision function used Dempster- Shafer framework. We discuss how this deeper understanding allows a decision analyst to better make the subjective choices needed in the formulation of the decision function

  7. Entropic uncertainty and measurement reversibility

    Science.gov (United States)

    Berta, Mario; Wehner, Stephanie; Wilde, Mark M.

    2016-07-01

    The entropic uncertainty relation with quantum side information (EUR-QSI) from (Berta et al 2010 Nat. Phys. 6 659) is a unifying principle relating two distinctive features of quantum mechanics: quantum uncertainty due to measurement incompatibility, and entanglement. In these relations, quantum uncertainty takes the form of preparation uncertainty where one of two incompatible measurements is applied. In particular, the ‘uncertainty witness’ lower bound in the EUR-QSI is not a function of a post-measurement state. An insightful proof of the EUR-QSI from (Coles et al 2012 Phys. Rev. Lett. 108 210405) makes use of a fundamental mathematical consequence of the postulates of quantum mechanics known as the non-increase of quantum relative entropy under quantum channels. Here, we exploit this perspective to establish a tightening of the EUR-QSI which adds a new state-dependent term in the lower bound, related to how well one can reverse the action of a quantum measurement. As such, this new term is a direct function of the post-measurement state and can be thought of as quantifying how much disturbance a given measurement causes. Our result thus quantitatively unifies this feature of quantum mechanics with the others mentioned above. We have experimentally tested our theoretical predictions on the IBM quantum experience and find reasonable agreement between our predictions and experimental outcomes.

  8. Extending BEAMS to incorporate correlated systematic uncertainties

    International Nuclear Information System (INIS)

    New supernova surveys such as the Dark Energy Survey, Pan-STARRS and the LSST will produce an unprecedented number of photometric supernova candidates, most with no spectroscopic data. Avoiding biases in cosmological parameters due to the resulting inevitable contamination from non-Ia supernovae can be achieved with the BEAMS formalism, allowing for fully photometric supernova cosmology studies. Here we extend BEAMS to deal with the case in which the supernovae are correlated by systematic uncertainties. The analytical form of the full BEAMS posterior requires evaluating 2N terms, where N is the number of supernova candidates. This 'exponential catastrophe' is computationally unfeasible even for N of order 100. We circumvent the exponential catastrophe by marginalising numerically instead of analytically over the possible supernova types: we augment the cosmological parameters with nuisance parameters describing the covariance matrix and the types of all the supernovae, τi, that we include in our MCMC analysis. We show that this method deals well even with large, unknown systematic uncertainties without a major increase in computational time, whereas ignoring the correlations can lead to significant biases and incorrect credible contours. We then compare the numerical marginalisation technique with a perturbative expansion of the posterior based on the insight that future surveys will have exquisite light curves and hence the probability that a given candidate is a Type Ia will be close to unity or zero, for most objects. Although this perturbative approach changes computation of the posterior from a 2N problem into an N2 or N3 one, we show that it leads to biases in general through a small number of misclassifications, implying that numerical marginalisation is superior

  9. Modeling multibody systems with uncertainties. Part II: Numerical applications

    International Nuclear Information System (INIS)

    This study applies generalized polynomial chaos theory to model complex nonlinear multibody dynamic systems operating in the presence of parametric and external uncertainty. Theoretical and computational aspects of this methodology are discussed in the companion paper 'Modeling Multibody Dynamic Systems With Uncertainties. Part I: Theoretical and Computational Aspects .In this paper we illustrate the methodology on selected test cases. The combined effects of parametric and forcing uncertainties are studied for a quarter car model. The uncertainty distributions in the system response in both time and frequency domains are validated against Monte-Carlo simulations. Results indicate that polynomial chaos is more efficient than Monte Carlo and more accurate than statistical linearization. The results of the direct collocation approach are similar to the ones obtained with the Galerkin approach. A stochastic terrain model is constructed using a truncated Karhunen-Loeve expansion. The application of polynomial chaos to differential-algebraic systems is illustrated using the constrained pendulum problem. Limitations of the polynomial chaos approach are studied on two different test problems, one with multiple attractor points, and the second with a chaotic evolution and a nonlinear attractor set. The overall conclusion is that, despite its limitations, generalized polynomial chaos is a powerful approach for the simulation of multibody dynamic systems with uncertainties

  10. Uncertainty Estimation Improves Energy Measurement and Verification Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Walter, Travis; Price, Phillip N.; Sohn, Michael D.

    2014-05-14

    Implementing energy conservation measures in buildings can reduce energy costs and environmental impacts, but such measures cost money to implement so intelligent investment strategies require the ability to quantify the energy savings by comparing actual energy used to how much energy would have been used in absence of the conservation measures (known as the baseline energy use). Methods exist for predicting baseline energy use, but a limitation of most statistical methods reported in the literature is inadequate quantification of the uncertainty in baseline energy use predictions. However, estimation of uncertainty is essential for weighing the risks of investing in retrofits. Most commercial buildings have, or soon will have, electricity meters capable of providing data at short time intervals. These data provide new opportunities to quantify uncertainty in baseline predictions, and to do so after shorter measurement durations than are traditionally used. In this paper, we show that uncertainty estimation provides greater measurement and verification (M&V) information and helps to overcome some of the difficulties with deciding how much data is needed to develop baseline models and to confirm energy savings. We also show that cross-validation is an effective method for computing uncertainty. In so doing, we extend a simple regression-based method of predicting energy use using short-interval meter data. We demonstrate the methods by predicting energy use in 17 real commercial buildings. We discuss the benefits of uncertainty estimates which can provide actionable decision making information for investing in energy conservation measures.

  11. Planning for robust reserve networks using uncertainty analysis

    Science.gov (United States)

    Moilanen, A.; Runge, M.C.; Elith, J.; Tyre, A.; Carmel, Y.; Fegraus, E.; Wintle, B.A.; Burgman, M.; Ben-Haim, Y.

    2006-01-01

    Planning land-use for biodiversity conservation frequently involves computer-assisted reserve selection algorithms. Typically such algorithms operate on matrices of species presence?absence in sites, or on species-specific distributions of model predicted probabilities of occurrence in grid cells. There are practically always errors in input data?erroneous species presence?absence data, structural and parametric uncertainty in predictive habitat models, and lack of correspondence between temporal presence and long-run persistence. Despite these uncertainties, typical reserve selection methods proceed as if there is no uncertainty in the data or models. Having two conservation options of apparently equal biological value, one would prefer the option whose value is relatively insensitive to errors in planning inputs. In this work we show how uncertainty analysis for reserve planning can be implemented within a framework of information-gap decision theory, generating reserve designs that are robust to uncertainty. Consideration of uncertainty involves modifications to the typical objective functions used in reserve selection. Search for robust-optimal reserve structures can still be implemented via typical reserve selection optimization techniques, including stepwise heuristics, integer-programming and stochastic global search.

  12. Davis-Besse uncertainty study

    International Nuclear Information System (INIS)

    The uncertainties of calculations of loss-of-feedwater transients at Davis-Besse Unit 1 were determined to address concerns of the US Nuclear Regulatory Commission relative to the effectiveness of feed and bleed cooling. Davis-Besse Unit 1 is a pressurized water reactor of the raised-loop Babcock and Wilcox design. A detailed, quality-assured RELAP5/MOD2 model of Davis-Besse was developed at the Idaho National Engineering Laboratory. The model was used to perform an analysis of the loss-of-feedwater transient that occurred at Davis-Besse on June 9, 1985. A loss-of-feedwater transient followed by feed and bleed cooling was also calculated. The evaluation of uncertainty was based on the comparisons of calculations and data, comparisons of different calculations of the same transient, sensitivity calculations, and the propagation of the estimated uncertainty in initial and boundary conditions to the final calculated results

  13. Decommissioning Funding: Ethics, Implementation, Uncertainties

    International Nuclear Information System (INIS)

    This status report on decommissioning funding: ethics, implementation, uncertainties is based on a review of recent literature and materials presented at NEA meetings in 2003 and 2004, and particularly at a topical session organised in November 2004 on funding issues associated with the decommissioning of nuclear power facilities. The report also draws on the experience of the NEA Working Party on Decommissioning and Dismantling (WPDD). This report offers, in a concise form, an overview of relevant considerations on decommissioning funding mechanisms with regard to ethics, implementation and uncertainties. Underlying ethical principles found in international agreements are identified, and factors influencing the accumulation and management of funds for decommissioning nuclear facilities are discussed together with the main sources of uncertainties of funding systems

  14. Uncertainty and Sensitivity Analyses Plan

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project.

  15. Uncertainty formulations for multislit interferometry

    Science.gov (United States)

    Biniok, Johannes C. G.

    2014-12-01

    In the context of (far-field) multislit interferometry we investigate the utility of two formulations of uncertainty in accounting for the complementarity of spatial localization and fringe width. We begin with a characterization of the relevant observables and general considerations regarding the suitability of different types of measures. The detailed analysis shows that both of the discussed uncertainty formulations yield qualitatively similar results, confirming that they correctly capture the relevant tradeoff. One approach, based on an idea of Aharonov and co-workers, is intuitively appealing and relies on a modification of the Heisenberg uncertainty relation. The other approach, developed by Uffink and Hilgevoord for single- and double-slit experiments, is readily applied to multislits. However, it is found that one of the underlying concepts requires generalization and that the choice of the parameters requires more consideration than was known.

  16. Word learning under infinite uncertainty

    CERN Document Server

    Blythe, Richard A; Smith, Kenny

    2014-01-01

    Language learners learn the meanings of many thousands of words, despite encountering them in complex environments where infinitely many meanings might be inferred by the learner as their true meaning. This problem of infinite referential uncertainty is often attributed to Willard Van Orman Quine. We provide a mathematical formalisation of an ideal cross-situational learner attempting to learn under infinite referential uncertainty, and identify conditions under which this can happen. As Quine's intuitions suggest, learning under infinite uncertainty is possible, provided that learners have some means of ranking candidate word meanings in terms of their plausibility; furthermore, our analysis shows that this ranking could in fact be exceedingly weak, implying that constraints allowing learners to infer the plausibility of candidate word meanings could also be weak.

  17. Uncertainty and Sensitivity Analyses Plan

    International Nuclear Information System (INIS)

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project

  18. Users manual for the FORSS sensitivity and uncertainty analysis code system

    Energy Technology Data Exchange (ETDEWEB)

    Lucius, J.L.; Weisbin, C.R.; Marable, J.H.; Drischler, J.D.; Wright, R.Q.; White, J.E.

    1981-01-01

    FORSS is a code system used to study relationships between nuclear reaction cross sections, integral experiments, reactor performance parameter predictions and associated uncertainties. This report describes the computing environment and the modules currently used to implement FORSS Sensitivity and Uncertainty Methodology.

  19. Corrections to the Cardy-Verlinde formula from the generalized uncertainty principle

    OpenAIRE

    Setare, M.R.(Department of Science, University of Kurdistan, Campus of Bijar, Bijar, Iran)

    2004-01-01

    In this letter, we compute the corrections to the Cardy-Verlinde formula of $d-$dimensional Schwarzschild black hole. These corrections stem from the generalized uncertainty principle. Then we show, one can taking into account the generalized uncertainty principle corrections of the Cardy-Verlinde entropy formula by just redefining the Virasoro operator $L_0$ and the central charge $c$.

  20. Users manual for the FORSS sensitivity and uncertainty analysis code system

    International Nuclear Information System (INIS)

    FORSS is a code system used to study relationships between nuclear reaction cross sections, integral experiments, reactor performance parameter predictions and associated uncertainties. This report describes the computing environment and the modules currently used to implement FORSS Sensitivity and Uncertainty Methodology

  1. Uncertainty Reduction using Bayesian Inference and Sensitivity Analysis: A Sequential Approach to the NASA Langley Uncertainty Quantification Challenge

    Science.gov (United States)

    Sankararaman, Shankar

    2016-01-01

    This paper presents a computational framework for uncertainty characterization and propagation, and sensitivity analysis under the presence of aleatory and epistemic un- certainty, and develops a rigorous methodology for efficient refinement of epistemic un- certainty by identifying important epistemic variables that significantly affect the overall performance of an engineering system. The proposed methodology is illustrated using the NASA Langley Uncertainty Quantification Challenge (NASA-LUQC) problem that deals with uncertainty analysis of a generic transport model (GTM). First, Bayesian inference is used to infer subsystem-level epistemic quantities using the subsystem-level model and corresponding data. Second, tools of variance-based global sensitivity analysis are used to identify four important epistemic variables (this limitation specified in the NASA-LUQC is reflective of practical engineering situations where not all epistemic variables can be refined due to time/budget constraints) that significantly affect system-level performance. The most significant contribution of this paper is the development of the sequential refine- ment methodology, where epistemic variables for refinement are not identified all-at-once. Instead, only one variable is first identified, and then, Bayesian inference and global sensi- tivity calculations are repeated to identify the next important variable. This procedure is continued until all 4 variables are identified and the refinement in the system-level perfor- mance is computed. The advantages of the proposed sequential refinement methodology over the all-at-once uncertainty refinement approach are explained, and then applied to the NASA Langley Uncertainty Quantification Challenge problem.

  2. Sensitivity- and Uncertainty-Based Criticality Safety Validation Techniques

    International Nuclear Information System (INIS)

    The theoretical basis for the application of sensitivity and uncertainty (S/U) analysis methods to the validation of benchmark data sets for use in criticality safety applications is developed. Sensitivity analyses produce energy-dependent sensitivity coefficients that give the relative change in the system multiplication factor keff value as a function of relative changes in the cross-section data by isotope, reaction, and energy. Integral indices are then developed that utilize the sensitivity information to quantify similarities between pairs of systems, typically a benchmark experiment and design system. Uncertainty analyses provide an estimate of the uncertainties in the calculated values of the system keff due to cross-section uncertainties, as well as correlation in the keff uncertainties between systems. These uncertainty correlations provide an additional measure of system similarity. The use of the similarity measures from both S/U analyses in the formal determination of areas of applicability for benchmark experiments is developed. Furthermore, the use of these similarity measures as a trending parameter for the estimation of the computational bias and uncertainty is explored. The S/U analysis results, along with the calculated and measured keff values and estimates of uncertainties in the measurements, were used in this work to demonstrate application of the generalized linear-least-squares methodology (GLLSM) to data validation for criticality safety studies.An illustrative example is used to demonstrate the application of these S/U analysis procedures to actual criticality safety problems. Computational biases, uncertainties, and the upper subcritical limit for the example applications are determined with the new methods and compared to those obtained through traditional criticality safety analysis validation techniques.The GLLSM procedure is also applied to determine cutoff values for the similarity indices such that applicability of a benchmark

  3. Review on Generalized Uncertainty Principle

    CERN Document Server

    Tawfik, Abdel Nasser

    2015-01-01

    Based on string theory, black hole physics, doubly special relativity and some "thought" experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in understanding recent PLANCK observations on the cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta.

  4. Uncertainties in offsite consequence analysis

    Energy Technology Data Exchange (ETDEWEB)

    Young, M.L.; Harper, F.T.; Lui, C.H.

    1996-03-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequences from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the U.S. Nuclear Regulatory Commission and the European Commission began co-sponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables using a formal expert judgment elicitation and evaluation process. This paper focuses on the methods used in and results of this on-going joint effort.

  5. Statistics, Uncertainty, and Transmitted Variation

    Energy Technology Data Exchange (ETDEWEB)

    Wendelberger, Joanne Roth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-11-05

    The field of Statistics provides methods for modeling and understanding data and making decisions in the presence of uncertainty. When examining response functions, variation present in the input variables will be transmitted via the response function to the output variables. This phenomenon can potentially have significant impacts on the uncertainty associated with results from subsequent analysis. This presentation will examine the concept of transmitted variation, its impact on designed experiments, and a method for identifying and estimating sources of transmitted variation in certain settings.

  6. Regulating renewable resources under uncertainty

    DEFF Research Database (Denmark)

    Hansen, Lars Gårn

    Renewable natural resources (like water, fish and wildlife stocks, forests and grazing lands) are critical for the livelihood of millions of people and understanding how they can be managed efficiently is an important economic problem. I show how regulator uncertainty about different economic and......) that a pro-quota result under uncertainty about prices and marginal costs is unlikely, requiring that the resource growth function is highly concave locally around the optimum and, 3) that quotas are always preferred if uncertainly about underlying structural economic parameters dominates. These...

  7. Survey and Evaluate Uncertainty Quantification Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Guang; Engel, David W.; Eslinger, Paul W.

    2012-02-01

    The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that will develop and deploy state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models with uncertainty quantification, optimization, risk analysis and decision making capabilities. The CCSI Toolset will incorporate commercial and open-source software currently in use by industry and will also develop new software tools as necessary to fill technology gaps identified during execution of the project. The CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. The goal of CCSI is to deliver a toolset that can simulate the scale-up of a broad set of new carbon capture technologies from laboratory scale to full commercial scale. To provide a framework around which the toolset can be developed and demonstrated, we will focus on three Industrial Challenge Problems (ICPs) related to carbon capture technologies relevant to U.S. pulverized coal (PC) power plants. Post combustion capture by solid sorbents is the technology focus of the initial ICP (referred to as ICP A). The goal of the uncertainty quantification (UQ) task (Task 6) is to provide a set of capabilities to the user community for the quantification of uncertainties associated with the carbon

  8. Linear Programming Problems for Generalized Uncertainty

    Science.gov (United States)

    Thipwiwatpotjana, Phantipa

    2010-01-01

    Uncertainty occurs when there is more than one realization that can represent an information. This dissertation concerns merely discrete realizations of an uncertainty. Different interpretations of an uncertainty and their relationships are addressed when the uncertainty is not a probability of each realization. A well known model that can handle…

  9. Uncertainty as a parameter for decision making

    International Nuclear Information System (INIS)

    The uncertainty is a very important measure which can contribute to proper decision making. Another very important role of the uncertainty is that uncertainty is the measurable value which can be used as quality describing factor. Thus the uncertainty is very important for quality assurance of research process

  10. Assessment of uncertainty in parameter evaluation and prediction.

    Science.gov (United States)

    Meinrath, G; Ekberg, C; Landgren, A; Liljenzin, J O

    2000-02-01

    Like in all experimental science, chemical data is affected by the limited precision of the measurement process. Quality control and traceability of experimental data require suitable approaches to express properly the degree of uncertainty. Noise and bias are nuisance effects reducing the information extractable from experimental data. However, because of the complexity of the numerical data evaluation in many chemical fields, often mean values from data analysis, e.g. multi-parametric curve fitting, are reported only. Relevant information on the interpretation limits, e.g. standard deviations or confidence limits, are either omitted or estimated. Modern techniques for handling of uncertainty in both parameter evaluation and prediction are strongly based on the calculation power of computers. Advantageously, computer-intensive methods like Monte Carlo resampling and Latin Hypercube sampling do not require sophisticated and often unavailable mathematical treatment. The statistical concepts are introduced. Applications of some computer-intensive statistical techniques to chemical problems are demonstrated. PMID:18967855

  11. Systemic change increases model projection uncertainty

    Science.gov (United States)

    Verstegen, Judith; Karssenberg, Derek; van der Hilst, Floor; Faaij, André

    2014-05-01

    the neighbourhood doubled, while the influence of slope and potential yield decreased by 75% and 25% respectively. Allowing these systemic changes to occur in our CA in the future (up to 2022) resulted in an increase in model projection uncertainty by a factor two compared to the assumption of a stationary system. This means that the assumption of a constant model structure is not adequate and largely underestimates uncertainty in the projection. References Verstegen, J.A., Karssenberg, D., van der Hilst, F., Faaij, A.P.C., 2014. Identifying a land use change cellular automaton by Bayesian data assimilation. Environmental Modelling & Software 53, 121-136. Verstegen, J.A., Karssenberg, D., van der Hilst, F., Faaij, A.P.C., 2012. Spatio-Temporal Uncertainty in Spatial Decision Support Systems: a Case Study of Changing Land Availability for Bioenergy Crops in Mozambique. Computers , Environment and Urban Systems 36, 30-42. Wald, A., Wolfowitz, J., 1940. On a test whether two samples are from the same population. The Annals of Mathematical Statistics 11, 147-162.

  12. Uncertainty about probability: a decision analysis perspective

    Energy Technology Data Exchange (ETDEWEB)

    Howard, R.A.

    1988-03-01

    The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group.

  13. Uncertainty about probability: a decision analysis perspective

    International Nuclear Information System (INIS)

    The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group

  14. Large-Scale Inverse Problems and Quantification of Uncertainty

    CERN Document Server

    Biegler, Lorenz; Ghattas, Omar

    2010-01-01

    Large-scale inverse problems and associated uncertainty quantification has become an important area of research, central to a wide range of science and engineering applications. Written by leading experts in the field, Large-scale Inverse Problems and Quantification of Uncertainty focuses on the computational methods used to analyze and simulate inverse problems. The text provides PhD students, researchers, advanced undergraduate students, and engineering practitioners with the perspectives of researchers in areas of inverse problems and data assimilation, ranging from statistics and large-sca

  15. Uncertainty Modeling Based on Bayesian Network in Ontology Mapping

    Institute of Scientific and Technical Information of China (English)

    LI Yuhua; LIU Tao; SUN Xiaolin

    2006-01-01

    How to deal with uncertainty is crucial in exact concept mapping between ontologies. This paper presents a new framework on modeling uncertainty in ontologies based on bayesian networks (BN). In our approach, ontology Web language (OWL) is extended to add probabilistic markups for attaching probability information, the source and target ontologies (expressed by patulous OWL) are translated into bayesian networks (BNs), the mapping between the two ontologies can be digged out by constructing the conditional probability tables (CPTs) of the BN using a improved algorithm named I-IPFP based on iterative proportional fitting procedure (IPFP). The basic idea of this framework and algorithm are validated by positive results from computer experiments.

  16. Uncertainty quantification in lattice QCD calculations for nuclear physics

    Energy Technology Data Exchange (ETDEWEB)

    Beane, Silas R. [Univ. of Washington, Seattle, WA (United States); Detmold, William [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Orginos, Kostas [College of William and Mary, Williamsburg, VA (United States); Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Savage, Martin J. [Institute for Nuclear Theory, Seattle, WA (United States)

    2015-02-05

    The numerical technique of Lattice QCD holds the promise of connecting the nuclear forces, nuclei, the spectrum and structure of hadrons, and the properties of matter under extreme conditions with the underlying theory of the strong interactions, quantum chromodynamics. A distinguishing, and thus far unique, feature of this formulation is that all of the associated uncertainties, both statistical and systematic can, in principle, be systematically reduced to any desired precision with sufficient computational and human resources. As a result, we review the sources of uncertainty inherent in Lattice QCD calculations for nuclear physics, and discuss how each is quantified in current efforts.

  17. Controlling Uncertainty Decision Making and Learning in Complex Worlds

    CERN Document Server

    Osman, Magda

    2010-01-01

    Controlling Uncertainty: Decision Making and Learning in Complex Worlds reviews and discusses the most current research relating to the ways we can control the uncertain world around us.: Features reviews and discussions of the most current research in a number of fields relevant to controlling uncertainty, such as psychology, neuroscience, computer science and engineering; Presents a new framework that is designed to integrate a variety of disparate fields of research; Represents the first book of its kind to provide a general overview of work related to understanding control

  18. Microscopic black hole stabilization via the uncertainty principle

    International Nuclear Information System (INIS)

    Due to the Heisenberg uncertainty principle, gravitational confinement of two- or three-rotating particle systems can lead to microscopic Planckian or sub-Planckian black holes with a size of order their Compton wavelength. Some properties of such states are discussed in terms of the Schwarzschild geodesics of general relativity and compared with properties computed via the combination of special relativity, equivalence principle, Newton's gravitational law and Compton wavelength. It is shown that the generalized uncertainty principle (GUP) provides a satisfactory fit of the Schwarzschild radius and Compton wavelength of such microscopic, particle-like, black holes

  19. Uncertainty Issues in Automating Process Connecting Web and User

    Czech Academy of Sciences Publication Activity Database

    Eckhardt, A.; Horváth, T.; Maruščák, D.; Novotný, P.; Vojtáš, Peter

    Busan: -, 2007 - (Pobillo, F.), s. 1-12 [URSW 2007. Uncertainty Reasoning for the Semantic Web Workshop. Busan (KR), 12.11.2007] R&D Projects: GA AV ČR 1ET100300517; GA AV ČR 1ET100300419 Grant ostatní: VEGA(SK) 1/3129/06 Source of funding: V - iné verejné zdroje Keywords : uncertainty modeling * uncertain reasoning * world wide web * web content mining * user profile mining Subject RIV: IN - Informatics, Computer Science

  20. Microscopic black hole stabilization via the uncertainty principle

    Science.gov (United States)

    Vayenas, Constantinos G.; Grigoriou, Dimitrios

    2015-01-01

    Due to the Heisenberg uncertainty principle, gravitational confinement of two- or three-rotating particle systems can lead to microscopic Planckian or sub-Planckian black holes with a size of order their Compton wavelength. Some properties of such states are discussed in terms of the Schwarzschild geodesics of general relativity and compared with properties computed via the combination of special relativity, equivalence principle, Newton's gravitational law and Compton wavelength. It is shown that the generalized uncertainty principle (GUP) provides a satisfactory fit of the Schwarzschild radius and Compton wavelength of such microscopic, particle-like, black holes.

  1. Uncertainty of climate change impacts and consequences on the prediction of future hydrological trends

    International Nuclear Information System (INIS)

    In the future, water is very likely to be the resource that will be most severely affected by climate change. It has been shown that small perturbations in precipitation frequency and/or quantity can result in significant impacts on the mean annual discharge. Moreover, modest changes in natural inflows result in larger changes in reservoir storage. There is however great uncertainty linked to changes in both the magnitude and direction of future hydrological trends. This presentation discusses the various sources of this uncertainty and their potential impact on the prediction of future hydrological trends. A companion paper will look at adaptation potential, taking into account some of the sources of uncertainty discussed in this presentation. Uncertainty is separated into two main components: climatic uncertainty and 'model and methods' uncertainty. Climatic uncertainty is linked to uncertainty in future greenhouse gas emission scenarios (GHGES) and to general circulation models (GCMs), whose representation of topography and climate processes is imperfect, in large part due to computational limitations. The uncertainty linked to natural variability (which may or may not increase) is also part of the climatic uncertainty. 'Model and methods' uncertainty regroups the uncertainty linked to the different approaches and models needed to transform climate data so that they can be used by hydrological models (such as downscaling methods) and the uncertainty of the models themselves and of their use in a changed climate. The impacts of the various sources of uncertainty on the hydrology of a watershed are demonstrated on the Peribonka River basin (Quebec, Canada). The results indicate that all sources of uncertainty can be important and outline the importance of taking these sources into account for any impact and adaptation studies. Recommendations are outlined for such studies. (author)

  2. IPA Phase 2 sensitivity and uncertainty analysis

    International Nuclear Information System (INIS)

    The NRC's Phase 2 Iterative Performance Assessment (IPA) used Monte Carlo techniques to propagate uncertainty for up to 297 independent variables and nine scenarios through computer models representing the performance of the Yucca Mountain repository. The NRC staff explored the use of a number of parametric and non-parametric tests and graphical methods to display the probabilistic results. Parametric tests included regression and differential analysis. Non-parametric tests included the Kolmogorov-Smirnov test and Sign test. Graphical methods included the Complementary Cumulative Distribution Function (CCDF), hair diagram, scatter plots, histograms and box plots. Multiple linear regression of raw, ranked, standardized and other transformed variables determined the gross sensitivity over the parameter space. CCDF's were also generated from subsets of the 400 vector sets formed by screening the vectors according to values of derived variables related to the behavior of the engineered and natural systems. While no single statistical or graphical technique proved to be useful in all cases, diverse methods of sensitivity and uncertainty analysis identified the same important input parameters

  3. Adaptive Strategies for Materials Design using Uncertainties

    Science.gov (United States)

    Balachandran, Prasanna V.; Xue, Dezhen; Theiler, James; Hogden, John; Lookman, Turab

    2016-01-01

    We compare several adaptive design strategies using a data set of 223 M2AX family of compounds for which the elastic properties [bulk (B), shear (G), and Young’s (E) modulus] have been computed using density functional theory. The design strategies are decomposed into an iterative loop with two main steps: machine learning is used to train a regressor that predicts elastic properties in terms of elementary orbital radii of the individual components of the materials; and a selector uses these predictions and their uncertainties to choose the next material to investigate. The ultimate goal is to obtain a material with desired elastic properties in as few iterations as possible. We examine how the choice of data set size, regressor and selector impact the design. We find that selectors that use information about the prediction uncertainty outperform those that don’t. Our work is a step in illustrating how adaptive design tools can guide the search for new materials with desired properties.

  4. Does inflation uncertainty increase with inflation?

    OpenAIRE

    John E. Golob

    1994-01-01

    One of the most important costs of inflation is the uncertainty it creates about future inflation. This uncertainty clouds the decisionmaking of consumers and businesses and reduces economic well-being. Without this uncertainty, consumers and businesses could better plan for the future. According to many analysts, uncertainty about future inflation rises as inflation rises. As a result, these analysts argue that the Federal Reserve could reduce inflation uncertainty by reducing inflation. Oth...

  5. Adressing Replication and Model Uncertainty

    DEFF Research Database (Denmark)

    Ebersberger, Bernd; Galia, Fabrice; Laursen, Keld;

    Many fields of strategic management are subject to an important degree of model uncertainty. This is because the true model, and therefore the selection of appropriate explanatory variables, is essentially unknown. Drawing on the literature on the determinants of innovation, and by analyzing...

  6. Entropy and the uncertainty principle

    CERN Document Server

    Frank, Rupert L

    2011-01-01

    We generalize, improve and unify theorems of Rumin, and Maassen--Uffink about classical entropies associated to quantum density matrices. These theorems refer to the classical entropies of the diagonals of a density matrix in two different bases. Thus they provide a kind of uncertainty principle. Our inequalities are sharp because they are exact in the high-temperature or semi-classical limit.

  7. Complementarity and the uncertainty relations

    CERN Document Server

    Björk, G; Trifonov, A S; Tsegaye, T; Karlsson, A M; Bjork, Gunnar; Soderholm, Jonas; Trifonov, Alexei; Tsegaye, Tedros; Karlsson, Anders

    1999-01-01

    We formulate a general complementarity relation starting from any Hermitian operator with discrete non-degenerate eigenvalues. We then elucidate the relationship between quantum complementarity and the Heisenberg-Robertson's uncertainty relation. We show that they are intimately connected. Finally we exemplify the general theory with some specific suggested experiments.

  8. Uncertainty assessment using uncalibrated objects:

    DEFF Research Database (Denmark)

    Meneghello, R.; Savio, Enrico; Larsen, Erik; De Chiffre, Leonardo

    This report is made as a part of the project Easytrac, an EU project under the programme: Competitive and Sustainable Growth: Contract No: G6RD-CT-2000-00188, coordinated by UNIMETRIK S.A. (Spain). The project is concerned with low uncertainty calibrations on coordinate measuring machines. The Ce...

  9. Exploring Uncertainty with Projectile Launchers

    Science.gov (United States)

    Orzel, Chad; Reich, Gary; Marr, Jonathan

    2012-01-01

    The proper choice of a measurement technique that minimizes systematic and random uncertainty is an essential part of experimental physics. These issues are difficult to teach in the introductory laboratory, though. Because most experiments involve only a single measurement technique, students are often unable to make a clear distinction between…

  10. Knowledge Uncertainty and Contextual Modelling

    Czech Academy of Sciences Publication Activity Database

    Klimešová, Dana; Ocelíková, E.

    Kréta: WSEAS Press, 2007, s. 326-330. ISBN 978-960-8457-90-4. [CSCC 2007. Agios Nikolaos (GR), 23.07.2007-28.07.2007] Institutional research plan: CEZ:AV0Z10750506 Keywords : Knowledge * uncertainty * knowledge management * contextual modelling * temporal modelling Subject RIV: BD - Theory of Information

  11. Uncertainties in radiation flow experiments

    Science.gov (United States)

    Fryer, C. L.; Dodd, E.; Even, W.; Fontes, C. J.; Greeff, C.; Hungerford, A.; Kline, J.; Mussack, K.; Tregillis, I.; Workman, J. B.; Benstead, J.; Guymer, T. M.; Moore, A. S.; Morton, J.

    2016-03-01

    Although the fundamental physics behind radiation and matter flow is understood, many uncertainties remain in the exact behavior of macroscopic fluids in systems ranging from pure turbulence to coupled radiation hydrodynamics. Laboratory experiments play an important role in studying this physics to allow scientists to test their macroscopic models of these phenomena. However, because the fundamental physics is well understood, precision experiments are required to validate existing codes already tested by a suite of analytic, manufactured and convergence solutions. To conduct such high-precision experiments requires a detailed understanding of the experimental errors and the nature of their uncertainties on the observed diagnostics. In this paper, we study the uncertainties plaguing many radiation-flow experiments, focusing on those using a hohlraum (dynamic or laser-driven) source and a foam-density target. This study focuses on the effect these uncertainties have on the breakout time of the radiation front. We find that, even if the errors in the initial conditions and numerical methods are Gaussian, the errors in the breakout time are asymmetric, leading to a systematic bias in the observed data. We must understand these systematics to produce the high-precision experimental results needed to study this physics.

  12. Competitive Capacity Investment under Uncertainty

    NARCIS (Netherlands)

    X. Li (Xishu); R.A. Zuidwijk (Rob); M.B.M. de Koster (René); R. Dekker (Rommert)

    2016-01-01

    textabstractWe consider a long-term capacity investment problem in a competitive market under demand uncertainty. Two firms move sequentially in the competition and a firm’s capacity decision interacts with the other firm’s current and future capacity. Throughout the investment race, a firm can eith

  13. Uncertainty propagation in nuclear forensics

    International Nuclear Information System (INIS)

    Uncertainty propagation formulae are presented for age dating in support of nuclear forensics. The age of radioactive material in this context refers to the time elapsed since a particular radionuclide was chemically separated from its decay product(s). The decay of the parent radionuclide and ingrowth of the daughter nuclide are governed by statistical decay laws. Mathematical equations allow calculation of the age of specific nuclear material through the atom ratio between parent and daughter nuclides, or through the activity ratio provided that the daughter nuclide is also unstable. The derivation of the uncertainty formulae of the age may present some difficulty to the user community and so the exact solutions, some approximations, a graphical representation and their interpretation are presented in this work. Typical nuclides of interest are actinides in the context of non-proliferation commitments. The uncertainty analysis is applied to a set of important parent–daughter pairs and the need for more precise half-life data is examined. - Highlights: • Uncertainty propagation formulae for age dating with nuclear chronometers. • Applied to parent–daughter pairs used in nuclear forensics. • Investigated need for better half-life data

  14. Uncertainty covariances in robotics applications

    International Nuclear Information System (INIS)

    The application of uncertainty covariance matrices in the analysis of robot trajectory errors is explored. First, relevant statistical concepts are reviewed briefly. Then, a simple, hypothetical robot model is considered to illustrate methods for error propagation and performance test data evaluation. The importance of including error correlations is emphasized

  15. UncertWeb: chaining web services accounting for uncertainty

    Science.gov (United States)

    Cornford, Dan; Jones, Richard; Bastin, Lucy; Williams, Matthew; Pebesma, Edzer; Nativi, Stefano

    2010-05-01

    The development of interoperable services that permit access to data and processes, typically using web service based standards opens up the possibility for increasingly complex chains of data and processes, which might be discovered and composed in increasingly automatic ways. This concept, sometimes referred to as the "Model Web", offers the promise of integrated (Earth) system models, with pluggable web service based components which can be discovered, composed and evaluated dynamically. A significant issue with such service chains, indeed in any composite model composed of coupled components, is that in all interesting (non-linear) cases the effect of uncertainties on inputs, or components within the chain will have complex, potentially unexpected effects on the outputs. Within the FP7 UncertWeb project we will be developing a mechanism and an accompanying set of tools to enable rigorous uncertainty management in web based service chains involving both data and processes. The project will exploit and extend the UncertML candidate standard to flexibly propagate uncertainty through service chains, including looking at mechanisms to develop uncertainty enabled profiles of existing Open Geospatial Consortium services. To facilitate the use of such services we will develop tools to address the definition of the input uncertainties (elicitation), manage the uncertainty propagation (emulation), undertake uncertainty and sensitivity analysis and visualise the output uncertainty. In this talk we will outline the challenges of the UncertWeb project, illustrating this with a prototype service chain we have created for correcting station level pressure to sea-level pressure, which accounts for the various uncertainties involved. In particular we will discuss some of the challenges of chaining Open Geospatial Consortium services using the Business Process Execution Language. We will also address the issue of computational cost and communication bandwidth requirements for

  16. Uncertainty governance: an integrated framework for managing and communicating uncertainties

    International Nuclear Information System (INIS)

    Treatment of uncertainty, or in other words, reasoning with imperfect information is widely recognised as being of great importance within performance assessment (PA) of the geological disposal mainly because of the time scale of interest and spatial heterogeneity that geological environment exhibits. A wide range of formal methods have been proposed for the optimal processing of incomplete information. Many of these methods rely on the use of numerical information, the frequency based concept of probability in particular, to handle the imperfections. However, taking quantitative information as a base for models that solve the problem of handling imperfect information merely creates another problem, i.e., how to provide the quantitative information. In many situations this second problem proves more resistant to solution, and in recent years several authors have looked at a particularly ingenious way in accordance with the rules of well-founded methods such as Bayesian probability theory, possibility theory, and the Dempster-Shafer theory of evidence. Those methods, while drawing inspiration from quantitative methods, do not require the kind of complete numerical information required by quantitative methods. Instead they provide information that, though less precise than that provided by quantitative techniques, is often, if not sufficient, the best that could be achieved. Rather than searching for the best method for handling all imperfect information, our strategy for uncertainty management, that is recognition and evaluation of uncertainties associated with PA followed by planning and implementation of measures to reduce them, is to use whichever method best fits the problem at hand. Such an eclectic position leads naturally to integration of the different formalisms. While uncertainty management based on the combination of semi-quantitative methods forms an important part of our framework for uncertainty governance, it only solves half of the problem

  17. Decision making uncertainty, imperfection, deliberation and scalability

    CERN Document Server

    Kárný, Miroslav; Wolpert, David

    2015-01-01

    This volume focuses on uncovering the fundamental forces underlying dynamic decision making among multiple interacting, imperfect and selfish decision makers. The chapters are written by leading experts from different disciplines, all considering the many sources of imperfection in decision making, and always with an eye to decreasing the myriad discrepancies between theory and real world human decision making. Topics addressed include uncertainty, deliberation cost and the complexity arising from the inherent large computational scale of decision making in these systems. In particular, analyses and experiments are presented which concern: • task allocation to maximize “the wisdom of the crowd”; • design of a society of “edutainment” robots who account for one anothers’ emotional states; • recognizing and counteracting seemingly non-rational human decision making; • coping with extreme scale when learning causality in networks; • efficiently incorporating expert knowledge in personalized...

  18. UNCERTAINTY EVALUATION OF AVAILABLE ENERGY AND POWER

    Energy Technology Data Exchange (ETDEWEB)

    Jon P. Christophersen; John L. Morrison

    2006-05-01

    The Idaho National Laboratory does extensive testing and evaluation of advanced technology batteries and ultracapacitors for applications in electric and hybrid vehicles. The testing is essentially acquiring time records of voltage, current and temperature from a variety of charge and discharge time profiles. From these three basic measured parameters, a complex assortment of derived parameters (resistance, power, etc.) is computed. Derived parameters are in many cases functions of multiple layers of other derived parameters that eventually work back to the three basic measured parameters. The purpose of this paper is to document the methodology used for the uncertainty analysis of the most complicated derived parameters broadly grouped as available energy and available power. This work is an analytical derivation. Future work will report the implementation of algorithms based upon this effort.

  19. Uncertainty Analyses in the Finite-Difference Time-Domain Method

    OpenAIRE

    Edwards, R. S.; Marvin, A. C.; Porter, S J

    2010-01-01

    Providing estimates of the uncertainty in results obtained by Computational Electromagnetic (CEM) simulations is essential when determining the acceptability of the results. The Monte Carlo method (MCM) has been previously used to quantify the uncertainty in CEM simulations. Other computationally efficient methods have been investigated more recently, such as the polynomial chaos method (PCM) and the method of moments (MoM). This paper introduces a novel implementation of the PCM and the MoM ...

  20. A preliminary uncertainty analysis of phenomenological inputs in TEXAS-V code

    Energy Technology Data Exchange (ETDEWEB)

    Park, S. H.; Kim, H. D.; Ahn, K. I. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2010-10-15

    Uncertainty analysis is important step in safety analysis of nuclear power plants. The better estimate for the computer codes is on the increase instead of conservative codes. These efforts aim to get more precise evaluation of safety margins, and aim at determining the rate of change in the prediction of codes with one or more input parameters varies within its range of interest. From this point of view, a severe accident uncertainty analysis system, SAUNA, has been improved for TEXAS-V FCI uncertainty analysis. The main objective of this paper is to present the TEXAS FCI uncertainty analysis results implemented through the SAUNA code

  1. Uncertainty and validation. Effect of model complexity on uncertainty estimates

    International Nuclear Information System (INIS)

    In the Model Complexity subgroup of BIOMOVS II, models of varying complexity have been applied to the problem of downward transport of radionuclides in soils. A scenario describing a case of surface contamination of a pasture soil was defined. Three different radionuclides with different environmental behavior and radioactive half-lives were considered: Cs-137, Sr-90 and I-129. The intention was to give a detailed specification of the parameters required by different kinds of model, together with reasonable values for the parameter uncertainty. A total of seven modelling teams participated in the study using 13 different models. Four of the modelling groups performed uncertainty calculations using nine different modelling approaches. The models used range in complexity from analytical solutions of a 2-box model using annual average data to numerical models coupling hydrology and transport using data varying on a daily basis. The complex models needed to consider all aspects of radionuclide transport in a soil with a variable hydrology are often impractical to use in safety assessments. Instead simpler models, often box models, are preferred. The comparison of predictions made with the complex models and the simple models for this scenario show that the predictions in many cases are very similar, e g in the predictions of the evolution of the root zone concentration. However, in other cases differences of many orders of magnitude can appear. One example is the prediction of the flux to the groundwater of radionuclides being transported through the soil column. Some issues that have come to focus in this study: There are large differences in the predicted soil hydrology and as a consequence also in the radionuclide transport, which suggests that there are large uncertainties in the calculation of effective precipitation and evapotranspiration. The approach used for modelling the water transport in the root zone has an impact on the predictions of the decline in root

  2. Groundwater management under sustainable yield uncertainty

    Science.gov (United States)

    Delottier, Hugo; Pryet, Alexandre; Dupuy, Alain

    2015-04-01

    The definition of the sustainable yield (SY) of a groundwater system consists in adjusting pumping rates so as to avoid groundwater depletion and preserve environmental flows. Once stakeholders have defined which impacts can be considered as "acceptable" for both environmental and societal aspects, hydrogeologists use groundwater models to estimate the SY. Yet, these models are based on a simplification of actual groundwater systems, whose hydraulic properties are largely unknown. As a result, the estimated SY is subject to "predictive" uncertainty. We illustrate the issue with a synthetic homogeneous aquifer system in interaction with a stream for steady state and transient conditions. Simulations are conducted with the USGS MODFLOW finite difference model with the river-package. A synthetic dataset is first generated with the numerical model that will further be considered as the "observed" state. In a second step, we conduct the calibration operation as hydrogeologists dealing with real word, unknown groundwater systems. The RMSE between simulated hydraulic heads and the synthetic "observed" values is used as objective function. But instead of simply "calibrating" model parameters, we explore the value of the objective function in the parameter space (hydraulic conductivity, storage coefficient and total recharge). We highlight the occurrence of an ellipsoidal "null space", where distinct parameter sets lead to equally low values for the objective function. The optimum of the objective function is not unique, which leads to a range of possible values for the SY. With a large confidence interval for the SY, the use of modeling results for decision-making is challenging. We argue that prior to modeling operations, efforts must be invested so as to narrow the intervals of likely parameter values. Parameter space exploration is effective to estimate SY uncertainty, but not efficient because of its computational burden and is therefore inapplicable for real world

  3. Quantum Computation and Quantum Information

    OpenAIRE

    Wang, Yazhen

    2012-01-01

    Quantum computation and quantum information are of great current interest in computer science, mathematics, physical sciences and engineering. They will likely lead to a new wave of technological innovations in communication, computation and cryptography. As the theory of quantum physics is fundamentally stochastic, randomness and uncertainty are deeply rooted in quantum computation, quantum simulation and quantum information. Consequently quantum algorithms are random in nature, and quantum ...

  4. Maxallent: Maximizers of all Entropies and Uncertainty of Uncertainty

    CERN Document Server

    Gorban, A N

    2013-01-01

    The entropy maximum approach (Maxent) was developed as a minimization of the subjective uncertainty measured by the Boltzmann--Gibbs--Shannon entropy. Many new entropies have been invented in the second half of the 20th century. Now there exists a rich choice of entropies for fitting needs. This diversity of entropies gave rise to a Maxent "anarchism". Maxent approach is now the conditional maximization of an appropriate entropy for the evaluation of the probability distribution when our information is partial and incomplete. The rich choice of non-classical entropies causes a new problem: which entropy is better for a given class of applications? We understand entropy as a {\\em measure of uncertainty which increases in Markov processes.} In this work, we describe the most general ordering of the distribution space, with respect to which all continuous-time Markov processes are monotonic (the Markov order). For inference, this approach results in a {\\em set} of conditionally "most random" distributions. Each ...

  5. Ideas underlying the Quantification of Margins and Uncertainties

    International Nuclear Information System (INIS)

    Key ideas underlying the application of Quantification of Margins and Uncertainties (QMU) to nuclear weapons stockpile lifecycle decisions are described. While QMU is a broad process and methodology for generating critical technical information to be used in U.S. nuclear weapon stockpile management, this paper emphasizes one component, which is information produced by computational modeling and simulation. In particular, the following topics are discussed: (i) the key principles of developing QMU information in the form of Best Estimate Plus Uncertainty, (ii) the need to separate aleatory and epistemic uncertainty in QMU, and (iii) the properties of risk-informed decision making (RIDM) that are best suited for effective application of QMU. The paper is written at a high level, but provides an extensive bibliography of useful papers for interested readers to deepen their understanding of the presented ideas.

  6. Handling uncertainty and networked structure in robot control

    CERN Document Server

    Tamás, Levente

    2015-01-01

    This book focuses on two challenges posed in robot control by the increasing adoption of robots in the everyday human environment: uncertainty and networked communication. Part I of the book describes learning control to address environmental uncertainty. Part II discusses state estimation, active sensing, and complex scenario perception to tackle sensing uncertainty. Part III completes the book with control of networked robots and multi-robot teams. Each chapter features in-depth technical coverage and case studies highlighting the applicability of the techniques, with real robots or in simulation. Platforms include mobile ground, aerial, and underwater robots, as well as humanoid robots and robot arms. Source code and experimental data are available at http://extras.springer.com. The text gathers contributions from academic and industry experts, and offers a valuable resource for researchers or graduate students in robot control and perception. It also benefits researchers in related areas, such as computer...

  7. LDRD Final Report: Capabilities for Uncertainty in Predictive Science.

    Energy Technology Data Exchange (ETDEWEB)

    Phipps, Eric T.; Eldred, Michael S.; Salinger, Andrew G.; Webster, Clayton G.

    2008-10-01

    Predictive simulation of systems comprised of numerous interconnected, tightly coupled com-ponents promises to help solve many problems of scientific and national interest. Howeverpredictive simulation of such systems is extremely challenging due to the coupling of adiverse set of physical and biological length and time scales. This report investigates un-certainty quantification methods for such systems that attempt to exploit their structure togain computational efficiency. The traditional layering of uncertainty quantification aroundnonlinear solution processes is inverted to allow for heterogeneous uncertainty quantificationmethods to be applied to each component in a coupled system. Moreover this approachallows stochastic dimension reduction techniques to be applied at each coupling interface.The mathematical feasibility of these ideas is investigated in this report, and mathematicalformulations for the resulting stochastically coupled nonlinear systems are developed.3

  8. Monte Carlo uncertainty analysis for an iron shielding benchmark experiment

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, U.; Tsige-Tamirat, H. [Association Euratom-FZK Forschungszentrum Karlsruhe (Germany); Perel, R.L. [Hebrew Univ., Jerusalem (Israel); Wu, Y. [Institute of Plasma Physics, Heifi (China)

    1998-07-01

    This work is devoted to the computational uncertainty analysis of an iron benchmark experiment having been performed previously at the Technical University of Dresden (TUD). The analysis is based on the use of a novel Monte Carlo approach for calculating sensitivities of point detectors and focuses on the new {sup 56}Fe evaluation of the European Fusion File EFF-3. The calculated uncertainties of the neutron leakage fluxes are shown to be significantly smaller than with previous data. Above 5 MeV the calculated uncertainties are larger than the experimental ones. As the measured neutron leakage fluxes are underestimated by about 10 - 20 % in that energy range, it is concluded that the {sup 56}Fe cross-section data have to be further improved. (authors)

  9. Uncertainty and Sensitivity Analyses of Model Predictions of Solute Transport

    Science.gov (United States)

    Skaggs, T. H.; Suarez, D. L.; Goldberg, S. R.

    2012-12-01

    Soil salinity reduces crop production on about 50% of irrigated lands worldwide. One roadblock to increased use of advanced computer simulation tools for better managing irrigation water and soil salinity is that the models usually do not provide an estimate of the uncertainty in model predictions, which can be substantial. In this work, we investigate methods for putting confidence bounds on HYDRUS-1D simulations of solute leaching in soils. Uncertainties in model parameters estimated with pedotransfer functions are propagated through simulation model predictions using Monte Carlo simulation. Generalized sensitivity analyses indicate which parameters are most significant for quantifying uncertainty. The simulation results are compared with experimentally observed transport variability in a number of large, replicated lysimeters.

  10. Measuring the uncertainties of discharge measurements: interlaboratory experiments in hydrometry

    Science.gov (United States)

    Le Coz, Jérôme; Blanquart, Bertrand; Pobanz, Karine; Dramais, Guillaume; Pierrefeu, Gilles; Hauet, Alexandre; Despax, Aurélien

    2015-04-01

    Quantifying the uncertainty of streamflow data is key for hydrological sciences. The conventional uncertainty analysis based on error propagation techniques is restricted by the absence of traceable discharge standards and by the weight of difficult-to-predict errors related to the operator, procedure and measurement environment. Field interlaboratory experiments recently emerged as an efficient, standardized method to 'measure' the uncertainties of a given streamgauging technique in given measurement conditions. Both uncertainty approaches are compatible and should be developed jointly in the field of hydrometry. In the recent years, several interlaboratory experiments have been reported by different hydrological services. They involved different streamgauging techniques, including acoustic profilers (ADCP), current-meters and handheld radars (SVR). Uncertainty analysis was not always their primary goal: most often, testing the proficiency and homogeneity of instruments, makes and models, procedures and operators was the original motivation. When interlaboratory experiments are processed for uncertainty analysis, once outliers have been discarded all participants are assumed to be equally skilled and to apply the same streamgauging technique in equivalent conditions. A universal requirement is that all participants simultaneously measure the same discharge, which shall be kept constant within negligible variations. To our best knowledge, we were the first to apply the interlaboratory method for computing the uncertainties of streamgauging techniques, according to the authoritative international documents (ISO standards). Several specific issues arise due to the measurements conditions in outdoor canals and rivers. The main limitation is that the best available river discharge references are usually too uncertain to quantify the bias of the streamgauging technique, i.e. the systematic errors that are common to all participants in the experiment. A reference or a

  11. An algorithm to improve sampling efficiency for uncertainty propagation using sampling based method

    Energy Technology Data Exchange (ETDEWEB)

    Campolina, Daniel; Lima, Paulo Rubens I., E-mail: campolina@cdtn.br, E-mail: pauloinacio@cpejr.com.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil). Servico de Tecnologia de Reatores; Pereira, Claubia; Veloso, Maria Auxiliadora F., E-mail: claubia@nuclear.ufmg.br, E-mail: dora@nuclear.ufmg.br [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Dept. de Engenharia Nuclear

    2015-07-01

    Sample size and computational uncertainty were varied in order to investigate sample efficiency and convergence of the sampling based method for uncertainty propagation. Transport code MCNPX was used to simulate a LWR model and allow the mapping, from uncertain inputs of the benchmark experiment, to uncertain outputs. Random sampling efficiency was improved through the use of an algorithm for selecting distributions. Mean range, standard deviation range and skewness were verified in order to obtain a better representation of uncertainty figures. Standard deviation of 5 pcm in the propagated uncertainties for 10 n-samples replicates was adopted as convergence criterion to the method. Estimation of 75 pcm uncertainty on reactor k{sub eff} was accomplished by using sample of size 93 and computational uncertainty of 28 pcm to propagate 1σ uncertainty of burnable poison radius. For a fixed computational time, in order to reduce the variance of the uncertainty propagated, it was found, for the example under investigation, it is preferable double the sample size than double the amount of particles followed by Monte Carlo process in MCNPX code. (author)

  12. An algorithm to improve sampling efficiency for uncertainty propagation using sampling based method

    International Nuclear Information System (INIS)

    Sample size and computational uncertainty were varied in order to investigate sample efficiency and convergence of the sampling based method for uncertainty propagation. Transport code MCNPX was used to simulate a LWR model and allow the mapping, from uncertain inputs of the benchmark experiment, to uncertain outputs. Random sampling efficiency was improved through the use of an algorithm for selecting distributions. Mean range, standard deviation range and skewness were verified in order to obtain a better representation of uncertainty figures. Standard deviation of 5 pcm in the propagated uncertainties for 10 n-samples replicates was adopted as convergence criterion to the method. Estimation of 75 pcm uncertainty on reactor keff was accomplished by using sample of size 93 and computational uncertainty of 28 pcm to propagate 1σ uncertainty of burnable poison radius. For a fixed computational time, in order to reduce the variance of the uncertainty propagated, it was found, for the example under investigation, it is preferable double the sample size than double the amount of particles followed by Monte Carlo process in MCNPX code. (author)

  13. Applied research in uncertainty modeling and analysis

    CERN Document Server

    Ayyub, Bilal

    2005-01-01

    Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

  14. Probabilistic forecasts based on radar rainfall uncertainty

    Science.gov (United States)

    Liguori, S.; Rico-Ramirez, M. A.

    2012-04-01

    The potential advantages resulting from integrating weather radar rainfall estimates in hydro-meteorological forecasting systems is limited by the inherent uncertainty affecting radar rainfall measurements, which is due to various sources of error [1-3]. The improvement of quality control and correction techniques is recognized to play a role for the future improvement of radar-based flow predictions. However, the knowledge of the uncertainty affecting radar rainfall data can also be effectively used to build a hydro-meteorological forecasting system in a probabilistic framework. This work discusses the results of the implementation of a novel probabilistic forecasting system developed to improve ensemble predictions over a small urban area located in the North of England. An ensemble of radar rainfall fields can be determined as the sum of a deterministic component and a perturbation field, the latter being informed by the knowledge of the spatial-temporal characteristics of the radar error assessed with reference to rain-gauges measurements. This approach is similar to the REAL system [4] developed for use in the Southern-Alps. The radar uncertainty estimate can then be propagated with a nowcasting model, used to extrapolate an ensemble of radar rainfall forecasts, which can ultimately drive hydrological ensemble predictions. A radar ensemble generator has been calibrated using radar rainfall data made available from the UK Met Office after applying post-processing and corrections algorithms [5-6]. One hour rainfall accumulations from 235 rain gauges recorded for the year 2007 have provided the reference to determine the radar error. Statistics describing the spatial characteristics of the error (i.e. mean and covariance) have been computed off-line at gauges location, along with the parameters describing the error temporal correlation. A system has then been set up to impose the space-time error properties to stochastic perturbations, generated in real-time at

  15. Computational Modeling of Uncertainty Avoidance in Consumer Behavior

    NARCIS (Netherlands)

    Roozmand, O.; Ghasem-Aghaee, N.; Nematbakhsh, M.A.; Baraani, A.; Hofstede, G.J.

    2011-01-01

    Human purchasing behavior is affected by many influential factors. Culture at macro-level and personality at micro-level influence consumer purchasing behavior. People of different cultures tend to accept the values of their own group and consequently have different purchasing behavior. Also, people

  16. An Uncertainty Quantification Framework for Prognostics and Condition-Based Monitoring

    Science.gov (United States)

    Sankararaman, Shankar; Goebel, Kai

    2014-01-01

    This paper presents a computational framework for uncertainty quantification in prognostics in the context of condition-based monitoring of aerospace systems. The different sources of uncertainty and the various uncertainty quantification activities in condition-based prognostics are outlined in detail, and it is demonstrated that the Bayesian subjective approach is suitable for interpreting uncertainty in online monitoring. A state-space model-based framework for prognostics, that can rigorously account for the various sources of uncertainty, is presented. Prognostics consists of two important steps. First, the state of the system is estimated using Bayesian tracking, and then, the future states of the system are predicted until failure, thereby computing the remaining useful life of the system. The proposed framework is illustrated using the power system of a planetary rover test-bed, which is being developed and studied at NASA Ames Research Center.

  17. Generalized likelihood uncertainty estimation (GLUE) using adaptive Markov chain Monte Carlo sampling

    DEFF Research Database (Denmark)

    Blasone, Roberta-Serena; Vrugt, Jasper A.; Madsen, Henrik;

    2008-01-01

    estimate of the associated uncertainty. This uncertainty arises from incomplete process representation, uncertainty in initial conditions, input, output and parameter error. The generalized likelihood uncertainty estimation (GLUE) framework was one of the first attempts to represent prediction uncertainty...... within the context of Monte Carlo (MC) analysis coupled with Bayesian estimation and propagation of uncertainty. Because of its flexibility, ease of implementation and its suitability for parallel implementation on distributed computer systems, the GLUE method has been used in a wide variety of...... applications. However, the MC based sampling strategy of the prior parameter space typically utilized in GLUE is not particularly efficient in finding behavioral simulations. This becomes especially problematic for high-dimensional parameter estimation problems, and in the case of complex simulation models...

  18. Local conditions and uncertainty bands for Semiscale Test S-02-9

    International Nuclear Information System (INIS)

    Analysis was performed to derive local conditions heat transfer parameters and their uncertainties using computer codes and experimentally derived boundary conditions for the Semiscale core for LOCA Test S-02-9. Calculations performed consisted of nominal code cases using best-estimate input parameters and cases where the specified input parameters were perturbed in accordance with the response surface method of uncertainty analysis. The output parameters of interest were those that are used in film boiling heat transfer correlations including enthalpy, pressure, quality, and coolant flow rate. Large uncertainty deviations occurred during low core mass flow periods where the relative flow uncertainties were large. Utilizing the derived local conditions and their associated uncertainties, a study was then made which showed the uncertainty in film boiling heat transfer coefficient varied between 5 and 250%

  19. Local conditions and uncertainty bands for Semiscale Test S-02-9. [PWR

    Energy Technology Data Exchange (ETDEWEB)

    Varacalle, Jr, D J

    1979-01-01

    Analysis was performed to derive local conditions heat transfer parameters and their uncertainties using computer codes and experimentally derived boundary conditions for the Semiscale core for LOCA Test S-02-9. Calculations performed consisted of nominal code cases using best-estimate input parameters and cases where the specified input parameters were perturbed in accordance with the response surface method of uncertainty analysis. The output parameters of interest were those that are used in film boiling heat transfer correlations including enthalpy, pressure, quality, and coolant flow rate. Large uncertainty deviations occurred during low core mass flow periods where the relative flow uncertainties were large. Utilizing the derived local conditions and their associated uncertainties, a study was then made which showed the uncertainty in film boiling heat transfer coefficient varied between 5 and 250%.

  20. Validation uncertainty of MATRA code for subchannel void distributions

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Dae-Hyun; Kim, S. J.; Kwon, H.; Seo, K. W. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    To extend code capability to the whole core subchannel analysis, pre-conditioned Krylov matrix solvers such as BiCGSTAB and GMRES are implemented in MATRA code as well as parallel computing algorithms using MPI and OPENMP. It is coded by fortran 90, and has some user friendly features such as graphic user interface. MATRA code was approved by Korean regulation body for design calculation of integral-type PWR named SMART. The major role subchannel code is to evaluate core thermal margin through the hot channel analysis and uncertainty evaluation for CHF predictions. In addition, it is potentially used for the best estimation of core thermal hydraulic field by incorporating into multiphysics and/or multi-scale code systems. In this study we examined a validation process for the subchannel code MATRA specifically in the prediction of subchannel void distributions. The primary objective of validation is to estimate a range within which the simulation modeling error lies. The experimental data for subchannel void distributions at steady state and transient conditions was provided on the framework of OECD/NEA UAM benchmark program. The validation uncertainty of MATRA code was evaluated for a specific experimental condition by comparing the simulation result and experimental data. A validation process should be preceded by code and solution verification. However, quantification of verification uncertainty was not addressed in this study. The validation uncertainty of the MATRA code for predicting subchannel void distribution was evaluated for a single data point of void fraction measurement at a 5x5 PWR test bundle on the framework of OECD UAM benchmark program. The validation standard uncertainties were evaluated as 4.2%, 3.9%, and 2.8% with the Monte-Carlo approach at the axial levels of 2216 mm, 2669 mm, and 3177 mm, respectively. The sensitivity coefficient approach revealed similar results of uncertainties but did not account for the nonlinear effects on the

  1. Validation uncertainty of MATRA code for subchannel void distributions

    International Nuclear Information System (INIS)

    To extend code capability to the whole core subchannel analysis, pre-conditioned Krylov matrix solvers such as BiCGSTAB and GMRES are implemented in MATRA code as well as parallel computing algorithms using MPI and OPENMP. It is coded by fortran 90, and has some user friendly features such as graphic user interface. MATRA code was approved by Korean regulation body for design calculation of integral-type PWR named SMART. The major role subchannel code is to evaluate core thermal margin through the hot channel analysis and uncertainty evaluation for CHF predictions. In addition, it is potentially used for the best estimation of core thermal hydraulic field by incorporating into multiphysics and/or multi-scale code systems. In this study we examined a validation process for the subchannel code MATRA specifically in the prediction of subchannel void distributions. The primary objective of validation is to estimate a range within which the simulation modeling error lies. The experimental data for subchannel void distributions at steady state and transient conditions was provided on the framework of OECD/NEA UAM benchmark program. The validation uncertainty of MATRA code was evaluated for a specific experimental condition by comparing the simulation result and experimental data. A validation process should be preceded by code and solution verification. However, quantification of verification uncertainty was not addressed in this study. The validation uncertainty of the MATRA code for predicting subchannel void distribution was evaluated for a single data point of void fraction measurement at a 5x5 PWR test bundle on the framework of OECD UAM benchmark program. The validation standard uncertainties were evaluated as 4.2%, 3.9%, and 2.8% with the Monte-Carlo approach at the axial levels of 2216 mm, 2669 mm, and 3177 mm, respectively. The sensitivity coefficient approach revealed similar results of uncertainties but did not account for the nonlinear effects on the

  2. SENSIT: a cross-section and design sensitivity and uncertainty analysis code

    International Nuclear Information System (INIS)

    SENSIT computes the sensitivity and uncertainty of a calculated integral response (such as a dose rate) due to input cross sections and their uncertainties. Sensitivity profiles are computed for neutron and gamma-ray reaction cross sections (of standard multigroup cross-section sets) and for secondary energy distributions (SED's) of multigroup scattering matrices. In the design sensitivity mode, SENSIT computes changes in an integral response due to design changes and gives the appropriate sensitivity coefficients. Cross-section uncertainty analyses are performed for three types of input data uncertainties: cross-section covariance matrices for pairs of multigroup reaction cross sections, spectral shape uncertainty parameters for secondary energy distributions (integral SED uncertainties), and covariance matrices for energy-dependent response functions. For all three types of data uncertainties, SENSIT computes the resulting variance and expected standard deviation in an integral response of interest, on the basis of generalized perturbation theory. SENSIT uses angular-flux files from one-dimensional discrete-ordinates codes like ONETRAN, ANISN, and DTF, and reads multigroup cross-section sets in three different formats. This report gives detailed input specifications; precise definitions of all input and output arrays; a discussion of the underlying theory; and details of program flow, data management, and storage requirements. Eight sample problems are described in detail for which complete input files and selected output prints are listed. 4 figures, 3 tables

  3. Adjoint-Based Uncertainty Quantification with MCNP

    Energy Technology Data Exchange (ETDEWEB)

    Seifried, Jeffrey E. [Univ. of California, Berkeley, CA (United States)

    2011-09-01

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  4. Efficient uncertainty analysis using optimal statistical estimator

    International Nuclear Information System (INIS)

    When performing best estimate calculations uncertainty needs to be quantified. In the world different uncertainty methods were developed for uncertainty evaluation. In the present work an optimal statistical estimator algorithm was adapted, extended and used for response surface generation. The objectives of the study was to demonstrate its applicability for uncertainty evaluation of single value or continuous valued parameters. The results showed that optimal statistical estimator is efficient in predicting lower and upper uncertainty bounds. This makes possibility to apply CsA method for uncertainty evaluation of any kind of transient.(author)

  5. Integrated Arrival and Departure Schedule Optimization Under Uncertainty

    Science.gov (United States)

    Xue, Min; Zelinski, Shannon

    2014-01-01

    In terminal airspace, integrating arrivals and departures with shared waypoints provides the potential of improving operational efficiency by allowing direct routes when possible. Incorporating stochastic evaluation as a post-analysis process of deterministic optimization, and imposing a safety buffer in deterministic optimization, are two ways to learn and alleviate the impact of uncertainty and to avoid unexpected outcomes. This work presents a third and direct way to take uncertainty into consideration during the optimization. The impact of uncertainty was incorporated into cost evaluations when searching for the optimal solutions. The controller intervention count was computed using a heuristic model and served as another stochastic cost besides total delay. Costs under uncertainty were evaluated using Monte Carlo simulations. The Pareto fronts that contain a set of solutions were identified and the trade-off between delays and controller intervention count was shown. Solutions that shared similar delays but had different intervention counts were investigated. The results showed that optimization under uncertainty could identify compromise solutions on Pareto fonts, which is better than deterministic optimization with extra safety buffers. It helps decision-makers reduce controller intervention while achieving low delays.

  6. Hydrocarbon exploration risk evaluation through uncertainty and sensitivity analyses techniques

    International Nuclear Information System (INIS)

    The evaluation of the exploration risk in the oil industry is a fundamental component of the decision process related to the exploratory phase. In this paper the two basic components of the exploratory risk: trap geometry and trapped hydrocarbon quantities (fluid), are compounded in a single coherent uncertainty and sensitivity approach. The results clarify that the model geometry influences each Petroleum System Modeling step and that the geometric uncertainty is correlated with the fluid uncertainty. The geometric uncertainty evaluation makes use of geostatistical techniques that produce a number of possible realizations of the trap geometry, all compatible with available data. The evaluation of the fluid uncertainty, through a Monte Carlo methodology, allows us to compute the possible quantities of oil and gas, generated in a basin and migrated from the hydrocarbon source location to each single trap. The final result is the probability distribution of oil and gas for each trap in the basin, together with other useful indicators like: the hydrocarbon filling probability map, the closure probability map, the drainage area probability map, the spilling paths probabilities, the trap-filling scenarios

  7. Uncertainty analysis of dose estimation

    International Nuclear Information System (INIS)

    This article is designed from the point of quantification of uncertainty associated with the measurement of radiation dose. Therefore, before introducing uncertainty quantification, it is always better to recap the few common terminologies of radiation dosimetry. We know that when radiation interacts with matter, electrons are removed from atoms through the process called ionization. When the energy of the radiation is not strong enough, electron excitation (jumps from lower shells to upper shells) may occur. The excitation of molecules or breaking of molecular bonds can also occur, causing damage to living cells. When the energy of radiation is large enough to produce ionizations, the radiation is called ionizing radiation. Otherwise, it is called non-ionizing radiation. When dealing with the interaction of gamma and X-rays in air, the term exposure is used. Exposure measures the electric charge (positive or negative) produced by electromagnetic radiation in a unit mass of air, at normal atmospheric conditions

  8. Uncertainty of the calibration factor

    International Nuclear Information System (INIS)

    According to present definitions, an error is the difference between a measured value and the ''true'' value. Thus an error has both a numerical value and a sign. In contrast, the uncertainly associated with a measurement is a parameter that characterizes the dispersion of the values ''that could reasonably be attributed to the measurand''. This parameter is normally an estimated standard deviation. An uncertainty, therefore, has no known sign and is usually assumed to be symmetrical. It is a measure of our lack of exact knowledge, after all recognized ''systematic'' effects have been eliminated by applying appropriate corrections. If errors were known exactly, the true value could be determined and there would be no problem left. In reality, errors are estimated in the best possible way and corrections made for them. Therefore, after application of all known corrections, errors need no further consideration (their expectation value being zero) and the only quantities of interest are uncertainties. 3 refs, 2 figs

  9. Visualizing Summary Statistics and Uncertainty

    KAUST Repository

    Potter, K.

    2010-08-12

    The graphical depiction of uncertainty information is emerging as a problem of great importance. Scientific data sets are not considered complete without indications of error, accuracy, or levels of confidence. The visual portrayal of this information is a challenging task. This work takes inspiration from graphical data analysis to create visual representations that show not only the data value, but also important characteristics of the data including uncertainty. The canonical box plot is reexamined and a new hybrid summary plot is presented that incorporates a collection of descriptive statistics to highlight salient features of the data. Additionally, we present an extension of the summary plot to two dimensional distributions. Finally, a use-case of these new plots is presented, demonstrating their ability to present high-level overviews as well as detailed insight into the salient features of the underlying data distribution. © 2010 The Eurographics Association and Blackwell Publishing Ltd.

  10. Extended uncertainty from first principles

    Directory of Open Access Journals (Sweden)

    Raimundo N. Costa Filho

    2016-04-01

    Full Text Available A translation operator acting in a space with a diagonal metric is introduced to describe the motion of a particle in a quantum system. We show that the momentum operator and, as a consequence, the uncertainty relation now depend on the metric. It is also shown that, for any metric expanded up to second order, this formalism naturally leads to an extended uncertainty principle (EUP with a minimum momentum dispersion. The Ehrenfest theorem is modified to include an additional term related to a tidal force arriving from the space curvature introduced by the metric. For one-dimensional systems, we show how to map a harmonic potential to an effective potential in Euclidean space using different metrics.

  11. Extended uncertainty from first principles

    Science.gov (United States)

    Costa Filho, Raimundo N.; Braga, João P. M.; Lira, Jorge H. S.; Andrade, José S.

    2016-04-01

    A translation operator acting in a space with a diagonal metric is introduced to describe the motion of a particle in a quantum system. We show that the momentum operator and, as a consequence, the uncertainty relation now depend on the metric. It is also shown that, for any metric expanded up to second order, this formalism naturally leads to an extended uncertainty principle (EUP) with a minimum momentum dispersion. The Ehrenfest theorem is modified to include an additional term related to a tidal force arriving from the space curvature introduced by the metric. For one-dimensional systems, we show how to map a harmonic potential to an effective potential in Euclidean space using different metrics.

  12. NASA Team 2 Sea Ice Concentration Algorithm Retrieval Uncertainty

    Science.gov (United States)

    Brucker, Ludovic; Cavalieri, Donald J.; Markus, Thorsten; Ivanoff, Alvaro

    2014-01-01

    Satellite microwave radiometers are widely used to estimate sea ice cover properties (concentration, extent, and area) through the use of sea ice concentration (IC) algorithms. Rare are the algorithms providing associated IC uncertainty estimates. Algorithm uncertainty estimates are needed to assess accurately global and regional trends in IC (and thus extent and area), and to improve sea ice predictions on seasonal to interannual timescales using data assimilation approaches. This paper presents a method to provide relative IC uncertainty estimates using the enhanced NASA Team (NT2) IC algorithm. The proposed approach takes advantage of the NT2 calculations and solely relies on the brightness temperatures (TBs) used as input. NT2 IC and its associated relative uncertainty are obtained for both the Northern and Southern Hemispheres using the Advanced Microwave Scanning Radiometer for the Earth Observing System (AMSR-E) TB. NT2 IC relative uncertainties estimated on a footprint-by-footprint swath-by-swath basis were averaged daily over each 12.5-km grid cell of the polar stereographic grid. For both hemispheres and throughout the year, the NT2 relative uncertainty is less than 5%. In the Southern Hemisphere, it is low in the interior ice pack, and it increases in the marginal ice zone up to 5%. In the Northern Hemisphere, areas with high uncertainties are also found in the high IC area of the Central Arctic. Retrieval uncertainties are greater in areas corresponding to NT2 ice types associated with deep snow and new ice. Seasonal variations in uncertainty show larger values in summer as a result of melt conditions and greater atmospheric contributions. Our analysis also includes an evaluation of the NT2 algorithm sensitivity to AMSR-E sensor noise. There is a 60% probability that the IC does not change (to within the computed retrieval precision of 1%) due to sensor noise, and the cumulated probability shows that there is a 90% chance that the IC varies by less than

  13. 'spup' - an R package for uncertainty propagation in spatial environmental modelling

    Science.gov (United States)

    Sawicka, Kasia; Heuvelink, Gerard

    2016-04-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected static and interactive visualization methods that are understandable by non-experts with limited background in

  14. Uncertainties in effective dose estimates of adult CT head scans: The effect of head size

    International Nuclear Information System (INIS)

    Purpose: This study is an extension of a previous study where the uncertainties in effective dose estimates from adult CT head scans were calculated using four CT effective dose estimation methods, three of which were computer programs (CT-EXPO, CTDOSIMETRY, and IMPACTDOSE) and one that involved the dose length product (DLP). However, that study did not include the uncertainty contribution due to variations in head sizes. Methods: The uncertainties due to head size variations were estimated by first using the computer program data to calculate doses to small and large heads. These doses were then compared with doses calculated for the phantom heads used by the computer programs. An uncertainty was then assigned based on the difference between the small and large head doses and the doses of the phantom heads. Results: The uncertainties due to head size variations alone were found to be between 4% and 26% depending on the method used and the patient gender. When these uncertainties were included with the results of the previous study, the overall uncertainties in effective dose estimates (stated at the 95% confidence interval) were 20%-31% (CT-EXPO), 15%-30% (CTDOSIMETRY), 20%-36% (IMPACTDOSE), and 31%-40% (DLP). Conclusions: For the computer programs, the lower overall uncertainties were still achieved when measured values of CT dose index were used rather than tabulated values. For DLP dose estimates, head size variations made the largest (for males) and second largest (for females) contributions to effective dose uncertainty. An improvement in the uncertainty of the DLP method dose estimates will be achieved if head size variation can be taken into account.

  15. Bidding wind energy under uncertainty

    OpenAIRE

    Usaola, Julio; Angarita, Jorge

    2007-01-01

    The integration of wind energy into electricity markets implies that the wind energy must commit their production for a given time period. This requires the use of short term wind power prediction tools to prepare the bids for the spot market. The output of these tools have a limited accuracy, and, therefore, these predictions are uncertain. Optimal bids must take into account this uncertainty in order to get the maximum revenue from the sell of energy, minimizing losses due to imbal...

  16. International Capital Movements Under Uncertainty

    OpenAIRE

    1983-01-01

    In this paper, we analyze the determinants of international movements of physical capital in a model with uncertainty and international trade in goods and securities.In our model, the world allocation of capital is governed, to some extent, by the asset preferences of risk averse consumer-investors. In a one-good variant in the spirit of the MacDougall model, we find that relative factor abundance, relative labor force size and relative production riskiness have separate but interrelated infl...

  17. Information, uncertainty and holographic action

    CERN Document Server

    Dikken, Robbert-Jan

    2016-01-01

    In this short note we show through simple derivation the explicit relation between information flow and the theories of the emergence of space-time and gravity, specifically for Newton's second law of motion. Next, in a rather straightforward derivation the Heisenberg uncertainty relation is uncovered from the universal bound on information flow. A relation between the universal bound on information flow and the change in bulk action is also shown to exist.

  18. Conditional Independence in Uncertainty Theories

    OpenAIRE

    Shenoy, Prakash P.

    2013-01-01

    This paper introduces the notions of independence and conditional independence in valuation-based systems (VBS). VBS is an axiomatic framework capable of representing many different uncertainty calculi. We define independence and conditional independence in terms of factorization of the joint valuation. The definitions of independence and conditional independence in VBS generalize the corresponding definitions in probability theory. Our definitions apply not only to probability theory, but al...

  19. Uncertainty analysis for hot channel

    International Nuclear Information System (INIS)

    The fulfillment of the safety analysis acceptance criteria is usually evaluated by separate hot channel calculations using the results of neutronic or/and thermo hydraulic system calculations. In case of an ATWS event (inadvertent withdrawal of control assembly), according to the analysis, a number of fuel rods are experiencing DNB for a longer time and must be regarded as failed. Their number must be determined for a further evaluation of the radiological consequences. In the deterministic approach, the global power history must be multiplied by different hot channel factors (kx) taking into account the radial power peaking factors for each fuel pin. If DNB occurs it is necessary to perform a few number of hot channel calculations to determine the limiting kx leading just to DNB and fuel failure (the conservative DNBR limit is 1.33). Knowing the pin power distribution from the core design calculation, the number of failed fuel pins can be calculated. The above procedure can be performed by conservative assumptions (e.g. conservative input parameters in the hot channel calculations), as well. In case of hot channel uncertainty analysis, the relevant input parameters (k x, mass flow, inlet temperature of the coolant, pin average burnup, initial gap size, selection of power history influencing the gap conductance value) of hot channel calculations and the DNBR limit are varied considering the respective uncertainties. An uncertainty analysis methodology was elaborated combining the response surface method with the one sided tolerance limit method of Wilks. The results of deterministic and uncertainty hot channel calculations are compared regarding to the number of failed fuel rods, max. temperature of the clad surface and max. temperature of the fuel (Authors)

  20. Knowledge, decision making, and uncertainty

    International Nuclear Information System (INIS)

    Artificial intelligence (AI) systems depend heavily upon the ability to make decisions. Decisions require knowledge, yet there is no knowledge-based theory of decision making. To the extent that AI uses a theory of decision-making it adopts components of the traditional statistical view in which choices are made by maximizing some function of the probabilities of decision options. A knowledge-based scheme for reasoning about uncertainty is proposed, which extends the traditional framework but is compatible with it

  1. Aggregate Uncertainty, Money and Banking

    OpenAIRE

    Hongfei Sun

    2006-01-01

    This paper studies the problem of monitoring the monitor in a model of money and banking with aggregate uncertainty. It shows that when inside money is required as a means of bank loan repayment, a market of inside money is entailed at the repayment stage and generates information-revealing prices that perfectly discipline the bank. The incentive problem of a bank is costlessly overcome simply by involving inside money in repayment. Inside money distinguishes itself from outside money by its ...

  2. CRISIS FOCUS Uncertainty and Flexibility

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    As the world continued to watch the unfolding financial and economic crises this month, Robert Zoellick, President of the World Bank, arrived in China for discussions on how the country can help support the global economy and the efforts it has taken to strengthen its own recovery. Zoellick, who had seen many uncertainties in 2009, called for China to be flexible with its macroeconomic policy. He made the following comments at a press conference in Beijing on December 15. Edited excerpts follow:

  3. Social Preferences and Strategic Uncertainty

    DEFF Research Database (Denmark)

    Cabrales, Antonio; Miniaci, Raffaele; Piovesan, Marco; Ponti, Giovanni

    chosen contract. We find that (heterogeneous) social preferences are significant determinants of choices in all phases of the experiment. Since the available contracts display a trade-off between fairness and strategic uncertainty, we observe that the latter is a much stronger determinant of choices, for...... both principals and agents. Finally, we also see that social preferences explain, to a large extent, matching between principals and agents, since agents display a marked propensity to work for principals with similar social preferences...

  4. Cautious operation planning under uncertainties

    OpenAIRE

    Capitanescu, Florin; Fliscounakis, Stéphane; Panciatici, Patrick; Wehenkel, Louis

    2012-01-01

    This paper deals with day-ahead power systems security planning under uncertainties, by posing an optimization problem over a set of power injection scenarios that could show up the next day and modeling the next day's real-time control strategies aiming at ensuring security with respect to contingencies by a combination of preventive and corrective controls. We seek to determine whether and which day-ahead decisions must be taken so that for scenarios over the next day there still ex...

  5. Competitive Capacity Investment under Uncertainty

    OpenAIRE

    Li, Xishu; Zuidwijk, Rob; Koster, René; Dekker, Rommert

    2016-01-01

    textabstractWe consider a long-term capacity investment problem in a competitive market under demand uncertainty. Two firms move sequentially in the competition and a firm’s capacity decision interacts with the other firm’s current and future capacity. Throughout the investment race, a firm can either choose to plan its investments proactively, taking into account possible responses from the other firm, or decide to respond reactively to the competition. In both cases, the optimal decision at...

  6. Learning about Fiscal Policy Uncertainty

    OpenAIRE

    Matthes, Christian; Sablik, Timothy

    2014-01-01

    In response to the financial crisis and recession of 2007-09, the federal government enacted a number of emergency fiscal policies intended to aid recovery. These included short-term stimulus measures, such as the American Recovery and Reinvestment Act of 2009, and temporary tax reductions, such as the payroll tax cut in 2010. However, the unconventional and transitory nature of these fiscal policies may have contributed to greater economic uncertainty. Given the slow recovery that has follow...

  7. Word learning under infinite uncertainty.

    Science.gov (United States)

    Blythe, Richard A; Smith, Andrew D M; Smith, Kenny

    2016-06-01

    Language learners must learn the meanings of many thousands of words, despite those words occurring in complex environments in which infinitely many meanings might be inferred by the learner as a word's true meaning. This problem of infinite referential uncertainty is often attributed to Willard Van Orman Quine. We provide a mathematical formalisation of an ideal cross-situational learner attempting to learn under infinite referential uncertainty, and identify conditions under which word learning is possible. As Quine's intuitions suggest, learning under infinite uncertainty is in fact possible, provided that learners have some means of ranking candidate word meanings in terms of their plausibility; furthermore, our analysis shows that this ranking could in fact be exceedingly weak, implying that constraints which allow learners to infer the plausibility of candidate word meanings could themselves be weak. This approach lifts the burden of explanation from 'smart' word learning constraints in learners, and suggests a programme of research into weak, unreliable, probabilistic constraints on the inference of word meaning in real word learners. PMID:26927884

  8. Blade tip timing (BTT) uncertainties

    Science.gov (United States)

    Russhard, Pete

    2016-06-01

    Blade Tip Timing (BTT) is an alternative technique for characterising blade vibration in which non-contact timing probes (e.g. capacitance or optical probes), typically mounted on the engine casing (figure 1), and are used to measure the time at which a blade passes each probe. This time is compared with the time at which the blade would have passed the probe if it had been undergoing no vibration. For a number of years the aerospace industry has been sponsoring research into Blade Tip Timing technologies that have been developed as tools to obtain rotor blade tip deflections. These have been successful in demonstrating the potential of the technology, but rarely produced quantitative data, along with a demonstration of a traceable value for measurement uncertainty. BTT technologies have been developed under a cloak of secrecy by the gas turbine OEM's due to the competitive advantages it offered if it could be shown to work. BTT measurements are sensitive to many variables and there is a need to quantify the measurement uncertainty of the complete technology and to define a set of guidelines as to how BTT should be applied to different vehicles. The data shown in figure 2 was developed from US government sponsored program that bought together four different tip timing system and a gas turbine engine test. Comparisons showed that they were just capable of obtaining measurement within a +/-25% uncertainty band when compared to strain gauges even when using the same input data sets.

  9. Uncertainty compliant design flood estimation

    Science.gov (United States)

    Botto, A.; Ganora, D.; Laio, F.; Claps, P.

    2014-05-01

    Hydraulic infrastructures are commonly designed with reference to target values of flood peak, estimated using probabilistic techniques, such as flood frequency analysis. The application of these techniques underlies levels of uncertainty, which are sometimes quantified but normally not accounted for explicitly in the decision regarding design discharges. The present approach aims at defining a procedure which enables the definition of Uncertainty Compliant Design (UNCODE) values of flood peaks. To pursue this goal, we first demonstrate the equivalence of the Standard design based on the return period and the cost-benefit procedure, when linear cost and damage functions are used. We then use this result to assign an expected cost to estimation errors, thus setting a framework to obtain a design flood estimator which minimizes the total expected cost. This procedure properly accounts for the uncertainty which is inherent in the frequency curve estimation. Applications of the UNCODE procedure to real cases leads to remarkable displacement of the design flood from the Standard values. UNCODE estimates are systematically larger than the Standard ones, with substantial differences (up to 55%) when large return periods or short data samples are considered.

  10. Uncertainty quantification in aerosol dynamics

    International Nuclear Information System (INIS)

    The influence of uncertainty in coagulation and depositions mechanisms, as well as in the initial conditions, on the solution of the aerosol dynamic equation have been assessed using polynomial chaos theory. In this way, large uncertainties can be incorporated into the equations and their propagation as a function of space and time studied. We base our calculations on the simplified point model dynamic equation which includes coagulation and deposition removal mechanisms. Results are given for the stochastic mean aerosol density as a function of time as well as its variance. The stochastic mean and deterministic mean are shown to differ and the associated uncertainty, in the form of a sensitivity coefficient, is obtained as a function of time. In addition, we obtain the probability density function of the aerosol density and show how this varies with time. In view of the generally uncertain nature of an accidental aerosol release in a nuclear reactor accident, the polynomial chaos method is a particularly useful technique as it allows one to deal with a very large spread of input data and examine the effect this has on the quantities of interest. Convergence matters are studied and numerical values given.

  11. Uncertainty of climate change impact on groundwater reserves - Application to a chalk aquifer

    Science.gov (United States)

    Goderniaux, Pascal; Brouyère, Serge; Wildemeersch, Samuel; Therrien, René; Dassargues, Alain

    2015-09-01

    Recent studies have evaluated the impact of climate change on groundwater resources for different geographical and climatic contexts. However, most studies have either not estimated the uncertainty around projected impacts or have limited the analysis to the uncertainty related to climate models. In this study, the uncertainties around impact projections from several sources (climate models, natural variability of the weather, hydrological model calibration) are calculated and compared for the Geer catchment (465 km2) in Belgium. We use a surface-subsurface integrated model implemented using the finite element code HydroGeoSphere, coupled with climate change scenarios (2010-2085) and the UCODE_2005 inverse model, to assess the uncertainty related to the calibration of the hydrological model. This integrated model provides a more realistic representation of the water exchanges between surface and subsurface domains and constrains more the calibration with the use of both surface and subsurface observed data. Sensitivity and uncertainty analyses were performed on predictions. The linear uncertainty analysis is approximate for this nonlinear system, but it provides some measure of uncertainty for computationally demanding models. Results show that, for the Geer catchment, the most important uncertainty is related to calibration of the hydrological model. The total uncertainty associated with the prediction of groundwater levels remains large. By the end of the century, however, the uncertainty becomes smaller than the predicted decline in groundwater levels.

  12. Measurement uncertainty from In-Situ gamma spectroscopy of nonhomogeneous containers and from Laboratory Assay

    International Nuclear Information System (INIS)

    During a D and D or ER process containers of radioactive waste are normally generated. The activity can commonly be determined by gamma spectroscopy, but frequently the measurement conditions are not conducive to precise sample-detector geometries, and usually the radioactive material is not in a homogeneous distribution. What is the best method to accurately assay these containers - sampling followed by laboratory analysis, or in-situ spectroscopy? What is the uncertainty of the final result? To help answer these questions, the Canberra tool ISOCS Uncertainty Estimator [IUE] was used to mathematically simulate and evaluate several different measurement scenarios and to estimate the uncertainty of the measurement and the sampling process. Several representative containers and source distributions were mathematically defined and evaluated to determine the in-situ measurement uncertainty due to the sample non-uniformity. In the First example a typical field situation requiring the measurement of 200-liter drums was evaluated. A sensitivity analysis was done to show which parameters contributed the most to the uncertainty. Then an efficiency uncertainty calculation was performed. In the Second example, a group of 200-liter drums with various types of non-homogeneous distributions was created, and them measurements were simulated with different detector arrangements to see how the uncertainty varied. In the Third example, a truck filled with non-uniform soil was first measured with multiple in-situ detectors to determine the measurement uncertainty. Then composite samples were extracted and the sampling uncertainty computed for comparison to the field measurement uncertainty. (authors)

  13. Modeling Heterogeneity in Networks using Uncertainty Quantification Tools

    CERN Document Server

    Rajendran, Karthikeyan; Siettos, Constantinos I; Laing, Carlo R; Kevrekidis, Ioannis G

    2015-01-01

    Using the dynamics of information propagation on a network as our illustrative example, we present and discuss a systematic approach to quantifying heterogeneity and its propagation that borrows established tools from Uncertainty Quantification. The crucial assumption underlying this mathematical and computational "technology transfer" is that the evolving states of the nodes in a network quickly become correlated with the corresponding node "identities": features of the nodes imparted by the network structure (e.g. the node degree, the node clustering coefficient). The node dynamics thus depend on heterogeneous (rather than uncertain) parameters, whose distribution over the network results from the network structure. Knowing these distributions allows us to obtain an efficient coarse-grained representation of the network state in terms of the expansion coefficients in suitable orthogonal polynomials. This representation is closely related to mathematical/computational tools for uncertainty quantification (th...

  14. Probabilistic Load Flow Considering Wind Generation Uncertainty

    Directory of Open Access Journals (Sweden)

    R. Ramezani

    2011-10-01

    Full Text Available Renewable energy sources, such as wind, solar and hydro, are increasingly incorporated into power grids, as a direct consequence of energy and environmental issues. These types of energies are variable and intermittent by nature and their exploitation introduces uncertainties into the power grid. Therefore, probabilistic analysis of the system performance is of significant interest. This paper describes a new approach to Probabilistic Load Flow (PLF by modifying the Two Point Estimation Method (2PEM to cover some drawbacks of other currently used methods. The proposed method is examined using two case studies, the IEEE 9-bus and the IEEE 57-bus test systems. In order to justify the effectiveness of the method, numerical comparison with Monte Carlo Simulation (MCS method is presented. Simulation results indicate that the proposed method significantly reduces the computational burden while maintaining a high level of accuracy. Moreover, that the unsymmetrical 2PEM has a higher level of accuracy than the symmetrical 2PEM with equal computing burden, when the Probability Density Function (PDF of uncertain variables is asymmetric.

  15. Stochastic and epistemic uncertainty propagation in LCA

    DEFF Research Database (Denmark)

    Clavreul, Julie; Guyonnet, Dominique; Tonini, Davide;

    2013-01-01

    When performing uncertainty propagation, most LCA practitioners choose to represent uncertainties by single probability distributions and to propagate them using stochastic methods. However, the selection of single probability distributions appears often arbitrary when faced with scarce informati...

  16. Heisenberg's Uncertainty Relations and Quantum Optics

    OpenAIRE

    Agarwal, G. S.

    2002-01-01

    We present a brief review of the impact of the Heisenberg uncertainty relations on quantum optics. In particular we demonstrate how almost all coherent and nonclassical states of quantum optics can be derived from uncertainty relations.

  17. Cost Estimating Risk and Cost Estimating Uncertainty Guidelines

    OpenAIRE

    Anderson, Timothy P.; Cherwonik, Jeffrey S.

    1997-01-01

    The Memorandum of Agreement signed by the Assistant Secretaries of the Navy for Research, Development, and Acquisition (ASN[RD&A]) and for Financial Management and Comptroller (FM&C) in June 1996 committed the Naval Center for Cost Analysis (NCCA) to improve cost analyses by helping program managers prepare better cost estimates. Recent computing advances make development of meaningful risk and uncertainty analyses easier, and these analyses can help managers do their job be...

  18. Propagation of Uncertainty in Rigid Body Attitude Flows

    OpenAIRE

    Lee, Taeyoung; Chaturvedi, Nalin A.; Sanyal, Amit K.; Leok, Melvin; McClamroch, N. Harris

    2007-01-01

    Motivated by attitude control and attitude estimation problems for a rigid body, computational methods are proposed to propagate uncertainties in the angular velocity and the attitude. The nonlinear attitude flow is determined by Euler-Poincar\\'e equations that describe the rotational dynamics of the rigid body acting under the influence of an attitude dependent potential and by a reconstruction equation that describes the kinematics expressed in terms of an orthogonal matrix representing the...

  19. Investment Decisions in Small Ethanol Plant under Risk and Uncertainty

    OpenAIRE

    Wamisho, Kassu; Ripplinger, David

    2015-01-01

    This study evaluates optimal investment decision rules for an energy beet ethanol firms to simultaneously exercise the option to invest, mothball, reactivate and exit the ethanol market, considering uncertainty and volatility in the market price of ethanol and irreversible investment. A real options framework is employed to compute the prices of ethanol that trigger entry into and exit from the ethanol market. Results show that hysteresis is found to be significant even with modest volatility...

  20. Analysis of uncertainties and geometric tolerances in assemblies of parts

    OpenAIRE

    Fleming, Alan Duncan

    1988-01-01

    Computer models of the geometry of the real world have a tendency to assume that the shapes and positions of objects can be described exactly. However, real surfaces are subject to irregularities such as bumps and undulations and so do not have perfect, mathematically definable forms. Engineers recognise this fact and so assign tolerance specifications to their designs. This thesis develops a representation of geometric tolerance and uncertainty in assemblies of rigid par...