WorldWideScience

Sample records for avt-147 computational uncertainty

  1. Uncertainty In Quantum Computation

    OpenAIRE

    Kak, Subhash

    2002-01-01

    We examine the effect of previous history on starting a computation on a quantum computer. Specifically, we assume that the quantum register has some unknown state on it, and it is required that this state be cleared and replaced by a specific superposition state without any phase uncertainty, as needed by quantum algorithms. We show that, in general, this task is computationally impossible.

  2. S-parameter uncertainty computations

    DEFF Research Database (Denmark)

    Vidkjær, Jens

    1993-01-01

    A method for computing uncertainties of measured s-parameters is presented. Unlike the specification software provided with network analyzers, the new method is capable of calculating the uncertainties of arbitrary s-parameter sets and instrument settings.......A method for computing uncertainties of measured s-parameters is presented. Unlike the specification software provided with network analyzers, the new method is capable of calculating the uncertainties of arbitrary s-parameter sets and instrument settings....

  3. Numerical uncertainty in computational engineering and physics

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois M [Los Alamos National Laboratory

    2009-01-01

    Obtaining a solution that approximates ordinary or partial differential equations on a computational mesh or grid does not necessarily mean that the solution is accurate or even 'correct'. Unfortunately assessing the quality of discrete solutions by questioning the role played by spatial and temporal discretizations generally comes as a distant third to test-analysis comparison and model calibration. This publication is contributed to raise awareness of the fact that discrete solutions introduce numerical uncertainty. This uncertainty may, in some cases, overwhelm in complexity and magnitude other sources of uncertainty that include experimental variability, parametric uncertainty and modeling assumptions. The concepts of consistency, convergence and truncation error are overviewed to explain the articulation between the exact solution of continuous equations, the solution of modified equations and discrete solutions computed by a code. The current state-of-the-practice of code and solution verification activities is discussed. An example in the discipline of hydro-dynamics illustrates the significant effect that meshing can have on the quality of code predictions. A simple method is proposed to derive bounds of solution uncertainty in cases where the exact solution of the continuous equations, or its modified equations, is unknown. It is argued that numerical uncertainty originating from mesh discretization should always be quantified and accounted for in the overall uncertainty 'budget' that supports decision-making for applications in computational physics and engineering.

  4. Uncertainty in biology a computational modeling approach

    CERN Document Server

    Gomez-Cabrero, David

    2016-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies.  Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process.  This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples.  This book is intended for graduate stude...

  5. Efficient Quantification of Uncertainties in Complex Computer Code Results Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Propagation of parameter uncertainties through large computer models can be very resource intensive. Frameworks and tools for uncertainty quantification are...

  6. Reliable Computational Predictions by Modeling Uncertainties Using Arbitrary Polynomial Chaos

    OpenAIRE

    Witteveen, J.A.S.; Bijl, H

    2006-01-01

    Inherent physical uncertainties can have a significant influence on computational predictions. It is therefore important to take physical uncertainties into account to obtain more reliable computational predictions. The Galerkin polynomial chaos method is a commonly applied uncertainty quantification method. However, the polynomial chaos expansion has some limitations. Firstly, the polynomial chaos expansion based on classical polynomials can achieve exponential convergence for a limited set ...

  7. Efficient uncertainty quantification in computational fluid dynamics

    NARCIS (Netherlands)

    Loeven, G.J.A.

    2010-01-01

    When modeling physical systems, several sources of uncertainty are present. For example, variability in boundary conditions like free stream velocity or ambient pressure are always present. Furthermore, uncertainties in geometry arise from production tolerances, wear or unknown deformations under lo

  8. Multifactorial Uncertainty Assessment for Monitoring Population Abundance using Computer Vision

    NARCIS (Netherlands)

    Beauxis-Aussalet, E.M.A.L.; Hardman, L.

    2015-01-01

    Computer vision enables in-situ monitoring of animal populations at a lower cost and with less ecosystem disturbance than with human observers. However, computer vision uncertainty may not be fully understood by end-users, and the uncertainty assessments performed by technology experts may not fully

  9. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    Energy Technology Data Exchange (ETDEWEB)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  10. Multifactorial Uncertainty Assessment for Monitoring Population Abundance using Computer Vision

    OpenAIRE

    Beauxis-Aussalet, Emmanuelle; Hardman, Hazel Lynda

    2015-01-01

    Computer vision enables in-situ monitoring of animal populations at a lower cost and with less ecosystem disturbance than with human observers. However, computer vision uncertainty may not be fully understood by end-users, and the uncertainty assessments performed by technology experts may not fully address end-user needs. This knowledge gap can yield misinterpretations of computer vision data, and trust issues impeding the transfer of valuable technologies. We bridge this gap with a user-cen...

  11. Uncertainty in biology: a computational modeling approach

    OpenAIRE

    2015-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies. Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building...

  12. A Monomial Chaos Approach for Efficient Uncertainty Quantification in Computational Fluid Dynamics

    NARCIS (Netherlands)

    Witteveen, J.A.S.; Bijl, H.

    2006-01-01

    A monomial chaos approach is proposed for efficient uncertainty quantification in nonlinear computational problems. Propagating uncertainty through nonlinear equations can still be computationally intensive for existing uncertainty quantification methods. It usually results in a set of nonlinear equ

  13. MOUSE (MODULAR ORIENTED UNCERTAINTY SYSTEM): A COMPUTERIZED UNCERTAINTY ANALYSIS SYSTEM (FOR MICRO- COMPUTERS)

    Science.gov (United States)

    Environmental engineering calculations involving uncertainties; either in the model itself or in the data, are far beyond the capabilities of conventional analysis for any but the simplest of models. There exist a number of general-purpose computer simulation languages, using Mon...

  14. Soft computing approaches to uncertainty propagation in environmental risk mangement

    OpenAIRE

    Kumar, Vikas

    2008-01-01

    Real-world problems, especially those that involve natural systems, are complex and composed of many nondeterministic components having non-linear coupling. It turns out that in dealing with such systems, one has to face a high degree of uncertainty and tolerate imprecision. Classical system models based on numerical analysis, crisp logic or binary logic have characteristics of precision and categoricity and classified as hard computing approach. In contrast soft computing approaches like pro...

  15. Uncertainty quantification in computational fluid dynamics and aircraft engines

    CERN Document Server

    Montomoli, Francesco; D'Ammaro, Antonio; Massini, Michela; Salvadori, Simone

    2015-01-01

    This book introduces novel design techniques developed to increase the safety of aircraft engines. The authors demonstrate how the application of uncertainty methods can overcome problems in the accurate prediction of engine lift, caused by manufacturing error. This in turn ameliorates the difficulty of achieving required safety margins imposed by limits in current design and manufacturing methods. This text shows that even state-of-the-art computational fluid dynamics (CFD) are not able to predict the same performance measured in experiments; CFD methods assume idealised geometries but ideal geometries do not exist, cannot be manufactured and their performance differs from real-world ones. By applying geometrical variations of a few microns, the agreement with experiments improves dramatically, but unfortunately the manufacturing errors in engines or in experiments are unknown. In order to overcome this limitation, uncertainty quantification considers the probability density functions of manufacturing errors...

  16. A High Performance Bayesian Computing Framework for Spatiotemporal Uncertainty Modeling

    Science.gov (United States)

    Cao, G.

    2015-12-01

    All types of spatiotemporal measurements are subject to uncertainty. With spatiotemporal data becomes increasingly involved in scientific research and decision making, it is important to appropriately model the impact of uncertainty. Quantitatively modeling spatiotemporal uncertainty, however, is a challenging problem considering the complex dependence and dataheterogeneities.State-space models provide a unifying and intuitive framework for dynamic systems modeling. In this paper, we aim to extend the conventional state-space models for uncertainty modeling in space-time contexts while accounting for spatiotemporal effects and data heterogeneities. Gaussian Markov Random Field (GMRF) models, also known as conditional autoregressive models, are arguably the most commonly used methods for modeling of spatially dependent data. GMRF models basically assume that a geo-referenced variable primarily depends on its neighborhood (Markov property), and the spatial dependence structure is described via a precision matrix. Recent study has shown that GMRFs are efficient approximation to the commonly used Gaussian fields (e.g., Kriging), and compared with Gaussian fields, GMRFs enjoy a series of appealing features, such as fast computation and easily accounting for heterogeneities in spatial data (e.g, point and areal). This paper represents each spatial dataset as a GMRF and integrates them into a state-space form to statistically model the temporal dynamics. Different types of spatial measurements (e.g., categorical, count or continuous), can be accounted for by according link functions. A fast alternative to MCMC framework, so-called Integrated Nested Laplace Approximation (INLA), was adopted for model inference.Preliminary case studies will be conducted to showcase the advantages of the described framework. In the first case, we apply the proposed method for modeling the water table elevation of Ogallala aquifer over the past decades. In the second case, we analyze the

  17. Computational uncertainty principle in nonlinear ordinary differential equations

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The error propagation for general numerical method in ordinarydifferential equations ODEs is studied. Three kinds of convergence, theoretical, numerical and actual convergences, are presented. The various components of round-off error occurring in floating-point computation are fully detailed. By introducing a new kind of recurrent inequality, the classical error bounds for linear multistep methods are essentially improved, and joining probabilistic theory the “normal” growth of accumulated round-off error is derived. Moreover, a unified estimate for the total error of general method is given. On the basis of these results, we rationally interpret the various phenomena found in the numerical experiments in part I of this paper and derive two universal relations which are independent of types of ODEs, initial values and numerical schemes and are consistent with the numerical results. Furthermore, we give the explicitly mathematical expression of the computational uncertainty principle and expound the intrinsic relation between two uncertainties which result from the inaccuracies of numerical method and calculating machine.

  18. Contribution to uncertainties computing: application to aerosol nanoparticles metrology

    International Nuclear Information System (INIS)

    This thesis aims to provide SMPS users with a methodology to compute uncertainties associated with the estimation of aerosol size distributions. SMPS selects and detects airborne particles with a Differential Mobility Analyser (DMA) and a Condensation Particle Counter (CPC), respectively. The on-line measurement provides particle counting over a large measuring range. Then, recovering aerosol size distribution from CPC measurements yields to consider an inverse problem under uncertainty. A review of models to represent CPC measurements as a function of the aerosol size distribution is presented in the first chapter showing that competitive theories exist to model the physic involved in the measurement. It shows in the meantime the necessity of modelling parameters and other functions as uncertain. The physical model we established was first created to accurately represent the physic and second to be low time consuming. The first requirement is obvious as it characterizes the performance of the model. On the other hand, the time constraint is common to every large-scale problems for which an evaluation of the uncertainty is sought. To perform the estimation of the size distribution, a new criterion that couples regularization techniques and decomposition on a wavelet basis is described. Regularization is largely used to solve ill-posed problems. The regularized solution is computed as a trade-off between fidelity to the data and prior on the solution to be rebuilt, the trade-off being represented by a scalar known as the regularization parameter. Nevertheless, when dealing with size distributions showing broad and sharp profiles, an homogeneous prior is no longer suitable. Main improvement of this work is brought when such situations occur. The multi-scale approach we propose for the definition of the new prior is an alternative that enables to adjust the weights of the regularization on each scale of the signal. The method is tested against common regularization

  19. Efficient Quantification of Uncertainties in Complex Computer Code Results Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal addresses methods for efficient quantification of margins and uncertainties (QMU) for models that couple multiple, large-scale commercial or...

  20. A best-estimate plus uncertainty type analysis for computing accurate critical channel power uncertainties

    International Nuclear Information System (INIS)

    This paper provides a Critical Channel Power (CCP) uncertainty analysis methodology based on a Monte-Carlo approach. This Monte-Carlo method includes the identification of the sources of uncertainty and the development of error models for the characterization of epistemic and aleatory uncertainties associated with the CCP parameter. Furthermore, the proposed method facilitates a means to use actual operational data leading to improvements over traditional methods (e.g., sensitivity analysis) which assume parametric models that may not accurately capture the possible complex statistical structures in the system input and responses. (author)

  1. The Uncertainty Test for the MAAP Computer Code

    Energy Technology Data Exchange (ETDEWEB)

    Park, S. H.; Song, Y. M.; Park, S. Y.; Ahn, K. I.; Kim, K. R.; Lee, Y. J. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2008-10-15

    After the Three Mile Island Unit 2 (TMI-2) and Chernobyl accidents, safety issues for a severe accident are treated in various aspects. Major issues in our research part include a level 2 PSA. The difficulty in expanding the level 2 PSA as a risk information activity is the uncertainty. In former days, it attached a weight to improve the quality in a internal accident PSA, but the effort is insufficient for decrease the phenomenon uncertainty in the level 2 PSA. In our country, the uncertainty degree is high in the case of a level 2 PSA model, and it is necessary to secure a model to decrease the uncertainty. We have not yet experienced the uncertainty assessment technology, the assessment system itself depends on advanced nations. In advanced nations, the severe accident simulator is implemented in the hardware level. But in our case, basic function in a software level can be implemented. In these circumstance at home and abroad, similar instances are surveyed such as UQM and MELCOR. Referred to these instances, SAUNA (Severe Accident UNcertainty Analysis) system is being developed in our project to assess and decrease the uncertainty in a level 2 PSA. It selects the MAAP code to analyze the uncertainty in a severe accident.

  2. Fuzzy randomness uncertainty in civil engineering and computational mechanics

    CERN Document Server

    Möller, Bernd

    2004-01-01

    This book, for the first time, provides a coherent, overall concept for taking account of uncertainty in the analysis, the safety assessment, and the design of structures. The reader is introduced to the problem of uncertainty modeling and familiarized with particular uncertainty models. For simultaneously considering stochastic and non-stochastic uncertainty the superordinated uncertainty model fuzzy randomness, which contains real valued random variables as well as fuzzy variables as special cases, is presented. For this purpose basic mathematical knowledge concerning the fuzzy set theory and the theory of fuzzy random variables is imparted. The body of the book comprises the appropriate quantification of uncertain structural parameters, the fuzzy and fuzzy probabilistic structural analysis, the fuzzy probabilistic safety assessment, and the fuzzy cluster structural design. The completely new algorithms are described in detail and illustrated by way of demonstrative examples.

  3. Computational uncertainty principle in nonlinear ordinary differential equations

    Institute of Scientific and Technical Information of China (English)

    LI; Jianping

    2001-01-01

    [1]Li Jianping, Zeng Qingcun, Chou Jifan, Computational Uncertainty Principle in Nonlinear Ordinary Differential Equations I. Numerical Results, Science in China, Ser. E, 2000, 43(5): 449[2]Henrici, P., Discrete Variable Methods in Ordinary Differential Equations, New York: John Wiley, 1962, 1; 187.[3]Henrici, P., Error Propagation for Difference Methods, New York: John Whiley, 1963.[4]Gear, C. W., Numerical Initial Value Problems in Ordinary Differential Equations, Englewood Cliffs, NJ: Prentice-Hall, 1971, 1; 72.[5]Hairer, E., Nrsett, S. P., Wanner, G., Solving Ordinary Differential Equations I. Nonstiff Problems, 2nd ed., Berlin-Heidelberg-New York: Springer-Verlag, 1993, 130.[6]Stoer, J., Bulirsch, R., Introduction to Numerical Analysis, 2nd ed., Vol. 1, Berlin-Heidelberg-New York: Springer-Verlag (reprinted in China by Beijing Wold Publishing Corporation), 1998, 428.[7]Li Qingyang, Numerical Methods in Ordinary Differential Equations (Stiff Problems and Boundary Value Problems), in Chinese Beijing: Higher Education Press, 1991, 1.[8]Li Ronghua, Weng Guochen, Numerical Methods in Differential Equations (in Chinese), 3rd ed., Beijing: Higher Education Press, 1996, 1.[9]Dahlquist, G., Convergence and stability in the numerical integration of ordinary differential equations, Math. Scandinavica, 1956, 4: 33.[10]Dahlquist, G., 33 years of numerical instability, Part I, BIT, 1985, 25: 188.[11]Heisenberg, W., The Physical Principles of Quantum Theory, Chicago: University of Chicago Press, 1930.[12]McMurry, S. M., Quantum Mechanics, London: Addison-Wesley Longman Ltd (reprined in China by Beijing World Publishing Corporation), 1998.

  4. Binding in light nuclei: Statistical NN uncertainties vs Computational accuracy

    CERN Document Server

    Perez, R Navarro; Amaro, J E; Arriola, E Ruiz

    2016-01-01

    We analyse the impact of the statistical uncertainties of the the nucleon-nucleon interaction, based on the Granada-2013 np-pp database, on the binding energies of the triton and the alpha particle using a bootstrap method, by solving the Faddeev equations for $^3$H and the Yakubovsky equations for $^4$He respectively. We check that in practice about 30 samples prove enough for a reliable error estimate. An extrapolation of the well fulfilled Tjon-line correlation predicts the experimental binding of the alpha particle within uncertainties.

  5. Computing Statistics under Interval and Fuzzy Uncertainty Applications to Computer Science and Engineering

    CERN Document Server

    Nguyen, Hung T; Wu, Berlin; Xiang, Gang

    2012-01-01

    In many practical situations, we are interested in statistics characterizing a population of objects: e.g. in the mean height of people from a certain area.   Most algorithms for estimating such statistics assume that the sample values are exact. In practice, sample values come from measurements, and measurements are never absolutely accurate. Sometimes, we know the exact probability distribution of the measurement inaccuracy, but often, we only know the upper bound on this inaccuracy. In this case, we have interval uncertainty: e.g. if the measured value is 1.0, and inaccuracy is bounded by 0.1, then the actual (unknown) value of the quantity can be anywhere between 1.0 - 0.1 = 0.9 and 1.0 + 0.1 = 1.1. In other cases, the values are expert estimates, and we only have fuzzy information about the estimation inaccuracy.   This book shows how to compute statistics under such interval and fuzzy uncertainty. The resulting methods are applied to computer science (optimal scheduling of different processors), to in...

  6. Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems

    Science.gov (United States)

    Groves, Curtis; Ilie, Marcel; Schallhorn, Paul

    2014-01-01

    Spacecraft components may be damaged due to airflow produced by Environmental Control Systems (ECS). There are uncertainties and errors associated with using Computational Fluid Dynamics (CFD) to predict the flow field around a spacecraft from the ECS System. This paper describes an approach to estimate the uncertainty in using CFD to predict the airflow speeds around an encapsulated spacecraft.

  7. Computational methods estimating uncertainties for profile reconstruction in scatterometry

    Science.gov (United States)

    Gross, H.; Rathsfeld, A.; Scholze, F.; Model, R.; Bär, M.

    2008-04-01

    The solution of the inverse problem in scatterometry, i.e. the determination of periodic surface structures from light diffraction patterns, is incomplete without knowledge of the uncertainties associated with the reconstructed surface parameters. With decreasing feature sizes of lithography masks, increasing demands on metrology techniques arise. Scatterometry as a non-imaging indirect optical method is applied to periodic line-space structures in order to determine geometric parameters like side-wall angles, heights, top and bottom widths and to evaluate the quality of the manufacturing process. The numerical simulation of the diffraction process is based on the finite element solution of the Helmholtz equation. The inverse problem seeks to reconstruct the grating geometry from measured diffraction patterns. Restricting the class of gratings and the set of measurements, this inverse problem can be reformulated as a non-linear operator equation in Euclidean spaces. The operator maps the grating parameters to the efficiencies of diffracted plane wave modes. We employ a Gauss-Newton type iterative method to solve this operator equation and end up minimizing the deviation of the measured efficiency or phase shift values from the simulated ones. The reconstruction properties and the convergence of the algorithm, however, is controlled by the local conditioning of the non-linear mapping and the uncertainties of the measured efficiencies or phase shifts. In particular, the uncertainties of the reconstructed geometric parameters essentially depend on the uncertainties of the input data and can be estimated by various methods. We compare the results obtained from a Monte Carlo procedure to the estimations gained from the approximative covariance matrix of the profile parameters close to the optimal solution and apply them to EUV masks illuminated by plane waves with wavelengths in the range of 13 nm.

  8. May Day: A computer code to perform uncertainty and sensitivity analysis. Manuals

    International Nuclear Information System (INIS)

    The computer program May Day was developed to carry out the uncertainty and sensitivity analysis in the evaluation of radioactive waste storage. The May Day was made by the Polytechnical University of Madrid. (Author)

  9. Computer simulations in room acoustics: concepts and uncertainties.

    Science.gov (United States)

    Vorländer, Michael

    2013-03-01

    Geometrical acoustics are used as a standard model for room acoustic design and consulting. Research on room acoustic simulation focuses on a more accurate modeling of propagation effects such as diffraction and other wave effects in rooms, and on scattering. Much progress was made in this field so that wave models also (for example, the boundary element method and the finite differences in time domain) can now be used for higher frequencies. The concepts and implementations of room simulation methods are briefly reviewed. After all, simulations in architectural acoustics are indeed powerful tools, but their reliability depends on the skills of the operator who has to create an adequate polygon model and has to choose the correct input data of boundary conditions such as absorption and scattering. Very little is known about the uncertainty of this input data. With the theory of error propagation of uncertainties it can be shown that prediction of reverberation times with accuracy better than the just noticeable difference requires input data in a quality which is not available from reverberation room measurements. PMID:23463991

  10. A computer simulation platform for the estimation of measurement uncertainties in dimensional X-ray computed tomography

    DEFF Research Database (Denmark)

    Hiller, Jochen; Reindl, Leonard M

    2012-01-01

    The knowledge of measurement uncertainty is of great importance in conformance testing in production. The tolerance limit for production must be reduced by the amounts of measurement uncertainty to ensure that the parts are in fact within the tolerance. Over the last 5 years, industrial X...... for estimating measurement uncertainties are briefly discussed. As we will show, the developed virtual CT (VCT) simulator can be adapted to various scanner systems, providing realistic CT data. Using the Monte Carlo method (MCM), measurement uncertainties for a given measuring task can be estimated, taking......-ray computed tomography (CT) has become an important technology for dimensional quality control. In this paper a computer simulation platform is presented which is able to investigate error sources in dimensional CT measurements. The typical workflow in industrial CT metrology is described and methods...

  11. Dissertation Defense: Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems

    Science.gov (United States)

    Groves, Curtis Edward

    2014-01-01

    Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This paper describes an approach to quantify the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft without the use of test data. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional validation by test only mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions.Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computational Fluid Dynamics can be used to verify these requirements; however, the model must be validated by test data. This research includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available and open source solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT, STARCCM+, and OPENFOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid Dynamics model using the methodology found in Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations. This method requires three separate grids and solutions, which quantify the error bars around Computational Fluid Dynamics predictions

  12. Dissertation Defense Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems

    Science.gov (United States)

    Groves, Curtis Edward

    2014-01-01

    Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This paper describes an approach to quantify the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft without the use of test data. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional "validation by test only" mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions. Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computational Fluid Dynamics can be used to verify these requirements; however, the model must be validated by test data. This research includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available and open source solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT, STARCCM+, and OPENFOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid Dynamics model using the methodology found in "Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations". This method requires three separate grids and solutions, which quantify the error bars around Computational Fluid Dynamics

  13. A computational framework for uncertainty integration in stochastic unit commitment with intermittent renewable energy sources

    International Nuclear Information System (INIS)

    Highlights: • A computational framework is proposed for uncertainty integration. • A new scenario generation method is proposed for renewable energy. • Prediction intervals are used to capture uncertainties of wind and solar power. • Load, wind, solar and generator outage uncertainties are integrated together. • Different generation costs and reserves are discussed for decision making. - Abstract: The penetration of intermittent renewable energy sources (IRESs) into power grids has increased in the last decade. Integration of wind farms and solar systems as the major IRESs have significantly boosted the level of uncertainty in operation of power systems. This paper proposes a comprehensive computational framework for quantification and integration of uncertainties in distributed power systems (DPSs) with IRESs. Different sources of uncertainties in DPSs such as electrical load, wind and solar power forecasts and generator outages are covered by the proposed framework. Load forecast uncertainty is assumed to follow a normal distribution. Wind and solar forecast are implemented by a list of prediction intervals (PIs) ranging from 5% to 95%. Their uncertainties are further represented as scenarios using a scenario generation method. Generator outage uncertainty is modeled as discrete scenarios. The integrated uncertainties are further incorporated into a stochastic security-constrained unit commitment (SCUC) problem and a heuristic genetic algorithm is utilized to solve this stochastic SCUC problem. To demonstrate the effectiveness of the proposed method, five deterministic and four stochastic case studies are implemented. Generation costs as well as different reserve strategies are discussed from the perspectives of system economics and reliability. Comparative results indicate that the planned generation costs and reserves are different from the realized ones. The stochastic models show better robustness than deterministic ones. Power systems run a higher

  14. Estimation of measurement uncertainties in X-ray computed tomography metrology using the substitution method

    DEFF Research Database (Denmark)

    Müller, Pavel; Hiller, Jochen; Dai, Y.;

    2014-01-01

    This paper presents the application of the substitution method for the estimation of measurement uncertainties using calibrated workpieces in X-ray computed tomography (CT) metrology. We have shown that this, well accepted method for uncertainty estimation using tactile coordinate measuring...... machines, can be applied to dimensional CT measurements. The method is based on repeated measurements carried out on a calibrated master piece. The master piece is a component of a dose engine from an insulin pen. Measurement uncertainties estimated from the repeated measurements of the master piece were...... transferred on to additionally scanned uncalibrated workpieces which provided the necessary link for achieving traceable measurements. © 2014 CIRP....

  15. Consideration of uncertainties in soil-structure interaction computations

    Energy Technology Data Exchange (ETDEWEB)

    Costantino, C.J.; Miller, C.A. [City Coll., New York, NY (United States). Earthquake Research Center

    1992-12-01

    This report summarizes the results obtained in a study conducted to evaluate and quantify some important effects of soil-structure interaction on the seismic response of Category I facilities. The specific areas addressed in this study have to do with the following: The specification of criteria needed to develop reliable results when using the large computer codes in SSI studies; the development of recommendations for specification of control point location for generic soil sites needed to generate input motions to the SSI analyses; the development of specific criteria to allow the Staff to judge the adequacy of fixed base structural analyses; the development of expanded guidelines for inclusion of variability in soil properties in the SSI calculations; and the development of estimates of radiation damping inherent in the detailed numerical analyses performed or SSI evaluations of typical Category I structures.

  16. Computer-assisted uncertainty assessment of k0-NAA measurement results

    Science.gov (United States)

    Bučar, T.; Smodiš, B.

    2008-10-01

    In quantifying measurement uncertainty of measurement results obtained by the k0-based neutron activation analysis ( k0-NAA), a number of parameters should be considered and appropriately combined in deriving the final budget. To facilitate this process, a program ERON (ERror propagatiON) was developed, which computes uncertainty propagation factors from the relevant formulae and calculates the combined uncertainty. The program calculates uncertainty of the final result—mass fraction of an element in the measured sample—taking into account the relevant neutron flux parameters such as α and f, including their uncertainties. Nuclear parameters and their uncertainties are taken from the IUPAC database (V.P. Kolotov and F. De Corte, Compilation of k0 and related data for NAA). Furthermore, the program allows for uncertainty calculations of the measured parameters needed in k0-NAA: α (determined with either the Cd-ratio or the Cd-covered multi-monitor method), f (using the Cd-ratio or the bare method), Q0 (using the Cd-ratio or internal comparator method) and k0 (using the Cd-ratio, internal comparator or the Cd subtraction method). The results of calculations can be printed or exported to text or MS Excel format for further analysis. Special care was taken to make the calculation engine portable by having possibility of its incorporation into other applications (e.g., DLL and WWW server). Theoretical basis and the program are described in detail, and typical results obtained under real measurement conditions are presented.

  17. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Jaeseok, E-mail: jheo@kaeri.re.kr; Kim, Kyung Doo, E-mail: kdkim@kaeri.re.kr

    2015-10-15

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper.

  18. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    International Nuclear Information System (INIS)

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper

  19. Efficient Methods for Bayesian Uncertainty Analysis and Global Optimization of Computationally Expensive Environmental Models

    Science.gov (United States)

    Shoemaker, Christine; Espinet, Antoine; Pang, Min

    2015-04-01

    Models of complex environmental systems can be computationally expensive in order to describe the dynamic interactions of the many components over a sizeable time period. Diagnostics of these systems can include forward simulations of calibrated models under uncertainty and analysis of alternatives of systems management. This discussion will focus on applications of new surrogate optimization and uncertainty analysis methods to environmental models that can enhance our ability to extract information and understanding. For complex models, optimization and especially uncertainty analysis can require a large number of model simulations, which is not feasible for computationally expensive models. Surrogate response surfaces can be used in Global Optimization and Uncertainty methods to obtain accurate answers with far fewer model evaluations, which made the methods practical for computationally expensive models for which conventional methods are not feasible. In this paper we will discuss the application of the SOARS surrogate method for estimating Bayesian posterior density functions for model parameters for a TOUGH2 model of geologic carbon sequestration. We will also briefly discuss new parallel surrogate global optimization algorithm applied to two groundwater remediation sites that was implemented on a supercomputer with up to 64 processors. The applications will illustrate the use of these methods to predict the impact of monitoring and management on subsurface contaminants.

  20. Measurement Uncertainty Evaluation in Dimensional X-ray Computed Tomography Using the Bootstrap Method

    DEFF Research Database (Denmark)

    Hiller, Jochen; Genta, Gianfranco; Barbato, Giulio;

    2014-01-01

    Industrial applications of computed tomography (CT) for dimensional metrology on various components are fast increasing, owing to a number of favorable properties such as capability of non-destructive internal measurements. Uncertainty evaluation is however more complex than in conventional...... measurement processes, e.g., with tactile systems, also due to factors related to systematic errors, mainly caused by specific CT image characteristics. In this paper we propose a simulation-based framework for measurement uncertainty evaluation in dimensional CT using the bootstrap method. In a case study...... the problem concerning measurement uncertainties was addressed with bootstrap and successfully applied to ball-bar CT measurements. Results obtained enabled extension to more complex shapes such as actual industrial components as we show by tests on a hollow cylinder workpiece....

  1. A Novel Method for the Evaluation of Uncertainty in Dose-Volume Histogram Computation

    International Nuclear Information System (INIS)

    Purpose: Dose-volume histograms (DVHs) are a useful tool in state-of-the-art radiotherapy treatment planning, and it is essential to recognize their limitations. Even after a specific dose-calculation model is optimized, dose distributions computed by using treatment-planning systems are affected by several sources of uncertainty, such as algorithm limitations, measurement uncertainty in the data used to model the beam, and residual differences between measured and computed dose. This report presents a novel method to take them into account. Methods and Materials: To take into account the effect of associated uncertainties, a probabilistic approach using a new kind of histogram, a dose-expected volume histogram, is introduced. The expected value of the volume in the region of interest receiving an absorbed dose equal to or greater than a certain value is found by using the probability distribution of the dose at each point. A rectangular probability distribution is assumed for this point dose, and a formulation that accounts for uncertainties associated with point dose is presented for practical computations. Results: This method is applied to a set of DVHs for different regions of interest, including 6 brain patients, 8 lung patients, 8 pelvis patients, and 6 prostate patients planned for intensity-modulated radiation therapy. Conclusions: Results show a greater effect on planning target volume coverage than in organs at risk. In cases of steep DVH gradients, such as planning target volumes, this new method shows the largest differences with the corresponding DVH; thus, the effect of the uncertainty is larger

  2. Prediction and Uncertainty in Computational Modeling of Complex Phenomena: A Whitepaper

    Energy Technology Data Exchange (ETDEWEB)

    Trucano, T.G.

    1999-01-20

    This report summarizes some challenges associated with the use of computational science to predict the behavior of complex phenomena. As such, the document is a compendium of ideas that have been generated by various staff at Sandia. The report emphasizes key components of the use of computational to predict complex phenomena, including computational complexity and correctness of implementations, the nature of the comparison with data, the importance of uncertainty quantification in comprehending what the prediction is telling us, and the role of risk in making and using computational predictions. Both broad and more narrowly focused technical recommendations for research are given. Several computational problems are summarized that help to illustrate the issues we have emphasized. The tone of the report is informal, with virtually no mathematics. However, we have attempted to provide a useful bibliography that would assist the interested reader in pursuing the content of this report in greater depth.

  3. Numerical Uncertainty Analysis for Computational Fluid Dynamics using Student T Distribution -- Application of CFD Uncertainty Analysis Compared to Exact Analytical Solution

    Science.gov (United States)

    Groves, Curtis E.; Ilie, marcel; Shallhorn, Paul A.

    2014-01-01

    Computational Fluid Dynamics (CFD) is the standard numerical tool used by Fluid Dynamists to estimate solutions to many problems in academia, government, and industry. CFD is known to have errors and uncertainties and there is no universally adopted method to estimate such quantities. This paper describes an approach to estimate CFD uncertainties strictly numerically using inputs and the Student-T distribution. The approach is compared to an exact analytical solution of fully developed, laminar flow between infinite, stationary plates. It is shown that treating all CFD input parameters as oscillatory uncertainty terms coupled with the Student-T distribution can encompass the exact solution.

  4. Hilbert's Second Problems and Uncertainty Computing, from HCP Logic's Point of View

    Institute of Scientific and Technical Information of China (English)

    James Kuodo Huang

    2006-01-01

    Hilbert's complete perfect (HCP) logic is introduced. The G(o)del's incompleteness theorem discloses the limit of logic. Huang's universal consistent theorem and relative consistent theorem extends the limit of logic. The proofs of these theorems are in 2-valued logic but the completeness can be extended in the three-valued HCP logic.The author proposes HCP logic for the foundation of uncertainty computing as well.

  5. A novel method for the evaluation of uncertainty in dose volume histogram computation

    CERN Document Server

    Cutanda-Henriquez, Francisco

    2007-01-01

    Dose volume histograms are a useful tool in state-of-the-art radiotherapy planning, and it is essential to be aware of their limitations. Dose distributions computed by treatment planning systems are affected by several sources of uncertainty such as algorithm limitations, measurement uncertainty in the data used to model the beam and residual differences between measured and computed dose, once the model is optimized. In order to take into account the effect of uncertainty, a probabilistic approach is proposed and a new kind of histogram, a dose-expected volume histogram, is introduced. The expected value of the volume in the region of interest receiving an absorbed dose equal or greater than a certain value is found using the probability distribution of the dose at each point. A rectangular probability distribution is assumed for this point dose, and a relationship is given for practical computations. This method is applied to a set of dose volume histograms for different regions of interest for 6 brain pat...

  6. How does our ignorance of rainfall affect the uncertainty of hydrological computations?

    Science.gov (United States)

    Lebecherel, Laure; Andréassian, Vazken

    2014-05-01

    Precipitation is an essential input to hydrological studies, fundamental for water balance studies, for hydrological simulation and forecasting. Since precipitation can be spatially and temporally variable, the configuration of the raingauge network can have a major impact on the accuracy of hydrological computations. Hydrological good sense tells us that the less we know about catchment rainfall, the more uncertain our hydrological computations will be. Quantifying this trend, i.e. the sensitivity of our computations to the design of the rainfall measurement network is essential in a context of increasing requests and decreasing finance. We keep hearing about the need to "rationalize" observation networks. However, this rationalization, which often means a reduction of network density, can deteriorate our rainfall knowledge and can particularly increase the hydrological computation uncertainties. We evaluate here on a large set of French catchments the impact of the rain gauge density and rain gauge network configuration on the uncertainty of several hydrological computations, based on the GR4J daily rainfall-runoff model [Perrin et al., 2003]. Four hydrological applications are considered: (i) daily runoff simulation, (ii) long-term average streamflow assessment, (iii) high-flow quantiles assessment, and (iv) low-flow quantiles assessment. Perrin, C., C. Michel, and V. Andréassian (2003), Improvement of a parsimonious model for streamflow simulation, Journal of Hydrology, 279(1-4), 275-289, doi: 10.1016/s0022-1694(03)00225-7.

  7. Flood risk assessment at the regional scale: Computational challenges and the monster of uncertainty

    Science.gov (United States)

    Efstratiadis, Andreas; Papalexiou, Simon-Michael; Markonis, Yiannis; Koukouvinos, Antonis; Vasiliades, Lampros; Papaioannou, George; Loukas, Athanasios

    2016-04-01

    We present a methodological framework for flood risk assessment at the regional scale, developed within the implementation of the EU Directive 2007/60 in Greece. This comprises three phases: (a) statistical analysis of extreme rainfall data, resulting to spatially-distributed parameters of intensity-duration-frequency (IDF) relationships and their confidence intervals, (b) hydrological simulations, using event-based semi-distributed rainfall-runoff approaches, and (c) hydraulic simulations, employing the propagation of flood hydrographs across the river network and the mapping of inundated areas. The flood risk assessment procedure is employed over the River Basin District of Thessaly, Greece, which requires schematization and modelling of hundreds of sub-catchments, each one examined for several risk scenarios. This is a challenging task, involving multiple computational issues to handle, such as the organization, control and processing of huge amount of hydrometeorological and geographical data, the configuration of model inputs and outputs, and the co-operation of several software tools. In this context, we have developed supporting applications allowing massive data processing and effective model coupling, thus drastically reducing the need for manual interventions and, consequently, the time of the study. Within flood risk computations we also account for three major sources of uncertainty, in an attempt to provide upper and lower confidence bounds of flood maps, i.e. (a) statistical uncertainty of IDF curves, (b) structural uncertainty of hydrological models, due to varying anteceded soil moisture conditions, and (c) parameter uncertainty of hydraulic models, with emphasis to roughness coefficients. Our investigations indicate that the combined effect of the above uncertainties (which are certainly not the unique ones) result to extremely large bounds of potential inundation, thus rising many questions about the interpretation and usefulness of current flood

  8. A sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, J. D. (Prostat, Mesa, AZ); Oberkampf, William Louis; Helton, Jon Craig (Arizona State University, Tempe, AZ); Storlie, Curtis B. (North Carolina State University, Raleigh, NC)

    2006-10-01

    Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a model is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.

  9. Uncertainty Management in Seismic Vulnerability Assessment Using Granular Computing Based on Covering of Universe

    Science.gov (United States)

    Khamespanah, F.; Delavar, M. R.; Zare, M.

    2013-05-01

    Earthquake is an abrupt displacement of the earth's crust caused by the discharge of strain collected along faults or by volcanic eruptions. Earthquake as a recurring natural cataclysm has always been a matter of concern in Tehran, capital of Iran, as a laying city on a number of known and unknown faults. Earthquakes can cause severe physical, psychological and financial damages. Consequently, some procedures should be developed to assist modelling the potential casualties and its spatial uncertainty. One of these procedures is production of seismic vulnerability maps to take preventive measures to mitigate corporeal and financial losses of future earthquakes. Since vulnerability assessment is a multi-criteria decision making problem depending on some parameters and expert's judgments, it undoubtedly is characterized by intrinsic uncertainties. In this study, it is attempted to use Granular computing (GrC) model based on covering of universe to handle the spatial uncertainty. Granular computing model concentrates on a general theory and methodology for problem solving as well as information processing by assuming multiple levels of granularity. Basic elements in granular computing are subsets, classes, and clusters of a universe called elements. In this research GrC is used for extracting classification rules based on seismic vulnerability with minimum entropy to handle uncertainty related to earthquake data. Tehran was selected as the study area. In our previous research, Granular computing model based on a partition model of universe was employed. The model has some kinds of limitations in defining similarity between elements of the universe and defining granules. In the model similarity between elements is defined based on an equivalence relation. According to this relation, two objects are similar based on some attributes, provided for each attribute the values of these objects are equal. In this research a general relation for defining similarity between

  10. Quantifying the Contribution of Post-Processing in Computed Tomography Measurement Uncertainty

    DEFF Research Database (Denmark)

    Stolfi, Alessandro; Thompson, Mary Kathryn; Carli, Lorenzo;

    2016-01-01

    This paper evaluates and quantifies the repeatability of post-processing settings, such as surface determination, data fitting, and the definition of the datum system, on the uncertainties of Computed Tomography (CT) measurements. The influence of post-processing contributions was determined...... by calculating the standard deviation of 10 repeated measurement evaluations on the same data set. The evaluations were performed on an industrial assembly. Each evaluation includes several dimensional and geometrical measurands that were expected to have different responses to the various post......-processing settings. It was found that the definition of the datum system had the largest impact on the uncertainty with a standard deviation of a few microns. The surface determination and data fitting had smaller contributions with sub-micron repeatability....

  11. Uncertainty quantification based on pillars of experiment, theory, and computation. Part I: Data analysis

    Science.gov (United States)

    Elishakoff, I.; Sarlin, N.

    2016-06-01

    In this paper we provide a general methodology of analysis and design of systems involving uncertainties. Available experimental data is enclosed by some geometric figures (triangle, rectangle, ellipse, parallelogram, super ellipse) of minimum area. Then these areas are inflated resorting to the Chebyshev inequality in order to take into account the forecasted data. Next step consists in evaluating response of system when uncertainties are confined to one of the above five suitably inflated geometric figures. This step involves a combined theoretical and computational analysis. We evaluate the maximum response of the system subjected to variation of uncertain parameters in each hypothesized region. The results of triangular, interval, ellipsoidal, parallelogram, and super ellipsoidal calculi are compared with the view of identifying the region that leads to minimum of maximum response. That response is identified as a result of the suggested predictive inference. The methodology thus synthesizes probabilistic notion with each of the five calculi. Using the term "pillar" in the title was inspired by the News Release (2013) on according Honda Prize to J. Tinsley Oden, stating, among others, that "Dr. Oden refers to computational science as the "third pillar" of scientific inquiry, standing beside theoretical and experimental science. Computational science serves as a new paradigm for acquiring knowledge and informing decisions important to humankind". Analysis of systems with uncertainties necessitates employment of all three pillars. The analysis is based on the assumption that that the five shapes are each different conservative estimates of the true bounding region. The smallest of the maximal displacements in x and y directions (for a 2D system) therefore provides the closest estimate of the true displacements based on the above assumption.

  12. Uncertainty studies of real anode surface area in computational analysis for molten salt electrorefining

    International Nuclear Information System (INIS)

    Highlights: → Numerical electrochemo-fluid modeling of pyrochemical electrorefining in cross comparison with 2D and 3D analysis models. → Benchmark study on cell potential of molten LiCl-KCl electrorefining with Mark-IV electrorefiner containing EBR-II spent fuel. → Determination of real anode surface area profile model governing electrorefining performance. → Identification of uncertainty factors in electrorefining causing disagreements between simulation and experiment. → Fully transient performance analysis of 80 hours Mark-IV electrorefining with multi-species multi-reaction 1D model. - Abstract: This study examines how much cell potential changes with five differently assumed real anode surface area cases. Determining real anode surface area is a significant issue to be resolved for precisely modeling molten salt electrorefining. Based on a three-dimensional electrorefining model, calculated cell potentials compare with an experimental cell potential variation over 80 h of operation of the Mark-IV electrorefiner with driver fuel from the Experimental Breeder Reactor II. We succeeded to achieve a good agreement with an overall trend of the experimental data with appropriate selection of a mode for real anode surface area, but there are still local inconsistencies between theoretical calculation and experimental observation. In addition, the results were validated and compared with two-dimensional results to identify possible uncertainty factors that had to be further considered in a computational electrorefining analysis. These uncertainty factors include material properties, heterogeneous material distribution, surface roughness, and current efficiency. Zirconium's abundance and complex behavior have more impact on uncertainty towards the latter period of electrorefining at given batch of fuel. The benchmark results found that anode materials would be dissolved from both axial and radial directions at least for low burn-up metallic fuels after active

  13. UNCERTAINTY MANAGEMENT IN SEISMIC VULNERABILITY ASSESSMENT USING GRANULAR COMPUTING BASED ON COVERING OF UNIVERSE

    Directory of Open Access Journals (Sweden)

    F. Khamespanah

    2013-05-01

    Granular computing model concentrates on a general theory and methodology for problem solving as well as information processing by assuming multiple levels of granularity. Basic elements in granular computing are subsets, classes, and clusters of a universe called elements. In this research GrC is used for extracting classification rules based on seismic vulnerability with minimum entropy to handle uncertainty related to earthquake data. Tehran was selected as the study area. In our previous research, Granular computing model based on a partition model of universe was employed. The model has some kinds of limitations in defining similarity between elements of the universe and defining granules. In the model similarity between elements is defined based on an equivalence relation. According to this relation, two objects are similar based on some attributes, provided for each attribute the values of these objects are equal. In this research a general relation for defining similarity between elements of universe is proposed. The general relation is used for defining similarity and instead of partitioning the universe, granulation is done based on covering of universe. As a result of the study, a physical seismic vulnerability map of Tehran has been produced based on granular computing model. The accuracy of the seismic vulnerability map is evaluated using granular computing model based on covering of universe. The comparison between this model and granular computing model based on partition model of universe is undertaken which verified the superiority of the GrC based on covering of the universe in terms of the match between the achieved results with those confirmed by the related experts' judgments.

  14. Analysis of the CONRAD computational problems expressing only stochastic uncertainties: neutrons and protons.

    Science.gov (United States)

    Gualdrini, G; Tanner, R J; Agosteo, S; Pola, A; Bedogni, R; Ferrari, P; Lacoste, V; Bordy, J-M; Chartier, J-L; de Carlan, L; Gomez Ros, J-M; Grosswendt, B; Kodeli, I; Price, R A; Rollet, S; Schultz, F; Siebert, B; Terrissol, M; Zankl, M

    2008-01-01

    Within the scope of CONRAD (A Coordinated Action for Radiation Dosimetry) Work Package 4 on Computational Dosimetry jointly collaborated with the other research actions on internal dosimetry, complex mixed radiation fields at workplaces and medical staff dosimetry. Besides these collaborative actions, WP4 promoted an international comparison on eight problems with their associated experimental data. A first set of three problems, the results of which are herewith summarised, dealt only with the expression of the stochastic uncertainties of the results: the analysis of the response function of a proton recoil telescope detector, the study of a Bonner sphere neutron spectrometer and the analysis of the neutron spectrum and dosimetric quantity H(p)(10) in a thermal neutron facility operated by IRSN Cadarache (the SIGMA facility). A second paper will summarise the results of the other five problems which dealt with the full uncertainty budget estimate. A third paper will present the results of a comparison on in vivo measurements of the (241)Am bone-seeker nuclide distributed in the knee. All the detailed papers will be presented in the WP4 Final Workshop Proceedings.

  15. Overcoming computational uncertainties to reveal chemical sensitivity in single molecule conduction calculations.

    Science.gov (United States)

    Solomon, Gemma C; Reimers, Jeffrey R; Hush, Noel S

    2005-06-01

    In the calculation of conduction through single molecule's approximations about the geometry and electronic structure of the system are usually made in order to simplify the problem. Previously [G. C. Solomon, J. R. Reimers, and N. S. Hush, J. Chem. Phys. 121, 6615 (2004)], we have shown that, in calculations employing cluster models for the electrodes, proper treatment of the open-shell nature of the clusters is the most important computational feature required to make the results sensitive to variations in the structural and chemical features of the system. Here, we expand this and establish a general hierarchy of requirements involving treatment of geometrical approximations. These approximations are categorized into two classes: those associated with finite-dimensional methods for representing the semi-infinite electrodes, and those associated with the chemisorption topology. We show that ca. 100 unique atoms are required in order to properly characterize each electrode: using fewer atoms leads to nonsystematic variations in conductivity that can overwhelm the subtler changes. The choice of binding site is shown to be the next most important feature, while some effects that are difficult to control experimentally concerning the orientations at each binding site are actually shown to be insignificant. Verification of this result provides a general test for the precision of computational procedures for molecular conductivity. Predictions concerning the dependence of conduction on substituent and other effects on the central molecule are found to be meaningful only when they exceed the uncertainties of the effects associated with binding-site variation.

  16. Differential effects of reward and punishment in decision making under uncertainty: a computational study.

    Directory of Open Access Journals (Sweden)

    Elaine eDuffin

    2014-02-01

    Full Text Available Computational models of learning have proved largely successful in characterising potentialmechanisms which allow humans to make decisions in uncertain and volatile contexts. We reporthere findings that extend existing knowledge and show that a modified reinforcement learningmodel which differentiates between prior reward and punishment can provide the best fit tohuman behaviour in decision making under uncertainty. More specifically, we examined thefit of our modified reinforcement learning model to human behavioural data in a probabilistictwo-alternative decision making task with rule reversals. Our results demonstrate that this modelpredicted human behaviour better than a series of other models based on reinforcement learningor Bayesian reasoning. Unlike the Bayesian models, our modified reinforcement learning modeldoes not include any representation of rule switches. When our task is considered purely as amachine learning task, to gain as many rewards as possible without trying to describe humanbehaviour, the performance of modified reinforcement learning and Bayesian methods is similar.Others have used various computational models to describe human behaviour in similar tasks,however, we are not aware of any who have compared Bayesian reasoning with reinforcementlearning modified to differentiate rewards and punishments.

  17. Computational uncertainty principle in nonlinear ordinary differential equations--Numerical results

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    In a majority of cases of long-time numerical integration for initial-value problems, round-off error has received little attention. Using twenty-nine numerical methods, the influence of round-off error on numerical solutions is generally studied through a large number of numerical experiments. Here we find that there exists a strong dependence on machine precision (which is a new kind of dependence different from the sensitive dependence on initial conditions), maximally effective computation time (MECT) and optimal stepsize (OS) in solving nonlinear ordinary differential equations (ODEs) in finite machine precision. And an optimal searching method for evaluating MECT and OS under finite machine precision is presented. The relationships between MECT, OS, the order of numerical method and machine precision are found. Numerical results show that round-off error plays a significant role in the above phenomena. Moreover, we find two universal relations which are independent of the types of ODEs, initial values and numerical schemes. Based on the results of numerical experiments, we present a computational uncertainty principle, which is a great challenge to the reliability of long-time numerical integration for nonlinear ODEs.

  18. Fast computation of statistical uncertainty for spatiotemporal distributions estimated directly from dynamic cone beam SPECT projections

    Energy Technology Data Exchange (ETDEWEB)

    Reutter, Bryan W.; Gullberg, Grant T.; Huesman, Ronald H.

    2001-04-09

    The estimation of time-activity curves and kinetic model parameters directly from projection data is potentially useful for clinical dynamic single photon emission computed tomography (SPECT) studies, particularly in those clinics that have only single-detector systems and thus are not able to perform rapid tomographic acquisitions. Because the radiopharmaceutical distribution changes while the SPECT gantry rotates, projections at different angles come from different tracer distributions. A dynamic image sequence reconstructed from the inconsistent projections acquired by a slowly rotating gantry can contain artifacts that lead to biases in kinetic parameters estimated from time-activity curves generated by overlaying regions of interest on the images. If cone beam collimators are used and the focal point of the collimators always remains in a particular transaxial plane, additional artifacts can arise in other planes reconstructed using insufficient projection samples [1]. If the projection samples truncate the patient's body, this can result in additional image artifacts. To overcome these sources of bias in conventional image based dynamic data analysis, we and others have been investigating the estimation of time-activity curves and kinetic model parameters directly from dynamic SPECT projection data by modeling the spatial and temporal distribution of the radiopharmaceutical throughout the projected field of view [2-8]. In our previous work we developed a computationally efficient method for fully four-dimensional (4-D) direct estimation of spatiotemporal distributions from dynamic SPECT projection data [5], which extended Formiconi's least squares algorithm for reconstructing temporally static distributions [9]. In addition, we studied the biases that result from modeling various orders temporal continuity and using various time samplings [5]. the present work, we address computational issues associated with evaluating the statistical uncertainty of

  19. Evaluation of nuclear safety from the outputs of computer codes in the presence of uncertainties

    International Nuclear Information System (INIS)

    We apply methods from order statistics to the problem of satisfying regulations that specify individual criteria to be met by each of a number of outputs, k, from a computer code simulating nuclear accidents. The regulations are assumed to apply to an 'extent', γk, (such as 95%) of the cumulative probability distribution of each output, k, that is obtained by randomly varying the inputs to the code over their ranges of uncertainty. We use a 'bracketing' approach to obtain expressions for the confidence, β, or probability that these desired extents will be covered in N runs of the code. Detailed results are obtained for k=1,2,3, with equal extents, γ, and are shown to depend on the degree of correlation of the outputs. They reduce to the proper expressions in limiting cases. These limiting cases are also analyzed for an arbitrary number of outputs, k. The bracketing methodology is contrasted with the traditional 'coverage' approach in which the objective is to obtain a range of outputs that enclose a total fraction, γ, of all possible outputs, without regard to the extent of individual outputs. For the case of two outputs we develop an alternate formulation and show that the confidence, β, depends on the degree of correlation between outputs. The alternate formulation reduces to the single output case when the outputs are so well correlated that the coverage criterion is always met in a single run of the code if either output lies beyond an extent γ, it reduces to Wilks' expression for un-correlated variables when the outputs are independent, and it reduces to Wald's result when the outputs are so negatively correlated that the coverage criterion could never be met by the two outputs of a single run of the code. The predictions of both formulations are validated by comparison with Monte Carlo simulations

  20. Computing continuous record of discharge with quantified uncertainty using index velocity observations: A probabilistic machine learning approach

    Science.gov (United States)

    Farahmand, Touraj; Hamilton, Stuart

    2016-04-01

    Application of the index velocity method for computing continuous records of discharge has become increasingly common, especially since the introduction of low-cost acoustic Doppler velocity meters (ADVMs). In general, the index velocity method can be used at locations where stage-discharge methods are used, but it is especially appropriate and recommended when more than one specific discharge can be measured for a specific stage such as backwater and unsteady flow conditions caused by but not limited to the following; stream confluences, streams flowing into lakes or reservoirs, tide-affected streams, regulated streamflows (dams or control structures), or streams affected by meteorological forcing, such as strong prevailing winds. In existing index velocity modeling techniques, two models (ratings) are required; index velocity model and stage-area model. The outputs from each of these models, mean channel velocity (Vm) and cross-sectional area (A), are then multiplied together to compute a discharge. Mean channel velocity (Vm) can generally be determined by a multivariate regression parametric model such as linear regression in the simplest case. The main challenges in the existing index velocity modeling techniques are; 1) Preprocessing and QA/QC of continuous index velocity data and synchronizing them with discharge measurements. 2) Nonlinear relationship between mean velocity and index velocity which is not uncommon at monitoring locations. 3)Model exploration and analysis in order to find the optimal regression model predictor(s) and model type (linear vs nonlinear and if nonlinear number of the parameters). 3) Model changes caused by dynamical changes in the environment (geomorphic, biological) over time 5) Deployment of the final model into the Data Management Systems (DMS) for real-time discharge calculation 6) Objective estimation of uncertainty caused by: field measurement errors; structural uncertainty; parameter uncertainty; and continuous sensor data

  1. PABS: A Computer Program to Normalize Emission Probabilities and Calculate Realistic Uncertainties

    International Nuclear Information System (INIS)

    The program PABS normalizes relative particle emission probabilities to an absolute scale and calculates the relevant uncertainties on this scale. The program is written in Java using the JDK 1.6 library. For additional information about system requirements, the code itself, and compiling from source, see the README file distributed with this program. The mathematical procedures used are given.

  2. LJUNGSKILE 1.0 A Computer Program for Investigation of Uncertainties in Chemical Speciation

    International Nuclear Information System (INIS)

    In analysing the long-term safety of nuclear waste disposal, there is a need to investigate uncertainties in chemical speciation calculations. Chemical speciation is of importance in evaluating the solubility of radionuclides, the chemical degradation of engineering materials, and chemical processes controlling groundwater composition. The uncertainties in chemical speciation may for instance be related to uncertainties in thermodynamic data, the groundwater composition, or the extrapolation to the actual temperature and ionic strength. The magnitude of such uncertainties and its implications are seldom explicitly evaluated in any detail. Commonly available chemical speciation programmes normally do not have a build-in option to include uncertainty ranges. The program developed within this project has the capability of incorporating uncertainty ranges in speciation calculations and can be used for graphical presentation of uncertainty ranges for dominant species. The program should be regarded as a starting point for assessing uncertainties in chemical speciation, since it is not yet comprehensive in its capabilities. There may be limitations in its usefulness to address various geochemical problems. The LJUNGSKILE code allows the user to select two approaches: the Monte Carlo (MC) approach and the Latin Hypercube Sampling (LHS). LHS allows to produce a satisfactory statistics with a minimum of CPU time. It is, in general, possible to do a simple theoretical speciation calculation within seconds. There are, admittedly, alternatives to LHS and there is criticism towards the uncritical use of LHS output because commonly correlation between some of the input variables exists. LHS, like MC, is not capable to take these correlations into account. Such a correlation can, i.e. exist between the pH of a solution and the partial pressure of CO2: higher pH solutions may absorb larger amounts of CO2 and can reduce the CO2 partial pressure. It is therefore of advantage to

  3. Optimal Technological Portfolios for Climate-Change Policy under Uncertainty: A Computable General Equilibrium Approach

    OpenAIRE

    David F. Bradford; Seung-Rae Kim; Klaus Keller

    2004-01-01

    When exploring solutions to long-term environmental problems such as climate change, it is crucial to understand how the rates and directions of technological change may interact with environmental policies in the presence of uncertainty. This paper analyzes optimal technological portfolios for global carbon emissions reductions in an integrated assessment model of the coupled social-natural system. The model used here is a probabilistic, two-technology extension of Nordhaus" earlier model (N...

  4. Contributions to the uncertainty management in numerical modelization: wave propagation in random media and analysis of computer experiments

    International Nuclear Information System (INIS)

    The present document constitutes my Habilitation thesis report. It recalls my scientific activity of the twelve last years, since my PhD thesis until the works completed as a research engineer at CEA Cadarache. The two main chapters of this document correspond to two different research fields both referring to the uncertainty treatment in engineering problems. The first chapter establishes a synthesis of my work on high frequency wave propagation in random medium. It more specifically relates to the study of the statistical fluctuations of acoustic wave travel-times in random and/or turbulent media. The new results mainly concern the introduction of the velocity field statistical anisotropy in the analytical expressions of the travel-time statistical moments according to those of the velocity field. This work was primarily carried by requirements in geophysics (oil exploration and seismology). The second chapter is concerned by the probabilistic techniques to study the effect of input variables uncertainties in numerical models. My main applications in this chapter relate to the nuclear engineering domain which offers a large variety of uncertainty problems to be treated. First of all, a complete synthesis is carried out on the statistical methods of sensitivity analysis and global exploration of numerical models. The construction and the use of a meta-model (inexpensive mathematical function replacing an expensive computer code) are then illustrated by my work on the Gaussian process model (kriging). Two additional topics are finally approached: the high quantile estimation of a computer code output and the analysis of stochastic computer codes. We conclude this memory with some perspectives about the numerical simulation and the use of predictive models in industry. This context is extremely positive for future researches and application developments. (author)

  5. Uncertainty analysis of computational methods for deriving sensible heat flux values from scintillometer measurements

    Directory of Open Access Journals (Sweden)

    P. A. Solignac

    2009-11-01

    Full Text Available The use of scintillometers to determine sensible heat fluxes is now common in studies of land-atmosphere interactions. The main interest in these instruments is due to their ability to quantify energy distributions at the landscape scale, as they can calculate sensible heat flux values over long distances, in contrast to Eddy Covariance systems. However, scintillometer data do not provide a direct measure of sensible heat flux, but require additional data, such as the Bowen ratio (β, to provide flux values. The Bowen ratio can either be measured using Eddy Covariance systems or derived from the energy balance closure. In this work, specific requirements for estimating energy fluxes using a scintillometer were analyzed, as well as the accuracy of two flux calculation methods. We first focused on the classical method (used in standard softwares and we analysed the impact of the Bowen ratio on flux value and uncertainty. For instance, an averaged Bowen ratio (β of less than 1 proved to be a significant source of measurement uncertainty. An alternative method, called the "β-closure method", for which the Bowen ratio measurement is not necessary, was also tested. In this case, it was observed that even for low β values, flux uncertainties were reduced and scintillometer data were well correlated with the Eddy Covariance results. Besides, both methods should tend to the same results, but the second one slightly underestimates H while β decreases (<5%.

  6. Real-time, mixed-mode computing architecture for waveform-resolved lidar systems with total propagated uncertainty

    Science.gov (United States)

    Ortman, Robert L.; Carr, Domenic A.; James, Ryan; Long, Daniel; O'Shaughnessy, Matthew R.; Valenta, Christopher R.; Tuell, Grady H.

    2016-05-01

    We have developed a prototype real-time computer for a bathymetric lidar capable of producing point clouds attributed with total propagated uncertainty (TPU). This real-time computer employs a "mixed-mode" architecture comprised of an FPGA, CPU, and GPU. Noise reduction and ranging are performed in the digitizer's user-programmable FPGA, and coordinates and TPU are calculated on the GPU. A Keysight M9703A digitizer with user-programmable Xilinx Virtex 6 FPGAs digitizes as many as eight channels of lidar data, performs ranging, and delivers the data to the CPU via PCIe. The floating-point-intensive coordinate and TPU calculations are performed on an NVIDIA Tesla K20 GPU. Raw data and computed products are written to an SSD RAID, and an attributed point cloud is displayed to the user. This prototype computer has been tested using 7m-deep waveforms measured at a water tank on the Georgia Tech campus, and with simulated waveforms to a depth of 20m. Preliminary results show the system can compute, store, and display about 20 million points per second.

  7. Mathematical and Computational Tools for Predictive Simulation of Complex Coupled Systems under Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Ghanem, Roger

    2013-03-25

    Methods and algorithms are developed to enable the accurate analysis of problems that exhibit interacting physical processes with uncertainties. These uncertainties can pertain either to each of the physical processes or to the manner in which they depend on each others. These problems are cast within a polynomial chaos framework and their solution then involves either solving a large system of algebraic equations or a high dimensional numerical quadrature. In both cases, the curse of dimensionality is manifested. Procedures are developed for the efficient evaluation of the resulting linear equations that advantage of the block sparse structure of these equations, resulting in a block recursive Schur complement construction. In addition, embedded quadratures are constructed that permit the evaluation of very high-dimensional integrals using low-dimensional quadratures adapted to particular quantities of interest. The low-dimensional integration is carried out in a transformed measure space in which the quantity of interest is low-dimensional. Finally, a procedure is also developed to discover a low-dimensional manifold, embedded in the initial high-dimensional one, in which scalar quantities of interest exist. This approach permits the functional expression of the reduced space in terms of the original space, thus permitting cross-scale sensitivity analysis.

  8. Uncertainty analysis of computational methods for deriving sensible heat flux values from scintillometer measurements

    Directory of Open Access Journals (Sweden)

    P. A. Solignac

    2009-06-01

    Full Text Available The use of scintillometers to determine sensible heat fluxes is now common in studies of land-atmosphere interactions. The main interest in these instruments is due to their ability to quantify energy distributions at the landscape scale, as they can calculate sensible heat flux values over long distances, in contrast to Eddy Correlation systems. However, scintillometer data do not provide a direct measure of sensible heat flux, but require additional data, such as the Bowen ratio (β, to provide flux values. The Bowen ratio can either be measured using Eddy Correlation systems or derived from the energy balance closure. In this work, specific requirements for estimating energy fluxes using a scintillometer were analyzed, as well as the accuracy of two flux calculation methods. We first focused on the classical method (used in standard software. We analysed the impact of the Bowen ratio according to both time averaging and ratio values; for instance, an averaged Bowen ratio (β of less than 1 proved to be a significant source of measurement uncertainty. An alternative method, called the "β-closure method", for which the Bowen ratio measurement is not necessary, was also tested. In this case, it was observed that even for low β values, flux uncertainties were reduced and scintillometer data were well correlated with the Eddy Correlation results.

  9. Development of a Computational Framework for Stochastic Co-optimization of Water and Energy Resource Allocations under Climatic Uncertainty

    Science.gov (United States)

    Xuan, Y.; Mahinthakumar, K.; Arumugam, S.; DeCarolis, J.

    2015-12-01

    Owing to the lack of a consistent approach to assimilate probabilistic forecasts for water and energy systems, utilization of climate forecasts for conjunctive management of these two systems is very limited. Prognostic management of these two systems presents a stochastic co-optimization problem that seeks to determine reservoir releases and power allocation strategies while minimizing the expected operational costs subject to probabilistic climate forecast constraints. To address these issues, we propose a high performance computing (HPC) enabled computational framework for stochastic co-optimization of water and energy resource allocations under climate uncertainty. The computational framework embodies a new paradigm shift in which attributes of climate (e.g., precipitation, temperature) and its forecasted probability distribution are employed conjointly to inform seasonal water availability and electricity demand. The HPC enabled cyberinfrastructure framework is developed to perform detailed stochastic analyses, and to better quantify and reduce the uncertainties associated with water and power systems management by utilizing improved hydro-climatic forecasts. In this presentation, our stochastic multi-objective solver extended from Optimus (Optimization Methods for Universal Simulators), is introduced. The solver uses parallel cooperative multi-swarm method to solve for efficient solution of large-scale simulation-optimization problems on parallel supercomputers. The cyberinfrastructure harnesses HPC resources to perform intensive computations using ensemble forecast models of streamflow and power demand. The stochastic multi-objective particle swarm optimizer we developed is used to co-optimize water and power system models under constraints over a large number of ensembles. The framework sheds light on the application of climate forecasts and cyber-innovation framework to improve management and promote the sustainability of water and energy systems.

  10. Alleviating Search Uncertainty through Concept Associations: Automatic Indexing, Co-Occurrence Analysis, and Parallel Computing.

    Science.gov (United States)

    Chen, Hsinchun; Martinez, Joanne; Kirchhoff, Amy; Ng, Tobun D.; Schatz, Bruce R.

    1998-01-01

    Grounded on object filtering, automatic indexing, and co-occurrence analysis, an experiment was performed using a parallel supercomputer to analyze over 400,000 abstracts in an INSPEC computer engineering collection. A user evaluation revealed that system-generated thesauri were better than the human-generated INSPEC subject thesaurus in concept…

  11. Computing the uncertainty associated with the control of ecological and biological systems

    Directory of Open Access Journals (Sweden)

    Alessandro Ferrarini

    2013-09-01

    Full Text Available Recently, I showed that ecological and biological networks can be controlled by coupling their dynamics to evolutionary modelling. This provides numerous solutions to the goal of guiding a system's behaviour towards the desired result. In this paper, I face another important question: how reliable is the achieved solution? In other words, which is the degree of uncertainty about getting the desired result if values of edges and nodes were a bit different from optimized ones? This is a pivotal question, because it's not assured that while managing a certain system we are able to impose to nodes and edges exactly the optimized values we would need in order to achieve the desired results. In order to face this topic, I have formulated here a 3-parts framework (network dynamics - genetic optimization - stochastic simulations and, using an illustrative example, I have been able to detect the most reliable solution to the goal of network control. The proposed framework could be used to: a counteract damages to ecological and biological networks, b safeguard rare and endangered species, c manage systems at the least possible cost, and d plan optimized bio-manipulations.

  12. Bayesian Uncertainty Analysis SOARS for Computationally Expensive Simulation Models with Application to Contaminant Hydrology in the Cannonsville Watershed

    Science.gov (United States)

    Shoemaker, C. A.; Cowan, D.; Woodbury, J.; Ruppert, D.; Bliznyuk, N.; Wang, Y.; Li, Y.

    2009-12-01

    This paper presents application of a new computationally efficient method SOARS for statistically rigorous assessment of uncertainty in parameters and model output when the model is calibrated to field data. The SOARS method is general and is here applied to watershed problems The innovative aspect of this procedure is that an optimization method is first used to find the maximum likelihood estimator and then the costly simulations done during the optimization are re-used to build a response surface model of the likelihood function. Markov Chain Monte Carlo is applied then to the response surface model to obtain the posterior distributions of the model parameters and the appropriate transformations to correct for non-normal error. On a hazardous spill in channel problem and on a small watershed (37 km2), the computational effort to obtain roughly the same accuracy of solution is 150 model simulations for the SOARS method versus 10,000 simulations for conventional MCMC analysis, which is more than a 60 fold reduction in computational effort. For the larger Cannonsville Watershed (1200 km2) the method is expanded to provide posterior densities not only on parameter values but also on multiple model predictions. Available software for the method will be discussed as well as SOAR’s use for assessing the impact of climate change on hydrology and water-borne pollutant transport in the Cannonsville basin and other watersheds.

  13. Computation of strain and rotation tensor as well as their uncertainties for small arrays in spherical coordinate system

    Institute of Scientific and Technical Information of China (English)

    MENG Guo-jie; REN Jin-wei; WU Ji-cang; SHEN Xu-hui

    2008-01-01

    Based on Taylor series expansion and strain components expressions of elastic mechanics, we derive formulae of strain and rotation tensor for small arrays in spherical coordinates system. By linearization process of the formulae, we also derive expressions of strain components and Euler vector uncertainties respectively for subnets using the law of error propagation. Taking GPS velocity field in Sichuan-Yunnan area as an example, we compute dilation rate and maximum shear strain rate field using the above procedure, and their characteristics are preliminarily carried on. Limits of the strain model for small array are also discussed. We make detailed explanations on small array method and the choice of small arrays. How to set weights of GPS observations are further discussed. Moreover relationship between strain and radius of GPS subnets is also analyzed.

  14. Reducing annotation cost and uncertainty in computer-aided diagnosis through selective iterative classification

    Science.gov (United States)

    Riely, Amelia; Sablan, Kyle; Xiaotao, Thomas; Furst, Jacob; Raicu, Daniela

    2015-03-01

    Medical imaging technology has always provided radiologists with the opportunity to view and keep records of anatomy of the patient. With the development of machine learning and intelligent computing, these images can be used to create Computer-Aided Diagnosis (CAD) systems, which can assist radiologists in analyzing image data in various ways to provide better health care to patients. This paper looks at increasing accuracy and reducing cost in creating CAD systems, specifically in predicting the malignancy of lung nodules in the Lung Image Database Consortium (LIDC). Much of the cost in creating an accurate CAD system stems from the need for multiple radiologist diagnoses or annotations of each image, since there is rarely a ground truth diagnosis and even different radiologists' diagnoses of the same nodule often disagree. To resolve this issue, this paper outlines an method of selective iterative classification that predicts lung nodule malignancy by using multiple radiologist diagnoses only for cases that can benefit from them. Our method achieved 81% accuracy while costing only 46% of the method that indiscriminately used all annotations, which achieved a lower accuracy of 70%, while costing more.

  15. UCODE_2005 and six other computer codes for universal sensitivity analysis, calibration, and uncertainty evaluation constructed using the JUPITER API

    Science.gov (United States)

    Poeter, Eileen E.; Hill, Mary C.; Banta, Edward R.; Mehl, Steffen; Christensen, Steen

    2006-01-01

    This report documents the computer codes UCODE_2005 and six post-processors. Together the codes can be used with existing process models to perform sensitivity analysis, data needs assessment, calibration, prediction, and uncertainty analysis. Any process model or set of models can be used; the only requirements are that models have numerical (ASCII or text only) input and output files, that the numbers in these files have sufficient significant digits, that all required models can be run from a single batch file or script, and that simulated values are continuous functions of the parameter values. Process models can include pre-processors and post-processors as well as one or more models related to the processes of interest (physical, chemical, and so on), making UCODE_2005 extremely powerful. An estimated parameter can be a quantity that appears in the input files of the process model(s), or a quantity used in an equation that produces a value that appears in the input files. In the latter situation, the equation is user-defined. UCODE_2005 can compare observations and simulated equivalents. The simulated equivalents can be any simulated value written in the process-model output files or can be calculated from simulated values with user-defined equations. The quantities can be model results, or dependent variables. For example, for ground-water models they can be heads, flows, concentrations, and so on. Prior, or direct, information on estimated parameters also can be considered. Statistics are calculated to quantify the comparison of observations and simulated equivalents, including a weighted least-squares objective function. In addition, data-exchange files are produced that facilitate graphical analysis. UCODE_2005 can be used fruitfully in model calibration through its sensitivity analysis capabilities and its ability to estimate parameter values that result in the best possible fit to the observations. Parameters are estimated using nonlinear regression: a

  16. Computational uncertainty principle in nonlinear ordinary differential equations (I)——Numerical results

    Institute of Scientific and Technical Information of China (English)

    李建平[1; 曾庆存[2; 丑纪范[3

    2000-01-01

    In a majority of cases of long-time numerical integration for initial-value problems, roundoff error has received little attention. Using twenty-nine numerical methods, the influence of round-off error on numerical solutions is generally studied through a large number of numerical experiments. Here we find that there exists a strong dependence on machine precision (which is a new kind of dependence different from the sensitive dependence on initial conditions), maximally effective computation time (MECT) and optimal stepsize (OS) in solving nonlinear ordinary differential equations (ODEs) in finite machine precision. And an optimal searching method for evaluating MECT and OS under finite machine precision is presented. The relationships between MECT, OS, the order of numerical method and machine precision are found. Numerical results show that round-off error plays a significant role in the above phenomena. Moreover, we find two universal relations which are independent of the types of ODEs, initial val

  17. Uncertainties in radiative transfer computations: consequences on the ocean color products

    Science.gov (United States)

    Dilligeard, Eric; Zagolski, Francis; Fischer, Juergen; Santer, Richard P.

    2003-05-01

    Operational MERIS (MEdium Resolution Imaging Spectrometer) level-2 processing uses auxiliary data generated by two radiative transfer tools. These two codes simulate upwelling radiances within a coupled 'Atmosphere-Ocean' system, using different approaches based on the matrix-operator method (MOMO) and the successive orders (SO) technique. Intervalidation of these two radiative transfer codes was performed in order to implement them in the MERIS level-2 processing. MOMO and SO simulations were then conducted on a set of representative test cases. Results stressed both for all test cases good agreements were observed. The scattering processes are retrieved within a few tenths of a percent. Nevertheless, some substantial discrepancies occurred if the polarization is not taken into account mainly in the Rayleigh scattering computations. A preliminary study indicates that the impact of the code inaccuracy in the water leaving radiances retrieval (a level-2 MERIS product) is large, up to 50% in relative difference. Applying the OC2 algorithm, the effect on the retrieval chlorophyll concentration is less than 10%.

  18. The VIMOS VLT Deep Survey: Computing the two point correlation statistics and associated uncertainties

    CERN Document Server

    Pollo, A; Guzzo, L; Lefèvre, O; Blaizot, J P; Cappi, A; Iovino, A; Marinoni, C; McCracken, H J; Bottini, D; Garilli, B; Le Brun, V; MacCagni, D; Picat, J P; Scaramella, R; Scodeggio, M; Tresse, L; Vettolani, G; Zanichelli, A; Adami, C; Bardelli, S; Bolzonella, M; Charlot, S; Contini, T; Foucaud, S; Franzetti, P; Gavignaud, I; Ilbert, O; Marano, B; Mathez, G; Mazure, A; Merighi, R; Paltani, S; Pellò, R; Pozzetti, L; Radovich, M; Zamorani, G; Zucca, E; Bondi, M; Bongiorno, A; Busarello, G; Ciliegi, P; Mellier, Y; Merluzzi, P; Ripepi, V; Rizzo, D

    2004-01-01

    We are presenting in this paper a detailed account of the methods used to compute the three-dimensional two-point galaxy correlation function in the VIMOS-VLT deep survey (VVDS). We investigate how instrumental selection effects and observational biases affect the measurements and identify the methods to correct them. We quantify the accuracy of our correction method using an ensemble of fifty mock galaxy surveys generated with the GalICS semi-analytic model of galaxy formation which incorporate the same selection biases and tiling strategy as the real data does. We demonstrate that we are able to recover the real-space two-point correlation function xi(s) to an accuracy better than 10% on scales larger than 1 h^{-1} Mpc, and of about 30% on scales below 1 h^{-1} Mpc, with the sampling strategy used for the first epoch VVDS data. The projected correlation function w_p(r_p) is recovered with an accuracy better than 10% on all scales 0.1 <= r <= 10 h^{-1} Mpc. There is a tendency for a small but systemati...

  19. Construction and experimental identification of an uncertain model in computational dynamics using a generalized probabilistic approach of uncertainties

    OpenAIRE

    Batou, Anas; SOIZE, Christian; Corus, M.

    2010-01-01

    International audience We are interested in constructing an uncertain model of a nominal motor CSS of pressurized water reactor using a generalized probabilistic approach of uncertainties and in identifying this model using experimental measurements of the first eigenfrequencies. This generalized probabilistic approach of uncertainties allows both model-parameter uncertainties and model uncertainties to be taken into account and identified separately in the context of the experimental moda...

  20. Model Uncertainty

    OpenAIRE

    Clyde, Merlise; George, Edward I.

    2004-01-01

    The evolution of Bayesian approaches for model uncertainty over the past decade has been remarkable. Catalyzed by advances in methods and technology for posterior computation, the scope of these methods has widened substantially. Major thrusts of these developments have included new methods for semiautomatic prior specification and posterior exploration. To illustrate key aspects of this evolution, the highlights of some of these developments are described.

  1. The Personality Trait of Intolerance to Uncertainty Affects Behavior in a Novel Computer-Based Conditioned Place Preference Task

    Science.gov (United States)

    Radell, Milen L.; Myers, Catherine E.; Beck, Kevin D.; Moustafa, Ahmed A.; Allen, Michael Todd

    2016-01-01

    Recent work has found that personality factors that confer vulnerability to addiction can also affect learning and economic decision making. One personality trait which has been implicated in vulnerability to addiction is intolerance to uncertainty (IU), i.e., a preference for familiar over unknown (possibly better) options. In animals, the motivation to obtain drugs is often assessed through conditioned place preference (CPP), which compares preference for contexts where drug reward was previously received. It is an open question whether participants with high IU also show heightened preference for previously rewarded contexts. To address this question, we developed a novel computer-based CPP task for humans in which participants guide an avatar through a paradigm in which one room contains frequent reward (i.e., rich) and one contains less frequent reward (i.e., poor). Following exposure to both contexts, subjects are assessed for preference to enter the previously rich and previously poor room. Individuals with low IU showed little bias to enter the previously rich room first, and instead entered both rooms at about the same rate which may indicate a foraging behavior. By contrast, those with high IU showed a strong bias to enter the previously rich room first. This suggests an increased tendency to chase reward in the intolerant group, consistent with previously observed behavior in opioid-addicted individuals. Thus, the personality factor of high IU may produce a pre-existing cognitive bias that provides a mechanism to promote decision-making processes that increase vulnerability to addiction. PMID:27555829

  2. The Personality Trait of Intolerance to Uncertainty Affects Behavior in a Novel Computer-Based Conditioned Place Preference Task.

    Science.gov (United States)

    Radell, Milen L; Myers, Catherine E; Beck, Kevin D; Moustafa, Ahmed A; Allen, Michael Todd

    2016-01-01

    Recent work has found that personality factors that confer vulnerability to addiction can also affect learning and economic decision making. One personality trait which has been implicated in vulnerability to addiction is intolerance to uncertainty (IU), i.e., a preference for familiar over unknown (possibly better) options. In animals, the motivation to obtain drugs is often assessed through conditioned place preference (CPP), which compares preference for contexts where drug reward was previously received. It is an open question whether participants with high IU also show heightened preference for previously rewarded contexts. To address this question, we developed a novel computer-based CPP task for humans in which participants guide an avatar through a paradigm in which one room contains frequent reward (i.e., rich) and one contains less frequent reward (i.e., poor). Following exposure to both contexts, subjects are assessed for preference to enter the previously rich and previously poor room. Individuals with low IU showed little bias to enter the previously rich room first, and instead entered both rooms at about the same rate which may indicate a foraging behavior. By contrast, those with high IU showed a strong bias to enter the previously rich room first. This suggests an increased tendency to chase reward in the intolerant group, consistent with previously observed behavior in opioid-addicted individuals. Thus, the personality factor of high IU may produce a pre-existing cognitive bias that provides a mechanism to promote decision-making processes that increase vulnerability to addiction. PMID:27555829

  3. Uncertainty analysis guide

    Energy Technology Data Exchange (ETDEWEB)

    Andres, T.H

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  4. Uncertainty analysis guide

    International Nuclear Information System (INIS)

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  5. Uncertainty and Cognitive Control

    Directory of Open Access Journals (Sweden)

    Faisal eMushtaq

    2011-10-01

    Full Text Available A growing trend of neuroimaging, behavioural and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1 There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2 There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3 The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the need for control; (4 Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders.

  6. A study on evaluation strategies in dimensional X-ray computed tomography by estimation of measurement uncertainties

    DEFF Research Database (Denmark)

    Müller, Pavel; Hiller, Jochen; Cantatore, Angela;

    2012-01-01

    measurement results using different measuring strategies applied in different inspection software packages for volume and surface data analysis. The strategy influence is determined by calculating the measurement uncertainty. This investigation includes measurements of two industrial items, an aluminium pipe...... in smaller systematic errors compared to distance and height measurements. It was found that uncertainties of all measurands evaluated on surface data were generally greater compared to measurements performed on volume data....... connector and a plastic toggle, a hearing aid component. These are measured using a commercial CT scanner. Traceability is transferred using tactile and optical coordinate measuring machines, which are used to produce reference measurements. Results show that measurements of diameter for both parts resulted...

  7. Uncertainty analysis

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software.

  8. Uncertainty budget for a whole body counter in the scan geometry and computer simulation of the calibration phantoms

    International Nuclear Information System (INIS)

    At the Austrian Research Centers Seibersdorf (ARCS), a whole body counter (WBC) in the scan geometry is used to perform routine measurements for the determination of radioactive intake of workers. The calibration of the WBC is made using bottle phantoms with a homogeneous activity distribution. The same calibration procedures have been simulated using Monte Carlo N-Particle (MCNP) code and FLUKA and the results of the full energy peak efficiencies for eight energies and five phantoms have been compared with the experimental results. The deviation between experiment and simulation results is within 10%. Furthermore, uncertainty budget evaluations have been performed to find out which parameters make substantial contributions to these differences. Therefore, statistical errors of the Monte Carlo simulation, uncertainties in the cross section tables and differences due to geometrical considerations have been taken into account. Comparisons between these results and the one with inhomogeneous distribution, for which the activity is concentrated only in certain parts of the body (such as head, lung, arms and legs), have been performed. The maximum deviation of 43% from the homogeneous case has been found when the activity is concentrated on the arms. (authors)

  9. A new surrogate modeling technique combining Kriging and polynomial chaos expansions – Application to uncertainty analysis in computational dosimetry

    OpenAIRE

    Kersaudy, Pierric; Sudret, Bruno; Varsier, Nadège; Picon, Odile; Wiart, Joe

    2015-01-01

    In numerical dosimetry, the recent advances in high performance computing led to a strong reduction of the required computational time to assess the specific absorption rate (SAR) characterizing the human exposure to electromagnetic waves. However, this procedure remains time-consuming and a single simulation can request several hours. As a consequence, the influence of uncertain input parameters onto the SAR cannot be analyzed using crude Monte Carlo simulation. The solution presented here t...

  10. Measurement uncertainty.

    Science.gov (United States)

    Bartley, David; Lidén, Göran

    2008-08-01

    The reporting of measurement uncertainty has recently undergone a major harmonization whereby characteristics of a measurement method obtained during establishment and application are combined componentwise. For example, the sometimes-pesky systematic error is included. A bias component of uncertainty can be often easily established as the uncertainty in the bias. However, beyond simply arriving at a value for uncertainty, meaning to this uncertainty if needed can sometimes be developed in terms of prediction confidence in uncertainty-based intervals covering what is to be measured. To this end, a link between concepts of accuracy and uncertainty is established through a simple yet accurate approximation to a random variable known as the non-central Student's t-distribution. Without a measureless and perpetual uncertainty, the drama of human life would be destroyed. Winston Churchill.

  11. Understanding uncertainty

    CERN Document Server

    Lindley, Dennis V

    2013-01-01

    Praise for the First Edition ""...a reference for everyone who is interested in knowing and handling uncertainty.""-Journal of Applied Statistics The critically acclaimed First Edition of Understanding Uncertainty provided a study of uncertainty addressed to scholars in all fields, showing that uncertainty could be measured by probability, and that probability obeyed three basic rules that enabled uncertainty to be handled sensibly in everyday life. These ideas were extended to embrace the scientific method and to show how decisions, containing an uncertain element, could be rationally made.

  12. Estimation and Uncertainty Analysis of Flammability Properties for Computer-aided molecular design of working fluids for thermodynamic cycles

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Abildskov, Jens; Sin, Gürkan

    Computer Aided Molecular Design (CAMD) is an important tool to generate, test and evaluate promising chemical products. CAMD can be used in thermodynamic cycle for the design of pure component or mixture working fluids in order to improve the heat transfer capacity of the system. The safety...

  13. Selection of low activation materials for fusion power plants using ACAB system: the effect of computational methods and cross section uncertainties on waste management assessment

    Energy Technology Data Exchange (ETDEWEB)

    Alonso, M.; Sanz, J.; Rodriguez, A.; Falquina, R. [Universidad Nacional de Educacion a Distancia (UNED), Dept. of Power Engineering, Madrid (Spain); Cabellos, O.; Sanz, J. [Universidad Politecnica de Madrid, Instituto de Fusion Nuclear (UPM) (Spain)

    2003-07-01

    The feasibility of nuclear fusion as a realistic option for energy generation depends on its radioactive waste management assessment. In this respect, the production of high level waste is to be avoided and the reduction of low level waste volumes is to be enhanced. Three different waste management options are commonly regarded in fusion plants: Hands-on Recycling, Remote Recycling and Shallow Land Burial (SLB). Therefore, important research work has been undertaken to find low activation structural materials. In performing this task, a major issue is to compute the concentration limits (CLs) for all natural elements, which will be used to select the intended constituent elements of a particular Low Activation Material (LAM) and assess how much the impurities can deteriorate the waste management properties. Nevertheless, the reliable computation of CLs depends on the accuracy of nuclear data (mainly activation cross-sections) and the suitability of the computational method both for inertial and magnetic fusion environments. In this paper the importance of nuclear data uncertainties and mathematical algorithms used in different activation calculations for waste management purposes will be studied. Our work is centred on the study of {sup 186}W activation under first structural wall conditions of Hylife-II inertial fusion reactor design. The importance of the dominant transmutation/decay sequence has been documented in several publications. From a practical point of view, W is used in low activation materials for fusion applications: Cr-W ferritic/martensitic steels, and the need to better compute its activation has been assessed, in particular in relation to the cross-section uncertainties for reactions leading to Ir isotopes. {sup 192n}Ir and {sup 192}Ir reach a secular equilibrium, and {sup 192n}Ir is the critical one for waste management, with a half life of 241 years. From a theoretical point of view, this is one of the most complex chains appearing in

  14. Computing the Risk of Postprandial Hypo- and Hyperglycemia in Type 1 Diabetes Mellitus Considering Intrapatient Variability and Other Sources of Uncertainty

    Science.gov (United States)

    García-Jaramillo, Maira; Calm, Remei; Bondia, Jorge; Tarín, Cristina; Vehí, Josep

    2009-01-01

    Objective The objective of this article was to develop a methodology to quantify the risk of suffering different grades of hypo- and hyperglycemia episodes in the postprandial state. Methods Interval predictions of patient postprandial glucose were performed during a 5-hour period after a meal for a set of 3315 scenarios. Uncertainty in the patient's insulin sensitivities and carbohydrate (CHO) contents of the planned meal was considered. A normalized area under the curve of the worst-case predicted glucose excursion for severe and mild hypo- and hyperglycemia glucose ranges was obtained and weighted accordingly to their importance. As a result, a comprehensive risk measure was obtained. A reference model of preprandial glucose values representing the behavior in different ranges was chosen by a ξ2 test. The relationship between the computed risk index and the probability of occurrence of events was analyzed for these reference models through 19,500 Monte Carlo simulations. Results The obtained reference models for each preprandial glucose range were 100, 160, and 220 mg/dl. A relationship between the risk index ranges 120 and the probability of occurrence of mild and severe postprandial hyper- and hypoglycemia can be derived. Conclusions When intrapatient variability and uncertainty in the CHO content of the meal are considered, a safer prediction of possible hyper- and hypoglycemia episodes induced by the tested insulin therapy can be calculated. PMID:20144339

  15. Uncertainty in Environmental Economics

    OpenAIRE

    Robert S. Pindyck

    2006-01-01

    In a world of certainty, the design of environmental policy is relatively straightforward, and boils down to maximizing the present value of the flow of social benefits minus costs. But the real world is one of considerable uncertainty -- over the physical and ecological impact of pollution, over the economic costs and benefits of reducing it, and over the discount rates that should be used to compute present values. The implications of uncertainty are complicated by the fact that most enviro...

  16. Minimising uncertainty induced by temperature extrapolations of thermodynamic data: a pragmatic view on the integration of thermodynamic databases into geochemical computer codes

    International Nuclear Information System (INIS)

    Incorporation of temperature corrections is gaining priority regarding geochemical modelling computer codes with built-in thermodynamic databases related to performance assessment in nuclear waste management. As no experimental data at elevated temperatures are available e.g. for many actinide and lanthanide species, the simplest one-term extrapolations of equilibrium constants are usually assumed in practice. Such extrapolations, if set inappropriately, may accumulate large additional uncertainty at temperatures above 100 deg C. Such errors can be avoided because one-, two- and three-term extrapolations have great predictive potential for isoelectric/iso-coulombic reactions which has to be explored and extensively used in geochemical modelling by LMA and/or GEM algorithm. This can be done efficiently and consistently via implementing a built-in 'hybrid' database combining 'kernel' thermochemical/EoS data for substances with the 'extension' reaction-defined data for other species. (author)

  17. A research on the verification of models used in the computational codes and the uncertainty reduction method for the containment integrity evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Moo Hwan; Seo, Kyoung Woo [POSTECH, Pohang (Korea, Republic of)

    2001-03-15

    In the probability approach, the calculated CCFPs of all the scenarios were zero, which meant that it was expected that for all the accident scenarios the maximum pressure load induced by DCH was lower than the containment failure pressure obtained from the fragility curve. Thus, it can be stated that the KSNP containment is robust to the DCH threat. And uncertainty of computer codes used to be two (deterministic and probabilistic) approaches were reduced by the sensitivity tests and the research with the verification and comparison of the DCH models in each code. So, this research was to evaluate synthetic result of DCH issue and expose accurate methodology to assess containment integrity about operating PWR in Korea.

  18. Measurement Uncertainty

    Science.gov (United States)

    Koch, Michael

    Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.

  19. A new surrogate modeling technique combining Kriging and polynomial chaos expansions – Application to uncertainty analysis in computational dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Kersaudy, Pierric, E-mail: pierric.kersaudy@orange.com [Orange Labs, 38 avenue du Général Leclerc, 92130 Issy-les-Moulineaux (France); Whist Lab, 38 avenue du Général Leclerc, 92130 Issy-les-Moulineaux (France); ESYCOM, Université Paris-Est Marne-la-Vallée, 5 boulevard Descartes, 77700 Marne-la-Vallée (France); Sudret, Bruno [ETH Zürich, Chair of Risk, Safety and Uncertainty Quantification, Stefano-Franscini-Platz 5, 8093 Zürich (Switzerland); Varsier, Nadège [Orange Labs, 38 avenue du Général Leclerc, 92130 Issy-les-Moulineaux (France); Whist Lab, 38 avenue du Général Leclerc, 92130 Issy-les-Moulineaux (France); Picon, Odile [ESYCOM, Université Paris-Est Marne-la-Vallée, 5 boulevard Descartes, 77700 Marne-la-Vallée (France); Wiart, Joe [Orange Labs, 38 avenue du Général Leclerc, 92130 Issy-les-Moulineaux (France); Whist Lab, 38 avenue du Général Leclerc, 92130 Issy-les-Moulineaux (France)

    2015-04-01

    In numerical dosimetry, the recent advances in high performance computing led to a strong reduction of the required computational time to assess the specific absorption rate (SAR) characterizing the human exposure to electromagnetic waves. However, this procedure remains time-consuming and a single simulation can request several hours. As a consequence, the influence of uncertain input parameters on the SAR cannot be analyzed using crude Monte Carlo simulation. The solution presented here to perform such an analysis is surrogate modeling. This paper proposes a novel approach to build such a surrogate model from a design of experiments. Considering a sparse representation of the polynomial chaos expansions using least-angle regression as a selection algorithm to retain the most influential polynomials, this paper proposes to use the selected polynomials as regression functions for the universal Kriging model. The leave-one-out cross validation is used to select the optimal number of polynomials in the deterministic part of the Kriging model. The proposed approach, called LARS-Kriging-PC modeling, is applied to three benchmark examples and then to a full-scale metamodeling problem involving the exposure of a numerical fetus model to a femtocell device. The performances of the LARS-Kriging-PC are compared to an ordinary Kriging model and to a classical sparse polynomial chaos expansion. The LARS-Kriging-PC appears to have better performances than the two other approaches. A significant accuracy improvement is observed compared to the ordinary Kriging or to the sparse polynomial chaos depending on the studied case. This approach seems to be an optimal solution between the two other classical approaches. A global sensitivity analysis is finally performed on the LARS-Kriging-PC model of the fetus exposure problem.

  20. Conundrums with uncertainty factors.

    Science.gov (United States)

    Cooke, Roger

    2010-03-01

    The practice of uncertainty factors as applied to noncancer endpoints in the IRIS database harkens back to traditional safety factors. In the era before risk quantification, these were used to build in a "margin of safety." As risk quantification takes hold, the safety factor methods yield to quantitative risk calculations to guarantee safety. Many authors believe that uncertainty factors can be given a probabilistic interpretation as ratios of response rates, and that the reference values computed according to the IRIS methodology can thus be converted to random variables whose distributions can be computed with Monte Carlo methods, based on the distributions of the uncertainty factors. Recent proposals from the National Research Council echo this view. Based on probabilistic arguments, several authors claim that the current practice of uncertainty factors is overprotective. When interpreted probabilistically, uncertainty factors entail very strong assumptions on the underlying response rates. For example, the factor for extrapolating from animal to human is the same whether the dosage is chronic or subchronic. Together with independence assumptions, these assumptions entail that the covariance matrix of the logged response rates is singular. In other words, the accumulated assumptions entail a log-linear dependence between the response rates. This in turn means that any uncertainty analysis based on these assumptions is ill-conditioned; it effectively computes uncertainty conditional on a set of zero probability. The practice of uncertainty factors is due for a thorough review. Two directions are briefly sketched, one based on standard regression models, and one based on nonparametric continuous Bayesian belief nets. PMID:20030767

  1. PIV uncertainty propagation

    Science.gov (United States)

    Sciacchitano, Andrea; Wieneke, Bernhard

    2016-08-01

    This paper discusses the propagation of the instantaneous uncertainty of PIV measurements to statistical and instantaneous quantities of interest derived from the velocity field. The expression of the uncertainty of vorticity, velocity divergence, mean value and Reynolds stresses is derived. It is shown that the uncertainty of vorticity and velocity divergence requires the knowledge of the spatial correlation between the error of the x and y particle image displacement, which depends upon the measurement spatial resolution. The uncertainty of statistical quantities is often dominated by the random uncertainty due to the finite sample size and decreases with the square root of the effective number of independent samples. Monte Carlo simulations are conducted to assess the accuracy of the uncertainty propagation formulae. Furthermore, three experimental assessments are carried out. In the first experiment, a turntable is used to simulate a rigid rotation flow field. The estimated uncertainty of the vorticity is compared with the actual vorticity error root-mean-square, with differences between the two quantities within 5-10% for different interrogation window sizes and overlap factors. A turbulent jet flow is investigated in the second experimental assessment. The reference velocity, which is used to compute the reference value of the instantaneous flow properties of interest, is obtained with an auxiliary PIV system, which features a higher dynamic range than the measurement system. Finally, the uncertainty quantification of statistical quantities is assessed via PIV measurements in a cavity flow. The comparison between estimated uncertainty and actual error demonstrates the accuracy of the proposed uncertainty propagation methodology.

  2. Introduction to uncertainty quantification

    CERN Document Server

    Sullivan, T J

    2015-01-01

    Uncertainty quantification is a topic of increasing practical importance at the intersection of applied mathematics, statistics, computation, and numerous application areas in science and engineering. This text provides a framework in which the main objectives of the field of uncertainty quantification are defined, and an overview of the range of mathematical methods by which they can be achieved. Complete with exercises throughout, the book will equip readers with both theoretical understanding and practical experience of the key mathematical and algorithmic tools underlying the treatment of uncertainty in modern applied mathematics. Students and readers alike are encouraged to apply the mathematical methods discussed in this book to their own favourite problems to understand their strengths and weaknesses, also making the text suitable as a self-study. This text is designed as an introduction to uncertainty quantification for senior undergraduate and graduate students with a mathematical or statistical back...

  3. Calibration uncertainty

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Anglov, Thomas

    2002-01-01

    uncertainty was verified from independent measurements of the same sample by demonstrating statistical control of analytical results and the absence of bias. The proposed method takes into account uncertainties of the measurement, as well as of the amount of calibrant. It is applicable to all types......Methods recommended by the International Standardization Organisation and Eurachem are not satisfactory for the correct estimation of calibration uncertainty. A novel approach is introduced and tested on actual calibration data for the determination of Pb by ICP-AES. The improved calibration...

  4. Uncertainty theory

    CERN Document Server

    Liu, Baoding

    2015-01-01

    When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...

  5. Sensitivity and uncertainty analysis

    CERN Document Server

    Cacuci, Dan G; Navon, Ionel Michael

    2005-01-01

    As computer-assisted modeling and analysis of physical processes have continued to grow and diversify, sensitivity and uncertainty analyses have become indispensable scientific tools. Sensitivity and Uncertainty Analysis. Volume I: Theory focused on the mathematical underpinnings of two important methods for such analyses: the Adjoint Sensitivity Analysis Procedure and the Global Adjoint Sensitivity Analysis Procedure. This volume concentrates on the practical aspects of performing these analyses for large-scale systems. The applications addressed include two-phase flow problems, a radiative c

  6. Teaching Uncertainties

    Science.gov (United States)

    Duerdoth, Ian

    2009-01-01

    The subject of uncertainties (sometimes called errors) is traditionally taught (to first-year science undergraduates) towards the end of a course on statistics that defines probability as the limit of many trials, and discusses probability distribution functions and the Gaussian distribution. We show how to introduce students to the concepts of…

  7. Calibration Under Uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Swiler, Laura Painton; Trucano, Timothy Guy

    2005-03-01

    This report is a white paper summarizing the literature and different approaches to the problem of calibrating computer model parameters in the face of model uncertainty. Model calibration is often formulated as finding the parameters that minimize the squared difference between the model-computed data (the predicted data) and the actual experimental data. This approach does not allow for explicit treatment of uncertainty or error in the model itself: the model is considered the %22true%22 deterministic representation of reality. While this approach does have utility, it is far from an accurate mathematical treatment of the true model calibration problem in which both the computed data and experimental data have error bars. This year, we examined methods to perform calibration accounting for the error in both the computer model and the data, as well as improving our understanding of its meaning for model predictability. We call this approach Calibration under Uncertainty (CUU). This talk presents our current thinking on CUU. We outline some current approaches in the literature, and discuss the Bayesian approach to CUU in detail.

  8. Demand Uncertainty

    DEFF Research Database (Denmark)

    Nguyen, Daniel Xuyen

    This paper presents a model of trade that explains why firms wait to export and why many exporters fail. Firms face uncertain demands that are only realized after the firm enters the destination. The model retools the timing of uncertainty resolution found in productivity heterogeneity models...... the high rate of exit seen in the first years of exporting. Finally, when faced with multiple countries in which to export, some firms will choose to sequentially export in order to slowly learn more about its chances for success in untested markets....

  9. Orbital State Uncertainty Realism

    Science.gov (United States)

    Horwood, J.; Poore, A. B.

    2012-09-01

    Fundamental to the success of the space situational awareness (SSA) mission is the rigorous inclusion of uncertainty in the space surveillance network. The *proper characterization of uncertainty* in the orbital state of a space object is a common requirement to many SSA functions including tracking and data association, resolution of uncorrelated tracks (UCTs), conjunction analysis and probability of collision, sensor resource management, and anomaly detection. While tracking environments, such as air and missile defense, make extensive use of Gaussian and local linearity assumptions within algorithms for uncertainty management, space surveillance is inherently different due to long time gaps between updates, high misdetection rates, nonlinear and non-conservative dynamics, and non-Gaussian phenomena. The latter implies that "covariance realism" is not always sufficient. SSA also requires "uncertainty realism"; the proper characterization of both the state and covariance and all non-zero higher-order cumulants. In other words, a proper characterization of a space object's full state *probability density function (PDF)* is required. In order to provide a more statistically rigorous treatment of uncertainty in the space surveillance tracking environment and to better support the aforementioned SSA functions, a new class of multivariate PDFs are formulated which more accurately characterize the uncertainty of a space object's state or orbit. The new distribution contains a parameter set controlling the higher-order cumulants which gives the level sets a distinctive "banana" or "boomerang" shape and degenerates to a Gaussian in a suitable limit. Using the new class of PDFs within the general Bayesian nonlinear filter, the resulting filter prediction step (i.e., uncertainty propagation) is shown to have the *same computational cost as the traditional unscented Kalman filter* with the former able to maintain a proper characterization of the uncertainty for up to *ten

  10. A monomial chaos approach for efficient uncertainty quantification on nonlinear problems

    NARCIS (Netherlands)

    Witteveen, J.A.S.; Bijl, H.

    2008-01-01

    A monomial chaos approach is presented for efficient uncertainty quantification in nonlinear computational problems. Propagating uncertainty through nonlinear equations can be computationally intensive for existing uncertainty quantification methods. It usually results in a set of nonlinear equation

  11. Uncertainty-induced quantum nonlocality

    Science.gov (United States)

    Wu, Shao-xiong; Zhang, Jun; Yu, Chang-shui; Song, He-shan

    2014-01-01

    Based on the skew information, we present a quantity, uncertainty-induced quantum nonlocality (UIN) to measure the quantum correlation. It can be considered as the updated version of the original measurement-induced nonlocality (MIN) preserving the good computability but eliminating the non-contractivity problem. For 2×d-dimensional state, it is shown that UIN can be given by a closed form. In addition, we also investigate the maximal uncertainty-induced nonlocality.

  12. An assessment of uncertainties in using volume-area modelling for computing the twenty-first century glacier contribution to sea-level change

    NARCIS (Netherlands)

    Slangen, A.B.A.; van de Wal, R.S.W.

    2011-01-01

    A large part of present-day sea-level change is formed by the melt of glaciers and ice caps (GIC). This study focuses on the uncertainties in the calculation of the GIC contribution on a century timescale. The model used is based on volume-area scaling, 5 combined with the mass balance sensitivity o

  13. Uncertainty in Air Quality Modeling.

    Science.gov (United States)

    Fox, Douglas G.

    1984-01-01

    Under the direction of the AMS Steering Committee for the EPA Cooperative Agreement on Air Quality Modeling, a small group of scientists convened to consider the question of uncertainty in air quality modeling. Because the group was particularly concerned with the regulatory use of models, its discussion focused on modeling tall stack, point source emissions.The group agreed that air quality model results should be viewed as containing both reducible error and inherent uncertainty. Reducible error results from improper or inadequate meteorological and air quality data inputs, and from inadequacies in the models. Inherent uncertainty results from the basic stochastic nature of the turbulent atmospheric motions that are responsible for transport and diffusion of released materials. Modelers should acknowledge that all their predictions to date contain some associated uncertainty and strive also to quantify uncertainty.How can the uncertainty be quantified? There was no consensus from the group as to precisely how uncertainty should be calculated. One subgroup, which addressed statistical procedures, suggested that uncertainty information could be obtained from comparisons of observations and predictions. Following recommendations from a previous AMS workshop on performance evaluation (Fox. 1981), the subgroup suggested construction of probability distribution functions from the differences between observations and predictions. Further, they recommended that relatively new computer-intensive statistical procedures be considered to improve the quality of uncertainty estimates for the extreme value statistics of interest in regulatory applications.A second subgroup, which addressed the basic nature of uncertainty in a stochastic system, also recommended that uncertainty be quantified by consideration of the differences between observations and predictions. They suggested that the average of the difference squared was appropriate to isolate the inherent uncertainty that

  14. Deterministic uncertainty analysis

    International Nuclear Information System (INIS)

    This paper presents a deterministic uncertainty analysis (DUA) method for calculating uncertainties that has the potential to significantly reduce the number of computer runs compared to conventional statistical analysis. The method is based upon the availability of derivative and sensitivity data such as that calculated using the well known direct or adjoint sensitivity analysis techniques. Formation of response surfaces using derivative data and the propagation of input probability distributions are discussed relative to their role in the DUA method. A sample problem that models the flow of water through a borehole is used as a basis to compare the cumulative distribution function of the flow rate as calculated by the standard statistical methods and the DUA method. Propogation of uncertainties by the DUA method is compared for ten cases in which the number of reference model runs was varied from one to ten. The DUA method gives a more accurate representation of the true cumulative distribution of the flow rate based upon as few as two model executions compared to fifty model executions using a statistical approach. 16 refs., 4 figs., 5 tabs

  15. Computational strategy for uncertainty importance measure ranking based on norm%基于范数的基本输入变量重要度排序的计算策略

    Institute of Scientific and Technical Information of China (English)

    许鑫; 吕震宙; 罗晓鹏

    2011-01-01

    The probability density function integral of Borgonovo input uncertainty importance measure is hard to calculate. Hereby based on the definition of the input uncertainty importance measure, the concept of norm is introduced into the uncertainty importance ranking analysis for the first time, and a new importance ranking computational strategy is developed, for which some equivalent norms are selected.This strategy replaces the integral by its equivalent norm, and introduces a kind of regularization computing method for uncertainty importance measure at the same time, which has more applicability. Theoretically,this strategy can provide many kinds equivalent norms for estimating the uncertainty importance ranking.Comparisons of present work with Borgonovo method and Liu method subsequently show that the new method is the easiest one. Finally, two examples are illustrated the feasibility of the present work.%针对Borgonovo基本输入变量重要度中概率密度函数积分不易求解的缺陷,基于重要度定义将范数的概念引入灵敏度分析中的重要度排序分析领域,提出一种新的基本输入变量重要度排序的计算策略,并为该计算策略选择了若干等价范数,从而避免求解积分的困难.该方法将所求积分用其等价范数进行替代,同时采用适用性更加广泛的一种正则化基本输入变量重要度的计算方法.从理论上讲,该计算策略能够为重要度排序计算提供多种等价范数形式.从所提计算策略与Borgonovo方法、Liu方法的比较可以看出:所提方法是三种方法中最简单有效的.最后,给出两个算例来说明所提方法的可行性.

  16. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  17. Uncertainty quantification and error analysis

    Energy Technology Data Exchange (ETDEWEB)

    Higdon, Dave M [Los Alamos National Laboratory; Anderson, Mark C [Los Alamos National Laboratory; Habib, Salman [Los Alamos National Laboratory; Klein, Richard [Los Alamos National Laboratory; Berliner, Mark [OHIO STATE UNIV.; Covey, Curt [LLNL; Ghattas, Omar [UNIV OF TEXAS; Graziani, Carlo [UNIV OF CHICAGO; Seager, Mark [LLNL; Sefcik, Joseph [LLNL; Stark, Philip [UC/BERKELEY; Stewart, James [SNL

    2010-01-01

    UQ studies all sources of error and uncertainty, including: systematic and stochastic measurement error; ignorance; limitations of theoretical models; limitations of numerical representations of those models; limitations on the accuracy and reliability of computations, approximations, and algorithms; and human error. A more precise definition for UQ is suggested below.

  18. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  19. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  20. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  1. Generalized Uncertainty Principle and Angular Momentum

    CERN Document Server

    Bosso, Pasquale

    2016-01-01

    Various models of quantum gravity suggest a modification of the Heisenberg's Uncertainty Principle, to the so-called Generalized Uncertainty Principle, between position and momentum. In this work we show how this modification influences the theory of angular momentum in Quantum Mechanics. In particular, we compute Planck scale corrections to angular momentum eigenvalues, the Hydrogen atom spectrum, the Stern-Gerlach experiment and the Clebsch-Gordan coefficients. We also examine effects of the Generalized Uncertainty Principle on multi-particle systems.

  2. Fuzzy Uncertainty Evaluation for Fault Tree Analysis

    International Nuclear Information System (INIS)

    This traditional probabilistic approach can calculate relatively accurate results. However it requires a long time because of repetitive computation due to the MC method. In addition, when informative data for statistical analysis are not sufficient or some events are mainly caused by human error, the probabilistic approach may not be possible because uncertainties of these events are difficult to be expressed by probabilistic distributions. In order to reduce the computation time and quantify uncertainties of top events when basic events whose uncertainties are difficult to be expressed by probabilistic distributions exist, the fuzzy uncertainty propagation based on fuzzy set theory can be applied. In this paper, we develop a fuzzy uncertainty propagation code and apply the fault tree of the core damage accident after the large loss of coolant accident (LLOCA). The fuzzy uncertainty propagation code is implemented and tested for the fault tree of the radiation release accident. We apply this code to the fault tree of the core damage accident after the LLOCA in three cases and compare the results with those computed by the probabilistic uncertainty propagation using the MC method. The results obtained by the fuzzy uncertainty propagation can be calculated in relatively short time, covering the results obtained by the probabilistic uncertainty propagation

  3. Fuzzy Uncertainty Evaluation for Fault Tree Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ki Beom; Shim, Hyung Jin [Seoul National University, Seoul (Korea, Republic of); Jae, Moo Sung [Hanyang University, Seoul (Korea, Republic of)

    2015-05-15

    This traditional probabilistic approach can calculate relatively accurate results. However it requires a long time because of repetitive computation due to the MC method. In addition, when informative data for statistical analysis are not sufficient or some events are mainly caused by human error, the probabilistic approach may not be possible because uncertainties of these events are difficult to be expressed by probabilistic distributions. In order to reduce the computation time and quantify uncertainties of top events when basic events whose uncertainties are difficult to be expressed by probabilistic distributions exist, the fuzzy uncertainty propagation based on fuzzy set theory can be applied. In this paper, we develop a fuzzy uncertainty propagation code and apply the fault tree of the core damage accident after the large loss of coolant accident (LLOCA). The fuzzy uncertainty propagation code is implemented and tested for the fault tree of the radiation release accident. We apply this code to the fault tree of the core damage accident after the LLOCA in three cases and compare the results with those computed by the probabilistic uncertainty propagation using the MC method. The results obtained by the fuzzy uncertainty propagation can be calculated in relatively short time, covering the results obtained by the probabilistic uncertainty propagation.

  4. Uncertainty of empirical correlation equations

    Science.gov (United States)

    Feistel, R.; Lovell-Smith, J. W.; Saunders, P.; Seitz, S.

    2016-08-01

    The International Association for the Properties of Water and Steam (IAPWS) has published a set of empirical reference equations of state, forming the basis of the 2010 Thermodynamic Equation of Seawater (TEOS-10), from which all thermodynamic properties of seawater, ice, and humid air can be derived in a thermodynamically consistent manner. For each of the equations of state, the parameters have been found by simultaneously fitting equations for a range of different derived quantities using large sets of measurements of these quantities. In some cases, uncertainties in these fitted equations have been assigned based on the uncertainties of the measurement results. However, because uncertainties in the parameter values have not been determined, it is not possible to estimate the uncertainty in many of the useful quantities that can be calculated using the parameters. In this paper we demonstrate how the method of generalised least squares (GLS), in which the covariance of the input data is propagated into the values calculated by the fitted equation, and in particular into the covariance matrix of the fitted parameters, can be applied to one of the TEOS-10 equations of state, namely IAPWS-95 for fluid pure water. Using the calculated parameter covariance matrix, we provide some preliminary estimates of the uncertainties in derived quantities, namely the second and third virial coefficients for water. We recommend further investigation of the GLS method for use as a standard method for calculating and propagating the uncertainties of values computed from empirical equations.

  5. COMPUTING

    CERN Document Server

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  6. Computer

    CERN Document Server

    Atkinson, Paul

    2011-01-01

    The pixelated rectangle we spend most of our day staring at in silence is not the television as many long feared, but the computer-the ubiquitous portal of work and personal lives. At this point, the computer is almost so common we don't notice it in our view. It's difficult to envision that not that long ago it was a gigantic, room-sized structure only to be accessed by a few inspiring as much awe and respect as fear and mystery. Now that the machine has decreased in size and increased in popular use, the computer has become a prosaic appliance, little-more noted than a toaster. These dramati

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  8. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  9. Dyke leakage localization and hydraulic permeability estimation through self-potential and hydro-acoustic measurements: Self-potential 'abacus' diagram for hydraulic permeability estimation and uncertainty computation

    Science.gov (United States)

    Bolève, A.; Vandemeulebrouck, J.; Grangeon, J.

    2012-11-01

    In the present study, we propose the combination of two geophysical techniques, which we have applied to a dyke located in southeastern France that has a visible downstream flood area: the self-potential (SP) and hydro-acoustic methods. These methods are sensitive to two different types of signals: electric signals and water-soil pressure disturbances, respectively. The advantages of the SP technique lie in the high rate of data acquisition, which allows assessment of long dykes, and direct diagnosis in terms of leakage area delimitation and quantification. Coupled with punctual hydro-acoustic cartography, a leakage position can be precisely located, therefore allowing specific remediation decisions with regard to the results of the geophysical investigation. Here, the precise localization of leakage from an earth dyke has been identified using SP and hydro-acoustic signals, with the permeability of the preferential fluid flow area estimated by forward SP modeling. Moreover, we propose a general 'abacus' diagram for the estimation of hydraulic permeability of dyke leakage according to the magnitude of over water SP anomalies and the associated uncertainty.

  10. Collaborative Project: The problem of bias in defining uncertainty in computationally enabled strategies for data-driven climate model development. Final Technical Report.

    Energy Technology Data Exchange (ETDEWEB)

    Huerta, Gabriel [Univ. of New Mexico, Albuquerque, NM (United States)

    2016-05-10

    The objective of the project is to develop strategies for better representing scientific sensibilities within statistical measures of model skill that then can be used within a Bayesian statistical framework for data-driven climate model development and improved measures of model scientific uncertainty. One of the thorny issues in model evaluation is quantifying the effect of biases on climate projections. While any bias is not desirable, only those biases that affect feedbacks affect scatter in climate projections. The effort at the University of Texas is to analyze previously calculated ensembles of CAM3.1 with perturbed parameters to discover how biases affect projections of global warming. The hypothesis is that compensating errors in the control model can be identified by their effect on a combination of processes and that developing metrics that are sensitive to dependencies among state variables would provide a way to select version of climate models that may reduce scatter in climate projections. Gabriel Huerta at the University of New Mexico is responsible for developing statistical methods for evaluating these field dependencies. The UT effort will incorporate these developments into MECS, which is a set of python scripts being developed at the University of Texas for managing the workflow associated with data-driven climate model development over HPC resources. This report reflects the main activities at the University of New Mexico where the PI (Huerta) and the Postdocs (Nosedal, Hattab and Karki) worked on the project.

  11. The uncertainty of valuation

    OpenAIRE

    French, N.; L. Gabrielli

    2005-01-01

    Valuation is often said to be “an art not a science” but this relates to the techniques employed to calculate value not to the underlying concept itself. Valuation is the process of estimating price in the market place. Yet, such an estimation will be affected by uncertainties. Uncertainty in the comparable information available; uncertainty in the current and future market conditions and uncertainty in the specific inputs for the subject property. These input uncertainties will translate int...

  12. Modeling Model Uncertainty

    OpenAIRE

    Onatski, Alexei; Williams, Noah

    2003-01-01

    Recently there has been much interest in studying monetary policy under model uncertainty. We develop methods to analyze different sources of uncertainty in one coherent structure useful for policy decisions. We show how to estimate the size of the uncertainty based on time series data, and incorporate this uncertainty in policy optimization. We propose two different approaches to modeling model uncertainty. The first is model error modeling, which imposes additional structure on the errors o...

  13. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  15. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  16. COMPUTING

    CERN Document Server

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  18. COMPUTING

    CERN Document Server

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  19. COMPUTING

    CERN Document Server

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  20. Assessment of the computational uncertainty of temperature rise and SAR in the eyes and brain under far-field exposure from 1 to 10 GHz

    Science.gov (United States)

    Laakso, Ilkka

    2009-06-01

    This paper presents finite-difference time-domain (FDTD) calculations of specific absorption rate (SAR) values in the head under plane-wave exposure from 1 to 10 GHz using a resolution of 0.5 mm in adult male and female voxel models. Temperature rise due to the power absorption is calculated by the bioheat equation using a multigrid method solver. The computational accuracy is investigated by repeating the calculations with resolutions of 1 mm and 2 mm and comparing the results. Cubically averaged 10 g SAR in the eyes and brain and eye-averaged SAR are calculated and compared to the corresponding temperature rise as well as the recommended limits for exposure. The results suggest that 2 mm resolution should only be used for frequencies smaller than 2.5 GHz, and 1 mm resolution only under 5 GHz. Morphological differences in models seemed to be an important cause of variation: differences in results between the two different models were usually larger than the computational error due to the grid resolution, and larger than the difference between the results for open and closed eyes. Limiting the incident plane-wave power density to smaller than 100 W m-2 was sufficient for ensuring that the temperature rise in the eyes and brain were less than 1 °C in the whole frequency range.

  1. Assessment of the computational uncertainty of temperature rise and SAR in the eyes and brain under far-field exposure from 1 to 10 GHz

    Energy Technology Data Exchange (ETDEWEB)

    Laakso, Ilkka [Department of Radio Science and Engineering, Helsinki University of Technology, Otakaari 5 A, 02150 Espoo (Finland)], E-mail: ilkka.laakso@tkk.fi

    2009-06-07

    This paper presents finite-difference time-domain (FDTD) calculations of specific absorption rate (SAR) values in the head under plane-wave exposure from 1 to 10 GHz using a resolution of 0.5 mm in adult male and female voxel models. Temperature rise due to the power absorption is calculated by the bioheat equation using a multigrid method solver. The computational accuracy is investigated by repeating the calculations with resolutions of 1 mm and 2 mm and comparing the results. Cubically averaged 10 g SAR in the eyes and brain and eye-averaged SAR are calculated and compared to the corresponding temperature rise as well as the recommended limits for exposure. The results suggest that 2 mm resolution should only be used for frequencies smaller than 2.5 GHz, and 1 mm resolution only under 5 GHz. Morphological differences in models seemed to be an important cause of variation: differences in results between the two different models were usually larger than the computational error due to the grid resolution, and larger than the difference between the results for open and closed eyes. Limiting the incident plane-wave power density to smaller than 100 W m{sup -2} was sufficient for ensuring that the temperature rise in the eyes and brain were less than 1 deg. C in the whole frequency range.

  2. Assessment of the computational uncertainty of temperature rise and SAR in the eyes and brain under far-field exposure from 1 to 10 GHz

    International Nuclear Information System (INIS)

    This paper presents finite-difference time-domain (FDTD) calculations of specific absorption rate (SAR) values in the head under plane-wave exposure from 1 to 10 GHz using a resolution of 0.5 mm in adult male and female voxel models. Temperature rise due to the power absorption is calculated by the bioheat equation using a multigrid method solver. The computational accuracy is investigated by repeating the calculations with resolutions of 1 mm and 2 mm and comparing the results. Cubically averaged 10 g SAR in the eyes and brain and eye-averaged SAR are calculated and compared to the corresponding temperature rise as well as the recommended limits for exposure. The results suggest that 2 mm resolution should only be used for frequencies smaller than 2.5 GHz, and 1 mm resolution only under 5 GHz. Morphological differences in models seemed to be an important cause of variation: differences in results between the two different models were usually larger than the computational error due to the grid resolution, and larger than the difference between the results for open and closed eyes. Limiting the incident plane-wave power density to smaller than 100 W m-2 was sufficient for ensuring that the temperature rise in the eyes and brain were less than 1 deg. C in the whole frequency range.

  3. Uncertainty quantification theory, implementation, and applications

    CERN Document Server

    Smith, Ralph C

    2014-01-01

    The field of uncertainty quantification is evolving rapidly because of increasing emphasis on models that require quantified uncertainties for large-scale applications, novel algorithm development, and new computational architectures that facilitate implementation of these algorithms. Uncertainty Quantification: Theory, Implementation, and Applications provides readers with the basic concepts, theory, and algorithms necessary to quantify input and response uncertainties for simulation models arising in a broad range of disciplines. The book begins with a detailed discussion of applications where uncertainty quantification is critical for both scientific understanding and policy. It then covers concepts from probability and statistics, parameter selection techniques, frequentist and Bayesian model calibration, propagation of uncertainties, quantification of model discrepancy, surrogate model construction, and local and global sensitivity analysis. The author maintains a complementary web page where readers ca...

  4. Experimental uncertainty estimation and statistics for data having interval uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Kreinovich, Vladik (Applied Biomathematics, Setauket, New York); Oberkampf, William Louis (Applied Biomathematics, Setauket, New York); Ginzburg, Lev (Applied Biomathematics, Setauket, New York); Ferson, Scott (Applied Biomathematics, Setauket, New York); Hajagos, Janos (Applied Biomathematics, Setauket, New York)

    2007-05-01

    This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.

  5. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  6. COMPUTING

    CERN Document Server

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  7. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  9. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  10. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  13. Estimating uncertainties in complex joint inverse problems

    Science.gov (United States)

    Afonso, Juan Carlos

    2016-04-01

    Sources of uncertainty affecting geophysical inversions can be classified either as reflective (i.e. the practitioner is aware of her/his ignorance) or non-reflective (i.e. the practitioner does not know that she/he does not know!). Although we should be always conscious of the latter, the former are the ones that, in principle, can be estimated either empirically (by making measurements or collecting data) or subjectively (based on the experience of the researchers). For complex parameter estimation problems in geophysics, subjective estimation of uncertainty is the most common type. In this context, probabilistic (aka Bayesian) methods are commonly claimed to offer a natural and realistic platform from which to estimate model uncertainties. This is because in the Bayesian approach, errors (whatever their nature) can be naturally included as part of the global statistical model, the solution of which represents the actual solution to the inverse problem. However, although we agree that probabilistic inversion methods are the most powerful tool for uncertainty estimation, the common claim that they produce "realistic" or "representative" uncertainties is not always justified. Typically, ALL UNCERTAINTY ESTIMATES ARE MODEL DEPENDENT, and therefore, besides a thorough characterization of experimental uncertainties, particular care must be paid to the uncertainty arising from model errors and input uncertainties. We recall here two quotes by G. Box and M. Gunzburger, respectively, of special significance for inversion practitioners and for this session: "…all models are wrong, but some are useful" and "computational results are believed by no one, except the person who wrote the code". In this presentation I will discuss and present examples of some problems associated with the estimation and quantification of uncertainties in complex multi-observable probabilistic inversions, and how to address them. Although the emphasis will be on sources of uncertainty related

  14. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  15. Treatment of uncertainties in the geologic disposal of radioactive waste

    International Nuclear Information System (INIS)

    Uncertainty in the analysis of geologic waste disposal is generally considered to have three primary components: (1) computer code/model uncertainty, (2) model parameter uncertainty, and (3) scenario uncertainty. Computer code/model uncertainty arises from problems associated with determination of appropriate parameters for use in model construction, mathematical formulatin of models, and numerical techniques used in conjunction with the mathematical formulation of models. Model parameter uncertainty arises from problems associated with selection of appropriate values for model input, data interpretation and possible misuse of data, and variation of data. Scenario uncertainty arises from problems associated with the ''completeness' of scenarios, the definition of parameters which describe scenarios, and the rate or probability of scenario occurrence. The preceding sources of uncertainty are discussed below

  16. Treatment of uncertainties in the geologic disposal of radioactive waste

    Energy Technology Data Exchange (ETDEWEB)

    Cranwell, R.M.

    1985-12-31

    Uncertainty in the analysis of geologic waste disposal is generally considered to have three primary components: (1) computer code/model uncertainty, (2) model parameter uncertainty, and (3) scenario uncertainty. Computer code/model uncertainty arises from problems associated with determination of appropriate parameters for use in model construction, mathematical formulatin of models, and numerical techniques used in conjunction with the mathematical formulation of models. Model parameter uncertainty arises from problems associated with selection of appropriate values for model input, data interpretation and possible misuse of data, and variation of data. Scenario uncertainty arises from problems associated with the "completeness` of scenarios, the definition of parameters which describe scenarios, and the rate or probability of scenario occurrence. The preceding sources of uncertainty are discussed below.

  17. Pore Velocity Estimation Uncertainties

    Science.gov (United States)

    Devary, J. L.; Doctor, P. G.

    1982-08-01

    Geostatistical data analysis techniques were used to stochastically model the spatial variability of groundwater pore velocity in a potential waste repository site. Kriging algorithms were applied to Hanford Reservation data to estimate hydraulic conductivities, hydraulic head gradients, and pore velocities. A first-order Taylor series expansion for pore velocity was used to statistically combine hydraulic conductivity, hydraulic head gradient, and effective porosity surfaces and uncertainties to characterize the pore velocity uncertainty. Use of these techniques permits the estimation of pore velocity uncertainties when pore velocity measurements do not exist. Large pore velocity estimation uncertainties were found to be located in the region where the hydraulic head gradient relative uncertainty was maximal.

  18. Verification of uncertainty budgets

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Madsen, B.S.

    2005-01-01

    , and therefore it is essential that the applicability of the overall uncertainty budget to actual measurement results be verified on the basis of current experimental data. This should be carried out by replicate analysis of samples taken in accordance with the definition of the measurand, but representing......The quality of analytical results is expressed by their uncertainty, as it is estimated on the basis of an uncertainty budget; little effort is, however, often spent on ascertaining the quality of the uncertainty budget. The uncertainty budget is based on circumstantial or historical data...... the full range of matrices and concentrations for which the budget is assumed to be valid. In this way the assumptions made in the uncertainty budget can be experimentally verified, both as regards sources of variability that are assumed negligible, and dominant uncertainty components. Agreement between...

  19. Computational Sustainability

    OpenAIRE

    Eaton, Eric; University of Pennsylvania; Gomes, Carla P.; Cornell University; Williams, Brian; Massachusetts Institute of Technology

    2014-01-01

    Computational sustainability problems, which exist in dynamic environments with high amounts of uncertainty, provide a variety of unique challenges to artificial intelligence research and the opportunity for significant impact upon our collective future. This editorial provides an overview of artificial intelligence for computational sustainability, and introduces this special issue of AI Magazine.

  20. Uncertainties in Safety Analysis. A literature review

    Energy Technology Data Exchange (ETDEWEB)

    Ekberg, C. [Chalmers Univ. of Technology, Goeteborg (Sweden). Dept. of Nuclear Chemistry

    1995-05-01

    The purpose of the presented work has been to give a short summary of the origins of many uncertainties arising in the designing and performance assessment of a repository for spent nuclear fuel. Some different methods to treat these uncertainties is also included. The methods and conclusions are in many cases general in the sense that they are applicable to many other disciplines where simulations are used. As a conclusion it may be noted that uncertainties of different origin have been discussed and debated, but one large group, e.g. computer simulations, where the methods to make a more explicit investigation exists, have not been investigated in a satisfying way. 50 refs.

  1. Uncertainties in Safety Analysis. A literature review

    International Nuclear Information System (INIS)

    The purpose of the presented work has been to give a short summary of the origins of many uncertainties arising in the designing and performance assessment of a repository for spent nuclear fuel. Some different methods to treat these uncertainties is also included. The methods and conclusions are in many cases general in the sense that they are applicable to many other disciplines where simulations are used. As a conclusion it may be noted that uncertainties of different origin have been discussed and debated, but one large group, e.g. computer simulations, where the methods to make a more explicit investigation exists, have not been investigated in a satisfying way. 50 refs

  2. Measurement Uncertainty and Probability

    Science.gov (United States)

    Willink, Robin

    2013-02-01

    Part I. Principles: 1. Introduction; 2. Foundational ideas in measurement; 3. Components of error or uncertainty; 4. Foundational ideas in probability and statistics; 5. The randomization of systematic errors; 6. Beyond the standard confidence interval; Part II. Evaluation of Uncertainty: 7. Final preparation; 8. Evaluation using the linear approximation; 9. Evaluation without the linear approximations; 10. Uncertainty information fit for purpose; Part III. Related Topics: 11. Measurement of vectors and functions; 12. Why take part in a measurement comparison?; 13. Other philosophies; 14. An assessment of objective Bayesian methods; 15. A guide to the expression of uncertainty in measurement; 16. Measurement near a limit - an insoluble problem?; References; Index.

  3. Uncertainty in artificial intelligence

    CERN Document Server

    Kanal, LN

    1986-01-01

    How to deal with uncertainty is a subject of much controversy in Artificial Intelligence. This volume brings together a wide range of perspectives on uncertainty, many of the contributors being the principal proponents in the controversy.Some of the notable issues which emerge from these papers revolve around an interval-based calculus of uncertainty, the Dempster-Shafer Theory, and probability as the best numeric model for uncertainty. There remain strong dissenting opinions not only about probability but even about the utility of any numeric method in this context.

  4. Uncertainty in hydrological signatures

    Science.gov (United States)

    McMillan, Hilary; Westerberg, Ida

    2015-04-01

    Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty

  5. Understanding Theoretical Uncertainties in Perturbative QCD Computations

    DEFF Research Database (Denmark)

    Jenniches, Laura Katharina

    effective field theories and perturbative QCD to predict the effect of New Physics on measurements at the LHC and at other future colliders. We use heavy-quark, heavy-scalar and soft-collinear effective theory to calculate a three-body cascade decay at NLO QCD in the expansion-by-regions formalism...

  6. Predictive uncertainty in auditory sequence processing.

    Science.gov (United States)

    Hansen, Niels Chr; Pearce, Marcus T

    2014-01-01

    Previous studies of auditory expectation have focused on the expectedness perceived by listeners retrospectively in response to events. In contrast, this research examines predictive uncertainty-a property of listeners' prospective state of expectation prior to the onset of an event. We examine the information-theoretic concept of Shannon entropy as a model of predictive uncertainty in music cognition. This is motivated by the Statistical Learning Hypothesis, which proposes that schematic expectations reflect probabilistic relationships between sensory events learned implicitly through exposure. Using probability estimates from an unsupervised, variable-order Markov model, 12 melodic contexts high in entropy and 12 melodic contexts low in entropy were selected from two musical repertoires differing in structural complexity (simple and complex). Musicians and non-musicians listened to the stimuli and provided explicit judgments of perceived uncertainty (explicit uncertainty). We also examined an indirect measure of uncertainty computed as the entropy of expectedness distributions obtained using a classical probe-tone paradigm where listeners rated the perceived expectedness of the final note in a melodic sequence (inferred uncertainty). Finally, we simulate listeners' perception of expectedness and uncertainty using computational models of auditory expectation. A detailed model comparison indicates which model parameters maximize fit to the data and how they compare to existing models in the literature. The results show that listeners experience greater uncertainty in high-entropy musical contexts than low-entropy contexts. This effect is particularly apparent for inferred uncertainty and is stronger in musicians than non-musicians. Consistent with the Statistical Learning Hypothesis, the results suggest that increased domain-relevant training is associated with an increasingly accurate cognitive model of probabilistic structure in music.

  7. Predictive uncertainty in auditory sequence processing

    Directory of Open Access Journals (Sweden)

    Niels Chr. eHansen

    2014-09-01

    Full Text Available Previous studies of auditory expectation have focused on the expectedness perceived by listeners retrospectively in response to events. In contrast, this research examines predictive uncertainty - a property of listeners’ prospective state of expectation prior to the onset of an event. We examine the information-theoretic concept of Shannon entropy as a model of predictive uncertainty in music cognition. This is motivated by the Statistical Learning Hypothesis, which proposes that schematic expectations reflect probabilistic relationships between sensory events learned implicitly through exposure.Using probability estimates from an unsupervised, variable-order Markov model, 12 melodic contexts high in entropy and 12 melodic contexts low in entropy were selected from two musical repertoires differing in structural complexity (simple and complex. Musicians and non-musicians listened to the stimuli and provided explicit judgments of perceived uncertainty (explicit uncertainty. We also examined an indirect measure of uncertainty computed as the entropy of expectedness distributions obtained using a classical probe-tone paradigm where listeners rated the perceived expectedness of the final note in a melodic sequence (inferred uncertainty. Finally, we simulate listeners’ perception of expectedness and uncertainty using computational models of auditory expectation. A detailed model comparison indicates which model parameters maximize fit to the data and how they compare to existing models in the literature.The results show that listeners experience greater uncertainty in high-entropy musical contexts than low-entropy contexts. This effect is particularly apparent for inferred uncertainty and is stronger in musicians than non-musicians. Consistent with the Statistical Learning Hypothesis, the results suggest that increased domain-relevant training is associated with an increasingly accurate cognitive model of probabilistic structure in music.

  8. Predictive uncertainty in auditory sequence processing.

    Science.gov (United States)

    Hansen, Niels Chr; Pearce, Marcus T

    2014-01-01

    Previous studies of auditory expectation have focused on the expectedness perceived by listeners retrospectively in response to events. In contrast, this research examines predictive uncertainty-a property of listeners' prospective state of expectation prior to the onset of an event. We examine the information-theoretic concept of Shannon entropy as a model of predictive uncertainty in music cognition. This is motivated by the Statistical Learning Hypothesis, which proposes that schematic expectations reflect probabilistic relationships between sensory events learned implicitly through exposure. Using probability estimates from an unsupervised, variable-order Markov model, 12 melodic contexts high in entropy and 12 melodic contexts low in entropy were selected from two musical repertoires differing in structural complexity (simple and complex). Musicians and non-musicians listened to the stimuli and provided explicit judgments of perceived uncertainty (explicit uncertainty). We also examined an indirect measure of uncertainty computed as the entropy of expectedness distributions obtained using a classical probe-tone paradigm where listeners rated the perceived expectedness of the final note in a melodic sequence (inferred uncertainty). Finally, we simulate listeners' perception of expectedness and uncertainty using computational models of auditory expectation. A detailed model comparison indicates which model parameters maximize fit to the data and how they compare to existing models in the literature. The results show that listeners experience greater uncertainty in high-entropy musical contexts than low-entropy contexts. This effect is particularly apparent for inferred uncertainty and is stronger in musicians than non-musicians. Consistent with the Statistical Learning Hypothesis, the results suggest that increased domain-relevant training is associated with an increasingly accurate cognitive model of probabilistic structure in music. PMID:25295018

  9. Uncertainty in flood risk mapping

    Science.gov (United States)

    Gonçalves, Luisa M. S.; Fonte, Cidália C.; Gomes, Ricardo

    2014-05-01

    A flood refers to a sharp increase of water level or volume in rivers and seas caused by sudden rainstorms or melting ice due to natural factors. In this paper, the flooding of riverside urban areas caused by sudden rainstorms will be studied. In this context, flooding occurs when the water runs above the level of the minor river bed and enters the major river bed. The level of the major bed determines the magnitude and risk of the flooding. The prediction of the flooding extent is usually deterministic, and corresponds to the expected limit of the flooded area. However, there are many sources of uncertainty in the process of obtaining these limits, which influence the obtained flood maps used for watershed management or as instruments for territorial and emergency planning. In addition, small variations in the delineation of the flooded area can be translated into erroneous risk prediction. Therefore, maps that reflect the uncertainty associated with the flood modeling process have started to be developed, associating a degree of likelihood with the boundaries of the flooded areas. In this paper an approach is presented that enables the influence of the parameters uncertainty to be evaluated, dependent on the type of Land Cover Map (LCM) and Digital Elevation Model (DEM), on the estimated values of the peak flow and the delineation of flooded areas (different peak flows correspond to different flood areas). The approach requires modeling the DEM uncertainty and its propagation to the catchment delineation. The results obtained in this step enable a catchment with fuzzy geographical extent to be generated, where a degree of possibility of belonging to the basin is assigned to each elementary spatial unit. Since the fuzzy basin may be considered as a fuzzy set, the fuzzy area of the basin may be computed, generating a fuzzy number. The catchment peak flow is then evaluated using fuzzy arithmetic. With this methodology a fuzzy number is obtained for the peak flow

  10. Feedback versus uncertainty

    NARCIS (Netherlands)

    Van Nooyen, R.R.P.; Hrachowitz, M.; Kolechkina, A.G.

    2014-01-01

    Even without uncertainty about the model structure or parameters, the output of a hydrological model run still contains several sources of uncertainty. These are: measurement errors affecting the input, the transition from continuous time and space to discrete time and space, which causes loss of in

  11. Chance and Uncertainty

    NARCIS (Netherlands)

    Capel, H.W.; Cramer, J.S.; Estevez-Uscanga, O.

    1995-01-01

    'Uncertainty and chance' is a subject with a broad span, in that there is no academic discipline or walk of life that is not beset by uncertainty and chance. In this book a range of approaches is represented by authors from varied disciplines: natural sciences, mathematics, social sciences and medic

  12. Uncertainty and simulation

    International Nuclear Information System (INIS)

    More than 150 researchers and engineers from universities and the industrial world met to discuss on the new methodologies developed around assessing uncertainty. About 20 papers were presented and the main topics were: methods to study the propagation of uncertainties, sensitivity analysis, nuclear data covariances or multi-parameter optimisation. This report gathers the contributions of CEA researchers and engineers

  13. Physical Uncertainty Bounds (PUB)

    Energy Technology Data Exchange (ETDEWEB)

    Vaughan, Diane Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Dean L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-03-19

    This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switching out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.

  14. Nanoparticles: Uncertainty Risk Analysis

    DEFF Research Database (Denmark)

    Grieger, Khara Deanne; Hansen, Steffen Foss; Baun, Anders

    2012-01-01

    approaches. To date, there have been a number of different approaches to assess uncertainty of environmental risks in general, and some have also been proposed in the case of nanoparticles and nanomaterials. In recent years, others have also proposed that broader assessments of uncertainty are also needed......Scientific uncertainty plays a major role in assessing the potential environmental risks of nanoparticles. Moreover, there is uncertainty within fundamental data and information regarding the potential environmental and health risks of nanoparticles, hampering risk assessments based on standard...... in order to handle the complex potential risks of nanoparticles, including more descriptive characterizations of uncertainty. Some of these approaches are presented and discussed herein, in which the potential strengths and limitations of these approaches are identified along with further challenges...

  15. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  16. Economic uncertainty and econophysics

    Science.gov (United States)

    Schinckus, Christophe

    2009-10-01

    The objective of this paper is to provide a methodological link between econophysics and economics. I will study a key notion of both fields: uncertainty and the ways of thinking about it developed by the two disciplines. After having presented the main economic theories of uncertainty (provided by Knight, Keynes and Hayek), I show how this notion is paradoxically excluded from the economic field. In economics, uncertainty is totally reduced by an a priori Gaussian framework-in contrast to econophysics, which does not use a priori models because it works directly on data. Uncertainty is then not shaped by a specific model, and is partially and temporally reduced as models improve. This way of thinking about uncertainty has echoes in the economic literature. By presenting econophysics as a Knightian method, and a complementary approach to a Hayekian framework, this paper shows that econophysics can be methodologically justified from an economic point of view.

  17. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  18. Using dynamical uncertainty models estimating uncertainty bounds on power plant performance prediction

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Stoustrup, Jakob; Mataji, B.

    2007-01-01

    Predicting the performance of large scale plants can be difficult due to model uncertainties etc, meaning that one can be almost certain that the prediction will diverge from the plant performance with time. In this paper output multiplicative uncertainty models are used as dynamical models of th...... models, is applied to two different sets of measured plant data. The computed uncertainty bounds cover the measured plant output, while the nominal prediction is outside these uncertainty bounds for some samples in these examples.  ......Predicting the performance of large scale plants can be difficult due to model uncertainties etc, meaning that one can be almost certain that the prediction will diverge from the plant performance with time. In this paper output multiplicative uncertainty models are used as dynamical models...... of the prediction error. These proposed dynamical uncertainty models result in an upper and lower bound on the predicted performance of the plant. The dynamical uncertainty models are used to estimate the uncertainty of the predicted performance of a coal-fired power plant. The proposed scheme, which uses dynamical...

  19. Predictive uncertainty in auditory sequence processing

    DEFF Research Database (Denmark)

    Hansen, Niels Chr.; Pearce, Marcus T

    2014-01-01

    and provided explicit judgments of perceived uncertainty (explicit uncertainty). We also examined an indirect measure of uncertainty computed as the entropy of expectedness distributions obtained using a classical probe-tone paradigm where listeners rated the perceived expectedness of the final note......Previous studies of auditory expectation have focused on the expectedness perceived by listeners retrospectively in response to events. In contrast, this research examines predictive uncertainty—a property of listeners' prospective state of expectation prior to the onset of an event. We examine...... the information-theoretic concept of Shannon entropy as a model of predictive uncertainty in music cognition. This is motivated by the Statistical Learning Hypothesis, which proposes that schematic expectations reflect probabilistic relationships between sensory events learned implicitly through exposure. Using...

  20. Optimal Universal Uncertainty Relations

    Science.gov (United States)

    Li, Tao; Xiao, Yunlong; Ma, Teng; Fei, Shao-Ming; Jing, Naihuan; Li-Jost, Xianqing; Wang, Zhi-Xi

    2016-01-01

    We study universal uncertainty relations and present a method called joint probability distribution diagram to improve the majorization bounds constructed independently in [Phys. Rev. Lett. 111, 230401 (2013)] and [J. Phys. A. 46, 272002 (2013)]. The results give rise to state independent uncertainty relations satisfied by any nonnegative Schur-concave functions. On the other hand, a remarkable recent result of entropic uncertainty relation is the direct-sum majorization relation. In this paper, we illustrate our bounds by showing how they provide a complement to that in [Phys. Rev. A. 89, 052115 (2014)]. PMID:27775010

  1. A surrogate-based uncertainty quantification with quantifiable errors

    Energy Technology Data Exchange (ETDEWEB)

    Bang, Y.; Abdel-Khalik, H. S. [North Carolina State Univ., Raleigh, NC 27695 (United States)

    2012-07-01

    Surrogate models are often employed to reduce the computational cost required to complete uncertainty quantification, where one is interested in propagating input parameters uncertainties throughout a complex engineering model to estimate responses uncertainties. An improved surrogate construction approach is introduced here which places a premium on reducing the associated computational cost. Unlike existing methods where the surrogate is constructed first, then employed to propagate uncertainties, the new approach combines both sensitivity and uncertainty information to render further reduction in the computational cost. Mathematically, the reduction is described by a range finding algorithm that identifies a subspace in the parameters space, whereby parameters uncertainties orthogonal to the subspace contribute negligible amount to the propagated uncertainties. Moreover, the error resulting from the reduction can be upper-bounded. The new approach is demonstrated using a realistic nuclear assembly model and compared to existing methods in terms of computational cost and accuracy of uncertainties. Although we believe the algorithm is general, it will be applied here for linear-based surrogates and Gaussian parameters uncertainties. The generalization to nonlinear models will be detailed in a separate article. (authors)

  2. Uncertainty in oil projects

    International Nuclear Information System (INIS)

    This report presents an oil project valuation under uncertainty by means of two well-known financial techniques: The Capital Asset Pricing Model (CAPM) and The Black-Scholes Option Pricing Formula. CAPM gives a linear positive relationship between expected rate of return and risk but does not take into consideration the aspect of flexibility which is crucial for an irreversible investment as an oil price is. Introduction of investment decision flexibility by using real options can increase the oil project value substantially. Some simple tests for the importance of uncertainty in stock market for oil investments are performed. Uncertainty in stock returns is correlated with aggregate product market uncertainty according to Pindyck (1991). The results of the tests are not satisfactory due to the short data series but introducing two other explanatory variables the interest rate and Gross Domestic Product make the situation better. 36 refs., 18 figs., 6 tabs

  3. Evaluating prediction uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    McKay, M.D. [Los Alamos National Lab., NM (United States)

    1995-03-01

    The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented.

  4. Commonplaces and social uncertainty

    DEFF Research Database (Denmark)

    Lassen, Inger

    2008-01-01

    This article explores the concept of uncertainty in four focus group discussions about genetically modified food. In the discussions, members of the general public interact with food biotechnology scientists while negotiating their attitudes towards genetic engineering. Their discussions offer...

  5. Trajectories without quantum uncertainties

    CERN Document Server

    Polzik, Eugene S

    2014-01-01

    A common knowledge suggests that trajectories of particles in quantum mechanics always have quantum uncertainties. These quantum uncertainties set by the Heisenberg uncertainty principle limit precision of measurements of fields and forces, and ultimately give rise to the standard quantum limit in metrology. With the rapid developments of sensitivity of measurements these limits have been approached in various types of measurements including measurements of fields and acceleration. Here we show that a quantum trajectory of one system measured relatively to the other "reference system" with an effective negative mass can be quantum uncertainty--free. The method crucially relies on the generation of an Einstein-Podolsky-Rosen entangled state of two objects, one of which has an effective negative mass. From a practical perspective these ideas open the way towards force and acceleration measurements at new levels of sensitivity far below the standard quantum limit.

  6. Uncertainty, rationality, and agency

    CERN Document Server

    Hoek, Wiebe van der

    2006-01-01

    Goes across 'classical' borderlines of disciplinesUnifies logic, game theory, and epistemics and studies them in an agent-settingCombines classical and novel approaches to uncertainty, rationality, and agency

  7. Sustainability and uncertainty

    DEFF Research Database (Denmark)

    Jensen, Karsten Klint

    2007-01-01

    The widely used concept of sustainability is seldom precisely defined, and its clarification involves making up one's mind about a range of difficult questions. One line of research (bottom-up) takes sustaining a system over time as its starting point and then infers prescriptions from this requi...... are decisions under uncertainty. There might be different judgments on likelihoods; but even given some set of probabilities, there might be disagreement on the right level of precaution in face of the uncertainty....

  8. Uncertainty calculations made easier

    Energy Technology Data Exchange (ETDEWEB)

    Hogenbirk, A.

    1994-07-01

    The results are presented of a neutron cross section sensitivity/uncertainty analysis performed in a complicated 2D model of the NET shielding blanket design inside the ITER torus design, surrounded by the cryostat/biological shield as planned for ITER. The calculations were performed with a code system developed at ECN Petten, with which sensitivity/uncertainty calculations become relatively simple. In order to check the deterministic neutron transport calculations (performed with DORT), calculations were also performed with the Monte Carlo code MCNP. Care was taken to model the 2.0 cm wide gaps between two blanket segments, as the neutron flux behind the vacuum vessel is largely determined by neutrons streaming through these gaps. The resulting neutron flux spectra are in excellent agreement up to the end of the cryostat. It is noted, that at this position the attenuation of the neutron flux is about 1 l orders of magnitude. The uncertainty in the energy integrated flux at the beginning of the vacuum vessel and at the beginning of the cryostat was determined in the calculations. The uncertainty appears to be strongly dependent on the exact geometry: if the gaps are filled with stainless steel, the neutron spectrum changes strongly, which results in an uncertainty of 70% in the energy integrated flux at the beginning of the cryostat in the no-gap-geometry, compared to an uncertainty of only 5% in the gap-geometry. Therefore, it is essential to take into account the exact geometry in sensitivity/uncertainty calculations. Furthermore, this study shows that an improvement of the covariance data is urgently needed in order to obtain reliable estimates of the uncertainties in response parameters in neutron transport calculations. (orig./GL).

  9. Uncertainty calculations made easier

    International Nuclear Information System (INIS)

    The results are presented of a neutron cross section sensitivity/uncertainty analysis performed in a complicated 2D model of the NET shielding blanket design inside the ITER torus design, surrounded by the cryostat/biological shield as planned for ITER. The calculations were performed with a code system developed at ECN Petten, with which sensitivity/uncertainty calculations become relatively simple. In order to check the deterministic neutron transport calculations (performed with DORT), calculations were also performed with the Monte Carlo code MCNP. Care was taken to model the 2.0 cm wide gaps between two blanket segments, as the neutron flux behind the vacuum vessel is largely determined by neutrons streaming through these gaps. The resulting neutron flux spectra are in excellent agreement up to the end of the cryostat. It is noted, that at this position the attenuation of the neutron flux is about 1 l orders of magnitude. The uncertainty in the energy integrated flux at the beginning of the vacuum vessel and at the beginning of the cryostat was determined in the calculations. The uncertainty appears to be strongly dependent on the exact geometry: if the gaps are filled with stainless steel, the neutron spectrum changes strongly, which results in an uncertainty of 70% in the energy integrated flux at the beginning of the cryostat in the no-gap-geometry, compared to an uncertainty of only 5% in the gap-geometry. Therefore, it is essential to take into account the exact geometry in sensitivity/uncertainty calculations. Furthermore, this study shows that an improvement of the covariance data is urgently needed in order to obtain reliable estimates of the uncertainties in response parameters in neutron transport calculations. (orig./GL)

  10. Generalized uncertainty principles

    CERN Document Server

    Machluf, Ronny

    2008-01-01

    The phenomenon in the essence of classical uncertainty principles is well known since the thirties of the last century. We introduce a new phenomenon which is in the essence of a new notion that we introduce: "Generalized Uncertainty Principles". We show the relation between classical uncertainty principles and generalized uncertainty principles. We generalized "Landau-Pollak-Slepian" uncertainty principle. Our generalization relates the following two quantities and two scaling parameters: 1) The weighted time spreading $\\int_{-\\infty}^\\infty |f(x)|^2w_1(x)dx$, ($w_1(x)$ is a non-negative function). 2) The weighted frequency spreading $\\int_{-\\infty}^\\infty |\\hat{f}(\\omega)|^2w_2(\\omega)d\\omega$. 3) The time weight scale $a$, ${w_1}_a(x)=w_1(xa^{-1})$ and 4) The frequency weight scale $b$, ${w_2}_b(\\omega)=w_2(\\omega b^{-1})$. "Generalized Uncertainty Principle" is an inequality that summarizes the constraints on the relations between the two spreading quantities and two scaling parameters. For any two reason...

  11. Data assimilation for kinetic parameters uncertainty analysis

    International Nuclear Information System (INIS)

    Several years ago, the OECD/NEA Nuclear Science Committee (NSC) established the Expert Group on Uncertainty Analysis in Modeling (UAM-LWR) after thorough discussions of the demands from nuclear research, industry, safety and regulation to provide the best estimate predictions of nuclear systems parameters with their confidence bounds. UAM objectives include among others, the quantification of uncertainties of neutronic calculations with respect to their value for the multi-physics analysis. Since the kinetics parameters and their uncertainties are of particular interest for these studies the deterministic approaches for analysis of uncertainties in nuclear reactor kinetic parameters (neutron generation lifetime and delayed neutron effective fraction) have been developed in frame of the UAM-LWR. The approach uses combination of generalization of perturbation theory to reactivity analysis, and the Generalized Perturbation Theory (GPT) for sensitivity computation. It has been applied to the UAM complementary fast neutron SNEAK test case that has unique set of experimental data for βeff. In this example, the covariance matrices of nuclear data have been derived from COMMARA library by Bayesian adjustment upon the set of the fast neutron integral benchmarks with the BERING code package. Then, the kinetic parameters uncertainties with their correlations have been applied to simplified model of a reactivity insertion transient where relative uncertainty of power peak was taken as figure of merit. The results demonstrate that the uncertainties due to nuclear data impact significantly the energy release in a coupled transient modeling. It was also found that such uncertainties become higher if the correlations between uncertainties of different lumped parameters are taken into account. (author)

  12. Remaining Useful Life Estimation in Prognosis: An Uncertainty Propagation Problem

    Science.gov (United States)

    Sankararaman, Shankar; Goebel, Kai

    2013-01-01

    The estimation of remaining useful life is significant in the context of prognostics and health monitoring, and the prediction of remaining useful life is essential for online operations and decision-making. However, it is challenging to accurately predict the remaining useful life in practical aerospace applications due to the presence of various uncertainties that affect prognostic calculations, and in turn, render the remaining useful life prediction uncertain. It is challenging to identify and characterize the various sources of uncertainty in prognosis, understand how each of these sources of uncertainty affect the uncertainty in the remaining useful life prediction, and thereby compute the overall uncertainty in the remaining useful life prediction. In order to achieve these goals, this paper proposes that the task of estimating the remaining useful life must be approached as an uncertainty propagation problem. In this context, uncertainty propagation methods which are available in the literature are reviewed, and their applicability to prognostics and health monitoring are discussed.

  13. Measurement uncertainty of lactase-containing tablets analyzed with FTIR.

    Science.gov (United States)

    Paakkunainen, Maaret; Kohonen, Jarno; Reinikainen, Satu-Pia

    2014-01-01

    Uncertainty is one of the most critical aspects in determination of measurement reliability. In order to ensure accurate measurements, results need to be traceable and uncertainty measurable. In this study, homogeneity of FTIR samples is determined with a combination of variographic and multivariate approach. An approach for estimation of uncertainty within individual sample, as well as, within repeated samples is introduced. FTIR samples containing two commercial pharmaceutical lactase products (LactaNON and Lactrase) are applied as an example of the procedure. The results showed that the approach is suitable for the purpose. The sample pellets were quite homogeneous, since the total uncertainty of each pellet varied between 1.5% and 2.5%. The heterogeneity within a tablet strip was found to be dominant, as 15-20 tablets has to be analyzed in order to achieve uncertainty level. Uncertainty arising from the FTIR instrument was uncertainty estimates are computed directly from FTIR spectra without any concentration information of the analyte.

  14. The uncertainties of magnetic properties measurements of electrical sheet steel

    CERN Document Server

    Ahlers, H

    2000-01-01

    In this work, uncertainties in measurements of magnetic properties of Epstein- and single-sheet samples have been determined according to the 'Guide To The Expression Of Uncertainty In Measurement', [International Organization for Standardization (1993)]. They were calculated for the results at predicted values of parameters taking into account the non-linear dependences. The measurement results and the uncertainties are calculated simultaneously by a computer program.

  15. Estimating the measurement uncertainty in forensic blood alcohol analysis.

    Science.gov (United States)

    Gullberg, Rod G

    2012-04-01

    For many reasons, forensic toxicologists are being asked to determine and report their measurement uncertainty in blood alcohol analysis. While understood conceptually, the elements and computations involved in determining measurement uncertainty are generally foreign to most forensic toxicologists. Several established and well-documented methods are available to determine and report the uncertainty in blood alcohol measurement. A straightforward bottom-up approach is presented that includes: (1) specifying the measurand, (2) identifying the major components of uncertainty, (3) quantifying the components, (4) statistically combining the components and (5) reporting the results. A hypothetical example is presented that employs reasonable estimates for forensic blood alcohol analysis assuming headspace gas chromatography. These computations are easily employed in spreadsheet programs as well. Determining and reporting measurement uncertainty is an important element in establishing fitness-for-purpose. Indeed, the demand for such computations and information from the forensic toxicologist will continue to increase.

  16. Network planning under uncertainties

    Science.gov (United States)

    Ho, Kwok Shing; Cheung, Kwok Wai

    2008-11-01

    One of the main focuses for network planning is on the optimization of network resources required to build a network under certain traffic demand projection. Traditionally, the inputs to this type of network planning problems are treated as deterministic. In reality, the varying traffic requirements and fluctuations in network resources can cause uncertainties in the decision models. The failure to include the uncertainties in the network design process can severely affect the feasibility and economics of the network. Therefore, it is essential to find a solution that can be insensitive to the uncertain conditions during the network planning process. As early as in the 1960's, a network planning problem with varying traffic requirements over time had been studied. Up to now, this kind of network planning problems is still being active researched, especially for the VPN network design. Another kind of network planning problems under uncertainties that has been studied actively in the past decade addresses the fluctuations in network resources. One such hotly pursued research topic is survivable network planning. It considers the design of a network under uncertainties brought by the fluctuations in topology to meet the requirement that the network remains intact up to a certain number of faults occurring anywhere in the network. Recently, the authors proposed a new planning methodology called Generalized Survivable Network that tackles the network design problem under both varying traffic requirements and fluctuations of topology. Although all the above network planning problems handle various kinds of uncertainties, it is hard to find a generic framework under more general uncertainty conditions that allows a more systematic way to solve the problems. With a unified framework, the seemingly diverse models and algorithms can be intimately related and possibly more insights and improvements can be brought out for solving the problem. This motivates us to seek a

  17. Interpreting uncertainty terms.

    Science.gov (United States)

    Holtgraves, Thomas

    2014-08-01

    Uncertainty terms (e.g., some, possible, good, etc.) are words that do not have a fixed referent and hence are relatively ambiguous. A model is proposed that specifies how, from the hearer's perspective, recognition of facework as a potential motive for the use of an uncertainty term results in a calibration of the intended meaning of that term. Four experiments are reported that examine the impact of face threat, and the variables that affect it (e.g., power), on the manner in which a variety of uncertainty terms (probability terms, quantifiers, frequency terms, etc.) are interpreted. Overall, the results demonstrate that increased face threat in a situation will result in a more negative interpretation of an utterance containing an uncertainty term. That the interpretation of so many different types of uncertainty terms is affected in the same way suggests the operation of a fundamental principle of language use, one with important implications for the communication of risk, subjective experience, and so on.

  18. About uncertainties in practical salinity calculations

    Directory of Open Access Journals (Sweden)

    M. Le Menn

    2009-10-01

    Full Text Available Salinity is a quantity computed, in the actual state of the art, from conductivity ratio measurements, knowing temperature and pressure at the time of the measurement and using the Practical Salinity Scale algorithm of 1978 (PSS-78 which gives practical salinity values S. The uncertainty expected on PSS-78 values is ±0.002, but nothing has ever been detailed about the method to work out this uncertainty, and the sources of errors to include in this calculation. Following a guide edited by the Bureau International des Poids et Mesures (BIPM, this paper assess, by two independent methods, the uncertainties of salinity values obtained from a laboratory salinometer and Conductivity-Temperature-Depth (CTD measurements after laboratory calibration of a conductivity cell. The results show that the part due to the PSS-78 relations fits is sometimes as much significant as the instruments one's. This is particularly the case with CTD measurements where correlations between the variables contribute to decrease largely the uncertainty on S, even when the expanded uncertainties on conductivity cells calibrations are largely up of 0.002 mS/cm. The relations given in this publication, and obtained with the normalized GUM method, allow a real analysis of the uncertainties sources and they can be used in a more general way, with instruments having different specifications.

  19. Measurement uncertainty relations

    Energy Technology Data Exchange (ETDEWEB)

    Busch, Paul, E-mail: paul.busch@york.ac.uk [Department of Mathematics, University of York, York (United Kingdom); Lahti, Pekka, E-mail: pekka.lahti@utu.fi [Turku Centre for Quantum Physics, Department of Physics and Astronomy, University of Turku, FI-20014 Turku (Finland); Werner, Reinhard F., E-mail: reinhard.werner@itp.uni-hannover.de [Institut für Theoretische Physik, Leibniz Universität, Hannover (Germany)

    2014-04-15

    Measurement uncertainty relations are quantitative bounds on the errors in an approximate joint measurement of two observables. They can be seen as a generalization of the error/disturbance tradeoff first discussed heuristically by Heisenberg. Here we prove such relations for the case of two canonically conjugate observables like position and momentum, and establish a close connection with the more familiar preparation uncertainty relations constraining the sharpness of the distributions of the two observables in the same state. Both sets of relations are generalized to means of order α rather than the usual quadratic means, and we show that the optimal constants are the same for preparation and for measurement uncertainty. The constants are determined numerically and compared with some bounds in the literature. In both cases, the near-saturation of the inequalities entails that the state (resp. observable) is uniformly close to a minimizing one.

  20. SAGD optimization under uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Gossuin, J.; Naccache, P. [Schlumberger SIS, Abingdon (United Kingdom); Bailley, W.; Couet, B. [Schlumberger-Doll Research, Cambridge, MA, (United States)

    2011-07-01

    In the heavy oil industry, the steam assisted gravity drainage process is often used to enhance oil recovery but this is a costly method and ways to make it more efficient are needed. Multiple methods have been developed to optimize the SAGD process but none of them explicitly considered uncertainty. This paper presents an optimization method in the presence of reservoir uncertainty. This process was tested on an SAGD model where three equi-probable geological models are possible. Preparatory steps were first performed to identify key variables and the optimization model was then proposed. The method was shown to be successful in handling a significant number of uncertainties, optimizing the SAGD process and preventing premature steam channels that can choke production. The optimization method presented herein was successfully applied to an SAGD process and was shown to provide better strategies than sensitivity analysis while handling more complex problems.

  1. Serenity in political uncertainty.

    Science.gov (United States)

    Doumit, Rita; Afifi, Rema A; Devon, Holli A

    2015-01-01

    College students are often faced with academic and personal stressors that threaten their well-being. Added to that may be political and environmental stressors such as acts of violence on the streets, interruptions in schooling, car bombings, targeted religious intimidations, financial hardship, and uncertainty of obtaining a job after graduation. Research on how college students adapt to the latter stressors is limited. The aims of this study were (1) to investigate the associations between stress, uncertainty, resilience, social support, withdrawal coping, and well-being for Lebanese youth during their first year of college and (2) to determine whether these variables predicted well-being. A sample of 293 first-year students enrolled in a private university in Lebanon completed a self-reported questionnaire in the classroom setting. The mean age of sample participants was 18.1 years, with nearly an equal percentage of males and females (53.2% vs 46.8%), who lived with their family (92.5%), and whose family reported high income levels (68.4%). Multiple regression analyses revealed that best determinants of well-being are resilience, uncertainty, social support, and gender that accounted for 54.1% of the variance. Despite living in an environment of frequent violence and political uncertainty, Lebanese youth in this study have a strong sense of well-being and are able to go on with their lives. This research adds to our understanding on how adolescents can adapt to stressors of frequent violence and political uncertainty. Further research is recommended to understand the mechanisms through which young people cope with political uncertainty and violence. PMID:25658930

  2. Comment on "A procedure for the estimation of the numerical uncertainty of CFD calculations based on grid refinement studies" (L. Eça and M. Hoekstra, Journal of Computational Physics 262 (2014) 104-130)

    Science.gov (United States)

    Xing, Tao; Stern, Frederick

    2015-11-01

    Eça and Hoekstra [1] proposed a procedure for the estimation of the numerical uncertainty of CFD calculations based on the least squares root (LSR) method. We believe that the LSR method has potential value for providing an extended Richardson-extrapolation solution verification procedure for mixed monotonic and oscillatory or only oscillatory convergent solutions (based on the usual systematic grid-triplet convergence condition R). Current Richardson-extrapolation solution verification procedures [2-7] are restricted to monotonic convergent solutions 0 block diagram, which summarizes the LSR procedure and options, including some of which we are in disagreement. Compared to the grid-triplet and three-step procedure followed by most solution verification methods (convergence condition followed by error and uncertainty estimates), the LSR method follows a four-grid (minimum) and four-step procedure (error estimate, data range parameter Δϕ, FS, and uncertainty estimate).

  3. Weighted Uncertainty Relations

    Science.gov (United States)

    Xiao, Yunlong; Jing, Naihuan; Li-Jost, Xianqing; Fei, Shao-Ming

    2016-03-01

    Recently, Maccone and Pati have given two stronger uncertainty relations based on the sum of variances and one of them is nontrivial when the quantum state is not an eigenstate of the sum of the observables. We derive a family of weighted uncertainty relations to provide an optimal lower bound for all situations and remove the restriction on the quantum state. Generalization to multi-observable cases is also given and an optimal lower bound for the weighted sum of the variances is obtained in general quantum situation.

  4. Uncertainty in artificial intelligence

    CERN Document Server

    Levitt, TS; Lemmer, JF; Shachter, RD

    1990-01-01

    Clearly illustrated in this volume is the current relationship between Uncertainty and AI.It has been said that research in AI revolves around five basic questions asked relative to some particular domain: What knowledge is required? How can this knowledge be acquired? How can it be represented in a system? How should this knowledge be manipulated in order to provide intelligent behavior? How can the behavior be explained? In this volume, all of these questions are addressed. From the perspective of the relationship of uncertainty to the basic questions of AI, the book divides naturally i

  5. Quantification of Uncertainties in Integrated Spacecraft System Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed effort is to investigate a novel uncertainty quantification (UQ) approach based on non-intrusive polynomial chaos (NIPC) for computationally efficient...

  6. Investment and uncertainty

    DEFF Research Database (Denmark)

    Greasley, David; Madsen, Jakob B.

    2006-01-01

    A severe collapse of fixed capital formation distinguished the onset of the Great Depression from other investment downturns between the world wars. Using a model estimated for the years 1890-2000, we show that the expected profitability of capital measured by Tobin's q, and the uncertainty...

  7. Risks, uncertainty, vagueness

    International Nuclear Information System (INIS)

    The notion of 'risk' is discussed in its social and technological contexts, leading to an investigation of the terms factuality, hypotheticality, uncertainty, and vagueness, and to the problems of acceptance and acceptability especially in the context of political decision finding. (DG)

  8. Proportional Representation with Uncertainty

    OpenAIRE

    Francesco De Sinopoli; Giovanna Iannantuoni; Elena Manzoni; Carlos Pimienta

    2014-01-01

    We introduce a model with strategic voting in a parliamentary election with proportional representation and uncertainty about voters’ preferences. In any equilibrium of the model, most voters only vote for those parties whose positions are extreme. In the resulting parliament, a consensus government forms and the policy maximizing the sum of utilities of the members of the government is implemented.

  9. Risk, Uncertainty, and Entrepreneurship

    DEFF Research Database (Denmark)

    Koudstaal, martin; Sloof, Randolph; Van Praag, Mirjam

    2015-01-01

    Theory predicts that entrepreneurs have distinct attitudes toward risk and uncertainty, but empirical evidence is mixed. To better understand these mixed results, we perform a large “lab-in-the-field” experiment comparing entrepreneurs to managers (a suitable comparison group) and employees (n D ...

  10. manage employee uncertainty

    Institute of Scientific and Technical Information of China (English)

    范梦璇

    2015-01-01

    <正>Employ change-related uncertainty is a condition that under current continually changing business environment,the organizations also have to change,the change include strategic direction,structure and staffing levels to help company to keep competitive(Armenakis&Bedeian,1999);However;these

  11. Justice under uncertainty

    NARCIS (Netherlands)

    Cettolin, E.; Riedl, A.M.

    2013-01-01

    An important element for the public support of policies is their perceived justice. At the same time most policy choices have uncertain outcomes. We report the results of a first experiment investigating just allocations of resources when some recipients are exposed to uncertainty. Although, under c

  12. Inspection Uncertainty and Model Uncertainty Updating for Ship Structures Subjected to Corrosion Deterioration

    Institute of Scientific and Technical Information of China (English)

    LIDian-qing; ZHANGSheng-kun

    2004-01-01

    The classical probability theory cannot effectively quantify the parameter uncertainty in probability of detection.Furthermore,the conventional data analytic method and expert judgment method fail to handle the problem of model uncertainty updating with the information from nondestructive inspection.To overcome these disadvantages,a Bayesian approach was proposed to quantify the parameter uncertainty in probability of detection.Furthermore,the formulae of the multiplication factors to measure the statistical uncertainties in the probability of detection following the Weibull distribution were derived.A Bayesian updating method was applied to compute the posterior probabilities of model weights and the posterior probability density functions of distribution parameters of probability of detection.A total probability model method was proposed to analyze the problem of multi-layered model uncertainty updating.This method was then applied to the problem of multilayered corrosion model uncertainty updating for ship structures.The results indicate that the proposed method is very effective in analyzing the problem of multi-layered model uncertainty updating.

  13. Strategy under uncertainty.

    Science.gov (United States)

    Courtney, H; Kirkland, J; Viguerie, P

    1997-01-01

    At the heart of the traditional approach to strategy lies the assumption that by applying a set of powerful analytic tools, executives can predict the future of any business accurately enough to allow them to choose a clear strategic direction. But what happens when the environment is so uncertain that no amount of analysis will allow us to predict the future? What makes for a good strategy in highly uncertain business environments? The authors, consultants at McKinsey & Company, argue that uncertainty requires a new way of thinking about strategy. All too often, they say, executives take a binary view: either they underestimate uncertainty to come up with the forecasts required by their companies' planning or capital-budging processes, or they overestimate it, abandon all analysis, and go with their gut instinct. The authors outline a new approach that begins by making a crucial distinction among four discrete levels of uncertainty that any company might face. They then explain how a set of generic strategies--shaping the market, adapting to it, or reserving the right to play at a later time--can be used in each of the four levels. And they illustrate how these strategies can be implemented through a combination of three basic types of actions: big bets, options, and no-regrets moves. The framework can help managers determine which analytic tools can inform decision making under uncertainty--and which cannot. At a broader level, it offers executives a discipline for thinking rigorously and systematically about uncertainty and its implications for strategy.

  14. Fluid flow dynamics under location uncertainty

    CERN Document Server

    Mémin, Etienne

    2013-01-01

    We present a derivation of a stochastic model of Navier Stokes equations that relies on a decomposition of the velocity fields into a differentiable drift component and a time uncorrelated uncertainty random term. This type of decomposition is reminiscent in spirit to the classical Reynolds decomposition. However, the random velocity fluctuations considered here are not differentiable with respect to time, and they must be handled through stochastic calculus. The dynamics associated with the differentiable drift component is derived from a stochastic version of the Reynolds transport theorem. It includes in its general form an uncertainty dependent "subgrid" bulk formula that cannot be immediately related to the usual Boussinesq eddy viscosity assumption constructed from thermal molecular agitation analogy. This formulation, emerging from uncertainties on the fluid parcels location, explains with another viewpoint some subgrid eddy diffusion models currently used in computational fluid dynamics or in geophysi...

  15. Uncertainty Analysis via Failure Domain Characterization: Polynomial Requirement Functions

    Science.gov (United States)

    Crespo, Luis G.; Munoz, Cesar A.; Narkawicz, Anthony J.; Kenny, Sean P.; Giesy, Daniel P.

    2011-01-01

    This paper proposes an uncertainty analysis framework based on the characterization of the uncertain parameter space. This characterization enables the identification of worst-case uncertainty combinations and the approximation of the failure and safe domains with a high level of accuracy. Because these approximations are comprised of subsets of readily computable probability, they enable the calculation of arbitrarily tight upper and lower bounds to the failure probability. A Bernstein expansion approach is used to size hyper-rectangular subsets while a sum of squares programming approach is used to size quasi-ellipsoidal subsets. These methods are applicable to requirement functions whose functional dependency on the uncertainty is a known polynomial. Some of the most prominent features of the methodology are the substantial desensitization of the calculations from the uncertainty model assumed (i.e., the probability distribution describing the uncertainty) as well as the accommodation for changes in such a model with a practically insignificant amount of computational effort.

  16. SENSIT: a cross-section and design sensitivity and uncertainty analysis code

    International Nuclear Information System (INIS)

    SENSIT computes the sensitivity and uncertainty of a calculated integral response (such as a dose rate) due to input cross sections and their uncertainties. Sensitivity profiles are computed for neutron and gamma-ray reaction cross sections of standard multigroup cross section sets and for secondary energy distributions (SEDs) of multigroup scattering matrices. In the design sensitivity mode, SENSIT computes changes in an integral response due to design changes and gives the appropriate sensitivity coefficients. Cross section uncertainty analyses are performed for three types of input data uncertainties: cross-section covariance matrices for pairs of multigroup reaction cross sections, spectral shape uncertainty parameters for secondary energy distributions (integral SED uncertainties), and covariance matrices for energy-dependent response functions. For all three types of data uncertainties SENSIT computes the resulting variance and estimated standard deviation in an integral response of interest, on the basis of generalized perturbation theory. SENSIT attempts to be more comprehensive than earlier sensitivity analysis codes, such as SWANLAKE

  17. Estimating uncertainty of inference for validation

    Energy Technology Data Exchange (ETDEWEB)

    Booker, Jane M [Los Alamos National Laboratory; Langenbrunner, James R [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2010-09-30

    We present a validation process based upon the concept that validation is an inference-making activity. This has always been true, but the association has not been as important before as it is now. Previously, theory had been confirmed by more data, and predictions were possible based on data. The process today is to infer from theory to code and from code to prediction, making the role of prediction somewhat automatic, and a machine function. Validation is defined as determining the degree to which a model and code is an accurate representation of experimental test data. Imbedded in validation is the intention to use the computer code to predict. To predict is to accept the conclusion that an observable final state will manifest; therefore, prediction is an inference whose goodness relies on the validity of the code. Quantifying the uncertainty of a prediction amounts to quantifying the uncertainty of validation, and this involves the characterization of uncertainties inherent in theory/models/codes and the corresponding data. An introduction to inference making and its associated uncertainty is provided as a foundation for the validation problem. A mathematical construction for estimating the uncertainty in the validation inference is then presented, including a possibility distribution constructed to represent the inference uncertainty for validation under uncertainty. The estimation of inference uncertainty for validation is illustrated using data and calculations from Inertial Confinement Fusion (ICF). The ICF measurements of neutron yield and ion temperature were obtained for direct-drive inertial fusion capsules at the Omega laser facility. The glass capsules, containing the fusion gas, were systematically selected with the intent of establishing a reproducible baseline of high-yield 10{sup 13}-10{sup 14} neutron output. The deuterium-tritium ratio in these experiments was varied to study its influence upon yield. This paper on validation inference is the

  18. Uncertainty Analysis of Air Radiation for Lunar Return Shock Layers

    Science.gov (United States)

    Kleb, Bil; Johnston, Christopher O.

    2008-01-01

    By leveraging a new uncertainty markup technique, two risk analysis methods are used to compute the uncertainty of lunar-return shock layer radiation predicted by the High temperature Aerothermodynamic Radiation Algorithm (HARA). The effects of epistemic uncertainty, or uncertainty due to a lack of knowledge, is considered for the following modeling parameters: atomic line oscillator strengths, atomic line Stark broadening widths, atomic photoionization cross sections, negative ion photodetachment cross sections, molecular bands oscillator strengths, and electron impact excitation rates. First, a simplified shock layer problem consisting of two constant-property equilibrium layers is considered. The results of this simplified problem show that the atomic nitrogen oscillator strengths and Stark broadening widths in both the vacuum ultraviolet and infrared spectral regions, along with the negative ion continuum, are the dominant uncertainty contributors. Next, three variable property stagnation-line shock layer cases are analyzed: a typical lunar return case and two Fire II cases. For the near-equilibrium lunar return and Fire 1643-second cases, the resulting uncertainties are very similar to the simplified case. Conversely, the relatively nonequilibrium 1636-second case shows significantly larger influence from electron impact excitation rates of both atoms and molecules. For all cases, the total uncertainty in radiative heat flux to the wall due to epistemic uncertainty in modeling parameters is 30% as opposed to the erroneously-small uncertainty levels (plus or minus 6%) found when treating model parameter uncertainties as aleatory (due to chance) instead of epistemic (due to lack of knowledge).

  19. Uncertainty Principle Respects Locality

    CERN Document Server

    Wang, Dongsheng

    2013-01-01

    The notion of nonlocality implicitly implies there might be some kind of spooky action at a distance in nature, however, the validity of quantum mechanics has been well tested up to now. In this work it is argued that the notion of nonlocality is physically improper, the basic principle of locality in nature is well respected by quantum mechanics, namely, the uncertainty principle. We show that the quantum bound on the Clauser, Horne, Shimony, and Holt (CHSH) inequality can be recovered from the uncertainty relation in a multipartite setting, and the same bound exists classically which indicates that nonlocality does not capture the essence of quantum and then distinguish quantum mechanics and classical mechanics properly. We further argue that the super-quantum correlation demonstrated by the nonlocal box is not physically comparable with the quantum one, as the result, the physical foundation for the existence of nonlocality is falsified. The origin of the quantum structure of nature still remains to be exp...

  20. Aeroelastic/Aeroservoelastic Uncertainty and Reliability of Advanced Aerospace Vehicles in Flight and Ground Operations Project

    Data.gov (United States)

    National Aeronautics and Space Administration — ASSURE - Aeroelastic / Aeroservoelastic (AE/ASE) Uncertainty and Reliability Engineering capability - is a set of probabilistic computer programs for isolating...

  1. Uncertainty in artificial intelligence

    CERN Document Server

    Shachter, RD; Henrion, M; Lemmer, JF

    1990-01-01

    This volume, like its predecessors, reflects the cutting edge of research on the automation of reasoning under uncertainty.A more pragmatic emphasis is evident, for although some papers address fundamental issues, the majority address practical issues. Topics include the relations between alternative formalisms (including possibilistic reasoning), Dempster-Shafer belief functions, non-monotonic reasoning, Bayesian and decision theoretic schemes, and new inference techniques for belief nets. New techniques are applied to important problems in medicine, vision, robotics, and natural language und

  2. Uncertainty propagation in locally damped dynamic systems

    OpenAIRE

    Cortes Mochales, Lluis; Ferguson, Neil S.; Bhaskar, Atul

    2012-01-01

    In the field of stochastic structural dynamics, perturbation methods are widely used to estimate the response statistics of uncertain systems. When large built up systems are to be modelled in the mid-frequency range, perturbation methods are often combined with finite element model reduction techniques in order to considerably reduce the computation time of the response. Existing methods based on Component Mode Synthesis(CMS) allow the uncertainties in the system parameters to be treated ...

  3. Uncertainty in magnetic activity indices

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Magnetic activity indices are widely used in theoretical studies of solar-terrestrial coupling and space weather prediction. However, the indices suffer from various uncertainties, which limit their application and even mislead to incorrect conclu-sion. In this paper we analyze three most popular indices, Kp, AE and Dst. Three categories of uncertainties in magnetic indices are discussed: "data uncertainty" originating from inadequate data processing, "station uncertainty" caused by in-complete station covering, and "physical uncertainty" stemming from unclear physical mechanism. A comparison between magnetic disturbances and related indices indicate that the residual Sq will cause an uncertainty of 1―2 in K meas-urement, the uncertainty in saturated AE is as much as 50%, and the uncertainty in Dst index caused by the partial ring currents is about a half of the partial ring cur-rent.

  4. Uncertainties in parton distribution functions

    Energy Technology Data Exchange (ETDEWEB)

    Martin, A.D. [Deptartment of Physics, University of Durham, Durham DH1 3LE (United Kingdom); Roberts, R.G. [Rutherford Appleton Laboratory, Chilton, Didcot, Oxon OX11 0QX (United Kingdom); Stirling, W.J. [Department of Physics, University of Durham, Durham DH1 3LE (United Kingdom); Department of Mathematical Sciences, University of Durham, Durham DH1 3LE (United Kingdom); Thorne, R.S. [Jesus College, University of Oxford, Oxford OX1 3DW (United Kingdom)

    2000-05-01

    We discuss the uncertainty in the predictions for hard scattering cross sections at hadron colliders due to uncertainties in the input parton distributions, using W production at the LHC as an example. (author)

  5. Legal uncertainty and contractual innovation

    OpenAIRE

    Yaron Leitner

    2005-01-01

    Although innovative contracts are important for economic growth, when firms face uncertainty as to whether contracts will be enforced, they may choose not to innovate. Legal uncertainty can arise if a judge interprets the terms of a contract in a way that is antithetical to the intentions of the parties to the contract. Or sometimes a judge may understand the contract but overrule it for other reasons. How does legal uncertainty affect firms’ decisions to innovate? In “Legal Uncertainty and C...

  6. The Uncertainties of Risk Management

    DEFF Research Database (Denmark)

    Vinnari, Eija; Skærbæk, Peter

    2014-01-01

    . These include uncertainties relating to legal aspects of risk management solutions, in particular the issue concerning which types of document are considered legally valid; uncertainties relating to the definition and operationalisation of risk management; and uncertainties relating to the resources available...

  7. Gravitational tests of the generalized uncertainty principle

    Energy Technology Data Exchange (ETDEWEB)

    Scardigli, Fabio [American University of the Middle East, Department of Mathematics, College of Engineering, P.O. Box 220, Dasman (Kuwait); Politecnico di Milano, Dipartimento di Matematica, Milan (Italy); Casadio, Roberto [Alma Mater Universita di Bologna, Dipartimento di Fisica e Astronomia, Bologna (Italy); INFN, Sezione di Bologna, Bologna (Italy)

    2015-09-15

    We compute the corrections to the Schwarzschild metric necessary to reproduce the Hawking temperature derived from a generalized uncertainty principle (GUP), so that the GUP deformation parameter is directly linked to the deformation of the metric. Using this modified Schwarzschild metric, we compute corrections to the standard general relativistic predictions for the light deflection and perihelion precession, both for planets in the solar system and for binary pulsars. This analysis allows us to set bounds for the GUP deformation parameter from well-known astronomical measurements. (orig.)

  8. Uncertainty Quantification in Climate Modeling and Projection

    Energy Technology Data Exchange (ETDEWEB)

    Qian, Yun; Jackson, Charles S.; Giorgi, Filippo; Booth, Ben; Duan, Qingyun; Forest , Chris; Higdon, Dave; Hou, Zhangshuan; Huerta, Gabriel

    2016-05-01

    The projection of future climate is one of the most complex problems undertaken by the scientific community. Although scientists have been striving to better understand the physical basis of the climate system and to improve climate models, the overall uncertainty in projections of future climate has not been significantly reduced (e.g., from the IPCC AR4 to AR5). With the rapid increase of complexity in Earth system models, reducing uncertainties in climate projections becomes extremely challenging. Since uncertainties always exist in climate models, interpreting the strengths and limitations of future climate projections is key to evaluating risks, and climate change information for use in Vulnerability, Impact, and Adaptation (VIA) studies should be provided with both well-characterized and well-quantified uncertainty. The workshop aimed at providing participants, many of them from developing countries, information on strategies to quantify the uncertainty in climate model projections and assess the reliability of climate change information for decision-making. The program included a mixture of lectures on fundamental concepts in Bayesian inference and sampling, applications, and hands-on computer laboratory exercises employing software packages for Bayesian inference, Markov Chain Monte Carlo methods, and global sensitivity analyses. The lectures covered a range of scientific issues underlying the evaluation of uncertainties in climate projections, such as the effects of uncertain initial and boundary conditions, uncertain physics, and limitations of observational records. Progress in quantitatively estimating uncertainties in hydrologic, land surface, and atmospheric models at both regional and global scales was also reviewed. The application of Uncertainty Quantification (UQ) concepts to coupled climate system models is still in its infancy. The Coupled Model Intercomparison Project (CMIP) multi-model ensemble currently represents the primary data for

  9. Strain Gauge Balance Uncertainty Analysis at NASA Langley: A Technical Review

    Science.gov (United States)

    Tripp, John S.

    1999-01-01

    This paper describes a method to determine the uncertainties of measured forces and moments from multi-component force balances used in wind tunnel tests. A multivariate regression technique is first employed to estimate the uncertainties of the six balance sensitivities and 156 interaction coefficients derived from established balance calibration procedures. These uncertainties are then employed to calculate the uncertainties of force-moment values computed from observed balance output readings obtained during tests. Confidence and prediction intervals are obtained for each computed force and moment as functions of the actual measurands. Techniques are discussed for separate estimation of balance bias and precision uncertainties.

  10. Risk, Uncertainty and Entrepreneurship

    DEFF Research Database (Denmark)

    Koudstaal, Martin; Sloof, Randolph; Van Praag, Mirjam

    Theory predicts that entrepreneurs have distinct attitudes towards risk and uncertainty, but empirical evidence is mixed. To better understand the unique behavioral characteristics of entrepreneurs and the causes of these mixed results, we perform a large ‘lab-in-the-field’ experiment comparing...... entrepreneurs to managers – a suitable comparison group – and employees (n = 2288). The results indicate that entrepreneurs perceive themselves as less risk averse than managers and employees, in line with common wisdom. However, when using experimental incentivized measures, the differences are subtler...

  11. Principles of Uncertainty

    CERN Document Server

    Kadane, Joseph B

    2011-01-01

    An intuitive and mathematical introduction to subjective probability and Bayesian statistics. An accessible, comprehensive guide to the theory of Bayesian statistics, Principles of Uncertainty presents the subjective Bayesian approach, which has played a pivotal role in game theory, economics, and the recent boom in Markov Chain Monte Carlo methods. Both rigorous and friendly, the book contains: Introductory chapters examining each new concept or assumption Just-in-time mathematics -- the presentation of ideas just before they are applied Summary and exercises at the end of each chapter Discus

  12. Collective Uncertainty Entanglement Test

    CERN Document Server

    Rudnicki, Łukasz; Życzkowski, Karol

    2011-01-01

    For a given pure state of a composite quantum system we analyze the product of its projections onto a set of locally orthogonal separable pure states. We derive a bound for this product analogous to the entropic uncertainty relations. For bipartite systems the bound is saturated for maximally entangled states and it allows us to construct a family of entanglement measures, we shall call collectibility. As these quantities are experimentally accessible, the approach advocated contributes to the task of experimental quantification of quantum entanglement, while for a three-qubit system it is capable to identify the genuine three-party entanglement.

  13. Uncertainty Analysis of Light Water Reactor Fuel Lattices

    Directory of Open Access Journals (Sweden)

    C. Arenas

    2013-01-01

    Full Text Available The study explored the calculation of uncertainty based on available cross-section covariance data and computational tool on fuel lattice levels, which included pin cell and the fuel assembly models. Uncertainty variations due to temperatures changes and different fuel compositions are the main focus of this analysis. Selected assemblies and unit pin cells were analyzed according to the OECD LWR UAM benchmark specifications. Criticality and uncertainty analysis were performed using TSUNAMI-2D sequence in SCALE 6.1. It was found that uncertainties increase with increasing temperature, while kinf decreases. This increase in the uncertainty is due to the increase in sensitivity of the largest contributing reaction of uncertainty, namely, the neutron capture reaction 238U(n, γ due to the Doppler broadening. In addition, three types (UOX, MOX, and UOX-Gd2O3 of fuel material compositions were analyzed. A remarkable increase in uncertainty in kinf was observed for the case of MOX fuel. The increase in uncertainty of kinf in MOX fuel was nearly twice the corresponding value in UOX fuel. The neutron-nuclide reaction of 238U, mainly inelastic scattering (n, n′, contributed the most to the uncertainties in the MOX fuel, shifting the neutron spectrum to higher energy compared to the UOX fuel.

  14. Oil price uncertainty in Canada

    Energy Technology Data Exchange (ETDEWEB)

    Elder, John [Department of Finance and Real Estate, 1272 Campus Delivery, Colorado State University, Fort Collins, CO 80523 (United States); Serletis, Apostolos [Department of Economics, University of Calgary, Calgary, Alberta (Canada)

    2009-11-15

    Bernanke [Bernanke, Ben S. Irreversibility, uncertainty, and cyclical investment. Quarterly Journal of Economics 98 (1983), 85-106.] shows how uncertainty about energy prices may induce optimizing firms to postpone investment decisions, thereby leading to a decline in aggregate output. Elder and Serletis [Elder, John and Serletis, Apostolos. Oil price uncertainty.] find empirical evidence that uncertainty about oil prices has tended to depress investment in the United States. In this paper we assess the robustness of these results by investigating the effects of oil price uncertainty in Canada. Our results are remarkably similar to existing results for the United States, providing additional evidence that uncertainty about oil prices may provide another explanation for why the sharp oil price declines of 1985 failed to produce rapid output growth. Impulse-response analysis suggests that uncertainty about oil prices may tend to reinforce the negative response of output to positive oil shocks. (author)

  15. Integrating Out Astrophysical Uncertainties

    CERN Document Server

    Fox, Patrick J; Weiner, Neal

    2010-01-01

    Underground searches for dark matter involve a complicated interplay of particle physics, nuclear physics, atomic physics and astrophysics. We attempt to remove the uncertainties associated with astrophysics by developing the means to map the observed signal in one experiment directly into a predicted rate at another. We argue that it is possible to make experimental comparisons that are completely free of astrophysical uncertainties by focusing on {\\em integral} quantities, such as $g(v_{min})=\\int_{v_{min}} dv\\, f(v)/v $ and $\\int_{v_{thresh}} dv\\, v g(v)$. Direct comparisons are possible when the $v_{min}$ space probed by different experiments overlap. As examples, we consider the possible dark matter signals at CoGeNT, DAMA and CRESST-Oxygen. We find that expected rate from CoGeNT in the XENON10 experiment is higher than observed, unless scintillation light output is low. Moreover, we determine that S2-only analyses are constraining, unless the charge yield $Q_y< 2.4 {\\, \\rm electrons/keV}$. For DAMA t...

  16. LCA data quality: sensitivity and uncertainty analysis.

    Science.gov (United States)

    Guo, M; Murphy, R J

    2012-10-01

    Life cycle assessment (LCA) data quality issues were investigated by using case studies on products from starch-polyvinyl alcohol based biopolymers and petrochemical alternatives. The time horizon chosen for the characterization models was shown to be an important sensitive parameter for the environmental profiles of all the polymers. In the global warming potential and the toxicity potential categories the comparison between biopolymers and petrochemical counterparts altered as the time horizon extended from 20 years to infinite time. These case studies demonstrated that the use of a single time horizon provide only one perspective on the LCA outcomes which could introduce an inadvertent bias into LCA outcomes especially in toxicity impact categories and thus dynamic LCA characterization models with varying time horizons are recommended as a measure of the robustness for LCAs especially comparative assessments. This study also presents an approach to integrate statistical methods into LCA models for analyzing uncertainty in industrial and computer-simulated datasets. We calibrated probabilities for the LCA outcomes for biopolymer products arising from uncertainty in the inventory and from data variation characteristics this has enabled assigning confidence to the LCIA outcomes in specific impact categories for the biopolymer vs. petrochemical polymer comparisons undertaken. Uncertainty combined with the sensitivity analysis carried out in this study has led to a transparent increase in confidence in the LCA findings. We conclude that LCAs lacking explicit interpretation of the degree of uncertainty and sensitivities are of limited value as robust evidence for decision making or comparative assertions.

  17. Uncertainty Quantification for Cargo Hold Fires

    CERN Document Server

    DeGennaro, Anthony M; Martinelli, Luigi; Rowley, Clarence W

    2015-01-01

    The purpose of this study is twofold -- first, to introduce the application of high-order discontinuous Galerkin methods to buoyancy-driven cargo hold fire simulations, second, to explore statistical variation in the fluid dynamics of a cargo hold fire given parameterized uncertainty in the fire source location and temperature. Cargo hold fires represent a class of problems that require highly-accurate computational methods to simulate faithfully. Hence, we use an in-house discontinuous Galerkin code to treat these flows. Cargo hold fires also exhibit a large amount of uncertainty with respect to the boundary conditions. Thus, the second aim of this paper is to quantify the resulting uncertainty in the flow, using tools from the uncertainty quantification community to ensure that our efforts require a minimal number of simulations. We expect that the results of this study will provide statistical insight into the effects of fire location and temperature on cargo fires, and also assist in the optimization of f...

  18. LCA data quality: sensitivity and uncertainty analysis.

    Science.gov (United States)

    Guo, M; Murphy, R J

    2012-10-01

    Life cycle assessment (LCA) data quality issues were investigated by using case studies on products from starch-polyvinyl alcohol based biopolymers and petrochemical alternatives. The time horizon chosen for the characterization models was shown to be an important sensitive parameter for the environmental profiles of all the polymers. In the global warming potential and the toxicity potential categories the comparison between biopolymers and petrochemical counterparts altered as the time horizon extended from 20 years to infinite time. These case studies demonstrated that the use of a single time horizon provide only one perspective on the LCA outcomes which could introduce an inadvertent bias into LCA outcomes especially in toxicity impact categories and thus dynamic LCA characterization models with varying time horizons are recommended as a measure of the robustness for LCAs especially comparative assessments. This study also presents an approach to integrate statistical methods into LCA models for analyzing uncertainty in industrial and computer-simulated datasets. We calibrated probabilities for the LCA outcomes for biopolymer products arising from uncertainty in the inventory and from data variation characteristics this has enabled assigning confidence to the LCIA outcomes in specific impact categories for the biopolymer vs. petrochemical polymer comparisons undertaken. Uncertainty combined with the sensitivity analysis carried out in this study has led to a transparent increase in confidence in the LCA findings. We conclude that LCAs lacking explicit interpretation of the degree of uncertainty and sensitivities are of limited value as robust evidence for decision making or comparative assertions. PMID:22854094

  19. Parameter Uncertainty for Aircraft Aerodynamic Modeling using Recursive Least Squares

    Science.gov (United States)

    Grauer, Jared A.; Morelli, Eugene A.

    2016-01-01

    A real-time method was demonstrated for determining accurate uncertainty levels of stability and control derivatives estimated using recursive least squares and time-domain data. The method uses a recursive formulation of the residual autocorrelation to account for colored residuals, which are routinely encountered in aircraft parameter estimation and change the predicted uncertainties. Simulation data and flight test data for a subscale jet transport aircraft were used to demonstrate the approach. Results showed that the corrected uncertainties matched the observed scatter in the parameter estimates, and did so more accurately than conventional uncertainty estimates that assume white residuals. Only small differences were observed between batch estimates and recursive estimates at the end of the maneuver. It was also demonstrated that the autocorrelation could be reduced to a small number of lags to minimize computation and memory storage requirements without significantly degrading the accuracy of predicted uncertainty levels.

  20. On the worst case uncertainty and its evaluation

    Science.gov (United States)

    Fabbiano, L.; Giaquinto, N.; Savino, M.; Vacca, G.

    2016-02-01

    The paper is a review on the worst case uncertainty (WCU) concept, neglected in the Guide to the Expression of Uncertainty in Measurements (GUM), but necessary for a correct uncertainty assessment in a number of practical cases involving distribution with compact support. First, it is highlighted that the knowledge of the WCU is necessary to choose a sensible coverage factor, associated to a sensible coverage probability: the Maximum Acceptable Coverage Factor (MACF) is introduced as a convenient index to guide this choice. Second, propagation rules for the worst-case uncertainty are provided in matrix and scalar form. It is highlighted that when WCU propagation cannot be computed, the Monte Carlo approach is the only way to obtain a correct expanded uncertainty assessment, in contrast to what can be inferred from the GUM. Third, examples of applications of the formulae to ordinary instruments and measurements are given. Also an example taken from the GUM is discussed, underlining some inconsistencies in it.

  1. Scientific visualization uncertainty, multifield, biomedical, and scalable visualization

    CERN Document Server

    Chen, Min; Johnson, Christopher; Kaufman, Arie; Hagen, Hans

    2014-01-01

    Based on the seminar that took place in Dagstuhl, Germany in June 2011, this contributed volume studies the four important topics within the scientific visualization field: uncertainty visualization, multifield visualization, biomedical visualization and scalable visualization. • Uncertainty visualization deals with uncertain data from simulations or sampled data, uncertainty due to the mathematical processes operating on the data, and uncertainty in the visual representation, • Multifield visualization addresses the need to depict multiple data at individual locations and the combination of multiple datasets, • Biomedical is a vast field with select subtopics addressed from scanning methodologies to structural applications to biological applications, • Scalability in scientific visualization is critical as data grows and computational devices range from hand-held mobile devices to exascale computational platforms. Scientific Visualization will be useful to practitioners of scientific visualization, ...

  2. Pragmatic aspects of uncertainty propagation: A conceptual review

    KAUST Repository

    Thacker, W.Carlisle

    2015-09-11

    When quantifying the uncertainty of the response of a computationally costly oceanographic or meteorological model stemming from the uncertainty of its inputs, practicality demands getting the most information using the fewest simulations. It is widely recognized that, by interpolating the results of a small number of simulations, results of additional simulations can be inexpensively approximated to provide a useful estimate of the variability of the response. Even so, as computing the simulations to be interpolated remains the biggest expense, the choice of these simulations deserves attention. When making this choice, two requirement should be considered: (i) the nature of the interpolation and ii) the available information about input uncertainty. Examples comparing polynomial interpolation and Gaussian process interpolation are presented for three different views of input uncertainty.

  3. Comprehensive Approach to Verification and Validation of CFD Simulations Applied to Backward Facing Step-Application of CFD Uncertainty Analysis

    Science.gov (United States)

    Groves, Curtis E.; LLie, Marcel; Shallhorn, Paul A.

    2012-01-01

    There are inherent uncertainties and errors associated with using Computational Fluid Dynamics (CFD) to predict the flow field and there is no standard method for evaluating uncertainty in the CFD community. This paper describes an approach to -validate the . uncertainty in using CFD. The method will use the state of the art uncertainty analysis applying different turbulence niodels and draw conclusions on which models provide the least uncertainty and which models most accurately predict the flow of a backward facing step.

  4. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    Energy Technology Data Exchange (ETDEWEB)

    Langenbrunner, James R. [Los Alamos National Laboratory; Booker, Jane M [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Salazar, Issac F [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  5. Project scheduling under uncertainty: survey and research potentials.

    OpenAIRE

    Herroelen, Willy; Leus, Roel

    2005-01-01

    The vast majority of the research efforts in project scheduling assume complete information about the scheduling problem to be solved and a static deterministic environment within which the pre-computed baseline schedule will be executed. However, in the real world, project activities are subject to considerable uncertainty, which is gradually resolved during project execution. In this survey we review the fundamental approaches for scheduling under uncertainty: reactive scheduling, stochasti...

  6. Thermodynamics of Black Holes and the Symmetric Generalized Uncertainty Principle

    Science.gov (United States)

    Dutta, Abhijit; Gangopadhyay, Sunandan

    2016-06-01

    In this paper, we have investigated the thermodynamics of Schwarzschild and Reissner-Nordström black holes using the symmetric generalised uncertainty principle which contains correction terms involving momentum and position uncertainty. The mass-temperature relationship and the heat capacity for these black holes have been computed using which the critical and remnant masses have been obtained. The entropy is found to satisfy the area law upto leading order logarithmic corrections and corrections of the form A 2 (which is a new finding in this paper) from the symmetric generalised uncertainty principle.

  7. Uncertainty-like relations of the relative entropy of coherence

    Science.gov (United States)

    Liu, Feng; Li, Fei; Chen, Jun; Xing, Wei

    2016-08-01

    Quantum coherence is an important physical resource in quantum computation and quantum information processing. In this paper, we firstly obtain an uncertainty-like expression relating two coherences contained in corresponding local bipartite quantum system. This uncertainty-like inequality shows that the larger the coherence of one subsystem, the less coherence contained in other subsystems. Further, we discuss in detail the uncertainty-like relation among three single-partite quantum systems. We show that the coherence contained in pure tripartite quantum system is greater than the sum of the coherence of all local subsystems.

  8. Relational uncertainty in service dyads

    DEFF Research Database (Denmark)

    Kreye, Melanie

    2016-01-01

    Purpose: Relational uncertainty determines how relationships develop because it enables the building of trust and commitment. However, relational uncertainty has not been explored in an inter-organisational setting. This paper investigates how organisations experience relational uncertainty...... via semi-structured interviews and secondary data. Findings: The findings suggest that relational uncertainty is caused by the partner’s unresolved organisational uncertainty, i.e. their lacking capabilities to deliver or receive (parts of) the service. Furthermore, we found that resolving...... and explain the actions of a partnering organisation due to a lack of knowledge about their abilities and intentions. Second, we present suitable organisational responses to relational uncertainty and their effect on service quality....

  9. Impact of discharge data uncertainty on nutrient load uncertainty

    Science.gov (United States)

    Westerberg, Ida; Gustavsson, Hanna; Sonesten, Lars

    2016-04-01

    Uncertainty in the rating-curve model of the stage-discharge relationship leads to uncertainty in discharge time series. These uncertainties in turn affect many other analyses based on discharge data, such as nutrient load estimations. It is important to understand how large the impact of discharge data uncertainty is on such analyses, since they are often used as the basis to take important environmental management decisions. In the Baltic Sea basin, nutrient load estimates from river mouths are a central information basis for managing and reducing eutrophication in the Baltic Sea. In this study we investigated rating curve uncertainty and its propagation to discharge data uncertainty and thereafter to uncertainty in the load of phosphorous and nitrogen for twelve Swedish river mouths. We estimated rating curve uncertainty using the Voting Point method, which accounts for random and epistemic errors in the stage-discharge relation and allows drawing multiple rating-curve realisations consistent with the total uncertainty. We sampled 40,000 rating curves, and for each sampled curve we calculated a discharge time series from 15-minute water level data for the period 2005-2014. Each discharge time series was then aggregated to daily scale and used to calculate the load of phosphorous and nitrogen from linearly interpolated monthly water samples, following the currently used methodology for load estimation. Finally the yearly load estimates were calculated and we thus obtained distributions with 40,000 load realisations per year - one for each rating curve. We analysed how the rating curve uncertainty propagated to the discharge time series at different temporal resolutions, and its impact on the yearly load estimates. Two shorter periods of daily water quality sampling around the spring flood peak allowed a comparison of load uncertainty magnitudes resulting from discharge data with those resulting from the monthly water quality sampling.

  10. Uncertainty analysis of two-phase flow pressure drop calculations

    Energy Technology Data Exchange (ETDEWEB)

    Siqueira, Cezar A.M.; Costa, Bruno M.P.; Fonseca Junior, Roberto da; Gonalves, Marcelo de A.L. [PETROBRAS, Rio de Janeiro, RJ (Brazil)

    2004-07-01

    The simulation of multiphase flow in pipes is usually performed by petroleum engineers with two main purposes: design of new pipelines and production systems; diagnosis of flow assurance problems in existing systems. The tools used for this calculation are computer codes that use published pressure drop correlations developed for steady-state two-phase flow, such as Hagedorn-Brown, Beggs and Brill and others. Each one of these correlations is best suited for a given situation and the engineer must find out the best option for each particular case, based on his experience. In order to select the best correlation to use and to analyze the results of the calculation, the engineer must determine the reliability of computed values. The uncertainty of the computation is obtained by considering uncertainties of the correlation adopted, of the calculation algorithm and the input data. This paper proposes a method to evaluate the uncertainties of this type of calculation and presents an analysis of these uncertainties. The uncertainty analysis also allows the identification of the parameters that are more significant for the final uncertainty of the simulation. Therefore it makes possible to determine which are the input parameters that must be determined with higher accuracy and the ones that may have lower accuracy, without reducing the reliability of the results. (author)

  11. The Uncertainty of Measurement Results

    International Nuclear Information System (INIS)

    Factors affecting the uncertainty of measurement are explained, basic statistical formulae given, and the theoretical concept explained in the context of pesticide formulation analysis. Practical guidance is provided on how to determine individual uncertainty components within an analytical procedure. An extended and comprehensive table containing the relevant mathematical/statistical expressions elucidates the relevant underlying principles. Appendix I provides a practical elaborated example on measurement uncertainty estimation, above all utilizing experimental repeatability and reproducibility laboratory data. (author)

  12. Decommissioning funding: ethics, implementation, uncertainties

    International Nuclear Information System (INIS)

    This status report on Decommissioning Funding: Ethics, Implementation, Uncertainties also draws on the experience of the NEA Working Party on Decommissioning and Dismantling (WPDD). The report offers, in a concise form, an overview of relevant considerations on decommissioning funding mechanisms with regard to ethics, implementation and uncertainties. Underlying ethical principles found in international agreements are identified, and factors influencing the accumulation and management of funds for decommissioning nuclear facilities are discussed together with the main sources of uncertainties of funding systems. (authors)

  13. PIV uncertainty quantification from correlation statistics

    International Nuclear Information System (INIS)

    The uncertainty of a PIV displacement field is estimated using a generic post-processing method based on statistical analysis of the correlation process using differences in the intensity pattern in the two images. First the second image is dewarped back onto the first one using the computed displacement field which provides two almost perfectly matching images. Differences are analyzed regarding the effect of shifting the peak of the correlation function. A relationship is derived between the standard deviation of intensity differences in each interrogation window and the expected asymmetry of the correlation peak, which is then converted to the uncertainty of a displacement vector. This procedure is tested with synthetic data for various types of noise and experimental conditions (pixel noise, out-of-plane motion, seeding density, particle image size, etc) and is shown to provide an accurate estimate of the true error. (paper)

  14. On Uncertainty Quantification in Particle Accelerators Modelling

    CERN Document Server

    Adelmann, Andreas

    2015-01-01

    Using a cyclotron based model problem, we demonstrate for the first time the applicability and usefulness of a uncertainty quantification (UQ) approach in order to construct surrogate models for quantities such as emittance, energy spread but also the halo parameter, and construct a global sensitivity analysis together with error propagation and $L_{2}$ error analysis. The model problem is selected in a way that it represents a template for general high intensity particle accelerator modelling tasks. The presented physics problem has to be seen as hypothetical, with the aim to demonstrate the usefulness and applicability of the presented UQ approach and not solving a particulate problem. The proposed UQ approach is based on sparse polynomial chaos expansions and relies on a small number of high fidelity particle accelerator simulations. Within this UQ framework, the identification of most important uncertainty sources is achieved by performing a global sensitivity analysis via computing the so-called Sobols' ...

  15. Optimizing Integrated Terminal Airspace Operations Under Uncertainty

    Science.gov (United States)

    Bosson, Christabelle; Xue, Min; Zelinski, Shannon

    2014-01-01

    In the terminal airspace, integrated departures and arrivals have the potential to increase operations efficiency. Recent research has developed geneticalgorithm- based schedulers for integrated arrival and departure operations under uncertainty. This paper presents an alternate method using a machine jobshop scheduling formulation to model the integrated airspace operations. A multistage stochastic programming approach is chosen to formulate the problem and candidate solutions are obtained by solving sample average approximation problems with finite sample size. Because approximate solutions are computed, the proposed algorithm incorporates the computation of statistical bounds to estimate the optimality of the candidate solutions. A proof-ofconcept study is conducted on a baseline implementation of a simple problem considering a fleet mix of 14 aircraft evolving in a model of the Los Angeles terminal airspace. A more thorough statistical analysis is also performed to evaluate the impact of the number of scenarios considered in the sampled problem. To handle extensive sampling computations, a multithreading technique is introduced.

  16. Are models, uncertainty, and dispute resolution compatible?

    Science.gov (United States)

    Anderson, J. D.; Wilson, J. L.

    2013-12-01

    Models and their uncertainty often move from an objective use in planning and decision making into the regulatory environment, then sometimes on to dispute resolution through litigation or other legal forums. Through this last transition whatever objectivity the models and uncertainty assessment may have once possessed becomes biased (or more biased) as each party chooses to exaggerate either the goodness of a model, or its worthlessness, depending on which view is in its best interest. If worthlessness is desired, then what was uncertain becomes unknown, or even unknowable. If goodness is desired, then precision and accuracy are often exaggerated and uncertainty, if it is explicitly recognized, encompasses only some parameters or conceptual issues, ignores others, and may minimize the uncertainty that it accounts for. In dispute resolution, how well is the adversarial process able to deal with these biases? The challenge is that they are often cloaked in computer graphics and animations that appear to lend realism to what could be mostly fancy, or even a manufactured outcome. While junk science can be challenged through appropriate motions in federal court, and in most state courts, it not unusual for biased or even incorrect modeling results, or conclusions based on incorrect results, to be permitted to be presented at trial. Courts allow opinions that are based on a "reasonable degree of scientific certainty," but when that 'certainty' is grossly exaggerated by an expert, one way or the other, how well do the courts determine that someone has stepped over the line? Trials are based on the adversary system of justice, so opposing and often irreconcilable views are commonly allowed, leaving it to the judge or jury to sort out the truth. Can advances in scientific theory and engineering practice, related to both modeling and uncertainty, help address this situation and better ensure that juries and judges see more objective modeling results, or at least see

  17. Quantifying and Reducing Curve-Fitting Uncertainty in Isc

    Energy Technology Data Exchange (ETDEWEB)

    Campanelli, Mark; Duck, Benjamin; Emery, Keith

    2015-06-14

    Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data points can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.

  18. Strain gauge measurement uncertainties on hydraulic turbine runner blade

    International Nuclear Information System (INIS)

    Strains experimentally measured with strain gauges can differ from those evaluated using the Finite Element (FE) method. This difference is due mainly to the assumptions and uncertainties inherent to each method. To circumvent this difficulty, we developed a numerical method based on Monte Carlo simulations to evaluate measurement uncertainties produced by the behaviour of a unidirectional welded gauge, its position uncertainty and its integration effect. This numerical method uses the displacement fields of the studied part evaluated by an FE analysis. The paper presents a study case using in situ data measured on a hydraulic turbine runner. The FE analysis of the turbine runner blade was computed, and our numerical method used to evaluate uncertainties on strains measured at five locations with welded strain gauges. Then, measured strains and their uncertainty ranges are compared to the estimated strains. The uncertainty ranges obtained extended from 74 με to 165 με. Furthermore, the biases observed between the median of the uncertainty ranges and the FE strains varied from −36 to 36 με. Note that strain gauge measurement uncertainties depend mainly on displacement fields and gauge geometry.

  19. Quantifying and Reducing Curve-Fitting Uncertainty in Isc: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Campanelli, Mark; Duck, Benjamin; Emery, Keith

    2015-09-28

    Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data points can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.

  20. Spatial Uncertainty Model for Visual Features Using a Kinect™ Sensor

    Directory of Open Access Journals (Sweden)

    Jae-Han Park

    2012-06-01

    Full Text Available This study proposes a mathematical uncertainty model for the spatial measurement of visual features using Kinect™ sensors. This model can provide qualitative and quantitative analysis for the utilization of Kinect™ sensors as 3D perception sensors. In order to achieve this objective, we derived the propagation relationship of the uncertainties between the disparity image space and the real Cartesian space with the mapping function between the two spaces. Using this propagation relationship, we obtained the mathematical model for the covariance matrix of the measurement error, which represents the uncertainty for spatial position of visual features from Kinect™ sensors. In order to derive the quantitative model of spatial uncertainty for visual features, we estimated the covariance matrix in the disparity image space using collected visual feature data. Further, we computed the spatial uncertainty information by applying the covariance matrix in the disparity image space and the calibrated sensor parameters to the proposed mathematical model. This spatial uncertainty model was verified by comparing the uncertainty ellipsoids for spatial covariance matrices and the distribution of scattered matching visual features. We expect that this spatial uncertainty model and its analyses will be useful in various Kinect™ sensor applications.

  1. Climate model uncertainty vs. conceptual geological uncertainty in hydrological modeling

    Directory of Open Access Journals (Sweden)

    T. O. Sonnenborg

    2015-04-01

    Full Text Available Projections of climate change impact are associated with a cascade of uncertainties including CO2 emission scenario, climate model, downscaling and impact model. The relative importance of the individual uncertainty sources is expected to depend on several factors including the quantity that is projected. In the present study the impacts of climate model uncertainty and geological model uncertainty on hydraulic head, stream flow, travel time and capture zones are evaluated. Six versions of a physically based and distributed hydrological model, each containing a unique interpretation of the geological structure of the model area, are forced by 11 climate model projections. Each projection of future climate is a result of a GCM-RCM model combination (from the ENSEMBLES project forced by the same CO2 scenario (A1B. The changes from the reference period (1991–2010 to the future period (2081–2100 in projected hydrological variables are evaluated and the effects of geological model and climate model uncertainties are quantified. The results show that uncertainty propagation is context dependent. While the geological conceptualization is the dominating uncertainty source for projection of travel time and capture zones, the uncertainty on the climate models is more important for groundwater hydraulic heads and stream flow.

  2. Climate Certainties and Uncertainties

    International Nuclear Information System (INIS)

    In issue 380 of Futuribles in December 2011, Antonin Pottier analysed in detail the workings of what is today termed 'climate scepticism' - namely the propensity of certain individuals to contest the reality of climate change on the basis of pseudo-scientific arguments. He emphasized particularly that what fuels the debate on climate change is, largely, the degree of uncertainty inherent in the consequences to be anticipated from observation of the facts, not the description of the facts itself. In his view, the main aim of climate sceptics is to block the political measures for combating climate change. However, since they do not admit to this political posture, they choose instead to deny the scientific reality. This month, Futuribles complements this socio-psychological analysis of climate-sceptical discourse with an - in this case, wholly scientific - analysis of what we know (or do not know) about climate change on our planet. Pierre Morel gives a detailed account of the state of our knowledge in the climate field and what we are able to predict in the medium/long-term. After reminding us of the influence of atmospheric meteorological processes on the climate, he specifies the extent of global warming observed since 1850 and the main origin of that warming, as revealed by the current state of knowledge: the increase in the concentration of greenhouse gases. He then describes the changes in meteorological regimes (showing also the limits of climate simulation models), the modifications of hydrological regimes, and also the prospects for rises in sea levels. He also specifies the mechanisms that may potentially amplify all these phenomena and the climate disasters that might ensue. Lastly, he shows what are the scientific data that cannot be disregarded, the consequences of which are now inescapable (melting of the ice-caps, rises in sea level etc.), the only remaining uncertainty in this connection being the date at which these things will happen. 'In this

  3. Extended Forward Sensitivity Analysis for Uncertainty Quantification

    Energy Technology Data Exchange (ETDEWEB)

    Haihua Zhao; Vincent A. Mousseau

    2011-09-01

    Verification and validation (V&V) are playing more important roles to quantify uncertainties and realize high fidelity simulations in engineering system analyses, such as transients happened in a complex nuclear reactor system. Traditional V&V in the reactor system analysis focused more on the validation part or did not differentiate verification and validation. The traditional approach to uncertainty quantification is based on a 'black box' approach. The simulation tool is treated as an unknown signal generator, a distribution of inputs according to assumed probability density functions is sent in and the distribution of the outputs is measured and correlated back to the original input distribution. The 'black box' method mixes numerical errors with all other uncertainties. It is also not efficient to perform sensitivity analysis. Contrary to the 'black box' method, a more efficient sensitivity approach can take advantage of intimate knowledge of the simulation code. In these types of approaches equations for the propagation of uncertainty are constructed and the sensitivities are directly solved for as variables in the simulation. This paper presents the forward sensitivity analysis as a method to help uncertainty qualification. By including time step and potentially spatial step as special sensitivity parameters, the forward sensitivity method is extended as one method to quantify numerical errors. Note that by integrating local truncation errors over the whole system through the forward sensitivity analysis process, the generated time step and spatial step sensitivity information reflect global numerical errors. The discretization errors can be systematically compared against uncertainties due to other physical parameters. This extension makes the forward sensitivity method a much more powerful tool to help uncertainty qualification. By knowing the relative sensitivity of time and space steps with other interested physical

  4. Housing Uncertainty and Childhood Impatience

    Science.gov (United States)

    Anil, Bulent; Jordan, Jeffrey L.; Zahirovic-Herbert, Velma

    2011-01-01

    The study demonstrates a direct link between housing uncertainty and children's time preferences, or patience. We show that students who face housing uncertainties through mortgage foreclosures and eviction learn impatient behavior and are therefore at greater risk of making poor intertemporal choices such as dropping out of school. We find that…

  5. AGRICULTURAL ECONOMICS, INTERDEPENDENCE AND UNCERTAINTY

    OpenAIRE

    Anderson, Jock R.

    1982-01-01

    Interdependence has always been central to economics but assumes pressing importance for agricultural economists as they deal with industrialising agricultures. Continued unresolvable uncertainties, when properly recognised, also add to the challenge of relevant work in agricultural economics. The related roles of interdependence and uncertainty are illustrated through examples from the progress of agricultural technology and enhancement of food security.

  6. Mama Software Features: Uncertainty Testing

    Energy Technology Data Exchange (ETDEWEB)

    Ruggiero, Christy E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Porter, Reid B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-05-30

    This document reviews how the uncertainty in the calculations is being determined with test image data. The results of this testing give an ‘initial uncertainty’ number than can be used to estimate the ‘back end’ uncertainty in digital image quantification in images. Statisticians are refining these numbers as part of a UQ effort.

  7. Uncertainty in Integrated Assessment Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Mort Webster

    2005-10-17

    The determination of climate policy is a decision under uncertainty. The uncertainty in future climate change impacts is large, as is the uncertainty in the costs of potential policies. Rational and economically efficient policy choices will therefore seek to balance the expected marginal costs with the expected marginal benefits. This approach requires that the risks of future climate change be assessed. The decision process need not be formal or quantitative for descriptions of the risks to be useful. Whatever the decision procedure, a useful starting point is to have as accurate a description of climate risks as possible. Given the goal of describing uncertainty in future climate change, we need to characterize the uncertainty in the main causes of uncertainty in climate impacts. One of the major drivers of uncertainty in future climate change is the uncertainty in future emissions, both of greenhouse gases and other radiatively important species such as sulfur dioxide. In turn, the drivers of uncertainty in emissions are uncertainties in the determinants of the rate of economic growth and in the technologies of production and how those technologies will change over time. This project uses historical experience and observations from a large number of countries to construct statistical descriptions of variability and correlation in labor productivity growth and in AEEI. The observed variability then provides a basis for constructing probability distributions for these drivers. The variance of uncertainty in growth rates can be further modified by expert judgment if it is believed that future variability will differ from the past. But often, expert judgment is more readily applied to projected median or expected paths through time. Analysis of past variance and covariance provides initial assumptions about future uncertainty for quantities that are less intuitive and difficult for experts to estimate, and these variances can be normalized and then applied to mean

  8. Reformulating the Quantum Uncertainty Relation.

    Science.gov (United States)

    Li, Jun-Li; Qiao, Cong-Feng

    2015-01-01

    Uncertainty principle is one of the cornerstones of quantum theory. In the literature, there are two types of uncertainty relations, the operator form concerning the variances of physical observables and the entropy form related to entropic quantities. Both these forms are inequalities involving pairwise observables, and are found to be nontrivial to incorporate multiple observables. In this work we introduce a new form of uncertainty relation which may give out complete trade-off relations for variances of observables in pure and mixed quantum systems. Unlike the prevailing uncertainty relations, which are either quantum state dependent or not directly measurable, our bounds for variances of observables are quantum state independent and immune from the "triviality" problem of having zero expectation values. Furthermore, the new uncertainty relation may provide a geometric explanation for the reason why there are limitations on the simultaneous determination of different observables in N-dimensional Hilbert space. PMID:26234197

  9. Uncertainty relation in Schwarzschild spacetime

    CERN Document Server

    Feng, Jun; Gould, Mark D; Fan, Heng

    2015-01-01

    We explore the entropic uncertainty relation in the curved background outside a Schwarzschild black hole, and find that Hawking radiation introduce a nontrivial modification on the uncertainty bound for particular observer, therefore could be witnessed by proper uncertainty game experimentally. We first investigate an uncertainty game between a free falling observer and his static partner holding a quantum memory initially entangled with the quantum system to be measured. Due to the information loss from Hawking decoherence, we find an inevitably increase of the uncertainty on the outcome of measurements in the view of static observer, which is dependent on the mass of the black hole, the distance of observer from event horizon, and the mode frequency of quantum memory. In an alternative game between two static players, we show that quantum information of qubit can be transferred to quantum memory through a bath of fluctuating quantum fields outside the black hole, which triggers an effectively reduced uncert...

  10. Uncertainties in Atomic Data and Their Propagation Through Spectral Models. I

    CERN Document Server

    Bautista, Manuel A; Quinet, Pascal; Dunn, Jay; Kallman, Theodore R Gull Timothy R; Mendoza, Claudio

    2013-01-01

    We present a method for computing uncertainties in spectral models, i.e. level populations, line emissivities, and emission line ratios, based upon the propagation of uncertainties originating from atomic data. We provide analytic expressions, in the form of linear sets of algebraic equations, for the coupled uncertainties among all levels. These equations can be solved efficiently for any set of physical conditions and uncertainties in the atomic data. We illustrate our method applied to spectral models of O III and Fe II and discuss the impact of the uncertainties on atomic systems under different physical conditions. As to intrinsic uncertainties in theoretical atomic data, we propose that these uncertainties can be estimated from the dispersion in the results from various independent calculations. This technique provides excellent results for the uncertainties in A-values of forbidden transitions in [Fe II].

  11. UNCERTAINTIES IN ATOMIC DATA AND THEIR PROPAGATION THROUGH SPECTRAL MODELS. I

    International Nuclear Information System (INIS)

    We present a method for computing uncertainties in spectral models, i.e., level populations, line emissivities, and emission line ratios, based upon the propagation of uncertainties originating from atomic data. We provide analytic expressions, in the form of linear sets of algebraic equations, for the coupled uncertainties among all levels. These equations can be solved efficiently for any set of physical conditions and uncertainties in the atomic data. We illustrate our method applied to spectral models of O III and Fe II and discuss the impact of the uncertainties on atomic systems under different physical conditions. As to intrinsic uncertainties in theoretical atomic data, we propose that these uncertainties can be estimated from the dispersion in the results from various independent calculations. This technique provides excellent results for the uncertainties in A-values of forbidden transitions in [Fe II].

  12. UNCERTAINTIES IN ATOMIC DATA AND THEIR PROPAGATION THROUGH SPECTRAL MODELS. I

    Energy Technology Data Exchange (ETDEWEB)

    Bautista, M. A.; Fivet, V. [Department of Physics, Western Michigan University, Kalamazoo, MI 49008 (United States); Quinet, P. [Astrophysique et Spectroscopie, Universite de Mons-UMONS, B-7000 Mons (Belgium); Dunn, J. [Physical Science Department, Georgia Perimeter College, Dunwoody, GA 30338 (United States); Gull, T. R. [Code 667, NASA Goddard Space Flight Center, Greenbelt, MD 20771 (United States); Kallman, T. R. [Code 662, NASA Goddard Space Flight Center, Greenbelt, MD 20771 (United States); Mendoza, C., E-mail: manuel.bautista@wmich.edu [Centro de Fisica, Instituto Venezolano de Investigaciones Cientificas (IVIC), P.O. Box 20632, Caracas 1020A (Venezuela, Bolivarian Republic of)

    2013-06-10

    We present a method for computing uncertainties in spectral models, i.e., level populations, line emissivities, and emission line ratios, based upon the propagation of uncertainties originating from atomic data. We provide analytic expressions, in the form of linear sets of algebraic equations, for the coupled uncertainties among all levels. These equations can be solved efficiently for any set of physical conditions and uncertainties in the atomic data. We illustrate our method applied to spectral models of O III and Fe II and discuss the impact of the uncertainties on atomic systems under different physical conditions. As to intrinsic uncertainties in theoretical atomic data, we propose that these uncertainties can be estimated from the dispersion in the results from various independent calculations. This technique provides excellent results for the uncertainties in A-values of forbidden transitions in [Fe II].

  13. SENSIT: a cross-section and design sensitivity and uncertainty analysis code. [In FORTRAN for CDC-7600, IBM 360

    Energy Technology Data Exchange (ETDEWEB)

    Gerstl, S.A.W.

    1980-01-01

    SENSIT computes the sensitivity and uncertainty of a calculated integral response (such as a dose rate) due to input cross sections and their uncertainties. Sensitivity profiles are computed for neutron and gamma-ray reaction cross sections of standard multigroup cross section sets and for secondary energy distributions (SEDs) of multigroup scattering matrices. In the design sensitivity mode, SENSIT computes changes in an integral response due to design changes and gives the appropriate sensitivity coefficients. Cross section uncertainty analyses are performed for three types of input data uncertainties: cross-section covariance matrices for pairs of multigroup reaction cross sections, spectral shape uncertainty parameters for secondary energy distributions (integral SED uncertainties), and covariance matrices for energy-dependent response functions. For all three types of data uncertainties SENSIT computes the resulting variance and estimated standard deviation in an integral response of interest, on the basis of generalized perturbation theory. SENSIT attempts to be more comprehensive than earlier sensitivity analysis codes, such as SWANLAKE.

  14. Uncertainty propagation within an integrated model of climate change

    International Nuclear Information System (INIS)

    This paper demonstrates a methodology whereby stochastic dynamical systems are used to investigate a climate model's inherent capacity to propagate uncertainty over time. The usefulness of the methodology stems from its ability to identify the variables that account for most of the model's uncertainty. We accomplish this by reformulating a deterministic dynamical system capturing the structure of an integrated climate model into a stochastic dynamical system. Then, via the use of computational techniques of stochastic differential equations accurate uncertainty estimates of the model's variables are determined. The uncertainty is measured in terms of properties of probability distributions of the state variables. The starting characteristics of the uncertainty of the initial state and the random fluctuations are derived from estimates given in the literature. Two aspects of uncertainty are investigated: (1) the dependence on environmental scenario - which is determined by technological development and actions towards environmental protection; and (2) the dependence on the magnitude of the initial state measurement error determined by the progress of climate change and the total magnitude of the system's random fluctuations as well as by our understanding of the climate system. Uncertainty of most of the system's variables is found to be nearly independent of the environmental scenario for the time period under consideration (1990-2100). Even conservative uncertainty estimates result in scenario overlap of several decades during which the consequences of any actions affecting the environment could be very difficult to identify with a sufficient degree of confidence. This fact may have fundamental consequences on the level of social acceptance of any restrictive measures against accelerating global warming. In general, the stochastic fluctuations contribute more to the uncertainty than the initial state measurements. The variables coupling all major climate elements

  15. Development of a Prototype Model-Form Uncertainty Knowledge Base

    Science.gov (United States)

    Green, Lawrence L.

    2016-01-01

    Uncertainties are generally classified as either aleatory or epistemic. Aleatory uncertainties are those attributed to random variation, either naturally or through manufacturing processes. Epistemic uncertainties are generally attributed to a lack of knowledge. One type of epistemic uncertainty is called model-form uncertainty. The term model-form means that among the choices to be made during a design process within an analysis, there are different forms of the analysis process, which each give different results for the same configuration at the same flight conditions. Examples of model-form uncertainties include the grid density, grid type, and solver type used within a computational fluid dynamics code, or the choice of the number and type of model elements within a structures analysis. The objectives of this work are to identify and quantify a representative set of model-form uncertainties and to make this information available to designers through an interactive knowledge base (KB). The KB can then be used during probabilistic design sessions, so as to enable the possible reduction of uncertainties in the design process through resource investment. An extensive literature search has been conducted to identify and quantify typical model-form uncertainties present within aerospace design. An initial attempt has been made to assemble the results of this literature search into a searchable KB, usable in real time during probabilistic design sessions. A concept of operations and the basic structure of a model-form uncertainty KB are described. Key operations within the KB are illustrated. Current limitations in the KB, and possible workarounds are explained.

  16. TSUNAMI Primer: A Primer for Sensitivity/Uncertainty Calculations with SCALE

    Energy Technology Data Exchange (ETDEWEB)

    Rearden, Bradley T [ORNL; Mueller, Don [ORNL; Bowman, Stephen M [ORNL; Busch, Robert D. [University of New Mexico, Albuquerque; Emerson, Scott [University of New Mexico, Albuquerque

    2009-01-01

    This primer presents examples in the application of the SCALE/TSUNAMI tools to generate k{sub eff} sensitivity data for one- and three-dimensional models using TSUNAMI-1D and -3D and to examine uncertainties in the computed k{sub eff} values due to uncertainties in the cross-section data used in their calculation. The proper use of unit cell data and need for confirming the appropriate selection of input parameters through direct perturbations are described. The uses of sensitivity and uncertainty data to identify and rank potential sources of computational bias in an application system and TSUNAMI tools for assessment of system similarity using sensitivity and uncertainty criteria are demonstrated. Uses of these criteria in trending analyses to assess computational biases, bias uncertainties, and gap analyses are also described. Additionally, an application of the data adjustment tool TSURFER is provided, including identification of specific details of sources of computational bias.

  17. Uncertainties in Nuclear Proliferation Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Chul Min; Yim, Man-Sung; Park, Hyeon Seok [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2015-05-15

    There have been various efforts in the research community to understand the determinants of nuclear proliferation and develop quantitative tools to predict nuclear proliferation events. Such systematic approaches have shown the possibility to provide warning for the international community to prevent nuclear proliferation activities. However, there are still large debates for the robustness of the actual effect of determinants and projection results. Some studies have shown that several factors can cause uncertainties in previous quantitative nuclear proliferation modeling works. This paper analyzes the uncertainties in the past approaches and suggests future works in the view of proliferation history, analysis methods, and variable selection. The research community still lacks the knowledge for the source of uncertainty in current models. Fundamental problems in modeling will remain even other advanced modeling method is developed. Before starting to develop fancy model based on the time dependent proliferation determinants' hypothesis, using graph theory, etc., it is important to analyze the uncertainty of current model to solve the fundamental problems of nuclear proliferation modeling. The uncertainty from different proliferation history coding is small. Serious problems are from limited analysis methods and correlation among the variables. Problems in regression analysis and survival analysis cause huge uncertainties when using the same dataset, which decreases the robustness of the result. Inaccurate variables for nuclear proliferation also increase the uncertainty. To overcome these problems, further quantitative research should focus on analyzing the knowledge suggested on the qualitative nuclear proliferation studies.

  18. Computational movement analysis

    CERN Document Server

    Laube, Patrick

    2014-01-01

    This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi

  19. Extended Forward Sensitivity Analysis for Uncertainty Quantification

    Energy Technology Data Exchange (ETDEWEB)

    Haihua Zhao; Vincent A. Mousseau

    2008-09-01

    This report presents the forward sensitivity analysis method as a means for quantification of uncertainty in system analysis. The traditional approach to uncertainty quantification is based on a “black box” approach. The simulation tool is treated as an unknown signal generator, a distribution of inputs according to assumed probability density functions is sent in and the distribution of the outputs is measured and correlated back to the original input distribution. This approach requires large number of simulation runs and therefore has high computational cost. Contrary to the “black box” method, a more efficient sensitivity approach can take advantage of intimate knowledge of the simulation code. In this approach equations for the propagation of uncertainty are constructed and the sensitivity is solved for as variables in the same simulation. This “glass box” method can generate similar sensitivity information as the above “black box” approach with couples of runs to cover a large uncertainty region. Because only small numbers of runs are required, those runs can be done with a high accuracy in space and time ensuring that the uncertainty of the physical model is being measured and not simply the numerical error caused by the coarse discretization. In the forward sensitivity method, the model is differentiated with respect to each parameter to yield an additional system of the same size as the original one, the result of which is the solution sensitivity. The sensitivity of any output variable can then be directly obtained from these sensitivities by applying the chain rule of differentiation. We extend the forward sensitivity method to include time and spatial steps as special parameters so that the numerical errors can be quantified against other physical parameters. This extension makes the forward sensitivity method a much more powerful tool to help uncertainty analysis. By knowing the relative sensitivity of time and space steps with other

  20. Uncertainty in Regional Air Quality Modeling

    Science.gov (United States)

    Digar, Antara

    Effective pollution mitigation is the key to successful air quality management. Although states invest millions of dollars to predict future air quality, the regulatory modeling and analysis process to inform pollution control strategy remains uncertain. Traditionally deterministic ‘bright-line’ tests are applied to evaluate the sufficiency of a control strategy to attain an air quality standard. A critical part of regulatory attainment demonstration is the prediction of future pollutant levels using photochemical air quality models. However, because models are uncertain, they yield a false sense of precision that pollutant response to emission controls is perfectly known and may eventually mislead the selection of control policies. These uncertainties in turn affect the health impact assessment of air pollution control strategies. This thesis explores beyond the conventional practice of deterministic attainment demonstration and presents novel approaches to yield probabilistic representations of pollutant response to emission controls by accounting for uncertainties in regional air quality planning. Computationally-efficient methods are developed and validated to characterize uncertainty in the prediction of secondary pollutant (ozone and particulate matter) sensitivities to precursor emissions in the presence of uncertainties in model assumptions and input parameters. We also introduce impact factors that enable identification of model inputs and scenarios that strongly influence pollutant concentrations and sensitivity to precursor emissions. We demonstrate how these probabilistic approaches could be applied to determine the likelihood that any control measure will yield regulatory attainment, or could be extended to evaluate probabilistic health benefits of emission controls, considering uncertainties in both air quality models and epidemiological concentration-response relationships. Finally, ground-level observations for pollutant (ozone) and precursor

  1. Analysis and Reduction of Complex Networks Under Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Knio, Omar M

    2014-04-09

    This is a collaborative proposal that aims at developing new methods for the analysis and reduction of complex multiscale networks under uncertainty. The approach is based on combining methods of computational singular perturbation (CSP) and probabilistic uncertainty quantification. In deterministic settings, CSP yields asymptotic approximations of reduced-dimensionality “slow manifolds” on which a multiscale dynamical system evolves. Introducing uncertainty raises fundamentally new issues, particularly concerning its impact on the topology of slow manifolds, and means to represent and quantify associated variability. To address these challenges, this project uses polynomial chaos (PC) methods to reformulate uncertain network models, and to analyze them using CSP in probabilistic terms. Specific objectives include (1) developing effective algorithms that can be used to illuminate fundamental and unexplored connections among model reduction, multiscale behavior, and uncertainty, and (2) demonstrating the performance of these algorithms through applications to model problems.

  2. Collaborative framework for PIV uncertainty quantification: comparative assessment of methods

    International Nuclear Information System (INIS)

    A posteriori uncertainty quantification of particle image velocimetry (PIV) data is essential to obtain accurate estimates of the uncertainty associated with a given experiment. This is particularly relevant when measurements are used to validate computational models or in design and decision processes. In spite of the importance of the subject, the first PIV uncertainty quantification (PIV-UQ) methods have been developed only in the last three years. The present work is a comparative assessment of four approaches recently proposed in the literature: the uncertainty surface method (Timmins et al 2012), the particle disparity approach (Sciacchitano et al 2013), the peak ratio criterion (Charonko and Vlachos 2013) and the correlation statistics method (Wieneke 2015). The analysis is based upon experiments conducted for this specific purpose, where several measurement techniques are employed simultaneously. The performances of the above approaches are surveyed across different measurement conditions and flow regimes. (paper)

  3. Meteorological Uncertainty of atmospheric Dispersion model results (MUD)

    DEFF Research Database (Denmark)

    Havskov Sørensen, Jens; Amstrup, Bjarne; Feddersen, Henrik;

    ’ dispersion scenario. However, recent developments in numerical weather prediction (NWP) include probabilistic forecasting techniques, which can be utilised also for long-range atmospheric dispersion models. The ensemble statistical methods developed and applied to NWP models aim at describing the inherent...... uncertainties of the meteorological model results. These uncertainties stem from e.g. limits in meteorological observations used to initialise meteorological forecast series. By perturbing e.g. the initial state of an NWP model run in agreement with the available observational data, an ensemble...... of meteorological forecasts is produced from which uncertainties in the various meteorological parameters are estimated, e.g. probabilities for rain. Corresponding ensembles of atmospheric dispersion can now be computed from which uncertainties of predicted radionuclide concentration and deposition patterns can...

  4. Optimal uncertainty relations in a modified Heisenberg algebra

    CERN Document Server

    Abdelkhalek, Kais; Fiedler, Leander; Mangano, Gianpiero; Schwonnek, René

    2016-01-01

    Various theories that aim at unifying gravity with quantum mechanics suggest modifications of the Heisenberg algebra for position and momentum. From the perspective of quantum mechanics, such modifications lead to new uncertainty relations which are thought (but not proven) to imply the existence of a minimal observable length. Here we prove this statement in a framework of sufficient physical and structural assumptions. Moreover, we present a general method that allows to formulate optimal and state-independent variance-based uncertainty relations. In addition, instead of variances, we make use of entropies as a measure of uncertainty and provide uncertainty relations in terms of min- and Shannon entropies. We compute the corresponding entropic minimal lengths and find that the minimal length in terms of min-entropy is exactly one bit.

  5. 基于确定性计算响应面的复杂工程结构不确定性建模研究%Deterministic computer simulation response surface research on complex engineer structure uncertainty

    Institute of Scientific and Technical Information of China (English)

    朱跃; 张令弥

    2011-01-01

    Aimied at the more uncertainty parameters and nonlinear issues in the complex engineer structure, the response surface for structure was employed between response characters and different variables. In the case of two typical nonlinear indication functions and a screw-connection structure including several uncertain parameters, the response surface model was constructed based on high order exponent polynomial and LS-SVM (Least Squares Support Vector Machine). The modeling procedure using LS-SVM was presented in the paper. Compared with the method of high order exponent polynomial, the high precision response surface can be formed by less samples based on LS-SVM. The more effective result was proved in the high dimension and nonlinear problem of complex engineer structure.%由于复杂工程结构具有不确定性参数较多和非线性较强等特点,本文研究了如何用确定性计算响应面方法建立其响应特征和不同自变量之间的高精度模型,以两个典型非线性显示函数和一个含有多个不确定参数的螺栓结构为例,采用高阶幂多项式和最小二乘支持向量机两种方法建立了响应面模型,阐述了用支持向量机建立响应面模型的一般步骤。与高阶幂多项式方法相比,支持向量机可以用较少的样本点建立复杂工程结构的不确定性分析模型,对处理复杂工程结构响应面的高维数和非线性问题具有较好的效果。

  6. Realising the Uncertainty Enabled Model Web

    Science.gov (United States)

    Cornford, D.; Bastin, L.; Pebesma, E. J.; Williams, M.; Stasch, C.; Jones, R.; Gerharz, L.

    2012-12-01

    conversion between uncertainty types, and between the spatial / temporal support of service inputs / outputs. Finally we describe the tools being generated within the UncertWeb project, considering three main aspects: i) Elicitation of uncertainties on model inputs. We are developing tools to enable domain experts to provide judgements about input uncertainties from UncertWeb model components (e.g. parameters in meteorological models) which allow panels of experts to engage in the process and reach a consensus view on the current knowledge / beliefs about that parameter or variable. We are developing systems for continuous and categorical variables as well as stationary spatial fields. ii) Visualisation of the resulting uncertain outputs from the end of the workflow, but also at intermediate steps. At this point we have prototype implementations driven by the requirements from the use cases that motivate UncertWeb. iii) Sensitivity and uncertainty analysis on model outputs. Here we show the design of the overall system we are developing, including the deployment of an emulator framework to allow computationally efficient approaches. We conclude with a summary of the open issues and remaining challenges we are facing in UncertWeb, and provide a brief overview of how we plan to tackle these.

  7. Uncertainty product of composite signals

    International Nuclear Information System (INIS)

    The well known uncertainty product of communication theory for a signal in the time domain and its Fourier transform in the frequency domain is studied for a 'composite signal', i.e. a 'pure' signal to which a time-delayed replica is added. This uncertainty product shows the appearance of local maxima and minima as a function of the time delay, leading to the following conjecture: the uncertainty product of a non-Gaussian composite signal can be smaller than that of the 'pure' signal. As an example this conjecture will be proven for the derivative of the Gaussian signal and for the Cauchy distribution. The effect on the uncertainty product of adding a delayed scaled replica of a signal to the original signal in the time domain leads to an important possibility for interpretation in the study of the reverberation phenomenon in echo-location signals of dolphins. (author). Letter-to-the-editor

  8. Climate Projections and Uncertainty Communication.

    Science.gov (United States)

    Joslyn, Susan L; LeClerc, Jared E

    2016-01-01

    Lingering skepticism about climate change might be due in part to the way climate projections are perceived by members of the public. Variability between scientists' estimates might give the impression that scientists disagree about the fact of climate change rather than about details concerning the extent or timing. Providing uncertainty estimates might clarify that the variability is due in part to quantifiable uncertainty inherent in the prediction process, thereby increasing people's trust in climate projections. This hypothesis was tested in two experiments. Results suggest that including uncertainty estimates along with climate projections leads to an increase in participants' trust in the information. Analyses explored the roles of time, place, demographic differences (e.g., age, gender, education level, political party affiliation), and initial belief in climate change. Implications are discussed in terms of the potential benefit of adding uncertainty estimates to public climate projections. PMID:26695995

  9. Exposing Position Uncertainty in Middleware

    DEFF Research Database (Denmark)

    Langdal, Jakob; Kjærgaard, Mikkel Baun; Toftkjær, Thomas;

    2010-01-01

    Traditionally, the goal for positioning middleware is to provide developers with seamless position transparency, i.e., providing a connection between the application domain and the positioning sensors while hiding the complexity of the positioning technologies in use. A key part of the hidden...... complexity is the uncertainty associated to positions caused by inherent limitations when using sensors to convert physical phenomena to digital representations. We propose to use the notion of seamful design for developers to design a positioning middleware that provides transparent positioning and still...... allows developers some control of the uncertainty aspects of the positioning process. The design presented in this paper shows how uncertainty of positioning can be conceptualized and internalized into a positioning middleware. Furthermore, we argue that a developer who is interacting with uncertainty...

  10. The Time Energy Uncertainty Relation

    CERN Document Server

    Busch, P

    2001-01-01

    The time energy uncertainty relation has been a controversial issue since the advent of quantum theory, with respect to appropriate formalisation, validity and possible meanings. A comprehensive account of the development of this subject up to the 1980s is provided by a combination of the reviews of Jammer (1974), Bauer and Mello (1978), and Busch (1990). More recent reviews are concerned with different specific aspects of the subject. The purpose of this chapter is to show that different types of time energy uncertainty relation can indeed be deduced in specific contexts, but that there is no unique universal relation that could stand on equal footing with the position-momentum uncertainty relation. To this end, we will survey the various formulations of a time energy uncertainty relation, with a brief assessment of their validity, and along the way we will indicate some new developments that emerged since the 1990s.

  11. Climate Projections and Uncertainty Communication.

    Science.gov (United States)

    Joslyn, Susan L; LeClerc, Jared E

    2016-01-01

    Lingering skepticism about climate change might be due in part to the way climate projections are perceived by members of the public. Variability between scientists' estimates might give the impression that scientists disagree about the fact of climate change rather than about details concerning the extent or timing. Providing uncertainty estimates might clarify that the variability is due in part to quantifiable uncertainty inherent in the prediction process, thereby increasing people's trust in climate projections. This hypothesis was tested in two experiments. Results suggest that including uncertainty estimates along with climate projections leads to an increase in participants' trust in the information. Analyses explored the roles of time, place, demographic differences (e.g., age, gender, education level, political party affiliation), and initial belief in climate change. Implications are discussed in terms of the potential benefit of adding uncertainty estimates to public climate projections.

  12. [DNA computing].

    Science.gov (United States)

    Błasiak, Janusz; Krasiński, Tadeusz; Popławski, Tomasz; Sakowski, Sebastian

    2011-01-01

    Biocomputers can be an alternative for traditional "silicon-based" computers, which continuous development may be limited due to further miniaturization (imposed by the Heisenberg Uncertainty Principle) and increasing the amount of information between the central processing unit and the main memory (von Neuman bottleneck). The idea of DNA computing came true for the first time in 1994, when Adleman solved the Hamiltonian Path Problem using short DNA oligomers and DNA ligase. In the early 2000s a series of biocomputer models was presented with a seminal work of Shapiro and his colleguas who presented molecular 2 state finite automaton, in which the restriction enzyme, FokI, constituted hardware and short DNA oligomers were software as well as input/output signals. DNA molecules provided also energy for this machine. DNA computing can be exploited in many applications, from study on the gene expression pattern to diagnosis and therapy of cancer. The idea of DNA computing is still in progress in research both in vitro and in vivo and at least promising results of these research allow to have a hope for a breakthrough in the computer science. PMID:21735816

  13. Uncertainty relation for N fermions

    Energy Technology Data Exchange (ETDEWEB)

    Goodmanson, D.; Taylor, J.R.; Campesino-Romeo, E.

    1979-01-15

    We define uncertainties in position and momentum for states of N identical fermions of spin s. We prove that in a one-dimensional world the product of these uncertainties is greater than or equal to (N/(2s + 1))h/2. We generalize this to arbitrary numbers of dimensions; in particular, the corresponding result in three dimensions has the factor h/2 replaced by (3/4) (h/2)/sup 3/.

  14. Supply chain collaboration under uncertainty

    OpenAIRE

    Hasan, Saad; Eckert, Claudia; Earl, Chris

    2012-01-01

    In fluctuating economic conditions such as global recession, supply chains operate under changing conditions of uncertainty. The impact of this uncertainty and associated risk might be mitigated by collaboration. This paper proposes a model of supply chain collaboration based on information exchange and decision coordination at both the strategic and tactical levels. However, a collaborative supply chain can be exposed to associated risks such as the failure of individual actors. Governance r...

  15. Uncertainty, investment, and industry evolution

    OpenAIRE

    Caballero, Ricardo J; Robert S. Pindyck

    1992-01-01

    We study the effects of aggregate and idiosyncratic uncertainty on the entry of firms, total investment, and prices in a competitive industry with irreversible investment. We first use standard dynamic programming methods to determine firms' entry decisions, and we describe the resulting industry equilibrium and its characteristics, emphasizing the effects of different sources of uncertainty. We then show how the conditional distribution of prices can be used as an alternative means of determ...

  16. Cognitive ability and the effect of strategic uncertainty

    OpenAIRE

    Hanaki, Nobuyuki; Jacquemet, Nicolas; Luchini, Stéphane; Zylbersztejn, Adam

    2014-01-01

    How is one's cognitive ability related to the way one responds to strategic uncertainty? We address this question by conducting a set of experiments in simple 2 × 2 dominance solvable coordination games. Our experiments involve two main treatments: one in which two human subjects interact, and another in which one human subject interacts with a computer program whose behavior is known. By making the behavior of the computer perfectly predictable, the latter treatment eliminates strategic unce...

  17. Cognitive ability and the effect of strategic uncertainty

    OpenAIRE

    Hanaki, Nobuyuki; Jacquemet, Nicolas; Luchini, Stéphane; Zylbersztejn, Adam

    2016-01-01

    International audience How is one's cognitive ability related to the way one responds to strategic uncertainty? We address this question by conducting a set of experiments in simple 2 × 2 dominance solvable coordination games. Our experiments involve two main treatments: one in which two human subjects interact, and another in which one human subject interacts with a computer program whose behavior is known. By making the behavior of the computer perfectly predictable, the latter treatment...

  18. Cognitive ability and the effect of strategic uncertainty

    OpenAIRE

    Hanaki, Nobuyuki; Jacquemet, Nicolas; Luchini, Stéphane; Zylbersztejn, Adam

    2015-01-01

    How is one's cognitive ability related to the way one responds to strategic uncertainty? We address this question by conducting a set of experiments in simple 2 x 2 dominance solvable coordination games. Our experiments involve two main treatments: one in which two human subjects interact, and another in which one human subject interacts with a computer program whose behavior is known. By making the behavior of the computer perfectly predictable, the latter treatment eliminates strategic unce...

  19. Cognitive Ability and the Effect of Strategic Uncertainty

    OpenAIRE

    Nobuyuki Hanaki; Nicolas Jacquemet; Stéphane Luchini; Adam Zylberstejn

    2014-01-01

    How is one’s cognitive ability related to the way one responds to strategic uncertainty? We address this question by conducting a set of experiments in simple 2 x 2 dominance solvable coordination games. Our experiments involve two main treatments: one in which two human subjects interact, and another in which one human subject interacts with a computer program whose behavior is known. By making the behavior of the computer perfectly predictable, the latter treatment eliminates strategic unce...

  20. Numerical solution of dynamic equilibrium models under Poisson uncertainty

    DEFF Research Database (Denmark)

    Posch, Olaf; Trimborn, Timo

    2013-01-01

    We propose a simple and powerful numerical algorithm to compute the transition process in continuous-time dynamic equilibrium models with rare events. In this paper we transform the dynamic system of stochastic differential equations into a system of functional differential equations of the retar...... solution to Lucas' endogenous growth model under Poisson uncertainty are used to compute the exact numerical error. We show how (potential) catastrophic events such as rare natural disasters substantially affect the economic decisions of households....

  1. Uncertainties in land use data

    Directory of Open Access Journals (Sweden)

    G. Castilla

    2007-11-01

    Full Text Available This paper deals with the description and assessment of uncertainties in land use data derived from Remote Sensing observations, in the context of hydrological studies. Land use is a categorical regionalised variable reporting the main socio-economic role each location has, where the role is inferred from the pattern of occupation of land. The properties of this pattern that are relevant to hydrological processes have to be known with some accuracy in order to obtain reliable results; hence, uncertainty in land use data may lead to uncertainty in model predictions. There are two main uncertainties surrounding land use data, positional and categorical. The first one is briefly addressed and the second one is explored in more depth, including the factors that influence it. We (1 argue that the conventional method used to assess categorical uncertainty, the confusion matrix, is insufficient to propagate uncertainty through distributed hydrologic models; (2 report some alternative methods to tackle this and other insufficiencies; (3 stress the role of metadata as a more reliable means to assess the degree of distrust with which these data should be used; and (4 suggest some practical recommendations.

  2. Wildfire Decision Making Under Uncertainty

    Science.gov (United States)

    Thompson, M.

    2013-12-01

    Decisions relating to wildfire management are subject to multiple sources of uncertainty, and are made by a broad range of individuals, across a multitude of environmental and socioeconomic contexts. In this presentation I will review progress towards identification and characterization of uncertainties and how this information can support wildfire decision-making. First, I will review a typology of uncertainties common to wildfire management, highlighting some of the more salient sources of uncertainty and how they present challenges to assessing wildfire risk. This discussion will cover the expanding role of burn probability modeling, approaches for characterizing fire effects, and the role of multi-criteria decision analysis, and will provide illustrative examples of integrated wildfire risk assessment across a variety of planning scales. Second, I will describe a related uncertainty typology that focuses on the human dimensions of wildfire management, specifically addressing how social, psychological, and institutional factors may impair cost-effective risk mitigation. This discussion will encompass decision processes before, during, and after fire events, with a specific focus on active management of complex wildfire incidents. An improved ability to characterize uncertainties faced in wildfire management could lead to improved delivery of decision support, targeted communication strategies, and ultimately to improved wildfire management outcomes.

  3. A Bayesian approach to model uncertainty

    International Nuclear Information System (INIS)

    A Bayesian approach to model uncertainty is taken. For the case of a finite number of alternative models, the model uncertainty is equivalent to parameter uncertainty. A derivation based on Savage's partition problem is given

  4. Uncertainty and Its Description in Decision Models

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Various perspectives in uncertainty are briefly summarized in this paper, and the classification of uncertainties is discussed. The descriptions of different classes of uncertainty are also presented. Furthermore, a risk representation model for decision-making analysis is provided.

  5. Axial power monitoring uncertainty in the Savannah River Reactors

    International Nuclear Information System (INIS)

    The results of this analysis quantified the uncertainty associated with monitoring the Axial Power Shape (APS) in the Savannah River Reactors. Thermocouples at each assembly flow exit map the radial power distribution and are the primary means of monitoring power in these reactors. The remaining uncertainty in power monitoring is associated with the relative axial power distribution. The APS is monitored by seven sensors that respond to power on each of nine vertical Axial Power Monitor (APM) rods. Computation of the APS uncertainty, for the reactor power limits analysis, started with a large database of APM rod measurements spanning several years of reactor operation. A computer algorithm was used to randomly select a sample of APSs which were input to a code. This code modeled the thermal-hydraulic performance of a single fuel assembly during a design basis Loss-of Coolant Accident. The assembly power limit at Onset of Significant Voiding was computed for each APS. The output was a distribution of expected assembly power limits that was adjusted to account for the biases caused by instrumentation error and by measuring 7 points rather than a continuous APS. Statistical analysis of the final assembly power limit distribution showed that reducing reactor power by approximately 3% was sufficient to account for APS variation. This data confirmed expectations that the assembly exit thermocouples provide all information needed for monitoring core power. The computational analysis results also quantified the contribution to power limits of the various uncertainties such as instrumentation error

  6. Bounds in the generalized Weber problem under locational uncertainty

    DEFF Research Database (Denmark)

    Juel, Henrik

    1981-01-01

    An existing analysis of the bounds on the Weber problem solution under uncertainty is incorrect. For the generalized problem with arbitrary measures of distance, we give easily computable ranges on the bounds and state the conditions under which the exact values of the bounds can be found...

  7. Quantification of Modelling Uncertainties in Turbulent Flow Simulations

    NARCIS (Netherlands)

    Edeling, W.N.

    2015-01-01

    The goal of this thesis is to make predictive simulations with Reynolds-Averaged Navier-Stokes (RANS) turbulence models, i.e. simulations with a systematic treatment of model and data uncertainties and their propagation through a computational model to produce predictions of quantities of interest w

  8. Propagation of interval and probabilistic uncertainty in cyberinfrastructure-related data processing and data fusion

    CERN Document Server

    Servin, Christian

    2015-01-01

    On various examples ranging from geosciences to environmental sciences, this book explains how to generate an adequate description of uncertainty, how to justify semiheuristic algorithms for processing uncertainty, and how to make these algorithms more computationally efficient. It explains in what sense the existing approach to uncertainty as a combination of random and systematic components is only an approximation, presents a more adequate three-component model with an additional periodic error component, and explains how uncertainty propagation techniques can be extended to this model. The book provides a justification for a practically efficient heuristic technique (based on fuzzy decision-making). It explains how the computational complexity of uncertainty processing can be reduced. The book also shows how to take into account that in real life, the information about uncertainty is often only partially known, and, on several practical examples, explains how to extract the missing information about uncer...

  9. Power system transient stability simulation under uncertainty based on Taylor model arithmetic

    Institute of Scientific and Technical Information of China (English)

    Shouxiang WANG; Zhijie ZHENG; Chengshan WANG

    2009-01-01

    The Taylor model arithmetic is introduced to deal with uncertainty. The uncertainty of model parameters is described by Taylor models and each variable in functions is replaced with the Taylor model (TM). Thus,time domain simulation under uncertainty is transformed to the integration of TM-based differential equations. In this paper, the Taylor series method is employed to compute differential equations; moreover, power system time domain simulation under uncertainty based on Taylor model method is presented. This method allows a rigorous estimation of the influence of either form of uncertainty and only needs one simulation. It is computationally fast compared with the Monte Carlo method, which is another technique for uncertainty analysis. The proposed method has been tested on the 39-bus New England system. The test results illustrate the effectiveness and practical value of the approach by comparing with the results of Monte Carlo simulation and traditional time domain simulation.

  10. Total Measurement Uncertainty for the Plutonium Finishing Plant (PFP) Segmented Gamma Scan Assay System

    CERN Document Server

    Fazzari, D M

    2001-01-01

    This report presents the results of an evaluation of the Total Measurement Uncertainty (TMU) for the Canberra manufactured Segmented Gamma Scanner Assay System (SGSAS) as employed at the Hanford Plutonium Finishing Plant (PFP). In this document, TMU embodies the combined uncertainties due to all of the individual random and systematic sources of measurement uncertainty. It includes uncertainties arising from corrections and factors applied to the analysis of transuranic waste to compensate for inhomogeneities and interferences from the waste matrix and radioactive components. These include uncertainty components for any assumptions contained in the calibration of the system or computation of the data. Uncertainties are propagated at 1 sigma. The final total measurement uncertainty value is reported at the 95% confidence level. The SGSAS is a gamma assay system that is used to assay plutonium and uranium waste. The SGSAS system can be used in a stand-alone mode to perform the NDA characterization of a containe...

  11. Probabilistic uncertainty analysis of an FRF of a structure using a Gaussian process emulator

    Science.gov (United States)

    Fricker, Thomas E.; Oakley, Jeremy E.; Sims, Neil D.; Worden, Keith

    2011-11-01

    This paper introduces methods for probabilistic uncertainty analysis of a frequency response function (FRF) of a structure obtained via a finite element (FE) model. The methods are applicable to computationally expensive FE models, making use of a Bayesian metamodel known as an emulator. The emulator produces fast predictions of the FE model output, but also accounts for the additional uncertainty induced by only having a limited number of model evaluations. Two approaches to the probabilistic uncertainty analysis of FRFs are developed. The first considers the uncertainty in the response at discrete frequencies, giving pointwise uncertainty intervals. The second considers the uncertainty in an entire FRF across a frequency range, giving an uncertainty envelope function. The methods are demonstrated and compared to alternative approaches in a practical case study.

  12. A comparative study of new non-linear uncertainty propagation methods for space surveillance

    Science.gov (United States)

    Horwood, Joshua T.; Aristoff, Jeffrey M.; Singh, Navraj; Poore, Aubrey B.

    2014-06-01

    We propose a unified testing framework for assessing uncertainty realism during non-linear uncertainty propagation under the perturbed two-body problem of celestial mechanics, with an accompanying suite of metrics and benchmark test cases on which to validate different methods. We subsequently apply the testing framework to different combinations of uncertainty propagation techniques and coordinate systems for representing the uncertainty. In particular, we recommend the use of a newly-derived system of orbital element coordinates that mitigate the non-linearities in uncertainty propagation and the recently-developed Gauss von Mises filter which, when used in tandem, provide uncertainty realism over much longer periods of time compared to Gaussian representations of uncertainty in Cartesian spaces, at roughly the same computational cost.

  13. RUMINATIONS ON NDA MEASUREMENT UNCERTAINTY COMPARED TO DA UNCERTAINTY

    Energy Technology Data Exchange (ETDEWEB)

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-06-17

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  14. Analyses of effect of parameter uncertainty in NPP design on computation of floor response spectra%核电结构设计中参数不确定性对楼层谱计算影响分析

    Institute of Scientific and Technical Information of China (English)

    李建波; 林皋; 钟红; 胡志强

    2011-01-01

    核电厂房的楼层反应谱是核电结构和设备抗震设计的依据.采用通常的确定性的方法计算只能选择可能性比较大,但又相对比较保守的荷载和结构参数进行分析,对计算结果反映实际的程度很难作出可靠的判断.在动力相互作用分析的基础上,针对核电结构设计中的参数不确定性问题,采用基于蒙特卡罗模拟的概率统计法进行楼层反应谱的分析.这种方法可以适应性能参数概率密度函数的各种不规则变化,在概率统计基础上获取不同置信率的响应结果.数值算例表明,该模型合理,计算结果可以对传统的确定性分析结果进行置信率的定量判断,进而对传统确定性方法设计的核电结构安全性有更深入的了解.%Floor response spectrum is the basis for design of structures and equipments in a nuclear power plant (NPP). In current routine of NPP design, deterministic methods are generally employed for computation of floor spectra, in which a proper estimate of loadings and structural parameters, with a good possibility yet conservative, has to be chosen, yielding results difficult to reflect the complicated real situation. To solve this problem, probabilistic and statistical method based on Monte- Carlo simulation is employed in the analyses of floor response spectra of this complicated stochastic system on the dynamic interaction analyses. This procedure can perfectly consider the irregular probabilistic density functions of physical parameters, and obtain floor spectra of different guarantee ratios. Numerical examples show that the numerical results can be used to evaluate the guarantee ratio of results obtained from deterministic analyses, providing better understanding of the safety of NPP designed with conventional deterministic method.

  15. Sustainable energy development under uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Fuss, S.

    2008-04-24

    This thesis has contributed to investment decision-making under uncertainty and irreversibility with particular focus on the transition towards a more sustainable mix of electricity-generating technologies. The models introduced here do thus not only provide important insights for investors, but also for policy makers interested in curtailing greenhouse gas (GHG) emissions for the sake of a deceleration of global warming, who therefore need to understand how investors respond to uncertainties, climate change policy and other factors. Part 1 is partly devoted to frameworks based on principles from real options theory. Chapter 3 presents a model, in which we deal with different types of uncertainty afiecting the investor. In liberalized electricity markets, investors nowadays do not only face uncertainty from volatile electricity prices, but also from the possibility of stricter climate change policy. We investigate this in a real options framework with two types of power plants (both coal-fired, but one with a carbon capture and storage module, which therefore emits less CO2), where the prices of electricity and CO2 emissions are stochastic. In particular, we analyze the response of long-term investment to (higher) uncertainty about CO2 prices, which can come from two sources: price fluctuations around an average, rising price that might as well be market-driven, and uncertainty about the actions of the government, which can lead to sudden price jumps or drops. We find that producers facing market uncertainty optimize under incomplete information and invest into the carbon-saving technology earlier than they would have done if they had known what the prices indeed are, while policy uncertainty leads to postponement of investment, as the option value of waiting for the revelation of the policy outcome more than outweighs the losses associated with ongoing, continuously rising CO2 costs. Chapter 4 is about a real options model, which considers both fuel price risk and

  16. Critical loads - assessment of uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Barkman, A.

    1998-10-01

    The effects of data uncertainty in applications of the critical loads concept were investigated on different spatial resolutions in Sweden and northern Czech Republic. Critical loads of acidity (CL) were calculated for Sweden using the biogeochemical model PROFILE. Three methods with different structural complexity were used to estimate the adverse effects of S0{sub 2} concentrations in northern Czech Republic. Data uncertainties in the calculated critical loads/levels and exceedances (EX) were assessed using Monte Carlo simulations. Uncertainties within cumulative distribution functions (CDF) were aggregated by accounting for the overlap between site specific confidence intervals. Aggregation of data uncertainties within CDFs resulted in lower CL and higher EX best estimates in comparison with percentiles represented by individual sites. Data uncertainties were consequently found to advocate larger deposition reductions to achieve non-exceedance based on low critical loads estimates on 150 x 150 km resolution. Input data were found to impair the level of differentiation between geographical units at all investigated resolutions. Aggregation of data uncertainty within CDFs involved more constrained confidence intervals for a given percentile. Differentiation as well as identification of grid cells on 150 x 150 km resolution subjected to EX was generally improved. Calculation of the probability of EX was shown to preserve the possibility to differentiate between geographical units. Re-aggregation of the 95%-ile EX on 50 x 50 km resolution generally increased the confidence interval for each percentile. Significant relationships were found between forest decline and the three methods addressing risks induced by S0{sub 2} concentrations. Modifying S0{sub 2} concentrations by accounting for the length of the vegetation period was found to constitute the most useful trade-off between structural complexity, data availability and effects of data uncertainty. Data

  17. Communicating Uncertainties on Climate Change

    Science.gov (United States)

    Planton, S.

    2009-09-01

    The term of uncertainty in common language is confusing since it is related in one of its most usual sense to what cannot be known in advance or what is subject to doubt. Its definition in mathematics is unambiguous but not widely shared. It is thus difficult to communicate on this notion through media to a wide public. From its scientific basis to the impact assessment, climate change issue is subject to a large number of sources of uncertainties. In this case, the definition of the term is close to its mathematical sense, but the diversity of disciplines involved in the analysis process implies a great diversity of approaches of the notion. Faced to this diversity of approaches, the issue of communicating uncertainties on climate change is thus a great challenge. It is also complicated by the diversity of the targets of the communication on climate change, from stakeholders and policy makers to a wide public. We will present the process chosen by the IPCC in order to communicate uncertainties in its assessment reports taking the example of the guidance note to lead authors of the fourth assessment report. Concerning the communication of uncertainties to a wide public, we will give some examples aiming at illustrating how to avoid the above-mentioned ambiguity when dealing with this kind of communication.

  18. Entropic uncertainty and measurement reversibility

    Science.gov (United States)

    Berta, Mario; Wehner, Stephanie; Wilde, Mark M.

    2016-07-01

    The entropic uncertainty relation with quantum side information (EUR-QSI) from (Berta et al 2010 Nat. Phys. 6 659) is a unifying principle relating two distinctive features of quantum mechanics: quantum uncertainty due to measurement incompatibility, and entanglement. In these relations, quantum uncertainty takes the form of preparation uncertainty where one of two incompatible measurements is applied. In particular, the ‘uncertainty witness’ lower bound in the EUR-QSI is not a function of a post-measurement state. An insightful proof of the EUR-QSI from (Coles et al 2012 Phys. Rev. Lett. 108 210405) makes use of a fundamental mathematical consequence of the postulates of quantum mechanics known as the non-increase of quantum relative entropy under quantum channels. Here, we exploit this perspective to establish a tightening of the EUR-QSI which adds a new state-dependent term in the lower bound, related to how well one can reverse the action of a quantum measurement. As such, this new term is a direct function of the post-measurement state and can be thought of as quantifying how much disturbance a given measurement causes. Our result thus quantitatively unifies this feature of quantum mechanics with the others mentioned above. We have experimentally tested our theoretical predictions on the IBM quantum experience and find reasonable agreement between our predictions and experimental outcomes.

  19. Uncertainty Estimation Improves Energy Measurement and Verification Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Walter, Travis; Price, Phillip N.; Sohn, Michael D.

    2014-05-14

    Implementing energy conservation measures in buildings can reduce energy costs and environmental impacts, but such measures cost money to implement so intelligent investment strategies require the ability to quantify the energy savings by comparing actual energy used to how much energy would have been used in absence of the conservation measures (known as the baseline energy use). Methods exist for predicting baseline energy use, but a limitation of most statistical methods reported in the literature is inadequate quantification of the uncertainty in baseline energy use predictions. However, estimation of uncertainty is essential for weighing the risks of investing in retrofits. Most commercial buildings have, or soon will have, electricity meters capable of providing data at short time intervals. These data provide new opportunities to quantify uncertainty in baseline predictions, and to do so after shorter measurement durations than are traditionally used. In this paper, we show that uncertainty estimation provides greater measurement and verification (M&V) information and helps to overcome some of the difficulties with deciding how much data is needed to develop baseline models and to confirm energy savings. We also show that cross-validation is an effective method for computing uncertainty. In so doing, we extend a simple regression-based method of predicting energy use using short-interval meter data. We demonstrate the methods by predicting energy use in 17 real commercial buildings. We discuss the benefits of uncertainty estimates which can provide actionable decision making information for investing in energy conservation measures.

  20. GCR environmental models II: Uncertainty propagation methods for GCR environments

    Science.gov (United States)

    Slaba, Tony C.; Blattnig, Steve R.

    2014-04-01

    In order to assess the astronaut exposure received within vehicles or habitats, accurate models of the ambient galactic cosmic ray (GCR) environment are required. Many models have been developed and compared to measurements, with uncertainty estimates often stated to be within 15%. However, intercode comparisons can lead to differences in effective dose exceeding 50%. This is the second of three papers focused on resolving this discrepancy. The first paper showed that GCR heavy ions with boundary energies below 500 MeV/n induce less than 5% of the total effective dose behind shielding. Yet, due to limitations on available data, model development and validation are heavily influenced by comparisons to measurements taken below 500 MeV/n. In the current work, the focus is on developing an efficient method for propagating uncertainties in the ambient GCR environment to effective dose values behind shielding. A simple approach utilizing sensitivity results from the first paper is described and shown to be equivalent to a computationally expensive Monte Carlo uncertainty propagation. The simple approach allows a full uncertainty propagation to be performed once GCR uncertainty distributions are established. This rapid analysis capability may be integrated into broader probabilistic radiation shielding analysis and also allows error bars (representing boundary condition uncertainty) to be placed around point estimates of effective dose.

  1. Users manual for the FORSS sensitivity and uncertainty analysis code system

    International Nuclear Information System (INIS)

    FORSS is a code system used to study relationships between nuclear reaction cross sections, integral experiments, reactor performance parameter predictions and associated uncertainties. This report describes the computing environment and the modules currently used to implement FORSS Sensitivity and Uncertainty Methodology

  2. Users manual for the FORSS sensitivity and uncertainty analysis code system

    Energy Technology Data Exchange (ETDEWEB)

    Lucius, J.L.; Weisbin, C.R.; Marable, J.H.; Drischler, J.D.; Wright, R.Q.; White, J.E.

    1981-01-01

    FORSS is a code system used to study relationships between nuclear reaction cross sections, integral experiments, reactor performance parameter predictions and associated uncertainties. This report describes the computing environment and the modules currently used to implement FORSS Sensitivity and Uncertainty Methodology.

  3. Word learning under infinite uncertainty

    CERN Document Server

    Blythe, Richard A; Smith, Kenny

    2014-01-01

    Language learners learn the meanings of many thousands of words, despite encountering them in complex environments where infinitely many meanings might be inferred by the learner as their true meaning. This problem of infinite referential uncertainty is often attributed to Willard Van Orman Quine. We provide a mathematical formalisation of an ideal cross-situational learner attempting to learn under infinite referential uncertainty, and identify conditions under which this can happen. As Quine's intuitions suggest, learning under infinite uncertainty is possible, provided that learners have some means of ranking candidate word meanings in terms of their plausibility; furthermore, our analysis shows that this ranking could in fact be exceedingly weak, implying that constraints allowing learners to infer the plausibility of candidate word meanings could also be weak.

  4. Uncertainty in hydrological change modelling

    DEFF Research Database (Denmark)

    Seaby, Lauren Paige

    Hydrological change modelling methodologies generally use climate models outputs to force hydrological simulations under changed conditions. There are nested sources of uncertainty throughout this methodology, including choice of climate model and subsequent bias correction methods. This Ph.......D. study evaluates the uncertainty of the impact of climate change in hydrological simulations given multiple climate models and bias correction methods of varying complexity. Three distribution based scaling methods (DBS) were developed and benchmarked against a more simplistic and commonly used delta...... change (DC) approach. These climate model projections were then used to force hydrological simulations under climate change for the island Sjælland in Denmark to analyse the contribution of different climate models and bias correction methods to overall uncertainty in the hydrological change modelling...

  5. Uncertainty and Sensitivity Analyses Plan

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project.

  6. Decommissioning Funding: Ethics, Implementation, Uncertainties

    International Nuclear Information System (INIS)

    This status report on decommissioning funding: ethics, implementation, uncertainties is based on a review of recent literature and materials presented at NEA meetings in 2003 and 2004, and particularly at a topical session organised in November 2004 on funding issues associated with the decommissioning of nuclear power facilities. The report also draws on the experience of the NEA Working Party on Decommissioning and Dismantling (WPDD). This report offers, in a concise form, an overview of relevant considerations on decommissioning funding mechanisms with regard to ethics, implementation and uncertainties. Underlying ethical principles found in international agreements are identified, and factors influencing the accumulation and management of funds for decommissioning nuclear facilities are discussed together with the main sources of uncertainties of funding systems

  7. Uncertainty and Sensitivity Analyses Plan

    International Nuclear Information System (INIS)

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project

  8. Davis-Besse uncertainty study

    Energy Technology Data Exchange (ETDEWEB)

    Davis, C B

    1987-08-01

    The uncertainties of calculations of loss-of-feedwater transients at Davis-Besse Unit 1 were determined to address concerns of the US Nuclear Regulatory Commission relative to the effectiveness of feed and bleed cooling. Davis-Besse Unit 1 is a pressurized water reactor of the raised-loop Babcock and Wilcox design. A detailed, quality-assured RELAP5/MOD2 model of Davis-Besse was developed at the Idaho National Engineering Laboratory. The model was used to perform an analysis of the loss-of-feedwater transient that occurred at Davis-Besse on June 9, 1985. A loss-of-feedwater transient followed by feed and bleed cooling was also calculated. The evaluation of uncertainty was based on the comparisons of calculations and data, comparisons of different calculations of the same transient, sensitivity calculations, and the propagation of the estimated uncertainty in initial and boundary conditions to the final calculated results.

  9. Uncertainty Reduction using Bayesian Inference and Sensitivity Analysis: A Sequential Approach to the NASA Langley Uncertainty Quantification Challenge

    Science.gov (United States)

    Sankararaman, Shankar

    2016-01-01

    This paper presents a computational framework for uncertainty characterization and propagation, and sensitivity analysis under the presence of aleatory and epistemic un- certainty, and develops a rigorous methodology for efficient refinement of epistemic un- certainty by identifying important epistemic variables that significantly affect the overall performance of an engineering system. The proposed methodology is illustrated using the NASA Langley Uncertainty Quantification Challenge (NASA-LUQC) problem that deals with uncertainty analysis of a generic transport model (GTM). First, Bayesian inference is used to infer subsystem-level epistemic quantities using the subsystem-level model and corresponding data. Second, tools of variance-based global sensitivity analysis are used to identify four important epistemic variables (this limitation specified in the NASA-LUQC is reflective of practical engineering situations where not all epistemic variables can be refined due to time/budget constraints) that significantly affect system-level performance. The most significant contribution of this paper is the development of the sequential refine- ment methodology, where epistemic variables for refinement are not identified all-at-once. Instead, only one variable is first identified, and then, Bayesian inference and global sensi- tivity calculations are repeated to identify the next important variable. This procedure is continued until all 4 variables are identified and the refinement in the system-level perfor- mance is computed. The advantages of the proposed sequential refinement methodology over the all-at-once uncertainty refinement approach are explained, and then applied to the NASA Langley Uncertainty Quantification Challenge problem.

  10. Survey and Evaluate Uncertainty Quantification Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Guang; Engel, David W.; Eslinger, Paul W.

    2012-02-01

    The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that will develop and deploy state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models with uncertainty quantification, optimization, risk analysis and decision making capabilities. The CCSI Toolset will incorporate commercial and open-source software currently in use by industry and will also develop new software tools as necessary to fill technology gaps identified during execution of the project. The CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. The goal of CCSI is to deliver a toolset that can simulate the scale-up of a broad set of new carbon capture technologies from laboratory scale to full commercial scale. To provide a framework around which the toolset can be developed and demonstrated, we will focus on three Industrial Challenge Problems (ICPs) related to carbon capture technologies relevant to U.S. pulverized coal (PC) power plants. Post combustion capture by solid sorbents is the technology focus of the initial ICP (referred to as ICP A). The goal of the uncertainty quantification (UQ) task (Task 6) is to provide a set of capabilities to the user community for the quantification of uncertainties associated with the carbon

  11. Numerical Continuation Methods for Intrusive Uncertainty Quantification Studies

    Energy Technology Data Exchange (ETDEWEB)

    Safta, Cosmin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Najm, Habib N. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Phipps, Eric Todd [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-09-01

    Rigorous modeling of engineering systems relies on efficient propagation of uncertainty from input parameters to model outputs. In recent years, there has been substantial development of probabilistic polynomial chaos (PC) Uncertainty Quantification (UQ) methods, enabling studies in expensive computational models. One approach, termed ”intrusive”, involving reformulation of the governing equations, has been found to have superior computational performance compared to non-intrusive sampling-based methods in relevant large-scale problems, particularly in the context of emerging architectures. However, the utility of intrusive methods has been severely limited due to detrimental numerical instabilities associated with strong nonlinear physics. Previous methods for stabilizing these constructions tend to add unacceptably high computational costs, particularly in problems with many uncertain parameters. In order to address these challenges, we propose to adapt and improve numerical continuation methods for the robust time integration of intrusive PC system dynamics. We propose adaptive methods, starting with a small uncertainty for which the model has stable behavior and gradually moving to larger uncertainty where the instabilities are rampant, in a manner that provides a suitable solution.

  12. Large-Scale Inverse Problems and Quantification of Uncertainty

    CERN Document Server

    Biegler, Lorenz; Ghattas, Omar

    2010-01-01

    Large-scale inverse problems and associated uncertainty quantification has become an important area of research, central to a wide range of science and engineering applications. Written by leading experts in the field, Large-scale Inverse Problems and Quantification of Uncertainty focuses on the computational methods used to analyze and simulate inverse problems. The text provides PhD students, researchers, advanced undergraduate students, and engineering practitioners with the perspectives of researchers in areas of inverse problems and data assimilation, ranging from statistics and large-sca

  13. Uncertainty Modeling Based on Bayesian Network in Ontology Mapping

    Institute of Scientific and Technical Information of China (English)

    LI Yuhua; LIU Tao; SUN Xiaolin

    2006-01-01

    How to deal with uncertainty is crucial in exact concept mapping between ontologies. This paper presents a new framework on modeling uncertainty in ontologies based on bayesian networks (BN). In our approach, ontology Web language (OWL) is extended to add probabilistic markups for attaching probability information, the source and target ontologies (expressed by patulous OWL) are translated into bayesian networks (BNs), the mapping between the two ontologies can be digged out by constructing the conditional probability tables (CPTs) of the BN using a improved algorithm named I-IPFP based on iterative proportional fitting procedure (IPFP). The basic idea of this framework and algorithm are validated by positive results from computer experiments.

  14. Uncertainty quantification in lattice QCD calculations for nuclear physics

    Energy Technology Data Exchange (ETDEWEB)

    Beane, Silas R. [Univ. of Washington, Seattle, WA (United States); Detmold, William [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Orginos, Kostas [College of William and Mary, Williamsburg, VA (United States); Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Savage, Martin J. [Institute for Nuclear Theory, Seattle, WA (United States)

    2015-02-05

    The numerical technique of Lattice QCD holds the promise of connecting the nuclear forces, nuclei, the spectrum and structure of hadrons, and the properties of matter under extreme conditions with the underlying theory of the strong interactions, quantum chromodynamics. A distinguishing, and thus far unique, feature of this formulation is that all of the associated uncertainties, both statistical and systematic can, in principle, be systematically reduced to any desired precision with sufficient computational and human resources. As a result, we review the sources of uncertainty inherent in Lattice QCD calculations for nuclear physics, and discuss how each is quantified in current efforts.

  15. Adaptive second-order sliding mode control with uncertainty compensation

    Science.gov (United States)

    Bartolini, G.; Levant, A.; Pisano, A.; Usai, E.

    2016-09-01

    This paper endows the second-order sliding mode control (2-SMC) approach with additional capabilities of learning and control adaptation. We present a 2-SMC scheme that estimates and compensates for the uncertainties affecting the system dynamics. It also adjusts the discontinuous control effort online, so that it can be reduced to arbitrarily small values. The proposed scheme is particularly useful when the available information regarding the uncertainties is conservative, and the classical `fixed-gain' SMC would inevitably lead to largely oversized discontinuous control effort. Benefits from the viewpoint of chattering reduction are obtained, as confirmed by computer simulations.

  16. Controlling Uncertainty Decision Making and Learning in Complex Worlds

    CERN Document Server

    Osman, Magda

    2010-01-01

    Controlling Uncertainty: Decision Making and Learning in Complex Worlds reviews and discusses the most current research relating to the ways we can control the uncertain world around us.: Features reviews and discussions of the most current research in a number of fields relevant to controlling uncertainty, such as psychology, neuroscience, computer science and engineering; Presents a new framework that is designed to integrate a variety of disparate fields of research; Represents the first book of its kind to provide a general overview of work related to understanding control

  17. Statistics, Uncertainty, and Transmitted Variation

    Energy Technology Data Exchange (ETDEWEB)

    Wendelberger, Joanne Roth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-11-05

    The field of Statistics provides methods for modeling and understanding data and making decisions in the presence of uncertainty. When examining response functions, variation present in the input variables will be transmitted via the response function to the output variables. This phenomenon can potentially have significant impacts on the uncertainty associated with results from subsequent analysis. This presentation will examine the concept of transmitted variation, its impact on designed experiments, and a method for identifying and estimating sources of transmitted variation in certain settings.

  18. Review on Generalized Uncertainty Principle

    CERN Document Server

    Tawfik, Abdel Nasser

    2015-01-01

    Based on string theory, black hole physics, doubly special relativity and some "thought" experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in understanding recent PLANCK observations on the cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta.

  19. Awe, uncertainty, and agency detection.

    Science.gov (United States)

    Valdesolo, Piercarlo; Graham, Jesse

    2014-01-01

    Across five studies, we found that awe increases both supernatural belief (Studies 1, 2, and 5) and intentional-pattern perception (Studies 3 and 4)-two phenomena that have been linked to agency detection, or the tendency to interpret events as the consequence of intentional and purpose-driven agents. Effects were both directly and conceptually replicated, and mediational analyses revealed that these effects were driven by the influence of awe on tolerance for uncertainty. Experiences of awe decreased tolerance for uncertainty, which, in turn, increased the tendency to believe in nonhuman agents and to perceive human agency in random events. PMID:24247728

  20. Uncertainties in offsite consequence analysis

    Energy Technology Data Exchange (ETDEWEB)

    Young, M.L.; Harper, F.T.; Lui, C.H.

    1996-03-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequences from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the U.S. Nuclear Regulatory Commission and the European Commission began co-sponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables using a formal expert judgment elicitation and evaluation process. This paper focuses on the methods used in and results of this on-going joint effort.

  1. Regulating renewable resources under uncertainty

    DEFF Research Database (Denmark)

    Hansen, Lars Gårn

    Renewable natural resources (like water, fish and wildlife stocks, forests and grazing lands) are critical for the livelihood of millions of people and understanding how they can be managed efficiently is an important economic problem. I show how regulator uncertainty about different economic...... and ecological parts of the harvesting system affect the optimal choice of instrument for regulating harvesters. I bring prior results into a unified framework and add to these by showing that: 1) quotas are preferred under ecological uncertainty if there are substantial diseconomies of scale in harvesting, 2...

  2. Systemic change increases model projection uncertainty

    Science.gov (United States)

    Verstegen, Judith; Karssenberg, Derek; van der Hilst, Floor; Faaij, André

    2014-05-01

    the neighbourhood doubled, while the influence of slope and potential yield decreased by 75% and 25% respectively. Allowing these systemic changes to occur in our CA in the future (up to 2022) resulted in an increase in model projection uncertainty by a factor two compared to the assumption of a stationary system. This means that the assumption of a constant model structure is not adequate and largely underestimates uncertainty in the projection. References Verstegen, J.A., Karssenberg, D., van der Hilst, F., Faaij, A.P.C., 2014. Identifying a land use change cellular automaton by Bayesian data assimilation. Environmental Modelling & Software 53, 121-136. Verstegen, J.A., Karssenberg, D., van der Hilst, F., Faaij, A.P.C., 2012. Spatio-Temporal Uncertainty in Spatial Decision Support Systems: a Case Study of Changing Land Availability for Bioenergy Crops in Mozambique. Computers , Environment and Urban Systems 36, 30-42. Wald, A., Wolfowitz, J., 1940. On a test whether two samples are from the same population. The Annals of Mathematical Statistics 11, 147-162.

  3. Uncertainty of climate change impacts and consequences on the prediction of future hydrological trends

    International Nuclear Information System (INIS)

    In the future, water is very likely to be the resource that will be most severely affected by climate change. It has been shown that small perturbations in precipitation frequency and/or quantity can result in significant impacts on the mean annual discharge. Moreover, modest changes in natural inflows result in larger changes in reservoir storage. There is however great uncertainty linked to changes in both the magnitude and direction of future hydrological trends. This presentation discusses the various sources of this uncertainty and their potential impact on the prediction of future hydrological trends. A companion paper will look at adaptation potential, taking into account some of the sources of uncertainty discussed in this presentation. Uncertainty is separated into two main components: climatic uncertainty and 'model and methods' uncertainty. Climatic uncertainty is linked to uncertainty in future greenhouse gas emission scenarios (GHGES) and to general circulation models (GCMs), whose representation of topography and climate processes is imperfect, in large part due to computational limitations. The uncertainty linked to natural variability (which may or may not increase) is also part of the climatic uncertainty. 'Model and methods' uncertainty regroups the uncertainty linked to the different approaches and models needed to transform climate data so that they can be used by hydrological models (such as downscaling methods) and the uncertainty of the models themselves and of their use in a changed climate. The impacts of the various sources of uncertainty on the hydrology of a watershed are demonstrated on the Peribonka River basin (Quebec, Canada). The results indicate that all sources of uncertainty can be important and outline the importance of taking these sources into account for any impact and adaptation studies. Recommendations are outlined for such studies. (author)

  4. Uncertainty in Vs30-based site response

    Science.gov (United States)

    Thompson, Eric; Wald, David J.

    2016-01-01

    Methods that account for site response range in complexity from simple linear categorical adjustment factors to sophisticated nonlinear constitutive models. Seismic‐hazard analysis usually relies on ground‐motion prediction equations (GMPEs); within this framework site response is modeled statistically with simplified site parameters that include the time‐averaged shear‐wave velocity to 30 m (VS30) and basin depth parameters. Because VS30 is not known in most locations, it must be interpolated or inferred through secondary information such as geology or topography. In this article, we analyze a subset of stations for which VS30 has been measured to address effects of VS30 proxies on the uncertainty in the ground motions as modeled by GMPEs. The stations we analyze also include multiple recordings, which allow us to compute the repeatable site effects (or empirical amplification factors [EAFs]) from the ground motions. Although all methods exhibit similar bias, the proxy methods only reduce the ground‐motion standard deviations at long periods when compared to GMPEs without a site term, whereas measured VS30 values reduce the standard deviations at all periods. The standard deviation of the ground motions are much lower when the EAFs are used, indicating that future refinements of the site term in GMPEs have the potential to substantially reduce the overall uncertainty in the prediction of ground motions by GMPEs.

  5. Uncertainty in Greenland glacial isostatic adjustment (Invited)

    Science.gov (United States)

    Milne, G. A.; Lecavalier, B.; Kjeldsen, K. K.; Kjaer, K.; Wolstencroft, M.; Wake, L. M.; Simpson, M. J.; Long, A. J.; Woodroffe, S.; Korsgaard, N. J.; Bjork, A. A.; Khan, S. A.

    2013-12-01

    It is well known that the interpretation of geodetic data in Greenland to constrain recent ice mass changes requires knowledge of isostatic land motion associated with past changes in the ice sheet. In this talk we will consider a variety of factors that limit how well the signal due to past mass changes (commonly referred to as glacial isostatic adjustment (GIA)) can be defined. Predictions based on a new model of Greenland GIA will be shown. Using these predictions as a reference, we will consider the influence of plausible variations in some key aspects of both the Earth and ice load components of the GIA model on predictions of land motion and gravity changes. The sensitivity of model output to plausible variations in both depth-dependent and lateral viscosity structure will be considered. With respect to the ice model, we will compare the relative contributions of loading during key periods of the ice history with a focus on the past few thousand years. In particular, we will show predictions of contemporary land motion and gravity changes due to loading changes following the Little Ice Age computed using a new reconstruction of ice thickness changes based largely on empirical data. A primary contribution of this work will be the identification of dominant sources of uncertainty in current models of Greenland GIA and the regions most significantly affected by this uncertainty.

  6. Adaptive Strategies for Materials Design using Uncertainties

    Science.gov (United States)

    Balachandran, Prasanna V.; Xue, Dezhen; Theiler, James; Hogden, John; Lookman, Turab

    2016-01-01

    We compare several adaptive design strategies using a data set of 223 M2AX family of compounds for which the elastic properties [bulk (B), shear (G), and Young’s (E) modulus] have been computed using density functional theory. The design strategies are decomposed into an iterative loop with two main steps: machine learning is used to train a regressor that predicts elastic properties in terms of elementary orbital radii of the individual components of the materials; and a selector uses these predictions and their uncertainties to choose the next material to investigate. The ultimate goal is to obtain a material with desired elastic properties in as few iterations as possible. We examine how the choice of data set size, regressor and selector impact the design. We find that selectors that use information about the prediction uncertainty outperform those that don’t. Our work is a step in illustrating how adaptive design tools can guide the search for new materials with desired properties.

  7. UncertWeb: chaining web services accounting for uncertainty

    Science.gov (United States)

    Cornford, Dan; Jones, Richard; Bastin, Lucy; Williams, Matthew; Pebesma, Edzer; Nativi, Stefano

    2010-05-01

    The development of interoperable services that permit access to data and processes, typically using web service based standards opens up the possibility for increasingly complex chains of data and processes, which might be discovered and composed in increasingly automatic ways. This concept, sometimes referred to as the "Model Web", offers the promise of integrated (Earth) system models, with pluggable web service based components which can be discovered, composed and evaluated dynamically. A significant issue with such service chains, indeed in any composite model composed of coupled components, is that in all interesting (non-linear) cases the effect of uncertainties on inputs, or components within the chain will have complex, potentially unexpected effects on the outputs. Within the FP7 UncertWeb project we will be developing a mechanism and an accompanying set of tools to enable rigorous uncertainty management in web based service chains involving both data and processes. The project will exploit and extend the UncertML candidate standard to flexibly propagate uncertainty through service chains, including looking at mechanisms to develop uncertainty enabled profiles of existing Open Geospatial Consortium services. To facilitate the use of such services we will develop tools to address the definition of the input uncertainties (elicitation), manage the uncertainty propagation (emulation), undertake uncertainty and sensitivity analysis and visualise the output uncertainty. In this talk we will outline the challenges of the UncertWeb project, illustrating this with a prototype service chain we have created for correcting station level pressure to sea-level pressure, which accounts for the various uncertainties involved. In particular we will discuss some of the challenges of chaining Open Geospatial Consortium services using the Business Process Execution Language. We will also address the issue of computational cost and communication bandwidth requirements for

  8. Does inflation uncertainty increase with inflation?

    OpenAIRE

    John E. Golob

    1994-01-01

    One of the most important costs of inflation is the uncertainty it creates about future inflation. This uncertainty clouds the decisionmaking of consumers and businesses and reduces economic well-being. Without this uncertainty, consumers and businesses could better plan for the future. According to many analysts, uncertainty about future inflation rises as inflation rises. As a result, these analysts argue that the Federal Reserve could reduce inflation uncertainty by reducing inflation. Oth...

  9. Uncertainty Analyses in the Finite-Difference Time-Domain Method

    OpenAIRE

    Edwards, R. S.; Marvin, A. C.; Porter, S J

    2010-01-01

    Providing estimates of the uncertainty in results obtained by Computational Electromagnetic (CEM) simulations is essential when determining the acceptability of the results. The Monte Carlo method (MCM) has been previously used to quantify the uncertainty in CEM simulations. Other computationally efficient methods have been investigated more recently, such as the polynomial chaos method (PCM) and the method of moments (MoM). This paper introduces a novel implementation of the PCM and the MoM ...

  10. Quantum Computation and Quantum Information

    OpenAIRE

    Wang, Yazhen

    2012-01-01

    Quantum computation and quantum information are of great current interest in computer science, mathematics, physical sciences and engineering. They will likely lead to a new wave of technological innovations in communication, computation and cryptography. As the theory of quantum physics is fundamentally stochastic, randomness and uncertainty are deeply rooted in quantum computation, quantum simulation and quantum information. Consequently quantum algorithms are random in nature, and quantum ...

  11. Entropy and the uncertainty principle

    CERN Document Server

    Frank, Rupert L

    2011-01-01

    We generalize, improve and unify theorems of Rumin, and Maassen--Uffink about classical entropies associated to quantum density matrices. These theorems refer to the classical entropies of the diagonals of a density matrix in two different bases. Thus they provide a kind of uncertainty principle. Our inequalities are sharp because they are exact in the high-temperature or semi-classical limit.

  12. Uncertainty propagation in nuclear forensics

    International Nuclear Information System (INIS)

    Uncertainty propagation formulae are presented for age dating in support of nuclear forensics. The age of radioactive material in this context refers to the time elapsed since a particular radionuclide was chemically separated from its decay product(s). The decay of the parent radionuclide and ingrowth of the daughter nuclide are governed by statistical decay laws. Mathematical equations allow calculation of the age of specific nuclear material through the atom ratio between parent and daughter nuclides, or through the activity ratio provided that the daughter nuclide is also unstable. The derivation of the uncertainty formulae of the age may present some difficulty to the user community and so the exact solutions, some approximations, a graphical representation and their interpretation are presented in this work. Typical nuclides of interest are actinides in the context of non-proliferation commitments. The uncertainty analysis is applied to a set of important parent–daughter pairs and the need for more precise half-life data is examined. - Highlights: • Uncertainty propagation formulae for age dating with nuclear chronometers. • Applied to parent–daughter pairs used in nuclear forensics. • Investigated need for better half-life data

  13. Numerical modeling of economic uncertainty

    DEFF Research Database (Denmark)

    Schjær-Jacobsen, Hans

    2007-01-01

    Representation and modeling of economic uncertainty is addressed by different modeling methods, namely stochastic variables and probabilities, interval analysis, and fuzzy numbers, in particular triple estimates. Focusing on discounted cash flow analysis numerical results are presented, comparisons...... are made between alternative modeling methods, and characteristics of the methods are discussed....

  14. Adressing Replication and Model Uncertainty

    DEFF Research Database (Denmark)

    Ebersberger, Bernd; Galia, Fabrice; Laursen, Keld;

    Many fields of strategic management are subject to an important degree of model uncertainty. This is because the true model, and therefore the selection of appropriate explanatory variables, is essentially unknown. Drawing on the literature on the determinants of innovation, and by analyzing...

  15. Uncertainties in radiation flow experiments

    Science.gov (United States)

    Fryer, C. L.; Dodd, E.; Even, W.; Fontes, C. J.; Greeff, C.; Hungerford, A.; Kline, J.; Mussack, K.; Tregillis, I.; Workman, J. B.; Benstead, J.; Guymer, T. M.; Moore, A. S.; Morton, J.

    2016-03-01

    Although the fundamental physics behind radiation and matter flow is understood, many uncertainties remain in the exact behavior of macroscopic fluids in systems ranging from pure turbulence to coupled radiation hydrodynamics. Laboratory experiments play an important role in studying this physics to allow scientists to test their macroscopic models of these phenomena. However, because the fundamental physics is well understood, precision experiments are required to validate existing codes already tested by a suite of analytic, manufactured and convergence solutions. To conduct such high-precision experiments requires a detailed understanding of the experimental errors and the nature of their uncertainties on the observed diagnostics. In this paper, we study the uncertainties plaguing many radiation-flow experiments, focusing on those using a hohlraum (dynamic or laser-driven) source and a foam-density target. This study focuses on the effect these uncertainties have on the breakout time of the radiation front. We find that, even if the errors in the initial conditions and numerical methods are Gaussian, the errors in the breakout time are asymmetric, leading to a systematic bias in the observed data. We must understand these systematics to produce the high-precision experimental results needed to study this physics.

  16. Model uncertainty in growth empirics

    NARCIS (Netherlands)

    Prüfer, P.

    2008-01-01

    This thesis applies so-called Bayesian model averaging (BMA) to three different economic questions substantially exposed to model uncertainty. Chapter 2 addresses a major issue of modern development economics: the analysis of the determinants of pro-poor growth (PPG), which seeks to combine high gro

  17. Competitive Capacity Investment under Uncertainty

    NARCIS (Netherlands)

    X. Li (Xishu); R.A. Zuidwijk (Rob); M.B.M. de Koster (René); R. Dekker (Rommert)

    2016-01-01

    textabstractWe consider a long-term capacity investment problem in a competitive market under demand uncertainty. Two firms move sequentially in the competition and a firm’s capacity decision interacts with the other firm’s current and future capacity. Throughout the investment race, a firm can eith

  18. A preliminary uncertainty analysis of phenomenological inputs in TEXAS-V code

    Energy Technology Data Exchange (ETDEWEB)

    Park, S. H.; Kim, H. D.; Ahn, K. I. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2010-10-15

    Uncertainty analysis is important step in safety analysis of nuclear power plants. The better estimate for the computer codes is on the increase instead of conservative codes. These efforts aim to get more precise evaluation of safety margins, and aim at determining the rate of change in the prediction of codes with one or more input parameters varies within its range of interest. From this point of view, a severe accident uncertainty analysis system, SAUNA, has been improved for TEXAS-V FCI uncertainty analysis. The main objective of this paper is to present the TEXAS FCI uncertainty analysis results implemented through the SAUNA code

  19. Planners face a veritable `cocktail` of uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Jones, C.

    1999-01-01

    Welcome to the US EPA`s regulatory cocktail party - it seemingly has no end, you are required to attend, and the outcome is unpredictable. The cocktail`s recipe is complicated: one part SO{sub x}, two parts NO{sub x} a dash of toxics - heavy on mercury - with a splash of particulate. Planners are intoxicated by uncertainly. The idiosyncrasies of the US emissions trading system are described. While some plants are easily in compliance, others struggle to attain the targets. All have to run complex computer software to monitor emissions. Strategies taken on NO{sub x} and ozone are described, wide regulatory uncertainty makes planning compliance strategies even more difficult. 2 figs.

  20. UNCERTAINTY EVALUATION OF AVAILABLE ENERGY AND POWER

    Energy Technology Data Exchange (ETDEWEB)

    Jon P. Christophersen; John L. Morrison

    2006-05-01

    The Idaho National Laboratory does extensive testing and evaluation of advanced technology batteries and ultracapacitors for applications in electric and hybrid vehicles. The testing is essentially acquiring time records of voltage, current and temperature from a variety of charge and discharge time profiles. From these three basic measured parameters, a complex assortment of derived parameters (resistance, power, etc.) is computed. Derived parameters are in many cases functions of multiple layers of other derived parameters that eventually work back to the three basic measured parameters. The purpose of this paper is to document the methodology used for the uncertainty analysis of the most complicated derived parameters broadly grouped as available energy and available power. This work is an analytical derivation. Future work will report the implementation of algorithms based upon this effort.

  1. Decision making uncertainty, imperfection, deliberation and scalability

    CERN Document Server

    Kárný, Miroslav; Wolpert, David

    2015-01-01

    This volume focuses on uncovering the fundamental forces underlying dynamic decision making among multiple interacting, imperfect and selfish decision makers. The chapters are written by leading experts from different disciplines, all considering the many sources of imperfection in decision making, and always with an eye to decreasing the myriad discrepancies between theory and real world human decision making. Topics addressed include uncertainty, deliberation cost and the complexity arising from the inherent large computational scale of decision making in these systems. In particular, analyses and experiments are presented which concern: • task allocation to maximize “the wisdom of the crowd”; • design of a society of “edutainment” robots who account for one anothers’ emotional states; • recognizing and counteracting seemingly non-rational human decision making; • coping with extreme scale when learning causality in networks; • efficiently incorporating expert knowledge in personalized...

  2. Groundwater management under sustainable yield uncertainty

    Science.gov (United States)

    Delottier, Hugo; Pryet, Alexandre; Dupuy, Alain

    2015-04-01

    The definition of the sustainable yield (SY) of a groundwater system consists in adjusting pumping rates so as to avoid groundwater depletion and preserve environmental flows. Once stakeholders have defined which impacts can be considered as "acceptable" for both environmental and societal aspects, hydrogeologists use groundwater models to estimate the SY. Yet, these models are based on a simplification of actual groundwater systems, whose hydraulic properties are largely unknown. As a result, the estimated SY is subject to "predictive" uncertainty. We illustrate the issue with a synthetic homogeneous aquifer system in interaction with a stream for steady state and transient conditions. Simulations are conducted with the USGS MODFLOW finite difference model with the river-package. A synthetic dataset is first generated with the numerical model that will further be considered as the "observed" state. In a second step, we conduct the calibration operation as hydrogeologists dealing with real word, unknown groundwater systems. The RMSE between simulated hydraulic heads and the synthetic "observed" values is used as objective function. But instead of simply "calibrating" model parameters, we explore the value of the objective function in the parameter space (hydraulic conductivity, storage coefficient and total recharge). We highlight the occurrence of an ellipsoidal "null space", where distinct parameter sets lead to equally low values for the objective function. The optimum of the objective function is not unique, which leads to a range of possible values for the SY. With a large confidence interval for the SY, the use of modeling results for decision-making is challenging. We argue that prior to modeling operations, efforts must be invested so as to narrow the intervals of likely parameter values. Parameter space exploration is effective to estimate SY uncertainty, but not efficient because of its computational burden and is therefore inapplicable for real world

  3. Handling uncertainty and networked structure in robot control

    CERN Document Server

    Tamás, Levente

    2015-01-01

    This book focuses on two challenges posed in robot control by the increasing adoption of robots in the everyday human environment: uncertainty and networked communication. Part I of the book describes learning control to address environmental uncertainty. Part II discusses state estimation, active sensing, and complex scenario perception to tackle sensing uncertainty. Part III completes the book with control of networked robots and multi-robot teams. Each chapter features in-depth technical coverage and case studies highlighting the applicability of the techniques, with real robots or in simulation. Platforms include mobile ground, aerial, and underwater robots, as well as humanoid robots and robot arms. Source code and experimental data are available at http://extras.springer.com. The text gathers contributions from academic and industry experts, and offers a valuable resource for researchers or graduate students in robot control and perception. It also benefits researchers in related areas, such as computer...

  4. LDRD Final Report: Capabilities for Uncertainty in Predictive Science.

    Energy Technology Data Exchange (ETDEWEB)

    Phipps, Eric Todd; Eldred, Michael S; Salinger, Andrew G.; Webster, Clayton G.

    2008-10-01

    Predictive simulation of systems comprised of numerous interconnected, tightly coupled com-ponents promises to help solve many problems of scientific and national interest. Howeverpredictive simulation of such systems is extremely challenging due to the coupling of adiverse set of physical and biological length and time scales. This report investigates un-certainty quantification methods for such systems that attempt to exploit their structure togain computational efficiency. The traditional layering of uncertainty quantification aroundnonlinear solution processes is inverted to allow for heterogeneous uncertainty quantificationmethods to be applied to each component in a coupled system. Moreover this approachallows stochastic dimension reduction techniques to be applied at each coupling interface.The mathematical feasibility of these ideas is investigated in this report, and mathematicalformulations for the resulting stochastically coupled nonlinear systems are developed.3

  5. Measuring the uncertainties of discharge measurements: interlaboratory experiments in hydrometry

    Science.gov (United States)

    Le Coz, Jérôme; Blanquart, Bertrand; Pobanz, Karine; Dramais, Guillaume; Pierrefeu, Gilles; Hauet, Alexandre; Despax, Aurélien

    2015-04-01

    Quantifying the uncertainty of streamflow data is key for hydrological sciences. The conventional uncertainty analysis based on error propagation techniques is restricted by the absence of traceable discharge standards and by the weight of difficult-to-predict errors related to the operator, procedure and measurement environment. Field interlaboratory experiments recently emerged as an efficient, standardized method to 'measure' the uncertainties of a given streamgauging technique in given measurement conditions. Both uncertainty approaches are compatible and should be developed jointly in the field of hydrometry. In the recent years, several interlaboratory experiments have been reported by different hydrological services. They involved different streamgauging techniques, including acoustic profilers (ADCP), current-meters and handheld radars (SVR). Uncertainty analysis was not always their primary goal: most often, testing the proficiency and homogeneity of instruments, makes and models, procedures and operators was the original motivation. When interlaboratory experiments are processed for uncertainty analysis, once outliers have been discarded all participants are assumed to be equally skilled and to apply the same streamgauging technique in equivalent conditions. A universal requirement is that all participants simultaneously measure the same discharge, which shall be kept constant within negligible variations. To our best knowledge, we were the first to apply the interlaboratory method for computing the uncertainties of streamgauging techniques, according to the authoritative international documents (ISO standards). Several specific issues arise due to the measurements conditions in outdoor canals and rivers. The main limitation is that the best available river discharge references are usually too uncertain to quantify the bias of the streamgauging technique, i.e. the systematic errors that are common to all participants in the experiment. A reference or a

  6. An algorithm to improve sampling efficiency for uncertainty propagation using sampling based method

    Energy Technology Data Exchange (ETDEWEB)

    Campolina, Daniel; Lima, Paulo Rubens I., E-mail: campolina@cdtn.br, E-mail: pauloinacio@cpejr.com.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil). Servico de Tecnologia de Reatores; Pereira, Claubia; Veloso, Maria Auxiliadora F., E-mail: claubia@nuclear.ufmg.br, E-mail: dora@nuclear.ufmg.br [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Dept. de Engenharia Nuclear

    2015-07-01

    Sample size and computational uncertainty were varied in order to investigate sample efficiency and convergence of the sampling based method for uncertainty propagation. Transport code MCNPX was used to simulate a LWR model and allow the mapping, from uncertain inputs of the benchmark experiment, to uncertain outputs. Random sampling efficiency was improved through the use of an algorithm for selecting distributions. Mean range, standard deviation range and skewness were verified in order to obtain a better representation of uncertainty figures. Standard deviation of 5 pcm in the propagated uncertainties for 10 n-samples replicates was adopted as convergence criterion to the method. Estimation of 75 pcm uncertainty on reactor k{sub eff} was accomplished by using sample of size 93 and computational uncertainty of 28 pcm to propagate 1σ uncertainty of burnable poison radius. For a fixed computational time, in order to reduce the variance of the uncertainty propagated, it was found, for the example under investigation, it is preferable double the sample size than double the amount of particles followed by Monte Carlo process in MCNPX code. (author)

  7. Maxallent: Maximizers of all Entropies and Uncertainty of Uncertainty

    CERN Document Server

    Gorban, A N

    2013-01-01

    The entropy maximum approach (Maxent) was developed as a minimization of the subjective uncertainty measured by the Boltzmann--Gibbs--Shannon entropy. Many new entropies have been invented in the second half of the 20th century. Now there exists a rich choice of entropies for fitting needs. This diversity of entropies gave rise to a Maxent "anarchism". Maxent approach is now the conditional maximization of an appropriate entropy for the evaluation of the probability distribution when our information is partial and incomplete. The rich choice of non-classical entropies causes a new problem: which entropy is better for a given class of applications? We understand entropy as a {\\em measure of uncertainty which increases in Markov processes.} In this work, we describe the most general ordering of the distribution space, with respect to which all continuous-time Markov processes are monotonic (the Markov order). For inference, this approach results in a {\\em set} of conditionally "most random" distributions. Each ...

  8. Applied research in uncertainty modeling and analysis

    CERN Document Server

    Ayyub, Bilal

    2005-01-01

    Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

  9. Local conditions and uncertainty bands for Semiscale Test S-02-9

    International Nuclear Information System (INIS)

    Analysis was performed to derive local conditions heat transfer parameters and their uncertainties using computer codes and experimentally derived boundary conditions for the Semiscale core for LOCA Test S-02-9. Calculations performed consisted of nominal code cases using best-estimate input parameters and cases where the specified input parameters were perturbed in accordance with the response surface method of uncertainty analysis. The output parameters of interest were those that are used in film boiling heat transfer correlations including enthalpy, pressure, quality, and coolant flow rate. Large uncertainty deviations occurred during low core mass flow periods where the relative flow uncertainties were large. Utilizing the derived local conditions and their associated uncertainties, a study was then made which showed the uncertainty in film boiling heat transfer coefficient varied between 5 and 250%

  10. Local conditions and uncertainty bands for Semiscale Test S-02-9. [PWR

    Energy Technology Data Exchange (ETDEWEB)

    Varacalle, Jr, D J

    1979-01-01

    Analysis was performed to derive local conditions heat transfer parameters and their uncertainties using computer codes and experimentally derived boundary conditions for the Semiscale core for LOCA Test S-02-9. Calculations performed consisted of nominal code cases using best-estimate input parameters and cases where the specified input parameters were perturbed in accordance with the response surface method of uncertainty analysis. The output parameters of interest were those that are used in film boiling heat transfer correlations including enthalpy, pressure, quality, and coolant flow rate. Large uncertainty deviations occurred during low core mass flow periods where the relative flow uncertainties were large. Utilizing the derived local conditions and their associated uncertainties, a study was then made which showed the uncertainty in film boiling heat transfer coefficient varied between 5 and 250%.

  11. Accounting for Calibration Uncertainties in X-ray Analysis: Effective Areas in Spectral Fitting

    CERN Document Server

    Lee, Hyunsook; van Dyk, David A; Connors, Alanna; Drake, Jeremy J; Izem, Rima; Meng, Xiao-Li; Min, Shandong; Park, Taeyoung; Ratzlaff, Pete; Siemiginowska, Aneta; Zezas, Andreas

    2011-01-01

    While considerable advance has been made to account for statistical uncertainties in astronomical analyses, systematic instrumental uncertainties have been generally ignored. This can be crucial to a proper interpretation of analysis results because instrumental calibration uncertainty is a form of systematic uncertainty. Ignoring it can underestimate error bars and introduce bias into the fitted values of model parameters. Accounting for such uncertainties currently requires extensive case-specific simulations if using existing analysis packages. Here we present general statistical methods that incorporate calibration uncertainties into spectral analysis of high-energy data. We first present a method based on multiple imputation that can be applied with any fitting method, but is necessarily approximate. We then describe a more exact Bayesian approach that works in conjunction with a Markov chain Monte Carlo based fitting. We explore methods for improving computational efficiency, and in particular detail a ...

  12. Computational Modeling of Uncertainty Avoidance in Consumer Behavior

    NARCIS (Netherlands)

    Roozmand, O.; Ghasem-Aghaee, N.; Nematbakhsh, M.A.; Baraani, A.; Hofstede, G.J.

    2011-01-01

    Human purchasing behavior is affected by many influential factors. Culture at macro-level and personality at micro-level influence consumer purchasing behavior. People of different cultures tend to accept the values of their own group and consequently have different purchasing behavior. Also, people

  13. Validation uncertainty of MATRA code for subchannel void distributions

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Dae-Hyun; Kim, S. J.; Kwon, H.; Seo, K. W. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    To extend code capability to the whole core subchannel analysis, pre-conditioned Krylov matrix solvers such as BiCGSTAB and GMRES are implemented in MATRA code as well as parallel computing algorithms using MPI and OPENMP. It is coded by fortran 90, and has some user friendly features such as graphic user interface. MATRA code was approved by Korean regulation body for design calculation of integral-type PWR named SMART. The major role subchannel code is to evaluate core thermal margin through the hot channel analysis and uncertainty evaluation for CHF predictions. In addition, it is potentially used for the best estimation of core thermal hydraulic field by incorporating into multiphysics and/or multi-scale code systems. In this study we examined a validation process for the subchannel code MATRA specifically in the prediction of subchannel void distributions. The primary objective of validation is to estimate a range within which the simulation modeling error lies. The experimental data for subchannel void distributions at steady state and transient conditions was provided on the framework of OECD/NEA UAM benchmark program. The validation uncertainty of MATRA code was evaluated for a specific experimental condition by comparing the simulation result and experimental data. A validation process should be preceded by code and solution verification. However, quantification of verification uncertainty was not addressed in this study. The validation uncertainty of the MATRA code for predicting subchannel void distribution was evaluated for a single data point of void fraction measurement at a 5x5 PWR test bundle on the framework of OECD UAM benchmark program. The validation standard uncertainties were evaluated as 4.2%, 3.9%, and 2.8% with the Monte-Carlo approach at the axial levels of 2216 mm, 2669 mm, and 3177 mm, respectively. The sensitivity coefficient approach revealed similar results of uncertainties but did not account for the nonlinear effects on the

  14. Estimation of uncertainty for fatigue growth rate at cryogenic temperatures

    Science.gov (United States)

    Nyilas, Arman; Weiss, Klaus P.; Urbach, Elisabeth; Marcinek, Dawid J.

    2014-01-01

    Fatigue crack growth rate (FCGR) measurement data for high strength austenitic alloys at cryogenic environment suffer in general from a high degree of data scatter in particular at ΔK regime below 25 MPa√m. Using standard mathematical smoothing techniques forces ultimately a linear relationship at stage II regime (crack propagation rate versus ΔK) in a double log field called Paris law. However, the bandwidth of uncertainty relies somewhat arbitrary upon the researcher's interpretation. The present paper deals with the use of the uncertainty concept on FCGR data as given by GUM (Guidance of Uncertainty in Measurements), which since 1993 is a recommended procedure to avoid subjective estimation of error bands. Within this context, the lack of a true value addresses to evaluate the best estimate by a statistical method using the crack propagation law as a mathematical measurement model equation and identifying all input parameters. Each parameter necessary for the measurement technique was processed using the Gaussian distribution law by partial differentiation of the terms to estimate the sensitivity coefficients. The combined standard uncertainty determined for each term with its computed sensitivity coefficients finally resulted in measurement uncertainty of the FCGR test result. The described procedure of uncertainty has been applied within the framework of ITER on a recent FCGR measurement for high strength and high toughness Type 316LN material tested at 7 K using a standard ASTM proportional compact tension specimen. The determined values of Paris law constants such as C0 and the exponent m as best estimate along with the their uncertainty value may serve a realistic basis for the life expectancy of cyclic loaded members.

  15. 'spup' - an R package for uncertainty propagation in spatial environmental modelling

    Science.gov (United States)

    Sawicka, Kasia; Heuvelink, Gerard

    2016-04-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected static and interactive visualization methods that are understandable by non-experts with limited background in

  16. An Uncertainty-Aware Approach for Exploratory Microblog Retrieval.

    Science.gov (United States)

    Liu, Mengchen; Liu, Shixia; Zhu, Xizhou; Liao, Qinying; Wei, Furu; Pan, Shimei

    2016-01-01

    Although there has been a great deal of interest in analyzing customer opinions and breaking news in microblogs, progress has been hampered by the lack of an effective mechanism to discover and retrieve data of interest from microblogs. To address this problem, we have developed an uncertainty-aware visual analytics approach to retrieve salient posts, users, and hashtags. We extend an existing ranking technique to compute a multifaceted retrieval result: the mutual reinforcement rank of a graph node, the uncertainty of each rank, and the propagation of uncertainty among different graph nodes. To illustrate the three facets, we have also designed a composite visualization with three visual components: a graph visualization, an uncertainty glyph, and a flow map. The graph visualization with glyphs, the flow map, and the uncertainty analysis together enable analysts to effectively find the most uncertain results and interactively refine them. We have applied our approach to several Twitter datasets. Qualitative evaluation and two real-world case studies demonstrate the promise of our approach for retrieving high-quality microblog data. PMID:26529705

  17. NASA Team 2 Sea Ice Concentration Algorithm Retrieval Uncertainty

    Science.gov (United States)

    Brucker, Ludovic; Cavalieri, Donald J.; Markus, Thorsten; Ivanoff, Alvaro

    2014-01-01

    Satellite microwave radiometers are widely used to estimate sea ice cover properties (concentration, extent, and area) through the use of sea ice concentration (IC) algorithms. Rare are the algorithms providing associated IC uncertainty estimates. Algorithm uncertainty estimates are needed to assess accurately global and regional trends in IC (and thus extent and area), and to improve sea ice predictions on seasonal to interannual timescales using data assimilation approaches. This paper presents a method to provide relative IC uncertainty estimates using the enhanced NASA Team (NT2) IC algorithm. The proposed approach takes advantage of the NT2 calculations and solely relies on the brightness temperatures (TBs) used as input. NT2 IC and its associated relative uncertainty are obtained for both the Northern and Southern Hemispheres using the Advanced Microwave Scanning Radiometer for the Earth Observing System (AMSR-E) TB. NT2 IC relative uncertainties estimated on a footprint-by-footprint swath-by-swath basis were averaged daily over each 12.5-km grid cell of the polar stereographic grid. For both hemispheres and throughout the year, the NT2 relative uncertainty is less than 5%. In the Southern Hemisphere, it is low in the interior ice pack, and it increases in the marginal ice zone up to 5%. In the Northern Hemisphere, areas with high uncertainties are also found in the high IC area of the Central Arctic. Retrieval uncertainties are greater in areas corresponding to NT2 ice types associated with deep snow and new ice. Seasonal variations in uncertainty show larger values in summer as a result of melt conditions and greater atmospheric contributions. Our analysis also includes an evaluation of the NT2 algorithm sensitivity to AMSR-E sensor noise. There is a 60% probability that the IC does not change (to within the computed retrieval precision of 1%) due to sensor noise, and the cumulated probability shows that there is a 90% chance that the IC varies by less than

  18. Adjoint-Based Uncertainty Quantification with MCNP

    Energy Technology Data Exchange (ETDEWEB)

    Seifried, Jeffrey E. [Univ. of California, Berkeley, CA (United States)

    2011-09-01

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  19. Roughness coefficient and its uncertainty in gravel-bed river

    Institute of Scientific and Technical Information of China (English)

    Ji-Sung KIM; Chan-Joo LEE; Won KIM; Yong-Jeon KIM

    2010-01-01

    Manning's roughness coefficient was estimated for a gravel-bed river reach using field measurements of water level and discharge,and the applicability of various methods used for estimation of the roughness coefficient was evaluated.Results show that the roughness coefficient tends to decrease with increasing discharge and water depth,and over a certain range it appears to remain constant.Comparison of roughness coefficients calculated by field measurement data with those estimated by other methods shows that,although the field-measured values provide approximate roughness coefficients for relatively large discharge,there seems to be rather high uncertainty due to the difference in resultant values.For this reason,uncertainty related to the roughness coefficient was analyzed in terms of change in computed variables.On average,a 20%increase of the roughness coefficient causes a 7% increase in the water depth and an 8% decrease in velocity,but there may be about a 15% increase in the water depth and an equivalent decrease in velocity for certain cross-sections in the study reach.Finally,the validity of estimated roughness coefficient based on field measurements was examined.A 10% error in discharge measurement may lead to more than 10% uncertainty in roughness coefficient estimation,but corresponding uncertainty in computed water depth and velocity is reduced to approximately 5%.Conversely,the necessity for roughness coefficient estimation by field measurement is confirmed.

  20. Automated Generation of Tabular Equations of State with Uncertainty Information

    Science.gov (United States)

    Carpenter, John H.; Robinson, Allen C.; Debusschere, Bert J.; Mattsson, Ann E.

    2015-06-01

    As computational science pushes toward higher fidelity prediction, understanding the uncertainty associated with closure models, such as the equation of state (EOS), has become a key focus. Traditional EOS development often involves a fair amount of art, where expert modelers may appear as magicians, providing what is felt to be the closest possible representation of the truth. Automation of the development process gives a means by which one may demystify the art of EOS, while simultaneously obtaining uncertainty information in a manner that is both quantifiable and reproducible. We describe our progress on the implementation of such a system to provide tabular EOS tables with uncertainty information to hydrocodes. Key challenges include encoding the artistic expert opinion into an algorithmic form and preserving the analytic models and uncertainty information in a manner that is both accurate and computationally efficient. Results are demonstrated on a multi-phase aluminum model. *Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  1. Measurement uncertainty from In-Situ gamma spectroscopy of nonhomogeneous containers and from Laboratory Assay

    International Nuclear Information System (INIS)

    During a D and D or ER process containers of radioactive waste are normally generated. The activity can commonly be determined by gamma spectroscopy, but frequently the measurement conditions are not conducive to precise sample-detector geometries, and usually the radioactive material is not in a homogeneous distribution. What is the best method to accurately assay these containers - sampling followed by laboratory analysis, or in-situ spectroscopy? What is the uncertainty of the final result? To help answer these questions, the Canberra tool ISOCS Uncertainty Estimator [IUE] was used to mathematically simulate and evaluate several different measurement scenarios and to estimate the uncertainty of the measurement and the sampling process. Several representative containers and source distributions were mathematically defined and evaluated to determine the in-situ measurement uncertainty due to the sample non-uniformity. In the First example a typical field situation requiring the measurement of 200-liter drums was evaluated. A sensitivity analysis was done to show which parameters contributed the most to the uncertainty. Then an efficiency uncertainty calculation was performed. In the Second example, a group of 200-liter drums with various types of non-homogeneous distributions was created, and them measurements were simulated with different detector arrangements to see how the uncertainty varied. In the Third example, a truck filled with non-uniform soil was first measured with multiple in-situ detectors to determine the measurement uncertainty. Then composite samples were extracted and the sampling uncertainty computed for comparison to the field measurement uncertainty. (authors)

  2. Visualizing Summary Statistics and Uncertainty

    KAUST Repository

    Potter, K.

    2010-08-12

    The graphical depiction of uncertainty information is emerging as a problem of great importance. Scientific data sets are not considered complete without indications of error, accuracy, or levels of confidence. The visual portrayal of this information is a challenging task. This work takes inspiration from graphical data analysis to create visual representations that show not only the data value, but also important characteristics of the data including uncertainty. The canonical box plot is reexamined and a new hybrid summary plot is presented that incorporates a collection of descriptive statistics to highlight salient features of the data. Additionally, we present an extension of the summary plot to two dimensional distributions. Finally, a use-case of these new plots is presented, demonstrating their ability to present high-level overviews as well as detailed insight into the salient features of the underlying data distribution. © 2010 The Eurographics Association and Blackwell Publishing Ltd.

  3. Uncertainty analysis of dose estimation

    International Nuclear Information System (INIS)

    This article is designed from the point of quantification of uncertainty associated with the measurement of radiation dose. Therefore, before introducing uncertainty quantification, it is always better to recap the few common terminologies of radiation dosimetry. We know that when radiation interacts with matter, electrons are removed from atoms through the process called ionization. When the energy of the radiation is not strong enough, electron excitation (jumps from lower shells to upper shells) may occur. The excitation of molecules or breaking of molecular bonds can also occur, causing damage to living cells. When the energy of radiation is large enough to produce ionizations, the radiation is called ionizing radiation. Otherwise, it is called non-ionizing radiation. When dealing with the interaction of gamma and X-rays in air, the term exposure is used. Exposure measures the electric charge (positive or negative) produced by electromagnetic radiation in a unit mass of air, at normal atmospheric conditions

  4. Aspects of complementarity and uncertainty

    Science.gov (United States)

    Vathsan, Radhika; Qureshi, Tabish

    2016-08-01

    The two-slit experiment with quantum particles provides many insights into the behavior of quantum mechanics, including Bohr’s complementarity principle. Here, we analyze Einstein’s recoiling slit version of the experiment and show how the inevitable entanglement between the particle and the recoiling slit as a which-way detector is responsible for complementarity. We derive the Englert-Greenberger-Yasin duality from this entanglement, which can also be thought of as a consequence of sum-uncertainty relations between certain complementary observables of the recoiling slit. Thus, entanglement is an integral part of the which-way detection process, and so is uncertainty, though in a completely different way from that envisaged by Bohr and Einstein.

  5. Extended uncertainty from first principles

    Directory of Open Access Journals (Sweden)

    Raimundo N. Costa Filho

    2016-04-01

    Full Text Available A translation operator acting in a space with a diagonal metric is introduced to describe the motion of a particle in a quantum system. We show that the momentum operator and, as a consequence, the uncertainty relation now depend on the metric. It is also shown that, for any metric expanded up to second order, this formalism naturally leads to an extended uncertainty principle (EUP with a minimum momentum dispersion. The Ehrenfest theorem is modified to include an additional term related to a tidal force arriving from the space curvature introduced by the metric. For one-dimensional systems, we show how to map a harmonic potential to an effective potential in Euclidean space using different metrics.

  6. Extended uncertainty from first principles

    Science.gov (United States)

    Costa Filho, Raimundo N.; Braga, João P. M.; Lira, Jorge H. S.; Andrade, José S.

    2016-04-01

    A translation operator acting in a space with a diagonal metric is introduced to describe the motion of a particle in a quantum system. We show that the momentum operator and, as a consequence, the uncertainty relation now depend on the metric. It is also shown that, for any metric expanded up to second order, this formalism naturally leads to an extended uncertainty principle (EUP) with a minimum momentum dispersion. The Ehrenfest theorem is modified to include an additional term related to a tidal force arriving from the space curvature introduced by the metric. For one-dimensional systems, we show how to map a harmonic potential to an effective potential in Euclidean space using different metrics.

  7. International Capital Movements Under Uncertainty

    OpenAIRE

    1983-01-01

    In this paper, we analyze the determinants of international movements of physical capital in a model with uncertainty and international trade in goods and securities.In our model, the world allocation of capital is governed, to some extent, by the asset preferences of risk averse consumer-investors. In a one-good variant in the spirit of the MacDougall model, we find that relative factor abundance, relative labor force size and relative production riskiness have separate but interrelated infl...

  8. Quantifying uncertainty from material inhomogeneity.

    Energy Technology Data Exchange (ETDEWEB)

    Battaile, Corbett Chandler; Emery, John M.; Brewer, Luke N.; Boyce, Brad Lee

    2009-09-01

    Most engineering materials are inherently inhomogeneous in their processing, internal structure, properties, and performance. Their properties are therefore statistical rather than deterministic. These inhomogeneities manifest across multiple length and time scales, leading to variabilities, i.e. statistical distributions, that are necessary to accurately describe each stage in the process-structure-properties hierarchy, and are ultimately the primary source of uncertainty in performance of the material and component. When localized events are responsible for component failure, or when component dimensions are on the order of microstructural features, this uncertainty is particularly important. For ultra-high reliability applications, the uncertainty is compounded by a lack of data describing the extremely rare events. Hands-on testing alone cannot supply sufficient data for this purpose. To date, there is no robust or coherent method to quantify this uncertainty so that it can be used in a predictive manner at the component length scale. The research presented in this report begins to address this lack of capability through a systematic study of the effects of microstructure on the strain concentration at a hole. To achieve the strain concentration, small circular holes (approximately 100 {micro}m in diameter) were machined into brass tensile specimens using a femto-second laser. The brass was annealed at 450 C, 600 C, and 800 C to produce three hole-to-grain size ratios of approximately 7, 1, and 1/7. Electron backscatter diffraction experiments were used to guide the construction of digital microstructures for finite element simulations of uniaxial tension. Digital image correlation experiments were used to qualitatively validate the numerical simulations. The simulations were performed iteratively to generate statistics describing the distribution of plastic strain at the hole in varying microstructural environments. In both the experiments and simulations, the

  9. Information, uncertainty and holographic action

    CERN Document Server

    Dikken, Robbert-Jan

    2016-01-01

    In this short note we show through simple derivation the explicit relation between information flow and the theories of the emergence of space-time and gravity, specifically for Newton's second law of motion. Next, in a rather straightforward derivation the Heisenberg uncertainty relation is uncovered from the universal bound on information flow. A relation between the universal bound on information flow and the change in bulk action is also shown to exist.

  10. Strong majorization entropic uncertainty relations

    Energy Technology Data Exchange (ETDEWEB)

    Rudnicki, Lukasz [Freiburg Institute for Advanced Studies, Albert-Ludwigs University of Freiburg, Albertstrasse 19, 79104 Freiburg (Germany); Center for Theoretical Physics, Polish Academy of Sciences, Aleja Lotnikow 32/46, PL-02-668 Warsaw (Poland); Puchala, Zbigniew [Institute of Theoretical and Applied Informatics, Polish Academy of Sciences, Baltycka 5, 44-100 Gliwice (Poland); Institute of Physics, Jagiellonian University, ul Reymonta 4, 30-059 Krakow (Poland); Zyczkowski, Karol [Center for Theoretical Physics, Polish Academy of Sciences, Aleja Lotnikow 32/46, PL-02-668 Warsaw (Poland); Institute of Physics, Jagiellonian University, ul Reymonta 4, 30-059 Krakow (Poland)

    2014-07-01

    We present new entropic uncertainty relations in a finite-dimensional Hilbert space. Using the majorization technique we derive several explicit lower bounds for the sum of two Renyi entropies of the same order. Obtained bounds are expressed in terms of the largest singular values of given unitary matrices. Numerical simulations with random unitary matrices show that our bound is almost always stronger than the well known result of Maassen and Uffink.

  11. Conditional Independence in Uncertainty Theories

    OpenAIRE

    Shenoy, Prakash P.

    2013-01-01

    This paper introduces the notions of independence and conditional independence in valuation-based systems (VBS). VBS is an axiomatic framework capable of representing many different uncertainty calculi. We define independence and conditional independence in terms of factorization of the joint valuation. The definitions of independence and conditional independence in VBS generalize the corresponding definitions in probability theory. Our definitions apply not only to probability theory, but al...

  12. Competitive Capacity Investment under Uncertainty

    OpenAIRE

    Li, Xishu; Zuidwijk, Rob; Koster, René; Dekker, Rommert

    2016-01-01

    textabstractWe consider a long-term capacity investment problem in a competitive market under demand uncertainty. Two firms move sequentially in the competition and a firm’s capacity decision interacts with the other firm’s current and future capacity. Throughout the investment race, a firm can either choose to plan its investments proactively, taking into account possible responses from the other firm, or decide to respond reactively to the competition. In both cases, the optimal decision at...

  13. CRISIS FOCUS Uncertainty and Flexibility

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    As the world continued to watch the unfolding financial and economic crises this month, Robert Zoellick, President of the World Bank, arrived in China for discussions on how the country can help support the global economy and the efforts it has taken to strengthen its own recovery. Zoellick, who had seen many uncertainties in 2009, called for China to be flexible with its macroeconomic policy. He made the following comments at a press conference in Beijing on December 15. Edited excerpts follow:

  14. Modelling of Transport Projects Uncertainties

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Leleur, Steen

    2012-01-01

    This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating......-based graphs which functions as risk-related decision support for the appraised transport infrastructure project. The presentation of RSF is demonstrated by using an appraisal case concerning a new airfield in the capital of Greenland, Nuuk....

  15. Blade tip timing (BTT) uncertainties

    Science.gov (United States)

    Russhard, Pete

    2016-06-01

    Blade Tip Timing (BTT) is an alternative technique for characterising blade vibration in which non-contact timing probes (e.g. capacitance or optical probes), typically mounted on the engine casing (figure 1), and are used to measure the time at which a blade passes each probe. This time is compared with the time at which the blade would have passed the probe if it had been undergoing no vibration. For a number of years the aerospace industry has been sponsoring research into Blade Tip Timing technologies that have been developed as tools to obtain rotor blade tip deflections. These have been successful in demonstrating the potential of the technology, but rarely produced quantitative data, along with a demonstration of a traceable value for measurement uncertainty. BTT technologies have been developed under a cloak of secrecy by the gas turbine OEM's due to the competitive advantages it offered if it could be shown to work. BTT measurements are sensitive to many variables and there is a need to quantify the measurement uncertainty of the complete technology and to define a set of guidelines as to how BTT should be applied to different vehicles. The data shown in figure 2 was developed from US government sponsored program that bought together four different tip timing system and a gas turbine engine test. Comparisons showed that they were just capable of obtaining measurement within a +/-25% uncertainty band when compared to strain gauges even when using the same input data sets.

  16. Uncertainty compliant design flood estimation

    Science.gov (United States)

    Botto, A.; Ganora, D.; Laio, F.; Claps, P.

    2014-05-01

    Hydraulic infrastructures are commonly designed with reference to target values of flood peak, estimated using probabilistic techniques, such as flood frequency analysis. The application of these techniques underlies levels of uncertainty, which are sometimes quantified but normally not accounted for explicitly in the decision regarding design discharges. The present approach aims at defining a procedure which enables the definition of Uncertainty Compliant Design (UNCODE) values of flood peaks. To pursue this goal, we first demonstrate the equivalence of the Standard design based on the return period and the cost-benefit procedure, when linear cost and damage functions are used. We then use this result to assign an expected cost to estimation errors, thus setting a framework to obtain a design flood estimator which minimizes the total expected cost. This procedure properly accounts for the uncertainty which is inherent in the frequency curve estimation. Applications of the UNCODE procedure to real cases leads to remarkable displacement of the design flood from the Standard values. UNCODE estimates are systematically larger than the Standard ones, with substantial differences (up to 55%) when large return periods or short data samples are considered.

  17. Characterizing Epistemic Uncertainty for Launch Vehicle Designs

    Science.gov (United States)

    Novack, Steven D.; Rogers, Jim; Hark, Frank; Al Hassan, Mohammad

    2016-01-01

    NASA Probabilistic Risk Assessment (PRA) has the task of estimating the aleatory (randomness) and epistemic (lack of knowledge) uncertainty of launch vehicle loss of mission and crew risk and communicating the results. Launch vehicles are complex engineered systems designed with sophisticated subsystems that are built to work together to accomplish mission success. Some of these systems or subsystems are in the form of heritage equipment, while some have never been previously launched. For these cases, characterizing the epistemic uncertainty is of foremost importance, and it is anticipated that the epistemic uncertainty of a modified launch vehicle design versus a design of well understood heritage equipment would be greater. For reasons that will be discussed, standard uncertainty propagation methods using Monte Carlo simulation produce counter intuitive results and significantly underestimate epistemic uncertainty for launch vehicle models. Furthermore, standard PRA methods such as Uncertainty-Importance analyses used to identify components that are significant contributors to uncertainty are rendered obsolete since sensitivity to uncertainty changes are not reflected in propagation of uncertainty using Monte Carlo methods.This paper provides a basis of the uncertainty underestimation for complex systems and especially, due to nuances of launch vehicle logic, for launch vehicles. It then suggests several alternative methods for estimating uncertainty and provides examples of estimation results. Lastly, the paper shows how to implement an Uncertainty-Importance analysis using one alternative approach, describes the results, and suggests ways to reduce epistemic uncertainty by focusing on additional data or testing of selected components.

  18. Probabilistic Load Flow Considering Wind Generation Uncertainty

    Directory of Open Access Journals (Sweden)

    R. Ramezani

    2011-10-01

    Full Text Available Renewable energy sources, such as wind, solar and hydro, are increasingly incorporated into power grids, as a direct consequence of energy and environmental issues. These types of energies are variable and intermittent by nature and their exploitation introduces uncertainties into the power grid. Therefore, probabilistic analysis of the system performance is of significant interest. This paper describes a new approach to Probabilistic Load Flow (PLF by modifying the Two Point Estimation Method (2PEM to cover some drawbacks of other currently used methods. The proposed method is examined using two case studies, the IEEE 9-bus and the IEEE 57-bus test systems. In order to justify the effectiveness of the method, numerical comparison with Monte Carlo Simulation (MCS method is presented. Simulation results indicate that the proposed method significantly reduces the computational burden while maintaining a high level of accuracy. Moreover, that the unsymmetrical 2PEM has a higher level of accuracy than the symmetrical 2PEM with equal computing burden, when the Probability Density Function (PDF of uncertain variables is asymmetric.

  19. Improved Approximations for Multiprocessor Scheduling Under Uncertainty

    CERN Document Server

    Crutchfield, Christopher; Fineman, Jeremy T; Karger, David R; Scott, Jacob

    2008-01-01

    This paper presents improved approximation algorithms for the problem of multiprocessor scheduling under uncertainty, or SUU, in which the execution of each job may fail probabilistically. This problem is motivated by the increasing use of distributed computing to handle large, computationally intensive tasks. In the SUU problem we are given n unit-length jobs and m machines, a directed acyclic graph G of precedence constraints among jobs, and unrelated failure probabilities q_{ij} for each job j when executed on machine i for a single timestep. Our goal is to find a schedule that minimizes the expected makespan, which is the expected time at which all jobs complete. Lin and Rajaraman gave the first approximations for this NP-hard problem for the special cases of independent jobs, precedence constraints forming disjoint chains, and precedence constraints forming trees. In this paper, we present asymptotically better approximation algorithms. In particular, we give an O(loglog min(m,n))-approximation for indep...

  20. Cost Estimating Risk and Cost Estimating Uncertainty Guidelines

    OpenAIRE

    Anderson, Timothy P.; Cherwonik, Jeffrey S.

    1997-01-01

    The Memorandum of Agreement signed by the Assistant Secretaries of the Navy for Research, Development, and Acquisition (ASN[RD&A]) and for Financial Management and Comptroller (FM&C) in June 1996 committed the Naval Center for Cost Analysis (NCCA) to improve cost analyses by helping program managers prepare better cost estimates. Recent computing advances make development of meaningful risk and uncertainty analyses easier, and these analyses can help managers do their job be...

  1. Propagation of Uncertainty in Rigid Body Attitude Flows

    OpenAIRE

    Lee, Taeyoung; Chaturvedi, Nalin A.; Sanyal, Amit K.; Leok, Melvin; McClamroch, N. Harris

    2007-01-01

    Motivated by attitude control and attitude estimation problems for a rigid body, computational methods are proposed to propagate uncertainties in the angular velocity and the attitude. The nonlinear attitude flow is determined by Euler-Poincar\\'e equations that describe the rotational dynamics of the rigid body acting under the influence of an attitude dependent potential and by a reconstruction equation that describes the kinematics expressed in terms of an orthogonal matrix representing the...

  2. Investment Decisions in Small Ethanol Plant under Risk and Uncertainty

    OpenAIRE

    Wamisho, Kassu; Ripplinger, David

    2015-01-01

    This study evaluates optimal investment decision rules for an energy beet ethanol firms to simultaneously exercise the option to invest, mothball, reactivate and exit the ethanol market, considering uncertainty and volatility in the market price of ethanol and irreversible investment. A real options framework is employed to compute the prices of ethanol that trigger entry into and exit from the ethanol market. Results show that hysteresis is found to be significant even with modest volatility...

  3. Model Uncertainty for Bilinear Hysteretic Systems

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    1984-01-01

    . The statistical uncertainty -due to lack of information can e.g. be taken into account by describing the variables by predictive density functions, Veneziano [2). In general, model uncertainty is the uncertainty connected with mathematical modelling of the physical reality. When structural reliability analysis......In structural reliability analysis at least three types of uncertainty must be considered, namely physical uncertainty, statistical uncertainty, and model uncertainty (see e.g. Thoft·Christensen & Baker [1)). The physical uncertainty is usually modelled by a number of basic variables...... loaded structures it is often assumed that the loading and the response can be modelled by stationary stochastic processes. Further, it is assumed that the structures can be modelled by non-linear systems showing hysteresis. This non-linear behaviour is essential to the design procedure from an economic...

  4. Stochastic and epistemic uncertainty propagation in LCA

    DEFF Research Database (Denmark)

    Clavreul, Julie; Guyonnet, Dominique; Tonini, Davide;

    2013-01-01

    When performing uncertainty propagation, most LCA practitioners choose to represent uncertainties by single probability distributions and to propagate them using stochastic methods. However, the selection of single probability distributions appears often arbitrary when faced with scarce informati...

  5. Heisenberg's Uncertainty Relations and Quantum Optics

    OpenAIRE

    Agarwal, G. S.

    2002-01-01

    We present a brief review of the impact of the Heisenberg uncertainty relations on quantum optics. In particular we demonstrate how almost all coherent and nonclassical states of quantum optics can be derived from uncertainty relations.

  6. PCT Uncertainty Analysis Using Unscented Transform with Random Orthogonal Matrix

    International Nuclear Information System (INIS)

    Most Best Estimate Plus Uncertainty (BEPU) methods employ nonparametric order statistics through Wilks' formula to quantify uncertainties of best estimate simulations of nuclear power plant (NPP) transients. 95%/95% limits, the 95''th percentile at a 95% confidence level, are obtained by randomly sampling all uncertainty contributors through conventional Monte Carlo (MC). Advantages are simple implementation of MC sampling of input probability density functions (pdfs) and limited computational expense of 1''st, 2''nd, and 3''rd order Wilks' formula requiring only 59, 93, or 124 simulations, respectively. A disadvantage of small sample size is large sample to sample variation of statistical estimators. This paper presents a new efficient sampling based algorithm for accurate estimation of mean and variance of the output parameter pdf. The algorithm combines a deterministic sampling method, the unscented transform (UT), with random sampling through the generation of a random orthogonal matrix (ROM). The UT guarantees the mean, covariance, and 3''rd order moments of the multivariate input parameter distributions are exactly preserved by the sampled input points and the orthogonal transformation of the points by a ROM guarantees the sample error of all 4''th order and higher moments are unbiased. The UT with ROM algorithm is applied to the uncertainty quantification of the peak clad temperature (PCT) during a large break loss-of-coolant accident (LBLOCA) in an OPR1000 NPP to demonstrate the applicability of the new algorithm to BEPU. This paper presented a new algorithm combining the UT with ROM for efficient multivariate parameter sampling that ensures sample input covariance and 3''rd order moments are exactly preserved and 4''th moment errors are small and unbiased. The advantageous sample properties guarantee higher order accuracy and less statistical variation of mean and

  7. Measuring, Estimating, and Deciding under Uncertainty.

    Science.gov (United States)

    Michel, Rolf

    2016-03-01

    The problem of uncertainty as a general consequence of incomplete information and the approach to quantify uncertainty in metrology is addressed. Then, this paper discusses some of the controversial aspects of the statistical foundation of the concepts of uncertainty in measurements. The basics of the ISO Guide to the Expression of Uncertainty in Measurement as well as of characteristic limits according to ISO 11929 are described and the needs for a revision of the latter standard are explained. PMID:26688360

  8. The Black Hole Uncertainty Principle Correspondence

    OpenAIRE

    Carr, B.J.

    2014-01-01

    The Black Hole Uncertainty Principle correspondence proposes a connection between the Uncertainty Principle on microscopic scales and black holes on macroscopic scales. This is manifested in a unified expression for the Compton wavelength and Schwarzschild radius. It is a natural consequence of the Generalized Uncertainty Principle, which suggests corrections to the Uncertainty Principle as the energy increases towards the Planck value. It also entails corrections to the event horizon size as...

  9. Probabilistic evaluation of uncertainties and risks in aerospace components

    Science.gov (United States)

    Shah, A. R.; Shiao, M. C.; Nagpal, V. K.; Chamis, C. C.

    1992-01-01

    This paper summarizes a methodology developed at NASA Lewis Research Center which computationally simulates the structural, material, and load uncertainties associated with Space Shuttle Main Engine (SSME) components. The methodology was applied to evaluate the scatter in static, buckling, dynamic, fatigue, and damage behavior of the SSME turbo pump blade. Also calculated are the probability densities of typical critical blade responses, such as effective stress, natural frequency, damage initiation, most probable damage path, etc. Risk assessments were performed for different failure modes, and the effect of material degradation on the fatigue and damage behaviors of a blade were calculated using a multi-factor interaction equation. Failure probabilities for different fatigue cycles were computed and the uncertainties associated with damage initiation and damage propagation due to different load cycle were quantified. Evaluations on the effects of mistuned blades on a rotor were made; uncertainties in the excitation frequency were found to significantly amplify the blade responses of a mistuned rotor. The effects of the number of blades on a rotor were studied. The autocorrelation function of displacements and the probability density function of the first passage time for deterministic and random barriers for structures subjected to random processes also were computed. A brief discussion was included on the future direction of probabilistic structural analysis.

  10. Probabilistic evaluation of uncertainties and risks in aerospace components

    Science.gov (United States)

    Shah, A. R.; Shiao, M. C.; Nagpal, V. K.; Chamis, C. C.

    1992-03-01

    This paper summarizes a methodology developed at NASA Lewis Research Center which computationally simulates the structural, material, and load uncertainties associated with Space Shuttle Main Engine (SSME) components. The methodology was applied to evaluate the scatter in static, buckling, dynamic, fatigue, and damage behavior of the SSME turbo pump blade. Also calculated are the probability densities of typical critical blade responses, such as effective stress, natural frequency, damage initiation, most probable damage path, etc. Risk assessments were performed for different failure modes, and the effect of material degradation on the fatigue and damage behaviors of a blade were calculated using a multi-factor interaction equation. Failure probabilities for different fatigue cycles were computed and the uncertainties associated with damage initiation and damage propagation due to different load cycle were quantified. Evaluations on the effects of mistuned blades on a rotor were made; uncertainties in the excitation frequency were found to significantly amplify the blade responses of a mistuned rotor. The effects of the number of blades on a rotor were studied. The autocorrelation function of displacements and the probability density function of the first passage time for deterministic and random barriers for structures subjected to random processes also were computed. A brief discussion was included on the future direction of probabilistic structural analysis.

  11. Analysis of uncertainty propagation in nuclear fuel cycle scenarios

    International Nuclear Information System (INIS)

    Nuclear scenario studies model nuclear fleet over a given period. They enable the comparison of different options for the reactor fleet evolution, and the management of the future fuel cycle materials, from mining to disposal, based on criteria such as installed capacity per reactor technology, mass inventories and flows, in the fuel cycle and in the waste. Uncertainties associated with nuclear data and scenario parameters (fuel, reactors and facilities characteristics) propagate along the isotopic chains in depletion calculations, and through out the scenario history, which reduces the precision of the results. The aim of this work is to develop, implement and use a stochastic uncertainty propagation methodology adapted to scenario studies. The method chosen is based on development of depletion computation surrogate models, which reduce the scenario studies computation time, and whose parameters include perturbations of the depletion model; and fabrication of equivalence model which take into account cross-sections perturbations for computation of fresh fuel enrichment. Then the uncertainty propagation methodology is applied to different scenarios of interest, considering different options of evolution for the French PWR fleet with SFR deployment. (author)

  12. Errors and Uncertainty in Physics Measurement.

    Science.gov (United States)

    Blasiak, Wladyslaw

    1983-01-01

    Classifies errors as either systematic or blunder and uncertainties as either systematic or random. Discusses use of error/uncertainty analysis in direct/indirect measurement, describing the process of planning experiments to ensure lowest possible uncertainty. Also considers appropriate level of error analysis for high school physics students'…

  13. On Uncertainty of Compton Backscattering Process

    CERN Document Server

    Mo, X H

    2013-01-01

    The uncertainty of Compton backscattering process is studied by virtue of analytical formulas, and the special effects of variant energy spread and energy drift on the systematic uncertainty estimation are also studied with Monte Carlo sampling technique. These quantitative conclusions are especially important for the understanding the uncertainty of beam energy measurement system.

  14. Flood modelling: Parameterisation and inflow uncertainty

    NARCIS (Netherlands)

    Mukolwe, M.M.; Di Baldassarre, G.; Werner, M.; Solomatine, D.P.

    2014-01-01

    This paper presents an analysis of uncertainty in hydraulic modelling of floods, focusing on the inaccuracy caused by inflow errors and parameter uncertainty. In particular, the study develops a method to propagate the uncertainty induced by, firstly, application of a stage–discharge rating curve an

  15. Directional Uncertainty Principle for Quaternion Fourier Transform

    OpenAIRE

    Hitzer, Eckhard

    2013-01-01

    This paper derives a new directional uncertainty principle for quaternion valued functions subject to the quaternion Fourier transformation. This can be generalized to establish directional uncertainty principles in Clifford geometric algebras with quaternion subalgebras. We demonstrate this with the example of a directional spacetime algebra function uncertainty principle related to multivector wave packets.

  16. Tax strategy : How to deal with uncertainty?

    NARCIS (Netherlands)

    Hafkenscheid, R.P.F.M.; Janssen, C.M.L.

    2009-01-01

    The article discusses tax strategy under conditions of uncertainty. It explains the three levels of uncertainty that affect corporate and tax strategies, namely, clear trends, alternate trends and residual uncertainty. The author suggests tax strategies that a company may adopt when dealing with the

  17. Propagation of uncertainty in system parameters of a LWR model by sampling MCNPX calculations - Burnup analysis

    International Nuclear Information System (INIS)

    For all the physical components that comprise a nuclear system there is an uncertainty. Assessing the impact of uncertainties in the simulation of fissionable material systems is essential for a best estimate calculation that has been replacing the conservative model calculations as the computational power increases. The propagation of uncertainty in a simulation using a Monte Carlo code by sampling the input parameters is recent because of the huge computational effort required. In this work a sample space of MCNPX calculations was used to propagate the uncertainty. The sample size was optimized using the Wilks formula for a 95. percentile and a two-sided statistical tolerance interval of 95%. Uncertainties in input parameters of the reactor considered included geometry dimensions and densities. It was showed the capacity of the sampling-based method for burnup when the calculations sample size is optimized and many parameter uncertainties are investigated together, in the same input. Particularly it was shown that during the burnup, the variances when considering all the parameters uncertainties is equivalent to the sum of variances if the parameter uncertainties are sampled separately

  18. Uncertainty and validation. Effect of user interpretation on uncertainty estimates

    International Nuclear Information System (INIS)

    Uncertainty in predictions of environmental transfer models arises from, among other sources, the adequacy of the conceptual model, the approximations made in coding the conceptual model, the quality of the input data, the uncertainty in parameter values, and the assumptions made by the user. In recent years efforts to quantify the confidence that can be placed in predictions have been increasing, but have concentrated on a statistical propagation of the influence of parameter uncertainties on the calculational results. The primary objective of this Working Group of BIOMOVS II was to test user's influence on model predictions on a more systematic basis than has been done before. The main goals were as follows: To compare differences between predictions from different people all using the same model and the same scenario description with the statistical uncertainties calculated by the model. To investigate the main reasons for different interpretations by users. To create a better awareness of the potential influence of the user on the modeling results. Terrestrial food chain models driven by deposition of radionuclides from the atmosphere were used. Three codes were obtained and run with three scenarios by a maximum of 10 users. A number of conclusions can be drawn some of which are general and independent of the type of models and processes studied, while others are restricted to the few processes that were addressed directly: For any set of predictions, the variation in best estimates was greater than one order of magnitude. Often the range increased from deposition to pasture to milk probably due to additional transfer processes. The 95% confidence intervals about the predictions calculated from the parameter distributions prepared by the participants did not always overlap the observations; similarly, sometimes the confidence intervals on the predictions did not overlap. Often the 95% confidence intervals of individual predictions were smaller than the

  19. Measurement uncertainty evaluation of conicity error inspected on CMM

    Science.gov (United States)

    Wang, Dongxia; Song, Aiguo; Wen, Xiulan; Xu, Youxiong; Qiao, Guifang

    2016-01-01

    The cone is widely used in mechanical design for rotation, centering and fixing. Whether the conicity error can be measured and evaluated accurately will directly influence its assembly accuracy and working performance. According to the new generation geometrical product specification(GPS), the error and its measurement uncertainty should be evaluated together. The mathematical model of the minimum zone conicity error is established and an improved immune evolutionary algorithm(IIEA) is proposed to search for the conicity error. In the IIEA, initial antibodies are firstly generated by using quasi-random sequences and two kinds of affinities are calculated. Then, each antibody clone is generated and they are self-adaptively mutated so as to maintain diversity. Similar antibody is suppressed and new random antibody is generated. Because the mathematical model of conicity error is strongly nonlinear and the input quantities are not independent, it is difficult to use Guide to the expression of uncertainty in the measurement(GUM) method to evaluate measurement uncertainty. Adaptive Monte Carlo method(AMCM) is proposed to estimate measurement uncertainty in which the number of Monte Carlo trials is selected adaptively and the quality of the numerical results is directly controlled. The cone parts was machined on lathe CK6140 and measured on Miracle NC 454 Coordinate Measuring Machine(CMM). The experiment results confirm that the proposed method not only can search for the approximate solution of the minimum zone conicity error(MZCE) rapidly and precisely, but also can evaluate measurement uncertainty and give control variables with an expected numerical tolerance. The conicity errors computed by the proposed method are 20%-40% less than those computed by NC454 CMM software and the evaluation accuracy improves significantly.

  20. Connectivity graphs of uncertainty regions

    CERN Document Server

    Chambers, Erin; Lenchner, Jonathan; Sember, Jeff; Srinivasan, Venkatesh; Stege, Ulrike; Stolpner, Svetlana; Weibel, Christophe; Whitesides, Sue

    2010-01-01

    We study a generalization of the well known bottleneck spanning tree problem called "Best Case Connectivity with Uncertainty": Given a family of geometric regions, choose one point per region, such that the length of the longest edge in a spanning tree of a disc intersection graph is minimized. We show that this problem is NP-hard even for very simple scenarios such as line segments and squares. We also give exact and approximation algorithms for the case of line segments and unit discs respectively.

  1. Uncertainty assessment using uncalibrated objects:

    DEFF Research Database (Denmark)

    Meneghello, R.; Savio, Enrico; Larsen, Erik;

    This report is made as a part of the project Easytrac, an EU project under the programme: Competitive and Sustainable Growth: Contract No: G6RD-CT-2000-00188, coordinated by UNIMETRIK S.A. (Spain). The project is concerned with low uncertainty calibrations on coordinate measuring machines....... The Centre for Geometrical Metrology (CGM) at the Technical University of Denmark takes care of free form measurements, in collaboration with DIMEG, University of Padova, Italy and Unilab Laboratori Industriali Srl, Italy. The present report describes the calibration of a bevel gear using the method...

  2. Oil, Uncertainty, and Gasoline Prices

    OpenAIRE

    Dongfeng (Karen) Chang; Apostolos Serletis

    2015-01-01

    In this paper we investigate the relationship between crude oil and gasoline prices and also examine the effect of oil price uncertainty on gasoline prices. The empirical model is based on a structural vector autoregression that is modifiÂ…ed to accommodate multivariate GARCH-in-Mean errors, as detailed in Elder (2004) and Elder and Serletis (2010). We use monthly data for the United States, over the period from January 1976 to September 2014. We fiÂ…nd that there is an asymmetric relationshi...

  3. Sibling Dependence, Uncertainty and Education

    DEFF Research Database (Denmark)

    Lilleør, Helene Bie

    in the educational decision, which is consistent with a human capital portfolio theory of risk diversification and which cannot be explained by sibling rivalry over scarce resources for credit constrained households. The paper thus provides a complementary explanation to why enrolment rates in developing countries...... future agricultural employment. Given this dichtomy, the question is then: Does future income uncertainty influence the joint educational choice made by parents on behalf of their children and is it possible to test this on simple cross-sectional data? I extend a simple human capital portfolio model...

  4. Social preferences and strategic uncertainty

    DEFF Research Database (Denmark)

    Cabrales, Antonio; Miniaci, Raffaele; Piovesan, Marco;

    2010-01-01

    This paper reports a three-phase experiment on a stylized labor market. In the first two phases, agents face simple games, which we use to estimate subjects' social and reciprocity concerns. In the last phase, four principals compete by offering agents a contract from a fixed menu. Then, agents...... "choose to work" for a principal by selecting one of the available contracts. We find that (i) (heterogeneous) social preferences are significant determinants of choices, (ii) for both principals and agents, strategic uncertainty aversion is a stronger determinant of choices than fairness, and (iii......) agents display a marked propensity to work for principals with similar distributional concerns....

  5. Computational Complexity on Signed Numbers

    OpenAIRE

    Jaeger, Stefan

    2011-01-01

    This paper presents a new representation of natural numbers and discusses its consequences for computability and computational complexity. The paper argues that the introduction of the first Peano axiom in the traditional definition of natural numbers is not essential. It claims that natural numbers remain usable in traditional ways without assuming the existence of at least one natural number. However, the uncertainty about the existence of natural numbers translates into every computation a...

  6. Uncertainty of temperature measurement with thermal cameras

    Science.gov (United States)

    Chrzanowski, Krzysztof; Matyszkiel, Robert; Fischer, Joachim; Barela, Jaroslaw

    2001-06-01

    All main international metrological organizations are proposing a parameter called uncertainty as a measure of the accuracy of measurements. A mathematical model that enables the calculations of uncertainty of temperature measurement with thermal cameras is presented. The standard uncertainty or the expanded uncertainty of temperature measurement of the tested object can be calculated when the bounds within which the real object effective emissivity (epsilon) r, the real effective background temperature Tba(r), and the real effective atmospheric transmittance (tau) a(r) are located and can be estimated; and when the intrinsic uncertainty of the thermal camera and the relative spectral sensitivity of the thermal camera are known.

  7. Uncertainty Communication. Issues and good practice

    International Nuclear Information System (INIS)

    In 2003 the Netherlands Environmental Assessment Agency (MNP) published the RIVM/MNP Guidance for Uncertainty Assessment and Communication. The Guidance assists in dealing with uncertainty in environmental assessments. Dealing with uncertainty is essential because assessment results regarding complex environmental issues are of limited value if the uncertainties have not been taken into account adequately. A careful analysis of uncertainties in an environmental assessment is required, but even more important is the effective communication of these uncertainties in the presentation of assessment results. The Guidance yields rich and differentiated insights in uncertainty, but the relevance of this uncertainty information may vary across audiences and uses of assessment results. Therefore, the reporting of uncertainties is one of the six key issues that is addressed in the Guidance. In practice, users of the Guidance felt a need for more practical assistance in the reporting of uncertainty information. This report explores the issue of uncertainty communication in more detail, and contains more detailed guidance on the communication of uncertainty. In order to make this a 'stand alone' document several questions that are mentioned in the detailed Guidance have been repeated here. This document thus has some overlap with the detailed Guidance. Part 1 gives a general introduction to the issue of communicating uncertainty information. It offers guidelines for (fine)tuning the communication to the intended audiences and context of a report, discusses how readers of a report tend to handle uncertainty information, and ends with a list of criteria that uncertainty communication needs to meet to increase its effectiveness. Part 2 helps writers to analyze the context in which communication takes place, and helps to map the audiences, and their information needs. It further helps to reflect upon anticipated uses and possible impacts of the uncertainty information on the

  8. Parameter uncertainty analysis for simulating streamflow in a river catchment of Vietnam

    Directory of Open Access Journals (Sweden)

    Dao Nguyen Khoi

    2015-07-01

    Full Text Available Hydrological models play vital roles in management of water resources. However, the calibration of the hydrological models is a large challenge because of the uncertainty involved in the large number of parameters. In this study, four uncertainty analysis methods, including Generalized Likelihood Uncertainty Estimation (GLUE, Parameter Solution (ParaSol, Particle Swarm Optimization (PSO, and Sequential Uncertainty Fitting (SUFI-2, were employed to perform parameter uncertainty analysis of streamflow simulation in the Srepok River Catchment by using the Soil and Water Assessment Tool (SWAT model. The four methods were compared in terms of the model prediction uncertainty, the model performance, and the computational efficiency. The results showed that the SUFI-2 method has the advantages in the model calibration and uncertainty analysis. This technique could be run with the smallest of simulation runs to achieve good prediction uncertainty bands and model performance. This technique could be run with the smallest of simulation runs to achieve good prediction uncertainty bands and model performance.

  9. Uncertainty and Sensitivity Analysis in Performance Assessment for the Waste Isolation Pilot Plant

    Energy Technology Data Exchange (ETDEWEB)

    Helton, J.C.

    1998-12-17

    The Waste Isolation Pilot Plant (WIPP) is under development by the U.S. Department of Energy (DOE) for the geologic (deep underground) disposal of transuranic (TRU) waste. This development has been supported by a sequence of performance assessments (PAs) carried out by Sandla National Laboratories (SNL) to assess what is known about the WIPP and to provide .tidance for future DOE research and development activities. Uncertainty and sensitivity analysis procedures based on Latin hypercube sampling and regression techniques play an important role in these PAs by providing an assessment of the uncertainty in important analysis outcomes and identi~ing the sources of thk uncertainty. Performance assessments for the WIPP are conceptually and computational] y interesting due to regulatory requirements to assess and display the effects of both stochastic (i.e., aleatory) and subjective (i.e., epistemic) uncertainty, where stochastic uncertainty arises from the possible disruptions that could occur over the 10,000 yr regulatory period associated with the WIPP and subjective uncertainty arises from an inability to unambi-aously characterize the many models and associated parameters required in a PA for the WIPP. The interplay between uncertainty analysis, sensitivity analysis, stochastic uncertainty and subjective uncertainty are discussed and illustrated in the context of a recent PA carried out by SNL to support an application by the DOE to the U.S. Environmental Protection Agency for the certification of the WIPP for the disposal of TRU waste.

  10. Uncertainty in geological and hydrogeological data

    Directory of Open Access Journals (Sweden)

    B. Nilsson

    2007-09-01

    Full Text Available Uncertainty in conceptual model structure and in environmental data is of essential interest when dealing with uncertainty in water resources management. To make quantification of uncertainty possible is it necessary to identify and characterise the uncertainty in geological and hydrogeological data. This paper discusses a range of available techniques to describe the uncertainty related to geological model structure and scale of support. Literature examples on uncertainty in hydrogeological variables such as saturated hydraulic conductivity, specific yield, specific storage, effective porosity and dispersivity are given. Field data usually have a spatial and temporal scale of support that is different from the one on which numerical models for water resources management operate. Uncertainty in hydrogeological data variables is characterised and assessed within the methodological framework of the HarmoniRiB classification.

  11. Uncertainty in Resistance Models for Steel Members

    Directory of Open Access Journals (Sweden)

    Nadolski Vitali

    2014-12-01

    Full Text Available Resistance of steel structures is primarily dependent on material properties, geometry and uncertainties related to an applied model. While materials and geometry can be relatively well described, the uncertainties in resistance models are not yet well understood. In many cases significant efforts are spent to improve resistance models and reduce uncertainty associated with outcomes of the model. However, these achievements are then inadequately reflected in the values of partial factors. That is why the present paper clarifies a model uncertainty and its quantification. Initially a general concept of the model uncertainty is proposed. Influences affecting results obtained by tests and models and influences of actual structural conditions are overviewed. Statistical characteristics of the uncertainties in resistance of steel members are then provided. Simple engineering formulas, mostly based on the EN 1993-1-1 models, are taken into account. To facilitate practical applications, the partial factors for the model uncertainties are derived using a semiprobabilistic approach.

  12. Uncertainty and Sensitivity Analyses of Duct Propagation Models

    Science.gov (United States)

    Nark, Douglas M.; Watson, Willie R.; Jones, Michael G.

    2008-01-01

    This paper presents results of uncertainty and sensitivity analyses conducted to assess the relative merits of three duct propagation codes. Results from this study are intended to support identification of a "working envelope" within which to use the various approaches underlying these propagation codes. This investigation considers a segmented liner configuration that models the NASA Langley Grazing Incidence Tube, for which a large set of measured data was available. For the uncertainty analysis, the selected input parameters (source sound pressure level, average Mach number, liner impedance, exit impedance, static pressure and static temperature) are randomly varied over a range of values. Uncertainty limits (95% confidence levels) are computed for the predicted values from each code, and are compared with the corresponding 95% confidence intervals in the measured data. Generally, the mean values of the predicted attenuation are observed to track the mean values of the measured attenuation quite well and predicted confidence intervals tend to be larger in the presence of mean flow. A two-level, six factor sensitivity study is also conducted in which the six inputs are varied one at a time to assess their effect on the predicted attenuation. As expected, the results demonstrate the liner resistance and reactance to be the most important input parameters. They also indicate the exit impedance is a significant contributor to uncertainty in the predicted attenuation.

  13. Uncertainty quantification in virtual surgery predictions for single ventricle palliation

    Science.gov (United States)

    Schiavazzi, Daniele; Marsden, Alison

    2014-11-01

    Hemodynamic results from numerical simulations of physiology in patients are invariably presented as deterministic quantities without assessment of associated confidence. Recent advances in cardiovascular simulation and Uncertainty Analysis can be leveraged to challenge this paradigm and to quantify the variability of output quantities of interest, of paramount importance to complement clinical decision making. Physiological variability and errors are responsible for the uncertainty typically associated with measurements in the clinic; starting from a characterization of these quantities in probability, we present applications in the context of estimating the distributions of lumped parameters in 0D models of single-ventricle circulation. We also present results in virtual Fontan palliation surgery, where the variability of both local and systemic hemodynamic indicators is inferred from the uncertainty in pre-operative clinical measurements. Efficient numerical algorithms are required to mitigate the computational cost of propagating the uncertainty through multiscale coupled 0D-3D models of pulsatile flow at the cavopulmonary connection. This work constitutes a first step towards systematic application of robust numerical simulations to virtual surgery predictions.

  14. Uncertainty quantification for large-scale ocean circulation predictions.

    Energy Technology Data Exchange (ETDEWEB)

    Safta, Cosmin; Debusschere, Bert J.; Najm, Habib N.; Sargsyan, Khachik

    2010-09-01

    Uncertainty quantificatio in climate models is challenged by the sparsity of the available climate data due to the high computational cost of the model runs. Another feature that prevents classical uncertainty analyses from being easily applicable is the bifurcative behavior in the climate data with respect to certain parameters. A typical example is the Meridional Overturning Circulation in the Atlantic Ocean. The maximum overturning stream function exhibits discontinuity across a curve in the space of two uncertain parameters, namely climate sensitivity and CO{sub 2} forcing. We develop a methodology that performs uncertainty quantificatio in the presence of limited data that have discontinuous character. Our approach is two-fold. First we detect the discontinuity location with a Bayesian inference, thus obtaining a probabilistic representation of the discontinuity curve location in presence of arbitrarily distributed input parameter values. Furthermore, we developed a spectral approach that relies on Polynomial Chaos (PC) expansions on each sides of the discontinuity curve leading to an averaged-PC representation of the forward model that allows efficient uncertainty quantification and propagation. The methodology is tested on synthetic examples of discontinuous data with adjustable sharpness and structure.

  15. Relationships for Cost and Uncertainty of Decision Trees

    KAUST Repository

    Chikalov, Igor

    2013-01-01

    This chapter is devoted to the design of new tools for the study of decision trees. These tools are based on dynamic programming approach and need the consideration of subtables of the initial decision table. So this approach is applicable only to relatively small decision tables. The considered tools allow us to compute: 1. Theminimum cost of an approximate decision tree for a given uncertainty value and a cost function. 2. The minimum number of nodes in an exact decision tree whose depth is at most a given value. For the first tool we considered various cost functions such as: depth and average depth of a decision tree and number of nodes (and number of terminal and nonterminal nodes) of a decision tree. The uncertainty of a decision table is equal to the number of unordered pairs of rows with different decisions. The uncertainty of approximate decision tree is equal to the maximum uncertainty of a subtable corresponding to a terminal node of the tree. In addition to the algorithms for such tools we also present experimental results applied to various datasets acquired from UCI ML Repository [4]. © Springer-Verlag Berlin Heidelberg 2013.

  16. Comparison of nuclear data uncertainty propagation methodologies for PWR burn-up simulations

    CERN Document Server

    Diez, Carlos Javier; Hoefer, Axel; Porsch, Dieter; Cabellos, Oscar

    2014-01-01

    Several methodologies using different levels of approximations have been developed for propagating nuclear data uncertainties in nuclear burn-up simulations. Most methods fall into the two broad classes of Monte Carlo approaches, which are exact apart from statistical uncertainties but require additional computation time, and first order perturbation theory approaches, which are efficient for not too large numbers of considered response functions but only applicable for sufficiently small nuclear data uncertainties. Some methods neglect isotopic composition uncertainties induced by the depletion steps of the simulations, others neglect neutron flux uncertainties, and the accuracy of a given approximation is often very hard to quantify. In order to get a better sense of the impact of different approximations, this work aims to compare results obtained based on different approximate methodologies with an exact method, namely the NUDUNA Monte Carlo based approach developed by AREVA GmbH. In addition, the impact ...

  17. Localizing the Latent Structure Canonical Uncertainty: Entropy Profiles for Hidden Markov Models

    CERN Document Server

    Durand, Jean-Baptiste

    2012-01-01

    This report addresses state inference for hidden Markov models. These models rely on unobserved states, which often have a meaningful interpretation. This makes it necessary to develop diagnostic tools for quantification of state uncertainty. The entropy of the state sequence that explains an observed sequence for a given hidden Markov chain model can be considered as the canonical measure of state sequence uncertainty. This canonical measure of state sequence uncertainty is not reflected by the classic multivariate state profiles computed by the smoothing algorithm, which summarizes the possible state sequences. Here, we introduce a new type of profiles which have the following properties: (i) these profiles of conditional entropies are a decomposition of the canonical measure of state sequence uncertainty along the sequence and makes it possible to localize this uncertainty, (ii) these profiles are univariate and thus remain easily interpretable on tree structures. We show how to extend the smoothing algori...

  18. Development, qualification and use of a code with the capability of internal assessment of uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Petruzzi, A. [Pennsylvania State Univ., Dept. of Mechanical and Nuclear Engineering, University Park, Pennsylvania (United States)]|[Univ. of Pisa, Dipartimento di Ingegneria Meccanica, Nucleare e della Produzione, Pisa (Italy)]. E-mail: axp46@psu.edu; Giannotti, W.; D' Auria, F. [Univ. of Pisa, Dipartimento di Ingegneria Meccanica, Nucleare e della Produzione, Pisa (Italy)]. E-mail: w.giannotti@ing.unipi.it; dauria@ing.unipi.it; Ivanov, K. [Pennsylvania State Univ., Dept. of Mechanical and Nuclear Engineering, University Park, Pennsylvania (United States)]. E-mail: knil@psu.edu

    2004-07-01

    The best-estimate calculation results from complex system codes are affected by approximations that are un-predictable without the use of computational tools that account for the various sources of uncertainty. The Code with (the capability of) Internal Assessment of Uncertainty (CIAU) has been proposed by the University of Pisa to realize the integration between a qualified best-estimate thermal-hydraulic system code and a suitable uncertainty methodology and to supply proper uncertainty bands each time a nuclear power plant transient scenario is calculated. In the frame of a cooperation with The Pennsylvania State University, the CIAU code has been recently extended to evaluate the uncertainty of coupled 3-D neutronics/thermal-hydraulics calculations. The result of this effort is CIAU-TN. The derivation of the methodology and the adopted qualification processes ('internal' and 'external') are described in the paper. (author)

  19. A Multi-Model Approach for Uncertainty Propagation and Model Calibration in CFD Applications

    CERN Document Server

    Wang, Jian-xun; Xiao, Heng

    2015-01-01

    Proper quantification and propagation of uncertainties in computational simulations are of critical importance. This issue is especially challenging for CFD applications. A particular obstacle for uncertainty quantifications in CFD problems is the large model discrepancies associated with the CFD models used for uncertainty propagation. Neglecting or improperly representing the model discrepancies leads to inaccurate and distorted uncertainty distribution for the Quantities of Interest. High-fidelity models, being accurate yet expensive, can accommodate only a small ensemble of simulations and thus lead to large interpolation errors and/or sampling errors; low-fidelity models can propagate a large ensemble, but can introduce large modeling errors. In this work, we propose a multi-model strategy to account for the influences of model discrepancies in uncertainty propagation and to reduce their impact on the predictions. Specifically, we take advantage of CFD models of multiple fidelities to estimate the model ...

  20. Dosimetric uncertainty in prostate cancer proton radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Lin Liyong; Vargas, Carlos; Hsi Wen; Indelicato, Daniel; Slopsema, Roelf; Li Zuofeng; Yeung, Daniel; Horne, Dave; Palta, Jatinder [University of Florida Proton Therapy Institute, Jacksonville, Florida 32206 (United States)

    2008-11-15

    Purpose: The authors we evaluate the uncertainty in proton therapy dose distribution for prostate cancer due to organ displacement, varying penumbra width of proton beams, and the amount of rectal gas inside the rectum. Methods and Materials: Proton beam treatment plans were generated for ten prostate patients with a minimum dose of 74.1 cobalt gray equivalent (CGE) to the planning target volume (PTV) while 95% of the PTV received 78 CGE. Two lateral or lateral oblique proton beams were used for each plan. The authors we investigated the uncertainty in dose to the rectal wall (RW) and the bladder wall (BW) due to organ displacement by comparing the dose-volume histograms (DVH) calculated with the original or shifted contours. The variation between DVHs was also evaluated for patients with and without rectal gas in the rectum for five patients who had 16 to 47 cc of visible rectal gas in their planning computed tomography (CT) imaging set. The uncertainty due to the varying penumbra width of the delivered protons for different beam setting options on the proton delivery system was also evaluated. Results: For a 5 mm anterior shift, the relative change in the RW volume receiving 70 CGE dose (V{sub 70}) was 37.9% (5.0% absolute change in 13.2% of a mean V{sub 70}). The relative change in the BW volume receiving 70 CGE dose (V{sub 70}) was 20.9% (4.3% absolute change in 20.6% of a mean V{sub 70}) with a 5 mm inferior shift. A 2 mm penumbra difference in beam setting options on the proton delivery system resulted in the relative variations of 6.1% (0.8% absolute change) and 4.4% (0.9% absolute change) in V{sub 70} of RW and BW, respectively. The data show that the organ displacements produce absolute DVH changes that generally shift the entire isodose line while maintaining the same shape. The overall shape of the DVH curve for each organ is determined by the penumbra and the distance of the target in beam's eye view (BEV) from the block edge. The beam setting

  1. Lithological Uncertainty Expressed by Normalized Compression Distance

    Science.gov (United States)

    Jatnieks, J.; Saks, T.; Delina, A.; Popovs, K.

    2012-04-01

    prediction by partial matching (PPM), used for computing the NCD metric, is highly dependant on context. We assign unique symbols for aggregate lithology types and serialize the borehole logs into text strings, where the string length represents a normalized borehole depth. This encoding ensures that both lithology types as well as depth and sequence of strata is comparable in a form most native to the universal data compression software that calculates the pairwise NCD dissimilarity matrix. The NCD results can be used for generalization of the Quaternary structure using spatial clustering followed by a Voronoi tessellation using boreholes as generator points. After dissolving cluster membership identifiers of the borehole Voronoi polygons in GIS environment, regions representing similar lithological structure can be visualized. The exact number of regions and their homogeneity depends on parameters of the clustering solution. This study is supported by the European Social Fund project No. 2009/0212/1DP/1.1.1.2.0/09/APIA/VIAA/060 Keywords: geological uncertainty, lithological uncertainty, generalization, information distance, normalized compression distance, data compression

  2. Evaluating the uncertainty of input quantities in measurement models

    Science.gov (United States)

    Possolo, Antonio; Elster, Clemens

    2014-06-01

    The Guide to the Expression of Uncertainty in Measurement (GUM) gives guidance about how values and uncertainties should be assigned to the input quantities that appear in measurement models. This contribution offers a concrete proposal for how that guidance may be updated in light of the advances in the evaluation and expression of measurement uncertainty that were made in the course of the twenty years that have elapsed since the publication of the GUM, and also considering situations that the GUM does not yet contemplate. Our motivation is the ongoing conversation about a new edition of the GUM. While generally we favour a Bayesian approach to uncertainty evaluation, we also recognize the value that other approaches may bring to the problems considered here, and focus on methods for uncertainty evaluation and propagation that are widely applicable, including to cases that the GUM has not yet addressed. In addition to Bayesian methods, we discuss maximum-likelihood estimation, robust statistical methods, and measurement models where values of nominal properties play the same role that input quantities play in traditional models. We illustrate these general-purpose techniques in concrete examples, employing data sets that are realistic but that also are of conveniently small sizes. The supplementary material available online lists the R computer code that we have used to produce these examples (stacks.iop.org/Met/51/3/339/mmedia). Although we strive to stay close to clause 4 of the GUM, which addresses the evaluation of uncertainty for input quantities, we depart from it as we review the classes of measurement models that we believe are generally useful in contemporary measurement science. We also considerably expand and update the treatment that the GUM gives to Type B evaluations of uncertainty: reviewing the state-of-the-art, disciplined approach to the elicitation of expert knowledge, and its encapsulation in probability distributions that are usable in

  3. Application of perturbation theory methods to nuclear data uncertainty propagation using the collision probability method

    International Nuclear Information System (INIS)

    This thesis presents a comprehensive study of sensitivity/uncertainty analysis for reactor performance parameters (e.g. the k-effective) to the base nuclear data from which they are computed. The analysis starts at the fundamental step, the Evaluated Nuclear Data File and the uncertainties inherently associated with the data they contain, available in the form of variance/covariance matrices. We show that when a methodical and consistent computation of sensitivity is performed, conventional deterministic formalisms can be sufficient to propagate nuclear data uncertainties with the level of accuracy obtained by the most advanced tools, such as state-of-the-art Monte Carlo codes. By applying our developed methodology to three exercises proposed by the OECD (Uncertainty Analysis for Criticality Safety Assessment Benchmarks), we provide insights of the underlying physical phenomena associated with the used formalisms. (author)

  4. A computer scientist looks at game theory

    OpenAIRE

    Halpern, Joseph Y.

    2002-01-01

    I consider issues in distributed computation that should be of relevance to game theory. In particular, I focus on (a) representing knowledge and uncertainty, (b) dealing with failures, and (c) specification of mechanisms.

  5. The Role of Uncertainty in Climate Science

    Science.gov (United States)

    Oreskes, N.

    2012-12-01

    Scientific discussions of climate change place considerable weight on uncertainty. The research frontier, by definition, rests at the interface between the known and the unknown and our scientific investigations necessarily track this interface. Yet, other areas of active scientific research are not necessarily characterized by a similar focus on uncertainty; previous assessments of science for policy, for example, do not reveal such extensive efforts at uncertainty quantification. Why has uncertainty loomed so large in climate science? This paper argues that the extensive discussions of uncertainty surrounding climate change are at least in part a response to the social and political context of climate change. Skeptics and contrarians focus on uncertainty as a political strategy, emphasizing or exaggerating uncertainties as a means to undermine public concern about climate change and delay policy action. The strategy works in part because it appeals to a certain logic: if our knowledge is uncertain, then it makes sense to do more research. Change, as the tobacco industry famously realized, requires justification; doubt favors the status quo. However, the strategy also works by pulling scientists into an "uncertainty framework," inspiring them to respond to the challenge by addressing and quantifying the uncertainties. The problem is that all science is uncertain—nothing in science is ever proven absolutely, positively—so as soon as one uncertainty is addressed, another can be raised, which is precisely what contrarians have done over the past twenty years.

  6. Uncertainty in gridded CO2 emissions estimates

    Science.gov (United States)

    Hogue, Susannah; Marland, Eric; Andres, Robert J.; Marland, Gregg; Woodard, Dawn

    2016-05-01

    We are interested in the spatial distribution of fossil-fuel-related emissions of CO2 for both geochemical and geopolitical reasons, but it is important to understand the uncertainty that exists in spatially explicit emissions estimates. Working from one of the widely used gridded data sets of CO2 emissions, we examine the elements of uncertainty, focusing on gridded data for the United States at the scale of 1° latitude by 1° longitude. Uncertainty is introduced in the magnitude of total United States emissions, the magnitude and location of large point sources, the magnitude and distribution of non-point sources, and from the use of proxy data to characterize emissions. For the United States, we develop estimates of the contribution of each component of uncertainty. At 1° resolution, in most grid cells, the largest contribution to uncertainty comes from how well the distribution of the proxy (in this case population density) represents the distribution of emissions. In other grid cells, the magnitude and location of large point sources make the major contribution to uncertainty. Uncertainty in population density can be important where a large gradient in population density occurs near a grid cell boundary. Uncertainty is strongly scale-dependent with uncertainty increasing as grid size decreases. Uncertainty for our data set with 1° grid cells for the United States is typically on the order of ±150%, but this is perhaps not excessive in a data set where emissions per grid cell vary over 8 orders of magnitude.

  7. Linear minimax estimation for random vectors with parametric uncertainty

    KAUST Repository

    Bitar, E

    2010-06-01

    In this paper, we take a minimax approach to the problem of computing a worst-case linear mean squared error (MSE) estimate of X given Y , where X and Y are jointly distributed random vectors with parametric uncertainty in their distribution. We consider two uncertainty models, PA and PB. Model PA represents X and Y as jointly Gaussian whose covariance matrix Λ belongs to the convex hull of a set of m known covariance matrices. Model PB characterizes X and Y as jointly distributed according to a Gaussian mixture model with m known zero-mean components, but unknown component weights. We show: (a) the linear minimax estimator computed under model PA is identical to that computed under model PB when the vertices of the uncertain covariance set in PA are the same as the component covariances in model PB, and (b) the problem of computing the linear minimax estimator under either model reduces to a semidefinite program (SDP). We also consider the dynamic situation where x(t) and y(t) evolve according to a discrete-time LTI state space model driven by white noise, the statistics of which is modeled by PA and PB as before. We derive a recursive linear minimax filter for x(t) given y(t).

  8. Accommodating Uncertainty in ADAS Models and Data

    International Nuclear Information System (INIS)

    The Atomic Data and Analysis Structure (ADAS) is an interconnected set of computer codes and data collections for modelling the radiating properties of ions and atoms in plasmas and for assisting in the analysis and interpretation of spectral measurements. Fundamental atomic data is mediated via collisional-radiative population models to produce the effective coefficients for practical use in impurity transport modelling, influx estimation, beam stopping efficiency and active charge exchange quantification. Away from the coronal picture, the simple connection between fundamental processes and the derived population or emissivity is lost. Instead these coefficients become dependent on many fundamental processes with unknown weightings. The challenge is threefold: to develop methods for assigning an uncertainty to the fundamental data, to propagate these through the population models and to enable techniques to utilize within plasmas models atomic data that comes with an accompanying error surface. Pure atomic properties, such as energy levels and some transition probabilities, are measurable and are known to high precision. However most of the fundamental data required for fusion work, such as excitation and state selective charge transfer cross sections, ionisation and recombination rates (for both electron and ion drivers) remain the result of theoretical calculations and this is not likely to change. The envelope of variation between different methods and calculations is the simplest approach to ascribing an error and is useful in identifying both the principal contributors to the final quantity of interest and to assess its domain of influence. A more refined approach, based on variation of the control parameters of the fundamental ab initio codes, is adopted for this smaller set of significant cross sections. The error for the derived quantities is computed via a statistical sampling methodology assuming an error distribution within the ascribed uncertainty

  9. Optimal arbitrage under model uncertainty

    CERN Document Server

    Fernholz, Daniel; 10.1214/10-AAP755

    2012-01-01

    In an equity market model with "Knightian" uncertainty regarding the relative risk and covariance structure of its assets, we characterize in several ways the highest return relative to the market that can be achieved using nonanticipative investment rules over a given time horizon, and under any admissible configuration of model parameters that might materialize. One characterization is in terms of the smallest positive supersolution to a fully nonlinear parabolic partial differential equation of the Hamilton--Jacobi--Bellman type. Under appropriate conditions, this smallest supersolution is the value function of an associated stochastic control problem, namely, the maximal probability with which an auxiliary multidimensional diffusion process, controlled in a manner which affects both its drift and covariance structures, stays in the interior of the positive orthant through the end of the time-horizon. This value function is also characterized in terms of a stochastic game, and can be used to generate an in...

  10. Uncertainty, incompleteness, chance, and design

    CERN Document Server

    Sols, Fernando

    2013-01-01

    The 20th century has revealed two important limitations of scientific knowledge. On the one hand, the combination of Poincar\\'e's nonlinear dynamics and Heisenberg's uncertainty principle leads to a world picture where physical reality is, in many respects, intrinsically undetermined. On the other hand, G\\"odel's incompleteness theorems reveal us the existence of mathematical truths that cannot be demonstrated. More recently, Chaitin has proved that, from the incompleteness theorems, it follows that the random character of a given mathematical sequence cannot be proved in general (it is 'undecidable'). I reflect here on the consequences derived from the indeterminacy of the future and the undecidability of randomness, concluding that the question of the presence or absence of finality in nature is fundamentally outside the scope of the scientific method.

  11. Dopamine, uncertainty and TD learning

    Directory of Open Access Journals (Sweden)

    Duff Michael O

    2005-05-01

    Full Text Available Abstract Substantial evidence suggests that the phasic activities of dopaminergic neurons in the primate midbrain represent a temporal difference (TD error in predictions of future reward, with increases above and decreases below baseline consequent on positive and negative prediction errors, respectively. However, dopamine cells have very low baseline activity, which implies that the representation of these two sorts of error is asymmetric. We explore the implications of this seemingly innocuous asymmetry for the interpretation of dopaminergic firing patterns in experiments with probabilistic rewards which bring about persistent prediction errors. In particular, we show that when averaging the non-stationary prediction errors across trials, a ramping in the activity of the dopamine neurons should be apparent, whose magnitude is dependent on the learning rate. This exact phenomenon was observed in a recent experiment, though being interpreted there in antipodal terms as a within-trial encoding of uncertainty.

  12. Handling uncertainties in SVM classification

    CERN Document Server

    Niaf, Emilie; Lartizien, Carole; Canu, Stéphane

    2011-01-01

    This paper addresses the pattern classification problem arising when available target data include some uncertainty information. Target data considered here is either qualitative (a class label) or quantitative (an estimation of the posterior probability). Our main contribution is a SVM inspired formulation of this problem allowing to take into account class label through a hinge loss as well as probability estimates using epsilon-insensitive cost function together with a minimum norm (maximum margin) objective. This formulation shows a dual form leading to a quadratic problem and allows the use of a representer theorem and associated kernel. The solution provided can be used for both decision and posterior probability estimation. Based on empirical evidence our method outperforms regular SVM in terms of probability predictions and classification performances.

  13. Bayesian Mars for uncertainty quantification in stochastic transport problems

    International Nuclear Information System (INIS)

    We present a method for estimating solutions to partial differential equations with uncertain parameters using a modification of the Bayesian Multivariate Adaptive Regression Splines (BMARS) emulator. The BMARS algorithm uses Markov chain Monte Carlo (MCMC) to construct a basis function composed of polynomial spline functions, for which derivatives and integrals are straightforward to compute. We use these calculations and a modification of the curve-fitting BMARS algorithm to search for a basis function (response surface) which, in combination with its derivatives/integrals, satisfies a governing differential equation and specified boundary condition. We further show that this fit can be improved by enforcing a conservation or other physics-based constraint. Our results indicate that estimates to solutions of simple first order partial differential equations (without uncertainty) can be efficiently computed with very little regression error. We then extend the method to estimate uncertainties in the solution to a pure absorber transport problem in a medium with uncertain cross-section. We describe and compare two strategies for propagating the uncertain cross-section through the BMARS algorithm; the results from each method are in close comparison with analytic results. We discuss the scalability of the algorithm to parallel architectures and the applicability of the two strategies to larger problems with more degrees of uncertainty. (author)

  14. Assessing and propagating uncertainty in model inputs in corsim

    Energy Technology Data Exchange (ETDEWEB)

    Molina, G.; Bayarri, M. J.; Berger, J. O.

    2001-07-01

    CORSIM is a large simulator for vehicular traffic, and is being studied with respect to its ability to successfully model and predict behavior of traffic in a 36 block section of Chicago. Inputs to the simulator include information about street configuration, driver behavior, traffic light timing, turning probabilities at each corner and distributions of traffic ingress into the system. This work is described in more detail in the article Fast Simulators for Assessment and Propagation of Model Uncertainty also in these proceedings. The focus of this conference poster is on the computational aspects of this problem. In particular, we address the description of the full conditional distributions needed for implementation of the MCMC algorithm and, in particular, how the constraints can be incorporated; details concerning the run time and convergence of the MCMC algorithm; and utilisation of the MCMC output for prediction and uncertainty analysis concerning the CORSIM computer model. As this last is the ultimate goal, it is worth emphasizing that the incorporation of all uncertainty concerning inputs can significantly affect the model predictions. (Author)

  15. Efficient Characterization of Parametric Uncertainty of Complex (Biochemical Networks.

    Directory of Open Access Journals (Sweden)

    Claudia Schillings

    2015-08-01

    Full Text Available Parametric uncertainty is a particularly challenging and relevant aspect of systems analysis in domains such as systems biology where, both for inference and for assessing prediction uncertainties, it is essential to characterize the system behavior globally in the parameter space. However, current methods based on local approximations or on Monte-Carlo sampling cope only insufficiently with high-dimensional parameter spaces associated with complex network models. Here, we propose an alternative deterministic methodology that relies on sparse polynomial approximations. We propose a deterministic computational interpolation scheme which identifies most significant expansion coefficients adaptively. We present its performance in kinetic model equations from computational systems biology with several hundred parameters and state variables, leading to numerical approximations of the parametric solution on the entire parameter space. The scheme is based on adaptive Smolyak interpolation of the parametric solution at judiciously and adaptively chosen points in parameter space. As Monte-Carlo sampling, it is "non-intrusive" and well-suited for massively parallel implementation, but affords higher convergence rates. This opens up new avenues for large-scale dynamic network analysis by enabling scaling for many applications, including parameter estimation, uncertainty quantification, and systems design.

  16. Constrained quantities in uncertainty quantification. Ambiguity and tips to follow

    International Nuclear Information System (INIS)

    The nuclear community relies heavily on computer codes and numerical tools. The results of such computations can only be trusted if they are augmented by proper sensitivity and uncertainty (S and U) studies. This paper presents some aspects of S and U analysis when constrained quantities are involved, such as the fission spectrum or the isotopic distribution of elements. A consistent theory is given for the derivation and interpretation of constrained sensitivities as well as the corresponding covariance matrix normalization procedures. It is shown that if the covariance matrix violates the “generic zero column and row sum” condition, normalizing it is equivalent to constraining the sensitivities, but since both can be done in many ways different sensitivity coefficients and uncertainties can be derived. This makes results ambiguous, underlining the need for proper covariance data. It is also highlighted that the use of constrained sensitivity coefficients derived with a constraining procedure that is not idempotent can lead to biased results in uncertainty propagation. The presented theory is demonstrated on an analytical case and a numerical example involving the fission spectrum, both confirming the main conclusions of this research. (author)

  17. Optimisation of decision making under uncertainty throughout field lifetime: A fractured reservoir example

    Science.gov (United States)

    Arnold, Dan; Demyanov, Vasily; Christie, Mike; Bakay, Alexander; Gopa, Konstantin

    2016-10-01

    Assessing the change in uncertainty in reservoir production forecasts over field lifetime is rarely undertaken because of the complexity of joining together the individual workflows. This becomes particularly important in complex fields such as naturally fractured reservoirs. The impact of this problem has been identified in previous and many solutions have been proposed but never implemented on complex reservoir problems due to the computational cost of quantifying uncertainty and optimising the reservoir development, specifically knowing how many and what kind of simulations to run. This paper demonstrates a workflow that propagates uncertainty throughout field lifetime, and into the decision making process by a combination of a metric-based approach, multi-objective optimisation and Bayesian estimation of uncertainty. The workflow propagates uncertainty estimates from appraisal into initial development optimisation, then updates uncertainty through history matching and finally propagates it into late-life optimisation. The combination of techniques applied, namely the metric approach and multi-objective optimisation, help evaluate development options under uncertainty. This was achieved with a significantly reduced number of flow simulations, such that the combined workflow is computationally feasible to run for a real-field problem. This workflow is applied to two synthetic naturally fractured reservoir (NFR) case studies in appraisal, field development, history matching and mid-life EOR stages. The first is a simple sector model, while the second is a more complex full field example based on a real life analogue. This study infers geological uncertainty from an ensemble of models that are based on the carbonate Brazilian outcrop which are propagated through the field lifetime, before and after the start of production, with the inclusion of production data significantly collapsing the spread of P10-P90 in reservoir forecasts. The workflow links uncertainty

  18. LILW repository in Slovenia: uncertainty treatment in performance assessment of Slovenian generic safety case

    International Nuclear Information System (INIS)

    The Slovenian Agency for Radwaste Management has founded and developed a PA team, which has been educated also through the scholarship and other help of the IAEA, to perform performance assessment calculations for Slovene LILW repository safety case. The team has gone through several steps of the performance assessment procedure, starting with scenario generation and ending with radionuclide transport calculations for each of selected scenarios. To do this, two Slovene sites have been chosen and preliminary designs of a near surface and an underground repository have been made. In 2004, the team was asked to concentrate on the part of confidence building process concerning the treatment of uncertainty for the post closure assessment. The team has thus treated all kinds of uncertainty in PA/SA: scenario uncertainty, model (conceptual model, mathematical model, computer code) uncertainty, data/parameter uncertainty and subjective uncertainty, each of these groups in a specific way, recognised by the international practice. We used deterministic calculations to address scenario and model uncertainties. Scenarios, which have thus been calculated, were: normal evolution scenario with progressive engineered barrier system degradation, cap failure scenario, instant failure of all engineered barriers, climate-change scenario, inadvertent human intrusion scenario - all of them assuming different paths through the biosphere. Model uncertainties have also been addressed by deterministic calculations: calculations of different conceptual models and comparison of results, combining numerical solutions with theoretical ones and using different computer codes for the same conceptual models. Data/parameter uncertainties have been treated through sensitivity analysis (single parameter variation - deterministic) and multi-parameter variation (stochastic calculations), using all available site-specific data. Subjective uncertainty we have addressed through thorough international

  19. Attitudes, beliefs, uncertainty and risk

    Energy Technology Data Exchange (ETDEWEB)

    Greenhalgh, Geoffrey [Down Park Place, Crawley Down (United Kingdom)

    2001-07-01

    There is now unmistakable evidence of a widening split within the Western industrial nations arising from conflicting views of society; for and against change. The argument is over the benefits of 'progress' and growth. On one side are those who seek more jobs, more production and consumption, higher standards of living, an ever-increasing GNP with an increasing globalisation of production and welcome the advances of science and technology confident that any temporary problems that arise can be solved by further technological development - possible energy shortages as a growing population increases energy usage can be met by nuclear power development; food shortages by the increased yields of GM crops. In opposition are those who put the quality of life before GNP, advocate a more frugal life-style, reducing needs and energy consumption, and, pointing to the harm caused by increasing pollution, press for cleaner air and water standards. They seek to reduce the pressure of an ever-increasing population and above all to preserve the natural environment. This view is associated with a growing uncertainty as the established order is challenged with the rise in status of 'alternative' science and medicine. This paper argues that these conflicting views reflect instinctive attitudes. These in turn draw support from beliefs selected from those which uncertainty offers. Where there is scope for argument over the truth or validity of a 'fact', the choice of which of the disputed views to believe will be determined by a value judgement. This applies to all controversial social and political issues. Nuclear waste disposal and biotechnology are but two particular examples in the technological field; joining the EMU is a current political controversy where value judgements based on attitudes determine beliefs. When, or if, a controversy is finally resolved the judgement arrived at will be justified by the belief that the consequences of the course

  20. Attitudes, beliefs, uncertainty and risk

    International Nuclear Information System (INIS)

    There is now unmistakable evidence of a widening split within the Western industrial nations arising from conflicting views of society; for and against change. The argument is over the benefits of 'progress' and growth. On one side are those who seek more jobs, more production and consumption, higher standards of living, an ever-increasing GNP with an increasing globalisation of production and welcome the advances of science and technology confident that any temporary problems that arise can be solved by further technological development - possible energy shortages as a growing population increases energy usage can be met by nuclear power development; food shortages by the increased yields of GM crops. In opposition are those who put the quality of life before GNP, advocate a more frugal life-style, reducing needs and energy consumption, and, pointing to the harm caused by increasing pollution, press for cleaner air and water standards. They seek to reduce the pressure of an ever-increasing population and above all to preserve the natural environment. This view is associated with a growing uncertainty as the established order is challenged with the rise in status of 'alternative' science and medicine. This paper argues that these conflicting views reflect instinctive attitudes. These in turn draw support from beliefs selected from those which uncertainty offers. Where there is scope for argument over the truth or validity of a 'fact', the choice of which of the disputed views to believe will be determined by a value judgement. This applies to all controversial social and political issues. Nuclear waste disposal and biotechnology are but two particular examples in the technological field; joining the EMU is a current political controversy where value judgements based on attitudes determine beliefs. When, or if, a controversy is finally resolved the judgement arrived at will be justified by the belief that the consequences of the course chosen will be more favourable

  1. Uncertainty Quantification for Safeguards Measurements

    International Nuclear Information System (INIS)

    Part of the scientific method requires all calculated and measured results to be accompanied by a description that meets user needs and provides an adequate statement of the confidence one can have in the results. The scientific art of generating quantitative uncertainty statements is closely related to the mathematical disciplines of applied statistics, sensitivity analysis, optimization, and inversion, but in the field of non-destructive assay, also often draws heavily on expert judgment based on experience. We call this process uncertainty quantification, (UQ). Philosophical approaches to UQ along with the formal tools available for UQ have advanced considerably over recent years and these advances, we feel, may be useful to include in the analysis of data gathered from safeguards instruments. This paper sets out what we hope to achieve during a three year US DOE NNSA research project recently launched to address the potential of advanced UQ to improve safeguards conclusions. By way of illustration we discuss measurement of uranium enrichment by the enrichment meter principle (also known as the infinite thickness technique), that relies on gamma counts near the 186 keV peak directly from 235U. This method has strong foundations in fundamental physics and so we have a basis for the choice of response model — although in some implementations, peak area extraction may result in a bias when applied over a wide dynamic range. It also allows us to describe a common but usually neglected aspect of applying a calibration curve, namely the error structure in the predictors. We illustrate this using a combination of measured data and simulation. (author)

  2. Algorithms for propagating uncertainty across heterogeneous domains

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Heyrim; Yang, Xiu; Venturi, D.; Karniadakis, George E.

    2015-12-30

    We address an important research area in stochastic multi-scale modeling, namely the propagation of uncertainty across heterogeneous domains characterized by partially correlated processes with vastly different correlation lengths. This class of problems arise very often when computing stochastic PDEs and particle models with stochastic/stochastic domain interaction but also with stochastic/deterministic coupling. The domains may be fully embedded, adjacent or partially overlapping. The fundamental open question we address is the construction of proper transmission boundary conditions that preserve global statistical properties of the solution across different subdomains. Often, the codes that model different parts of the domains are black-box and hence a domain decomposition technique is required. No rigorous theory or even effective empirical algorithms have yet been developed for this purpose, although interfaces defined in terms of functionals of random fields (e.g., multi-point cumulants) can overcome the computationally prohibitive problem of preserving sample-path continuity across domains. The key idea of the different methods we propose relies on combining local reduced-order representations of random fields with multi-level domain decomposition. Specifically, we propose two new algorithms: The first one enforces the continuity of the conditional mean and variance of the solution across adjacent subdomains by using Schwarz iterations. The second algorithm is based on PDE-constrained multi-objective optimization, and it allows us to set more general interface conditions. The effectiveness of these new algorithms is demonstrated in numerical examples involving elliptic problems with random diffusion coefficients, stochastically advected scalar fields, and nonlinear advection-reaction problems with random reaction rates.

  3. Present theoretical uncertainties on charm hadroproduction in QCD and prompt neutrino fluxes

    Directory of Open Access Journals (Sweden)

    Garzelli M.V.

    2016-01-01

    Full Text Available Prompt neutrino fluxes are basic backgrounds in the search of high-energy neutrinos of astrophysical origin, performed by means of full-size neutrino telescopes located at Earth, under ice or under water. Predictions for these fluxes are provided on the basis of up-to-date theoretical results for charm hadroproduction in perturbative QCD, together with a comprehensive discussion of the various sources of theoretical uncertainty affecting their computation, and a quantitative estimate of each uncertainty contribution.

  4. Optimal Climate Protection Policies Under Uncertainty

    Science.gov (United States)

    Weber, M.; Barth, V.; Hasselmann, K.; Hooss, G.

    A cost-benefit analysis for greenhouse warming based on a globally integrated cou- pled climate-macro economic cost model SIAM2 (Structural Integrated Assessment Model) is used to compute optimal paths of global CO2 emissions. The aim of the model is to minimize the net time-integrated sum of climate damage and mitigation costs (or maximize the economic and social welfare). The climate model is repre- sented by a nonlinear impulse-response model (NICCS) calibrated against a coupled ocean-atmosphere general circulation model and a three-dimensional global carbon cycle model. The latest version of the economic module is based a macro economic growth model, which is designed to capture not only the interactions between cli- mate damages and economic development, but also the conflicting goals of individual firms and society (government). The model includes unemployment, limited fossil fuel resources, endogenous and stochastic exogenous technological development (unpre- dictable labor or fuel efficiency innovations of random impact amplitude at random points in time). One objective of the project is to examine optimal climate protection policies in the presence of uncertainty. A stochastic model is introduced to simulate the development of technology as well as climate change and climate damages. In re- sponse to this (stochastic) prediction, the fiscal policy is adjusted gradually in a series of discrete steps. The stochastic module includes probability-based methods, sensitiv- ity studies and formal szenario analysis.

  5. Low cost high performance uncertainty quantification

    KAUST Repository

    Bekas, C.

    2009-01-01

    Uncertainty quantification in risk analysis has become a key application. In this context, computing the diagonal of inverse covariance matrices is of paramount importance. Standard techniques, that employ matrix factorizations, incur a cubic cost which quickly becomes intractable with the current explosion of data sizes. In this work we reduce this complexity to quadratic with the synergy of two algorithms that gracefully complement each other and lead to a radically different approach. First, we turned to stochastic estimation of the diagonal. This allowed us to cast the problem as a linear system with a relatively small number of multiple right hand sides. Second, for this linear system we developed a novel, mixed precision, iterative refinement scheme, which uses iterative solvers instead of matrix factorizations. We demonstrate that the new framework not only achieves the much needed quadratic cost but in addition offers excellent opportunities for scaling at massively parallel environments. We based our implementation on BLAS 3 kernels that ensure very high processor performance. We achieved a peak performance of 730 TFlops on 72 BG/P racks, with a sustained performance 73% of theoretical peak. We stress that the techniques presented in this work are quite general and applicable to several other important applications. Copyright © 2009 ACM.

  6. Measurement Uncertainty for Finite Quantum Observables

    Directory of Open Access Journals (Sweden)

    René Schwonnek

    2016-06-01

    Full Text Available Measurement uncertainty relations are lower bounds on the errors of any approximate joint measurement of two or more quantum observables. The aim of this paper is to provide methods to compute optimal bounds of this type. The basic method is semidefinite programming, which we apply to arbitrary finite collections of projective observables on a finite dimensional Hilbert space. The quantification of errors is based on an arbitrary cost function, which assigns a penalty to getting result x rather than y, for any pair ( x , y . This induces a notion of optimal transport cost for a pair of probability distributions, and we include an Appendix with a short summary of optimal transport theory as needed in our context. There are then different ways to form an overall figure of merit from the comparison of distributions. We consider three, which are related to different physical testing scenarios. The most thorough test compares the transport distances between the marginals of a joint measurement and the reference observables for every input state. Less demanding is a test just on the states for which a “true value” is known in the sense that the reference observable yields a definite outcome. Finally, we can measure a deviation as a single expectation value by comparing the two observables on the two parts of a maximally-entangled state. All three error quantities have the property that they vanish if and only if the tested observable is equal to the reference. The theory is illustrated with some characteristic examples.

  7. Unscented transform-based uncertainty analysis of rotating coil transducers for field mapping

    Science.gov (United States)

    Arpaia, P.; De Matteis, E.; Schiano Lo Moriello, R.

    2016-03-01

    The uncertainty of a rotating coil transducer for magnetic field mapping is analyzed. Unscented transform and statistical design of experiments are combined to determine magnetic field expectation, standard uncertainty, and separate contributions of the uncertainty sources. For nonlinear measurement models, the unscented transform-based approach is more error-proof than the linearization underlying the "Guide to the expression of Uncertainty in Measurements" (GUMs), owing to the absence of model approximations and derivatives computation. When GUM assumptions are not met, the deterministic sampling strategy strongly reduces computational burden with respect to Monte Carlo-based methods proposed by the Supplement 1 of the GUM. Furthermore, the design of experiments and the associated statistical analysis allow the uncertainty sources domain to be explored efficiently, as well as their significance and single contributions to be assessed for an effective setup configuration. A straightforward experimental case study highlights that a one-order-of-magnitude reduction in the relative uncertainty of the coil area produces a decrease in uncertainty of the field mapping transducer by a factor of 25 with respect to the worst condition. Moreover, about 700 trials and the related processing achieve results corresponding to 5 × 106 brute-force Monte Carlo simulations.

  8. A Cascade Approach to Uncertainty Estimation for the Hydrological Simulation of Droughts

    Science.gov (United States)

    Smith, Katie; Tanguy, Maliko; Parry, Simon; Prudhomme, Christel

    2016-04-01

    Uncertainty poses a significant challenge in environmental research and the characterisation and quantification of uncertainty has become a research priority over the past decade. Studies of extreme events are particularly affected by issues of uncertainty. This study focusses on the sources of uncertainty in the modelling of streamflow droughts in the United Kingdom. Droughts are a poorly understood natural hazard with no universally accepted definition. Meteorological, hydrological and agricultural droughts have different meanings and vary both spatially and temporally, yet each is inextricably linked. The work presented here is part of two extensive interdisciplinary projects investigating drought reconstruction and drought forecasting capabilities in the UK. Lumped catchment models are applied to simulate streamflow drought, and uncertainties from 5 different sources are investigated: climate input data, potential evapotranspiration (PET) method, hydrological model, within model structure, and model parameterisation. Latin Hypercube sampling is applied to develop large parameter ensembles for each model structure which are run using parallel computing on a high performance computer cluster. Parameterisations are assessed using a multi-objective evaluation criteria which includes both general and drought performance metrics. The effect of different climate input data and PET methods on model output is then considered using the accepted model parameterisations. The uncertainty from each of the sources creates a cascade, and when presented as such the relative importance of each aspect of uncertainty can be determined.

  9. Main results of the OECD best estimate methods, uncertainty and sensitivity evaluation (BEMUSE) programme

    International Nuclear Information System (INIS)

    The BEMUSE (Best Estimate Methods - Uncertainty and Sensitivity Evaluation) Programme - promoted by the Working Group on Analysis and Management of Accidents (WGAMA) and endorsed by the Committee on the Safety of Nuclear Installations (CSNI) - represents an important step towards reliable application of high-quality best-estimate and uncertainty and sensitivity evaluation methods. The methods used in this activity are considered to be mature for application, including licensing processes. Skill, experience and knowledge of the users about the applied suitable computer code as well as the used uncertainty method are important for the quality of the results. (author)

  10. Dakota uncertainty quantification methods applied to the NEK-5000 SAHEX model.

    Energy Technology Data Exchange (ETDEWEB)

    Weirs, V. Gregory

    2014-03-01

    This report summarizes the results of a NEAMS project focused on the use of uncertainty and sensitivity analysis methods within the NEK-5000 and Dakota software framework for assessing failure probabilities as part of probabilistic risk assessment. NEK-5000 is a software tool under development at Argonne National Laboratory to perform computational fluid dynamics calculations for applications such as thermohydraulics of nuclear reactor cores. Dakota is a software tool developed at Sandia National Laboratories containing optimization, sensitivity analysis, and uncertainty quantification algorithms. The goal of this work is to demonstrate the use of uncertainty quantification methods in Dakota with NEK-5000.

  11. Verification of the uncertainty principle by using diffraction of light waves

    Energy Technology Data Exchange (ETDEWEB)

    Nikolic, D [Grammar School Pirot, 18 300 Pirot (Serbia); Nesic, Lj, E-mail: gisanikolic@yahoo.com [Faculty of Sciences and Mathematics, University of Nis, 18 000 Nis (Serbia)

    2011-03-15

    We described a simple idea for experimental verification of the uncertainty principle for light waves. We used a single-slit diffraction of a laser beam for measuring the angular width of zero-order diffraction maximum and obtained the corresponding wave number uncertainty. We will assume that the uncertainty in position is the slit width. For the acquisition of the experimental data and their further analysis, we used a computer. Because of its simplicity this experiment is very suitable for demonstration, as well as for a quantitative exercise at universities and final year of high school studies.

  12. Evacuation decision-making: process and uncertainty

    International Nuclear Information System (INIS)

    The purpose was to describe the processes of evacuation decision-making, identify and document uncertainties in that process and discuss implications for federal assumption of liability for precautionary evacuations at nuclear facilities under the Price-Anderson Act. Four major categories of uncertainty are identified concerning the interpretation of hazard, communication problems, perceived impacts of evacuation decisions and exogenous influences. Over 40 historical accounts are reviewed and cases of these uncertainties are documented. The major findings are that all levels of government, including federal agencies experience uncertainties in some evacuation situations. Second, private sector organizations are subject to uncertainties at a variety of decision points. Third, uncertainties documented in the historical record have provided the grounds for liability although few legal actions have ensued. Finally it is concluded that if liability for evacuations is assumed by the federal government, the concept of a ''precautionary'' evacuation is not useful in establishing criteria for that assumption. 55 refs., 1 fig., 4 tabs

  13. Incorporating Forecast Uncertainty in Utility Control Center

    Energy Technology Data Exchange (ETDEWEB)

    Makarov, Yuri V.; Etingov, Pavel V.; Ma, Jian

    2014-07-09

    Uncertainties in forecasting the output of intermittent resources such as wind and solar generation, as well as system loads are not adequately reflected in existing industry-grade tools used for transmission system management, generation commitment, dispatch and market operation. There are other sources of uncertainty such as uninstructed deviations of conventional generators from their dispatch set points, generator forced outages and failures to start up, load drops, losses of major transmission facilities and frequency variation. These uncertainties can cause deviations from the system balance, which sometimes require inefficient and costly last minute solutions in the near real-time timeframe. This Chapter considers sources of uncertainty and variability, overall system uncertainty model, a possible plan for transition from deterministic to probabilistic methods in planning and operations, and two examples of uncertainty-based fools for grid operations.This chapter is based on work conducted at the Pacific Northwest National Laboratory (PNNL)

  14. Uncertainty Quantification with Applications to Engineering Problems

    DEFF Research Database (Denmark)

    Bigoni, Daniele

    in measurements, predictions and manufacturing, and we can say that any dynamical system used in engineering is subject to some of these uncertainties. The first part of this work presents an overview of the mathematical framework used in Uncertainty Quantification (UQ) analysis and introduces the spectral tensor......The systematic quantification of the uncertainties affecting dynamical systems and the characterization of the uncertainty of their outcomes is critical for engineering design and analysis, where risks must be reduced as much as possible. Uncertainties stem naturally from our limitations......-train (STT) decomposition, a novel high-order method for the effective propagation of uncertainties which aims at providing an exponential convergence rate while tackling the curse of dimensionality. The curse of dimensionality is a problem that afflicts many methods based on meta-models, for which...

  15. Quantifying Mixed Uncertainties in Cyber Attacker Payoffs

    Energy Technology Data Exchange (ETDEWEB)

    Chatterjee, Samrat; Halappanavar, Mahantesh; Tipireddy, Ramakrishna; Oster, Matthew R.; Saha, Sudip

    2015-04-15

    Representation and propagation of uncertainty in cyber attacker payoffs is a key aspect of security games. Past research has primarily focused on representing the defender’s beliefs about attacker payoffs as point utility estimates. More recently, within the physical security domain, attacker payoff uncertainties have been represented as Uniform and Gaussian probability distributions, and intervals. Within cyber-settings, continuous probability distributions may still be appropriate for addressing statistical (aleatory) uncertainties where the defender may assume that the attacker’s payoffs differ over time. However, systematic (epistemic) uncertainties may exist, where the defender may not have sufficient knowledge or there is insufficient information about the attacker’s payoff generation mechanism. Such epistemic uncertainties are more suitably represented as probability boxes with intervals. In this study, we explore the mathematical treatment of such mixed payoff uncertainties.

  16. Uncertainty in tsunami sediment transport modeling

    Science.gov (United States)

    Jaffe, Bruce E.; Goto, Kazuhisa; Sugawara, Daisuke; Gelfenbaum, Guy R.; La Selle, SeanPaul M.

    2016-01-01

    Erosion and deposition from tsunamis record information about tsunami hydrodynamics and size that can be interpreted to improve tsunami hazard assessment. We explore sources and methods for quantifying uncertainty in tsunami sediment transport modeling. Uncertainty varies with tsunami, study site, available input data, sediment grain size, and model. Although uncertainty has the potential to be large, published case studies indicate that both forward and inverse tsunami sediment transport models perform well enough to be useful for deciphering tsunami characteristics, including size, from deposits. New techniques for quantifying uncertainty, such as Ensemble Kalman Filtering inversion, and more rigorous reporting of uncertainties will advance the science of tsunami sediment transport modeling. Uncertainty may be decreased with additional laboratory studies that increase our understanding of the semi-empirical parameters and physics of tsunami sediment transport, standardized benchmark tests to assess model performance, and development of hybrid modeling approaches to exploit the strengths of forward and inverse models.

  17. Bayesian Uncertainty Analyses Via Deterministic Model

    Science.gov (United States)

    Krzysztofowicz, R.

    2001-05-01

    Rational decision-making requires that the total uncertainty about a variate of interest (a predictand) be quantified in terms of a probability distribution, conditional on all available information and knowledge. Suppose the state-of-knowledge is embodied in a deterministic model, which is imperfect and outputs only an estimate of the predictand. Fundamentals are presented of three Bayesian approaches to producing a probability distribution of the predictand via any deterministic model. The Bayesian Processor of Output (BPO) quantifies the total uncertainty in terms of a posterior distribution, conditional on model output. The Bayesian Processor of Ensemble (BPE) quantifies the total uncertainty in terms of a posterior distribution, conditional on an ensemble of model output. The Bayesian Forecasting System (BFS) decomposes the total uncertainty into input uncertainty and model uncertainty, which are characterized independently and then integrated into a predictive distribution.

  18. Report on the uncertainty methods study

    International Nuclear Information System (INIS)

    The Uncertainty Methods Study (UMS) Group, following a mandate from CSNI, has compared five methods for calculating the uncertainty in the predictions of advanced 'best estimate' thermal-hydraulic codes: the Pisa method (based on extrapolation from integral experiments) and four methods identifying and combining input uncertainties. Three of these, the GRS, IPSN and ENUSA methods, use subjective probability distributions, and one, the AEAT method, performs a bounding analysis. Each method has been used to calculate the uncertainty in specified parameters for the LSTF SB-CL-18 5% cold leg small break LOCA experiment in the ROSA-IV Large Scale Test Facility (LSTF). The uncertainty analysis was conducted essentially blind and the participants did not use experimental measurements from the test as input apart from initial and boundary conditions. Participants calculated uncertainty ranges for experimental parameters including pressurizer pressure, primary circuit inventory and clad temperature (at a specified position) as functions of time

  19. Visual Semiotics & Uncertainty Visualization: An Empirical Study.

    Science.gov (United States)

    MacEachren, A M; Roth, R E; O'Brien, J; Li, B; Swingley, D; Gahegan, M

    2012-12-01

    This paper presents two linked empirical studies focused on uncertainty visualization. The experiments are framed from two conceptual perspectives. First, a typology of uncertainty is used to delineate kinds of uncertainty matched with space, time, and attribute components of data. Second, concepts from visual semiotics are applied to characterize the kind of visual signification that is appropriate for representing those different categories of uncertainty. This framework guided the two experiments reported here. The first addresses representation intuitiveness, considering both visual variables and iconicity of representation. The second addresses relative performance of the most intuitive abstract and iconic representations of uncertainty on a map reading task. Combined results suggest initial guidelines for representing uncertainty and discussion focuses on practical applicability of results.

  20. Uncertainty quantification for proton-proton fusion in chiral effective field theory

    Science.gov (United States)

    Acharya, B.; Carlsson, B. D.; Ekström, A.; Forssén, C.; Platter, L.

    2016-09-01

    We compute the S-factor of the proton-proton (pp) fusion reaction using chiral effective field theory (χEFT) up to next-to-next-to-leading order (NNLO) and perform a rigorous uncertainty analysis of the results. We quantify the uncertainties due to (i) the computational method used to compute the pp cross section in momentum space, (ii) the statistical uncertainties in the low-energy coupling constants of χEFT, (iii) the systematic uncertainty due to the χEFT cutoff, and (iv) systematic variations in the database used to calibrate the nucleon-nucleon interaction. We also examine the robustness of the polynomial extrapolation procedure, which is commonly used to extract the threshold S-factor and its energy-derivatives. By performing a statistical analysis of the polynomial fit of the energy-dependent S-factor at several different energy intervals, we eliminate a systematic uncertainty that can arise from the choice of the fit interval in our calculations. In addition, we explore the statistical correlations between the S-factor and few-nucleon observables such as the binding energies and point-proton radii of 2,3H and 3He as well as the D-state probability and quadrupole moment of 2H, and the β-decay of 3H. We find that, with the state-of-the-art optimization of the nuclear Hamiltonian, the statistical uncertainty in the threshold S-factor cannot be reduced beyond 0.7%.

  1. An Analysis of Optimal Advertising Under Uncertainty

    OpenAIRE

    Dung Nguyen

    1985-01-01

    We examine the firm's optimal advertising behavior under conditions of uncertainty. For the static one-period model, we show that the firm's attitude toward risk may be responsible for the potential divergence between advertising decisions under uncertainty and those under deterministic conditions. For the dynamic multi-period model, the ultimate impact of uncertainty on advertising is further complicated when the sales response function contains an unknown parameter, and the firm wishes to g...

  2. Uncertainty propagation with functionally correlated quantities

    CERN Document Server

    Giordano, Mosè

    2016-01-01

    Many uncertainty propagation software exist, written in different programming languages, but not all of them are able to handle functional correlation between quantities. In this paper we review one strategy to deal with uncertainty propagation of quantities that are functionally correlated, and introduce a new software offering this feature: the Julia package Measurements.jl. It supports real and complex numbers with uncertainty, arbitrary-precision calculations, mathematical and linear algebra operations with matrices and arrays.

  3. Research of relationship between uncertainty and investment

    Institute of Scientific and Technical Information of China (English)

    MENG Li; WANG Ding-wei

    2005-01-01

    This study focuses on revealing the relationship between uncertainty and investment probability through real option model involving investment critical trigger and project earning. Use of Matlab software on the experimental results showing that project earning volatility influences investment probability, led the authors to conclude that this notion is not always correct, as increasing uncertainty should have an inhibiting effect on investment, and that in certain situation, increasing uncertainty actually increases the investment probability and so, should have positive impact on investment.

  4. Whitepaper on Uncertainty Quantification for MPACT

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Mark L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-12-17

    The MPACT code provides the ability to perform high-fidelity deterministic calculations to obtain a wide variety of detailed results for very complex reactor core models. However MPACT currently does not have the capability to propagate the effects of input data uncertainties to provide uncertainties in the calculated results. This white paper discusses a potential method for MPACT uncertainty quantification (UQ) based on stochastic sampling.

  5. Assessing uncertainty in stormwater quality modelling.

    Science.gov (United States)

    Wijesiri, Buddhi; Egodawatta, Prasanna; McGree, James; Goonetilleke, Ashantha

    2016-10-15

    Designing effective stormwater pollution mitigation strategies is a challenge in urban stormwater management. This is primarily due to the limited reliability of catchment scale stormwater quality modelling tools. As such, assessing the uncertainty associated with the information generated by stormwater quality models is important for informed decision making. Quantitative assessment of build-up and wash-off process uncertainty, which arises from the variability associated with these processes, is a major concern as typical uncertainty assessment approaches do not adequately account for process uncertainty. The research study undertaken found that the variability of build-up and wash-off processes for different particle size ranges leads to processes uncertainty. After variability and resulting process uncertainties are accurately characterised, they can be incorporated into catchment stormwater quality predictions. Accounting of process uncertainty influences the uncertainty limits associated with predicted stormwater quality. The impact of build-up process uncertainty on stormwater quality predictions is greater than that of wash-off process uncertainty. Accordingly, decision making should facilitate the designing of mitigation strategies which specifically addresses variations in load and composition of pollutants accumulated during dry weather periods. Moreover, the study outcomes found that the influence of process uncertainty is different for stormwater quality predictions corresponding to storm events with different intensity, duration and runoff volume generated. These storm events were also found to be significantly different in terms of the Runoff-Catchment Area ratio. As such, the selection of storm events in the context of designing stormwater pollution mitigation strategies needs to take into consideration not only the storm event characteristics, but also the influence of process uncertainty on stormwater quality predictions. PMID:27423532

  6. Managing Uncertainty for an Integrated Fishery

    OpenAIRE

    MB Hasan

    2012-01-01

    This paper investigates ways to deal with the uncertainties in fishing trawler scheduling and production planning in a quota-based integrated commercial fishery. A commercial fishery faces uncertainty mainly from variation in catch rate, which may be due to weather, and other environmental factors. The firm tries to manage this uncertainty through planning co-ordination of fishing trawler scheduling, catch quota, processing and labour allocation, and inventory control. Scheduling must necessa...

  7. Inflation, inflation uncertainty and output in Tunisia

    OpenAIRE

    Hachicha, Ahmed; Lean Hooi Hooi

    2013-01-01

    This study investigates the relationship between inflation, inflation uncertainty and output in Tunisia using real and nominal data. GARCH-in-mean model with lagged variance equation is employed for the analysis. The result shows that inflation uncertainty has a positive and significant effect on the level of inflation only in the real term. Moreover, inflation uncertainty Granger-causes inflation and economic growth respectively. These results have important implications for the monetary pol...

  8. Inflation and Inflation Uncertainty in Latvia

    OpenAIRE

    Viktors Ajevskis

    2007-01-01

    The paper considers interrelation between inflation and inflation uncertainty in Latvia. The monthly growth in CPI in the period from January 1994 to June 2007 has been used as an inflation measure. The application of the GARCH-M model with lagged inflation in GARCH equation proves that a positive relationship between inflation and inflation uncertainty does exist. It suggests that increased inflation uncertainty raises inflation, and, vice versa, increased inflation is a cause for higher unc...

  9. Uncertainty Relations for General Unitary Operators

    OpenAIRE

    Bagchi, Shrobona; Pati, Arun Kumar

    2015-01-01

    We derive several uncertainty relations for two arbitrary unitary operators acting on physical states of any Hilbert space (finite or infinite dimensional). We show that our bounds are tighter in various cases than the ones existing in the current literature. With regard to the minimum uncertainty state for the cases of both the finite as well as the infinite dimensional unitary operators, we derive the minimum uncertainty state equation by the analytic method. As an application of this, we f...

  10. Uncertainty quantification of effective nuclear interactions

    CERN Document Server

    Perez, R Navarro; Arriola, E Ruiz

    2016-01-01

    We give a brief review on the development of phenomenological NN interactions and the corresponding quantification of statistical uncertainties. We look into the uncertainty of effective interactions broadly used in mean field calculations through the Skyrme parameters and effective field theory counter-terms by estimating both statistical and systematic uncertainties stemming from the NN interaction. We also comment on the role played by different fitting strategies on the light of recent developments.

  11. An approximation approach for uncertainty quantification using evidence theory

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Ha-Rok; Grandhi, Ramana V.; Canfield, Robert A

    2004-12-01

    Over the last two decades, uncertainty quantification (UQ) in engineering systems has been performed by the popular framework of probability theory. However, many scientific and engineering communities realize that there are limitations in using only one framework for quantifying the uncertainty experienced in engineering applications. Recently evidence theory, also called Dempster-Shafer theory, was proposed to handle limited and imprecise data situations as an alternative to the classical probability theory. Adaptation of this theory for large-scale engineering structures is a challenge due to implicit nature of simulations and excessive computational costs. In this work, an approximation approach is developed to improve the practical utility of evidence theory in UQ analysis. The techniques are demonstrated on composite material structures and airframe wing aeroelastic design problem.

  12. Nuclear fuel cycle cost in the face of uncertainty

    International Nuclear Information System (INIS)

    Long-term planning and decision making are a major part of the utility fuel manager's responsibilities. The proper performance of these responsibilities is made that much more challenging by the many uncertainties that encumber the world in which he must work. A proper appreciation for the sensitivity of nuclear fuel costs to the myriad of factors that can and do affect these costs can be best achieved through the application of probabilistic modeling techniques. The methodology to perform such an analysis has been formalized in the menu-driven, personal computer based program, PROBCOST. PROBCOST facilitates a quantitative analysis of these uncertainties and enables the fuel manager to better evaluate the relative contribution of various factors to the total nuclear fuel cost

  13. Cumulative theoretical uncertainties in lithium depletion boundary age

    CERN Document Server

    Tognelli, Emanuele; Degl'Innocenti, Scilla

    2015-01-01

    We performed a detailed analysis of the main theoretical uncertainties affecting the age at the lithium depletion boundary (LDB). To do that we computed almost 12000 pre-main sequence models with mass in the range [0.06, 0.4] M_sun by varying input physics (nuclear reaction cross-sections, plasma electron screening, outer boundary conditions, equation of state, and radiative opacity), initial chemical elements abundances (total metallicity, helium and deuterium abundances, and heavy elements mixture), and convection efficiency (mixing length parameter, alpha_ML). As a first step, we studied the effect of varying these quantities individually within their extreme values. Then, we analysed the impact of simultaneously perturbing the main input/parameters without an a priori assumption of independence. Such an approach allowed us to build for the first time the cumulative error stripe, which defines the edges of the maximum uncertainty region in the theoretical LDB age. We found that the cumulative error stripe ...

  14. Vibration and stress analysis in the presence of structural uncertainty

    Science.gov (United States)

    Langley, R. S.

    2009-08-01

    At medium to high frequencies the dynamic response of a built-up engineering system, such as an automobile, can be sensitive to small random manufacturing imperfections. Ideally the statistics of the system response in the presence of these uncertainties should be computed at the design stage, but in practice this is an extremely difficult task. In this paper a brief review of the methods available for the analysis of systems with uncertainty is presented, and attention is then focused on two particular "non-parametric" methods: statistical energy analysis (SEA), and the hybrid method. The main governing equations are presented, and a number of example applications are considered, ranging from academic benchmark studies to industrial design studies.

  15. Vibration and stress analysis in the presence of structural uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Langley, R S, E-mail: RSL21@eng.cam.ac.u [Department of Engineering, University of Cambridge, Cambridge CB2 1PZ (United Kingdom)

    2009-08-01

    At medium to high frequencies the dynamic response of a built-up engineering system, such as an automobile, can be sensitive to small random manufacturing imperfections. Ideally the statistics of the system response in the presence of these uncertainties should be computed at the design stage, but in practice this is an extremely difficult task. In this paper a brief review of the methods available for the analysis of systems with uncertainty is presented, and attention is then focused on two particular ''non-parametric'' methods: statistical energy analysis (SEA), and the hybrid method. The main governing equations are presented, and a number of example applications are considered, ranging from academic benchmark studies to industrial design studies.

  16. Uncertainty propagation from raw data to final results

    International Nuclear Information System (INIS)

    Reduction of data from raw numbers (counts per channel) to physically meaningful quantities (such as cross sections) is in itself a complicated procedure. Propagation of experimental uncertainties through that reduction process has sometimes been perceived as even more difficult, if not impossible. At the Oak Ridge Electron Linear Accelerator, a computer code ALEX has been developed to assist in the propagation process. The purpose of ALEX is to carefully and correctly propagate all experimental uncertainties through the entire reduction procedure, yielding the complete covariance matrix for the reduced data, while requiring little additional input from the experimentalist beyond that which is needed for the data reduction itself. The theoretical method used in ALEX is described, with emphasis on transmission measurements. Application to the natural iron and natural nickel measurements of D.C. Larson is shown

  17. Uncertainty, reward, and attention in the Bayesian brain

    DEFF Research Database (Denmark)

    Whiteley, Louise Emma

    2008-01-01

    The ‘Bayesian Coding Hypothesis’ formalises the classic Helmholtzian picture of perception as inverse inference, stating that the brain uses Bayes’ rule to compute posterior belief distributions over states of the world. There is much behavioural evidence that human observers can behave Bayes...... function with their uncertainty about a very simple stimulus, but behave suboptimally with respect to highly complex stimuli. Second, we use the same paradigm in a collaborative fMRI study, asking where along the path from sensory to motor areas a loss function is integrated with sensory uncertainty. Our...... in the focus of attention. When faced instead with a complex scene, the brain can’t be Bayes-optimal everywhere. We suggest that a general limitation on the representation of complex posteriors causes the brain to make approximations, which are then locally re¿ned by attention. This framework extends ideas...

  18. A surrogate accelerated multicanonical Monte Carlo method for uncertainty quantification

    Science.gov (United States)

    Wu, Keyi; Li, Jinglai

    2016-09-01

    In this work we consider a class of uncertainty quantification problems where the system performance or reliability is characterized by a scalar parameter y. The performance parameter y is random due to the presence of various sources of uncertainty in the system, and our goal is to estimate the probability density function (PDF) of y. We propose to use the multicanonical Monte Carlo (MMC) method, a special type of adaptive importance sampling algorithms, to compute the PDF of interest. Moreover, we develop an adaptive algorithm to construct local Gaussian process surrogates to further accelerate the MMC iterations. With numerical examples we demonstrate that the proposed method can achieve several orders of magnitudes of speedup over the standard Monte Carlo methods.

  19. Dealing with uncertainties in angles-only initial orbit determination

    Science.gov (United States)

    Armellin, Roberto; Di Lizia, Pierluigi; Zanetti, Renato

    2016-08-01

    A method to deal with uncertainties in initial orbit determination (IOD) is presented. This is based on the use of Taylor differential algebra (DA) to nonlinearly map uncertainties from the observation space to the state space. When a minimum set of observations is available, DA is used to expand the solution of the IOD problem in Taylor series with respect to measurement errors. When more observations are available, high order inversion tools are exploited to obtain full state pseudo-observations at a common epoch. The mean and covariance of these pseudo-observations are nonlinearly computed by evaluating the expectation of high order Taylor polynomials. Finally, a linear scheme is employed to update the current knowledge of the orbit. Angles-only observations are considered and simplified Keplerian dynamics adopted to ease the explanation. Three test cases of orbit determination of artificial satellites in different orbital regimes are presented to discuss the feature and performances of the proposed methodology.

  20. Predicting uncertainty in future marine ice sheet volume using Bayesian statistical methods

    Science.gov (United States)

    Davis, A. D.

    2015-12-01

    The marine ice instability can trigger rapid retreat of marine ice streams. Recent observations suggest that marine ice systems in West Antarctica have begun retreating. However, unknown ice dynamics, computationally intensive mathematical models, and uncertain parameters in these models make predicting retreat rate and ice volume difficult. In this work, we fuse current observational data with ice stream/shelf models to develop probabilistic predictions of future grounded ice sheet volume. Given observational data (e.g., thickness, surface elevation, and velocity) and a forward model that relates uncertain parameters (e.g., basal friction and basal topography) to these observations, we use a Bayesian framework to define a posterior distribution over the parameters. A stochastic predictive model then propagates uncertainties in these parameters to uncertainty in a particular quantity of interest (QoI)---here, the volume of grounded ice at a specified future time. While the Bayesian approach can in principle characterize the posterior predictive distribution of the QoI, the computational cost of both the forward and predictive models makes this effort prohibitively expensive. To tackle this challenge, we introduce a new Markov chain Monte Carlo method that constructs convergent approximations of the QoI target density in an online fashion, yielding accurate characterizations of future ice sheet volume at significantly reduced computational cost.Our second goal is to attribute uncertainty in these Bayesian predictions to uncertainties in particular parameters. Doing so can help target data collection, for the purpose of constraining the parameters that contribute most strongly to uncertainty in the future volume of grounded ice. For instance, smaller uncertainties in parameters to which the QoI is highly sensitive may account for more variability in the prediction than larger uncertainties in parameters to which the QoI is less sensitive. We use global sensitivity