Greenhouse Gas Source Attribution: Measurements Modeling and Uncertainty Quantification
Liu, Zhen [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Safta, Cosmin [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Sargsyan, Khachik [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Najm, Habib N. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); van Bloemen Waanders, Bart Gustaaf [Sandia National Lab. (SNL-CA), Livermore, CA (United States); LaFranchi, Brian W. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Ivey, Mark D. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Schrader, Paul E. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Michelsen, Hope A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Bambha, Ray P. [Sandia National Lab. (SNL-CA), Livermore, CA (United States)
2014-09-01
In this project we have developed atmospheric measurement capabilities and a suite of atmospheric modeling and analysis tools that are well suited for verifying emissions of green- house gases (GHGs) on an urban-through-regional scale. We have for the first time applied the Community Multiscale Air Quality (CMAQ) model to simulate atmospheric CO_{2} . This will allow for the examination of regional-scale transport and distribution of CO_{2} along with air pollutants traditionally studied using CMAQ at relatively high spatial and temporal resolution with the goal of leveraging emissions verification efforts for both air quality and climate. We have developed a bias-enhanced Bayesian inference approach that can remedy the well-known problem of transport model errors in atmospheric CO_{2} inversions. We have tested the approach using data and model outputs from the TransCom3 global CO_{2} inversion comparison project. We have also performed two prototyping studies on inversion approaches in the generalized convection-diffusion context. One of these studies employed Polynomial Chaos Expansion to accelerate the evaluation of a regional transport model and enable efficient Markov Chain Monte Carlo sampling of the posterior for Bayesian inference. The other approach uses de- terministic inversion of a convection-diffusion-reaction system in the presence of uncertainty. These approaches should, in principle, be applicable to realistic atmospheric problems with moderate adaptation. We outline a regional greenhouse gas source inference system that integrates (1) two ap- proaches of atmospheric dispersion simulation and (2) a class of Bayesian inference and un- certainty quantification algorithms. We use two different and complementary approaches to simulate atmospheric dispersion. Specifically, we use a Eulerian chemical transport model CMAQ and a Lagrangian Particle Dispersion Model - FLEXPART-WRF. These two models share the same WRF
Uranium Measurements and Attributes
It may be necessary to find the means to determine unclassified attributes of uranium in nuclear weapons or their components for future transparency initiatives. We briefly describe the desired characteristics of attribute measurement systems for transparency. The determination of uranium attributes; in particular, by passive gamma-ray detection is a formidable challenge
The attribute measurement technique
Any verification measurement performed on potentially classified nuclear material must satisfy two seemingly contradictory constraints. First and foremost, no classified information can be released. At the same time, the monitoring party must have confidence in the veracity of the measurement. An information barrier (IB) is included in the measurement system to protect the potentially classified information while allowing sufficient information transfer to occur for the monitoring party to gain confidence that the material being measured is consistent with the host's declarations, concerning that material. The attribute measurement technique incorporates an IB and addresses both concerns by measuring several attributes of the nuclear material and displaying unclassified results through green (indicating that the material does possess the specified attribute) and red (indicating that the material does not possess the specified attribute) lights. The attribute measurement technique has been implemented in the AVNG, an attribute measuring system described in other presentations at this conference. In this presentation, we will discuss four techniques used in the AVNG: (1) the 1B, (2) the attribute measurement technique, (3) the use of open and secure modes to increase confidence in the displayed results, and (4) the joint design as a method for addressing both host and monitor needs.
Measurement Uncertainty and Probability
Willink, Robin
2013-02-01
Part I. Principles: 1. Introduction; 2. Foundational ideas in measurement; 3. Components of error or uncertainty; 4. Foundational ideas in probability and statistics; 5. The randomization of systematic errors; 6. Beyond the standard confidence interval; Part II. Evaluation of Uncertainty: 7. Final preparation; 8. Evaluation using the linear approximation; 9. Evaluation without the linear approximations; 10. Uncertainty information fit for purpose; Part III. Related Topics: 11. Measurement of vectors and functions; 12. Why take part in a measurement comparison?; 13. Other philosophies; 14. An assessment of objective Bayesian methods; 15. A guide to the expression of uncertainty in measurement; 16. Measurement near a limit - an insoluble problem?; References; Index.
Measurement uncertainty and probability
Willink, Robin
2013-01-01
A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.
Traceability and Measurement Uncertainty
Tosello, Guido; De Chiffre, Leonardo
2004-01-01
and motivating to this important group. The developed e-learning system consists on 12 different chapters dealing with the following topics: 1. Basics 2. Traceability and measurement uncertainty 3. Coordinate metrology 4. Form measurement 5. Surface testing 6. Optical measurement and testing 7. Measuring rooms 8....... Machine tool testing 9. The role of manufacturing metrology for QM 10. Inspection planning 11. Quality management of measurements incl. Documentation 12. Advanced manufacturing measurement technology The present report (which represents the section 2 - Traceability and Measurement Uncertainty – of the e...
Attempting Measurement of Psychological Attributes
Salzberger, Thomas
2013-01-01
Measures of psychological attributes abound in the social sciences as much as measures of physical properties do in the physical sciences. However, there are crucial differences between the scientific underpinning of measurement. While measurement in the physical sciences is supported by empirical evidence that demonstrates the quantitative nature of the property assessed, measurement in the social sciences is, in large part, made possible only by a vague, discretionary definition of measurem...
Attempting measurement of psychological attributes.
Salzberger, Thomas
2013-01-01
Measures of psychological attributes abound in the social sciences as much as measures of physical properties do in the physical sciences. However, there are crucial differences between the scientific underpinning of measurement. While measurement in the physical sciences is supported by empirical evidence that demonstrates the quantitative nature of the property assessed, measurement in the social sciences is, in large part, made possible only by a vague, discretionary definition of measurement that places hardly any restrictions on empirical data. Traditional psychometric analyses fail to address the requirements of measurement as defined more rigorously in the physical sciences. The construct definitions do not allow for testable predictions; and content validity becomes a matter of highly subjective judgment. In order to improve measurement of psychological attributes, it is suggested to, first, readopt the definition of measurement in the physical sciences; second, to devise an elaborate theory of the construct to be measured that includes the hypothesis of a quantitative attribute; and third, to test the data for the structure implied by the hypothesis of quantity as well as predictions derived from the theory of the construct. PMID:23550264
The Uncertainty of Measurement Results
Factors affecting the uncertainty of measurement are explained, basic statistical formulae given, and the theoretical concept explained in the context of pesticide formulation analysis. Practical guidance is provided on how to determine individual uncertainty components within an analytical procedure. An extended and comprehensive table containing the relevant mathematical/statistical expressions elucidates the relevant underlying principles. Appendix I provides a practical elaborated example on measurement uncertainty estimation, above all utilizing experimental repeatability and reproducibility laboratory data. (author)
Entropic uncertainty and measurement reversibility
Berta, Mario; Wehner, Stephanie; Wilde, Mark M.
2016-07-01
The entropic uncertainty relation with quantum side information (EUR-QSI) from (Berta et al 2010 Nat. Phys. 6 659) is a unifying principle relating two distinctive features of quantum mechanics: quantum uncertainty due to measurement incompatibility, and entanglement. In these relations, quantum uncertainty takes the form of preparation uncertainty where one of two incompatible measurements is applied. In particular, the ‘uncertainty witness’ lower bound in the EUR-QSI is not a function of a post-measurement state. An insightful proof of the EUR-QSI from (Coles et al 2012 Phys. Rev. Lett. 108 210405) makes use of a fundamental mathematical consequence of the postulates of quantum mechanics known as the non-increase of quantum relative entropy under quantum channels. Here, we exploit this perspective to establish a tightening of the EUR-QSI which adds a new state-dependent term in the lower bound, related to how well one can reverse the action of a quantum measurement. As such, this new term is a direct function of the post-measurement state and can be thought of as quantifying how much disturbance a given measurement causes. Our result thus quantitatively unifies this feature of quantum mechanics with the others mentioned above. We have experimentally tested our theoretical predictions on the IBM quantum experience and find reasonable agreement between our predictions and experimental outcomes.
Incentive salience attribution under reward uncertainty: A Pavlovian model.
Anselme, Patrick
2015-02-01
There is a vast literature on the behavioural effects of partial reinforcement in Pavlovian conditioning. Compared with animals receiving continuous reinforcement, partially rewarded animals typically show (a) a slower development of the conditioned response (CR) early in training and (b) a higher asymptotic level of the CR later in training. This phenomenon is known as the partial reinforcement acquisition effect (PRAE). Learning models of Pavlovian conditioning fail to account for it. In accordance with the incentive salience hypothesis, it is here argued that incentive motivation (or 'wanting') plays a more direct role in controlling behaviour than does learning, and reward uncertainty is shown to have an excitatory effect on incentive motivation. The psychological origin of that effect is discussed and a computational model integrating this new interpretation is developed. Many features of CRs under partial reinforcement emerge from this model. PMID:25444780
Measuring the uncertainty of coupling
Zhao, Xiaojun; Shang, Pengjian
2015-06-01
A new information-theoretic measure, called coupling entropy, is proposed here to detect the causal links in complex systems by taking into account the inner composition alignment of temporal structure. It is a permutation-based asymmetric association measure to infer the uncertainty of coupling between two time series. The coupling entropy is found to be effective in the analysis of Hénon maps, where different noises are added to test its accuracy and sensitivity. The coupling entropy is also applied to analyze the relationship between unemployment rate and CPI change in the U.S., where the CPI change turns out to be the driving variable while the unemployment rate is the responding one.
Uncertainty Quantification for Safeguards Measurements
Part of the scientific method requires all calculated and measured results to be accompanied by a description that meets user needs and provides an adequate statement of the confidence one can have in the results. The scientific art of generating quantitative uncertainty statements is closely related to the mathematical disciplines of applied statistics, sensitivity analysis, optimization, and inversion, but in the field of non-destructive assay, also often draws heavily on expert judgment based on experience. We call this process uncertainty quantification, (UQ). Philosophical approaches to UQ along with the formal tools available for UQ have advanced considerably over recent years and these advances, we feel, may be useful to include in the analysis of data gathered from safeguards instruments. This paper sets out what we hope to achieve during a three year US DOE NNSA research project recently launched to address the potential of advanced UQ to improve safeguards conclusions. By way of illustration we discuss measurement of uranium enrichment by the enrichment meter principle (also known as the infinite thickness technique), that relies on gamma counts near the 186 keV peak directly from 235U. This method has strong foundations in fundamental physics and so we have a basis for the choice of response model — although in some implementations, peak area extraction may result in a bias when applied over a wide dynamic range. It also allows us to describe a common but usually neglected aspect of applying a calibration curve, namely the error structure in the predictors. We illustrate this using a combination of measured data and simulation. (author)
Uncertainty of temperature measurement with thermal cameras
Chrzanowski, Krzysztof; Matyszkiel, Robert; Fischer, Joachim; Barela, Jaroslaw
2001-06-01
All main international metrological organizations are proposing a parameter called uncertainty as a measure of the accuracy of measurements. A mathematical model that enables the calculations of uncertainty of temperature measurement with thermal cameras is presented. The standard uncertainty or the expanded uncertainty of temperature measurement of the tested object can be calculated when the bounds within which the real object effective emissivity (epsilon) r, the real effective background temperature Tba(r), and the real effective atmospheric transmittance (tau) a(r) are located and can be estimated; and when the intrinsic uncertainty of the thermal camera and the relative spectral sensitivity of the thermal camera are known.
Measuring, Estimating, and Deciding under Uncertainty.
Michel, Rolf
2016-03-01
The problem of uncertainty as a general consequence of incomplete information and the approach to quantify uncertainty in metrology is addressed. Then, this paper discusses some of the controversial aspects of the statistical foundation of the concepts of uncertainty in measurements. The basics of the ISO Guide to the Expression of Uncertainty in Measurement as well as of characteristic limits according to ISO 11929 are described and the needs for a revision of the latter standard are explained. PMID:26688360
Gomez, Jose Alfonso; Owens, Phillip N.; Koiter, Alex J.; Lobb, David
2016-04-01
One of the major sources of uncertainty in attributing sediment sources in fingerprinting studies is the uncertainty in determining the concentrations of the elements used in the mixing model due to the variability of the concentrations of these elements in the source materials (e.g., Kraushaar et al., 2015). The uncertainty in determining the "true" concentration of a given element in each one of the source areas depends on several factors, among them the spatial variability of that element, the sampling procedure and sampling density. Researchers have limited control over these factors, and usually sampling density tends to be sparse, limited by time and the resources available. Monte Carlo analysis has been used regularly in fingerprinting studies to explore the probable solutions within the measured variability of the elements in the source areas, providing an appraisal of the probability of the different solutions (e.g., Collins et al., 2012). This problem can be considered analogous to the propagation of uncertainty in hydrologic models due to uncertainty in the determination of the values of the model parameters, and there are many examples of Monte Carlo analysis of this uncertainty (e.g., Freeze, 1980; Gómez et al., 2001). Some of these model analyses rely on the simulation of "virtual" situations that were calibrated from parameter values found in the literature, with the purpose of providing insight about the response of the model to different configurations of input parameters. This approach - evaluating the answer for a "virtual" problem whose solution could be known in advance - might be useful in evaluating the propagation of uncertainty in mixing models in sediment fingerprinting studies. In this communication, we present the preliminary results of an on-going study evaluating the effect of variability of element concentrations in source materials, sampling density, and the number of elements included in the mixing models. For this study a virtual
GM Counters: Potential Measurement Uncertainty Sources
This paper describes theoretically potential measurement uncertainty sources in radiation detection by GM counters. Procedure of obtaining expanded and combined uncertainties is shown experimentally for four technologically different types of GM counters. Based on experimental results obtained, it has been established that the uncertainties of an influenced random variables depend on the technological solution of the counter reading system and contribute in different ways to the expanded and combined uncertainty of the applied types of GM counters. (author)
The impact of uncertainty and risk measures
Jo, Soojin
2012-01-01
This dissertation seeks to better understand how uncertainty impacts a variety of economic activities and how to measure systemic risk. In the first chapter, "The effects of oil price uncertainty on the macroeconomy'' focuses on oil price uncertainty, and how it affects the global economic growth. In particular, I define oil price uncertainty as the time-varying standard deviation of one- quarter ahead forecasting error that follows stochastic volatility. Then I use a quarterly VAR with stoch...
RUMINATIONS ON NDA MEASUREMENT UNCERTAINTY COMPARED TO DA UNCERTAINTY
Salaymeh, S.; Ashley, W.; Jeffcoat, R.
2010-06-17
It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.
Ruminations On NDA Measurement Uncertainty Compared TO DA Uncertainty
It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.
Measurement Theory, Nomological Machine And Measurement Uncertainties (In Classical Physics
Ave Mets
2012-12-01
Full Text Available Measurement is said to be the basis of exact sciences as the process of assigning numbers to matter (things or their attributes, thus making it possible to apply the mathematically formulated laws of nature to the empirical world. Mathematics and empiria are best accorded to each other in laboratory experiments which function as what Nancy Cartwright calls nomological machine: an arrangement generating (mathematical regularities. On the basis of accounts of measurement errors and uncertainties, I will argue for two claims: 1 Both fundamental laws of physics, corresponding to ideal nomological machine, and phenomenological laws, corresponding to material nomological machine, lie, being highly idealised relative to the empirical reality; and also laboratory measurement data do not describe properties inherent to the world independently of human understanding of it. 2 Therefore the naive, representational view of measurement and experimentation should be replaced with a more pragmatic or practice-based view.
Uncertainty reconciles complementarity with joint measurability
The fundamental principles of complementarity and uncertainty are shown to be related to the possibility of joint unsharp measurements of pairs of noncommuting quantum observables. A joint measurement scheme for complementary observables is proposed. The measured observables are represented as positive operator valued measures (POVMs), whose intrinsic fuzziness parameters are found to satisfy an intriguing pay-off relation reflecting the complementarity. At the same time, this relation represents an instance of a Heisenberg uncertainty relation for measurement imprecisions. A model-independent consideration shows that this uncertainty relation is logically connected with the joint measurability of the POVMs in question
Unsharpness of generalized measurement and its effects in entropic uncertainty relations
Baek, Kyunghyun; Son, Wonmin
2016-01-01
Under the scenario of generalized measurements, it can be questioned how much of quantum uncertainty can be attributed to measuring device, independent of the uncertainty in the measured system. On the course to answer the question, we suggest a new class of entropic uncertainty relation that differentiates quantum uncertainty from device imperfection due to the unsharpness of measurement. In order to quantify the unsharpness, we {suggest} and analyze the quantity that characterizes the uncer...
Kleim, B; Gonzalo, D; Ehlers, A
2011-01-01
A depressogenic attributional style, i.e., internal, stable and global causal interpretations of negative events, is a stable vulnerability factor for depression. Current measures of pessimistic attributional style can be time-consuming to complete, and some are designed for specific use with student populations. We developed and validated a new short questionnaire suitable for the measurement of depressogenic attributions in clinical settings, the Depressive Attributions Questionnaire (DAQ)....
Uncertainty estimation of ultrasonic thickness measurement
The most important factor that should be taken into consideration when selecting ultrasonic thickness measurement technique is its reliability. Only when the uncertainty of a measurement results is known, it may be judged if the result is adequate for intended purpose. The objective of this study is to model the ultrasonic thickness measurement function, to identify the most contributing input uncertainty components, and to estimate the uncertainty of the ultrasonic thickness measurement results. We assumed that there are five error sources significantly contribute to the final error, these sources are calibration velocity, transit time, zero offset, measurement repeatability and resolution, by applying the propagation of uncertainty law to the model function, a combined uncertainty of the ultrasonic thickness measurement was obtained. In this study the modeling function of ultrasonic thickness measurement was derived. By using this model the estimation of the uncertainty of the final output result was found to be reliable. It was also found that the most contributing input uncertainty components are calibration velocity, transit time linearity and zero offset. (author)
Uncertainty evaluation in electrochemical noise resistance measurement
The uncertainty in statistical noise resistance measurement was evaluated for a type 316 stainless steel in NaCl solutions at room temperature. Sensitivity coefficients were determined for measurements or variables such as NaCl concentration, pH, solution temperature, surface roughness, inert gas flow rate and bias potential amplitude. The coefficients were larger for the variables such as NaCl concentration, pH, inert gas flow rate and solution temperature, and they were the major factors increasing the combined standard uncertainty of noise resistance. However, the contribution to the uncertainty in noise resistance measurement from the above variables was remarkably low compared to that from repeated measurements of noise resistance, and thus, it is difficult to lower the uncertainty in noise resistance measurement significantly by lowering the uncertainties related with NaCl concentration, pH, inert gas flow rate and solution temperature. In addition, the uncertainty in noise resistance measurement was high amounting to 17.3 % of the mean, indicating that the reliability in measurement of noise resistance is low
An approach to multi-attribute utility analysis under parametric uncertainty
The techniques of cost-benefit analysis and multi-attribute analysis provide a useful basis for informing decisions in situations where a number of potentially conflicting opinions or interests need to be considered, and where there are a number of possible decisions that could be adopted. When the input data to such decision-making processes are uniquely specified, cost-benefit analysis and multi-attribute utility analysis provide unambiguous guidance on the preferred decision option. However, when the data are not uniquely specified, application and interpretation of these techniques is more complex. Herein, an approach to multi-attribute utility analysis (and hence, as a special case, cost-benefit analysis) when input data are subject to parametric uncertainty is presented. The approach is based on the use of a Monte Carlo technique, and has recently been applied to options for the remediation of former uranium mining liabilities in a number of Central and Eastern European States
Subjective judgment on measure of data uncertainty
Integral parameters are considered, which can be derived from the covariance matrix of the uncertainties and can serve as a general measure of uncertainties in comparisons of different fits. Using realistic examples and simple data model fits with a variable number of parameters, he was able to show that the sum of all elements of the covariance matrix is a best general measure for characterizing and comparing uncertainties obtained in different model and non-model fits. Discussions also included the problem of non-positive definiteness of the covariance matrix of the uncertainty of the cross sections obtained from the covariance matrix of the uncertainty of the parameters in cases where the number of parameters is less than number of cross section points. As a consequence of the numerical inaccuracy of the calculations that are always many orders larger than the presentation of the machine zero, it was concluded that the calculated eigenvalues of semipositive definite matrices have no machine zeros. These covariance matrices can be inverted when they are used in the error propagation equations. So the procedure for transformation of the semi-positive definite matrices to positive ones by introducing minimal changes into the matrix (changes that are equivalent to introducing additional non-informative parameters in the model) is generally not needed. But caution should be observed, because there can be cases where uncertainties can be unphysical, e.g. integral parameters estimated with formally non-positive-definite covariance matrices
Measuring the uncertainty of tapping torque
Belluco, Walter; De Chiffre, Leonardo
An uncertainty budget is carried out for torque measurements performed at the Institut for Procesteknik for the evaluation of cutting fluids. Thirty test blanks were machined with one tool and one fluid, torque diagrams were recorded and the repeatability of single torque measurements was estimat...
Teaching Measurement and Uncertainty the GUM Way
Buffler, Andy; Allie, Saalih; Lubben, Fred
2008-01-01
This paper describes a course aimed at developing understanding of measurement and uncertainty in the introductory physics laboratory. The course materials, in the form of a student workbook, are based on the probabilistic framework for measurement as recommended by the International Organization for Standardization in their publication "Guide to…
Uncertainty Measures of Regional Flood Frequency Estimators
Rosbjerg, Dan; Madsen, Henrik
1995-01-01
Regional flood frequency models have different assumptions regarding homogeneity and inter-site independence. Thus, uncertainty measures of T-year event estimators are not directly comparable. However, having chosen a particular method, the reliability of the estimate should always be stated, e...
Methods for Attribute Measurement and Alternatives to Multiplicity Counting
The Attribute Measurement System with Information Barrier (AMS/IB) specification is being developed in support of the Defense Threat Reduction Agency's (DTRA's) Cooperative Threat Reduction (CTR) program for the Mayak Fissile Material Storage Facility. This document discusses the technologies available for attribute measurement, and advantages and disadvantages of alternatives
Using MINITAB software for teaching measurement uncertainty
The concept of measurement uncertainty should be regarded not only related to the concept of doubt about the validity of the measurement result, but also to the quantization of this concept. In this sense the measurement uncertainty is that parameter associated with the result characterizing the dispersion of the values that could reasonably be assigned to the measurand (or more properly to its representation through a model). This parameter may be for example a multiple of the standard deviation but especially, and more importantly, the half width of an interval with a predetermined level of confidence or trust. In these terms in this paper I attempt, with the help of MINITAB software, to analyze this parameter; with simple and quick operations to evaluate the mean, the standard deviation and the confidence interval and by the use of several plotted graphs
Quantifying uncertainty in nuclear analytical measurements
The lack of international consensus on the expression of uncertainty in measurements was recognised by the late 1970s and led, after the issuance of a series of rather generic recommendations, to the publication of a general publication, known as GUM, the Guide to the Expression of Uncertainty in Measurement. This publication, issued in 1993, was based on co-operation over several years by the Bureau International des Poids et Mesures, the International Electrotechnical Commission, the International Federation of Clinical Chemistry, the International Organization for Standardization (ISO), the International Union of Pure and Applied Chemistry, the International Union of Pure and Applied Physics and the Organisation internationale de metrologie legale. The purpose was to promote full information on how uncertainty statements are arrived at and to provide a basis for harmonized reporting and the international comparison of measurement results. The need to provide more specific guidance to different measurement disciplines was soon recognized and the field of analytical chemistry was addressed by EURACHEM in 1995 in the first edition of a guidance report on Quantifying Uncertainty in Analytical Measurements, produced by a group of experts from the field. That publication translated the general concepts of the GUM into specific applications for analytical laboratories and illustrated the principles with a series of selected examples as a didactic tool. Based on feedback from the actual practice, the EURACHEM publication was extensively reviewed in 1997-1999 under the auspices of the Co-operation on International Traceability in Analytical Chemistry (CITAC), and a second edition was published in 2000. Still, except for a single example on the measurement of radioactivity in GUM, the field of nuclear and radiochemical measurements was not covered. The explicit requirement of ISO standard 17025:1999, General Requirements for the Competence of Testing and Calibration
Uncertainties in the attribution of greenhouse gas warming and implications for climate prediction
Jones, Gareth S; Mitchell, John F B
2016-01-01
Using optimal detection techniques with climate model simulations, most of the observed increase of near surface temperatures over the second half of the twentieth century is attributed to anthropogenic influences. However, the partitioning of the anthropogenic influence to individual factors, such as greenhouse gases and aerosols, is much less robust. Differences in how forcing factors are applied, in their radiative influence and in models' climate sensitivities, substantially influence the response patterns. We find standard optimal detection methodologies cannot fully reconcile this response diversity. By selecting a set of experiments to enable the diagnosing of greenhouse gases and the combined influence of other anthropogenic and natural factors, we find robust detections of well mixed greenhouse gases across a large ensemble of models. Of the observed warming over the 20th century of 0.65K/century we find, using a multi model mean not incorporating pattern uncertainty, a well mixed greenhouse gas warm...
Inconclusive quantum measurements and decisions under uncertainty
Yukalov, V I
2016-01-01
We give a mathematical definition for the notion of inconclusive quantum measurements. In physics, such measurements occur at intermediate stages of a complex measurement procedure, with the final measurement result being operationally testable. Since the mathematical structure of Quantum Decision Theory has been developed in analogy with the theory of quantum measurements, the inconclusive quantum measurements correspond, in Quantum Decision Theory, to intermediate stages of decision making in the process of taking decisions under uncertainty. The general form of the quantum probability for a composite event is the sum of a utility factor, describing a rational evaluation of the considered prospect, and of an attraction factor, characterizing irrational, subconscious attitudes of the decision maker. Despite the involved irrationality, the probability of prospects can be evaluated. This is equivalent to the possibility of calculating quantum probabilities without specifying hidden variables. We formulate a ge...
Measuring the uncertainties of discharge measurements: interlaboratory experiments in hydrometry
Le Coz, Jérôme; Blanquart, Bertrand; Pobanz, Karine; Dramais, Guillaume; Pierrefeu, Gilles; Hauet, Alexandre; Despax, Aurélien
2015-04-01
Quantifying the uncertainty of streamflow data is key for hydrological sciences. The conventional uncertainty analysis based on error propagation techniques is restricted by the absence of traceable discharge standards and by the weight of difficult-to-predict errors related to the operator, procedure and measurement environment. Field interlaboratory experiments recently emerged as an efficient, standardized method to 'measure' the uncertainties of a given streamgauging technique in given measurement conditions. Both uncertainty approaches are compatible and should be developed jointly in the field of hydrometry. In the recent years, several interlaboratory experiments have been reported by different hydrological services. They involved different streamgauging techniques, including acoustic profilers (ADCP), current-meters and handheld radars (SVR). Uncertainty analysis was not always their primary goal: most often, testing the proficiency and homogeneity of instruments, makes and models, procedures and operators was the original motivation. When interlaboratory experiments are processed for uncertainty analysis, once outliers have been discarded all participants are assumed to be equally skilled and to apply the same streamgauging technique in equivalent conditions. A universal requirement is that all participants simultaneously measure the same discharge, which shall be kept constant within negligible variations. To our best knowledge, we were the first to apply the interlaboratory method for computing the uncertainties of streamgauging techniques, according to the authoritative international documents (ISO standards). Several specific issues arise due to the measurements conditions in outdoor canals and rivers. The main limitation is that the best available river discharge references are usually too uncertain to quantify the bias of the streamgauging technique, i.e. the systematic errors that are common to all participants in the experiment. A reference or a
Estimation of dose uncertainty measurement in a Nuclear Medicine Service
Full text: The accuracy of activity measurement is the first step in safety and radiation protection of patient. The uncertainty of activity measurement has a lot of aspect to take into account: calibration factors, geometric of the sample, position, stability of the equipment parameters, stability, etc. More over, operational parameters change between different equipment, for that reason guarantee the traceability of measurement is very important in quality assurance. The objective of this study was determined the combine uncertainty of activity measurement for isotopes I131 and Tc99m , using dose calibrators well establish in Latin American countries Capintec CRC-15R and PTW Curimentor 3 those present in our Nuclear Medicine Service. The uncertainty could be defined as a parameter associated with the result of a measurement, which characterises the dispersion of the values that could reasonably be attributed to the measured. The parameter may be, for example, a standard deviation (or a given multiple of it), or the width of a confidence interval. Uncertainty of measurement comprises, in general, many components. Some of these components may be evaluated from the statistical distribution of the results of series of measurements and can be characterised by standard deviations. The other components, which also can be characterised by standard deviations, are evaluated from assumed probability distributions based on experience or other information. These different cases are defining as Type A and Type B estimations respectively. The combine uncertainty is calculated as the square root of the square sum of all uncertainty components. These uncertainties are associates to calibration factor, linearity, reproducibility, radioactive background, stability, radioactive decay correction, etc. For that reason the performance of accuracy, precision, the linearity of activity response and reproducibility were study during 6 month. In order to check the precision and accuracy
METHOD OF DYNAMIC EXPRESSION OF UNCERTAINTY OF MEASUREMENT
O. M. Vasilevskyi
2015-03-01
Full Text Available The way of expressing the dynamic uncertainty of measurement that allows using the spectral function of the input signal and the frequency response of the measurement tools used to assess the uncertainty in its dynamic operation.
Inconclusive quantum measurements and decisions under uncertainty
Vyacheslav I. Yukalov
2016-04-01
Full Text Available We give a mathematical definition for the notion of inconclusive quantum measurements.In physics, such measurements occur at intermediate stages of a complex measurement procedure, with the final measurement result being operationally testable. Since the mathematical structure of Quantum Decision Theory has been developed in analogy withthe theory of quantum measurements, the inconclusive quantum measurements correspond,in Quantum Decision Theory, to intermediate stages of decision making in the process of taking decisions under uncertainty. The general form of the quantum probability for a composite event is the sum of a utility factor, describing a rational evaluationof the considered prospect, and of an attraction factor, characterizing irrational,subconscious attitudes of the decision maker. Despite the involved irrationality, the probability of prospects can be evaluated. This is equivalent to the possibility of calculating quantum probabilities without specifying hidden variables. We formulate a general way of evaluation, based on the use of non-informative priors. As an example,we suggest the explanation of the decoy effect. Our quantitative predictions are in very good agreement with experimental data.
Inconclusive quantum measurements and decisions under uncertainty
Yukalov, Vyacheslav; Sornette, Didier
2016-04-01
We give a mathematical definition for the notion of inconclusive quantum measurements. In physics, such measurements occur at intermediate stages of a complex measurement procedure, with the final measurement result being operationally testable. Since the mathematical structure of Quantum Decision Theory has been developed in analogy with the theory of quantum measurements, the inconclusive quantum measurements correspond, in Quantum Decision Theory, to intermediate stages of decision making in the process of taking decisions under uncertainty. The general form of the quantum probability for a composite event is the sum of a utility factor, describing a rational evaluation of the considered prospect, and of an attraction factor, characterizing irrational, subconscious attitudes of the decision maker. Despite the involved irrationality, the probability of prospects can be evaluated. This is equivalent to the possibility of calculating quantum probabilities without specifying hidden variables. We formulate a general way of evaluation, based on the use of non-informative priors. As an example, we suggest the explanation of the decoy effect. Our quantitative predictions are in very good agreement with experimental data.
Attribute measurement systems prototypes and equipment in the United States
Since the fall of 1997, the United States has been developing prototypical attribute verification technology for potential use by the International Atomic Energy Agency (IAEA) under the Trilateral Initiative. The first attribute measurement equipment demonstration took place in December 1997 at the Lawrence Livermore National Laboratory. This demonstration led to a series of joint Russian Federatioin/US/IAEA technical discussions that focused on attribute measurement technology that could be applied to plutonium bearing items having classified characteristics. A first prototype attribute verification system with an information barrier was demonstrated at a Trilateral Technical Workshop in June 1999 at Los Alamos. This prototype nourished further fruitful discussions between the three parties that has in turn led to the documents discussed in a previous paper. Prototype development has continued in the US, under other initiatives, using an integrated approach that includes the Trilatleral Initiative. Specifically for the Trilateral Initiative, US development has turned to some peripheral equipment that would support verifications by the IAEA. This equipment includes an authentication tool for measurement systems with information barriers and in situ probes that would facilitate inspections by reducing the need to move material out of storage locations for reverification. In this paper, we will first summarize the development of attribute verification measurement system technology in the US and then report on the status of the development of other equipment to support the Trilateral Initiative.
Expanded uncertainty in measurements of Geiger-Mueller's counter
This paper explains the procedure of obtaining expanded uncertainty in measurement for four types of GM counters with the same counter's tube in cases when the contributors of uncertainties in measurement are cosmic background radiation and induced overvoltage phenomenon. According to experiment and obtained experimental results it is established that uncertainties of influenced random variables depend on technological solution of counter and for that purpose they give different contribution to expanded uncertainty in measurement of applied GM counters
Measurement uncertainties in science and technology
Grabe, Michael
2014-01-01
This book recasts the classical Gaussian error calculus from scratch, the inducements concerning both random and unknown systematic errors. The idea of this book is to create a formalism being fit to localize the true values of physical quantities considered – true with respect to the set of predefined physical units. Remarkably enough, the prevailingly practiced forms of error calculus do not feature this property which however proves in every respect, to be physically indispensable. The amended formalism, termed Generalized Gaussian Error Calculus by the author, treats unknown systematic errors as biases and brings random errors to bear via enhanced confidence intervals as laid down by students. The significantly extended second edition thoroughly restructures and systematizes the text as a whole and illustrates the formalism by numerous numerical examples. They demonstrate the basic principles of how to understand uncertainties to localize the true values of measured values - a perspective decisive in vi...
Uncertainty in outdoor noise measurement and prediction
Wilson, D. Keith
2005-09-01
Standards for outdoor noise are intended to ensure that (1) measurements are representative of actual exposure and (2) noise prediction procedures are consistent and scientifically defensible. Attainment of these worthwhile goals is hindered by the many complexities of sound interaction with the local atmosphere and terrain. The paradigm predominant in current standards might be described as measuring/predicting ``somewhat worse than average'' conditions. Measurements/predictions are made for moderate downward refraction conditions, since that is when noise annoyance is most often expected to occur. This paradigm is reasonable and practical, although one might argue that current standards could implement it better. A different, potentially more rigorous, paradigm is to explicitly treat the statistical nature of noise imissions as produced by variability in the atmospheric environment and by uncertainties in its characterization. For example, measurements and prediction techniques could focus on exceedance levels. For this to take place, a better conceptual framework must be developed for predictions that are averaged over environmental states, frequency bands, and various time intervals. Another increasingly important issue is the role of computer models. As these models continue to grow in fidelity and capability, there will be increasing pressure to abandon standard calculations in many applications.
Measurement Uncertainty for Finite Quantum Observables
René Schwonnek
2016-06-01
Full Text Available Measurement uncertainty relations are lower bounds on the errors of any approximate joint measurement of two or more quantum observables. The aim of this paper is to provide methods to compute optimal bounds of this type. The basic method is semidefinite programming, which we apply to arbitrary finite collections of projective observables on a finite dimensional Hilbert space. The quantification of errors is based on an arbitrary cost function, which assigns a penalty to getting result x rather than y, for any pair ( x , y . This induces a notion of optimal transport cost for a pair of probability distributions, and we include an Appendix with a short summary of optimal transport theory as needed in our context. There are then different ways to form an overall figure of merit from the comparison of distributions. We consider three, which are related to different physical testing scenarios. The most thorough test compares the transport distances between the marginals of a joint measurement and the reference observables for every input state. Less demanding is a test just on the states for which a “true value” is known in the sense that the reference observable yields a definite outcome. Finally, we can measure a deviation as a single expectation value by comparing the two observables on the two parts of a maximally-entangled state. All three error quantities have the property that they vanish if and only if the tested observable is equal to the reference. The theory is illustrated with some characteristic examples.
Automating Measurement for Software Process Models using Attribute Grammar Rules
Abdul Azim Abd. Ghani
2007-08-01
Full Text Available The modelling concept is well accepted in software engineering discipline. Some software models are built either to control the development stages, to measure program quality or to serve as a medium that gives better understanding of the actual software systems. Software process modelling nowadays has reached a level that allow software designs to be transformed into programming languages, such as architecture design language and unified modelling language. This paper described the adaptation of attribute grammar approach in measuring software process model. A tool, called Software Process Measurement Application was developed to enable the measurement accordingly to specified attribute grammar rules. A context-free grammar to read the process model is depicted from IDEF3 standard, and rules were attached to enable the measurement metrics calculation. The measurement metric values collected were used to aid in determining the decomposing and structuring of processes for the proposed software systems.
Estimating discharge measurement uncertainty using the interpolated variance estimator
Cohn, T.; Kiang, J.; Mason, R., Jr.
2012-01-01
Methods for quantifying the uncertainty in discharge measurements typically identify various sources of uncertainty and then estimate the uncertainty from each of these sources by applying the results of empirical or laboratory studies. If actual measurement conditions are not consistent with those encountered in the empirical or laboratory studies, these methods may give poor estimates of discharge uncertainty. This paper presents an alternative method for estimating discharge measurement uncertainty that uses statistical techniques and at-site observations. This Interpolated Variance Estimator (IVE) estimates uncertainty based on the data collected during the streamflow measurement and therefore reflects the conditions encountered at the site. The IVE has the additional advantage of capturing all sources of random uncertainty in the velocity and depth measurements. It can be applied to velocity-area discharge measurements that use a velocity meter to measure point velocities at multiple vertical sections in a channel cross section.
Uncertainty of dose measurement in radiation processing
Miller, A.
1996-01-01
running debate and presents the author's view, which is based upon experience in radiation processing dosimetry. The origin of all uncertainty components must be identified and can be classified according to Type A and Type B, but it is equally important to separate the uncertainty components into those...
Quantum measurement and uncertainty relations in photon polarization
Edamatsu, Keiichi
2016-07-01
Recent theoretical and experimental studies have given raise to new aspects in quantum measurements and error-disturbance uncertainty relations. After a brief review of these issues, we present an experimental test of the error-disturbance uncertainty relations in photon polarization measurements. Using a generalized, strength-variable measurement of a single photon polarization state, we experimentally evaluate the error and disturbance in the measurement process and demonstrate the validity of recently proposed uncertainty relations.
Using a Meniscus to Teach Uncertainty in Measurement
Backman, Philip
2008-01-01
I have found that students easily understand that a measurement cannot be exact, but they often seem to lack an understanding of why it is important to know "something" about the magnitude of the uncertainty. This tends to promote an attitude that almost any uncertainty value will do. Such indifference may exist because once an uncertainty is…
Overall measurement uncertainty of k0-based neutron activation analysis
Special aspects of the uncertainty quantification in k0-NAA are discussed and applied in accordance with the Guide to the Expression of Uncertainty in Measurement (GUM), on a model case. The uncertainty budget is calculated highlighting the contribution and the importance of the different parameters to be taken into account. The importance of the nuclide-specific and neutron fluence-specific approach in estimating individual uncertainty contributions is emphasized and demonstrated by examples of Au, Cr, Rb, and Sb determinations. (author)
Plutonium Attribute Estimation From Passive NMIS Measurements at VNIIEF
Currently, the most relevant application of NMIS for plutonium attribute estimation stems from measurements performed jointly by Oak Ridge National Laboratory (ORNL) and Russian Federal Nuclear Center, All-Russia Scientific Research Institute of Experimental Physics (RFNC-VNIIEF) personnel at RFNC-VNIIEF facilities in Sarov, Russia in June and July 2000. During these measurements at VNIIEF, NMIS was applied in its passive mode to eight unclassified plutonium spherical shells. The shells' properties spanned the following ranges: Composition: (delta)-phase plutonium-metal, constant; Relative 240Pu-content (f240Pu): f240Pu = 1.77% (g 240Pu/g Pu), constant; Inner radius (r1): 10.0 mm (le) r1 (le) 53.5 mm, mean r1 33.5 mm; Outer radius (r2): 31.5 mm (le) r2 (le) 60.0 mm, mean r2 = 46.6 mm; Radial thickness (Δr): 6.4 mm (le) Δr (le) 30.2 mm, mean Δr = 13.1 mm; and Plutonium mass (mPu): 1829 g (le) mPu (le) 4468 g, mean mPu = 3265 g. The features of these measurements were analyzed to extract the attributes of each plutonium shell. Given that the samples measured were of constant composition, geometry, and relative 240Pu-content, each shell is completely described by any two of the following four properties: Inner radius r1; Outer radius r2; Mass m, one of 239Pu mass m239Pu, 240Pu mass m240Pu, or total Pu mass mPu; and Radial thickness Δr. Of these, generally only mass is acknowledged as an attribute of interest; the second property (whichever is chosen) can be considered to be a parameter of the attribute-estimation procedure, much as multiplication is a parameter necessary to accurately estimate fissile mass via most neutron measurements
Hwang, Rong-Jen; Rogers, Craig; Beltran, Jada; Razatos, Gerasimos; Avery, Jason
2016-06-01
Reporting a measurement of uncertainty helps to determine the limitations of the method of analysis and aids in laboratory accreditation. This laboratory has conducted a study to estimate a reasonable uncertainty for the mass concentration of vaporous ethanol, in g/210 L, by the Intoxilyzer(®) 8000 breath analyzer. The uncertainty sources used were: gas chromatograph (GC) calibration adjustment, GC analytical, certified reference material, Intoxilyzer(®) 8000 calibration adjustment and Intoxilyzer(®) 8000 analytical. Standard uncertainties attributed to these sources were calculated and separated into proportional and constant standard uncertainties. Both the combined proportional and the constant standard uncertainties were further combined to an expanded uncertainty as both a percentage and an unit. To prevent any under reporting of the expanded uncertainty, 0.10 g/210 L was chosen as the defining point for expressing the expanded uncertainty. For the Intoxilyzer(®) 8000, all vaporous ethanol results at or above 0.10 g/210 L, the expanded uncertainty will be reported as ±3.6% at a confidence level of 95% (k = 2); for vaporous ethanol results below 0.10 g/210 L, the expanded uncertainty will be reported as ±0.0036 g/210 L at a confidence level of 95% (k = 2). PMID:27107099
Adaptive framework for uncertainty analysis in electromagnetic field measurements
Misinterpretation of uncertainty in the measurement of the electromagnetic field (EMF) strength may lead to an underestimation of exposure risk or an overestimation of required measurements. The Guide to the Expression of Uncertainty in Measurement (GUM) has internationally been adopted as a de facto standard for uncertainty assessment. However, analyses under such an approach commonly assume unrealistic static models or neglect relevant prior information, resulting in non-robust uncertainties. This study proposes a principled and systematic framework for uncertainty analysis that fuses information from current measurements and prior knowledge. Such a framework dynamically adapts to data by exploiting a likelihood function based on kernel mixtures and incorporates flexible choices of prior information by applying importance sampling. The validity of the proposed techniques is assessed from measurements performed with a broadband radiation meter and an isotropic field probe. The developed framework significantly outperforms GUM approach, achieving a reduction of 28 % in measurement uncertainty. (authors)
Use of Commericially Available Software in an Attribute Measurement System.
MacArthur, D. W. (Duncan W.); Bracken, D. S. (David S.); Carrillo, L. A. (Louis A.); Elmont, T. H. (Timothy H.); Frame, K. C. (Katherine C.); Hirsch, K. L. (Karen L.)
2005-01-01
A major issue in international safeguards of nuclear materials is the ability to verify that processes and materials in nuclear facilities are consistent with declaration without revealing sensitive information. An attribute measurement system (AMS) is a non-destructive assay (NDA) system that utilizes an information barrier to protect potentially sensitive information about the measurement item. A key component is the software utilized for operator interface, data collection, analysis, and attribute determination, as well as the operating system under which they are implemented. Historically, custom software has been used almost exclusively in transparency applications, and it is unavoidable that some amount of custom software is needed. The focus of this paper is to explore the extent to which commercially available software may be used and the relative merits.
Use of Commericially Available Software in an Attribute Measurement System
A major issue in international safeguards of nuclear materials is the ability to verify that processes and materials in nuclear facilities are consistent with declaration without revealing sensitive information. An attribute measurement system (AMS) is a non-destructive assay (NDA) system that utilizes an information barrier to protect potentially sensitive information about the measurement item. A key component is the software utilized for operator interface, data collection, analysis, and attribute determination, as well as the operating system under which they are implemented. Historically, custom software has been used almost exclusively in transparency applications, and it is unavoidable that some amount of custom software is needed. The focus of this paper is to explore the extent to which commercially available software may be used and the relative merits.
Uncertainty Measures in Ordered Information System Based on Approximation Operators
Bingjiao Fan
2014-01-01
Full Text Available This paper focuses on constructing uncertainty measures by the pure rough set approach in ordered information system. Four types of definitions of lower and upper approximations and corresponding uncertainty measurement concepts including accuracy, roughness, approximation quality, approximation accuracy, dependency degree, and importance degree are investigated. Theoretical analysis indicates that all the four types can be used to evaluate the uncertainty in ordered information system, especially that we find that the essence of the first type and the third type is the same. To interpret and help understand the approach, experiments about real-life data sets have been conducted to test the four types of uncertainty measures. From the results obtained, it can be shown that these uncertainty measures can surely measure the uncertainty in ordered information system.
Image Reinforcement or Impairment: The Effects of Co-Branding on Attribute Uncertainty
Tansev Geylani; J. Jeffrey Inman; Frenkel Ter Hofstede
2008-01-01
Co-branding is often used by companies to reinforce the image of their brands. In this paper, we investigate the conditions under which a brand's image is reinforced or impaired as a result of co-branding, and the characteristics of a good partner for a firm considering co-branding for image reinforcement. We address these issues by conceptualizing attribute beliefs as two-dimensional constructs: The first dimension reflects the expected value of the attribute, while the second dimension refl...
Uncertainty Quantification for Quantitative Imaging Holdup Measurements
Bevill, Aaron M [ORNL; Bledsoe, Keith C [ORNL
2016-01-01
In nuclear fuel cycle safeguards, special nuclear material "held up" in pipes, ducts, and glove boxes causes significant uncertainty in material-unaccounted-for estimates. Quantitative imaging is a proposed non-destructive assay technique with potential to estimate the holdup mass more accurately and reliably than current techniques. However, uncertainty analysis for quantitative imaging remains a significant challenge. In this work we demonstrate an analysis approach for data acquired with a fast-neutron coded aperture imager. The work includes a calibrated forward model of the imager. Cross-validation indicates that the forward model predicts the imager data typically within 23%; further improvements are forthcoming. A new algorithm based on the chi-squared goodness-of-fit metric then uses the forward model to calculate a holdup confidence interval. The new algorithm removes geometry approximations that previous methods require, making it a more reliable uncertainty estimator.
GRAPH THEORY APPROACH TO QUANTIFY UNCERTAINTY OF PERFORMANCE MEASURES
Sérgio D. Sousa
2015-03-01
Full Text Available In this work, the performance measurement process is studied to quantify the uncertainty induced in the resulting performance measure (PM. To that end, the causes of uncertainty are identified, analysing the activities undertaken in the three following stages of the performance measurement process: design and implementation, data collection and record, and determination and analysis. A quantitative methodology based on graph theory and on the sources of uncertainty of the performance measurement process is used to calculate an uncertainty index to evaluate the level of uncertainty of a given PM or (key performance indicator. An application example is presented. The quantification of PM uncertainty could contribute to better represent the risk associated with a given decision and also to improve the PM to increase its precision and reliability.
OPEN PUBLIC SPACE ATTRIBUTES AND CATEGORIES – COMPLEXITY AND MEASURABILITY
Ljiljana Čavić
2014-12-01
Full Text Available Within the field of architectural and urban research, this work addresses the complexity of contemporary public space, both in a conceptual and concrete sense. It aims at systematizing spatial attributes and their categories and discussing spatial complexity and measurability, all this in order to reach a more comprehensive understanding, description and analysis of public space. Our aim is to improve everyday usage of open public space and we acknowledged users as its crucial factor. There are numerous investigations on the complex urban and architectural reality of public space that recognise importance of users. However, we did not find any that would holistically account for what users find essential in public space. Based on the incompleteness of existing approaches on open public space and the importance of users for their success, this paper proposes a user-orientated approach. Through an initial survey directed to users, we collected the most important aspects of public spaces in the way that contemporary humans see them. The gathered data is analysed and coded into spatial attributes from which their role in the complexity of open public space and measurability are discussed. The work results in an inventory of attributes that users find salient in public spaces. It does not discuss their qualitative values or contribution in generating spatial realities. It aims to define them clearly so that any further logical argumentation on open space concerning users may be solidly constructed. Finally, through categorisation of attributes it proposes the disciplinary levels necessary for the analysis of complex urban-architectural reality
On the different approaches of measuring uncertainty shocks
Strobel, Johannes
2015-01-01
As uncertainty has become an increasingly prominent source of business cycle fluctuations, various uncertainty proxies have been proposed in the literature. This paper shows that uncertainty measures based on realized variables fluctuate more than the measures that are based on forecasts. More precisely, the variation in the realized cross-sectional standard deviation of profit growth and stock returns is larger than the variation in the forecast standard deviation. Moreover, the forecast sta...
Attributes measurements by calorimetry in 15 to 30 minutes
An analysis of the early portion of the power-history data collected with both of the IAEA's air-cooled bulk calorimeters has demonstrated that such calorimeters can measure the power from preheated containers of plutonium oxide with an accuracy of 2-5% in 15 to 30 minutes. Material accountancy at plutonium facilities has a need for such a capability for measurement of Pu scrap. Also, the IAEA could use just two calorimeters and a gamma-ray assay system for reliable variables and attributes measurements of plutonium mass during a two-day physical-inventory verification (PIV) at a mixed-oxide (MOX) fuel-fabrication facility. The assay results would be free of the concerns about sample moisture, impurities, and geometry that previously have limited the accuracy of assays based on neutron measurements
Designing a 3rd generation, authenticatable attribute measurement system
Attribute measurement systems (AMS) are designed to measure potentially sensitive items containing Special Nuclear Materials to determine if the items possess attributes which fall within an agreed-upon range. Such systems could be used in a treaty to inspect and verify the identity of items in storage without revealing any sensitive information associated with the item. An AMS needs to satisfy two constraints: the host party needs to be sure that none of their sensitive information is released, while the inspecting party wants to have confidence that the limited amount of information they see accurately reflects the properties of the item being measured. The former involves 'certifying' the system and the latter 'authenticating' it. Previous work into designing and building AMS systems have focused more on the questions of certifiability than on the questions of authentication - although a few approaches have been investigated. The next step is to build a 3rd generation AMS which (1) makes the appropriate measurements, (2) can be certified, and (3) can be authenticated (the three generations). This paper will discuss the ideas, options, and process of producing a design for a 3rd generation AMS.
Li, Shuai; Xiong, Lihua; Li, Hongyi; Leung, Lai-Yung R.; Demissie, Yonas
2016-01-08
Hydrological simulations to delineate the impacts of climate variability and human activities are subjected to uncertainties related to both parameter and structure of the hydrological models. To analyze the impact of these uncertainties on the model performance and to yield more reliable simulation results, a global calibration and multimodel combination method that integrates the Shuffled Complex Evolution Metropolis (SCEM) and Bayesian Model Averaging (BMA) of four monthly water balance models was proposed. The method was applied to the Weihe River Basin (WRB), the largest tributary of the Yellow River, to determine the contribution of climate variability and human activities to runoff changes. The change point, which was used to determine the baseline period (1956-1990) and human-impacted period (1991-2009), was derived using both cumulative curve and Pettitt’s test. Results show that the combination method from SCEM provides more skillful deterministic predictions than the best calibrated individual model, resulting in the smallest uncertainty interval of runoff changes attributed to climate variability and human activities. This combination methodology provides a practical and flexible tool for attribution of runoff changes to climate variability and human activities by hydrological models.
Instrumental measurement of beer taste attributes using an electronic tongue
Rudnitskaya, Alisa, E-mail: alisa.rudnitskaya@gmail.com [Chemistry Department, University of Aveiro, Aveiro (Portugal); Laboratory of Chemical Sensors, Chemistry Department, St. Petersburg University, St. Petersburg (Russian Federation); Polshin, Evgeny [Laboratory of Chemical Sensors, Chemistry Department, St. Petersburg University, St. Petersburg (Russian Federation); BIOSYST/MeBioS, Catholic University of Leuven, W. De Croylaan 42, B-3001 Leuven (Belgium); Kirsanov, Dmitry [Laboratory of Chemical Sensors, Chemistry Department, St. Petersburg University, St. Petersburg (Russian Federation); Lammertyn, Jeroen; Nicolai, Bart [BIOSYST/MeBioS, Catholic University of Leuven, W. De Croylaan 42, B-3001 Leuven (Belgium); Saison, Daan; Delvaux, Freddy R.; Delvaux, Filip [Centre for Malting and Brewing Sciences, Katholieke Universiteit Leuven, Heverelee (Belgium); Legin, Andrey [Laboratory of Chemical Sensors, Chemistry Department, St. Petersburg University, St. Petersburg (Russian Federation)
2009-07-30
The present study deals with the evaluation of the electronic tongue multisensor system as an analytical tool for the rapid assessment of taste and flavour of beer. Fifty samples of Belgian and Dutch beers of different types (lager beers, ales, wheat beers, etc.), which were characterized with respect to the sensory properties, were measured using the electronic tongue (ET) based on potentiometric chemical sensors developed in Laboratory of Chemical Sensors of St. Petersburg University. The analysis of the sensory data and the calculation of the compromise average scores was made using STATIS. The beer samples were discriminated using both sensory panel and ET data based on PCA, and both data sets were compared using Canonical Correlation Analysis. The ET data were related to the sensory beer attributes using Partial Least Square regression for each attribute separately. Validation was done based on a test set comprising one-third of all samples. The ET was capable of predicting with good precision 20 sensory attributes of beer including such as bitter, sweet, sour, fruity, caramel, artificial, burnt, intensity and body.
Instrumental measurement of beer taste attributes using an electronic tongue
The present study deals with the evaluation of the electronic tongue multisensor system as an analytical tool for the rapid assessment of taste and flavour of beer. Fifty samples of Belgian and Dutch beers of different types (lager beers, ales, wheat beers, etc.), which were characterized with respect to the sensory properties, were measured using the electronic tongue (ET) based on potentiometric chemical sensors developed in Laboratory of Chemical Sensors of St. Petersburg University. The analysis of the sensory data and the calculation of the compromise average scores was made using STATIS. The beer samples were discriminated using both sensory panel and ET data based on PCA, and both data sets were compared using Canonical Correlation Analysis. The ET data were related to the sensory beer attributes using Partial Least Square regression for each attribute separately. Validation was done based on a test set comprising one-third of all samples. The ET was capable of predicting with good precision 20 sensory attributes of beer including such as bitter, sweet, sour, fruity, caramel, artificial, burnt, intensity and body.
Uncertainty determination demonstration program on MC and A measurement systems
Statistically propagated limits of error (LOE) for accountability measurements are usually smaller than LOEs derived from historical data. Laboratory measurement quality control programs generate estimates of random and systematic errors for the LOE calculations. The uncertainty of measurement system standards and instrument calibrations are often not included in measurement quality control (QC) programs (MCPs) estimates. Therefore, the uncertainty associated with a measurement system is usually underestimated. A program was conducted at the Savannah River Site (SRS) to evaluate a commercial measurement assurance program software package (JTIPMAP trademark) that records, charts, and analyzes control standard measurements to determine and control total measurement uncertainty. The software uniquely uses the uncertainty of the standards, the calibration histories and routine QC data to estimate the total uncertainty of a measurement process. The demonstration program involved: training measurement personnel on the principles of process measurement assurance (PMAP) and the use of the software; technical support in setting up PMAPs on gas mass spectrometry, calorimetry, Fourier transformed infrared (FTIR) spectrometry, alpha PHA spectrometry, diode array spectrophotometry, and mass standards calibrations measurement systems; and determining and evaluating uncertainty estimates for each system. Results of the demonstration program are described and the uncertainties for these measurement systems are summarized in the paper below. The demonstration showed the training and software provided several useful functions such as uncertainty determinations that include the standards used and independent standards that reveal measurement process systematic errors, which produced larger uncertainties estimates than current MCPs. The software will be tested further in pilot programs for D2O measurements, calorimetry and mass standards calibrations
Relating confidence to measured information uncertainty in qualitative reasoning
Chavez, Gregory M [Los Alamos National Laboratory; Zerkle, David K [Los Alamos National Laboratory; Key, Brian P [Los Alamos National Laboratory; Shevitz, Daniel W [Los Alamos National Laboratory
2010-10-07
Qualitative reasoning makes use of qualitative assessments provided by subject matter experts to model factors such as security risk. Confidence in a result is important and useful when comparing competing results. Quantifying the confidence in an evidential reasoning result must be consistent and based on the available information. A novel method is proposed to relate confidence to the available information uncertainty in the result using fuzzy sets. Information uncertainty can be quantified through measures of non-specificity and conflict. Fuzzy values for confidence are established from information uncertainty values that lie between the measured minimum and maximum information uncertainty values.
UNCERTAINTY AND ITS IMPACT ON THE QUALITY OF MEASUREMENT
Adel Elahdi M. Yahya
2012-01-01
Full Text Available The imposition of practice, the current world, the laboratory measurement, calibration should be approved by points of credit to national or international and should be compatible with the requirements specification (ISO 17025 for the adoption of efficient laboratories. Those requirements were included the testing process or scale limits to doubt that mentioned in the measurement certificate, which recognizes the customer to achieve quality and efficiency in the process of measurement. In this study we would theoretically try to clarify, indicate what the uncertainty in the measurement, standard types of uncertainty and how to calculate the budget of uncertainty as we should show some examples of how the scientific calculation of the budget challenge with some measure the lengths of the laboratory. After analyzing the results we had found during the measurement using CMM, we had found that the value of non-statistical uncertainty in the measurement type (b piece length of one meter was Â±1.9257 Âµm. and when using the configuration measuring device, we had gotten the value of the extended standard combined uncertainty Â±2.030 Âµm when measured the screws value of 1.2707 mm. When used the configuration measuring device, we had gotten the value of the extended standard combined uncertainty Â±2.030 Âµm when measuring the screws value of 1.2707 mm. We concluded that the impact of uncertainty on the measured results a high fineness degree and less impact on the smoothness of a piece with low fineness, careful calibration of measuring instrument Careful calibration of measuring instrument and equipment by measurement standard is of the utmost importance and equipment by measurement standard is of the utmost importance and laboratories must calculate the uncertainty budget as a part of measurement evaluation to provide high quality measurement results.
Strain gauge measurement uncertainties on hydraulic turbine runner blade
Strains experimentally measured with strain gauges can differ from those evaluated using the Finite Element (FE) method. This difference is due mainly to the assumptions and uncertainties inherent to each method. To circumvent this difficulty, we developed a numerical method based on Monte Carlo simulations to evaluate measurement uncertainties produced by the behaviour of a unidirectional welded gauge, its position uncertainty and its integration effect. This numerical method uses the displacement fields of the studied part evaluated by an FE analysis. The paper presents a study case using in situ data measured on a hydraulic turbine runner. The FE analysis of the turbine runner blade was computed, and our numerical method used to evaluate uncertainties on strains measured at five locations with welded strain gauges. Then, measured strains and their uncertainty ranges are compared to the estimated strains. The uncertainty ranges obtained extended from 74 με to 165 με. Furthermore, the biases observed between the median of the uncertainty ranges and the FE strains varied from −36 to 36 με. Note that strain gauge measurement uncertainties depend mainly on displacement fields and gauge geometry.
Extended component importance measures considering aleatory and epistemic uncertainties
Sallak, Mohamed; Schon, Walter; Aguirre, Felipe
2013-01-01
International audience In this paper, extended component importance measures (Birnbaum importance, RAW, RRW and Crit- icality importance) considering aleatory and epistemic uncertainties are introduced. The D-S theory which is considered to be a less restricted extension of probability theory is proposed as a framework for taking into account both aleatory and epistemic uncertainties. The epistemic uncertainty defined in this paper is the total lack of knowledge of the component state. The...
Measurement uncertainty in pharmaceutical analysis and its application
Marcus Augusto Lyrio Traple; Alessandro Morais Saviano; Fabiane Lacerda Francisco; Felipe Rebello Lourenço
2014-01-01
The measurement uncertainty provides complete information about an analytical result. This is very important because several decisions of compliance or non-compliance are based on analytical results in pharmaceutical industries. The aim of this work was to evaluate and discuss the estimation of uncertainty in pharmaceutical analysis. The uncertainty is a useful tool in the assessment of compliance or non-compliance of in-process and final pharmaceutical products as well as in the assessment o...
Calibration and Measurement Uncertainty Estimation of Radiometric Data: Preprint
Habte, A.; Sengupta, M.; Reda, I.; Andreas, A.; Konings, J.
2014-11-01
Evaluating the performance of photovoltaic cells, modules, and arrays that form large solar deployments relies on accurate measurements of the available solar resource. Therefore, determining the accuracy of these solar radiation measurements provides a better understanding of investment risks. This paper provides guidelines and recommended procedures for estimating the uncertainty in calibrations and measurements by radiometers using methods that follow the International Bureau of Weights and Measures Guide to the Expression of Uncertainty (GUM). Standardized analysis based on these procedures ensures that the uncertainty quoted is well documented.
Point-In-Time Measurement Uncertainty Recapture for RCS Flow
Jung, Byung Ryul; Jang, Ho Cheol; Yune, Seok Jeong; Kim, Eun Kee [Korea Power Engineering and Construction Company, Inc., Daejeon (Korea, Republic of)
2014-10-15
In nuclear power plants, RCS flow measurement uncertainty plays an important role in the establishment of flow acceptance criteria. The narrow band of acceptance criteria based on the design limiting uncertainty of the measured RCS flow may lead to a point-in-time violation of acceptance criteria in a situation where the measured flow is too close to the upper limit of allowable RCS flow operating band. Also the measured RCS flow may approach the lower limit of the acceptance criteria as operating cycle proceeds. Several measurement uncertainty recapturing methods for RCS flow are attempted to be applied in a point-in-time situation failed to meet the acceptance criteria. Also a combination of these recapturing methods can be utilized to establish a design limiting measurement uncertainty. To recapture the RCS flow measurement uncertainty, possible and practical methods are proposed to be utilized in a point-in-time situation failed to meet the acceptance criteria. These methods can be used as a design basis methodology to establish the design limiting uncertainty. It is worthy to note that the hot and cold leg temperatures have an additional redundancy such as wide range instrument channel. The measured operating condition for RCS flow has potential for more recapture. With those recapturing ways more applied, the uncertainty recapture can be improved.
Detailed Uncertainty Analysis of the ZEM-3 Measurement System
Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred
2014-01-01
The measurement of Seebeck coefficient and electrical resistivity are critical to the investigation of all thermoelectric systems. Therefore, it stands that the measurement uncertainty must be well understood to report ZT values which are accurate and trustworthy. A detailed uncertainty analysis of the ZEM-3 measurement system has been performed. The uncertainty analysis calculates error in the electrical resistivity measurement as a result of sample geometry tolerance, probe geometry tolerance, statistical error, and multi-meter uncertainty. The uncertainty on Seebeck coefficient includes probe wire correction factors, statistical error, multi-meter uncertainty, and most importantly the cold-finger effect. The cold-finger effect plagues all potentiometric (four-probe) Seebeck measurement systems, as heat parasitically transfers through thermocouple probes. The effect leads to an asymmetric over-estimation of the Seebeck coefficient. A thermal finite element analysis allows for quantification of the phenomenon, and provides an estimate on the uncertainty of the Seebeck coefficient. The thermoelectric power factor has been found to have an uncertainty of +9-14 at high temperature and 9 near room temperature.
Measurement uncertainty in Total Reflection X-ray Fluorescence
Total Reflection X-ray Fluorescence (TXRF) spectrometry is a multi-elemental technique using micro-volumes of sample. This work assessed the components contributing to the combined uncertainty budget associated with TXRF measurements using Cu and Fe concentrations in different spiked and natural water samples as an example. The results showed that an uncertainty estimation based solely on the count statistics of the analyte is not a realistic estimation of the overall uncertainty, since the depositional repeatability and the relative sensitivity between the analyte and the internal standard are important contributions to the uncertainty budget. The uncertainty on the instrumental repeatability and sensitivity factor could be estimated and as such, potentially relatively straightforward implemented in the TXRF instrument software. However, the depositional repeatability varied significantly from sample to sample and between elemental ratios and the controlling factors are not well understood. By a lack of theoretical prediction of the depositional repeatability, the uncertainty budget can be based on repeat measurements using different reflectors. A simple approach to estimate the uncertainty was presented. The measurement procedure implemented and the uncertainty estimation processes developed were validated from the agreement with results obtained by inductively coupled plasma — optical emission spectrometry (ICP-OES) and/or reference/calculated values. - Highlights: • The uncertainty of TXRF cannot be realistically described by the counting statistics. • The depositional repeatability is an important contribution to the uncertainty. • Total combined uncertainties for Fe and Cu in waste/mine water samples were 4–8%. • Obtained concentrations agree within uncertainty with reference values
Measurement uncertainty in Total Reflection X-ray Fluorescence
Floor, G.H., E-mail: geerke.floor@gfz-potsdam.de [GFZ German Research Centre for Geosciences Section 3.4. Earth Surface Geochemistry, Telegrafenberg, 14473 Postdam (Germany); Queralt, I. [Institute of Earth Sciences Jaume Almera ICTJA-CSIC, Solé Sabaris s/n, 08028 Barcelona (Spain); Hidalgo, M.; Marguí, E. [Department of Chemistry, University of Girona, Campus Montilivi s/n, 17071 Girona (Spain)
2015-09-01
Total Reflection X-ray Fluorescence (TXRF) spectrometry is a multi-elemental technique using micro-volumes of sample. This work assessed the components contributing to the combined uncertainty budget associated with TXRF measurements using Cu and Fe concentrations in different spiked and natural water samples as an example. The results showed that an uncertainty estimation based solely on the count statistics of the analyte is not a realistic estimation of the overall uncertainty, since the depositional repeatability and the relative sensitivity between the analyte and the internal standard are important contributions to the uncertainty budget. The uncertainty on the instrumental repeatability and sensitivity factor could be estimated and as such, potentially relatively straightforward implemented in the TXRF instrument software. However, the depositional repeatability varied significantly from sample to sample and between elemental ratios and the controlling factors are not well understood. By a lack of theoretical prediction of the depositional repeatability, the uncertainty budget can be based on repeat measurements using different reflectors. A simple approach to estimate the uncertainty was presented. The measurement procedure implemented and the uncertainty estimation processes developed were validated from the agreement with results obtained by inductively coupled plasma — optical emission spectrometry (ICP-OES) and/or reference/calculated values. - Highlights: • The uncertainty of TXRF cannot be realistically described by the counting statistics. • The depositional repeatability is an important contribution to the uncertainty. • Total combined uncertainties for Fe and Cu in waste/mine water samples were 4–8%. • Obtained concentrations agree within uncertainty with reference values.
Attributes and templates from active measurements with 252Cf
Active neutron interrogation is useful for the detection of shielded HEU and could also be used for Pu. In an active technique, fissile material is stimulated by an external neutron source to produce fission with the emanation of neutrons and gamma rays. The time distribution of particles leaving the fissile material is measured with respect to the source emission in a variety of ways. A variety of accelerator and radioactive sources can be used. Active interrogation of nuclear weapons/components can be used in two ways: template matching or attribute estimation. Template matching compares radiation signatures with known reference signatures and for treaty applications has the problem of authentication of the reference signatures along with storage and retrieval of templates. Attribute estimation determines, for example, the fissile mass from various features of the radiation signatures and does not require storage of radiation signatures but does require calibration, which can be repeated as necessary. A nuclear materials identification system (NMIS) has been in use at the Oak Ridge Y-12 Plant for verification of weapons components being received and in storage by template matching and has been used with calibrations for attribute (fissile mass) estimation for HEU metal. NMIS employs a 252Cf source of low intensity (6 n/sec) such that the dose at 1 m is approximately twice that on a commercial airline at altitude. The use of such a source presents no significant safety concerns either for personnel or nuclear explosive safety, and has been approved for use at the Pantex Plant on fully assembled weapons systems
Uncertainty and sensitivity analysis and its applications in OCD measurements
Vagos, Pedro; Hu, Jiangtao; Liu, Zhuan; Rabello, Silvio
2009-03-01
This article describes an Uncertainty & Sensitivity Analysis package, a mathematical tool that can be an effective time-shortcut for optimizing OCD models. By including real system noises in the model, an accurate method for predicting measurements uncertainties is shown. The assessment, in an early stage, of the uncertainties, sensitivities and correlations of the parameters to be measured drives the user in the optimization of the OCD measurement strategy. Real examples are discussed revealing common pitfalls like hidden correlations and simulation results are compared with real measurements. Special emphasis is given to 2 different cases: 1) the optimization of the data set of multi-head metrology tools (NI-OCD, SE-OCD), 2) the optimization of the azimuth measurement angle in SE-OCD. With the uncertainty and sensitivity analysis result, the right data set and measurement mode (NI-OCD, SE-OCD or NI+SE OCD) can be easily selected to achieve the best OCD model performance.
Assessment of dose measurement uncertainty using RisøScan
Helt-Hansen, J.; Miller, A.
2006-01-01
The dose measurement uncertainty of the dosimeter system RisoScan, office scanner and Riso B3 dosimeters has been assessed by comparison with spectrophotometer measurements of the same dosimeters. The reproducibility and the combined uncertainty were found to be approximately 2% and 4%, respectiv......The dose measurement uncertainty of the dosimeter system RisoScan, office scanner and Riso B3 dosimeters has been assessed by comparison with spectrophotometer measurements of the same dosimeters. The reproducibility and the combined uncertainty were found to be approximately 2% and 4......%, respectively, at one standard deviation. The subroutine in RisoScan for electron energy measurement is shown to give results that are equivalent to the measurements with a scanning spectrophotometer. (c) 2006 Elsevier Ltd. All rights reserved....
Measuring the Gas Constant "R": Propagation of Uncertainty and Statistics
Olsen, Robert J.; Sattar, Simeen
2013-01-01
Determining the gas constant "R" by measuring the properties of hydrogen gas collected in a gas buret is well suited for comparing two approaches to uncertainty analysis using a single data set. The brevity of the experiment permits multiple determinations, allowing for statistical evaluation of the standard uncertainty u[subscript…
Evaluation of the uncertainty of environmental measurements of radioactivity
Results obtained by measurement of radioactivity have traditionally been associated with an expression of their uncertainty, based on the so-called counting statistics. This is calculated together with the actual result on the assumption that the number of counts observed has a Poisson distribution with equal mean and variance. Most of the nuclear scientific community has, therefore, assumed that it already complied with the latest ISO 17025 requirements. Counting statistics, however, express only the variability observed among repeated measurements of the same sample under the same counting conditions, which is equivalent to the term repeatability used in quantitative analysis. Many other sources of uncertainty need to be taken into account before a statement of the uncertainty of the actual result can be made. As the first link in the traceability chain calibration is always an important uncertainty component in any kind of measurement. For radioactivity measurements in particular counting geometry assumes the greatest importance, because it is often not possible to measure a standard and a control sample under exactly the same conditions. In the case of large samples there are additional uncertainty components associated with sample heterogeneity and its influence on self-absorption and counting efficiency. An uncertainty budget is prepared for existing data for 137Cs in Danish soil, which is shown to account adequately for all sources of uncertainty. (author)
The largest source of uncertainty in the calculation of reactor power can be attributed to the limited accuracy and potential for undetected degradation of conventional flow nozzles and venturis used for feedwater flow measurement. UFM installations have been carried out with regulatory approval in PWRs and BWRs in the USA, regulatory approval is being progressed for trial installations on commercial nuclear units in Japan, and installations are being considered for PHWRs in Canada. Installations use permanently mounted chordal measurement transducer arrays in laboratory calibrated pipe spools to achieve a measurement accuracy of ±0.28%. In addition to high accuracy, measurement systems have evolved to be highly reliable, with redundancy and self-checking features built in to eliminate failures and the potential for drift and inadvertent overpower conditions. Outputs can be used for thermal power measurement and for feedwater flow process control. Measurement frequency can be set to be compatible with existing systems for thermal power measurement and process control. Contributors to thermal power measurement uncertainty are examined, and the range of potential measurement uncertainty recapture (MUR) is identified. Using industry-accepted practices to carry out MUR calculations, the available thermal power uprate can be predicted. Based on the combined uncertainty of all of the process parameters used in on-line thermal power calculations and the uncertainty assumed in the original licensing basis, available thermal power uprates vary between 1.5 and 2.5% of full power (FP). As the year-to-year power demand in Canada increases, nuclear energy continues to play an essential role in providing secure, stable and affordable electricity. Nuclear energy remains cost-competitive compared to other energy resources while eliminating greenhouse gas emissions. In the last decade, great progress has been achieved in developing new technologies applicable to NPPs, especially
An entropic uncertainty principle for positive operator valued measures
Rumin, Michel
2011-01-01
Extending a recent result by Frank and Lieb, we show an entropic uncertainty principle for mixed states in a Hilbert space relatively to pairs of positive operator valued measures that are independent in some sense.
Measurement Uncertainty in Decision Assessment for Telecommunication Systems
Moschitta, Antonio; Pianegiani, Fernando; Petri, Dario
2004-01-01
This paper deals with the effects of measurement uncertainty on decision making. In particular, conformance tests for communication systems equipment are considered, with respect to both consumer and producer risk and the effects of such parameters on the overall costs.
Uncertainty issues on S-CO2 compressor performance measurement
This is related to the property variation, pressure ratio and measurement method. Since SCO2PE facility operates near the critical point with a low pressure ratio compressor, one of the solutions to improve measurement uncertainty is utilizing a density meter. However, additional two density meters on compressor inlet and outlet measurement didn't provide remarkable improvement on the overall uncertainty. Thus, the authors think that different approach on the performance measurement is required to secure measurement confidence. As further works, identifying appropriate approximation on efficiency equation and applying direct measurement of compressor shaft power for the efficiency calculation will be considered
Point cloud uncertainty analysis for laser radar measurement system based on error ellipsoid model
Zhengchun, Du; Zhaoyong, Wu; Jianguo, Yang
2016-04-01
Three-dimensional laser scanning has become an increasingly popular measurement method in industrial fields as it provides a non-contact means of measuring large objects, whereas the conventional methods are contact-based. However, the data acquisition process is subject to many interference factors, which inevitably cause errors. Therefore, it is necessary to precisely evaluate the accuracy of the measurement results. In this study, an error-ellipsoid-based uncertainty model was applied to 3D laser radar measurement system (LRMS) data. First, a spatial point uncertainty distribution map was constructed according to the error ellipsoid attributes. The single-point uncertainty ellipsoid model was then extended to point-point, point-plane, and plane-plane situations, and the corresponding distance uncertainty models were derived. Finally, verification experiments were performed by using an LRMS to measure the height of a cubic object, and the measurement accuracies were evaluated. The results show that the plane-plane distance uncertainties determined based on the ellipsoid model are comparable to those obtained by actual distance measurements. Thus, this model offers solid theoretical support to enable further LRMS measurement accuracy improvement.
On the Uncertainty Principle for Continuous Quantum Measurement
Miao, Haixing
2016-01-01
We revisit the Heisenberg uncertainty principle for continuous quantum measurement with the detector describable by linear response. When the detector is at the quantum limit with minimum uncertainty, the fluctuation and response of a single-input single-output detector are shown to be related via two equalities. We illustrate the result by applying it to an optomechanical device--a typical continuous measurement setup.
Uncertainty Estimation Improves Energy Measurement and Verification Procedures
Walter, Travis; Price, Phillip N.; Sohn, Michael D.
2014-05-14
Implementing energy conservation measures in buildings can reduce energy costs and environmental impacts, but such measures cost money to implement so intelligent investment strategies require the ability to quantify the energy savings by comparing actual energy used to how much energy would have been used in absence of the conservation measures (known as the baseline energy use). Methods exist for predicting baseline energy use, but a limitation of most statistical methods reported in the literature is inadequate quantification of the uncertainty in baseline energy use predictions. However, estimation of uncertainty is essential for weighing the risks of investing in retrofits. Most commercial buildings have, or soon will have, electricity meters capable of providing data at short time intervals. These data provide new opportunities to quantify uncertainty in baseline predictions, and to do so after shorter measurement durations than are traditionally used. In this paper, we show that uncertainty estimation provides greater measurement and verification (M&V) information and helps to overcome some of the difficulties with deciding how much data is needed to develop baseline models and to confirm energy savings. We also show that cross-validation is an effective method for computing uncertainty. In so doing, we extend a simple regression-based method of predicting energy use using short-interval meter data. We demonstrate the methods by predicting energy use in 17 real commercial buildings. We discuss the benefits of uncertainty estimates which can provide actionable decision making information for investing in energy conservation measures.
Navacerrada Saturio, Maria Angeles; Díaz Sanchidrián, César; Pedrero González, Antonio; Iglesias Martínez, Luis
2008-01-01
The new Spanish Regulation in Building Acoustic establishes values and limits for the different acoustic magnitudes whose fulfillment can be verify by means field measurements. In this sense, an essential aspect of a field measurement is to give the measured magnitude and the uncertainty associated to such a magnitude. In the calculus of the uncertainty it is very usual to follow the uncertainty propagation method as described in the Guide to the expression of Uncertainty in Measurements (GUM...
A Unified Approach for Reporting ARM Measurement Uncertainties Technical Report
Campos, E [Argonne National Laboratory; Sisterson, DL [Argonne National Laboratory
2015-10-01
The Atmospheric Radiation Measurement (ARM) Climate Research Facility is observationally based, and quantifying the uncertainty of its measurements is critically important. With over 300 widely differing instruments providing over 2,500 datastreams, concise expression of measurement uncertainty is quite challenging. The ARM Facility currently provides data and supporting metadata (information about the data or data quality) to its users through a number of sources. Because the continued success of the ARM Facility depends on the known quality of its measurements, the Facility relies on instrument mentors and the ARM Data Quality Office (DQO) to ensure, assess, and report measurement quality. Therefore, an easily-accessible, well-articulated estimate of ARM measurement uncertainty is needed.
Uncertainty Analysis of Sonic Boom Levels Measured in a Simulator at NASA Langley
Rathsam, Jonathan; Ely, Jeffry W.
2012-01-01
A sonic boom simulator has been constructed at NASA Langley Research Center for testing the human response to sonic booms heard indoors. Like all measured quantities, sonic boom levels in the simulator are subject to systematic and random errors. To quantify these errors, and their net influence on the measurement result, a formal uncertainty analysis is conducted. Knowledge of the measurement uncertainty, or range of values attributable to the quantity being measured, enables reliable comparisons among measurements at different locations in the simulator as well as comparisons with field data or laboratory data from other simulators. The analysis reported here accounts for acoustic excitation from two sets of loudspeakers: one loudspeaker set at the facility exterior that reproduces the exterior sonic boom waveform and a second set of interior loudspeakers for reproducing indoor rattle sounds. The analysis also addresses the effect of pressure fluctuations generated when exterior doors of the building housing the simulator are opened. An uncertainty budget is assembled to document each uncertainty component, its sensitivity coefficient, and the combined standard uncertainty. The latter quantity will be reported alongside measurement results in future research reports to indicate data reliability.
Measurement uncertainty in pharmaceutical analysis and its application
Marcus Augusto Lyrio Traple; Alessandro Morais Saviano; Fabiane Lacerda Francisco; Felipe Rebello Lourençon
2014-01-01
The measurement uncertainty provides complete information about an analytical result. This is very important because several decisions of compliance or non-compliance are based on analytical results in pharmaceutical industries. The aim of this work was to evaluate and discuss the estimation of uncertainty in pharmaceutical analysis. The uncertainty is a useful tool in the assessment of compliance or non-compliance of in-process and final pharmaceutical products as well as in the assessment of pharmaceutical equivalence and stability study of drug products.
Measurement uncertainties physical parameters and calibration of instruments
Gupta, S V
2012-01-01
This book fulfills the global need to evaluate measurement results along with the associated uncertainty. In the book, together with the details of uncertainty calculations for many physical parameters, probability distributions and their properties are discussed. Definitions of various terms are given and will help the practicing metrologists to grasp the subject. The book helps to establish international standards for the evaluation of the quality of raw data obtained from various laboratories for interpreting the results of various national metrology institutes in an international inter-comparisons. For the routine calibration of instruments, a new idea for the use of pooled variance is introduced. The uncertainty calculations are explained for (i) independent linear inputs, (ii) non-linear inputs and (iii) correlated inputs. The merits and limitations of the Guide to the Expression of Uncertainty in Measurement (GUM) are discussed. Monte Carlo methods for the derivation of the output distribution from the...
Evaluation of measuring results, statement of uncertainty in dosimeter calibrations
The method described starts from the requirement that the quantitative statement of a measuring result in dosimetry should contain at least three figures: 1) the measured value or the best estimate of the quantity to be measured, 2) the uncertainty of this value given by a figure, which indicates a certain range around the measured value, and which is strongly linked with 3) a figure for the confidence level of this range, i.e. the probability that the (unknown) correct value is embraced by the given uncertainty range. How the figures 2) and 3) can be obtained and how they should be quoted in calibration certificates is the subject of these lectures. In addition, the means by which the method may be extended on determining the uncertainty of a measurement performed under conditions which deviate from the calibration conditt ions is briefly described. (orig.)
Uncertainty evaluation in radon concentration measurement using charcoal canister
Active charcoal detectors are used for testing the concentration of radon in dwellings. The method of measurement is based on radon adsorption on coal and measurement of gamma radiation of radon daughters. The contributions to the final measurement uncertainty are identified, based on the equation for radon activity concentration calculation. Different methods for setting the region of interest for gamma spectrometry of canisters were discussed and evaluated. The obtained radon activity concentration and uncertainties do not depend on peak area determination method. - Highlights: • Measurement uncertainty budget for radon activity concentration established. • Three different methods for ROI selection are used and compared. • Recommend to use one continuous ROI, less sensitive to gamma spectrometry system instabilities
Gamma Attribute Measurements - Pu300, Pu600, Pu900
Gamma rays are ideal probes for the determination of information about the special nuclear material that is in the transparency regime. Gamma rays are good probes because they interact relatively weakly with the containers that surround the SNM under investigation. In addition, gamma rays carry a great deal of information about the material under investigation. We have leveraged these two characteristics to develop three technologies that have proven useful for the measurements of various attributes of plutonium. These technologies are Pu-300, Pu-600 and Pu-900. These technologies obtain the age, isotopics and presence/absence of oxide of a plutonium sample, respectively. Pu-300 obtains the time since the last 241Am separation for a sample of plutonium. This is accomplished by looking at the 241Am/241pu ratio in the energy region from 330-350 keV, hence the name Pu-300. Pu-600 determines the isotopics of the plutonium sample under consideration. More specifically, it determines the 240Pu/239Pu ratio to determine if the plutonium sample is of weapons quality or not. This analysis is carded out in the energy region from 630-670 keV. Pu-900 determines the absence of PuO2 by searching for a peak at 870.7 keV. If this peak is absent then there is no oxide in the sample. This peak arises from the de-excitation of the first excited state of 17O. The assumption being made is that this state is populated by means of the 17O(α,α') reaction. The first excited state of 17O could also be populated by means of the 14N(α,p) reaction, which might indicate that this is not a good signature for the absence of PuO2, however in the samples we have measured this peak is visible in oxide samples and is absent in other samples. In this paper we will discuss the physics details of these technologies and also present results of various measurements
Scepkowski, Lisa A; Wiegel, Markus; Bach, Amy K; Weisberg, Risa B; Brown, Timothy A; Barlow, David H
2004-12-01
This study investigated the attributional styles of men with and without sexual dysfunction for both positive and negative sexual and general events using a sex-specific version of the Attributional Style Questionnaire (Sex-ASQ), and ascertained the preliminary psychometric properties of the measure. The Sex-ASQ was created by embedding 8 hypothetical sexual events (4 positive, 4 negative) among the original 12 events in the Attributional Style Questionnaire (ASQ; C. Peterson, A. Semmel, C. von Baeyer, L. Y. Abramson, G. I. Metalsky, & M. E. Seligman, 1982). The Sex-ASQ was completed by 21 men with a principal DSM-IV diagnosis of Male Erectile Disorder (MED) and 32 male control participants. The psychometrics of the Sex-ASQ were satisfactory, but with the positive sexual event scales found to be less stable and internally consistent than the negative sexual event scales. Reasons for modest reliability of the positive event scales are discussed in terms of the original ASQ. As expected, men with MED did not differ significantly from men without sexual dysfunction in their causal attributions for general events, indicating that both groups exhibited an optimistic attributional style in general. Also as predicted, men with MED made more internal and stable causal attributions for negative sexual events than men without sexual dysfunction, and also rated negative sexual events as more important. For positive sexual events, the 2 groups did not differ in attributional style, with both groups making more external/unstable/specific causal attributions than for positive general events. Differences between explanatory style for sexual versus nonsexual events found in both sexually functional and dysfunctional men lend support for explanatory style models that propose both cross-situational consistency and situational specificity. PMID:15483370
Estimate of the uncertainty in measurement for the determination of mercury in seafood by TDA AAS.
Torres, Daiane Placido; Olivares, Igor R B; Queiroz, Helena Müller
2015-01-01
An approach for the estimate of the uncertainty in measurement considering the individual sources related to the different steps of the method under evaluation as well as the uncertainties estimated from the validation data for the determination of mercury in seafood by using thermal decomposition/amalgamation atomic absorption spectrometry (TDA AAS) is proposed. The considered method has been fully optimized and validated in an official laboratory of the Ministry of Agriculture, Livestock and Food Supply of Brazil, in order to comply with national and international food regulations and quality assurance. The referred method has been accredited under the ISO/IEC 17025 norm since 2010. The approach of the present work in order to reach the aim of estimating of the uncertainty in measurement was based on six sources of uncertainty for mercury determination in seafood by TDA AAS, following the validation process, which were: Linear least square regression, Repeatability, Intermediate precision, Correction factor of the analytical curve, Sample mass, and Standard reference solution. Those that most influenced the uncertainty in measurement were sample weight, repeatability, intermediate precision and calibration curve. The obtained result for the estimate of uncertainty in measurement in the present work reached a value of 13.39%, which complies with the European Regulation EC 836/2011. This figure represents a very realistic estimate of the routine conditions, since it fairly encompasses the dispersion obtained from the value attributed to the sample and the value measured by the laboratory analysts. From this outcome, it is possible to infer that the validation data (based on calibration curve, recovery and precision), together with the variation on sample mass, can offer a proper estimate of uncertainty in measurement. PMID:26065523
Stigsson, Martin; Munier, Raymond
2013-07-01
Measurements of structure orientations are afflicted with uncertainties which arise from many sources. Commonly, such uncertainties involve instrument imprecision, external disturbances and human factors. The aggregated uncertainty depends on the uncertainty of each of the sources. The orientation of an object measured in a borehole (e.g. a fracture) is calculated using four parameters: the bearing and inclination of the borehole and two relative angles of the measured object to the borehole. Each parameter may be a result of one or several measurements. The aim of this paper is to develop a method to both calculate and visualize the aggregated uncertainty resulting from the uncertainty in each of the four geometrical constituents. Numerical methods were used to develop a VBA-application in Microsoft Excel to calculate the aggregated uncertainty. The code calculates two different representations of the aggregated uncertainty: a 1-parameter uncertainty, the ‘minimum dihedral angle’, denoted by Ω; and, a non-parametric visual representation of the uncertainty, denoted by χ. The simple 1-parameter uncertainty algorithm calculates the minimum dihedral angle accurately, but overestimates the probability space that plots as an ellipsoid on a lower hemisphere stereonet. The non-parametric representation plots the uncertainty probability space accurately, usually as a sector of an annulus for steeply inclined boreholes, but is difficult to express numerically. The 1-parameter uncertainty can be used for evaluating statistics of large datasets whilst the non-parametric representation is useful when scrutinizing single or a few objects.
Measurement uncertainty evaluation of conicity error inspected on CMM
Wang, Dongxia; Song, Aiguo; Wen, Xiulan; Xu, Youxiong; Qiao, Guifang
2016-01-01
The cone is widely used in mechanical design for rotation, centering and fixing. Whether the conicity error can be measured and evaluated accurately will directly influence its assembly accuracy and working performance. According to the new generation geometrical product specification(GPS), the error and its measurement uncertainty should be evaluated together. The mathematical model of the minimum zone conicity error is established and an improved immune evolutionary algorithm(IIEA) is proposed to search for the conicity error. In the IIEA, initial antibodies are firstly generated by using quasi-random sequences and two kinds of affinities are calculated. Then, each antibody clone is generated and they are self-adaptively mutated so as to maintain diversity. Similar antibody is suppressed and new random antibody is generated. Because the mathematical model of conicity error is strongly nonlinear and the input quantities are not independent, it is difficult to use Guide to the expression of uncertainty in the measurement(GUM) method to evaluate measurement uncertainty. Adaptive Monte Carlo method(AMCM) is proposed to estimate measurement uncertainty in which the number of Monte Carlo trials is selected adaptively and the quality of the numerical results is directly controlled. The cone parts was machined on lathe CK6140 and measured on Miracle NC 454 Coordinate Measuring Machine(CMM). The experiment results confirm that the proposed method not only can search for the approximate solution of the minimum zone conicity error(MZCE) rapidly and precisely, but also can evaluate measurement uncertainty and give control variables with an expected numerical tolerance. The conicity errors computed by the proposed method are 20%-40% less than those computed by NC454 CMM software and the evaluation accuracy improves significantly.
Evaluating the uncertainty of input quantities in measurement models
Possolo, Antonio; Elster, Clemens
2014-06-01
The Guide to the Expression of Uncertainty in Measurement (GUM) gives guidance about how values and uncertainties should be assigned to the input quantities that appear in measurement models. This contribution offers a concrete proposal for how that guidance may be updated in light of the advances in the evaluation and expression of measurement uncertainty that were made in the course of the twenty years that have elapsed since the publication of the GUM, and also considering situations that the GUM does not yet contemplate. Our motivation is the ongoing conversation about a new edition of the GUM. While generally we favour a Bayesian approach to uncertainty evaluation, we also recognize the value that other approaches may bring to the problems considered here, and focus on methods for uncertainty evaluation and propagation that are widely applicable, including to cases that the GUM has not yet addressed. In addition to Bayesian methods, we discuss maximum-likelihood estimation, robust statistical methods, and measurement models where values of nominal properties play the same role that input quantities play in traditional models. We illustrate these general-purpose techniques in concrete examples, employing data sets that are realistic but that also are of conveniently small sizes. The supplementary material available online lists the R computer code that we have used to produce these examples (stacks.iop.org/Met/51/3/339/mmedia). Although we strive to stay close to clause 4 of the GUM, which addresses the evaluation of uncertainty for input quantities, we depart from it as we review the classes of measurement models that we believe are generally useful in contemporary measurement science. We also considerably expand and update the treatment that the GUM gives to Type B evaluations of uncertainty: reviewing the state-of-the-art, disciplined approach to the elicitation of expert knowledge, and its encapsulation in probability distributions that are usable in
Measurement uncertainty issues in freeze-drying Processes
Vallan, Alberto; Carullo, Alessio
2012-01-01
This paper deals with problems that have to be faced when performing mass and temperature measurements of substances subjected to freeze-drying processes. A brief description of a lyophilization process is initially presented and a deep investigation is performed in order to identify the main uncertainty contributions that affect mass and temperature measurements. A measurement system is then described that has been specifically conceived to work inside a freeze-dryer. Experimental results ar...
Habte, A.; Sengupta, M.; Reda, I.
2015-03-01
Radiometric data with known and traceable uncertainty is essential for climate change studies to better understand cloud radiation interactions and the earth radiation budget. Further, adopting a known and traceable method of estimating uncertainty with respect to SI ensures that the uncertainty quoted for radiometric measurements can be compared based on documented methods of derivation.Therefore, statements about the overall measurement uncertainty can only be made on an individual basis, taking all relevant factors into account. This poster provides guidelines and recommended procedures for estimating the uncertainty in calibrations and measurements from radiometers. The approach follows the Guide to the Expression of Uncertainty in Measurement (GUM). derivation.Therefore, statements about the overall measurement uncertainty can only be made on an individual basis, taking all relevant factors into account. This poster provides guidelines and recommended procedures for estimating the uncertainty in calibrations and measurements from radiometers. The approach follows the Guide to the Expression of Uncertainty in Measurement (GUM).
Measurement Of Beer Taste Attributes Using An Electronic Tongue
Polshin, Evgeny; Rudnitskaya, Alisa; Kirsanov, Dmitry; Lammertyn, Jeroen; Nicolaï, Bart; Saison, Daan; Delvaux, Freddy R.; Delvaux, Filip; Legin, Andrey
2009-05-01
The present work deals with the results of the application of an electronic tongue system as an analytical tool for rapid assessment of beer flavour. Fifty samples of Belgian and Dutch beers of different types, characterized with respect to sensory properties and bitterness, were analyzed using the electronic tongue (ET) based on potentiometric chemical sensors. The ET was capable of predicting 10 sensory attributes of beer with good precision including sweetness, sourness, intensity, body, etc., as well as the most important instrumental parameter—bitterness. These results show a good promise for further progressing of the ET as a new analytical technique for the fast assessment of taste attributes and bitterness, in particular, in the food and brewery industries.
USGS Polar Temperature Logging System, Description and Measurement Uncertainties
Clow, Gary D.
2008-01-01
This paper provides an updated technical description of the USGS Polar Temperature Logging System (PTLS) and a complete assessment of the measurement uncertainties. This measurement system is used to acquire subsurface temperature data for climate-change detection in the polar regions and for reconstructing past climate changes using the 'borehole paleothermometry' inverse method. Specifically designed for polar conditions, the PTLS can measure temperatures as low as -60 degrees Celsius with a sensitivity ranging from 0.02 to 0.19 millikelvin (mK). A modular design allows the PTLS to reach depths as great as 4.5 kilometers with a skid-mounted winch unit or 650 meters with a small helicopter-transportable unit. The standard uncertainty (uT) of the ITS-90 temperature measurements obtained with the current PTLS range from 3.0 mK at -60 degrees Celsius to 3.3 mK at 0 degrees Celsius. Relative temperature measurements used for borehole paleothermometry have a standard uncertainty (urT) whose upper limit ranges from 1.6 mK at -60 degrees Celsius to 2.0 mK at 0 degrees Celsius. The uncertainty of a temperature sensor's depth during a log depends on specific borehole conditions and the temperature near the winch and thus must be treated on a case-by-case basis. However, recent experience indicates that when logging conditions are favorable, the 4.5-kilometer system is capable of producing depths with a standard uncertainty (uZ) on the order of 200-250 parts per million.
CO2 Flux Measurement Uncertainty Estimates for NACP
Barr, A.; Hollinger, D.; Richardson, A. D.
2009-12-01
We evaluated the uncertainties in eddy-covariance net ecosystem exchange NEE, total ecosystem respiration RE and gross primary production GPP associated with (a) random measurement error and (b) uncertainties in the u* (friction velocity) threshold u*Th for all site-years in the NACP site-level synthesis. The analyses required automated evaluation of the u*Th filter used to identify and reject bad NEE measurements during low-turbulence periods at night. The u*Th detection algorithm was adapted from Papale et al. (2006), modified to use a standard change-point detection algorithm. Uncertainty in the u*Th was estimated by bootstrapping, conducted annually with 1,000 draws per site-year, then pooling all years and calculating the lower and upper 95% confidence intervals from the median and 2.5 and 97.5 percentiles of the pooled u*Th values. Random uncertainties in NEE, RE and GPP were estimated following Richardson et al. (2007). The NEE random uncertainty characteristic curve, which characterizes random uncertainty in NEE as a function of NEE, was estimated for each site-year based on the differences between the measured data and the output of a simple and robust gap-filling model. The estimation procedure began with synthetic NEE data generated by the gap-filling model, introduced gaps (as in the measured data after u*Th filtering), added synthetic noise (defined by the NEE random uncertainty characteristic curve using a Monte-Carlo approach), then filled the gaps in the noisy, gappy synthetic data. The process was repeated 1,000 times for each site-year, and the random uncertainty was estimated from median and the 2.5 and 97.5 percentiles of the gap-filled data. The uncertainties in NEE, RE and GPP associated with uncertainties in the u*Th were evaluated by running the gap-filling routine at 1,000 u*Th values, drawn randomly from the pooled annual bootstrapping estimates. This produced 1,000 realizations of the gap-filled NEE, RE and GPP time series. The
Toward an uncertainty budget for measuring nanoparticles by AFM
Delvallée, A.; Feltin, N.; Ducourtieux, S.; Trabelsi, M.; Hochepied, J. F.
2016-02-01
This article reports on the evaluation of an uncertainty budget associated with the measurement of the mean diameter of a nanoparticle (NP) population by Atomic Force Microscopy. The measurement principle consists in measuring the height of a spherical-like NP population to determine the mean diameter and the size distribution. This method assumes that the NPs are well-dispersed on the substrate and isolated enough to avoid measurement errors due to agglomeration phenomenon. Since the measurement is directly impacted by the substrate roughness, the NPs have been deposited on a mica sheet presenting a very low roughness. A complete metrological characterization of the instrument has been carried out and the main error sources have been evaluated. The measuring method has been tested on a population of SiO2 NPs. Homemade software has been used to build the height distribution histogram taking into account only isolated NP. Finally, the uncertainty budget including main components has been established for the mean diameter measurement of this NP population. The most important components of this uncertainty budget are the calibration process along Z-axis, the scanning speed influence and then the vertical noise level.
Research on a method of shielded HEU attribution measurement
A special assay method that determining the high energy gamma ray 2614.75 keV from 208Tl, the daughter of 232U is presented. By which the attributes of uranium can be learned about. The theoretical analysis and experiment results prove that HEU from U-Pu cycle contains a few 232U. The gamma ray 2614.75 keV can be detected even under the condition of thick shield. It is proved that this method is feasible to determine HEU
Uncertainty analysis of NDA waste measurements using computer simulations
Uncertainty assessments for nondestructive radioassay (NDA) systems for nuclear waste are complicated by factors extraneous to the measurement systems themselves. Most notably, characteristics of the waste matrix (e.g., homogeneity) and radioactive source material (e.g., particle size distribution) can have great effects on measured mass values. Under these circumstances, characterizing the waste population is as important as understanding the measurement system in obtaining realistic uncertainty values. When extraneous waste characteristics affect measurement results, the uncertainty results are waste-type specific. The goal becomes to assess the expected bias and precision for the measurement of a randomly selected item from the waste population of interest. Standard propagation-of-errors methods for uncertainty analysis can be very difficult to implement in the presence of significant extraneous effects on the measurement system. An alternative approach that naturally includes the extraneous effects is as follows: (1) Draw a random sample of items from the population of interest; (2) Measure the items using the NDA system of interest; (3) Establish the true quantity being measured using a gold standard technique; and (4) Estimate bias by deriving a statistical regression model comparing the measurements on the system of interest to the gold standard values; similar regression techniques for modeling the standard deviation of the difference values gives the estimated precision. Actual implementation of this method is often impractical. For example, a true gold standard confirmation measurement may not exist. A more tractable implementation is obtained by developing numerical models for both the waste material and the measurement system. A random sample of simulated waste containers generated by the waste population model serves as input to the measurement system model. This approach has been developed and successfully applied to assessing the quantity of
Generalized uncertainty relations and efficient measurements in quantum systems
Belavkin, V. P.
2004-01-01
We consider two variants of a quantum-statistical generalization of the Cramer-Rao inequality that establishes an invariant lower bound on the mean square error of a generalized quantum measurement. The proposed complex variant of this inequality leads to a precise formulation of a generalized uncertainty principle for arbitrary states, in contrast to Helstrom's symmetric variant in which these relations are obtained only for pure states. A notion of canonical states is introduced and the low...
Uncertainties and re-analysis of glacier mass balance measurements
Zemp, M.; E. Thibert; Huss, M.; Stumm, D.; Rolstad Denby, C.; Nuth, C.; S. U. Nussbaumer; G. Moholdt; A. Mercer; Mayer, C.; Joerg, P. C.; P. Jansson; B. Hynek; Fischer, A.; Escher-Vetter, H.
2013-01-01
Glacier-wide mass balance has been measured for more than sixty years and is widely used as an indicator of climate change and to assess the glacier contribution to runoff and sea level rise. Until present, comprehensive uncertainty assessments have rarely been carried out and mass balance data have often been applied using rough error estimation or without error considerations. In this study, we propose a framework for re-analyzing glacier mass balance series including conceptual and ...
UNCERTAINTIES OF ANION AND TOC MEASUREMENTS AT THE DWPF LABORATORY
Edwards, T.
2011-04-07
The Savannah River Remediation (SRR) Defense Waste Processing Facility (DWPF) has identified a technical issue related to the amount of antifoam added to the Chemical Process Cell (CPC). Specifically, due to the long duration of the concentration and reflux cycles for the Sludge Receipt and Adjustment Tank (SRAT), additional antifoam has been required. The additional antifoam has been found to impact the melter flammability analysis as an additional source of carbon and hydrogen. To better understand and control the carbon and hydrogen contributors to the melter flammability analysis, SRR's Waste Solidification Engineering (WSE) has requested, via a Technical Task Request (TTR), that the Savannah River National Laboratory (SRNL) conduct an error evaluation of the measurements of key Slurry Mix Evaporator (SME) anions. SRNL issued a Task Technical and Quality Assurance Plan (TTQAP) [2] in response to that request, and the work reported here was conducted under the auspices of that TTQAP. The TTR instructs SRNL to conduct an error evaluation of anion measurements generated by the DWPF Laboratory using Ion Chromatography (IC) performed on SME samples. The anions of interest include nitrate, oxalate, and formate. Recent measurements of SME samples for these anions as well as measurements of total organic carbon (TOC) were provided to SRNL by DWPF Laboratory Operations (Lab OPS) personnel for this evaluation. This work was closely coordinated with the efforts of others within SRNL that are investigating the Chemical Process Cell (CPC) contributions to the melter flammability. The objective of that investigation was to develop a more comprehensive melter flammability control strategy that when implemented in DWPF will rely on process measurements. Accounting for the uncertainty of the measurements is necessary for successful implementation. The error evaluations conducted as part of this task will facilitate the integration of appropriate uncertainties for the
Measurement uncertainty. A practical guide for Secondary Standards Dosimetry Laboratories
The need for international traceability for radiation dose measurements has been understood since the early nineteen-sixties. The benefits of high dosimetric accuracy were recognized, particularly in radiotherapy, where the outcome of treatments is dependent on the radiation dose delivered to patients. When considering radiation protection dosimetry, the uncertainty may be greater than for therapy, but proper traceability of the measurements is no less important. To ensure harmonization and consistency in radiation measurements, the International Atomic Energy Agency (IAEA) and the World Health Organization (WHO) created a Network of Secondary Standards Dosimetry Laboratories (SSDLs) in 1976. An SSDL is a laboratory that has been designated by the competent national authorities to undertake the duty of providing the necessary link in the traceability chain of radiation dosimetry to the international measurement system (SI, for Systeme International) for radiation metrology users. The role of the SSDLs is crucial in providing traceable calibrations; they disseminate calibrations at specific radiation qualities appropriate for the use of radiation measuring instruments. Historically, although the first SSDLs were established mainly to provide radiotherapy level calibrations, the scope of their work has expanded over the years. Today, many SSDLs provide traceability for radiation protection measurements and diagnostic radiology in addition to radiotherapy. Some SSDLs, with the appropriate facilities and expertise, also conduct quality audits of the clinical use of the calibrated dosimeters - for example, by providing postal dosimeters for dose comparisons for medical institutions or on-site dosimetry audits with an ion chamber and other appropriate equipment. The requirements for traceable and reliable calibrations are becoming more important. For example, for international trade where radiation products are manufactured within strict quality control systems, it is
Measurement of nuclear activity with Ge detectors and its uncertainty
The objective of this work is to analyse the influence magnitudes which affect the activity measurement of gamma transmitter isolated radioactive sources. They prepared by means of the gravimetric method, as well as, determining the uncertainty of such measurement when this is carried out with a gamma spectrometer system with a germanium detector. This work is developed in five chapters: In the first one, named Basic principles it is made a brief description about the meaning of the word Measurement and its implications and the necessaries concepts are presented which are used in this work. In the second chapter it is exposed the gravimetric method used for the manufacture of the gamma transmitter isolated radioactive sources, it is tackled the problem to determine the main influence magnitudes which affect in the measurement of their activity and the respective correction factors and their uncertainties are deduced. The third chapter describes the gamma spectrometry system which is used in this work for the measurement of the activity of isolated sources and also its performance and experimental arrangement that it is used. In the fourth chapter are applied the three previous items with the object of determining the uncertainty which would be obtained in the measurement of an isolated radioactive source elaborated with the gravimetric method in the experimental conditions less favourable predicted above the obtained results from the chapter two. The conclusions are presented in the fifth chapter and they are applied to establish the optimum conditions for the measurement of the activity of a gamma transmitter isolated radioactive source with a spectrometer with germanium detector. (Author)
Using measurement uncertainty in decision-making and conformity assessment
Pendrill, L. R.
2014-08-01
Measurements often provide an objective basis for making decisions, perhaps when assessing whether a product conforms to requirements or whether one set of measurements differs significantly from another. There is increasing appreciation of the need to account for the role of measurement uncertainty when making decisions, so that a ‘fit-for-purpose’ level of measurement effort can be set prior to performing a given task. Better mutual understanding between the metrologist and those ordering such tasks about the significance and limitations of the measurements when making decisions of conformance will be especially useful. Decisions of conformity are, however, currently made in many important application areas, such as when addressing the grand challenges (energy, health, etc), without a clear and harmonized basis for sharing the risks that arise from measurement uncertainty between the consumer, supplier and third parties. In reviewing, in this paper, the state of the art of the use of uncertainty evaluation in conformity assessment and decision-making, two aspects in particular—the handling of qualitative observations and of impact—are considered key to bringing more order to the present diverse rules of thumb of more or less arbitrary limits on measurement uncertainty and percentage risk in the field. (i) Decisions of conformity can be made on a more or less quantitative basis—referred in statistical acceptance sampling as by ‘variable’ or by ‘attribute’ (i.e. go/no-go decisions)—depending on the resources available or indeed whether a full quantitative judgment is needed or not. There is, therefore, an intimate relation between decision-making, relating objects to each other in terms of comparative or merely qualitative concepts, and nominal and ordinal properties. (ii) Adding measures of impact, such as the costs of incorrect decisions, can give more objective and more readily appreciated bases for decisions for all parties concerned. Such
Measuring uncertainty in modeling toxic concentrations in the Niagara River
Franceschini, S.; Tsai, C.
2004-12-01
degree of accuracy in the estimation of the first few statistical moments of a model output distribution. Furthermore, the probabilistic analysis can be used as a more rigorous method to compare the modeled results with established water quality criteria. In this study, the toxic concentrations computed at the end of the Niagara River and their estimated variability will be compared with field data measurements. The purpose of this comparison is two-fold: (a) to evaluate the accuracy of the Modified Rosenblueth method in measuring the uncertainty of toxic concentration in the Niagara River and (b) to quantify the risk of exceeding established water quality standards when such uncertainty is accounted for.
Measurement Uncertainty Analysis of the Strain Gauge Based Stabilographic Platform
Walendziuk Wojciech
2014-08-01
Full Text Available The present article describes constructing a stabilographic platform which records a standing patient’s deflection from their point of balance. The constructed device is composed of a toughen glass slab propped with 4 force sensors. Power transducers are connected to the measurement system based on a 24-bit ADC transducer which acquires slight body movements of a patient. The data is then transferred to the computer in real time and data analysis is conducted. The article explains the principle of operation as well as the algorithm of measurement uncertainty for the COP (Centre of Pressure surface (x, y.
Object-oriented software for evaluating measurement uncertainty
An earlier publication (Hall 2006 Metrologia 43 L56–61) introduced the notion of an uncertain number that can be used in data processing to represent quantity estimates with associated uncertainty. The approach can be automated, allowing data processing algorithms to be decomposed into convenient steps, so that complicated measurement procedures can be handled. This paper illustrates the uncertain-number approach using several simple measurement scenarios and two different software tools. One is an extension library for Microsoft Excel®. The other is a special-purpose calculator using the Python programming language. (paper)
Measurement Uncertainty Estimation of a Robust Photometer Circuit
Jesús de Vicente
2009-04-01
Full Text Available In this paper the uncertainty of a robust photometer circuit (RPC was estimated. Here, the RPC was considered as a measurement system, having input quantities that were inexactly known, and output quantities that consequently were also inexactly known. Input quantities represent information obtained from calibration certificates, specifications of manufacturers, and tabulated data. Output quantities describe the transfer function of the electrical part of the photodiode. Input quantities were the electronic components of the RPC, the parameters of the model of the photodiode and its sensitivity at 670 nm. The output quantities were the coefficients of both numerator and denominator of the closed-loop transfer function of the RPC. As an example, the gain and phase shift of the RPC versus frequency was evaluated from the transfer function, with their uncertainties and correlation coefficient. Results confirm the robustness of photodiode design.
Measurement uncertainties in regression analysis with scarcity of data
The evaluation of measurement uncertainty, in certain fields of science, faces the problem of scarcity of data. This is certainly the case in the testing of geological soils in civil engineering, where tests can take several days or weeks and where the same sample is not available for further testing, being destroyed during the experiment. In this particular study attention will be paid to triaxial compression tests used to typify particular soils. The purpose of the testing is to determine two parameters that characterize the soil, namely, cohesion and friction angle. These parameters are defined in terms of the intercept and slope of a straight line fitted to a small number of points (usually three) derived from experimental data. The use of ordinary least squares to obtain uncertainties associated with estimates of the two parameters would be unreliable if there were only three points (and no replicates) and hence only one degrees of freedom.
Romdhani, Fekria; Hennebelle, François; Ge, Min; Juillion, Patrick; Coquet, Richard; François Fontaine, Jean
2014-12-01
Articulated Arm Coordinate Measuring Machines (AACMMs) have gradually evolved and are increasingly used in mechanical industry. At present, measurement uncertainties relating to the use of these devices are not yet well quantified. The work carried out consists of determining the measurement uncertainties of a mechanical part by an AACMM. The studies aiming to develop a model of measurement uncertainty are based on the Monte Carlo method developed in Supplement 1 of the Guide to Expression of Uncertainty in Measurement [1] but also identifying and characterizing the main sources of uncertainty. A multi-level Monte Carlo approach principle has been developed which allows for characterizing the possible evolution of the AACMM during the measurement and quantifying in a second level the uncertainty on the considered measurand. The first Monte Carlo level is the most complex and is thus divided into three sub-levels, namely characterization on the positioning error of a point, estimation of calibration errors and evaluation of fluctuations of the ‘localization point’. The global method is thus presented and results of the first sub-level are particularly developed. The main sources of uncertainty, including AACMM deformations, are exposed.
A comparison of protocols and observer precision for measuring physical stream attributes
Whitacre, H.W.; Roper, B.B.; Kershner, J.L.
2007-01-01
Stream monitoring programs commonly measure physical attributes to assess the effect of land management on stream habitat. Variability associated with the measurement of these attributes has been linked to a number of factors, but few studies have evaluated variability due to differences in protocols. We compared six protocols, five used by the U.S. Department of Agriculture Forest Service and one by the U.S. Environmental Protection Agency, on six streams in Oregon and Idaho to determine whether differences in protocol affect values for 10 physical stream attributes. Results from Oregon and Idaho were combined for groups participating in both states, with significant differences in attribute means for 9 out of the 10 stream attributes. Significant differences occurred in 5 of 10 in Idaho, and 10 of 10 in Oregon. Coefficients of variation, signal-to-noise ratio, and root mean square error were used to evaluate measurement precision. There were differences among protocols for all attributes when states were analyzed separately and as a combined dataset. Measurement differences were influenced by choice of instruments, measurement method, measurement location, attribute definitions, and training approach. Comparison of data gathered by observers using different protocols will be difficult unless a core set of protocols for commonly measured stream attributes can be standardized among monitoring programs.
Estimation of measuring uncertainty for optical micro-coordinate measuring machine
Kang Song(宋康); Zhuangde Jiang(蒋庄德)
2004-01-01
Based on the evaluation principle of the measuring uncertainty of the traditional coordinate measuring machine (CMM), the analysis and evaluation of the measuring uncertainty for optical micro-CMM have been made. Optical micro-CMM is an integrated measuring system with optical, mechanical, and electronic components, which may influence the measuring uncertainty of the optical micro-CMM. If the influence of laser speckle is taken into account, its longitudinal measuring uncertainty is 2.0 μm, otherwise it is 0.88 μm. It is proved that the estimation of the synthetic uncertainty for optical micro-CMM is correct and reliable by measuring the standard reference materials and simulating the influence of the diameter of laser beam. With Heisenberg's uncertainty principle and quantum mechanics theory, a method for improving the measuring accuracy of optical micro-CMM through adding a diaphragm in the receiving terminal of the light path was proposed, and the measuring results are verified by experiments.
TOTAL MEASUREMENT UNCERTAINTY IN HOLDUP MEASUREMENTS AT THE PLUTONIUM FINISHING PLANT (PFP)
An approach to determine the total measurement uncertainty (TMU) associated with Generalized Geometry Holdup (GGH) [1,2,3] measurements was developed and implemented in 2004 and 2005 [4]. This paper describes a condensed version of the TMU calculational model, including recent developments. Recent modifications to the TMU calculation model include a change in the attenuation uncertainty, clarifying the definition of the forward background uncertainty, reducing conservatism in the random uncertainty by selecting either a propagation of counting statistics or the standard deviation of the mean, and considering uncertainty in the width and height as a part of the self attenuation uncertainty. In addition, a detection limit is calculated for point sources using equations derived from summary equations contained in Chapter 20 of MARLAP [5]. The Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2007-1 to the Secretary of Energy identified a lack of requirements and a lack of standardization for performing measurements across the U.S. Department of Energy (DOE) complex. The DNFSB also recommended that guidance be developed for a consistent application of uncertainty values. As such, the recent modifications to the TMU calculational model described in this paper have not yet been implemented. The Plutonium Finishing Plant (PFP) is continuing to perform uncertainty calculations as per Reference 4. Publication at this time is so that these concepts can be considered in developing a consensus methodology across the complex
Measurement of nuclear activity with Ge detectors and its uncertainty
Cortes, C A P
1999-01-01
presented in the fifth chapter and they are applied to establish the optimum conditions for the measurement of the activity of a gamma transmitter isolated radioactive source with a spectrometer with germanium detector. (Author) The objective of this work is to analyse the influence magnitudes which affect the activity measurement of gamma transmitter isolated radioactive sources. They prepared by means of the gravimetric method, as well as, determining the uncertainty of such measurement when this is carried out with a gamma spectrometer system with a germanium detector. This work is developed in five chapters: In the first one, named Basic principles it is made a brief description about the meaning of the word Measurement and its implications and the necessaries concepts are presented which are used in this work. In the second chapter it is exposed the gravimetric method used for the manufacture of the gamma transmitter isolated radioactive sources, it is tackled the problem to determine the main influence ...
Applying the Implicit Association Test to Measure Intolerance of Uncertainty.
Mosca, Oriana; Dentale, Francesco; Lauriola, Marco; Leone, Luigi
2016-08-01
Intolerance of Uncertainty (IU) is a key trans-diagnostic personality construct strongly associated with anxiety symptoms. Traditionally, IU is measured through self-report measures that are prone to bias effects due to impression management concerns and introspective difficulties. Moreover, self-report scales are not able to intercept the automatic associations that are assumed to be main determinants of several spontaneous responses (e.g., emotional reactions). In order to overcome these limitations, the Implicit Association Test (IAT) was applied to measure IU, with a particular focus on reliability and criterion validity issues. The IU-IAT and the Intolerance of Uncertainty Inventory (IUI) were administered to an undergraduate student sample (54 females and 10 males) with a mean age of 23 years (SD = 1.7). Successively, participants were asked to provide an individually chosen uncertain event from their own lives that may occur in the future and were requested to identify a number of potential negative consequences of it. Participants' responses in terms of cognitive thoughts (i.e., cognitive appraisal) and worry reactions toward these events were assessed using the two subscales of the Worry and Intolerance of Uncertainty Beliefs Questionnaire. The IU-IAT showed an adequate level of internal consistency and a not significant correlation with the IUI. A path analysis model, accounting for 35% of event-related worry, revealed that IUI had a significant indirect effect on the dependent variable through event-related IU thoughts. By contrast, as expected, IU-IAT predicted event-related worry independently from IU thoughts. In accordance with dual models of social cognition, these findings suggest that IU can influence event-related worry through two different processing pathways (automatic vs. deliberative), supporting the criterion and construct validity of the IU-IAT. The potential role of the IU-IAT for clinical applications was discussed. PMID:27451266
Lidar Uncertainty Measurement Experiment (LUMEX) - Understanding Sampling Errors
Choukulkar, A.; Brewer, W. A.; Banta, R. M.; Hardesty, M.; Pichugina, Y.; Senff, Christoph; Sandberg, S.; Weickmann, A.; Carroll, B.; Delgado, R.; Muschinski, A.
2016-06-01
Coherent Doppler LIDAR (Light Detection and Ranging) has been widely used to provide measurements of several boundary layer parameters such as profiles of wind speed, wind direction, vertical velocity statistics, mixing layer heights and turbulent kinetic energy (TKE). An important aspect of providing this wide range of meteorological data is to properly characterize the uncertainty associated with these measurements. With the above intent in mind, the Lidar Uncertainty Measurement Experiment (LUMEX) was conducted at Erie, Colorado during the period June 23rd to July 13th, 2014. The major goals of this experiment were the following: Characterize sampling error for vertical velocity statistics Analyze sensitivities of different Doppler lidar systems Compare various single and dual Doppler retrieval techniques Characterize error of spatial representativeness for separation distances up to 3 km Validate turbulence analysis techniques and retrievals from Doppler lidars This experiment brought together 5 Doppler lidars, both commercial and research grade, for a period of three weeks for a comprehensive intercomparison study. The Doppler lidars were deployed at the Boulder Atmospheric Observatory (BAO) site in Erie, site of a 300 m meteorological tower. This tower was instrumented with six sonic anemometers at levels from 50 m to 300 m with 50 m vertical spacing. A brief overview of the experiment outline and deployment will be presented. Results from the sampling error analysis and its implications on scanning strategy will be discussed.
Measures of uncertainty, importance and sensitivity of the SEDA code
The purpose of this work is the estimation of the uncertainty on the results of the SEDA code (Sistema de Evaluacion de Dosis en Accidentes) in accordance with the input data and its parameters. The SEDA code has been developed by the Comision Nacional de Energia Atomica for the estimation of doses during emergencies in the vicinity of Atucha and Embalse, nuclear power plants. The user should feed the code with meteorological data, source terms and accident data (timing involved, release height, thermal content of the release, etc.) It is designed to be used during the emergency, and to bring fast results that enable to make decisions. The uncertainty in the results of the SEDA code is quantified in the present paper. This uncertainty is associated both with the data the user inputs to the code, and with the uncertain parameters of the code own models. The used method consisted in the statistical characterization of the parameters and variables, assigning them adequate probability distributions. These distributions have been sampled with the Latin Hypercube Sampling method, which is a stratified multi-variable Monte-Carlo technique. The code has been performed for each of the samples and finally, a result sample has been obtained. These results have been characterized from the statistical point of view (obtaining their mean, most probable value, distribution shape, etc.) for several distances from the source. Finally, the Partial Correlation Coefficients and Standard Regression Coefficients techniques have been used to obtain the relative importance of each input variable, and the Sensitivity of the code to its variations. The measures of Importance and Sensitivity have been obtained for several distances from the source and various cases of atmospheric stability, making comparisons possible. This paper allowed to confide in the results of the code, and the association of their uncertainty to them, as a way to know the limits in which the results can vary in a real
Range and number-of-levels effects in derived and stated measures of attribute importance
Verlegh, PWJ; Schifferstein, HNJ; Wittink, DR
2002-01-01
We study how the range of variation and the number of ttribute levels affect five measures of attribute importance: full profile conjoint estimates, ranges in attribute level attractiveness ratings. regression coefficients. graded paired comparisons. and self-reported ratings, We find that all impor
O'Connor Daniel P
2011-07-01
Full Text Available Background Physical activity (PA adoption is essential for obesity prevention and control, yet ethnic minority women report lower levels of PA and are at higher risk for obesity and its comorbidities compared to Caucasians. Epidemiological studies and ecologic models of health behavior suggest that built environmental factors are associated with health behaviors like PA, but few studies have examined the association between built environment attribute concordance and PA, and no known studies have examined attribute concordance and PA adoption. Purpose The purpose of this study was to associate the degree of concordance between directly and indirectly measured built environment attributes with changes in PA over time among African American and Hispanic Latina women participating in a PA intervention. Method Women (N = 410 completed measures of PA at Time 1 (T1 and Time 2 (T2; environmental data collected at T1 were used to compute concordance between directly and indirectly measured built environment attributes. The association between changes in PA and the degree of concordance between each directly and indirectly measured environmental attribute was assessed using repeated measures analyses. Results There were no significant associations between built environment attribute concordance values and change in self-reported or objectively measured PA. Self-reported PA significantly increased over time (F(1,184 = 7.82, p = .006, but this increase did not vary by ethnicity or any built environment attribute concordance variable. Conclusions Built environment attribute concordance may not be associated with PA changes over time among minority women. In an effort to promote PA, investigators should clarify specific built environment attributes that are important for PA adoption and whether accurate perceptions of these attributes are necessary, particularly among the vulnerable population of minority women.
The study realised includes several phases: the delimitation of the field of the study, the identification of the paramount parameters, the determination of the variations intervals of the paramount parameters, the analysis of the sensitivity and finally the analysis of uncertainty. (N.C.)
SI2N overview paper: ozone profile measurements: techniques, uncertainties and availability
Hassler, B.; Petropavlovskikh, I.; Staehelin, J.; August, T.; Bhartia, P. K.; Clerbaux, C.; Degenstein, D.; De Mazière, M.; Dinelli, B. M.; Dudhia, A.; Dufour, G.; Frith, S. M.; Froidevaux, L.; Godin-Beekmann, S.; Granville, J.; Harris, N. R. P.; Hoppel, K.; Hubert, D.; Kasai, Y.; Kurylo, M. J.; Kyrölä, E.; Lambert, J.-C.; Levelt, P. F.; McElroy, C. T.; McPeters, R. D.; Munro, R.; Nakajima, H.; Parrish, A.; Raspollini, P.; Remsberg, E. E.; Rosenlof, K. H.; Rozanov, A.; Sano, T.; Sasano, Y.; Shiotani, M.; Smit, H. G. J.; Stiller, G.; Tamminen, J.; Tarasick, D. W.; Urban, J.; van der A, R. J.; Veefkind, J. P.; Vigouroux, C.; von Clarmann, T.; von Savigny, C.; Walker, K. A.; Weber, M.; Wild, J.; Zawodny, J.
2013-11-01
Peak stratospheric chlorofluorocarbon (CFC) and other ozone depleting substance (ODS) concentrations were reached in the mid to late 1990s. Detection and attribution of the expected recovery of the stratospheric ozone layer in an atmosphere with reduced ODSs as well as efforts to understand the evolution of stratospheric ozone in the presence of increasing greenhouse gases are key current research topics. These require a critical examination of the ozone changes with an accurate knowledge of the spatial (geographical and vertical) and temporal ozone response. For such an examination, it is vital that the quality of the measurements used be as high as possible and measurement uncertainties well quantified. In preparation for the 2014 United Nations Environment Programme (UNEP)/World Meteorological Organization (WMO) Scientific Assessment of Ozone Depletion, the SPARC/IO3C/IGACO-O3/NDACC (SI2N) initiative was designed to study and document changes in the global ozone profile distribution. This requires assessing long-term ozone profile data sets in regards to measurement stability and uncertainty characteristics. The ultimate goal is to establish suitability for estimating long-term ozone trends to contribute to ozone recovery studies. Some of the data sets have been improved as part of this initiative with updated versions now available. This summary presents an overview of stratospheric ozone profile measurement data sets (ground- and satellite-based) available for ozone recovery studies. Here we document measurement techniques, spatial and temporal coverage, vertical resolution, native units and measurement uncertainties. In addition, the latest data versions are briefly described (including data version updates as well as detailing multiple retrievals when available for a given satellite instrument). Archive location information is for each data set is also given.
B. Hassler
2014-05-01
Full Text Available Peak stratospheric chlorofluorocarbon (CFC and other ozone depleting substance (ODS concentrations were reached in the mid- to late 1990s. Detection and attribution of the expected recovery of the stratospheric ozone layer in an atmosphere with reduced ODSs as well as efforts to understand the evolution of stratospheric ozone in the presence of increasing greenhouse gases are key current research topics. These require a critical examination of the ozone changes with an accurate knowledge of the spatial (geographical and vertical and temporal ozone response. For such an examination, it is vital that the quality of the measurements used be as high as possible and measurement uncertainties well quantified. In preparation for the 2014 United Nations Environment Programme (UNEP/World Meteorological Organization (WMO Scientific Assessment of Ozone Depletion, the SPARC/IO3C/IGACO-O3/NDACC (SI2N Initiative was designed to study and document changes in the global ozone profile distribution. This requires assessing long-term ozone profile data sets in regards to measurement stability and uncertainty characteristics. The ultimate goal is to establish suitability for estimating long-term ozone trends to contribute to ozone recovery studies. Some of the data sets have been improved as part of this initiative with updated versions now available. This summary presents an overview of stratospheric ozone profile measurement data sets (ground and satellite based available for ozone recovery studies. Here we document measurement techniques, spatial and temporal coverage, vertical resolution, native units and measurement uncertainties. In addition, the latest data versions are briefly described (including data version updates as well as detailing multiple retrievals when available for a given satellite instrument. Archive location information for each data set is also given.
Hassler, B.; Petropavlovskikh, I.; Staehelin, J.; August, T.; Bhartia, P. K.; Clerbaux, C.; Degenstein, D.; De Mazière, M.; Dinelli, B. M.; Dudhia, A.; Dufour, G.; Frith, S. M.; Froidevaux, L.; Godin-Beekmann, S.; Granville, J.; Harris, N. R. P.; Hoppel, K.; Hubert, D.; Kasai, Y.; Kurylo, M. J.; Kyrölä, E.; Lambert, J.-C.; Levelt, P. F.; McElroy, C. T.; McPeters, R. D.; Munro, R.; Nakajima, H.; Parrish, A.; Raspollini, P.; Remsberg, E. E.; Rosenlof, K. H.; Rozanov, A.; Sano, T.; Sasano, Y.; Shiotani, M.; Smit, H. G. J.; Stiller, G.; Tamminen, J.; Tarasick, D. W.; Urban, J.; van der A, R. J.; Veefkind, J. P.; Vigouroux, C.; von Clarmann, T.; von Savigny, C.; Walker, K. A.; Weber, M.; Wild, J.; Zawodny, J. M.
2014-05-01
Peak stratospheric chlorofluorocarbon (CFC) and other ozone depleting substance (ODS) concentrations were reached in the mid- to late 1990s. Detection and attribution of the expected recovery of the stratospheric ozone layer in an atmosphere with reduced ODSs as well as efforts to understand the evolution of stratospheric ozone in the presence of increasing greenhouse gases are key current research topics. These require a critical examination of the ozone changes with an accurate knowledge of the spatial (geographical and vertical) and temporal ozone response. For such an examination, it is vital that the quality of the measurements used be as high as possible and measurement uncertainties well quantified. In preparation for the 2014 United Nations Environment Programme (UNEP)/World Meteorological Organization (WMO) Scientific Assessment of Ozone Depletion, the SPARC/IO3C/IGACO-O3/NDACC (SI2N) Initiative was designed to study and document changes in the global ozone profile distribution. This requires assessing long-term ozone profile data sets in regards to measurement stability and uncertainty characteristics. The ultimate goal is to establish suitability for estimating long-term ozone trends to contribute to ozone recovery studies. Some of the data sets have been improved as part of this initiative with updated versions now available. This summary presents an overview of stratospheric ozone profile measurement data sets (ground and satellite based) available for ozone recovery studies. Here we document measurement techniques, spatial and temporal coverage, vertical resolution, native units and measurement uncertainties. In addition, the latest data versions are briefly described (including data version updates as well as detailing multiple retrievals when available for a given satellite instrument). Archive location information for each data set is also given.
Hassler, B.; Petropavlovskikh, I.; Staehelin, J.; August, T.; Bhartia, P. K.; Clerbaux, C.; Degenstein, D.; Maziere, M. De; Dinelli, B. M.; Dudhia, A.; Dufour, G.; Frith, S. M.; Froidevaux, L.; Godin-Beekmann, S.; Granville, J.; Harris, N. R. P.; Hoppel, K.; Hubert, D.; Kasai, Y.; Kurylo, M. J.; Kyrola, E.; Lambert, J.-C.; Levelt, P. F.; McElroy, C. T.; McPeters, R. D.; Munro, R.; Nakajima, H.; Parrish, A.; Raspollini, P.; Remsberg, E. E.; Rosenlof, K. H.; Rozanov, A.; Sano, T.; Sasano, Y.; Shiotani, M.; Zawodny, J. M.
2014-01-01
Peak stratospheric chlorofluorocarbon (CFC) and other ozone depleting substance (ODS) concentrations were reached in the mid- to late 1990s. Detection and attribution of the expected recovery of the stratospheric ozone layer in an atmosphere with reduced ODSs as well as efforts to understand the evolution of stratospheric ozone in the presence of increasing greenhouse gases are key current research topics. These require a critical examination of the ozone changes with an accurate knowledge of the spatial (geographical and vertical) and temporal ozone response. For such an examination, it is vital that the quality of the measurements used be as high as possible and measurement uncertainties well quantified. In preparation for the 2014 United Nations Environment Programme (UNEP)/World Meteorological Organization (WMO) Scientific Assessment of Ozone Depletion, the SPARC/IO3C/IGACO-O3/NDACC (SI2N) Initiative was designed to study and document changes in the global ozone profile distribution. This requires assessing long-term ozone profile data sets in regards to measurement stability and uncertainty characteristics. The ultimate goal is to establish suitability for estimating long-term ozone trends to contribute to ozone recovery studies. Some of the data sets have been improved as part of this initiative with updated versions now available. This summary presents an overview of stratospheric ozone profile measurement data sets (ground and satellite based) available for ozone recovery studies. Here we document measurement techniques, spatial and temporal coverage, vertical resolution, native units and measurement uncertainties. In addition, the latest data versions are briefly described (including data version updates as well as detailing multiple retrievals when available for a given satellite instrument). Archive location information for each data set is also given.
Leff, Stephen S.; Lefler, Elizabeth K.; Khera, Gagan S.; Paskewich, Brooke; Jawad, Abbas F.
2014-01-01
The current study illustrates how researchers developed and validated a cartoon-based adaptation of a written hostile attributional bias measure for a sample of urban, low-income, African American boys. A series of studies were conducted to develop cartoon illustrations to accompany a standard written hostile attributional bias vignette measure (Study 1), to determine initial psychometric properties (Study 2) and acceptability (Study 3), and to conduct a test-retest reliability trial of the adapted measure in a separate sample (Study 4). These studies utilize a participatory action research approach to measurement design and adaptation, and suggest that collaborations between researchers and key school stakeholders can lead to measures that are psychometrically strong, developmentally appropriate, and culturally sensitive. In addition, the cartoon-based hostile attributional bias measure appears to have promise as an assessment and/or outcome measure for aggression and bullying prevention programs conducted with urban African American boys. PMID:21800228
The neutron activation technique is routinely used in fusion experiments to measure the neutron yields. This paper investigates the uncertainty on these measurements as due to the uncertainties on dosimetry and activation reactions. For this purpose, activation cross-sections were taken from the International Reactor Dosimetry and Fusion File (IRDFF-v1.05) in 640 groups ENDF-6 format for several reactions of interest for both 2.5 and 14 MeV neutrons. Activation coefficients (reaction rates) have been calculated using the neutron flux spectra at JET vacuum vessel, both for DD and DT plasmas, calculated by MCNP in the required 640-energy group format. The related uncertainties for the JET neutron spectra are evaluated as well using the covariance data available in the library. These uncertainties are in general small, but not negligible when high accuracy is required in the determination of the fusion neutron yields
Reconsideration of the uncertainty relations and quantum measurements
Dumitru, Spiridon
2012-01-01
Discussions on uncertainty relations (UR) and quantum measurements (QMS) persisted until nowadays in publications about quantum mechanics (QM). They originate mainly from the conventional interpretation of UR (CIUR). In the most of the QM literarure, it is underestimated the fact that, over the years, a lot of deficiencies regarding CIUR were signaled. As a rule the alluded deficiencies were remarked disparately and discussed as punctual and non-essential questions. Here we approach an investigation of the mentioned deficiencies collected in a conclusive ensemble. Subsequently we expose a reconsideration of the major problems referring to UR and QMS. We reveal that all the basic presumption of CIUR are troubled by insurmountable deficiencies which require the indubitable failure of CIUR and its necessary abandonment. Therefore the UR must be deprived of their statute of crucial pieces for physics. So, the aboriginal versions of UR appear as being in postures of either (i) thought-experimental fictions or (ii)...
This paper presents continuous and discrete equations for the propagation of uncertainty applied to inverse kinetics and shows that the uncertainty of a measurement can be minimized by the proper choice of frequency from the perturbing reactivity waveform. (authors)
Physics and Operational Research: measure of uncertainty via Nonlinear Programming
Davizon-Castillo, Yasser A.
2008-03-01
Physics and Operational Research presents an interdisciplinary interaction in problems such as Quantum Mechanics, Classical Mechanics and Statistical Mechanics. The nonlinear nature of the physical phenomena in a single well and double well quantum systems is resolved via Nonlinear Programming (NLP) techniques (Kuhn-Tucker conditions, Dynamic Programming) subject to Heisenberg Uncertainty Principle and an extended equality uncertainty relation to exploit the NLP Lagrangian method. This review addresses problems in Kinematics and Thermal Physics developing uncertainty relations for each case of study, under a novel way to quantify uncertainty.
Including uncertainty in hazard analysis through fuzzy measures
This paper presents a method for capturing the uncertainty expressed by an Hazard Analysis (HA) expert team when estimating the frequencies and consequences of accident sequences and provides a sound mathematical framework for propagating this uncertainty to the risk estimates for these accident sequences. The uncertainty is readily expressed as distributions that can visually aid the analyst in determining the extent and source of risk uncertainty in HA accident sequences. The results also can be expressed as single statistics of the distribution in a manner analogous to expressing a probabilistic distribution as a point-value statistic such as a mean or median. The study discussed here used data collected during the elicitation portion of an HA on a high-level waste transfer process to demonstrate the techniques for capturing uncertainty. These data came from observations of the uncertainty that HA team members expressed in assigning frequencies and consequences to accident sequences during an actual HA. This uncertainty was captured and manipulated using ideas from possibility theory. The result of this study is a practical method for displaying and assessing the uncertainty in the HA team estimates of the frequency and consequences for accident sequences. This uncertainty provides potentially valuable information about accident sequences that typically is lost in the HA process
This paper is intended to identify the uncertainties of activities in environmental samples measured with gamma-ray spectrometry that result from uncertainties in matrix composition, density and geometrical dimensions of the sample. For that purpose efficiencies were calculated for a wide range of environmental matrices such as fresh and ashed food samples, water samples and soil samples. Compositions were mainly taken from literature. Densities and geometry parameters were varied in a range occurring in practice. Considered energies cover a range from 46.5 keV to 2000 keV. Finally, a couple of recommendations in respect to gamma-ray spectrometric measurements of environmental samples are given. - Highlights: • Uncertainties of gamma-ray spectrometry measurements were assessed. • Efficiencies were calculated for a wide range of environmental matrices. • The effect of matrix compositions and density on efficiency was studied. • The effect of geometry parameters on efficiency was considered
Solecky, Eric; Archie, Chas; Sendelbach, Matthew; Fiege, Ron; Zaitz, Mary; Shneyder, Dmitriy; Strocchia-rivera, Carlos; Munoz, Andres; Rangarajan, Srinivasan; Muth, William; Brendler, Andrew; Banke, Bill; Schulz, Bernd; Hartig, Carsten; Hoeft, Jon-Tobias; Vaid, Alok; Kelling, Mark; Bunday, Benjamin; Allgair, John
2009-03-01
Ever shrinking measurement uncertainty requirements are difficult to achieve for a typical metrology toolset, especially over the entire expected life of the fleet. Many times, acceptable performance can be demonstrated during brief evaluation periods on a tool or two in the fleet. Over time and across the rest of the fleet, the most demanding processes often have measurement uncertainty concerns that prevent optimal process control, thereby limiting premium part yield, especially on the most aggressive technology nodes. Current metrology statistical process control (SPC) monitoring techniques focus on maintaining the performance of the fleet where toolset control chart limits are derived from a stable time period. These tools are prevented from measuring product when a statistical deviation is detected. Lastly, these charts are primarily concerned with daily fluctuations and do not consider the overall measurement uncertainty. It is possible that the control charts implemented for a given toolset suggest a healthy fleet while many of these demanding processes continue to suffer measurement uncertainty issues. This is especially true when extendibility is expected in a given generation of toolset. With this said, there is a need to continually improve the measurement uncertainty of the fleet until it can no longer meet the needed requirements at which point new technology needs to be entertained. This paper explores new methods in analyzing existing SPC monitor data to assess the measurement performance of the fleet and look for opportunities to drive improvements. Long term monitor data from a fleet of overlay and scatterometry tools will be analyzed. The paper also discusses using other methods besides SPC monitors to ensure the fleet stays matched; a set of SPC monitors provides a good baseline of fleet stability but it cannot represent all measurement scenarios happening in product recipes. The analyses presented deal with measurement uncertainty on non-measurement
Weight sensitivity measurement, analysis, and application in multi-attribute evaluation
Zhao, Yong; Huang, Chongyin; Chen, Yang
2013-11-01
Weights are used to measure relative importance of multiple attributes or objectives, which influence evaluation or decision results to a great degree. Thus, analyzing weight sensitivity is an important work for a multi-attribute evaluation or decision. A measuring method based on the inclined angle of two vectors is proposed in this paper in order to solve the weight sensitivity of a multi-attribute evaluation with isotonicity characteristic. This method uses the cosine of the inclined angle to measure the weight sensitivity based on preferences or preference combinations. Concepts of sensitivity space, degree, and angle are given, and the relevant measurement method is discussed and proved. Also, this method is used for the choice of the water environment protection projects in Heyuan City.
During a D and D or ER process containers of radioactive waste are normally generated. The activity can commonly be determined by gamma spectroscopy, but frequently the measurement conditions are not conducive to precise sample-detector geometries, and usually the radioactive material is not in a homogeneous distribution. What is the best method to accurately assay these containers - sampling followed by laboratory analysis, or in-situ spectroscopy? What is the uncertainty of the final result? To help answer these questions, the Canberra tool ISOCS Uncertainty Estimator [IUE] was used to mathematically simulate and evaluate several different measurement scenarios and to estimate the uncertainty of the measurement and the sampling process. Several representative containers and source distributions were mathematically defined and evaluated to determine the in-situ measurement uncertainty due to the sample non-uniformity. In the First example a typical field situation requiring the measurement of 200-liter drums was evaluated. A sensitivity analysis was done to show which parameters contributed the most to the uncertainty. Then an efficiency uncertainty calculation was performed. In the Second example, a group of 200-liter drums with various types of non-homogeneous distributions was created, and them measurements were simulated with different detector arrangements to see how the uncertainty varied. In the Third example, a truck filled with non-uniform soil was first measured with multiple in-situ detectors to determine the measurement uncertainty. Then composite samples were extracted and the sampling uncertainty computed for comparison to the field measurement uncertainty. (authors)
Dynamic risk measuring under model uncertainty: taking advantage of the hidden probability measure
Bion-Nadal, Jocelyne
2010-01-01
We study dynamic risk measures in a very general framework enabling to model uncertainty and processes with jumps. We previously showed the existence of a canonical equivalence class of probability measures hidden behind a given set of probability measures possibly non dominated. Taking advantage of this result, we exhibit a dual representation that completely characterizes the dynamic risk measure. We prove continuity and characterize time consistency. Then, we prove regularity for all processes associated to time consistent convex dynamic risk measures. We also study factorization through time for sublinear risk measures. Finally we consider examples (uncertain volatility and G-expectations).
Guitar Chords Classification Using Uncertainty Measurements of Frequency Bins
Jesus Guerrero-Turrubiates
2015-01-01
Full Text Available This paper presents a method to perform chord classification from recorded audio. The signal harmonics are obtained by using the Fast Fourier Transform, and timbral information is suppressed by spectral whitening. A multiple fundamental frequency estimation of whitened data is achieved by adding attenuated harmonics by a weighting function. This paper proposes a method that performs feature selection by using a thresholding of the uncertainty of all frequency bins. Those measurements under the threshold are removed from the signal in the frequency domain. This allows a reduction of 95.53% of the signal characteristics, and the other 4.47% of frequency bins are used as enhanced information for the classifier. An Artificial Neural Network was utilized to classify four types of chords: major, minor, major 7th, and minor 7th. Those, played in the twelve musical notes, give a total of 48 different chords. Two reference methods (based on Hidden Markov Models were compared with the method proposed in this paper by having the same database for the evaluation test. In most of the performed tests, the proposed method achieved a reasonably high performance, with an accuracy of 93%.
Reconsideration of the Uncertainty Relations and Quantum Measurements
Dumitru S.
2008-04-01
Full Text Available Discussions on uncertainty relations (UR and quantum measurements (QMS persisted until nowadays in publications about quantum mechanics (QM. They originate mainly from the conventional interpretation of UR (CIUR. In the most of the QM literarure, it is underestimated the fact that, over the years, a lot of deficiencies regarding CIUR were signaled. As a rule the alluded deficiencies were remarked disparately and discussed as punctual and non-essential questions. Here we approach an investigation of the mentioned deficiencies collected in a conclusive ensemble. Subsequently we expose a reconsideration of the major problems referring to UR and QMS. We reveal that all the basic presumption of CIUR are troubled by insurmountable deficiencies which require the indubitable failure of CIUR and its necessary abandonment. Therefore the UR must be deprived of their statute of crucialpieces for physics. So, the aboriginal versions of UR appear as being in postures of either (i thought-experimental fictions or (ii simple QM formulae and, any other versions of them, have no connection with the QMS. Then the QMS must be viewed as an additional subject comparatively with the usual questions of QM. For a theoretical description of QMS we propose an information-transmission model, in which the quantum observables are considered as random variables. Our approach directs to natural solutions and simplifications for many problems regarding UR and QMS.
Reconsideration of the Uncertainty Relations and Quantum Measurements
Dumitru S.
2008-04-01
Full Text Available Discussions on uncertainty relations (UR and quantum measurements (QMS persisted until nowadays in publications about quantum mechanics (QM. They originate mainly from the conventional interpretation of UR (CIUR. In the most of the QM literarure, it is underestimated the fact that, over the years, a lot of deficiencies regarding CIUR were signaled. As a rule the alluded deficiencies were remarked disparately and dis- cussed as punctual and non-essential questions. Here we approach an investigation of the mentioned deficiencies collected in a conclusive ensemble. Subsequently we expose a reconsideration of the major problems referring to UR and QMS. We reveal that all the basic presumption of CIUR are troubled by insurmountable deficiencies which require the indubitable failure of CIUR and its necessary abandonment. Therefore the UR must be deprived of their statute of crucial pieces for physics. So, the aboriginal versions of UR appear as being in postures of either (i thought-experimental fictions or (ii sim- ple QM formulae and, any other versions of them, have no connection with the QMS. Then the QMS must be viewed as an additional subject comparatively with the usual questions of QM. For a theoretical description of QMS we propose an information- transmission model, in which the quantum observables are considered as random vari- ables. Our approach directs to natural solutions and simplifications for many problems regarding UR and QMS.
One of the most important aspects in relation to the quality assurance in any analytical activity is the estimation of measurement uncertainty. There is general agreement that 'the expression of the result of a measurement is not complete without specifying its associated uncertainty'. An analytical process is the mechanism for obtaining methodological information (measurand) of a material system (population). This implies the need for the definition of the problem, the choice of methods for sampling and measurement and proper execution of these activities for obtaining information. The result of a measurement is only an approximation or estimate of the value of the measurand, which is complete only when accompanied by an estimate of the uncertainty of the analytical process. According to the 'Vocabulary of Basic and General Terms in Metrology' measurement uncertainty' is the parameter associated with the result of a measurement that characterizes the dispersion of the values that could reasonably be attributed to the measurand (or magnitude). This parameter could be a standard deviation or a confidence interval. The uncertainty evaluation requires detailed look at all possible sources, but not disproportionately. We can make a good estimate of the uncertainty concentrating efforts on the largest contributions. The key steps of the process of determining the uncertainty in the measurements are: - the specification of the measurand; - identification of the sources of uncertainty - the quantification of individual components of uncertainty, - calculate the combined standard uncertainty; - report of uncertainty.
The 1993 ISO Guide to the expression of uncertainty in measurement applied to NAA
Principles of the expression of uncertainty in measurements are briefly reviewed and special aspects of the uncertainty quantification in NAA are discussed in detail regarding the relative and k0-standardization in both modes of the technique, i.e., INAA and RNAA. A survey of uncertainty sources is presented and calculation of the combined uncertainty is demonstrated by an example of manganese determination in biological material by RNAA. (author)
Přečková Petra
2012-04-01
Full Text Available Abstract Background Narrative medical reports do not use standardized terminology and often bring insufficient information for statistical processing and medical decision making. Objectives of the paper are to propose a method for measuring diversity in medical reports written in any language, to compare diversities in narrative and structured medical reports and to map attributes and terms to selected classification systems. Methods A new method based on a general concept of f-diversity is proposed for measuring diversity of medical reports in any language. The method is based on categorized attributes recorded in narrative or structured medical reports and on international classification systems. Values of categories are expressed by terms. Using SNOMED CT and ICD 10 we are mapping attributes and terms to predefined codes. We use f-diversities of Gini-Simpson and Number of Categories types to compare diversities of narrative and structured medical reports. The comparison is based on attributes selected from the Minimal Data Model for Cardiology (MDMC. Results We compared diversities of 110 Czech narrative medical reports and 1119 Czech structured medical reports. Selected categorized attributes of MDMC had mostly different numbers of categories and used different terms in narrative and structured reports. We found more than 60% of MDMC attributes in SNOMED CT. We showed that attributes in narrative medical reports had greater diversity than the same attributes in structured medical reports. Further, we replaced each value of category (term used for attributes in narrative medical reports by the closest term and the category used in MDMC for structured medical reports. We found that relative Gini-Simpson diversities in structured medical reports were significantly smaller than those in narrative medical reports except the "Allergy" attribute. Conclusions Terminology in narrative medical reports is not standardized. Therefore it is nearly
Jones, D.W.
2002-05-16
In previous reports, we have identified two potentially important issues, solutions to which would increase the attractiveness of DOE-developed technologies in commercial buildings energy systems. One issue concerns the fact that in addition to saving energy, many new technologies offer non-energy benefits that contribute to building productivity (firm profitability). The second issue is that new technologies are typically unproven in the eyes of decision makers and must bear risk premiums that offset cost advantages resulting from laboratory calculations. Even though a compelling case can be made for the importance of these issues, for building decision makers to incorporate them in business decisions and for DOE to use them in R&D program planning there must be robust empirical evidence of their existence and size. This paper investigates how such measurements could be made and offers recommendations as to preferred options. There is currently little systematic information on either of these concepts in the literature. Of the two there is somewhat more information on non-energy benefits, but little as regards office buildings. Office building productivity impacts can be observed casually, but must be estimated statistically, because buildings have many interacting attributes and observations based on direct behavior can easily confuse the process of attribution. For example, absenteeism can be easily observed. However, absenteeism may be down because a more healthy space conditioning system was put into place, because the weather was milder, or because firm policy regarding sick days had changed. There is also a general dearth of appropriate information for purposes of estimation. To overcome these difficulties, we propose developing a new data base and applying the technique of hedonic price analysis. This technique has been used extensively in the analysis of residential dwellings. There is also a literature on its application to commercial and industrial
Morace, Renata Erica; Hansen, Hans Nørgaard; De Chiffre, Leonardo
2005-01-01
This paper deals with the uncertainty estimation of measurements performed on optical coordinate measuring machines (CMMs). Two different methods were used to assess the uncertainty of circle diameter measurements using an optical CMM: the sensitivity analysis developing an uncertainty budget and...... the substitution method based on measuring calibrated workpieces. Three holes with nominal diameter values in the range from 2 mm to 6 mm were measured on an optical CMM equipped with a CCD sensor and expanded measuring uncertainties were estimated to be in the range of 1-2 ìm....
Evaluation of Measurement Uncertainty in Neutron Activation Analysis using Research Reactor
Chung, Y. S.; Moon, J. H.; Sun, G. M.; Kim, S. H.; Baek, S. Y.; Lim, J. M.; Lee, Y. N.; Kim, H. R
2007-02-15
This report was summarized a general and technical requirements, methods, results on the measurement uncertainty assessment for a maintenance of quality assurance and traceability which should be performed in NAA technique using the HANARO research reactor. It will be used as a basic information to support effectively an accredited analytical services in the future. That is, for the assessment of measurement uncertainty, environmental certified reference materials are used to apply the analytical results obtained from real experiment using ISO-GUM and Monte Carlo Simulation(MCS) methods. Firstly, standard uncertainty of predominant parameters in a NAA is evaluated for the measured values of elements quantitatively, and then combined uncertainty is calculated applying the rule of uncertainty propagation. In addition, the contribution of individual standard uncertainty for the combined uncertainty are estimated and the way for a minimization of them is reviewed.
The uncertainty in physical measurements an introduction to data analysis in the physics laboratory
Fornasini, Paolo
2008-01-01
All measurements of physical quantities are affected by uncertainty. Understanding the origin of uncertainty, evaluating its extent and suitably taking it into account in data analysis is essential for assessing the degree of accuracy of phenomenological relationships and physical laws in both scientific research and technological applications. The Uncertainty in Physical Measurements: An Introduction to Data Analysis in the Physics Laboratory presents an introduction to uncertainty and to some of the most common procedures of data analysis. This book will serve the reader well by filling the gap between tutorial textbooks and highly specialized monographs. The book is divided into three parts. The first part is a phenomenological introduction to measurement and uncertainty: properties of instruments, different causes and corresponding expressions of uncertainty, histograms and distributions, and unified expression of uncertainty. The second part contains an introduction to probability theory, random variable...
Fazzari, D M
2001-01-01
This report presents the results of an evaluation of the Total Measurement Uncertainty (TMU) for the Canberra manufactured Segmented Gamma Scanner Assay System (SGSAS) as employed at the Hanford Plutonium Finishing Plant (PFP). In this document, TMU embodies the combined uncertainties due to all of the individual random and systematic sources of measurement uncertainty. It includes uncertainties arising from corrections and factors applied to the analysis of transuranic waste to compensate for inhomogeneities and interferences from the waste matrix and radioactive components. These include uncertainty components for any assumptions contained in the calibration of the system or computation of the data. Uncertainties are propagated at 1 sigma. The final total measurement uncertainty value is reported at the 95% confidence level. The SGSAS is a gamma assay system that is used to assay plutonium and uranium waste. The SGSAS system can be used in a stand-alone mode to perform the NDA characterization of a containe...
Uncertainty of the beam energy measurement in the e+e- collision using Compton backscattering
Mo, Xiao-Hu
2014-10-01
The beam energy is measured in the e+e- collision by using Compton backscattering. The uncertainty of this measurement process is studied by virtue of analytical formulas, and the special effects of variant energy spread and energy drift on the systematic uncertainty estimation are also studied with the Monte Carlo sampling technique. These quantitative conclusions are especially important for understanding the uncertainty of the beam energy measurement system.
Zhan Zhiqiang
2016-01-01
Full Text Available In digital modulation quality parameters traceability, the Error Vector Magnitude, Magnitude Error and Phase Error must be traced, and the measurement uncertainty of the above parameters needs to be assessed. Although the calibration specification JJF1128-2004 Calibration Specification for Vector Signal Analyzers is published domestically, the measurement uncertainty evaluation is unreasonable, the parameters selected is incorrect, and not all error terms are selected in measurement uncertainty evaluation. This article lists formula about magnitude error and phase error, than presents the measurement uncertainty evaluation processes for magnitude error and phase errors.
Radiation-induced statistical uncertainty in the threshold voltage measurement of MOSFET dosimeters
The results of a recent study on the limiting uncertainties in the measurement of photon radiation dose with MOSFET dosimeters are reported. The statistical uncertainty in dose measurement from a single device has been measured before and after irradiation. The resulting increase in 1/f noise with radiation dose has been investigated via various analytical models. The limit of uncertainty in the ubiquitous linear trend of threshold voltage with dose has been measured and compared to two nonlinear models. Inter-device uncertainty has been investigated in a group of 40 devices, and preliminary evidence for kurtosis and skewness in the distributions for devices without external bias has been observed
Ling Mingxiang
2014-12-01
Full Text Available Measurement uncertainty evaluation based on the Monte Carlo method (MCM with the assumption that all uncertainty sources are independent is common. For some measure problems, however, the correlation between input quantities is of great importance and even essential. The purpose of this paper is to provide an uncertainty evaluation method based on MCM that can handle correlated cases, especially for measurement in which uncertainty sources are correlated and submit to non-Gaussian distribution. In this method, a linear-nonlinear transformation technique was developed to generate correlated random variables sampling sequences with target prescribed marginal probability distribution and correlation coefficients. Measurement of the arm stretch of a precision centrifuge of 10-6 order was implemented by a high precision approach and associated uncertainty evaluation was carried out using the mentioned method and the method proposed in the Guide to the Expression of Uncertainty in Measurement (GUM. The obtained results were compared and discussed at last.
Rastegin, Alexey E.
2015-01-01
We study uncertainty and certainty relations for two successive measurements of two-dimensional observables. Uncertainties in successive measurement are considered within the following two scenarios. In the first scenario, the second measurement is performed on the quantum state generated after the first measurement with completely erased information. In the second scenario, the second measurement is performed on the post-first-measurement state conditioned on the actual measurement outcome. ...
Uncertainties in eddy covariance flux measurements assessed from CH4 and N2O observations
The uncertainty in eddy covariance (EC) flux measurements is assessed for CH4 and N2O using data measured at a dairy farm site in the Netherlands in 2006 and 2007. An overview is given of the contributing uncertainties and their magnitude. The relative and absolute uncertainty of a 30 min EC flux are estimated for CH4 and N2O using N = 2185 EC fluxes. The average absolute uncertainty and its standard deviation are 500 ± 400 ng C m-2 s-1 for CH4 and 100 ± 100 ng N m-2 s-1 for N2O. The corresponding relative uncertainties have 95% confidence interval ranging from 20% to 300% for CH4 and from 30% to 1800% for N2O. The large relative uncertainties correspond to relatively small EC fluxes. The uncertainties are mainly caused by the uncertainty due to one-point sampling which contributes on average more than 90% to the total uncertainty. The other 10% includes the uncertainty in the correction algorithm for the systematic errors. The uncertainty in a daily and monthly averaged EC flux are estimated for several flux magnitude ranges. The daily and monthly average uncertainty are smaller than 25% and 10% for CH4 and smaller than 50% and 10% for N2O, respectively, based on fluxes larger than 100 ng C m-2 s-1 and 15 ng N m-2 s-1.
How Should Attributions Be Measured? A Reanalysis of Data from Elig and Frieze.
Maruyama, Geoffrey
1982-01-01
T.W. Elig and I.H. Frieze used a multitrait, multimethod approach to contrast three methods for measuring attributions: unstructured/open-ended, structured/unidimensional, and structured/ipsative. This paper reanalyzed their data using confirmatory factor analysis techniques. (Author/PN)
The Attributive Theory of Quality: A Model for Quality Measurement in Higher Education.
Afshar, Arash
A theoretical basis for defining and measuring the quality of institutions of higher education, namely for accreditation purposes, is developed. The theory, the Attributive Theory of Quality, is illustrated using a calculation model that is based on general systems theory. The theory postulates that quality only exists in relation to the…
Groschl, Andreas; Gotz, Jurgen; Loderer, Andreas; Bills, Paul J.; Hausotte, Tino
2015-01-01
In verifying the tolerance specification and identifying the zone of conformity of a particular component an adequate determination of the task-related measurement uncertainty relevant to the utilized measurement method is required, in accordance with part one of the standard “Geometrical Product Specifications” as well as with the “Guide to the Expression of Uncertainty in Measurement”. Although, measurement uncertainty is a central subject in the field of metrology and is certainly consider...
Uncertainty of power curve measurement with a two-beam nacelle-mounted lidar
Wagner, Rozenn; Courtney, Michael Stephen; Friis Pedersen, Troels;
2015-01-01
already been demonstrated to be suitable for use in power performance measurements. To be considered as a professional tool, however, power curve measurements performed using these instruments require traceable calibrated measurements and the quantification of the wind speed measurement uncertainty. Here...... uncertainty lies between 1 and 2% for the wind speed range between cut-in and rated wind speed. Finally, the lidar was mounted on the nacelle of a wind turbine in order to perform a power curve measurement. The wind speed was simultaneously measured with a mast-top mounted cup anemometer placed two rotor...... diameters upwind of the turbine. The wind speed uncertainty related to the lidar tilting was calculated based on the tilt angle uncertainty derived from the inclinometer calibration and the deviation of the measurement height from hub height. The resulting combined uncertainty in the power curve using the...
Ospina, José; Canuto, Enrico
2008-08-01
The paper deals with the uncertainty of differential measurements, obtained from the subtraction of a pair of absolute measurements. It is shown that if the same sensor is used to perform both measurements, a model of the sensor will reveal a correlation component between the uncertainty of each absolute measurement, reducing the uncertainty on its subtraction. The procedure followed is based on the Gauss-Markov estimation method, showing that differential measurement uncertainty vanishes when the gradient to be measured is zero. If the two absolute measurements are to be performed using different sensors, a calibration by comparison between them will result in a similar uncertainty reduction. Finally, a simulated example based on commercially available thermistor data is included.
This report describes the software development for the plutonium attribute verification system--AVNG. A brief synopsis of the technical solution for the measurement system is presented. The main tasks for the software development that is underway are formulated. The development tasks are shown in software structural flowcharts, measurement system state diagram and a description of the software. The current status of the AVNG software development is elucidated
This report describes the software development for the plutonium attribute verification system - AVNG. A brief synopsis of the technical solution for the measurement system is presented. The main tasks for the software development that is underway are formulated. The development tasks are shown in software structural flowcharts, measurement system state diagram and a description of the software. The current status of the AVNG software development is elucidated.
A systematic review of quality attributes and measures for software product lines
Montagud Gregori, Sonia; Abrahao Gonzales, Silvia Mara; Insfrán Pelozo, César Emilio
2012-01-01
It is widely accepted that software measures provide an appropriate mechanism for understanding, monitoring, controlling, and predicting the quality of software development projects. In software product lines (SPL), quality is even more important than in a single software product since, owing to systematic reuse, a fault or an inadequate design decision could be propagated to several products in the family. Over the last few years, a great number of quality attributes and measures for assessi...
Müller, Pavel; Hiller, Jochen; Dai, Y.;
2014-01-01
This paper presents the application of the substitution method for the estimation of measurement uncertainties using calibrated workpieces in X-ray computed tomography (CT) metrology. We have shown that this, well accepted method for uncertainty estimation using tactile coordinate measuring...... machines, can be applied to dimensional CT measurements. The method is based on repeated measurements carried out on a calibrated master piece. The master piece is a component of a dose engine from an insulin pen. Measurement uncertainties estimated from the repeated measurements of the master piece were...
Thibodeau, Michel A; Carleton, R Nicholas; McEvoy, Peter M; Zvolensky, Michael J; Brandt, Charles P; Boelen, Paul A; Mahoney, Alison E J; Deacon, Brett J; Asmundson, Gordon J G
2015-01-01
Intolerance of uncertainty (IU) is a construct of growing prominence in literature on anxiety disorders and major depressive disorder. Existing measures of IU do not define the uncertainty that respondents perceive as distressing. To address this limitation, we developed eight scales measuring disor
Uncertainty in Citizen Science observations: from measurement to user perception
Lahoz, William; Schneider, Philipp; Castell, Nuria
2016-04-01
Citizen Science activities concern general public engagement in scientific research activities when citizens actively contribute to science either with their intellectual effort or surrounding knowledge or with their tools and resources. The advent of technologies such as the Internet and smartphones, and the growth in their usage, has significantly increased the potential benefits from Citizen Science activities. Citizen Science observations from low-cost sensors, smartphones and Citizen Observatories, provide a novel and recent development in platforms for observing the Earth System, with the opportunity to extend the range of observational platforms available to society to spatio-temporal scales (10-100s m; 1 hr or less) highly relevant to citizen needs. The potential value of Citizen Science is high, with applications in science, education, social aspects, and policy aspects, but this potential, particularly for citizens and policymakers, remains largely untapped. Key areas where Citizen Science data start to have demonstrable benefits include GEOSS Societal Benefit Areas such as Health and Weather. Citizen Science observations have many challenges, including simulation of smaller spatial scales, noisy data, combination with traditional observational methods (satellite and in situ data), and assessment, representation and visualization of uncertainty. Within these challenges, that of the assessment and representation of uncertainty and its communication to users is fundamental, as it provides qualitative and/or quantitative information that influences the belief users will have in environmental information. This presentation will discuss the challenges in assessment and representation of uncertainty in Citizen Science observations, its communication to users, including the use of visualization, and the perception of this uncertainty information by users of Citizen Science observations.
The calculation methods of the capability of measurement processes in the automotive industry differ from each other. There are three main calculation methods: MSA, VDA 5 and the international standard, ISO 22514–7. During this research our aim was to compare the capability calculation methods in a case study. Two types of automotive parts (ten pieces of each) are chosen to examine the behaviour of the manufacturing process and to measure the required characteristics of the measurement process being evaluated. The measurement uncertainty of the measuring process is calculated according to the VDA 5 and ISO 22514–7, and MSA guidelines. In this study the conformance of a measurement process in an automotive manufacturing process is determined, and the similarities and the differences between the methods used are shown. (paper)
Within the frameworks of TO No.007 between ORNL and VNIIEF on Nuclear Materials Identification System (NMIS) mastering at VNIIEF in July 2000 there had been finalized joint measurements, in which NMIS-technique equipment was used that had been placed at VNIIEF's disposal by ORNL, as well as VNIIEF-produced unclassified samples of fissile materials. In the report there are presented results of experimental data preliminary processing to obtain absolute values of some attributes used in plutonium shells measurements: values of their mass and thickness. Possibility of fissile materials parameters absolute values obtaining from measurement data essentially widens NMIS applicability to the tasks relevant to these materials inspections
Nottrott, A.; Hoffnagle, J.; Farinas, A.; Rella, C.
2014-12-01
Carbon monoxide (CO) is an urban pollutant generated by internal combustion engines which contributes to the formation of ground level ozone (smog). CO is also an excellent tracer for emissions from mobile combustion sources. In this work we present an optimized spectroscopic sampling scheme that enables enhanced precision CO measurements. The scheme was implemented on the Picarro G2401 Cavity Ring-Down Spectroscopy (CRDS) analyzer which measures CO2, CO, CH4 and H2O at 0.2 Hz. The optimized scheme improved the raw precision of CO measurements by 40% from 5 ppb to 3 ppb. Correlations of measured CO2, CO, CH4 and H2O from an urban tower were partitioned by wind direction and combined with a concentration footprint model for source attribution. The application of a concentration footprint for source attribution has several advantages. The upwind extent of the concentration footprint for a given sensor is much larger than the flux footprint. Measurements of mean concentration at the sensor location can be used to estimate source strength from a concentration footprint, while measurements of the vertical concentration flux are necessary to determine source strength from the flux footprint. Direct measurement of vertical concentration flux requires high frequency temporal sampling and increases the cost and complexity of the measurement system.
An attribute verification system (AVNG) with information barriers for mass and isotopics measurements has been designed and its fabrication is nearly completed. The AVNG is being built by scientists at the Russian Federal Nuclear Center-VNIIEF, with support of Los Alamos National Laboratory (LANL) and Lawrence Livermore National Laboratory (LLNL). Such a system could be used to verify the presence of several unclassified attributes of classified material with no classified information release. The system is comprised of a neutron multiplicity counter and gamma-spectrometry system based on a high purity germanium gamma detector (nominal relative efficiency @ 1332 keV 50%) and digital gamma-ray spectrometer DSPECPLUS. The neutron multiplicity counter is a three ring counter with 164 3He tubes. The system was designed to measure prototype containers 491 mm in diameter and 503 mm high. This paper provides a brief history of the project and documents the progress of this effort with drawings and photographs.
An attribute verification system (AVNG) with information barriers for mass and isotopics measurements has been designed and its fabrication is nearly completed. The AVNG is being built by scientists at the Russian Federal Nuclear Center-VNIIEF, with support of Los Alamos National Laboratory (LANL) and Lawrence Livermore National Laboratory (LLNL). Such a system could be used to verify the presence of several unclassified attributes of classified material with no classified information release. The system is comprised of a neutron multiplicity counter and gamma-spectrometry system based on a high purity germanium gamma detector (nominal relative efficiency (at) 1332 keV 50%) and digital gamma-ray spectrometer DSPECPLUS. The neutron multiplicity counter is a three ring counter with 164 3He tubes. The system was designed to measure prototype containers 491 mm in diameter and 503 mm high. This paper provides a brief history of the project and documents the progress of this effort with drawings and photographs
Doubt-free uncertainty in measurement an introduction for engineers and students
Ratcliffe, Colin
2015-01-01
This volume presents measurement uncertainty and uncertainty budgets in a form accessible to practicing engineers and engineering students from across a wide range of disciplines. The book gives a detailed explanation of the methods presented by NIST in the “GUM” – Guide to Uncertainty of Measurement. Emphasis is placed on explaining the background and meaning of the topics, while keeping the level of mathematics at the minimum level necessary. Dr. Colin Ratcliffe, USNA, and Bridget Ratcliffe, Johns Hopkins, develop uncertainty budgets and explain their use. In some examples, the budget may show a process is already adequate and where costs can be saved. In other examples, the budget may show the process is inadequate and needs improvement. The book demonstrates how uncertainty budgets help identify the most cost effective place to make changes. In addition, an extensive fully-worked case study leads readers through all issues related to an uncertainty analysis, including a variety of different types of...
Real Graphs from Real Data: Experiencing the Concepts of Measurement and Uncertainty
Farmer, Stuart
2012-01-01
A simple activity using cheap and readily available materials is described that allows students to experience first hand many of the concepts of measurement, uncertainty and graph drawing without laborious measuring or calculation. (Contains 9 figures.)
Payman Moallem
2007-01-01
Fuzzy Attribute Graph (FAG) is a powerful tool for representation and recognition of structural patterns. The conventional framework for similarity measure of FAGs is based on equivalent fuzzy attributes but in fuzzy world, some attributes are more important. In this paper, a modified recognition framework, using linguistic modifier for matching of the fuzzy attribute graphs, is introduced. Then an algorithm for automatic selection of fuzzy modifier based on the learning patterns is posed. So...
Measurement and Segmentation of College Students' Noncognitive Attributes: A Targeted Review
Ann E. Person; Scott E. Baumgartner; Kristin Hallgren; Betsy Santos
2014-01-01
This report presents findings from a targeted document review and expert interviews conducted as part of the Student Segmentation Initiative, which was funded by the Bill & Melinda Gates Foundationâ€™s Postsecondary Success strategy. The review addresses three questions relevant to the initiative: (1) What instruments and measures are available to assess postsecondary studentsâ€™ noncognitive attributes? (2) To what extent are these instruments used to classify or segment student populations?...
J. K. Spiegel; Zieger, P.; Bukowiecki, N.; E. Hammer; Weingartner, E.; W. Eugster
2012-01-01
Droplet size spectra measurements are crucial to obtain a quantitative microphysical description of clouds and fog. However, cloud droplet size measurements are subject to various uncertainties. This work focuses on the evaluation of two key measurement uncertainties arising during cloud droplet size measurements with a conventional droplet size spectrometer (FM-100): first, we addressed the precision with which droplets can be sized with the FM-100 on the basis of Mie theory. We deduced erro...
J. K. Spiegel; Zieger, P.; Bukowiecki, N.; E. Hammer; Weingartner, E.; W. Eugster
2012-01-01
Droplet size spectra measurements are crucial to obtain a quantitative microphysical description of clouds and fog. However, cloud droplet size measurements are subject to various uncertainties. This work focuses on the error analysis of two key measurement uncertainties arising during cloud droplet size measurements with a conventional droplet size spectrometer (FM-100): first, we addressed the precision with which droplets can be sized with the FM-100 on the basis of the Mie theory. We dedu...
Biedermann, Eric; Jauriqui, Leanne; Aldrin, John C.; Mayes, Alexander; Williams, Tom; Mazdiyasni, Siamack
2016-02-01
Resonant Ultrasound Spectroscopy (RUS) is a nondestructive evaluation (NDE) method which can be used for material characterization, defect detection, process control and life monitoring for critical components in gas turbine engines, aircraft and other systems. Accurate forward and inverse modeling for RUS requires a proper accounting of the propagation of uncertainty due to the model and measurement sources. A process for quantifying the propagation of uncertainty to RUS frequency results for models and measurements was developed. Epistemic and aleatory sources of uncertainty were identified for forward model parameters, forward model material property and geometry inputs, inverse model parameters, and physical RUS measurements. RUS model parametric studies were then conducted for simple geometric samples to determine the sensitivity of RUS frequencies and model inversion results to the various sources of uncertainty. The results of these parametric studies were used to calculate uncertainty bounds associated with each source. Uncertainty bounds were then compared to assess the relative impact of the various sources of uncertainty, and mitigations were identified. The elastic material property inputs for forward models, such as Young's Modulus, were found to be the most significant source of uncertainty in these studies. The end result of this work was the development of an uncertainty quantification process that can be adapted to a broad range of components and materials.
One of the most common and popular practices on measuring the non-ionising electric and/or magnetic field strength employs field meters and the appropriate electric and/or magnetic field strength sensors. These measurements have to meet several requirements proposed by specific guidelines or standards. On the other hand, performing non-ionising exposure assessment using real measurement data can be a very difficult task due to instrumentation limits and uncertainties. In addition, each measuring technique, practice and recommendation has its own drawbacks. In this paper, a methodology for estimating the overall uncertainty for such measurements, including uncertainty estimation of spatial average values of electric or magnetic field strengths, is proposed. Estimating and reporting measurement uncertainty are of great importance, especially when the measured values are very close to the established limits of human exposure to non-ionising electromagnetic fields. (authors)
It is now widely recognized that, when all of the known or suspected components of errors have been evaluated and corrected, there still remains an uncertainty, that is, a doubt about how well the result of the measurement represents the value of the quantity being measured. Evaluation of measurement data - Guide to the expression of Uncertainty in Measurement (GUM) is a guidance document, the purpose of which is to promote full information on how uncertainty statements are arrived at and to provide a basis for the international comparison of measurement results. In this paper, uncertainty estimations following GUM guidelines have been made for the measured values of online thoron concentrations using Lucas scintillation cell to prove that the correction for disequilibrium between 220Rn and 216Po is significant in online 220Rn measurements
Measuring the performance of sensors that report uncertainty
Martin, A D; Parry, M
2014-01-01
We provide methods to validate and compare sensor outputs, or inference algorithms applied to sensor data, by adapting statistical scoring rules. The reported output should either be in the form of a prediction interval or of a parameter estimate with corresponding uncertainty. Using knowledge of the `true' parameter values, scoring rules provide a method of ranking different sensors or algorithms for accuracy and precision. As an example, we apply the scoring rules to the inferred masses of cattle from ground force data and draw conclusions on which rules are most meaningful and in which way.
Comparison of model predictions with measurements using the improved spent fuel attribute tester
Design improvements for the International Atomic Energy Agency's Spent Fuel Attribute Tester, recommended on the basis of an optimization study, were incorporated into a new instrument fabricated under the Finnish Support Programme. The new instrument was tested at a spent fuel storage pool on September 8 and 9, 1993. The result of two of the measurements have been compared with calculations. In both cases the calculated and measured pulse height spectra in good agreement and the 137Cs gamma peak signature from the target spent fuel element is present
Realistic uncertainties on Hapke model parameters from photometric measurement
Schmidt, Frederic
2015-01-01
Hapke proposed a convenient and widely used analytical model to describe the spectro-photometry of granular materials. Using a compilation of the published data, Hapke (2012, Icarus, 221, 1079-1083) recently studied the relationship of b and c for natural examples and proposed the hockey stick relation (excluding b>0.5 and c>0.5). For the moment, there is no theoretical explanation for this relationship. One goal of this article is to study a possible bias due to the retrieval method. We expand here an innovative Bayesian inversion method in order to study into detail the uncertainties of retrieved parameters. On Emission Phase Function (EPF) data, we demonstrate that the uncertainties of the retrieved parameters follow the same hockey stick relation, suggesting that this relation is due to the fact that b and c are coupled parameters in the Hapke model instead of a natural phenomena. Nevertheless, the data used in the Hapke (2012) compilation generally are full Bidirectional Reflectance Diffusion Function (B...
Yu Rao
2012-01-01
Full Text Available Liquid crystal thermography is an advanced nonintrusive measurement technique, which is capable of providing a high-accuracy continuous temperature field measurement, especially for a complex structured heat transfer surface. The first part of the paper presents a comprehensive introduction to the thermochromic liquid crystal material and the related liquid crystal thermography technique. Then, based on the aythors' experiences in using the liquid crystal thermography for the heat transfer measurement, the parameters affecting the measurement uncertainty of the liquid crystal thermography have been discussed in detail through an experimental study. The final part of the paper describes the applications of the steady and transient liquid crystal thermography technique in the study of the turbulent flow heat transfer related to the aeroengine turbine blade cooling.
Although the detection techniques used for measuring classified materials are very similar to those used in unclassified measurements, the surrounding packaging is generally very different. If iZ classified item is to be measured, an information barrier is required to protect any classified data acquired. This information barrier must protect the classified information while giving the inspector confidence that the unclassified outputs accurately reflect the classified inputs, Both information barrier and authentication considerations must be considered during all phases of system design and fabrication. One example of such a measurement system is the attribute measurement system (termed the AVNG) designed for the: Trilateral Initiative. We will discuss the integration of information barrier components into this system as well as the effects of an information barrier (including authentication) concerns on the implementation of the detector systems.
The CSGU: a measure of controllability, stability, globality, and universality attributions.
Coffee, Pete; Rees, Tim
2008-10-01
This article reports initial evidence of construct validity for a four-factor measure of attributions assessing the dimensions of controllability, stability, globality, and universality (the CSGU). In Study 1, using confirmatory factor analysis, factors were confirmed across least successful and most successful conditions. In Study 2, following less successful performances, correlations supported hypothesized relationships between subscales of the CSGU and subscales of the CDSII (McAuley, Duncan, & Russell, 1992). In Study 3, following less successful performances, moderated hierarchical regression analyses demonstrated that individuals have higher subsequent self-efficacy when they perceive causes of performance as controllable, and/or specific, and/or universal. An interaction for controllability and stability demonstrated that if causes are perceived as likely to recur, it is important to perceive that causes are controllable. Researchers are encouraged to use the CSGU to examine main and interactive effects of controllability and generalizability attributions upon outcomes such as self-efficacy, emotions, and performance. PMID:18971514
High Speed Railway Environment Safety Evaluation Based on Measurement Attribute Recognition Model
Qizhou Hu
2014-01-01
Full Text Available In order to rationally evaluate the high speed railway operation safety level, the environmental safety evaluation index system of high speed railway should be well established by means of analyzing the impact mechanism of severe weather such as raining, thundering, lightning, earthquake, winding, and snowing. In addition to that, the attribute recognition will be identified to determine the similarity between samples and their corresponding attribute classes on the multidimensional space, which is on the basis of the Mahalanobis distance measurement function in terms of Mahalanobis distance with the characteristics of noncorrelation and nondimensionless influence. On top of the assumption, the high speed railway of China environment safety situation will be well elaborated by the suggested methods. The results from the detailed analysis show that the evaluation is basically matched up with the actual situation and could lay a scientific foundation for the high speed railway operation safety.
Uncertainty analysis of steady state incident heat flux measurements in hydrocarbon fuel fires.
Nakos, James Thomas
2005-12-01
The objective of this report is to develop uncertainty estimates for three heat flux measurement techniques used for the measurement of incident heat flux in a combined radiative and convective environment. This is related to the measurement of heat flux to objects placed inside hydrocarbon fuel (diesel, JP-8 jet fuel) fires, which is very difficult to make accurately (e.g., less than 10%). Three methods will be discussed: a Schmidt-Boelter heat flux gage; a calorimeter and inverse heat conduction method; and a thin plate and energy balance method. Steady state uncertainties were estimated for two types of fires (i.e., calm wind and high winds) at three times (early in the fire, late in the fire, and at an intermediate time). Results showed a large uncertainty for all three methods. Typical uncertainties for a Schmidt-Boelter gage ranged from {+-}23% for high wind fires to {+-}39% for low wind fires. For the calorimeter/inverse method the uncertainties were {+-}25% to {+-}40%. The thin plate/energy balance method the uncertainties ranged from {+-}21% to {+-}42%. The 23-39% uncertainties for the Schmidt-Boelter gage are much larger than the quoted uncertainty for a radiative only environment (i.e ., {+-}3%). This large difference is due to the convective contribution and because the gage sensitivities to radiative and convective environments are not equal. All these values are larger than desired, which suggests the need for improvements in heat flux measurements in fires.
Measuring Research Data Uncertainty in the 2010 NRC Assessment of Geography Graduate Education
Shortridge, Ashton; Goldsberry, Kirk; Weessies, Kathleen
2011-01-01
This article characterizes and measures errors in the 2010 National Research Council (NRC) assessment of research-doctorate programs in geography. This article provides a conceptual model for data-based sources of uncertainty and reports on a quantitative assessment of NRC research data uncertainty for a particular geography doctoral program.…
A model for the time uncertainty measurements in the Auger surface detector array
Bonifazi, C.; Letessier-Selvon, A.; Santos, E. M.
2007-01-01
The precise determination of the arrival direction of cosmic rays is a fundamental prerequisite for the search for sources or the study of their anisotropies on the sky. One of the most important aspects to achieve an optimal measurement of these directions is to properly take into account the measurement uncertainties in the estimation procedure. In this article we present a model for the uncertainties associated with the time measurements in the Auger surface detector array. We show that th...
Helium Mass Spectrometer Leak Detection: A Method to Quantify Total Measurement Uncertainty
Mather, Janice L.; Taylor, Shawn C.
2015-01-01
In applications where leak rates of components or systems are evaluated against a leak rate requirement, the uncertainty of the measured leak rate must be included in the reported result. However, in the helium mass spectrometer leak detection method, the sensitivity, or resolution, of the instrument is often the only component of the total measurement uncertainty noted when reporting results. To address this shortfall, a measurement uncertainty analysis method was developed that includes the leak detector unit's resolution, repeatability, hysteresis, and drift, along with the uncertainty associated with the calibration standard. In a step-wise process, the method identifies the bias and precision components of the calibration standard, the measurement correction factor (K-factor), and the leak detector unit. Together these individual contributions to error are combined and the total measurement uncertainty is determined using the root-sum-square method. It was found that the precision component contributes more to the total uncertainty than the bias component, but the bias component is not insignificant. For helium mass spectrometer leak rate tests where unit sensitivity alone is not enough, a thorough evaluation of the measurement uncertainty such as the one presented herein should be performed and reported along with the leak rate value.
Realistic uncertainties on Hapke model parameters from photometric measurement
Schmidt, Frédéric; Fernando, Jennifer
2015-11-01
The single particle phase function describes the manner in which an average element of a granular material diffuses the light in the angular space usually with two parameters: the asymmetry parameter b describing the width of the scattering lobe and the backscattering fraction c describing the main direction of the scattering lobe. Hapke proposed a convenient and widely used analytical model to describe the spectro-photometry of granular materials. Using a compilation of the published data, Hapke (Hapke, B. [2012]. Icarus 221, 1079-1083) recently studied the relationship of b and c for natural examples and proposed the hockey stick relation (excluding b > 0.5 and c > 0.5). For the moment, there is no theoretical explanation for this relationship. One goal of this article is to study a possible bias due to the retrieval method. We expand here an innovative Bayesian inversion method in order to study into detail the uncertainties of retrieved parameters. On Emission Phase Function (EPF) data, we demonstrate that the uncertainties of the retrieved parameters follow the same hockey stick relation, suggesting that this relation is due to the fact that b and c are coupled parameters in the Hapke model instead of a natural phenomena. Nevertheless, the data used in the Hapke (Hapke, B. [2012]. Icarus 221, 1079-1083) compilation generally are full Bidirectional Reflectance Diffusion Function (BRDF) that are shown not to be subject to this artifact. Moreover, the Bayesian method is a good tool to test if the sampling geometry is sufficient to constrain the parameters (single scattering albedo, surface roughness, b, c , opposition effect). We performed sensitivity tests by mimicking various surface scattering properties and various single image-like/disk resolved image, EPF-like and BRDF-like geometric sampling conditions. The second goal of this article is to estimate the favorable geometric conditions for an accurate estimation of photometric parameters in order to provide
The Waste Receiving and Processing (WRAP) facility, located on the Hanford Site in southeast Washington, is a key link in the certification of Hanford's transuranic (TRU) waste for shipment to the Waste Isolation Pilot Plant (WIPP). Waste characterization is one of the vital functions performed at WRAP, and nondestructive assay (NDA) measurements of TRU waste containers is one of two required methods used for waste characterization (Reference 1). Various programs exist to ensure the validity of waste characterization data; all of these cite the need for clearly defined knowledge of uncertainty, associated with any measurements taken. All measurements have an inherent uncertainty associated with them. The combined effect of all uncertainties associated with a measurement is referred to as the Total Measurement Uncertainty (TMU). The NDA measurement uncertainties can be numerous and complex. In addition to system-induced measurement uncertainty, other factors contribute to the TMU, each associated with a particular measurement. The NDA measurements at WRAP are based on processes (radioactive decay and induced fission) which are statistical in nature. As a result, the proper statistical summation of the various uncertainty components is essential. This report examines the contributing factors to NDA measurement uncertainty at WRAP. The significance of each factor on the TMU is analyzed, and a final method is given for determining the TMU for NDA measurements at WRAP. As more data becomes available, and WRAP gains in operational experience, this report will be reviewed semi-annually and updated as necessary. This report also includes the data flow paths for the analytical process in the radiometric determinations
WILLS, C.E.
2000-02-24
The Waste Receiving and Processing (WRAP) facility, located on the Hanford Site in southeast Washington, is a key link in the certification of Hanford's transuranic (TRU) waste for shipment to the Waste Isolation Pilot Plant (WIPP). Waste characterization is one of the vital functions performed at WRAP, and nondestructive assay (NDA) measurements of TRU waste containers is one of two required methods used for waste characterization (Reference 1). Various programs exist to ensure the validity of waste characterization data; all of these cite the need for clearly defined knowledge of uncertainty, associated with any measurements taken. All measurements have an inherent uncertainty associated with them. The combined effect of all uncertainties associated with a measurement is referred to as the Total Measurement Uncertainty (TMU). The NDA measurement uncertainties can be numerous and complex. In addition to system-induced measurement uncertainty, other factors contribute to the TMU, each associated with a particular measurement. The NDA measurements at WRAP are based on processes (radioactive decay and induced fission) which are statistical in nature. As a result, the proper statistical summation of the various uncertainty components is essential. This report examines the contributing factors to NDA measurement uncertainty at WRAP. The significance of each factor on the TMU is analyzed, and a final method is given for determining the TMU for NDA measurements at WRAP. As more data becomes available, and WRAP gains in operational experience, this report will be reviewed semi-annually and updated as necessary. This report also includes the data flow paths for the analytical process in the radiometric determinations.
Alternative risk measure for decision-making under uncertainty in water management
Yueping Xu; YeouKoung Tung; Jia Li; Shaofeng Niu
2009-01-01
Taking into account uncertainties in water management remains a challenge due to social,economic and environmental changes.Often,uncertainty creates difficulty in ranking or comparing multiple water management options,possibly leading to a wrong decision.In this paper,an alternative risk measure is proposed to facilitate the ranking or comparison of water management options under uncertainty by using the concepts of conditional expected loss and partial mean.This measure has the advantages of being more intuitive,general and could relate to many other measures of risk in the literature.The application of the risk measure is demonstrated through a case study for the evaluation of flood mitigation projects.The results show that the new measure is applicable to a general decisionmaking process under uncertainty.
Role and Significance of Uncertainty in HV Measurement of Porcelain Insulators - a Case Study
Choudhary, Rahul Raj; Bhardwaj, Pooja; Dayama, Ravindra
The improved safety margins in complex systems have attained prime importance in the modern scientific environment. The analysis and implementation of complex systems demands the well quantified accuracy and capability of measurements. Careful measurement with properly identified and quantified uncertainties could lead to the actual discovery which further may contribute for social developments. Unfortunately most scientists and students are passively taught to ignore the possibility of definition problems in the field of measurement and are often source of great arguments. Identifying this issue, ISO has initiated the standardisation of methodologies but its Guide to the Expression of Uncertainty in Measurement (GUM) has yet to be adapted seriously in tertiary education institutions for understanding the concept of uncertainty. The paper has been focused for understanding the concepts of measurement and uncertainty. Further a case study for calculation and quantification of UOM for high voltage electrical testing of ceramic insulators has been explained.
Utilization of Software Tools for Uncertainty Calculation in Measurement Science Education
Despite its importance, uncertainty is often neglected by practitioners in the design of system even in safety critical applications. Thus, problems arising from uncertainty may only be identified late in the design process and thus lead to additional costs. Although there exists numerous tools to support uncertainty calculation, reasons for limited usage in early design phases may be low awareness of the existence of the tools and insufficient training in the practical application. We present a teaching philosophy that addresses uncertainty from the very beginning of teaching measurement science, in particular with respect to the utilization of software tools. The developed teaching material is based on the GUM method and makes use of uncertainty toolboxes in the simulation environment. Based on examples in measurement science education we discuss advantages and disadvantages of the proposed teaching philosophy and include feedback from students
Bruschewski, Martin; Freudenhammer, Daniel; Buchenberg, Waltraud B.; Schiffer, Heinz-Peter; Grundmann, Sven
2016-05-01
Velocity measurements with magnetic resonance velocimetry offer outstanding possibilities for experimental fluid mechanics. The purpose of this study was to provide practical guidelines for the estimation of the measurement uncertainty in such experiments. Based on various test cases, it is shown that the uncertainty estimate can vary substantially depending on how the uncertainty is obtained. The conventional approach to estimate the uncertainty from the noise in the artifact-free background can lead to wrong results. A deviation of up to -75 % is observed with the presented experiments. In addition, a similarly high deviation is demonstrated with the data from other studies. As a more accurate approach, the uncertainty is estimated directly from the image region with the flow sample. Two possible estimation methods are presented.
Computer-assisted uncertainty assessment of k0-NAA measurement results
In quantifying measurement uncertainty of measurement results obtained by the k0-based neutron activation analysis (k0-NAA), a number of parameters should be considered and appropriately combined in deriving the final budget. To facilitate this process, a program ERON (ERror propagatiON) was developed, which computes uncertainty propagation factors from the relevant formulae and calculates the combined uncertainty. The program calculates uncertainty of the final result-mass fraction of an element in the measured sample-taking into account the relevant neutron flux parameters such as α and f, including their uncertainties. Nuclear parameters and their uncertainties are taken from the IUPAC database (V.P. Kolotov and F. De Corte, Compilation of k0 and related data for NAA). Furthermore, the program allows for uncertainty calculations of the measured parameters needed in k0-NAA: α (determined with either the Cd-ratio or the Cd-covered multi-monitor method), f (using the Cd-ratio or the bare method), Q0 (using the Cd-ratio or internal comparator method) and k0 (using the Cd-ratio, internal comparator or the Cd subtraction method). The results of calculations can be printed or exported to text or MS Excel format for further analysis. Special care was taken to make the calculation engine portable by having possibility of its incorporation into other applications (e.g., DLL and WWW server). Theoretical basis and the program are described in detail, and typical results obtained under real measurement conditions are presented
Full text: A team of technical experts from the Russian Federation, the International Atomic Energy Agency (IAEA) and the United States have been working for almost five years on the development of a tool kit of instruments that could be used to verify plutonium-bearing items that have classified characteristics in nuclear weapons states. This suite of instruments is similar in many ways to standard safeguards equipment and includes high-resolution gamma-ray spectrometers, neutron multiplicity counters, gross neutron counters and gross gamma-ray detectors. In safeguards applications, this equipment is known to be robust, and authentication methods are well understood. This equipment is very intrusive, however, and a traditional safeguards application of such equipment for verification of materials with classified characteristics would reveal classified information to the inspector, Several enabling technologies have been or are being developed to facilitate the use of these trusted, but intrusive technologies. In this paper, these technologies will be described. One of the new technologies is called an Attribute Verification System with an Information Barrier Utilizing Neutron Multiplicity Counting and High-Resolution Gamma-Ray Spectrometry' or AVNG. The radiation measurement equipment, comprising a neutron multiplicity counter and high-resolution gamma-ray spectrometer, is standard safeguards-type equipment with information security features added. The information barrier is a combination of technical and procedural methods that protect classified information while allowing the inspector to have confidence that the measurement equipment is providing authentic results. The approach is to reduce the radiation data collected by the measurement equipment to a simple 'yes/no' result regarding attributes of the plutonium-bearing item. The 'yes/no' result is unclassified by design so that it can be shared with an inspector. The attributes that the Trilateral Initiative
Universal Uncertainty Principle, Simultaneous Measurability, and Weak Values
Ozawa, Masanao
2011-01-01
In the conventional formulation, it is broadly accepted that simultaneous measurability and commutativity of observables are equivalent. However, several objections have been claimed that there are cases in which even nowhere commuting observables can be measured simultaneously. Here, we outline a new theory of simultaneous measurements based on a state-dependent formulation, in which nowhere commuting observables are shown to have simultaneous measurements in some states, so that the known o...
The grey relational approach for evaluating measurement uncertainty with poor information
The Guide to the Expression of Uncertainty in Measurement (GUM) is the master document for measurement uncertainty evaluation. However, the GUM may encounter problems and does not work well when the measurement data have poor information. In most cases, poor information means a small data sample and an unknown probability distribution. In these cases, the evaluation of measurement uncertainty has become a bottleneck in practical measurement. To solve this problem, a novel method called the grey relational approach (GRA), different from the statistical theory, is proposed in this paper. The GRA does not require a large sample size or probability distribution information of the measurement data. Mathematically, the GRA can be divided into three parts. Firstly, according to grey relational analysis, the grey relational coefficients between the ideal and the practical measurement output series are obtained. Secondly, the weighted coefficients and the measurement expectation function will be acquired based on the grey relational coefficients. Finally, the measurement uncertainty is evaluated based on grey modeling. In order to validate the performance of this method, simulation experiments were performed and the evaluation results show that the GRA can keep the average error around 5%. Besides, the GRA was also compared with the grey method, the Bessel method, and the Monte Carlo method by a real stress measurement. Both the simulation experiments and real measurement show that the GRA is appropriate and effective to evaluate the measurement uncertainty with poor information. (paper)
Andres, T.H
2002-05-01
This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)
This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)
Dieisson Pivoto
2016-04-01
Full Text Available ABSTRACT: The study aimed to i quantify the measurement uncertainty in the physical tests of rice and beans for a hypothetical defect, ii verify whether homogenization and sample reduction in the physical classification tests of rice and beans is effective to reduce the measurement uncertainty of the process and iii determine whether the increase in size of beans sample increases accuracy and reduces measurement uncertainty in a significant way. Hypothetical defects in rice and beans with different damage levels were simulated according to the testing methodology determined by the Normative Ruling of each product. The homogenization and sample reduction in the physical classification of rice and beans are not effective, transferring to the final test result a high measurement uncertainty. The sample size indicated by the Normative Ruling did not allow an appropriate homogenization and should be increased.
Xue, Zhenyu; Vlachos, Pavlos P
2014-01-01
In particle image velocimetry (PIV) the measurement signal is contained in the recorded intensity of the particle image pattern superimposed on a variety of noise sources. The signal-to-noise-ratio (SNR) strength governs the resulting PIV cross correlation and ultimately the accuracy and uncertainty of the resulting PIV measurement. Hence we posit that correlation SNR metrics calculated from the correlation plane can be used to quantify the quality of the correlation and the resulting uncertainty of an individual measurement. In this paper we present a framework for evaluating the correlation SNR using a set of different metrics, which in turn are used to develop models for uncertainty estimation. The SNR metrics and corresponding models presented herein are expanded to be applicable to both standard and filtered correlations. In addition, the notion of a valid measurement is redefined with respect to the correlation peak width in order to be consistent with uncertainty quantification principles and distinct ...
郭凯红; 李文立
2012-01-01
The previous study shows that the evidential reasoning algorithm is an effective and rational method to solve MADM (Multiple Attribute Decision Making) problems under uncertainty. However, the method has constraints that attribute weights should be deterministic and evaluation grades assessing basic attributes and general attributes should be consistent. However, these constraints are not relevant to the actual decision-making problems, especially for basic qualitative attributes. Existing subjective and objective methods have defect for basic attribute weights. Most methods assume that the grade is the same in order to evaluate grades based on basic and general attributes. Therefore, these methods are not effective to assist the decision making process and solve problems.In consideration of the weakness of previous study, this study proposes a method based on the evidential reasoning for MADM under uncertainty with the goal of extending evidential reasoning algorithm into a more general decision environment.The first part is to determine basic attribute weights. We first briefly introduce the evidential reasoning algorithm, discussing two major issues related to its effective application for MADM under uncertainty: (1) how to totally determine basic attribute weights, and (2) how to fully implement the transformation of distributed assessment from basic attributes into general attributes. In addition, we calculate basic attribute weights using the information entropy of decision matrix to solve the first problem. In the second part, we implement the equivalent transformation of distributed assessments from basic attributes into general attributes by assuming that evaluation grades assessing basic attributes and general attributes are not the same.We first fuzz the distributed assessments of basic attributes according to different data types of basic attribute values, and then implement, based on fuzzy transformation theory, the unified form of general distributed
On the Uncertainties of Stellar Mass Estimates via Colour Measurements
Roediger, Joel C
2015-01-01
Mass-to-light versus colour relations (MLCRs), derived from stellar population synthesis models, are widely used to estimate galaxy stellar masses (M$_*$) yet a detailed investigation of their inherent biases and limitations is still lacking. We quantify several potential sources of uncertainty, using optical and near-infrared (NIR) photometry for a representative sample of nearby galaxies from the Virgo cluster. Our method for combining multi-band photometry with MLCRs yields robust stellar masses, while errors in M$_*$ decrease as more bands are simultaneously considered. The prior assumptions in one's stellar population modelling dominate the error budget, creating a colour-dependent bias of up to 0.6 dex if NIR fluxes are used (0.3 dex otherwise). This matches the systematic errors associated with the method of spectral energy distribution (SED) fitting, indicating that MLCRs do not suffer from much additional bias. Moreover, MLCRs and SED fitting yield similar degrees of random error ($\\sim$0.1-0.14 dex)...
Validity of Willingness to Pay Measures under Preference Uncertainty
Braun, Carola; Rehdanz, Katrin; Schmidt, Ulrich
2016-01-01
Recent studies in the marketing literature developed a new method for eliciting willingness to pay (WTP) with an open-ended elicitation format: the Range-WTP method. In contrast to the traditional approach of eliciting WTP as a single value (Point-WTP), Range-WTP explicitly allows for preference uncertainty in responses. The aim of this paper is to apply Range-WTP to the domain of contingent valuation and to test for its theoretical validity and robustness in comparison to the Point-WTP. Using data from two novel large-scale surveys on the perception of solar radiation management (SRM), a little-known technique for counteracting climate change, we compare the performance of both methods in the field. In addition to the theoretical validity (i.e. the degree to which WTP values are consistent with theoretical expectations), we analyse the test-retest reliability and stability of our results over time. Our evidence suggests that the Range-WTP method clearly outperforms the Point-WTP method. PMID:27096163
Benchmarking laboratory observation uncertainty for in-pipe storm sewer discharge measurements
Aguilar, Marcus F.; McDonald, Walter M.; Dymond, Randel L.
2016-03-01
The uncertainty associated with discharge measurement in storm sewer systems is of fundamental importance for hydrologic/hydraulic model calibration and pollutant load estimation, although it is difficult to determine as field benchmarks are generally impractical. This study benchmarks discharge uncertainty in several commonly used sensors by laboratory flume testing with and without a woody debris model. The sensors are then installed in a field location where laboratory benchmarked uncertainty is applied to field measurements. Combined depth and velocity uncertainty from the laboratory ranged from ±0.207-0.710 in., and ±0.176-0.631 fps respectively, and when propagated and applied to discharge estimation in the field, resulted in field discharge uncertainties of between 13% and 256% of the observation. Average daily volume calculation based on these observations had uncertainties of between 58% and 99% of the estimated value, and the uncertainty bounds of storm flow volume and peak flow for nine storm events constituted between 31-84%, and 13-48% of the estimated value respectively. Subsequently, the implications of these observational uncertainties for stormwater best-management practice evaluation, hydrologic modeling, and Total Maximum Daily Load development are considered.
Payman Moallem
2007-09-01
Full Text Available Fuzzy Attribute Graph (FAG is a powerful tool for representation and recognition of structural patterns. The conventional framework for similarity measure of FAGs is based on equivalent fuzzy attributes but in fuzzy world, some attributes are more important. In this paper, a modified recognition framework, using linguistic modifier for matching of the fuzzy attribute graphs, is introduced. Then an algorithm for automatic selection of fuzzy modifier based on the learning patterns is posed. Some examples for the conventional and modified framework for FAG similarity measure are studied and the potential of the proposed framework for FAG matching is showed.
Uncertainty issues on S-CO{sub 2} compressor performance measurement
Lee, Jekyoung; Cho, Seongkuk; Lee, Jeong Ik [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)
2014-10-15
This is related to the property variation, pressure ratio and measurement method. Since SCO2PE facility operates near the critical point with a low pressure ratio compressor, one of the solutions to improve measurement uncertainty is utilizing a density meter. However, additional two density meters on compressor inlet and outlet measurement didn't provide remarkable improvement on the overall uncertainty. Thus, the authors think that different approach on the performance measurement is required to secure measurement confidence. As further works, identifying appropriate approximation on efficiency equation and applying direct measurement of compressor shaft power for the efficiency calculation will be considered.
Uncertainty of nitrate and sulphate measured by ion chromatography in wastewater samples
Tepuš, Brigita; Simonič, Marjana
2012-01-01
This paper presents an evaluation of measurement uncertainty regarding the results of anion (nitrate and sulphate) concentrations in wastewater. Anions were determined by ion chromatography (EN ISO 10304-2, 1996). The major sources of uncertainty regarding the measurement results were identified as contributions to linear least-square or weighted regression lines, precision, trueness, storage conditions, and sampling. Determination of anions in wastewater is very important for the purificatio...
UNCERTAINTY AND ITS IMPACT ON THE QUALITY OF MEASUREMENT
Adel Elahdi M. Yahya; Martin Halaj
2012-01-01
The imposition of practice, the current world, the laboratory measurement, calibration should be approved by points of credit to national or international and should be compatible with the requirements specification (ISO 17025) for the adoption of efficient laboratories. Those requirements were included the testing process or scale limits to doubt that mentioned in the measurement certificate, which recognizes the customer to achieve quality and efficiency in the process of measurement. In th...
Improvement of process monitoring uncertainty by the use of Diverse Measurement Methods
Primary coolant flow monitoring margin for one of the plants Westinghouse services was approaching 0% due to Steam Generator tube plugging and design changes. The approved methodology requires calibrating the primary coolant flow to the calorimetrically measured flow. The calorimetric flow measurement has a high uncertainty and has errors due to process variations. The path identified to gain margin was to develop a new method to reduce the uncertainty of the reference flow measurement. The new method identified for the uncertainty reduction was to determine the reference flow based on diverse, independent indications of flow. Utilizing the variance weighted averaging technique, the method produces a more accurate best estimate reference flow. Our team used three alternate indications available in the plant and the simulation of the flow loop as the diverse indications of flow. The benefit to the plant was a 60% reduction of the uncertainty. Introduction: Standard monitoring method: based on reactor coolant pump pressure differentials, periodically calibrated to the calorimetrically measured flow. - RCP involvement: Difficulty with RCP DP based flow measurement: it needs to be calibrated to a reference flow since the conditions of performance testing, if it is done at all, differ from the operating conditions. - Calorimetric involvement: Difficulties with calorimetric flow measurement: high uncertainty due to process noise and the process variation based biasing. - Plant Problem: Cycle independent calibration, applied at various plants, would not be adequate at the plant without RSG due to high uncertainty and low monitored flow. Increased resistance could put plant operability at risk. - Identification of the problem: Identified reference flow uncertainty as dominant margin reduction - Proposed Solution: Use Diverse Methods to reduce uncertainty and get more accurate best estimate flow, calibrate pump DP data to best estimate flow to allow plant to not have to
The attribute measurement technique provides a method for determining whether or not an item containing special nuclear material (SNM) possesses attributes that fall within an agreed upon range of values. One potential attribute is whether the mass of an SNM item is larger than some threshold value that has been negotiated as part of a nonproliferation treaty. While the historical focus on measuring mass attributes has been on using neutron measurements, calorimetry measurements may be a viable alternative for measuring mass attributes for plutonium-bearing items. Traditionally, calorimetry measurements have provided a highly precise and accurate determination of the thermal power that is being generated by an item. In order to achieve this high level of precision and accuracy, the item must reach thermal equilibrium inside the calorimeter prior to determining the thermal power of the item. Because the approach to thermal equilibrium is exponential in nature, a large portion of the time spent approaching equilibrium is spent with the measurement being within ∼10% of its final equilibrium value inside the calorimeter. Since a mass attribute measurement only needs to positively determine if the mass of a given SNM item is greater than a threshold value, performing a short calorimetry measurement to determine how the system is approaching thermal equilibrium may provide sufficient information to determine if an item has a larger mass than the agreed upon threshold. In previous research into a fast calorimetry attribute technique, a two-dimensional heat flow model of a calorimeter was used to investigate the possibility of determining a mass attribute for plutonium-bearing items using this technique. While the results of this study looked favorable for developing a fast calorimetry attribute technique, additional work was needed to determine the accuracy of the model used to make the calculations. In this paper, the results from the current work investigating the
Hauck, Danielle K [Los Alamos National Laboratory; Bracken, David S [Los Alamos National Laboratory; Mac Arthur, Duncan W [Los Alamos National Laboratory; Santi, Peter A [Los Alamos National Laboratory; Thron, Jonathan [Los Alamos National Laboratory
2010-01-01
The attribute measurement technique provides a method for determining whether or not an item containing special nuclear material (SNM) possesses attributes that fall within an agreed upon range of values. One potential attribute is whether the mass of an SNM item is larger than some threshold value that has been negotiated as part of a nonproliferation treaty. While the historical focus on measuring mass attributes has been on using neutron measurements, calorimetry measurements may be a viable alternative for measuring mass attributes for plutonium-bearing items. Traditionally, calorimetry measurements have provided a highly precise and accurate determination of the thermal power that is being generated by an item. In order to achieve this high level of precision and accuracy, the item must reach thermal equilibrium inside the calorimeter prior to determining the thermal power of the item. Because the approach to thermal equilibrium is exponential in nature, a large portion of the time spent approaching equilibrium is spent with the measurement being within {approx}10% of its final equilibrium value inside the calorimeter. Since a mass attribute measurement only needs to positively determine if the mass of a given SNM item is greater than a threshold value, performing a short calorimetry measurement to determine how the system is approaching thermal equilibrium may provide sufficient information to determine if an item has a larger mass than the agreed upon threshold. In previous research into a fast calorimetry attribute technique, a two-dimensional heat flow model of a calorimeter was used to investigate the possibility of determining a mass attribute for plutonium-bearing items using this technique. While the results of this study looked favorable for developing a fast calorimetry attribute technique, additional work was needed to determine the accuracy of the model used to make the calculations. In this paper, the results from the current work investigating
Improved sample size determination for attributes and variables sampling
Earlier INMM paper have addressed the attributes/variables problem and, under conservative/limiting approximations, have reported analytical solutions for the attributes and variables sample sizes. Through computer simulation of this problem, the authors have calculated attributes and variables sample sizes as a function of falsification, measurement uncertainties, and required detection probability without using approximations. Using realistic assumptions for uncertainty parameters of measurement, the simulation results support the conclusions: (1) previously used conservative approximations can be expensive because they lead to larger sample sizes than needed, and (2) the optimal verification strategy, as well as the falsification strategy, are highly dependent on the underlying uncertainty parameters of the measurement instruments
In particle image velocimetry (PIV) the measurement signal is contained in the recorded intensity of the particle image pattern superimposed on a variety of noise sources. The signal-to-noise-ratio (SNR) strength governs the resulting PIV cross correlation and ultimately the accuracy and uncertainty of the resulting PIV measurement. Hence we posit that correlation SNR metrics calculated from the correlation plane can be used to quantify the quality of the correlation and the resulting uncertainty of an individual measurement. In this paper we extend the original work by Charonko and Vlachos and present a framework for evaluating the correlation SNR using a set of different metrics, which in turn are used to develop models for uncertainty estimation. Several corrections have been applied in this work. The SNR metrics and corresponding models presented herein are expanded to be applicable to both standard and filtered correlations by applying a subtraction of the minimum correlation value to remove the effect of the background image noise. In addition, the notion of a ‘valid’ measurement is redefined with respect to the correlation peak width in order to be consistent with uncertainty quantification principles and distinct from an ‘outlier’ measurement. Finally the type and significance of the error distribution function is investigated. These advancements lead to more robust and reliable uncertainty estimation models compared with the original work by Charonko and Vlachos. The models are tested against both synthetic benchmark data as well as experimental measurements. In this work, U68.5 uncertainties are estimated at the 68.5% confidence level while U95 uncertainties are estimated at 95% confidence level. For all cases the resulting calculated coverage factors approximate the expected theoretical confidence intervals, thus demonstrating the applicability of these new models for estimation of uncertainty for individual PIV measurements. (paper)
A generalized measurement model to quantify health: the multi-attribute preference response model.
Paul F M Krabbe
Full Text Available After 40 years of deriving metric values for health status or health-related quality of life, the effective quantification of subjective health outcomes is still a challenge. Here, two of the best measurement tools, the discrete choice and the Rasch model, are combined to create a new model for deriving health values. First, existing techniques to value health states are briefly discussed followed by a reflection on the recent revival of interest in patients' experience with regard to their possible role in health measurement. Subsequently, three basic principles for valid health measurement are reviewed, namely unidimensionality, interval level, and invariance. In the main section, the basic operation of measurement is then discussed in the framework of probabilistic discrete choice analysis (random utility model and the psychometric Rasch model. It is then shown how combining the main features of these two models yields an integrated measurement model, called the multi-attribute preference response (MAPR model, which is introduced here. This new model transforms subjective individual rank data into a metric scale using responses from patients who have experienced certain health states. Its measurement mechanism largely prevents biases such as adaptation and coping. Several extensions of the MAPR model are presented. The MAPR model can be applied to a wide range of research problems. If extended with the self-selection of relevant health domains for the individual patient, this model will be more valid than existing valuation techniques.
Multi-attribute integrated measurement of node importance in complex networks
Wang, Shibo; Zhao, Jinlou
2015-11-01
The measure of node importance in complex networks is very important to the research of networks stability and robustness; it also can ensure the security of the whole network. Most researchers have used a single indicator to measure the networks node importance, so that the obtained measurement results only reflect certain aspects of the networks with a loss of information. Meanwhile, because of the difference of networks topology, the nodes' importance should be described by combining the character of the networks topology. Most of the existing evaluation algorithms cannot completely reflect the circumstances of complex networks, so this paper takes into account the degree of centrality, the relative closeness centrality, clustering coefficient, and topology potential and raises an integrated measuring method to measure the nodes' importance. This method can reflect nodes' internal and outside attributes and eliminate the influence of network structure on the node importance. The experiments of karate network and dolphin network show that networks topology structure integrated measure has smaller range of metrical result than a single indicator and more universal. Experiments show that attacking the North American power grid and the Internet network with the method has a faster convergence speed than other methods.
Morrell Jane
2009-05-01
Full Text Available Abstract Background Multi-attribute utility measures are preference-based health-related quality of life measures that have been developed to inform economic evaluations of health care interventions. The objective of this study was to compare the empirical validity of two multi-attribute utility measures (EQ-5D and SF-6D based on hypothetical preferences in a large maternity population in England. Methods Women who participated in a randomised controlled trial of additional postnatal support provided by trained community support workers represented the study population for this investigation. The women were asked to complete the EQ-5D descriptive system (which defines health-related quality of life in terms of five dimensions: mobility, self care, usual activities, pain/discomfort and anxiety/depression and the SF-36 (which defines health-related quality of life, using 36 items, across eight dimensions: physical functioning, role limitations (physical, social functioning, bodily pain, general health, mental health, vitality and role limitations (emotional at six months postpartum. Their responses were converted into utility scores using the York A1 tariff set and the SF-6D utility algorithm, respectively. One-way analysis of variance was used to test the hypothetically-constructed preference rule that each set of utility scores differs significantly by self-reported health status (categorised as excellent, very good, good, fair or poor. The degree to which EQ-5D and SF-6D utility scores reflected alternative dichotomous configurations of self-reported health status and the Edinburgh Postnatal Depression Scale score was tested using the relative efficiency statistic and receiver operating characteristic (ROC curves. Results The mean utility score for the EQ-5D was 0.861 (95% CI: 0.844, 0.877, whilst the mean utility score for the SF-6D was 0.809 (95% CI: 0.796, 0.822, representing a mean difference in utility score of 0.052 (95% CI: 0.040, 0
Uncertainties of DS86 and prospects for residual radioactivity measurement.
Shizuma, K; Hoshi, M; Hasai, H
1999-12-01
Residual radioactivity data of 152Eu, 60Co and 36Cl have been accumulated and it has been revealed in the thermal neutron region that a systematic discrepancy exists between the measured data and activation calculation based on the DS86 neutrons in Hiroshima. Recently 63Ni produced in copper samples by the fast neutron reaction 63Cu(n,p)63Ni has been of interest for evaluation of fast neutrons. Reevaluation of atomic-bomb neutrons and prospects based on residual activity measurements have been discussed. PMID:10805002
Uncertainty in SMAP Soil Moisture Measurements Caused by Dew
Soil moisture is an important reservoir of the hydrologic cycle that regulates the exchange of moisture and energy between the land surface the atmosphere. Two satellite missions will soon make the first global measurements of soil moisture at the optimal microwave wavelength within L-band: ESA's So...
Toward a Characterization of Uncertainty Measure for the Dempster-Shafer Theory
Harmanec, David
2013-01-01
This is a working paper summarizing results of an ongoing research project whose aim is to uniquely characterize the uncertainty measure for the Dempster-Shafer Theory. A set of intuitive axiomatic requirements is presented, some of their implications are shown, and the proof is given of the minimality of recently proposed measure AU among all measures satisfying the proposed requirements.
Role of uncertainty in the measurement of crack length by compliance techniques
An experimental program is underway to investigate the effect of thermal treatment and electrochemical potential on the cyclic crack growth behaviour of Inconel-600 and Inconel X-750 in deoxygenated high purity water at 2900C. As part of the program an investigation has been conducted to determine an approximation for the degrees of uncertainty in the elastic compliance technique used for determining crack length. Preliminary results indicate that for room temperature-air crack growth measurements an uncertainty of approximately 1.5% can be expected for the value of measured compliance. For the specimen geometry used this translates to an uncertainty in the effective crack length of 0.25 mm. In an aqueous environment at 2900C, 10.34 MPa the estimated uncertainty in compliance measurement can be as much as 6.5% which translates to a crack length uncertainty of 1.83 mm. These uncertainty values have a significant impact on the measurement intervals required for statistically meaningful crack growth rate data generation
Measuring Young’s modulus the easy way, and tracing the effects of measurement uncertainties
Nunn, John
2015-09-01
The speed of sound in a solid is determined by the density and elasticity of the material. Young’s modulus can therefore be calculated once the density and the speed of sound in the solid are measured. The density can be measured relatively easily, and the speed of sound through a rod can be measured very inexpensively by setting up a longitudinal standing wave and using a microphone to record its frequency. This is a simplified version of a technique called ‘impulse excitation’. It is a good educational technique for school pupils. This paper includes the description and the free provision of custom software to calculate the frequency spectrum of a recorded sound so that the resonant peaks can be readily identified. Discussion on the effect of measurement uncertainties is included to help the more thorough experimental student improve the accuracy of his method. The technique is sensitive enough to be able to detect changes in the elasticity modulus with a temperature change of just a few degrees.
Michael F. Frimpon
2012-01-01
The selection of a school leader is a multi attribute problem that needs to be addressed taking into considerationthe peculiar needs of an institution. This paper is intended to specify the critical success factors (CSFs) of collegeleaders as perceived by students. A survey comprising the 37 attributes of The Leaders Attributes Inventory (LAI)of Moss was given to the students in a local university to determine their best 10. The 10 selected attributes weremapped onto the Leadership Effectiven...
Hiller, Jochen; Genta, Gianfranco; Barbato, Giulio;
2014-01-01
Industrial applications of computed tomography (CT) for dimensional metrology on various components are fast increasing, owing to a number of favorable properties such as capability of non-destructive internal measurements. Uncertainty evaluation is however more complex than in conventional...... measurement processes, e.g., with tactile systems, also due to factors related to systematic errors, mainly caused by specific CT image characteristics. In this paper we propose a simulation-based framework for measurement uncertainty evaluation in dimensional CT using the bootstrap method. In a case study...... the problem concerning measurement uncertainties was addressed with bootstrap and successfully applied to ball-bar CT measurements. Results obtained enabled extension to more complex shapes such as actual industrial components as we show by tests on a hollow cylinder workpiece....
Measurement of grating pitch by optical diffraction is one of the few methods currently available for establishing traceability to the definition of the meter on the nanoscale; therefore, understanding all aspects of the measurement is imperative for accurate dissemination of the SI meter. A method for evaluating the component of measurement uncertainty associated with coherent scattering in the diffractometer instrument is presented. The model equation for grating pitch calibration by optical diffraction is an example where Monte Carlo (MC) methods can vastly simplify evaluation of measurement uncertainty. This paper includes discussion of the practical aspects of implementing MC methods for evaluation of measurement uncertainty in grating pitch calibration by diffraction. Downloadable open-source software is demonstrated. (technical design note)
Evaluation of the uncertainty of electrical impedance measurements: the GUM and its Supplement 2
Electrical impedance is not a scalar but a complex quantity. Thus, evaluation of the uncertainty of its value involves a model whose output is a complex. In this paper the comparison of the evaluation of the uncertainty of the measurement of the electrical impedance of a simple electric circuit using the GUM and using a Monte Carlo method according to the Supplement 2 of the GUM is presented
Terminological aspects of the Guide to the Expression of Uncertainty in Measurement (GUM)
Ehrlich, Charles
2014-08-01
The Guide to the Expression of Uncertainty in Measurement (GUM) provided for the first time an international consensus on how to approach the widespread difficulties associated with conveying information about how reliable the value resulting from a measurement is thought to be. This paper examines the evolution in thinking and its impact on the terminology that accompanied the development of the GUM. Particular emphasis is put on the very clear distinction in the GUM between measurement uncertainty and measurement error, and on the reasons that even though ‘true value’ and ‘error’ are considered in the GUM to be ‘unknowable’ and, sometimes by implication, of little (or even no) use in measurement analysis, they remain as key concepts, especially when considering the objective of measurement. While probability theory in measurement analysis from a frequentist perspective was in widespread use prior to the publication of the GUM, a key underpinning principle of the GUM was to instead consider probability as a ‘degree of belief.’ The terminological changes necessary to make this transition are also covered. Even twenty years after the publication of the GUM, the scientific and metrology literatures sometimes contain uncertainty analyses, or discussions of measurement uncertainty, that are not terminologically consistent with the GUM, leading to the inability of readers to fully understand what has been done and what is intended in the associated measurements. This paper concludes with a discussion of the importance of using proper methodology and terminology for reporting measurement results.
Coherent Uncertainty Analysis of Aerosol Measurements from Multiple Satellite Sensors
Petrenko, M.; Ichoku, C.
2013-01-01
Aerosol retrievals from multiple spaceborne sensors, including MODIS (on Terra and Aqua), MISR, OMI, POLDER, CALIOP, and SeaWiFS altogether, a total of 11 different aerosol products were comparatively analyzed using data collocated with ground-based aerosol observations from the Aerosol Robotic Network (AERONET) stations within the Multi-sensor Aerosol Products Sampling System (MAPSS, http://giovanni.gsfc.nasa.gov/mapss/ and http://giovanni.gsfc.nasa.gov/aerostat/). The analysis was performed by comparing quality-screened satellite aerosol optical depth or thickness (AOD or AOT) retrievals during 2006-2010 to available collocated AERONET measurements globally, regionally, and seasonally, and deriving a number of statistical measures of accuracy. We used a robust statistical approach to detect and remove possible outliers in the collocated data that can bias the results of the analysis. Overall, the proportion of outliers in each of the quality-screened AOD products was within 12%. Squared correlation coefficient (R2) values of the satellite AOD retrievals relative to AERONET exceeded 0.6, with R2 for most of the products exceeding 0.7 over land and 0.8 over ocean. Root mean square error (RMSE) values for most of the AOD products were within 0.15 over land and 0.09 over ocean. We have been able to generate global maps showing regions where the different products present advantages over the others, as well as the relative performance of each product over different landcover types. It was observed that while MODIS, MISR, and SeaWiFS provide accurate retrievals over most of the landcover types, multi-angle capabilities make MISR the only sensor to retrieve reliable AOD over barren and snow / ice surfaces. Likewise, active sensing enables CALIOP to retrieve aerosol properties over bright-surface shrublands more accurately than the other sensors, while POLDER, which is the only one of the sensors capable of measuring polarized aerosols, outperforms other sensors in
In making low-level radioactivity measurements of populations, it is commonly observed that a substantial portion of net results are negative. Furthermore, the observed variance of the measurement results arises from a combination of measurement uncertainty and population variability. This paper presents a method for disaggregating measurement uncertainty from population variability to produce a probability density function (PDF) of possibly true results. To do this, simple, justifiable, and reasonable assumptions are made about the relationship of the measurements to the measurands (the 'true values'). The measurements are assumed to be unbiased, that is, that their average value is the average of the measurands. Using traditional estimates of each measurement's uncertainty to disaggregate population variability from measurement uncertainty, a PDF of measurands for the population is produced. Then, using Bayes's theorem, the same assumptions, and all the data from the population of individuals, a prior PDF is computed for each individual's measurand. These PDFs are non-negative, and their average is equal to the average of the measurement results for the population. The uncertainty in these Bayesian posterior PDFs is all Berkson with no remaining classical component. The methods are applied to baseline bioassay data from the Hanford site. The data include 90Sr urinalysis measurements on 128 people, 137Cs in vivo measurements on 5,337 people, and 239Pu urinalysis measurements on 3,270 people. The method produces excellent results for the 90Sr and 137Cs measurements, since there are nonzero concentrations of these global fallout radionuclides in people who have not been occupationally exposed. The method does not work for the 239Pu measurements in non-occupationally exposed people because the population average is essentially zero.
Comparison of different approaches to estimate uncertainty budget in k0-INAA measurement
Three CRMs of different matrix composition were analysed, representing an environmental matrix sample (BCR-320R Channel Sediment), a botanical matrix sample (SRM 1547 Peach Leaves) and a zoological matrix sample (SRM 1566b Oyster Tissue). The element mass fractions were obtained using the KayWin program. Analytical measurement uncertainty was determined by two approaches: (1) the routine procedure applying combination of the overall uncertainty u(m) = 3.5 % and statistical uncertainty of the peak area determination and (2) the procedure applying the dedicated ERON program for calculating uncertainty. Performance of altogether 31 certified values was tested by means of calculating En numbers. For the remaining 52 non-certified values, comparison between uncertainties obtained by the two approaches was made. When using the first approach, the E n number showed satisfactory performance in 28 cases; by using the second approach, the En number showed satisfactory performance in 27 cases. None of the unsatisfactory performances (En > 1) appeared to be of systematic nature. The uncertainties obtained by applying the two approaches revealed a big extent of consistency. As the present nuclear database lacks lot of data that serve as input to the ERON program, in particular uncertainties of Q0 factors, estimates need to be introduced for the missing values, emphasising the urgent need to upgrade the database with missing data. (author)
Ren Bo
2014-06-01
Full Text Available For structural systems with both epistemic and aleatory uncertainties, research on quantifying the contribution of the epistemic and aleatory uncertainties to the failure probability of the systems is conducted. Based on the method of separating epistemic and aleatory uncertainties in a variable, the core idea of the research is firstly to establish a novel deterministic transition model for auxiliary variables, distribution parameters, random variables, failure probability, then to propose the improved importance sampling (IS to solve the transition model. Furthermore, the distribution parameters and auxiliary variables are sampled simultaneously and independently; therefore, the inefficient sampling procedure with an “inner-loop” for epistemic uncertainty and an “outer-loop” for aleatory uncertainty in traditional methods is avoided. Since the proposed method combines the fast convergence of the proper estimates and searches failure samples in the interesting regions with high efficiency, the proposed method is more efficient than traditional methods for the variance-based failure probability sensitivity measures in the presence of epistemic and aleatory uncertainties. Two numerical examples and one engineering example are introduced for demonstrating the efficiency and precision of the proposed method for structural systems with both epistemic and aleatory uncertainties.
Ren Bo; Lu Zhenzhou; Zhou Changcong
2014-01-01
For structural systems with both epistemic and aleatory uncertainties, research on quantifying the contribution of the epistemic and aleatory uncertainties to the failure probability of the systems is conducted. Based on the method of separating epistemic and aleatory uncertainties in a variable, the core idea of the research is firstly to establish a novel deterministic transition model for auxiliary variables, distribution parameters, random variables, failure probability, then to propose the improved importance sampling (IS) to solve the transition model. Furthermore, the distribution parameters and auxiliary variables are sampled simultaneously and independently;therefore, the inefficient sampling procedure with an‘‘inner-loop’’ for epistemic uncertainty and an‘‘outer-loop’’ for aleatory uncertainty in traditional methods is avoided. Since the proposed method combines the fast convergence of the proper estimates and searches failure samples in the interesting regions with high efficiency, the proposed method is more efficient than traditional methods for the variance-based failure probability sensitivity measures in the presence of epistemic and aleatory uncertainties. Two numerical examples and one engineering example are introduced for demonstrating the efficiency and precision of the proposed method for structural systems with both epistemic and aleatory uncertainties.
Determination of uncertainties associated to the in vivo measurement of iodine-131 in the thyroid.
Dantas, B M; Lima, F F; Dantas, A L; Lucena, E A; Gontijo, R M G; Carvalho, C B; Hazin, C
2016-07-01
Intakes of radionuclides can be estimated through in vivo measurements, and the uncertainties associated to the measured activities should be clearly stated in monitoring program reports. This study aims to evaluate the uncertainties of in vivo monitoring of iodine 131 in the thyroid. The reference values for high-energy photons are based on the IDEAS Guide. Measurements were performed at the In Vivo Monitoring Laboratory of the Institute of Radiation Protection and Dosimetry (IRD) and at the Internal Dosimetry Laboratory of the Regional Center of Nuclear Sciences (CRCN-NE). In both institutions, the experiment was performed using a NaI(Tl) 3''3″ scintillation detector and a neck-thyroid phantom. Scattering factors were calculated and compared in different counting geometries. The results show that the technique produces reproducibility equivalent to the values suggested in the IDEAS Guide and measurement uncertainties is comparable to international quality standards for this type of in vivo monitoring. PMID:27108067
Possolo, Antonio
2016-02-01
The current guidelines for the evaluation and expression of the uncertainty of NIST measurement results were originally published in 1993 as NIST Technical Note 1297, which was last revised in 1994. NIST is now updating its principles and procedures for uncertainty evaluation to address current and emerging needs in measurement science that Technical Note 1297 could not have anticipated or contemplated when it was first conceived. Although progressive and forward-looking, this update is also conservative because it does not require that current practices for uncertainty evaluation be abandoned or modified where they are fit for purpose and when there is no compelling reason to do otherwise. The updated guidelines are offered as a Simple Guide intended to be deployed under the NIST policy on Measurement Quality, and are accompanied by a rich collection of examples of application drawn from many different fields of measurement science.