WorldWideScience

Sample records for attribution measuring uncertainty

  1. Greenhouse Gas Source Attribution: Measurements Modeling and Uncertainty Quantification

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Zhen [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Safta, Cosmin [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Sargsyan, Khachik [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Najm, Habib N. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); van Bloemen Waanders, Bart Gustaaf [Sandia National Lab. (SNL-CA), Livermore, CA (United States); LaFranchi, Brian W. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Ivey, Mark D. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Schrader, Paul E. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Michelsen, Hope A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Bambha, Ray P. [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2014-09-01

    In this project we have developed atmospheric measurement capabilities and a suite of atmospheric modeling and analysis tools that are well suited for verifying emissions of green- house gases (GHGs) on an urban-through-regional scale. We have for the first time applied the Community Multiscale Air Quality (CMAQ) model to simulate atmospheric CO2 . This will allow for the examination of regional-scale transport and distribution of CO2 along with air pollutants traditionally studied using CMAQ at relatively high spatial and temporal resolution with the goal of leveraging emissions verification efforts for both air quality and climate. We have developed a bias-enhanced Bayesian inference approach that can remedy the well-known problem of transport model errors in atmospheric CO2 inversions. We have tested the approach using data and model outputs from the TransCom3 global CO2 inversion comparison project. We have also performed two prototyping studies on inversion approaches in the generalized convection-diffusion context. One of these studies employed Polynomial Chaos Expansion to accelerate the evaluation of a regional transport model and enable efficient Markov Chain Monte Carlo sampling of the posterior for Bayesian inference. The other approach uses de- terministic inversion of a convection-diffusion-reaction system in the presence of uncertainty. These approaches should, in principle, be applicable to realistic atmospheric problems with moderate adaptation. We outline a regional greenhouse gas source inference system that integrates (1) two ap- proaches of atmospheric dispersion simulation and (2) a class of Bayesian inference and un- certainty quantification algorithms. We use two different and complementary approaches to simulate atmospheric dispersion. Specifically, we use a Eulerian chemical transport model CMAQ and a Lagrangian Particle Dispersion Model - FLEXPART-WRF. These two models share the same WRF

  2. Uranium Measurements and Attributes

    International Nuclear Information System (INIS)

    It may be necessary to find the means to determine unclassified attributes of uranium in nuclear weapons or their components for future transparency initiatives. We briefly describe the desired characteristics of attribute measurement systems for transparency. The determination of uranium attributes; in particular, by passive gamma-ray detection is a formidable challenge

  3. The attribute measurement technique

    International Nuclear Information System (INIS)

    Any verification measurement performed on potentially classified nuclear material must satisfy two seemingly contradictory constraints. First and foremost, no classified information can be released. At the same time, the monitoring party must have confidence in the veracity of the measurement. An information barrier (IB) is included in the measurement system to protect the potentially classified information while allowing sufficient information transfer to occur for the monitoring party to gain confidence that the material being measured is consistent with the host's declarations, concerning that material. The attribute measurement technique incorporates an IB and addresses both concerns by measuring several attributes of the nuclear material and displaying unclassified results through green (indicating that the material does possess the specified attribute) and red (indicating that the material does not possess the specified attribute) lights. The attribute measurement technique has been implemented in the AVNG, an attribute measuring system described in other presentations at this conference. In this presentation, we will discuss four techniques used in the AVNG: (1) the 1B, (2) the attribute measurement technique, (3) the use of open and secure modes to increase confidence in the displayed results, and (4) the joint design as a method for addressing both host and monitor needs.

  4. Measurement Uncertainty and Probability

    Science.gov (United States)

    Willink, Robin

    2013-02-01

    Part I. Principles: 1. Introduction; 2. Foundational ideas in measurement; 3. Components of error or uncertainty; 4. Foundational ideas in probability and statistics; 5. The randomization of systematic errors; 6. Beyond the standard confidence interval; Part II. Evaluation of Uncertainty: 7. Final preparation; 8. Evaluation using the linear approximation; 9. Evaluation without the linear approximations; 10. Uncertainty information fit for purpose; Part III. Related Topics: 11. Measurement of vectors and functions; 12. Why take part in a measurement comparison?; 13. Other philosophies; 14. An assessment of objective Bayesian methods; 15. A guide to the expression of uncertainty in measurement; 16. Measurement near a limit - an insoluble problem?; References; Index.

  5. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  6. Traceability and Measurement Uncertainty

    DEFF Research Database (Denmark)

    Tosello, Guido; De Chiffre, Leonardo

    2004-01-01

    and motivating to this important group. The developed e-learning system consists on 12 different chapters dealing with the following topics: 1. Basics 2. Traceability and measurement uncertainty 3. Coordinate metrology 4. Form measurement 5. Surface testing 6. Optical measurement and testing 7. Measuring rooms 8....... Machine tool testing 9. The role of manufacturing metrology for QM 10. Inspection planning 11. Quality management of measurements incl. Documentation 12. Advanced manufacturing measurement technology The present report (which represents the section 2 - Traceability and Measurement Uncertainty – of the e...

  7. Attempting Measurement of Psychological Attributes

    OpenAIRE

    Salzberger, Thomas

    2013-01-01

    Measures of psychological attributes abound in the social sciences as much as measures of physical properties do in the physical sciences. However, there are crucial differences between the scientific underpinning of measurement. While measurement in the physical sciences is supported by empirical evidence that demonstrates the quantitative nature of the property assessed, measurement in the social sciences is, in large part, made possible only by a vague, discretionary definition of measurem...

  8. Attempting measurement of psychological attributes.

    Science.gov (United States)

    Salzberger, Thomas

    2013-01-01

    Measures of psychological attributes abound in the social sciences as much as measures of physical properties do in the physical sciences. However, there are crucial differences between the scientific underpinning of measurement. While measurement in the physical sciences is supported by empirical evidence that demonstrates the quantitative nature of the property assessed, measurement in the social sciences is, in large part, made possible only by a vague, discretionary definition of measurement that places hardly any restrictions on empirical data. Traditional psychometric analyses fail to address the requirements of measurement as defined more rigorously in the physical sciences. The construct definitions do not allow for testable predictions; and content validity becomes a matter of highly subjective judgment. In order to improve measurement of psychological attributes, it is suggested to, first, readopt the definition of measurement in the physical sciences; second, to devise an elaborate theory of the construct to be measured that includes the hypothesis of a quantitative attribute; and third, to test the data for the structure implied by the hypothesis of quantity as well as predictions derived from the theory of the construct. PMID:23550264

  9. The Uncertainty of Measurement Results

    International Nuclear Information System (INIS)

    Factors affecting the uncertainty of measurement are explained, basic statistical formulae given, and the theoretical concept explained in the context of pesticide formulation analysis. Practical guidance is provided on how to determine individual uncertainty components within an analytical procedure. An extended and comprehensive table containing the relevant mathematical/statistical expressions elucidates the relevant underlying principles. Appendix I provides a practical elaborated example on measurement uncertainty estimation, above all utilizing experimental repeatability and reproducibility laboratory data. (author)

  10. Entropic uncertainty and measurement reversibility

    Science.gov (United States)

    Berta, Mario; Wehner, Stephanie; Wilde, Mark M.

    2016-07-01

    The entropic uncertainty relation with quantum side information (EUR-QSI) from (Berta et al 2010 Nat. Phys. 6 659) is a unifying principle relating two distinctive features of quantum mechanics: quantum uncertainty due to measurement incompatibility, and entanglement. In these relations, quantum uncertainty takes the form of preparation uncertainty where one of two incompatible measurements is applied. In particular, the ‘uncertainty witness’ lower bound in the EUR-QSI is not a function of a post-measurement state. An insightful proof of the EUR-QSI from (Coles et al 2012 Phys. Rev. Lett. 108 210405) makes use of a fundamental mathematical consequence of the postulates of quantum mechanics known as the non-increase of quantum relative entropy under quantum channels. Here, we exploit this perspective to establish a tightening of the EUR-QSI which adds a new state-dependent term in the lower bound, related to how well one can reverse the action of a quantum measurement. As such, this new term is a direct function of the post-measurement state and can be thought of as quantifying how much disturbance a given measurement causes. Our result thus quantitatively unifies this feature of quantum mechanics with the others mentioned above. We have experimentally tested our theoretical predictions on the IBM quantum experience and find reasonable agreement between our predictions and experimental outcomes.

  11. Incentive salience attribution under reward uncertainty: A Pavlovian model.

    Science.gov (United States)

    Anselme, Patrick

    2015-02-01

    There is a vast literature on the behavioural effects of partial reinforcement in Pavlovian conditioning. Compared with animals receiving continuous reinforcement, partially rewarded animals typically show (a) a slower development of the conditioned response (CR) early in training and (b) a higher asymptotic level of the CR later in training. This phenomenon is known as the partial reinforcement acquisition effect (PRAE). Learning models of Pavlovian conditioning fail to account for it. In accordance with the incentive salience hypothesis, it is here argued that incentive motivation (or 'wanting') plays a more direct role in controlling behaviour than does learning, and reward uncertainty is shown to have an excitatory effect on incentive motivation. The psychological origin of that effect is discussed and a computational model integrating this new interpretation is developed. Many features of CRs under partial reinforcement emerge from this model. PMID:25444780

  12. Measuring the uncertainty of coupling

    Science.gov (United States)

    Zhao, Xiaojun; Shang, Pengjian

    2015-06-01

    A new information-theoretic measure, called coupling entropy, is proposed here to detect the causal links in complex systems by taking into account the inner composition alignment of temporal structure. It is a permutation-based asymmetric association measure to infer the uncertainty of coupling between two time series. The coupling entropy is found to be effective in the analysis of Hénon maps, where different noises are added to test its accuracy and sensitivity. The coupling entropy is also applied to analyze the relationship between unemployment rate and CPI change in the U.S., where the CPI change turns out to be the driving variable while the unemployment rate is the responding one.

  13. Uncertainty Quantification for Safeguards Measurements

    International Nuclear Information System (INIS)

    Part of the scientific method requires all calculated and measured results to be accompanied by a description that meets user needs and provides an adequate statement of the confidence one can have in the results. The scientific art of generating quantitative uncertainty statements is closely related to the mathematical disciplines of applied statistics, sensitivity analysis, optimization, and inversion, but in the field of non-destructive assay, also often draws heavily on expert judgment based on experience. We call this process uncertainty quantification, (UQ). Philosophical approaches to UQ along with the formal tools available for UQ have advanced considerably over recent years and these advances, we feel, may be useful to include in the analysis of data gathered from safeguards instruments. This paper sets out what we hope to achieve during a three year US DOE NNSA research project recently launched to address the potential of advanced UQ to improve safeguards conclusions. By way of illustration we discuss measurement of uranium enrichment by the enrichment meter principle (also known as the infinite thickness technique), that relies on gamma counts near the 186 keV peak directly from 235U. This method has strong foundations in fundamental physics and so we have a basis for the choice of response model — although in some implementations, peak area extraction may result in a bias when applied over a wide dynamic range. It also allows us to describe a common but usually neglected aspect of applying a calibration curve, namely the error structure in the predictors. We illustrate this using a combination of measured data and simulation. (author)

  14. Uncertainty of temperature measurement with thermal cameras

    Science.gov (United States)

    Chrzanowski, Krzysztof; Matyszkiel, Robert; Fischer, Joachim; Barela, Jaroslaw

    2001-06-01

    All main international metrological organizations are proposing a parameter called uncertainty as a measure of the accuracy of measurements. A mathematical model that enables the calculations of uncertainty of temperature measurement with thermal cameras is presented. The standard uncertainty or the expanded uncertainty of temperature measurement of the tested object can be calculated when the bounds within which the real object effective emissivity (epsilon) r, the real effective background temperature Tba(r), and the real effective atmospheric transmittance (tau) a(r) are located and can be estimated; and when the intrinsic uncertainty of the thermal camera and the relative spectral sensitivity of the thermal camera are known.

  15. Measuring, Estimating, and Deciding under Uncertainty.

    Science.gov (United States)

    Michel, Rolf

    2016-03-01

    The problem of uncertainty as a general consequence of incomplete information and the approach to quantify uncertainty in metrology is addressed. Then, this paper discusses some of the controversial aspects of the statistical foundation of the concepts of uncertainty in measurements. The basics of the ISO Guide to the Expression of Uncertainty in Measurement as well as of characteristic limits according to ISO 11929 are described and the needs for a revision of the latter standard are explained. PMID:26688360

  16. Exploring the uncertainty in attributing sediment contributions in fingerprinting studies due to uncertainty in determining element concentrations in source areas.

    Science.gov (United States)

    Gomez, Jose Alfonso; Owens, Phillip N.; Koiter, Alex J.; Lobb, David

    2016-04-01

    One of the major sources of uncertainty in attributing sediment sources in fingerprinting studies is the uncertainty in determining the concentrations of the elements used in the mixing model due to the variability of the concentrations of these elements in the source materials (e.g., Kraushaar et al., 2015). The uncertainty in determining the "true" concentration of a given element in each one of the source areas depends on several factors, among them the spatial variability of that element, the sampling procedure and sampling density. Researchers have limited control over these factors, and usually sampling density tends to be sparse, limited by time and the resources available. Monte Carlo analysis has been used regularly in fingerprinting studies to explore the probable solutions within the measured variability of the elements in the source areas, providing an appraisal of the probability of the different solutions (e.g., Collins et al., 2012). This problem can be considered analogous to the propagation of uncertainty in hydrologic models due to uncertainty in the determination of the values of the model parameters, and there are many examples of Monte Carlo analysis of this uncertainty (e.g., Freeze, 1980; Gómez et al., 2001). Some of these model analyses rely on the simulation of "virtual" situations that were calibrated from parameter values found in the literature, with the purpose of providing insight about the response of the model to different configurations of input parameters. This approach - evaluating the answer for a "virtual" problem whose solution could be known in advance - might be useful in evaluating the propagation of uncertainty in mixing models in sediment fingerprinting studies. In this communication, we present the preliminary results of an on-going study evaluating the effect of variability of element concentrations in source materials, sampling density, and the number of elements included in the mixing models. For this study a virtual

  17. GM Counters: Potential Measurement Uncertainty Sources

    International Nuclear Information System (INIS)

    This paper describes theoretically potential measurement uncertainty sources in radiation detection by GM counters. Procedure of obtaining expanded and combined uncertainties is shown experimentally for four technologically different types of GM counters. Based on experimental results obtained, it has been established that the uncertainties of an influenced random variables depend on the technological solution of the counter reading system and contribute in different ways to the expanded and combined uncertainty of the applied types of GM counters. (author)

  18. The impact of uncertainty and risk measures

    OpenAIRE

    Jo, Soojin

    2012-01-01

    This dissertation seeks to better understand how uncertainty impacts a variety of economic activities and how to measure systemic risk. In the first chapter, "The effects of oil price uncertainty on the macroeconomy'' focuses on oil price uncertainty, and how it affects the global economic growth. In particular, I define oil price uncertainty as the time-varying standard deviation of one- quarter ahead forecasting error that follows stochastic volatility. Then I use a quarterly VAR with stoch...

  19. RUMINATIONS ON NDA MEASUREMENT UNCERTAINTY COMPARED TO DA UNCERTAINTY

    Energy Technology Data Exchange (ETDEWEB)

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-06-17

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  20. Ruminations On NDA Measurement Uncertainty Compared TO DA Uncertainty

    International Nuclear Information System (INIS)

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  1. Measurement Theory, Nomological Machine And Measurement Uncertainties (In Classical Physics

    Directory of Open Access Journals (Sweden)

    Ave Mets

    2012-12-01

    Full Text Available Measurement is said to be the basis of exact sciences as the process of assigning numbers to matter (things or their attributes, thus making it possible to apply the mathematically formulated laws of nature to the empirical world. Mathematics and empiria are best accorded to each other in laboratory experiments which function as what Nancy Cartwright calls nomological machine: an arrangement generating (mathematical regularities. On the basis of accounts of measurement errors and uncertainties, I will argue for two claims: 1 Both fundamental laws of physics, corresponding to ideal nomological machine, and phenomenological laws, corresponding to material nomological machine, lie, being highly idealised relative to the empirical reality; and also laboratory measurement data do not describe properties inherent to the world independently of human understanding of it. 2 Therefore the naive, representational view of measurement and experimentation should be replaced with a more pragmatic or practice-based view.

  2. Uncertainty reconciles complementarity with joint measurability

    International Nuclear Information System (INIS)

    The fundamental principles of complementarity and uncertainty are shown to be related to the possibility of joint unsharp measurements of pairs of noncommuting quantum observables. A joint measurement scheme for complementary observables is proposed. The measured observables are represented as positive operator valued measures (POVMs), whose intrinsic fuzziness parameters are found to satisfy an intriguing pay-off relation reflecting the complementarity. At the same time, this relation represents an instance of a Heisenberg uncertainty relation for measurement imprecisions. A model-independent consideration shows that this uncertainty relation is logically connected with the joint measurability of the POVMs in question

  3. Unsharpness of generalized measurement and its effects in entropic uncertainty relations

    OpenAIRE

    Baek, Kyunghyun; Son, Wonmin

    2016-01-01

    Under the scenario of generalized measurements, it can be questioned how much of quantum uncertainty can be attributed to measuring device, independent of the uncertainty in the measured system. On the course to answer the question, we suggest a new class of entropic uncertainty relation that differentiates quantum uncertainty from device imperfection due to the unsharpness of measurement. In order to quantify the unsharpness, we {suggest} and analyze the quantity that characterizes the uncer...

  4. The Depressive Attributions Questionnaire (DAQ): Development of a Short Self-Report Measure of Depressogenic Attributions

    OpenAIRE

    Kleim, B; Gonzalo, D; Ehlers, A

    2011-01-01

    A depressogenic attributional style, i.e., internal, stable and global causal interpretations of negative events, is a stable vulnerability factor for depression. Current measures of pessimistic attributional style can be time-consuming to complete, and some are designed for specific use with student populations. We developed and validated a new short questionnaire suitable for the measurement of depressogenic attributions in clinical settings, the Depressive Attributions Questionnaire (DAQ)....

  5. Uncertainty estimation of ultrasonic thickness measurement

    International Nuclear Information System (INIS)

    The most important factor that should be taken into consideration when selecting ultrasonic thickness measurement technique is its reliability. Only when the uncertainty of a measurement results is known, it may be judged if the result is adequate for intended purpose. The objective of this study is to model the ultrasonic thickness measurement function, to identify the most contributing input uncertainty components, and to estimate the uncertainty of the ultrasonic thickness measurement results. We assumed that there are five error sources significantly contribute to the final error, these sources are calibration velocity, transit time, zero offset, measurement repeatability and resolution, by applying the propagation of uncertainty law to the model function, a combined uncertainty of the ultrasonic thickness measurement was obtained. In this study the modeling function of ultrasonic thickness measurement was derived. By using this model the estimation of the uncertainty of the final output result was found to be reliable. It was also found that the most contributing input uncertainty components are calibration velocity, transit time linearity and zero offset. (author)

  6. Uncertainty evaluation in electrochemical noise resistance measurement

    International Nuclear Information System (INIS)

    The uncertainty in statistical noise resistance measurement was evaluated for a type 316 stainless steel in NaCl solutions at room temperature. Sensitivity coefficients were determined for measurements or variables such as NaCl concentration, pH, solution temperature, surface roughness, inert gas flow rate and bias potential amplitude. The coefficients were larger for the variables such as NaCl concentration, pH, inert gas flow rate and solution temperature, and they were the major factors increasing the combined standard uncertainty of noise resistance. However, the contribution to the uncertainty in noise resistance measurement from the above variables was remarkably low compared to that from repeated measurements of noise resistance, and thus, it is difficult to lower the uncertainty in noise resistance measurement significantly by lowering the uncertainties related with NaCl concentration, pH, inert gas flow rate and solution temperature. In addition, the uncertainty in noise resistance measurement was high amounting to 17.3 % of the mean, indicating that the reliability in measurement of noise resistance is low

  7. An approach to multi-attribute utility analysis under parametric uncertainty

    International Nuclear Information System (INIS)

    The techniques of cost-benefit analysis and multi-attribute analysis provide a useful basis for informing decisions in situations where a number of potentially conflicting opinions or interests need to be considered, and where there are a number of possible decisions that could be adopted. When the input data to such decision-making processes are uniquely specified, cost-benefit analysis and multi-attribute utility analysis provide unambiguous guidance on the preferred decision option. However, when the data are not uniquely specified, application and interpretation of these techniques is more complex. Herein, an approach to multi-attribute utility analysis (and hence, as a special case, cost-benefit analysis) when input data are subject to parametric uncertainty is presented. The approach is based on the use of a Monte Carlo technique, and has recently been applied to options for the remediation of former uranium mining liabilities in a number of Central and Eastern European States

  8. Subjective judgment on measure of data uncertainty

    International Nuclear Information System (INIS)

    Integral parameters are considered, which can be derived from the covariance matrix of the uncertainties and can serve as a general measure of uncertainties in comparisons of different fits. Using realistic examples and simple data model fits with a variable number of parameters, he was able to show that the sum of all elements of the covariance matrix is a best general measure for characterizing and comparing uncertainties obtained in different model and non-model fits. Discussions also included the problem of non-positive definiteness of the covariance matrix of the uncertainty of the cross sections obtained from the covariance matrix of the uncertainty of the parameters in cases where the number of parameters is less than number of cross section points. As a consequence of the numerical inaccuracy of the calculations that are always many orders larger than the presentation of the machine zero, it was concluded that the calculated eigenvalues of semipositive definite matrices have no machine zeros. These covariance matrices can be inverted when they are used in the error propagation equations. So the procedure for transformation of the semi-positive definite matrices to positive ones by introducing minimal changes into the matrix (changes that are equivalent to introducing additional non-informative parameters in the model) is generally not needed. But caution should be observed, because there can be cases where uncertainties can be unphysical, e.g. integral parameters estimated with formally non-positive-definite covariance matrices

  9. Measuring the uncertainty of tapping torque

    DEFF Research Database (Denmark)

    Belluco, Walter; De Chiffre, Leonardo

    An uncertainty budget is carried out for torque measurements performed at the Institut for Procesteknik for the evaluation of cutting fluids. Thirty test blanks were machined with one tool and one fluid, torque diagrams were recorded and the repeatability of single torque measurements was estimat...

  10. Teaching Measurement and Uncertainty the GUM Way

    Science.gov (United States)

    Buffler, Andy; Allie, Saalih; Lubben, Fred

    2008-01-01

    This paper describes a course aimed at developing understanding of measurement and uncertainty in the introductory physics laboratory. The course materials, in the form of a student workbook, are based on the probabilistic framework for measurement as recommended by the International Organization for Standardization in their publication "Guide to…

  11. Uncertainty Measures of Regional Flood Frequency Estimators

    DEFF Research Database (Denmark)

    Rosbjerg, Dan; Madsen, Henrik

    1995-01-01

    Regional flood frequency models have different assumptions regarding homogeneity and inter-site independence. Thus, uncertainty measures of T-year event estimators are not directly comparable. However, having chosen a particular method, the reliability of the estimate should always be stated, e...

  12. Methods for Attribute Measurement and Alternatives to Multiplicity Counting

    International Nuclear Information System (INIS)

    The Attribute Measurement System with Information Barrier (AMS/IB) specification is being developed in support of the Defense Threat Reduction Agency's (DTRA's) Cooperative Threat Reduction (CTR) program for the Mayak Fissile Material Storage Facility. This document discusses the technologies available for attribute measurement, and advantages and disadvantages of alternatives

  13. Using MINITAB software for teaching measurement uncertainty

    International Nuclear Information System (INIS)

    The concept of measurement uncertainty should be regarded not only related to the concept of doubt about the validity of the measurement result, but also to the quantization of this concept. In this sense the measurement uncertainty is that parameter associated with the result characterizing the dispersion of the values that could reasonably be assigned to the measurand (or more properly to its representation through a model). This parameter may be for example a multiple of the standard deviation but especially, and more importantly, the half width of an interval with a predetermined level of confidence or trust. In these terms in this paper I attempt, with the help of MINITAB software, to analyze this parameter; with simple and quick operations to evaluate the mean, the standard deviation and the confidence interval and by the use of several plotted graphs

  14. Quantifying uncertainty in nuclear analytical measurements

    International Nuclear Information System (INIS)

    The lack of international consensus on the expression of uncertainty in measurements was recognised by the late 1970s and led, after the issuance of a series of rather generic recommendations, to the publication of a general publication, known as GUM, the Guide to the Expression of Uncertainty in Measurement. This publication, issued in 1993, was based on co-operation over several years by the Bureau International des Poids et Mesures, the International Electrotechnical Commission, the International Federation of Clinical Chemistry, the International Organization for Standardization (ISO), the International Union of Pure and Applied Chemistry, the International Union of Pure and Applied Physics and the Organisation internationale de metrologie legale. The purpose was to promote full information on how uncertainty statements are arrived at and to provide a basis for harmonized reporting and the international comparison of measurement results. The need to provide more specific guidance to different measurement disciplines was soon recognized and the field of analytical chemistry was addressed by EURACHEM in 1995 in the first edition of a guidance report on Quantifying Uncertainty in Analytical Measurements, produced by a group of experts from the field. That publication translated the general concepts of the GUM into specific applications for analytical laboratories and illustrated the principles with a series of selected examples as a didactic tool. Based on feedback from the actual practice, the EURACHEM publication was extensively reviewed in 1997-1999 under the auspices of the Co-operation on International Traceability in Analytical Chemistry (CITAC), and a second edition was published in 2000. Still, except for a single example on the measurement of radioactivity in GUM, the field of nuclear and radiochemical measurements was not covered. The explicit requirement of ISO standard 17025:1999, General Requirements for the Competence of Testing and Calibration

  15. Uncertainties in the attribution of greenhouse gas warming and implications for climate prediction

    CERN Document Server

    Jones, Gareth S; Mitchell, John F B

    2016-01-01

    Using optimal detection techniques with climate model simulations, most of the observed increase of near surface temperatures over the second half of the twentieth century is attributed to anthropogenic influences. However, the partitioning of the anthropogenic influence to individual factors, such as greenhouse gases and aerosols, is much less robust. Differences in how forcing factors are applied, in their radiative influence and in models' climate sensitivities, substantially influence the response patterns. We find standard optimal detection methodologies cannot fully reconcile this response diversity. By selecting a set of experiments to enable the diagnosing of greenhouse gases and the combined influence of other anthropogenic and natural factors, we find robust detections of well mixed greenhouse gases across a large ensemble of models. Of the observed warming over the 20th century of 0.65K/century we find, using a multi model mean not incorporating pattern uncertainty, a well mixed greenhouse gas warm...

  16. Inconclusive quantum measurements and decisions under uncertainty

    CERN Document Server

    Yukalov, V I

    2016-01-01

    We give a mathematical definition for the notion of inconclusive quantum measurements. In physics, such measurements occur at intermediate stages of a complex measurement procedure, with the final measurement result being operationally testable. Since the mathematical structure of Quantum Decision Theory has been developed in analogy with the theory of quantum measurements, the inconclusive quantum measurements correspond, in Quantum Decision Theory, to intermediate stages of decision making in the process of taking decisions under uncertainty. The general form of the quantum probability for a composite event is the sum of a utility factor, describing a rational evaluation of the considered prospect, and of an attraction factor, characterizing irrational, subconscious attitudes of the decision maker. Despite the involved irrationality, the probability of prospects can be evaluated. This is equivalent to the possibility of calculating quantum probabilities without specifying hidden variables. We formulate a ge...

  17. Measuring the uncertainties of discharge measurements: interlaboratory experiments in hydrometry

    Science.gov (United States)

    Le Coz, Jérôme; Blanquart, Bertrand; Pobanz, Karine; Dramais, Guillaume; Pierrefeu, Gilles; Hauet, Alexandre; Despax, Aurélien

    2015-04-01

    Quantifying the uncertainty of streamflow data is key for hydrological sciences. The conventional uncertainty analysis based on error propagation techniques is restricted by the absence of traceable discharge standards and by the weight of difficult-to-predict errors related to the operator, procedure and measurement environment. Field interlaboratory experiments recently emerged as an efficient, standardized method to 'measure' the uncertainties of a given streamgauging technique in given measurement conditions. Both uncertainty approaches are compatible and should be developed jointly in the field of hydrometry. In the recent years, several interlaboratory experiments have been reported by different hydrological services. They involved different streamgauging techniques, including acoustic profilers (ADCP), current-meters and handheld radars (SVR). Uncertainty analysis was not always their primary goal: most often, testing the proficiency and homogeneity of instruments, makes and models, procedures and operators was the original motivation. When interlaboratory experiments are processed for uncertainty analysis, once outliers have been discarded all participants are assumed to be equally skilled and to apply the same streamgauging technique in equivalent conditions. A universal requirement is that all participants simultaneously measure the same discharge, which shall be kept constant within negligible variations. To our best knowledge, we were the first to apply the interlaboratory method for computing the uncertainties of streamgauging techniques, according to the authoritative international documents (ISO standards). Several specific issues arise due to the measurements conditions in outdoor canals and rivers. The main limitation is that the best available river discharge references are usually too uncertain to quantify the bias of the streamgauging technique, i.e. the systematic errors that are common to all participants in the experiment. A reference or a

  18. Estimation of dose uncertainty measurement in a Nuclear Medicine Service

    International Nuclear Information System (INIS)

    Full text: The accuracy of activity measurement is the first step in safety and radiation protection of patient. The uncertainty of activity measurement has a lot of aspect to take into account: calibration factors, geometric of the sample, position, stability of the equipment parameters, stability, etc. More over, operational parameters change between different equipment, for that reason guarantee the traceability of measurement is very important in quality assurance. The objective of this study was determined the combine uncertainty of activity measurement for isotopes I131 and Tc99m , using dose calibrators well establish in Latin American countries Capintec CRC-15R and PTW Curimentor 3 those present in our Nuclear Medicine Service. The uncertainty could be defined as a parameter associated with the result of a measurement, which characterises the dispersion of the values that could reasonably be attributed to the measured. The parameter may be, for example, a standard deviation (or a given multiple of it), or the width of a confidence interval. Uncertainty of measurement comprises, in general, many components. Some of these components may be evaluated from the statistical distribution of the results of series of measurements and can be characterised by standard deviations. The other components, which also can be characterised by standard deviations, are evaluated from assumed probability distributions based on experience or other information. These different cases are defining as Type A and Type B estimations respectively. The combine uncertainty is calculated as the square root of the square sum of all uncertainty components. These uncertainties are associates to calibration factor, linearity, reproducibility, radioactive background, stability, radioactive decay correction, etc. For that reason the performance of accuracy, precision, the linearity of activity response and reproducibility were study during 6 month. In order to check the precision and accuracy

  19. METHOD OF DYNAMIC EXPRESSION OF UNCERTAINTY OF MEASUREMENT

    Directory of Open Access Journals (Sweden)

    O. M. Vasilevskyi

    2015-03-01

    Full Text Available The way of expressing the dynamic uncertainty of measurement that allows using the spectral function of the input signal and the frequency response of the measurement tools used to assess the uncertainty in its dynamic operation.

  20. Inconclusive quantum measurements and decisions under uncertainty

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2016-04-01

    Full Text Available We give a mathematical definition for the notion of inconclusive quantum measurements.In physics, such measurements occur at intermediate stages of a complex measurement procedure, with the final measurement result being operationally testable. Since the mathematical structure of Quantum Decision Theory has been developed in analogy withthe theory of quantum measurements, the inconclusive quantum measurements correspond,in Quantum Decision Theory, to intermediate stages of decision making in the process of taking decisions under uncertainty. The general form of the quantum probability for a composite event is the sum of a utility factor, describing a rational evaluationof the considered prospect, and of an attraction factor, characterizing irrational,subconscious attitudes of the decision maker. Despite the involved irrationality, the probability of prospects can be evaluated. This is equivalent to the possibility of calculating quantum probabilities without specifying hidden variables. We formulate a general way of evaluation, based on the use of non-informative priors. As an example,we suggest the explanation of the decoy effect. Our quantitative predictions are in very good agreement with experimental data.

  1. Inconclusive quantum measurements and decisions under uncertainty

    Science.gov (United States)

    Yukalov, Vyacheslav; Sornette, Didier

    2016-04-01

    We give a mathematical definition for the notion of inconclusive quantum measurements. In physics, such measurements occur at intermediate stages of a complex measurement procedure, with the final measurement result being operationally testable. Since the mathematical structure of Quantum Decision Theory has been developed in analogy with the theory of quantum measurements, the inconclusive quantum measurements correspond, in Quantum Decision Theory, to intermediate stages of decision making in the process of taking decisions under uncertainty. The general form of the quantum probability for a composite event is the sum of a utility factor, describing a rational evaluation of the considered prospect, and of an attraction factor, characterizing irrational, subconscious attitudes of the decision maker. Despite the involved irrationality, the probability of prospects can be evaluated. This is equivalent to the possibility of calculating quantum probabilities without specifying hidden variables. We formulate a general way of evaluation, based on the use of non-informative priors. As an example, we suggest the explanation of the decoy effect. Our quantitative predictions are in very good agreement with experimental data.

  2. Attribute measurement systems prototypes and equipment in the United States

    International Nuclear Information System (INIS)

    Since the fall of 1997, the United States has been developing prototypical attribute verification technology for potential use by the International Atomic Energy Agency (IAEA) under the Trilateral Initiative. The first attribute measurement equipment demonstration took place in December 1997 at the Lawrence Livermore National Laboratory. This demonstration led to a series of joint Russian Federatioin/US/IAEA technical discussions that focused on attribute measurement technology that could be applied to plutonium bearing items having classified characteristics. A first prototype attribute verification system with an information barrier was demonstrated at a Trilateral Technical Workshop in June 1999 at Los Alamos. This prototype nourished further fruitful discussions between the three parties that has in turn led to the documents discussed in a previous paper. Prototype development has continued in the US, under other initiatives, using an integrated approach that includes the Trilatleral Initiative. Specifically for the Trilateral Initiative, US development has turned to some peripheral equipment that would support verifications by the IAEA. This equipment includes an authentication tool for measurement systems with information barriers and in situ probes that would facilitate inspections by reducing the need to move material out of storage locations for reverification. In this paper, we will first summarize the development of attribute verification measurement system technology in the US and then report on the status of the development of other equipment to support the Trilateral Initiative.

  3. Expanded uncertainty in measurements of Geiger-Mueller's counter

    International Nuclear Information System (INIS)

    This paper explains the procedure of obtaining expanded uncertainty in measurement for four types of GM counters with the same counter's tube in cases when the contributors of uncertainties in measurement are cosmic background radiation and induced overvoltage phenomenon. According to experiment and obtained experimental results it is established that uncertainties of influenced random variables depend on technological solution of counter and for that purpose they give different contribution to expanded uncertainty in measurement of applied GM counters

  4. Measurement uncertainties in science and technology

    CERN Document Server

    Grabe, Michael

    2014-01-01

    This book recasts the classical Gaussian error calculus from scratch, the inducements concerning both random and unknown systematic errors. The idea of this book is to create a formalism being fit to localize the true values of physical quantities considered – true with respect to the set of predefined physical units. Remarkably enough, the prevailingly practiced forms of error calculus do not feature this property which however proves in every respect, to be physically indispensable. The amended formalism, termed Generalized Gaussian Error Calculus by the author, treats unknown systematic errors as biases and brings random errors to bear via enhanced confidence intervals as laid down by students. The significantly extended second edition thoroughly restructures and systematizes the text as a whole and illustrates the formalism by numerous numerical examples. They demonstrate the basic principles of how to understand uncertainties to localize the true values of measured values - a perspective decisive in vi...

  5. Uncertainty in outdoor noise measurement and prediction

    Science.gov (United States)

    Wilson, D. Keith

    2005-09-01

    Standards for outdoor noise are intended to ensure that (1) measurements are representative of actual exposure and (2) noise prediction procedures are consistent and scientifically defensible. Attainment of these worthwhile goals is hindered by the many complexities of sound interaction with the local atmosphere and terrain. The paradigm predominant in current standards might be described as measuring/predicting ``somewhat worse than average'' conditions. Measurements/predictions are made for moderate downward refraction conditions, since that is when noise annoyance is most often expected to occur. This paradigm is reasonable and practical, although one might argue that current standards could implement it better. A different, potentially more rigorous, paradigm is to explicitly treat the statistical nature of noise imissions as produced by variability in the atmospheric environment and by uncertainties in its characterization. For example, measurements and prediction techniques could focus on exceedance levels. For this to take place, a better conceptual framework must be developed for predictions that are averaged over environmental states, frequency bands, and various time intervals. Another increasingly important issue is the role of computer models. As these models continue to grow in fidelity and capability, there will be increasing pressure to abandon standard calculations in many applications.

  6. Measurement Uncertainty for Finite Quantum Observables

    Directory of Open Access Journals (Sweden)

    René Schwonnek

    2016-06-01

    Full Text Available Measurement uncertainty relations are lower bounds on the errors of any approximate joint measurement of two or more quantum observables. The aim of this paper is to provide methods to compute optimal bounds of this type. The basic method is semidefinite programming, which we apply to arbitrary finite collections of projective observables on a finite dimensional Hilbert space. The quantification of errors is based on an arbitrary cost function, which assigns a penalty to getting result x rather than y, for any pair ( x , y . This induces a notion of optimal transport cost for a pair of probability distributions, and we include an Appendix with a short summary of optimal transport theory as needed in our context. There are then different ways to form an overall figure of merit from the comparison of distributions. We consider three, which are related to different physical testing scenarios. The most thorough test compares the transport distances between the marginals of a joint measurement and the reference observables for every input state. Less demanding is a test just on the states for which a “true value” is known in the sense that the reference observable yields a definite outcome. Finally, we can measure a deviation as a single expectation value by comparing the two observables on the two parts of a maximally-entangled state. All three error quantities have the property that they vanish if and only if the tested observable is equal to the reference. The theory is illustrated with some characteristic examples.

  7. Automating Measurement for Software Process Models using Attribute Grammar Rules

    Directory of Open Access Journals (Sweden)

    Abdul Azim Abd. Ghani

    2007-08-01

    Full Text Available The modelling concept is well accepted in software engineering discipline. Some software models are built either to control the development stages, to measure program quality or to serve as a medium that gives better understanding of the actual software systems. Software process modelling nowadays has reached a level that allow software designs to be transformed into programming languages, such as architecture design language and unified modelling language. This paper described the adaptation of attribute grammar approach in measuring software process model. A tool, called Software Process Measurement Application was developed to enable the measurement accordingly to specified attribute grammar rules. A context-free grammar to read the process model is depicted from IDEF3 standard, and rules were attached to enable the measurement metrics calculation. The measurement metric values collected were used to aid in determining the decomposing and structuring of processes for the proposed software systems.

  8. Estimating discharge measurement uncertainty using the interpolated variance estimator

    Science.gov (United States)

    Cohn, T.; Kiang, J.; Mason, R., Jr.

    2012-01-01

    Methods for quantifying the uncertainty in discharge measurements typically identify various sources of uncertainty and then estimate the uncertainty from each of these sources by applying the results of empirical or laboratory studies. If actual measurement conditions are not consistent with those encountered in the empirical or laboratory studies, these methods may give poor estimates of discharge uncertainty. This paper presents an alternative method for estimating discharge measurement uncertainty that uses statistical techniques and at-site observations. This Interpolated Variance Estimator (IVE) estimates uncertainty based on the data collected during the streamflow measurement and therefore reflects the conditions encountered at the site. The IVE has the additional advantage of capturing all sources of random uncertainty in the velocity and depth measurements. It can be applied to velocity-area discharge measurements that use a velocity meter to measure point velocities at multiple vertical sections in a channel cross section.

  9. Uncertainty of dose measurement in radiation processing

    DEFF Research Database (Denmark)

    Miller, A.

    1996-01-01

    running debate and presents the author's view, which is based upon experience in radiation processing dosimetry. The origin of all uncertainty components must be identified and can be classified according to Type A and Type B, but it is equally important to separate the uncertainty components into those...

  10. Quantum measurement and uncertainty relations in photon polarization

    Science.gov (United States)

    Edamatsu, Keiichi

    2016-07-01

    Recent theoretical and experimental studies have given raise to new aspects in quantum measurements and error-disturbance uncertainty relations. After a brief review of these issues, we present an experimental test of the error-disturbance uncertainty relations in photon polarization measurements. Using a generalized, strength-variable measurement of a single photon polarization state, we experimentally evaluate the error and disturbance in the measurement process and demonstrate the validity of recently proposed uncertainty relations.

  11. Using a Meniscus to Teach Uncertainty in Measurement

    Science.gov (United States)

    Backman, Philip

    2008-01-01

    I have found that students easily understand that a measurement cannot be exact, but they often seem to lack an understanding of why it is important to know "something" about the magnitude of the uncertainty. This tends to promote an attitude that almost any uncertainty value will do. Such indifference may exist because once an uncertainty is…

  12. Overall measurement uncertainty of k0-based neutron activation analysis

    International Nuclear Information System (INIS)

    Special aspects of the uncertainty quantification in k0-NAA are discussed and applied in accordance with the Guide to the Expression of Uncertainty in Measurement (GUM), on a model case. The uncertainty budget is calculated highlighting the contribution and the importance of the different parameters to be taken into account. The importance of the nuclide-specific and neutron fluence-specific approach in estimating individual uncertainty contributions is emphasized and demonstrated by examples of Au, Cr, Rb, and Sb determinations. (author)

  13. Plutonium Attribute Estimation From Passive NMIS Measurements at VNIIEF

    International Nuclear Information System (INIS)

    Currently, the most relevant application of NMIS for plutonium attribute estimation stems from measurements performed jointly by Oak Ridge National Laboratory (ORNL) and Russian Federal Nuclear Center, All-Russia Scientific Research Institute of Experimental Physics (RFNC-VNIIEF) personnel at RFNC-VNIIEF facilities in Sarov, Russia in June and July 2000. During these measurements at VNIIEF, NMIS was applied in its passive mode to eight unclassified plutonium spherical shells. The shells' properties spanned the following ranges: Composition: (delta)-phase plutonium-metal, constant; Relative 240Pu-content (f240Pu): f240Pu = 1.77% (g 240Pu/g Pu), constant; Inner radius (r1): 10.0 mm (le) r1 (le) 53.5 mm, mean r1 33.5 mm; Outer radius (r2): 31.5 mm (le) r2 (le) 60.0 mm, mean r2 = 46.6 mm; Radial thickness (Δr): 6.4 mm (le) Δr (le) 30.2 mm, mean Δr = 13.1 mm; and Plutonium mass (mPu): 1829 g (le) mPu (le) 4468 g, mean mPu = 3265 g. The features of these measurements were analyzed to extract the attributes of each plutonium shell. Given that the samples measured were of constant composition, geometry, and relative 240Pu-content, each shell is completely described by any two of the following four properties: Inner radius r1; Outer radius r2; Mass m, one of 239Pu mass m239Pu, 240Pu mass m240Pu, or total Pu mass mPu; and Radial thickness Δr. Of these, generally only mass is acknowledged as an attribute of interest; the second property (whichever is chosen) can be considered to be a parameter of the attribute-estimation procedure, much as multiplication is a parameter necessary to accurately estimate fissile mass via most neutron measurements

  14. Measurement of Uncertainty for Vaporous Ethanol Concentration Analyzed by Intoxilyzer® 8000 Instruments.

    Science.gov (United States)

    Hwang, Rong-Jen; Rogers, Craig; Beltran, Jada; Razatos, Gerasimos; Avery, Jason

    2016-06-01

    Reporting a measurement of uncertainty helps to determine the limitations of the method of analysis and aids in laboratory accreditation. This laboratory has conducted a study to estimate a reasonable uncertainty for the mass concentration of vaporous ethanol, in g/210 L, by the Intoxilyzer(®) 8000 breath analyzer. The uncertainty sources used were: gas chromatograph (GC) calibration adjustment, GC analytical, certified reference material, Intoxilyzer(®) 8000 calibration adjustment and Intoxilyzer(®) 8000 analytical. Standard uncertainties attributed to these sources were calculated and separated into proportional and constant standard uncertainties. Both the combined proportional and the constant standard uncertainties were further combined to an expanded uncertainty as both a percentage and an unit. To prevent any under reporting of the expanded uncertainty, 0.10 g/210 L was chosen as the defining point for expressing the expanded uncertainty. For the Intoxilyzer(®) 8000, all vaporous ethanol results at or above 0.10 g/210 L, the expanded uncertainty will be reported as ±3.6% at a confidence level of 95% (k = 2); for vaporous ethanol results below 0.10 g/210 L, the expanded uncertainty will be reported as ±0.0036 g/210 L at a confidence level of 95% (k = 2). PMID:27107099

  15. Adaptive framework for uncertainty analysis in electromagnetic field measurements

    International Nuclear Information System (INIS)

    Misinterpretation of uncertainty in the measurement of the electromagnetic field (EMF) strength may lead to an underestimation of exposure risk or an overestimation of required measurements. The Guide to the Expression of Uncertainty in Measurement (GUM) has internationally been adopted as a de facto standard for uncertainty assessment. However, analyses under such an approach commonly assume unrealistic static models or neglect relevant prior information, resulting in non-robust uncertainties. This study proposes a principled and systematic framework for uncertainty analysis that fuses information from current measurements and prior knowledge. Such a framework dynamically adapts to data by exploiting a likelihood function based on kernel mixtures and incorporates flexible choices of prior information by applying importance sampling. The validity of the proposed techniques is assessed from measurements performed with a broadband radiation meter and an isotropic field probe. The developed framework significantly outperforms GUM approach, achieving a reduction of 28 % in measurement uncertainty. (authors)

  16. Use of Commericially Available Software in an Attribute Measurement System.

    Energy Technology Data Exchange (ETDEWEB)

    MacArthur, D. W. (Duncan W.); Bracken, D. S. (David S.); Carrillo, L. A. (Louis A.); Elmont, T. H. (Timothy H.); Frame, K. C. (Katherine C.); Hirsch, K. L. (Karen L.)

    2005-01-01

    A major issue in international safeguards of nuclear materials is the ability to verify that processes and materials in nuclear facilities are consistent with declaration without revealing sensitive information. An attribute measurement system (AMS) is a non-destructive assay (NDA) system that utilizes an information barrier to protect potentially sensitive information about the measurement item. A key component is the software utilized for operator interface, data collection, analysis, and attribute determination, as well as the operating system under which they are implemented. Historically, custom software has been used almost exclusively in transparency applications, and it is unavoidable that some amount of custom software is needed. The focus of this paper is to explore the extent to which commercially available software may be used and the relative merits.

  17. Use of Commericially Available Software in an Attribute Measurement System

    International Nuclear Information System (INIS)

    A major issue in international safeguards of nuclear materials is the ability to verify that processes and materials in nuclear facilities are consistent with declaration without revealing sensitive information. An attribute measurement system (AMS) is a non-destructive assay (NDA) system that utilizes an information barrier to protect potentially sensitive information about the measurement item. A key component is the software utilized for operator interface, data collection, analysis, and attribute determination, as well as the operating system under which they are implemented. Historically, custom software has been used almost exclusively in transparency applications, and it is unavoidable that some amount of custom software is needed. The focus of this paper is to explore the extent to which commercially available software may be used and the relative merits.

  18. Uncertainty Measures in Ordered Information System Based on Approximation Operators

    Directory of Open Access Journals (Sweden)

    Bingjiao Fan

    2014-01-01

    Full Text Available This paper focuses on constructing uncertainty measures by the pure rough set approach in ordered information system. Four types of definitions of lower and upper approximations and corresponding uncertainty measurement concepts including accuracy, roughness, approximation quality, approximation accuracy, dependency degree, and importance degree are investigated. Theoretical analysis indicates that all the four types can be used to evaluate the uncertainty in ordered information system, especially that we find that the essence of the first type and the third type is the same. To interpret and help understand the approach, experiments about real-life data sets have been conducted to test the four types of uncertainty measures. From the results obtained, it can be shown that these uncertainty measures can surely measure the uncertainty in ordered information system.

  19. Image Reinforcement or Impairment: The Effects of Co-Branding on Attribute Uncertainty

    OpenAIRE

    Tansev Geylani; J. Jeffrey Inman; Frenkel Ter Hofstede

    2008-01-01

    Co-branding is often used by companies to reinforce the image of their brands. In this paper, we investigate the conditions under which a brand's image is reinforced or impaired as a result of co-branding, and the characteristics of a good partner for a firm considering co-branding for image reinforcement. We address these issues by conceptualizing attribute beliefs as two-dimensional constructs: The first dimension reflects the expected value of the attribute, while the second dimension refl...

  20. Uncertainty Quantification for Quantitative Imaging Holdup Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Bevill, Aaron M [ORNL; Bledsoe, Keith C [ORNL

    2016-01-01

    In nuclear fuel cycle safeguards, special nuclear material "held up" in pipes, ducts, and glove boxes causes significant uncertainty in material-unaccounted-for estimates. Quantitative imaging is a proposed non-destructive assay technique with potential to estimate the holdup mass more accurately and reliably than current techniques. However, uncertainty analysis for quantitative imaging remains a significant challenge. In this work we demonstrate an analysis approach for data acquired with a fast-neutron coded aperture imager. The work includes a calibrated forward model of the imager. Cross-validation indicates that the forward model predicts the imager data typically within 23%; further improvements are forthcoming. A new algorithm based on the chi-squared goodness-of-fit metric then uses the forward model to calculate a holdup confidence interval. The new algorithm removes geometry approximations that previous methods require, making it a more reliable uncertainty estimator.

  1. GRAPH THEORY APPROACH TO QUANTIFY UNCERTAINTY OF PERFORMANCE MEASURES

    Directory of Open Access Journals (Sweden)

    Sérgio D. Sousa

    2015-03-01

    Full Text Available In this work, the performance measurement process is studied to quantify the uncertainty induced in the resulting performance measure (PM. To that end, the causes of uncertainty are identified, analysing the activities undertaken in the three following stages of the performance measurement process: design and implementation, data collection and record, and determination and analysis. A quantitative methodology based on graph theory and on the sources of uncertainty of the performance measurement process is used to calculate an uncertainty index to evaluate the level of uncertainty of a given PM or (key performance indicator. An application example is presented. The quantification of PM uncertainty could contribute to better represent the risk associated with a given decision and also to improve the PM to increase its precision and reliability.

  2. OPEN PUBLIC SPACE ATTRIBUTES AND CATEGORIES – COMPLEXITY AND MEASURABILITY

    Directory of Open Access Journals (Sweden)

    Ljiljana Čavić

    2014-12-01

    Full Text Available Within the field of architectural and urban research, this work addresses the complexity of contemporary public space, both in a conceptual and concrete sense. It aims at systematizing spatial attributes and their categories and discussing spatial complexity and measurability, all this in order to reach a more comprehensive understanding, description and analysis of public space. Our aim is to improve everyday usage of open public space and we acknowledged users as its crucial factor. There are numerous investigations on the complex urban and architectural reality of public space that recognise importance of users. However, we did not find any that would holistically account for what users find essential in public space. Based on the incompleteness of existing approaches on open public space and the importance of users for their success, this paper proposes a user-orientated approach. Through an initial survey directed to users, we collected the most important aspects of public spaces in the way that contemporary humans see them. The gathered data is analysed and coded into spatial attributes from which their role in the complexity of open public space and measurability are discussed. The work results in an inventory of attributes that users find salient in public spaces. It does not discuss their qualitative values or contribution in generating spatial realities. It aims to define them clearly so that any further logical argumentation on open space concerning users may be solidly constructed. Finally, through categorisation of attributes it proposes the disciplinary levels necessary for the analysis of complex urban-architectural reality

  3. On the different approaches of measuring uncertainty shocks

    OpenAIRE

    Strobel, Johannes

    2015-01-01

    As uncertainty has become an increasingly prominent source of business cycle fluctuations, various uncertainty proxies have been proposed in the literature. This paper shows that uncertainty measures based on realized variables fluctuate more than the measures that are based on forecasts. More precisely, the variation in the realized cross-sectional standard deviation of profit growth and stock returns is larger than the variation in the forecast standard deviation. Moreover, the forecast sta...

  4. Attributes measurements by calorimetry in 15 to 30 minutes

    International Nuclear Information System (INIS)

    An analysis of the early portion of the power-history data collected with both of the IAEA's air-cooled bulk calorimeters has demonstrated that such calorimeters can measure the power from preheated containers of plutonium oxide with an accuracy of 2-5% in 15 to 30 minutes. Material accountancy at plutonium facilities has a need for such a capability for measurement of Pu scrap. Also, the IAEA could use just two calorimeters and a gamma-ray assay system for reliable variables and attributes measurements of plutonium mass during a two-day physical-inventory verification (PIV) at a mixed-oxide (MOX) fuel-fabrication facility. The assay results would be free of the concerns about sample moisture, impurities, and geometry that previously have limited the accuracy of assays based on neutron measurements

  5. Designing a 3rd generation, authenticatable attribute measurement system

    International Nuclear Information System (INIS)

    Attribute measurement systems (AMS) are designed to measure potentially sensitive items containing Special Nuclear Materials to determine if the items possess attributes which fall within an agreed-upon range. Such systems could be used in a treaty to inspect and verify the identity of items in storage without revealing any sensitive information associated with the item. An AMS needs to satisfy two constraints: the host party needs to be sure that none of their sensitive information is released, while the inspecting party wants to have confidence that the limited amount of information they see accurately reflects the properties of the item being measured. The former involves 'certifying' the system and the latter 'authenticating' it. Previous work into designing and building AMS systems have focused more on the questions of certifiability than on the questions of authentication - although a few approaches have been investigated. The next step is to build a 3rd generation AMS which (1) makes the appropriate measurements, (2) can be certified, and (3) can be authenticated (the three generations). This paper will discuss the ideas, options, and process of producing a design for a 3rd generation AMS.

  6. Attributing runoff changes to climate variability and human activities: Uncertainty analysis using four monthly water balance models

    Energy Technology Data Exchange (ETDEWEB)

    Li, Shuai; Xiong, Lihua; Li, Hongyi; Leung, Lai-Yung R.; Demissie, Yonas

    2016-01-08

    Hydrological simulations to delineate the impacts of climate variability and human activities are subjected to uncertainties related to both parameter and structure of the hydrological models. To analyze the impact of these uncertainties on the model performance and to yield more reliable simulation results, a global calibration and multimodel combination method that integrates the Shuffled Complex Evolution Metropolis (SCEM) and Bayesian Model Averaging (BMA) of four monthly water balance models was proposed. The method was applied to the Weihe River Basin (WRB), the largest tributary of the Yellow River, to determine the contribution of climate variability and human activities to runoff changes. The change point, which was used to determine the baseline period (1956-1990) and human-impacted period (1991-2009), was derived using both cumulative curve and Pettitt’s test. Results show that the combination method from SCEM provides more skillful deterministic predictions than the best calibrated individual model, resulting in the smallest uncertainty interval of runoff changes attributed to climate variability and human activities. This combination methodology provides a practical and flexible tool for attribution of runoff changes to climate variability and human activities by hydrological models.

  7. Instrumental measurement of beer taste attributes using an electronic tongue

    Energy Technology Data Exchange (ETDEWEB)

    Rudnitskaya, Alisa, E-mail: alisa.rudnitskaya@gmail.com [Chemistry Department, University of Aveiro, Aveiro (Portugal); Laboratory of Chemical Sensors, Chemistry Department, St. Petersburg University, St. Petersburg (Russian Federation); Polshin, Evgeny [Laboratory of Chemical Sensors, Chemistry Department, St. Petersburg University, St. Petersburg (Russian Federation); BIOSYST/MeBioS, Catholic University of Leuven, W. De Croylaan 42, B-3001 Leuven (Belgium); Kirsanov, Dmitry [Laboratory of Chemical Sensors, Chemistry Department, St. Petersburg University, St. Petersburg (Russian Federation); Lammertyn, Jeroen; Nicolai, Bart [BIOSYST/MeBioS, Catholic University of Leuven, W. De Croylaan 42, B-3001 Leuven (Belgium); Saison, Daan; Delvaux, Freddy R.; Delvaux, Filip [Centre for Malting and Brewing Sciences, Katholieke Universiteit Leuven, Heverelee (Belgium); Legin, Andrey [Laboratory of Chemical Sensors, Chemistry Department, St. Petersburg University, St. Petersburg (Russian Federation)

    2009-07-30

    The present study deals with the evaluation of the electronic tongue multisensor system as an analytical tool for the rapid assessment of taste and flavour of beer. Fifty samples of Belgian and Dutch beers of different types (lager beers, ales, wheat beers, etc.), which were characterized with respect to the sensory properties, were measured using the electronic tongue (ET) based on potentiometric chemical sensors developed in Laboratory of Chemical Sensors of St. Petersburg University. The analysis of the sensory data and the calculation of the compromise average scores was made using STATIS. The beer samples were discriminated using both sensory panel and ET data based on PCA, and both data sets were compared using Canonical Correlation Analysis. The ET data were related to the sensory beer attributes using Partial Least Square regression for each attribute separately. Validation was done based on a test set comprising one-third of all samples. The ET was capable of predicting with good precision 20 sensory attributes of beer including such as bitter, sweet, sour, fruity, caramel, artificial, burnt, intensity and body.

  8. Instrumental measurement of beer taste attributes using an electronic tongue

    International Nuclear Information System (INIS)

    The present study deals with the evaluation of the electronic tongue multisensor system as an analytical tool for the rapid assessment of taste and flavour of beer. Fifty samples of Belgian and Dutch beers of different types (lager beers, ales, wheat beers, etc.), which were characterized with respect to the sensory properties, were measured using the electronic tongue (ET) based on potentiometric chemical sensors developed in Laboratory of Chemical Sensors of St. Petersburg University. The analysis of the sensory data and the calculation of the compromise average scores was made using STATIS. The beer samples were discriminated using both sensory panel and ET data based on PCA, and both data sets were compared using Canonical Correlation Analysis. The ET data were related to the sensory beer attributes using Partial Least Square regression for each attribute separately. Validation was done based on a test set comprising one-third of all samples. The ET was capable of predicting with good precision 20 sensory attributes of beer including such as bitter, sweet, sour, fruity, caramel, artificial, burnt, intensity and body.

  9. Uncertainty determination demonstration program on MC and A measurement systems

    International Nuclear Information System (INIS)

    Statistically propagated limits of error (LOE) for accountability measurements are usually smaller than LOEs derived from historical data. Laboratory measurement quality control programs generate estimates of random and systematic errors for the LOE calculations. The uncertainty of measurement system standards and instrument calibrations are often not included in measurement quality control (QC) programs (MCPs) estimates. Therefore, the uncertainty associated with a measurement system is usually underestimated. A program was conducted at the Savannah River Site (SRS) to evaluate a commercial measurement assurance program software package (JTIPMAP trademark) that records, charts, and analyzes control standard measurements to determine and control total measurement uncertainty. The software uniquely uses the uncertainty of the standards, the calibration histories and routine QC data to estimate the total uncertainty of a measurement process. The demonstration program involved: training measurement personnel on the principles of process measurement assurance (PMAP) and the use of the software; technical support in setting up PMAPs on gas mass spectrometry, calorimetry, Fourier transformed infrared (FTIR) spectrometry, alpha PHA spectrometry, diode array spectrophotometry, and mass standards calibrations measurement systems; and determining and evaluating uncertainty estimates for each system. Results of the demonstration program are described and the uncertainties for these measurement systems are summarized in the paper below. The demonstration showed the training and software provided several useful functions such as uncertainty determinations that include the standards used and independent standards that reveal measurement process systematic errors, which produced larger uncertainties estimates than current MCPs. The software will be tested further in pilot programs for D2O measurements, calorimetry and mass standards calibrations

  10. Relating confidence to measured information uncertainty in qualitative reasoning

    Energy Technology Data Exchange (ETDEWEB)

    Chavez, Gregory M [Los Alamos National Laboratory; Zerkle, David K [Los Alamos National Laboratory; Key, Brian P [Los Alamos National Laboratory; Shevitz, Daniel W [Los Alamos National Laboratory

    2010-10-07

    Qualitative reasoning makes use of qualitative assessments provided by subject matter experts to model factors such as security risk. Confidence in a result is important and useful when comparing competing results. Quantifying the confidence in an evidential reasoning result must be consistent and based on the available information. A novel method is proposed to relate confidence to the available information uncertainty in the result using fuzzy sets. Information uncertainty can be quantified through measures of non-specificity and conflict. Fuzzy values for confidence are established from information uncertainty values that lie between the measured minimum and maximum information uncertainty values.

  11. UNCERTAINTY AND ITS IMPACT ON THE QUALITY OF MEASUREMENT

    Directory of Open Access Journals (Sweden)

    Adel Elahdi M. Yahya

    2012-01-01

    Full Text Available The imposition of practice, the current world, the laboratory measurement, calibration should be approved by points of credit to national or international and should be compatible with the requirements specification (ISO 17025 for the adoption of efficient laboratories. Those requirements were included the testing process or scale limits to doubt that mentioned in the measurement certificate, which recognizes the customer to achieve quality and efficiency in the process of measurement. In this study we would theoretically try to clarify, indicate what the uncertainty in the measurement, standard types of uncertainty and how to calculate the budget of uncertainty as we should show some examples of how the scientific calculation of the budget challenge with some measure the lengths of the laboratory. After analyzing the results we had found during the measurement using CMM, we had found that the value of non-statistical uncertainty in the measurement type (b piece length of one meter was ±1.9257 µm. and when using the configuration measuring device, we had gotten the value of the extended standard combined uncertainty ±2.030 µm when measured the screws value of 1.2707 mm. When used the configuration measuring device, we had gotten the value of the extended standard combined uncertainty ±2.030 µm when measuring the screws value of 1.2707 mm. We concluded that the impact of uncertainty on the measured results a high fineness degree and less impact on the smoothness of a piece with low fineness, careful calibration of measuring instrument Careful calibration of measuring instrument and equipment by measurement standard is of the utmost importance and equipment by measurement standard is of the utmost importance and laboratories must calculate the uncertainty budget as a part of measurement evaluation to provide high quality measurement results.

  12. Strain gauge measurement uncertainties on hydraulic turbine runner blade

    International Nuclear Information System (INIS)

    Strains experimentally measured with strain gauges can differ from those evaluated using the Finite Element (FE) method. This difference is due mainly to the assumptions and uncertainties inherent to each method. To circumvent this difficulty, we developed a numerical method based on Monte Carlo simulations to evaluate measurement uncertainties produced by the behaviour of a unidirectional welded gauge, its position uncertainty and its integration effect. This numerical method uses the displacement fields of the studied part evaluated by an FE analysis. The paper presents a study case using in situ data measured on a hydraulic turbine runner. The FE analysis of the turbine runner blade was computed, and our numerical method used to evaluate uncertainties on strains measured at five locations with welded strain gauges. Then, measured strains and their uncertainty ranges are compared to the estimated strains. The uncertainty ranges obtained extended from 74 με to 165 με. Furthermore, the biases observed between the median of the uncertainty ranges and the FE strains varied from −36 to 36 με. Note that strain gauge measurement uncertainties depend mainly on displacement fields and gauge geometry.

  13. Extended component importance measures considering aleatory and epistemic uncertainties

    OpenAIRE

    Sallak, Mohamed; Schon, Walter; Aguirre, Felipe

    2013-01-01

    International audience In this paper, extended component importance measures (Birnbaum importance, RAW, RRW and Crit- icality importance) considering aleatory and epistemic uncertainties are introduced. The D-S theory which is considered to be a less restricted extension of probability theory is proposed as a framework for taking into account both aleatory and epistemic uncertainties. The epistemic uncertainty defined in this paper is the total lack of knowledge of the component state. The...

  14. Measurement uncertainty in pharmaceutical analysis and its application

    OpenAIRE

    Marcus Augusto Lyrio Traple; Alessandro Morais Saviano; Fabiane Lacerda Francisco; Felipe Rebello Lourenço

    2014-01-01

    The measurement uncertainty provides complete information about an analytical result. This is very important because several decisions of compliance or non-compliance are based on analytical results in pharmaceutical industries. The aim of this work was to evaluate and discuss the estimation of uncertainty in pharmaceutical analysis. The uncertainty is a useful tool in the assessment of compliance or non-compliance of in-process and final pharmaceutical products as well as in the assessment o...

  15. Calibration and Measurement Uncertainty Estimation of Radiometric Data: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Habte, A.; Sengupta, M.; Reda, I.; Andreas, A.; Konings, J.

    2014-11-01

    Evaluating the performance of photovoltaic cells, modules, and arrays that form large solar deployments relies on accurate measurements of the available solar resource. Therefore, determining the accuracy of these solar radiation measurements provides a better understanding of investment risks. This paper provides guidelines and recommended procedures for estimating the uncertainty in calibrations and measurements by radiometers using methods that follow the International Bureau of Weights and Measures Guide to the Expression of Uncertainty (GUM). Standardized analysis based on these procedures ensures that the uncertainty quoted is well documented.

  16. Point-In-Time Measurement Uncertainty Recapture for RCS Flow

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Byung Ryul; Jang, Ho Cheol; Yune, Seok Jeong; Kim, Eun Kee [Korea Power Engineering and Construction Company, Inc., Daejeon (Korea, Republic of)

    2014-10-15

    In nuclear power plants, RCS flow measurement uncertainty plays an important role in the establishment of flow acceptance criteria. The narrow band of acceptance criteria based on the design limiting uncertainty of the measured RCS flow may lead to a point-in-time violation of acceptance criteria in a situation where the measured flow is too close to the upper limit of allowable RCS flow operating band. Also the measured RCS flow may approach the lower limit of the acceptance criteria as operating cycle proceeds. Several measurement uncertainty recapturing methods for RCS flow are attempted to be applied in a point-in-time situation failed to meet the acceptance criteria. Also a combination of these recapturing methods can be utilized to establish a design limiting measurement uncertainty. To recapture the RCS flow measurement uncertainty, possible and practical methods are proposed to be utilized in a point-in-time situation failed to meet the acceptance criteria. These methods can be used as a design basis methodology to establish the design limiting uncertainty. It is worthy to note that the hot and cold leg temperatures have an additional redundancy such as wide range instrument channel. The measured operating condition for RCS flow has potential for more recapture. With those recapturing ways more applied, the uncertainty recapture can be improved.

  17. Detailed Uncertainty Analysis of the ZEM-3 Measurement System

    Science.gov (United States)

    Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred

    2014-01-01

    The measurement of Seebeck coefficient and electrical resistivity are critical to the investigation of all thermoelectric systems. Therefore, it stands that the measurement uncertainty must be well understood to report ZT values which are accurate and trustworthy. A detailed uncertainty analysis of the ZEM-3 measurement system has been performed. The uncertainty analysis calculates error in the electrical resistivity measurement as a result of sample geometry tolerance, probe geometry tolerance, statistical error, and multi-meter uncertainty. The uncertainty on Seebeck coefficient includes probe wire correction factors, statistical error, multi-meter uncertainty, and most importantly the cold-finger effect. The cold-finger effect plagues all potentiometric (four-probe) Seebeck measurement systems, as heat parasitically transfers through thermocouple probes. The effect leads to an asymmetric over-estimation of the Seebeck coefficient. A thermal finite element analysis allows for quantification of the phenomenon, and provides an estimate on the uncertainty of the Seebeck coefficient. The thermoelectric power factor has been found to have an uncertainty of +9-14 at high temperature and 9 near room temperature.

  18. Measurement uncertainty in Total Reflection X-ray Fluorescence

    International Nuclear Information System (INIS)

    Total Reflection X-ray Fluorescence (TXRF) spectrometry is a multi-elemental technique using micro-volumes of sample. This work assessed the components contributing to the combined uncertainty budget associated with TXRF measurements using Cu and Fe concentrations in different spiked and natural water samples as an example. The results showed that an uncertainty estimation based solely on the count statistics of the analyte is not a realistic estimation of the overall uncertainty, since the depositional repeatability and the relative sensitivity between the analyte and the internal standard are important contributions to the uncertainty budget. The uncertainty on the instrumental repeatability and sensitivity factor could be estimated and as such, potentially relatively straightforward implemented in the TXRF instrument software. However, the depositional repeatability varied significantly from sample to sample and between elemental ratios and the controlling factors are not well understood. By a lack of theoretical prediction of the depositional repeatability, the uncertainty budget can be based on repeat measurements using different reflectors. A simple approach to estimate the uncertainty was presented. The measurement procedure implemented and the uncertainty estimation processes developed were validated from the agreement with results obtained by inductively coupled plasma — optical emission spectrometry (ICP-OES) and/or reference/calculated values. - Highlights: • The uncertainty of TXRF cannot be realistically described by the counting statistics. • The depositional repeatability is an important contribution to the uncertainty. • Total combined uncertainties for Fe and Cu in waste/mine water samples were 4–8%. • Obtained concentrations agree within uncertainty with reference values

  19. Measurement uncertainty in Total Reflection X-ray Fluorescence

    Energy Technology Data Exchange (ETDEWEB)

    Floor, G.H., E-mail: geerke.floor@gfz-potsdam.de [GFZ German Research Centre for Geosciences Section 3.4. Earth Surface Geochemistry, Telegrafenberg, 14473 Postdam (Germany); Queralt, I. [Institute of Earth Sciences Jaume Almera ICTJA-CSIC, Solé Sabaris s/n, 08028 Barcelona (Spain); Hidalgo, M.; Marguí, E. [Department of Chemistry, University of Girona, Campus Montilivi s/n, 17071 Girona (Spain)

    2015-09-01

    Total Reflection X-ray Fluorescence (TXRF) spectrometry is a multi-elemental technique using micro-volumes of sample. This work assessed the components contributing to the combined uncertainty budget associated with TXRF measurements using Cu and Fe concentrations in different spiked and natural water samples as an example. The results showed that an uncertainty estimation based solely on the count statistics of the analyte is not a realistic estimation of the overall uncertainty, since the depositional repeatability and the relative sensitivity between the analyte and the internal standard are important contributions to the uncertainty budget. The uncertainty on the instrumental repeatability and sensitivity factor could be estimated and as such, potentially relatively straightforward implemented in the TXRF instrument software. However, the depositional repeatability varied significantly from sample to sample and between elemental ratios and the controlling factors are not well understood. By a lack of theoretical prediction of the depositional repeatability, the uncertainty budget can be based on repeat measurements using different reflectors. A simple approach to estimate the uncertainty was presented. The measurement procedure implemented and the uncertainty estimation processes developed were validated from the agreement with results obtained by inductively coupled plasma — optical emission spectrometry (ICP-OES) and/or reference/calculated values. - Highlights: • The uncertainty of TXRF cannot be realistically described by the counting statistics. • The depositional repeatability is an important contribution to the uncertainty. • Total combined uncertainties for Fe and Cu in waste/mine water samples were 4–8%. • Obtained concentrations agree within uncertainty with reference values.

  20. Attributes and templates from active measurements with 252Cf

    International Nuclear Information System (INIS)

    Active neutron interrogation is useful for the detection of shielded HEU and could also be used for Pu. In an active technique, fissile material is stimulated by an external neutron source to produce fission with the emanation of neutrons and gamma rays. The time distribution of particles leaving the fissile material is measured with respect to the source emission in a variety of ways. A variety of accelerator and radioactive sources can be used. Active interrogation of nuclear weapons/components can be used in two ways: template matching or attribute estimation. Template matching compares radiation signatures with known reference signatures and for treaty applications has the problem of authentication of the reference signatures along with storage and retrieval of templates. Attribute estimation determines, for example, the fissile mass from various features of the radiation signatures and does not require storage of radiation signatures but does require calibration, which can be repeated as necessary. A nuclear materials identification system (NMIS) has been in use at the Oak Ridge Y-12 Plant for verification of weapons components being received and in storage by template matching and has been used with calibrations for attribute (fissile mass) estimation for HEU metal. NMIS employs a 252Cf source of low intensity (6 n/sec) such that the dose at 1 m is approximately twice that on a commercial airline at altitude. The use of such a source presents no significant safety concerns either for personnel or nuclear explosive safety, and has been approved for use at the Pantex Plant on fully assembled weapons systems

  1. Uncertainty and sensitivity analysis and its applications in OCD measurements

    Science.gov (United States)

    Vagos, Pedro; Hu, Jiangtao; Liu, Zhuan; Rabello, Silvio

    2009-03-01

    This article describes an Uncertainty & Sensitivity Analysis package, a mathematical tool that can be an effective time-shortcut for optimizing OCD models. By including real system noises in the model, an accurate method for predicting measurements uncertainties is shown. The assessment, in an early stage, of the uncertainties, sensitivities and correlations of the parameters to be measured drives the user in the optimization of the OCD measurement strategy. Real examples are discussed revealing common pitfalls like hidden correlations and simulation results are compared with real measurements. Special emphasis is given to 2 different cases: 1) the optimization of the data set of multi-head metrology tools (NI-OCD, SE-OCD), 2) the optimization of the azimuth measurement angle in SE-OCD. With the uncertainty and sensitivity analysis result, the right data set and measurement mode (NI-OCD, SE-OCD or NI+SE OCD) can be easily selected to achieve the best OCD model performance.

  2. Assessment of dose measurement uncertainty using RisøScan

    DEFF Research Database (Denmark)

    Helt-Hansen, J.; Miller, A.

    2006-01-01

    The dose measurement uncertainty of the dosimeter system RisoScan, office scanner and Riso B3 dosimeters has been assessed by comparison with spectrophotometer measurements of the same dosimeters. The reproducibility and the combined uncertainty were found to be approximately 2% and 4%, respectiv......The dose measurement uncertainty of the dosimeter system RisoScan, office scanner and Riso B3 dosimeters has been assessed by comparison with spectrophotometer measurements of the same dosimeters. The reproducibility and the combined uncertainty were found to be approximately 2% and 4......%, respectively, at one standard deviation. The subroutine in RisoScan for electron energy measurement is shown to give results that are equivalent to the measurements with a scanning spectrophotometer. (c) 2006 Elsevier Ltd. All rights reserved....

  3. Measuring the Gas Constant "R": Propagation of Uncertainty and Statistics

    Science.gov (United States)

    Olsen, Robert J.; Sattar, Simeen

    2013-01-01

    Determining the gas constant "R" by measuring the properties of hydrogen gas collected in a gas buret is well suited for comparing two approaches to uncertainty analysis using a single data set. The brevity of the experiment permits multiple determinations, allowing for statistical evaluation of the standard uncertainty u[subscript…

  4. Evaluation of the uncertainty of environmental measurements of radioactivity

    International Nuclear Information System (INIS)

    Results obtained by measurement of radioactivity have traditionally been associated with an expression of their uncertainty, based on the so-called counting statistics. This is calculated together with the actual result on the assumption that the number of counts observed has a Poisson distribution with equal mean and variance. Most of the nuclear scientific community has, therefore, assumed that it already complied with the latest ISO 17025 requirements. Counting statistics, however, express only the variability observed among repeated measurements of the same sample under the same counting conditions, which is equivalent to the term repeatability used in quantitative analysis. Many other sources of uncertainty need to be taken into account before a statement of the uncertainty of the actual result can be made. As the first link in the traceability chain calibration is always an important uncertainty component in any kind of measurement. For radioactivity measurements in particular counting geometry assumes the greatest importance, because it is often not possible to measure a standard and a control sample under exactly the same conditions. In the case of large samples there are additional uncertainty components associated with sample heterogeneity and its influence on self-absorption and counting efficiency. An uncertainty budget is prepared for existing data for 137Cs in Danish soil, which is shown to account adequately for all sources of uncertainty. (author)

  5. Application of Ultrasonic Flow Measurement for Power Uprates Based on Measurement Uncertainty Recapture

    International Nuclear Information System (INIS)

    The largest source of uncertainty in the calculation of reactor power can be attributed to the limited accuracy and potential for undetected degradation of conventional flow nozzles and venturis used for feedwater flow measurement. UFM installations have been carried out with regulatory approval in PWRs and BWRs in the USA, regulatory approval is being progressed for trial installations on commercial nuclear units in Japan, and installations are being considered for PHWRs in Canada. Installations use permanently mounted chordal measurement transducer arrays in laboratory calibrated pipe spools to achieve a measurement accuracy of ±0.28%. In addition to high accuracy, measurement systems have evolved to be highly reliable, with redundancy and self-checking features built in to eliminate failures and the potential for drift and inadvertent overpower conditions. Outputs can be used for thermal power measurement and for feedwater flow process control. Measurement frequency can be set to be compatible with existing systems for thermal power measurement and process control. Contributors to thermal power measurement uncertainty are examined, and the range of potential measurement uncertainty recapture (MUR) is identified. Using industry-accepted practices to carry out MUR calculations, the available thermal power uprate can be predicted. Based on the combined uncertainty of all of the process parameters used in on-line thermal power calculations and the uncertainty assumed in the original licensing basis, available thermal power uprates vary between 1.5 and 2.5% of full power (FP). As the year-to-year power demand in Canada increases, nuclear energy continues to play an essential role in providing secure, stable and affordable electricity. Nuclear energy remains cost-competitive compared to other energy resources while eliminating greenhouse gas emissions. In the last decade, great progress has been achieved in developing new technologies applicable to NPPs, especially

  6. An entropic uncertainty principle for positive operator valued measures

    CERN Document Server

    Rumin, Michel

    2011-01-01

    Extending a recent result by Frank and Lieb, we show an entropic uncertainty principle for mixed states in a Hilbert space relatively to pairs of positive operator valued measures that are independent in some sense.

  7. Measurement Uncertainty in Decision Assessment for Telecommunication Systems

    OpenAIRE

    Moschitta, Antonio; Pianegiani, Fernando; Petri, Dario

    2004-01-01

    This paper deals with the effects of measurement uncertainty on decision making. In particular, conformance tests for communication systems equipment are considered, with respect to both consumer and producer risk and the effects of such parameters on the overall costs.

  8. Uncertainty issues on S-CO2 compressor performance measurement

    International Nuclear Information System (INIS)

    This is related to the property variation, pressure ratio and measurement method. Since SCO2PE facility operates near the critical point with a low pressure ratio compressor, one of the solutions to improve measurement uncertainty is utilizing a density meter. However, additional two density meters on compressor inlet and outlet measurement didn't provide remarkable improvement on the overall uncertainty. Thus, the authors think that different approach on the performance measurement is required to secure measurement confidence. As further works, identifying appropriate approximation on efficiency equation and applying direct measurement of compressor shaft power for the efficiency calculation will be considered

  9. Point cloud uncertainty analysis for laser radar measurement system based on error ellipsoid model

    Science.gov (United States)

    Zhengchun, Du; Zhaoyong, Wu; Jianguo, Yang

    2016-04-01

    Three-dimensional laser scanning has become an increasingly popular measurement method in industrial fields as it provides a non-contact means of measuring large objects, whereas the conventional methods are contact-based. However, the data acquisition process is subject to many interference factors, which inevitably cause errors. Therefore, it is necessary to precisely evaluate the accuracy of the measurement results. In this study, an error-ellipsoid-based uncertainty model was applied to 3D laser radar measurement system (LRMS) data. First, a spatial point uncertainty distribution map was constructed according to the error ellipsoid attributes. The single-point uncertainty ellipsoid model was then extended to point-point, point-plane, and plane-plane situations, and the corresponding distance uncertainty models were derived. Finally, verification experiments were performed by using an LRMS to measure the height of a cubic object, and the measurement accuracies were evaluated. The results show that the plane-plane distance uncertainties determined based on the ellipsoid model are comparable to those obtained by actual distance measurements. Thus, this model offers solid theoretical support to enable further LRMS measurement accuracy improvement.

  10. On the Uncertainty Principle for Continuous Quantum Measurement

    OpenAIRE

    Miao, Haixing

    2016-01-01

    We revisit the Heisenberg uncertainty principle for continuous quantum measurement with the detector describable by linear response. When the detector is at the quantum limit with minimum uncertainty, the fluctuation and response of a single-input single-output detector are shown to be related via two equalities. We illustrate the result by applying it to an optomechanical device--a typical continuous measurement setup.

  11. Uncertainty Estimation Improves Energy Measurement and Verification Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Walter, Travis; Price, Phillip N.; Sohn, Michael D.

    2014-05-14

    Implementing energy conservation measures in buildings can reduce energy costs and environmental impacts, but such measures cost money to implement so intelligent investment strategies require the ability to quantify the energy savings by comparing actual energy used to how much energy would have been used in absence of the conservation measures (known as the baseline energy use). Methods exist for predicting baseline energy use, but a limitation of most statistical methods reported in the literature is inadequate quantification of the uncertainty in baseline energy use predictions. However, estimation of uncertainty is essential for weighing the risks of investing in retrofits. Most commercial buildings have, or soon will have, electricity meters capable of providing data at short time intervals. These data provide new opportunities to quantify uncertainty in baseline predictions, and to do so after shorter measurement durations than are traditionally used. In this paper, we show that uncertainty estimation provides greater measurement and verification (M&V) information and helps to overcome some of the difficulties with deciding how much data is needed to develop baseline models and to confirm energy savings. We also show that cross-validation is an effective method for computing uncertainty. In so doing, we extend a simple regression-based method of predicting energy use using short-interval meter data. We demonstrate the methods by predicting energy use in 17 real commercial buildings. We discuss the benefits of uncertainty estimates which can provide actionable decision making information for investing in energy conservation measures.

  12. Calculus of the uncertainty in acoustic field measurements: comparative study between the uncertainty propagation method and the distribution propagation method

    OpenAIRE

    Navacerrada Saturio, Maria Angeles; Díaz Sanchidrián, César; Pedrero González, Antonio; Iglesias Martínez, Luis

    2008-01-01

    The new Spanish Regulation in Building Acoustic establishes values and limits for the different acoustic magnitudes whose fulfillment can be verify by means field measurements. In this sense, an essential aspect of a field measurement is to give the measured magnitude and the uncertainty associated to such a magnitude. In the calculus of the uncertainty it is very usual to follow the uncertainty propagation method as described in the Guide to the expression of Uncertainty in Measurements (GUM...

  13. A Unified Approach for Reporting ARM Measurement Uncertainties Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Campos, E [Argonne National Laboratory; Sisterson, DL [Argonne National Laboratory

    2015-10-01

    The Atmospheric Radiation Measurement (ARM) Climate Research Facility is observationally based, and quantifying the uncertainty of its measurements is critically important. With over 300 widely differing instruments providing over 2,500 datastreams, concise expression of measurement uncertainty is quite challenging. The ARM Facility currently provides data and supporting metadata (information about the data or data quality) to its users through a number of sources. Because the continued success of the ARM Facility depends on the known quality of its measurements, the Facility relies on instrument mentors and the ARM Data Quality Office (DQO) to ensure, assess, and report measurement quality. Therefore, an easily-accessible, well-articulated estimate of ARM measurement uncertainty is needed.

  14. Uncertainty Analysis of Sonic Boom Levels Measured in a Simulator at NASA Langley

    Science.gov (United States)

    Rathsam, Jonathan; Ely, Jeffry W.

    2012-01-01

    A sonic boom simulator has been constructed at NASA Langley Research Center for testing the human response to sonic booms heard indoors. Like all measured quantities, sonic boom levels in the simulator are subject to systematic and random errors. To quantify these errors, and their net influence on the measurement result, a formal uncertainty analysis is conducted. Knowledge of the measurement uncertainty, or range of values attributable to the quantity being measured, enables reliable comparisons among measurements at different locations in the simulator as well as comparisons with field data or laboratory data from other simulators. The analysis reported here accounts for acoustic excitation from two sets of loudspeakers: one loudspeaker set at the facility exterior that reproduces the exterior sonic boom waveform and a second set of interior loudspeakers for reproducing indoor rattle sounds. The analysis also addresses the effect of pressure fluctuations generated when exterior doors of the building housing the simulator are opened. An uncertainty budget is assembled to document each uncertainty component, its sensitivity coefficient, and the combined standard uncertainty. The latter quantity will be reported alongside measurement results in future research reports to indicate data reliability.

  15. Measurement uncertainty in pharmaceutical analysis and its application

    Institute of Scientific and Technical Information of China (English)

    Marcus Augusto Lyrio Traple; Alessandro Morais Saviano; Fabiane Lacerda Francisco; Felipe Rebello Lourençon

    2014-01-01

    The measurement uncertainty provides complete information about an analytical result. This is very important because several decisions of compliance or non-compliance are based on analytical results in pharmaceutical industries. The aim of this work was to evaluate and discuss the estimation of uncertainty in pharmaceutical analysis. The uncertainty is a useful tool in the assessment of compliance or non-compliance of in-process and final pharmaceutical products as well as in the assessment of pharmaceutical equivalence and stability study of drug products.

  16. Measurement uncertainties physical parameters and calibration of instruments

    CERN Document Server

    Gupta, S V

    2012-01-01

    This book fulfills the global need to evaluate measurement results along with the associated uncertainty. In the book, together with the details of uncertainty calculations for many physical parameters, probability distributions and their properties are discussed. Definitions of various terms are given and will help the practicing metrologists to grasp the subject. The book helps to establish international standards for the evaluation of the quality of raw data obtained from various laboratories for interpreting the results of various national metrology institutes in an international inter-comparisons. For the routine calibration of instruments, a new idea for the use of pooled variance is introduced. The uncertainty calculations are explained for (i) independent linear inputs, (ii) non-linear inputs and (iii) correlated inputs. The merits and limitations of the Guide to the Expression of Uncertainty in Measurement (GUM) are discussed. Monte Carlo methods for the derivation of the output distribution from the...

  17. Evaluation of measuring results, statement of uncertainty in dosimeter calibrations

    International Nuclear Information System (INIS)

    The method described starts from the requirement that the quantitative statement of a measuring result in dosimetry should contain at least three figures: 1) the measured value or the best estimate of the quantity to be measured, 2) the uncertainty of this value given by a figure, which indicates a certain range around the measured value, and which is strongly linked with 3) a figure for the confidence level of this range, i.e. the probability that the (unknown) correct value is embraced by the given uncertainty range. How the figures 2) and 3) can be obtained and how they should be quoted in calibration certificates is the subject of these lectures. In addition, the means by which the method may be extended on determining the uncertainty of a measurement performed under conditions which deviate from the calibration conditt ions is briefly described. (orig.)

  18. Uncertainty evaluation in radon concentration measurement using charcoal canister

    International Nuclear Information System (INIS)

    Active charcoal detectors are used for testing the concentration of radon in dwellings. The method of measurement is based on radon adsorption on coal and measurement of gamma radiation of radon daughters. The contributions to the final measurement uncertainty are identified, based on the equation for radon activity concentration calculation. Different methods for setting the region of interest for gamma spectrometry of canisters were discussed and evaluated. The obtained radon activity concentration and uncertainties do not depend on peak area determination method. - Highlights: • Measurement uncertainty budget for radon activity concentration established. • Three different methods for ROI selection are used and compared. • Recommend to use one continuous ROI, less sensitive to gamma spectrometry system instabilities

  19. Gamma Attribute Measurements - Pu300, Pu600, Pu900

    International Nuclear Information System (INIS)

    Gamma rays are ideal probes for the determination of information about the special nuclear material that is in the transparency regime. Gamma rays are good probes because they interact relatively weakly with the containers that surround the SNM under investigation. In addition, gamma rays carry a great deal of information about the material under investigation. We have leveraged these two characteristics to develop three technologies that have proven useful for the measurements of various attributes of plutonium. These technologies are Pu-300, Pu-600 and Pu-900. These technologies obtain the age, isotopics and presence/absence of oxide of a plutonium sample, respectively. Pu-300 obtains the time since the last 241Am separation for a sample of plutonium. This is accomplished by looking at the 241Am/241pu ratio in the energy region from 330-350 keV, hence the name Pu-300. Pu-600 determines the isotopics of the plutonium sample under consideration. More specifically, it determines the 240Pu/239Pu ratio to determine if the plutonium sample is of weapons quality or not. This analysis is carded out in the energy region from 630-670 keV. Pu-900 determines the absence of PuO2 by searching for a peak at 870.7 keV. If this peak is absent then there is no oxide in the sample. This peak arises from the de-excitation of the first excited state of 17O. The assumption being made is that this state is populated by means of the 17O(α,α') reaction. The first excited state of 17O could also be populated by means of the 14N(α,p) reaction, which might indicate that this is not a good signature for the absence of PuO2, however in the samples we have measured this peak is visible in oxide samples and is absent in other samples. In this paper we will discuss the physics details of these technologies and also present results of various measurements

  20. Attributions for sexual situations in men with and without erectile disorder: evidence from a sex-specific attributional style measure.

    Science.gov (United States)

    Scepkowski, Lisa A; Wiegel, Markus; Bach, Amy K; Weisberg, Risa B; Brown, Timothy A; Barlow, David H

    2004-12-01

    This study investigated the attributional styles of men with and without sexual dysfunction for both positive and negative sexual and general events using a sex-specific version of the Attributional Style Questionnaire (Sex-ASQ), and ascertained the preliminary psychometric properties of the measure. The Sex-ASQ was created by embedding 8 hypothetical sexual events (4 positive, 4 negative) among the original 12 events in the Attributional Style Questionnaire (ASQ; C. Peterson, A. Semmel, C. von Baeyer, L. Y. Abramson, G. I. Metalsky, & M. E. Seligman, 1982). The Sex-ASQ was completed by 21 men with a principal DSM-IV diagnosis of Male Erectile Disorder (MED) and 32 male control participants. The psychometrics of the Sex-ASQ were satisfactory, but with the positive sexual event scales found to be less stable and internally consistent than the negative sexual event scales. Reasons for modest reliability of the positive event scales are discussed in terms of the original ASQ. As expected, men with MED did not differ significantly from men without sexual dysfunction in their causal attributions for general events, indicating that both groups exhibited an optimistic attributional style in general. Also as predicted, men with MED made more internal and stable causal attributions for negative sexual events than men without sexual dysfunction, and also rated negative sexual events as more important. For positive sexual events, the 2 groups did not differ in attributional style, with both groups making more external/unstable/specific causal attributions than for positive general events. Differences between explanatory style for sexual versus nonsexual events found in both sexually functional and dysfunctional men lend support for explanatory style models that propose both cross-situational consistency and situational specificity. PMID:15483370

  1. Estimate of the uncertainty in measurement for the determination of mercury in seafood by TDA AAS.

    Science.gov (United States)

    Torres, Daiane Placido; Olivares, Igor R B; Queiroz, Helena Müller

    2015-01-01

    An approach for the estimate of the uncertainty in measurement considering the individual sources related to the different steps of the method under evaluation as well as the uncertainties estimated from the validation data for the determination of mercury in seafood by using thermal decomposition/amalgamation atomic absorption spectrometry (TDA AAS) is proposed. The considered method has been fully optimized and validated in an official laboratory of the Ministry of Agriculture, Livestock and Food Supply of Brazil, in order to comply with national and international food regulations and quality assurance. The referred method has been accredited under the ISO/IEC 17025 norm since 2010. The approach of the present work in order to reach the aim of estimating of the uncertainty in measurement was based on six sources of uncertainty for mercury determination in seafood by TDA AAS, following the validation process, which were: Linear least square regression, Repeatability, Intermediate precision, Correction factor of the analytical curve, Sample mass, and Standard reference solution. Those that most influenced the uncertainty in measurement were sample weight, repeatability, intermediate precision and calibration curve. The obtained result for the estimate of uncertainty in measurement in the present work reached a value of 13.39%, which complies with the European Regulation EC 836/2011. This figure represents a very realistic estimate of the routine conditions, since it fairly encompasses the dispersion obtained from the value attributed to the sample and the value measured by the laboratory analysts. From this outcome, it is possible to infer that the validation data (based on calibration curve, recovery and precision), together with the variation on sample mass, can offer a proper estimate of uncertainty in measurement. PMID:26065523

  2. Orientation uncertainty goes bananas: An algorithm to visualise the uncertainty sample space on stereonets for oriented objects measured in boreholes

    Science.gov (United States)

    Stigsson, Martin; Munier, Raymond

    2013-07-01

    Measurements of structure orientations are afflicted with uncertainties which arise from many sources. Commonly, such uncertainties involve instrument imprecision, external disturbances and human factors. The aggregated uncertainty depends on the uncertainty of each of the sources. The orientation of an object measured in a borehole (e.g. a fracture) is calculated using four parameters: the bearing and inclination of the borehole and two relative angles of the measured object to the borehole. Each parameter may be a result of one or several measurements. The aim of this paper is to develop a method to both calculate and visualize the aggregated uncertainty resulting from the uncertainty in each of the four geometrical constituents. Numerical methods were used to develop a VBA-application in Microsoft Excel to calculate the aggregated uncertainty. The code calculates two different representations of the aggregated uncertainty: a 1-parameter uncertainty, the ‘minimum dihedral angle’, denoted by Ω; and, a non-parametric visual representation of the uncertainty, denoted by χ. The simple 1-parameter uncertainty algorithm calculates the minimum dihedral angle accurately, but overestimates the probability space that plots as an ellipsoid on a lower hemisphere stereonet. The non-parametric representation plots the uncertainty probability space accurately, usually as a sector of an annulus for steeply inclined boreholes, but is difficult to express numerically. The 1-parameter uncertainty can be used for evaluating statistics of large datasets whilst the non-parametric representation is useful when scrutinizing single or a few objects.

  3. Measurement uncertainty evaluation of conicity error inspected on CMM

    Science.gov (United States)

    Wang, Dongxia; Song, Aiguo; Wen, Xiulan; Xu, Youxiong; Qiao, Guifang

    2016-01-01

    The cone is widely used in mechanical design for rotation, centering and fixing. Whether the conicity error can be measured and evaluated accurately will directly influence its assembly accuracy and working performance. According to the new generation geometrical product specification(GPS), the error and its measurement uncertainty should be evaluated together. The mathematical model of the minimum zone conicity error is established and an improved immune evolutionary algorithm(IIEA) is proposed to search for the conicity error. In the IIEA, initial antibodies are firstly generated by using quasi-random sequences and two kinds of affinities are calculated. Then, each antibody clone is generated and they are self-adaptively mutated so as to maintain diversity. Similar antibody is suppressed and new random antibody is generated. Because the mathematical model of conicity error is strongly nonlinear and the input quantities are not independent, it is difficult to use Guide to the expression of uncertainty in the measurement(GUM) method to evaluate measurement uncertainty. Adaptive Monte Carlo method(AMCM) is proposed to estimate measurement uncertainty in which the number of Monte Carlo trials is selected adaptively and the quality of the numerical results is directly controlled. The cone parts was machined on lathe CK6140 and measured on Miracle NC 454 Coordinate Measuring Machine(CMM). The experiment results confirm that the proposed method not only can search for the approximate solution of the minimum zone conicity error(MZCE) rapidly and precisely, but also can evaluate measurement uncertainty and give control variables with an expected numerical tolerance. The conicity errors computed by the proposed method are 20%-40% less than those computed by NC454 CMM software and the evaluation accuracy improves significantly.

  4. Evaluating the uncertainty of input quantities in measurement models

    Science.gov (United States)

    Possolo, Antonio; Elster, Clemens

    2014-06-01

    The Guide to the Expression of Uncertainty in Measurement (GUM) gives guidance about how values and uncertainties should be assigned to the input quantities that appear in measurement models. This contribution offers a concrete proposal for how that guidance may be updated in light of the advances in the evaluation and expression of measurement uncertainty that were made in the course of the twenty years that have elapsed since the publication of the GUM, and also considering situations that the GUM does not yet contemplate. Our motivation is the ongoing conversation about a new edition of the GUM. While generally we favour a Bayesian approach to uncertainty evaluation, we also recognize the value that other approaches may bring to the problems considered here, and focus on methods for uncertainty evaluation and propagation that are widely applicable, including to cases that the GUM has not yet addressed. In addition to Bayesian methods, we discuss maximum-likelihood estimation, robust statistical methods, and measurement models where values of nominal properties play the same role that input quantities play in traditional models. We illustrate these general-purpose techniques in concrete examples, employing data sets that are realistic but that also are of conveniently small sizes. The supplementary material available online lists the R computer code that we have used to produce these examples (stacks.iop.org/Met/51/3/339/mmedia). Although we strive to stay close to clause 4 of the GUM, which addresses the evaluation of uncertainty for input quantities, we depart from it as we review the classes of measurement models that we believe are generally useful in contemporary measurement science. We also considerably expand and update the treatment that the GUM gives to Type B evaluations of uncertainty: reviewing the state-of-the-art, disciplined approach to the elicitation of expert knowledge, and its encapsulation in probability distributions that are usable in

  5. Measurement uncertainty issues in freeze-drying Processes

    OpenAIRE

    Vallan, Alberto; Carullo, Alessio

    2012-01-01

    This paper deals with problems that have to be faced when performing mass and temperature measurements of substances subjected to freeze-drying processes. A brief description of a lyophilization process is initially presented and a deep investigation is performed in order to identify the main uncertainty contributions that affect mass and temperature measurements. A measurement system is then described that has been specifically conceived to work inside a freeze-dryer. Experimental results ar...

  6. A Method to Estimate Uncertainty in Radiometric Measurement Using the Guide to the Expression of Uncertainty in Measurement (GUM) Method; NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Habte, A.; Sengupta, M.; Reda, I.

    2015-03-01

    Radiometric data with known and traceable uncertainty is essential for climate change studies to better understand cloud radiation interactions and the earth radiation budget. Further, adopting a known and traceable method of estimating uncertainty with respect to SI ensures that the uncertainty quoted for radiometric measurements can be compared based on documented methods of derivation.Therefore, statements about the overall measurement uncertainty can only be made on an individual basis, taking all relevant factors into account. This poster provides guidelines and recommended procedures for estimating the uncertainty in calibrations and measurements from radiometers. The approach follows the Guide to the Expression of Uncertainty in Measurement (GUM). derivation.Therefore, statements about the overall measurement uncertainty can only be made on an individual basis, taking all relevant factors into account. This poster provides guidelines and recommended procedures for estimating the uncertainty in calibrations and measurements from radiometers. The approach follows the Guide to the Expression of Uncertainty in Measurement (GUM).

  7. Measurement Of Beer Taste Attributes Using An Electronic Tongue

    Science.gov (United States)

    Polshin, Evgeny; Rudnitskaya, Alisa; Kirsanov, Dmitry; Lammertyn, Jeroen; Nicolaï, Bart; Saison, Daan; Delvaux, Freddy R.; Delvaux, Filip; Legin, Andrey

    2009-05-01

    The present work deals with the results of the application of an electronic tongue system as an analytical tool for rapid assessment of beer flavour. Fifty samples of Belgian and Dutch beers of different types, characterized with respect to sensory properties and bitterness, were analyzed using the electronic tongue (ET) based on potentiometric chemical sensors. The ET was capable of predicting 10 sensory attributes of beer with good precision including sweetness, sourness, intensity, body, etc., as well as the most important instrumental parameter—bitterness. These results show a good promise for further progressing of the ET as a new analytical technique for the fast assessment of taste attributes and bitterness, in particular, in the food and brewery industries.

  8. USGS Polar Temperature Logging System, Description and Measurement Uncertainties

    Science.gov (United States)

    Clow, Gary D.

    2008-01-01

    This paper provides an updated technical description of the USGS Polar Temperature Logging System (PTLS) and a complete assessment of the measurement uncertainties. This measurement system is used to acquire subsurface temperature data for climate-change detection in the polar regions and for reconstructing past climate changes using the 'borehole paleothermometry' inverse method. Specifically designed for polar conditions, the PTLS can measure temperatures as low as -60 degrees Celsius with a sensitivity ranging from 0.02 to 0.19 millikelvin (mK). A modular design allows the PTLS to reach depths as great as 4.5 kilometers with a skid-mounted winch unit or 650 meters with a small helicopter-transportable unit. The standard uncertainty (uT) of the ITS-90 temperature measurements obtained with the current PTLS range from 3.0 mK at -60 degrees Celsius to 3.3 mK at 0 degrees Celsius. Relative temperature measurements used for borehole paleothermometry have a standard uncertainty (urT) whose upper limit ranges from 1.6 mK at -60 degrees Celsius to 2.0 mK at 0 degrees Celsius. The uncertainty of a temperature sensor's depth during a log depends on specific borehole conditions and the temperature near the winch and thus must be treated on a case-by-case basis. However, recent experience indicates that when logging conditions are favorable, the 4.5-kilometer system is capable of producing depths with a standard uncertainty (uZ) on the order of 200-250 parts per million.

  9. CO2 Flux Measurement Uncertainty Estimates for NACP

    Science.gov (United States)

    Barr, A.; Hollinger, D.; Richardson, A. D.

    2009-12-01

    We evaluated the uncertainties in eddy-covariance net ecosystem exchange NEE, total ecosystem respiration RE and gross primary production GPP associated with (a) random measurement error and (b) uncertainties in the u* (friction velocity) threshold u*Th for all site-years in the NACP site-level synthesis. The analyses required automated evaluation of the u*Th filter used to identify and reject bad NEE measurements during low-turbulence periods at night. The u*Th detection algorithm was adapted from Papale et al. (2006), modified to use a standard change-point detection algorithm. Uncertainty in the u*Th was estimated by bootstrapping, conducted annually with 1,000 draws per site-year, then pooling all years and calculating the lower and upper 95% confidence intervals from the median and 2.5 and 97.5 percentiles of the pooled u*Th values. Random uncertainties in NEE, RE and GPP were estimated following Richardson et al. (2007). The NEE random uncertainty characteristic curve, which characterizes random uncertainty in NEE as a function of NEE, was estimated for each site-year based on the differences between the measured data and the output of a simple and robust gap-filling model. The estimation procedure began with synthetic NEE data generated by the gap-filling model, introduced gaps (as in the measured data after u*Th filtering), added synthetic noise (defined by the NEE random uncertainty characteristic curve using a Monte-Carlo approach), then filled the gaps in the noisy, gappy synthetic data. The process was repeated 1,000 times for each site-year, and the random uncertainty was estimated from median and the 2.5 and 97.5 percentiles of the gap-filled data. The uncertainties in NEE, RE and GPP associated with uncertainties in the u*Th were evaluated by running the gap-filling routine at 1,000 u*Th values, drawn randomly from the pooled annual bootstrapping estimates. This produced 1,000 realizations of the gap-filled NEE, RE and GPP time series. The

  10. Toward an uncertainty budget for measuring nanoparticles by AFM

    Science.gov (United States)

    Delvallée, A.; Feltin, N.; Ducourtieux, S.; Trabelsi, M.; Hochepied, J. F.

    2016-02-01

    This article reports on the evaluation of an uncertainty budget associated with the measurement of the mean diameter of a nanoparticle (NP) population by Atomic Force Microscopy. The measurement principle consists in measuring the height of a spherical-like NP population to determine the mean diameter and the size distribution. This method assumes that the NPs are well-dispersed on the substrate and isolated enough to avoid measurement errors due to agglomeration phenomenon. Since the measurement is directly impacted by the substrate roughness, the NPs have been deposited on a mica sheet presenting a very low roughness. A complete metrological characterization of the instrument has been carried out and the main error sources have been evaluated. The measuring method has been tested on a population of SiO2 NPs. Homemade software has been used to build the height distribution histogram taking into account only isolated NP. Finally, the uncertainty budget including main components has been established for the mean diameter measurement of this NP population. The most important components of this uncertainty budget are the calibration process along Z-axis, the scanning speed influence and then the vertical noise level.

  11. Research on a method of shielded HEU attribution measurement

    International Nuclear Information System (INIS)

    A special assay method that determining the high energy gamma ray 2614.75 keV from 208Tl, the daughter of 232U is presented. By which the attributes of uranium can be learned about. The theoretical analysis and experiment results prove that HEU from U-Pu cycle contains a few 232U. The gamma ray 2614.75 keV can be detected even under the condition of thick shield. It is proved that this method is feasible to determine HEU

  12. Uncertainty analysis of NDA waste measurements using computer simulations

    International Nuclear Information System (INIS)

    Uncertainty assessments for nondestructive radioassay (NDA) systems for nuclear waste are complicated by factors extraneous to the measurement systems themselves. Most notably, characteristics of the waste matrix (e.g., homogeneity) and radioactive source material (e.g., particle size distribution) can have great effects on measured mass values. Under these circumstances, characterizing the waste population is as important as understanding the measurement system in obtaining realistic uncertainty values. When extraneous waste characteristics affect measurement results, the uncertainty results are waste-type specific. The goal becomes to assess the expected bias and precision for the measurement of a randomly selected item from the waste population of interest. Standard propagation-of-errors methods for uncertainty analysis can be very difficult to implement in the presence of significant extraneous effects on the measurement system. An alternative approach that naturally includes the extraneous effects is as follows: (1) Draw a random sample of items from the population of interest; (2) Measure the items using the NDA system of interest; (3) Establish the true quantity being measured using a gold standard technique; and (4) Estimate bias by deriving a statistical regression model comparing the measurements on the system of interest to the gold standard values; similar regression techniques for modeling the standard deviation of the difference values gives the estimated precision. Actual implementation of this method is often impractical. For example, a true gold standard confirmation measurement may not exist. A more tractable implementation is obtained by developing numerical models for both the waste material and the measurement system. A random sample of simulated waste containers generated by the waste population model serves as input to the measurement system model. This approach has been developed and successfully applied to assessing the quantity of

  13. Generalized uncertainty relations and efficient measurements in quantum systems

    OpenAIRE

    Belavkin, V. P.

    2004-01-01

    We consider two variants of a quantum-statistical generalization of the Cramer-Rao inequality that establishes an invariant lower bound on the mean square error of a generalized quantum measurement. The proposed complex variant of this inequality leads to a precise formulation of a generalized uncertainty principle for arbitrary states, in contrast to Helstrom's symmetric variant in which these relations are obtained only for pure states. A notion of canonical states is introduced and the low...

  14. Uncertainties and re-analysis of glacier mass balance measurements

    OpenAIRE

    Zemp, M.; E. Thibert; Huss, M.; Stumm, D.; Rolstad Denby, C.; Nuth, C.; S. U. Nussbaumer; G. Moholdt; A. Mercer; Mayer, C.; Joerg, P. C.; P. Jansson; B. Hynek; Fischer, A.; Escher-Vetter, H.

    2013-01-01

    Glacier-wide mass balance has been measured for more than sixty years and is widely used as an indicator of climate change and to assess the glacier contribution to runoff and sea level rise. Until present, comprehensive uncertainty assessments have rarely been carried out and mass balance data have often been applied using rough error estimation or without error considerations. In this study, we propose a framework for re-analyzing glacier mass balance series including conceptual and ...

  15. UNCERTAINTIES OF ANION AND TOC MEASUREMENTS AT THE DWPF LABORATORY

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, T.

    2011-04-07

    The Savannah River Remediation (SRR) Defense Waste Processing Facility (DWPF) has identified a technical issue related to the amount of antifoam added to the Chemical Process Cell (CPC). Specifically, due to the long duration of the concentration and reflux cycles for the Sludge Receipt and Adjustment Tank (SRAT), additional antifoam has been required. The additional antifoam has been found to impact the melter flammability analysis as an additional source of carbon and hydrogen. To better understand and control the carbon and hydrogen contributors to the melter flammability analysis, SRR's Waste Solidification Engineering (WSE) has requested, via a Technical Task Request (TTR), that the Savannah River National Laboratory (SRNL) conduct an error evaluation of the measurements of key Slurry Mix Evaporator (SME) anions. SRNL issued a Task Technical and Quality Assurance Plan (TTQAP) [2] in response to that request, and the work reported here was conducted under the auspices of that TTQAP. The TTR instructs SRNL to conduct an error evaluation of anion measurements generated by the DWPF Laboratory using Ion Chromatography (IC) performed on SME samples. The anions of interest include nitrate, oxalate, and formate. Recent measurements of SME samples for these anions as well as measurements of total organic carbon (TOC) were provided to SRNL by DWPF Laboratory Operations (Lab OPS) personnel for this evaluation. This work was closely coordinated with the efforts of others within SRNL that are investigating the Chemical Process Cell (CPC) contributions to the melter flammability. The objective of that investigation was to develop a more comprehensive melter flammability control strategy that when implemented in DWPF will rely on process measurements. Accounting for the uncertainty of the measurements is necessary for successful implementation. The error evaluations conducted as part of this task will facilitate the integration of appropriate uncertainties for the

  16. Measurement uncertainty. A practical guide for Secondary Standards Dosimetry Laboratories

    International Nuclear Information System (INIS)

    The need for international traceability for radiation dose measurements has been understood since the early nineteen-sixties. The benefits of high dosimetric accuracy were recognized, particularly in radiotherapy, where the outcome of treatments is dependent on the radiation dose delivered to patients. When considering radiation protection dosimetry, the uncertainty may be greater than for therapy, but proper traceability of the measurements is no less important. To ensure harmonization and consistency in radiation measurements, the International Atomic Energy Agency (IAEA) and the World Health Organization (WHO) created a Network of Secondary Standards Dosimetry Laboratories (SSDLs) in 1976. An SSDL is a laboratory that has been designated by the competent national authorities to undertake the duty of providing the necessary link in the traceability chain of radiation dosimetry to the international measurement system (SI, for Systeme International) for radiation metrology users. The role of the SSDLs is crucial in providing traceable calibrations; they disseminate calibrations at specific radiation qualities appropriate for the use of radiation measuring instruments. Historically, although the first SSDLs were established mainly to provide radiotherapy level calibrations, the scope of their work has expanded over the years. Today, many SSDLs provide traceability for radiation protection measurements and diagnostic radiology in addition to radiotherapy. Some SSDLs, with the appropriate facilities and expertise, also conduct quality audits of the clinical use of the calibrated dosimeters - for example, by providing postal dosimeters for dose comparisons for medical institutions or on-site dosimetry audits with an ion chamber and other appropriate equipment. The requirements for traceable and reliable calibrations are becoming more important. For example, for international trade where radiation products are manufactured within strict quality control systems, it is

  17. Measurement of nuclear activity with Ge detectors and its uncertainty

    International Nuclear Information System (INIS)

    The objective of this work is to analyse the influence magnitudes which affect the activity measurement of gamma transmitter isolated radioactive sources. They prepared by means of the gravimetric method, as well as, determining the uncertainty of such measurement when this is carried out with a gamma spectrometer system with a germanium detector. This work is developed in five chapters: In the first one, named Basic principles it is made a brief description about the meaning of the word Measurement and its implications and the necessaries concepts are presented which are used in this work. In the second chapter it is exposed the gravimetric method used for the manufacture of the gamma transmitter isolated radioactive sources, it is tackled the problem to determine the main influence magnitudes which affect in the measurement of their activity and the respective correction factors and their uncertainties are deduced. The third chapter describes the gamma spectrometry system which is used in this work for the measurement of the activity of isolated sources and also its performance and experimental arrangement that it is used. In the fourth chapter are applied the three previous items with the object of determining the uncertainty which would be obtained in the measurement of an isolated radioactive source elaborated with the gravimetric method in the experimental conditions less favourable predicted above the obtained results from the chapter two. The conclusions are presented in the fifth chapter and they are applied to establish the optimum conditions for the measurement of the activity of a gamma transmitter isolated radioactive source with a spectrometer with germanium detector. (Author)

  18. Using measurement uncertainty in decision-making and conformity assessment

    Science.gov (United States)

    Pendrill, L. R.

    2014-08-01

    Measurements often provide an objective basis for making decisions, perhaps when assessing whether a product conforms to requirements or whether one set of measurements differs significantly from another. There is increasing appreciation of the need to account for the role of measurement uncertainty when making decisions, so that a ‘fit-for-purpose’ level of measurement effort can be set prior to performing a given task. Better mutual understanding between the metrologist and those ordering such tasks about the significance and limitations of the measurements when making decisions of conformance will be especially useful. Decisions of conformity are, however, currently made in many important application areas, such as when addressing the grand challenges (energy, health, etc), without a clear and harmonized basis for sharing the risks that arise from measurement uncertainty between the consumer, supplier and third parties. In reviewing, in this paper, the state of the art of the use of uncertainty evaluation in conformity assessment and decision-making, two aspects in particular—the handling of qualitative observations and of impact—are considered key to bringing more order to the present diverse rules of thumb of more or less arbitrary limits on measurement uncertainty and percentage risk in the field. (i) Decisions of conformity can be made on a more or less quantitative basis—referred in statistical acceptance sampling as by ‘variable’ or by ‘attribute’ (i.e. go/no-go decisions)—depending on the resources available or indeed whether a full quantitative judgment is needed or not. There is, therefore, an intimate relation between decision-making, relating objects to each other in terms of comparative or merely qualitative concepts, and nominal and ordinal properties. (ii) Adding measures of impact, such as the costs of incorrect decisions, can give more objective and more readily appreciated bases for decisions for all parties concerned. Such

  19. Measuring uncertainty in modeling toxic concentrations in the Niagara River

    Science.gov (United States)

    Franceschini, S.; Tsai, C.

    2004-12-01

    degree of accuracy in the estimation of the first few statistical moments of a model output distribution. Furthermore, the probabilistic analysis can be used as a more rigorous method to compare the modeled results with established water quality criteria. In this study, the toxic concentrations computed at the end of the Niagara River and their estimated variability will be compared with field data measurements. The purpose of this comparison is two-fold: (a) to evaluate the accuracy of the Modified Rosenblueth method in measuring the uncertainty of toxic concentration in the Niagara River and (b) to quantify the risk of exceeding established water quality standards when such uncertainty is accounted for.

  20. Measurement Uncertainty Analysis of the Strain Gauge Based Stabilographic Platform

    Directory of Open Access Journals (Sweden)

    Walendziuk Wojciech

    2014-08-01

    Full Text Available The present article describes constructing a stabilographic platform which records a standing patient’s deflection from their point of balance. The constructed device is composed of a toughen glass slab propped with 4 force sensors. Power transducers are connected to the measurement system based on a 24-bit ADC transducer which acquires slight body movements of a patient. The data is then transferred to the computer in real time and data analysis is conducted. The article explains the principle of operation as well as the algorithm of measurement uncertainty for the COP (Centre of Pressure surface (x, y.

  1. Object-oriented software for evaluating measurement uncertainty

    International Nuclear Information System (INIS)

    An earlier publication (Hall 2006 Metrologia 43 L56–61) introduced the notion of an uncertain number that can be used in data processing to represent quantity estimates with associated uncertainty. The approach can be automated, allowing data processing algorithms to be decomposed into convenient steps, so that complicated measurement procedures can be handled. This paper illustrates the uncertain-number approach using several simple measurement scenarios and two different software tools. One is an extension library for Microsoft Excel®. The other is a special-purpose calculator using the Python programming language. (paper)

  2. Measurement Uncertainty Estimation of a Robust Photometer Circuit

    Directory of Open Access Journals (Sweden)

    Jesús de Vicente

    2009-04-01

    Full Text Available In this paper the uncertainty of a robust photometer circuit (RPC was estimated. Here, the RPC was considered as a measurement system, having input quantities that were inexactly known, and output quantities that consequently were also inexactly known. Input quantities represent information obtained from calibration certificates, specifications of manufacturers, and tabulated data. Output quantities describe the transfer function of the electrical part of the photodiode. Input quantities were the electronic components of the RPC, the parameters of the model of the photodiode and its sensitivity at 670 nm. The output quantities were the coefficients of both numerator and denominator of the closed-loop transfer function of the RPC. As an example, the gain and phase shift of the RPC versus frequency was evaluated from the transfer function, with their uncertainties and correlation coefficient. Results confirm the robustness of photodiode design.

  3. Measurement uncertainties in regression analysis with scarcity of data

    International Nuclear Information System (INIS)

    The evaluation of measurement uncertainty, in certain fields of science, faces the problem of scarcity of data. This is certainly the case in the testing of geological soils in civil engineering, where tests can take several days or weeks and where the same sample is not available for further testing, being destroyed during the experiment. In this particular study attention will be paid to triaxial compression tests used to typify particular soils. The purpose of the testing is to determine two parameters that characterize the soil, namely, cohesion and friction angle. These parameters are defined in terms of the intercept and slope of a straight line fitted to a small number of points (usually three) derived from experimental data. The use of ordinary least squares to obtain uncertainties associated with estimates of the two parameters would be unreliable if there were only three points (and no replicates) and hence only one degrees of freedom.

  4. Methodology for the assessment of measuring uncertainties of articulated arm coordinate measuring machines

    Science.gov (United States)

    Romdhani, Fekria; Hennebelle, François; Ge, Min; Juillion, Patrick; Coquet, Richard; François Fontaine, Jean

    2014-12-01

    Articulated Arm Coordinate Measuring Machines (AACMMs) have gradually evolved and are increasingly used in mechanical industry. At present, measurement uncertainties relating to the use of these devices are not yet well quantified. The work carried out consists of determining the measurement uncertainties of a mechanical part by an AACMM. The studies aiming to develop a model of measurement uncertainty are based on the Monte Carlo method developed in Supplement 1 of the Guide to Expression of Uncertainty in Measurement [1] but also identifying and characterizing the main sources of uncertainty. A multi-level Monte Carlo approach principle has been developed which allows for characterizing the possible evolution of the AACMM during the measurement and quantifying in a second level the uncertainty on the considered measurand. The first Monte Carlo level is the most complex and is thus divided into three sub-levels, namely characterization on the positioning error of a point, estimation of calibration errors and evaluation of fluctuations of the ‘localization point’. The global method is thus presented and results of the first sub-level are particularly developed. The main sources of uncertainty, including AACMM deformations, are exposed.

  5. A comparison of protocols and observer precision for measuring physical stream attributes

    Science.gov (United States)

    Whitacre, H.W.; Roper, B.B.; Kershner, J.L.

    2007-01-01

    Stream monitoring programs commonly measure physical attributes to assess the effect of land management on stream habitat. Variability associated with the measurement of these attributes has been linked to a number of factors, but few studies have evaluated variability due to differences in protocols. We compared six protocols, five used by the U.S. Department of Agriculture Forest Service and one by the U.S. Environmental Protection Agency, on six streams in Oregon and Idaho to determine whether differences in protocol affect values for 10 physical stream attributes. Results from Oregon and Idaho were combined for groups participating in both states, with significant differences in attribute means for 9 out of the 10 stream attributes. Significant differences occurred in 5 of 10 in Idaho, and 10 of 10 in Oregon. Coefficients of variation, signal-to-noise ratio, and root mean square error were used to evaluate measurement precision. There were differences among protocols for all attributes when states were analyzed separately and as a combined dataset. Measurement differences were influenced by choice of instruments, measurement method, measurement location, attribute definitions, and training approach. Comparison of data gathered by observers using different protocols will be difficult unless a core set of protocols for commonly measured stream attributes can be standardized among monitoring programs.

  6. Estimation of measuring uncertainty for optical micro-coordinate measuring machine

    Institute of Scientific and Technical Information of China (English)

    Kang Song(宋康); Zhuangde Jiang(蒋庄德)

    2004-01-01

    Based on the evaluation principle of the measuring uncertainty of the traditional coordinate measuring machine (CMM), the analysis and evaluation of the measuring uncertainty for optical micro-CMM have been made. Optical micro-CMM is an integrated measuring system with optical, mechanical, and electronic components, which may influence the measuring uncertainty of the optical micro-CMM. If the influence of laser speckle is taken into account, its longitudinal measuring uncertainty is 2.0 μm, otherwise it is 0.88 μm. It is proved that the estimation of the synthetic uncertainty for optical micro-CMM is correct and reliable by measuring the standard reference materials and simulating the influence of the diameter of laser beam. With Heisenberg's uncertainty principle and quantum mechanics theory, a method for improving the measuring accuracy of optical micro-CMM through adding a diaphragm in the receiving terminal of the light path was proposed, and the measuring results are verified by experiments.

  7. TOTAL MEASUREMENT UNCERTAINTY IN HOLDUP MEASUREMENTS AT THE PLUTONIUM FINISHING PLANT (PFP)

    International Nuclear Information System (INIS)

    An approach to determine the total measurement uncertainty (TMU) associated with Generalized Geometry Holdup (GGH) [1,2,3] measurements was developed and implemented in 2004 and 2005 [4]. This paper describes a condensed version of the TMU calculational model, including recent developments. Recent modifications to the TMU calculation model include a change in the attenuation uncertainty, clarifying the definition of the forward background uncertainty, reducing conservatism in the random uncertainty by selecting either a propagation of counting statistics or the standard deviation of the mean, and considering uncertainty in the width and height as a part of the self attenuation uncertainty. In addition, a detection limit is calculated for point sources using equations derived from summary equations contained in Chapter 20 of MARLAP [5]. The Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2007-1 to the Secretary of Energy identified a lack of requirements and a lack of standardization for performing measurements across the U.S. Department of Energy (DOE) complex. The DNFSB also recommended that guidance be developed for a consistent application of uncertainty values. As such, the recent modifications to the TMU calculational model described in this paper have not yet been implemented. The Plutonium Finishing Plant (PFP) is continuing to perform uncertainty calculations as per Reference 4. Publication at this time is so that these concepts can be considered in developing a consensus methodology across the complex

  8. Measurement of nuclear activity with Ge detectors and its uncertainty

    CERN Document Server

    Cortes, C A P

    1999-01-01

    presented in the fifth chapter and they are applied to establish the optimum conditions for the measurement of the activity of a gamma transmitter isolated radioactive source with a spectrometer with germanium detector. (Author) The objective of this work is to analyse the influence magnitudes which affect the activity measurement of gamma transmitter isolated radioactive sources. They prepared by means of the gravimetric method, as well as, determining the uncertainty of such measurement when this is carried out with a gamma spectrometer system with a germanium detector. This work is developed in five chapters: In the first one, named Basic principles it is made a brief description about the meaning of the word Measurement and its implications and the necessaries concepts are presented which are used in this work. In the second chapter it is exposed the gravimetric method used for the manufacture of the gamma transmitter isolated radioactive sources, it is tackled the problem to determine the main influence ...

  9. Applying the Implicit Association Test to Measure Intolerance of Uncertainty.

    Science.gov (United States)

    Mosca, Oriana; Dentale, Francesco; Lauriola, Marco; Leone, Luigi

    2016-08-01

    Intolerance of Uncertainty (IU) is a key trans-diagnostic personality construct strongly associated with anxiety symptoms. Traditionally, IU is measured through self-report measures that are prone to bias effects due to impression management concerns and introspective difficulties. Moreover, self-report scales are not able to intercept the automatic associations that are assumed to be main determinants of several spontaneous responses (e.g., emotional reactions). In order to overcome these limitations, the Implicit Association Test (IAT) was applied to measure IU, with a particular focus on reliability and criterion validity issues. The IU-IAT and the Intolerance of Uncertainty Inventory (IUI) were administered to an undergraduate student sample (54 females and 10 males) with a mean age of 23 years (SD = 1.7). Successively, participants were asked to provide an individually chosen uncertain event from their own lives that may occur in the future and were requested to identify a number of potential negative consequences of it. Participants' responses in terms of cognitive thoughts (i.e., cognitive appraisal) and worry reactions toward these events were assessed using the two subscales of the Worry and Intolerance of Uncertainty Beliefs Questionnaire. The IU-IAT showed an adequate level of internal consistency and a not significant correlation with the IUI. A path analysis model, accounting for 35% of event-related worry, revealed that IUI had a significant indirect effect on the dependent variable through event-related IU thoughts. By contrast, as expected, IU-IAT predicted event-related worry independently from IU thoughts. In accordance with dual models of social cognition, these findings suggest that IU can influence event-related worry through two different processing pathways (automatic vs. deliberative), supporting the criterion and construct validity of the IU-IAT. The potential role of the IU-IAT for clinical applications was discussed. PMID:27451266

  10. Lidar Uncertainty Measurement Experiment (LUMEX) - Understanding Sampling Errors

    Science.gov (United States)

    Choukulkar, A.; Brewer, W. A.; Banta, R. M.; Hardesty, M.; Pichugina, Y.; Senff, Christoph; Sandberg, S.; Weickmann, A.; Carroll, B.; Delgado, R.; Muschinski, A.

    2016-06-01

    Coherent Doppler LIDAR (Light Detection and Ranging) has been widely used to provide measurements of several boundary layer parameters such as profiles of wind speed, wind direction, vertical velocity statistics, mixing layer heights and turbulent kinetic energy (TKE). An important aspect of providing this wide range of meteorological data is to properly characterize the uncertainty associated with these measurements. With the above intent in mind, the Lidar Uncertainty Measurement Experiment (LUMEX) was conducted at Erie, Colorado during the period June 23rd to July 13th, 2014. The major goals of this experiment were the following: Characterize sampling error for vertical velocity statistics Analyze sensitivities of different Doppler lidar systems Compare various single and dual Doppler retrieval techniques Characterize error of spatial representativeness for separation distances up to 3 km Validate turbulence analysis techniques and retrievals from Doppler lidars This experiment brought together 5 Doppler lidars, both commercial and research grade, for a period of three weeks for a comprehensive intercomparison study. The Doppler lidars were deployed at the Boulder Atmospheric Observatory (BAO) site in Erie, site of a 300 m meteorological tower. This tower was instrumented with six sonic anemometers at levels from 50 m to 300 m with 50 m vertical spacing. A brief overview of the experiment outline and deployment will be presented. Results from the sampling error analysis and its implications on scanning strategy will be discussed.

  11. Measures of uncertainty, importance and sensitivity of the SEDA code

    International Nuclear Information System (INIS)

    The purpose of this work is the estimation of the uncertainty on the results of the SEDA code (Sistema de Evaluacion de Dosis en Accidentes) in accordance with the input data and its parameters. The SEDA code has been developed by the Comision Nacional de Energia Atomica for the estimation of doses during emergencies in the vicinity of Atucha and Embalse, nuclear power plants. The user should feed the code with meteorological data, source terms and accident data (timing involved, release height, thermal content of the release, etc.) It is designed to be used during the emergency, and to bring fast results that enable to make decisions. The uncertainty in the results of the SEDA code is quantified in the present paper. This uncertainty is associated both with the data the user inputs to the code, and with the uncertain parameters of the code own models. The used method consisted in the statistical characterization of the parameters and variables, assigning them adequate probability distributions. These distributions have been sampled with the Latin Hypercube Sampling method, which is a stratified multi-variable Monte-Carlo technique. The code has been performed for each of the samples and finally, a result sample has been obtained. These results have been characterized from the statistical point of view (obtaining their mean, most probable value, distribution shape, etc.) for several distances from the source. Finally, the Partial Correlation Coefficients and Standard Regression Coefficients techniques have been used to obtain the relative importance of each input variable, and the Sensitivity of the code to its variations. The measures of Importance and Sensitivity have been obtained for several distances from the source and various cases of atmospheric stability, making comparisons possible. This paper allowed to confide in the results of the code, and the association of their uncertainty to them, as a way to know the limits in which the results can vary in a real

  12. Range and number-of-levels effects in derived and stated measures of attribute importance

    NARCIS (Netherlands)

    Verlegh, PWJ; Schifferstein, HNJ; Wittink, DR

    2002-01-01

    We study how the range of variation and the number of ttribute levels affect five measures of attribute importance: full profile conjoint estimates, ranges in attribute level attractiveness ratings. regression coefficients. graded paired comparisons. and self-reported ratings, We find that all impor

  13. The concordance of directly and indirectly measured built environment attributes and physical activity adoption

    Directory of Open Access Journals (Sweden)

    O'Connor Daniel P

    2011-07-01

    Full Text Available Background Physical activity (PA adoption is essential for obesity prevention and control, yet ethnic minority women report lower levels of PA and are at higher risk for obesity and its comorbidities compared to Caucasians. Epidemiological studies and ecologic models of health behavior suggest that built environmental factors are associated with health behaviors like PA, but few studies have examined the association between built environment attribute concordance and PA, and no known studies have examined attribute concordance and PA adoption. Purpose The purpose of this study was to associate the degree of concordance between directly and indirectly measured built environment attributes with changes in PA over time among African American and Hispanic Latina women participating in a PA intervention. Method Women (N = 410 completed measures of PA at Time 1 (T1 and Time 2 (T2; environmental data collected at T1 were used to compute concordance between directly and indirectly measured built environment attributes. The association between changes in PA and the degree of concordance between each directly and indirectly measured environmental attribute was assessed using repeated measures analyses. Results There were no significant associations between built environment attribute concordance values and change in self-reported or objectively measured PA. Self-reported PA significantly increased over time (F(1,184 = 7.82, p = .006, but this increase did not vary by ethnicity or any built environment attribute concordance variable. Conclusions Built environment attribute concordance may not be associated with PA changes over time among minority women. In an effort to promote PA, investigators should clarify specific built environment attributes that are important for PA adoption and whether accurate perceptions of these attributes are necessary, particularly among the vulnerable population of minority women.

  14. Analysis of sensitivity and uncertainty on the leukemia risk attributable to the nuclear installations of North Cotentin

    International Nuclear Information System (INIS)

    The study realised includes several phases: the delimitation of the field of the study, the identification of the paramount parameters, the determination of the variations intervals of the paramount parameters, the analysis of the sensitivity and finally the analysis of uncertainty. (N.C.)

  15. SI2N overview paper: ozone profile measurements: techniques, uncertainties and availability

    Science.gov (United States)

    Hassler, B.; Petropavlovskikh, I.; Staehelin, J.; August, T.; Bhartia, P. K.; Clerbaux, C.; Degenstein, D.; De Mazière, M.; Dinelli, B. M.; Dudhia, A.; Dufour, G.; Frith, S. M.; Froidevaux, L.; Godin-Beekmann, S.; Granville, J.; Harris, N. R. P.; Hoppel, K.; Hubert, D.; Kasai, Y.; Kurylo, M. J.; Kyrölä, E.; Lambert, J.-C.; Levelt, P. F.; McElroy, C. T.; McPeters, R. D.; Munro, R.; Nakajima, H.; Parrish, A.; Raspollini, P.; Remsberg, E. E.; Rosenlof, K. H.; Rozanov, A.; Sano, T.; Sasano, Y.; Shiotani, M.; Smit, H. G. J.; Stiller, G.; Tamminen, J.; Tarasick, D. W.; Urban, J.; van der A, R. J.; Veefkind, J. P.; Vigouroux, C.; von Clarmann, T.; von Savigny, C.; Walker, K. A.; Weber, M.; Wild, J.; Zawodny, J.

    2013-11-01

    Peak stratospheric chlorofluorocarbon (CFC) and other ozone depleting substance (ODS) concentrations were reached in the mid to late 1990s. Detection and attribution of the expected recovery of the stratospheric ozone layer in an atmosphere with reduced ODSs as well as efforts to understand the evolution of stratospheric ozone in the presence of increasing greenhouse gases are key current research topics. These require a critical examination of the ozone changes with an accurate knowledge of the spatial (geographical and vertical) and temporal ozone response. For such an examination, it is vital that the quality of the measurements used be as high as possible and measurement uncertainties well quantified. In preparation for the 2014 United Nations Environment Programme (UNEP)/World Meteorological Organization (WMO) Scientific Assessment of Ozone Depletion, the SPARC/IO3C/IGACO-O3/NDACC (SI2N) initiative was designed to study and document changes in the global ozone profile distribution. This requires assessing long-term ozone profile data sets in regards to measurement stability and uncertainty characteristics. The ultimate goal is to establish suitability for estimating long-term ozone trends to contribute to ozone recovery studies. Some of the data sets have been improved as part of this initiative with updated versions now available. This summary presents an overview of stratospheric ozone profile measurement data sets (ground- and satellite-based) available for ozone recovery studies. Here we document measurement techniques, spatial and temporal coverage, vertical resolution, native units and measurement uncertainties. In addition, the latest data versions are briefly described (including data version updates as well as detailing multiple retrievals when available for a given satellite instrument). Archive location information is for each data set is also given.

  16. Past changes in the vertical distribution of ozone – Part 1: Measurement techniques, uncertainties and availability

    Directory of Open Access Journals (Sweden)

    B. Hassler

    2014-05-01

    Full Text Available Peak stratospheric chlorofluorocarbon (CFC and other ozone depleting substance (ODS concentrations were reached in the mid- to late 1990s. Detection and attribution of the expected recovery of the stratospheric ozone layer in an atmosphere with reduced ODSs as well as efforts to understand the evolution of stratospheric ozone in the presence of increasing greenhouse gases are key current research topics. These require a critical examination of the ozone changes with an accurate knowledge of the spatial (geographical and vertical and temporal ozone response. For such an examination, it is vital that the quality of the measurements used be as high as possible and measurement uncertainties well quantified. In preparation for the 2014 United Nations Environment Programme (UNEP/World Meteorological Organization (WMO Scientific Assessment of Ozone Depletion, the SPARC/IO3C/IGACO-O3/NDACC (SI2N Initiative was designed to study and document changes in the global ozone profile distribution. This requires assessing long-term ozone profile data sets in regards to measurement stability and uncertainty characteristics. The ultimate goal is to establish suitability for estimating long-term ozone trends to contribute to ozone recovery studies. Some of the data sets have been improved as part of this initiative with updated versions now available. This summary presents an overview of stratospheric ozone profile measurement data sets (ground and satellite based available for ozone recovery studies. Here we document measurement techniques, spatial and temporal coverage, vertical resolution, native units and measurement uncertainties. In addition, the latest data versions are briefly described (including data version updates as well as detailing multiple retrievals when available for a given satellite instrument. Archive location information for each data set is also given.

  17. Past changes in the vertical distribution of ozone - Part 1: Measurement techniques, uncertainties and availability

    Science.gov (United States)

    Hassler, B.; Petropavlovskikh, I.; Staehelin, J.; August, T.; Bhartia, P. K.; Clerbaux, C.; Degenstein, D.; De Mazière, M.; Dinelli, B. M.; Dudhia, A.; Dufour, G.; Frith, S. M.; Froidevaux, L.; Godin-Beekmann, S.; Granville, J.; Harris, N. R. P.; Hoppel, K.; Hubert, D.; Kasai, Y.; Kurylo, M. J.; Kyrölä, E.; Lambert, J.-C.; Levelt, P. F.; McElroy, C. T.; McPeters, R. D.; Munro, R.; Nakajima, H.; Parrish, A.; Raspollini, P.; Remsberg, E. E.; Rosenlof, K. H.; Rozanov, A.; Sano, T.; Sasano, Y.; Shiotani, M.; Smit, H. G. J.; Stiller, G.; Tamminen, J.; Tarasick, D. W.; Urban, J.; van der A, R. J.; Veefkind, J. P.; Vigouroux, C.; von Clarmann, T.; von Savigny, C.; Walker, K. A.; Weber, M.; Wild, J.; Zawodny, J. M.

    2014-05-01

    Peak stratospheric chlorofluorocarbon (CFC) and other ozone depleting substance (ODS) concentrations were reached in the mid- to late 1990s. Detection and attribution of the expected recovery of the stratospheric ozone layer in an atmosphere with reduced ODSs as well as efforts to understand the evolution of stratospheric ozone in the presence of increasing greenhouse gases are key current research topics. These require a critical examination of the ozone changes with an accurate knowledge of the spatial (geographical and vertical) and temporal ozone response. For such an examination, it is vital that the quality of the measurements used be as high as possible and measurement uncertainties well quantified. In preparation for the 2014 United Nations Environment Programme (UNEP)/World Meteorological Organization (WMO) Scientific Assessment of Ozone Depletion, the SPARC/IO3C/IGACO-O3/NDACC (SI2N) Initiative was designed to study and document changes in the global ozone profile distribution. This requires assessing long-term ozone profile data sets in regards to measurement stability and uncertainty characteristics. The ultimate goal is to establish suitability for estimating long-term ozone trends to contribute to ozone recovery studies. Some of the data sets have been improved as part of this initiative with updated versions now available. This summary presents an overview of stratospheric ozone profile measurement data sets (ground and satellite based) available for ozone recovery studies. Here we document measurement techniques, spatial and temporal coverage, vertical resolution, native units and measurement uncertainties. In addition, the latest data versions are briefly described (including data version updates as well as detailing multiple retrievals when available for a given satellite instrument). Archive location information for each data set is also given.

  18. Past Changes in the Vertical Distribution of Ozone Part 1: Measurement Techniques, Uncertainties and Availability

    Science.gov (United States)

    Hassler, B.; Petropavlovskikh, I.; Staehelin, J.; August, T.; Bhartia, P. K.; Clerbaux, C.; Degenstein, D.; Maziere, M. De; Dinelli, B. M.; Dudhia, A.; Dufour, G.; Frith, S. M.; Froidevaux, L.; Godin-Beekmann, S.; Granville, J.; Harris, N. R. P.; Hoppel, K.; Hubert, D.; Kasai, Y.; Kurylo, M. J.; Kyrola, E.; Lambert, J.-C.; Levelt, P. F.; McElroy, C. T.; McPeters, R. D.; Munro, R.; Nakajima, H.; Parrish, A.; Raspollini, P.; Remsberg, E. E.; Rosenlof, K. H.; Rozanov, A.; Sano, T.; Sasano, Y.; Shiotani, M.; Zawodny, J. M.

    2014-01-01

    Peak stratospheric chlorofluorocarbon (CFC) and other ozone depleting substance (ODS) concentrations were reached in the mid- to late 1990s. Detection and attribution of the expected recovery of the stratospheric ozone layer in an atmosphere with reduced ODSs as well as efforts to understand the evolution of stratospheric ozone in the presence of increasing greenhouse gases are key current research topics. These require a critical examination of the ozone changes with an accurate knowledge of the spatial (geographical and vertical) and temporal ozone response. For such an examination, it is vital that the quality of the measurements used be as high as possible and measurement uncertainties well quantified. In preparation for the 2014 United Nations Environment Programme (UNEP)/World Meteorological Organization (WMO) Scientific Assessment of Ozone Depletion, the SPARC/IO3C/IGACO-O3/NDACC (SI2N) Initiative was designed to study and document changes in the global ozone profile distribution. This requires assessing long-term ozone profile data sets in regards to measurement stability and uncertainty characteristics. The ultimate goal is to establish suitability for estimating long-term ozone trends to contribute to ozone recovery studies. Some of the data sets have been improved as part of this initiative with updated versions now available. This summary presents an overview of stratospheric ozone profile measurement data sets (ground and satellite based) available for ozone recovery studies. Here we document measurement techniques, spatial and temporal coverage, vertical resolution, native units and measurement uncertainties. In addition, the latest data versions are briefly described (including data version updates as well as detailing multiple retrievals when available for a given satellite instrument). Archive location information for each data set is also given.

  19. Preliminary Examination of a Cartoon-Based Hostile Attributional Bias Measure for Urban African American Boys

    Science.gov (United States)

    Leff, Stephen S.; Lefler, Elizabeth K.; Khera, Gagan S.; Paskewich, Brooke; Jawad, Abbas F.

    2014-01-01

    The current study illustrates how researchers developed and validated a cartoon-based adaptation of a written hostile attributional bias measure for a sample of urban, low-income, African American boys. A series of studies were conducted to develop cartoon illustrations to accompany a standard written hostile attributional bias vignette measure (Study 1), to determine initial psychometric properties (Study 2) and acceptability (Study 3), and to conduct a test-retest reliability trial of the adapted measure in a separate sample (Study 4). These studies utilize a participatory action research approach to measurement design and adaptation, and suggest that collaborations between researchers and key school stakeholders can lead to measures that are psychometrically strong, developmentally appropriate, and culturally sensitive. In addition, the cartoon-based hostile attributional bias measure appears to have promise as an assessment and/or outcome measure for aggression and bullying prevention programs conducted with urban African American boys. PMID:21800228

  20. Measurements of fusion neutron yields by neutron activation technique: Uncertainty due to the uncertainty on activation cross-sections

    International Nuclear Information System (INIS)

    The neutron activation technique is routinely used in fusion experiments to measure the neutron yields. This paper investigates the uncertainty on these measurements as due to the uncertainties on dosimetry and activation reactions. For this purpose, activation cross-sections were taken from the International Reactor Dosimetry and Fusion File (IRDFF-v1.05) in 640 groups ENDF-6 format for several reactions of interest for both 2.5 and 14 MeV neutrons. Activation coefficients (reaction rates) have been calculated using the neutron flux spectra at JET vacuum vessel, both for DD and DT plasmas, calculated by MCNP in the required 640-energy group format. The related uncertainties for the JET neutron spectra are evaluated as well using the covariance data available in the library. These uncertainties are in general small, but not negligible when high accuracy is required in the determination of the fusion neutron yields

  1. Reconsideration of the uncertainty relations and quantum measurements

    CERN Document Server

    Dumitru, Spiridon

    2012-01-01

    Discussions on uncertainty relations (UR) and quantum measurements (QMS) persisted until nowadays in publications about quantum mechanics (QM). They originate mainly from the conventional interpretation of UR (CIUR). In the most of the QM literarure, it is underestimated the fact that, over the years, a lot of deficiencies regarding CIUR were signaled. As a rule the alluded deficiencies were remarked disparately and discussed as punctual and non-essential questions. Here we approach an investigation of the mentioned deficiencies collected in a conclusive ensemble. Subsequently we expose a reconsideration of the major problems referring to UR and QMS. We reveal that all the basic presumption of CIUR are troubled by insurmountable deficiencies which require the indubitable failure of CIUR and its necessary abandonment. Therefore the UR must be deprived of their statute of crucial pieces for physics. So, the aboriginal versions of UR appear as being in postures of either (i) thought-experimental fictions or (ii)...

  2. Minimization of Uncertainties in the Inverse- Kinetics Measurements Using the Oscillator Technique (June 2013)

    International Nuclear Information System (INIS)

    This paper presents continuous and discrete equations for the propagation of uncertainty applied to inverse kinetics and shows that the uncertainty of a measurement can be minimized by the proper choice of frequency from the perturbing reactivity waveform. (authors)

  3. Physics and Operational Research: measure of uncertainty via Nonlinear Programming

    Science.gov (United States)

    Davizon-Castillo, Yasser A.

    2008-03-01

    Physics and Operational Research presents an interdisciplinary interaction in problems such as Quantum Mechanics, Classical Mechanics and Statistical Mechanics. The nonlinear nature of the physical phenomena in a single well and double well quantum systems is resolved via Nonlinear Programming (NLP) techniques (Kuhn-Tucker conditions, Dynamic Programming) subject to Heisenberg Uncertainty Principle and an extended equality uncertainty relation to exploit the NLP Lagrangian method. This review addresses problems in Kinematics and Thermal Physics developing uncertainty relations for each case of study, under a novel way to quantify uncertainty.

  4. Including uncertainty in hazard analysis through fuzzy measures

    International Nuclear Information System (INIS)

    This paper presents a method for capturing the uncertainty expressed by an Hazard Analysis (HA) expert team when estimating the frequencies and consequences of accident sequences and provides a sound mathematical framework for propagating this uncertainty to the risk estimates for these accident sequences. The uncertainty is readily expressed as distributions that can visually aid the analyst in determining the extent and source of risk uncertainty in HA accident sequences. The results also can be expressed as single statistics of the distribution in a manner analogous to expressing a probabilistic distribution as a point-value statistic such as a mean or median. The study discussed here used data collected during the elicitation portion of an HA on a high-level waste transfer process to demonstrate the techniques for capturing uncertainty. These data came from observations of the uncertainty that HA team members expressed in assigning frequencies and consequences to accident sequences during an actual HA. This uncertainty was captured and manipulated using ideas from possibility theory. The result of this study is a practical method for displaying and assessing the uncertainty in the HA team estimates of the frequency and consequences for accident sequences. This uncertainty provides potentially valuable information about accident sequences that typically is lost in the HA process

  5. Uncertainty of gamma-ray spectrometry measurement of environmental samples due to uncertainties in matrix composition, density and sample geometry

    International Nuclear Information System (INIS)

    This paper is intended to identify the uncertainties of activities in environmental samples measured with gamma-ray spectrometry that result from uncertainties in matrix composition, density and geometrical dimensions of the sample. For that purpose efficiencies were calculated for a wide range of environmental matrices such as fresh and ashed food samples, water samples and soil samples. Compositions were mainly taken from literature. Densities and geometry parameters were varied in a range occurring in practice. Considered energies cover a range from 46.5 keV to 2000 keV. Finally, a couple of recommendations in respect to gamma-ray spectrometric measurements of environmental samples are given. - Highlights: • Uncertainties of gamma-ray spectrometry measurements were assessed. • Efficiencies were calculated for a wide range of environmental matrices. • The effect of matrix compositions and density on efficiency was studied. • The effect of geometry parameters on efficiency was considered

  6. Monitoring measurement tools: new methods for driving continuous improvements in fleet measurement uncertainty

    Science.gov (United States)

    Solecky, Eric; Archie, Chas; Sendelbach, Matthew; Fiege, Ron; Zaitz, Mary; Shneyder, Dmitriy; Strocchia-rivera, Carlos; Munoz, Andres; Rangarajan, Srinivasan; Muth, William; Brendler, Andrew; Banke, Bill; Schulz, Bernd; Hartig, Carsten; Hoeft, Jon-Tobias; Vaid, Alok; Kelling, Mark; Bunday, Benjamin; Allgair, John

    2009-03-01

    Ever shrinking measurement uncertainty requirements are difficult to achieve for a typical metrology toolset, especially over the entire expected life of the fleet. Many times, acceptable performance can be demonstrated during brief evaluation periods on a tool or two in the fleet. Over time and across the rest of the fleet, the most demanding processes often have measurement uncertainty concerns that prevent optimal process control, thereby limiting premium part yield, especially on the most aggressive technology nodes. Current metrology statistical process control (SPC) monitoring techniques focus on maintaining the performance of the fleet where toolset control chart limits are derived from a stable time period. These tools are prevented from measuring product when a statistical deviation is detected. Lastly, these charts are primarily concerned with daily fluctuations and do not consider the overall measurement uncertainty. It is possible that the control charts implemented for a given toolset suggest a healthy fleet while many of these demanding processes continue to suffer measurement uncertainty issues. This is especially true when extendibility is expected in a given generation of toolset. With this said, there is a need to continually improve the measurement uncertainty of the fleet until it can no longer meet the needed requirements at which point new technology needs to be entertained. This paper explores new methods in analyzing existing SPC monitor data to assess the measurement performance of the fleet and look for opportunities to drive improvements. Long term monitor data from a fleet of overlay and scatterometry tools will be analyzed. The paper also discusses using other methods besides SPC monitors to ensure the fleet stays matched; a set of SPC monitors provides a good baseline of fleet stability but it cannot represent all measurement scenarios happening in product recipes. The analyses presented deal with measurement uncertainty on non-measurement

  7. Weight sensitivity measurement, analysis, and application in multi-attribute evaluation

    Science.gov (United States)

    Zhao, Yong; Huang, Chongyin; Chen, Yang

    2013-11-01

    Weights are used to measure relative importance of multiple attributes or objectives, which influence evaluation or decision results to a great degree. Thus, analyzing weight sensitivity is an important work for a multi-attribute evaluation or decision. A measuring method based on the inclined angle of two vectors is proposed in this paper in order to solve the weight sensitivity of a multi-attribute evaluation with isotonicity characteristic. This method uses the cosine of the inclined angle to measure the weight sensitivity based on preferences or preference combinations. Concepts of sensitivity space, degree, and angle are given, and the relevant measurement method is discussed and proved. Also, this method is used for the choice of the water environment protection projects in Heyuan City.

  8. Measurement uncertainty from In-Situ gamma spectroscopy of nonhomogeneous containers and from Laboratory Assay

    International Nuclear Information System (INIS)

    During a D and D or ER process containers of radioactive waste are normally generated. The activity can commonly be determined by gamma spectroscopy, but frequently the measurement conditions are not conducive to precise sample-detector geometries, and usually the radioactive material is not in a homogeneous distribution. What is the best method to accurately assay these containers - sampling followed by laboratory analysis, or in-situ spectroscopy? What is the uncertainty of the final result? To help answer these questions, the Canberra tool ISOCS Uncertainty Estimator [IUE] was used to mathematically simulate and evaluate several different measurement scenarios and to estimate the uncertainty of the measurement and the sampling process. Several representative containers and source distributions were mathematically defined and evaluated to determine the in-situ measurement uncertainty due to the sample non-uniformity. In the First example a typical field situation requiring the measurement of 200-liter drums was evaluated. A sensitivity analysis was done to show which parameters contributed the most to the uncertainty. Then an efficiency uncertainty calculation was performed. In the Second example, a group of 200-liter drums with various types of non-homogeneous distributions was created, and them measurements were simulated with different detector arrangements to see how the uncertainty varied. In the Third example, a truck filled with non-uniform soil was first measured with multiple in-situ detectors to determine the measurement uncertainty. Then composite samples were extracted and the sampling uncertainty computed for comparison to the field measurement uncertainty. (authors)

  9. Dynamic risk measuring under model uncertainty: taking advantage of the hidden probability measure

    CERN Document Server

    Bion-Nadal, Jocelyne

    2010-01-01

    We study dynamic risk measures in a very general framework enabling to model uncertainty and processes with jumps. We previously showed the existence of a canonical equivalence class of probability measures hidden behind a given set of probability measures possibly non dominated. Taking advantage of this result, we exhibit a dual representation that completely characterizes the dynamic risk measure. We prove continuity and characterize time consistency. Then, we prove regularity for all processes associated to time consistent convex dynamic risk measures. We also study factorization through time for sublinear risk measures. Finally we consider examples (uncertain volatility and G-expectations).

  10. Guitar Chords Classification Using Uncertainty Measurements of Frequency Bins

    Directory of Open Access Journals (Sweden)

    Jesus Guerrero-Turrubiates

    2015-01-01

    Full Text Available This paper presents a method to perform chord classification from recorded audio. The signal harmonics are obtained by using the Fast Fourier Transform, and timbral information is suppressed by spectral whitening. A multiple fundamental frequency estimation of whitened data is achieved by adding attenuated harmonics by a weighting function. This paper proposes a method that performs feature selection by using a thresholding of the uncertainty of all frequency bins. Those measurements under the threshold are removed from the signal in the frequency domain. This allows a reduction of 95.53% of the signal characteristics, and the other 4.47% of frequency bins are used as enhanced information for the classifier. An Artificial Neural Network was utilized to classify four types of chords: major, minor, major 7th, and minor 7th. Those, played in the twelve musical notes, give a total of 48 different chords. Two reference methods (based on Hidden Markov Models were compared with the method proposed in this paper by having the same database for the evaluation test. In most of the performed tests, the proposed method achieved a reasonably high performance, with an accuracy of 93%.

  11. Reconsideration of the Uncertainty Relations and Quantum Measurements

    Directory of Open Access Journals (Sweden)

    Dumitru S.

    2008-04-01

    Full Text Available Discussions on uncertainty relations (UR and quantum measurements (QMS persisted until nowadays in publications about quantum mechanics (QM. They originate mainly from the conventional interpretation of UR (CIUR. In the most of the QM literarure, it is underestimated the fact that, over the years, a lot of deficiencies regarding CIUR were signaled. As a rule the alluded deficiencies were remarked disparately and discussed as punctual and non-essential questions. Here we approach an investigation of the mentioned deficiencies collected in a conclusive ensemble. Subsequently we expose a reconsideration of the major problems referring to UR and QMS. We reveal that all the basic presumption of CIUR are troubled by insurmountable deficiencies which require the indubitable failure of CIUR and its necessary abandonment. Therefore the UR must be deprived of their statute of crucialpieces for physics. So, the aboriginal versions of UR appear as being in postures of either (i thought-experimental fictions or (ii simple QM formulae and, any other versions of them, have no connection with the QMS. Then the QMS must be viewed as an additional subject comparatively with the usual questions of QM. For a theoretical description of QMS we propose an information-transmission model, in which the quantum observables are considered as random variables. Our approach directs to natural solutions and simplifications for many problems regarding UR and QMS.

  12. Reconsideration of the Uncertainty Relations and Quantum Measurements

    Directory of Open Access Journals (Sweden)

    Dumitru S.

    2008-04-01

    Full Text Available Discussions on uncertainty relations (UR and quantum measurements (QMS persisted until nowadays in publications about quantum mechanics (QM. They originate mainly from the conventional interpretation of UR (CIUR. In the most of the QM literarure, it is underestimated the fact that, over the years, a lot of deficiencies regarding CIUR were signaled. As a rule the alluded deficiencies were remarked disparately and dis- cussed as punctual and non-essential questions. Here we approach an investigation of the mentioned deficiencies collected in a conclusive ensemble. Subsequently we expose a reconsideration of the major problems referring to UR and QMS. We reveal that all the basic presumption of CIUR are troubled by insurmountable deficiencies which require the indubitable failure of CIUR and its necessary abandonment. Therefore the UR must be deprived of their statute of crucial pieces for physics. So, the aboriginal versions of UR appear as being in postures of either (i thought-experimental fictions or (ii sim- ple QM formulae and, any other versions of them, have no connection with the QMS. Then the QMS must be viewed as an additional subject comparatively with the usual questions of QM. For a theoretical description of QMS we propose an information- transmission model, in which the quantum observables are considered as random vari- ables. Our approach directs to natural solutions and simplifications for many problems regarding UR and QMS.

  13. Calculation of uncertainties

    International Nuclear Information System (INIS)

    One of the most important aspects in relation to the quality assurance in any analytical activity is the estimation of measurement uncertainty. There is general agreement that 'the expression of the result of a measurement is not complete without specifying its associated uncertainty'. An analytical process is the mechanism for obtaining methodological information (measurand) of a material system (population). This implies the need for the definition of the problem, the choice of methods for sampling and measurement and proper execution of these activities for obtaining information. The result of a measurement is only an approximation or estimate of the value of the measurand, which is complete only when accompanied by an estimate of the uncertainty of the analytical process. According to the 'Vocabulary of Basic and General Terms in Metrology' measurement uncertainty' is the parameter associated with the result of a measurement that characterizes the dispersion of the values that could reasonably be attributed to the measurand (or magnitude). This parameter could be a standard deviation or a confidence interval. The uncertainty evaluation requires detailed look at all possible sources, but not disproportionately. We can make a good estimate of the uncertainty concentrating efforts on the largest contributions. The key steps of the process of determining the uncertainty in the measurements are: - the specification of the measurand; - identification of the sources of uncertainty - the quantification of individual components of uncertainty, - calculate the combined standard uncertainty; - report of uncertainty.

  14. The 1993 ISO Guide to the expression of uncertainty in measurement applied to NAA

    International Nuclear Information System (INIS)

    Principles of the expression of uncertainty in measurements are briefly reviewed and special aspects of the uncertainty quantification in NAA are discussed in detail regarding the relative and k0-standardization in both modes of the technique, i.e., INAA and RNAA. A survey of uncertainty sources is presented and calculation of the combined uncertainty is demonstrated by an example of manganese determination in biological material by RNAA. (author)

  15. Measuring diversity in medical reports based on categorized attributes and international classification systems

    Directory of Open Access Journals (Sweden)

    Přečková Petra

    2012-04-01

    Full Text Available Abstract Background Narrative medical reports do not use standardized terminology and often bring insufficient information for statistical processing and medical decision making. Objectives of the paper are to propose a method for measuring diversity in medical reports written in any language, to compare diversities in narrative and structured medical reports and to map attributes and terms to selected classification systems. Methods A new method based on a general concept of f-diversity is proposed for measuring diversity of medical reports in any language. The method is based on categorized attributes recorded in narrative or structured medical reports and on international classification systems. Values of categories are expressed by terms. Using SNOMED CT and ICD 10 we are mapping attributes and terms to predefined codes. We use f-diversities of Gini-Simpson and Number of Categories types to compare diversities of narrative and structured medical reports. The comparison is based on attributes selected from the Minimal Data Model for Cardiology (MDMC. Results We compared diversities of 110 Czech narrative medical reports and 1119 Czech structured medical reports. Selected categorized attributes of MDMC had mostly different numbers of categories and used different terms in narrative and structured reports. We found more than 60% of MDMC attributes in SNOMED CT. We showed that attributes in narrative medical reports had greater diversity than the same attributes in structured medical reports. Further, we replaced each value of category (term used for attributes in narrative medical reports by the closest term and the category used in MDMC for structured medical reports. We found that relative Gini-Simpson diversities in structured medical reports were significantly smaller than those in narrative medical reports except the "Allergy" attribute. Conclusions Terminology in narrative medical reports is not standardized. Therefore it is nearly

  16. Measurement Issues for Energy Efficient Commercial Buildings: Productivity and Performance Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Jones, D.W.

    2002-05-16

    In previous reports, we have identified two potentially important issues, solutions to which would increase the attractiveness of DOE-developed technologies in commercial buildings energy systems. One issue concerns the fact that in addition to saving energy, many new technologies offer non-energy benefits that contribute to building productivity (firm profitability). The second issue is that new technologies are typically unproven in the eyes of decision makers and must bear risk premiums that offset cost advantages resulting from laboratory calculations. Even though a compelling case can be made for the importance of these issues, for building decision makers to incorporate them in business decisions and for DOE to use them in R&D program planning there must be robust empirical evidence of their existence and size. This paper investigates how such measurements could be made and offers recommendations as to preferred options. There is currently little systematic information on either of these concepts in the literature. Of the two there is somewhat more information on non-energy benefits, but little as regards office buildings. Office building productivity impacts can be observed casually, but must be estimated statistically, because buildings have many interacting attributes and observations based on direct behavior can easily confuse the process of attribution. For example, absenteeism can be easily observed. However, absenteeism may be down because a more healthy space conditioning system was put into place, because the weather was milder, or because firm policy regarding sick days had changed. There is also a general dearth of appropriate information for purposes of estimation. To overcome these difficulties, we propose developing a new data base and applying the technique of hedonic price analysis. This technique has been used extensively in the analysis of residential dwellings. There is also a literature on its application to commercial and industrial

  17. Comparison of two different methods for the uncertainty estimation of circle diameter measurements using an optical coordinate measuring machine

    DEFF Research Database (Denmark)

    Morace, Renata Erica; Hansen, Hans Nørgaard; De Chiffre, Leonardo

    2005-01-01

    This paper deals with the uncertainty estimation of measurements performed on optical coordinate measuring machines (CMMs). Two different methods were used to assess the uncertainty of circle diameter measurements using an optical CMM: the sensitivity analysis developing an uncertainty budget and...... the substitution method based on measuring calibrated workpieces. Three holes with nominal diameter values in the range from 2 mm to 6 mm were measured on an optical CMM equipped with a CCD sensor and expanded measuring uncertainties were estimated to be in the range of 1-2 ìm....

  18. Evaluation of Measurement Uncertainty in Neutron Activation Analysis using Research Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Y. S.; Moon, J. H.; Sun, G. M.; Kim, S. H.; Baek, S. Y.; Lim, J. M.; Lee, Y. N.; Kim, H. R

    2007-02-15

    This report was summarized a general and technical requirements, methods, results on the measurement uncertainty assessment for a maintenance of quality assurance and traceability which should be performed in NAA technique using the HANARO research reactor. It will be used as a basic information to support effectively an accredited analytical services in the future. That is, for the assessment of measurement uncertainty, environmental certified reference materials are used to apply the analytical results obtained from real experiment using ISO-GUM and Monte Carlo Simulation(MCS) methods. Firstly, standard uncertainty of predominant parameters in a NAA is evaluated for the measured values of elements quantitatively, and then combined uncertainty is calculated applying the rule of uncertainty propagation. In addition, the contribution of individual standard uncertainty for the combined uncertainty are estimated and the way for a minimization of them is reviewed.

  19. The uncertainty in physical measurements an introduction to data analysis in the physics laboratory

    CERN Document Server

    Fornasini, Paolo

    2008-01-01

    All measurements of physical quantities are affected by uncertainty. Understanding the origin of uncertainty, evaluating its extent and suitably taking it into account in data analysis is essential for assessing the degree of accuracy of phenomenological relationships and physical laws in both scientific research and technological applications. The Uncertainty in Physical Measurements: An Introduction to Data Analysis in the Physics Laboratory presents an introduction to uncertainty and to some of the most common procedures of data analysis. This book will serve the reader well by filling the gap between tutorial textbooks and highly specialized monographs. The book is divided into three parts. The first part is a phenomenological introduction to measurement and uncertainty: properties of instruments, different causes and corresponding expressions of uncertainty, histograms and distributions, and unified expression of uncertainty. The second part contains an introduction to probability theory, random variable...

  20. Total Measurement Uncertainty for the Plutonium Finishing Plant (PFP) Segmented Gamma Scan Assay System

    CERN Document Server

    Fazzari, D M

    2001-01-01

    This report presents the results of an evaluation of the Total Measurement Uncertainty (TMU) for the Canberra manufactured Segmented Gamma Scanner Assay System (SGSAS) as employed at the Hanford Plutonium Finishing Plant (PFP). In this document, TMU embodies the combined uncertainties due to all of the individual random and systematic sources of measurement uncertainty. It includes uncertainties arising from corrections and factors applied to the analysis of transuranic waste to compensate for inhomogeneities and interferences from the waste matrix and radioactive components. These include uncertainty components for any assumptions contained in the calibration of the system or computation of the data. Uncertainties are propagated at 1 sigma. The final total measurement uncertainty value is reported at the 95% confidence level. The SGSAS is a gamma assay system that is used to assay plutonium and uranium waste. The SGSAS system can be used in a stand-alone mode to perform the NDA characterization of a containe...

  1. Uncertainty of the beam energy measurement in the e+e- collision using Compton backscattering

    Science.gov (United States)

    Mo, Xiao-Hu

    2014-10-01

    The beam energy is measured in the e+e- collision by using Compton backscattering. The uncertainty of this measurement process is studied by virtue of analytical formulas, and the special effects of variant energy spread and energy drift on the systematic uncertainty estimation are also studied with the Monte Carlo sampling technique. These quantitative conclusions are especially important for understanding the uncertainty of the beam energy measurement system.

  2. Measurement Uncertainty Evaluation of Digital Modulation Quality Parameters: Magnitude Error and Phase Error

    Directory of Open Access Journals (Sweden)

    Zhan Zhiqiang

    2016-01-01

    Full Text Available In digital modulation quality parameters traceability, the Error Vector Magnitude, Magnitude Error and Phase Error must be traced, and the measurement uncertainty of the above parameters needs to be assessed. Although the calibration specification JJF1128-2004 Calibration Specification for Vector Signal Analyzers is published domestically, the measurement uncertainty evaluation is unreasonable, the parameters selected is incorrect, and not all error terms are selected in measurement uncertainty evaluation. This article lists formula about magnitude error and phase error, than presents the measurement uncertainty evaluation processes for magnitude error and phase errors.

  3. Radiation-induced statistical uncertainty in the threshold voltage measurement of MOSFET dosimeters

    International Nuclear Information System (INIS)

    The results of a recent study on the limiting uncertainties in the measurement of photon radiation dose with MOSFET dosimeters are reported. The statistical uncertainty in dose measurement from a single device has been measured before and after irradiation. The resulting increase in 1/f noise with radiation dose has been investigated via various analytical models. The limit of uncertainty in the ubiquitous linear trend of threshold voltage with dose has been measured and compared to two nonlinear models. Inter-device uncertainty has been investigated in a group of 40 devices, and preliminary evidence for kurtosis and skewness in the distributions for devices without external bias has been observed

  4. Measurement Uncertainty Evaluation Method Considering Correlation and its Application to Precision Centrifuge

    Directory of Open Access Journals (Sweden)

    Ling Mingxiang

    2014-12-01

    Full Text Available Measurement uncertainty evaluation based on the Monte Carlo method (MCM with the assumption that all uncertainty sources are independent is common. For some measure problems, however, the correlation between input quantities is of great importance and even essential. The purpose of this paper is to provide an uncertainty evaluation method based on MCM that can handle correlated cases, especially for measurement in which uncertainty sources are correlated and submit to non-Gaussian distribution. In this method, a linear-nonlinear transformation technique was developed to generate correlated random variables sampling sequences with target prescribed marginal probability distribution and correlation coefficients. Measurement of the arm stretch of a precision centrifuge of 10-6 order was implemented by a high precision approach and associated uncertainty evaluation was carried out using the mentioned method and the method proposed in the Guide to the Expression of Uncertainty in Measurement (GUM. The obtained results were compared and discussed at last.

  5. Uncertainty and certainty relations for successive projective measurements of a qubit in terms of Tsallis' entropies

    OpenAIRE

    Rastegin, Alexey E.

    2015-01-01

    We study uncertainty and certainty relations for two successive measurements of two-dimensional observables. Uncertainties in successive measurement are considered within the following two scenarios. In the first scenario, the second measurement is performed on the quantum state generated after the first measurement with completely erased information. In the second scenario, the second measurement is performed on the post-first-measurement state conditioned on the actual measurement outcome. ...

  6. Uncertainties in eddy covariance flux measurements assessed from CH4 and N2O observations

    International Nuclear Information System (INIS)

    The uncertainty in eddy covariance (EC) flux measurements is assessed for CH4 and N2O using data measured at a dairy farm site in the Netherlands in 2006 and 2007. An overview is given of the contributing uncertainties and their magnitude. The relative and absolute uncertainty of a 30 min EC flux are estimated for CH4 and N2O using N = 2185 EC fluxes. The average absolute uncertainty and its standard deviation are 500 ± 400 ng C m-2 s-1 for CH4 and 100 ± 100 ng N m-2 s-1 for N2O. The corresponding relative uncertainties have 95% confidence interval ranging from 20% to 300% for CH4 and from 30% to 1800% for N2O. The large relative uncertainties correspond to relatively small EC fluxes. The uncertainties are mainly caused by the uncertainty due to one-point sampling which contributes on average more than 90% to the total uncertainty. The other 10% includes the uncertainty in the correction algorithm for the systematic errors. The uncertainty in a daily and monthly averaged EC flux are estimated for several flux magnitude ranges. The daily and monthly average uncertainty are smaller than 25% and 10% for CH4 and smaller than 50% and 10% for N2O, respectively, based on fluxes larger than 100 ng C m-2 s-1 and 15 ng N m-2 s-1.

  7. How Should Attributions Be Measured? A Reanalysis of Data from Elig and Frieze.

    Science.gov (United States)

    Maruyama, Geoffrey

    1982-01-01

    T.W. Elig and I.H. Frieze used a multitrait, multimethod approach to contrast three methods for measuring attributions: unstructured/open-ended, structured/unidimensional, and structured/ipsative. This paper reanalyzed their data using confirmatory factor analysis techniques. (Author/PN)

  8. The Attributive Theory of Quality: A Model for Quality Measurement in Higher Education.

    Science.gov (United States)

    Afshar, Arash

    A theoretical basis for defining and measuring the quality of institutions of higher education, namely for accreditation purposes, is developed. The theory, the Attributive Theory of Quality, is illustrated using a calculation model that is based on general systems theory. The theory postulates that quality only exists in relation to the…

  9. Measures of improvement MUVoT, a Blended Learning course on the topic of Measurement Uncertainty for advanced Vocational Training

    OpenAIRE

    Groschl, Andreas; Gotz, Jurgen; Loderer, Andreas; Bills, Paul J.; Hausotte, Tino

    2015-01-01

    In verifying the tolerance specification and identifying the zone of conformity of a particular component an adequate determination of the task-related measurement uncertainty relevant to the utilized measurement method is required, in accordance with part one of the standard “Geometrical Product Specifications” as well as with the “Guide to the Expression of Uncertainty in Measurement”. Although, measurement uncertainty is a central subject in the field of metrology and is certainly consider...

  10. Uncertainty of power curve measurement with a two-beam nacelle-mounted lidar

    DEFF Research Database (Denmark)

    Wagner, Rozenn; Courtney, Michael Stephen; Friis Pedersen, Troels;

    2015-01-01

    already been demonstrated to be suitable for use in power performance measurements. To be considered as a professional tool, however, power curve measurements performed using these instruments require traceable calibrated measurements and the quantification of the wind speed measurement uncertainty. Here...... uncertainty lies between 1 and 2% for the wind speed range between cut-in and rated wind speed. Finally, the lidar was mounted on the nacelle of a wind turbine in order to perform a power curve measurement. The wind speed was simultaneously measured with a mast-top mounted cup anemometer placed two rotor...... diameters upwind of the turbine. The wind speed uncertainty related to the lidar tilting was calculated based on the tilt angle uncertainty derived from the inclinometer calibration and the deviation of the measurement height from hub height. The resulting combined uncertainty in the power curve using the...

  11. Uncertainty on differential measurements and its reduction using the calibration by comparison method

    Science.gov (United States)

    Ospina, José; Canuto, Enrico

    2008-08-01

    The paper deals with the uncertainty of differential measurements, obtained from the subtraction of a pair of absolute measurements. It is shown that if the same sensor is used to perform both measurements, a model of the sensor will reveal a correlation component between the uncertainty of each absolute measurement, reducing the uncertainty on its subtraction. The procedure followed is based on the Gauss-Markov estimation method, showing that differential measurement uncertainty vanishes when the gradient to be measured is zero. If the two absolute measurements are to be performed using different sensors, a calibration by comparison between them will result in a similar uncertainty reduction. Finally, a simulated example based on commercially available thermistor data is included.

  12. AVNG SYSTEM SOFTWARE - ATTRIBUTE VERIFICATION SYSTEM WITH INFORMATION BARRIERS FOR MASS AND ISOTOPICS MEASUREMENTS

    International Nuclear Information System (INIS)

    This report describes the software development for the plutonium attribute verification system--AVNG. A brief synopsis of the technical solution for the measurement system is presented. The main tasks for the software development that is underway are formulated. The development tasks are shown in software structural flowcharts, measurement system state diagram and a description of the software. The current status of the AVNG software development is elucidated

  13. AVNG System Software-Attribute Verification System with Information Barriers for Mass Isotopic Measurements

    International Nuclear Information System (INIS)

    This report describes the software development for the plutonium attribute verification system - AVNG. A brief synopsis of the technical solution for the measurement system is presented. The main tasks for the software development that is underway are formulated. The development tasks are shown in software structural flowcharts, measurement system state diagram and a description of the software. The current status of the AVNG software development is elucidated.

  14. A systematic review of quality attributes and measures for software product lines

    OpenAIRE

    Montagud Gregori, Sonia; Abrahao Gonzales, Silvia Mara; Insfrán Pelozo, César Emilio

    2012-01-01

    It is widely accepted that software measures provide an appropriate mechanism for understanding, monitoring, controlling, and predicting the quality of software development projects. In software product lines (SPL), quality is even more important than in a single software product since, owing to systematic reuse, a fault or an inadequate design decision could be propagated to several products in the family. Over the last few years, a great number of quality attributes and measures for assessi...

  15. Estimation of measurement uncertainties in X-ray computed tomography metrology using the substitution method

    DEFF Research Database (Denmark)

    Müller, Pavel; Hiller, Jochen; Dai, Y.;

    2014-01-01

    This paper presents the application of the substitution method for the estimation of measurement uncertainties using calibrated workpieces in X-ray computed tomography (CT) metrology. We have shown that this, well accepted method for uncertainty estimation using tactile coordinate measuring...... machines, can be applied to dimensional CT measurements. The method is based on repeated measurements carried out on a calibrated master piece. The master piece is a component of a dose engine from an insulin pen. Measurement uncertainties estimated from the repeated measurements of the master piece were...

  16. Developing scales measuring disorder-specific intolerance of uncertainty (DSIU) : a new perspective on transdiagnostic

    NARCIS (Netherlands)

    Thibodeau, Michel A; Carleton, R Nicholas; McEvoy, Peter M; Zvolensky, Michael J; Brandt, Charles P; Boelen, Paul A; Mahoney, Alison E J; Deacon, Brett J; Asmundson, Gordon J G

    2015-01-01

    Intolerance of uncertainty (IU) is a construct of growing prominence in literature on anxiety disorders and major depressive disorder. Existing measures of IU do not define the uncertainty that respondents perceive as distressing. To address this limitation, we developed eight scales measuring disor

  17. Uncertainty in Citizen Science observations: from measurement to user perception

    Science.gov (United States)

    Lahoz, William; Schneider, Philipp; Castell, Nuria

    2016-04-01

    Citizen Science activities concern general public engagement in scientific research activities when citizens actively contribute to science either with their intellectual effort or surrounding knowledge or with their tools and resources. The advent of technologies such as the Internet and smartphones, and the growth in their usage, has significantly increased the potential benefits from Citizen Science activities. Citizen Science observations from low-cost sensors, smartphones and Citizen Observatories, provide a novel and recent development in platforms for observing the Earth System, with the opportunity to extend the range of observational platforms available to society to spatio-temporal scales (10-100s m; 1 hr or less) highly relevant to citizen needs. The potential value of Citizen Science is high, with applications in science, education, social aspects, and policy aspects, but this potential, particularly for citizens and policymakers, remains largely untapped. Key areas where Citizen Science data start to have demonstrable benefits include GEOSS Societal Benefit Areas such as Health and Weather. Citizen Science observations have many challenges, including simulation of smaller spatial scales, noisy data, combination with traditional observational methods (satellite and in situ data), and assessment, representation and visualization of uncertainty. Within these challenges, that of the assessment and representation of uncertainty and its communication to users is fundamental, as it provides qualitative and/or quantitative information that influences the belief users will have in environmental information. This presentation will discuss the challenges in assessment and representation of uncertainty in Citizen Science observations, its communication to users, including the use of visualization, and the perception of this uncertainty information by users of Citizen Science observations.

  18. Measurement uncertainty and gauge capability of surface roughness measurements in the automotive industry: a case study

    International Nuclear Information System (INIS)

    The calculation methods of the capability of measurement processes in the automotive industry differ from each other. There are three main calculation methods: MSA, VDA 5 and the international standard, ISO 22514–7. During this research our aim was to compare the capability calculation methods in a case study. Two types of automotive parts (ten pieces of each) are chosen to examine the behaviour of the manufacturing process and to measure the required characteristics of the measurement process being evaluated. The measurement uncertainty of the measuring process is calculated according to the VDA 5 and ISO 22514–7, and MSA guidelines. In this study the conformance of a measurement process in an automotive manufacturing process is determined, and the similarities and the differences between the methods used are shown. (paper)

  19. VNIIEF-ORNL Joint Plutonium Measurements with NMIS and Results of Plutonium Attributes Preliminary Evaluations

    International Nuclear Information System (INIS)

    Within the frameworks of TO No.007 between ORNL and VNIIEF on Nuclear Materials Identification System (NMIS) mastering at VNIIEF in July 2000 there had been finalized joint measurements, in which NMIS-technique equipment was used that had been placed at VNIIEF's disposal by ORNL, as well as VNIIEF-produced unclassified samples of fissile materials. In the report there are presented results of experimental data preliminary processing to obtain absolute values of some attributes used in plutonium shells measurements: values of their mass and thickness. Possibility of fissile materials parameters absolute values obtaining from measurement data essentially widens NMIS applicability to the tasks relevant to these materials inspections

  20. Optimized spectroscopic scheme for enhanced precision CO measurements with applications to urban source attribution

    Science.gov (United States)

    Nottrott, A.; Hoffnagle, J.; Farinas, A.; Rella, C.

    2014-12-01

    Carbon monoxide (CO) is an urban pollutant generated by internal combustion engines which contributes to the formation of ground level ozone (smog). CO is also an excellent tracer for emissions from mobile combustion sources. In this work we present an optimized spectroscopic sampling scheme that enables enhanced precision CO measurements. The scheme was implemented on the Picarro G2401 Cavity Ring-Down Spectroscopy (CRDS) analyzer which measures CO2, CO, CH4 and H2O at 0.2 Hz. The optimized scheme improved the raw precision of CO measurements by 40% from 5 ppb to 3 ppb. Correlations of measured CO2, CO, CH4 and H2O from an urban tower were partitioned by wind direction and combined with a concentration footprint model for source attribution. The application of a concentration footprint for source attribution has several advantages. The upwind extent of the concentration footprint for a given sensor is much larger than the flux footprint. Measurements of mean concentration at the sensor location can be used to estimate source strength from a concentration footprint, while measurements of the vertical concentration flux are necessary to determine source strength from the flux footprint. Direct measurement of vertical concentration flux requires high frequency temporal sampling and increases the cost and complexity of the measurement system.

  1. Progress of the AVNG System - Attribute Verification System with Information Barriers for Mass Isotopics Measurements

    International Nuclear Information System (INIS)

    An attribute verification system (AVNG) with information barriers for mass and isotopics measurements has been designed and its fabrication is nearly completed. The AVNG is being built by scientists at the Russian Federal Nuclear Center-VNIIEF, with support of Los Alamos National Laboratory (LANL) and Lawrence Livermore National Laboratory (LLNL). Such a system could be used to verify the presence of several unclassified attributes of classified material with no classified information release. The system is comprised of a neutron multiplicity counter and gamma-spectrometry system based on a high purity germanium gamma detector (nominal relative efficiency @ 1332 keV 50%) and digital gamma-ray spectrometer DSPECPLUS. The neutron multiplicity counter is a three ring counter with 164 3He tubes. The system was designed to measure prototype containers 491 mm in diameter and 503 mm high. This paper provides a brief history of the project and documents the progress of this effort with drawings and photographs.

  2. PROGRESS OF THE AVNG SYSTEM - ATTRIBUTE VERIFICATION SYSTEM WITH INFORMATION BARRIERS FOR MASS AND ISOTOPICS MEASUREMENTS

    International Nuclear Information System (INIS)

    An attribute verification system (AVNG) with information barriers for mass and isotopics measurements has been designed and its fabrication is nearly completed. The AVNG is being built by scientists at the Russian Federal Nuclear Center-VNIIEF, with support of Los Alamos National Laboratory (LANL) and Lawrence Livermore National Laboratory (LLNL). Such a system could be used to verify the presence of several unclassified attributes of classified material with no classified information release. The system is comprised of a neutron multiplicity counter and gamma-spectrometry system based on a high purity germanium gamma detector (nominal relative efficiency (at) 1332 keV 50%) and digital gamma-ray spectrometer DSPECPLUS. The neutron multiplicity counter is a three ring counter with 164 3He tubes. The system was designed to measure prototype containers 491 mm in diameter and 503 mm high. This paper provides a brief history of the project and documents the progress of this effort with drawings and photographs

  3. Doubt-free uncertainty in measurement an introduction for engineers and students

    CERN Document Server

    Ratcliffe, Colin

    2015-01-01

    This volume presents measurement uncertainty and uncertainty budgets in a form accessible to practicing engineers and engineering students from across a wide range of disciplines. The book gives a detailed explanation of the methods presented by NIST in the “GUM” – Guide to Uncertainty of Measurement. Emphasis is placed on explaining the background and meaning of the topics, while keeping the level of mathematics at the minimum level necessary. Dr. Colin Ratcliffe, USNA, and Bridget Ratcliffe, Johns Hopkins, develop uncertainty budgets and explain their use. In some examples, the budget may show a process is already adequate and where costs can be saved. In other examples, the budget may show the process is inadequate and needs improvement. The book demonstrates how uncertainty budgets help identify the most cost effective place to make changes. In addition, an extensive fully-worked case study leads readers through all issues related to an uncertainty analysis, including a variety of different types of...

  4. Real Graphs from Real Data: Experiencing the Concepts of Measurement and Uncertainty

    Science.gov (United States)

    Farmer, Stuart

    2012-01-01

    A simple activity using cheap and readily available materials is described that allows students to experience first hand many of the concepts of measurement, uncertainty and graph drawing without laborious measuring or calculation. (Contains 9 figures.)

  5. Using Fuzzy Modifier in Similarity Measure of Fuzzy Attribute Graph and Its Automatic Selection in Structural Pattern Recognition

    OpenAIRE

    Payman Moallem

    2007-01-01

    Fuzzy Attribute Graph (FAG) is a powerful tool for representation and recognition of structural patterns. The conventional framework for similarity measure of FAGs is based on equivalent fuzzy attributes but in fuzzy world, some attributes are more important. In this paper, a modified recognition framework, using linguistic modifier for matching of the fuzzy attribute graphs, is introduced. Then an algorithm for automatic selection of fuzzy modifier based on the learning patterns is posed. So...

  6. Measurement and Segmentation of College Students' Noncognitive Attributes: A Targeted Review

    OpenAIRE

    Ann E. Person; Scott E. Baumgartner; Kristin Hallgren; Betsy Santos

    2014-01-01

    This report presents findings from a targeted document review and expert interviews conducted as part of the Student Segmentation Initiative, which was funded by the Bill & Melinda Gates Foundation’s Postsecondary Success strategy. The review addresses three questions relevant to the initiative: (1) What instruments and measures are available to assess postsecondary students’ noncognitive attributes? (2) To what extent are these instruments used to classify or segment student populations?...

  7. Evaluating the capabilities and uncertainties of droplet measurements for the fog droplet spectrometer (FM-100)

    OpenAIRE

    J. K. Spiegel; Zieger, P.; Bukowiecki, N.; E. Hammer; Weingartner, E.; W. Eugster

    2012-01-01

    Droplet size spectra measurements are crucial to obtain a quantitative microphysical description of clouds and fog. However, cloud droplet size measurements are subject to various uncertainties. This work focuses on the evaluation of two key measurement uncertainties arising during cloud droplet size measurements with a conventional droplet size spectrometer (FM-100): first, we addressed the precision with which droplets can be sized with the FM-100 on the basis of Mie theory. We deduced erro...

  8. Evaluating the capabilities and uncertainties of droplet measurements for the fog droplet spectrometer (FM-100)

    OpenAIRE

    J. K. Spiegel; Zieger, P.; Bukowiecki, N.; E. Hammer; Weingartner, E.; W. Eugster

    2012-01-01

    Droplet size spectra measurements are crucial to obtain a quantitative microphysical description of clouds and fog. However, cloud droplet size measurements are subject to various uncertainties. This work focuses on the error analysis of two key measurement uncertainties arising during cloud droplet size measurements with a conventional droplet size spectrometer (FM-100): first, we addressed the precision with which droplets can be sized with the FM-100 on the basis of the Mie theory. We dedu...

  9. Uncertainty quantification in modeling and measuring components with resonant ultrasound spectroscopy

    Science.gov (United States)

    Biedermann, Eric; Jauriqui, Leanne; Aldrin, John C.; Mayes, Alexander; Williams, Tom; Mazdiyasni, Siamack

    2016-02-01

    Resonant Ultrasound Spectroscopy (RUS) is a nondestructive evaluation (NDE) method which can be used for material characterization, defect detection, process control and life monitoring for critical components in gas turbine engines, aircraft and other systems. Accurate forward and inverse modeling for RUS requires a proper accounting of the propagation of uncertainty due to the model and measurement sources. A process for quantifying the propagation of uncertainty to RUS frequency results for models and measurements was developed. Epistemic and aleatory sources of uncertainty were identified for forward model parameters, forward model material property and geometry inputs, inverse model parameters, and physical RUS measurements. RUS model parametric studies were then conducted for simple geometric samples to determine the sensitivity of RUS frequencies and model inversion results to the various sources of uncertainty. The results of these parametric studies were used to calculate uncertainty bounds associated with each source. Uncertainty bounds were then compared to assess the relative impact of the various sources of uncertainty, and mitigations were identified. The elastic material property inputs for forward models, such as Young's Modulus, were found to be the most significant source of uncertainty in these studies. The end result of this work was the development of an uncertainty quantification process that can be adapted to a broad range of components and materials.

  10. On the uncertainty estimation of electromagnetic field measurements using field sensors: A general approach

    International Nuclear Information System (INIS)

    One of the most common and popular practices on measuring the non-ionising electric and/or magnetic field strength employs field meters and the appropriate electric and/or magnetic field strength sensors. These measurements have to meet several requirements proposed by specific guidelines or standards. On the other hand, performing non-ionising exposure assessment using real measurement data can be a very difficult task due to instrumentation limits and uncertainties. In addition, each measuring technique, practice and recommendation has its own drawbacks. In this paper, a methodology for estimating the overall uncertainty for such measurements, including uncertainty estimation of spatial average values of electric or magnetic field strengths, is proposed. Estimating and reporting measurement uncertainty are of great importance, especially when the measured values are very close to the established limits of human exposure to non-ionising electromagnetic fields. (authors)

  11. GUM approach to uncertainty estimations for online 220Rn concentration measurements using Lucas scintillation cell

    International Nuclear Information System (INIS)

    It is now widely recognized that, when all of the known or suspected components of errors have been evaluated and corrected, there still remains an uncertainty, that is, a doubt about how well the result of the measurement represents the value of the quantity being measured. Evaluation of measurement data - Guide to the expression of Uncertainty in Measurement (GUM) is a guidance document, the purpose of which is to promote full information on how uncertainty statements are arrived at and to provide a basis for the international comparison of measurement results. In this paper, uncertainty estimations following GUM guidelines have been made for the measured values of online thoron concentrations using Lucas scintillation cell to prove that the correction for disequilibrium between 220Rn and 216Po is significant in online 220Rn measurements

  12. Measuring the performance of sensors that report uncertainty

    CERN Document Server

    Martin, A D; Parry, M

    2014-01-01

    We provide methods to validate and compare sensor outputs, or inference algorithms applied to sensor data, by adapting statistical scoring rules. The reported output should either be in the form of a prediction interval or of a parameter estimate with corresponding uncertainty. Using knowledge of the `true' parameter values, scoring rules provide a method of ranking different sensors or algorithms for accuracy and precision. As an example, we apply the scoring rules to the inferred masses of cattle from ground force data and draw conclusions on which rules are most meaningful and in which way.

  13. Comparison of model predictions with measurements using the improved spent fuel attribute tester

    International Nuclear Information System (INIS)

    Design improvements for the International Atomic Energy Agency's Spent Fuel Attribute Tester, recommended on the basis of an optimization study, were incorporated into a new instrument fabricated under the Finnish Support Programme. The new instrument was tested at a spent fuel storage pool on September 8 and 9, 1993. The result of two of the measurements have been compared with calculations. In both cases the calculated and measured pulse height spectra in good agreement and the 137Cs gamma peak signature from the target spent fuel element is present

  14. Realistic uncertainties on Hapke model parameters from photometric measurement

    CERN Document Server

    Schmidt, Frederic

    2015-01-01

    Hapke proposed a convenient and widely used analytical model to describe the spectro-photometry of granular materials. Using a compilation of the published data, Hapke (2012, Icarus, 221, 1079-1083) recently studied the relationship of b and c for natural examples and proposed the hockey stick relation (excluding b>0.5 and c>0.5). For the moment, there is no theoretical explanation for this relationship. One goal of this article is to study a possible bias due to the retrieval method. We expand here an innovative Bayesian inversion method in order to study into detail the uncertainties of retrieved parameters. On Emission Phase Function (EPF) data, we demonstrate that the uncertainties of the retrieved parameters follow the same hockey stick relation, suggesting that this relation is due to the fact that b and c are coupled parameters in the Hapke model instead of a natural phenomena. Nevertheless, the data used in the Hapke (2012) compilation generally are full Bidirectional Reflectance Diffusion Function (B...

  15. Liquid Crystal Thermography Measurement Uncertainty Analysis and Its Application to Turbulent Heat Transfer Measurements

    Directory of Open Access Journals (Sweden)

    Yu Rao

    2012-01-01

    Full Text Available Liquid crystal thermography is an advanced nonintrusive measurement technique, which is capable of providing a high-accuracy continuous temperature field measurement, especially for a complex structured heat transfer surface. The first part of the paper presents a comprehensive introduction to the thermochromic liquid crystal material and the related liquid crystal thermography technique. Then, based on the aythors' experiences in using the liquid crystal thermography for the heat transfer measurement, the parameters affecting the measurement uncertainty of the liquid crystal thermography have been discussed in detail through an experimental study. The final part of the paper describes the applications of the steady and transient liquid crystal thermography technique in the study of the turbulent flow heat transfer related to the aeroengine turbine blade cooling.

  16. The Effects of information barrier requirements on the trilateral initiative attribute measurement system (AVNG)

    International Nuclear Information System (INIS)

    Although the detection techniques used for measuring classified materials are very similar to those used in unclassified measurements, the surrounding packaging is generally very different. If iZ classified item is to be measured, an information barrier is required to protect any classified data acquired. This information barrier must protect the classified information while giving the inspector confidence that the unclassified outputs accurately reflect the classified inputs, Both information barrier and authentication considerations must be considered during all phases of system design and fabrication. One example of such a measurement system is the attribute measurement system (termed the AVNG) designed for the: Trilateral Initiative. We will discuss the integration of information barrier components into this system as well as the effects of an information barrier (including authentication) concerns on the implementation of the detector systems.

  17. The CSGU: a measure of controllability, stability, globality, and universality attributions.

    Science.gov (United States)

    Coffee, Pete; Rees, Tim

    2008-10-01

    This article reports initial evidence of construct validity for a four-factor measure of attributions assessing the dimensions of controllability, stability, globality, and universality (the CSGU). In Study 1, using confirmatory factor analysis, factors were confirmed across least successful and most successful conditions. In Study 2, following less successful performances, correlations supported hypothesized relationships between subscales of the CSGU and subscales of the CDSII (McAuley, Duncan, & Russell, 1992). In Study 3, following less successful performances, moderated hierarchical regression analyses demonstrated that individuals have higher subsequent self-efficacy when they perceive causes of performance as controllable, and/or specific, and/or universal. An interaction for controllability and stability demonstrated that if causes are perceived as likely to recur, it is important to perceive that causes are controllable. Researchers are encouraged to use the CSGU to examine main and interactive effects of controllability and generalizability attributions upon outcomes such as self-efficacy, emotions, and performance. PMID:18971514

  18. High Speed Railway Environment Safety Evaluation Based on Measurement Attribute Recognition Model

    Directory of Open Access Journals (Sweden)

    Qizhou Hu

    2014-01-01

    Full Text Available In order to rationally evaluate the high speed railway operation safety level, the environmental safety evaluation index system of high speed railway should be well established by means of analyzing the impact mechanism of severe weather such as raining, thundering, lightning, earthquake, winding, and snowing. In addition to that, the attribute recognition will be identified to determine the similarity between samples and their corresponding attribute classes on the multidimensional space, which is on the basis of the Mahalanobis distance measurement function in terms of Mahalanobis distance with the characteristics of noncorrelation and nondimensionless influence. On top of the assumption, the high speed railway of China environment safety situation will be well elaborated by the suggested methods. The results from the detailed analysis show that the evaluation is basically matched up with the actual situation and could lay a scientific foundation for the high speed railway operation safety.

  19. Uncertainty analysis of steady state incident heat flux measurements in hydrocarbon fuel fires.

    Energy Technology Data Exchange (ETDEWEB)

    Nakos, James Thomas

    2005-12-01

    The objective of this report is to develop uncertainty estimates for three heat flux measurement techniques used for the measurement of incident heat flux in a combined radiative and convective environment. This is related to the measurement of heat flux to objects placed inside hydrocarbon fuel (diesel, JP-8 jet fuel) fires, which is very difficult to make accurately (e.g., less than 10%). Three methods will be discussed: a Schmidt-Boelter heat flux gage; a calorimeter and inverse heat conduction method; and a thin plate and energy balance method. Steady state uncertainties were estimated for two types of fires (i.e., calm wind and high winds) at three times (early in the fire, late in the fire, and at an intermediate time). Results showed a large uncertainty for all three methods. Typical uncertainties for a Schmidt-Boelter gage ranged from {+-}23% for high wind fires to {+-}39% for low wind fires. For the calorimeter/inverse method the uncertainties were {+-}25% to {+-}40%. The thin plate/energy balance method the uncertainties ranged from {+-}21% to {+-}42%. The 23-39% uncertainties for the Schmidt-Boelter gage are much larger than the quoted uncertainty for a radiative only environment (i.e ., {+-}3%). This large difference is due to the convective contribution and because the gage sensitivities to radiative and convective environments are not equal. All these values are larger than desired, which suggests the need for improvements in heat flux measurements in fires.

  20. Measuring Research Data Uncertainty in the 2010 NRC Assessment of Geography Graduate Education

    Science.gov (United States)

    Shortridge, Ashton; Goldsberry, Kirk; Weessies, Kathleen

    2011-01-01

    This article characterizes and measures errors in the 2010 National Research Council (NRC) assessment of research-doctorate programs in geography. This article provides a conceptual model for data-based sources of uncertainty and reports on a quantitative assessment of NRC research data uncertainty for a particular geography doctoral program.…

  1. A model for the time uncertainty measurements in the Auger surface detector array

    OpenAIRE

    Bonifazi, C.; Letessier-Selvon, A.; Santos, E. M.

    2007-01-01

    The precise determination of the arrival direction of cosmic rays is a fundamental prerequisite for the search for sources or the study of their anisotropies on the sky. One of the most important aspects to achieve an optimal measurement of these directions is to properly take into account the measurement uncertainties in the estimation procedure. In this article we present a model for the uncertainties associated with the time measurements in the Auger surface detector array. We show that th...

  2. Helium Mass Spectrometer Leak Detection: A Method to Quantify Total Measurement Uncertainty

    Science.gov (United States)

    Mather, Janice L.; Taylor, Shawn C.

    2015-01-01

    In applications where leak rates of components or systems are evaluated against a leak rate requirement, the uncertainty of the measured leak rate must be included in the reported result. However, in the helium mass spectrometer leak detection method, the sensitivity, or resolution, of the instrument is often the only component of the total measurement uncertainty noted when reporting results. To address this shortfall, a measurement uncertainty analysis method was developed that includes the leak detector unit's resolution, repeatability, hysteresis, and drift, along with the uncertainty associated with the calibration standard. In a step-wise process, the method identifies the bias and precision components of the calibration standard, the measurement correction factor (K-factor), and the leak detector unit. Together these individual contributions to error are combined and the total measurement uncertainty is determined using the root-sum-square method. It was found that the precision component contributes more to the total uncertainty than the bias component, but the bias component is not insignificant. For helium mass spectrometer leak rate tests where unit sensitivity alone is not enough, a thorough evaluation of the measurement uncertainty such as the one presented herein should be performed and reported along with the leak rate value.

  3. Realistic uncertainties on Hapke model parameters from photometric measurement

    Science.gov (United States)

    Schmidt, Frédéric; Fernando, Jennifer

    2015-11-01

    The single particle phase function describes the manner in which an average element of a granular material diffuses the light in the angular space usually with two parameters: the asymmetry parameter b describing the width of the scattering lobe and the backscattering fraction c describing the main direction of the scattering lobe. Hapke proposed a convenient and widely used analytical model to describe the spectro-photometry of granular materials. Using a compilation of the published data, Hapke (Hapke, B. [2012]. Icarus 221, 1079-1083) recently studied the relationship of b and c for natural examples and proposed the hockey stick relation (excluding b > 0.5 and c > 0.5). For the moment, there is no theoretical explanation for this relationship. One goal of this article is to study a possible bias due to the retrieval method. We expand here an innovative Bayesian inversion method in order to study into detail the uncertainties of retrieved parameters. On Emission Phase Function (EPF) data, we demonstrate that the uncertainties of the retrieved parameters follow the same hockey stick relation, suggesting that this relation is due to the fact that b and c are coupled parameters in the Hapke model instead of a natural phenomena. Nevertheless, the data used in the Hapke (Hapke, B. [2012]. Icarus 221, 1079-1083) compilation generally are full Bidirectional Reflectance Diffusion Function (BRDF) that are shown not to be subject to this artifact. Moreover, the Bayesian method is a good tool to test if the sampling geometry is sufficient to constrain the parameters (single scattering albedo, surface roughness, b, c , opposition effect). We performed sensitivity tests by mimicking various surface scattering properties and various single image-like/disk resolved image, EPF-like and BRDF-like geometric sampling conditions. The second goal of this article is to estimate the favorable geometric conditions for an accurate estimation of photometric parameters in order to provide

  4. Total Measurement Uncertainty (TMU) for Nondestructive Assay of Transuranic (TRU) Waste at the WRAP Facility

    International Nuclear Information System (INIS)

    The Waste Receiving and Processing (WRAP) facility, located on the Hanford Site in southeast Washington, is a key link in the certification of Hanford's transuranic (TRU) waste for shipment to the Waste Isolation Pilot Plant (WIPP). Waste characterization is one of the vital functions performed at WRAP, and nondestructive assay (NDA) measurements of TRU waste containers is one of two required methods used for waste characterization (Reference 1). Various programs exist to ensure the validity of waste characterization data; all of these cite the need for clearly defined knowledge of uncertainty, associated with any measurements taken. All measurements have an inherent uncertainty associated with them. The combined effect of all uncertainties associated with a measurement is referred to as the Total Measurement Uncertainty (TMU). The NDA measurement uncertainties can be numerous and complex. In addition to system-induced measurement uncertainty, other factors contribute to the TMU, each associated with a particular measurement. The NDA measurements at WRAP are based on processes (radioactive decay and induced fission) which are statistical in nature. As a result, the proper statistical summation of the various uncertainty components is essential. This report examines the contributing factors to NDA measurement uncertainty at WRAP. The significance of each factor on the TMU is analyzed, and a final method is given for determining the TMU for NDA measurements at WRAP. As more data becomes available, and WRAP gains in operational experience, this report will be reviewed semi-annually and updated as necessary. This report also includes the data flow paths for the analytical process in the radiometric determinations

  5. Total Measurement Uncertainty (TMU) for Nondestructive Assay of Transuranic (TRU) Waste at the WRAP Facility

    Energy Technology Data Exchange (ETDEWEB)

    WILLS, C.E.

    2000-02-24

    The Waste Receiving and Processing (WRAP) facility, located on the Hanford Site in southeast Washington, is a key link in the certification of Hanford's transuranic (TRU) waste for shipment to the Waste Isolation Pilot Plant (WIPP). Waste characterization is one of the vital functions performed at WRAP, and nondestructive assay (NDA) measurements of TRU waste containers is one of two required methods used for waste characterization (Reference 1). Various programs exist to ensure the validity of waste characterization data; all of these cite the need for clearly defined knowledge of uncertainty, associated with any measurements taken. All measurements have an inherent uncertainty associated with them. The combined effect of all uncertainties associated with a measurement is referred to as the Total Measurement Uncertainty (TMU). The NDA measurement uncertainties can be numerous and complex. In addition to system-induced measurement uncertainty, other factors contribute to the TMU, each associated with a particular measurement. The NDA measurements at WRAP are based on processes (radioactive decay and induced fission) which are statistical in nature. As a result, the proper statistical summation of the various uncertainty components is essential. This report examines the contributing factors to NDA measurement uncertainty at WRAP. The significance of each factor on the TMU is analyzed, and a final method is given for determining the TMU for NDA measurements at WRAP. As more data becomes available, and WRAP gains in operational experience, this report will be reviewed semi-annually and updated as necessary. This report also includes the data flow paths for the analytical process in the radiometric determinations.

  6. Alternative risk measure for decision-making under uncertainty in water management

    Institute of Scientific and Technical Information of China (English)

    Yueping Xu; YeouKoung Tung; Jia Li; Shaofeng Niu

    2009-01-01

    Taking into account uncertainties in water management remains a challenge due to social,economic and environmental changes.Often,uncertainty creates difficulty in ranking or comparing multiple water management options,possibly leading to a wrong decision.In this paper,an alternative risk measure is proposed to facilitate the ranking or comparison of water management options under uncertainty by using the concepts of conditional expected loss and partial mean.This measure has the advantages of being more intuitive,general and could relate to many other measures of risk in the literature.The application of the risk measure is demonstrated through a case study for the evaluation of flood mitigation projects.The results show that the new measure is applicable to a general decisionmaking process under uncertainty.

  7. Role and Significance of Uncertainty in HV Measurement of Porcelain Insulators - a Case Study

    Science.gov (United States)

    Choudhary, Rahul Raj; Bhardwaj, Pooja; Dayama, Ravindra

    The improved safety margins in complex systems have attained prime importance in the modern scientific environment. The analysis and implementation of complex systems demands the well quantified accuracy and capability of measurements. Careful measurement with properly identified and quantified uncertainties could lead to the actual discovery which further may contribute for social developments. Unfortunately most scientists and students are passively taught to ignore the possibility of definition problems in the field of measurement and are often source of great arguments. Identifying this issue, ISO has initiated the standardisation of methodologies but its Guide to the Expression of Uncertainty in Measurement (GUM) has yet to be adapted seriously in tertiary education institutions for understanding the concept of uncertainty. The paper has been focused for understanding the concepts of measurement and uncertainty. Further a case study for calculation and quantification of UOM for high voltage electrical testing of ceramic insulators has been explained.

  8. Utilization of Software Tools for Uncertainty Calculation in Measurement Science Education

    International Nuclear Information System (INIS)

    Despite its importance, uncertainty is often neglected by practitioners in the design of system even in safety critical applications. Thus, problems arising from uncertainty may only be identified late in the design process and thus lead to additional costs. Although there exists numerous tools to support uncertainty calculation, reasons for limited usage in early design phases may be low awareness of the existence of the tools and insufficient training in the practical application. We present a teaching philosophy that addresses uncertainty from the very beginning of teaching measurement science, in particular with respect to the utilization of software tools. The developed teaching material is based on the GUM method and makes use of uncertainty toolboxes in the simulation environment. Based on examples in measurement science education we discuss advantages and disadvantages of the proposed teaching philosophy and include feedback from students

  9. Estimation of the measurement uncertainty in magnetic resonance velocimetry based on statistical models

    Science.gov (United States)

    Bruschewski, Martin; Freudenhammer, Daniel; Buchenberg, Waltraud B.; Schiffer, Heinz-Peter; Grundmann, Sven

    2016-05-01

    Velocity measurements with magnetic resonance velocimetry offer outstanding possibilities for experimental fluid mechanics. The purpose of this study was to provide practical guidelines for the estimation of the measurement uncertainty in such experiments. Based on various test cases, it is shown that the uncertainty estimate can vary substantially depending on how the uncertainty is obtained. The conventional approach to estimate the uncertainty from the noise in the artifact-free background can lead to wrong results. A deviation of up to -75 % is observed with the presented experiments. In addition, a similarly high deviation is demonstrated with the data from other studies. As a more accurate approach, the uncertainty is estimated directly from the image region with the flow sample. Two possible estimation methods are presented.

  10. Computer-assisted uncertainty assessment of k0-NAA measurement results

    International Nuclear Information System (INIS)

    In quantifying measurement uncertainty of measurement results obtained by the k0-based neutron activation analysis (k0-NAA), a number of parameters should be considered and appropriately combined in deriving the final budget. To facilitate this process, a program ERON (ERror propagatiON) was developed, which computes uncertainty propagation factors from the relevant formulae and calculates the combined uncertainty. The program calculates uncertainty of the final result-mass fraction of an element in the measured sample-taking into account the relevant neutron flux parameters such as α and f, including their uncertainties. Nuclear parameters and their uncertainties are taken from the IUPAC database (V.P. Kolotov and F. De Corte, Compilation of k0 and related data for NAA). Furthermore, the program allows for uncertainty calculations of the measured parameters needed in k0-NAA: α (determined with either the Cd-ratio or the Cd-covered multi-monitor method), f (using the Cd-ratio or the bare method), Q0 (using the Cd-ratio or internal comparator method) and k0 (using the Cd-ratio, internal comparator or the Cd subtraction method). The results of calculations can be printed or exported to text or MS Excel format for further analysis. Special care was taken to make the calculation engine portable by having possibility of its incorporation into other applications (e.g., DLL and WWW server). Theoretical basis and the program are described in detail, and typical results obtained under real measurement conditions are presented

  11. Attribute measurement equipment for the verification of plutonium in classified forms for the Trilateral Initiative

    International Nuclear Information System (INIS)

    Full text: A team of technical experts from the Russian Federation, the International Atomic Energy Agency (IAEA) and the United States have been working for almost five years on the development of a tool kit of instruments that could be used to verify plutonium-bearing items that have classified characteristics in nuclear weapons states. This suite of instruments is similar in many ways to standard safeguards equipment and includes high-resolution gamma-ray spectrometers, neutron multiplicity counters, gross neutron counters and gross gamma-ray detectors. In safeguards applications, this equipment is known to be robust, and authentication methods are well understood. This equipment is very intrusive, however, and a traditional safeguards application of such equipment for verification of materials with classified characteristics would reveal classified information to the inspector, Several enabling technologies have been or are being developed to facilitate the use of these trusted, but intrusive technologies. In this paper, these technologies will be described. One of the new technologies is called an Attribute Verification System with an Information Barrier Utilizing Neutron Multiplicity Counting and High-Resolution Gamma-Ray Spectrometry' or AVNG. The radiation measurement equipment, comprising a neutron multiplicity counter and high-resolution gamma-ray spectrometer, is standard safeguards-type equipment with information security features added. The information barrier is a combination of technical and procedural methods that protect classified information while allowing the inspector to have confidence that the measurement equipment is providing authentic results. The approach is to reduce the radiation data collected by the measurement equipment to a simple 'yes/no' result regarding attributes of the plutonium-bearing item. The 'yes/no' result is unclassified by design so that it can be shared with an inspector. The attributes that the Trilateral Initiative

  12. Universal Uncertainty Principle, Simultaneous Measurability, and Weak Values

    OpenAIRE

    Ozawa, Masanao

    2011-01-01

    In the conventional formulation, it is broadly accepted that simultaneous measurability and commutativity of observables are equivalent. However, several objections have been claimed that there are cases in which even nowhere commuting observables can be measured simultaneously. Here, we outline a new theory of simultaneous measurements based on a state-dependent formulation, in which nowhere commuting observables are shown to have simultaneous measurements in some states, so that the known o...

  13. The grey relational approach for evaluating measurement uncertainty with poor information

    International Nuclear Information System (INIS)

    The Guide to the Expression of Uncertainty in Measurement (GUM) is the master document for measurement uncertainty evaluation. However, the GUM may encounter problems and does not work well when the measurement data have poor information. In most cases, poor information means a small data sample and an unknown probability distribution. In these cases, the evaluation of measurement uncertainty has become a bottleneck in practical measurement. To solve this problem, a novel method called the grey relational approach (GRA), different from the statistical theory, is proposed in this paper. The GRA does not require a large sample size or probability distribution information of the measurement data. Mathematically, the GRA can be divided into three parts. Firstly, according to grey relational analysis, the grey relational coefficients between the ideal and the practical measurement output series are obtained. Secondly, the weighted coefficients and the measurement expectation function will be acquired based on the grey relational coefficients. Finally, the measurement uncertainty is evaluated based on grey modeling. In order to validate the performance of this method, simulation experiments were performed and the evaluation results show that the GRA can keep the average error around 5%. Besides, the GRA was also compared with the grey method, the Bessel method, and the Monte Carlo method by a real stress measurement. Both the simulation experiments and real measurement show that the GRA is appropriate and effective to evaluate the measurement uncertainty with poor information. (paper)

  14. Uncertainty analysis guide

    Energy Technology Data Exchange (ETDEWEB)

    Andres, T.H

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  15. Uncertainty analysis guide

    International Nuclear Information System (INIS)

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  16. Uncertainty measurement in the homogenization and sample reduction in the physical classification of rice and beans

    Directory of Open Access Journals (Sweden)

    Dieisson Pivoto

    2016-04-01

    Full Text Available ABSTRACT: The study aimed to i quantify the measurement uncertainty in the physical tests of rice and beans for a hypothetical defect, ii verify whether homogenization and sample reduction in the physical classification tests of rice and beans is effective to reduce the measurement uncertainty of the process and iii determine whether the increase in size of beans sample increases accuracy and reduces measurement uncertainty in a significant way. Hypothetical defects in rice and beans with different damage levels were simulated according to the testing methodology determined by the Normative Ruling of each product. The homogenization and sample reduction in the physical classification of rice and beans are not effective, transferring to the final test result a high measurement uncertainty. The sample size indicated by the Normative Ruling did not allow an appropriate homogenization and should be increased.

  17. Particle image velocimetry correlation signal-to-noise ratio metrics and measurement uncertainty quantification

    CERN Document Server

    Xue, Zhenyu; Vlachos, Pavlos P

    2014-01-01

    In particle image velocimetry (PIV) the measurement signal is contained in the recorded intensity of the particle image pattern superimposed on a variety of noise sources. The signal-to-noise-ratio (SNR) strength governs the resulting PIV cross correlation and ultimately the accuracy and uncertainty of the resulting PIV measurement. Hence we posit that correlation SNR metrics calculated from the correlation plane can be used to quantify the quality of the correlation and the resulting uncertainty of an individual measurement. In this paper we present a framework for evaluating the correlation SNR using a set of different metrics, which in turn are used to develop models for uncertainty estimation. The SNR metrics and corresponding models presented herein are expanded to be applicable to both standard and filtered correlations. In addition, the notion of a valid measurement is redefined with respect to the correlation peak width in order to be consistent with uncertainty quantification principles and distinct ...

  18. Evidential Reasoning-Based Approach for Multiple Attribute Decision Making Problems under Uncertainty%基于证据推理的不确定多属性决策方法

    Institute of Scientific and Technical Information of China (English)

    郭凯红; 李文立

    2012-01-01

    The previous study shows that the evidential reasoning algorithm is an effective and rational method to solve MADM (Multiple Attribute Decision Making) problems under uncertainty. However, the method has constraints that attribute weights should be deterministic and evaluation grades assessing basic attributes and general attributes should be consistent. However, these constraints are not relevant to the actual decision-making problems, especially for basic qualitative attributes. Existing subjective and objective methods have defect for basic attribute weights. Most methods assume that the grade is the same in order to evaluate grades based on basic and general attributes. Therefore, these methods are not effective to assist the decision making process and solve problems.In consideration of the weakness of previous study, this study proposes a method based on the evidential reasoning for MADM under uncertainty with the goal of extending evidential reasoning algorithm into a more general decision environment.The first part is to determine basic attribute weights. We first briefly introduce the evidential reasoning algorithm, discussing two major issues related to its effective application for MADM under uncertainty: (1) how to totally determine basic attribute weights, and (2) how to fully implement the transformation of distributed assessment from basic attributes into general attributes. In addition, we calculate basic attribute weights using the information entropy of decision matrix to solve the first problem. In the second part, we implement the equivalent transformation of distributed assessments from basic attributes into general attributes by assuming that evaluation grades assessing basic attributes and general attributes are not the same.We first fuzz the distributed assessments of basic attributes according to different data types of basic attribute values, and then implement, based on fuzzy transformation theory, the unified form of general distributed

  19. On the Uncertainties of Stellar Mass Estimates via Colour Measurements

    CERN Document Server

    Roediger, Joel C

    2015-01-01

    Mass-to-light versus colour relations (MLCRs), derived from stellar population synthesis models, are widely used to estimate galaxy stellar masses (M$_*$) yet a detailed investigation of their inherent biases and limitations is still lacking. We quantify several potential sources of uncertainty, using optical and near-infrared (NIR) photometry for a representative sample of nearby galaxies from the Virgo cluster. Our method for combining multi-band photometry with MLCRs yields robust stellar masses, while errors in M$_*$ decrease as more bands are simultaneously considered. The prior assumptions in one's stellar population modelling dominate the error budget, creating a colour-dependent bias of up to 0.6 dex if NIR fluxes are used (0.3 dex otherwise). This matches the systematic errors associated with the method of spectral energy distribution (SED) fitting, indicating that MLCRs do not suffer from much additional bias. Moreover, MLCRs and SED fitting yield similar degrees of random error ($\\sim$0.1-0.14 dex)...

  20. Validity of Willingness to Pay Measures under Preference Uncertainty

    Science.gov (United States)

    Braun, Carola; Rehdanz, Katrin; Schmidt, Ulrich

    2016-01-01

    Recent studies in the marketing literature developed a new method for eliciting willingness to pay (WTP) with an open-ended elicitation format: the Range-WTP method. In contrast to the traditional approach of eliciting WTP as a single value (Point-WTP), Range-WTP explicitly allows for preference uncertainty in responses. The aim of this paper is to apply Range-WTP to the domain of contingent valuation and to test for its theoretical validity and robustness in comparison to the Point-WTP. Using data from two novel large-scale surveys on the perception of solar radiation management (SRM), a little-known technique for counteracting climate change, we compare the performance of both methods in the field. In addition to the theoretical validity (i.e. the degree to which WTP values are consistent with theoretical expectations), we analyse the test-retest reliability and stability of our results over time. Our evidence suggests that the Range-WTP method clearly outperforms the Point-WTP method. PMID:27096163

  1. Benchmarking laboratory observation uncertainty for in-pipe storm sewer discharge measurements

    Science.gov (United States)

    Aguilar, Marcus F.; McDonald, Walter M.; Dymond, Randel L.

    2016-03-01

    The uncertainty associated with discharge measurement in storm sewer systems is of fundamental importance for hydrologic/hydraulic model calibration and pollutant load estimation, although it is difficult to determine as field benchmarks are generally impractical. This study benchmarks discharge uncertainty in several commonly used sensors by laboratory flume testing with and without a woody debris model. The sensors are then installed in a field location where laboratory benchmarked uncertainty is applied to field measurements. Combined depth and velocity uncertainty from the laboratory ranged from ±0.207-0.710 in., and ±0.176-0.631 fps respectively, and when propagated and applied to discharge estimation in the field, resulted in field discharge uncertainties of between 13% and 256% of the observation. Average daily volume calculation based on these observations had uncertainties of between 58% and 99% of the estimated value, and the uncertainty bounds of storm flow volume and peak flow for nine storm events constituted between 31-84%, and 13-48% of the estimated value respectively. Subsequently, the implications of these observational uncertainties for stormwater best-management practice evaluation, hydrologic modeling, and Total Maximum Daily Load development are considered.

  2. Using Fuzzy Modifier in Similarity Measure of Fuzzy Attribute Graph and Its Automatic Selection in Structural Pattern Recognition

    Directory of Open Access Journals (Sweden)

    Payman Moallem

    2007-09-01

    Full Text Available Fuzzy Attribute Graph (FAG is a powerful tool for representation and recognition of structural patterns. The conventional framework for similarity measure of FAGs is based on equivalent fuzzy attributes but in fuzzy world, some attributes are more important. In this paper, a modified recognition framework, using linguistic modifier for matching of the fuzzy attribute graphs, is introduced. Then an algorithm for automatic selection of fuzzy modifier based on the learning patterns is posed. Some examples for the conventional and modified framework for FAG similarity measure are studied and the potential of the proposed framework for FAG matching is showed.

  3. Uncertainty issues on S-CO{sub 2} compressor performance measurement

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jekyoung; Cho, Seongkuk; Lee, Jeong Ik [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2014-10-15

    This is related to the property variation, pressure ratio and measurement method. Since SCO2PE facility operates near the critical point with a low pressure ratio compressor, one of the solutions to improve measurement uncertainty is utilizing a density meter. However, additional two density meters on compressor inlet and outlet measurement didn't provide remarkable improvement on the overall uncertainty. Thus, the authors think that different approach on the performance measurement is required to secure measurement confidence. As further works, identifying appropriate approximation on efficiency equation and applying direct measurement of compressor shaft power for the efficiency calculation will be considered.

  4. Uncertainty of nitrate and sulphate measured by ion chromatography in wastewater samples

    OpenAIRE

    Tepuš, Brigita; Simonič, Marjana

    2012-01-01

    This paper presents an evaluation of measurement uncertainty regarding the results of anion (nitrate and sulphate) concentrations in wastewater. Anions were determined by ion chromatography (EN ISO 10304-2, 1996). The major sources of uncertainty regarding the measurement results were identified as contributions to linear least-square or weighted regression lines, precision, trueness, storage conditions, and sampling. Determination of anions in wastewater is very important for the purificatio...

  5. UNCERTAINTY AND ITS IMPACT ON THE QUALITY OF MEASUREMENT

    OpenAIRE

    Adel Elahdi M. Yahya; Martin Halaj

    2012-01-01

    The imposition of practice, the current world, the laboratory measurement, calibration should be approved by points of credit to national or international and should be compatible with the requirements specification (ISO 17025) for the adoption of efficient laboratories. Those requirements were included the testing process or scale limits to doubt that mentioned in the measurement certificate, which recognizes the customer to achieve quality and efficiency in the process of measurement. In th...

  6. Improvement of process monitoring uncertainty by the use of Diverse Measurement Methods

    International Nuclear Information System (INIS)

    Primary coolant flow monitoring margin for one of the plants Westinghouse services was approaching 0% due to Steam Generator tube plugging and design changes. The approved methodology requires calibrating the primary coolant flow to the calorimetrically measured flow. The calorimetric flow measurement has a high uncertainty and has errors due to process variations. The path identified to gain margin was to develop a new method to reduce the uncertainty of the reference flow measurement. The new method identified for the uncertainty reduction was to determine the reference flow based on diverse, independent indications of flow. Utilizing the variance weighted averaging technique, the method produces a more accurate best estimate reference flow. Our team used three alternate indications available in the plant and the simulation of the flow loop as the diverse indications of flow. The benefit to the plant was a 60% reduction of the uncertainty. Introduction: Standard monitoring method: based on reactor coolant pump pressure differentials, periodically calibrated to the calorimetrically measured flow. - RCP involvement: Difficulty with RCP DP based flow measurement: it needs to be calibrated to a reference flow since the conditions of performance testing, if it is done at all, differ from the operating conditions. - Calorimetric involvement: Difficulties with calorimetric flow measurement: high uncertainty due to process noise and the process variation based biasing. - Plant Problem: Cycle independent calibration, applied at various plants, would not be adequate at the plant without RSG due to high uncertainty and low monitored flow. Increased resistance could put plant operability at risk. - Identification of the problem: Identified reference flow uncertainty as dominant margin reduction - Proposed Solution: Use Diverse Methods to reduce uncertainty and get more accurate best estimate flow, calibrate pump DP data to best estimate flow to allow plant to not have to

  7. Feasibility study on using fast calorimetry technique to measure a mass attribute as part of a treaty verification regime

    International Nuclear Information System (INIS)

    The attribute measurement technique provides a method for determining whether or not an item containing special nuclear material (SNM) possesses attributes that fall within an agreed upon range of values. One potential attribute is whether the mass of an SNM item is larger than some threshold value that has been negotiated as part of a nonproliferation treaty. While the historical focus on measuring mass attributes has been on using neutron measurements, calorimetry measurements may be a viable alternative for measuring mass attributes for plutonium-bearing items. Traditionally, calorimetry measurements have provided a highly precise and accurate determination of the thermal power that is being generated by an item. In order to achieve this high level of precision and accuracy, the item must reach thermal equilibrium inside the calorimeter prior to determining the thermal power of the item. Because the approach to thermal equilibrium is exponential in nature, a large portion of the time spent approaching equilibrium is spent with the measurement being within ∼10% of its final equilibrium value inside the calorimeter. Since a mass attribute measurement only needs to positively determine if the mass of a given SNM item is greater than a threshold value, performing a short calorimetry measurement to determine how the system is approaching thermal equilibrium may provide sufficient information to determine if an item has a larger mass than the agreed upon threshold. In previous research into a fast calorimetry attribute technique, a two-dimensional heat flow model of a calorimeter was used to investigate the possibility of determining a mass attribute for plutonium-bearing items using this technique. While the results of this study looked favorable for developing a fast calorimetry attribute technique, additional work was needed to determine the accuracy of the model used to make the calculations. In this paper, the results from the current work investigating the

  8. Feasibility study on using fast calorimetry technique to measure a mass attribute as part of a treaty verification regime

    Energy Technology Data Exchange (ETDEWEB)

    Hauck, Danielle K [Los Alamos National Laboratory; Bracken, David S [Los Alamos National Laboratory; Mac Arthur, Duncan W [Los Alamos National Laboratory; Santi, Peter A [Los Alamos National Laboratory; Thron, Jonathan [Los Alamos National Laboratory

    2010-01-01

    The attribute measurement technique provides a method for determining whether or not an item containing special nuclear material (SNM) possesses attributes that fall within an agreed upon range of values. One potential attribute is whether the mass of an SNM item is larger than some threshold value that has been negotiated as part of a nonproliferation treaty. While the historical focus on measuring mass attributes has been on using neutron measurements, calorimetry measurements may be a viable alternative for measuring mass attributes for plutonium-bearing items. Traditionally, calorimetry measurements have provided a highly precise and accurate determination of the thermal power that is being generated by an item. In order to achieve this high level of precision and accuracy, the item must reach thermal equilibrium inside the calorimeter prior to determining the thermal power of the item. Because the approach to thermal equilibrium is exponential in nature, a large portion of the time spent approaching equilibrium is spent with the measurement being within {approx}10% of its final equilibrium value inside the calorimeter. Since a mass attribute measurement only needs to positively determine if the mass of a given SNM item is greater than a threshold value, performing a short calorimetry measurement to determine how the system is approaching thermal equilibrium may provide sufficient information to determine if an item has a larger mass than the agreed upon threshold. In previous research into a fast calorimetry attribute technique, a two-dimensional heat flow model of a calorimeter was used to investigate the possibility of determining a mass attribute for plutonium-bearing items using this technique. While the results of this study looked favorable for developing a fast calorimetry attribute technique, additional work was needed to determine the accuracy of the model used to make the calculations. In this paper, the results from the current work investigating

  9. Improved sample size determination for attributes and variables sampling

    International Nuclear Information System (INIS)

    Earlier INMM paper have addressed the attributes/variables problem and, under conservative/limiting approximations, have reported analytical solutions for the attributes and variables sample sizes. Through computer simulation of this problem, the authors have calculated attributes and variables sample sizes as a function of falsification, measurement uncertainties, and required detection probability without using approximations. Using realistic assumptions for uncertainty parameters of measurement, the simulation results support the conclusions: (1) previously used conservative approximations can be expensive because they lead to larger sample sizes than needed, and (2) the optimal verification strategy, as well as the falsification strategy, are highly dependent on the underlying uncertainty parameters of the measurement instruments

  10. Particle image velocimetry correlation signal-to-noise ratio metrics and measurement uncertainty quantification

    International Nuclear Information System (INIS)

    In particle image velocimetry (PIV) the measurement signal is contained in the recorded intensity of the particle image pattern superimposed on a variety of noise sources. The signal-to-noise-ratio (SNR) strength governs the resulting PIV cross correlation and ultimately the accuracy and uncertainty of the resulting PIV measurement. Hence we posit that correlation SNR metrics calculated from the correlation plane can be used to quantify the quality of the correlation and the resulting uncertainty of an individual measurement. In this paper we extend the original work by Charonko and Vlachos and present a framework for evaluating the correlation SNR using a set of different metrics, which in turn are used to develop models for uncertainty estimation. Several corrections have been applied in this work. The SNR metrics and corresponding models presented herein are expanded to be applicable to both standard and filtered correlations by applying a subtraction of the minimum correlation value to remove the effect of the background image noise. In addition, the notion of a ‘valid’ measurement is redefined with respect to the correlation peak width in order to be consistent with uncertainty quantification principles and distinct from an ‘outlier’ measurement. Finally the type and significance of the error distribution function is investigated. These advancements lead to more robust and reliable uncertainty estimation models compared with the original work by Charonko and Vlachos. The models are tested against both synthetic benchmark data as well as experimental measurements. In this work, U68.5 uncertainties are estimated at the 68.5% confidence level while U95 uncertainties are estimated at 95% confidence level. For all cases the resulting calculated coverage factors approximate the expected theoretical confidence intervals, thus demonstrating the applicability of these new models for estimation of uncertainty for individual PIV measurements. (paper)

  11. A generalized measurement model to quantify health: the multi-attribute preference response model.

    Directory of Open Access Journals (Sweden)

    Paul F M Krabbe

    Full Text Available After 40 years of deriving metric values for health status or health-related quality of life, the effective quantification of subjective health outcomes is still a challenge. Here, two of the best measurement tools, the discrete choice and the Rasch model, are combined to create a new model for deriving health values. First, existing techniques to value health states are briefly discussed followed by a reflection on the recent revival of interest in patients' experience with regard to their possible role in health measurement. Subsequently, three basic principles for valid health measurement are reviewed, namely unidimensionality, interval level, and invariance. In the main section, the basic operation of measurement is then discussed in the framework of probabilistic discrete choice analysis (random utility model and the psychometric Rasch model. It is then shown how combining the main features of these two models yields an integrated measurement model, called the multi-attribute preference response (MAPR model, which is introduced here. This new model transforms subjective individual rank data into a metric scale using responses from patients who have experienced certain health states. Its measurement mechanism largely prevents biases such as adaptation and coping. Several extensions of the MAPR model are presented. The MAPR model can be applied to a wide range of research problems. If extended with the self-selection of relevant health domains for the individual patient, this model will be more valid than existing valuation techniques.

  12. Multi-attribute integrated measurement of node importance in complex networks

    Science.gov (United States)

    Wang, Shibo; Zhao, Jinlou

    2015-11-01

    The measure of node importance in complex networks is very important to the research of networks stability and robustness; it also can ensure the security of the whole network. Most researchers have used a single indicator to measure the networks node importance, so that the obtained measurement results only reflect certain aspects of the networks with a loss of information. Meanwhile, because of the difference of networks topology, the nodes' importance should be described by combining the character of the networks topology. Most of the existing evaluation algorithms cannot completely reflect the circumstances of complex networks, so this paper takes into account the degree of centrality, the relative closeness centrality, clustering coefficient, and topology potential and raises an integrated measuring method to measure the nodes' importance. This method can reflect nodes' internal and outside attributes and eliminate the influence of network structure on the node importance. The experiments of karate network and dolphin network show that networks topology structure integrated measure has smaller range of metrical result than a single indicator and more universal. Experiments show that attacking the North American power grid and the Internet network with the method has a faster convergence speed than other methods.

  13. Assessing the empirical validity of alternative multi-attribute utility measures in the maternity context

    Directory of Open Access Journals (Sweden)

    Morrell Jane

    2009-05-01

    Full Text Available Abstract Background Multi-attribute utility measures are preference-based health-related quality of life measures that have been developed to inform economic evaluations of health care interventions. The objective of this study was to compare the empirical validity of two multi-attribute utility measures (EQ-5D and SF-6D based on hypothetical preferences in a large maternity population in England. Methods Women who participated in a randomised controlled trial of additional postnatal support provided by trained community support workers represented the study population for this investigation. The women were asked to complete the EQ-5D descriptive system (which defines health-related quality of life in terms of five dimensions: mobility, self care, usual activities, pain/discomfort and anxiety/depression and the SF-36 (which defines health-related quality of life, using 36 items, across eight dimensions: physical functioning, role limitations (physical, social functioning, bodily pain, general health, mental health, vitality and role limitations (emotional at six months postpartum. Their responses were converted into utility scores using the York A1 tariff set and the SF-6D utility algorithm, respectively. One-way analysis of variance was used to test the hypothetically-constructed preference rule that each set of utility scores differs significantly by self-reported health status (categorised as excellent, very good, good, fair or poor. The degree to which EQ-5D and SF-6D utility scores reflected alternative dichotomous configurations of self-reported health status and the Edinburgh Postnatal Depression Scale score was tested using the relative efficiency statistic and receiver operating characteristic (ROC curves. Results The mean utility score for the EQ-5D was 0.861 (95% CI: 0.844, 0.877, whilst the mean utility score for the SF-6D was 0.809 (95% CI: 0.796, 0.822, representing a mean difference in utility score of 0.052 (95% CI: 0.040, 0

  14. Uncertainties of DS86 and prospects for residual radioactivity measurement.

    Science.gov (United States)

    Shizuma, K; Hoshi, M; Hasai, H

    1999-12-01

    Residual radioactivity data of 152Eu, 60Co and 36Cl have been accumulated and it has been revealed in the thermal neutron region that a systematic discrepancy exists between the measured data and activation calculation based on the DS86 neutrons in Hiroshima. Recently 63Ni produced in copper samples by the fast neutron reaction 63Cu(n,p)63Ni has been of interest for evaluation of fast neutrons. Reevaluation of atomic-bomb neutrons and prospects based on residual activity measurements have been discussed. PMID:10805002

  15. Uncertainty in SMAP Soil Moisture Measurements Caused by Dew

    Science.gov (United States)

    Soil moisture is an important reservoir of the hydrologic cycle that regulates the exchange of moisture and energy between the land surface the atmosphere. Two satellite missions will soon make the first global measurements of soil moisture at the optimal microwave wavelength within L-band: ESA's So...

  16. Toward a Characterization of Uncertainty Measure for the Dempster-Shafer Theory

    OpenAIRE

    Harmanec, David

    2013-01-01

    This is a working paper summarizing results of an ongoing research project whose aim is to uniquely characterize the uncertainty measure for the Dempster-Shafer Theory. A set of intuitive axiomatic requirements is presented, some of their implications are shown, and the proof is given of the minimality of recently proposed measure AU among all measures satisfying the proposed requirements.

  17. Role of uncertainty in the measurement of crack length by compliance techniques

    International Nuclear Information System (INIS)

    An experimental program is underway to investigate the effect of thermal treatment and electrochemical potential on the cyclic crack growth behaviour of Inconel-600 and Inconel X-750 in deoxygenated high purity water at 2900C. As part of the program an investigation has been conducted to determine an approximation for the degrees of uncertainty in the elastic compliance technique used for determining crack length. Preliminary results indicate that for room temperature-air crack growth measurements an uncertainty of approximately 1.5% can be expected for the value of measured compliance. For the specimen geometry used this translates to an uncertainty in the effective crack length of 0.25 mm. In an aqueous environment at 2900C, 10.34 MPa the estimated uncertainty in compliance measurement can be as much as 6.5% which translates to a crack length uncertainty of 1.83 mm. These uncertainty values have a significant impact on the measurement intervals required for statistically meaningful crack growth rate data generation

  18. Measuring Young’s modulus the easy way, and tracing the effects of measurement uncertainties

    Science.gov (United States)

    Nunn, John

    2015-09-01

    The speed of sound in a solid is determined by the density and elasticity of the material. Young’s modulus can therefore be calculated once the density and the speed of sound in the solid are measured. The density can be measured relatively easily, and the speed of sound through a rod can be measured very inexpensively by setting up a longitudinal standing wave and using a microphone to record its frequency. This is a simplified version of a technique called ‘impulse excitation’. It is a good educational technique for school pupils. This paper includes the description and the free provision of custom software to calculate the frequency spectrum of a recorded sound so that the resonant peaks can be readily identified. Discussion on the effect of measurement uncertainties is included to help the more thorough experimental student improve the accuracy of his method. The technique is sensitive enough to be able to detect changes in the elasticity modulus with a temperature change of just a few degrees.

  19. Using Pre-Identified Attributes as the Critical Success Factors of College Leadership to Measure Candidates

    OpenAIRE

    Michael F. Frimpon

    2012-01-01

    The selection of a school leader is a multi attribute problem that needs to be addressed taking into considerationthe peculiar needs of an institution. This paper is intended to specify the critical success factors (CSFs) of collegeleaders as perceived by students. A survey comprising the 37 attributes of The Leaders Attributes Inventory (LAI)of Moss was given to the students in a local university to determine their best 10. The 10 selected attributes weremapped onto the Leadership Effectiven...

  20. Measurement Uncertainty Evaluation in Dimensional X-ray Computed Tomography Using the Bootstrap Method

    DEFF Research Database (Denmark)

    Hiller, Jochen; Genta, Gianfranco; Barbato, Giulio;

    2014-01-01

    Industrial applications of computed tomography (CT) for dimensional metrology on various components are fast increasing, owing to a number of favorable properties such as capability of non-destructive internal measurements. Uncertainty evaluation is however more complex than in conventional...... measurement processes, e.g., with tactile systems, also due to factors related to systematic errors, mainly caused by specific CT image characteristics. In this paper we propose a simulation-based framework for measurement uncertainty evaluation in dimensional CT using the bootstrap method. In a case study...... the problem concerning measurement uncertainties was addressed with bootstrap and successfully applied to ball-bar CT measurements. Results obtained enabled extension to more complex shapes such as actual industrial components as we show by tests on a hollow cylinder workpiece....

  1. Evaluation of uncertainty in grating pitch measurement by optical diffraction using Monte Carlo methods

    International Nuclear Information System (INIS)

    Measurement of grating pitch by optical diffraction is one of the few methods currently available for establishing traceability to the definition of the meter on the nanoscale; therefore, understanding all aspects of the measurement is imperative for accurate dissemination of the SI meter. A method for evaluating the component of measurement uncertainty associated with coherent scattering in the diffractometer instrument is presented. The model equation for grating pitch calibration by optical diffraction is an example where Monte Carlo (MC) methods can vastly simplify evaluation of measurement uncertainty. This paper includes discussion of the practical aspects of implementing MC methods for evaluation of measurement uncertainty in grating pitch calibration by diffraction. Downloadable open-source software is demonstrated. (technical design note)

  2. Evaluation of the uncertainty of electrical impedance measurements: the GUM and its Supplement 2

    International Nuclear Information System (INIS)

    Electrical impedance is not a scalar but a complex quantity. Thus, evaluation of the uncertainty of its value involves a model whose output is a complex. In this paper the comparison of the evaluation of the uncertainty of the measurement of the electrical impedance of a simple electric circuit using the GUM and using a Monte Carlo method according to the Supplement 2 of the GUM is presented

  3. Terminological aspects of the Guide to the Expression of Uncertainty in Measurement (GUM)

    Science.gov (United States)

    Ehrlich, Charles

    2014-08-01

    The Guide to the Expression of Uncertainty in Measurement (GUM) provided for the first time an international consensus on how to approach the widespread difficulties associated with conveying information about how reliable the value resulting from a measurement is thought to be. This paper examines the evolution in thinking and its impact on the terminology that accompanied the development of the GUM. Particular emphasis is put on the very clear distinction in the GUM between measurement uncertainty and measurement error, and on the reasons that even though ‘true value’ and ‘error’ are considered in the GUM to be ‘unknowable’ and, sometimes by implication, of little (or even no) use in measurement analysis, they remain as key concepts, especially when considering the objective of measurement. While probability theory in measurement analysis from a frequentist perspective was in widespread use prior to the publication of the GUM, a key underpinning principle of the GUM was to instead consider probability as a ‘degree of belief.’ The terminological changes necessary to make this transition are also covered. Even twenty years after the publication of the GUM, the scientific and metrology literatures sometimes contain uncertainty analyses, or discussions of measurement uncertainty, that are not terminologically consistent with the GUM, leading to the inability of readers to fully understand what has been done and what is intended in the associated measurements. This paper concludes with a discussion of the importance of using proper methodology and terminology for reporting measurement results.

  4. Coherent Uncertainty Analysis of Aerosol Measurements from Multiple Satellite Sensors

    Science.gov (United States)

    Petrenko, M.; Ichoku, C.

    2013-01-01

    Aerosol retrievals from multiple spaceborne sensors, including MODIS (on Terra and Aqua), MISR, OMI, POLDER, CALIOP, and SeaWiFS altogether, a total of 11 different aerosol products were comparatively analyzed using data collocated with ground-based aerosol observations from the Aerosol Robotic Network (AERONET) stations within the Multi-sensor Aerosol Products Sampling System (MAPSS, http://giovanni.gsfc.nasa.gov/mapss/ and http://giovanni.gsfc.nasa.gov/aerostat/). The analysis was performed by comparing quality-screened satellite aerosol optical depth or thickness (AOD or AOT) retrievals during 2006-2010 to available collocated AERONET measurements globally, regionally, and seasonally, and deriving a number of statistical measures of accuracy. We used a robust statistical approach to detect and remove possible outliers in the collocated data that can bias the results of the analysis. Overall, the proportion of outliers in each of the quality-screened AOD products was within 12%. Squared correlation coefficient (R2) values of the satellite AOD retrievals relative to AERONET exceeded 0.6, with R2 for most of the products exceeding 0.7 over land and 0.8 over ocean. Root mean square error (RMSE) values for most of the AOD products were within 0.15 over land and 0.09 over ocean. We have been able to generate global maps showing regions where the different products present advantages over the others, as well as the relative performance of each product over different landcover types. It was observed that while MODIS, MISR, and SeaWiFS provide accurate retrievals over most of the landcover types, multi-angle capabilities make MISR the only sensor to retrieve reliable AOD over barren and snow / ice surfaces. Likewise, active sensing enables CALIOP to retrieve aerosol properties over bright-surface shrublands more accurately than the other sensors, while POLDER, which is the only one of the sensors capable of measuring polarized aerosols, outperforms other sensors in

  5. Disaggregating measurement uncertainty from population variability and Bayesian treatment of uncensored results

    International Nuclear Information System (INIS)

    In making low-level radioactivity measurements of populations, it is commonly observed that a substantial portion of net results are negative. Furthermore, the observed variance of the measurement results arises from a combination of measurement uncertainty and population variability. This paper presents a method for disaggregating measurement uncertainty from population variability to produce a probability density function (PDF) of possibly true results. To do this, simple, justifiable, and reasonable assumptions are made about the relationship of the measurements to the measurands (the 'true values'). The measurements are assumed to be unbiased, that is, that their average value is the average of the measurands. Using traditional estimates of each measurement's uncertainty to disaggregate population variability from measurement uncertainty, a PDF of measurands for the population is produced. Then, using Bayes's theorem, the same assumptions, and all the data from the population of individuals, a prior PDF is computed for each individual's measurand. These PDFs are non-negative, and their average is equal to the average of the measurement results for the population. The uncertainty in these Bayesian posterior PDFs is all Berkson with no remaining classical component. The methods are applied to baseline bioassay data from the Hanford site. The data include 90Sr urinalysis measurements on 128 people, 137Cs in vivo measurements on 5,337 people, and 239Pu urinalysis measurements on 3,270 people. The method produces excellent results for the 90Sr and 137Cs measurements, since there are nonzero concentrations of these global fallout radionuclides in people who have not been occupationally exposed. The method does not work for the 239Pu measurements in non-occupationally exposed people because the population average is essentially zero.

  6. Comparison of different approaches to estimate uncertainty budget in k0-INAA measurement

    International Nuclear Information System (INIS)

    Three CRMs of different matrix composition were analysed, representing an environmental matrix sample (BCR-320R Channel Sediment), a botanical matrix sample (SRM 1547 Peach Leaves) and a zoological matrix sample (SRM 1566b Oyster Tissue). The element mass fractions were obtained using the KayWin program. Analytical measurement uncertainty was determined by two approaches: (1) the routine procedure applying combination of the overall uncertainty u(m) = 3.5 % and statistical uncertainty of the peak area determination and (2) the procedure applying the dedicated ERON program for calculating uncertainty. Performance of altogether 31 certified values was tested by means of calculating En numbers. For the remaining 52 non-certified values, comparison between uncertainties obtained by the two approaches was made. When using the first approach, the E n number showed satisfactory performance in 28 cases; by using the second approach, the En number showed satisfactory performance in 27 cases. None of the unsatisfactory performances (En > 1) appeared to be of systematic nature. The uncertainties obtained by applying the two approaches revealed a big extent of consistency. As the present nuclear database lacks lot of data that serve as input to the ERON program, in particular uncertainties of Q0 factors, estimates need to be introduced for the missing values, emphasising the urgent need to upgrade the database with missing data. (author)

  7. A novel method for importance measure analysis in the presence of epistemic and aleatory uncertainties

    Directory of Open Access Journals (Sweden)

    Ren Bo

    2014-06-01

    Full Text Available For structural systems with both epistemic and aleatory uncertainties, research on quantifying the contribution of the epistemic and aleatory uncertainties to the failure probability of the systems is conducted. Based on the method of separating epistemic and aleatory uncertainties in a variable, the core idea of the research is firstly to establish a novel deterministic transition model for auxiliary variables, distribution parameters, random variables, failure probability, then to propose the improved importance sampling (IS to solve the transition model. Furthermore, the distribution parameters and auxiliary variables are sampled simultaneously and independently; therefore, the inefficient sampling procedure with an “inner-loop” for epistemic uncertainty and an “outer-loop” for aleatory uncertainty in traditional methods is avoided. Since the proposed method combines the fast convergence of the proper estimates and searches failure samples in the interesting regions with high efficiency, the proposed method is more efficient than traditional methods for the variance-based failure probability sensitivity measures in the presence of epistemic and aleatory uncertainties. Two numerical examples and one engineering example are introduced for demonstrating the efficiency and precision of the proposed method for structural systems with both epistemic and aleatory uncertainties.

  8. A novel method for importance measure analysis in the presence of epistemic and aleatory uncertainties

    Institute of Scientific and Technical Information of China (English)

    Ren Bo; Lu Zhenzhou; Zhou Changcong

    2014-01-01

    For structural systems with both epistemic and aleatory uncertainties, research on quantifying the contribution of the epistemic and aleatory uncertainties to the failure probability of the systems is conducted. Based on the method of separating epistemic and aleatory uncertainties in a variable, the core idea of the research is firstly to establish a novel deterministic transition model for auxiliary variables, distribution parameters, random variables, failure probability, then to propose the improved importance sampling (IS) to solve the transition model. Furthermore, the distribution parameters and auxiliary variables are sampled simultaneously and independently;therefore, the inefficient sampling procedure with an‘‘inner-loop’’ for epistemic uncertainty and an‘‘outer-loop’’ for aleatory uncertainty in traditional methods is avoided. Since the proposed method combines the fast convergence of the proper estimates and searches failure samples in the interesting regions with high efficiency, the proposed method is more efficient than traditional methods for the variance-based failure probability sensitivity measures in the presence of epistemic and aleatory uncertainties. Two numerical examples and one engineering example are introduced for demonstrating the efficiency and precision of the proposed method for structural systems with both epistemic and aleatory uncertainties.

  9. Determination of uncertainties associated to the in vivo measurement of iodine-131 in the thyroid.

    Science.gov (United States)

    Dantas, B M; Lima, F F; Dantas, A L; Lucena, E A; Gontijo, R M G; Carvalho, C B; Hazin, C

    2016-07-01

    Intakes of radionuclides can be estimated through in vivo measurements, and the uncertainties associated to the measured activities should be clearly stated in monitoring program reports. This study aims to evaluate the uncertainties of in vivo monitoring of iodine 131 in the thyroid. The reference values for high-energy photons are based on the IDEAS Guide. Measurements were performed at the In Vivo Monitoring Laboratory of the Institute of Radiation Protection and Dosimetry (IRD) and at the Internal Dosimetry Laboratory of the Regional Center of Nuclear Sciences (CRCN-NE). In both institutions, the experiment was performed using a NaI(Tl) 3''3″ scintillation detector and a neck-thyroid phantom. Scattering factors were calculated and compared in different counting geometries. The results show that the technique produces reproducibility equivalent to the values suggested in the IDEAS Guide and measurement uncertainties is comparable to international quality standards for this type of in vivo monitoring. PMID:27108067

  10. Introducing a Simple Guide for the evaluation and expression of the uncertainty of NIST measurement results

    Science.gov (United States)

    Possolo, Antonio

    2016-02-01

    The current guidelines for the evaluation and expression of the uncertainty of NIST measurement results were originally published in 1993 as NIST Technical Note 1297, which was last revised in 1994. NIST is now updating its principles and procedures for uncertainty evaluation to address current and emerging needs in measurement science that Technical Note 1297 could not have anticipated or contemplated when it was first conceived. Although progressive and forward-looking, this update is also conservative because it does not require that current practices for uncertainty evaluation be abandoned or modified where they are fit for purpose and when there is no compelling reason to do otherwise. The updated guidelines are offered as a Simple Guide intended to be deployed under the NIST policy on Measurement Quality, and are accompanied by a rich collection of examples of application drawn from many different fields of measurement science.

  11. Propagation of systematic uncertainty due to data reduction in transmission measurement of iron

    International Nuclear Information System (INIS)

    A technique of determinantal inequalities to estimate the bounds for statistical and systematic uncertainties in neutron cross section measurement have been developed. In the measurement of neutron cross section, correlation is manifested due to the process of measurement and due to many systematic components like geometrical factor, half life, back scattering etc. However propagation of experimental uncertainties through the reduction of cross section data is itself a complicated procedure and has been inviting attention in recent times. The concept of determinantal inequalities to a transmission measurement of iron cross section and demonstration of how in such data reduction procedures the systematic uncertainty dominates over the statistical and estimate their individual bounds have been applied in this paper. (author). 2 refs., 1 tab

  12. UNCERTAINTY OF MEASUREMENT- AN IMPORTANT INSTRUMENT TO EVALUATE THE QUALITY OF RESULTS IN FORMALDEHYDE TESTS

    Directory of Open Access Journals (Sweden)

    Emanuela BELDEAN

    2013-09-01

    Full Text Available The measurement uncertainty is a quantitativeindicator of the results quality, meaning how well theresult represents the value of the quantity beingmeasured. It is a relatively new concept and severalguides and regulations were elaborated in order tofacilitate laboratories to evaluate it. The uncertaintycomponents are quantified based on data fromrepeated measurements, previous measurements,knowledge of the equipment and experience of themeasurement. Uncertainity estimation involves arigorous evaluation of possible sources of uncertaintyand good knowledge of the measurement procedure.The case study presented in this paper revealed thebasic steps in uncertainty calculation for formaldehydeemission from wood-based panels determined by the1m3 Chamber method. Based on a very well definedIshikawa Diagram, the expanded uncertainty of0.044mg/m3 for k=2, at 95% confidence level wasestablished.

  13. Measurement uncertainties when determining heat rate, isentropic efficiency and swallowing capacity

    Energy Technology Data Exchange (ETDEWEB)

    Snygg, U.

    1996-05-01

    The objective of the project was to determine the uncertainties when calculating heat rate, isentropic efficiencies and swallowing capacities of power plants. Normally when a power plant is constructed, the supplier also guarantee some performance values, e.g. heat rate. When the plant is built and running under normal conditions, an evaluation is done and the guarantee values are checked. Different measured parameters influence the calculated value differently, and therefore a sensitivity factor can be defined as the sensitivity of a calculated value when the measured value is changing. The product of this factor and the uncertainty of the measured parameter gives an error of the calculated value. For every measured parameter, the above given factor has to be determined and then the root square sum gives the overall uncertainty of the calculated parameter. To receive acceptable data during the evaluation of the plant, a test code is to be followed. The test code also gives guidelines how big the errors of the measurements are. In this study, ASME PTC6 and DIN 1943 were used. The results show that not only the test code was of vital importance, but also the distribution of the power output of the HP-IP turbines contra LP turbines. A higher inlet pressure of the LP turbine gives a smaller uncertainty of the isentropic efficiency. An increase from 6 to 13 bar will lower the uncertainty 1.5 times. 10 refs, 24 figs, 23 tabs, 5 appendixes

  14. Electromagnetic coil (EM coil) measurement technique to verify presence of metal/absence of oxide attribute

    International Nuclear Information System (INIS)

    This paper summarizes how an Electromagnetic coil (EM coil) measurement technique can be used to discriminate between plutonium metal, plutonium oxide, and mixtures of these two materials inside sealed storage containers. Measurement results are from a variety of metals and Aluminium oxide in two different container types, the carbon steel AL-R8 and the stainless steel AT-400R. Within these container types two scenarios have been explored. 1.) The same configuration made from different metals for demonstrating material property effects. 2.) The same metal configured differently to demonstrate how mass distribution affects the EM signature. This non-radiation measurement method offers verification of the 'presence of metal/absence of oxide' attribute in less than a minute. In January 2001, researches at Pacific Northwest Laboratory showed this method to discriminate between aluminium and aluminium oxide placed inside an AT-400R (a total wall thickness of over 2.5 cm) storage container. Subsequent experimental and theoretical investigations into adapting the EM coil technique for arms control applications, suggests a similar response for plutonium and plutonium oxide. This conclusion is consistent with the fact that all metals are electrically conductive while most oxides are electrical insulators (non-conductors). (author)

  15. Total Measurement Uncertainty (TMU) for Nondestructive Assay of Transuranic (TRU) Waste at the WRAP Facility

    Energy Technology Data Exchange (ETDEWEB)

    WILLS, C.E.

    1999-12-06

    This report examines the contributing factors to NDA measurement uncertainty at WRAP. The significance of each factor on the TMU is analyzed, and a final method is given for determining the TMU for NDA measurements at WRAP. As more data becomes available, and WRAP gains in operational experience, this report will be reviewed semi-annually and updated as necessary.

  16. Total Measurement Uncertainty (TMU) for Nondestructive Assay of Transuranic (TRU) Waste at the WRAP Facility

    International Nuclear Information System (INIS)

    This report examines the contributing factors to NDA measurement uncertainty at WRAP. The significance of each factor on the TMU is analyzed, and a final method is given for determining the TMU for NDA measurements at WRAP. As more data becomes available, and WRAP gains in operational experience, this report will be reviewed semi-annually and updated as necessary

  17. Total Measurement Uncertainty for Nondestructive Assay of Transuranic Waste at the WRAP Facility

    International Nuclear Information System (INIS)

    This report examines the contributing factors to NDA measurement uncertainty at WRAP. The significance of each factor on the TMU is analyzed, and a final method is given for determining the TMU for NDA measurements at WRAP. As more data becomes available, and WRAP gains in operational experience, this report will be reviewed semi-annually and updated as necessary

  18. Total Measurement Uncertainty (TMU) for Nondestructive Assay of Transuranic (TRU) Waste at the WRAP Facility

    International Nuclear Information System (INIS)

    This report examines the contributing factors to NDA measurement uncertainty at WRAP The significance of each factor on the TMU is analyzed and a final method is given for determining the TMU for NDA measurements at WRAP. As more data becomes available and WRAP gains in operational experience this report will be reviewed semi annually and updated as necessary

  19. Screening-level estimates of mass discharge uncertainty from point measurement methods

    Science.gov (United States)

    The uncertainty of mass discharge measurements associated with point-scale measurement techniques was investigated by deriving analytical solutions for the mass discharge coefficient of variation for two simplified, conceptual models. In the first case, a depth-averaged domain w...

  20. Technical notes: A detailed study for the provision of measurement uncertainty and traceability for goniospectrometers

    NARCIS (Netherlands)

    Peltoniemi, J.I.; Hakala, T.; Suomalainen, J.M.; Honkavaara, E.; Markelin, L.; Gritsevich, M.; Eskelinen, J.; Jaanson, P.; Ikonen, E.

    2014-01-01

    The measurement uncertainty and traceability of the Finnish Geodetic Institutes¿s field gonio-spectro-polarimeter FIGIFIGO have been assessed. First, the reference standard (Spectralon sample) was measured at the National Standard Laboratory of MIKES-Aalto. This standard was transferred to FGI¿s fie

  1. A super-resolution approach for uncertainty estimation of PIV measurements

    NARCIS (Netherlands)

    Sciacchitano, A.; Wieneke , B.; Scarano, F.

    2012-01-01

    A super-resolution approach is proposed for the a posteriori uncertainty estimation of PIV measurements. The measured velocity field is employed to determine the displacement of individual particle images. A disparity set is built from the residual distance between paired particle images of successi

  2. Low uncertainty measurements of bidirectional reflectance factor on the NPOESS/VIIRS solar diffuser

    Science.gov (United States)

    Lessel, Kristen; McClain, Stephen

    2007-09-01

    An illuminated Solar Diffuser is the calibration source for the VIS/NIR bands on the NPOESS/VIIRS sensor. We completed a set of BRF measurements to fully characterize the distribution of scattered light from the solar diffuser. NPOESS/VIIRS has an overall VIS/NIR radiometric calibration uncertainty requirement of 2%(1 sigma), of which 1.32% was allocated to the characterization of the BRF. In order to meet this requirement, we modified the existing goniometer and measurement procedure used on MODIS. Modifications include sample yoke redesign, periodic measurements of the lamp polarization coupled with stability measurements, modifications to source optics, and stray light reduction. We measured BRF in 6 spectral wavebands for 9 out-of-plane illumination angles and 2 view angles. We achieved NIST traceable measurements with an uncertainty ranging from 1.09% to 1.32%. Our measurements of a smaller Spectralon TM sample match NIST measurements of the same sample to better than 0.5%. These requirements are nominally the same as achieved on MODIS. As a result of instrument upgrades, we currently meet this overall uncertainty while having included additional uncertainty terms.

  3. Traceable measurement and uncertainty analysis of the gross calorific value of methane determined by isoperibolic calorimetry

    Science.gov (United States)

    Haloua, F.; Foulon, E.; Allard, A.; Hay, B.; Filtz, J. R.

    2015-12-01

    As methane is the major component of natural gas and non-conventional gases as biogas or mine gases, its energy content has to be measured accurately regardless of its production site for fiscal trading of transported and distributed natural gas. The determination of calorific value of fuel gases with the lowest uncertainty can only be performed by direct method with a reference gas calorimeter. To address this point, LNE developed a few years ago an isoperibolic reference gas calorimeter according to the Rossini’s principle. The energy content Hs of methane of purity 99.9995% has been measured to 55 507.996 kJ kg-1 (890.485 kJ mol-1) with an expanded relative uncertainty of 0.091% (coverage factor k  =  2.101 providing a level of confidence of approximately 95%). These results are based on ten repeated measurements and on the uncertainty assessment performed in accordance with the guide to the expression of uncertainty in measurement (GUM). The experimental setup and the results are reported here and for the first time, the fully detailed uncertainty calculation is exposed.

  4. Experimental and Measurement Uncertainty Associated with Characterizing Slurry Mixing Performance of Pulsating Jets at Multiple Scales

    Energy Technology Data Exchange (ETDEWEB)

    Bamberger, Judith A.; Piepel, Gregory F.; Enderlin, Carl W.; Amidan, Brett G.; Heredia-Langner, Alejandro

    2015-09-10

    Understanding how uncertainty manifests itself in complex experiments is important for developing the testing protocol and interpreting the experimental results. This paper describes experimental and measurement uncertainties, and how they can depend on the order of performing experimental tests. Experiments with pulse-jet mixers in tanks at three scales were conducted to characterize the performance of transient-developing periodic flows in Newtonian slurries. Other test parameters included the simulant, solids concentration, and nozzle exit velocity. Critical suspension velocity and cloud height were the metrics used to characterize Newtonian slurry flow associated with mobilization and mixing. During testing, near-replicate and near-repeat tests were conducted. The experimental results were used to quantify the combined experimental and measurement uncertainties using standard deviations and percent relative standard deviations (%RSD) The uncertainties in critical suspension velocity and cloud height tend to increase with the values of these responses. Hence, the %RSD values are the more appropriate summary measure of near-replicate testing and measurement uncertainty.

  5. Estimating the Uncertainty of Tensile Strength Measurement for A Photocured Material Produced by Additive Manufacturing

    Directory of Open Access Journals (Sweden)

    Adamczak Stanisław

    2014-08-01

    Full Text Available The aim of this study was to estimate the measurement uncertainty for a material produced by additive manufacturing. The material investigated was FullCure 720 photocured resin, which was applied to fabricate tensile specimens with a Connex 350 3D printer based on PolyJet technology. The tensile strength of the specimens established through static tensile testing was used to determine the measurement uncertainty. There is a need for extensive research into the performance of model materials obtained via 3D printing as they have not been studied sufficiently like metal alloys or plastics, the most common structural materials. In this analysis, the measurement uncertainty was estimated using a larger number of samples than usual, i.e., thirty instead of typical ten. The results can be very useful to engineers who design models and finished products using this material. The investigations also show how wide the scatter of results is.

  6. Uncertainties of size measurements in electron microscopy characterization of nanomaterials in foods

    DEFF Research Database (Denmark)

    Dudkiewicz, Agnieszka; Boxall, Alistair B. A.; Chaudhry, Qasim;

    2015-01-01

    Electron microscopy is a recognized standard tool for nanomaterial characterization, and recommended by the European Food Safety Authority for the size measurement of nanomaterials in food. Despite this, little data have been published assessing the reliability of the method, especially for size...... measurement of nanomaterials characterized by a broad size distribution and/or added to food matrices. This study is a thorough investigation of the measurement uncertainty when applying electron microscopy for size measurement of engineered nanomaterials in foods. Our results show that the number of measured...... particles was only a minor source of measurement uncertainty for nanomaterials in food, compared to the combined influence of sampling, sample preparation prior to imaging and the image analysis. The main conclusion is that to improve the measurement reliability, care should be taken to consider...

  7. Comparison of ISO-GUM and Monte Carlo Method for Evaluation of Measurement Uncertainty

    International Nuclear Information System (INIS)

    To supplement the ISO-GUM method for the evaluation of measurement uncertainty, a simulation program using the Monte Carlo method (MCM) was developed, and the MCM and GUM methods were compared. The results are as follows: (1) Even under a non-normal probability distribution of the measurement, MCM provides an accurate coverage interval; (2) Even if a probability distribution that emerged from combining a few non-normal distributions looks as normal, there are cases in which the actual distribution is not normal and the non-normality can be determined by the probability distribution of the combined variance; and (3) If type-A standard uncertainties are involved in the evaluation of measurement uncertainty, GUM generally offers an under-valued coverage interval. However, this problem can be solved by the Bayesian evaluation of type-A standard uncertainty. In this case, the effective degree of freedom for the combined variance is not required in the evaluation of expanded uncertainty, and the appropriate coverage factor for 95% level of confidence was determined to be 1.96

  8. Measure of Uncertainty in Process Models Using Stochastic Petri Nets and Shannon Entropy

    Directory of Open Access Journals (Sweden)

    Martin Ibl

    2016-01-01

    Full Text Available When modelling and analysing business processes, the main emphasis is usually put on model validity and accuracy, i.e., the model meets the formal specification and also models the relevant system. In recent years, a series of metrics has begun to develop, which allows the quantification of the specific properties of process models. These characteristics are, for instance, complexity, comprehensibility, cohesion, and uncertainty. This work is focused on defining a method that allows us to measure the uncertainty of a process model, which was modelled by using stochastic Petri nets (SPN. The principle of this method consists of mapping of all reachable marking of SPN into the continuous-time Markov chain and then calculating its stationary probabilities. The uncertainty is then measured as the entropy of the Markov chain (it is possible to calculate the uncertainty of the specific subset of places as well as of whole net. Alternatively, the uncertainty index is quantified as a percentage of the calculated entropy against maximum entropy (the resulting value is normalized to the interval <0,1>. The calculated entropy can also be used as a measure of the model complexity.

  9. About uncertainties related to the indirect method of measuring radiation doses in paediatric radiography

    International Nuclear Information System (INIS)

    Indirect method of measuring radiation doses in diagnostic radiology has played a important role to large-scale dosimetric surveys of paediatric patients. To determine the uncertainties associated with this method is crucial to compare the results surveyed in different radiology departments for optimisation purposes. Entrance surface doses (E.S.D.) received by paediatric patients in chest and skull radiographies were estimated by indirect method in three public hospitals of Belo Horizonte city in Brazil: two general hospitals and a children specialist one. Uncertainties of the entrance doses were calculated from the uncertainties of the output measurements, backscatter factors, patient data and technique factors employed within a 95% confidence limit. In a room of one general hospital, E.S.D. values for diagnostic images of chest were (74 ± 12%) μGy for a one-year old child, (92±11%) μGy for a five-years old child and (135 ± 12%) μGy for a ten-years old child. E.S.D. values in the two radiographic procedures studied for a five-years old child were generally lower than that published by Commission of the European Communities in 1996 and higher than that published by the National Radiological Protection Board in 2000. The uncertainties of the output measurements and technique factors employed (consequence of non standardisation of technique factors) were determinants to the high values of uncertainties found in some rooms. (authors)

  10. Total Uncertainty in Measurements Record for Climate: Strategies from the CLARREO Mission

    Science.gov (United States)

    Dykema, J. A.; Anderson, J.

    2010-12-01

    Questions about uncertainty in observed trends in the climate system arise from multiple sources, including instrument performance, issues of temporal and spatial sampling, and geophysical information content obtainable from measurement records. The Climate Absolute Radiance and Refractivity Observatory (CLARREO) mission is designed to provide objective, testable evidence to support uncertainty estimates associated with these various sources. In this paper, we examine the strategies planned for CLARREO to obtain the information required to achieve this objective. In the case of instrument performance, the CLARREO sensors will utilize methods for obtaining robust uncertainty estimates that have been refined throughout the natural sciences through the work of the international community of National Measurement Institutes (NMIs). The foundation of the methods developed by the NMI community is a set of measurement standards that can be reproduced over time, and across national borders and institutions, to assure an exact quantitative relationship between different measurements. These measurement standards are the International System of Units, or SI. The SI units achieve the required properties by utilizing fundamental properties of matter to define a measurement system that is independent of instruments or techniques that are specific to a particular place or time. The set of a robust set of measurement standards then forms the basis for an experimental strategy to test the uncertainty of a climate observation system based on objective techniques that can be repeated by any experimenter, anywhere in the world, at any time. This paper will look at specific examples of the physical logic underlying this framework for the CLARREO infrared instrument suite, paying special attention to the overlap between the CLARREO calibration strategies and measurement successes from other areas of natural science. The interplay of measurement uncertainty with sampling and information

  11. Calculation of the detection limit in radiation measurements with systematic uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Kirkpatrick, J.M., E-mail: john.kirkpatrick@canberra.com; Russ, W.; Venkataraman, R.; Young, B.M.

    2015-06-01

    The detection limit (L{sub D}) or Minimum Detectable Activity (MDA) is an a priori evaluation of assay sensitivity intended to quantify the suitability of an instrument or measurement arrangement for the needs of a given application. Traditional approaches as pioneered by Currie rely on Gaussian approximations to yield simple, closed-form solutions, and neglect the effects of systematic uncertainties in the instrument calibration. These approximations are applicable over a wide range of applications, but are of limited use in low-count applications, when high confidence values are required, or when systematic uncertainties are significant. One proposed modification to the Currie formulation attempts account for systematic uncertainties within a Gaussian framework. We have previously shown that this approach results in an approximation formula that works best only for small values of the relative systematic uncertainty, for which the modification of Currie's method is the least necessary, and that it significantly overestimates the detection limit or gives infinite or otherwise non-physical results for larger systematic uncertainties where such a correction would be the most useful. We have developed an alternative approach for calculating detection limits based on realistic statistical modeling of the counting distributions which accurately represents statistical and systematic uncertainties. Instead of a closed form solution, numerical and iterative methods are used to evaluate the result. Accurate detection limits can be obtained by this method for the general case.

  12. Calculation of the detection limit in radiation measurements with systematic uncertainties

    International Nuclear Information System (INIS)

    The detection limit (LD) or Minimum Detectable Activity (MDA) is an a priori evaluation of assay sensitivity intended to quantify the suitability of an instrument or measurement arrangement for the needs of a given application. Traditional approaches as pioneered by Currie rely on Gaussian approximations to yield simple, closed-form solutions, and neglect the effects of systematic uncertainties in the instrument calibration. These approximations are applicable over a wide range of applications, but are of limited use in low-count applications, when high confidence values are required, or when systematic uncertainties are significant. One proposed modification to the Currie formulation attempts account for systematic uncertainties within a Gaussian framework. We have previously shown that this approach results in an approximation formula that works best only for small values of the relative systematic uncertainty, for which the modification of Currie's method is the least necessary, and that it significantly overestimates the detection limit or gives infinite or otherwise non-physical results for larger systematic uncertainties where such a correction would be the most useful. We have developed an alternative approach for calculating detection limits based on realistic statistical modeling of the counting distributions which accurately represents statistical and systematic uncertainties. Instead of a closed form solution, numerical and iterative methods are used to evaluate the result. Accurate detection limits can be obtained by this method for the general case

  13. A reference material for establishing uncertainties in full-field displacement measurements

    International Nuclear Information System (INIS)

    A simple reference material for establishing the minimum measurement uncertainty of optical systems for measuring 3D surface displacement fields in deforming objects is described and its use demonstrated by employing 3D digital image correlation as an exemplar technique. The reference material consists of a stepped bar, whose dimensions can be scaled to suit the application, and that can be clamped rigidly at its thick end to create an idealized cantilever. The cantilever was excited at resonance to generate out-of-plane displacements and, in a separate experiment, loaded statically in-plane to provide in-plane displacement fields. The displacements were measured using 3D digital image correlation and compared to the predicted displacement fields derived from tip deflections obtained using a calibrated transducer that provided traceability to the national standard for length. The minimum measurement uncertainties were evaluated by comparing the measured and predicted displacement fields, taking account of the uncertainties in the input parameters for the predictions. It was found that the minimum measurement uncertainties were less than 3% for the Cartesian components of displacement present during static in-plane bending and less than 3 µm for out-of-plane displacements during dynamic loading. It was concluded that this reference material was more straightforward to use, more versatile and yielded comparable results relative to an earlier design. (paper)

  14. Variance gradients and uncertainty budgets for nonlinear measurement functions with independent inputs

    International Nuclear Information System (INIS)

    A novel variance-based measure for global sensitivity analysis, termed a variance gradient (VG), is presented for constructing uncertainty budgets under the Guide to the Expression of Uncertainty in Measurement (GUM) framework for nonlinear measurement functions with independent inputs. The motivation behind VGs is the desire of metrologists to understand which inputs' variance reductions would most effectively reduce the variance of the measurand. VGs are particularly useful when the application of the first supplement to the GUM is indicated because of the inadequacy of measurement function linearization. However, VGs reduce to a commonly understood variance decomposition in the case of a linear(ized) measurement function with independent inputs for which the original GUM readily applies. The usefulness of VGs is illustrated by application to an example from the first supplement to the GUM, as well as to the benchmark Ishigami function. A comparison of VGs to other available sensitivity measures is made. (paper)

  15. Measuring the perceived uncertainty of scientific evidence and its relationship to engagement with science.

    Science.gov (United States)

    Retzbach, Joachim; Otto, Lukas; Maier, Michaela

    2016-08-01

    Many scholars have argued for the need to communicate openly not only scientific successes to the public but also limitations, such as the tentativeness of research findings, in order to enhance public trust and engagement. Yet, it has not been quantitatively assessed how the perception of scientific uncertainties relates to engagement with science on an individual level. In this article, we report the development and testing of a new questionnaire in English and German measuring the perceived uncertainty of scientific evidence. Results indicate that the scale is reliable and valid in both language versions and that its two subscales are differentially related to measures of engagement: Science-friendly attitudes were positively related only to 'subjectively' perceived uncertainty, whereas interest in science as well as behavioural engagement actions and intentions were largely uncorrelated. We conclude that perceiving scientific knowledge to be uncertain is only weakly, but positively related to engagement with science. PMID:25814513

  16. Recent Surface Reflectance Measurement Campaigns with Emphasis on Best Practices, SI Traceability and Uncertainty Estimation

    Science.gov (United States)

    Helder, Dennis; Thome, Kurtis John; Aaron, Dave; Leigh, Larry; Czapla-Myers, Jeff; Leisso, Nathan; Biggar, Stuart; Anderson, Nik

    2012-01-01

    A significant problem facing the optical satellite calibration community is limited knowledge of the uncertainties associated with fundamental measurements, such as surface reflectance, used to derive satellite radiometric calibration estimates. In addition, it is difficult to compare the capabilities of calibration teams around the globe, which leads to differences in the estimated calibration of optical satellite sensors. This paper reports on two recent field campaigns that were designed to isolate common uncertainties within and across calibration groups, particularly with respect to ground-based surface reflectance measurements. Initial results from these efforts suggest the uncertainties can be as low as 1.5% to 2.5%. In addition, methods for improving the cross-comparison of calibration teams are suggested that can potentially reduce the differences in the calibration estimates of optical satellite sensors.

  17. Calculation of uncertainties associated to environmental radioactivity measurements and their functions. Practical Procedure II

    International Nuclear Information System (INIS)

    Environmental radioactivity measurements are mainly affected by counting uncertainties. In this report the uncertainties associated to certain functions related to activity concentration calculations are determined. Some practical exercise are presented to calculate the uncertainties associated to: a) Chemical recovery of a radiochemical separation when employing tracers (i.e. Pu and Am purification from a sediment sample). b) Indirect determination of a mother radionuclide through one of its daughters (i. e. ''210 Pb quantification following its daughter ''210 Po building-up activity). c) Time span from last separation date of one of the components of a disintegration chain (i.e. Am last purification date from a nuclear weapons following ''241 Am and ''241 Pu measurements). Calculations concerning example b) and c) are based on Baterman equations, regulating radioactive equilibria. Although the exercises here presented are performed with certain radionuclides, they could be applied as generic procedures for other alpha-emitting radioelements

  18. Investment in flood protection measures under climate change uncertainty. An investment decision

    Energy Technology Data Exchange (ETDEWEB)

    Bruin, Karianne de

    2012-11-01

    Recent river flooding in Europe has triggered debates among scientists and policymakers on future projections of flood frequency and the need for adaptive investments, such as flood protection measures. Because there exists uncertainty about the impact of climate change of flood risk, such investments require a careful analysis of expected benefits and costs. The objective of this paper is to show how climate change uncertainty affects the decision to invest in flood protection measures. We develop a model that simulates optimal decision making in flood protection, it incorporates flexible timing of investment decisions and scientific uncertainty on the extent of climate change impacts. This model allows decision-makers to cope with the uncertain impacts of climate change on the frequency and damage of river flood events and minimises the risk of under- or over-investment. One of the innovative elements is that we explicitly distinguish between structural and non-structural flood protection measures. Our results show that the optimal investment decision today depends strongly on the cost structure of the adaptation measures and the discount rate, especially the ratio of fixed and weighted annual costs of the measures. A higher level of annual flood damage and later resolution of uncertainty in time increases the optimal investment. Furthermore, the optimal investment decision today is influenced by the possibility of the decision-maker to adjust his decision at a future moment in time.(auth)

  19. Uncertainty quantification in aerosol optical thickness retrieval from Ozone Monitoring Instrument (OMI) measurements

    Science.gov (United States)

    Määttä, A.; Laine, M.; Tamminen, J.; Veefkind, J. P.

    2013-10-01

    The space borne measurements provide global view of atmospheric aerosol distribution. The Ozone Monitoring Instrument (OMI) on board NASAs Earth Observing System (EOS) Aura satellite is a Dutch-Finnish nadir-viewing solar backscatter spectrometer measuring in the ultraviolet and visible wavelengths. OMI measures several trace gases and aerosols that are important in many air quality and climate studies. The OMI aerosol measurements are used, for example, for detecting volcanic ash plumes, wild fires and transportation of desert dust. We present a methodology for improving the uncertainty quantification in the aerosols retrieval algorithm. We have used the OMI measurements in this feasibility study. Our focus is on the uncertainties originating from the pre-calculated aerosol models. These models are never complete descriptions of the reality. This aerosol model uncertainty is estimated using Gaussian processes with computational tools from spatial statistics. Our approach is based on smooth systematic differences between the observed and modelled reflectances. When acknowledging this model inadequacy in the estimation of aerosol optical thickness (AOT), the uncertainty estimates are more realistic. We present here a real world example of applying the methodology.

  20. Account for uncertainties of control measurements in the assessment of design margin factors

    International Nuclear Information System (INIS)

    The paper discusses the feasibility of accounting for uncertainties of control measurements in estimation of design margin factors. The feasibility is also taken into consideration proceeding from the fact how much the processed measured data were corrected by a priori calculated data of measurable parameters. The possibility and feasibility of such data correction is demonstrated by the authors with the help of Bayes theorem famous in mathematical statistics. (Authors)

  1. Regional inversion of CO2 ecosystem fluxes from atmospheric measurements. Reliability of the uncertainty estimates

    Energy Technology Data Exchange (ETDEWEB)

    Broquet, G.; Chevallier, F.; Breon, F.M.; Yver, C.; Ciais, P.; Ramonet, M.; Schmidt, M. [Laboratoire des Sciences du Climat et de l' Environnement, CEA-CNRS-UVSQ, UMR8212, IPSL, Gif-sur-Yvette (France); Alemanno, M. [Servizio Meteorologico dell' Aeronautica Militare Italiana, Centro Aeronautica Militare di Montagna, Monte Cimone/Sestola (Italy); Apadula, F. [Research on Energy Systems, RSE, Environment and Sustainable Development Department, Milano (Italy); Hammer, S. [Universitaet Heidelberg, Institut fuer Umweltphysik, Heidelberg (Germany); Haszpra, L. [Hungarian Meteorological Service, Budapest (Hungary); Meinhardt, F. [Federal Environmental Agency, Kirchzarten (Germany); Necki, J. [AGH University of Science and Technology, Krakow (Poland); Piacentino, S. [ENEA, Laboratory for Earth Observations and Analyses, Palermo (Italy); Thompson, R.L. [Max Planck Institute for Biogeochemistry, Jena (Germany); Vermeulen, A.T. [Energy research Centre of the Netherlands ECN, EEE-EA, Petten (Netherlands)

    2013-07-01

    The Bayesian framework of CO2 flux inversions permits estimates of the retrieved flux uncertainties. Here, the reliability of these theoretical estimates is studied through a comparison against the misfits between the inverted fluxes and independent measurements of the CO2 Net Ecosystem Exchange (NEE) made by the eddy covariance technique at local (few hectares) scale. Regional inversions at 0.5{sup 0} resolution are applied for the western European domain where {approx}50 eddy covariance sites are operated. These inversions are conducted for the period 2002-2007. They use a mesoscale atmospheric transport model, a prior estimate of the NEE from a terrestrial ecosystem model and rely on the variational assimilation of in situ continuous measurements of CO2 atmospheric mole fractions. Averaged over monthly periods and over the whole domain, the misfits are in good agreement with the theoretical uncertainties for prior and inverted NEE, and pass the chi-square test for the variance at the 30% and 5% significance levels respectively, despite the scale mismatch and the independence between the prior (respectively inverted) NEE and the flux measurements. The theoretical uncertainty reduction for the monthly NEE at the measurement sites is 53% while the inversion decreases the standard deviation of the misfits by 38 %. These results build confidence in the NEE estimates at the European/monthly scales and in their theoretical uncertainty from the regional inverse modelling system. However, the uncertainties at the monthly (respectively annual) scale remain larger than the amplitude of the inter-annual variability of monthly (respectively annual) fluxes, so that this study does not engender confidence in the inter-annual variations. The uncertainties at the monthly scale are significantly smaller than the seasonal variations. The seasonal cycle of the inverted fluxes is thus reliable. In particular, the CO2 sink period over the European continent likely ends later than

  2. Identification, summary and comparison of tools used to measure organizational attributes associated with chronic disease management within primary care settings

    OpenAIRE

    Lukewich, Julia; Corbin, Renée; Elizabeth G VanDenKerkhof; Edge, Dana S.; Williamson, Tyler; Tranmer, Joan E.

    2014-01-01

    Rationale, aims and objectives Given the increasing emphasis being placed on managing patients with chronic diseases within primary care, there is a need to better understand which primary care organizational attributes affect the quality of care that patients with chronic diseases receive. This study aimed to identify, summarize and compare data collection tools that describe and measure organizational attributes used within the primary care setting worldwide. Methods Systematic search and r...

  3. A new approach on restoration of dynamic measurement uncertainties in optical precision coordinate metrology

    International Nuclear Information System (INIS)

    This paper presents a new approach to the restoration of dynamic influenced measurement uncertainties in optical precision coordinate metrology (OPCM) using image sensors to measure geometrical features. Dynamic measurements within the context of this paper are based upon relative motion between the imaging setup (CCD-camera and optical system) and the measuring object respectively the measuring scene. The dynamic image acquisition causes image motion blur effects, which downgrades the uncertainties of the measurand. The approach presented deals with a new technique to restore motion degraded images using different methods to analyze important image features by extending the famous state of the art Richardson-Lucy image restoration technique using a new convergence criteria based on the variation of the detectable sub-pixel edge position of each iteration

  4. Measuring Cross-Section and Estimating Uncertainties with the fissionTPC

    Energy Technology Data Exchange (ETDEWEB)

    Bowden, N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Manning, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Sangiorgio, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Seilhan, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-01-30

    The purpose of this document is to outline the prescription for measuring fission cross-sections with the NIFFTE fissionTPC and estimating the associated uncertainties. As such it will serve as a work planning guide for NIFFTE collaboration members and facilitate clear communication of the procedures used to the broader community.

  5. Calculation of uncertainties associated to environmental radioactivity measurements and their functions. Practical Procedure

    International Nuclear Information System (INIS)

    This report summarizes the procedure used to calculate the uncertainties associated to environmental radioactivity measurements, focusing on those obtained by radiochemical separation in which tracers have been added. Uncertainties linked to activity concentration calculations, isotopic rat iso, inventories, sequential leaching data, chronology dating by using C.R.S. model and duplicate analysis are described in detail. The objective of this article is to serve as a guide to people not familiarized with this kind of calculations, showing clear practical examples. The input of the formulas and all the data needed to achieve these calculations into the Lotus 1, 2, 3 WTN is outlined as well. (Author) 13 refs

  6. Calculation of uncertainties associated to environmental radioactivity measurements and their functions. Practical Procedure

    International Nuclear Information System (INIS)

    This report summarizes the procedure used to calculate the uncertainties associated to environmental radioactivity measurements. focusing on those obtained by radiochemical separation in which tracers have been added. Uncertainties linked to activity concentration calculations, isotopic ratio, inventories, sequential leaching data, chronology dating by using C.R.S model and duplicate analysis are described in detail. The objective of this article is to serve as a guide to people not familiarized with this kind of calculations, showing clear practical examples. The input of the formulas and all the data needed to achieve these calculations into the Lotus 1,2,3, WIN is outlined as well. (Author)

  7. Source attribution of climatically important aerosol properties measured at Paposo (Chile during VOCALS

    Directory of Open Access Journals (Sweden)

    D. Chand

    2010-07-01

    Full Text Available Measurements of submicron aerosol composition, light scattering, and size distribution were made from 17 October to 15 November 2008 at the elevated Paposo site (25° 0.4' S, 70°27.01' W, 690 m a.s.l. on the Chilean coast as part of the VOCALS1 Regional Experiment (REx. Based on the chemical composition measurements, a receptor modeling analysis using Positive Matrix Factorization (PMF was carried out, yielding four broad source categories of the aerosol mass, light scattering coefficient, and a proxy for cloud condensation nucleus (CCN concentration at 0.4% supersaturation derived from the size distribution measurements assuming an observed soluble mass fraction of 0.53. The sources resolved were biomass burning, marine, an urban-biofuels mix and a somewhat ambiguous mix of smelter emissions and mineral dust. The urban-biofuels mix is the most dominant aerosol mass component (52% followed by biomass burning (25%, smelter/soil dust (12% and marine (9% sources. The average (mean±std submicron aerosol mass concentration, aerosol light scattering coefficient and proxy CCN concentration were, 8.77±5.40 μg m−3, 21.9±11.0 Mm−1 and 548±210 cm−3, respectively. Sulfate is the dominant identified submicron species constituting roughly 40% of the dry mass (3.64±2.30 μg m−3, although the indentified soluble species constitute only 53% of the mass. Much of the unidentified mass is likely organic in nature. The relative importance of each aerosol source category is different depending upon whether mass, light scattering, or CCN concentration is being considered, indicating that the mean size of aerosols associated with each source are different. Marine aerosols do not appear to contribute to more than 10% to either mass, light scattering, or CCN concentration at this site. Back trajectory cluster analysis proved consistent with the PMF source attribution.


    1 VOCALS

  8. Source attribution of climatically important aerosol properties measured at Paposo (Chile during VOCALS

    Directory of Open Access Journals (Sweden)

    D. Chand

    2010-11-01

    Full Text Available Measurements of submicron aerosol composition, light scattering, and size distribution were made from 17 October to 15 November 2008 at the elevated Paposo site (25° 0.4' S, 70° 27.01' W, 690 m a.s.l. on the Chilean coast as part of the VOCALS* Regional Experiment (REx. Based on the chemical composition measurements, a receptor modeling analysis using Positive Matrix Factorization (PMF was carried out, yielding four broad source categories of the aerosol mass, light scattering coefficient, and a proxy for cloud condensation nucleus (CCN concentration at 0.4% supersaturation derived from the size distribution measurements assuming an observed soluble mass fraction of 0.53. The sources resolved were biomass burning, marine, an urban-biofuels mix and a somewhat ambiguous mix of smelter emissions and mineral dust. The urban-biofuels mix is the most dominant aerosol mass component (52% followed by biomass burning (25%, smelter/soil dust (12% and marine (9% sources. The average (mean±std submicron aerosol mass concentration, aerosol light scattering coefficient and proxy CCN concentration were, 8.77±5.40 μg m−3, 21.9±11.0 Mm−1 and 548±210 cm−3, respectively. Sulfate is the dominant identified submicron species constituting roughly 40% of the dry mass (3.64±2.30 μg m−3, although the indentified soluble species constitute only 53% of the mass. Much of the unidentified mass is likely organic in nature. The relative importance of each aerosol source category is different depending upon whether mass, light scattering, or CCN concentration is being considered, indicating that the mean size of aerosols associated with each source are different. Marine aerosols do not appear to contribute to more than 10% to either mass, light scattering, or CCN concentration at this site. Back trajectory cluster analysis proved consistent with the PMF source attribution.

    *VOCALS: VAMOS** Ocean

  9. Evaluating uncertainty estimates in hydrologic models: borrowing measures from the forecast verification community

    Directory of Open Access Journals (Sweden)

    K. J. Franz

    2011-11-01

    Full Text Available The hydrologic community is generally moving towards the use of probabilistic estimates of streamflow, primarily through the implementation of Ensemble Streamflow Prediction (ESP systems, ensemble data assimilation methods, or multi-modeling platforms. However, evaluation of probabilistic outputs has not necessarily kept pace with ensemble generation. Much of the modeling community is still performing model evaluation using standard deterministic measures, such as error, correlation, or bias, typically applied to the ensemble mean or median. Probabilistic forecast verification methods have been well developed, particularly in the atmospheric sciences, yet few have been adopted for evaluating uncertainty estimates in hydrologic model simulations. In the current paper, we overview existing probabilistic forecast verification methods and apply the methods to evaluate and compare model ensembles produced from two different parameter uncertainty estimation methods: the Generalized Uncertainty Likelihood Estimator (GLUE, and the Shuffle Complex Evolution Metropolis (SCEM. Model ensembles are generated for the National Weather Service SACramento Soil Moisture Accounting (SAC-SMA model for 12 forecast basins located in the Southeastern United States. We evaluate the model ensembles using relevant metrics in the following categories: distribution, correlation, accuracy, conditional statistics, and categorical statistics. We show that the presented probabilistic metrics are easily adapted to model simulation ensembles and provide a robust analysis of model performance associated with parameter uncertainty. Application of these methods requires no information in addition to what is already available as part of traditional model validation methodology and considers the entire ensemble or uncertainty range in the approach.

  10. Optimal entropic uncertainty relation for successive measurements in quantum information theory

    Indian Academy of Sciences (India)

    M D Srinivas

    2003-06-01

    We derive an optimal bound on the sum of entropic uncertainties of two or more observables when they are sequentially measured on the same ensemble of systems. This optimal bound is shown to be greater than or equal to the bounds derived in the literature on the sum of entropic uncertainties of two observables which are measured on distinct but identically prepared ensembles of systems. In the case of a two-dimensional Hilbert space, the optimum bound for successive measurements of two-spin components, is seen to be strictly greater than the optimal bound for the case when they are measured on distinct ensembles, except when the spin components are mutually parallel or perpendicular.

  11. Invited Review Article: Measurement uncertainty of linear phase-stepping algorithms

    International Nuclear Information System (INIS)

    Phase retrieval techniques are widely used in optics, imaging and electronics. Originating in signal theory, they were introduced to interferometry around 1970. Over the years, many robust phase-stepping techniques have been developed that minimize specific experimental influence quantities such as phase step errors or higher harmonic components of the signal. However, optimizing a technique for a specific influence quantity can compromise its performance with regard to others. We present a consistent quantitative analysis of phase measurement uncertainty for the generalized linear phase stepping algorithm with nominally equal phase stepping angles thereby reviewing and generalizing several results that have been reported in literature. All influence quantities are treated on equal footing, and correlations between them are described in a consistent way. For the special case of classical N-bucket algorithms, we present analytical formulae that describe the combined variance as a function of the phase angle values. For the general Arctan algorithms, we derive expressions for the measurement uncertainty averaged over the full 2π-range of phase angles. We also give an upper bound for the measurement uncertainty which can be expressed as being proportional to an algorithm specific factor. Tabular compilations help the reader to quickly assess the uncertainties that are involved with his or her technique.

  12. Simultaneous minimum-uncertainty measurement of discrete-valued complementary observables

    CERN Document Server

    Trifonov, A S; Söderholm, J; Trifonov, Alexei; Bjork, Gunnar; Soderholm, Jonas

    2001-01-01

    We have made the first experimental demonstration of the simultaneous minimum uncertainty product between two complementary observables for a two-state system (a qubit). A partially entangled two-photon state was used to perform such measurements. Each of the photons carries (partial) information of the initial state thus leaving a room for measurements of two complementary observables on every member in an ensemble.

  13. Dead time effect on the Brewer measurements: correction and estimated uncertainties

    OpenAIRE

    Fountoulakis, Ilias; Redondas, Alberto; Bais, Alkiviadis F.; Rodriguez-Franco, Juan José; Fragkos, Konstantinos; Cede, Alexander

    2016-01-01

    Brewer spectrophotometers are widely used instruments which perform spectral measurements of the direct, the scattered and the global solar UV irradiance. By processing these measurements a variety of secondary products can be derived such as the total columns of ozone (TOC), sulfur dioxide and nitrogen dioxide and aerosol optical properties. Estimating and limiting the uncertainties of the final products is of critical importance. High-quality data have a lot of applications a...

  14. Survey of radiofrequency radiation levels around GSM base stations and evaluation of measurement uncertainty

    International Nuclear Information System (INIS)

    This paper is a summary of broadband measurement values of radiofrequency radiation around GSM base stations in the vicinity of residential areas in Belgrade and 12 other cities in Serbia. It will be useful for determining non-ionizing radiation exposure levels of the general public in the future. The purpose of this paper is also an appropriate representation of basic information on the evaluation of measurement uncertainty. (author)

  15. Integrating measuring uncertainty of tactile and optical coordinate measuring machines in the process capability assessment of micro injection moulding

    DEFF Research Database (Denmark)

    Tosello, Guido; Hansen, Hans Nørgaard; Gasparin, Stefania

    Process capability of micro injection moulding was investigated in this paper by calculating the Cp and Cpk statistics. Uncertainty of both optical and tactile measuring systems employed in the quality control of micro injection moulded products was assessed and compared with the specified tolera...

  16. Accuracy and uncertainty in radiochemical measurements. Learning from errors in nuclear analytical chemistry

    International Nuclear Information System (INIS)

    A characteristic that sets radioactivity measurements apart from most spectrometries is that the precision of a single determination can be estimated from Poisson statistics. This easily calculated counting uncertainty permits the detection of other sources of uncertainty by comparing observed with a priori precision. A good way to test the many underlysing assumptions in radiochemical measurements is to strive for high accuracy. For example, a measurement by instrumental neutron activation analysis (INAA) of gold film thickness in our laboratory revealed the need for pulse pileup correction even at modest dead times. Recently, the International Organization for Standardization (ISO) and other international bodies have formalized the quantitative determination and statement of uncertainty so that the weaknesses of each measurement are exposed for improvement. In the INAA certification measurement of ion-implanted arsenic in silicon (Standard Reference Material 2134), we recently achieved an expanded (95 % confidence) relative uncertainly of 0.38 % for 90 ng of arsenic per sample. A complete quantitative error analysis was performed. This measurement meets the CCQM definition of a primary ratio method. (author)

  17. Adaptive method for quantifying uncertainty in discharge measurements using velocity-area method.

    Science.gov (United States)

    Despax, Aurélien; Favre, Anne-Catherine; Belleville, Arnaud

    2015-04-01

    Streamflow information provided by hydrometric services such as EDF-DTG allow real time monitoring of rivers, streamflow forecasting, paramount hydrological studies and engineering design. In open channels, the traditional approach to measure flow uses a rating curve, which is an indirect method to estimate the discharge in rivers based on water level and punctual discharge measurements. A large proportion of these discharge measurements are performed using the velocity-area method; it consists in integrating flow velocities and depths through the cross-section [1]. The velocity field is estimated by choosing a number m of verticals, distributed across the river, where vertical velocity profile is sampled by a current-meter at ni different depths. Uncertainties coming from several sources are related to the measurement process. To date, the framework for assessing uncertainty in velocity-area discharge measurements is the method presented in the ISO 748 standard [2] which follows the GUM [3] approach. The equation for the combined uncertainty in measured discharge u(Q), at 68% level of confidence, proposed by the ISO 748 standard is expressed as: Σ 2 2 2 -q2i[u2(Bi)+-u2(Di)+-u2p(Vi)+-(1ni) ×-[u2c(Vi)+-u2exp(Vi)

  18. Analysis of the Uncertainty in Wind Measurements from the Atmospheric Radiation Measurement Doppler Lidar during XPIA: Field Campaign Report

    Energy Technology Data Exchange (ETDEWEB)

    Newsom, Rob [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-03-01

    In March and April of 2015, the ARM Doppler lidar that was formerly operated at the Tropical Western Pacific site in Darwin, Australia (S/N 0710-08) was deployed to the Boulder Atmospheric Observatory (BAO) for the eXperimental Planetary boundary-layer Instrument Assessment (XPIA) field campaign. The goal of the XPIA field campaign was to investigate methods of using multiple Doppler lidars to obtain high-resolution three-dimensional measurements of winds and turbulence in the atmospheric boundary layer, and to characterize the uncertainties in these measurements. The ARM Doppler lidar was one of many Doppler lidar systems that participated in this study. During XPIA the 300-m tower at the BAO site was instrumented with well-calibrated sonic anemometers at six levels. These sonic anemometers provided highly accurate reference measurements against which the lidars could be compared. Thus, the deployment of the ARM Doppler lidar during XPIA offered a rare opportunity for the ARM program to characterize the uncertainties in their lidar wind measurements. Results of the lidar-tower comparison indicate that the lidar wind speed measurements are essentially unbiased (~1cm s-1), with a random error of approximately 50 cm s-1. Two methods of uncertainty estimation were tested. The first method was found to produce uncertainties that were too low. The second method produced estimates that were more accurate and better indicators of data quality. As of December 2015, the first method is being used by the ARM Doppler lidar wind value-added product (VAP). One outcome of this work will be to update this VAP to use the second method for uncertainty estimation.

  19. A new approach to handle additive and multiplicative uncertainties in the measurement for ? LPV filtering

    Science.gov (United States)

    Lacerda, Márcio J.; Tognetti, Eduardo S.; Oliveira, Ricardo C. L. F.; Peres, Pedro L. D.

    2016-04-01

    This paper presents a general framework to cope with full-order ? linear parameter-varying (LPV) filter design subject to inexactly measured parameters. The main novelty is the ability of handling additive and multiplicative uncertainties in the measurements, for both continuous and discrete-time LPV systems, in a unified approach. By conveniently modelling scheduling parameters and uncertainties affecting the measurements, the ? filter design problem can be expressed in terms of robust matrix inequalities that become linear when two scalar parameters are fixed. Therefore, the proposed conditions can be efficiently solved through linear matrix inequality relaxations based on polynomial solutions. Numerical examples are presented to illustrate the improved efficiency of the proposed approach when compared to other methods and, more important, its capability to deal with scenarios where the available strategies in the literature cannot be used.

  20. Uncertainty Quantification for Nuclear Density Functional Theory and Information Content of New Measurements

    CERN Document Server

    McDonnell, J D; Higdon, D; Sarich, J; Wild, S M; Nazarewicz, W

    2015-01-01

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models; to estimate model errors and thereby improve predictive capability; to extrapolate beyond the regions reached by experiment; and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, w...

  1. Approach to establishing safety margin for uncertainty in measurement and nuclide spectrum in clearance level inspection

    International Nuclear Information System (INIS)

    In the monitoring for compliance with the clearance level, the concentrations of objective nuclides, such as alpha or low-energy beta emitters, can be estimated without direct gamma measurement by assuming the existence of objective nuclides at geometric mean concentrations or by using previously assessed information on nuclide spectra and measurement results for a key gamma nuclide. To determine whether clearance can be carried out, the uncertainty in the mean concentrations and concentration ratios to the key gamma nuclide should be appropriately considered, in addition to the measurement uncertainty. In this work, the concept of the clearance level has been reconsidered and a new approach to establishing an appropriate safety factor of the monitoring for compliance with the clearance level has been proposed. This approach was adopted in the draft of standard of 'Monitoring for Compliance with Clearance Level' prepared by the Standards Committee (SC) of the Atomic Energy Society of Japan (AESJ). (author)

  2. A technique for improved stability of adaptive feedforward controllers without detailed uncertainty measurements

    International Nuclear Information System (INIS)

    Model errors in adaptive controllers for the reduction of broadband noise and vibrations may lead to unstable systems or increased error signals. Previous research on active structures with small damping has shown that the addition of a low-authority controller which increases damping in the system may lead to improved performance of an adaptive, high-authority controller. Other researchers have suggested the use of frequency dependent regularization based on measured uncertainties. In this paper an alternative method is presented that avoids the disadvantages of these methods, namely the additional complex hardware and the need to obtain detailed information on the uncertainties. An analysis is made of an adaptive feedforward controller in which a difference exists between the secondary path and the model as used in the controller. The real parts of the eigenvalues that determine the stability of the system are expressed in terms of the amount of uncertainty and the singular values of the secondary path. Modifications of the feedforward control scheme are suggested that aim to improve performance without requiring detailed uncertainty measurements. (paper)

  3. Attributes identification of nuclear material by non-destructive radiation measurement methods

    International Nuclear Information System (INIS)

    Full text: The nuclear materials should be controlled under the regulation of National Safeguard System. The non-destructive analysis method, which is simple and quick, provide a effective process in determining the nuclear materials, nuclear scraps and wastes. The method play a very important role in the fields of nuclear material control and physical protection against the illegal removal and smuggling of nuclear material. The application of non-destructive analysis in attributes identification of nuclear material is briefly described in this paper. The attributes determined by radioactive detection technique are useful tolls to identify the characterization of special nuclear material (isotopic composition, enrichment etc.). (author)

  4. Uncertainty analysis of gross primary production partitioned from net ecosystem exchange measurements

    Directory of Open Access Journals (Sweden)

    R. Raj

    2015-08-01

    Full Text Available Gross primary production (GPP, separated from flux tower measurements of net ecosystem exchange (NEE of CO2, is used increasingly to validate process-based simulators and remote sensing-derived estimates of simulated GPP at various time steps. Proper validation should include the uncertainty associated with this separation at different time steps. This can be achieved by using a Bayesian framework. In this study, we estimated the uncertainty in GPP at half hourly time steps. We used a non-rectangular hyperbola (NRH model to separate GPP from flux tower measurements of NEE at the Speulderbos forest site, The Netherlands. The NRH model included the variables that influence GPP, in particular radiation, and temperature. In addition, the NRH model provided a robust empirical relationship between radiation and GPP by including the degree of curvature of the light response curve. Parameters of the NRH model were fitted to the measured NEE data for every 10-day period during the growing season (April to October in 2009. Adopting a Bayesian approach, we defined the prior distribution of each NRH parameter. Markov chain Monte Carlo (MCMC simulation was used to update the prior distribution of each NRH parameter. This allowed us to estimate the uncertainty in the separated GPP at half-hourly time steps. This yielded the posterior distribution of GPP at each half hour and allowed the quantification of uncertainty. The time series of posterior distributions thus obtained allowed us to estimate the uncertainty at daily time steps. We compared the informative with non-informative prior distributions of the NRH parameters. The results showed that both choices of prior produced similar posterior distributions GPP. This will provide relevant and important information for the validation of process-based simulators in the future. Furthermore, the obtained posterior distributions of NEE and the NRH parameters are of interest for a range of applications.

  5. Uncertainty Factors for Stage-Specific and Cumulative Results of Indirect Measurements

    CERN Document Server

    Datta, B P

    2009-01-01

    Evaluation of a variable Yd from certain measured variable(s) Xi(s), by making use of their system-specific-relationship (SSR), is generally referred as the indirect measurement. Naturally the SSR may stand for a simple data-translation process in a given case, but a set of equations, or even a cascade of different such processes, in some other case. Further, though the measurements are a priori ensured to be accurate, there is no definite method for examining whether the result obtained at the end of an SSR, specifically a cascade of SSRs, is really representative as the measured Xi-values. Of Course, it was recently shown that the uncertainty (ed) in the estimate (yd) of a specified Yd is given by a specified linear combination of corresponding measurement-uncertainties (uis). Here, further insight into this principle is provided by its application to the cases represented by cascade-SSRs. It is exemplified how the different stage-wise uncertainties (Ied, IIed, ... ed), that is to say the requirements for t...

  6. Uncertainty quantification for nuclear density functional theory and information content of new measurements

    Energy Technology Data Exchange (ETDEWEB)

    McDonnell, J. D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Schunck, N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Higdon, D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sarich, J. [Argonne National Lab. (ANL), Argonne, IL (United States); Wild, S. M. [Argonne National Lab. (ANL), Argonne, IL (United States); Nazarewicz, W. [Michigan State Univ., East Lansing, MI (United States); Oak Ridge National Lab., Oak Ridge, TN (United States); Univ. of Warsaw, Warsaw (Poland)

    2015-03-24

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. As a result, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.

  7. Uncertainty quantification for nuclear density functional theory and information content of new measurements.

    Science.gov (United States)

    McDonnell, J D; Schunck, N; Higdon, D; Sarich, J; Wild, S M; Nazarewicz, W

    2015-03-27

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method. PMID:25860736

  8. Uncertainties in acoustical transfer functions : modeling, measurement and derivation of parameters for airborne and structure-borne sound

    OpenAIRE

    Dietrich, Pascal

    2013-01-01

    Measured transfer functions of acoustic systems are often used to derive single-number parameters. The uncertainty analysis is commonly focused on the derived parameters but not on the transfer function as the primary quantity. This thesis presents an approach to assess the uncertainty contri- butions in these transfer functions by using analytic models. Uncertainties caused by the measurement method are analyzed with a focus on the un- derlying signal processing. In particular, the influence...

  9. Uncertainty estimation of measurement result about working platinum-rhodium10 and platinum thermocouple of level II

    International Nuclear Information System (INIS)

    This paper analyzes and evaluates the uncertainty of measurement result of working Platinum-Rhodium10 and Platinum thermocouple level II by double pole method. It presents the thermo-voltage expanded uncertainty and corresponding temperature span of measurement result when confidence level p = 0.95. The estimation result shows that the temperature corresponding to the expanded uncertainty of the thermocouple is between 0.6 degree C and 1.9 degree C. (author)

  10. Evaluation of the combined measurement uncertainty in isotope dilution by MC-ICP-MS

    Energy Technology Data Exchange (ETDEWEB)

    Fortunato, G.; Wunderli, S. [Metrology in Chemistry, Swiss Federal Laboratories for Materials Testing and Research (EMPA), Lerchenfeldstrasse 5, 9014 St. Gallen (Switzerland)

    2003-09-01

    The combination of metrological weighing, the measurement of isotope amount ratios by a multicollector inductively coupled plasma mass spectrometer (MC-ICP-MS) and the use of high-purity reference materials are the cornerstones to achieve improved results for the amount content of lead in wine by the reversed isotope dilution technique. Isotope dilution mass spectrometry (IDMS) and reversed IDMS have the potential to be a so-called primary method, with which close comparability and well-stated combined measurement uncertainties can be obtained. This work describes the detailed uncertainty budget determination using the ISO-GUM approach. The traces of lead in wine were separated from the matrix by ion exchange chromatography after HNO{sub 3}/H{sub 2}O{sub 2} microwave digestion. The thallium isotope amount ratio (n({sup 205}Tl)/n({sup 203}Tl)) was used to correct for mass discrimination using an exponential model approach. The corrected lead isotope amount ratio n({sup 206}Pb)/n({sup 208}Pb) for the isotopic standard SRM 981 measured in our laboratory was compared with ratio values considered to be the least uncertain. The result has been compared in a so-called pilot study ''lead in wine'' organised by the CCQM (Comite Consultatif pour la Quantite de Matiere, BIPM, Paris; the highest measurement authority for analytical chemical measurements). The result for the lead amount content k(Pb) and the corresponding expanded uncertainty U given by our laboratory was:k(Pb)=1.329 x 10-10mol g-1 (amount content of lead in wine)U[k(Pb)]=1.0 x 10-12mol g-1 (expanded uncertainty U=k x uc, k=2) The uncertainty of the main influence parameter of the combined measurement uncertainty was determined to be the isotope amount ratio R{sub 206,B} of the blend between the enriched spike and the sample. (orig.)

  11. Measuring Student Graduateness: Reliability and Construct Validity of the Graduate Skills and Attributes Scale

    Science.gov (United States)

    Coetzee, Melinde

    2014-01-01

    This study reports the development and validation of the Graduate Skills and Attributes Scale which was initially administered to a random sample of 272 third-year-level and postgraduate-level, distance-learning higher education students. The data were analysed using exploratory factor analysis. In a second study, the scale was administered to a…

  12. Attribution of nuclear material by non-destructive radiation measurement methods

    International Nuclear Information System (INIS)

    The paper briefly introduces the foundational principle of non-destructive analysis for the attribution of nuclear material. Facility in processing and simplicity in analysing mean that this method will provide effective support in the prevention of trafficking and smuggling in nuclear materials. (author)

  13. Estimating the uncertainty in measurement of occupational exposure with personal dosemeters

    International Nuclear Information System (INIS)

    Full text: In the 1990 Recommendations of the ICRP it is stated that an uncertainty of a factor 1.5 in either direction 'will not be unusual' in a dose measured under workplace conditions. In many documents like the EU Technical recommendations, the IAEA Safety Guides and papers in scientific journals, this statement is understood to be a starting point for developing type-test criteria and criteria for approval of dosimetric systems. And, although meant to refer to effective dose it is usually understood to refer to personal dose equivalent as well. When using a personal dosemeter, the quantity to be measured, personal dose equivalent, is derived from a number of input quantities such as light output of one or more TLDs or densities of areas on a film, sensitivity of the detection material, properties of the evaluating equipment and dose due to background radiation. Each of these input quantities is the result of a measurement that is inexact and thus resulting in an inexact value for the dose. In cases where the transformation of the raw measurements into the quantity to be estimated, measurand, only involves linear transformations and the distributions of the errors in the input quantities are Gaussian or at least symmetrical and Gausslike, then the uncertainty in the measurand can more or less reliably be derived from the uncertainties in the input quantities using the familiar 'general law of error propagation'. In cases where the evaluation involves non-linear transformations and/or the probability distributions of the input quantities are not well approximated by Gaussian distributions then it is far from obvious that the familiar techniques will result in a realistic estimate of the uncertainty in the dose. This paper presents a method using Monte Carlo techniques that does not depend on the validity of the general law of error propagation. The method is based on the 'Guide to the Expression of Uncertainty in Measurement' and the 'Supplement 1: Numerical

  14. Estimation of the thermal diffusion coefficient in fusion plasmas taking frequency measurement uncertainties into account

    International Nuclear Information System (INIS)

    In this paper, the estimation of the thermal diffusivity from perturbative experiments in fusion plasmas is discussed. The measurements used to estimate the thermal diffusivity suffer from stochastic noise. Accurate estimation of the thermal diffusivity should take this into account. It will be shown that formulas found in the literature often result in a thermal diffusivity that has a bias (a difference between the estimated value and the actual value that remains even if more measurements are added) or have an unnecessarily large uncertainty. This will be shown by modeling a plasma using only diffusion as heat transport mechanism and measurement noise based on ASDEX Upgrade measurements. The Fourier coefficients of a temperature perturbation will exhibit noise from the circular complex normal distribution (CCND). Based on Fourier coefficients distributed according to a CCND, it is shown that the resulting probability density function of the thermal diffusivity is an inverse non-central chi-squared distribution. The thermal diffusivity that is found by sampling this distribution will always be biased, and averaging of multiple estimated diffusivities will not necessarily improve the estimation. Confidence bounds are constructed to illustrate the uncertainty in the diffusivity using several formulas that are equivalent in the noiseless case. Finally, a different method of averaging, that reduces the uncertainty significantly, is suggested. The methodology is also extended to the case where damping is included, and it is explained how to include the cylindrical geometry. (paper)

  15. Uncertainty compensation methods for quantitative hardness measurement of materials using atomic force microscope nanoindentation technique

    International Nuclear Information System (INIS)

    We suggest uncertainty compensation methods for the quantification of nanoscale indentation using atomic force microscopy (AFM). The main error factors in the force–distance curves originated from the difference between theoretical and real shape of AFM tip during nanoscale indentation measurements. For the uncertainty compensations of tip shapes and misalignment of loading axis, we applied the enhanced tip geometry function and Y-scanner moving to the AFM measurements. Three different materials such as Si wafer, glass, and Au film were characterized with these compensation methods. By applying compensation methods, our results show the decreased values from 167% to 39% below 100 nm indenting depth compared with the literature values. These compensation methods applied to thin films will show the advanced quantitative analysis of hardness measurements using nanoscale indenting AFM. - Highlights: • We suggest uncertainty compensation methods for quantitative hardness measurement. • The main errors during indentation are tip geometry and non-uniform loading. • 3D tip characterization is obtained by using atomic force microscope scan. • The compensation methods perform well in thin films below thickness of 100 nm

  16. Environmental Uncertainty, Performance Measure Variety and Perceived Performance in Icelandic Companies

    DEFF Research Database (Denmark)

    Rikhardsson, Pall; Sigurjonsson, Throstur Olaf; Arnardottir, Audur Arna

    measurement frameworks, and management characteristics. This paper reports the results of a study carried out at year end 2013 of the use of performance measures by Icelandic companies and the links to perceived environmental uncertainty, management satisfaction with the performance measurement system and the...... hypotheses difficult. Possible explanations of the high number of performance measures in use in Icelandic companies could be the recent period of high environmental turbulence forcing them to adopt more measures focusing on a variety of performance areas. This should be researched further.......The use of performance measures and performance measurement frameworks has increased significantly in recent years. The type and variety of performance measures in use has been researched in various countries and linked to different variables such as the external environment, performance...

  17. Development and application of objective uncertainty measures for nuclear power plant transient analysis[Dissertation 3897

    Energy Technology Data Exchange (ETDEWEB)

    Vinai, P

    2007-10-15

    For the development, design and licensing of a nuclear power plant (NPP), a sound safety analysis is necessary to study the diverse physical phenomena involved in the system behaviour under operational and transient conditions. Such studies are based on detailed computer simulations. With the progresses achieved in computer technology and the greater availability of experimental and plant data, the use of best estimate codes for safety evaluations has gained increasing acceptance. The application of best estimate safety analysis has raised new problems that need to be addressed: it has become more crucial to assess as to how reliable code predictions are, especially when they need to be compared against safety limits that must not be crossed. It becomes necessary to identify and quantify the various possible sources of uncertainty that affect the reliability of the results. Currently, such uncertainty evaluations are generally based on experts' opinion. In the present research, a novel methodology based on a non-parametric statistical approach has been developed for objective quantification of best-estimate code uncertainties related to the physical models used in the code. The basis is an evaluation of the accuracy of a given physical model achieved by comparing its predictions with experimental data from an appropriate set of separate-effect tests. The differences between measurements and predictions can be considered stochastically distributed, and thus a statistical approach can be employed. The first step was the development of a procedure for investigating the dependence of a given physical model's accuracy on the experimental conditions. Each separate-effect test effectively provides a random sample of discrepancies between measurements and predictions, corresponding to a location in the state space defined by a certain number of independent system variables. As a consequence, the samples of 'errors', achieved from analysis of the entire

  18. Validation of the Consumer Values versus Perceived Product Attributes Model Measuring the Purchase of Athletic Team Merchandise

    Science.gov (United States)

    Lee, Donghun; Byon, Kevin K.; Schoenstedt, Linda; Johns, Gary; Bussell, Leigh Ann; Choi, Hwansuk

    2012-01-01

    Various consumer values and perceived product attributes trigger consumptive behaviors of athletic team merchandise (Lee, Trail, Kwon, & Anderson, 2011). Likewise, using a principal component analysis technique on a student sample, a measurement scale was proposed that consisted of nine factors affecting the purchase of athletic team merchandise.…

  19. Focused Belief Measures for Uncertainty Quantification in High Performance Semantic Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Joslyn, Cliff A.; Weaver, Jesse R.

    2013-08-13

    In web-scale semantic data analytics there is a great need for methods which aggregate uncertainty claims, on the one hand respecting the information provided as accurately as possible, while on the other still being tractable. Traditional statistical methods are more robust, but only represent distributional, additive uncertainty. Generalized information theory methods, including fuzzy systems and Dempster-Shafer (DS) evidence theory, represent multiple forms of uncertainty, but are computationally and methodologically difficult. We require methods which provide an effective balance between the complete representation of the full complexity of uncertainty claims in their interaction, while satisfying the needs of both computational complexity and human cognition. Here we build on J{\\o}sang's subjective logic to posit methods in focused belief measures (FBMs), where a full DS structure is focused to a single event. The resulting ternary logical structure is posited to be able to capture the minimal amount of generalized complexity needed at a maximum of computational efficiency. We demonstrate the efficacy of this approach in a web ingest experiment over the 2012 Billion Triple dataset from the Semantic Web Challenge.

  20. Towards a more realistic modelling of the uncertainty on identified mode shapes due to measurement noise

    International Nuclear Information System (INIS)

    Many damage identification methods use the information from mode shapes. In order to test the robustness of these methods, it is a common practice to introduce uncertainty on the mode shapes in the form of independent noise at each measured location. In doing so, the potential spatial correlation in the mode shapes uncertainty is not taken into account. A better approach consists in adding uncorrelated noise on the time domain responses at each sensor before doing the identification. The spatial correlation resulting from the identification can then be evaluated using the covariance matrices of the identified mode shapes. In this study, we apply this approach to the numerical example of a simply supported beam. Modal identification is performed using stochastic subspace based algorithms developed in the toolbox MACEC. The covariance matrices of the mode shapes shows that there is a strong spatial correlation in the mode shapes uncertainty. This result shows that adding independent noise directly on the mode shapes is not a very realistic approach to assess the impact of noise on damage identification methods. The approach used to characterize noise uncertainty on modeshapes identification is totally general and can be applied to any mode, structure or sensing technology.

  1. Quantification of model uncertainty in aerosol optical thickness retrieval from Ozone Monitoring Instrument (OMI measurements

    Directory of Open Access Journals (Sweden)

    A. Määttä

    2013-09-01

    Full Text Available We study uncertainty quantification in remote sensing of aerosols in the atmosphere with top of the atmosphere reflectance measurements from the nadir-viewing Ozone Monitoring Instrument (OMI. Focus is on the uncertainty in aerosol model selection of pre-calculated aerosol models and on the statistical modelling of the model inadequacies. The aim is to apply statistical methodologies that improve the uncertainty estimates of the aerosol optical thickness (AOT retrieval by propagating model selection and model error related uncertainties more realistically. We utilise Bayesian model selection and model averaging methods for the model selection problem and use Gaussian processes to model the smooth systematic discrepancies from the modelled to observed reflectance. The systematic model error is learned from an ensemble of operational retrievals. The operational OMI multi-wavelength aerosol retrieval algorithm OMAERO is used for cloud free, over land pixels of the OMI instrument with the additional Bayesian model selection and model discrepancy techniques. The method is demonstrated with four examples with different aerosol properties: weakly absorbing aerosols, forest fires over Greece and Russia, and Sahara dessert dust. The presented statistical methodology is general; it is not restricted to this particular satellite retrieval application.

  2. A revised uncertainty budget for measuring the Boltzmann constant using the Doppler Broadening Technique on ammonia

    CERN Document Server

    Lemarchand, Cyril; Sow, Papa Lat Tabara; Triki, Meriam; Tokunaga, Sean K; Briaudeau, Stephan; Chardonnet, Christian; Darquié, Benoît; Daussy, Christophe

    2013-01-01

    We report on our on-going effort to measure the Boltzmann constant, kB, using the Doppler Broadening Technique. The main systematic effects affecting the measurement are discussed. A revised error budget is presented in which the global uncertainty on systematic effects is reduced to 2.3 ppm. This corresponds to a reduction of more than one order of magnitude compared to our previous Boltzmann constant measurement. Means to reach a determination of kB at the part per million accuracy level are outlined.

  3. Uncertainty of angular displacement measurement with a MEMS gyroscope integrated in a smartphone

    Science.gov (United States)

    de Campos Porath, Maurício; Dolci, Ricardo

    2015-10-01

    Low-cost inertial sensors have recently gained popularity and are now widely used in electronic devices such as smartphones and tablets. In this paper we present the results of a set of experiments aiming to assess the angular displacement measurement errors of a gyroscope integrated in a smartphone of a recent model. The goal is to verify whether these sensors could substitute dedicated electronic inclinometers for the measurement of angular displacement. We estimated a maximum error of 0.3° (sum of expanded uncertainty and maximum absolute bias) for the roll and pitch axes, for a measurement time without referencing up to 1 h.

  4. The small sample uncertainty aspect in relation to bullwhip effect measurement

    DEFF Research Database (Denmark)

    Nielsen, Erland Hejn

    2009-01-01

    The bullwhip effect as a concept has been known for almost half a century starting with the Forrester effect. The bullwhip effect is observed in many supply chains, and it is generally accepted as a potential malice. Despite of this fact, the bullwhip effect still seems to be first and foremost...... chain under control. This paper will put special emphasis on the unavoidable small-sample uncertainty aspects relating to the measurement or estimation of the bullwhip effect.  ...

  5. Fluid flow estimation with multiscale ensemble filters based on motion measurements under location uncertainty

    OpenAIRE

    Beyou, Sébastien; Corpetti, Thomas; Gorthi, Sai; Mémin, Etienne

    2013-01-01

    International audience This paper proposes a novel multi-scale fluid flow data assimilation approach, which integrates and complements the advantages of a Bayesian sequential assimilation technique, the Weighted Ensemble Kalman filter (WEnKF). The data assimilation proposed in this work incorporates measurement brought by an efficient multiscale stochastic formulation of the well-known Lucas-Kanade (LK) estimator. This estimator has the great advantage to provide uncertainties associated t...

  6. "On Measuring Uncertainty of Small Area Estimators with Higher Order Accuracy"

    OpenAIRE

    Tatsuya Kubokawa

    2010-01-01

    The empirical best linear unbiased predictor (EBLUP) or the empirical Bayes estimator (EB) in the linear mixed model is recognized useful for the small area estimation, because it can increase the estimation precision by using the information from the related areas. Two of the measures of uncertainty of EBLUP is the estimation of the mean squared error (MSE) and the confidence interval, which have been studied under the second-order accuracy in the literature. This paper provides the general ...

  7. Environmental uncertainty, corporate strategy, performance measurement and the creation of economic value

    OpenAIRE

    Groot, TLCM Tom

    2001-01-01

    Evidence from practice indicates that firms frequently alter their performance measurement systems in order to accommodate effectively to dynamic circumstances and to changing corporate strategies. One common element in this development is an increasing uncertainty on the part of corporate managers about the usefulness of accounting information for performance evaluation. The idea is spreading that performance evaluation systems, in order to become effective, should include not only accountin...

  8. Validation and measurement uncertainty estimation in food microbiology: differences between quantitative and qualitative methods

    OpenAIRE

    Vesna Režić Dereani; Marijana Matek Sarić

    2010-01-01

    The aim of this research is to describe quality control procedures, procedures for validation and measurement uncertainty (MU) determination as an important element of quality assurance in food microbiology laboratory for qualitative and quantitative type of analysis. Accreditation is conducted according to the standard ISO 17025:2007. General requirements for the competence of testing and calibration laboratories, which guarantees the compliance with standard operating procedures and the tec...

  9. Determination of temperature measurements uncertainties of the heat transport primary system of Embalse nuclear power plant

    International Nuclear Information System (INIS)

    In this work, the systematic errors in temperature measurements in inlet and outlet headers of HTPS coolant channels of Embalse nuclear power plant are evaluated. These uncertainties are necessary for a later evaluation of the channel power maps transferred to the coolant. The power maps calculated in this way are used to compare power distributions using neutronic codes. Therefore, a methodology to correct systematic errors of temperature in outlet feeders and inlet headers is developed in this work. (author)

  10. Measurement and interpolation uncertainties in rainfall maps from cellular communication networks

    Science.gov (United States)

    Rios Gaona, M. F.; Overeem, A.; Leijnse, H.; Uijlenhoet, R.

    2015-08-01

    Accurate measurements of rainfall are important in many hydrological and meteorological applications, for instance, flash-flood early-warning systems, hydraulic structures design, irrigation, weather forecasting, and climate modelling. Whenever possible, link networks measure and store the received power of the electromagnetic signal at regular intervals. The decrease in power can be converted to rainfall intensity, and is largely due to the attenuation by raindrops along the link paths. Such an alternative technique fulfils the continuous effort to obtain measurements of rainfall in time and space at higher resolutions, especially in places where traditional rain gauge networks are scarce or poorly maintained. Rainfall maps from microwave link networks have recently been introduced at country-wide scales. Despite their potential in rainfall estimation at high spatiotemporal resolutions, the uncertainties present in rainfall maps from link networks are not yet fully comprehended. The aim of this work is to identify and quantify the sources of uncertainty present in interpolated rainfall maps from link rainfall depths. In order to disentangle these sources of uncertainty, we classified them into two categories: (1) those associated with the individual microwave link measurements, i.e. the errors involved in link rainfall retrievals, such as wet antenna attenuation, sampling interval of measurements, wet/dry period classification, dry weather baseline attenuation, quantization of the received power, drop size distribution (DSD), and multi-path propagation; and (2) those associated with mapping, i.e. the combined effect of the interpolation methodology and the spatial density of link measurements. We computed ~ 3500 rainfall maps from real and simulated link rainfall depths for 12 days for the land surface of the Netherlands. Simulated link rainfall depths refer to path-averaged rainfall depths obtained from radar data. The ~ 3500 real and simulated rainfall maps were

  11. Universally valid reformulation of the Heisenberg uncertainty principle on noise and disturbance in measurement

    CERN Document Server

    Ozawa, M

    2003-01-01

    The Heisenberg uncertainty principle states that the product of the noise in a position measurement and the momentum disturbance caused by that measurement should be no less than the limit set by Planck's constant, hbar/2, as demonstrated by Heisenberg's thought experiment using a gamma-ray microscope. Here I show that this common assumption is false: a universally valid trade-off relation between the noise and the disturbance has an additional correlation term, which is redundant when the intervention brought by the measurement is independent of the measured object, but which allows the noise-disturbance product much below Planck's constant when the intervention is dependent. A model of measuring interaction with dependent intervention shows that Heisenberg's lower bound for the noise-disturbance product is violated even by a nearly nondisturbing, precise position measuring instrument. An experimental implementation is also proposed to realize the above model in the context of optical quadrature measurement ...

  12. Relationship Between Psychosocial Burden of Skin Conditions and Symptoms: Measuring the Attributable Fraction.

    Science.gov (United States)

    Sampogna, Francesca; Tabolli, Stefano; Giannantoni, Patrizia; Paradisi, Andrea; Abeni, Damiano

    2016-01-01

    Skin conditions often have a severe impact on the physical and psychosocial domains of patients' quality of life, but the relationship between these domains has been studied little. This study estimated the fraction of psychosocial burden that may be attributable to symptoms, using the Skindex-17 quality of life questionnaire (symptoms and psychosocial scales) in 2,487 outpatients. The excess proportion of psychosocial burden for each skin condition was computed. Overall, 79.8% of the psychosocial burden of patients with severe symptoms may be attributable to the symptoms. For patients with mild symptoms this figure is 49.7%. A great heterogeneity was observed, from -0.9% for patients with scars, up to more than 90% for conditions such as lichen planus and psoriasis. While these results will have to be confirmed in longitudinal studies, they seem to indicate that, by targeting specific symptoms, a substantial portion of the psychosocial burden of skin diseases could be spared. PMID:25766753

  13. The concordance of directly and indirectly measured built environment attributes and physical activity adoption

    OpenAIRE

    O'Connor Daniel P; Medina Ashley; Mama Scherezade K; McAlexander Kristen M; Lee Rebecca E

    2011-01-01

    Background Physical activity (PA) adoption is essential for obesity prevention and control, yet ethnic minority women report lower levels of PA and are at higher risk for obesity and its comorbidities compared to Caucasians. Epidemiological studies and ecologic models of health behavior suggest that built environmental factors are associated with health behaviors like PA, but few studies have examined the association between built environment attribute concordance and PA, and no known studie...

  14. Assessment of adaptation measures to high-mountain risks in Switzerland under climate uncertainties

    Science.gov (United States)

    Muccione, Veruska; Lontzek, Thomas; Huggel, Christian; Ott, Philipp; Salzmann, Nadine

    2015-04-01

    The economic evaluation of different adaptation options is important to support policy-makers that need to set priorities in the decision-making process. However, the decision-making process faces considerable uncertainties regarding current and projected climate impacts. First, physical climate and related impact systems are highly complex and not fully understood. Second, the further we look into the future, the more important the emission pathways become, with effects on the frequency and severity of climate impacts. Decision on adaptation measures taken today and in the future must be able to adequately consider the uncertainties originating from the different sources. Decisions are not taken in a vacuum but always in the context of specific social, economic, institutional and political conditions. Decision finding processes strongly depend on the socio-political system and usually have evolved over some time. Finding and taking decisions in the respective socio-political and economic context multiplies the uncertainty challenge. Our presumption is that a sound assessment of the different adaptation options in Switzerland under uncertainty necessitates formulating and solving a dynamic, stochastic optimization problem. Economic optimization models in the field of climate change are not new. Typically, such models are applied for global-scale studies but barely for local-scale problems. In this analysis, we considered the case of the Guttannen-Grimsel Valley, situated in the Swiss Bernese Alps. The alpine community has been affected by high-magnitude, high-frequency debris flows that started in 2009 and were historically unprecendented. They were related to thaw of permafrost in the rock slopes of Ritzlihorn and repeated rock fall events that accumulated at the debris fan and formed a sediment source for debris flows and were transported downvalley. An important transit road, a trans-European gas pipeline and settlements were severely affected and partly

  15. Evaluating uncertainty estimates in hydrologic models: borrowing measures from the forecast verification community

    Directory of Open Access Journals (Sweden)

    K. J. Franz

    2011-03-01

    Full Text Available The hydrologic community is generally moving towards the use of probabilistic estimates of streamflow, primarily through the implementation of Ensemble Streamflow Prediction (ESP systems, ensemble data assimilation methods, or multi-modeling platforms. However, evaluation of probabilistic outputs has not necessarily kept pace with ensemble generation. Much of the modeling community is still performing model evaluation using standard deterministic measures, such as error, correlation, or bias, typically applied to the ensemble mean or median. Probabilistic forecast verification methods have been well developed, particularly in the atmospheric sciences yet, few have been adopted for evaluating uncertainty estimates in hydrologic model simulations. In the current paper, we overview existing probabilistic forecast verification methods and apply the methods to evaluate and compare model ensembles produced from different parameter uncertainty estimation methods. The Generalized Uncertainty Likelihood Estimator (GLUE, a modified version of GLUE, and the Shuffle Complex Evolution Metropolis (SCEM are used to generate model ensembles for the National Weather Service SACramento Soil Moisture Accounting (SAC-SMA model for 12 forecast basins located in the Southeastern United States. We evaluate the model ensembles using relevant metrics in the following categories: distribution, correlation, accuracy, conditional statistics, and categorical statistics. We show that the probabilistic metrics are easily adapted to model simulation ensembles and provide a robust analysis of parameter uncertainty, one that is commensurate with the dimension of the ensembles themselves. Application of these methods requires no information in addition to what is already available as part of traditional model validation methodology and considers the entire ensemble or uncertainty range in the approach.

  16. Accounting for uncertainty in volumes of seabed change measured with repeat multibeam sonar surveys

    Science.gov (United States)

    Schimel, Alexandre C. G.; Ierodiaconou, Daniel; Hulands, Lachlan; Kennedy, David M.

    2015-12-01

    Seafloors of unconsolidated sediment are highly dynamic features; eroding or accumulating under the action of tides, waves and currents. Assessing which areas of the seafloor experienced change and measuring the corresponding volumes involved provide insights into these important active sedimentation processes. Computing the difference between Digital Elevation Models (DEMs) obtained from repeat Multibeam Echosounders (MBES) surveys has become a common technique to identify these areas, but the uncertainty in these datasets considerably affects the estimation of the volumes displaced. The two main techniques used to take into account uncertainty in volume estimations are the limitation of calculations to areas experiencing a change in depth beyond a chosen threshold, and the computation of volumetric confidence intervals. However, these techniques are still in their infancy and, as a result, are often crude, seldom used or poorly understood. In this article, we explored a number of possible methodological advances to address this issue, including: (1) using the uncertainty information provided by the MBES data processing algorithm CUBE, (2) adapting fluvial geomorphology techniques for volume calculations using spatially variable thresholds and (3) volumetric histograms. The nearshore seabed off Warrnambool harbour - located in the highly energetic southwest Victorian coast, Australia - was used as a test site. Four consecutive MBES surveys were carried out over a four-months period. The difference between consecutive DEMs revealed an area near the beach experiencing large sediment transfers - mostly erosion - and an area of reef experiencing increasing deposition from the advance of a nearby sediment sheet. The volumes of sediment displaced in these two areas were calculated using the techniques described above, both traditionally and using the suggested improvements. We compared the results and discussed the applicability of the new methodological improvements

  17. Understanding uncertainty

    CERN Document Server

    Lindley, Dennis V

    2013-01-01

    Praise for the First Edition ""...a reference for everyone who is interested in knowing and handling uncertainty.""-Journal of Applied Statistics The critically acclaimed First Edition of Understanding Uncertainty provided a study of uncertainty addressed to scholars in all fields, showing that uncertainty could be measured by probability, and that probability obeyed three basic rules that enabled uncertainty to be handled sensibly in everyday life. These ideas were extended to embrace the scientific method and to show how decisions, containing an uncertain element, could be rationally made.

  18. A Method for Dimensional and Surface Optical Measurements Uncertainty Assessment on Micro Structured Surfaces Manufactured by Jet-ECM

    DEFF Research Database (Denmark)

    Quagliotti, Danilo; Tosello, Guido; Islam, Aminul;

    2015-01-01

    Surface texture and step height measurements of electrochemically machined cavities have been compared among optical and tactile instruments. A procedure is introduced for correcting possible divergences among the instruments and, ultimately, for evaluating the measurement uncertainty according to...

  19. Improving the uncertainty on short-term radon measurements using PADC detector

    International Nuclear Information System (INIS)

    Radon measurements over a short-term period of a few days have proven a popular choice with the general public, despite the issue that the radon concentration can vary significantly over time and longer periods of integration are recommended. Performing short-term radon measurements using a Poly Allyl Diglycol Carbonate (PADC) detector would see a larger contribution from the statistical error associated with the measurements than for longer term measurements. This motivated the investigation to improve the uncertainty on short-term measurements by utilising a new formulation of high-sensitivity PADC and also by investigating the effect of increasing the scan area and extending the measurement time by just a few days. (authors)

  20. Experimental Test of Heisenberg's Measurement Uncertainty Relation Based on Statistical Distances

    Science.gov (United States)

    Ma, Wenchao; Ma, Zhihao; Wang, Hengyan; Chen, Zhihua; Liu, Ying; Kong, Fei; Li, Zhaokai; Peng, Xinhua; Shi, Mingjun; Shi, Fazhan; Fei, Shao-Ming; Du, Jiangfeng

    2016-04-01

    Incompatible observables can be approximated by compatible observables in joint measurement or measured sequentially, with constrained accuracy as implied by Heisenberg's original formulation of the uncertainty principle. Recently, Busch, Lahti, and Werner proposed inaccuracy trade-off relations based on statistical distances between probability distributions of measurement outcomes [P. Busch et al., Phys. Rev. Lett. 111, 160405 (2013); P. Busch et al., Phys. Rev. A 89, 012129 (2014)]. Here we reformulate their theoretical framework, derive an improved relation for qubit measurement, and perform an experimental test on a spin system. The relation reveals that the worst-case inaccuracy is tightly bounded from below by the incompatibility of target observables, and is verified by the experiment employing joint measurement in which two compatible observables designed to approximate two incompatible observables on one qubit are measured simultaneously.

  1. Comparability and uncertainty of shape measurements with white-light interferometers

    Science.gov (United States)

    Boedecker, S.; Bauer, W.; Krüger-Sehm, R.; Lehmann, P. H.; Rembe, C.

    2010-05-01

    We discuss how the results obtained from a white-light interferometer can be compared to tactile measurements. The core idea to achieve comparability is to determine a short cut-off wavelength up to which the spatial frequency components of the surface topography are measured with less than 3 dB attenuation. We demonstrate for different interferometers that the data has to be filtered to achieve a linear transfer characteristic which allows to define the short cut-off wavelength. In addition, we demonstrate investigations of the error sources in shape measurements that we have identified. Results of our work are influencing a VDI/VDE calibration guideline for shape measurements which is currently under development. We show in this paper how the procedure developed for the guideline can be employed to real measurement devices. Uncertainty contributions to the error budget are also discussed and measurements on shape standards are presented.

  2. Uncertainty analysis on the design of thermal conductivity measurement by a guarded cut-bar technique

    International Nuclear Information System (INIS)

    A technique adapted from the guarded-comparative-longitudinal heat flow method was selected for the measurement of the thermal conductivity of a nuclear fuel compact over a temperature range characteristic of its usage. This technique fulfills the requirement for non-destructive measurement of the composite compact. Although numerous measurement systems have been created based on the guarded-comparative method, comprehensive systematic (bias) and measurement (precision) uncertainty associated with this technique have not been fully analyzed. In addition to the geometric effect in the bias error, which has been analyzed previously, this paper studies the working condition which is another potential error source. Using finite element analysis, this study showed the effect of these two types of error sources in the thermal conductivity measurement process and the limitations in the design selection of various parameters by considering their effect on the precision error. The results and conclusions provide valuable reference for designing and operating an experimental measurement system using this technique

  3. Uncertainty Analysis on the Design of Thermal Conductivity Measurement by a Guarded Cut-Bar Technique

    International Nuclear Information System (INIS)

    A technique adapted from the guarded-comparative-longitudinal heat flow method was selected for the measurement of the thermal conductivity of a nuclear fuel compact over a temperature range characteristic of its usage. This technique fulfills the requirement for non-destructive measurement of the composite compact. Although numerous measurement systems have been created based on the guarded comparative method, comprehensive systematic (bias) and measurement (precision) uncertainty associated with this technique have not been fully analyzed. In addition to the geometric effect in the bias error, which has been analyzed previously, this paper studies the working condition which is another potential error source. Using finite element analysis, this study showed the effect of these two types of error sources in the thermal conductivity measurement process and the limitations in the design selection of various parameters by considering their effect on the precision error. The results and conclusions provide valuable reference for designing and operating an experimental measurement system using this technique.

  4. Impact of Measurement Uncertainties on Receptor Modeling of Speciated Atmospheric Mercury

    Science.gov (United States)

    Cheng, I.; Zhang, L.; Xu, X.

    2016-01-01

    Gaseous oxidized mercury (GOM) and particle-bound mercury (PBM) measurement uncertainties could potentially affect the analysis and modeling of atmospheric mercury. This study investigated the impact of GOM measurement uncertainties on Principal Components Analysis (PCA), Absolute Principal Component Scores (APCS), and Concentration-Weighted Trajectory (CWT) receptor modeling results. The atmospheric mercury data input into these receptor models were modified by combining GOM and PBM into a single reactive mercury (RM) parameter and excluding low GOM measurements to improve the data quality. PCA and APCS results derived from RM or excluding low GOM measurements were similar to those in previous studies, except for a non-unique component and an additional component extracted from the RM dataset. The percent variance explained by the major components from a previous study differed slightly compared to RM and excluding low GOM measurements. CWT results were more sensitive to the input of RM than GOM excluding low measurements. Larger discrepancies were found between RM and GOM source regions than those between RM and PBM. Depending on the season, CWT source regions of RM differed by 40–61% compared to GOM from a previous study. No improvement in correlations between CWT results and anthropogenic mercury emissions were found. PMID:26857835

  5. Improving parton distribution uncertainties in a W mass measurement at the LHC

    CERN Document Server

    Sullivan, Zack

    2015-01-01

    We reexamine the dominant contribution of parton distribution function (PDF) uncertainties to the W mass measurement, and determine their contribution is +-39(30) MeV when running the Large Hadron Collider at 7(13) TeV. We find that spurious correlations in older PDF sets led to over-optimistic assumptions regarding normalization to Z observables. In order to understand the origin of the large uncertainties we break down the contribution of the PDF errors into effects at the hard matrix element level, in showering, and in sensitivity to finite detector resolutions. Using CT10, CT10W, and charm enhanced PDF sets in comparison to older PDF sets, we develop a robust analysis that examines correlations between transverse mass reconstructions of W and Z decays (scaled by cos $\\theta_W$) to leptons. We find that central leptons (|$\\eta_l$| < 1.3) from W and Z bosons carry the most weight in reducing the PDF uncertainty, and estimate a PDF error of +10/-12 MeV is achievable in a W mass measurement at the LHC. Fur...

  6. Evaluation of uncertainty in gravity wave potential energy calculations through GPS radio occultation measurements

    Science.gov (United States)

    Luna, D.; Alexander, P.; de la Torre, A.

    2013-09-01

    The application of the Global Positioning System (GPS) radio occultation (RO) method to the atmosphere enables the determination of height profiles of temperature, among other variables. From these measurements, gravity wave activity is usually quantified by calculating the potential energy through the integration of the ratio of perturbation and background temperatures between two given altitudes in each profile. The uncertainty in the estimation of wave activity depends on the systematic biases and random errors of the measured temperature, but also on additional factors like the selected vertical integration layer and the separation method between background and perturbation temperatures. In this study, the contributions of different parameters and variables to the uncertainty in the calculation of gravity wave potential energy in the lower stratosphere are investigated and quantified. In particular, a Monte Carlo method is used to evaluate the uncertainty that results from different GPS RO temperature error distributions. In addition, our analysis shows that RO data above 30 km height becomes dubious for gravity waves potential energy calculations.

  7. Monte Carlo method for calculating oxygen abundances and their uncertainties from strong-line flux measurements

    Science.gov (United States)

    Bianco, F. B.; Modjaz, M.; Oh, S. M.; Fierroz, D.; Liu, Y. Q.; Kewley, L.; Graur, O.

    2016-07-01

    We present the open-source Python code pyMCZ that determines oxygen abundance and its distribution from strong emission lines in the standard metallicity calibrators, based on the original IDL code of Kewley and Dopita (2002) with updates from Kewley and Ellison (2008), and expanded to include more recently developed calibrators. The standard strong-line diagnostics have been used to estimate the oxygen abundance in the interstellar medium through various emission line ratios (referred to as indicators) in many areas of astrophysics, including galaxy evolution and supernova host galaxy studies. We introduce a Python implementation of these methods that, through Monte Carlo sampling, better characterizes the statistical oxygen abundance confidence region including the effect due to the propagation of observational uncertainties. These uncertainties are likely to dominate the error budget in the case of distant galaxies, hosts of cosmic explosions. Given line flux measurements and their uncertainties, our code produces synthetic distributions for the oxygen abundance in up to 15 metallicity calibrators simultaneously, as well as for E(B- V) , and estimates their median values and their 68% confidence regions. We provide the option of outputting the full Monte Carlo distributions, and their Kernel Density estimates. We test our code on emission line measurements from a sample of nearby supernova host galaxies (z https://github.com/nyusngroup/pyMCZ.

  8. Modelling and Measurement Uncertainty Estimation for Integrated AFM-CMM Instrument

    DEFF Research Database (Denmark)

    Hansen, Hans Nørgaard; Bariani, Paolo; De Chiffre, Leonardo

    2005-01-01

    This paper describes modelling of an integrated AFM - CMM instrument, its calibration, and estimation of measurement uncertainty. Positioning errors were seen to limit the instrument performance. Software for off-line stitching of single AFM scans was developed and verified, which allows...... compensation of such errors. A geometrical model of the instrument was produced, describing the interaction between AFM and CMM systematic errors. The model parameters were quantified through calibration, and the model used for establishing an optimised measurement procedure for surface mapping. A maximum...

  9. Error Analysis and Measurement Uncertainty for a Fiber Grating Strain-Temperature Sensor

    OpenAIRE

    Jian-Neng Wang; Jaw-Luen Tang

    2010-01-01

    A fiber grating sensor capable of distinguishing between temperature and strain, using a reference and a dual-wavelength fiber Bragg grating, is presented. Error analysis and measurement uncertainty for this sensor are studied theoretically and experimentally. The measured root mean squared errors for temperature T and strain ε were estimated to be 0.13 °C and 6 με, respectively. The maximum errors for temperature and strain were calculated as 0.00155 T + 2.90 × 10−6 ε and 3.59 × 10−5 ε + 0.0...

  10. Robust framework for PET image reconstruction incorporating system and measurement uncertainties.

    Directory of Open Access Journals (Sweden)

    Huafeng Liu

    Full Text Available In Positron Emission Tomography (PET, an optimal estimate of the radioactivity concentration is obtained from the measured emission data under certain criteria. So far, all the well-known statistical reconstruction algorithms require exactly known system probability matrix a priori, and the quality of such system model largely determines the quality of the reconstructed images. In this paper, we propose an algorithm for PET image reconstruction for the real world case where the PET system model is subject to uncertainties. The method counts PET reconstruction as a regularization problem and the image estimation is achieved by means of an uncertainty weighted least squares framework. The performance of our work is evaluated with the Shepp-Logan simulated and real phantom data, which demonstrates significant improvements in image quality over the least squares reconstruction efforts.

  11. The GUM revision: the Bayesian view toward the expression of measurement uncertainty

    Science.gov (United States)

    Lira, I.

    2016-03-01

    The ‘Guide to the Expression of Uncertainty in Measurement’ (GUM) has been in use for more than 20 years, serving its purposes worldwide at all levels of metrology, from scientific to industrial and commercial applications. However, the GUM presents some inconsistencies, both internally and with respect to its two later Supplements. For this reason, the Joint Committee for Guides in Metrology, which is responsible for these documents, has decided that a major revision of the GUM is needed. This will be done by following the principles of Bayesian statistics, a concise summary of which is presented in this article. Those principles should be useful in physics and engineering laboratory courses that teach the fundamentals of data analysis and measurement uncertainty evaluation.

  12. Integration of rain gauge measurement errors with the overall rainfall uncertainty estimation using kriging methods

    Science.gov (United States)

    Cecinati, Francesca; Moreno Ródenas, Antonio Manuel; Rico-Ramirez, Miguel Angel; ten Veldhuis, Marie-claire; Han, Dawei

    2016-04-01

    In many research studies rain gauges are used as a reference point measurement for rainfall, because they can reach very good accuracy, especially compared to radar or microwave links, and their use is very widespread. In some applications rain gauge uncertainty is assumed to be small enough to be neglected. This can be done when rain gauges are accurate and their data is correctly managed. Unfortunately, in many operational networks the importance of accurate rainfall data and of data quality control can be underestimated; budget and best practice knowledge can be limiting factors in a correct rain gauge network management. In these cases, the accuracy of rain gauges can drastically drop and the uncertainty associated with the measurements cannot be neglected. This work proposes an approach based on three different kriging methods to integrate rain gauge measurement errors in the overall rainfall uncertainty estimation. In particular, rainfall products of different complexity are derived through 1) block kriging on a single rain gauge 2) ordinary kriging on a network of different rain gauges 3) kriging with external drift to integrate all the available rain gauges with radar rainfall information. The study area is the Eindhoven catchment, contributing to the river Dommel, in the southern part of the Netherlands. The area, 590 km2, is covered by high quality rain gauge measurements by the Royal Netherlands Meteorological Institute (KNMI), which has one rain gauge inside the study area and six around it, and by lower quality rain gauge measurements by the Dommel Water Board and by the Eindhoven Municipality (six rain gauges in total). The integration of the rain gauge measurement error is accomplished in all the cases increasing the nugget of the semivariogram proportionally to the estimated error. Using different semivariogram models for the different networks allows for the separate characterisation of higher and lower quality rain gauges. For the kriging with

  13. Optimal measurement uncertainties for materials accounting in a fast breeder reactor spent-fuel reprocessing plant

    International Nuclear Information System (INIS)

    Optimization techniques are used to calculate measurement uncertainties for materials accountability instruments in a fast breeder reactor spent-fuel reprocessing plant. Optimal measurement uncertainties are calculated so that performance goals for detecting materials loss are achieved while minimizing the total instrument development cost. Improved materials accounting in the chemical separations process (111 kg Pu/day) to meet 8-kg plutonium abrupt (1 day) and 40-kg plutonium protracted (6 months) loss-detection goals requires: process tank volume and concentration measurements having precisions less than or equal to 1%; accountability and plutonium sample tank volume measurements having precisions less than or equal to 0.3%, short-term correlated errors less than or equal to 0.04%, and long-term correlated errors less than or equal to 0.04%; and accountability and plutonium sample tank concentration measurements having precisions less than or equal to 0.4%, short-term correlated errors less than or equal to 0.1%, and long-term correlated errors less than or equal to 0.05%

  14. Final report on uncertainties in the detection, measurement, and analysis of selected features pertinent to deep geologic repositories

    Energy Technology Data Exchange (ETDEWEB)

    1978-07-10

    Uncertainties with regard to many facets of repository site characterization have not yet been quantified. This report summarizes the state of knowledge of uncertainties in the measurement of porosity, hydraulic conductivity, and hydraulic gradient; uncertainties associated with various geophysical field techniques; and uncertainties associated with the effects of exploration and exploitation activities in bedded salt basins. The potential for seepage through a depository in bedded salt or shale is reviewed and, based upon the available data, generic values for the hydraulic conductivity and porosity of bedded salt and shale are proposed.

  15. Final report on uncertainties in the detection, measurement, and analysis of selected features pertinent to deep geologic repositories

    International Nuclear Information System (INIS)

    Uncertainties with regard to many facets of repository site characterization have not yet been quantified. This report summarizes the state of knowledge of uncertainties in the measurement of porosity, hydraulic conductivity, and hydraulic gradient; uncertainties associated with various geophysical field techniques; and uncertainties associated with the effects of exploration and exploitation activities in bedded salt basins. The potential for seepage through a depository in bedded salt or shale is reviewed and, based upon the available data, generic values for the hydraulic conductivity and porosity of bedded salt and shale are proposed

  16. Retrievals and uncertainty analysis of aerosol single scattering albedo from MFRSR measurements

    International Nuclear Information System (INIS)

    Aerosol single scattering albedo (SSA) can be retrieved from the ratio of diffuse horizontal and direct normal fluxes measured from multifilter rotating shadowband radiometer (MFRSR). In this study, the measurement channels at 415 nm and 870 nm are selected for aerosol optical depth (AOD) and Angstrom coefficient retrievals, and the measurements at 415 nm are used for aerosol SSA retrievals with the constraint of retrieved Angstrom coefficient. We extensively assessed various issues impacting on the accuracy of SSA retrieval from measurements to input parameters and assumptions. For cloud-free days with mean aerosol loading of 0.13–0.60, our sensitivity study indicated that: (1) 1% calibration uncertainty can result in 0.8–3.7% changes in retrieved SSA; (2) without considering the cosine respond correction and/or forward scattering correction will result in underestimation of 1.1–3.3% and/or 0.73% in retrieved SSA; (3) an overestimation of 0.1 in asymmetry factor can result in an underestimation of 2.54–3.4% in retrieved SSA; (4) for small aerosol loading (e.g., 0.13), the uncertainty associated with the choice of Rayleigh optical depth value can result in non-negligible change in retrieved SSA (e.g., 0.015); (5) an uncertainty of 0.05 for surface albedo can result in changes of 1.49–5.4% in retrieved SSA. We applied the retrieval algorithm to the MFRSR measurements at the Atmospheric Radiation Measurements (ARM) Southern Great Plains (SGP) site. The retrieved results of AOD, Angstrom coefficient, and SSA are basically consistent with other independent measurements from co-located instruments at the site. - Highlights: • Aerosol SSA is derived from MFRSR measured diffuse to direct normal irradiance ratio. • We extensively assessed various issues impacting on the accuracy of SSA retrieval. • The issues are mainly from measurements and model input parameters and assumptions. • We applied the retrieval algorithm to the MFRSR measurements at ARM SGP

  17. A computer simulation platform for the estimation of measurement uncertainties in dimensional X-ray computed tomography

    DEFF Research Database (Denmark)

    Hiller, Jochen; Reindl, Leonard M

    2012-01-01

    The knowledge of measurement uncertainty is of great importance in conformance testing in production. The tolerance limit for production must be reduced by the amounts of measurement uncertainty to ensure that the parts are in fact within the tolerance. Over the last 5 years, industrial X......-ray computed tomography (CT) has become an important technology for dimensional quality control. In this paper a computer simulation platform is presented which is able to investigate error sources in dimensional CT measurements. The typical workflow in industrial CT metrology is described and methods for...... estimating measurement uncertainties are briefly discussed. As we will show, the developed virtual CT (VCT) simulator can be adapted to various scanner systems, providing realistic CT data. Using the Monte Carlo method (MCM), measurement uncertainties for a given measuring task can be estimated, taking into...

  18. Uncertainty analysis of computational methods for deriving sensible heat flux values from scintillometer measurements

    Directory of Open Access Journals (Sweden)

    P. A. Solignac

    2009-11-01

    Full Text Available The use of scintillometers to determine sensible heat fluxes is now common in studies of land-atmosphere interactions. The main interest in these instruments is due to their ability to quantify energy distributions at the landscape scale, as they can calculate sensible heat flux values over long distances, in contrast to Eddy Covariance systems. However, scintillometer data do not provide a direct measure of sensible heat flux, but require additional data, such as the Bowen ratio (β, to provide flux values. The Bowen ratio can either be measured using Eddy Covariance systems or derived from the energy balance closure. In this work, specific requirements for estimating energy fluxes using a scintillometer were analyzed, as well as the accuracy of two flux calculation methods. We first focused on the classical method (used in standard softwares and we analysed the impact of the Bowen ratio on flux value and uncertainty. For instance, an averaged Bowen ratio (β of less than 1 proved to be a significant source of measurement uncertainty. An alternative method, called the "β-closure method", for which the Bowen ratio measurement is not necessary, was also tested. In this case, it was observed that even for low β values, flux uncertainties were reduced and scintillometer data were well correlated with the Eddy Covariance results. Besides, both methods should tend to the same results, but the second one slightly underestimates H while β decreases (<5%.

  19. Sensitivity measures for optimal mitigation of risk and reduction of model uncertainty

    International Nuclear Information System (INIS)

    This paper presents a new set of reliability sensitivity measures. The purpose is to identify the optimal manner in which to mitigate risk to civil infrastructure, and reduce model uncertainty in order to improve risk estimates. Three measures are presented. One identifies the infrastructure components that should be prioritized for retrofit. Another measure identifies the infrastructure that should be prioritized for more refined modeling. The third measure identifies the models that should be prioritized in research to improve models, for example by gathering new data. The developments are presented in the context of a region with 622 buildings that are subjected to seismicity from several sources. A comprehensive seismic risk analysis of this region is conducted, with over 300 random variables, 30 model types, and 4000 model instances. All models are probabilistic and emphasis is placed on the explicit characterization of epistemic uncertainty. For the considered region, the buildings that should first be retrofitted are found to be pre-code unreinforced masonry buildings. Conversely, concrete shear wall buildings rank highest on the list of buildings that should be subjected to more detailed modeling. The ground shaking intensity model for shallow crustal earthquakes and the concrete shear wall structural response model rank highest on the list of models that should be prioritized by research to improve engineering analysis models. -- Highlights: • Three new sensitivity measures are presented to guide the allocation of resources. • The first measure prioritizes infrastructure for retrofit in order to mitigate risk. • The second measure prioritizes probabilistic models for more detailed modeling. • The third measure prioritizes model types for improvement by data gathering. • The measures are showcased by a regional seismic risk analysis of 622 buildings

  20. Assessing the uncertainties of climate policies and mitigation measures. Viewpoints on biofuel production, grid electricity consumption and differentiation of emission reduction commitments

    Energy Technology Data Exchange (ETDEWEB)

    Soimakallio, S.

    2012-08-15

    Ambitious climate change mitigation requires the implementation of effective and equitable climate policy and GHG emission reduction measures. The objective of this study was to explore the significance of the uncertainties related to GHG emission reduction measures and policies by providing viewpoints on biofuels production, grid electricity consumption and differentiation of emission reduction commitments between countries and country groups. Life cycle assessment (LCA) and macro-level scenario analysis through top-down and bottom-up modelling and cost-effectiveness analysis (CEA) were used as methods. The uncertainties were propagated in a statistical way through parameter variation, scenario analysis and stochastic modelling. This study showed that, in determining GHG emissions at product or process level, there are significant uncertainties due to parameters such as nitrous oxide emissions from soil, soil carbon changes and emissions from electricity production; and due to methodological choices related to the spatial and temporal system boundary setting and selection of allocation methods. Furthermore, the uncertainties due to modelling may be of central importance. For example, when accounting for biomass-based carbon emissions to and sequestration from the atmosphere, consideration of the temporal dimension is critical. The outcomes in differentiation of GHG emission reduction commitments between countries and country groups are critically influenced by the quality of data and criteria applied. In both LCA and effort sharing, the major issues are equitable attribution of emissions and emission allowances on the one hand and capturing consequences of measures and policies on the other. As LCA and system level top-down and bottom-up modelling results are increasingly used to justify various decisions by different stakeholders such as policy-makers and consumers, harmonization of practices, transparency and the handling of uncertainties related to

  1. Assessing the uncertainties of climate policies and mitigation measures. Viewpoints on biofuel production, grid electricity consumption and differentiation of emission reduction commitments

    Energy Technology Data Exchange (ETDEWEB)

    Soimakallio, S.

    2012-11-01

    Ambitious climate change mitigation requires the implementation of effective and equitable climate policy and GHG emission reduction measures. The objective of this study was to explore the significance of the uncertainties related to GHG emission reduction measures and policies by providing viewpoints on biofuels production, grid electricity consumption and differentiation of emission reduction commitments between countries and country groups. Life cycle assessment (LCA) and macro-level scenario analysis through top-down and bottom-up modelling and cost-effectiveness analysis (CEA) were used as methods. The uncertainties were propagated in a statistical way through parameter variation, scenario analysis and stochastic modelling. This study showed that, in determining GHG emissions at product or process level, there are significant uncertainties due to parameters such as nitrous oxide emissions from soil, soil carbon changes and emissions from electricity production; and due to methodological choices related to the spatial and temporal system boundary setting and selection of allocation methods. Furthermore, the uncertainties due to modelling may be of central importance. For example, when accounting for biomass-based carbon emissions to and sequestration from the atmosphere, consideration of the temporal dimension is critical. The outcomes in differentiation of GHG emission reduction commitments between countries and country groups are critically influenced by the quality of data and criteria applied. In both LCA and effort sharing, the major issues are equitable attribution of emissions and emission allowances on the one hand and capturing consequences of measures and policies on the other. As LCA and system level top-down and bottom-up modelling results are increasingly used to justify various decisions by different stakeholders such as policy-makers and consumers, harmonization of practices, transparency and the handling of uncertainties related to

  2. Welfare and Market Impacts of Food Safety Measures in China:Results from Urban Consumers’ Valuation of Product Attributes

    Institute of Scientific and Technical Information of China (English)

    David L.Ortega; H.Holly Wang; Nicole J.Olynk Widmar

    2014-01-01

    This study provides an economics assessment of various food safety measures in China. A choice experiment approach is used to elicit Chinese consumer preferences for various food safety attributes using data from a 2008 urban consumer survey. An alternative welfare calculation is used to model aggregate market impacts of select food safety measures. Our results show that the largest welfare gains are found in the current government-run certiifcation program. The implementation of a third-party certiifcation system, a traceability network and a product label would generate signiifcant value and would help reduce current system inefifciencies in China. This study builds on previous research and provides an alternative approach for calculating consumer valuation of safety and quality attributes that can be used to estimate aggregate economic and welfare impacts.

  3. Uncertainty analysis of computational methods for deriving sensible heat flux values from scintillometer measurements

    Directory of Open Access Journals (Sweden)

    P. A. Solignac

    2009-06-01

    Full Text Available The use of scintillometers to determine sensible heat fluxes is now common in studies of land-atmosphere interactions. The main interest in these instruments is due to their ability to quantify energy distributions at the landscape scale, as they can calculate sensible heat flux values over long distances, in contrast to Eddy Correlation systems. However, scintillometer data do not provide a direct measure of sensible heat flux, but require additional data, such as the Bowen ratio (β, to provide flux values. The Bowen ratio can either be measured using Eddy Correlation systems or derived from the energy balance closure. In this work, specific requirements for estimating energy fluxes using a scintillometer were analyzed, as well as the accuracy of two flux calculation methods. We first focused on the classical method (used in standard software. We analysed the impact of the Bowen ratio according to both time averaging and ratio values; for instance, an averaged Bowen ratio (β of less than 1 proved to be a significant source of measurement uncertainty. An alternative method, called the "β-closure method", for which the Bowen ratio measurement is not necessary, was also tested. In this case, it was observed that even for low β values, flux uncertainties were reduced and scintillometer data were well correlated with the Eddy Correlation results.

  4. Research into Uncertainty in Measurement of Seawater Chemical Oxygen Demand by Potassium Iodide-Alkaline Potassium Permanganate Determination Method.

    OpenAIRE

    Zhang, Shiqiang; Guo, Changsong

    2007-01-01

    Using the glucose and L-glutamic-acid to prepare the standard substance according to the ratio of 1:1, and the artificial seawater and the standard substance to prepare a series of standard solutions, the distribution pattern of uncertainty in measurement of seawater COD is obtained based on the measured results of the series of standard solutions by the potassium iodide-alkaline potassium permanganate determination method. The distribution pattern is as follows: Uncertainty in measurement is...

  5. Using Drell-Yan forward-backward asymmetry to reduce PDF uncertainties in the measurement of electroweak parameters

    International Nuclear Information System (INIS)

    The uncertainties in parton distribution functions (PDFs) are the dominant source of the systematic uncertainty in precision measurements of electroweak parameters at hadron colliders (e.g. sin2θeef(MZ), sin2θW = 1-MW2/MZ2 and the mass of the W boson). We show that measurements of the forward-backward charge asymmetry (AFB(M, y)) of Drell-Yan dilepton events produced at hadron colliders provide a new powerful tool to reduce the PDF uncertainties in these measurements. (orig.)

  6. International target values 2000 for measurement uncertainties in safeguarding nuclear materials

    International Nuclear Information System (INIS)

    The IAEA has prepared a revised and updated version of International Target Values (ITVs) for uncertainty components in measurements of nuclear material. The ITVs represent uncertainties to be considered in judging the reliability of analytical techniques applied to industrial nuclear and fissile material subject to safeguards verification. The tabulated values represent estimates of the 'state of the practice' which ought to be achievable under routine conditions by adequately equipped, experienced laboratories. The ITVs 2000 are intended to be used by plant operators and safeguards organizations as a reference of the quality of measurements achievable in nuclear material accountancy, and for planning purposes. The IAEA prepared a draft of a technical report presenting the proposed ITVs 2000, and in April 2000 the chairmen or officers of the panels or organizations listed below were invited to co- author the report and to submit the draft to a discussion by their panels and organizations. Euratom Safeguards Inspectorate, ESAKDA Working Group on Destructive Analysis, ESARDA Working Group on Non Destructive Analysis, Institute of Nuclear Material Management, Japanese Expert Group on ITV-2000, ISO Working Group on Analyses in Spent Fuel Reprocessing, ISO Working Group on Analyses in Uranium Fuel Fabrication, ISO Working Group on Analyses in MOX Fuel Fabrication, Agencia Brasileno-Argentina de Contabilidad y Control de Materiales Nucleares (ABACC). Comments from the above groups were received and incorporated into the final version of the document, completed in April 2001. The ITVs 2000 represent target standard uncertainties, expressing the precision achievable under stipulated conditions. These conditions typically fall in one of the two following categories: 'repeatability conditions' normally encountered during the measurements done within one inspection period; or 'reproducibility conditions' involving additional sources of measurement variability such as

  7. MM98.52 - An industrial comparison of coordinate measuring machines in Scandinavia with focus on uncertainty statements

    DEFF Research Database (Denmark)

    Hansen, Hans Nørgaard; Chiffre, Leonardo De

    1999-01-01

    . An important part of the intercomparison was to test the ability of the participants to determine measurement uncertainties. One of the uncertainties was based upon a "best guess" but nevertheless, many participants did not even report this uncertainty. Uncertainty budgeting was not used for......This paper describes an industrial comparison of coordinate measuring machines (CMMs) carried out in the Scandinavian countries from October 1994 to May 1996. Fifty-nine industrial companies with a total of 62 CMMs participated in the project and measured a comparison package with five items chosen...... to represent a variety of dimensions, angles, and other geometrical quantities. A tool holder, two gauge blocks, a straightedge, and a ring together with instructions on how to measure the items were produced and sent to each participant. Simple measurement tasks were observed to be carried out with...

  8. Changes in Handset Performance Measures due to Spherical Radiation Pattern Measurement Uncertainty

    DEFF Research Database (Denmark)

    Nielsen, Jesper Ødum; Pedersen, Gert Frølund

    ), and mean effective gain (MEG) can be computed. Often this kind of measurements are made with a phantom head next to the handsets in order to simulate the influence of a real user. The measured radiation patterns are only expected to be repeatable if the same setup is used, i.e., the same phantom and...... measurement system that may introduce errors in standardized performance measurements. Radiation patterns of six handsets have been measured while they were mounted at various offsets from the reference position defined by the Cellular Telecommunications & Internet Association (CTIA) certification. The change...

  9. Weak Anderson localisation in reverberation rooms and its effect on the uncertainty of sound power measurements

    DEFF Research Database (Denmark)

    Jacobsen, Finn

    2011-01-01

    The effect known as ‘weak Anderson localisation’, ‘coherent backscattering’ or ‘enhanced backscattering’ is a physical phenomenon that occurs in random systems, e.g., disordered media and linear wave systems, including reverberation rooms: the mean square response is increased at the drive point....... In a reverberation room this means that one can expect an increase of the reverberant sound field at the position of the source that generates the sound field. This affects the sound power output of the source and is therefore of practical concern. However, because of the stronger direct sound field...... implications for the uncertainty of sound power measurements....

  10. Determination of Al in cake mix: Method validation and estimation of measurement uncertainty

    Science.gov (United States)

    Andrade, G.; Rocha, O.; Junqueira, R.

    2016-07-01

    An analytical method for the determination of Al in cake mix was developed. Acceptable values were obtained for the following parameters: linearity, detection limit - LOD (5.00 mg-kg-1) quantification limit - LOQ (12.5 mg-kg-1), the recovery assays values (between 91 and 102%), the relative standard deviation under repeatability and within-reproducibility conditions (<20.0%) and measurement uncertainty tests (<10.0%) The results of the validation process showed that the proposed method is fitness for purpose.

  11. Measurement uncertainties in quantifying aeolian mass flux: evidence from wind tunnel and field site data

    Directory of Open Access Journals (Sweden)

    Ate Poortinga

    2014-07-01

    Full Text Available Aeolian sediment traps are widely used to estimate the total volume of wind-driven sediment transport, but also to study the vertical mass distribution of a saltating sand cloud. The reliability of sediment flux estimations from such measurements are dependent upon the specific configuration of the measurement compartments and the analysis approach used. In this study, we analyse the uncertainty of these measurements by investigating the vertical cumulative distribution and relative sediment flux derived from both wind tunnel and field studies. Vertical flux data was examined using existing data in combination with a newly acquired dataset; comprising meteorological data and sediment fluxes from six different events, using three customized catchers at Ameland beaches in northern Netherlands. Fast-temporal data collected in a wind tunnel shows that the median transport height has a scattered pattern between impact and fluid threshold, that increases linearly with shear velocities above the fluid threshold. For finer sediment, a larger proportion was transported closer to the surface compared to coarser sediment fractions. It was also shown that errors originating from the distribution of sampling compartments, specifically the location of the lowest sediment trap relative to the surface, can be identified using the relative sediment flux. In the field, surface conditions such as surface moisture, surface crusts or frozen surfaces have a more pronounced but localized effect than shear velocity. Uncertainty in aeolian mass flux estimates can be reduced by placing multiple compartments in closer proximity to the surface.

  12. International Target Values 2010 for Measurement Uncertainties in Safeguarding Nuclear Materials

    International Nuclear Information System (INIS)

    This issue of the International Target Values (ITVs) represents the sixth revision, following the first release of such tables issued in 1979 by the ESARDA/WGDA. The ITVs are uncertainties to be considered in judging the reliability of analytical techniques applied to industrial nuclear and fissile material, which are subject to safeguards verification. The tabulated values represent estimates of the ‘state of the practice’ which should be achievable under routine measurement conditions. The most recent standard conventions in representing uncertainty have been considered, while maintaining a format that allows comparison with the previous releases of the ITVs. The present report explains why target values are needed, how the concept evolved and how they relate to the operator’s and inspector’s measurement systems. The ITVs-2010 are intended to be used by plant operators and safeguards organizations, as a reference of the quality of measurements achievable in nuclear material accountancy, and for planning purposes. The report suggests that the use of ITVs can be beneficial for statistical inferences regarding the significance of operator-inspector differences whenever valid performance values are not available.

  13. Operationally-motivated uncertainty relations for joint measurability and the error-disturbance tradeoff

    International Nuclear Information System (INIS)

    We derive uncertainty relations for both joint measurability and the error-disturbance tradeoff in terms of the probability of distinguishing the actual operation of a device from its hypothetical ideal. Our relations provide a clear operational interpretation of two main aspects of the uncertainty principle, as originally formulated by Heisenberg. The first restricts the joint measurability of observables, stating that noncommuting observables can only be simultaneously determined with a characteristic amount of indeterminacy. The second describes an error-disturbance tradeoff, noting that the more precise a measurement of one observable is made, the greater the disturbance to noncommuting observables. Our relations are explicitly state-independent and valid for arbitrary observables of discrete quantum systems, and are also applicable to the case of position and momentum observables. They may be directly applied in information processing settings, for example to infer that devices which can faithfully transmit information regarding one observable do not leak information about conjugate observables to the environment. Though intuitively apparent from Heisenberg's original arguments, only limited versions of this statement have previously been formalized.

  14. Adaptive Particle Filter for Nonparametric Estimation with Measurement Uncertainty in Wireless Sensor Networks

    Science.gov (United States)

    Li, Xiaofan; Zhao, Yubin; Zhang, Sha; Fan, Xiaopeng

    2016-01-01

    Particle filters (PFs) are widely used for nonlinear signal processing in wireless sensor networks (WSNs). However, the measurement uncertainty makes the WSN observations unreliable to the actual case and also degrades the estimation accuracy of the PFs. In addition to the algorithm design, few works focus on improving the likelihood calculation method, since it can be pre-assumed by a given distribution model. In this paper, we propose a novel PF method, which is based on a new likelihood fusion method for WSNs and can further improve the estimation performance. We firstly use a dynamic Gaussian model to describe the nonparametric features of the measurement uncertainty. Then, we propose a likelihood adaptation method that employs the prior information and a belief factor to reduce the measurement noise. The optimal belief factor is attained by deriving the minimum Kullback–Leibler divergence. The likelihood adaptation method can be integrated into any PFs, and we use our method to develop three versions of adaptive PFs for a target tracking system using wireless sensor network. The simulation and experimental results demonstrate that our likelihood adaptation method has greatly improved the estimation performance of PFs in a high noise environment. In addition, the adaptive PFs are highly adaptable to the environment without imposing computational complexity. PMID:27249002

  15. Adaptive Particle Filter for Nonparametric Estimation with Measurement Uncertainty in Wireless Sensor Networks.

    Science.gov (United States)

    Li, Xiaofan; Zhao, Yubin; Zhang, Sha; Fan, Xiaopeng

    2016-01-01

    Particle filters (PFs) are widely used for nonlinear signal processing in wireless sensor networks (WSNs). However, the measurement uncertainty makes the WSN observations unreliable to the actual case and also degrades the estimation accuracy of the PFs. In addition to the algorithm design, few works focus on improving the likelihood calculation method, since it can be pre-assumed by a given distribution model. In this paper, we propose a novel PF method, which is based on a new likelihood fusion method for WSNs and can further improve the estimation performance. We firstly use a dynamic Gaussian model to describe the nonparametric features of the measurement uncertainty. Then, we propose a likelihood adaptation method that employs the prior information and a belief factor to reduce the measurement noise. The optimal belief factor is attained by deriving the minimum Kullback-Leibler divergence. The likelihood adaptation method can be integrated into any PFs, and we use our method to develop three versions of adaptive PFs for a target tracking system using wireless sensor network. The simulation and experimental results demonstrate that our likelihood adaptation method has greatly improved the estimation performance of PFs in a high noise environment. In addition, the adaptive PFs are highly adaptable to the environment without imposing computational complexity. PMID:27249002

  16. Development and optimization of measurements techniques by gamma spectrometries - control and reduction of associated uncertainties

    International Nuclear Information System (INIS)

    In the frameworks of the neutronics calculation schemes improvement and qualification, this thesis consists of the conception, the optimization and the development of γ-ray spectrometry techniques for the measurement of integral parameters, in the EOLE and MINERVE facilities. The work aims at correcting systematic errors and at controlling and reducing uncertainties on the measurement results. The achieved progresses allow a more precise comparison between calculations and experiments and a better knowledge of some nuclear data, especially integral capture and fission cross sections. Firstly, the former instrumentation - using an analog chain (2026 amplifier and PCA3 converter / analyser) - has been fully tested to estimate the experimental errors on the determination of neutronics parameters. This work shows a systematic bias up to 12% on the measurement of the power distribution scaling factor and up to 3% on the measurement of the axial buckling, due to the lack of pulse pile-up correction. Following a comparative study, these electronics have been replaced by a digital DSP2060 system with an adapted dead time correction. With the optimization of its parameters, a precision at better than 0.5% has been achieved until a 1.5*105 s-1 count rate. Consequently, this work allows new kinds of experiments, which could not have been done with a sufficient precision by the past. Secondly, the former methods applied to the treatment of measurement raw data and to the variance propagation, have been analysed and discussed. Different systematic errors, which were not taken into account by the past, are now corrected. They concern the influence of the activity decay on the dead time correction (about 2%), the true coincidence effect (about 5%) and the solid angle effect on the efficiency transfer corrections (about 8%). Besides, the description of more precise methods for the variance propagation leads to a 3 times better uncertainty on fundamental neutronics parameters

  17. Measurement of patient-derived utility values for periodontal health using a multi-attribute scale.

    Science.gov (United States)

    Bellamy, C A; Brickley, M R; McAndrew, R

    1996-09-01

    Periodontal health states are difficult to quantify and no formal scale quantifying patients' utilities for periodontal health states exits. Multi-attribute utility (MAU) techniques were used to develop such a scale. The MAU scale may be used to quantify patients' assessment of their current periodontal health and that of possible treatment outcomes. Such data, combined with probability values in formal decision analysis techniques would result in improved rationality of treatment planning for periodontal disease. 20 patients attending for routine undergraduate care were interviewed. Data from these interviews were sorted into groups of common interest (domains). Intra-domain health statements were complied from the interview content. 21 patients ranked the intra-domain statements on a scale of 0-100. This same group of patients also performed an inter-domain weighting. Mean results showed that patients were 2X as concerned with how they felt and with the prognosis of possible outcomes, than with how they looked and what facts they knew about their oral health. However, the real value of utilities research lies in application of individual results to treatment planning as there is a wide range of opinion regarding outcome health states. PMID:8891929

  18. Development of the multi-attribute Adolescent Health Utility Measure (AHUM)

    OpenAIRE

    Beusterien Kathleen M; Yeung Jean-Ezra; Pang Francis; Brazier John

    2012-01-01

    Abstract Objective Obtain utilities (preferences) for a generalizable set of health states experienced by older children and adolescents who receive therapy for chronic health conditions. Methods A health state classification system, the Adolescent Health Utility Measure (AHUM), was developed based on generic health status measures and input from children with Hunter syndrome and their caregivers. The AHUM contains six dimensions with 4–7 severity levels: self-care, pain, mobility, strenuous ...

  19. Attributes of an injury-free culture, part 3: measurement and metrics.

    Science.gov (United States)

    Groover, Donald R

    2007-10-01

    The answer to the CEO's challenge is five measures that all influence behavior, attitudes, and culture in the organization. For an organization moving toward injury-free, these types of measures become the most frequently discussed and used to assess the overall health of the safety situation. In part four, we will discuss employee engagement and how it is pivotal to achieving increasingly longer periods of injury-free performance. PMID:17972697

  20. Total measurement uncertainty for NDA for SNM in process materials and waste

    International Nuclear Information System (INIS)

    Non-Destructive Assay for safeguards and waste applications requires accurate determination of the Pu and U content of the samples. NDA systems must be designed to handle a variety of sample sizes, chemical forms, isotopics and matrices, all of which complicate the analysis of the measurement data. Canberra has evaluated regulatory requirements and nuclear material types worldwide to identify a standard set of NDA instruments that address these diversified applications. As these systems are becoming more common, the need for verification and certification of their performance for various sample types increases. Performance data for most of Canberra's safeguards and waste have been published in recent years. Canberra has combined all of these data into a composite evaluation of the total measurement uncertainty (TMU). This paper includes a summary of these TMU and each of the components that produce this overall uncertainty for two standard NDA systems: Waste Drum Assay Systems and the Waste Assay Scanner. This approach for determining the TMU for an assay has been accepted by the US Department of Energy and the approach has been implemented in Canberra's standard neutron and gamma NDA software. Also, the major contributors to the TMU have been identified and new assay techniques have been identified to detect and minimize the TMU. (author)

  1. An Evaluation of Uncertainty Associated to Analytical Measurements of Selected Polycyclic Aromatic Compounds in Ambient Air

    International Nuclear Information System (INIS)

    This paper presents an evaluation of uncertainty associated to analytical measurement of eighteen polycyclic aromatic compounds (PACs) in ambient air by liquid chromatography with fluorescence detection (HPLC/FD). The study was focused on analyses of PM10, PM2.5 and gas phase fractions. Main analytical uncertainty was estimated for eleven polycyclic aromatic hydrocarbons (PAHs), four nitro polycyclic aromatic hydrocarbons (nitro-PAHs) and two hydroxy polycyclic aromatic hydrocarbons (OH-PAHs) based on the analytical determination, reference material analysis and extraction step. Main contributions reached 15-30% and came from extraction process of real ambient samples, being those for nitro- PAHs the highest (20-30%). Range and mean concentration of PAC mass concentrations measured in gas phase and PM10/PM2.5 particle fractions during a full year are also presented. Concentrations of OH-PAHs were about 2-4 orders of magnitude lower than their parent PAHs and comparable to those sparsely reported in literature. (Author)

  2. Uncertainties in the measured quantities of water leaving waste Tank 241-C-106 via the ventilation system

    International Nuclear Information System (INIS)

    The purpose of this analysis is to estimate the uncertainty in the measured quantity of water which typically leaves Tank 241-C-106 via the ventilation system each month. Such measurements are essential for heat removal estimation and tank liquid level verification purposes. The uncertainty associated with the current, infrequent, manual method of measurement (involves various psychrometric and pressure measurements) is suspected to be unreasonably high. Thus, the possible reduction of this uncertainty using a continuous, automated method of measurement will also be estimated. There are three major conclusions as a result of this analysis: (1) the uncertainties associated with the current (infrequent, manual) method of measuring the water which typically leaves Tank 241-C-106 per month via the ventilation system are indeed quite high (80% to 120%); (2) given the current psychrometric and pressure measurement methods and any tank which loses considerable moisture through active ventilation, such as Tank 241-C-106, significant quantities of liquid can actually leak from the tank before a leak can be positively identified via liquid level measurement; (3) using improved (continuous, automated) methods of taking the psychrometric and pressure measurements, the uncertainty in the measured quantity of water leaving Tank 241-C-106 via the ventilation system can be reduced by approximately an order of magnitude

  3. Quantifying uncertainty in the measurement of arsenic in suspended particulate matter by Atomic Absorption Spectrometry with hydride generator

    Directory of Open Access Journals (Sweden)

    Ahuja Tarushee

    2011-04-01

    Full Text Available Abstract Arsenic is the toxic element, which creates several problems in human being specially when inhaled through air. So the accurate and precise measurement of arsenic in suspended particulate matter (SPM is of prime importance as it gives information about the level of toxicity in the environment, and preventive measures could be taken in the effective areas. Quality assurance is equally important in the measurement of arsenic in SPM samples before making any decision. The quality and reliability of the data of such volatile elements depends upon the measurement of uncertainty of each step involved from sampling to analysis. The analytical results quantifying uncertainty gives a measure of the confidence level of the concerned laboratory. So the main objective of this study was to determine arsenic content in SPM samples with uncertainty budget and to find out various potential sources of uncertainty, which affects the results. Keeping these facts, we have selected seven diverse sites of Delhi (National Capital of India for quantification of arsenic content in SPM samples with uncertainty budget following sampling by HVS to analysis by Atomic Absorption Spectrometer-Hydride Generator (AAS-HG. In the measurement of arsenic in SPM samples so many steps are involved from sampling to final result and we have considered various potential sources of uncertainties. The calculation of uncertainty is based on ISO/IEC17025: 2005 document and EURACHEM guideline. It has been found that the final results mostly depend on the uncertainty in measurement mainly due to repeatability, final volume prepared for analysis, weighing balance and sampling by HVS. After the analysis of data of seven diverse sites of Delhi, it has been concluded that during the period from 31st Jan. 2008 to 7th Feb. 2008 the arsenic concentration varies from 1.44 ± 0.25 to 5.58 ± 0.55 ng/m3 with 95% confidence level (k = 2.

  4. Reliable and valid NEWS for Chinese seniors: measuring perceived neighborhood attributes related to walking

    Directory of Open Access Journals (Sweden)

    Lee Lok-chun

    2010-11-01

    Full Text Available Abstract Background The effects of the built environment on walking in seniors have not been studied in an Asian context. To examine these effects, valid and reliable measures are needed. The aim of this study was to develop and validate a questionnaire of perceived neighborhood characteristics related to walking appropriate for Chinese seniors (Neighborhood Environment Walkability Scale for Chinese Seniors, NEWS-CS. It was based on the Neighborhood Environment Walkability Scale - Abbreviated (NEWS-A, a validated measure of perceived built environment developed in the USA for adults. A secondary study aim was to establish the generalizability of the NEWS-A to an Asian high-density urban context and a different age group. Methods A multidisciplinary panel of experts adapted the original NEWS-A to reflect the built environment of Hong Kong and needs of seniors. The translated instrument was pre-tested on a sample of 50 Chinese-speaking senior residents (65+ years. The final version of the NEWS-CS was interviewer-administered to 484 seniors residing in four selected Hong Kong districts varying in walkability and socio-economic status. Ninety-two participants completed the questionnaire on two separate occasions, 2-3 weeks apart. Test-rest reliability indices were estimated for each item and subscale of the NEWS-CS. Confirmatory factor analysis was used to develop the measurement model of the NEWS-CS and cross-validate that of the NEWS-A. Results The final version of the NEWS-CS consisted of 14 subscales and four single items (76 items. Test-retest reliability was moderate to good (ICC > 50 or % agreement > 60 except for four items measuring distance to destinations. The originally-proposed measurement models of the NEWS-A and NEWS-CS required 2-3 theoretically-justifiable modifications to fit the data well. Conclusions The NEWS-CS possesses sufficient levels of reliability and factorial validity to be used for measuring perceived neighborhood

  5. CFCI3 (CFC-11): UV Absorption Spectrum Temperature Dependence Measurements and the Impact on Atmospheric Lifetime and Uncertainty

    Science.gov (United States)

    Mcgillen, Max R.; Fleming, Eric L.; Jackman, Charles H.; Burkholder, James B.

    2014-01-01

    CFCl3 (CFC-11) is both an atmospheric ozone-depleting and potent greenhouse gas that is removed primarily via stratospheric UV photolysis. Uncertainty in the temperature dependence of its UV absorption spectrum is a significant contributing factor to the overall uncertainty in its global lifetime and, thus, model calculations of stratospheric ozone recovery and climate change. In this work, the CFC-11 UV absorption spectrum was measured over a range of wavelength (184.95 - 230 nm) and temperature (216 - 296 K). We report a spectrum temperature dependence that is less than currently recommended for use in atmospheric models. The impact on its atmospheric lifetime was quantified using a 2-D model and the spectrum parameterization developed in this work. The obtained global annually averaged lifetime was 58.1 +- 0.7 years (2 sigma uncertainty due solely to the spectrum uncertainty). The lifetime is slightly reduced and the uncertainty significantly reduced from that obtained using current spectrum recommendations

  6. $K$-corrections: an Examination of their Contribution to the Uncertainty of Luminosity Measurements

    CERN Document Server

    Lake, Sean E

    2016-01-01

    In this paper we provide formulae that can be used to determine the uncertainty contributed to a measurement by a $K$-correction and, thus, valuable information about which flux measurement will provide the most accurate $K$-corrected luminosity. All of this is done at the level of a Gaussian approximation of the statistics involved, that is, where the galaxies in question can be characterized by a mean spectral energy distribution (SED) and a covariance function (spectral 2-point function). This paper also includes approximations of the SED mean and covariance for galaxies, and the three common subclasses thereof, based on applying the templates from Assef et al. (2010) to the objects in zCOSMOS bright 10k (Lilly et al. 2009) and photometry of the same field from Capak et al. (2007), Sanders et al. (2007), and the AllWISE source catalog.

  7. Uncertainty studies of topographical measurements on steel surface corrosion by 3D scanning electron microscopy.

    Science.gov (United States)

    Kang, K W; Pereda, M D; Canafoglia, M E; Bilmes, P; Llorente, C; Bonetto, R

    2012-02-01

    Pitting corrosion is a damage mechanism quite serious and dangerous in both carbon steel boiler tubes for power plants which are vital to most industries and stainless steels for orthopedic human implants whose demand, due to the increase of life expectation and rate of traffic accidents, has sharply increased. Reliable methods to characterize this kind of damage are becoming increasingly necessary, when trying to evaluate the advance of damage and to establish the best procedures for component inspection in order to determine remaining lives and failure mitigation. A study about the uncertainties on the topographies of corrosion pits from 3D SEM images, obtained at low magnifications (where errors are greater) and different stage tilt angles were carried out using an in-house software previously developed. Additionally, measurements of pit depths on biomaterial surfaces, subjected to two different surface treatments on stainless steels, were carried out. The different depth distributions observed were in agreement with electrochemical measurements. PMID:22051087

  8. Decision making for urban drainage systems under uncertainty caused by weather radar rainfall measurement

    Science.gov (United States)

    Dai, Qiang; Zhuo, Lu; Han, Dawei

    2015-04-01

    With the rapidly growth of urbanization and population, the decision making for managing urban flood risk has been a significant issue for most large cities in China. A high-quality measurement of rainfall at small temporal but large spatial scales is of great importance to urban flood risk management. Weather radar rainfall, with its advantage of short-term predictability and high spatial and temporal resolutions, has been widely applied in the urban drainage system modeling. It is recognized that weather radar is subjected to many uncertainties and many studies have been carried out to quantify these uncertainties in order to improve the quality of the rainfall and the corresponding outlet flow. However, considering the final action in urban flood risk management is the decision making such as flood warning and whether to build or how to operate a hydraulics structure, some uncertainties of weather radar may have little or significant influence to the final results. For this reason, in this study, we aim to investigate which characteristics of the radar rainfall are the significant ones for decision making in urban flood risk management. A radar probabilistic quantitative rainfall estimated scheme is integrated with an urban flood model (Storm Water Management Model, SWMM) to make a decision on whether to warn or not according to the decision criterions. A number of scenarios with different storm types, synoptic regime and spatial and temporal correlation are designed to analyze the relationship between these affected factors and the final decision. Based on this, parameterized radar probabilistic rainfall estimation model is established which reflects the most important elements in the decision making for urban flood risk management.

  9. Influence of Spherical Radiation Pattern Measurement Uncertainty on Handset Performance Measures

    DEFF Research Database (Denmark)

    Nielsen, Jesper Ødum; Pedersen, Gert Frølund

    2005-01-01

    An important characteristic of a mobile handset is its ability to receive and transmit power. One way to characterize the performance of a handset in this respect is to use measurements of the spherical radiation pattern from which the total radiated power (TRP), total isotropic sensitivity (TIS...... in the performance measures are investigated for both the GSM-900 and the GSM-1800 band. Despite the deliberately large deviations from the reference position, the changes in TRP and TIS are generally within ±0.5 dB with a maximum of about 1.4 dB. For the MEG values the results depend on the orientation...

  10. Formulation of uncertainty relation of error and disturbance in quantum measurement by using quantum estimation theory

    International Nuclear Information System (INIS)

    Full text: When we try to obtain information about a quantum system, we need to perform measurement on the system. The measurement process causes unavoidable state change. Heisenberg discussed a thought experiment of the position measurement of a particle by using a gamma-ray microscope, and found a trade-off relation between the error of the measured position and the disturbance in the momentum caused by the measurement process. The trade-off relation epitomizes the complementarity in quantum measurements: we cannot perform a measurement of an observable without causing disturbance in its canonically conjugate observable. However, at the time Heisenberg found the complementarity, quantum measurement theory was not established yet, and Kennard and Robertson's inequality erroneously interpreted as a mathematical formulation of the complementarity. Kennard and Robertson's inequality actually implies the indeterminacy of the quantum state: non-commuting observables cannot have definite values simultaneously. However, Kennard and Robertson's inequality reflects the inherent nature of a quantum state alone, and does not concern any trade-off relation between the error and disturbance in the measurement process. In this talk, we report a resolution to the complementarity in quantum measurements. First, we find that it is necessary to involve the estimation process from the outcome of the measurement for quantifying the error and disturbance in the quantum measurement. We clarify the implicitly involved estimation process in Heisenberg's gamma-ray microscope and other measurement schemes, and formulate the error and disturbance for an arbitrary quantum measurement by using quantum estimation theory. The error and disturbance are defined in terms of the Fisher information, which gives the upper bound of the accuracy of the estimation. Second, we obtain uncertainty relations between the measurement errors of two observables [1], and between the error and disturbance in the

  11. Importance measures in nuclear PSA: how to control their uncertainty and develop new applications

    International Nuclear Information System (INIS)

    This PhD thesis deals with the importance measures based on nuclear probabilistic safety analyses (PSA). With these indicators, the importance towards risk of the events considered in the PSA models can be measured. The first part of this thesis sets out the framework in which they are currently used. The information extracted from importance measures evaluation is used in industrial decision-making processes that may impact the safety of nuclear plants. In the second part of the thesis, we thus try to meet the requirements of reliability and simplicity with an approach minimising the uncertainties due to modelling. We also lay out a new truncation process of the set of the minimal cut set (MCS) corresponding to the baseline case which allows a quick, automatic and precise calculation of the importance measures. As PSA are increasingly used in risk-informed decision-making approaches, we have examined the extension of importance measures to groups of basic events. The third part of the thesis therefore presents the definition of the importance of events such as the failure of a system or the loss of a function, as well as their potential applications. PSA being considered to be a useful tool to design new nuclear power plants, the fourth part of the thesis sketches out a design process based both on classical importance measures and on new ones. (author)

  12. Evaluating the capabilities and uncertainties of droplet measurements for the fog droplet spectrometer (FM-100

    Directory of Open Access Journals (Sweden)

    J. K. Spiegel

    2012-09-01

    Full Text Available Droplet size spectra measurements are crucial to obtain a quantitative microphysical description of clouds and fog. However, cloud droplet size measurements are subject to various uncertainties. This work focuses on the error analysis of two key measurement uncertainties arising during cloud droplet size measurements with a conventional droplet size spectrometer (FM-100: first, we addressed the precision with which droplets can be sized with the FM-100 on the basis of the Mie theory. We deduced error assumptions and proposed a new method on how to correct measured size distributions for these errors by redistributing the measured droplet size distribution using a stochastic approach. Second, based on a literature study, we summarized corrections for particle losses during sampling with the FM-100. We applied both corrections to cloud droplet size spectra measured at the high alpine site Jungfraujoch for a temperature range from 0 °C to 11 °C. We showed that Mie scattering led to spikes in the droplet size distributions using the default sizing procedure, while the new stochastic approach reproduced the ambient size distribution adequately. A detailed analysis of the FM-100 sampling efficiency revealed that particle losses were typically below 10% for droplet diameters up to 10 μm. For larger droplets, particle losses can increase up to 90% for the largest droplets of 50 μm at ambient wind speeds below 4.4 m s−1 and even to >90% for larger angles between the instrument orientation and the wind vector (sampling angle at higher wind speeds. Comparisons of the FM-100 to other reference instruments revealed that the total liquid water content (LWC measured by the FM-100 was more sensitive to particle losses than to re-sizing based on Mie scattering, while the total number concentration was only marginally influenced by particle losses. Consequently, for further LWC measurements with the FM-100 we strongly recommend to consider (1 the

  13. Evaluating the capabilities and uncertainties of droplet measurements for the fog droplet spectrometer (FM-100

    Directory of Open Access Journals (Sweden)

    J. K. Spiegel

    2012-05-01

    Full Text Available Droplet size spectra measurements are crucial to obtain a quantitative microphysical description of clouds and fog. However, cloud droplet size measurements are subject to various uncertainties. This work focuses on the evaluation of two key measurement uncertainties arising during cloud droplet size measurements with a conventional droplet size spectrometer (FM-100: first, we addressed the precision with which droplets can be sized with the FM-100 on the basis of Mie theory. We deduced error assumptions and proposed how to correct measured size distributions for these errors by redistributing the measured droplet size distribution using a stochastic approach. Second, based on a literature study, we derived corrections for particle losses during sampling with the FM-100. We applied both corrections to cloud droplet size spectra measured at the high alpine site Jungfraujoch for a temperature range from 0 °C to 11 °C. We show that Mie scattering led to spikes in the droplet size distributions using the default sizing procedure, while the stochastic approach reproduced the ambient size distribution adequately. A detailed analysis of the FM-100 sampling efficiency revealed that particle losses were typically below 10% for droplet diameters up to 10 μm. For larger droplets, particle losses can increase up to 90% for the largest droplets of 50 μm at ambient windspeeds below 4.4 m s−1 and even to >90% for larger angles between the instrument orientation and the wind vector (sampling angle at higher wind speeds. Comparisons of the FM-100 to other reference instruments revealed that the total liquid water content (LWC measured by the FM-100 was more sensitive to particle losses than to re-sizing based on Mie scattering, while the total number concentration was only marginally influenced by particle losses. As a consequence, for further LWC measurements with the FM-100 we strongly recommend to consider (1 the error arising due to Mie

  14. Instruments of Measurement Uncertainty Analysis Files%仪器分档不确定度的测算分析

    Institute of Scientific and Technical Information of China (English)

    张志清; 张阳春

    2014-01-01

    测量不确定度的应用范围很广,当用同样的方法和仪器测量同一对象,由于测量仪器选择档位的不同,则提供不同的测量误差、分辨率,测算出的不确定度评定结果也就不同,本文以数字万用表为例进行了分档不确定度的测算分析。%Measurement uncertainty wide range of applications, when using the same methods and instruments to measure the same object, because the selected range of different measurement instruments, then provide different measurement error, resolution, a measurement uncertainty of assessment results wil different paper to digital multimeter as an example to measure the uncertainty sub-ifle analysis.

  15. Site Water Budget: Influences of Measurement Uncertainties on Measurement Results and Model Results

    OpenAIRE

    Spank, Uwe

    2010-01-01

    The exact quantification of site water budget is a necessary precondition for successful and sustainable management of forests, agriculture and water resources. In this study the water balance was investigated at the spatial scale of canopies and at different temporal scales with focus on the monthly time scale. The estimation of the individual water balance components was primarily based on micrometeorological measurement methods. Evapotranspiration was assessed by the eddy-covariance (EC) m...

  16. Suspended matter concentrations in coastal waters: Methodological improvements to quantify individual measurement uncertainty

    Science.gov (United States)

    Röttgers, Rüdiger; Heymann, Kerstin; Krasemann, Hajo

    2014-12-01

    Measurements of total suspended matter (TSM) concentration and the discrimination of the particulate inorganic (PIM) and organic matter fraction by the loss on ignition methods are susceptible to significant and contradictory bias errors by: (a) retention of sea salt in the filter (despite washing with deionized water), and (b) filter material loss during washing and combustion procedures. Several methodological procedures are described to avoid or correct errors associated with these biases but no analysis of the final uncertainty for the overall mass concentration determination has yet been performed. Typically, the exact values of these errors are unknown and can only be estimated. Measurements were performed in coastal and estuarine waters of the German Bight that allowed the individual error for each sample to be determined with respect to a systematic mass offset. This was achieved by using different volumes of the sample and analyzing the mass over volume relationship by linear regression. The results showed that the variation in the mass offset is much larger than expected (mean mass offset: 0.85 ± 0.84 mg, range: -2.4 - 7.5 mg) and that it often leads to rather large relative errors even when TSM concentrations were high. Similarly large variations were found for the mass offset for PIM measurements. Correction with a mean offset determined with procedural control filters reduced the maximum error to estuarine waters. It should be possible to use the approach in oceanic or fresh water environments as well. The possibility of individual quality control will allow mass-specific optical properties to be determined with better resolved uncertainties and, hence, lower statistical variability, greatly improving our capability to model inherent optical properties of natural particles and its natural variability, e.g. dependence on particle size and the complex refractive index.

  17. International target values 2000 for measurement uncertainties in safeguarding nuclear materials

    International Nuclear Information System (INIS)

    The IAEA has published a revised and updated version of International Target Values (ITVs) for uncertainty components in measurements of nuclear material. This represents the fifth revision of the original release of such tables issued in 1979 by the ESARDA/WGDA. The ITVs represent uncertainties to be considered in judging the reliability of analytical techniques applied to industrial nuclear and fissile material subject to safeguards verification. The tabulated values represent estimates of the 'state of the practice' which ought to be achievable under routine conditions by adequately equipped, experienced laboratories. The most recent standard conventions in representing uncertainty and reliability data have been taken into account, while maintaining a format which allows comparison to previous releases of ITVs. The ITVs 2000 are intended to be used by plant operators and safeguards organizations as a reference of the quality of measurements achievable in nuclear material accountancy, and for planning purposes. They may also be used for statistical inferences regarding the significance of operator-inspector differences whenever insufficient measurement data is available. The IAEA prepared a draft of a technical report presenting the proposed ITVs 2000, and in April 2000 the chairmen or officers of the panels or organizations listed below were invited to co-author the report and to submit the draft to a discussion by their panels and organizations. Comments from the following groups were received and incorporated into the final version of the document, completed in April 2001. The final report replaces the 1993 version of the Target Values, STR 294: Euratom Safeguards Inspectorate, ESARDA Working Group on Destructive Analysis, ESARDA Working Group on Non Destructive Analysis, Institute of Nuclear Material Management, Japanese Expert Group on ITV-2000, ISO Working Group on Analyses in Spent Fuel Reprocessing, ISO Working Group on Analyses in Uranium Fuel

  18. Development of the multi-attribute Adolescent Health Utility Measure (AHUM

    Directory of Open Access Journals (Sweden)

    Beusterien Kathleen M

    2012-08-01

    Full Text Available Abstract Objective Obtain utilities (preferences for a generalizable set of health states experienced by older children and adolescents who receive therapy for chronic health conditions. Methods A health state classification system, the Adolescent Health Utility Measure (AHUM, was developed based on generic health status measures and input from children with Hunter syndrome and their caregivers. The AHUM contains six dimensions with 4–7 severity levels: self-care, pain, mobility, strenuous activities, self-image, and health perceptions. Using the time trade off (TTO approach, a UK population sample provided utilities for 62 of 16,800 AHUM states. A mixed effects model was used to estimate utilities for the AHUM states. The AHUM was applied to trial NCT00069641 of idursulfase for Hunter syndrome and its extension (NCT00630747. Results Observations (i.e., utilities totaled 3,744 (12*312 participants, with between 43 to 60 for each health state except for the best and worst states which had 312 observations. The mean utilities for the best and worst AHUM states were 0.99 and 0.41, respectively. The random effects model was statistically significant (p  Discussion The AHUM health state classification system may be used in future research to enable calculation of quality-adjust life expectancy for applicable health conditions.

  19. Single hadron response measurement and calorimeter jet energy scale uncertainty with the ATLAS detector at the LHC

    Science.gov (United States)

    Aad, G.; Abbott, B.; Abdallah, J.; Abdelalim, A. A.; Abdesselam, A.; Abdinov, O.; Abi, B.; Abolins, M.; AbouZeid, O. S.; Abramowicz, H.; Abreu, H.; Acerbi, E.; Acharya, B. S.; Adamczyk, L.; Adams, D. L.; Addy, T. N.; Adelman, J.; Aderholz, M.; Adomeit, S.; Adragna, P.; Adye, T.; Aefsky, S.; Aguilar-Saavedra, J. A.; Aharrouche, M.; Ahlen, S. P.; Ahles, F.; Ahmad, A.; Ahsan, M.; Aielli, G.; Akdogan, T.; Åkesson, T. P. A.; Akimoto, G.; Akimov, A. V.; Akiyama, A.; Alam, M. S.; Alam, M. A.; Albert, J.; Albrand, S.; Aleksa, M.; Aleksandrov, I. N.; Alessandria, F.; Alexa, C.; Alexander, G.; Alexandre, G.; Alexopoulos, T.; Alhroob, M.; Aliev, M.; Alimonti, G.; Alison, J.; Aliyev, M.; Allbrooke, B. M. M.; Allport, P. P.; Allwood-Spiers, S. E.; Almond, J.; Aloisio, A.; Alon, R.; Alonso, A.; Alvarez Gonzalez, B.; Alviggi, M. G.; Amako, K.; Amaral, P.; Amelung, C.; Ammosov, V. V.; Amorim, A.; Amorós, G.; Amram, N.; Anastopoulos, C.; Ancu, L. S.; Andari, N.; Andeen, T.; Anders, C. F.; Anders, G.; Anderson, K. J.; Andreazza, A.; Andrei, V.; Andrieux, M.-L.; Anduaga, X. S.; Angerami, A.; Anghinolfi, F.; Anisenkov, A.; Anjos, N.; Annovi, A.; Antonaki, A.; Antonelli, M.; Antonov, A.; Antos, J.; Anulli, F.; Aoun, S.; Aperio Bella, L.; Apolle, R.; Arabidze, G.; Aracena, I.; Arai, Y.; Arce, A. T. H.; Arfaoui, S.; Arguin, J.-F.; Arik, E.; Arik, M.; Armbruster, A. J.; Arnaez, O.; Arnault, C.; Artamonov, A.; Artoni, G.; Arutinov, D.; Asai, S.; Asfandiyarov, R.; Ask, S.; Åsman, B.; Asquith, L.; Assamagan, K.; Astbury, A.; Astvatsatourov, A.; Aubert, B.; Auge, E.; Augsten, K.; Aurousseau, M.; Avolio, G.; Avramidou, R.; Axen, D.; Ay, C.; Azuelos, G.; Azuma, Y.; Baak, M. A.; Baccaglioni, G.; Bacci, C.; Bach, A. M.; Bachacou, H.; Bachas, K.; Backes, M.; Backhaus, M.; Badescu, E.; Bagnaia, P.; Bahinipati, S.; Bai, Y.; Bailey, D. C.; Bain, T.; Baines, J. T.; Baker, O. K.; Baker, M. D.; Baker, S.; Banas, E.; Banerjee, P.; Banerjee, Sw.; Banfi, D.; Bangert, A.; Bansal, V.; Bansil, H. S.; Barak, L.; Baranov, S. P.; Barashkou, A.; Barbaro Galtieri, A.; Barber, T.; Barberio, E. L.; Barberis, D.; Barbero, M.; Bardin, D. Y.; Barillari, T.; Barisonzi, M.; Barklow, T.; Barlow, N.; Barnett, B. M.; Barnett, R. M.; Baroncelli, A.; Barone, G.; Barr, A. J.; Barreiro, F.; Barreiro Guimarães da Costa, J.; Barrillon, P.; Bartoldus, R.; Barton, A. E.; Bartsch, V.; Bates, R. L.; Batkova, L.; Batley, J. R.; Battaglia, A.; Battistin, M.; Bauer, F.; Bawa, H. S.; Beale, S.; Beau, T.; Beauchemin, P. H.; Beccherle, R.; Bechtle, P.; Beck, H. P.; Becker, S.; Beckingham, M.; Becks, K. H.; Beddall, A. J.; Beddall, A.; Bedikian, S.; Bednyakov, V. A.; Bee, C. P.; Begel, M.; Behar Harpaz, S.; Behera, P. K.; Beimforde, M.; Belanger-Champagne, C.; Bell, P. J.; Bell, W. H.; Bella, G.; Bellagamba, L.; Bellina, F.; Bellomo, M.; Belloni, A.; Beloborodova, O.; Belotskiy, K.; Beltramello, O.; Ben Ami, S.; Benary, O.; Benchekroun, D.; Benchouk, C.; Bendel, M.; Benekos, N.; Benhammou, Y.; Benhar Noccioli, E.; Benitez Garcia, J. A.; Benjamin, D. P.; Benoit, M.; Bensinger, J. R.; Benslama, K.; Bentvelsen, S.; Berge, D.; Bergeaas Kuutmann, E.; Berger, N.; Berghaus, F.; Berglund, E.; Beringer, J.; Bernat, P.; Bernhard, R.; Bernius, C.; Berry, T.; Bertella, C.; Bertin, A.; Bertinelli, F.; Bertolucci, F.; Besana, M. I.; Besson, N.; Bethke, S.; Bhimji, W.; Bianchi, R. M.; Bianco, M.; Biebel, O.; Bieniek, S. P.; Bierwagen, K.; Biesiada, J.; Biglietti, M.; Bilokon, H.; Bindi, M.; Binet, S.; Bingul, A.; Bini, C.; Biscarat, C.; Bitenc, U.; Black, K. M.; Blair, R. E.; Blanchard, J.-B.; Blanchot, G.; Blazek, T.; Blocker, C.; Blocki, J.; Blondel, A.; Blum, W.; Blumenschein, U.; Bobbink, G. J.; Bobrovnikov, V. B.; Bocchetta, S. S.; Bocci, A.; Boddy, C. R.; Boehler, M.; Boek, J.; Boelaert, N.; Bogaerts, J. A.; Bogdanchikov, A.; Bogouch, A.; Bohm, C.; Boisvert, V.; Bold, T.; Boldea, V.; Bolnet, N. M.; Bona, M.; Bondarenko, V. G.; Bondioli, M.; Boonekamp, M.; Booth, C. N.; Bordoni, S.; Borer, C.; Borisov, A.; Borissov, G.; Borjanovic, I.; Borri, M.; Borroni, S.; Bortolotto, V.; Bos, K.; Boscherini, D.; Bosman, M.; Boterenbrood, H.; Botterill, D.; Bouchami, J.; Boudreau, J.; Bouhova-Thacker, E. V.; Boumediene, D.; Bourdarios, C.; Bousson, N.; Boveia, A.; Boyd, J.; Boyko, I. R.; Bozhko, N. I.; Bozovic-Jelisavcic, I.; Bracinik, J.; Braem, A.; Branchini, P.; Brandenburg, G. W.; Brandt, A.; Brandt, G.; Brandt, O.; Bratzler, U.; Brau, B.; Brau, J. E.; Braun, H. M.; Brelier, B.; Bremer, J.; Brenner, R.; Bressler, S.; Britton, D.; Brochu, F. M.; Brock, I.; Brock, R.; Brodbeck, T. J.; Brodet, E.; Broggi, F.; Bromberg, C.; Bronner, J.; Brooijmans, G.; Brooks, W. K.; Brown, G.; Brown, H.; Bruckman de Renstrom, P. A.; Bruncko, D.; Bruneliere, R.; Brunet, S.; Bruni, A.; Bruni, G.

    2013-03-01

    The uncertainty on the calorimeter energy response to jets of particles is derived for the ATLAS experiment at the Large Hadron Collider (LHC). First, the calorimeter response to single isolated charged hadrons is measured and compared to the Monte Carlo simulation using proton-proton collisions at centre-of-mass energies of sqrt{s} = 900 {GeV} and 7 TeV collected during 2009 and 2010. Then, using the decay of K s and Λ particles, the calorimeter response to specific types of particles (positively and negatively charged pions, protons, and anti-protons) is measured and compared to the Monte Carlo predictions. Finally, the jet energy scale uncertainty is determined by propagating the response uncertainty for single charged and neutral particles to jets. The response uncertainty is 2-5 % for central isolated hadrons and 1-3 % for the final calorimeter jet energy scale.

  20. Single hadron response measurement and calorimeter jet energy scale uncertainty with the ATLAS detector at the LHC

    CERN Document Server

    Aad, Georges; Abdallah, Jalal; Abdelalim, Ahmed Ali; Abdesselam, Abdelouahab; Abdinov, Ovsat; Abi, Babak; Abolins, Maris; AbouZeid, Ossama; Abramowicz, Halina; Abreu, Henso; Acerbi, Emilio; Acharya, Bobby Samir; Adamczyk, Leszek; Adams, David; Addy, Tetteh; Adelman, Jahred; Aderholz, Michael; Adomeit, Stefanie; Adragna, Paolo; Adye, Tim; Aefsky, Scott; Aguilar-Saavedra, Juan Antonio; Aharrouche, Mohamed; Ahlen, Steven; Ahles, Florian; Ahmad, Ashfaq; Ahsan, Mahsana; Aielli, Giulio; Akdogan, Taylan; Åkesson, Torsten Paul Ake; Akimoto, Ginga; Akimov, Andrei; Akiyama, Kunihiro; Alam, Mohammad; Alam, Muhammad Aftab; Albert, Justin; Albrand, Solveig; Aleksa, Martin; Aleksandrov, Igor; Alessandria, Franco; Alexa, Calin; Alexander, Gideon; Alexandre, Gauthier; Alexopoulos, Theodoros; Alhroob, Muhammad; Aliev, Malik; Alimonti, Gianluca; Alison, John; Aliyev, Magsud; Allbrooke, Benedict; Allport, Phillip; Allwood-Spiers, Sarah; Almond, John; Aloisio, Alberto; Alon, Raz; Alonso, Alejandro; Alvarez Gonzalez, Barbara; Alviggi, Mariagrazia; Amako, Katsuya; Amaral, Pedro; Amelung, Christoph; Ammosov, Vladimir; Amorim, Antonio; Amorós, Gabriel; Amram, Nir; Anastopoulos, Christos; Ancu, Lucian Stefan; Andari, Nansi; Andeen, Timothy; Anders, Christoph Falk; Anders, Gabriel; Anderson, Kelby; Andreazza, Attilio; Andrei, George Victor; Andrieux, Marie-Laure; Anduaga, Xabier; Angerami, Aaron; Anghinolfi, Francis; Anisenkov, Alexey; Anjos, Nuno; Annovi, Alberto; Antonaki, Ariadni; Antonelli, Mario; Antonov, Alexey; Antos, Jaroslav; Anulli, Fabio; Aoun, Sahar; Aperio Bella, Ludovica; Apolle, Rudi; Arabidze, Giorgi; Aracena, Ignacio; Arai, Yasuo; Arce, Ayana; Arfaoui, Samir; Arguin, Jean-Francois; Arik, Engin; Arik, Metin; Armbruster, Aaron James; Arnaez, Olivier; Arnault, Christian; Artamonov, Andrei; Artoni, Giacomo; Arutinov, David; Asai, Shoji; Asfandiyarov, Ruslan; Ask, Stefan; Å sman, Barbro; Asquith, Lily; Assamagan, Ketevi; Astbury, Alan; Astvatsatourov, Anatoli; Aubert, Bernard; Auge, Etienne; Augsten, Kamil; Aurousseau, Mathieu; Avolio, Giuseppe; Avramidou, Rachel Maria; Axen, David; Ay, Cano; Azuelos, Georges; Azuma, Yuya; Baak, Max; Baccaglioni, Giuseppe; Bacci, Cesare; Bach, Andre; Bachacou, Henri; Bachas, Konstantinos; Backes, Moritz; Backhaus, Malte; Badescu, Elisabeta; Bagnaia, Paolo; Bahinipati, Seema; Bai, Yu; Bailey, David; Bain, Travis; Baines, John; Baker, Oliver Keith; Baker, Mark; Baker, Sarah; Banas, Elzbieta; Banerjee, Piyali; Banerjee, Swagato; Banfi, Danilo; Bangert, Andrea Michelle; Bansal, Vikas; Bansil, Hardeep Singh; Barak, Liron; Baranov, Sergei; Barashkou, Andrei; Barbaro Galtieri, Angela; Barber, Tom; Barberio, Elisabetta Luigia; Barberis, Dario; Barbero, Marlon; Bardin, Dmitri; Barillari, Teresa; Barisonzi, Marcello; Barklow, Timothy; Barlow, Nick; Barnett, Bruce; Barnett, Michael; Baroncelli, Antonio; Barone, Gaetano; Barr, Alan; Barreiro, Fernando; Barreiro Guimarães da Costa, João; Barrillon, Pierre; Bartoldus, Rainer; Barton, Adam Edward; Bartsch, Valeria; Bates, Richard; Batkova, Lucia; Batley, Richard; Battaglia, Andreas; Battistin, Michele; Bauer, Florian; Bawa, Harinder Singh; Beale, Steven; Beau, Tristan; Beauchemin, Pierre-Hugues; Beccherle, Roberto; Bechtle, Philip; Beck, Hans Peter; Becker, Sebastian; Beckingham, Matthew; Becks, Karl-Heinz; Beddall, Andrew; Beddall, Ayda; Bedikian, Sourpouhi; Bednyakov, Vadim; Bee, Christopher; Begel, Michael; Behar Harpaz, Silvia; Behera, Prafulla; Beimforde, Michael; Belanger-Champagne, Camille; Bell, Paul; Bell, William; Bella, Gideon; Bellagamba, Lorenzo; Bellina, Francesco; Bellomo, Massimiliano; Belloni, Alberto; Beloborodova, Olga; Belotskiy, Konstantin; Beltramello, Olga; Ben Ami, Sagi; Benary, Odette; Benchekroun, Driss; Benchouk, Chafik; Bendel, Markus; Benekos, Nektarios; Benhammou, Yan; Benhar Noccioli, Eleonora; Benitez Garcia, Jorge-Armando; Benjamin, Douglas; Benoit, Mathieu; Bensinger, James; Benslama, Kamal; Bentvelsen, Stan; Berge, David; Bergeaas Kuutmann, Elin; Berger, Nicolas; Berghaus, Frank; Berglund, Elina; Beringer, Jürg; Bernat, Pauline; Bernhard, Ralf; Bernius, Catrin; Berry, Tracey; Bertella, Claudia; Bertin, Antonio; Bertinelli, Francesco; Bertolucci, Federico; Besana, Maria Ilaria; Besson, Nathalie; Bethke, Siegfried; Bhimji, Wahid; Bianchi, Riccardo-Maria; Bianco, Michele; Biebel, Otmar; Bieniek, Stephen Paul; Bierwagen, Katharina; Biesiada, Jed; Biglietti, Michela; Bilokon, Halina; Bindi, Marcello; Binet, Sebastien; Bingul, Ahmet; Bini, Cesare; Biscarat, Catherine; Bitenc, Urban; Black, Kevin; Blair, Robert

    2013-01-01

    The uncertainty on the calorimeter energy response to jets of particles is derived for the ATLAS experiment at the Large Hadron Collider (LHC). First, the calorimeter response to single isolated charged hadrons is measured and compared to the Monte Carlo simulation using proton-proton collisions at centre-of-mass energies of $\\sqrt{s}$ = 900 GeV and 7 TeV collected during 2009 and 2010. Then, using the decay of K_s and Lambda particles, the calorimeter response to specific types of particles (positively and negatively charged pions, protons, and anti-protons) is measured and compared to the Monte Carlo predictions. Finally, the jet energy scale uncertainty is determined by propagating the response uncertainty for single charged and neutral particles to jets. The response uncertainty is 2-5% for central isolated hadrons and 1-3% for the final calorimeter jet energy scale.

  1. Calibration uncertainty

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Anglov, Thomas

    2002-01-01

    Methods recommended by the International Standardization Organisation and Eurachem are not satisfactory for the correct estimation of calibration uncertainty. A novel approach is introduced and tested on actual calibration data for the determination of Pb by ICP-AES. The improved calibration...... uncertainty was verified from independent measurements of the same sample by demonstrating statistical control of analytical results and the absence of bias. The proposed method takes into account uncertainties of the measurement, as well as of the amount of calibrant. It is applicable to all types of...

  2. Uncertainties in turbidity-based measurements of suspended sediment load used to quantify the sediment budget on the catchment scale

    Science.gov (United States)

    de Hipt, Felix Op; Diekkrüger, Bernd; Steup, Gero; Rode, Michael

    2016-04-01

    Water-driven soil erosion, transport and deposition take place on different spatial and temporal scales. Therefore, related measurements are complex and require process understanding and a multi-method approach combining different measurement methods with soil erosion modeling. Turbidity as a surrogate measurement for suspended sediment concentration (SSC) in rivers is frequently used to overcome the disadvantages of conventional sediment measurement techniques regarding temporal resolution and continuity. The use of turbidity measurements requires a close correlation between turbidity and SSC. Depending on the number of samples collected, the measured range and the variations in the measurements, SSC-turbidity curves are subject to uncertainty. This uncertainty has to be determined in order to assess the reliability of measure-ments used to quantify catchment sediment yields and to calibrate soil erosion models. This study presents the calibration results from a sub-humid catchment in south-western Burkina Faso and investigates the related uncertainties. Daily in situ measurements of SSC manually collected at one turbidity station and the corresponding turbidity readings are used to obtain the site-specific calibration curve. The discharge is calculated based on an empirical water level-discharge relationship. The derived regression equations are used to define prediction intervals for SSC and discharge. The uncertainty of the suspended sediment load time series is influenced by the corresponding uncertainties of SSC and discharge. This study shows that the determination of uncertainty is relevant when turbidity-based measurements of suspended sediment loads are used to quantify catchment erosion and to calibrate erosion models.

  3. Analysis of the uncertainty in the measurement of electron densities in plasmas using the wave cutoff method

    International Nuclear Information System (INIS)

    We have analysed the uncertainty of a measured electron density using a wave cutoff probe and compared it with that obtained using a double Langmuir probe and plasma oscillation probe. The wave cutoff probe gives an electron density from a measured plasma frequency, using a network analyser and radiating and detecting antennae. It can also measure the spatial distribution of the electron density. The cutoff method is free of many difficulties often encountered with Langmuir probes, such as thin film deposition and plasma potential fluctuation, and the uncertainty of the cutoff probe is not affected by the complex plasma environment. Here, the measurement technique is theoretically analysed and experimentally demonstrated in density measurements of an inductively coupled radio frequency plasma, and a comparison with the double probe and a plasma oscillation method with uncertainty analysis is also made. (authors)

  4. Analysis of the uncertainty in the measurement of electron densities in plasmas using the wave cutoff method

    Energy Technology Data Exchange (ETDEWEB)

    Jung-Hyung, Kim; Kwang-Hwa, Chung; Yong-Hyeon, Shin [Korea Research Inst. of Standards and Science, Center for Vacuum Technology, Daejeon (Korea, Republic of)

    2005-04-01

    We have analysed the uncertainty of a measured electron density using a wave cutoff probe and compared it with that obtained using a double Langmuir probe and plasma oscillation probe. The wave cutoff probe gives an electron density from a measured plasma frequency, using a network analyser and radiating and detecting antennae. It can also measure the spatial distribution of the electron density. The cutoff method is free of many difficulties often encountered with Langmuir probes, such as thin film deposition and plasma potential fluctuation, and the uncertainty of the cutoff probe is not affected by the complex plasma environment. Here, the measurement technique is theoretically analysed and experimentally demonstrated in density measurements of an inductively coupled radio frequency plasma, and a comparison with the double probe and a plasma oscillation method with uncertainty analysis is also made. (authors)

  5. Comment on ‘A low-uncertainty measurement of the Boltzmann constant’

    Science.gov (United States)

    Macnaughton, Donald B.

    2016-02-01

    The International Committee for Weights and Measures has projected a major revision of the International System of Units in which all the base units will be defined by fixing the values of certain fundamental constants of nature. To assist, de Podesta et al recently experimentally obtained a precise new estimate of the Boltzmann constant. This estimate is proposed as a basis for the redefinition of the unit of temperature, the kelvin. The present paper reports a reanalysis of de Podesta et al’s data that reveals systematic non-random patterns in the residuals of the key fitted model equation. These patterns violate the assumptions underlying the analysis and thus they raise questions about the validity of de Podesta et al’s estimate of the Boltzmann constant. An approach is discussed to address these issues, which should lead to an accurate estimate of the Boltzmann constant with a lower uncertainty.

  6. Monte Carlo Method for Calculating Oxygen Abundances and Their Uncertainties from Strong-Line Flux Measurements

    CERN Document Server

    Bianco, Federica B; Oh, Seung Man; Fierroz, David; Liu, Yuqian; Kewley, Lisa; Graur, Or

    2015-01-01

    We present the open-source Python code pyMCZ that determines oxygen abundance and its distribution from strong emission lines in the standard metallicity scales, based on the original IDL code of Kewley & Dopita (2002) with updates from Kewley & Ellison (2008), and expanded to include more recently developed scales. The standard strong-line diagnostics have been used to estimate the oxygen abundance in the interstellar medium through various emission line ratios in many areas of astrophysics, including galaxy evolution and supernova host galaxy studies. We introduce a Python implementation of these methods that, through Monte Carlo (MC) sampling, better characterizes the statistical reddening-corrected oxygen abundance confidence region. Given line flux measurements and their uncertainties, our code produces synthetic distributions for the oxygen abundance in up to 13 metallicity scales simultaneously, as well as for E(B-V), and estimates their median values and their 66% confidence regions. In additi...

  7. The ellipsoidal nested sampling and the expression of the model uncertainty in measurements

    Science.gov (United States)

    Gervino, Gianpiero; Mana, Giovanni; Palmisano, Carlo

    2016-07-01

    In this paper, we consider the problems of identifying the most appropriate model for a given physical system and of assessing the model contribution to the measurement uncertainty. The above problems are studied in terms of Bayesian model selection and model averaging. As the evaluation of the “evidence” Z, i.e., the integral of Likelihood × Prior over the space of the measurand and the parameters, becomes impracticable when this space has 20 ÷ 30 dimensions, it is necessary to consider an appropriate numerical strategy. Among the many algorithms for calculating Z, we have investigated the ellipsoidal nested sampling, which is a technique based on three pillars: The study of the iso-likelihood contour lines of the integrand, a probabilistic estimate of the volume of the parameter space contained within the iso-likelihood contours and the random samplings from hyperellipsoids embedded in the integration variables. This paper lays out the essential ideas of this approach.

  8. Study and Application of Safety Risk Evaluation Model for CO2 Geological Storage Based on Uncertainty Measure Theory

    OpenAIRE

    Hujun He; Yaning Zhao; Xingke Yang; Yaning Gao; Xu Wu

    2015-01-01

    Analyzing showed that the safety risk evaluation for CO2 geological storage had important significance. Aimed at the characteristics of CO2 geological storage safety risk evaluation, drawing on previous research results, rank and order models for safety risk evaluation of CO2 geological storage were put forward based on information entropy and uncertainty measure theory. In this model, the uncertainty problems in safety risk evaluation of CO2 geological storage were solved by qualitative anal...

  9. Boon or bane of advance tax rulings as a measure to mitigate tax uncertainty and foster investment

    OpenAIRE

    Diller, Markus; Kortebusch, Pia; Schneider, Georg; Sureth, Caren

    2015-01-01

    Tax uncertainty often negatively affects investment. Advance tax rulings (ATRs) are commonly used as a measure to provide tax certainty. Rulings are currently controversially discussed in the context of tax planning activities of multinational firms (Luxembourg Leaks). We analyze ATRs as tax uncertainty shields from both the taxpayers' and the tax authorities' perspectives. In general, tax authorities charge ATR fees and investors request ATRs provided the fee does not exceed a certain thresh...

  10. Assessing the impact of measurement frequency on accuracy and uncertainty of water quality data

    Science.gov (United States)

    Helm, Björn; Schiffner, Stefanie; Krebs, Peter

    2014-05-01

    Physico-chemical water quality is a major objective for the evaluation of the ecological state of a river water body. Physical and chemical water properties are measured to assess the river state, identify prevalent pressures and develop mitigating measures. Regularly water quality is assessed based on weekly to quarterly grab samples. The increasing availability of online-sensor data measured at a high frequency allows for an enhanced understanding of emission and transport dynamics, as well as the identification of typical and critical states. In this study we present a systematic approach to assess the impact of measurement frequency on the accuracy and uncertainty of derived aggregate indicators of environmental quality. High frequency measured (10 min-1 and 15 min-1) data on water temperature, pH, turbidity, electric conductivity and concentrations of dissolved oxygen nitrate, ammonia and phosphate are assessed in resampling experiments. The data is collected at 14 sites in eastern and northern Germany representing catchments between 40 km2 and 140 000 km2 of varying properties. Resampling is performed to create series of hourly to quarterly frequency, including special restrictions like sampling at working hours or discharge compensation. Statistical properties and their confidence intervals are determined in a bootstrapping procedure and evaluated along a gradient of sampling frequency. For all variables the range of the aggregate indicators increases largely in the bootstrapping realizations with decreasing sampling frequency. Mean values of electric conductivity, pH and water temperature obtained with monthly frequency differ in average less than five percent from the original data. Mean dissolved oxygen, nitrate and phosphate had in most stations less than 15 % bias. Ammonia and turbidity are most sensitive to the increase of sampling frequency with up to 30 % in average and 250 % maximum bias at monthly sampling frequency. A systematic bias is recognized

  11. New method and uncertainty estimation for plate dimensions and surface measurements

    International Nuclear Information System (INIS)

    Dimensional and surface quality for tile plate manufacturing control is facing difficult engineering challenges. One of these challenges being that plates in large-scale mass production contain geometrically uneven surfaces. There is a traditional measurement method used to assess the tile plate dimensions and surface quality based on standard specifications: ISO-10545-2: 1995, EOS-3168-2: 2007 and TIS 2398-2:2008. A new measurement method of the dimensions and surface quality for ceramic oblong large-scale tile plate has been developed compared to the traditional method. The strategy of the proposed method is based on CMM straightness measurement strategy instead of the centre point in the traditional method. Expanded uncertainties budgets in the measurements of each method have been estimated in detail. The capability of accurate estimations of real actual results of centre of curvature (CC), centre of edge (CE), warpage (W) and edge crack defects parameters has been achieved according to standards. Moreover, the obtained results showed not only a more accurate method but also improved the quality of tile plate products significantly

  12. Spatial resolution and measurement uncertainty of strains in bone and bone-cement interface using digital volume correlation.

    Science.gov (United States)

    Zhu, Ming-Liang; Zhang, Qing-Hang; Lupton, Colin; Tong, Jie

    2016-04-01

    The measurement uncertainty of strains has been assessed in a bone analogue (sawbone), bovine trabecular bone and bone-cement interface specimens under zero load using the Digital Volume Correlation (DVC) method. The effects of sub-volume size, sample constraint and preload on the measured strain uncertainty have been examined. There is generally a trade-off between the measurement uncertainty and the spatial resolution. Suitable sub-volume sizes have been be selected based on a compromise between the measurement uncertainty and the spatial resolution of the cases considered. A ratio of sub-volume size to a microstructure characteristic (Tb.Sp) was introduced to reflect a suitable spatial resolution, and the measurement uncertainty associated was assessed. Specifically, ratios between 1.6 and 4 appear to give rise to standard deviations in the measured strains between 166 and 620 με in all the cases considered, which would seem to suffice for strain analysis in pre as well as post yield loading regimes. A microscale finite element (μFE) model was built from the CT images of the sawbone, and the results from the μFE model and a continuum FE model were compared with those from the DVC. The strain results were found to differ significantly between the two methods at tissue level, consistent in trend with the results found in human bones, indicating mainly a limitation of the current DVC method in mapping strains at this level. PMID:26741534

  13. Time-Resolved Particle Image Velocimetry Measurements with Wall Shear Stress and Uncertainty Quantification for the FDA Nozzle Model.

    Science.gov (United States)

    Raben, Jaime S; Hariharan, Prasanna; Robinson, Ronald; Malinauskas, Richard; Vlachos, Pavlos P

    2016-03-01

    We present advanced particle image velocimetry (PIV) processing, post-processing, and uncertainty estimation techniques to support the validation of computational fluid dynamics analyses of medical devices. This work is an extension of a previous FDA-sponsored multi-laboratory study, which used a medical device mimicking geometry referred to as the FDA benchmark nozzle model. Experimental measurements were performed using time-resolved PIV at five overlapping regions of the model for Reynolds numbers in the nozzle throat of 500, 2000, 5000, and 8000. Images included a twofold increase in spatial resolution in comparison to the previous study. Data was processed using ensemble correlation, dynamic range enhancement, and phase correlations to increase signal-to-noise ratios and measurement accuracy, and to resolve flow regions with large velocity ranges and gradients, which is typical of many blood-contacting medical devices. Parameters relevant to device safety, including shear stress at the wall and in bulk flow, were computed using radial basis functions. In addition, in-field spatially resolved pressure distributions, Reynolds stresses, and energy dissipation rates were computed from PIV measurements. Velocity measurement uncertainty was estimated directly from the PIV correlation plane, and uncertainty analysis for wall shear stress at each measurement location was performed using a Monte Carlo model. Local velocity uncertainty varied greatly and depended largely on local conditions such as particle seeding, velocity gradients, and particle displacements. Uncertainty in low velocity regions in the sudden expansion section of the nozzle was greatly reduced by over an order of magnitude when dynamic range enhancement was applied. Wall shear stress uncertainty was dominated by uncertainty contributions from velocity estimations, which were shown to account for 90-99% of the total uncertainty. This study provides advancements in the PIV processing methodologies over

  14. Uncertainty Measurement and Visual Analysis on Terroristic Attacks Data%恐怖袭击事件不确定性度量及可视分析

    Institute of Scientific and Technical Information of China (English)

    贺怀清; 王赫

    2012-01-01

    In recent years,terroristic activities occur more frequently and have seriously affected the regional stability and the world peace. With the development of information technology, the researchers are able to obtain information of terroristic attacks from many aspects. However, with the constant enlargement of the scale of data sets,how to explore the underlying information and analyze the uncertainty from a large number of data has become an important issue in the analysis process of terroristic attacks. On Global Terrorism Database,based on visual analysis and uncertainty measurement theory,we propose the measurement and visual analysis methods on data records and uncertainty of attributes. By integrating results of uncertainty measurement with parallel coordinates, histogram, area chart and interactive methods,the data uncertainty is clearly displayed without influence on its representation and provides information base for situation assessment based on uncertainty theory the next step.%近年来,全球范围内恐怖主义活动愈发频繁,已经严重影响了地区稳定和世界和平.随着信息技术的发展,研究者们得以从多个方面获取恐怖袭击事件信息.然而,随着数据集规模的不断扩大,如何从大量数据中发掘隐含的信息、分析其中包含的不确定性,成为恐怖袭击事件分析过程中的重要问题.针对全球恐怖主义数据库,基于可视分析和不确定度量理论,提出了数据记录和属性不确定性的度量及可视分析方法.通过将不确定性度量结果与平行坐标、柱状图、面积图和交互式方法相结合,在不影响数据源表达的同时清晰地展示了其中包含的不确定性,为下一步基于不确定性理论的态势评估提供了信息基础.

  15. A study on the relationship between measurement uncertainty and the size of the disk gauge used to calibrate a straightness measuring system

    International Nuclear Information System (INIS)

    An autonomous method for calibrating the zero difference for the three-point method of surface straightness measurement is presented and discussed with respect to the relationship between the measurement uncertainty and the size of the disk gauge used for calibration. In this method, the disk gauge is used in two steps. In the first step, the disk gauge rotates a few revolutions and moves parallel to three displacement sensors built into a holder. In the second step, the geometrical parameters between the sensors and the disk gauge are acquired, and the zero differences are computed by our recently proposed algorithm. Finally, the uncertainty of the zero differences is analyzed and simulated numerically, and the relationship between the disk gauge radius and the measurement uncertainty is calculated. The use of a disk gauge of larger radius results in smaller uncertainty of straightness measurement

  16. Sensitivity of Large-Aperture Scintillometer Measurements of Area-Average Heat Fluxes to Uncertainties in Topographic Heights

    CERN Document Server

    Gruber, Matthew A; Hartogensis, Oscar K

    2013-01-01

    Scintillometers measure $C_n^2$ over large areas of turbulence in the atmospheric surface layer. Turbulent fluxes of heat and momentum are inferred through coupled sets of equations derived from the Monin-Obukhov similarity hypothesis. One-dimensional sensitivity functions have been produced which relate the sensitivity of heat fluxes to uncertainties in single values of beam height over homogeneous and flat terrain. Real field sites include variable topography and heterogeneous surface properties such as roughness length. We develop here the first analysis of the sensitivity of scintillometer derived sensible heat fluxes to uncertainties in spacially distributed topographic measurements. For large-aperture scintillometers and independent $u_\\star$ measurements, sensitivity is shown to be concentrated in areas near the center of the beam and where the underlying topography is closest to the beam height. Uncertainty may be greatly reduced by focusing precise topographic measurements in these areas. The new two...

  17. International target values 2010 for achievable measurement uncertainties in nuclear material accountancy

    Energy Technology Data Exchange (ETDEWEB)

    Dias, Fabio C., E-mail: fabio@ird.gov.b [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil); Almeida, Silvio G. de; Renha Junior, Geraldo, E-mail: silvio@abacc.org.b, E-mail: grenha@abacc.org.b [Agencia Brasileiro-Argentina de Contabilidade e Controle de Materiais Nucleares (ABACC), Rio de Janeiro, RJ (Brazil)

    2011-07-01

    The International Target Values (ITVs) are reasonable uncertainty estimates that can be used in judging the reliability of measurement techniques applied to industrial nuclear and fissile materials subject to accountancy and/or safeguards verification. In the absence of relevant experimental estimates, ITVs can also be used to select measurement techniques and calculate sample population during the planning phase of verification activities. It is important to note that ITVs represent estimates of the 'state-of-the-practice', which should be achievable under routine measurement conditions affecting both facility operators and safeguards inspectors, not only in the field, but also in laboratory. Tabulated values cover measurement methods used for the determination of volume or mass of the nuclear material, for its elemental and isotopic assays, and for its sampling. The 2010 edition represents the sixth revision of the International Target Values (ITVs), issued by the International Atomic Energy Agency (IAEA) as a Safeguards Technical Report (STR-368). The first version was released as 'Target Values' in 1979 by the Working Group on Techniques and Standards for Destructive Analysis (WGDA) of the European Safeguards Research and Development Association (ESARDA) and focused on destructive analytical methods. In the latest 2010 revision, international standards in estimating and expressing uncertainties have been considered while maintaining a format that allows comparison with the previous editions of the ITVs. Those standards have been usually applied in QC/QA programmes, as well as qualification of methods, techniques and instruments. Representatives of the Brazilian Nuclear Energy Commission (CNEN) and the Brazilian-Argentine Agency for Accounting and Control of Nuclear Materials (ABACC) participated in previous Consultants Group Meetings since the one convened to establish the first list of ITVs released in 1993 and in subsequent revisions

  18. Improving the control of systematic uncertainties in precision measurements of radionuclide half-life

    International Nuclear Information System (INIS)

    Many experiments designed to precisely determine the half-life of a radionuclide employ a long lived reference source to help determine the impact on the data of any systematic variation in the detector and associated electronics. The half-life of the radionuclide of interest is determined from the ratio of its decay rate data to the decay rate data from the reference source. This correction procedure assumes that any underlying systematic affects the data and reference measurements in exactly the same way. In this paper we show that when some systematic effects affect the two differently, the ratio procedure can leave artifacts in the corrected data that can compromise an unbiased and precise assessment of the radionuclide half-life. We describe two methods that can help overcome this problem. We also describe several statistical tests that help determine which effects may underlie systematic variations in the data. We discuss an illustrative example based on previously published 32Si and 36Cl data recorded by an experiment at Brookhaven National Laboratory. We correct the data for systematic variation related to climate variation and estimate the 32Si half-life to be T1/2=171.8±1.8. The reduction in uncertainty in the 32Si half-life, relative to the previous estimate based upon this data, is equivalent to that which would be achieved through increasing the size of the data set by almost 3.5 times. - Author-Highlights: • Isotope decay data and reference source data can have differing systematics. • Differing systematics can inflate uncertainty of isotope half-life estimate. • We describe two methods to overcome this problem. • We describe statistical tests to determine which variables cause systematics. • We analyze Brookhaven 32Si/36Cl decay data as an illustrative example

  19. Software simulation of a lock-in amplifier with application to the evaluation of uncertainties in real measuring systems

    International Nuclear Information System (INIS)

    Lock-in amplifiers are widely used in metrology to extract a sinusoidal component of a known frequency from a signal that is noisy, returning estimates of the amplitude and phase of the component. A number of questions arise regarding the use by metrologists of the results returned by a lock-in amplifier. For example, what uncertainties should be associated with the estimates returned? How do these uncertainties vary as the signal-to-noise ratio changes? How do instrument errors affect the estimates and the associated uncertainties? In this paper a software simulation tool is described that may be used to process simulated or real (measured) signals, with the user allowed to associate uncertainties with the values of parameters that define the simulated signal and with those that define the operation of the instrument. The results of applying the software simulation tool to both simulated signals and real signals from infrared radiation and nanoindentation experiments are described

  20. Minimizing measurement uncertainties of coniferous needle-leaf optical properties, part II: experimental set-up and error analysis

    NARCIS (Netherlands)

    Yanez Rausell, L.; Malenovsky, Z.; Clevers, J.G.P.W.; Schaepman, M.E.

    2014-01-01

    We present uncertainties associated with the measurement of coniferous needle-leaf optical properties (OPs) with an integrating sphere using an optimized gap-fraction (GF) correction method, where GF refers to the air gaps appearing between the needles of a measured sample. We used an optically stab

  1. Shannon Revisited: Considering a More Tractable Expression to Measure and Manage Intractability, Uncertainty, Risk, Ignorance, and Entropy

    OpenAIRE

    Samid, Gideon

    2010-01-01

    Building on Shannon's lead, let's consider a more malleable expression for tracking uncertainty, and states of "knowledge available" vs. "knowledge missing," to better practice innovation, improve risk management, and successfully measure progress of intractable undertakings. Shannon's formula, and its common replacements (Renyi, Tsallis) compute to increased knowledge whenever two competing choices, however marginal, exchange probability measures. Such and other distortions are corrected by ...

  2. Use of on-ground gamma-ray spectrometry to measure plant-available potassium and other topsoil attributes

    International Nuclear Information System (INIS)

    The incidence of potassium (K) deficiency is increasing in crops, pastures, and forestry in south-western Australia. Although soil K can be measured using soil sampling and analysis, γ-ray spectrometry offers a potentially cheaper and spatially more precise alternative. This could be particularly useful in precision agriculture, where inputs are applied according to need rather than by general prescription. In a study of topsoils near Jerramungup, Western Australia, strong relationships (r2 = 0.9) were found between on-ground counts of γ-rays derived from 40K (γ-K) and both total K and plant-available K. The success of γ-ray spectrometry in predicting available K relied on a strong relationship (r2 0.9) between total K and available K which may not hold in all areas. Although the relationship between γ-K and available K held over the range of 36-1012 mg/kg, crop response to K fertilisers is only expected when the available K content is 2 = 0.9) were also found between γ-K and a range of other soil attributes, including clay, silt, and organic carbon content. These relationships depended on the locally strong relationship between total K and these soil attributes. Since such relationships do not hold everywhere, the utility of γ-ray spectrometry will likewise be limited. Site-specific calibrations are required if γ-ray spectrometry is to be used for soil property mapping. Copyright (1999) CSIRO Publishing

  3. Construction of measurement uncertainty profiles for quantitative analysis of genetically modified organisms based on interlaboratory validation data.

    Science.gov (United States)

    Macarthur, Roy; Feinberg, Max; Bertheau, Yves

    2010-01-01

    A method is presented for estimating the size of uncertainty associated with the measurement of products derived from genetically modified organisms (GMOs). The method is based on the uncertainty profile, which is an extension, for the estimation of uncertainty, of a recent graphical statistical tool called an accuracy profile that was developed for the validation of quantitative analytical methods. The application of uncertainty profiles as an aid to decision making and assessment of fitness for purpose is also presented. Results of the measurement of the quantity of GMOs in flour by PCR-based methods collected through a number of interlaboratory studies followed the log-normal distribution. Uncertainty profiles built using the results generally give an expected range for measurement results of 50-200% of reference concentrations for materials that contain at least 1% GMO. This range is consistent with European Network of GM Laboratories and the European Union (EU) Community Reference Laboratory validation criteria and can be used as a fitness for purpose criterion for measurement methods. The effect on the enforcement of EU labeling regulations is that, in general, an individual analytical result needs to be 1.8% to demonstrate noncompliance with a labeling threshold of 0.9%. PMID:20629412

  4. Conversion factor and uncertainty estimation for quantification of towed gamma-ray detector measurements in Tohoku coastal waters

    Science.gov (United States)

    Ohnishi, S.; Thornton, B.; Kamada, S.; Hirao, Y.; Ura, T.; Odano, N.

    2016-05-01

    Factors to convert the count rate of a NaI(Tl) scintillation detector to the concentration of radioactive cesium in marine sediments are estimated for a towed gamma-ray detector system. The response of the detector against a unit concentration of radioactive cesium is calculated by Monte Carlo radiation transport simulation considering the vertical profile of radioactive material measured in core samples. The conversion factors are acquired by integrating the contribution of each layer and are normalized by the concentration in the surface sediment layer. At the same time, the uncertainty of the conversion factors are formulated and estimated. The combined standard uncertainty of the radioactive cesium concentration by the towed gamma-ray detector is around 25 percent. The values of uncertainty, often referred to as relative root mean squat errors in other works, between sediment core sampling measurements and towed detector measurements were 16 percent in the investigation made near the Abukuma River mouth and 5.2 percent in Sendai Bay, respectively. Most of the uncertainty is due to interpolation of the conversion factors between core samples and uncertainty of the detector's burial depth. The results of the towed measurements agree well with laboratory analysed sediment samples. Also, the concentrations of radioactive cesium at the intersection of each survey line are consistent. The consistency with sampling results and between different lines' transects demonstrate the availability and reproducibility of towed gamma-ray detector system.

  5. Improvement of ATHLET modelling capability for asymmetric natural circulation phenomenon using uncertainty and sensitivity measures

    International Nuclear Information System (INIS)

    Highlights: • Overprediction of mass flow during asymmetric natural circulation of PKL facility is observed by ATHLET code. • Uncertainty and sensitivity measures are used to evaluate the influence of various input and model parameters. • Bypass heating location, values, and heat loss coefficient of isolated steam generator are identified. • The mass flow reduction phenomenon in the isolated loop can be captured by the improved modelling. - Abstract: Natural circulation is one of the most important phenomena relating to nuclear reactor safety. In the last years of 20th century, lots of test facilities were set up to conduct natural circulation experiments. At the same time, some system codes, such as RELAP5 and CATHARE, have been validated by the experimental data carried out by these facilities. The calculation results show that these codes can capture most of the characters during the natural circulation transient, except some special processes. One of the examples is PKL IIIB3.1 test, an asymmetric natural circulation transient in the primary loop with an isolated steam generator. The cooled down process using the intact steam generators by reducing the secondary side pressure is the driving force of the natural circulation in primary loop. In this paper, a post-test calculation is performed by best-estimate thermal-hydraulic code ATHLET to simulate PKL IIIB3.1 test. Due to the effect of isolated steam generator on the whole primary cooling system, the results show that the mass flow in the isolated loop is overpredicted significantly comparing to the experimental data. Many measures have been tried to improve the simulation results, but an effective method is still missing. Based on the post-test calculation results, an uncertainty and sensitivity analysis is carried out to find out the main reason causing this higher circulation mass flow. Lots of factors are considered in this study, e.g. the boundary conditions, thermal-hydraulic correlations, and

  6. Uncertainties of retrospective radon concentration measurements by multilayer surface trap detector

    International Nuclear Information System (INIS)

    The detector for retrospective radon exposure measurements is developed. The detector consists of the multilayer package of solid-state nuclear track detectors LR-115 type. Nitrocellulose films works both as α-particle detector and as absorber decreasing the energy of α-particles. The uncertainties of implanted 210Pb measurements by two- and three-layer detectors are assessed in dependence on surface 210Po activity and gross background activity of the glass. The generalized compartment behavior model of radon decay products in the room atmosphere was developed and verified. It is shown that the most influencing parameters on the value of conversion coefficient from 210Po surface activity to average radon concentration are aerosol particles concentration, deposition velocity of unattached 218Po and air exchange rate. It is demonstrated that with the use of additional information on surface to volume room ratio, air exchange rate and aerosol particles concentration the systematic bias of conversion coefficient between surface activity of 210Po and average radon concentration can be decreased up to 30 %. (N.C.)

  7. An evaluation of measurement uncertainties in the on-line measurement of coal ash content by gamma-ray transmission

    International Nuclear Information System (INIS)

    In this paper, a significant effect producing systematic errors in the on-line measurement using gamma-ray transmission is revealed. Ash content fluctuations or thickness changes lead to a permanent negative systematic error in the results of the measurements. To study uncertainties in the measurements applicable to time-independent ash content indicators and to investigate the characteristics of the radiation attenuation process, the behavior of the quantity in question is modeled with a stationary Gaussian distribution. A systematic error-producing effect has been found, and a quantitative correction is given to compensate for it. For some other quantities in question that vary in time, a linear model is used to discuss the systematic errors in the case of automated coal gangue separator. Results of experiments that demonstrate different systematic errors for different sampling intervals are presented. The reason for these errors is the nonlinearity of the relationship between the radiation intensity, on the one hand, and the sample thickness and mass attention, on the other

  8. An evaluation of measurement uncertainties in the on-line measurement of coal ash content by gamma-ray transmission.

    Science.gov (United States)

    Liu, Wenzhong; Kong, Li; Qu, Tan; Chen, Jingjing

    2002-09-01

    In this paper, a significant effect producing systematic errors in the on-line measurement using gamma-ray transmission is revealed. Ash content fluctuations or thickness changes lead to a permanent negative systematic error in the results of the measurements. To study uncertainties in the measurements applicable to time-independent ash content indicators and to investigate the characteristics of the radiation attenuation process, the behavior of the quantity in question in modeled with a stationary Gaussian distribution. A systematic error-producing effect has been found, and a quantitative correction is given to compensate for it. For some other quantities in question that vary in time, a linear model is used to discuss the systematic errors in the case of automated coal gangue separator. Results of experiments that demonstrate different systematic errors for different sampling intervals are presented. The reason for these errors is the nonlinearity of the relationship between the radiation intensity, on the one hand, and the sample thickness and mass attention, on the other. PMID:12201142

  9. From digital earth to digital neighbourhood: A study of subjective measures of walkability attributes in objectively assessed digital neighbourhood

    International Nuclear Information System (INIS)

    According to IEA report (2011), about 23% of the World's CO2 emissions result from transport and this is one of the few areas where emissions are still rapidly increasing. The use of private vehicles is one of the principle contributors to green house gas emissions from transport sector. Therefore this paper focuses on the shift to more sustainable and low carbon forms of transportation mode such as walking. Neighbourhood built environment attributes may influence walkability. For this study, the author used a modified version of the ''Neighbourhood Environment Walkability Scale'' to make comparison between respondents' perceptions regarding attributes of two neighborhoods of Putrajaya. The 21st Century really needs planners to use the Digital Earth Concept, to go from global to regional to national to very local issues, using integrated, advanced technologies such as earth observation, GIS, virtual reality, etc. For this research, two (2) neighborhoods of different densities (High and Low density) were selected. A sample total of 381(195 and 186) between 7 to 65 years old participants were selected For subjective measures we used 54 questions questionnaire survey where as for the objective measures we used desktop 9.3 version of Arc GIS soft ware. Our results shows that respondents who reside in high-walkable neighbourhood precinct 9 in Putrajaya rated factors such as residential density, land use mix, proximity to destination and street connectivity, consistently higher then did respondents of the low walkable neighbourhood precinct 8 in Putrajaya

  10. Bernoulli particle filter with observer altitude for maritime radiation source tracking in the presence of measurement uncertainty

    Institute of Scientific and Technical Information of China (English)

    Luo Xiaobo; Fan Hongqi; Song Zhiyong; Fu Qiang

    2013-01-01

    For maritime radiation source target tracking in particular electronic counter measures (ECM) environment, there exists two main problems which can deteriorate the tracking perfor-mance of traditional approaches. The first problem is the poor observability of the radiation source. The second one is the measurement uncertainty which includes the uncertainty of the target appear-ing/disappearing and the detection uncertainty (false and missed detections). A novel approach is proposed in this paper for tracking maritime radiation source in the presence of measurement uncertainty. To solve the poor observability of maritime radiation source target, using the radiation source motion restriction, the observer altitude information is incorporated into the bearings-only tracking (BOT) method to obtain the unique target localization. Then the two uncertainties in the ECM environment are modeled by the random finite set (RFS) theory and the Bernoulli filtering method with the observer altitude is adopted to solve the tracking problem of maritime radiation source in such context. Simulation experiments verify the validity of the proposed approach for tracking maritime radiation source, and also demonstrate the superiority of the method compared with the traditional integrated probabilistic data association (IPDA) method. The tracking perfor-mance under different conditions, particularly those involving different duration of radiation source opening and switching-off, indicates that the method to solve our problem is robust and effective.

  11. Quantification of LiDAR measurement uncertainty through propagation of errors due to sensor sub-systems and terrain morphology

    Science.gov (United States)

    Goulden, T.; Hopkinson, C.

    2013-12-01

    The quantification of LiDAR sensor measurement uncertainty is important for evaluating the quality of derived DEM products, compiling risk assessment of management decisions based from LiDAR information, and enhancing LiDAR mission planning capabilities. Current quality assurance estimates of LiDAR measurement uncertainty are limited to post-survey empirical assessments or vendor estimates from commercial literature. Empirical evidence can provide valuable information for the performance of the sensor in validated areas; however, it cannot characterize the spatial distribution of measurement uncertainty throughout the extensive coverage of typical LiDAR surveys. Vendor advertised error estimates are often restricted to strict and optimal survey conditions, resulting in idealized values. Numerical modeling of individual pulse uncertainty provides an alternative method for estimating LiDAR measurement uncertainty. LiDAR measurement uncertainty is theoretically assumed to fall into three distinct categories, 1) sensor sub-system errors, 2) terrain influences, and 3) vegetative influences. This research details the procedures for numerical modeling of measurement uncertainty from the sensor sub-system (GPS, IMU, laser scanner, laser ranger) and terrain influences. Results show that errors tend to increase as the laser scan angle, altitude or laser beam incidence angle increase. An experimental survey over a flat and paved runway site, performed with an Optech ALTM 3100 sensor, showed an increase in modeled vertical errors of 5 cm, at a nadir scan orientation, to 8 cm at scan edges; for an aircraft altitude of 1200 m and half scan angle of 15°. In a survey with the same sensor, at a highly sloped glacial basin site absent of vegetation, modeled vertical errors reached over 2 m. Validation of error models within the glacial environment, over three separate flight lines, respectively showed 100%, 85%, and 75% of elevation residuals fell below error predictions. Future

  12. Potential for improved radiation thermometry measurement uncertainty through implementing a primary scale in an industrial laboratory

    Science.gov (United States)

    Willmott, Jon R.; Lowe, David; Broughton, Mick; White, Ben S.; Machin, Graham

    2016-09-01

    A primary temperature scale requires realising a unit in terms of its definition. For high temperature radiation thermometry in terms of the International Temperature Scale of 1990 this means extrapolating from the signal measured at the freezing temperature of gold, silver or copper using Planck’s radiation law. The difficulty in doing this means that primary scales above 1000 °C require specialist equipment and careful characterisation in order to achieve the extrapolation with sufficient accuracy. As such, maintenance of the scale at high temperatures is usually only practicable for National Metrology Institutes, and calibration laboratories have to rely on a scale calibrated against transfer standards. At lower temperatures it is practicable for an industrial calibration laboratory to have its own primary temperature scale, which reduces the number of steps between the primary scale and end user. Proposed changes to the SI that will introduce internationally accepted high temperature reference standards might make it practicable to have a primary high temperature scale in a calibration laboratory. In this study such a scale was established by calibrating radiation thermometers directly to high temperature reference standards. The possible reduction in uncertainty to an end user as a result of the reduced calibration chain was evaluated.

  13. Uncertainties of polynuclear aromatic hydrocarbon and carbonyl measurements in heavy-duty diesel emission.

    Science.gov (United States)

    Mabilia, Rosanna; Cecinato, Angelo; Guerriero, Ettore; Possanzini, Massimiliano

    2006-02-01

    In this note we describe the speciated particle-phase PM2.5 polynuclear aromatic hydrocarbon (PAH) and gas-phase carbonyl emissions as collected from a heavy-duty diesel bus outfitted with an oxidation catalyst for exhaust after-treatment. The vehicle was run on a chassis dynamometer during a transient cycle test reproducing a typical city bus route (Azienda Tramviaria Municipalizzata cycle). The diluted tailpipe emissions were sampled for PAH using a 2.5 microm cut size cyclone glass fiber filter assembly, while carbonyls were absorbed onto dinitrophenyl hydrazine-coated silica cartridges. The former compounds were analysed by CGC-MS, the latter by HPLC-UV. Combining the two sets of speciation data resulting from 15 identical dynamometer tests provided a profile of both unregulated organic emissions. PAH emission rates decreased with the number of benzene fused rings. Fluoranthene and pyrene amounted to 90% of total PAHs quantified; six-ring PAHs accounted only for 0.5%. Similarly, formaldehyde and acetaldehyde accounted for approximately 80% of the total carbonyl emissions. Uncertainties of the method in the determination of individual emission factors were calculated. Statistical data processing revealed that all the measurements were quite unaffected by systematic errors and repeatability percentages did not exceed 50% for the majority of components of both groups. PMID:16524107

  14. Hydrological model parameter dimensionality is a weak measure of prediction uncertainty

    Directory of Open Access Journals (Sweden)

    S. Pande

    2015-04-01

    Full Text Available This paper shows that instability of hydrological system representation in response to different pieces of information and associated prediction uncertainty is a function of model complexity. After demonstrating the connection between unstable model representation and model complexity, complexity is analyzed in a step by step manner. This is done measuring differences between simulations of a model under different realizations of input forcings. Algorithms are then suggested to estimate model complexity. Model complexities of the two model structures, SAC-SMA (Sacramento Soil Moisture Accounting and its simplified version SIXPAR (Six Parameter Model, are computed on resampled input data sets from basins that span across the continental US. The model complexities for SIXPAR are estimated for various parameter ranges. It is shown that complexity of SIXPAR increases with lower storage capacity and/or higher recession coefficients. Thus it is argued that a conceptually simple model structure, such as SIXPAR, can be more complex than an intuitively more complex model structure, such as SAC-SMA for certain parameter ranges. We therefore contend that magnitudes of feasible model parameters influence the complexity of the model selection problem just as parameter dimensionality (number of parameters does and that parameter dimensionality is an incomplete indicator of stability of hydrological model selection and prediction problems.

  15. Flow Rates Measurement and Uncertainty Analysis in Multiple-Zone Water-Injection Wells from Fluid Temperature Profiles

    Directory of Open Access Journals (Sweden)

    José E. O. Reges

    2016-07-01

    Full Text Available This work is a contribution to the development of flow sensors in the oil and gas industry. It presents a methodology to measure the flow rates into multiple-zone water-injection wells from fluid temperature profiles and estimate the measurement uncertainty. First, a method to iteratively calculate the zonal flow rates using the Ramey (exponential model was described. Next, this model was linearized to perform an uncertainty analysis. Then, a computer program to calculate the injected flow rates from experimental temperature profiles was developed. In the experimental part, a fluid temperature profile from a dual-zone water-injection well located in the Northeast Brazilian region was collected. Thus, calculated and measured flow rates were compared. The results proved that linearization error is negligible for practical purposes and the relative uncertainty increases as the flow rate decreases. The calculated values from both the Ramey and linear models were very close to the measured flow rates, presenting a difference of only 4.58 m³/d and 2.38 m³/d, respectively. Finally, the measurement uncertainties from the Ramey and linear models were equal to 1.22% and 1.40% (for injection zone 1; 10.47% and 9.88% (for injection zone 2. Therefore, the methodology was successfully validated and all objectives of this work were achieved.

  16. Flow Rates Measurement and Uncertainty Analysis in Multiple-Zone Water-Injection Wells from Fluid Temperature Profiles.

    Science.gov (United States)

    Reges, José E O; Salazar, A O; Maitelli, Carla W S P; Carvalho, Lucas G; Britto, Ursula J B

    2016-01-01

    This work is a contribution to the development of flow sensors in the oil and gas industry. It presents a methodology to measure the flow rates into multiple-zone water-injection wells from fluid temperature profiles and estimate the measurement uncertainty. First, a method to iteratively calculate the zonal flow rates using the Ramey (exponential) model was described. Next, this model was linearized to perform an uncertainty analysis. Then, a computer program to calculate the injected flow rates from experimental temperature profiles was developed. In the experimental part, a fluid temperature profile from a dual-zone water-injection well located in the Northeast Brazilian region was collected. Thus, calculated and measured flow rates were compared. The results proved that linearization error is negligible for practical purposes and the relative uncertainty increases as the flow rate decreases. The calculated values from both the Ramey and linear models were very close to the measured flow rates, presenting a difference of only 4.58 m³/d and 2.38 m³/d, respectively. Finally, the measurement uncertainties from the Ramey and linear models were equal to 1.22% and 1.40% (for injection zone 1); 10.47% and 9.88% (for injection zone 2). Therefore, the methodology was successfully validated and all objectives of this work were achieved. PMID:27420068

  17. Attribution and evolution of ozone from Asian wild fires using satellite and aircraft measurements during the ARCTAS campaign

    Directory of Open Access Journals (Sweden)

    R. Dupont

    2012-01-01

    Full Text Available We use ozone and carbon monoxide measurements from the Tropospheric Emission Spectrometer (TES, model estimates of Ozone, CO, and ozone pre-cursors from the Real-time Air Quality Modeling System (RAQMS, and data from the NASA DC8 aircraft to characterize the source and dynamical evolution of ozone and CO in Asian wildfire plumes during the spring ARCTAS campaign 2008. On the 19 April, NASA DC8 O3 and aerosol Differential Absorption Lidar (DIAL observed two biomass burning plumes originating from North-Western Asia (Kazakhstan and South-Eastern Asia (Thailand that advected eastward over the Pacific reaching North America in 10 to 12 days. Using both TES observations and RAQMS chemical analyses, we track the wildfire plumes from their source to the ARCTAS DC8 platform. In addition to photochemical production due to ozone pre-cursors, we find that exchange between the stratosphere and the troposphere is a major factor influencing O3 concentrations for both plumes. For example, the Kazakhstan and Siberian plumes at 55 degrees North is a region of significant springtime stratospheric/tropospheric exchange. Stratospheric air influences the Thailand plume after it is lofted to high altitudes via the Himalayas. Using comparisons of the model to the aircraft and satellite measurements, we estimate that the Kazakhstan plume is responsible for increases of O3 and CO mixing ratios by approximately 6.4 ppbv and 38 ppbv in the lower troposphere (height of 2 to 6 km, and the Thailand plume is responsible for increases of O3 and CO mixing ratios of approximately 11 ppbv and 71 ppbv in the upper troposphere (height of 8 to 12 km respectively. However, there are significant sources of uncertainty in these estimates that point to the need for future improvements in both model and satellite observations. For example, it is challenging to characterize the fraction of air parcels from the stratosphere versus those from the

  18. Towards a quantitative, measurement-based estimate of the uncertainty in photon mass attenuation coefficients at radiation therapy energies

    International Nuclear Information System (INIS)

    In this study, a quantitative estimate is derived for the uncertainty in the XCOM photon mass attenuation coefficients in the energy range of interest to external beam radiation therapy—i.e. 100 keV (orthovoltage) to 25 MeV—using direct comparisons of experimental data against Monte Carlo models and theoretical XCOM data. Two independent datasets are used. The first dataset is from our recent transmission measurements and the corresponding EGSnrc calculations (Ali et al 2012 Med. Phys. 39 5990–6003) for 10–30 MV photon beams from the research linac at the National Research Council Canada. The attenuators are graphite and lead, with a total of 140 data points and an experimental uncertainty of ∼0.5% (k = 1). An optimum energy-independent cross section scaling factor that minimizes the discrepancies between measurements and calculations is used to deduce cross section uncertainty. The second dataset is from the aggregate of cross section measurements in the literature for graphite and lead (49 experiments, 288 data points). The dataset is compared to the sum of the XCOM data plus the IAEA photonuclear data. Again, an optimum energy-independent cross section scaling factor is used to deduce the cross section uncertainty. Using the average result from the two datasets, the energy-independent cross section uncertainty estimate is 0.5% (68% confidence) and 0.7% (95% confidence). The potential for energy-dependent errors is discussed. Photon cross section uncertainty is shown to be smaller than the current qualitative ‘envelope of uncertainty’ of the order of 1–2%, as given by Hubbell (1999 Phys. Med. Biol 44 R1–22). (paper)

  19. Three-axis micro-force sensor with sub-micro-Newton measurement uncertainty and tunable force range

    International Nuclear Information System (INIS)

    The first three-axis micro-force sensor with adjustable force range from ±20 µN to ±200 µN and sub-micro-Newton measurement uncertainty is presented. The sensor design, the readout electronics, the sensor characterization and an uncertainty analysis for the force predictions are described. A novel microfabrication process based on a double silicon-on-insulator (SOI) substrate has been developed enabling a major reduction in the fabrication complexity of multi-axis sensors and actuators.

  20. Attributional Bias Instrument (ABI): Validation of a Measure to Assess Ability and Effort Explanations for Math Performance

    Science.gov (United States)

    Espinoza, Penelope P.; Quezada, Stephanie A.; Rincones, Rodolfo; Strobach, E. Natalia; Gutierrez, Maria Armida Estrada

    2012-01-01

    The present work investigates the validation of a newly developed instrument, the attributional bias instrument, based on achievement attribution theories that distinguish between effort and ability explanations of behavior. The instrument further incorporates the distinction between explanations for success versus failure in academic performance.…

  1. Propagation of void fraction uncertainty measures in the RETRAN-3D simulation of the Peach Bottom turbine trip

    Energy Technology Data Exchange (ETDEWEB)

    Vinai, Paolo [Paul Scherrer Institute, Villigen (Switzerland); Ecole Polytechnique Federale de Lausanne, Lausanne (Switzerland); Chalmers University of Technology, Goeteborg (Sweden); Macian-Juan, Rafael [Technische Universitaet Muenchen, Garching (Germany); Chawla, Rakesh [Paul Scherrer Institute, Villigen (Switzerland); Ecole Polytechnique Federale de Lausanne, Lausanne (Switzerland)

    2008-07-01

    The paper describes the propagation of void fraction uncertainty, as quantified by employing a novel methodology developed at PSI, in the RETRAN-3D simulation of the Peach Bottom turbine trip test. Since the transient considered is characterized by a strongly coupling between thermal-hydraulics and neutronics, the accuracy in the void fraction model has a very important influence on the prediction of the power history and, in particular, of the maximum power reached. It has been shown that the objective measures used for the void fraction uncertainty, based on the direct comparison between experimental and predicted values extracted from a database of appropriate separate-effect tests, provides power uncertainty bands that are narrower and more realistic than those based, for example, on expert opinion. The applicability of such an approach to NPP transient best estimate analysis has thus been demonstrated. (authors)

  2. Experimental uncertainty estimation on the effective capture cross sections measured in the PROFIL experiments in Phenix

    International Nuclear Information System (INIS)

    A desire of increasing nuclear system safety and fuel depletion is directly translated by a better knowledge on nuclear data. PROFIL and PROFIL-2 experiments give integral information on capture and (n,2n) cross sections and cumulative fission yields for several isotopes (95Mo, 97Mo, 101Pd, 105Pd, 133Cs, 143Nd, 144Nd, 145Nd, 147Sm, 149Sm, 151Eu, 233U, 234U, 235U, 238Pu, 239Pu, 240Pu, 241Pu, 242Pu, 244Cm ...). Interpretation have been done many times in the past but without experimental uncertainty estimation. The cross section library JEFF-3.1.1, the covariance data base COMAC and the code system ERANOS-2.2 are used for this updated interpretation. This study is focusing on the uncertainty estimation on experimental values sensitive to capture cross sections. Three steps are required: the fluence scaling, the uncertainty propagation on the fluence and finally the uncertainty estimation on ratio variation of interest. This work is done with CONRAD using Bayesian adjustment and marginalization method. Mean C/E results and conclusions are identical to the previous interpretation. A fluence uncertainty of 1.4% is found for the two experimental pins of PROFIL-2 and 1.9% for PROFIL. Propagating this new information on the fluence to ratio variation of interest gives experimental uncertainties between 1% to 2.5% for the isotopes present in the experimental pins. One of the main results are for 238Pu, 239Pu, 240Pu, 241Pu and 242Pu capture cross sections: C/E are respectively equal to 1.03, 0.98, 0.97, 1.08 and 1.14 with an uncertainty lower than 2.5%. All the results will provide feedback on variance-covariance matrices for further works. (author)

  3. Dead time effect on the Brewer measurements: correction and estimated uncertainties

    Science.gov (United States)

    Fountoulakis, Ilias; Redondas, Alberto; Bais, Alkiviadis F.; José Rodriguez-Franco, Juan; Fragkos, Konstantinos; Cede, Alexander

    2016-04-01

    Brewer spectrophotometers are widely used instruments which perform spectral measurements of the direct, the scattered and the global solar UV irradiance. By processing these measurements a variety of secondary products can be derived such as the total columns of ozone (TOC), sulfur dioxide and nitrogen dioxide and aerosol optical properties. Estimating and limiting the uncertainties of the final products is of critical importance. High-quality data have a lot of applications and can provide accurate estimations of trends.The dead time is specific for each instrument and improper correction of the raw data for its effect may lead to important errors in the final products. The dead time value may change with time and, with the currently used methodology, it cannot always be determined accurately. For specific cases, such as for low ozone slant columns and high intensities of the direct solar irradiance, the error in the retrieved TOC, due to a 10 ns change in the dead time from its value in use, is found to be up to 5 %. The error in the calculation of UV irradiance can be as high as 12 % near the maximum operational limit of light intensities. While in the existing documentation it is indicated that the dead time effects are important when the error in the used value is greater than 2 ns, we found that for single-monochromator Brewers a 2 ns error in the dead time may lead to errors above the limit of 1 % in the calculation of TOC; thus the tolerance limit should be lowered. A new routine for the determination of the dead time from direct solar irradiance measurements has been created and tested and a validation of the operational algorithm has been performed. Additionally, new methods for the estimation and the validation of the dead time have been developed and are analytically described. Therefore, the present study, in addition to highlighting the importance of the dead time for the processing of Brewer data sets, also provides useful information for their

  4. Quantifying Urban Natural Gas Leaks from Street-level Methane Mapping: Measurements and Uncertainty

    Science.gov (United States)

    von Fischer, J. C.; Ham, J. M.; Griebenow, C.; Schumacher, R. S.; Salo, J.

    2013-12-01

    Leaks from the natural gas pipeline system are a significant source of anthropogenic methane in urban settings. Detecting and repairing these leaks will reduce the energy and carbon footprints of our cities. Gas leaks can be detected from spikes in street-level methane concentrations measured by analyzers deployed on vehicles. While a spike in methane concentration indicates a leak, an algorithm (e.g., inverse model) must be used to estimate the size of the leak (i.e., flux) from concentration data and supporting meteorological information. Unfortunately, this drive-by approach to leak quantification is confounded by the complexity of urban roughness, changing weather conditions, and other incidental factors (e.g., traffic, vehicle speed, etc.). Furthermore, the vehicle might only pass through the plume one to three times during routine mapping. The objective of this study was to conduct controlled release experiments to better quantify the relationship between mobile methane concentration measurements and the size and location of the emission source (e.g., pipeline leakage) in an urban environment. A portable system was developed that could release methane at known rates between 10 and 40 LPM while maintaining concentrations below the lower explosive limit. A mapping vehicle was configured with fast response methane analyzers, GPS, and meteorological instruments. Portable air-sampling tripods were fabricated that could be deployed at defined distances downwind from the release point and automatically-triggered to collect grab samples. The experimental protocol was as follows: (1) identify an appropriate release point within a city, (2) release methane at a known rate, (3) measure downwind street-level concentrations with the vehicle by making multiple passes through the plume, and (4) collect supporting concentration and meteorological data with the static tripod samplers deployed in the plume. Controlled release studies were performed at multiple locations and

  5. Uncertainty Budget Analysis for Dimensional Inspection Processes (U)

    Energy Technology Data Exchange (ETDEWEB)

    Valdez, Lucas M. [Los Alamos National Laboratory

    2012-07-26

    This paper is intended to provide guidance and describe how to prepare an uncertainty analysis of a dimensional inspection process through the utilization of an uncertainty budget analysis. The uncertainty analysis is stated in the same methodology as that of the ISO GUM standard for calibration and testing. There is a specific distinction between how Type A and Type B uncertainty analysis is used in a general and specific process. All theory and applications are utilized to represent both a generalized approach to estimating measurement uncertainty and how to report and present these estimations for dimensional measurements in a dimensional inspection process. The analysis of this uncertainty budget shows that a well-controlled dimensional inspection process produces a conservative process uncertainty, which can be attributed to the necessary assumptions in place for best possible results.

  6. Development and optimization of nuclear heating and gamma flux measurement techniques in experimental reactors: identification, mastery, treatment and reduction of uncertainties

    International Nuclear Information System (INIS)

    This thesis work focuses on the needs for qualification of neutron and photonics calculation schemes in the future Jules Horowitz technological Reactor (RJH) and Pressurized Water Reactors (PWR). It is necessary to establish reliable measurement results with well defined associated uncertainties, for qualification and/or validation. The objective of this thesis is to develop and to improve the nuclear heating measurement methods (especially gamma photons) in MINERVE and EOLE experimental reactors at CEA-Cadarache, using thermo-luminescent detectors (TLD), optically stimulated luminescence detectors (OSLD) and an ionization chamber. It is to identify, prioritize, treat and reduce the various sources of uncertainty and systematic bias associated with the measurement. In a previous study, where nuclear heating was estimated from an integrated radiation dose by TLD in MINERVE and EOLE reactors, it has been shown that dose calculation underestimated the experiment by 25% with a total uncertainty of 15% (2σ). This systematic bias observed has been largely attributed to a lack of nuclear data used to perform the calculations. Therefore, in this work a new series of experiments was set up in the MINERVE reactor to reduce the measurement uncertainties, and better understand the origins of the discrepancies with the modeling. These experiments were carried out in an aluminum or hafnium surrounding (in specifically designed boxes) using a new procedure and analysis methodology. In these experiments, the TLD are calibrated individually, the repeatability of the measurement is experimentally evaluated and the laws of TLD heat are optimized. These improvements are subsequently used for the measurement of nuclear heating in AMMON program (EOLE reactor), dedicated to the qualification of neutron and photonics calculation schemes in the RJH reactor. The measurements of the gamma emitted, with a delay (delayed gamma) after shutdown of the MINERVE reactor, were also carried out

  7. The effect of random and systematic measurement uncertainties on temporal and spatial upscaling of N2O fluxes

    Science.gov (United States)

    Cowan, Nicholas; Levy, Peter; Skiba, Ute

    2016-04-01

    The addition of reactive nitrogen to agricultural soils in the form of artificial fertilisers or animal waste is the largest global source of anthropogenic N2O emissions. Emission factors are commonly used to evaluate N2O emissions released after the application of nitrogen fertilisers on a global scale based on records of fertiliser use. Currently these emission factors are estimated primarily by a combination of results of experiments in which flux chamber methodology is used to estimate annual cumulative fluxes of N2O after nitrogen fertiliser applications on agricultural soils. The use of the eddy covariance method to measure N2O and estimate emission factors is also becoming more common in the flux community as modern rapid gas analyser instruments advance. The aim of the presentation is to highlight the weaknesses and potential systematic biases in current flux measurement methodology. This is important for GHG accounting and for accurate model calibration and verification. The growing interest in top-down / bottom-up comparisons of tall tower and conventional N2O flux measurements is also an area of research in which the uncertainties in flux measurements needs to be properly quantified. The large and unpredictable spatial and temporal variability of N2O fluxes from agricultural soils leads to a significant source of uncertainty in emission factor estimates. N2O flux measurements typically show poor relationships with explanatory co-variates. The true uncertainties in flux measurements at the plot scale are often difficult to propagate to field scale and the annual time scale. This results in very uncertain cumulative flux (emission factor) estimates. Cumulative fluxes estimated using flux chamber and eddy covariance methods can also differ significantly which complicates the matter further. In this presentation, we examine some effects that spatial and temporal variability of N2O fluxes can have on the estimation of emission factors and describe how

  8. Determination of energy density in ducts by a three-microphone phaseless method and estimation of measurement uncertainties

    Science.gov (United States)

    Pascal, Jean-Claude; Li, Jing-Fang; Thomas, Jean-Hugh

    2011-09-01

    A new method to measure the total energy density of waves traveling in opposite directions in ducts is suggested in order to completely eliminate phase errors that lead to bias errors and are difficult to control in industrial tests. Only the auto-power spectral densities are measured by the three microphones. The inversion of a linear system based on a propagation model, where the two opposite waves are partially coherent, makes it possible to obtain the energy density. The sensitivity of this method to errors in the speed of sound, errors of microphone calibration and errors of microphone positions in the duct is analyzed. To complete the study on the robustness of the method, an evaluation of the statistical errors is carried out. The total uncertainty is used to make recommendations on the choice of the experimental parameters. The selection of the frequency limits permits to maintain the measurement uncertainty within a given confidence interval.

  9. Sensitivity of Displaced-Beam Scintillometer Measurements of Area-Average Heat Fluxes to Uncertainties in Topographic Heights

    CERN Document Server

    Gruber, Matthew; Hartogensis, Oscar

    2014-01-01

    Displaced-beam scintillometer measurements of the turbulence inner-scale length $l_o$ and refractive index structure function $C_n^2$ resolve area-average turbulent fluxes of heat and momentum through the Monin-Obukhov similarity equations. Sensitivity studies have been produced for the use of displaced-beam scintillometers over flat terrain. Many real field sites feature variable topography. We develop here an analysis of the sensitivity of displaced-beam scintillometer derived sensible heat fluxes to uncertainties in spacially distributed topographic measurements. Sensitivity is shown to be concentrated in areas near the center of the beam and where the underlying topography is closest to the beam height. Uncertainty may be decreased by taking precise topographic measurements in these areas.

  10. Analysis of Uncertainties in Protection Heater Delay Time Measurements and Simulations in Nb$_{3}$Sn High-Field Accelerator Magnets

    CERN Document Server

    Salmi, Tiina; Marchevsky, Maxim; Bajas, Hugo; Felice, Helene; Stenvall, Antti

    2015-01-01

    The quench protection of superconducting high-field accelerator magnets is presently based on protection heaters, which are activated upon quench detection to accelerate the quench propagation within the winding. Estimations of the heater delay to initiate a normal zone in the coil are essential for the protection design. During the development of Nb3Sn magnets for the LHC luminosity upgrade, protection heater delays have been measured in several experiments, and a new computational tool CoHDA (Code for Heater Delay Analysis) has been developed for heater design. Several computational quench analyses suggest that the efficiency of the present heater technology is on the borderline of protecting the magnets. Quantifying the inevitable uncertainties related to the measured and simulated delays is therefore of pivotal importance. In this paper, we analyze the uncertainties in the heater delay measurements and simulations using data from five impregnated high-field Nb3Sn magnets with different heater geometries. ...

  11. Sensitivity of large-aperture scintillometer measurements of area-average heat fluxes to uncertainties in topographic heights

    Directory of Open Access Journals (Sweden)

    M. A. Gruber

    2014-01-01

    Full Text Available Scintillometer measurements allow for estimations of the refractive index structure parameter Cn2 over large areas in the atmospheric surface layer. Turbulent fluxes of heat and momentum are inferred through coupled sets of equations derived from the Monin–Obukhov similarity hypothesis. One-dimensional sensitivity functions have been produced that relate the sensitivity of heat fluxes to uncertainties in single values of beam height over homogeneous and flat terrain. However, real field sites include variable topography and heterogeneous surfaces. We develop here the first analysis of the sensitivity of scintillometer derived sensible heat fluxes to uncertainties in spatially distributed topographic measurements. For large-aperture scintillometers and independent friction velocity u* measurements, sensitivity is shown to be concentrated in areas near the center of the beam path and where the underlying topography is closest to the beam height. Uncertainty may be greatly reduced by focusing precise topographic measurements in these areas. A new two-dimensional variable terrain sensitivity function is developed for quantitative error analysis. This function is compared with the previous one-dimensional sensitivity function for the same measurement strategy over flat and homogeneous terrain. Additionally, a new method of solution to the set of coupled equations is produced that eliminates computational error. The results are produced using a new methodology for error analysis involving distributed parameters that may be applied in other disciplines.

  12. Design of adaptive treatment margins for non-negligible measurement uncertainty: application to ultrasound-guided prostate radiation therapy

    International Nuclear Information System (INIS)

    Daily imaging during the course of a fractionated radiotherapy treatment has the potential for frequent intervention and therefore effective adaptation of the treatment to the individual patient. The treatment information gained from such images can be analysed and updated daily to obtain a set of patient individualized parameters. However, in many situations, the uncertainty with which these parameters are estimated cannot be neglected. In this work this methodology is applied to the adaptive estimation of setup errors, the derivation of a daily optimal pre-treatment correction strategy, and the daily update of the treatment margins after application of these corrections. For this purpose a dataset of 19 prostate cancer patients was analysed retrospectively. The position of the prostate was measured daily with an optically guided 3D ultrasound localization system. The measurement uncertainty of this system is approximately 2 mm. The algorithm finds the most likely position of the target maximizing an a posteriori probability given the set of measurements. These estimates are used for the optimal corrections applied to the target volume. The results show that the application of the optimal correction strategy allows a reduction in the treatment margins in a systematic way with increasing progression of the treatment. This is not the case using corrections based only on the measured values that do not take the measurement uncertainty into account

  13. CIRCLE 2 policy brief: Communicate uncertainties- design climate adaptation measures to be flexible and robust

    NARCIS (Netherlands)

    Pelt, van S.C.; Avelar, D.; Swart, R.J.

    2010-01-01

    This policy brief is directed towards funders and managers of climate change impacts and adaptation research programmes as well as policy makers in this area. It notes various challenges in addressing uncertainties in climate change research and policy and provides suggestions on how to address them

  14. Damage assessment of composite plate structures with material and measurement uncertainty

    Science.gov (United States)

    Chandrashekhar, M.; Ganguli, Ranjan

    2016-06-01

    Composite materials are very useful in structural engineering particularly in weight sensitive applications. Two different test models of the same structure made from composite materials can display very different dynamic behavior due to large uncertainties associated with composite material properties. Also, composite structures can suffer from pre-existing imperfections like delaminations, voids or cracks during fabrication. In this paper, we show that modeling and material uncertainties in composite structures can cause considerable problem in damage assessment. A recently developed C0 shear deformable locking free refined composite plate element is employed in the numerical simulations to alleviate modeling uncertainty. A qualitative estimate of the impact of modeling uncertainty on the damage detection problem is made. A robust Fuzzy Logic System (FLS) with sliding window defuzzifier is used for delamination damage detection in composite plate type structures. The FLS is designed using variations in modal frequencies due to randomness in material properties. Probabilistic analysis is performed using Monte Carlo Simulation (MCS) on a composite plate finite element model. It is demonstrated that the FLS shows excellent robustness in delamination detection at very high levels of randomness in input data.

  15. An Evaluation of Test and Physical Uncertainty of Measuring Vibration in Wooden Junctions

    DEFF Research Database (Denmark)

    Dickow, Kristoffer Ahrens; Kirkegaard, Poul Henning; Andersen, Lars Vabbersgaard

    2012-01-01

    In the present paper a study of test and material uncertainty in modal analysis of certain wooden junctions is presented. The main structure considered here is a T-junction made from a particleboard plate connected to a spruce beam of rectangular cross section. The size of the plate is 1.2 m by 0...... plate is significant....

  16. Measures of Model Uncertainty in the Assessment of Primary Stresses in Ship Structures

    DEFF Research Database (Denmark)

    Östergaard, Carsten; Dogliani, Mario; Guedes Soares, Carlos;

    1996-01-01

    The paper considers various models and methods commonly used for linear elastic stress analysis and assesses the uncertainty involved in their application to the analysis of the distribution of primary stresses in the hull of a containership example, through statistical evaluations of the results...... of calculations performed by different methods....

  17. The uncertainties calculation of acoustic method for measurement of dissipative properties of heterogeneous non-metallic materials

    Directory of Open Access Journals (Sweden)

    Мaryna O. Golofeyeva

    2015-12-01

    Full Text Available The effective use of heterogeneous non-metallic materials and structures needs measurement of reliable values of dissipation characteristics, as well as common factors of their change during the loading process. Aim: The aim of this study is to prepare the budget for measurement uncertainty of dissipative properties of composite materials. Materials and Methods: The method used to study the vibrational energy dissipation characteristics based on coupling of vibrations damping decrement and acoustic velocity in a non-metallic heterogeneous material is reviewed. The proposed method allows finding the dependence of damping on vibrations amplitude and frequency of strain-stress state of material. Results: Research of the accuracy of measurement method during the definition of decrement attenuation of fluctuations in synthegran was performed. The international approach for evaluation of measurements quality is used. It includes the common practice international rules for uncertainty expression and their summation. These rules are used as internationally acknowledged confidence measure to the measurement results, which includes testing. The uncertainties budgeting of acoustic method for measurement of dissipative properties of materials were compiled. Conclusions: It was defined that there are two groups of reasons resulting in errors during measurement of materials dissipative properties. The first group of errors contains of parameters changing of calibrated bump in tolerance limits, displacement of sensor in repeated placement to measurement point, layer thickness variation of contact agent because of irregular hold-down of resolvers to control surface, inaccuracy in reading and etc. The second group of errors is linked with density and Poisson’s ratio measurement errors, distance between sensors, time difference between signals of vibroacoustic sensors.

  18. Influence of measurement uncertainties on fractional solubility of iron in mineral aerosols over the oceans

    Science.gov (United States)

    Meskhidze, Nicholas; Johnson, Matthew S.; Hurley, David; Dawson, Kyle

    2016-09-01

    The atmospheric supply of mineral dust iron (Fe) plays a crucial role in the Earth's biogeochemical cycle and is of specific importance as a micronutrient in the marine environment. Observations show several orders of magnitude variability in the fractional solubility of Fe in mineral dust aerosols, making it hard to assess the role of mineral dust in the global ocean biogeochemical Fe cycle. In this study we compare the operational solubility of mineral dust aerosol Fe associated with the flow-through leaching protocol to the results of the global 3-D chemical transport model GEOS-Chem. According to the protocol, aerosol Fe is defined as soluble by first deionized water leaching of mineral dust through a 0.45 μm pore size membrane followed by acidification and storage of the leachate over a long period of time prior to analysis. To estimate the uncertainty in soluble Fe results introduced by the flow-through leaching protocol, we prescribe an average 50% (range of 30-70%) fractional solubility to sub-0.45 μm sized mineral dust particles that may inadvertently pass the filter and end up in the acidified (at pH ∼ 1.7) leachate for a couple of month period. In the model, the fractional solubility of Fe is either explicitly calculated using a complex mineral aerosol Fe dissolution equations, or prescribed to be 1% and 4% often used by global ocean biogeochemical Fe cycle models to reproduce the broad characteristics of the presently observed ocean dissolved iron distribution. Calculations show that the fractional solubility of Fe derived through the flow-through leaching is higher compared to the model results. The largest differences (∼40%) are predicted to occur farther away from the dust source regions, over the areas where sub-0.45 μm sized mineral dust particles contribute a larger fraction of the total mineral dust mass. This study suggests that different methods used in soluble Fe measurements and inconsistences in the operational definition of

  19. Trends of solar ultraviolet irradiance at Barrow, Alaska, and the effect of measurement uncertainties on trend detection

    Directory of Open Access Journals (Sweden)

    G. Bernhard

    2011-12-01

    Full Text Available Spectral ultraviolet (UV irradiance has been observed near Barrow, Alaska (71° N, 157° W between 1991 and 2011 with an SUV-100 spectroradiometer. The instrument was historically part of the US National Science Foundation's UV Monitoring Network and is now a component of NSF's Arctic Observing Network. From these measurements, trends in monthly average irradiance and their uncertainties were calculated. The analysis focuses on two quantities, the UV Index (which is affected by atmospheric ozone concentrations and irradiance at 345 nm (which is virtually insensitive to ozone. Uncertainties of trend estimates depend on variations in the data due to (1 natural variability, (2 systematic and random errors of the measurements, and (3 uncertainties caused by gaps in the time series. Using radiative transfer model calculations, systematic errors of the measurements were detected and corrected. Different correction schemes were tested to quantify the sensitivity of the trend estimates on the treatment of systematic errors. Depending on the correction method, estimates of decadal trends changed between 1.5% and 2.9%. Uncertainties in the trend estimates caused by error sources (2 and (3 were set into relation with the overall uncertainty of the trend determinations. Results show that these error sources are only relevant for February, March, and April when natural variability is low due to high surface albedo. This method of addressing measurement uncertainties in time series analysis is also applicable to other geophysical parameters. Trend estimates varied between −14% and +5% per decade and were significant (95.45% confidence level only for the month of October. Depending on the correction method, October trends varied between −11.4% and −13.7% for irradiance at 345 nm and between −11.7% and −14.1% for the UV Index. These large trends are consistent with trends in short-wave (0.3–3.0 μm solar irradiance measured with pyranometers at NOAA

  20. Expanded uncertainties of preconcentration neutron activation measurements of extractable organo-chlorine, bromine and iodine compounds in bovine milk lipids

    International Nuclear Information System (INIS)

    Milk is known to contain organohalogen compounds. A mixture of hexane and isopropanol was used to extract lipids from bovine milk and neutron activation analysis (NAA) was employed to measure extractable organohalogens in the lipids. The samples were irradiated in a neutron flux of 2.5 × 1011 cm2 s-1 for 10 min, allowed to decay for 2 min, and counted for 10 min. Uncertainties associated with the preconcentration NAA measurements were investigated in detail. The mass fractions of halogens in mg kg-1 and their relative expanded uncertainties in percent in bovine milk lipids were: 32 (8.4 %), 2.65 (9.8 %) and 0.211 (6.6 %) for Cl, Br and I, respectively. (author)

  1. Determination of boron in uranium-aluminum-silicon alloy by spectrophotometry and estimation of expanded uncertainty in measurement

    Energy Technology Data Exchange (ETDEWEB)

    Ramanjaneyulu, P.S. [Radioanalytical Chemistry Division, Bhabha Atomic Research Centre, Mumbai 400085 (India); Sayi, Y.S. [Radioanalytical Chemistry Division, Bhabha Atomic Research Centre, Mumbai 400085 (India)], E-mail: yssayi@barc.gov.in; Ramakumar, K.L. [Radioanalytical Chemistry Division, Bhabha Atomic Research Centre, Mumbai 400085 (India)], E-mail: klram@barc.gov.in

    2008-08-31

    Quantification of boron in diverse materials of relevance in nuclear technology is essential in view of its high thermal neutron absorption cross section. A simple and sensitive method has been developed for the determination of boron in uranium-aluminum-silicon alloy, based on leaching of boron with 6 M HCl and H{sub 2}O{sub 2}, its selective separation by solvent extraction with 2-ethyl hexane 1,3-diol and quantification by spectrophotometry using curcumin. The method has been evaluated by standard addition method and validated by inductively coupled plasma-atomic emission spectroscopy. Relative standard deviation and absolute detection limit of the method are 3.0% (at 1{sigma} level) and 12 ng, respectively. All possible sources of uncertainties in the methodology have been individually assessed, following the International Organization for Standardization guidelines. The combined uncertainty is calculated employing uncertainty propagation formulae. The expanded uncertainty in the measurement at 95% confidence level (coverage factor 2) is 8.840%.

  2. Measuring the knowledge-based economy of China in terms of synergy among technological, organizational, and geographic attributes of firms

    NARCIS (Netherlands)

    L. Leydesdorff; P. Zhou

    2014-01-01

    Using the possible synergy among geographic, size, and technological distributions of firms in the Orbis database, we find the greatest reduction of uncertainty at the level of the 31 provinces of China, and an additional 18.0 % at the national level. Some of the coastal provinces stand out as expec

  3. Simple and direct estimation chromium in different grades of steels using UV-visible spectrophotometer and associated measurement uncertainties

    International Nuclear Information System (INIS)

    Chromium is one of the important elements that provide desirable strength to different grades of steels which are chosen as structural materials for upcoming fast breeder reactors. Therefore its estimation is an important part of qualification of steels for desired applications. Several methods have been cited in literature for the estimation of chromium in steels which include most sophisticated instruments like XRFS, spark based OES, UV-Visible spectrophotometer and also classical volumetric titration. Being surface based techniques, both XRFS and spark OES have their own limitations of using matrix matching standards apart from usage of high cost instrumentation. Similarly, volumetric method being time consuming one and also the method cited in involves cumbersome chemical treatment to convert entire chromium in to measurable form of Cr (VI) and subsequent measurement by UV-Visible Spectrophotometer at 350 nm or 373 nm. As this method involves time consuming sample preparation step, it is also not a preferred method for an industrial laboratory where high analytical loads normally exists and quick analytical feedback is an issue. In view of limitations in the method cited above, an attempt has been made to develop a simple and direct method for estimation of chromium in different grades of steels containing chromium in the range of 4.75%-26%. Further, present paper also evaluates the measurement uncertainty (MU) in measurement of chromium in different grades of steels. The developed method involves the dissolution of steel in aqua-regia followed by perchloric acid fuming to convert total chromium to Cr (VI) and subsequent measurement at 447 nm after adding phosphoric acid to the suitable aliquot taken from stock solution. Phosphoric acid is added to mask iron present in solution. For the purpose to quantify measurement uncertainty, the methodology as given in EURACHEM/CITAC guide CG-4 has been followed. The expanded uncertainty at 95% confidence limit is

  4. Uncertainty of the calibration factor

    International Nuclear Information System (INIS)

    According to present definitions, an error is the difference between a measured value and the ''true'' value. Thus an error has both a numerical value and a sign. In contrast, the uncertainly associated with a measurement is a parameter that characterizes the dispersion of the values ''that could reasonably be attributed to the measurand''. This parameter is normally an estimated standard deviation. An uncertainty, therefore, has no known sign and is usually assumed to be symmetrical. It is a measure of our lack of exact knowledge, after all recognized ''systematic'' effects have been eliminated by applying appropriate corrections. If errors were known exactly, the true value could be determined and there would be no problem left. In reality, errors are estimated in the best possible way and corrections made for them. Therefore, after application of all known corrections, errors need no further consideration (their expectation value being zero) and the only quantities of interest are uncertainties. 3 refs, 2 figs

  5. Uncertainties achievable for uranium isotope-amount ratios. Estimates based on the precision and accuracy of recent characterization measurements

    International Nuclear Information System (INIS)

    Certified reference materials (CRMs) recently characterized by the NBL for isotope-amount ratios are: (i) CRM 112-A, Uranium (normal) Metal Assay and Isotopic Standard, (ii) CRM 115, Uranium (depleted) Metal Assay and Isotopic Standard, and (iii) CRM 116-A, Uranium (enriched) Metal Assay and Isotopic Standard. NBL also completed re-characterization of the isotope-amount ratios in CRM 125-A, Uranium (UO2) Pellet Assay, Isotopic, and Radio-chronometric Standard. Three different TIMS analytical techniques were employed for the characterization analyses. The total evaporation technique was used for the major isotope-amount ratio measurement, the modified total evaporation technique was used for both the major and minor isotope-amount ratios, and minor isotope-amount ratios were also measured using a Conventional technique. Uncertainties for the characterization studies were calculated from the combined TIMS data sets following the ISO Guide to the expression of uncertainty in measurement. The uncertainty components for the isotope-amount ratio values are discussed. (author)

  6. Uncertainty evaluation of fluid dynamic models and validation by gamma ray transmission measurements of the catalyst flow in a FCC cold pilot unity

    Energy Technology Data Exchange (ETDEWEB)

    Teles, Francisco A.S.; Santos, Ebenezer F.; Dantas, Carlos C., E-mail: francisco.teles@ufpe.br [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Centro de Tecnologia e Geociencias. Departamento de Energia Nuclear; Melo, Silvio B., E-mail: sbm@cin.ufpe.br [Universidade Federal de Pernambuco (CIN/UFPE), Recife, PE (Brazil). Centro de Informatica; Santos, Valdemir A. dos, E-mail: vas@unicap.br [Universidade Catolica de Pernambuco (UNICAP), Recife, PE (Brazil). Dept. de Quimica; Lima, Emerson A.O., E-mail: emathematics@gmail.com [Universidade de Pernambuco (POLI/UPE), Recife, PE (Brazil). Escola Politecnica

    2013-07-01

    In this paper, fluid dynamics of Fluid Catalytic Cracking (FCC) process is investigated by means of a Cold Flow Pilot Unit (CFPU) constructed in Plexiglas to visualize operational conditions. Axial and radial catalyst profiles were measured by gamma ray transmission in the riser of the CFPU. Standard uncertainty was evaluated in volumetric solid fraction measurements for several concentrations at a given point of axial profile. Monitoring of the pressure drop in riser shows a good agreement with measured standard uncertainty data. A further evaluation of the combined uncertainty was applied to volumetric solid fraction equation using gamma transmission data. Limit condition of catalyst concentration in riser was defined and simulation with random numbers provided by MATLAB software has tested uncertainty evaluation. The Guide to the expression of Uncertainty in Measurement (GUM) is based on the law of propagation of uncertainty and on the characterization of the quantities measured by means of either a Gaussian distribution or a t-distribution, which allows measurement uncertainty to be delimited by means of a confidence interval. A variety of supplements to GUM are being developed, which will progressively enter into effect. The first of these supplements [3] describes an alternative procedure for the calculation of uncertainties: the Monte Carlo Method (MCM).MCM is an alternative to GUM, since it performs a characterization of the quantities measured based on the random sampling of the probability distribution functions. This paper also explains the basic implementation of the MCM method in MATLAB. (author)

  7. Low flow measurement for infusion pumps: implementation and uncertainty determination of the normalized method

    International Nuclear Information System (INIS)

    Intravenous drug delivery is a standard practice in hospitalized patients. As the blood concentration reached depends directly on infusion rate, it is important to use safe devices that guarantee output accuracy. In pediatric intensive care units, low infusion rates (i.e. lower than 10.0 ml/h) are frequently used. Thus, it would be necessary to use control programs to search for deviations at this flow range. We describe the implementation of a gravimetric method to test infusion pumps in low flow delivery. The procedure recommended by the ISO/IEC 60601-2-24 standard was used being a reasonable option among the methods frequently used in hospitals, such as infusion pumps analyzers and volumetric cylinders. The main uncertainty sources affecting this method are revised and a numeric and graphic uncertainty analysis is presented in order to show its dependence on flow. Additionally, the obtained uncertainties are compared to those presented by an automatic flow analyzer. Finally, the results of a series of tests performed on a syringe infusion pump operating at low rates are shown.

  8. Development and optimization of neutron measurement methods by fission chamber on experimental reactors - management, treatment and reduction of uncertainties

    International Nuclear Information System (INIS)

    The main objectives of this research thesis are the management and reduction of uncertainties associated with measurements performed by means of a fission-chamber type sensor. The author first recalls the role of experimental reactors in nuclear research, presents the various sensors used in nuclear detection (photographic film, scintillation sensor, gas ionization sensor, semiconducting sensor, other types of radiation sensors), and more particularly addresses neutron detection (activation sensor, gas filling sensor). In a second part, the author gives an overview of the state of the art of neutron measurement by fission chamber in a mock-up reactor (signal formation, processing and post-processing, associated measurements and uncertainties, return on experience of measurements by fission chamber on Masurca and Minerve research reactors). In a third part, he reports the optimization of two intrinsic parameters of this sensor: the thickness of fissile material deposit, and the pressure and nature of the filler gas. The fourth part addresses the improvement of measurement electronics and of post-processing methods which are used for result analysis. The fifth part deals with the optimization of spectrum index measurements by means of a fission chamber. The impact of each parameter is quantified. Results explain some inconsistencies noticed in measurements performed on the Minerve reactor in 2004, and allow the improvement of biases with computed values

  9. Electromagnetic-coil (EM-coil) measurement technique to verify presence of metal/absence of oxide attribute

    International Nuclear Information System (INIS)

    This paper summarizes how an Electromagnetic-coil (EM-coil) measurement technique can be used to discriminate between plutonium metal and plutonium oxide inside sealed storage containers. As evidence, measurements on a variety of metals and their oxides are presented. This non-radiation measurement method provides assurance of the 'presence of metal/absence of oxide' attribute in less than a minute. During initial development, researchers at Pacific Northwest Laboratory have demonstrated the ability of this method to discriminate between aluminum and aluminum oxide placed inside an AT-400R storage container (total stainless steel wall thickness of over 2.5 cm). Similar results are expected, since Pu metal is electrically conductive and a Pu oxide behaves as an electrical insulator. At this writing, work is underway to perform the same demonstration using plutonium and plutonium oxide. Similar success has been demonstrated when using ALR-8 storage containers (basically carbon steel drums). Within these container types two scenarios have been explored. 1.) The same configuration made from different metals for demonstrating material property effects. A clear distinction was seen between the slight alloy changes among various forms of aluminum and brass in the same configuration. 2.) The same metal configured differently to demonstrate how mass distribution affects the EM signature. Hundreds of bb's (each about 2 mm in diameter) were placed in different containers to show how a slight change in distribution will affect the EM signature. With a five percent change in bb container diameter, the resulting EM signature changes are clear. This measurement method offers an extremely wide dynamic range resulting from its sensitivity to the wide range in electrical conductivity and magnetic permeability found in most metals and alloys. In fact, electrical conductivity spans the widest spectrum of all the known physical properties. Most insulators such as the oxides cover the

  10. Improved profile fitting and quantification of uncertainty in experimental measurements of impurity transport coefficients using Gaussian process regression

    International Nuclear Information System (INIS)

    The need to fit smooth temperature and density profiles to discrete observations is ubiquitous in plasma physics, but the prevailing techniques for this have many shortcomings that cast doubt on the statistical validity of the results. This issue is amplified in the context of validation of gyrokinetic transport models (Holland et al 2009 Phys. Plasmas 16 052301), where the strong sensitivity of the code outputs to input gradients means that inadequacies in the profile fitting technique can easily lead to an incorrect assessment of the degree of agreement with experimental measurements. In order to rectify the shortcomings of standard approaches to profile fitting, we have applied Gaussian process regression (GPR), a powerful non-parametric regression technique, to analyse an Alcator C-Mod L-mode discharge used for past gyrokinetic validation work (Howard et al 2012 Nucl. Fusion 52 063002). We show that the GPR techniques can reproduce the previous results while delivering more statistically rigorous fits and uncertainty estimates for both the value and the gradient of plasma profiles with an improved level of automation. We also discuss how the use of GPR can allow for dramatic increases in the rate of convergence of uncertainty propagation for any code that takes experimental profiles as inputs. The new GPR techniques for profile fitting and uncertainty propagation are quite useful and general, and we describe the steps to implementation in detail in this paper. These techniques have the potential to substantially improve the quality of uncertainty estimates on profile fits and the rate of convergence of uncertainty propagation, making them of great interest for wider use in fusion experiments and modelling efforts. (paper)

  11. Uncertainty in hydrological signatures for gauged and ungauged catchments

    Science.gov (United States)

    Westerberg, Ida K.; Wagener, Thorsten; Coxon, Gemma; McMillan, Hilary K.; Castellarin, Attilio; Montanari, Alberto; Freer, Jim

    2016-03-01

    Reliable information about hydrological behavior is needed for water-resource management and scientific investigations. Hydrological signatures quantify catchment behavior as index values, and can be predicted for ungauged catchments using a regionalization procedure. The prediction reliability is affected by data uncertainties for the gauged catchments used in prediction and by uncertainties in the regionalization procedure. We quantified signature uncertainty stemming from discharge data uncertainty for 43 UK catchments and propagated these uncertainties in signature regionalization, while accounting for regionalization uncertainty with a weighted-pooling-group approach. Discharge uncertainty was estimated using Monte Carlo sampling of multiple feasible rating curves. For each sampled rating curve, a discharge time series was calculated and used in deriving the gauged signature uncertainty distribution. We found that the gauged uncertainty varied with signature type, local measurement conditions and catchment behavior, with the highest uncertainties (median relative uncertainty ±30-40% across all catchments) for signatures measuring high- and low-flow magnitude and dynamics. Our regionalization method allowed assessing the role and relative magnitudes of the gauged and regionalized uncertainty sources in shaping the signature uncertainty distributions predicted for catchments treated as ungauged. We found that (1) if the gauged uncertainties were neglected there was a clear risk of overconditioning the regionalization inference, e.g., by attributing catchment differences resulting from gauged uncertainty to differences in catchment behavior, and (2) uncertainty in the regionalization results was lower for signatures measuring flow distribution (e.g., mean flow) than flow dynamics (e.g., autocorrelation), and for average flows (and then high flows) compared to low flows.

  12. Past changes in the vertical distribution of ozone - Part 1: Measurement techniques, uncertainties and availability

    OpenAIRE

    Hassler, B.; Petropavlovskikh, I; J. Staehelin; August, T.; Bhartia, P. K.; Clerbaux, Cathy; Degenstein, D.; De Mazière, M.; DINELLI, B. M.; Dudhia, A.; Dufour, G.; Frith, S. M.; Froidevaux, L.; Godin-Beekmann, Sophie; Granville, J.

    2014-01-01

    Peak stratospheric chlorofluorocarbon (CFC) and other ozone depleting substance (ODS) concentrations were reached in the mid to late 1990s. Detection and attribution of the expected recovery of the stratospheric ozone layer in an atmosphere with reduced ODSs as well as efforts to understand the evolution of stratospheric ozone in the presence of increasing greenhouse gases are key current research topics. These require a critical examination of the ozone changes with an accurate knowledge of ...

  13. Past changes in the vertical distribution of ozone - Part 1: Measurement techniques, uncertainties and availability

    OpenAIRE

    Hassler, B.; Petropavlovskikh, I; J. Staehelin; August, T.; Bhartia, P. K.; Clerbaux, C.; Degenstein, D.; De Mazière, M.; DINELLI, B. M.; Dudhia, A.; Dufour, G.; Frith, S. M.; Froidevaux, L.; S. Godin-Beekmann; Granville, J.

    2014-01-01

    Peak stratospheric chlorofluorocarbon (CFC) and other ozone depleting substance (ODS) concentrations were reached in the mid- to late 1990s. Detection and attribution of the expected recovery of the stratospheric ozone layer in an atmosphere with reduced ODSs as well as efforts to understand the evolution of stratospheric ozone in the presence of increasing greenhouse gases are key current research topics. These require a critical examination of the ozone ch...

  14. Method validation and measurement uncertainty for the simultaneous determination of synthetic phenolic antioxidants in edible oils commonly consumed in Korea.

    Science.gov (United States)

    Kim, Jae-Min; Choi, Seung-Hyun; Shin, Gi-Hae; Lee, Jin-Ha; Kang, Seong-Ran; Lee, Kyun-Young; Lim, Ho-Soo; Kang, Tae Seok; Lee, Ok-Hwan

    2016-12-15

    This study investigated a method for the validation and determination of measurement uncertainty for the simultaneous determination of synthetic phenolic antioxidants (SPAs) such as propyl gallate (PG), octyl gallate (OG), dodecyl gallate (DG), 2,4,5-trihydroxy butyrophenone (THBP), tert-butylhydroquinone (TBHQ), butylated hydroxyanisole (BHA), and butylated hydroxytoluene (BHT) in edible oils commonly consumed in Korea. The validated method was able to extract SPA residues under the optimized HPLC-UV and LC-MS/MS conditions. Furthermore, the measurement of uncertainty was evaluated based on the precision study. For HPLC-UV analysis, the recoveries of SPAs ranged from 91.4% to 115.9% with relative standard deviations between 0.3% and 11.4%. In addition, the expanded uncertainties of the SPAs ranged from 0.15 to 5.91. These results indicate that the validated method is appropriate for the extraction and determination of SPAs and can be used to verify the safety of edible oil products containing SPAs residues. PMID:27451150

  15. Estimation of experimental uncertainty for physical measurements based on the start-up data of the latest VVER-1000 units

    International Nuclear Information System (INIS)

    During the last decades several new VVER-1000 units of different projects have been commissioned. At some of them similar initial fuel loadings were implemented. To determine experimentally neutron-physics characteristics of those units almost similar physical start-up test programs were performed and measurement equipment were also similar. The paper presents the analysis of some result of neutron-physics characteristics that were carried out during physics start-up of VVER-1000 units with similar initial fuel loading. The subject of analysis is an estimation of experimental uncertainties. An effort is made to consider individual test result as a single random realization of universal set. It is assumed the analysis of individual results spread helps to estimate general inherent uncertainty of experiment.

  16. Uncertainties in linear energy transfer spectra measured with track-etched detectors in space

    Czech Academy of Sciences Publication Activity Database

    Pachnerová Brabcová, Kateřina; Ambrožová, Iva; Kolísková, Zlata; Malušek, Alexandr

    2013-01-01

    Roč. 713, JUN 11 (2013), s. 5-10. ISSN 0168-9002 R&D Projects: GA ČR GA205/09/0171; GA AV ČR IAA100480902; GA AV ČR KJB100480901; GA ČR GD202/09/H086 Institutional research plan: CEZ:AV0Z10480505 Institutional support: RVO:61389005 Keywords : CR-39 * linear energy transfer * uncertainty model * space dosimetry Subject RIV: BG - Nuclear, Atomic and Molecular Physics, Colliders Impact factor: 1.316, year: 2013

  17. Uncertainty assessment in measurement of myotube thickness in cells culture treated with and without therapeutic ultrasound

    International Nuclear Information System (INIS)

    Effectiveness of an ultrasound treatment shall be assessed by experiments. A reliable cell culture protocol is available and spatial discrepancies could arise. To assure if the spatial difference are relevant or not, and how they should be dealt with, an uncertainty model for the treatment result is a metrological reliable solution. The present work reports a metrological approach to assess myotube thickness and to validate a primary cell culture of muscle after a therapeutic ultrasound treatment, comparing it with a control group. The results reinforced the importance of such approach and show an efficacy of treatment on myotube differentiation

  18. On the evaluation of a fuel assembly design by means of uncertainty and sensitivity measures

    International Nuclear Information System (INIS)

    This paper will provide results of an uncertainty and sensitivity study in order to calculate parameters of safety related importance like the fuel centerline temperature, the cladding temperature and the fuel assembly pressure drop of a lead-alloy cooled fast system. Applying best practice guidelines, a list of uncertain parameters has been identified. The considered parameter variations are based on the experience gained during fabrication and operation of former and existing liquid metal cooled fast systems as well as on experimental results and on engineering judgment. (orig.)

  19. A Dynamic, Multivariate Sustainability Measure for Robust Analysis of Water Management under Climate and Demand Uncertainty in an Arid Environment

    Directory of Open Access Journals (Sweden)

    Christian Hunter

    2015-10-01

    Full Text Available Considering water resource scarcity and uncertainty in climate and demand futures, decision-makers require techniques for sustainability analysis in resource management. Through unclear definitions of “sustainability”, however, traditional indices for resource evaluation propose options of limited flexibility by adopting static climate and demand scenarios, limiting analysis variables to a particular water-use group and time. This work proposes a robust, multivariate, dynamic sustainability evaluation technique and corresponding performance indicator called Measure of Sustainability (MoS for resource management that is more adapted to withstand future parameter variation. The range of potential future climate and demand scenarios is simulated through a calibrated hydrological model of Copiapó, Chile, a case study example of an arid watershed under extreme natural and anthropogenic water stresses. Comparing MoS and cost rankings of proposed water management schemes, this paper determines that the traditional evaluation method not only underestimates future water deficits, but also espouses solutions without considering uncertainties in supply and demand. Given the uncertainty of the future and the dependence of resources upon climate and market trajectories, the MoS methodology proposes solutions that, while perhaps are not the most optimal, are robust to variations in future parameter values and are thus the best water management options in a stochastic natural world.

  20. An Indicator of Research Front Activity: Measuring Intellectual Organization as Uncertainty Reduction in Document Sets

    CERN Document Server

    Lucio-Arias, Diana

    2009-01-01

    When using scientific literature to model scholarly discourse, a research specialty can be operationalized as an evolving set of related documents. Each publication can be expected to contribute to the further development of the specialty at the research front. The specific combinations of title words and cited references in a paper can then be considered as a signature of the knowledge claim in the paper: new words and combinations of words can be expected to represent variation, while each paper is at the same time selectively positioned into the intellectual organization of a field using context-relevant references. Can the mutual information among these three dimensions--title words, cited references, and sequence numbers--be used as an indicator of the extent to which intellectual organization structures the uncertainty prevailing at a research front? The effect of the discovery of nanotubes (1991) on the previously existing field of fullerenes is used as a test case. Thereafter, this method is applied t...

  1. Uncertainty analysis of heat flux measurements estimated using a one-dimensional, inverse heat-conduction program.

    Energy Technology Data Exchange (ETDEWEB)

    Nakos, James Thomas; Figueroa, Victor G.; Murphy, Jill E. (Worcester Polytechnic Institute, Worcester, MA)

    2005-02-01

    The measurement of heat flux in hydrocarbon fuel fires (e.g., diesel or JP-8) is difficult due to high temperatures and the sooty environment. Un-cooled commercially available heat flux gages do not survive in long duration fires, and cooled gages often become covered with soot, thus changing the gage calibration. An alternate method that is rugged and relatively inexpensive is based on inverse heat conduction methods. Inverse heat-conduction methods estimate absorbed heat flux at specific material interfaces using temperature/time histories, boundary conditions, material properties, and usually an assumption of one-dimensional (1-D) heat flow. This method is commonly used at Sandia.s fire test facilities. In this report, an uncertainty analysis was performed for a specific example to quantify the effect of input parameter variations on the estimated heat flux when using the inverse heat conduction method. The approach used was to compare results from a number of cases using modified inputs to a base-case. The response of a 304 stainless-steel cylinder [about 30.5 cm (12-in.) in diameter and 0.32-cm-thick (1/8-in.)] filled with 2.5-cm-thick (1-in.) ceramic fiber insulation was examined. Input parameters of an inverse heat conduction program varied were steel-wall thickness, thermal conductivity, and volumetric heat capacity; insulation thickness, thermal conductivity, and volumetric heat capacity, temperature uncertainty, boundary conditions, temperature sampling period; and numerical inputs. One-dimensional heat transfer was assumed in all cases. Results of the analysis show that, at the maximum heat flux, the most important parameters were temperature uncertainty, steel thickness and steel volumetric heat capacity. The use of a constant thermal properties rather than temperature dependent values also made a significant difference in the resultant heat flux; therefore, temperature-dependent values should be used. As an example, several parameters were varied to

  2. SU-D-303-03: Impact of Uncertainty in T1 Measurements On Quantification of Dynamic Contrast Enhanced MRI

    Energy Technology Data Exchange (ETDEWEB)

    Aryal, M; Cao, Y [The University of Michigan, Ann Arbor, MI (United States)

    2015-06-15

    Purpose: Quantification of dynamic contrast enhanced (DCE) MRI requires native longitudinal relaxation time (T1) measurement. This study aimed to assess uncertainty in T1 measurements using two different methods. Methods and Materials: Brain MRI scans were performed on a 3T scanner in 9 patients who had low grade/benign tumors and partial brain radiotherapy without chemotherapy at pre-RT, week-3 during RT (wk-3), end-RT, and 1, 6 and 18 months after RT. T1-weighted images were acquired using gradient echo sequences with 1) 2 different flip angles (50 and 150), and 2) 5 variable TRs (100–2000ms). After creating quantitative T1 maps, average T1 was calculated in regions of interest (ROI), which were distant from tumors and received a total of accumulated radiation doses < 5 Gy at wk-3. ROIs included left and right normal Putamen and Thalamus (gray matter: GM), and frontal and parietal white matter (WM). Since there were no significant or even a trend of T1 changes from pre-RT to wk-3 in these ROIs, a relative repeatability coefficient (RC) of T1 as a measure of uncertainty was estimated in each ROI using the data pre-RT and at wk-3. The individual T1 changes at later time points were evaluated compared to the estimated RCs. Results: The 2-flip angle method produced small RCs in GM (9.7–11.7%) but large RCs in WM (12.2–13.6%) compared to the saturation-recovery (SR) method (11.0–17.7% for GM and 7.5–11.2% for WM). More than 81% of individual T1 changes were within T1 uncertainty ranges defined by RCs. Conclusion: Our study suggests that the impact of T1 uncertainty on physiological parameters derived from DCE MRI is not negligible. A short scan with 2 flip angles is able to achieve repeatability of T1 estimates similar to a long scan with 5 different TRs, and is desirable to be integrated in the DCE protocol. Present study was supported by National Institute of Health (NIH) under grant numbers; UO1 CA183848 and RO1 NS064973.

  3. SU-D-303-03: Impact of Uncertainty in T1 Measurements On Quantification of Dynamic Contrast Enhanced MRI

    International Nuclear Information System (INIS)

    Purpose: Quantification of dynamic contrast enhanced (DCE) MRI requires native longitudinal relaxation time (T1) measurement. This study aimed to assess uncertainty in T1 measurements using two different methods. Methods and Materials: Brain MRI scans were performed on a 3T scanner in 9 patients who had low grade/benign tumors and partial brain radiotherapy without chemotherapy at pre-RT, week-3 during RT (wk-3), end-RT, and 1, 6 and 18 months after RT. T1-weighted images were acquired using gradient echo sequences with 1) 2 different flip angles (50 and 150), and 2) 5 variable TRs (100–2000ms). After creating quantitative T1 maps, average T1 was calculated in regions of interest (ROI), which were distant from tumors and received a total of accumulated radiation doses < 5 Gy at wk-3. ROIs included left and right normal Putamen and Thalamus (gray matter: GM), and frontal and parietal white matter (WM). Since there were no significant or even a trend of T1 changes from pre-RT to wk-3 in these ROIs, a relative repeatability coefficient (RC) of T1 as a measure of uncertainty was estimated in each ROI using the data pre-RT and at wk-3. The individual T1 changes at later time points were evaluated compared to the estimated RCs. Results: The 2-flip angle method produced small RCs in GM (9.7–11.7%) but large RCs in WM (12.2–13.6%) compared to the saturation-recovery (SR) method (11.0–17.7% for GM and 7.5–11.2% for WM). More than 81% of individual T1 changes were within T1 uncertainty ranges defined by RCs. Conclusion: Our study suggests that the impact of T1 uncertainty on physiological parameters derived from DCE MRI is not negligible. A short scan with 2 flip angles is able to achieve repeatability of T1 estimates similar to a long scan with 5 different TRs, and is desirable to be integrated in the DCE protocol. Present study was supported by National Institute of Health (NIH) under grant numbers; UO1 CA183848 and RO1 NS064973

  4. Lessons learnt on biases and uncertainties in personal exposure measurement surveys of radiofrequency electromagnetic fields with exposimeters.

    Science.gov (United States)

    Bolte, John F B

    2016-09-01

    Personal exposure measurements of radio frequency electromagnetic fields are important for epidemiological studies and developing prediction models. Minimizing biases and uncertainties and handling spatial and temporal variability are important aspects of these measurements. This paper reviews the lessons learnt from testing the different types of exposimeters and from personal exposure measurement surveys performed between 2005 and 2015. Applying them will improve the comparability and ranking of exposure levels for different microenvironments, activities or (groups of) people, such that epidemiological studies are better capable of finding potential weak correlations with health effects. Over 20 papers have been published on how to prevent biases and minimize uncertainties due to: mechanical errors; design of hardware and software filters; anisotropy; and influence of the body. A number of biases can be corrected for by determining multiplicative correction factors. In addition a good protocol on how to wear the exposimeter, a sufficiently small sampling interval and sufficiently long measurement duration will minimize biases. Corrections to biases are possible for: non-detects through detection limit, erroneous manufacturer calibration and temporal drift. Corrections not deemed necessary, because no significant biases have been observed, are: linearity in response and resolution. Corrections difficult to perform after measurements are for: modulation/duty cycle sensitivity; out of band response aka cross talk; temperature and humidity sensitivity. Corrections not possible to perform after measurements are for: multiple signals detection in one band; flatness of response within a frequency band; anisotropy to waves of different elevation angle. An analysis of 20 microenvironmental surveys showed that early studies using exposimeters with logarithmic detectors, overestimated exposure to signals with bursts, such as in uplink signals from mobile phones and Wi

  5. Materials accounting in a fast-breeder-reactor fuels-reprocessing facility: optimal allocation of measurement uncertainties

    International Nuclear Information System (INIS)

    This report describes the conceptual design of a materials accounting system for the feed preparation and chemical separations processes of a fast breeder reactor spent-fuel reprocessing facility. For the proposed accounting system, optimization techniques are used to calculate instrument measurement uncertainties that meet four different accounting performance goals while minimizing the total development cost of instrument systems. We identify instruments that require development to meet performance goals and measurement uncertainty components that dominate the materials balance variance. Materials accounting in the feed preparation process is complicated by large in-process inventories and spent-fuel assembly inputs that are difficult to measure. To meet 8 kg of plutonium abrupt and 40 kg of plutonium protracted loss-detection goals, materials accounting in the chemical separations process requires: process tank volume and concentration measurements having a precision less than or equal to 1%; accountability and plutonium sample tank volume measurements having a precision less than or equal to 0.3%, a shortterm correlated error less than or equal to 0.04%, and a long-term correlated error less than or equal to 0.04%; and accountability and plutonium sample tank concentration measurements having a precision less than or equal to 0.4%, a short-term correlated error less than or equal to 0.1%, and a long-term correlated error less than or equal to 0.05%. The effects of process design on materials accounting are identified. Major areas of concern include the voloxidizer, the continuous dissolver, and the accountability tank

  6. Considering sampling strategy and cross-section complexity for estimating the uncertainty of discharge measurements using the velocity-area method

    Science.gov (United States)

    Despax, Aurélien; Perret, Christian; Garçon, Rémy; Hauet, Alexandre; Belleville, Arnaud; Le Coz, Jérôme; Favre, Anne-Catherine

    2016-02-01

    Streamflow time series provide baseline data for many hydrological investigations. Errors in the data mainly occur through uncertainty in gauging (measurement uncertainty) and uncertainty in the determination of the stage-discharge relationship based on gaugings (rating curve uncertainty). As the velocity-area method is the measurement technique typically used for gaugings, it is fundamental to estimate its level of uncertainty. Different methods are available in the literature (ISO 748, Q + , IVE), all with their own limitations and drawbacks. Among the terms forming the combined relative uncertainty in measured discharge, the uncertainty component relating to the limited number of verticals often includes a large part of the relative uncertainty. It should therefore be estimated carefully. In ISO 748 standard, proposed values of this uncertainty component only depend on the number of verticals without considering their distribution with respect to the depth and velocity cross-sectional profiles. The Q + method is sensitive to a user-defined parameter while it is questionable whether the IVE method is applicable to stream-gaugings performed with a limited number of verticals. To address the limitations of existing methods, this paper presents a new methodology, called FLow Analog UnceRtainty Estimation (FLAURE), to estimate the uncertainty component relating to the limited number of verticals. High-resolution reference gaugings (with 31 and more verticals) are used to assess the uncertainty component through a statistical analysis. Instead of subsampling purely randomly the verticals of these reference stream-gaugings, a subsampling method is developed in a way that mimicks the behavior of a hydrometric technician. A sampling quality index (SQI) is suggested and appears to be a more explanatory variable than the number of verticals. This index takes into account the spacing between verticals and the variation of unit flow between two verticals. To compute the

  7. FIRM: Sampling-based feedback motion-planning under motion uncertainty and imperfect measurements

    KAUST Repository

    Agha-mohammadi, A.-a.

    2013-11-15

    In this paper we present feedback-based information roadmap (FIRM), a multi-query approach for planning under uncertainty which is a belief-space variant of probabilistic roadmap methods. The crucial feature of FIRM is that the costs associated with the edges are independent of each other, and in this sense it is the first method that generates a graph in belief space that preserves the optimal substructure property. From a practical point of view, FIRM is a robust and reliable planning framework. It is robust since the solution is a feedback and there is no need for expensive replanning. It is reliable because accurate collision probabilities can be computed along the edges. In addition, FIRM is a scalable framework, where the complexity of planning with FIRM is a constant multiplier of the complexity of planning with PRM. In this paper, FIRM is introduced as an abstract framework. As a concrete instantiation of FIRM, we adopt stationary linear quadratic Gaussian (SLQG) controllers as belief stabilizers and introduce the so-called SLQG-FIRM. In SLQG-FIRM we focus on kinematic systems and then extend to dynamical systems by sampling in the equilibrium space. We investigate the performance of SLQG-FIRM in different scenarios. © The Author(s) 2013.

  8. Total Measurement Uncertainty (TMU) for Nondestructive Assay of Transuranic (TRU) Waste at the WRAP Facility

    International Nuclear Information System (INIS)

    At the WRAP facility, there are two identical imaging passive/active neutron (IPAN) assay systems and two identical gamma energy assay (GEA) systems. Currently, only the GEA systems are used to characterize waste, therefore, only the GEA systems are addressed in this document. This document contains the limiting factors relating to the waste drum analysis for shipments destined for WIPP. The TMU document provides the uncertainty basis in the NDA analysis of waste containers at the WRAP facility. The defined limitations for the current analysis scheme are as follows: The WRAP waste stream debris is from the Hanford Plutonium Finishing Plant's process lines, primarily combustible materials. Plutonium analysis range is from the minimum detectable concentration (MDC), Reference 6, to 160 grams (8). The GEA system calibration density ranges from 0.013 g/cc to 1.6 g/cc. PDP Plutonium drum densities were evaluated from 0.065 g/cc to 0.305 gkc. PDP Plutonium source weights ranged from 0.030 g to 3 18 g, in both empty and combustibles matrix drums. The GEA system design density correction macroscopic absorption cross section table (MAC) is Lucite, a material representative of combustible waste. Drums with material not fitting the debris waste criteria are targeted for additional calculations, reviews, and potential re-analysis using a calibration suited for the waste type

  9. Total Measurement Uncertainty (TMU) for Nondestructive Assay of Transuranic (TRU) Waste at the WRAP Facility

    Energy Technology Data Exchange (ETDEWEB)

    CANTALOUB, M.G.

    2000-10-20

    At the WRAP facility, there are two identical imaging passive/active neutron (IPAN) assay systems and two identical gamma energy assay (GEA) systems. Currently, only the GEA systems are used to characterize waste, therefore, only the GEA systems are addressed in this document. This document contains the limiting factors relating to the waste drum analysis for shipments destined for WIPP. The TMU document provides the uncertainty basis in the NDA analysis of waste containers at the WRAP facility. The defined limitations for the current analysis scheme are as follows: (1) The WRAP waste stream debris is from the Hanford Plutonium Finishing Plant's process lines, primarily combustible materials. (2) Plutonium analysis range is from the minimum detectable concentration (MDC), Reference 6, to 200 grams (g). (3) The GEA system calibration density ranges from 0.013 g/cc to 1.6 g/cc. (4) PDP Plutonium drum densities were evaluated from 0.065 g/cc to 0.305 g/cc. (5) PDP Plutonium source weights ranged from 0.030 g to 318 g, in both empty and combustibles matrix drums. (6) The GEA system design density correction mass absorption coefficient table (MAC) is Lucite, a material representative of combustible waste. (7) Drums with material not fitting the debris waste criteria are targeted for additional calculations, reviews, and potential re-analysis using a calibration suited for the waste type.

  10. Total Measurement Uncertainty (TMU) for Nondestructive Assay of Transuranic (TRU) Waste at the WRAP Facility

    Energy Technology Data Exchange (ETDEWEB)

    CANTALOUB, M.G.

    2000-05-22

    At the WRAP facility, there are two identical imaging passive/active neutron (IPAN) assay systems and two identical gamma energy assay (GEA) systems. Currently, only the GEA systems are used to characterize waste, therefore, only the GEA systems are addressed in this document. This document contains the limiting factors relating to the waste drum analysis for shipments destined for WIPP. The TMU document provides the uncertainty basis in the NDA analysis of waste containers at the WRAP facility. The defined limitations for the current analysis scheme are as follows: The WRAP waste stream debris is from the Hanford Plutonium Finishing Plant's process lines, primarily combustible materials. Plutonium analysis range is from the minimum detectable concentration (MDC), Reference 6, to 160 grams (8). The GEA system calibration density ranges from 0.013 g/cc to 1.6 g/cc. PDP Plutonium drum densities were evaluated from 0.065 g/cc to 0.305 gkc. PDP Plutonium source weights ranged from 0.030 g to 3 18 g, in both empty and combustibles matrix drums. The GEA system design density correction macroscopic absorption cross section table (MAC) is Lucite, a material representative of combustible waste. Drums with material not fitting the debris waste criteria are targeted for additional calculations, reviews, and potential re-analysis using a calibration suited for the waste type.

  11. Effect of temperature on uncertainty of measurements in the PAS LT

    International Nuclear Information System (INIS)

    In this presentation author deals with the influence of temperature on the measurement and analysis of spectra PAS LT (positron annihilation spectroscopy - lifetime technique) using the Lifetime 9. Also refers to the various advances in measurement accuracy and stability of the measuring apparatus using innovative implementation apparatus in the 'fast-fast' connection. Finally, impacts are assessed on each variable, e.g. temperatures from the spectra fitting program Lifetime 9.

  12. Existence of biological uncertainty principle implies that we can never find 'THE' measure for biological complexity

    OpenAIRE

    Banerji, Anirban

    2009-01-01

    There are innumerable 'biological complexity measure's. While some patterns emerge from these attempts to represent biological complexity, a single measure to encompass the seemingly countless features of biological systems, still eludes the students of Biology. It is the pursuit of this paper to discuss the feasibility of finding one complete and objective measure for biological complexity. A theoretical construct (the 'Thread-Mesh model') is proposed here to describe biological reality. It ...

  13. Standardization of the Definitions of Vertical Resolution and Uncertainty in the NDACC-archived Ozone and Temperature Lidar Measurements

    Science.gov (United States)

    Leblanc, T.; Godin-Beekmann, S.; Payen, Godin-Beekmann; Gabarrot, Franck; vanGijsel, Anne; Bandoro, J.; Sica, R.; Trickl, T.

    2012-01-01

    The international Network for the Detection of Atmospheric Composition Change (NDACC) is a global network of high-quality, remote-sensing research stations for observing and understanding the physical and chemical state of the Earth atmosphere. As part of NDACC, over 20 ground-based lidar instruments are dedicated to the long-term monitoring of atmospheric composition and to the validation of space-borne measurements of the atmosphere from environmental satellites such as Aura and ENVISAT. One caveat of large networks such as NDACC is the difficulty to archive measurement and analysis information consistently from one research group (or instrument) to another [1][2][3]. Yet the need for consistent definitions has strengthened as datasets of various origin (e.g., satellite and ground-based) are increasingly used for intercomparisons, validation, and ingested together in global assimilation systems.In the framework of the 2010 Call for Proposals by the International Space Science Institute (ISSI) located in Bern, Switzerland, a Team of lidar experts was created to address existing issues in three critical aspects of the NDACC lidar ozone and temperature data retrievals: signal filtering and the vertical filtering of the retrieved profiles, the quantification and propagation of the uncertainties, and the consistent definition and reporting of filtering and uncertainties in the NDACC- archived products. Additional experts from the satellite and global data standards communities complement the team to help address issues specific to the latter aspect.

  14. A Novel Relevance Feedback Approach Based on Similarity Measure Modification in an X-Ray Image Retrieval System Based on Fuzzy Representation Using Fuzzy Attributed Relational Graph

    OpenAIRE

    Hossien Pourghassem; Hassan Ghasemian

    2011-01-01

    Relevance feedback approaches is used to improve the performance of content-based image retrieval systems. In this paper, a novel relevance feedback approach based on similarity measure modification in an X-ray image retrieval system based on fuzzy representation using fuzzy attributed relational graph (FARG) is presented. In this approach, optimum weight of each feature in feature vector is calculated using similarity rate between query image and relevant and irrelevant images in user feedba...

  15. Measurement uncertainty in pulmonary vascular input impedance and characteristic impedance estimated from pulsed-wave Doppler ultrasound and pressure: clinical studies on 57 pediatric patients

    International Nuclear Information System (INIS)

    Pulmonary vascular input impedance better characterizes right ventricular (RV) afterload and disease outcomes in pulmonary hypertension compared to the standard clinical diagnostic, pulmonary vascular resistance (PVR). Early efforts to measure impedance were not routine, involving open-chest measurement. Recently, the use of pulsed-wave (PW) Doppler-measured velocity to non-invasively estimate instantaneous flow has made impedance measurement more practical. One critical concern remains with clinical use: the measurement uncertainty, especially since previous studies only incorporated random error. This study utilized data from a large pediatric patient population to comprehensively examine the systematic and random error contributions to the total impedance uncertainty and determined the least error prone methodology to compute impedance from among four different methods. We found that the systematic error contributes greatly to the total uncertainty and that one of the four methods had significantly smaller propagated uncertainty; however, even when this best method is used, the uncertainty can be large for input impedance at high harmonics and for the characteristic impedance modulus. Finally, we found that uncertainty in impedance between normotensive and hypertensive patient groups displays no significant difference. It is concluded that clinical impedance measurement would be most improved by advancements in instrumentation, and the best computation method is proposed for future clinical use of the input impedance

  16. Systematic uncertainties in RF-based measurement of superconducting cavity quality factors

    Science.gov (United States)

    Holzbauer, J. P.; Pischalnikov, Yu.; Sergatskov, D. A.; Schappert, W.; Smith, S.

    2016-09-01

    Q0 determinations based on RF power measurements are subject to at least three potentially large systematic effects that have not been previously appreciated. Instrumental factors that can systematically bias RF based measurements of Q0 are quantified and steps that can be taken to improve the determination of Q0 are discussed.

  17. Measuring the Higgs boson mass using event-by-event uncertainties

    NARCIS (Netherlands)

    A. Castelli

    2015-01-01

    The thesis presents a measurement of the properties of the Higgs particle, performed by using the data collected by the ATLAS experiment in 2011 and 2012. The measurement is performed by using a three-dimensional model based on analytic functions to describe the signal produced by the Higgs boson de

  18. The Uncertainty of Mass Discharge Measurements Using Pumping Methods Under Simplified Conditions

    Science.gov (United States)

    Mass discharge measurements at contaminated sites have been used to assist with site management decisions, and can be divided into two broad categories: point-scale measurement techniques and pumping methods. Pumping methods can be sub-divided based on the pumping procedures use...

  19. Total uncertainty of low velocity thermal anemometers for measurement of indoor air movements

    DEFF Research Database (Denmark)

    Jørgensen, F.; Popiolek, Z.; Melikov, Arsen Krikor;

    2004-01-01

    developed mathematical model of the anemometer in combination with a large database of representative room flows measured with a 3-D Laser Doppler anemometer (LDA). A direct comparison between measurements with a thermal anemometer and a 3-D LDA in flows of varying velocity and turbulence intensity shows...

  20. Uncertainties in measuring populations potentially impacted by sea level rise and coastal flooding.

    Science.gov (United States)

    Mondal, Pinki; Tatem, Andrew J

    2012-01-01

    A better understanding of the impact of global climate change requires information on the locations and characteristics of populations affected. For instance, with global sea level predicted to rise and coastal flooding set to become more frequent and intense, high-resolution spatial population datasets are increasingly being used to estimate the size of vulnerable coastal populations. Many previous studies have undertaken this by quantifying the size of populations residing in low elevation coastal zones using one of two global spatial population datasets available - LandScan and the Global Rural Urban Mapping Project (GRUMP). This has been undertaken without consideration of the effects of this choice, which are a function of the quality of input datasets and differences in methods used to construct each spatial population dataset. Here we calculate estimated low elevation coastal zone resident population sizes from LandScan and GRUMP using previously adopted approaches, and quantify the absolute and relative differences achieved through switching datasets. Our findings suggest that the choice of one particular dataset over another can translate to a difference of more than 7.5 million vulnerable people for countries with extensive coastal populations, such as Indonesia and Japan. Our findings also show variations in estimates of proportions of national populations at risk range from <0.1% to 45% differences when switching between datasets, with large differences predominantly for countries where coarse and outdated input data were used in the construction of the spatial population datasets. The results highlight the need for the construction of spatial population datasets built on accurate, contemporary and detailed census data for use in climate change impact studies and the importance of acknowledging uncertainties inherent in existing spatial population datasets when estimating the demographic impacts of climate change. PMID:23110208