WorldWideScience

Sample records for rigorous measurement uncertainty

  1. A rigorous treatment of uncertainty quantification for Silicon damage metrics

    International Nuclear Information System (INIS)

    Griffin, P.

    2016-01-01

    These report summaries the contributions made by Sandia National Laboratories in support of the International Atomic Energy Agency (IAEA) Nuclear Data Section (NDS) Technical Meeting (TM) on Nuclear Reaction Data and Uncertainties for Radiation Damage. This work focused on a rigorous treatment of the uncertainties affecting the characterization of the displacement damage seen in silicon semiconductors. (author)

  2. Uncertainty quantification in nanomechanical measurements using the atomic force microscope

    Science.gov (United States)

    Ryan Wagner; Robert Moon; Jon Pratt; Gordon Shaw; Arvind Raman

    2011-01-01

    Quantifying uncertainty in measured properties of nanomaterials is a prerequisite for the manufacture of reliable nanoengineered materials and products. Yet, rigorous uncertainty quantification (UQ) is rarely applied for material property measurements with the atomic force microscope (AFM), a widely used instrument that can measure properties at nanometer scale...

  3. Measurement uncertainty analysis techniques applied to PV performance measurements

    International Nuclear Information System (INIS)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results

  4. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  5. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  6. Uncertainty quantification in nanomechanical measurements using the atomic force microscope

    International Nuclear Information System (INIS)

    Wagner, Ryan; Raman, Arvind; Moon, Robert; Pratt, Jon; Shaw, Gordon

    2011-01-01

    Quantifying uncertainty in measured properties of nanomaterials is a prerequisite for the manufacture of reliable nanoengineered materials and products. Yet, rigorous uncertainty quantification (UQ) is rarely applied for material property measurements with the atomic force microscope (AFM), a widely used instrument that can measure properties at nanometer scale resolution of both inorganic and biological surfaces and nanomaterials. We present a framework to ascribe uncertainty to local nanomechanical properties of any nanoparticle or surface measured with the AFM by taking into account the main uncertainty sources inherent in such measurements. We demonstrate the framework by quantifying uncertainty in AFM-based measurements of the transverse elastic modulus of cellulose nanocrystals (CNCs), an abundant, plant-derived nanomaterial whose mechanical properties are comparable to Kevlar fibers. For a single, isolated CNC the transverse elastic modulus was found to have a mean of 8.1 GPa and a 95% confidence interval of 2.7–20 GPa. A key result is that multiple replicates of force–distance curves do not sample the important sources of uncertainty, which are systematic in nature. The dominant source of uncertainty is the nondimensional photodiode sensitivity calibration rather than the cantilever stiffness or Z-piezo calibrations. The results underscore the great need for, and open a path towards, quantifying and minimizing uncertainty in AFM-based material property measurements of nanoparticles, nanostructured surfaces, thin films, polymers and biomaterials.

  7. The uncertainties in estimating measurement uncertainties

    International Nuclear Information System (INIS)

    Clark, J.P.; Shull, A.H.

    1994-01-01

    All measurements include some error. Whether measurements are used for accountability, environmental programs or process support, they are of little value unless accompanied by an estimate of the measurements uncertainty. This fact is often overlooked by the individuals who need measurements to make decisions. This paper will discuss the concepts of measurement, measurements errors (accuracy or bias and precision or random error), physical and error models, measurement control programs, examples of measurement uncertainty, and uncertainty as related to measurement quality. Measurements are comparisons of unknowns to knowns, estimates of some true value plus uncertainty; and are no better than the standards to which they are compared. Direct comparisons of unknowns that match the composition of known standards will normally have small uncertainties. In the real world, measurements usually involve indirect comparisons of significantly different materials (e.g., measuring a physical property of a chemical element in a sample having a matrix that is significantly different from calibration standards matrix). Consequently, there are many sources of error involved in measurement processes that can affect the quality of a measurement and its associated uncertainty. How the uncertainty estimates are determined and what they mean is as important as the measurement. The process of calculating the uncertainty of a measurement itself has uncertainties that must be handled correctly. Examples of chemistry laboratory measurement will be reviewed in this report and recommendations made for improving measurement uncertainties

  8. Measurement Uncertainty

    Science.gov (United States)

    Koch, Michael

    Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.

  9. Uncertainty evaluation of thickness and warp of a silicon wafer measured by a spectrally resolved interferometer

    Science.gov (United States)

    Praba Drijarkara, Agustinus; Gergiso Gebrie, Tadesse; Lee, Jae Yong; Kang, Chu-Shik

    2018-06-01

    Evaluation of uncertainty of thickness and gravity-compensated warp of a silicon wafer measured by a spectrally resolved interferometer is presented. The evaluation is performed in a rigorous manner, by analysing the propagation of uncertainty from the input quantities through all the steps of measurement functions, in accordance with the ISO Guide to the Expression of Uncertainty in Measurement. In the evaluation, correlation between input quantities as well as uncertainty attributed to thermal effect, which were not included in earlier publications, are taken into account. The temperature dependence of the group refractive index of silicon was found to be nonlinear and varies widely within a wafer and also between different wafers. The uncertainty evaluation described here can be applied to other spectral interferometry applications based on similar principles.

  10. Uncertainty and measurement

    International Nuclear Information System (INIS)

    Landsberg, P.T.

    1990-01-01

    This paper explores how the quantum mechanics uncertainty relation can be considered to result from measurements. A distinction is drawn between the uncertainties obtained by scrutinising experiments and the standard deviation type of uncertainty definition used in quantum formalism. (UK)

  11. Uncertainty in the use of MAMA software to measure particle morphological parameters from SEM images

    Energy Technology Data Exchange (ETDEWEB)

    Schwartz, Daniel S. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Tandon, Lav [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-06-05

    The MAMA software package developed at LANL is designed to make morphological measurements on a wide variety of digital images of objects. At LANL, we have focused on using MAMA to measure scanning electron microscope (SEM) images of particles, as this is a critical part of our forensic analysis of interdicted radiologic materials. In order to successfully use MAMA to make such measurements, we must understand the level of uncertainty involved in the process, so that we can rigorously support our quantitative conclusions.

  12. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  13. Comparison of rigorous modelling of different structure profiles on photomasks for quantitative linewidth measurements by means of UV- or DUV-optical microscopy

    Science.gov (United States)

    Ehret, Gerd; Bodermann, Bernd; Woehler, Martin

    2007-06-01

    The optical microscopy is an important instrument for dimensional characterisation or calibration of micro- and nanostructures, e.g. chrome structures on photomasks. In comparison to scanning electron microscopy (possible contamination of the sample) and atomic force microscopy (slow, risk of damage) optical microscopy is a fast and non destructive metrology method. The precise quantitative determination of the linewidth from the microscope image is, however, only possible by knowledge of the geometry of the structures and their consideration in the optical modelling. We compared two different rigorous model approaches, the Rigorous Coupled Wave Analysis (RCWA) and the Finite Elements Method (FEM) for modelling of structures with different edge angles, linewidths, line to space ratios and polarisations. The RCWA method can adapt inclined edges profiles only by a staircase approximation leading to increased modelling errors of the RCWA method. Even today's sophisticated rigorous methods still show problems with TM-polarisation. Therefore both rigorous methods are compared in terms of their convergence for TE and TM- polarisation. Beyond that also the influence of typical illumination wavelengths (365 nm, 248 nm and 193 nm) on the microscope images and their contribution to the measuring uncertainty budget will be discussed.

  14. Rigorous approximation of stationary measures and convergence to equilibrium for iterated function systems

    International Nuclear Information System (INIS)

    Galatolo, Stefano; Monge, Maurizio; Nisoli, Isaia

    2016-01-01

    We study the problem of the rigorous computation of the stationary measure and of the rate of convergence to equilibrium of an iterated function system described by a stochastic mixture of two or more dynamical systems that are either all uniformly expanding on the interval, either all contracting. In the expanding case, the associated transfer operators satisfy a Lasota–Yorke inequality, we show how to compute a rigorous approximations of the stationary measure in the L "1 norm and an estimate for the rate of convergence. The rigorous computation requires a computer-aided proof of the contraction of the transfer operators for the maps, and we show that this property propagates to the transfer operators of the IFS. In the contracting case we perform a rigorous approximation of the stationary measure in the Wasserstein–Kantorovich distance and rate of convergence, using the same functional analytic approach. We show that a finite computation can produce a realistic computation of all contraction rates for the whole parameter space. We conclude with a description of the implementation and numerical experiments. (paper)

  15. Impact of AMS-02 Measurements on Reducing GCR Model Uncertainties

    Science.gov (United States)

    Slaba, T. C.; O'Neill, P. M.; Golge, S.; Norbury, J. W.

    2015-01-01

    For vehicle design, shield optimization, mission planning, and astronaut risk assessment, the exposure from galactic cosmic rays (GCR) poses a significant and complex problem both in low Earth orbit and in deep space. To address this problem, various computational tools have been developed to quantify the exposure and risk in a wide range of scenarios. Generally, the tool used to describe the ambient GCR environment provides the input into subsequent computational tools and is therefore a critical component of end-to-end procedures. Over the past few years, several researchers have independently and very carefully compared some of the widely used GCR models to more rigorously characterize model differences and quantify uncertainties. All of the GCR models studied rely heavily on calibrating to available near-Earth measurements of GCR particle energy spectra, typically over restricted energy regions and short time periods. In this work, we first review recent sensitivity studies quantifying the ions and energies in the ambient GCR environment of greatest importance to exposure quantities behind shielding. Currently available measurements used to calibrate and validate GCR models are also summarized within this context. It is shown that the AMS-II measurements will fill a critically important gap in the measurement database. The emergence of AMS-II measurements also provides a unique opportunity to validate existing models against measurements that were not used to calibrate free parameters in the empirical descriptions. Discussion is given regarding rigorous approaches to implement the independent validation efforts, followed by recalibration of empirical parameters.

  16. Uncertainty quantification for nuclear density functional theory and information content of new measurements.

    Science.gov (United States)

    McDonnell, J D; Schunck, N; Higdon, D; Sarich, J; Wild, S M; Nazarewicz, W

    2015-03-27

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.

  17. Measurement uncertainty: Friend or foe?

    Science.gov (United States)

    Infusino, Ilenia; Panteghini, Mauro

    2018-02-02

    The definition and enforcement of a reference measurement system, based on the implementation of metrological traceability of patients' results to higher order reference methods and materials, together with a clinically acceptable level of measurement uncertainty, are fundamental requirements to produce accurate and equivalent laboratory results. The uncertainty associated with each step of the traceability chain should be governed to obtain a final combined uncertainty on clinical samples fulfilling the requested performance specifications. It is important that end-users (i.e., clinical laboratory) may know and verify how in vitro diagnostics (IVD) manufacturers have implemented the traceability of their calibrators and estimated the corresponding uncertainty. However, full information about traceability and combined uncertainty of calibrators is currently very difficult to obtain. Laboratory professionals should investigate the need to reduce the uncertainty of the higher order metrological references and/or to increase the precision of commercial measuring systems. Accordingly, the measurement uncertainty should not be considered a parameter to be calculated by clinical laboratories just to fulfil the accreditation standards, but it must become a key quality indicator to describe both the performance of an IVD measuring system and the laboratory itself. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  18. Uncertainty estimation of ultrasonic thickness measurement

    International Nuclear Information System (INIS)

    Yassir Yassen, Abdul Razak Daud; Mohammad Pauzi Ismail; Abdul Aziz Jemain

    2009-01-01

    The most important factor that should be taken into consideration when selecting ultrasonic thickness measurement technique is its reliability. Only when the uncertainty of a measurement results is known, it may be judged if the result is adequate for intended purpose. The objective of this study is to model the ultrasonic thickness measurement function, to identify the most contributing input uncertainty components, and to estimate the uncertainty of the ultrasonic thickness measurement results. We assumed that there are five error sources significantly contribute to the final error, these sources are calibration velocity, transit time, zero offset, measurement repeatability and resolution, by applying the propagation of uncertainty law to the model function, a combined uncertainty of the ultrasonic thickness measurement was obtained. In this study the modeling function of ultrasonic thickness measurement was derived. By using this model the estimation of the uncertainty of the final output result was found to be reliable. It was also found that the most contributing input uncertainty components are calibration velocity, transit time linearity and zero offset. (author)

  19. Uncertainty quantification for nuclear density functional theory and information content of new measurements

    Energy Technology Data Exchange (ETDEWEB)

    McDonnell, J. D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Schunck, N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Higdon, D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sarich, J. [Argonne National Lab. (ANL), Argonne, IL (United States); Wild, S. M. [Argonne National Lab. (ANL), Argonne, IL (United States); Nazarewicz, W. [Michigan State Univ., East Lansing, MI (United States); Oak Ridge National Lab., Oak Ridge, TN (United States); Univ. of Warsaw, Warsaw (Poland)

    2015-03-24

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. As a result, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.

  20. Experimental evaluation of rigor mortis. VIII. Estimation of time since death by repeated measurements of the intensity of rigor mortis on rats.

    Science.gov (United States)

    Krompecher, T

    1994-10-21

    The development of the intensity of rigor mortis was monitored in nine groups of rats. The measurements were initiated after 2, 4, 5, 6, 8, 12, 15, 24, and 48 h post mortem (p.m.) and lasted 5-9 h, which ideally should correspond to the usual procedure after the discovery of a corpse. The experiments were carried out at an ambient temperature of 24 degrees C. Measurements initiated early after death resulted in curves with a rising portion, a plateau, and a descending slope. Delaying the initial measurement translated into shorter rising portions, and curves initiated 8 h p.m. or later are comprised of a plateau and/or a downward slope only. Three different phases were observed suggesting simple rules that can help estimate the time since death: (1) if an increase in intensity was found, the initial measurements were conducted not later than 5 h p.m.; (2) if only a decrease in intensity was observed, the initial measurements were conducted not earlier than 7 h p.m.; and (3) at 24 h p.m., the resolution is complete, and no further changes in intensity should occur. Our results clearly demonstrate that repeated measurements of the intensity of rigor mortis allow a more accurate estimation of the time since death of the experimental animals than the single measurement method used earlier. A critical review of the literature on the estimation of time since death on the basis of objective measurements of the intensity of rigor mortis is also presented.

  1. Rigorous Statistical Bounds in Uncertainty Quantification for One-Layer Turbulent Geophysical Flows

    Science.gov (United States)

    Qi, Di; Majda, Andrew J.

    2018-04-01

    Statistical bounds controlling the total fluctuations in mean and variance about a basic steady-state solution are developed for the truncated barotropic flow over topography. Statistical ensemble prediction is an important topic in weather and climate research. Here, the evolution of an ensemble of trajectories is considered using statistical instability analysis and is compared and contrasted with the classical deterministic instability for the growth of perturbations in one pointwise trajectory. The maximum growth of the total statistics in fluctuations is derived relying on the statistical conservation principle of the pseudo-energy. The saturation bound of the statistical mean fluctuation and variance in the unstable regimes with non-positive-definite pseudo-energy is achieved by linking with a class of stable reference states and minimizing the stable statistical energy. Two cases with dependence on initial statistical uncertainty and on external forcing and dissipation are compared and unified under a consistent statistical stability framework. The flow structures and statistical stability bounds are illustrated and verified by numerical simulations among a wide range of dynamical regimes, where subtle transient statistical instability exists in general with positive short-time exponential growth in the covariance even when the pseudo-energy is positive-definite. Among the various scenarios in this paper, there exist strong forward and backward energy exchanges between different scales which are estimated by the rigorous statistical bounds.

  2. A new uncertainty importance measure

    International Nuclear Information System (INIS)

    Borgonovo, E.

    2007-01-01

    Uncertainty in parameters is present in many risk assessment problems and leads to uncertainty in model predictions. In this work, we introduce a global sensitivity indicator which looks at the influence of input uncertainty on the entire output distribution without reference to a specific moment of the output (moment independence) and which can be defined also in the presence of correlations among the parameters. We discuss its mathematical properties and highlight the differences between the present indicator, variance-based uncertainty importance measures and a moment independent sensitivity indicator previously introduced in the literature. Numerical results are discussed with application to the probabilistic risk assessment model on which Iman [A matrix-based approach to uncertainty and sensitivity analysis for fault trees. Risk Anal 1987;7(1):22-33] first introduced uncertainty importance measures

  3. Uncertainty Analysis for Oil-Film Interferometry Skin-Friction Measurement Techniques

    Science.gov (United States)

    Naughton, Jonathan W.; Brown, James L.; Merriam, Marshal (Technical Monitor)

    1996-01-01

    Over the past 20 years, the use of oil-film interferometry to measure the skin friction coefficient (C(sub f) = tau/q where tau is the surface shear stress and q is the dynamic pressure) has increased. Different forms of this oil-film technique with various levels of accuracy and ease of use have been successfully applied in a wide range of flows. The method's popularity is growing due to its relative ease of implementation and minimal intrusiveness as well as an increased demand for C(sub f) measurements. Nonetheless, the accuracy of these methods has not been rigorously addressed to date. Most researchers have simply shown that the skin-friction measurements made using these techniques compare favorably with other measurements and theory, most of which are only accurate to within 5-20%. The use of skin-friction data in the design of commercial aircraft, whose drag at cruise is 50% skin-friction drag, and in the validation of computational fluid dynamics programs warrants better uncertainty estimates. Additional information is contained in the original extended abstract.

  4. Propagation of dynamic measurement uncertainty

    International Nuclear Information System (INIS)

    Hessling, J P

    2011-01-01

    The time-dependent measurement uncertainty has been evaluated in a number of recent publications, starting from a known uncertain dynamic model. This could be defined as the 'downward' propagation of uncertainty from the model to the targeted measurement. The propagation of uncertainty 'upward' from the calibration experiment to a dynamic model traditionally belongs to system identification. The use of different representations (time, frequency, etc) is ubiquitous in dynamic measurement analyses. An expression of uncertainty in dynamic measurements is formulated for the first time in this paper independent of representation, joining upward as well as downward propagation. For applications in metrology, the high quality of the characterization may be prohibitive for any reasonably large and robust model to pass the whiteness test. This test is therefore relaxed by not directly requiring small systematic model errors in comparison to the randomness of the characterization. Instead, the systematic error of the dynamic model is propagated to the uncertainty of the measurand, analogously but differently to how stochastic contributions are propagated. The pass criterion of the model is thereby transferred from the identification to acceptance of the total accumulated uncertainty of the measurand. This increases the relevance of the test of the model as it relates to its final use rather than the quality of the calibration. The propagation of uncertainty hence includes the propagation of systematic model errors. For illustration, the 'upward' propagation of uncertainty is applied to determine if an appliance box is damaged in an earthquake experiment. In this case, relaxation of the whiteness test was required to reach a conclusive result

  5. Ruminations On NDA Measurement Uncertainty Compared TO DA Uncertainty

    International Nuclear Information System (INIS)

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-01-01

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  6. RUMINATIONS ON NDA MEASUREMENT UNCERTAINTY COMPARED TO DA UNCERTAINTY

    Energy Technology Data Exchange (ETDEWEB)

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-06-17

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  7. The Uncertainty of Measurement Results

    Energy Technology Data Exchange (ETDEWEB)

    Ambrus, A. [Hungarian Food Safety Office, Budapest (Hungary)

    2009-07-15

    Factors affecting the uncertainty of measurement are explained, basic statistical formulae given, and the theoretical concept explained in the context of pesticide formulation analysis. Practical guidance is provided on how to determine individual uncertainty components within an analytical procedure. An extended and comprehensive table containing the relevant mathematical/statistical expressions elucidates the relevant underlying principles. Appendix I provides a practical elaborated example on measurement uncertainty estimation, above all utilizing experimental repeatability and reproducibility laboratory data. (author)

  8. Managing Measurement Uncertainty in Building Acoustics

    Directory of Open Access Journals (Sweden)

    Chiara Scrosati

    2015-12-01

    Full Text Available In general, uncertainties should preferably be determined following the principles laid down in ISO/IEC Guide 98-3, the Guide to the expression of uncertainty in measurement (GUM:1995. According to current knowledge, it seems impossible to formulate these models for the different quantities in building acoustics. Therefore, the concepts of repeatability and reproducibility are necessary to determine the uncertainty of building acoustics measurements. This study shows the uncertainty of field measurements of a lightweight wall, a heavyweight floor, a façade with a single glazing window and a façade with double glazing window that were analyzed by a Round Robin Test (RRT, conducted in a full-scale experimental building at ITC-CNR (Construction Technologies Institute of the National Research Council of Italy. The single number quantities and their uncertainties were evaluated in both narrow and enlarged range and it was shown that including or excluding the low frequencies leads to very significant differences, except in the case of the sound insulation of façades with single glazing window. The results obtained in these RRTs were compared with other results from literature, which confirm the increase of the uncertainty of single number quantities due to the low frequencies extension. Having stated the measurement uncertainty for a single measurement, in building acoustics, it is also very important to deal with sampling for the purposes of classification of buildings or building units. Therefore, this study also shows an application of the sampling included in the Italian Standard on the acoustic classification of building units on a serial type building consisting of 47 building units. It was found that the greatest variability is observed in the façade and it depends on both the great variability of window’s typologies and on workmanship. Finally, it is suggested how to manage the uncertainty in building acoustics, both for one single

  9. Measurement uncertainty in broadband radiofrequency radiation level measurements

    Directory of Open Access Journals (Sweden)

    Vulević Branislav D.

    2014-01-01

    Full Text Available For the evaluation of measurement uncertainty in the measurement of broadband radio frequency radiation, in this paper we propose a new approach based on the experience of the authors of the paper with measurements of radiofrequency electric field levels conducted in residential areas of Belgrade and over 35 municipalities in Serbia. The main objective of the paper is to present practical solutions in the evaluation of broadband measurement uncertainty for the in-situ RF radiation levels. [Projekat Ministarstva nauke Republike Srbije, br. III43009

  10. Measurement Errors and Uncertainties Theory and Practice

    CERN Document Server

    Rabinovich, Semyon G

    2006-01-01

    Measurement Errors and Uncertainties addresses the most important problems that physicists and engineers encounter when estimating errors and uncertainty. Building from the fundamentals of measurement theory, the author develops the theory of accuracy of measurements and offers a wealth of practical recommendations and examples of applications. This new edition covers a wide range of subjects, including: - Basic concepts of metrology - Measuring instruments characterization, standardization and calibration -Estimation of errors and uncertainty of single and multiple measurements - Modern probability-based methods of estimating measurement uncertainty With this new edition, the author completes the development of the new theory of indirect measurements. This theory provides more accurate and efficient methods for processing indirect measurement data. It eliminates the need to calculate the correlation coefficient - a stumbling block in measurement data processing - and offers for the first time a way to obtain...

  11. Measuring uncertainty within the theory of evidence

    CERN Document Server

    Salicone, Simona

    2018-01-01

    This monograph considers the evaluation and expression of measurement uncertainty within the mathematical framework of the Theory of Evidence. With a new perspective on the metrology science, the text paves the way for innovative applications in a wide range of areas. Building on Simona Salicone’s Measurement Uncertainty: An Approach via the Mathematical Theory of Evidence, the material covers further developments of the Random Fuzzy Variable (RFV) approach to uncertainty and provides a more robust mathematical and metrological background to the combination of measurement results that leads to a more effective RFV combination method. While the first part of the book introduces measurement uncertainty, the Theory of Evidence, and fuzzy sets, the following parts bring together these concepts and derive an effective methodology for the evaluation and expression of measurement uncertainty. A supplementary downloadable program allows the readers to interact with the proposed approach by generating and combining ...

  12. The state of the art of the impact of sampling uncertainty on measurement uncertainty

    Science.gov (United States)

    Leite, V. J.; Oliveira, E. C.

    2018-03-01

    The measurement uncertainty is a parameter that marks the reliability and can be divided into two large groups: sampling and analytical variations. Analytical uncertainty is a controlled process, performed in the laboratory. The same does not occur with the sampling uncertainty, which, because it faces several obstacles and there is no clarity on how to perform the procedures, has been neglected, although it is admittedly indispensable to the measurement process. This paper aims at describing the state of the art of sampling uncertainty and at assessing its relevance to measurement uncertainty.

  13. Determination of Formula for Vickers Hardness Measurements Uncertainty

    International Nuclear Information System (INIS)

    Purba, Asli

    2007-01-01

    The purpose of formula determination is to obtain the formula of Vickers hardness measurements uncertainty. The approach to determine the formula: influenced parameters identification, creating a cause and effect diagram, determination of sensitivity, determination of standard uncertainty and determination of formula for Vickers hardness measurements uncertainty. The results is a formula for determination of Vickers hardness measurements uncertainty. (author)

  14. Experimental evaluation of rigor mortis. VI. Effect of various causes of death on the evolution of rigor mortis.

    Science.gov (United States)

    Krompecher, T; Bergerioux, C; Brandt-Casadevall, C; Gujer, H R

    1983-07-01

    The evolution of rigor mortis was studied in cases of nitrogen asphyxia, drowning and strangulation, as well as in fatal intoxications due to strychnine, carbon monoxide and curariform drugs, using a modified method of measurement. Our experiments demonstrated that: (1) Strychnine intoxication hastens the onset and passing of rigor mortis. (2) CO intoxication delays the resolution of rigor mortis. (3) The intensity of rigor may vary depending upon the cause of death. (4) If the stage of rigidity is to be used to estimate the time of death, it is necessary: (a) to perform a succession of objective measurements of rigor mortis intensity; and (b) to verify the eventual presence of factors that could play a role in the modification of its development.

  15. [A new formula for the measurement of rigor mortis: the determination of the FRR-index (author's transl)].

    Science.gov (United States)

    Forster, B; Ropohl, D; Raule, P

    1977-07-05

    The manual examination of rigor mortis as currently used and its often subjective evaluation frequently produced highly incorrect deductions. It is therefore desirable that such inaccuracies should be replaced by the objective measuring of rigor mortis at the extremities. To that purpose a method is described which can also be applied in on-the-spot investigations and a new formula for the determination of rigor mortis--indices (FRR) is introduced.

  16. THE UNCERTAINTIES OF ENVIRONMENT'S PARAMETERS MEASUREMENTS AS TOLLS OF THE MEASUREMENTS QUALITY IMPROVEMENT

    Directory of Open Access Journals (Sweden)

    Miroslav Badida

    2008-06-01

    Full Text Available Identification of the noise measuring uncertainties by declared measured values is unconditionally necessary and required by legislative. Uncertainty of the measurements expresses all errors that accrue during the measuring. B y indication of uncertainties the measure documents that the objective value is with certain probability found in the interval that is bounded by the measurement uncertainty. The paper deals with the methodology of the uncertainty calculation by noise measurements in living and working environments. metal processing industry and building materials industry.

  17. A statistical approach to quantify uncertainty in carbon monoxide measurements at the Izaña global GAW station: 2008-2011

    Science.gov (United States)

    Gomez-Pelaez, A. J.; Ramos, R.; Gomez-Trueba, V.; Novelli, P. C.; Campo-Hernandez, R.

    2013-03-01

    Atmospheric CO in situ measurements are carried out at the Izaña (Tenerife) global GAW (Global Atmosphere Watch Programme of the World Meteorological Organization - WMO) mountain station using a Reduction Gas Analyser (RGA). In situ measurements at Izaña are representative of the subtropical Northeast Atlantic free troposphere, especially during nighttime. We present the measurement system configuration, the response function, the calibration scheme, the data processing, the Izaña 2008-2011 CO nocturnal time series, and the mean diurnal cycle by months. We have developed a rigorous uncertainty analysis for carbon monoxide measurements carried out at the Izaña station, which could be applied to other GAW stations. We determine the combined standard measurement uncertainty taking into consideration four contributing components: uncertainty of the WMO standard gases interpolated over the range of measurement, the uncertainty that takes into account the agreement between the standard gases and the response function used, the uncertainty due to the repeatability of the injections, and the propagated uncertainty related to the temporal consistency of the response function parameters (which also takes into account the covariance between the parameters). The mean value of the combined standard uncertainty decreased significantly after March 2009, from 2.37 nmol mol-1 to 1.66 nmol mol-1, due to improvements in the measurement system. A fifth type of uncertainty we call representation uncertainty is considered when some of the data necessary to compute the temporal mean are absent. Any computed mean has also a propagated uncertainty arising from the uncertainties of the data used to compute the mean. The law of propagation depends on the type of uncertainty component (random or systematic). In situ hourly means are compared with simultaneous and collocated NOAA flask samples. The uncertainty of the differences is computed and used to determine whether the differences are

  18. Evaluation of physical dimension changes as nondestructive measurements for monitoring rigor mortis development in broiler muscles.

    Science.gov (United States)

    Cavitt, L C; Sams, A R

    2003-07-01

    Studies were conducted to develop a non-destructive method for monitoring the rate of rigor mortis development in poultry and to evaluate the effectiveness of electrical stimulation (ES). In the first study, 36 male broilers in each of two trials were processed at 7 wk of age. After being bled, half of the birds received electrical stimulation (400 to 450 V, 400 to 450 mA, for seven pulses of 2 s on and 1 s off), and the other half were designated as controls. At 0.25 and 1.5 h postmortem (PM), carcasses were evaluated for the angles of the shoulder, elbow, and wing tip and the distance between the elbows. Breast fillets were harvested at 1.5 h PM (after chilling) from all carcasses. Fillet samples were excised and frozen for later measurement of pH and R-value, and the remainder of each fillet was held on ice until 24 h postmortem. Shear value and pH means were significantly lower, but R-value means were higher (P rigor mortis by ES. The physical dimensions of the shoulder and elbow changed (P rigor mortis development and with ES. These results indicate that physical measurements of the wings maybe useful as a nondestructive indicator of rigor development and for monitoring the effectiveness of ES. In the second study, 60 male broilers in each of two trials were processed at 7 wk of age. At 0.25, 1.5, 3.0, and 6.0 h PM, carcasses were evaluated for the distance between the elbows. At each time point, breast fillets were harvested from each carcass. Fillet samples were excised and frozen for later measurement of pH and sacromere length, whereas the remainder of each fillet was held on ice until 24 h PM. Shear value and pH means (P rigor mortis development. Elbow distance decreased (P rigor development and was correlated (P rigor mortis development in broiler carcasses.

  19. Experimental evaluation of rigor mortis. V. Effect of various temperatures on the evolution of rigor mortis.

    Science.gov (United States)

    Krompecher, T

    1981-01-01

    Objective measurements were carried out to study the evolution of rigor mortis on rats at various temperatures. Our experiments showed that: (1) at 6 degrees C rigor mortis reaches full development between 48 and 60 hours post mortem, and is resolved at 168 hours post mortem; (2) at 24 degrees C rigor mortis reaches full development at 5 hours post mortem, and is resolved at 16 hours post mortem; (3) at 37 degrees C rigor mortis reaches full development at 3 hours post mortem, and is resolved at 6 hours post mortem; (4) the intensity of rigor mortis grows with increase in temperature (difference between values obtained at 24 degrees C and 37 degrees C); and (5) and 6 degrees C a "cold rigidity" was found, in addition to and independent of rigor mortis.

  20. Uncertainty In Measuring Noise Parameters Of a Communication Receiver

    International Nuclear Information System (INIS)

    Korcz, Karol; Palczynska, Beata; Spiralski, Ludwik

    2005-01-01

    The paper presents the method of assessing uncertainty in measuring the usable sensitivity Es of communication receiver. The influence of partial uncertainties of measuring the noise factor F and the energy pass band of the receiver Δf on the combined standard uncertainty level is analyzed. The method to assess the uncertainty in measuring the noise factor on the basis of the systematic component of uncertainty, assuming that the main source of measurement uncertainty is the hardware of the measuring system, is proposed. The assessment of uncertainty in measuring the pass band of the receiver is determined with the assumption that input quantities of the measurement equation are not correlated. They are successive, discrete values of the spectral power density of the noise on the output of receiver. The results of the analyses of particular uncertainties components of measuring the sensitivity, which were carried out for a typical communication receiver, are presented

  1. Evaluation of Uncertainties in the Determination of Phosphorus by RNAA

    International Nuclear Information System (INIS)

    Rick L. Paul

    2000-01-01

    A radiochemical neutron activation analysis (RNAA) procedure for the determination of phosphorus in metals and other materials has been developed and critically evaluated. Uncertainties evaluated as type A include those arising from measurement replication, yield determination, neutron self-shielding, irradiation geometry, measurement of the quantity for concentration normalization (sample mass, area, etc.), and analysis of standards. Uncertainties evaluated as type B include those arising from beta contamination corrections, beta decay curve fitting, and beta self-absorption corrections. The evaluation of uncertainties in the determination of phosphorus is illustrated for three different materials in Table I. The metal standard reference materials (SRMs) 2175 and 861 were analyzed for value assignment of phosphorus; implanted silicon was analyzed to evaluate the technique for certification of phosphorus. The most significant difference in the error evaluation of the three materials lies in the type B uncertainties. The relatively uncomplicated matrix of the high-purity silicon allows virtually complete purification of phosphorus from other beta emitters; hence, minimal contamination correction is needed. Furthermore, because the chemistry is less rigorous, the carrier yield is more reproducible, and self-absorption corrections are less significant. Improvements in the chemical purification procedures for phosphorus in complex matrices will decrease the type B uncertainties for all samples. Uncertainties in the determination of carrier yield, the most significant type A error in the analysis of the silicon, also need to be evaluated more rigorously and minimized in the future

  2. Uncertainty in Measurement: Procedures for Determining Uncertainty With Application to Clinical Laboratory Calculations.

    Science.gov (United States)

    Frenkel, Robert B; Farrance, Ian

    2018-01-01

    The "Guide to the Expression of Uncertainty in Measurement" (GUM) is the foundational document of metrology. Its recommendations apply to all areas of metrology including metrology associated with the biomedical sciences. When the output of a measurement process depends on the measurement of several inputs through a measurement equation or functional relationship, the propagation of uncertainties in the inputs to the uncertainty in the output demands a level of understanding of the differential calculus. This review is intended as an elementary guide to the differential calculus and its application to uncertainty in measurement. The review is in two parts. In Part I, Section 3, we consider the case of a single input and introduce the concepts of error and uncertainty. Next we discuss, in the following sections in Part I, such notions as derivatives and differentials, and the sensitivity of an output to errors in the input. The derivatives of functions are obtained using very elementary mathematics. The overall purpose of this review, here in Part I and subsequently in Part II, is to present the differential calculus for those in the medical sciences who wish to gain a quick but accurate understanding of the propagation of uncertainties. © 2018 Elsevier Inc. All rights reserved.

  3. Evaluating measurement uncertainty in fluid phase equilibrium calculations

    Science.gov (United States)

    van der Veen, Adriaan M. H.

    2018-04-01

    The evaluation of measurement uncertainty in accordance with the ‘Guide to the expression of uncertainty in measurement’ (GUM) has not yet become widespread in physical chemistry. With only the law of the propagation of uncertainty from the GUM, many of these uncertainty evaluations would be cumbersome, as models are often non-linear and require iterative calculations. The methods from GUM supplements 1 and 2 enable the propagation of uncertainties under most circumstances. Experimental data in physical chemistry are used, for example, to derive reference property data and support trade—all applications where measurement uncertainty plays an important role. This paper aims to outline how the methods for evaluating and propagating uncertainty can be applied to some specific cases with a wide impact: deriving reference data from vapour pressure data, a flash calculation, and the use of an equation-of-state to predict the properties of both phases in a vapour-liquid equilibrium. The three uncertainty evaluations demonstrate that the methods of GUM and its supplements are a versatile toolbox that enable us to evaluate the measurement uncertainty of physical chemical measurements, including the derivation of reference data, such as the equilibrium thermodynamical properties of fluids.

  4. Radon measurements: the sources of uncertainties

    International Nuclear Information System (INIS)

    Zhukovsky, Michael; Onischenko, Alexandra; Bastrikov, Vladislav

    2008-01-01

    Full text: Radon measurements are quite complicated process and the correct estimation of uncertainties is very important. The sources of uncertainties for grab sampling, short term measurements (charcoal canisters), long term measurements (track detectors) and retrospective measurements (surface traps) are analyzed. The main sources of uncertainties for grab sampling measurements are: systematic bias of reference equipment; random Poisson and non-Poisson errors during calibration; random Poisson and non-Poisson errors during measurements. These sources are also common both for short term measurements (charcoal canisters) and long term measurements (track detectors). Usually during the calibration the high radon concentrations are used (1-5 kBq/m 3 ) and the Poisson random error rarely exceed some percents. Nevertheless the dispersion of measured values even during the calibration usually exceeds the Poisson dispersion expected on the basis of counting statistic. The origins of such non-Poisson random errors during calibration are different for different kinds of instrumental measurements. At present not all sources of non-Poisson random errors are trustworthy identified. The initial calibration accuracy of working devices rarely exceeds the value 20%. The real radon concentrations usually are in the range from some tens to some hundreds Becquerel per cubic meter and for low radon levels Poisson random error can reach up to 20%. The random non-Poisson errors and residual systematic biases are depends on the kind of measurement technique and the environmental conditions during radon measurements. For charcoal canisters there are additional sources of the measurement errors due to influence of air humidity and the variations of radon concentration during the canister exposure. The accuracy of long term measurements by track detectors will depend on the quality of chemical etching after exposure and the influence of season radon variations. The main sources of

  5. A Unified Approach for Reporting ARM Measurement Uncertainties Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Campos, E [Argonne National Lab. (ANL), Argonne, IL (United States); Sisterson, Douglas [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-12-01

    The U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility is observationally based, and quantifying the uncertainty of its measurements is critically important. With over 300 widely differing instruments providing over 2,500 datastreams, concise expression of measurement uncertainty is quite challenging. The ARM Facility currently provides data and supporting metadata (information about the data or data quality) to its users through a number of sources. Because the continued success of the ARM Facility depends on the known quality of its measurements, the Facility relies on instrument mentors and the ARM Data Quality Office (DQO) to ensure, assess, and report measurement quality. Therefore, an easily accessible, well-articulated estimate of ARM measurement uncertainty is needed. Note that some of the instrument observations require mathematical algorithms (retrievals) to convert a measured engineering variable into a useful geophysical measurement. While those types of retrieval measurements are identified, this study does not address particular methods for retrieval uncertainty. As well, the ARM Facility also provides engineered data products, or value-added products (VAPs), based on multiple instrument measurements. This study does not include uncertainty estimates for those data products. We propose here that a total measurement uncertainty should be calculated as a function of the instrument uncertainty (calibration factors), the field uncertainty (environmental factors), and the retrieval uncertainty (algorithm factors). The study will not expand on methods for computing these uncertainties. Instead, it will focus on the practical identification, characterization, and inventory of the measurement uncertainties already available in the ARM community through the ARM instrument mentors and their ARM instrument handbooks. As a result, this study will address the first steps towards reporting ARM measurement uncertainty

  6. Quantifying uncertainty in nuclear analytical measurements

    International Nuclear Information System (INIS)

    2004-07-01

    The lack of international consensus on the expression of uncertainty in measurements was recognised by the late 1970s and led, after the issuance of a series of rather generic recommendations, to the publication of a general publication, known as GUM, the Guide to the Expression of Uncertainty in Measurement. This publication, issued in 1993, was based on co-operation over several years by the Bureau International des Poids et Mesures, the International Electrotechnical Commission, the International Federation of Clinical Chemistry, the International Organization for Standardization (ISO), the International Union of Pure and Applied Chemistry, the International Union of Pure and Applied Physics and the Organisation internationale de metrologie legale. The purpose was to promote full information on how uncertainty statements are arrived at and to provide a basis for harmonized reporting and the international comparison of measurement results. The need to provide more specific guidance to different measurement disciplines was soon recognized and the field of analytical chemistry was addressed by EURACHEM in 1995 in the first edition of a guidance report on Quantifying Uncertainty in Analytical Measurements, produced by a group of experts from the field. That publication translated the general concepts of the GUM into specific applications for analytical laboratories and illustrated the principles with a series of selected examples as a didactic tool. Based on feedback from the actual practice, the EURACHEM publication was extensively reviewed in 1997-1999 under the auspices of the Co-operation on International Traceability in Analytical Chemistry (CITAC), and a second edition was published in 2000. Still, except for a single example on the measurement of radioactivity in GUM, the field of nuclear and radiochemical measurements was not covered. The explicit requirement of ISO standard 17025:1999, General Requirements for the Competence of Testing and Calibration

  7. Strain gauge measurement uncertainties on hydraulic turbine runner blade

    International Nuclear Information System (INIS)

    Arpin-Pont, J; Gagnon, M; Tahan, S A; Coutu, A; Thibault, D

    2012-01-01

    Strains experimentally measured with strain gauges can differ from those evaluated using the Finite Element (FE) method. This difference is due mainly to the assumptions and uncertainties inherent to each method. To circumvent this difficulty, we developed a numerical method based on Monte Carlo simulations to evaluate measurement uncertainties produced by the behaviour of a unidirectional welded gauge, its position uncertainty and its integration effect. This numerical method uses the displacement fields of the studied part evaluated by an FE analysis. The paper presents a study case using in situ data measured on a hydraulic turbine runner. The FE analysis of the turbine runner blade was computed, and our numerical method used to evaluate uncertainties on strains measured at five locations with welded strain gauges. Then, measured strains and their uncertainty ranges are compared to the estimated strains. The uncertainty ranges obtained extended from 74 με to 165 με. Furthermore, the biases observed between the median of the uncertainty ranges and the FE strains varied from −36 to 36 με. Note that strain gauge measurement uncertainties depend mainly on displacement fields and gauge geometry.

  8. Conclusions on measurement uncertainty in microbiology.

    Science.gov (United States)

    Forster, Lynne I

    2009-01-01

    Since its first issue in 1999, testing laboratories wishing to comply with all the requirements of ISO/IEC 17025 have been collecting data for estimating uncertainty of measurement for quantitative determinations. In the microbiological field of testing, some debate has arisen as to whether uncertainty needs to be estimated for each method performed in the laboratory for each type of sample matrix tested. Queries also arise concerning the estimation of uncertainty when plate/membrane filter colony counts are below recommended method counting range limits. A selection of water samples (with low to high contamination) was tested in replicate with the associated uncertainty of measurement being estimated from the analytical results obtained. The analyses performed on the water samples included total coliforms, fecal coliforms, fecal streptococci by membrane filtration, and heterotrophic plate counts by the pour plate technique. For those samples where plate/membrane filter colony counts were > or =20, uncertainty estimates at a 95% confidence level were very similar for the methods, being estimated as 0.13, 0.14, 0.14, and 0.12, respectively. For those samples where plate/membrane filter colony counts were <20, estimated uncertainty values for each sample showed close agreement with published confidence limits established using a Poisson distribution approach.

  9. Uncertainty in techno-economic estimates of cellulosic ethanol production due to experimental measurement uncertainty

    Directory of Open Access Journals (Sweden)

    Vicari Kristin J

    2012-04-01

    Full Text Available Abstract Background Cost-effective production of lignocellulosic biofuels remains a major financial and technical challenge at the industrial scale. A critical tool in biofuels process development is the techno-economic (TE model, which calculates biofuel production costs using a process model and an economic model. The process model solves mass and energy balances for each unit, and the economic model estimates capital and operating costs from the process model based on economic assumptions. The process model inputs include experimental data on the feedstock composition and intermediate product yields for each unit. These experimental yield data are calculated from primary measurements. Uncertainty in these primary measurements is propagated to the calculated yields, to the process model, and ultimately to the economic model. Thus, outputs of the TE model have a minimum uncertainty associated with the uncertainty in the primary measurements. Results We calculate the uncertainty in the Minimum Ethanol Selling Price (MESP estimate for lignocellulosic ethanol production via a biochemical conversion process: dilute sulfuric acid pretreatment of corn stover followed by enzymatic hydrolysis and co-fermentation of the resulting sugars to ethanol. We perform a sensitivity analysis on the TE model and identify the feedstock composition and conversion yields from three unit operations (xylose from pretreatment, glucose from enzymatic hydrolysis, and ethanol from fermentation as the most important variables. The uncertainty in the pretreatment xylose yield arises from multiple measurements, whereas the glucose and ethanol yields from enzymatic hydrolysis and fermentation, respectively, are dominated by a single measurement: the fraction of insoluble solids (fIS in the biomass slurries. Conclusions We calculate a $0.15/gal uncertainty in MESP from the TE model due to uncertainties in primary measurements. This result sets a lower bound on the error bars of

  10. Photovoltaic Calibrations at the National Renewable Energy Laboratory and Uncertainty Analysis Following the ISO 17025 Guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Emery, Keith [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-09-01

    The measurement of photovoltaic (PV) performance with respect to reference conditions requires measuring current versus voltage for a given tabular reference spectrum, junction temperature, and total irradiance. This report presents the procedures implemented by the PV Cell and Module Performance Characterization Group at the National Renewable Energy Laboratory (NREL) to achieve the lowest practical uncertainty. A rigorous uncertainty analysis of these procedures is presented, which follows the International Organization for Standardization (ISO) Guide to the Expression of Uncertainty in Measurement. This uncertainty analysis is required for the team’s laboratory accreditation under ISO standard 17025, “General Requirements for the Competence of Testing and Calibration Laboratories.” The report also discusses additional areas where the uncertainty can be reduced.

  11. CEC/USDOE workshop on uncertainty analysis

    International Nuclear Information System (INIS)

    Elderkin, C.E.; Kelly, G.N.

    1990-07-01

    Any measured or assessed quantity contains uncertainty. The quantitative estimation of such uncertainty is becoming increasingly important, especially in assuring that safety requirements are met in design, regulation, and operation of nuclear installations. The CEC/USDOE Workshop on Uncertainty Analysis, held in Santa Fe, New Mexico, on November 13 through 16, 1989, was organized jointly by the Commission of European Communities (CEC's) Radiation Protection Research program, dealing with uncertainties throughout the field of consequence assessment, and DOE's Atmospheric Studies in Complex Terrain (ASCOT) program, concerned with the particular uncertainties in time and space variant transport and dispersion. The workshop brought together US and European scientists who have been developing or applying uncertainty analysis methodologies, conducted in a variety of contexts, often with incomplete knowledge of the work of others in this area. Thus, it was timely to exchange views and experience, identify limitations of approaches to uncertainty and possible improvements, and enhance the interface between developers and users of uncertainty analysis methods. Furthermore, the workshop considered the extent to which consistent, rigorous methods could be used in various applications within consequence assessment. 3 refs

  12. Metrology and process control: dealing with measurement uncertainty

    Science.gov (United States)

    Potzick, James

    2010-03-01

    Metrology is often used in designing and controlling manufacturing processes. A product sample is processed, some relevant property is measured, and the process adjusted to bring the next processed sample closer to its specification. This feedback loop can be remarkably effective for the complex processes used in semiconductor manufacturing, but there is some risk involved because measurements have uncertainty and product specifications have tolerances. There is finite risk that good product will fail testing or that faulty product will pass. Standard methods for quantifying measurement uncertainty have been presented, but the question arises: how much measurement uncertainty is tolerable in a specific case? Or, How does measurement uncertainty relate to manufacturing risk? This paper looks at some of the components inside this process control feedback loop and describes methods to answer these questions.

  13. Methodology for the assessment of measuring uncertainties of articulated arm coordinate measuring machines

    International Nuclear Information System (INIS)

    Romdhani, Fekria; Hennebelle, François; Ge, Min; Juillion, Patrick; Fontaine, Jean François; Coquet, Richard

    2014-01-01

    Articulated Arm Coordinate Measuring Machines (AACMMs) have gradually evolved and are increasingly used in mechanical industry. At present, measurement uncertainties relating to the use of these devices are not yet well quantified. The work carried out consists of determining the measurement uncertainties of a mechanical part by an AACMM. The studies aiming to develop a model of measurement uncertainty are based on the Monte Carlo method developed in Supplement 1 of the Guide to Expression of Uncertainty in Measurement [1] but also identifying and characterizing the main sources of uncertainty. A multi-level Monte Carlo approach principle has been developed which allows for characterizing the possible evolution of the AACMM during the measurement and quantifying in a second level the uncertainty on the considered measurand. The first Monte Carlo level is the most complex and is thus divided into three sub-levels, namely characterization on the positioning error of a point, estimation of calibration errors and evaluation of fluctuations of the ‘localization point’. The global method is thus presented and results of the first sub-level are particularly developed. The main sources of uncertainty, including AACMM deformations, are exposed. (paper)

  14. Not Normal: the uncertainties of scientific measurements

    Science.gov (United States)

    Bailey, David C.

    2017-01-01

    Judging the significance and reproducibility of quantitative research requires a good understanding of relevant uncertainties, but it is often unclear how well these have been evaluated and what they imply. Reported scientific uncertainties were studied by analysing 41 000 measurements of 3200 quantities from medicine, nuclear and particle physics, and interlaboratory comparisons ranging from chemistry to toxicology. Outliers are common, with 5σ disagreements up to five orders of magnitude more frequent than naively expected. Uncertainty-normalized differences between multiple measurements of the same quantity are consistent with heavy-tailed Student's t-distributions that are often almost Cauchy, far from a Gaussian Normal bell curve. Medical research uncertainties are generally as well evaluated as those in physics, but physics uncertainty improves more rapidly, making feasible simple significance criteria such as the 5σ discovery convention in particle physics. Contributions to measurement uncertainty from mistakes and unknown problems are not completely unpredictable. Such errors appear to have power-law distributions consistent with how designed complex systems fail, and how unknown systematic errors are constrained by researchers. This better understanding may help improve analysis and meta-analysis of data, and help scientists and the public have more realistic expectations of what scientific results imply.

  15. GRAPH THEORY APPROACH TO QUANTIFY UNCERTAINTY OF PERFORMANCE MEASURES

    Directory of Open Access Journals (Sweden)

    Sérgio D. Sousa

    2015-03-01

    Full Text Available In this work, the performance measurement process is studied to quantify the uncertainty induced in the resulting performance measure (PM. To that end, the causes of uncertainty are identified, analysing the activities undertaken in the three following stages of the performance measurement process: design and implementation, data collection and record, and determination and analysis. A quantitative methodology based on graph theory and on the sources of uncertainty of the performance measurement process is used to calculate an uncertainty index to evaluate the level of uncertainty of a given PM or (key performance indicator. An application example is presented. The quantification of PM uncertainty could contribute to better represent the risk associated with a given decision and also to improve the PM to increase its precision and reliability.

  16. A statistical approach to quantify uncertainty in carbon monoxide measurements at the Izaña global GAW station: 2008–2011

    Directory of Open Access Journals (Sweden)

    A. J. Gomez-Pelaez

    2013-03-01

    Full Text Available Atmospheric CO in situ measurements are carried out at the Izaña (Tenerife global GAW (Global Atmosphere Watch Programme of the World Meteorological Organization – WMO mountain station using a Reduction Gas Analyser (RGA. In situ measurements at Izaña are representative of the subtropical Northeast Atlantic free troposphere, especially during nighttime. We present the measurement system configuration, the response function, the calibration scheme, the data processing, the Izaña 2008–2011 CO nocturnal time series, and the mean diurnal cycle by months. We have developed a rigorous uncertainty analysis for carbon monoxide measurements carried out at the Izaña station, which could be applied to other GAW stations. We determine the combined standard measurement uncertainty taking into consideration four contributing components: uncertainty of the WMO standard gases interpolated over the range of measurement, the uncertainty that takes into account the agreement between the standard gases and the response function used, the uncertainty due to the repeatability of the injections, and the propagated uncertainty related to the temporal consistency of the response function parameters (which also takes into account the covariance between the parameters. The mean value of the combined standard uncertainty decreased significantly after March 2009, from 2.37 nmol mol−1 to 1.66 nmol mol−1, due to improvements in the measurement system. A fifth type of uncertainty we call representation uncertainty is considered when some of the data necessary to compute the temporal mean are absent. Any computed mean has also a propagated uncertainty arising from the uncertainties of the data used to compute the mean. The law of propagation depends on the type of uncertainty component (random or systematic. In situ hourly means are compared with simultaneous and collocated NOAA flask samples. The uncertainty of the differences is computed and used to determine

  17. Differential rigor development in red and white muscle revealed by simultaneous measurement of tension and stiffness.

    Science.gov (United States)

    Kobayashi, Masahiko; Takemori, Shigeru; Yamaguchi, Maki

    2004-02-10

    Based on the molecular mechanism of rigor mortis, we have proposed that stiffness (elastic modulus evaluated with tension response against minute length perturbations) can be a suitable index of post-mortem rigidity in skeletal muscle. To trace the developmental process of rigor mortis, we measured stiffness and tension in both red and white rat skeletal muscle kept in liquid paraffin at 37 and 25 degrees C. White muscle (in which type IIB fibres predominate) developed stiffness and tension significantly more slowly than red muscle, except for soleus red muscle at 25 degrees C, which showed disproportionately slow rigor development. In each of the examined muscles, stiffness and tension developed more slowly at 25 degrees C than at 37 degrees C. In each specimen, tension always reached its maximum level earlier than stiffness, and then decreased more rapidly and markedly than stiffness. These phenomena may account for the sequential progress of rigor mortis in human cadavers.

  18. Treatment and reporting of uncertainties for environmental radiation measurements

    International Nuclear Information System (INIS)

    Colle, R.

    1980-01-01

    Recommendations for a practical and uniform method for treating and reporting uncertainties in environmental radiation measurements data are presented. The method requires that each reported measurement result include the value, a total propagated random uncertainty expressed as the standard deviation, and a combined overall uncertainty. The uncertainty assessment should be based on as nearly a complete assessment as possible and should include every conceivable or likely source of inaccuracy in the result. Guidelines are given for estimating random and systematic uncertainty components, and for propagating and combining them to form an overall uncertainty

  19. Uncertainty relation and simultaneous measurements in quantum theory

    International Nuclear Information System (INIS)

    Busch, P.

    1982-01-01

    In this thesis the question for the interpretation of the uncertainty relation is picked up, and a program for the justification of its individualistic interpretation is formulated. By means of quantum mechanical models for the position and momentum measurement a justification of the interpretaton has been tried by reconstruction of the origin of the uncertainties from the conditions of the measuring devices and the determination of the relation of the measured results to the object. By means of a model of the common measurement it could be shown how the uncertainty relation results from the not eliminable mutual disturbance of the devices and the uncertainty relation for the measuring system. So finally the commutation relation is conclusive. For the illustration the split experiment is discussed, first according to Heisenberg with fixed split, then for the quantum mechanical, movable split (Bohr-Einstein). (orig./HSI) [de

  20. [Experimental study of restiffening of the rigor mortis].

    Science.gov (United States)

    Wang, X; Li, M; Liao, Z G; Yi, X F; Peng, X M

    2001-11-01

    To observe changes of the length of sarcomere of rat when restiffening. We measured the length of sarcomere of quadriceps in 40 rats in different condition by scanning electron microscope. The length of sarcomere of rigor mortis without destroy is obviously shorter than that of restiffening. The length of sarcomere is negatively correlative to the intensity of rigor mortis. Measuring the length of sarcomere can determine the intensity of rigor mortis and provide evidence for estimation of time since death.

  1. Vector network analyzer (VNA) measurements and uncertainty assessment

    CERN Document Server

    Shoaib, Nosherwan

    2017-01-01

    This book describes vector network analyzer measurements and uncertainty assessments, particularly in waveguide test-set environments, in order to establish their compatibility to the International System of Units (SI) for accurate and reliable characterization of communication networks. It proposes a fully analytical approach to measurement uncertainty evaluation, while also highlighting the interaction and the linear propagation of different uncertainty sources to compute the final uncertainties associated with the measurements. The book subsequently discusses the dimensional characterization of waveguide standards and the quality of the vector network analyzer (VNA) calibration techniques. The book concludes with an in-depth description of the novel verification artefacts used to assess the performance of the VNAs. It offers a comprehensive reference guide for beginners to experts, in both academia and industry, whose work involves the field of network analysis, instrumentation and measurements.

  2. Measurement uncertainty budget of an interferometric flow velocity sensor

    Science.gov (United States)

    Bermuske, Mike; Büttner, Lars; Czarske, Jürgen

    2017-06-01

    Flow rate measurements are a common topic for process monitoring in chemical engineering and food industry. To achieve the requested low uncertainties of 0:1% for flow rate measurements, a precise measurement of the shear layers of such flows is necessary. The Laser Doppler Velocimeter (LDV) is an established method for measuring local flow velocities. For exact estimation of the flow rate, the flow profile in the shear layer is of importance. For standard LDV the axial resolution and therefore the number of measurement points in the shear layer is defined by the length of the measurement volume. A decrease of this length is accompanied by a larger fringe distance variation along the measurement axis which results in a rise of the measurement uncertainty for the flow velocity (uncertainty relation between spatial resolution and velocity uncertainty). As a unique advantage, the laser Doppler profile sensor (LDV-PS) overcomes this problem by using two fan-like fringe systems to obtain the position of the measured particles along the measurement axis and therefore achieve a high spatial resolution while it still offers a low velocity uncertainty. With this technique, the flow rate can be estimated with one order of magnitude lower uncertainty, down to 0:05% statistical uncertainty.1 And flow profiles especially in film flows can be measured more accurately. The problem for this technique is, in contrast to laboratory setups where the system is quite stable, that for industrial applications the sensor needs a reliable and robust traceability to the SI units, meter and second. Small deviations in the calibration can, because of the highly position depending calibration function, cause large systematic errors in the measurement result. Therefore, a simple, stable and accurate tool is needed, that can easily be used in industrial surroundings to check or recalibrate the sensor. In this work, different calibration methods are presented and their influences to the

  3. Measure of uncertainty in regional grade variability

    NARCIS (Netherlands)

    Tutmez, B.; Kaymak, U.; Melin, P.; Castillo, O.; Gomez Ramirez, E.; Kacprzyk, J.; Pedrycz, W.

    2007-01-01

    Because the geological events are neither homogeneous nor isotropic, the geological investigations are characterized by particularly high uncertainties. This paper presents a hybrid methodology for measuring of uncertainty in regional grade variability. In order to evaluate the fuzziness in grade

  4. An approach based on Hierarchical Bayesian Graphical Models for measurement interpretation under uncertainty

    Science.gov (United States)

    Skataric, Maja; Bose, Sandip; Zeroug, Smaine; Tilke, Peter

    2017-02-01

    It is not uncommon in the field of non-destructive evaluation that multiple measurements encompassing a variety of modalities are available for analysis and interpretation for determining the underlying states of nature of the materials or parts being tested. Despite and sometimes due to the richness of data, significant challenges arise in the interpretation manifested as ambiguities and inconsistencies due to various uncertain factors in the physical properties (inputs), environment, measurement device properties, human errors, and the measurement data (outputs). Most of these uncertainties cannot be described by any rigorous mathematical means, and modeling of all possibilities is usually infeasible for many real time applications. In this work, we will discuss an approach based on Hierarchical Bayesian Graphical Models (HBGM) for the improved interpretation of complex (multi-dimensional) problems with parametric uncertainties that lack usable physical models. In this setting, the input space of the physical properties is specified through prior distributions based on domain knowledge and expertise, which are represented as Gaussian mixtures to model the various possible scenarios of interest for non-destructive testing applications. Forward models are then used offline to generate the expected distribution of the proposed measurements which are used to train a hierarchical Bayesian network. In Bayesian analysis, all model parameters are treated as random variables, and inference of the parameters is made on the basis of posterior distribution given the observed data. Learned parameters of the posterior distribution obtained after the training can therefore be used to build an efficient classifier for differentiating new observed data in real time on the basis of pre-trained models. We will illustrate the implementation of the HBGM approach to ultrasonic measurements used for cement evaluation of cased wells in the oil industry.

  5. Increased scientific rigor will improve reliability of research and effectiveness of management

    Science.gov (United States)

    Sells, Sarah N.; Bassing, Sarah B.; Barker, Kristin J.; Forshee, Shannon C.; Keever, Allison; Goerz, James W.; Mitchell, Michael S.

    2018-01-01

    Rigorous science that produces reliable knowledge is critical to wildlife management because it increases accurate understanding of the natural world and informs management decisions effectively. Application of a rigorous scientific method based on hypothesis testing minimizes unreliable knowledge produced by research. To evaluate the prevalence of scientific rigor in wildlife research, we examined 24 issues of the Journal of Wildlife Management from August 2013 through July 2016. We found 43.9% of studies did not state or imply a priori hypotheses, which are necessary to produce reliable knowledge. We posit that this is due, at least in part, to a lack of common understanding of what rigorous science entails, how it produces more reliable knowledge than other forms of interpreting observations, and how research should be designed to maximize inferential strength and usefulness of application. Current primary literature does not provide succinct explanations of the logic behind a rigorous scientific method or readily applicable guidance for employing it, particularly in wildlife biology; we therefore synthesized an overview of the history, philosophy, and logic that define scientific rigor for biological studies. A rigorous scientific method includes 1) generating a research question from theory and prior observations, 2) developing hypotheses (i.e., plausible biological answers to the question), 3) formulating predictions (i.e., facts that must be true if the hypothesis is true), 4) designing and implementing research to collect data potentially consistent with predictions, 5) evaluating whether predictions are consistent with collected data, and 6) drawing inferences based on the evaluation. Explicitly testing a priori hypotheses reduces overall uncertainty by reducing the number of plausible biological explanations to only those that are logically well supported. Such research also draws inferences that are robust to idiosyncratic observations and

  6. Evaluation of the uncertainty of environmental measurements of radioactivity

    International Nuclear Information System (INIS)

    Heydorn, K.

    2003-01-01

    Full text: The almost universal acceptance of the concept of uncertainty has led to its introduction into the ISO 17025 standard for general requirements to testing and calibration laboratories. This means that not only scientists, but also legislators, politicians, the general population - and perhaps even the press - expect to see all future results associated with an expression of their uncertainty. Results obtained by measurement of radioactivity have routinely been associated with an expression of their uncertainty, based on the so-called counting statistics. This is calculated together with the actual result on the assumption that the number of counts observed has a Poisson distribution with equal mean and variance. Most of the nuclear scientific community has therefore assumed that it already complied with the latest ISO 17025 requirements. Counting statistics, however, express only the variability observed among repeated measurements of the same sample under the same counting conditions, which is equivalent to the term repeatability used in quantitative analysis. Many other sources of uncertainty need to be taken into account before a statement of the uncertainty of the actual result can be made. As the first link in the traceability chain calibration is always an important uncertainty component in any kind of measurement. For radioactivity measurements in particular we find that counting geometry assumes the greatest importance, because it is often not possible to measure a standard and a control sample under exactly the same conditions. In the case of large samples we have additional uncertainty components associated with sample heterogeneity and its influence on self-absorption and counting efficiency. In low-level environmental measurements we have an additional risk of sample contamination, but the most important contribution to uncertainty is usually the representativity of the sample being analysed. For uniform materials this can be expressed by the

  7. Estimating the measurement uncertainty in forensic blood alcohol analysis.

    Science.gov (United States)

    Gullberg, Rod G

    2012-04-01

    For many reasons, forensic toxicologists are being asked to determine and report their measurement uncertainty in blood alcohol analysis. While understood conceptually, the elements and computations involved in determining measurement uncertainty are generally foreign to most forensic toxicologists. Several established and well-documented methods are available to determine and report the uncertainty in blood alcohol measurement. A straightforward bottom-up approach is presented that includes: (1) specifying the measurand, (2) identifying the major components of uncertainty, (3) quantifying the components, (4) statistically combining the components and (5) reporting the results. A hypothetical example is presented that employs reasonable estimates for forensic blood alcohol analysis assuming headspace gas chromatography. These computations are easily employed in spreadsheet programs as well. Determining and reporting measurement uncertainty is an important element in establishing fitness-for-purpose. Indeed, the demand for such computations and information from the forensic toxicologist will continue to increase.

  8. Uncertainty, joint uncertainty, and the quantum uncertainty principle

    International Nuclear Information System (INIS)

    Narasimhachar, Varun; Poostindouz, Alireza; Gour, Gilad

    2016-01-01

    Historically, the element of uncertainty in quantum mechanics has been expressed through mathematical identities called uncertainty relations, a great many of which continue to be discovered. These relations use diverse measures to quantify uncertainty (and joint uncertainty). In this paper we use operational information-theoretic principles to identify the common essence of all such measures, thereby defining measure-independent notions of uncertainty and joint uncertainty. We find that most existing entropic uncertainty relations use measures of joint uncertainty that yield themselves to a small class of operational interpretations. Our notion relaxes this restriction, revealing previously unexplored joint uncertainty measures. To illustrate the utility of our formalism, we derive an uncertainty relation based on one such new measure. We also use our formalism to gain insight into the conditions under which measure-independent uncertainty relations can be found. (paper)

  9. Information measures and uncertainty of particular symbols

    Czech Academy of Sciences Publication Activity Database

    Mareš, Milan

    2011-01-01

    Roč. 47, č. 1 (2011), s. 144-163 ISSN 0023-5954 R&D Projects: GA MŠk(CZ) 1M0572; GA ČR GA402/08/0618 Institutional research plan: CEZ:AV0Z10750506 Keywords : Information source * Information measure * Uncertainty modelling Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.454, year: 2011 http://library.utia.cas.cz/separaty/2011/E/mares-information measures and uncertainty of particular symbols.pdf

  10. Assessment of dose measurement uncertainty using RisoScan

    International Nuclear Information System (INIS)

    Helt-Hansen, Jakob; Miller, Arne

    2006-01-01

    The dose measurement uncertainty of the dosimeter system RisoScan, office scanner and Riso B3 dosimeters has been assessed by comparison with spectrophotometer measurements of the same dosimeters. The reproducibility and the combined uncertainty were found to be approximately 2% and 4%, respectively, at one standard deviation. The subroutine in RisoScan for electron energy measurement is shown to give results that are equivalent to the measurements with a scanning spectrophotometer

  11. Calibration and Measurement Uncertainty Estimation of Radiometric Data: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Habte, A.; Sengupta, M.; Reda, I.; Andreas, A.; Konings, J.

    2014-11-01

    Evaluating the performance of photovoltaic cells, modules, and arrays that form large solar deployments relies on accurate measurements of the available solar resource. Therefore, determining the accuracy of these solar radiation measurements provides a better understanding of investment risks. This paper provides guidelines and recommended procedures for estimating the uncertainty in calibrations and measurements by radiometers using methods that follow the International Bureau of Weights and Measures Guide to the Expression of Uncertainty (GUM). Standardized analysis based on these procedures ensures that the uncertainty quoted is well documented.

  12. Evaluation of uncertainty and detection limits in radioactivity measurements

    Energy Technology Data Exchange (ETDEWEB)

    Herranz, M. [Universidad del Pais Vasco/Euskal Herriko Unibertsitatea, Escuela Tecnica Superior de Ingenieria de Bilbao, Alda. Urquijo, s/n, 48013 Bilbao (Spain); Idoeta, R. [Universidad del Pais Vasco/Euskal Herriko Unibertsitatea, Escuela Tecnica Superior de Ingenieria de Bilbao, Alda. Urquijo, s/n, 48013 Bilbao (Spain)], E-mail: raquel.idoeta@ehu.es; Legarda, F. [Universidad del Pais Vasco/Euskal Herriko Unibertsitatea, Escuela Tecnica Superior de Ingenieria de Bilbao, Alda. Urquijo, s/n, 48013 Bilbao (Spain)

    2008-10-01

    The uncertainty associated with the assessment of the radioactive content of any sample depends on the net counting rate registered during the measuring process and on the different weighting factors needed to transform this counting rate into activity, activity per unit mass or activity concentration. This work analyses the standard uncertainties in these weighting factors as well as their contribution to the uncertainty in the activity reported for three typical determinations for environmental radioactivity measurements in the laboratory. It also studies the corresponding characteristic limits and their dependence on the standard uncertainty related to those weighting factors, offering an analysis of the effectiveness of the simplified characteristic limits as evaluated by various measuring software and laboratories.

  13. Evaluation of uncertainty and detection limits in radioactivity measurements

    International Nuclear Information System (INIS)

    Herranz, M.; Idoeta, R.; Legarda, F.

    2008-01-01

    The uncertainty associated with the assessment of the radioactive content of any sample depends on the net counting rate registered during the measuring process and on the different weighting factors needed to transform this counting rate into activity, activity per unit mass or activity concentration. This work analyses the standard uncertainties in these weighting factors as well as their contribution to the uncertainty in the activity reported for three typical determinations for environmental radioactivity measurements in the laboratory. It also studies the corresponding characteristic limits and their dependence on the standard uncertainty related to those weighting factors, offering an analysis of the effectiveness of the simplified characteristic limits as evaluated by various measuring software and laboratories

  14. Using a Meniscus to Teach Uncertainty in Measurement

    Science.gov (United States)

    Backman, Philip

    2008-01-01

    I have found that students easily understand that a measurement cannot be exact, but they often seem to lack an understanding of why it is important to know "something" about the magnitude of the uncertainty. This tends to promote an attitude that almost any uncertainty value will do. Such indifference may exist because once an uncertainty is…

  15. Monitoring muscle optical scattering properties during rigor mortis

    Science.gov (United States)

    Xia, J.; Ranasinghesagara, J.; Ku, C. W.; Yao, G.

    2007-09-01

    Sarcomere is the fundamental functional unit in skeletal muscle for force generation. In addition, sarcomere structure is also an important factor that affects the eating quality of muscle food, the meat. The sarcomere structure is altered significantly during rigor mortis, which is the critical stage involved in transforming muscle to meat. In this paper, we investigated optical scattering changes during the rigor process in Sternomandibularis muscles. The measured optical scattering parameters were analyzed along with the simultaneously measured passive tension, pH value, and histology analysis. We found that the temporal changes of optical scattering, passive tension, pH value and fiber microstructures were closely correlated during the rigor process. These results suggested that sarcomere structure changes during rigor mortis can be monitored and characterized by optical scattering, which may find practical applications in predicting meat quality.

  16. Dimensional measurements with submicrometer uncertainty in production environment

    DEFF Research Database (Denmark)

    De Chiffre, L.; Gudnason, M. M.; Madruga, D.

    2015-01-01

    The work concerns a laboratory investigation of a method to achieve dimensional measurements with submicrometer uncertainty under conditions that are typical of a production environment. The method involves the concurrent determination of dimensions and material properties from measurements carried...... gauge blocks along with their uncertainties were estimated directly from the measurements. The length of the two workpieces at the reference temperature of 20 °C was extrapolated from the measurements and compared to certificate values. The investigations have documented that the developed approach...

  17. Continuous quantum measurements and the action uncertainty principle

    Science.gov (United States)

    Mensky, Michael B.

    1992-09-01

    The path-integral approach to quantum theory of continuous measurements has been developed in preceding works of the author. According to this approach the measurement amplitude determining probabilities of different outputs of the measurement can be evaluated in the form of a restricted path integral (a path integral “in finite limits”). With the help of the measurement amplitude, maximum deviation of measurement outputs from the classical one can be easily determined. The aim of the present paper is to express this variance in a simpler and transparent form of a specific uncertainty principle (called the action uncertainty principle, AUP). The most simple (but weak) form of AUP is δ S≳ℏ, where S is the action functional. It can be applied for simple derivation of the Bohr-Rosenfeld inequality for measurability of gravitational field. A stronger (and having wider application) form of AUP (for ideal measurements performed in the quantum regime) is |∫{/' t″ }(δ S[ q]/δ q( t))Δ q( t) dt|≃ℏ, where the paths [ q] and [Δ q] stand correspondingly for the measurement output and for the measurement error. It can also be presented in symbolic form as Δ(Equation) Δ(Path) ≃ ℏ. This means that deviation of the observed (measured) motion from that obeying the classical equation of motion is reciprocally proportional to the uncertainty in a path (the latter uncertainty resulting from the measurement error). The consequence of AUP is that improving the measurement precision beyond the threshold of the quantum regime leads to decreasing information resulting from the measurement.

  18. Projected uranium measurement uncertainties for the Gas Centrifuge Enrichment Plant

    International Nuclear Information System (INIS)

    Younkin, J.M.

    1979-02-01

    An analysis was made of the uncertainties associated with the measurements of the declared uranium streams in the Portsmouth Gas Centrifuge Enrichment Plant (GCEP). The total uncertainty for the GCEP is projected to be from 54 to 108 kg 235 U/year out of a measured total of 200,000 kg 235 U/year. The systematic component of uncertainty of the UF 6 streams is the largest and the dominant contributor to the total uncertainty. A possible scheme for reducing the total uncertainty is given

  19. Measuring the uncertainty of tapping torque

    DEFF Research Database (Denmark)

    Belluco, Walter; De Chiffre, Leonardo

    An uncertainty budget is carried out for torque measurements performed at the Institut for Procesteknik for the evaluation of cutting fluids. Thirty test blanks were machined with one tool and one fluid, torque diagrams were recorded and the repeatability of single torque measurements was estimat...

  20. Development of rigor mortis is not affected by muscle volume.

    Science.gov (United States)

    Kobayashi, M; Ikegaya, H; Takase, I; Hatanaka, K; Sakurada, K; Iwase, H

    2001-04-01

    There is a hypothesis suggesting that rigor mortis progresses more rapidly in small muscles than in large muscles. We measured rigor mortis as tension determined isometrically in rat musculus erector spinae that had been cut into muscle bundles of various volumes. The muscle volume did not influence either the progress or the resolution of rigor mortis, which contradicts the hypothesis. Differences in pre-rigor load on the muscles influenced the onset and resolution of rigor mortis in a few pairs of samples, but did not influence the time taken for rigor mortis to reach its full extent after death. Moreover, the progress of rigor mortis in this muscle was biphasic; this may reflect the early rigor of red muscle fibres and the late rigor of white muscle fibres.

  1. A decision-oriented measure of uncertainty importance for use in PSA

    International Nuclear Information System (INIS)

    Poern, Kurt

    1997-01-01

    For the interpretation of the results of probabilistic risk assessments it is important to have measures which identify the basic events that contribute most to the frequency of the top event but also to identify basic events that are the main contributors to the uncertainty in this frequency. Both types of measures, often called Importance Measure and Measure of Uncertainty Importance, respectively, have been the subject of interest for many researchers in the reliability field. The most frequent mode of uncertainty analysis in connection with probabilistic risk assessment has been to propagate the uncertainty of all model parameters up to an uncertainty distribution for the top event frequency. Various uncertainty importance measures have been proposed in order to point out the parameters that in some sense are the main contributors to the top event distribution. The new measure of uncertainty importance suggested here goes a step further in that it has been developed within a decision theory framework, thereby providing an indication of on what basic event it would be most valuable, from the decision-making point of view, to procure more information

  2. Uncertainty budget for optical coordinate measurements of circle diameter

    DEFF Research Database (Denmark)

    Morace, Renate Erica; Hansen, Hans Nørgaard; De Chiffre, Leonardo

    2004-01-01

    An uncertainty analysis for circle diameter measurements using a coordinate measuring machine (CMM) equipped with an optical probe is presented in this paper. A mathematical model for data evaluation and uncertainty assessment was formulated in accordance with Guide to the Expression of Uncertain...

  3. Quality of nuchal translucency measurements correlates with broader aspects of program rigor and culture of excellence.

    Science.gov (United States)

    Evans, Mark I; Krantz, David A; Hallahan, Terrence; Sherwin, John; Britt, David W

    2013-01-01

    To determine if nuchal translucency (NT) quality correlates with the extent to which clinics vary in rigor and quality control. We correlated NT performance quality (bias and precision) of 246,000 patients with two alternative measures of clinic culture - % of cases for whom nasal bone (NB) measurements were performed and % of requisitions correctly filled for race-ethnicity and weight. When requisition errors occurred in 5% (33%), the curve lowered to 0.93 MoM (p 90%, MoM was 0.99 compared to those quality exists independent of individual variation in NT quality, and two divergent indices of program rigor are associated with NT quality. Quality control must be program wide, and to effect continued improvement in the quality of NT results across time, the cultures of clinics must become a target for intervention. Copyright © 2013 S. Karger AG, Basel.

  4. Uncertainties for pressure-time efficiency measurements

    OpenAIRE

    Ramdal, Jørgen; Jonsson, Pontus; Dahlhaug, Ole Gunnar; Nielsen, Torbjørn; Cervantes, Michel

    2010-01-01

     In connection with the pressure-time project at the Norwegian University of Science and Technology and Luleå University of Technology, a number of tests with the pressure-time method have been performed at the Waterpower Laboratory in Trondheim, Norway. The aim is to lower the uncertainty and improve usability of the method. Also a field test at the Anundsjoe power plant in Sweden has been performed. The pressure-time measurement is affected by random uncertainty. To minimize the effect of t...

  5. Diffraction-based overlay measurement on dedicated mark using rigorous modeling method

    Science.gov (United States)

    Lu, Hailiang; Wang, Fan; Zhang, Qingyun; Chen, Yonghui; Zhou, Chang

    2012-03-01

    Diffraction Based Overlay (DBO) is widely evaluated by numerous authors, results show DBO can provide better performance than Imaging Based Overlay (IBO). However, DBO has its own problems. As well known, Modeling based DBO (mDBO) faces challenges of low measurement sensitivity and crosstalk between various structure parameters, which may result in poor accuracy and precision. Meanwhile, main obstacle encountered by empirical DBO (eDBO) is that a few pads must be employed to gain sufficient information on overlay-induced diffraction signature variations, which consumes more wafer space and costs more measuring time. Also, eDBO may suffer from mark profile asymmetry caused by processes. In this paper, we propose an alternative DBO technology that employs a dedicated overlay mark and takes a rigorous modeling approach. This technology needs only two or three pads for each direction, which is economic and time saving. While overlay measurement error induced by mark profile asymmetry being reduced, this technology is expected to be as accurate and precise as scatterometry technologies.

  6. Immersive Data Comprehension: Visualizing Uncertainty in Measurable Models

    Directory of Open Access Journals (Sweden)

    Pere eBrunet

    2015-09-01

    Full Text Available Recent advances in 3D scanning technologies have opened new possibilities in a broad range of applications includingcultural heritage, medicine, civil engineering and urban planning. Virtual Reality systems can provide new tools toprofessionals that want to understand acquired 3D models. In this paper, we review the concept of data comprehension with an emphasis on visualization and inspection tools on immersive setups. We claim that in most application fields, data comprehension requires model measurements which in turn should be based on the explicit visualization of uncertainty. As 3D digital representations are not faithful, information on their fidelity at local level should be included in the model itself as uncertainty bounds. We propose the concept of Measurable 3D Models as digital models that explicitly encode local uncertainty bounds related to their quality. We claim that professionals and experts can strongly benefit from immersive interaction through new specific, fidelity-aware measurement tools which can facilitate 3D data comprehension. Since noise and processing errors are ubiquitous in acquired datasets, we discuss the estimation, representation and visualization of data uncertainty. We show that, based on typical user requirements in Cultural Heritage and other domains, application-oriented measuring tools in 3D models must consider uncertainty and local error bounds. We also discuss the requirements of immersive interaction tools for the comprehension of huge 3D and nD datasets acquired from real objects.

  7. A Unified Approach for Reporting ARM Measurement Uncertainties Technical Report: Updated in 2016

    Energy Technology Data Exchange (ETDEWEB)

    Sisterson, Douglas [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-01-15

    The U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility is observationally based, and quantifying the uncertainty of its measurements is critically important. With over 300 widely differing instruments providing over 2,500 datastreams, concise expression of measurement uncertainty is quite challenging. ARM currently provides data and supporting metadata (information about the data or data quality) to its users through several sources. Because the continued success of the ARM Facility depends on the known quality of its measurements, ARM relies on Instrument Mentors and the ARM Data Quality Office to ensure, assess, and report measurement quality. Therefore, an easily accessible, well-articulated estimate of ARM measurement uncertainty is needed. This report is a continuation of the work presented by Campos and Sisterson (2015) and provides additional uncertainty information from instruments not available in their report. As before, a total measurement uncertainty has been calculated as a function of the instrument uncertainty (calibration factors), the field uncertainty (environmental factors), and the retrieval uncertainty (algorithm factors). This study will not expand on methods for computing these uncertainties. As before, it will focus on the practical identification, characterization, and inventory of the measurement uncertainties already available to the ARM community through the ARM Instrument Mentors and their ARM instrument handbooks. This study continues the first steps towards reporting ARM measurement uncertainty as: (1) identifying how the uncertainty of individual ARM measurements is currently expressed, (2) identifying a consistent approach to measurement uncertainty, and then (3) reclassifying ARM instrument measurement uncertainties in a common framework.

  8. Expanded and combined uncertainty in measurements by GM counters

    International Nuclear Information System (INIS)

    Stankovic, K.; Arandjic, D.; Lazarevic, Dj.; Osmokrovic, P.

    2007-01-01

    This paper deals with possible ways of obtaining expanded and combined uncertainty in measurements for four types of GM counters with a same counter's tube, in cases when the contributors of these uncertainties are cosmic background radiation and induced overvoltage phenomena. Nowadays, as a consequence of electromagnetic radiation, the latter phenomenon is especially marked in urban environments. Based on experimental results obtained, it has been established that the uncertainties of an influenced random variable 'number of pulses from background radiation' and 'number of pulses induced by overvoltage' depend on the technological solution of the counter's reading system and contribute in different ways to the expanded and combined uncertainty in measurements of the applied types of GM counters. (author)

  9. Approximate Bayesian evaluations of measurement uncertainty

    Science.gov (United States)

    Possolo, Antonio; Bodnar, Olha

    2018-04-01

    The Guide to the Expression of Uncertainty in Measurement (GUM) includes formulas that produce an estimate of a scalar output quantity that is a function of several input quantities, and an approximate evaluation of the associated standard uncertainty. This contribution presents approximate, Bayesian counterparts of those formulas for the case where the output quantity is a parameter of the joint probability distribution of the input quantities, also taking into account any information about the value of the output quantity available prior to measurement expressed in the form of a probability distribution on the set of possible values for the measurand. The approximate Bayesian estimates and uncertainty evaluations that we present have a long history and illustrious pedigree, and provide sufficiently accurate approximations in many applications, yet are very easy to implement in practice. Differently from exact Bayesian estimates, which involve either (analytical or numerical) integrations, or Markov Chain Monte Carlo sampling, the approximations that we describe involve only numerical optimization and simple algebra. Therefore, they make Bayesian methods widely accessible to metrologists. We illustrate the application of the proposed techniques in several instances of measurement: isotopic ratio of silver in a commercial silver nitrate; odds of cryptosporidiosis in AIDS patients; height of a manometer column; mass fraction of chromium in a reference material; and potential-difference in a Zener voltage standard.

  10. Evaluating the uncertainty of input quantities in measurement models

    Science.gov (United States)

    Possolo, Antonio; Elster, Clemens

    2014-06-01

    The Guide to the Expression of Uncertainty in Measurement (GUM) gives guidance about how values and uncertainties should be assigned to the input quantities that appear in measurement models. This contribution offers a concrete proposal for how that guidance may be updated in light of the advances in the evaluation and expression of measurement uncertainty that were made in the course of the twenty years that have elapsed since the publication of the GUM, and also considering situations that the GUM does not yet contemplate. Our motivation is the ongoing conversation about a new edition of the GUM. While generally we favour a Bayesian approach to uncertainty evaluation, we also recognize the value that other approaches may bring to the problems considered here, and focus on methods for uncertainty evaluation and propagation that are widely applicable, including to cases that the GUM has not yet addressed. In addition to Bayesian methods, we discuss maximum-likelihood estimation, robust statistical methods, and measurement models where values of nominal properties play the same role that input quantities play in traditional models. We illustrate these general-purpose techniques in concrete examples, employing data sets that are realistic but that also are of conveniently small sizes. The supplementary material available online lists the R computer code that we have used to produce these examples (stacks.iop.org/Met/51/3/339/mmedia). Although we strive to stay close to clause 4 of the GUM, which addresses the evaluation of uncertainty for input quantities, we depart from it as we review the classes of measurement models that we believe are generally useful in contemporary measurement science. We also considerably expand and update the treatment that the GUM gives to Type B evaluations of uncertainty: reviewing the state-of-the-art, disciplined approach to the elicitation of expert knowledge, and its encapsulation in probability distributions that are usable in

  11. Physiological studies of muscle rigor mortis in the fowl

    International Nuclear Information System (INIS)

    Nakahira, S.; Kaneko, K.; Tanaka, K.

    1990-01-01

    A simple system was developed for continuous measurement of muscle contraction during nor mortis. Longitudinal muscle strips dissected from the Peroneus Longus were suspended in a plastic tube containing liquid paraffin. Mechanical activity was transmitted to a strain-gauge transducer which is connected to a potentiometric pen-recorder. At the onset of measurement 1.2g was loaded on the muscle strip. This model was used to study the muscle response to various treatments during nor mortis. All measurements were carried out under the anaerobic condition at 17°C, except otherwise stated. 1. The present system was found to be quite useful for continuous measurement of muscle rigor course. 2. Muscle contraction under the anaerobic condition at 17°C reached a peak about 2 hours after the onset of measurement and thereafter it relaxed at a slow rate. In contrast, the aerobic condition under a high humidity resulted in a strong rigor, about three times stronger than that in the anaerobic condition. 3. Ultrasonic treatment (37, 000-47, 000Hz) at 25°C for 10 minutes resulted in a moderate muscle rigor. 4. Treatment of muscle strip with 2mM EGTA at 30°C for 30 minutes led to a relaxation of the muscle. 5. The muscle from the birds killed during anesthesia with pentobarbital sodium resulted in a slow rate of rigor, whereas the birds killed one day after hypophysectomy led to a quick muscle rigor as seen in intact controls. 6. A slight muscle rigor was observed when muscle strip was placed in a refrigerator at 0°C for 18.5 hours and thereafter temperature was kept at 17°C. (author)

  12. Optimal entropic uncertainty relation for successive measurements ...

    Indian Academy of Sciences (India)

    measurements in quantum information theory. M D SRINIVAS ... derived by Robertson in 1929 [2] from the first principles of quantum theory, does not ... systems and may hence be referred to as 'uncertainty relations for distinct measurements'.

  13. Assessing measurement uncertainty in meteorology in urban environments

    International Nuclear Information System (INIS)

    Curci, S; Lavecchia, C; Frustaci, G; Pilati, S; Paganelli, C; Paolini, R

    2017-01-01

    Measurement uncertainty in meteorology has been addressed in a number of recent projects. In urban environments, uncertainty is also affected by local effects which are more difficult to deal with than for synoptic stations. In Italy, beginning in 2010, an urban meteorological network (Climate Network ® ) was designed, set up and managed at national level according to high metrological standards and homogeneity criteria to support energy applications. The availability of such a high-quality operative automatic weather station network represents an opportunity to investigate the effects of station siting and sensor exposure and to estimate the related measurement uncertainty. An extended metadata set was established for the stations in Milan, including siting and exposure details. Statistical analysis on an almost 3-year-long operational period assessed network homogeneity, quality and reliability. Deviations from reference mean values were then evaluated in selected low-gradient local weather situations in order to investigate siting and exposure effects. In this paper the methodology is depicted and preliminary results of its application to air temperature discussed; this allowed the setting of an upper limit of 1 °C for the added measurement uncertainty at the top of the urban canopy layer. (paper)

  14. Assessing measurement uncertainty in meteorology in urban environments

    Science.gov (United States)

    Curci, S.; Lavecchia, C.; Frustaci, G.; Paolini, R.; Pilati, S.; Paganelli, C.

    2017-10-01

    Measurement uncertainty in meteorology has been addressed in a number of recent projects. In urban environments, uncertainty is also affected by local effects which are more difficult to deal with than for synoptic stations. In Italy, beginning in 2010, an urban meteorological network (Climate Network®) was designed, set up and managed at national level according to high metrological standards and homogeneity criteria to support energy applications. The availability of such a high-quality operative automatic weather station network represents an opportunity to investigate the effects of station siting and sensor exposure and to estimate the related measurement uncertainty. An extended metadata set was established for the stations in Milan, including siting and exposure details. Statistical analysis on an almost 3-year-long operational period assessed network homogeneity, quality and reliability. Deviations from reference mean values were then evaluated in selected low-gradient local weather situations in order to investigate siting and exposure effects. In this paper the methodology is depicted and preliminary results of its application to air temperature discussed; this allowed the setting of an upper limit of 1 °C for the added measurement uncertainty at the top of the urban canopy layer.

  15. Uncertainty of spatial straightness in 3D measurement

    International Nuclear Information System (INIS)

    Wang Jinxing; Jiang Xiangqian; Ma Limin; Xu Zhengao; Li Zhu

    2005-01-01

    The least-square method is commonly employed to verify the spatial straightness in actual three-dimensional measurement process, but the uncertainty of the verification result is usually not given by the coordinate measuring machines. According to the basic principle of spatial straightness least-square verification and the uncertainty propagation formula given by ISO/TS 14253-2, a calculation method for the uncertainty of spatial straightness least-square verification is proposed in this paper. By this method, the coefficients of the line equation are regarded as a statistical vector, so that the line equation, the result of the spatial straightness verification and the uncertainty of the result can be obtained after the expected value and covariance matrix of the vector are determined. The method not only assures the integrity of the verification result, but also accords with the requirement of the new generation of GPS standards, which can improve the veracity of verification

  16. Treatment of measurement uncertainties at the power burst facility

    International Nuclear Information System (INIS)

    Meyer, L.C.

    1980-01-01

    The treatment of measurement uncertainty at the Power Burst Facility provides a means of improving data integrity as well as meeting standard practice reporting requirements. This is accomplished by performing the uncertainty analysis in two parts, test independent uncertainty analysis and test dependent uncertainty analysis. The test independent uncertainty analysis is performed on instrumentation used repeatedly from one test to the next, and does not have to be repeated for each test except for improved or new types of instruments. A test dependent uncertainty analysis is performed on each test based on the test independent uncertainties modified as required by test specifications, experiment fixture design, and historical performance of instruments on similar tests. The methodology for performing uncertainty analysis based on the National Bureau of Standards method is reviewed with examples applied to nuclear instrumentation

  17. A Unified Approach for Reporting ARM Measurement Uncertainties Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Campos, E [Argonne National Laboratory; Sisterson, DL [Argonne National Laboratory

    2015-10-01

    The Atmospheric Radiation Measurement (ARM) Climate Research Facility is observationally based, and quantifying the uncertainty of its measurements is critically important. With over 300 widely differing instruments providing over 2,500 datastreams, concise expression of measurement uncertainty is quite challenging. The ARM Facility currently provides data and supporting metadata (information about the data or data quality) to its users through a number of sources. Because the continued success of the ARM Facility depends on the known quality of its measurements, the Facility relies on instrument mentors and the ARM Data Quality Office (DQO) to ensure, assess, and report measurement quality. Therefore, an easily-accessible, well-articulated estimate of ARM measurement uncertainty is needed.

  18. Assessing student understanding of measurement and uncertainty

    Science.gov (United States)

    Jirungnimitsakul, S.; Wattanakasiwich, P.

    2017-09-01

    The objectives of this study were to develop and assess student understanding of measurement and uncertainty. A test has been adapted and translated from the Laboratory Data Analysis Instrument (LDAI) test, consists of 25 questions focused on three topics including measures of central tendency, experimental errors and uncertainties, and fitting regression lines. The test was evaluated its content validity by three physics experts in teaching physics laboratory. In the pilot study, Thai LDAI was administered to 93 freshmen enrolled in a fundamental physics laboratory course. The final draft of the test was administered to three groups—45 freshmen taking fundamental physics laboratory, 16 sophomores taking intermediated physics laboratory and 21 juniors taking advanced physics laboratory at Chiang Mai University. As results, we found that the freshmen had difficulties in experimental errors and uncertainties. Most students had problems with fitting regression lines. These results will be used to improve teaching and learning physics laboratory for physics students in the department.

  19. Evaluation of Sources of Uncertainties in Solar Resource Measurement

    Energy Technology Data Exchange (ETDEWEB)

    Habte, Aron M [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Sengupta, Manajit [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-25

    This poster presents a high-level overview of sources of uncertainties in solar resource measurement, demonstrating the impact of various sources of uncertainties -- such as cosine response, thermal offset, spectral response, and others -- on the accuracy of data from several radiometers. The study provides insight on how to reduce the impact of some of the sources of uncertainties.

  20. LOFT experimental measurements uncertainty analyses. Volume XX. Fluid-velocity measurement using pulsed-neutron activation

    International Nuclear Information System (INIS)

    Lassahn, G.D.; Taylor, D.J.N.

    1982-08-01

    Analyses of uncertainty components inherent in pulsed-neutron-activation (PNA) measurements in general and the Loss-of-Fluid-Test (LOFT) system in particular are given. Due to the LOFT system's unique conditions, previously-used techniques were modified to make the volocity measurement. These methods render a useful, cost-effective measurement with an estimated uncertainty of 11% of reading

  1. Application of a virtual coordinate measuring machine for measurement uncertainty estimation of aspherical lens parameters

    International Nuclear Information System (INIS)

    Küng, Alain; Meli, Felix; Nicolet, Anaïs; Thalmann, Rudolf

    2014-01-01

    Tactile ultra-precise coordinate measuring machines (CMMs) are very attractive for accurately measuring optical components with high slopes, such as aspheres. The METAS µ-CMM, which exhibits a single point measurement repeatability of a few nanometres, is routinely used for measurement services of microparts, including optical lenses. However, estimating the measurement uncertainty is very demanding. Because of the many combined influencing factors, an analytic determination of the uncertainty of parameters that are obtained by numerical fitting of the measured surface points is almost impossible. The application of numerical simulation (Monte Carlo methods) using a parametric fitting algorithm coupled with a virtual CMM based on a realistic model of the machine errors offers an ideal solution to this complex problem: to each measurement data point, a simulated measurement variation calculated from the numerical model of the METAS µ-CMM is added. Repeated several hundred times, these virtual measurements deliver the statistical data for calculating the probability density function, and thus the measurement uncertainty for each parameter. Additionally, the eventual cross-correlation between parameters can be analyzed. This method can be applied for the calibration and uncertainty estimation of any parameter of the equation representing a geometric element. In this article, we present the numerical simulation model of the METAS µ-CMM and the application of a Monte Carlo method for the uncertainty estimation of measured asphere parameters. (paper)

  2. Waste receiving and processing drum weight measurement uncertainty review findings

    International Nuclear Information System (INIS)

    LANE, M.P.

    1999-01-01

    The purpose of reviewing the weight scale operation at the WRAP facility was to determine the uncertainty associated with weight measurements. Weight measurement uncertainty is needed to support WRAP Nondestructive Examination (NDE) and Non-destructive Assay (NDA) analysis

  3. Assessment of dose measurement uncertainty using RisøScan

    DEFF Research Database (Denmark)

    Helt-Hansen, J.; Miller, A.

    2006-01-01

    The dose measurement uncertainty of the dosimeter system RisoScan, office scanner and Riso B3 dosimeters has been assessed by comparison with spectrophotometer measurements of the same dosimeters. The reproducibility and the combined uncertainty were found to be approximately 2% and 4%, respectiv......%, respectively, at one standard deviation. The subroutine in RisoScan for electron energy measurement is shown to give results that are equivalent to the measurements with a scanning spectrophotometer. (c) 2006 Elsevier Ltd. All rights reserved....

  4. Putrefactive rigor: apparent rigor mortis due to gas distension.

    Science.gov (United States)

    Gill, James R; Landi, Kristen

    2011-09-01

    Artifacts due to decomposition may cause confusion for the initial death investigator, leading to an incorrect suspicion of foul play. Putrefaction is a microorganism-driven process that results in foul odor, skin discoloration, purge, and bloating. Various decompositional gases including methane, hydrogen sulfide, carbon dioxide, and hydrogen will cause the body to bloat. We describe 3 instances of putrefactive gas distension (bloating) that produced the appearance of inappropriate rigor, so-called putrefactive rigor. These gases may distend the body to an extent that the extremities extend and lose contact with their underlying support surface. The medicolegal investigator must recognize that this is not true rigor mortis and the body was not necessarily moved after death for this gravity-defying position to occur.

  5. The action uncertainty principle for continuous measurements

    Science.gov (United States)

    Mensky, Michael B.

    1996-02-01

    The action uncertainty principle (AUP) for the specification of the most probable readouts of continuous quantum measurements is proved, formulated in different forms and analyzed (for nonlinear as well as linear systems). Continuous monitoring of an observable A(p,q,t) with resolution Δa( t) is considered. The influence of the measurement process on the evolution of the measured system (quantum measurement noise) is presented by an additional term δ F(t)A(p,q,t) in the Hamiltonian where the function δ F (generalized fictitious force) is restricted by the AUP ∫|δ F(t)| Δa( t) d t ≲ and arbitrary otherwise. Quantum-nondemolition (QND) measurements are analyzed with the help of the AUP. A simple uncertainty relation for continuous quantum measurements is derived. It states that the area of a certain band in the phase space should be of the order of. The width of the band depends on the measurement resolution while its length is determined by the deviation of the system, due to the measurement, from classical behavior.

  6. The action uncertainty principle for continuous measurements

    International Nuclear Information System (INIS)

    Mensky, M.B.

    1996-01-01

    The action uncertainty principle (AUP) for the specification of the most probable readouts of continuous quantum measurements is proved, formulated in different forms and analyzed (for nonlinear as well as linear systems). Continuous monitoring of an observable A(p,q,t) with resolution Δa(t) is considered. The influence of the measurement process on the evolution of the measured system (quantum measurement noise) is presented by an additional term δF(t) A(p,q,t) in the Hamiltonian where the function δF (generalized fictitious force) is restricted by the AUP ∫ vertical stroke δF(t) vertical stroke Δa(t)d t< or∼ℎ and arbitrary otherwise. Quantum-nondemolition (QND) measurements are analyzed with the help of the AUP. A simple uncertainty relation for continuous quantum measurements is derived. It states that the area of a certain band in the phase space should be of the order of ℎ. The width of the band depends on the measurement resolution while its length is determined by the deviation of the system, due to the measurement, from classical behavior. (orig.)

  7. The estimation of uncertainty of radioactivity measurement on gamma counters in radiopharmacy

    International Nuclear Information System (INIS)

    Jovanovic, M.S.; Orlic, M.; Vranjes, S.; Stamenkovic, Lj. . E-mail address of corresponding author: nikijov@vin.bg.ac.yu; Jovanovic, M.S.)

    2005-01-01

    In this paper the estimation of uncertainty of measurement of radioactivity on gamma counter in Laboratory for radioisotopes is presented. The uncertainty components, which are important for these measurements, are identified and taken into account while estimating the uncertainty of measurement.(author)

  8. Principles and applications of measurement and uncertainty analysis in research and calibration

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that ``The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.`` Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true? What kind of information should we include in a statement of uncertainty accompanying a calibrated value? How and where do we get the information to include in an uncertainty statement? How should we interpret and use measurement uncertainty information? This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  9. Principles and applications of measurement and uncertainty analysis in research and calibration

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.'' Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true What kind of information should we include in a statement of uncertainty accompanying a calibrated value How and where do we get the information to include in an uncertainty statement How should we interpret and use measurement uncertainty information This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  10. BOOK REVIEW: Evaluating the Measurement Uncertainty: Fundamentals and practical guidance

    Science.gov (United States)

    Lira, Ignacio

    2003-08-01

    Evaluating the Measurement Uncertainty is a book written for anyone who makes and reports measurements. It attempts to fill the gaps in the ISO Guide to the Expression of Uncertainty in Measurement, or the GUM, and does a pretty thorough job. The GUM was written with the intent of being applicable by all metrologists, from the shop floor to the National Metrology Institute laboratory; however, the GUM has often been criticized for its lack of user-friendliness because it is primarily filled with statements, but with little explanation. Evaluating the Measurement Uncertainty gives lots of explanations. It is well written and makes use of many good figures and numerical examples. Also important, this book is written by a metrologist from a National Metrology Institute, and therefore up-to-date ISO rules, style conventions and definitions are correctly used and supported throughout. The author sticks very closely to the GUM in topical theme and with frequent reference, so readers who have not read GUM cover-to-cover may feel as if they are missing something. The first chapter consists of a reprinted lecture by T J Quinn, Director of the Bureau International des Poids et Mesures (BIPM), on the role of metrology in today's world. It is an interesting and informative essay that clearly outlines the importance of metrology in our modern society, and why accurate measurement capability, and by definition uncertainty evaluation, should be so important. Particularly interesting is the section on the need for accuracy rather than simply reproducibility. Evaluating the Measurement Uncertainty then begins at the beginning, with basic concepts and definitions. The third chapter carefully introduces the concept of standard uncertainty and includes many derivations and discussion of probability density functions. The author also touches on Monte Carlo methods, calibration correction quantities, acceptance intervals or guardbanding, and many other interesting cases. The book goes

  11. Experimental evaluation of rigor mortis. III. Comparative study of the evolution of rigor mortis in different sized muscle groups in rats.

    Science.gov (United States)

    Krompecher, T; Fryc, O

    1978-01-01

    The use of new methods and an appropriate apparatus has allowed us to make successive measurements of rigor mortis and a study of its evolution in the rat. By a comparative examination on the front and hind limbs, we have determined the following: (1) The muscular mass of the hind limbs is 2.89 times greater than that of the front limbs. (2) In the initial phase rigor mortis is more pronounced in the front limbs. (3) The front and hind limbs reach maximum rigor mortis at the same time and this state is maintained for 2 hours. (4) Resolution of rigor mortis is accelerated in the front limbs during the initial phase, but both front and hind limbs reach complete resolution at the same time.

  12. Uncertainty information in climate data records from Earth observation

    Science.gov (United States)

    Merchant, Christopher J.; Paul, Frank; Popp, Thomas; Ablain, Michael; Bontemps, Sophie; Defourny, Pierre; Hollmann, Rainer; Lavergne, Thomas; Laeng, Alexandra; de Leeuw, Gerrit; Mittaz, Jonathan; Poulsen, Caroline; Povey, Adam C.; Reuter, Max; Sathyendranath, Shubha; Sandven, Stein; Sofieva, Viktoria F.; Wagner, Wolfgang

    2017-07-01

    The question of how to derive and present uncertainty information in climate data records (CDRs) has received sustained attention within the European Space Agency Climate Change Initiative (CCI), a programme to generate CDRs addressing a range of essential climate variables (ECVs) from satellite data. Here, we review the nature, mathematics, practicalities, and communication of uncertainty information in CDRs from Earth observations. This review paper argues that CDRs derived from satellite-based Earth observation (EO) should include rigorous uncertainty information to support the application of the data in contexts such as policy, climate modelling, and numerical weather prediction reanalysis. Uncertainty, error, and quality are distinct concepts, and the case is made that CDR products should follow international metrological norms for presenting quantified uncertainty. As a baseline for good practice, total standard uncertainty should be quantified per datum in a CDR, meaning that uncertainty estimates should clearly discriminate more and less certain data. In this case, flags for data quality should not duplicate uncertainty information, but instead describe complementary information (such as the confidence in the uncertainty estimate provided or indicators of conditions violating the retrieval assumptions). The paper discusses the many sources of error in CDRs, noting that different errors may be correlated across a wide range of timescales and space scales. Error effects that contribute negligibly to the total uncertainty in a single-satellite measurement can be the dominant sources of uncertainty in a CDR on the large space scales and long timescales that are highly relevant for some climate applications. For this reason, identifying and characterizing the relevant sources of uncertainty for CDRs is particularly challenging. The characterization of uncertainty caused by a given error effect involves assessing the magnitude of the effect, the shape of the

  13. Improved profile fitting and quantification of uncertainty in experimental measurements of impurity transport coefficients using Gaussian process regression

    International Nuclear Information System (INIS)

    Chilenski, M.A.; Greenwald, M.; Howard, N.T.; White, A.E.; Rice, J.E.; Walk, J.R.; Marzouk, Y.

    2015-01-01

    The need to fit smooth temperature and density profiles to discrete observations is ubiquitous in plasma physics, but the prevailing techniques for this have many shortcomings that cast doubt on the statistical validity of the results. This issue is amplified in the context of validation of gyrokinetic transport models (Holland et al 2009 Phys. Plasmas 16 052301), where the strong sensitivity of the code outputs to input gradients means that inadequacies in the profile fitting technique can easily lead to an incorrect assessment of the degree of agreement with experimental measurements. In order to rectify the shortcomings of standard approaches to profile fitting, we have applied Gaussian process regression (GPR), a powerful non-parametric regression technique, to analyse an Alcator C-Mod L-mode discharge used for past gyrokinetic validation work (Howard et al 2012 Nucl. Fusion 52 063002). We show that the GPR techniques can reproduce the previous results while delivering more statistically rigorous fits and uncertainty estimates for both the value and the gradient of plasma profiles with an improved level of automation. We also discuss how the use of GPR can allow for dramatic increases in the rate of convergence of uncertainty propagation for any code that takes experimental profiles as inputs. The new GPR techniques for profile fitting and uncertainty propagation are quite useful and general, and we describe the steps to implementation in detail in this paper. These techniques have the potential to substantially improve the quality of uncertainty estimates on profile fits and the rate of convergence of uncertainty propagation, making them of great interest for wider use in fusion experiments and modelling efforts. (paper)

  14. Uncertainty estimation with a small number of measurements, part II: a redefinition of uncertainty and an estimator method

    Science.gov (United States)

    Huang, Hening

    2018-01-01

    This paper is the second (Part II) in a series of two papers (Part I and Part II). Part I has quantitatively discussed the fundamental limitations of the t-interval method for uncertainty estimation with a small number of measurements. This paper (Part II) reveals that the t-interval is an ‘exact’ answer to a wrong question; it is actually misused in uncertainty estimation. This paper proposes a redefinition of uncertainty, based on the classical theory of errors and the theory of point estimation, and a modification of the conventional approach to estimating measurement uncertainty. It also presents an asymptotic procedure for estimating the z-interval. The proposed modification is to replace the t-based uncertainty with an uncertainty estimator (mean- or median-unbiased). The uncertainty estimator method is an approximate answer to the right question to uncertainty estimation. The modified approach provides realistic estimates of uncertainty, regardless of whether the population standard deviation is known or unknown, or if the sample size is small or large. As an application example of the modified approach, this paper presents a resolution to the Du-Yang paradox (i.e. Paradox 2), one of the three paradoxes caused by the misuse of the t-interval in uncertainty estimation.

  15. Quantifying the measurement uncertainty of results from environmental analytical methods.

    Science.gov (United States)

    Moser, J; Wegscheider, W; Sperka-Gottlieb, C

    2001-07-01

    The Eurachem-CITAC Guide Quantifying Uncertainty in Analytical Measurement was put into practice in a public laboratory devoted to environmental analytical measurements. In doing so due regard was given to the provisions of ISO 17025 and an attempt was made to base the entire estimation of measurement uncertainty on available data from the literature or from previously performed validation studies. Most environmental analytical procedures laid down in national or international standards are the result of cooperative efforts and put into effect as part of a compromise between all parties involved, public and private, that also encompasses environmental standards and statutory limits. Central to many procedures is the focus on the measurement of environmental effects rather than on individual chemical species. In this situation it is particularly important to understand the measurement process well enough to produce a realistic uncertainty statement. Environmental analytical methods will be examined as far as necessary, but reference will also be made to analytical methods in general and to physical measurement methods where appropriate. This paper describes ways and means of quantifying uncertainty for frequently practised methods of environmental analysis. It will be shown that operationally defined measurands are no obstacle to the estimation process as described in the Eurachem/CITAC Guide if it is accepted that the dominating component of uncertainty comes from the actual practice of the method as a reproducibility standard deviation.

  16. The use of measurement uncertainty in nuclear materials accuracy and verification

    International Nuclear Information System (INIS)

    Alique, O.; Vaccaro, S.; Svedkauskaite, J.

    2015-01-01

    EURATOM nuclear safeguards are based on the nuclear operators’ accounting for and declaring of the amounts of nuclear materials in their possession, as well as on the European Commission verifying the correctness and completeness of such declarations by means of conformity assessment practices. Both the accountancy and the verification processes comprise the measurements of amounts and characteristics of nuclear materials. The uncertainties associated to these measurements play an important role in the reliability of the results of nuclear material accountancy and verification. The document “JCGM 100:2008 Evaluation of measurement data – Guide to the expression of uncertainty in measurement” - issued jointly by the International Bureau of Weights and Measures (BIPM) and international organisations for metrology, standardisation and accreditation in chemistry, physics and electro technology - describes a universal, internally consistent, transparent and applicable method for the evaluation and expression of uncertainty in measurements. This paper discusses different processes of nuclear materials accountancy and verification where measurement uncertainty plays a significant role. It also suggests the way measurement uncertainty could be used to enhance the reliability of the results of the nuclear materials accountancy and verification processes.

  17. Traceability and Measurement Uncertainty

    DEFF Research Database (Denmark)

    Tosello, Guido; De Chiffre, Leonardo

    2004-01-01

    . The project partnership aims (composed by 7 partners in 5 countries, thus covering a real European spread in high tech production technology) to develop and implement an advanced e-learning system that integrates contributions from quite different disciplines into a user-centred approach that strictly....... Machine tool testing 9. The role of manufacturing metrology for QM 10. Inspection planning 11. Quality management of measurements incl. Documentation 12. Advanced manufacturing measurement technology The present report (which represents the section 2 - Traceability and Measurement Uncertainty – of the e-learning......This report is made as a part of the project ‘Metro-E-Learn: European e-Learning in Manufacturing Metrology’, an EU project under the program SOCRATES MINERVA (ODL and ICT in Education), Contract No: 101434-CP-1-2002-1-DE-MINERVA, coordinated by Friedrich-Alexander-University Erlangen...

  18. Inspection of freeform surfaces considering uncertainties in measurement, localization and surface reconstruction

    International Nuclear Information System (INIS)

    Mehrad, Vahid; Xue, Deyi; Gu, Peihua

    2013-01-01

    Inspection of a manufactured freeform surface can be conducted by building its surface model and comparing this manufactured surface model with the ideal design surface model and its tolerance requirement. The manufactured freeform surface model is usually achieved by obtaining measurement points on the manufactured surface, transforming these measurement points from the measurement coordinate system to the design coordinate system through localization, and reconstructing the surface model using the localized measurement points. In this research, a method was developed to estimate the locations and their variances of any selected points on the reconstructed freeform surface considering different sources of uncertainties in measurement, localization and surface reconstruction processes. In this method, first locations and variances of the localized measurement points are calculated considering uncertainties of the measurement points and uncertainties introduced in the localization processes. Then locations and variances of points on the reconstructed freeform surface are obtained considering uncertainties of the localized measurement points and uncertainties introduced in the freeform surface reconstruction process. Two case studies were developed to demonstrate how these three different uncertainty sources influence the quality of the reconstructed freeform curve and freeform surface in inspection. (paper)

  19. Uncertainty associated with the gravimetric measurement of particulate matter concentration in ambient air.

    Science.gov (United States)

    Lacey, Ronald E; Faulkner, William Brock

    2015-07-01

    This work applied a propagation of uncertainty method to typical total suspended particulate (TSP) sampling apparatus in order to estimate the overall measurement uncertainty. The objectives of this study were to estimate the uncertainty for three TSP samplers, develop an uncertainty budget, and determine the sensitivity of the total uncertainty to environmental parameters. The samplers evaluated were the TAMU High Volume TSP Sampler at a nominal volumetric flow rate of 1.42 m3 min(-1) (50 CFM), the TAMU Low Volume TSP Sampler at a nominal volumetric flow rate of 17 L min(-1) (0.6 CFM) and the EPA TSP Sampler at the nominal volumetric flow rates of 1.1 and 1.7 m3 min(-1) (39 and 60 CFM). Under nominal operating conditions the overall measurement uncertainty was found to vary from 6.1x10(-6) g m(-3) to 18.0x10(-6) g m(-3), which represented an uncertainty of 1.7% to 5.2% of the measurement. Analysis of the uncertainty budget determined that three of the instrument parameters contributed significantly to the overall uncertainty: the uncertainty in the pressure drop measurement across the orifice meter during both calibration and testing and the uncertainty of the airflow standard used during calibration of the orifice meter. Five environmental parameters occurring during field measurements were considered for their effect on overall uncertainty: ambient TSP concentration, volumetric airflow rate, ambient temperature, ambient pressure, and ambient relative humidity. Of these, only ambient TSP concentration and volumetric airflow rate were found to have a strong effect on the overall uncertainty. The technique described in this paper can be applied to other measurement systems and is especially useful where there are no methods available to generate these values empirically. This work addresses measurement uncertainty of TSP samplers used in ambient conditions. Estimation of uncertainty in gravimetric measurements is of particular interest, since as ambient particulate

  20. Helium Mass Spectrometer Leak Detection: A Method to Quantify Total Measurement Uncertainty

    Science.gov (United States)

    Mather, Janice L.; Taylor, Shawn C.

    2015-01-01

    In applications where leak rates of components or systems are evaluated against a leak rate requirement, the uncertainty of the measured leak rate must be included in the reported result. However, in the helium mass spectrometer leak detection method, the sensitivity, or resolution, of the instrument is often the only component of the total measurement uncertainty noted when reporting results. To address this shortfall, a measurement uncertainty analysis method was developed that includes the leak detector unit's resolution, repeatability, hysteresis, and drift, along with the uncertainty associated with the calibration standard. In a step-wise process, the method identifies the bias and precision components of the calibration standard, the measurement correction factor (K-factor), and the leak detector unit. Together these individual contributions to error are combined and the total measurement uncertainty is determined using the root-sum-square method. It was found that the precision component contributes more to the total uncertainty than the bias component, but the bias component is not insignificant. For helium mass spectrometer leak rate tests where unit sensitivity alone is not enough, a thorough evaluation of the measurement uncertainty such as the one presented herein should be performed and reported along with the leak rate value.

  1. Uncertainty Propagation in OMFIT

    Science.gov (United States)

    Smith, Sterling; Meneghini, Orso; Sung, Choongki

    2017-10-01

    A rigorous comparison of power balance fluxes and turbulent model fluxes requires the propagation of uncertainties in the kinetic profiles and their derivatives. Making extensive use of the python uncertainties package, the OMFIT framework has been used to propagate covariant uncertainties to provide an uncertainty in the power balance calculation from the ONETWO code, as well as through the turbulent fluxes calculated by the TGLF code. The covariant uncertainties arise from fitting 1D (constant on flux surface) density and temperature profiles and associated random errors with parameterized functions such as a modified tanh. The power balance and model fluxes can then be compared with quantification of the uncertainties. No effort is made at propagating systematic errors. A case study will be shown for the effects of resonant magnetic perturbations on the kinetic profiles and fluxes at the top of the pedestal. A separate attempt at modeling the random errors with Monte Carlo sampling will be compared to the method of propagating the fitting function parameter covariant uncertainties. Work supported by US DOE under DE-FC02-04ER54698, DE-FG2-95ER-54309, DE-SC 0012656.

  2. Uncertainty Analysis of OC5-DeepCwind Floating Semisubmersible Offshore Wind Test Campaign: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Robertson, Amy N [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-07-26

    This paper examines how to assess the uncertainty levels for test measurements of the Offshore Code Comparison, Continued, with Correlation (OC5)-DeepCwind floating offshore wind system, examined within the OC5 project. The goal of the OC5 project was to validate the accuracy of ultimate and fatigue load estimates from a numerical model of the floating semisubmersible using data measured during scaled tank testing of the system under wind and wave loading. The examination of uncertainty was done after the test, and it was found that the limited amount of data available did not allow for an acceptable uncertainty assessment. Therefore, this paper instead qualitatively examines the sources of uncertainty associated with this test to start a discussion of how to assess uncertainty for these types of experiments and to summarize what should be done during future testing to acquire the information needed for a proper uncertainty assessment. Foremost, future validation campaigns should initiate numerical modeling before testing to guide the test campaign, which should include a rigorous assessment of uncertainty, and perform validation during testing to ensure that the tests address all of the validation needs.

  3. The grey relational approach for evaluating measurement uncertainty with poor information

    International Nuclear Information System (INIS)

    Luo, Zai; Wang, Yanqing; Zhou, Weihu; Wang, Zhongyu

    2015-01-01

    The Guide to the Expression of Uncertainty in Measurement (GUM) is the master document for measurement uncertainty evaluation. However, the GUM may encounter problems and does not work well when the measurement data have poor information. In most cases, poor information means a small data sample and an unknown probability distribution. In these cases, the evaluation of measurement uncertainty has become a bottleneck in practical measurement. To solve this problem, a novel method called the grey relational approach (GRA), different from the statistical theory, is proposed in this paper. The GRA does not require a large sample size or probability distribution information of the measurement data. Mathematically, the GRA can be divided into three parts. Firstly, according to grey relational analysis, the grey relational coefficients between the ideal and the practical measurement output series are obtained. Secondly, the weighted coefficients and the measurement expectation function will be acquired based on the grey relational coefficients. Finally, the measurement uncertainty is evaluated based on grey modeling. In order to validate the performance of this method, simulation experiments were performed and the evaluation results show that the GRA can keep the average error around 5%. Besides, the GRA was also compared with the grey method, the Bessel method, and the Monte Carlo method by a real stress measurement. Both the simulation experiments and real measurement show that the GRA is appropriate and effective to evaluate the measurement uncertainty with poor information. (paper)

  4. Application of a new importance measure for parametric uncertainty in PSA

    International Nuclear Information System (INIS)

    Poern, K.

    1997-04-01

    The traditional approach to uncertainty analysis in PSA, with propagation of basic event uncertainties through the PSA model, generates as an end product the uncertainty distribution of the top event frequency. This distribution, however, is not of much value for the decision maker. Most decisions are made under uncertainty. What the decision maker needs, to enhance the decision-making quality, is an adequate uncertainty importance measure that provides the decision maker with an indication of on what basic parameters it would be most valuable - as to the quality of the decision making in the specific situation - to procure more information. This paper will describe an application of a new measure of uncertainty importance that has been developed in the ongoing joint Nordic project NKS/RAK-1:3. The measure is called ''decision oriented'' because it is defined within a decision theoretic framework. It is defined as the expected value of a certain additional information about each basic parameter, and utilizes both the system structure and the complete uncertainty distributions of the basic parameters. The measure provides the analyst and the decision maker with a diagnostic information pointing to parameters on which more information would be most valuable to procure in order to enhance the decision-making quality. This uncertainty importance measure must not be confused with the more well-known, traditional importance measures of various kinds that are used to depict the contributions of each basic event or parameter (represented by point values) to the top event frequency. In this study the new measure is practically demonstrated through a real application on the top event: Water overflow through steam generator safety valves caused by steam generator tube rupture. This application object is one of the event sequences that the fore mentioned Nordic project has analysed with an integrated approach. The project has been funded by the Swedish Nuclear Power

  5. Uncertainty in eddy covariance measurements and its application to physiological models

    Science.gov (United States)

    D.Y. Hollinger; A.D. Richardson; A.D. Richardson

    2005-01-01

    Flux data are noisy, and this uncertainty is largely due to random measurement error. Knowledge of uncertainty is essential for the statistical evaluation of modeled andmeasured fluxes, for comparison of parameters derived by fitting models to measured fluxes and in formal data-assimilation efforts. We used the difference between simultaneous measurements from two...

  6. Measurability of quantum fields and the energy-time uncertainty relation

    International Nuclear Information System (INIS)

    Mensky, Mikhail B

    2011-01-01

    Quantum restrictions on the measurability of an electromagnetic field strength and their relevance to the energy-time uncertainty relation are considered. The minimum errors in measuring electromagnetic field strengths, as they were estimated by the author (1988) in the framework of the phenomenological method of restricted path integral (RPI), are compared with the analogous estimates found by Landau and Peierls (1931) and by Bohr and Rosenfeld (1933) with the help of certain measurement setups. RPI-based restrictions, including those of Landau and Peierls as a special case, hold for any measuring schemes meeting the strict definition of measurement. Their fundamental nature is confirmed by the fact that their associated field detectability condition has the form of the energy-time uncertainty relation. The weaker restrictions suggested by Bohr and Rosenfeld rely on an extended definition of measurement. The energy-time uncertainty relation, which is the condition for the electromagnetic field to be detectable, is applied to the analysis of how the near-field scanning microscope works. (methodological notes)

  7. Comparison of two different methods for the uncertainty estimation of circle diameter measurements using an optical coordinate measuring machine

    DEFF Research Database (Denmark)

    Morace, Renata Erica; Hansen, Hans Nørgaard; De Chiffre, Leonardo

    2005-01-01

    This paper deals with the uncertainty estimation of measurements performed on optical coordinate measuring machines (CMMs). Two different methods were used to assess the uncertainty of circle diameter measurements using an optical CMM: the sensitivity analysis developing an uncertainty budget...

  8. Uncertainty assessment in gamma spectrometric measurements of plutonium isotope ratios and age

    Energy Technology Data Exchange (ETDEWEB)

    Ramebaeck, H., E-mail: henrik.ramebeck@foi.se [Swedish Defence Research Agency, FOI, Division of CBRN Defence and Security, SE-901 82 Umea (Sweden); Chalmers University of Technology, Department of Chemical and Biological Engineering, Nuclear Chemistry, SE-412 96 Goeteborg (Sweden); Nygren, U.; Tovedal, A. [Swedish Defence Research Agency, FOI, Division of CBRN Defence and Security, SE-901 82 Umea (Sweden); Ekberg, C.; Skarnemark, G. [Chalmers University of Technology, Department of Chemical and Biological Engineering, Nuclear Chemistry, SE-412 96 Goeteborg (Sweden)

    2012-09-15

    A method for the assessment of the combined uncertainty in gamma spectrometric measurements of plutonium composition and age was evaluated. Two materials were measured. Isotope dilution inductively coupled plasma sector field mass spectrometry (ID-ICP-SFMS) was used as a reference method for comparing the results obtained with the gamma spectrometric method for one of the materials. For this material (weapons grade plutonium) the measurement results were in agreement between the two methods for all measurands. Moreover, the combined uncertainty in all isotope ratios considered in this material (R{sub Pu238/Pu239}, R{sub Pu240/Pu239}, R{sub Pu241/Pu239}, and R{sub Am241/Pu241} for age determination) were limited by counting statistics. However, the combined uncertainty for the other material (fuel grade plutonium) were limited by the response fit, which shows that the uncertainty in the response function is important to include in the combined measurement uncertainty of gamma spectrometric measurements of plutonium.

  9. Evaluation of measuring results, statement of uncertainty in dosimeter calibrations

    International Nuclear Information System (INIS)

    Reich, H.

    1978-05-01

    The method described starts from the requirement that the quantitative statement of a measuring result in dosimetry should contain at least three figures: 1) the measured value or the best estimate of the quantity to be measured, 2) the uncertainty of this value given by a figure, which indicates a certain range around the measured value, and which is strongly linked with 3) a figure for the confidence level of this range, i.e. the probability that the (unknown) correct value is embraced by the given uncertainty range. How the figures 2) and 3) can be obtained and how they should be quoted in calibration certificates is the subject of these lectures. In addition, the means by which the method may be extended on determining the uncertainty of a measurement performed under conditions which deviate from the calibration conditt ions is briefly described. (orig.) [de

  10. Systematic Evaluation of Uncertainty in Material Flow Analysis

    DEFF Research Database (Denmark)

    Laner, David; Rechberger, Helmut; Astrup, Thomas Fruergaard

    2014-01-01

    Material flow analysis (MFA) is a tool to investigate material flows and stocks in defined systems as a basis for resource management or environmental pollution control. Because of the diverse nature of sources and the varying quality and availability of data, MFA results are inherently uncertain....... Uncertainty analyses have received increasing attention in recent MFA studies, but systematic approaches for selection of appropriate uncertainty tools are missing. This article reviews existing literature related to handling of uncertainty in MFA studies and evaluates current practice of uncertainty analysis......) and exploratory MFA (identification of critical parameters and system behavior). Whereas mathematically simpler concepts focusing on data uncertainty characterization are appropriate for descriptive MFAs, statistical approaches enabling more-rigorous evaluation of uncertainty and model sensitivity are needed...

  11. Uncertainty Quantification for Monitoring of Civil Structures from Vibration Measurements

    Science.gov (United States)

    Döhler, Michael; Mevel, Laurent

    2014-05-01

    Health Monitoring of civil structures can be performed by detecting changes in the modal parameters of a structure, or more directly in the measured vibration signals. For a continuous monitoring the excitation of a structure is usually ambient, thus unknown and assumed to be noise. Hence, all estimates from the vibration measurements are realizations of random variables with inherent uncertainty due to (unknown) process and measurement noise and finite data length. In this talk, a strategy for quantifying the uncertainties of modal parameter estimates from a subspace-based system identification approach is presented and the importance of uncertainty quantification in monitoring approaches is shown. Furthermore, a damage detection method is presented, which is based on the direct comparison of the measured vibration signals without estimating modal parameters, while taking the statistical uncertainty in the signals correctly into account. The usefulness of both strategies is illustrated on data from a progressive damage action on a prestressed concrete bridge. References E. Carden and P. Fanning. Vibration based condition monitoring: a review. Structural Health Monitoring, 3(4):355-377, 2004. M. Döhler and L. Mevel. Efficient multi-order uncertainty computation for stochastic subspace identification. Mechanical Systems and Signal Processing, 38(2):346-366, 2013. M. Döhler, L. Mevel, and F. Hille. Subspace-based damage detection under changes in the ambient excitation statistics. Mechanical Systems and Signal Processing, 45(1):207-224, 2014.

  12. Triangular and Trapezoidal Fuzzy State Estimation with Uncertainty on Measurements

    Directory of Open Access Journals (Sweden)

    Mohammad Sadeghi Sarcheshmah

    2012-01-01

    Full Text Available In this paper, a new method for uncertainty analysis in fuzzy state estimation is proposed. The uncertainty is expressed in measurements. Uncertainties in measurements are modelled with different fuzzy membership functions (triangular and trapezoidal. To find the fuzzy distribution of any state variable, the problem is formulated as a constrained linear programming (LP optimization. The viability of the proposed method would be verified with the ones obtained from the weighted least squares (WLS and the fuzzy state estimation (FSE in the 6-bus system and in the IEEE-14 and 30 bus system.

  13. Instrumentation-related uncertainty of reflectance and transmittance measurements with a two-channel spectrophotometer.

    Science.gov (United States)

    Peest, Christian; Schinke, Carsten; Brendel, Rolf; Schmidt, Jan; Bothe, Karsten

    2017-01-01

    Spectrophotometers are operated in numerous fields of science and industry for a variety of applications. In order to provide confidence for the measured data, analyzing the associated uncertainty is valuable. However, the uncertainty of the measurement results is often unknown or reduced to sample-related contributions. In this paper, we describe our approach for the systematic determination of the measurement uncertainty of the commercially available two-channel spectrophotometer Agilent Cary 5000 in accordance with the Guide to the expression of uncertainty in measurements. We focus on the instrumentation-related uncertainty contributions rather than the specific application and thus outline a general procedure which can be adapted for other instruments. Moreover, we discover a systematic signal deviation due to the inertia of the measurement amplifier and develop and apply a correction procedure. Thereby we increase the usable dynamic range of the instrument by more than one order of magnitude. We present methods for the quantification of the uncertainty contributions and combine them into an uncertainty budget for the device.

  14. Propagation of nuclear data uncertainties for fusion power measurements

    Directory of Open Access Journals (Sweden)

    Sjöstrand Henrik

    2017-01-01

    Full Text Available Neutron measurements using neutron activation systems are an essential part of the diagnostic system at large fusion machines such as JET and ITER. Nuclear data is used to infer the neutron yield. Consequently, high-quality nuclear data is essential for the proper determination of the neutron yield and fusion power. However, uncertainties due to nuclear data are not fully taken into account in uncertainty analysis for neutron yield calibrations using activation foils. This paper investigates the neutron yield uncertainty due to nuclear data using the so-called Total Monte Carlo Method. The work is performed using a detailed MCNP model of the JET fusion machine; the uncertainties due to the cross-sections and angular distributions in JET structural materials, as well as the activation cross-sections in the activation foils, are analysed. It is found that a significant contribution to the neutron yield uncertainty can come from uncertainties in the nuclear data.

  15. Uncertainty analysis of thermal quantities measurement in a centrifugal compressor

    Science.gov (United States)

    Hurda, Lukáš; Matas, Richard

    2017-09-01

    Compressor performance characteristics evaluation process based on the measurement of pressure, temperature and other quantities is examined to find uncertainties for directly measured and derived quantities. CFD is used as a tool to quantify the influences of different sources of uncertainty of measurements for single- and multi-thermocouple total temperature probes. The heat conduction through the body of the thermocouple probe and the heat-up of the air in the intake piping are the main phenomena of interest.

  16. SU-G-BRB-14: Uncertainty of Radiochromic Film Based Relative Dose Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Devic, S; Tomic, N; DeBlois, F; Seuntjens, J [McGill University, Montreal, QC (Canada); Lewis, D [RCF Consulting, LLC, Monroe, CT (United States); Aldelaijan, S [King Faisal Specialist Hospital & Research Center, Riyadh (Saudi Arabia)

    2016-06-15

    Purpose: Due to inherently non-linear dose response, measurement of relative dose distribution with radiochromic film requires measurement of absolute dose using a calibration curve following previously established reference dosimetry protocol. On the other hand, a functional form that converts the inherently non-linear dose response curve of the radiochromic film dosimetry system into linear one has been proposed recently [Devic et al, Med. Phys. 39 4850–4857 (2012)]. However, there is a question what would be the uncertainty of such measured relative dose. Methods: If the relative dose distribution is determined going through the reference dosimetry system (conversion of the response by using calibration curve into absolute dose) the total uncertainty of such determined relative dose will be calculated by summing in quadrature total uncertainties of doses measured at a given and at the reference point. On the other hand, if the relative dose is determined using linearization method, the new response variable is calculated as ζ=a(netOD)n/ln(netOD). In this case, the total uncertainty in relative dose will be calculated by summing in quadrature uncertainties for a new response function (σζ) for a given and the reference point. Results: Except at very low doses, where the measurement uncertainty dominates, the total relative dose uncertainty is less than 1% for the linear response method as compared to almost 2% uncertainty level for the reference dosimetry method. The result is not surprising having in mind that the total uncertainty of the reference dose method is dominated by the fitting uncertainty, which is mitigated in the case of linearization method. Conclusion: Linearization of the radiochromic film dose response provides a convenient and a more precise method for relative dose measurements as it does not require reference dosimetry and creation of calibration curve. However, the linearity of the newly introduced function must be verified. Dave Lewis

  17. Measurement uncertainty of liquid chromatographic analyses visualized by Ishikawa diagrams.

    Science.gov (United States)

    Meyer, Veronika R

    2003-09-01

    Ishikawa, or cause-and-effect diagrams, help to visualize the parameters that influence a chromatographic analysis. Therefore, they facilitate the set up of the uncertainty budget of the analysis, which can then be expressed in mathematical form. If the uncertainty is calculated as the Gaussian sum of all uncertainty parameters, it is necessary to quantitate them all, a task that is usually not practical. The other possible approach is to use the intermediate precision as a base for the uncertainty calculation. In this case, it is at least necessary to consider the uncertainty of the purity of the reference material in addition to the precision data. The Ishikawa diagram is then very simple, and so is the uncertainty calculation. This advantage is given by the loss of information about the parameters that influence the measurement uncertainty.

  18. Impact of Pitot tube calibration on the uncertainty of water flow rate measurement

    Science.gov (United States)

    de Oliveira Buscarini, Icaro; Costa Barsaglini, Andre; Saiz Jabardo, Paulo Jose; Massami Taira, Nilson; Nader, Gilder

    2015-10-01

    Water utility companies often use Cole type Pitot tubes to map velocity profiles and thus measure flow rate. Frequent monitoring and measurement of flow rate is an important step in identifying leaks and other types of losses. In Brazil losses as high as 42% are common and in some places even higher values are found. When using Cole type Pitot tubes to measure the flow rate, the uncertainty of the calibration coefficient (Cd) is a major component of the overall flow rate measurement uncertainty. A common practice is to employ the usual value Cd = 0.869, in use since Cole proposed his Pitot tube in 1896. Analysis of 414 calibrations of Cole type Pitot tubes show that Cd varies considerably and values as high 0.020 for the expanded uncertainty are common. Combined with other uncertainty sources, the overall velocity measurement uncertainty is 0.02, increasing flowrate measurement uncertainty by 1.5% which, for the Sao Paulo metropolitan area (Brazil) corresponds to 3.5 × 107 m3/year.

  19. Universal uncertainty principle in the measurement operator formalism

    International Nuclear Information System (INIS)

    Ozawa, Masanao

    2005-01-01

    Heisenberg's uncertainty principle has been understood to set a limitation on measurements; however, the long-standing mathematical formulation established by Heisenberg, Kennard, and Robertson does not allow such an interpretation. Recently, a new relation was found to give a universally valid relation between noise and disturbance in general quantum measurements, and it has become clear that the new relation plays a role of the first principle to derive various quantum limits on measurement and information processing in a unified treatment. This paper examines the above development on the noise-disturbance uncertainty principle in the model-independent approach based on the measurement operator formalism, which is widely accepted to describe a class of generalized measurements in the field of quantum information. We obtain explicit formulae for the noise and disturbance of measurements given by measurement operators, and show that projective measurements do not satisfy the Heisenberg-type noise-disturbance relation that is typical in the gamma-ray microscope thought experiments. We also show that the disturbance on a Pauli operator of a projective measurement of another Pauli operator constantly equals √2, and examine how this measurement violates the Heisenberg-type relation but satisfies the new noise-disturbance relation

  20. Examples of measurement uncertainty evaluations in accordance with the revised GUM

    Science.gov (United States)

    Runje, B.; Horvatic, A.; Alar, V.; Medic, S.; Bosnjakovic, A.

    2016-11-01

    The paper presents examples of the evaluation of uncertainty components in accordance with the current and revised Guide to the expression of uncertainty in measurement (GUM). In accordance with the proposed revision of the GUM a Bayesian approach was conducted for both type A and type B evaluations.The law of propagation of uncertainty (LPU) and the law of propagation of distribution applied through the Monte Carlo method, (MCM) were used to evaluate associated standard uncertainties, expanded uncertainties and coverage intervals. Furthermore, the influence of the non-Gaussian dominant input quantity and asymmetric distribution of the output quantity y on the evaluation of measurement uncertainty was analyzed. In the case when the probabilistically coverage interval is not symmetric, the coverage interval for the probability P is estimated from the experimental probability density function using the Monte Carlo method. Key highlights of the proposed revision of the GUM were analyzed through a set of examples.

  1. Uncertainty Measures of Regional Flood Frequency Estimators

    DEFF Research Database (Denmark)

    Rosbjerg, Dan; Madsen, Henrik

    1995-01-01

    Regional flood frequency models have different assumptions regarding homogeneity and inter-site independence. Thus, uncertainty measures of T-year event estimators are not directly comparable. However, having chosen a particular method, the reliability of the estimate should always be stated, e...

  2. Establishing and maintaining a measurement uncertainty programme at the RPII dosimetry and calibration service

    International Nuclear Information System (INIS)

    Spain, D.; Currivan, L.; Fitzgerald, H.; Pollard, D.

    2005-01-01

    Full text: At the Dosimetry and Calibration Service of the Radiological Protection Institute of Ireland (RPII) approximately 70,000 thermoluminescent dosemeters (TLDs) are issued each year to monitor occupationally exposed workers in Ireland. In addition the service offers a calibration service for radiation survey meters, contamination monitors and electronic personal dosemeters. In order to meet the requirements of ISO/IEC 17025, it is necessary to quantify the uncertainty of measurement using well defined concepts and to maintain an up to date estimate. In this work it is shown how the measurement uncertainty in the Dosimetry and Calibration Service has been estimated. When estimating the uncertainty of measurement, all uncertainty components which are of importance in the given situation are taken into account. The combined uncertainty of the system is determined by considering a number of systematic and random errors. The analysis will include assumptions made and these have been documented and justified. Components of uncertainty were determined in accordance with such documents as IEC 61066, Guide to Expression of Uncertainty in Measurement, and the National Physical Laboratory Measurement Good Practice Guide No. 11, as appropriate. Results of intercomparisons are also presented, which adds confidence to the uncertainty estimate. Although a great deal of work is involved is estimating uncertainty in both laboratories it is felt that a reasonable estimate of measurement uncertainty has been achieved given the available information. Furthermore, in keeping with the laboratory's commitment to continuous improvement, it is necessary to evaluate periodically the measurement uncertainties associated with the relevant procedures and a programme for the future is outlined. (author)

  3. Comparison between bottom-up and top-down approaches in the estimation of measurement uncertainty.

    Science.gov (United States)

    Lee, Jun Hyung; Choi, Jee-Hye; Youn, Jae Saeng; Cha, Young Joo; Song, Woonheung; Park, Ae Ja

    2015-06-01

    Measurement uncertainty is a metrological concept to quantify the variability of measurement results. There are two approaches to estimate measurement uncertainty. In this study, we sought to provide practical and detailed examples of the two approaches and compare the bottom-up and top-down approaches to estimating measurement uncertainty. We estimated measurement uncertainty of the concentration of glucose according to CLSI EP29-A guideline. Two different approaches were used. First, we performed a bottom-up approach. We identified the sources of uncertainty and made an uncertainty budget and assessed the measurement functions. We determined the uncertainties of each element and combined them. Second, we performed a top-down approach using internal quality control (IQC) data for 6 months. Then, we estimated and corrected systematic bias using certified reference material of glucose (NIST SRM 965b). The expanded uncertainties at the low glucose concentration (5.57 mmol/L) by the bottom-up approach and top-down approaches were ±0.18 mmol/L and ±0.17 mmol/L, respectively (all k=2). Those at the high glucose concentration (12.77 mmol/L) by the bottom-up and top-down approaches were ±0.34 mmol/L and ±0.36 mmol/L, respectively (all k=2). We presented practical and detailed examples for estimating measurement uncertainty by the two approaches. The uncertainties by the bottom-up approach were quite similar to those by the top-down approach. Thus, we demonstrated that the two approaches were approximately equivalent and interchangeable and concluded that clinical laboratories could determine measurement uncertainty by the simpler top-down approach.

  4. Utilization of Software Tools for Uncertainty Calculation in Measurement Science Education

    International Nuclear Information System (INIS)

    Zangl, Hubert; Zine-Zine, Mariam; Hoermaier, Klaus

    2015-01-01

    Despite its importance, uncertainty is often neglected by practitioners in the design of system even in safety critical applications. Thus, problems arising from uncertainty may only be identified late in the design process and thus lead to additional costs. Although there exists numerous tools to support uncertainty calculation, reasons for limited usage in early design phases may be low awareness of the existence of the tools and insufficient training in the practical application. We present a teaching philosophy that addresses uncertainty from the very beginning of teaching measurement science, in particular with respect to the utilization of software tools. The developed teaching material is based on the GUM method and makes use of uncertainty toolboxes in the simulation environment. Based on examples in measurement science education we discuss advantages and disadvantages of the proposed teaching philosophy and include feedback from students

  5. The impact of uncertainty on optimal emission policies

    Science.gov (United States)

    Botta, Nicola; Jansson, Patrik; Ionescu, Cezar

    2018-05-01

    We apply a computational framework for specifying and solving sequential decision problems to study the impact of three kinds of uncertainties on optimal emission policies in a stylized sequential emission problem.We find that uncertainties about the implementability of decisions on emission reductions (or increases) have a greater impact on optimal policies than uncertainties about the availability of effective emission reduction technologies and uncertainties about the implications of trespassing critical cumulated emission thresholds. The results show that uncertainties about the implementability of decisions on emission reductions (or increases) call for more precautionary policies. In other words, delaying emission reductions to the point in time when effective technologies will become available is suboptimal when these uncertainties are accounted for rigorously. By contrast, uncertainties about the implications of exceeding critical cumulated emission thresholds tend to make early emission reductions less rewarding.

  6. Two-point method uncertainty during control and measurement of cylindrical element diameters

    Science.gov (United States)

    Glukhov, V. I.; Shalay, V. V.; Radev, H.

    2018-04-01

    The topic of the article is devoted to the urgent problem of the reliability of technical products geometric specifications measurements. The purpose of the article is to improve the quality of parts linear sizes control by the two-point measurement method. The article task is to investigate methodical extended uncertainties in measuring cylindrical element linear sizes. The investigation method is a geometric modeling of the element surfaces shape and location deviations in a rectangular coordinate system. The studies were carried out for elements of various service use, taking into account their informativeness, corresponding to the kinematic pairs classes in theoretical mechanics and the number of constrained degrees of freedom in the datum element function. Cylindrical elements with informativity of 4, 2, 1 and θ (zero) were investigated. The uncertainties estimation of in two-point measurements was made by comparing the results of of linear dimensions measurements with the functional diameters maximum and minimum of the element material. Methodical uncertainty is formed when cylindrical elements with maximum informativeness have shape deviations of the cut and the curvature types. Methodical uncertainty is formed by measuring the element average size for all types of shape deviations. The two-point measurement method cannot take into account the location deviations of a dimensional element, so its use for elements with informativeness less than the maximum creates unacceptable methodical uncertainties in measurements of the maximum, minimum and medium linear dimensions. Similar methodical uncertainties also exist in the arbitration control of the linear dimensions of the cylindrical elements by limiting two-point gauges.

  7. NIS method for uncertainty estimation of airborne sound insulation measurement in field

    Directory of Open Access Journals (Sweden)

    El-Basheer Tarek M.

    2017-01-01

    Full Text Available In structures, airborne sound insulation is utilized to characterize the acoustic nature of barriers between rooms. However, the assessment of sound insulation index is once in a while troublesome or indeed, even questionable, both in field and laboratory measurements, notwithstanding the way that there are some unified measurement methodology indicated in the ISO 140 series standards. There are issues with the reproducibility and repeatability of the measurement results. A few troubles might be brought on by non-diffuse acoustic fields, non-uniform reverberation time, or blunders of the reverberation time measurements. Some minor issues are additionally postured by flanking transmission. In this paper, investigation of the uncertainties of the above specified measurement parts and their impact on the consolidated uncertainty in 1/3-octave frequency band. The total measurement uncertainty model contributes several different partial uncertainties, which are evaluated by the method of type A or type B. Also, the determination of the sound reduction index decided by ISO 140-4 has been performed.

  8. Using measurement uncertainty in decision-making and conformity assessment

    Science.gov (United States)

    Pendrill, L. R.

    2014-08-01

    Measurements often provide an objective basis for making decisions, perhaps when assessing whether a product conforms to requirements or whether one set of measurements differs significantly from another. There is increasing appreciation of the need to account for the role of measurement uncertainty when making decisions, so that a ‘fit-for-purpose’ level of measurement effort can be set prior to performing a given task. Better mutual understanding between the metrologist and those ordering such tasks about the significance and limitations of the measurements when making decisions of conformance will be especially useful. Decisions of conformity are, however, currently made in many important application areas, such as when addressing the grand challenges (energy, health, etc), without a clear and harmonized basis for sharing the risks that arise from measurement uncertainty between the consumer, supplier and third parties. In reviewing, in this paper, the state of the art of the use of uncertainty evaluation in conformity assessment and decision-making, two aspects in particular—the handling of qualitative observations and of impact—are considered key to bringing more order to the present diverse rules of thumb of more or less arbitrary limits on measurement uncertainty and percentage risk in the field. (i) Decisions of conformity can be made on a more or less quantitative basis—referred in statistical acceptance sampling as by ‘variable’ or by ‘attribute’ (i.e. go/no-go decisions)—depending on the resources available or indeed whether a full quantitative judgment is needed or not. There is, therefore, an intimate relation between decision-making, relating objects to each other in terms of comparative or merely qualitative concepts, and nominal and ordinal properties. (ii) Adding measures of impact, such as the costs of incorrect decisions, can give more objective and more readily appreciated bases for decisions for all parties concerned. Such

  9. Uncertainty evaluation in normalization of isotope delta measurement results against international reference materials.

    Science.gov (United States)

    Meija, Juris; Chartrand, Michelle M G

    2018-01-01

    Isotope delta measurements are normalized against international reference standards. Although multi-point normalization is becoming a standard practice, the existing uncertainty evaluation practices are either undocumented or are incomplete. For multi-point normalization, we present errors-in-variables regression models for explicit accounting of the measurement uncertainty of the international standards along with the uncertainty that is attributed to their assigned values. This manuscript presents framework to account for the uncertainty that arises due to a small number of replicate measurements and discusses multi-laboratory data reduction while accounting for inevitable correlations between the laboratories due to the use of identical reference materials for calibration. Both frequentist and Bayesian methods of uncertainty analysis are discussed.

  10. Measurements of fusion neutron yields by neutron activation technique: Uncertainty due to the uncertainty on activation cross-sections

    Energy Technology Data Exchange (ETDEWEB)

    Stankunas, Gediminas, E-mail: gediminas.stankunas@lei.lt [Lithuanian Energy Institute, Laboratory of Nuclear Installation Safety, Breslaujos str. 3, LT-44403 Kaunas (Lithuania); EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Batistoni, Paola [ENEA, Via E. Fermi, 45, 00044 Frascati, Rome (Italy); EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Sjöstrand, Henrik; Conroy, Sean [Department of Physics and Astronomy, Uppsala University, PO Box 516, SE-75120 Uppsala (Sweden); EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB (United Kingdom)

    2015-07-11

    The neutron activation technique is routinely used in fusion experiments to measure the neutron yields. This paper investigates the uncertainty on these measurements as due to the uncertainties on dosimetry and activation reactions. For this purpose, activation cross-sections were taken from the International Reactor Dosimetry and Fusion File (IRDFF-v1.05) in 640 groups ENDF-6 format for several reactions of interest for both 2.5 and 14 MeV neutrons. Activation coefficients (reaction rates) have been calculated using the neutron flux spectra at JET vacuum vessel, both for DD and DT plasmas, calculated by MCNP in the required 640-energy group format. The related uncertainties for the JET neutron spectra are evaluated as well using the covariance data available in the library. These uncertainties are in general small, but not negligible when high accuracy is required in the determination of the fusion neutron yields.

  11. A new computational method of a moment-independent uncertainty importance measure

    International Nuclear Information System (INIS)

    Liu Qiao; Homma, Toshimitsu

    2009-01-01

    For a risk assessment model, the uncertainty in input parameters is propagated through the model and leads to the uncertainty in the model output. The study of how the uncertainty in the output of a model can be apportioned to the uncertainty in the model inputs is the job of sensitivity analysis. Saltelli [Sensitivity analysis for importance assessment. Risk Analysis 2002;22(3):579-90] pointed out that a good sensitivity indicator should be global, quantitative and model free. Borgonovo [A new uncertainty importance measure. Reliability Engineering and System Safety 2007;92(6):771-84] further extended these three requirements by adding the fourth feature, moment-independence, and proposed a new sensitivity measure, δ i . It evaluates the influence of the input uncertainty on the entire output distribution without reference to any specific moment of the model output. In this paper, a new computational method of δ i is proposed. It is conceptually simple and easier to implement. The feasibility of this new method is proved by applying it to two examples.

  12. Guide to the expression of uncertainty in measurements

    Energy Technology Data Exchange (ETDEWEB)

    Mathew, Kattathu Joseph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-10-19

    The enabling objectives of this presentation are to: Provide a working knowledge of the ISO GUM method to estimation of uncertainties in safeguards measurements; Introduce GUM terminology; Provide brief historical background of the GUM methodology; Introduce GUM Workbench software; Isotope ratio measurements by MS will be discussed in the next session.

  13. Impact of measurement uncertainty from experimental load distribution factors on bridge load rating

    Science.gov (United States)

    Gangone, Michael V.; Whelan, Matthew J.

    2018-03-01

    Load rating and testing of highway bridges is important in determining the capacity of the structure. Experimental load rating utilizes strain transducers placed at critical locations of the superstructure to measure normal strains. These strains are then used in computing diagnostic performance measures (neutral axis of bending, load distribution factor) and ultimately a load rating. However, it has been shown that experimentally obtained strain measurements contain uncertainties associated with the accuracy and precision of the sensor and sensing system. These uncertainties propagate through to the diagnostic indicators that in turn transmit into the load rating calculation. This paper will analyze the effect that measurement uncertainties have on the experimental load rating results of a 3 span multi-girder/stringer steel and concrete bridge. The focus of this paper will be limited to the uncertainty associated with the experimental distribution factor estimate. For the testing discussed, strain readings were gathered at the midspan of each span of both exterior girders and the center girder. Test vehicles of known weight were positioned at specified locations on each span to generate maximum strain response for each of the five girders. The strain uncertainties were used in conjunction with a propagation formula developed by the authors to determine the standard uncertainty in the distribution factor estimates. This distribution factor uncertainty is then introduced into the load rating computation to determine the possible range of the load rating. The results show the importance of understanding measurement uncertainty in experimental load testing.

  14. Disaggregating measurement uncertainty from population variability and Bayesian treatment of uncensored results

    International Nuclear Information System (INIS)

    Strom, Daniel J.; Joyce, Kevin E.; Maclellan, Jay A.; Watson, David J.; Lynch, Timothy P.; Antonio, Cheryl L.; Birchall, Alan; Anderson, Kevin K.; Zharov, Peter

    2012-01-01

    In making low-level radioactivity measurements of populations, it is commonly observed that a substantial portion of net results are negative. Furthermore, the observed variance of the measurement results arises from a combination of measurement uncertainty and population variability. This paper presents a method for disaggregating measurement uncertainty from population variability to produce a probability density function (PDF) of possibly true results. To do this, simple, justifiable, and reasonable assumptions are made about the relationship of the measurements to the measurands (the 'true values'). The measurements are assumed to be unbiased, that is, that their average value is the average of the measurands. Using traditional estimates of each measurement's uncertainty to disaggregate population variability from measurement uncertainty, a PDF of measurands for the population is produced. Then, using Bayes's theorem, the same assumptions, and all the data from the population of individuals, a prior PDF is computed for each individual's measurand. These PDFs are non-negative, and their average is equal to the average of the measurement results for the population. The uncertainty in these Bayesian posterior PDFs is all Berkson with no remaining classical component. The methods are applied to baseline bioassay data from the Hanford site. The data include 90Sr urinalysis measurements on 128 people, 137Cs in vivo measurements on 5,337 people, and 239Pu urinalysis measurements on 3,270 people. The method produces excellent results for the 90Sr and 137Cs measurements, since there are nonzero concentrations of these global fallout radionuclides in people who have not been occupationally exposed. The method does not work for the 239Pu measurements in non-occupationally exposed people because the population average is essentially zero.

  15. Measurements of the degree of development of rigor mortis as an indicator of stress in slaughtered pigs.

    Science.gov (United States)

    Warriss, P D; Brown, S N; Knowles, T G

    2003-12-13

    The degree of development of rigor mortis in the carcases of slaughter pigs was assessed subjectively on a three-point scale 35 minutes after they were exsanguinated, and related to the levels of cortisol, lactate and creatine kinase in blood collected at exsanguination. Earlier rigor development was associated with higher concentrations of these stress indicators in the blood. This relationship suggests that the mean rigor score, and the frequency distribution of carcases that had or had not entered rigor, could be used as an index of the degree of stress to which the pigs had been subjected.

  16. Estimation of the breaking of rigor mortis by myotonometry.

    Science.gov (United States)

    Vain, A; Kauppila, R; Vuori, E

    1996-05-31

    Myotonometry was used to detect breaking of rigor mortis. The myotonometer is a new instrument which measures the decaying oscillations of a muscle after a brief mechanical impact. The method gives two numerical parameters for rigor mortis, namely the period and decrement of the oscillations, both of which depend on the time period elapsed after death. In the case of breaking the rigor mortis by muscle lengthening, both the oscillation period and decrement decreased, whereas, shortening the muscle caused the opposite changes. Fourteen h after breaking the stiffness characteristics of the right and left m. biceps brachii, or oscillation periods, were assimilated. However, the values for decrement of the muscle, reflecting the dissipation of mechanical energy, maintained their differences.

  17. A systematic approach to the modelling of measurements for uncertainty evaluation

    International Nuclear Information System (INIS)

    Sommer, K D; Weckenmann, A; Siebert, B R L

    2005-01-01

    The evaluation of measurement uncertainty is based on both, the knowledge about the measuring process and the quantities which influence the measurement result. The knowledge about the measuring process is represented by the model equation which expresses the interrelation between the measurand and the input quantities. Therefore, the modelling of the measurement is a key element of modern uncertainty evaluation. A modelling concept has been developed that is based on the idea of the measuring chain. It gets on with only a few generic model structures. From this concept, a practical stepwise procedure has been derived

  18. Rigorous results on measuring the quark charge below color threshold

    International Nuclear Information System (INIS)

    Lipkin, H.J.

    1979-01-01

    Rigorous theorems are presented showing that contributions from a color nonsinglet component of the current to matrix elements of a second order electromagnetic transition are suppressed by factors inversely proportional to the energy of the color threshold. Parton models which obtain matrix elements proportional to the color average of the square of the quark charge are shown to neglect terms of the same order of magnitude as terms kept. (author)

  19. Uncertainty analysis of NDA waste measurements using computer simulations

    International Nuclear Information System (INIS)

    Blackwood, L.G.; Harker, Y.D.; Yoon, W.Y.; Meachum, T.R.

    2000-01-01

    Uncertainty assessments for nondestructive radioassay (NDA) systems for nuclear waste are complicated by factors extraneous to the measurement systems themselves. Most notably, characteristics of the waste matrix (e.g., homogeneity) and radioactive source material (e.g., particle size distribution) can have great effects on measured mass values. Under these circumstances, characterizing the waste population is as important as understanding the measurement system in obtaining realistic uncertainty values. When extraneous waste characteristics affect measurement results, the uncertainty results are waste-type specific. The goal becomes to assess the expected bias and precision for the measurement of a randomly selected item from the waste population of interest. Standard propagation-of-errors methods for uncertainty analysis can be very difficult to implement in the presence of significant extraneous effects on the measurement system. An alternative approach that naturally includes the extraneous effects is as follows: (1) Draw a random sample of items from the population of interest; (2) Measure the items using the NDA system of interest; (3) Establish the true quantity being measured using a gold standard technique; and (4) Estimate bias by deriving a statistical regression model comparing the measurements on the system of interest to the gold standard values; similar regression techniques for modeling the standard deviation of the difference values gives the estimated precision. Actual implementation of this method is often impractical. For example, a true gold standard confirmation measurement may not exist. A more tractable implementation is obtained by developing numerical models for both the waste material and the measurement system. A random sample of simulated waste containers generated by the waste population model serves as input to the measurement system model. This approach has been developed and successfully applied to assessing the quantity of

  20. Uncertainty in relative energy resolution measurements

    International Nuclear Information System (INIS)

    Volkovitsky, P.; Yen, J.; Cumberland, L.

    2007-01-01

    We suggest a new method for the determination of the detector relative energy resolution and its uncertainty based on spline approximation of experimental spectra and a statistical bootstrapping procedure. The proposed method is applied to the spectra obtained with NaI(Tl) scintillating detectors and 137 Cs sources. The spectrum histogram with background subtracted channel-by-channel is modeled by cubic spline approximation. The relative energy resolution (which is also known as pulse height resolution and energy resolution), defined as the full-width at half-maximum (FWHM) divided by the value of peak centroid, is calculated using the intercepts of the spline curve with the line of the half peak height. The value of the peak height is determined as the point where the value of the derivative goes to zero. The residuals, which are normalized over the square root of counts in a given bin (y-coordinate), obey the standard Gaussian distribution. The values of these residuals are randomly re-assigned to a different set of y-coordinates where a new 'pseudo-experimental' data set is obtained after 'de-normalization' of the old values. For this new data set a new spline approximation is found and the whole procedure is repeated several hundred times, until the standard deviation of relative energy resolution becomes stabilized. The standard deviation of relative energy resolutions calculated for each 'pseudo-experimental' data set (bootstrap uncertainty) is considered to be an estimate for relative energy resolution uncertainty. It is also shown that the relative bootstrap uncertainty is proportional to, and generally only two to three times bigger than, 1/√(N tot ), which is the relative statistical count uncertainty (N tot is the total number of counts under the peak). The newly suggested method is also applicable to other radiation and particle detectors, not only for relative energy resolution, but also for any of the other parameters in a measured spectrum, like

  1. Particle image velocimetry correlation signal-to-noise ratio metrics and measurement uncertainty quantification

    International Nuclear Information System (INIS)

    Xue, Zhenyu; Charonko, John J; Vlachos, Pavlos P

    2014-01-01

    In particle image velocimetry (PIV) the measurement signal is contained in the recorded intensity of the particle image pattern superimposed on a variety of noise sources. The signal-to-noise-ratio (SNR) strength governs the resulting PIV cross correlation and ultimately the accuracy and uncertainty of the resulting PIV measurement. Hence we posit that correlation SNR metrics calculated from the correlation plane can be used to quantify the quality of the correlation and the resulting uncertainty of an individual measurement. In this paper we extend the original work by Charonko and Vlachos and present a framework for evaluating the correlation SNR using a set of different metrics, which in turn are used to develop models for uncertainty estimation. Several corrections have been applied in this work. The SNR metrics and corresponding models presented herein are expanded to be applicable to both standard and filtered correlations by applying a subtraction of the minimum correlation value to remove the effect of the background image noise. In addition, the notion of a ‘valid’ measurement is redefined with respect to the correlation peak width in order to be consistent with uncertainty quantification principles and distinct from an ‘outlier’ measurement. Finally the type and significance of the error distribution function is investigated. These advancements lead to more robust and reliable uncertainty estimation models compared with the original work by Charonko and Vlachos. The models are tested against both synthetic benchmark data as well as experimental measurements. In this work, U 68.5 uncertainties are estimated at the 68.5% confidence level while U 95 uncertainties are estimated at 95% confidence level. For all cases the resulting calculated coverage factors approximate the expected theoretical confidence intervals, thus demonstrating the applicability of these new models for estimation of uncertainty for individual PIV measurements. (paper)

  2. Particle image velocimetry correlation signal-to-noise ratio metrics and measurement uncertainty quantification

    Science.gov (United States)

    Xue, Zhenyu; Charonko, John J.; Vlachos, Pavlos P.

    2014-11-01

    In particle image velocimetry (PIV) the measurement signal is contained in the recorded intensity of the particle image pattern superimposed on a variety of noise sources. The signal-to-noise-ratio (SNR) strength governs the resulting PIV cross correlation and ultimately the accuracy and uncertainty of the resulting PIV measurement. Hence we posit that correlation SNR metrics calculated from the correlation plane can be used to quantify the quality of the correlation and the resulting uncertainty of an individual measurement. In this paper we extend the original work by Charonko and Vlachos and present a framework for evaluating the correlation SNR using a set of different metrics, which in turn are used to develop models for uncertainty estimation. Several corrections have been applied in this work. The SNR metrics and corresponding models presented herein are expanded to be applicable to both standard and filtered correlations by applying a subtraction of the minimum correlation value to remove the effect of the background image noise. In addition, the notion of a ‘valid’ measurement is redefined with respect to the correlation peak width in order to be consistent with uncertainty quantification principles and distinct from an ‘outlier’ measurement. Finally the type and significance of the error distribution function is investigated. These advancements lead to more robust and reliable uncertainty estimation models compared with the original work by Charonko and Vlachos. The models are tested against both synthetic benchmark data as well as experimental measurements. In this work, {{U}68.5} uncertainties are estimated at the 68.5% confidence level while {{U}95} uncertainties are estimated at 95% confidence level. For all cases the resulting calculated coverage factors approximate the expected theoretical confidence intervals, thus demonstrating the applicability of these new models for estimation of uncertainty for individual PIV measurements.

  3. Detailed modeling of the statistical uncertainty of Thomson scattering measurements

    International Nuclear Information System (INIS)

    Morton, L A; Parke, E; Hartog, D J Den

    2013-01-01

    The uncertainty of electron density and temperature fluctuation measurements is determined by statistical uncertainty introduced by multiple noise sources. In order to quantify these uncertainties precisely, a simple but comprehensive model was made of the noise sources in the MST Thomson scattering system and of the resulting variance in the integrated scattered signals. The model agrees well with experimental and simulated results. The signal uncertainties are then used by our existing Bayesian analysis routine to find the most likely electron temperature and density, with confidence intervals. In the model, photonic noise from scattered light and plasma background light is multiplied by the noise enhancement factor (F) of the avalanche photodiode (APD). Electronic noise from the amplifier and digitizer is added. The amplifier response function shapes the signal and induces correlation in the noise. The data analysis routine fits a characteristic pulse to the digitized signals from the amplifier, giving the integrated scattered signals. A finite digitization rate loses information and can cause numerical integration error. We find a formula for the variance of the scattered signals in terms of the background and pulse amplitudes, and three calibration constants. The constants are measured easily under operating conditions, resulting in accurate estimation of the scattered signals' uncertainty. We measure F ≈ 3 for our APDs, in agreement with other measurements for similar APDs. This value is wavelength-independent, simplifying analysis. The correlated noise we observe is reproduced well using a Gaussian response function. Numerical integration error can be made negligible by using an interpolated characteristic pulse, allowing digitization rates as low as the detector bandwidth. The effect of background noise is also determined

  4. High frequency electric field levels: An example of determination of measurement uncertainty for broadband measurements

    Directory of Open Access Journals (Sweden)

    Vulević Branislav

    2016-01-01

    Full Text Available Determining high frequency electromagnetic field levels in urban areas represents a very complex task, having in mind the exponential growth of the number of sources embodied in public cellular telephony systems in the past twenty years. The main goal of this paper is a representation of a practical solution in the evaluation of measurement uncertainty for in-situ measurements in the case of spatial averaging. An example of the estimation of the uncertainty for electric field strength broadband measurements in the frequency range from 3 MHz to 18 GHz is presented.

  5. Status of uncertainty assessment in k0-NAA measurement. Anything still missing?

    International Nuclear Information System (INIS)

    Borut Smodis; Tinkara Bucar

    2014-01-01

    Several approaches to quantifying measurement uncertainty in k 0 -based neutron activation analysis (k 0 -NAA) are reviewed, comprising the original approach, the spreadsheet approach, the dedicated computer program involving analytical calculations and the two k 0 -NAA programs available on the market. Two imperfectness in the dedicated programs are identified, their impact assessed and possible improvements presented for a concrete experimental situation. The status of uncertainty assessment in k 0 -NAA is discussed and steps for improvement are recommended. It is concluded that the present magnitude of measurement uncertainty should further be improved by making additional efforts in reducing uncertainties of the relevant nuclear constants used. (author)

  6. The uncertainty in physical measurements an introduction to data analysis in the physics laboratory

    CERN Document Server

    Fornasini, Paolo

    2008-01-01

    All measurements of physical quantities are affected by uncertainty. Understanding the origin of uncertainty, evaluating its extent and suitably taking it into account in data analysis is essential for assessing the degree of accuracy of phenomenological relationships and physical laws in both scientific research and technological applications. The Uncertainty in Physical Measurements: An Introduction to Data Analysis in the Physics Laboratory presents an introduction to uncertainty and to some of the most common procedures of data analysis. This book will serve the reader well by filling the gap between tutorial textbooks and highly specialized monographs. The book is divided into three parts. The first part is a phenomenological introduction to measurement and uncertainty: properties of instruments, different causes and corresponding expressions of uncertainty, histograms and distributions, and unified expression of uncertainty. The second part contains an introduction to probability theory, random variable...

  7. Principles of Uncertainty

    CERN Document Server

    Kadane, Joseph B

    2011-01-01

    An intuitive and mathematical introduction to subjective probability and Bayesian statistics. An accessible, comprehensive guide to the theory of Bayesian statistics, Principles of Uncertainty presents the subjective Bayesian approach, which has played a pivotal role in game theory, economics, and the recent boom in Markov Chain Monte Carlo methods. Both rigorous and friendly, the book contains: Introductory chapters examining each new concept or assumption Just-in-time mathematics -- the presentation of ideas just before they are applied Summary and exercises at the end of each chapter Discus

  8. Uncertainty estimation of shape and roughness measurement

    NARCIS (Netherlands)

    Morel, M.A.A.

    2006-01-01

    One of the most common techniques to measure a surface or form is mechanical probing. Although used since the early 30s of the 20th century, a method to calculate a task specific uncertainty budget was not yet devised. Guidelines and statistical estimates are common in certain cases but an

  9. Development of Uncertainty Quantification Method for MIR-PIV Measurement using BOS Technique

    International Nuclear Information System (INIS)

    Seong, Jee Hyun; Song, Min Seop; Kim, Eung Soo

    2014-01-01

    Matching Index of Refraction (MIR) is frequently used for obtaining high quality PIV measurement data. ven small distortion by unmatched refraction index of test section can result in uncertainty problems. In this context, it is desirable to construct new concept for checking errors of MIR and following uncertainty of PIV measurement. This paper proposes a couple of experimental concept and relative results. This study developed an MIR uncertainty quantification method for PIV measurement using SBOS technique. From the reference data of the BOS, the reliable SBOS experiment procedure was constructed. Then with the combination of SBOS technique with MIR-PIV technique, velocity vector and refraction displacement vector field was measured simultaneously. MIR errors are calculated through mathematical equation, in which PIV and SBOS data are put. These errors are also verified by another BOS experiment. Finally, with the applying of calculated MIR-PIV uncertainty, correct velocity vector field can be obtained regardless of MIR errors

  10. Invited Review Article: Measurement uncertainty of linear phase-stepping algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Hack, Erwin [EMPA, Laboratory Electronics/Metrology/Reliability, Ueberlandstrasse 129, CH-8600 Duebendorf (Switzerland); Burke, Jan [Australian Centre for Precision Optics, CSIRO (Commonwealth Scientific and Industrial Research Organisation) Materials Science and Engineering, P.O. Box 218, Lindfield, NSW 2070 (Australia)

    2011-06-15

    Phase retrieval techniques are widely used in optics, imaging and electronics. Originating in signal theory, they were introduced to interferometry around 1970. Over the years, many robust phase-stepping techniques have been developed that minimize specific experimental influence quantities such as phase step errors or higher harmonic components of the signal. However, optimizing a technique for a specific influence quantity can compromise its performance with regard to others. We present a consistent quantitative analysis of phase measurement uncertainty for the generalized linear phase stepping algorithm with nominally equal phase stepping angles thereby reviewing and generalizing several results that have been reported in literature. All influence quantities are treated on equal footing, and correlations between them are described in a consistent way. For the special case of classical N-bucket algorithms, we present analytical formulae that describe the combined variance as a function of the phase angle values. For the general Arctan algorithms, we derive expressions for the measurement uncertainty averaged over the full 2{pi}-range of phase angles. We also give an upper bound for the measurement uncertainty which can be expressed as being proportional to an algorithm specific factor. Tabular compilations help the reader to quickly assess the uncertainties that are involved with his or her technique.

  11. Invited Article: Concepts and tools for the evaluation of measurement uncertainty

    Science.gov (United States)

    Possolo, Antonio; Iyer, Hari K.

    2017-01-01

    Measurements involve comparisons of measured values with reference values traceable to measurement standards and are made to support decision-making. While the conventional definition of measurement focuses on quantitative properties (including ordinal properties), we adopt a broader view and entertain the possibility of regarding qualitative properties also as legitimate targets for measurement. A measurement result comprises the following: (i) a value that has been assigned to a property based on information derived from an experiment or computation, possibly also including information derived from other sources, and (ii) a characterization of the margin of doubt that remains about the true value of the property after taking that information into account. Measurement uncertainty is this margin of doubt, and it can be characterized by a probability distribution on the set of possible values of the property of interest. Mathematical or statistical models enable the quantification of measurement uncertainty and underlie the varied collection of methods available for uncertainty evaluation. Some of these methods have been in use for over a century (for example, as introduced by Gauss for the combination of mutually inconsistent observations or for the propagation of "errors"), while others are of fairly recent vintage (for example, Monte Carlo methods including those that involve Markov Chain Monte Carlo sampling). This contribution reviews the concepts, models, methods, and computations that are commonly used for the evaluation of measurement uncertainty, and illustrates their application in realistic examples drawn from multiple areas of science and technology, aiming to serve as a general, widely accessible reference.

  12. Correlation and uncertainties evaluation in backscattering of entrance surface air kerma measurements

    Energy Technology Data Exchange (ETDEWEB)

    Teixeira, G.J.; Sousa, C.H.S.; Peixoto, J.G.P., E-mail: gt@ird.gov.br [Instituto de Radioproteção e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2017-07-01

    The air kerma measurement is important to verify the applied doses in radiodiagnostic. The literature determines some methods to measure the entrance surface air kerma or entrance surface dose but some of this methods may increase the measurement with the backscattering. Were done setups of measurements to do correlations between them. The expanded uncertainty exceeded 5% for measurements with backscattering, reaching 8.36%, while in situations where the backscattering was avoided, the uncertainty was 3.43%. (author)

  13. The Rigor Mortis of Education: Rigor Is Required in a Dying Educational System

    Science.gov (United States)

    Mixon, Jason; Stuart, Jerry

    2009-01-01

    In an effort to answer the "Educational Call to Arms", our national public schools have turned to Advanced Placement (AP) courses as the predominate vehicle used to address the lack of academic rigor in our public high schools. Advanced Placement is believed by many to provide students with the rigor and work ethic necessary to…

  14. Towards minimizing measurement uncertainty in total petroleum hydrocarbon determination by GC-FID

    Energy Technology Data Exchange (ETDEWEB)

    Saari, E.

    2009-07-01

    Despite tightened environmental legislation, spillages of petroleum products remain a serious problem worldwide. The environmental impacts of these spillages are always severe and reliable methods for the identification and quantitative determination of petroleum hydrocarbons in environmental samples are therefore needed. Great improvements in the definition and analysis of total petroleum hydrocarbons (TPH) were finally introduced by international organizations for standardization in 2004. This brought some coherence to the determination and, nowadays, most laboratories seem to employ ISO/DIS 16703:2004, ISO 9377-2:2000 and CEN prEN 14039:2004:E draft international standards for analysing TPH in soil. The implementation of these methods, however, usually fails because the reliability of petroleum hydrocarbon determination has proved to be poor.This thesis describes the assessment of measurement uncertainty for TPH determination in soil. Chemometric methods were used to both estimate the main uncertainty sources and identify the most significant factors affecting these uncertainty sources. The method used for the determinations was based on gas chromatography utilizing flame ionization detection (GC-FID).Chemometric methodology applied in estimating measurement uncertainty for TPH determination showed that the measurement uncertainty is in actual fact dominated by the analytical uncertainty. Within the specific concentration range studied, the analytical uncertainty accounted for as much as 68-80% of the measurement uncertainty. The robustness of the analytical method used for petroleum hydrocarbon determination was then studied in more detail. A two-level Plackett-Burman design and a D-optimal design were utilized to assess the main analytical uncertainty sources of the sample treatment and GC determination procedures. It was also found that the matrix-induced systematic error may also significantly reduce the reliability of petroleum hydrocarbon determination

  15. Rigor, vigor, and the study of health disparities.

    Science.gov (United States)

    Adler, Nancy; Bush, Nicole R; Pantell, Matthew S

    2012-10-16

    Health disparities research spans multiple fields and methods and documents strong links between social disadvantage and poor health. Associations between socioeconomic status (SES) and health are often taken as evidence for the causal impact of SES on health, but alternative explanations, including the impact of health on SES, are plausible. Studies showing the influence of parents' SES on their children's health provide evidence for a causal pathway from SES to health, but have limitations. Health disparities researchers face tradeoffs between "rigor" and "vigor" in designing studies that demonstrate how social disadvantage becomes biologically embedded and results in poorer health. Rigorous designs aim to maximize precision in the measurement of SES and health outcomes through methods that provide the greatest control over temporal ordering and causal direction. To achieve precision, many studies use a single SES predictor and single disease. However, doing so oversimplifies the multifaceted, entwined nature of social disadvantage and may overestimate the impact of that one variable and underestimate the true impact of social disadvantage on health. In addition, SES effects on overall health and functioning are likely to be greater than effects on any one disease. Vigorous designs aim to capture this complexity and maximize ecological validity through more complete assessment of social disadvantage and health status, but may provide less-compelling evidence of causality. Newer approaches to both measurement and analysis may enable enhanced vigor as well as rigor. Incorporating both rigor and vigor into studies will provide a fuller understanding of the causes of health disparities.

  16. A new measure of uncertainty importance based on distributional sensitivity analysis for PSA

    International Nuclear Information System (INIS)

    Han, Seok Jung; Tak, Nam Il; Chun, Moon Hyun

    1996-01-01

    The main objective of the present study is to propose a new measure of uncertainty importance based on distributional sensitivity analysis. The new measure is developed to utilize a metric distance obtained from cumulative distribution functions (cdfs). The measure is evaluated for two cases: one is a cdf given by a known analytical distribution and the other given by an empirical distribution generated by a crude Monte Carlo simulation. To study its applicability, the present measure has been applied to two different cases. The results are compared with those of existing three methods. The present approach is a useful measure of uncertainty importance which is based on cdfs. This method is simple and easy to calculate uncertainty importance without any complex process. On the basis of the results obtained in the present work, the present method is recommended to be used as a tool for the analysis of uncertainty importance

  17. Uncertainty analysis technique for OMEGA Dante measurements

    International Nuclear Information System (INIS)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-01-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  18. Uncertainty Analysis Technique for OMEGA Dante Measurements

    International Nuclear Information System (INIS)

    May, M.J.; Widmann, K.; Sorce, C.; Park, H.; Schneider, M.

    2010-01-01

    The Dante is an 18 channel X-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g. hohlraums, etc.) at X-ray energies between 50 eV to 10 keV. It is a main diagnostics installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the X-ray diodes, filters and mirrors and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte-Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  19. Realizing rigor in the mathematics classroom

    CERN Document Server

    Hull, Ted H (Henry); Balka, Don S

    2014-01-01

    Rigor put within reach! Rigor: The Common Core has made it policy-and this first-of-its-kind guide takes math teachers and leaders through the process of making it reality. Using the Proficiency Matrix as a framework, the authors offer proven strategies and practical tools for successful implementation of the CCSS mathematical practices-with rigor as a central objective. You'll learn how to Define rigor in the context of each mathematical practice Identify and overcome potential issues, including differentiating instruction and using data

  20. Uncertainty analysis of the magnetic field measurement by the translating coil method in axisymmetric magnets

    International Nuclear Information System (INIS)

    Arpaia, Pasquale; De Vito, Luca; Kazazi, Mario

    2016-01-01

    In the uncertainty assessment of magnetic flux measurements in axially symmetric magnets by the translating coil method, the Guide to the Uncertainty in Measurement and its supplement cannot be applied: the voltage variation at the coil terminals, which is the actual measured quantity, affects the flux estimate and its uncertainty. In this paper, a particle filter, implementing a sequential Monte-Carlo method based on Bayesian inference, is applied. At this aim, the main uncertainty sources are analyzed and a model of the measurement process is defined. The results of the experimental validation point out the transport system and the acquisition system as the main contributions to the uncertainty budget. (authors)

  1. Total Measurement Uncertainty for the Plutonium Finishing Plant (PFP) Segmented Gamma Scan Assay System

    CERN Document Server

    Fazzari, D M

    2001-01-01

    This report presents the results of an evaluation of the Total Measurement Uncertainty (TMU) for the Canberra manufactured Segmented Gamma Scanner Assay System (SGSAS) as employed at the Hanford Plutonium Finishing Plant (PFP). In this document, TMU embodies the combined uncertainties due to all of the individual random and systematic sources of measurement uncertainty. It includes uncertainties arising from corrections and factors applied to the analysis of transuranic waste to compensate for inhomogeneities and interferences from the waste matrix and radioactive components. These include uncertainty components for any assumptions contained in the calibration of the system or computation of the data. Uncertainties are propagated at 1 sigma. The final total measurement uncertainty value is reported at the 95% confidence level. The SGSAS is a gamma assay system that is used to assay plutonium and uranium waste. The SGSAS system can be used in a stand-alone mode to perform the NDA characterization of a containe...

  2. Quantifying measurement uncertainty and spatial variability in the context of model evaluation

    Science.gov (United States)

    Choukulkar, A.; Brewer, A.; Pichugina, Y. L.; Bonin, T.; Banta, R. M.; Sandberg, S.; Weickmann, A. M.; Djalalova, I.; McCaffrey, K.; Bianco, L.; Wilczak, J. M.; Newman, J. F.; Draxl, C.; Lundquist, J. K.; Wharton, S.; Olson, J.; Kenyon, J.; Marquis, M.

    2017-12-01

    In an effort to improve wind forecasts for the wind energy sector, the Department of Energy and the NOAA funded the second Wind Forecast Improvement Project (WFIP2). As part of the WFIP2 field campaign, a large suite of in-situ and remote sensing instrumentation was deployed to the Columbia River Gorge in Oregon and Washington from October 2015 - March 2017. The array of instrumentation deployed included 915-MHz wind profiling radars, sodars, wind- profiling lidars, and scanning lidars. The role of these instruments was to provide wind measurements at high spatial and temporal resolution for model evaluation and improvement of model physics. To properly determine model errors, the uncertainties in instrument-model comparisons need to be quantified accurately. These uncertainties arise from several factors such as measurement uncertainty, spatial variability, and interpolation of model output to instrument locations, to name a few. In this presentation, we will introduce a formalism to quantify measurement uncertainty and spatial variability. The accuracy of this formalism will be tested using existing datasets such as the eXperimental Planetary boundary layer Instrumentation Assessment (XPIA) campaign. Finally, the uncertainties in wind measurement and the spatial variability estimates from the WFIP2 field campaign will be discussed to understand the challenges involved in model evaluation.

  3. Measurement, simulation and uncertainty assessment of implant heating during MRI

    International Nuclear Information System (INIS)

    Neufeld, E; Kuehn, S; Kuster, N; Szekely, G

    2009-01-01

    The heating of tissues around implants during MRI can pose severe health risks, and careful evaluation is required for leads to be labeled as MR conditionally safe. A recent interlaboratory comparison study has shown that different groups can produce widely varying results (sometimes with more than a factor of 5 difference) when performing measurements according to current guidelines. To determine the related difficulties and to derive optimized procedures, two different generic lead structures have been investigated in this study by using state-of-the-art temperature and dosimetric probes, as well as simulations for which detailed uncertainty budgets have been determined. The agreement between simulations and measurements is well within the combined uncertainty. The study revealed that the uncertainty can be kept below 17% if appropriate instrumentation and procedures are applied. Optimized experimental assessment techniques can be derived from the findings presented herein.

  4. Measurement, simulation and uncertainty assessment of implant heating during MRI

    Energy Technology Data Exchange (ETDEWEB)

    Neufeld, E; Kuehn, S; Kuster, N [Foundation for Research on Information Technologies in Society (IT' IS), Zeughausstr. 43, 8004 Zurich (Switzerland); Szekely, G [Computer Vision Laboratory, Swiss Federal Institute of Technology (ETHZ), Sternwartstr 7, ETH Zentrum, 8092 Zurich (Switzerland)], E-mail: neufeld@itis.ethz.ch

    2009-07-07

    The heating of tissues around implants during MRI can pose severe health risks, and careful evaluation is required for leads to be labeled as MR conditionally safe. A recent interlaboratory comparison study has shown that different groups can produce widely varying results (sometimes with more than a factor of 5 difference) when performing measurements according to current guidelines. To determine the related difficulties and to derive optimized procedures, two different generic lead structures have been investigated in this study by using state-of-the-art temperature and dosimetric probes, as well as simulations for which detailed uncertainty budgets have been determined. The agreement between simulations and measurements is well within the combined uncertainty. The study revealed that the uncertainty can be kept below 17% if appropriate instrumentation and procedures are applied. Optimized experimental assessment techniques can be derived from the findings presented herein.

  5. Uncertainty of measurement for large product verification: evaluation of large aero gas turbine engine datums

    International Nuclear Information System (INIS)

    Muelaner, J E; Wang, Z; Keogh, P S; Brownell, J; Fisher, D

    2016-01-01

    Understanding the uncertainty of dimensional measurements for large products such as aircraft, spacecraft and wind turbines is fundamental to improving efficiency in these products. Much work has been done to ascertain the uncertainty associated with the main types of instruments used, based on laser tracking and photogrammetry, and the propagation of this uncertainty through networked measurements. Unfortunately this is not sufficient to understand the combined uncertainty of industrial measurements, which include secondary tooling and datum structures used to locate the coordinate frame. This paper presents for the first time a complete evaluation of the uncertainty of large scale industrial measurement processes. Generic analysis and design rules are proven through uncertainty evaluation and optimization for the measurement of a large aero gas turbine engine. This shows how the instrument uncertainty can be considered to be negligible. Before optimization the dominant source of uncertainty was the tooling design, after optimization the dominant source was thermal expansion of the engine; meaning that no further improvement can be made without measurement in a temperature controlled environment. These results will have a significant impact on the ability of aircraft and wind turbines to improve efficiency and therefore reduce carbon emissions, as well as the improved reliability of these products. (paper)

  6. Validation of Fuel Performance Uncertainty for RIA Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Nam-Gyu; Yoo, Jong-Sung; Jung, Yil-Sup [KEPCO Nuclear Fuel Co., Daejeon (Korea, Republic of)

    2016-10-15

    To achieve this the computer code performance has to be validated based on the experimental results. And for the uncertainty quantification, important uncertainty parameters need to be selected, and combined uncertainty has to be evaluated with an acceptable statistical treatment. And important uncertainty parameters to the rod performance such as fuel enthalpy, fission gas release, cladding hoop strain etc. were chosen through the rigorous sensitivity studies. And their validity has been assessed by utilizing the experimental results, which were tested in CABRI and NSRR. Analysis results revealed that several tested rods were not bounded within combined fuel performance uncertainty. Assessment of fuel performance with an extended fuel power uncertainty on tested rods in NSRR and CABRI has been done. Analysis results showed that several tested rods were not bounded within calculated fuel performance uncertainty. This implies that the currently considered uncertainty range of the parameters is not enough to cover the fuel performance sufficiently.

  7. Covariance methodology applied to uncertainties in I-126 disintegration rate measurements

    International Nuclear Information System (INIS)

    Fonseca, K.A.; Koskinas, M.F.; Dias, M.S.

    1996-01-01

    The covariance methodology applied to uncertainties in 126 I disintegration rate measurements is described. Two different coincidence systems were used due to the complex decay scheme of this radionuclide. The parameters involved in the determination of the disintegration rate in each experimental system present correlated components. In this case, the conventional statistical methods to determine the uncertainties (law of propagation) result in wrong values for the final uncertainty. Therefore, use of the methodology of the covariance matrix is necessary. The data from both systems were combined taking into account all possible correlations between the partial uncertainties. (orig.)

  8. Uncertainty in Measurement: A Review of Monte Carlo Simulation Using Microsoft Excel for the Calculation of Uncertainties Through Functional Relationships, Including Uncertainties in Empirically Derived Constants

    Science.gov (United States)

    Farrance, Ian; Frenkel, Robert

    2014-01-01

    The Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides the basic framework for evaluating uncertainty in measurement. The GUM however does not always provide clearly identifiable procedures suitable for medical laboratory applications, particularly when internal quality control (IQC) is used to derive most of the uncertainty estimates. The GUM modelling approach requires advanced mathematical skills for many of its procedures, but Monte Carlo simulation (MCS) can be used as an alternative for many medical laboratory applications. In particular, calculations for determining how uncertainties in the input quantities to a functional relationship propagate through to the output can be accomplished using a readily available spreadsheet such as Microsoft Excel. The MCS procedure uses algorithmically generated pseudo-random numbers which are then forced to follow a prescribed probability distribution. When IQC data provide the uncertainty estimates the normal (Gaussian) distribution is generally considered appropriate, but MCS is by no means restricted to this particular case. With input variations simulated by random numbers, the functional relationship then provides the corresponding variations in the output in a manner which also provides its probability distribution. The MCS procedure thus provides output uncertainty estimates without the need for the differential equations associated with GUM modelling. The aim of this article is to demonstrate the ease with which Microsoft Excel (or a similar spreadsheet) can be used to provide an uncertainty estimate for measurands derived through a functional relationship. In addition, we also consider the relatively common situation where an empirically derived formula includes one or more ‘constants’, each of which has an empirically derived numerical value. Such empirically derived ‘constants’ must also have associated uncertainties which propagate through the functional

  9. Uncertainty in measurement: a review of monte carlo simulation using microsoft excel for the calculation of uncertainties through functional relationships, including uncertainties in empirically derived constants.

    Science.gov (United States)

    Farrance, Ian; Frenkel, Robert

    2014-02-01

    The Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides the basic framework for evaluating uncertainty in measurement. The GUM however does not always provide clearly identifiable procedures suitable for medical laboratory applications, particularly when internal quality control (IQC) is used to derive most of the uncertainty estimates. The GUM modelling approach requires advanced mathematical skills for many of its procedures, but Monte Carlo simulation (MCS) can be used as an alternative for many medical laboratory applications. In particular, calculations for determining how uncertainties in the input quantities to a functional relationship propagate through to the output can be accomplished using a readily available spreadsheet such as Microsoft Excel. The MCS procedure uses algorithmically generated pseudo-random numbers which are then forced to follow a prescribed probability distribution. When IQC data provide the uncertainty estimates the normal (Gaussian) distribution is generally considered appropriate, but MCS is by no means restricted to this particular case. With input variations simulated by random numbers, the functional relationship then provides the corresponding variations in the output in a manner which also provides its probability distribution. The MCS procedure thus provides output uncertainty estimates without the need for the differential equations associated with GUM modelling. The aim of this article is to demonstrate the ease with which Microsoft Excel (or a similar spreadsheet) can be used to provide an uncertainty estimate for measurands derived through a functional relationship. In addition, we also consider the relatively common situation where an empirically derived formula includes one or more 'constants', each of which has an empirically derived numerical value. Such empirically derived 'constants' must also have associated uncertainties which propagate through the functional relationship

  10. Top down arsenic uncertainty measurement in water and sediments from Guarapiranga dam (Brazil)

    Science.gov (United States)

    Faustino, M. G.; Lange, C. N.; Monteiro, L. R.; Furusawa, H. A.; Marques, J. R.; Stellato, T. B.; Soares, S. M. V.; da Silva, T. B. S. C.; da Silva, D. B.; Cotrim, M. E. B.; Pires, M. A. F.

    2018-03-01

    Total arsenic measurements assessment regarding legal threshold demands more than average and standard deviation approach. In this way, analytical measurement uncertainty evaluation was conducted in order to comply with legal requirements and to allow the balance of arsenic in both water and sediment compartments. A top-down approach for measurement uncertainties was applied to evaluate arsenic concentrations in water and sediments from Guarapiranga dam (São Paulo, Brazil). Laboratory quality control and arsenic interlaboratory tests data were used in this approach to estimate the uncertainties associated with the methodology.

  11. Estimation of Uncertainty in Aerosol Concentration Measured by Aerosol Sampling System

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jong Chan; Song, Yong Jae; Jung, Woo Young; Lee, Hyun Chul; Kim, Gyu Tae; Lee, Doo Yong [FNC Technology Co., Yongin (Korea, Republic of)

    2016-10-15

    FNC Technology Co., Ltd has been developed test facilities for the aerosol generation, mixing, sampling and measurement under high pressure and high temperature conditions. The aerosol generation system is connected to the aerosol mixing system which injects SiO{sub 2}/ethanol mixture. In the sampling system, glass fiber membrane filter has been used to measure average mass concentration. Based on the experimental results using main carrier gas of steam and air mixture, the uncertainty estimation of the sampled aerosol concentration was performed by applying Gaussian error propagation law. FNC Technology Co., Ltd. has been developed the experimental facilities for the aerosol measurement under high pressure and high temperature. The purpose of the tests is to develop commercial test module for aerosol generation, mixing and sampling system applicable to environmental industry and safety related system in nuclear power plant. For the uncertainty calculation of aerosol concentration, the value of the sampled aerosol concentration is not measured directly, but must be calculated from other quantities. The uncertainty of the sampled aerosol concentration is a function of flow rates of air and steam, sampled mass, sampling time, condensed steam mass and its absolute errors. These variables propagate to the combination of variables in the function. Using operating parameters and its single errors from the aerosol test cases performed at FNC, the uncertainty of aerosol concentration evaluated by Gaussian error propagation law is less than 1%. The results of uncertainty estimation in the aerosol sampling system will be utilized for the system performance data.

  12. A Quantitative Measure For Evaluating Project Uncertainty Under Variation And Risk Effects

    Directory of Open Access Journals (Sweden)

    A. Chenarani

    2017-10-01

    Full Text Available The effects of uncertainty on a project and the risk event as the consequence of uncertainty are analyzed. The uncertainty index is proposed as a quantitative measure for evaluating the uncertainty of a project. This is done by employing entropy as the indicator of system disorder and lack of information. By employing this index, the uncertainty of each activity and its increase due to risk effects as well as project uncertainty changes as a function of time can be assessed. The results are implemented and analyzed for a small turbojet engine development project as the case study. The results of this study can be useful for project managers and other stakeholders for selecting the most effective risk management and uncertainty controlling method.

  13. Estimation of the measurement uncertainty in magnetic resonance velocimetry based on statistical models

    Energy Technology Data Exchange (ETDEWEB)

    Bruschewski, Martin; Schiffer, Heinz-Peter [Technische Universitaet Darmstadt, Institute of Gas Turbines and Aerospace Propulsion, Darmstadt (Germany); Freudenhammer, Daniel [Technische Universitaet Darmstadt, Institute of Fluid Mechanics and Aerodynamics, Center of Smart Interfaces, Darmstadt (Germany); Buchenberg, Waltraud B. [University Medical Center Freiburg, Medical Physics, Department of Radiology, Freiburg (Germany); Grundmann, Sven [University of Rostock, Institute of Fluid Mechanics, Rostock (Germany)

    2016-05-15

    Velocity measurements with magnetic resonance velocimetry offer outstanding possibilities for experimental fluid mechanics. The purpose of this study was to provide practical guidelines for the estimation of the measurement uncertainty in such experiments. Based on various test cases, it is shown that the uncertainty estimate can vary substantially depending on how the uncertainty is obtained. The conventional approach to estimate the uncertainty from the noise in the artifact-free background can lead to wrong results. A deviation of up to -75% is observed with the presented experiments. In addition, a similarly high deviation is demonstrated with the data from other studies. As a more accurate approach, the uncertainty is estimated directly from the image region with the flow sample. Two possible estimation methods are presented. (orig.)

  14. Estimation of the measurement uncertainty in magnetic resonance velocimetry based on statistical models

    Science.gov (United States)

    Bruschewski, Martin; Freudenhammer, Daniel; Buchenberg, Waltraud B.; Schiffer, Heinz-Peter; Grundmann, Sven

    2016-05-01

    Velocity measurements with magnetic resonance velocimetry offer outstanding possibilities for experimental fluid mechanics. The purpose of this study was to provide practical guidelines for the estimation of the measurement uncertainty in such experiments. Based on various test cases, it is shown that the uncertainty estimate can vary substantially depending on how the uncertainty is obtained. The conventional approach to estimate the uncertainty from the noise in the artifact-free background can lead to wrong results. A deviation of up to -75 % is observed with the presented experiments. In addition, a similarly high deviation is demonstrated with the data from other studies. As a more accurate approach, the uncertainty is estimated directly from the image region with the flow sample. Two possible estimation methods are presented.

  15. Rigorous approach to the comparison between experiment and theory in Casimir force measurements

    International Nuclear Information System (INIS)

    Klimchitskaya, G L; Chen, F; Decca, R S; Fischbach, E; Krause, D E; Lopez, D; Mohideen, U; Mostepanenko, V M

    2006-01-01

    In most experiments on the Casimir force the comparison between measurement data and theory was done using the concept of the root-mean-square deviation, a procedure that has been criticized in the literature. Here we propose a special statistical analysis which should be performed separately for the experimental data and for the results of the theoretical computations. In so doing, the random, systematic and total experimental errors are found as functions of separation, taking into account the distribution laws for each error at 95% confidence. Independently, all theoretical errors are combined to obtain the total theoretical error at the same confidence. Finally, the confidence interval for the differences between theoretical and experimental values is obtained as a function of separation. This rigorous approach is applied to two recent experiments on the Casimir effect

  16. Relating Tropical Cyclone Track Forecast Error Distributions with Measurements of Forecast Uncertainty

    Science.gov (United States)

    2016-03-01

    CYCLONE TRACK FORECAST ERROR DISTRIBUTIONS WITH MEASUREMENTS OF FORECAST UNCERTAINTY by Nicholas M. Chisler March 2016 Thesis Advisor...March 2016 3. REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE RELATING TROPICAL CYCLONE TRACK FORECAST ERROR DISTRIBUTIONS...WITH MEASUREMENTS OF FORECAST UNCERTAINTY 5. FUNDING NUMBERS 6. AUTHOR(S) Nicholas M. Chisler 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES

  17. Total uncertainty of low velocity thermal anemometers for measurement of indoor air movements

    DEFF Research Database (Denmark)

    Jørgensen, F.; Popiolek, Z.; Melikov, Arsen Krikor

    2004-01-01

    For a specific thermal anemometer with omnidirectional velocity sensor the expanded total uncertainty in measured mean velocity Û(Vmean) and the expanded total uncertainty in measured turbulence intensity Û(Tu) due to different error sources are estimated. The values are based on a previously...... developed mathematical model of the anemometer in combination with a large database of representative room flows measured with a 3-D Laser Doppler anemometer (LDA). A direct comparison between measurements with a thermal anemometer and a 3-D LDA in flows of varying velocity and turbulence intensity shows...... good agreement not only between the two instruments but also between the thermal anemometer and its mathematical model. The differences in the measurements performed with the two instruments are all well within the measurement uncertainty of both anemometers....

  18. Object-oriented software for evaluating measurement uncertainty

    Science.gov (United States)

    Hall, B. D.

    2013-05-01

    An earlier publication (Hall 2006 Metrologia 43 L56-61) introduced the notion of an uncertain number that can be used in data processing to represent quantity estimates with associated uncertainty. The approach can be automated, allowing data processing algorithms to be decomposed into convenient steps, so that complicated measurement procedures can be handled. This paper illustrates the uncertain-number approach using several simple measurement scenarios and two different software tools. One is an extension library for Microsoft Excel®. The other is a special-purpose calculator using the Python programming language.

  19. Object-oriented software for evaluating measurement uncertainty

    International Nuclear Information System (INIS)

    Hall, B D

    2013-01-01

    An earlier publication (Hall 2006 Metrologia 43 L56–61) introduced the notion of an uncertain number that can be used in data processing to represent quantity estimates with associated uncertainty. The approach can be automated, allowing data processing algorithms to be decomposed into convenient steps, so that complicated measurement procedures can be handled. This paper illustrates the uncertain-number approach using several simple measurement scenarios and two different software tools. One is an extension library for Microsoft Excel®. The other is a special-purpose calculator using the Python programming language. (paper)

  20. Uncertainty Quantification and Comparison of Weld Residual Stress Measurements and Predictions.

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, John R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Brooks, Dusty Marie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-10-01

    In pressurized water reactors, the prevention, detection, and repair of cracks within dissimilar metal welds is essential to ensure proper plant functionality and safety. Weld residual stresses, which are difficult to model and cannot be directly measured, contribute to the formation and growth of cracks due to primary water stress corrosion cracking. Additionally, the uncertainty in weld residual stress measurements and modeling predictions is not well understood, further complicating the prediction of crack evolution. The purpose of this document is to develop methodology to quantify the uncertainty associated with weld residual stress that can be applied to modeling predictions and experimental measurements. Ultimately, the results can be used to assess the current state of uncertainty and to build confidence in both modeling and experimental procedures. The methodology consists of statistically modeling the variation in the weld residual stress profiles using functional data analysis techniques. Uncertainty is quantified using statistical bounds (e.g. confidence and tolerance bounds) constructed with a semi-parametric bootstrap procedure. Such bounds describe the range in which quantities of interest, such as means, are expected to lie as evidenced by the data. The methodology is extended to provide direct comparisons between experimental measurements and modeling predictions by constructing statistical confidence bounds for the average difference between the two quantities. The statistical bounds on the average difference can be used to assess the level of agreement between measurements and predictions. The methodology is applied to experimental measurements of residual stress obtained using two strain relief measurement methods and predictions from seven finite element models developed by different organizations during a round robin study.

  1. Assessment of Measurement Uncertainty Values of the Scandium Determination in Marine Sediment

    International Nuclear Information System (INIS)

    Rina-Mulyaningsih, Th.

    2005-01-01

    The result value of testing is meaningless if it isn't completed with uncertainty value. So that with the analysis result Sc in the marine sediment sample. It was assessed the uncertainty measurement of Sc analysis in marine sediment. The experiment was done in AAN Serpong laboratory. The result of calculation uncertainty on Sc analysis showed that the uncertainty components come from: preparation of sample and standard/comparator, purity of standard, counting statistics (sample and standard), repeatability, nuclear data and decay correction. The assessment on uncertainty must be done for the analysis of others elements, because each elements has difference nuclear and physical properties. (author)

  2. Measurement Uncertainty of Dew-Point Temperature in a Two-Pressure Humidity Generator

    Science.gov (United States)

    Martins, L. Lages; Ribeiro, A. Silva; Alves e Sousa, J.; Forbes, Alistair B.

    2012-09-01

    This article describes the measurement uncertainty evaluation of the dew-point temperature when using a two-pressure humidity generator as a reference standard. The estimation of the dew-point temperature involves the solution of a non-linear equation for which iterative solution techniques, such as the Newton-Raphson method, are required. Previous studies have already been carried out using the GUM method and the Monte Carlo method but have not discussed the impact of the approximate numerical method used to provide the temperature estimation. One of the aims of this article is to take this approximation into account. Following the guidelines presented in the GUM Supplement 1, two alternative approaches can be developed: the forward measurement uncertainty propagation by the Monte Carlo method when using the Newton-Raphson numerical procedure; and the inverse measurement uncertainty propagation by Bayesian inference, based on prior available information regarding the usual dispersion of values obtained by the calibration process. The measurement uncertainties obtained using these two methods can be compared with previous results. Other relevant issues concerning this research are the broad application to measurements that require hygrometric conditions obtained from two-pressure humidity generators and, also, the ability to provide a solution that can be applied to similar iterative models. The research also studied the factors influencing both the use of the Monte Carlo method (such as the seed value and the convergence parameter) and the inverse uncertainty propagation using Bayesian inference (such as the pre-assigned tolerance, prior estimate, and standard deviation) in terms of their accuracy and adequacy.

  3. Automated cleaning and uncertainty attribution of archival bathymetry based on a priori knowledge

    Science.gov (United States)

    Ladner, Rodney Wade; Elmore, Paul; Perkins, A. Louise; Bourgeois, Brian; Avera, Will

    2017-09-01

    Hydrographic offices hold large valuable historic bathymetric data sets, many of which were collected using older generation survey systems that contain little or no metadata and/or uncertainty estimates. These bathymetric data sets generally contain large outlier (errant) data points to clean, yet standard practice does not include rigorous automated procedures for systematic cleaning of these historical data sets and their subsequent conversion into reusable data formats. In this paper, we propose an automated method for this task. We utilize statistically diverse threshold tests, including a robust least trimmed squared method, to clean the data. We use LOESS weighted regression residuals together with a Student-t distribution to attribute uncertainty for each retained sounding; the resulting uncertainty values compare favorably with native estimates of uncertainty from co-located data sets which we use to estimate a point-wise goodness-of-fit measure. Storing a cleansed validated data set augmented with uncertainty in a re-usable format provides the details of this analysis for subsequent users. Our test results indicate that the method significantly improves the quality of the data set while concurrently providing confidence interval estimates and point-wise goodness-of-fit estimates as referenced to current hydrographic practices.

  4. Learning about Measurement Uncertainties in Secondary Education: A Model of the Subject Matter

    Science.gov (United States)

    Priemer, Burkhard; Hellwig, Julia

    2018-01-01

    Estimating measurement uncertainties is important for experimental scientific work. However, this is very often neglected in school curricula and teaching practice, even though experimental work is seen as a fundamental part of teaching science. In order to call attention to the relevance of measurement uncertainties, we developed a comprehensive…

  5. GUM approach to uncertainty estimations for online 220Rn concentration measurements using Lucas scintillation cell

    International Nuclear Information System (INIS)

    Sathyabama, N.

    2014-01-01

    It is now widely recognized that, when all of the known or suspected components of errors have been evaluated and corrected, there still remains an uncertainty, that is, a doubt about how well the result of the measurement represents the value of the quantity being measured. Evaluation of measurement data - Guide to the expression of Uncertainty in Measurement (GUM) is a guidance document, the purpose of which is to promote full information on how uncertainty statements are arrived at and to provide a basis for the international comparison of measurement results. In this paper, uncertainty estimations following GUM guidelines have been made for the measured values of online thoron concentrations using Lucas scintillation cell to prove that the correction for disequilibrium between 220 Rn and 216 Po is significant in online 220 Rn measurements

  6. PDF uncertainties in precision electroweak measurements, including the W mass, in ATLAS

    CERN Document Server

    Cooper-Sarkar, Amanda; The ATLAS collaboration

    2015-01-01

    Now that the Higgs mass is known all the parameters of the SM are known- but with what accuracy? Precision EW measurements test the self-consistency of the SM- and thus can give hints of BSM physics. Precision measurements of $sin^2\\theta _W$ and the W mass are limited by PDF uncertainties This contribution discusses these uncertainties and what can be done to improve them.

  7. Comparison of ISO-GUM and Monte Carlo Method for Evaluation of Measurement Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Ha, Young-Cheol; Her, Jae-Young; Lee, Seung-Jun; Lee, Kang-Jin [Korea Gas Corporation, Daegu (Korea, Republic of)

    2014-07-15

    To supplement the ISO-GUM method for the evaluation of measurement uncertainty, a simulation program using the Monte Carlo method (MCM) was developed, and the MCM and GUM methods were compared. The results are as follows: (1) Even under a non-normal probability distribution of the measurement, MCM provides an accurate coverage interval; (2) Even if a probability distribution that emerged from combining a few non-normal distributions looks as normal, there are cases in which the actual distribution is not normal and the non-normality can be determined by the probability distribution of the combined variance; and (3) If type-A standard uncertainties are involved in the evaluation of measurement uncertainty, GUM generally offers an under-valued coverage interval. However, this problem can be solved by the Bayesian evaluation of type-A standard uncertainty. In this case, the effective degree of freedom for the combined variance is not required in the evaluation of expanded uncertainty, and the appropriate coverage factor for 95% level of confidence was determined to be 1.96.

  8. Measurement uncertainty of dissolution test of acetaminophen immediate release tablets using Monte Carlo simulations

    Directory of Open Access Journals (Sweden)

    Daniel Cancelli Romero

    2017-10-01

    Full Text Available ABSTRACT Analytical results are widely used to assess batch-by-batch conformity, pharmaceutical equivalence, as well as in the development of drug products. Despite this, few papers describing the measurement uncertainty estimation associated with these results were found in the literature. Here, we described a simple procedure used for estimating measurement uncertainty associated with the dissolution test of acetaminophen tablets. A fractionate factorial design was used to define a mathematical model that explains the amount of acetaminophen dissolved (% as a function of time of dissolution (from 20 to 40 minutes, volume of dissolution media (from 800 to 1000 mL, pH of dissolution media (from 2.0 to 6.8, and rotation speed (from 40 to 60 rpm. Using Monte Carlo simulations, we estimated measurement uncertainty for dissolution test of acetaminophen tablets (95.2 ± 1.0%, with a 95% confidence level. Rotation speed was the most important source of uncertainty, contributing about 96.2% of overall uncertainty. Finally, it is important to note that the uncertainty calculated in this paper reflects the expected uncertainty to the dissolution test, and does not consider variations in the content of acetaminophen.

  9. Rigorous Photogrammetric Processing of CHANG'E-1 and CHANG'E-2 Stereo Imagery for Lunar Topographic Mapping

    Science.gov (United States)

    Di, K.; Liu, Y.; Liu, B.; Peng, M.

    2012-07-01

    Chang'E-1(CE-1) and Chang'E-2(CE-2) are the two lunar orbiters of China's lunar exploration program. Topographic mapping using CE-1 and CE-2 images is of great importance for scientific research as well as for preparation of landing and surface operation of Chang'E-3 lunar rover. In this research, we developed rigorous sensor models of CE-1 and CE-2 CCD cameras based on push-broom imaging principle with interior and exterior orientation parameters. Based on the rigorous sensor model, the 3D coordinate of a ground point in lunar body-fixed (LBF) coordinate system can be calculated by space intersection from the image coordinates of con-jugate points in stereo images, and the image coordinates can be calculated from 3D coordinates by back-projection. Due to uncer-tainties of the orbit and the camera, the back-projected image points are different from the measured points. In order to reduce these inconsistencies and improve precision, we proposed two methods to refine the rigorous sensor model: 1) refining EOPs by correcting the attitude angle bias, 2) refining the interior orientation model by calibration of the relative position of the two linear CCD arrays. Experimental results show that the mean back-projection residuals of CE-1 images are reduced to better than 1/100 pixel by method 1 and the mean back-projection residuals of CE-2 images are reduced from over 20 pixels to 0.02 pixel by method 2. Consequently, high precision DEM (Digital Elevation Model) and DOM (Digital Ortho Map) are automatically generated.

  10. Quantifying measurement uncertainties in ADCP measurements in non-steady, inhomogeneous flow

    Science.gov (United States)

    Schäfer, Stefan

    2017-04-01

    The author presents a laboratory study of fixed-platform four-beam ADCP and three-beam ADV measurements in the tailrace of a micro hydro power setup with a 35kW Kaplan-turbine and 2.5m head. The datasets discussed quantify measurement uncertainties of the ADCP measurement technique coming from non-steady, inhomogeneous flow. For constant discharge of 1.5m3/s, two different flow scenarios were investigated: one being the regular tailrace flow downstream the draft tube and the second being a straightened, less inhomogeneous flow, which was generated by the use of a flow straightening device: A rack of diameter 40mm pipe sections was mounted right behind the draft tube. ADCP measurements (sampling rate 1.35Hz) were conducted in three distances behind the draft tube and compared bin-wise to measurements of three simultaneously measuring ADV probes (sampling rate 64Hz). The ADV probes were aligned horizontally and the ADV bins were placed in the centers of two facing ADCP bins and in the vertical under the ADCP probe of the corresponding depth. Rotating the ADV probes by 90° allowed for measurements of the other two facing ADCP bins. For reasons of mutual probe interaction, ADCP and ADV measurements were not conducted at the same time. The datasets were evaluated by using mean and fluctuation velocities. Turbulence parameters were calculated and compared as far as applicable. Uncertainties coming from non-steady flow were estimated with the normalized mean square error und evaluated by comparing long-term measurements of 60 minutes to shorter measurement intervals. Uncertainties coming from inhomogeneous flow were evaluated by comparison of ADCP with ADV data along the ADCP beams where ADCP data were effectively measured and in the vertical under the ADCP probe where velocities of the ADCP measurements were displayed. Errors coming from non-steady flow could be compensated through sufficiently long measurement intervals with high enough sampling rates depending on the

  11. Interval-parameter chance-constraint programming model for end-of-life vehicles management under rigorous environmental regulations.

    Science.gov (United States)

    Simic, Vladimir

    2016-06-01

    As the number of end-of-life vehicles (ELVs) is estimated to increase to 79.3 million units per year by 2020 (e.g., 40 million units were generated in 2010), there is strong motivation to effectively manage this fast-growing waste flow. Intensive work on management of ELVs is necessary in order to more successfully tackle this important environmental challenge. This paper proposes an interval-parameter chance-constraint programming model for end-of-life vehicles management under rigorous environmental regulations. The proposed model can incorporate various uncertainty information in the modeling process. The complex relationships between different ELV management sub-systems are successfully addressed. Particularly, the formulated model can help identify optimal patterns of procurement from multiple sources of ELV supply, production and inventory planning in multiple vehicle recycling factories, and allocation of sorted material flows to multiple final destinations under rigorous environmental regulations. A case study is conducted in order to demonstrate the potentials and applicability of the proposed model. Various constraint-violation probability levels are examined in detail. Influences of parameter uncertainty on model solutions are thoroughly investigated. Useful solutions for the management of ELVs are obtained under different probabilities of violating system constraints. The formulated model is able to tackle a hard, uncertainty existing ELV management problem. The presented model has advantages in providing bases for determining long-term ELV management plans with desired compromises between economic efficiency of vehicle recycling system and system-reliability considerations. The results are helpful for supporting generation and improvement of ELV management plans. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Chemical kinetic model uncertainty minimization through laminar flame speed measurements

    Science.gov (United States)

    Park, Okjoo; Veloo, Peter S.; Sheen, David A.; Tao, Yujie; Egolfopoulos, Fokion N.; Wang, Hai

    2016-01-01

    Laminar flame speed measurements were carried for mixture of air with eight C3-4 hydrocarbons (propene, propane, 1,3-butadiene, 1-butene, 2-butene, iso-butene, n-butane, and iso-butane) at the room temperature and ambient pressure. Along with C1-2 hydrocarbon data reported in a recent study, the entire dataset was used to demonstrate how laminar flame speed data can be utilized to explore and minimize the uncertainties in a reaction model for foundation fuels. The USC Mech II kinetic model was chosen as a case study. The method of uncertainty minimization using polynomial chaos expansions (MUM-PCE) (D.A. Sheen and H. Wang, Combust. Flame 2011, 158, 2358–2374) was employed to constrain the model uncertainty for laminar flame speed predictions. Results demonstrate that a reaction model constrained only by the laminar flame speed values of methane/air flames notably reduces the uncertainty in the predictions of the laminar flame speeds of C3 and C4 alkanes, because the key chemical pathways of all of these flames are similar to each other. The uncertainty in model predictions for flames of unsaturated C3-4 hydrocarbons remain significant without considering fuel specific laminar flames speeds in the constraining target data set, because the secondary rate controlling reaction steps are different from those in the saturated alkanes. It is shown that the constraints provided by the laminar flame speeds of the foundation fuels could reduce notably the uncertainties in the predictions of laminar flame speeds of C4 alcohol/air mixtures. Furthermore, it is demonstrated that an accurate prediction of the laminar flame speed of a particular C4 alcohol/air mixture is better achieved through measurements for key molecular intermediates formed during the pyrolysis and oxidation of the parent fuel. PMID:27890938

  13. Onset of rigor mortis is earlier in red muscle than in white muscle.

    Science.gov (United States)

    Kobayashi, M; Takatori, T; Nakajima, M; Sakurada, K; Hatanaka, K; Ikegaya, H; Matsuda, Y; Iwase, H

    2000-01-01

    Rigor mortis is thought to be related to falling ATP levels in muscles postmortem. We measured rigor mortis as tension determined isometrically in three rat leg muscles in liquid paraffin kept at 37 degrees C or 25 degrees C--two red muscles, red gastrocnemius (RG) and soleus (SO) and one white muscle, white gastrocnemius (WG). Onset, half and full rigor mortis occurred earlier in RG and SO than in WG both at 37 degrees C and at 25 degrees C even though RG and WG were portions of the same muscle. This suggests that rigor mortis directly reflects the postmortem intramuscular ATP level, which decreases more rapidly in red muscle than in white muscle after death. Rigor mortis was more retarded at 25 degrees C than at 37 degrees C in each type of muscle.

  14. Optimized Clustering Estimators for BAO Measurements Accounting for Significant Redshift Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Ross, Ashley J. [Portsmouth U., ICG; Banik, Nilanjan [Fermilab; Avila, Santiago [Madrid, IFT; Percival, Will J. [Portsmouth U., ICG; Dodelson, Scott [Fermilab; Garcia-Bellido, Juan [Madrid, IFT; Crocce, Martin [ICE, Bellaterra; Elvin-Poole, Jack [Jodrell Bank; Giannantonio, Tommaso [Cambridge U., KICC; Manera, Marc [Cambridge U., DAMTP; Sevilla-Noarbe, Ignacio [Madrid, CIEMAT

    2017-05-15

    We determine an optimized clustering statistic to be used for galaxy samples with significant redshift uncertainty, such as those that rely on photometric redshifts. To do so, we study the BAO information content as a function of the orientation of galaxy clustering modes with respect to their angle to the line-of-sight (LOS). The clustering along the LOS, as observed in a redshift-space with significant redshift uncertainty, has contributions from clustering modes with a range of orientations with respect to the true LOS. For redshift uncertainty $\\sigma_z \\geq 0.02(1+z)$ we find that while the BAO information is confined to transverse clustering modes in the true space, it is spread nearly evenly in the observed space. Thus, measuring clustering in terms of the projected separation (regardless of the LOS) is an efficient and nearly lossless compression of the signal for $\\sigma_z \\geq 0.02(1+z)$. For reduced redshift uncertainty, a more careful consideration is required. We then use more than 1700 realizations of galaxy simulations mimicking the Dark Energy Survey Year 1 sample to validate our analytic results and optimized analysis procedure. We find that using the correlation function binned in projected separation, we can achieve uncertainties that are within 10 per cent of of those predicted by Fisher matrix forecasts. We predict that DES Y1 should achieve a 5 per cent distance measurement using our optimized methods. We expect the results presented here to be important for any future BAO measurements made using photometric redshift data.

  15. An uncertainty importance measure using a distance metric for the change in a cumulative distribution function

    International Nuclear Information System (INIS)

    Chun, Moon-Hyun; Han, Seok-Jung; Tak, Nam-IL

    2000-01-01

    A simple measure of uncertainty importance using the entire change of cumulative distribution functions (CDFs) has been developed for use in probability safety assessments (PSAs). The entire change of CDFs is quantified in terms of the metric distance between two CDFs. The metric distance measure developed in this study reflects the relative impact of distributional changes of inputs on the change of an output distribution, while most of the existing uncertainty importance measures reflect the magnitude of relative contribution of input uncertainties to the output uncertainty. The present measure has been evaluated analytically for various analytical distributions to examine its characteristics. To illustrate the applicability and strength of the present measure, two examples are provided. The first example is an application of the present measure to a typical problem of a system fault tree analysis and the second one is for a hypothetical non-linear model. Comparisons of the present result with those obtained by existing uncertainty importance measures show that the metric distance measure is a useful tool to express the measure of uncertainty importance in terms of the relative impact of distributional changes of inputs on the change of an output distribution

  16. Calibration Uncertainties in the Droplet Measurement Technologies Cloud Condensation Nuclei Counter

    Science.gov (United States)

    Hibert, Kurt James

    Cloud condensation nuclei (CCN) serve as the nucleation sites for the condensation of water vapor in Earth's atmosphere and are important for their effect on climate and weather. The influence of CCN on cloud radiative properties (aerosol indirect effect) is the most uncertain of quantified radiative forcing changes that have occurred since pre-industrial times. CCN influence the weather because intrinsic and extrinsic aerosol properties affect cloud formation and precipitation development. To quantify these effects, it is necessary to accurately measure CCN, which requires accurate calibrations using a consistent methodology. Furthermore, the calibration uncertainties are required to compare measurements from different field projects. CCN uncertainties also aid the integration of CCN measurements with atmospheric models. The commercially available Droplet Measurement Technologies (DMT) CCN Counter is used by many research groups, so it is important to quantify its calibration uncertainty. Uncertainties in the calibration of the DMT CCN counter exist in the flow rate and supersaturation values. The concentration depends on the accuracy of the flow rate calibration, which does not have a large (4.3 %) uncertainty. The supersaturation depends on chamber pressure, temperature, and flow rate. The supersaturation calibration is a complex process since the chamber's supersaturation must be inferred from a temperature difference measurement. Additionally, calibration errors can result from the Kohler theory assumptions, fitting methods utilized, the influence of multiply-charged particles, and calibration points used. In order to determine the calibration uncertainties and the pressure dependence of the supersaturation calibration, three calibrations are done at each pressure level: 700, 840, and 980 hPa. Typically 700 hPa is the pressure used for aircraft measurements in the boundary layer, 840 hPa is the calibration pressure at DMT in Boulder, CO, and 980 hPa is the

  17. A study on the propagation of measurement uncertainties into the result on a turbine performance test

    International Nuclear Information System (INIS)

    Cho, Soo Yong; Park, Chan Woo

    2004-01-01

    Uncertainties generated from the individual measured variables have an influence on the uncertainty of the experimental result through a data reduction equation. In this study, a performance test of a single stage axial type turbine is conducted, and total-to-total efficiencies are measured at the various off-design points in the low pressure and cold state. Based on an experimental apparatus, a data reduction equation for turbine efficiency is formulated and six measured variables are selected. Codes are written to calculate the efficiency, the uncertainty of the efficiency, and the sensitivity of the efficiency uncertainty by each of the measured quantities. The influence of each measured variable on the experimental result is figured out. Results show that the largest Uncertainty Magnification Factor (UMF) value is obtained by the inlet total pressure among the six measured variables, and its value is always greater than one. The UMF values of the inlet total temperature, the torque, and the RPM are always one. The Uncertainty Percentage Contribution (UPC) of the RPM shows the lowest influence on the uncertainty of the turbine efficiency, but the UPC of the torque has the largest influence to the result among the measured variables. These results are applied to find the correct direction for meeting an uncertainty requirement of the experimental result in the planning or development phase of experiment, and also to offer ideas for preparing a measurement system in the planning phase

  18. A super-resolution approach for uncertainty estimation of PIV measurements

    NARCIS (Netherlands)

    Sciacchitano, A.; Wieneke, B.; Scarano, F.

    2012-01-01

    A super-resolution approach is proposed for the a posteriori uncertainty estimation of PIV measurements. The measured velocity field is employed to determine the displacement of individual particle images. A disparity set is built from the residual distance between paired particle images of

  19. Experimental Test of Entropic Noise-Disturbance Uncertainty Relations for Spin-1/2 Measurements.

    Science.gov (United States)

    Sulyok, Georg; Sponar, Stephan; Demirel, Bülent; Buscemi, Francesco; Hall, Michael J W; Ozawa, Masanao; Hasegawa, Yuji

    2015-07-17

    Information-theoretic definitions for noise and disturbance in quantum measurements were given in [Phys. Rev. Lett. 112, 050401 (2014)] and a state-independent noise-disturbance uncertainty relation was obtained. Here, we derive a tight noise-disturbance uncertainty relation for complementary qubit observables and carry out an experimental test. Successive projective measurements on the neutron's spin-1/2 system, together with a correction procedure which reduces the disturbance, are performed. Our experimental results saturate the tight noise-disturbance uncertainty relation for qubits when an optimal correction procedure is applied.

  20. OR14-V-Uncertainty-PD2La Uncertainty Quantification for Nuclear Safeguards and Nondestructive Assay Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Nicholson, Andrew D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Croft, Stephen [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); McElroy, Robert Dennis [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-08-01

    The various methods of nondestructive assay (NDA) of special nuclear material (SNM) have applications in nuclear nonproliferation, including detection and identification of illicit SNM at border crossings and quantifying SNM at nuclear facilities for safeguards. No assay method is complete without “error bars,” which provide one way of expressing confidence in the assay result. Consequently, NDA specialists typically provide error bars and also partition total uncertainty into “random” and “systematic” components so that, for example, an error bar can be developed for the total mass estimate in multiple items. Uncertainty Quantification (UQ) for NDA has always been important, but it is recognized that greater rigor is needed and achievable using modern statistical methods.

  1. Uncertainties in pipeline water percentage measurement

    Energy Technology Data Exchange (ETDEWEB)

    Scott, Bentley N.

    2005-07-01

    Measurement of the quantity, density, average temperature and water percentage in petroleum pipelines has been an issue of prime importance. The methods of measurement have been investigated and have seen continued improvement over the years. Questions are being asked as to the reliability of the measurement of water in the oil through sampling systems originally designed and tested for a narrow range of densities. Today most facilities sampling systems handle vastly increased ranges of density and types of crude oils. Issues of pipeline integrity, product loss and production balances are placing further demands on the issues of accurate measurement. Water percentage is one area that has not received the attention necessary to understand the many factors involved in making a reliable measurement. A previous paper1 discussed the issues of uncertainty of the measurement from a statistical perspective. This paper will outline many of the issues of where the errors lie in the manual and automatic methods in use today. A routine to use the data collected by the analyzers in the on line system for validation of the measurements will be described. (author) (tk)

  2. Evaluation of the combined measurement uncertainty in isotope dilution by MC-ICP-MS

    International Nuclear Information System (INIS)

    Fortunato, G.; Wunderli, S.

    2003-01-01

    The combination of metrological weighing, the measurement of isotope amount ratios by a multicollector inductively coupled plasma mass spectrometer (MC-ICP-MS) and the use of high-purity reference materials are the cornerstones to achieve improved results for the amount content of lead in wine by the reversed isotope dilution technique. Isotope dilution mass spectrometry (IDMS) and reversed IDMS have the potential to be a so-called primary method, with which close comparability and well-stated combined measurement uncertainties can be obtained. This work describes the detailed uncertainty budget determination using the ISO-GUM approach. The traces of lead in wine were separated from the matrix by ion exchange chromatography after HNO 3 /H 2 O 2 microwave digestion. The thallium isotope amount ratio (n( 205 Tl)/n( 203 Tl)) was used to correct for mass discrimination using an exponential model approach. The corrected lead isotope amount ratio n( 206 Pb)/n( 208 Pb) for the isotopic standard SRM 981 measured in our laboratory was compared with ratio values considered to be the least uncertain. The result has been compared in a so-called pilot study ''lead in wine'' organised by the CCQM (Comite Consultatif pour la Quantite de Matiere, BIPM, Paris; the highest measurement authority for analytical chemical measurements). The result for the lead amount content k(Pb) and the corresponding expanded uncertainty U given by our laboratory was:k(Pb)=1.329 x 10-10mol g-1 (amount content of lead in wine)U[k(Pb)]=1.0 x 10-12mol g-1 (expanded uncertainty U=k x uc, k=2) The uncertainty of the main influence parameter of the combined measurement uncertainty was determined to be the isotope amount ratio R 206,B of the blend between the enriched spike and the sample. (orig.)

  3. A real-time assessment of measurement uncertainty in the experimental characterization of sprays

    International Nuclear Information System (INIS)

    Panão, M R O; Moreira, A L N

    2008-01-01

    This work addresses the estimation of the measurement uncertainty of discrete probability distributions used in the characterization of sprays. A real-time assessment of this measurement uncertainty is further investigated, particularly concerning the informative quality of the measured distribution and the influence of acquiring additional information on the knowledge retrieved from statistical analysis. The informative quality is associated with the entropy concept as understood in information theory (Shannon entropy), normalized by the entropy of the most informative experiment. A new empirical correlation is derived between the error accuracy of a discrete cumulative probability distribution and the normalized Shannon entropy. The results include case studies using: (i) spray impingement measurements to study the applicability of the real-time assessment of measurement uncertainty, and (ii) the simulation of discrete probability distributions of unknown shape or function to test the applicability of the new correlation

  4. Uncertainty as Information: Narrowing the Science-policy Gap

    Directory of Open Access Journals (Sweden)

    G. A. Bradshaw

    2000-07-01

    Full Text Available Conflict and indecision are hallmarks of environmental policy formulation. Some argue that the requisite information and certainty fall short of scientific standards for decision making; others argue that science is not the issue and that indecisiveness reflects a lack of political willpower. One of the most difficult aspects of translating science into policy is scientific uncertainty. Whereas scientists are familiar with uncertainty and complexity, the public and policy makers often seek certainty and deterministic solutions. We assert that environmental policy is most effective if scientific uncertainty is incorporated into a rigorous decision-theoretic framework as knowledge, not ignorance. The policies that best utilize scientific findings are defined here as those that accommodate the full scope of scientifically based predictions.

  5. Evaluation of the measurement uncertainty when measuring the resistance of solid isolating materials to tracking

    Science.gov (United States)

    Stare, E.; Beges, G.; Drnovsek, J.

    2006-07-01

    This paper presents the results of research into the measurement of the resistance of solid isolating materials to tracking. Two types of tracking were investigated: the proof tracking index (PTI) and the comparative tracking index (CTI). Evaluation of the measurement uncertainty in a case study was performed using a test method in accordance with the IEC 60112 standard. In the scope of the tests performed here, this particular test method was used to ensure the safety of electrical appliances. According to the EN ISO/IEC 17025 standard (EN ISO/IEC 17025), in the process of conformity assessment, the evaluation of the measurement uncertainty of the test method should be carried out. In the present article, possible influential parameters that are in accordance with the third and fourth editions of the standard IEC 60112 are discussed. The differences, ambiguities or lack of guidance referring to both editions of the standard are described in the article 'Ambiguities in technical standards—case study IEC 60112—measuring the resistance of solid isolating materials to tracking' (submitted for publication). Several hundred measurements were taken in the present experiments in order to form the basis for the results and conclusions presented. A specific problem of the test (according to the IEC 60112 standard) is the great variety of influential physical parameters (mechanical, electrical, chemical, etc) that can affect the results. At the end of the present article therefore, there is a histogram containing information on the contributions to the measurement uncertainty.

  6. Effect of the sample matrix on measurement uncertainty in X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Morgenstern, P.; Brueggemann, L.; Wennrich, R.

    2005-01-01

    The estimation of measurement uncertainty, with reference to univariate calibration functions, is discussed in detail in the Eurachem Guide 'Quantifying Uncertainty in Analytical Measurement'. The adoption of these recommendations to quantitative X-ray fluorescence analysis (XRF) involves basic problems which are above all due to the strong influence of the sample matrix on the analytical response. In XRF-analysis, the proposed recommendations are consequently applicable only to the matrix corrected response. The application is also restricted with regard to both the matrices and analyte concentrations. In this context the present studies are aimed at the problems to predict measurement uncertainty also with reference to more variable sample compositions. The corresponding investigations are focused on the use of the intensity of the Compton scattered tube line as an internal standard to assess the effect of the individual sample matrix on the analytical response relatively to a reference matrix. Based on this concept the estimation of the measurement uncertainty of an analyte presented in an unknown specimen can be predicted in consideration of the data obtained under defined matrix conditions

  7. Tolerance analysis in manufacturing using process capability ratio with measurement uncertainty

    DEFF Research Database (Denmark)

    Mahshid, Rasoul; Mansourvar, Zahra; Hansen, Hans Nørgaard

    2017-01-01

    . In this paper, a new statistical analysis was applied to manufactured products to assess achieved tolerances when the process is known while using capability ratio and expanded uncertainty. The analysis has benefits for process planning, determining actual precision limits, process optimization, troubleshoot......Tolerance analysis provides valuable information regarding performance of manufacturing process. It allows determining the maximum possible variation of a quality feature in production. Previous researches have focused on application of tolerance analysis to the design of mechanical assemblies...... malfunctioning existing part. The capability measure is based on a number of measurements performed on part’s quality variable. Since the ratio relies on measurements, elimination of any possible error has notable negative impact on results. Therefore, measurement uncertainty was used in combination with process...

  8. Strategy under uncertainty.

    Science.gov (United States)

    Courtney, H; Kirkland, J; Viguerie, P

    1997-01-01

    At the heart of the traditional approach to strategy lies the assumption that by applying a set of powerful analytic tools, executives can predict the future of any business accurately enough to allow them to choose a clear strategic direction. But what happens when the environment is so uncertain that no amount of analysis will allow us to predict the future? What makes for a good strategy in highly uncertain business environments? The authors, consultants at McKinsey & Company, argue that uncertainty requires a new way of thinking about strategy. All too often, they say, executives take a binary view: either they underestimate uncertainty to come up with the forecasts required by their companies' planning or capital-budging processes, or they overestimate it, abandon all analysis, and go with their gut instinct. The authors outline a new approach that begins by making a crucial distinction among four discrete levels of uncertainty that any company might face. They then explain how a set of generic strategies--shaping the market, adapting to it, or reserving the right to play at a later time--can be used in each of the four levels. And they illustrate how these strategies can be implemented through a combination of three basic types of actions: big bets, options, and no-regrets moves. The framework can help managers determine which analytic tools can inform decision making under uncertainty--and which cannot. At a broader level, it offers executives a discipline for thinking rigorously and systematically about uncertainty and its implications for strategy.

  9. Upper bounds on quantum uncertainty products and complexity measures

    Energy Technology Data Exchange (ETDEWEB)

    Guerrero, Angel; Sanchez-Moreno, Pablo; Dehesa, Jesus S. [Department of Atomic, Molecular and Nuclear Physics, University of Granada, Granada (Spain); Department of Applied Mathematics, University of Granada, Granada (Spain) and Institute Carlos I for Computational and Theoretical Physics, University of Granada, Granada (Spain); Department of Atomic, Molecular and Nuclear Physics, University of Granada, Granada (Spain); Institute Carlos I for Computational and Theoretical Physics, University of Granada, Granada (Spain)

    2011-10-15

    The position-momentum Shannon and Renyi uncertainty products of general quantum systems are shown to be bounded not only from below (through the known uncertainty relations), but also from above in terms of the Heisenberg-Kennard product . Moreover, the Cramer-Rao, Fisher-Shannon, and Lopez-Ruiz, Mancini, and Calbet shape measures of complexity (whose lower bounds have been recently found) are also bounded from above. The improvement of these bounds for systems subject to spherically symmetric potentials is also explicitly given. Finally, applications to hydrogenic and oscillator-like systems are done.

  10. Development of electrical efficiency measurement techniques for 10 kW-class SOFC system: Part II. Uncertainty estimation

    International Nuclear Information System (INIS)

    Tanaka, Yohei; Momma, Akihiko; Kato, Ken; Negishi, Akira; Takano, Kiyonami; Nozaki, Ken; Kato, Tohru

    2009-01-01

    Uncertainty of electrical efficiency measurement was investigated for a 10 kW-class SOFC system using town gas. Uncertainty of heating value measured by the gas chromatography method on a mole base was estimated as ±0.12% at 95% level of confidence. Micro-gas chromatography with/without CH 4 quantification may be able to reduce uncertainty of measurement. Calibration and uncertainty estimation methods are proposed for flow-rate measurement of town gas with thermal mass-flow meters or controllers. By adequate calibrations for flowmeters, flow rate of town gas or natural gas at 35 standard litters per minute can be measured within relative uncertainty ±1.0% at 95 % level of confidence. Uncertainty of power measurement can be as low as ±0.14% when a precise wattmeter is used and calibrated properly. It is clarified that electrical efficiency for non-pressurized 10 kW-class SOFC systems can be measured within ±1.0% relative uncertainty at 95% level of confidence with the developed techniques when the SOFC systems are operated relatively stably

  11. Measurement of nuclear activity with Ge detectors and its uncertainty

    International Nuclear Information System (INIS)

    Cortes P, C.A.

    1999-01-01

    The objective of this work is to analyse the influence magnitudes which affect the activity measurement of gamma transmitter isolated radioactive sources. They prepared by means of the gravimetric method, as well as, determining the uncertainty of such measurement when this is carried out with a gamma spectrometer system with a germanium detector. This work is developed in five chapters: In the first one, named Basic principles it is made a brief description about the meaning of the word Measurement and its implications and the necessaries concepts are presented which are used in this work. In the second chapter it is exposed the gravimetric method used for the manufacture of the gamma transmitter isolated radioactive sources, it is tackled the problem to determine the main influence magnitudes which affect in the measurement of their activity and the respective correction factors and their uncertainties are deduced. The third chapter describes the gamma spectrometry system which is used in this work for the measurement of the activity of isolated sources and also its performance and experimental arrangement that it is used. In the fourth chapter are applied the three previous items with the object of determining the uncertainty which would be obtained in the measurement of an isolated radioactive source elaborated with the gravimetric method in the experimental conditions less favourable predicted above the obtained results from the chapter two. The conclusions are presented in the fifth chapter and they are applied to establish the optimum conditions for the measurement of the activity of a gamma transmitter isolated radioactive source with a spectrometer with germanium detector. (Author)

  12. The explicit treatment of model uncertainties in the presence of aleatory and epistemic parameter uncertainties in risk and reliability analysis

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Yang, Joon Eon

    2003-01-01

    In the risk and reliability analysis of complex technological systems, the primary concern of formal uncertainty analysis is to understand why uncertainties arise, and to evaluate how they impact the results of the analysis. In recent times, many of the uncertainty analyses have focused on parameters of the risk and reliability analysis models, whose values are uncertain in an aleatory or an epistemic way. As the field of parametric uncertainty analysis matures, however, more attention is being paid to the explicit treatment of uncertainties that are addressed in the predictive model itself as well as the accuracy of the predictive model. The essential steps for evaluating impacts of these model uncertainties in the presence of parameter uncertainties are to determine rigorously various sources of uncertainties to be addressed in an underlying model itself and in turn model parameters, based on our state-of-knowledge and relevant evidence. Answering clearly the question of how to characterize and treat explicitly the forgoing different sources of uncertainty is particularly important for practical aspects such as risk and reliability optimization of systems as well as more transparent risk information and decision-making under various uncertainties. The main purpose of this paper is to provide practical guidance for quantitatively treating various model uncertainties that would often be encountered in the risk and reliability modeling process of complex technological systems

  13. Developing scales measuring disorder-specific intolerance of uncertainty (DSIU) : a new perspective on transdiagnostic

    NARCIS (Netherlands)

    Thibodeau, Michel A; Carleton, R Nicholas; McEvoy, Peter M; Zvolensky, Michael J; Brandt, Charles P; Boelen, Paul A; Mahoney, Alison E J; Deacon, Brett J; Asmundson, Gordon J G

    Intolerance of uncertainty (IU) is a construct of growing prominence in literature on anxiety disorders and major depressive disorder. Existing measures of IU do not define the uncertainty that respondents perceive as distressing. To address this limitation, we developed eight scales measuring

  14. Accelerating Biomedical Discoveries through Rigor and Transparency.

    Science.gov (United States)

    Hewitt, Judith A; Brown, Liliana L; Murphy, Stephanie J; Grieder, Franziska; Silberberg, Shai D

    2017-07-01

    Difficulties in reproducing published research findings have garnered a lot of press in recent years. As a funder of biomedical research, the National Institutes of Health (NIH) has taken measures to address underlying causes of low reproducibility. Extensive deliberations resulted in a policy, released in 2015, to enhance reproducibility through rigor and transparency. We briefly explain what led to the policy, describe its elements, provide examples and resources for the biomedical research community, and discuss the potential impact of the policy on translatability with a focus on research using animal models. Importantly, while increased attention to rigor and transparency may lead to an increase in the number of laboratory animals used in the near term, it will lead to more efficient and productive use of such resources in the long run. The translational value of animal studies will be improved through more rigorous assessment of experimental variables and data, leading to better assessments of the translational potential of animal models, for the benefit of the research community and society. Published by Oxford University Press on behalf of the Institute for Laboratory Animal Research 2017. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  15. Role of turbulence fluctuations on uncertainties of acoutic Doppler current profiler discharge measurements

    Science.gov (United States)

    Tarrab, Leticia; Garcia, Carlos M.; Cantero, Mariano I.; Oberg, Kevin

    2012-01-01

    This work presents a systematic analysis quantifying the role of the presence of turbulence fluctuations on uncertainties (random errors) of acoustic Doppler current profiler (ADCP) discharge measurements from moving platforms. Data sets of three-dimensional flow velocities with high temporal and spatial resolution were generated from direct numerical simulation (DNS) of turbulent open channel flow. Dimensionless functions relating parameters quantifying the uncertainty in discharge measurements due to flow turbulence (relative variance and relative maximum random error) to sampling configuration were developed from the DNS simulations and then validated with field-scale discharge measurements. The validated functions were used to evaluate the role of the presence of flow turbulence fluctuations on uncertainties in ADCP discharge measurements. The results of this work indicate that random errors due to the flow turbulence are significant when: (a) a low number of transects is used for a discharge measurement, and (b) measurements are made in shallow rivers using high boat velocity (short time for the boat to cross a flow turbulence structure).

  16. Measurement uncertainty of ester number, acid number and patchouli alcohol of patchouli oil produced in Yogyakarta

    Science.gov (United States)

    Istiningrum, Reni Banowati; Saepuloh, Azis; Jannah, Wirdatul; Aji, Didit Waskito

    2017-03-01

    Yogyakarta is one of patchouli oil distillation center in Indonesia. The quality of patchouli oil greatly affect its market price. Therefore, testing quality of patchouli oil parameters is an important concern, one through determination of the measurement uncertainty. This study will determine the measurement uncertainty of ester number, acid number and content of patchouli alcohol through a bottom up approach. Source contributor to measurement uncertainty of ester number is a mass of the sample, a blank and sample titration volume, the molar mass of KOH, HCl normality, and replication. While the source contributor of the measurement uncertainty of acid number is the mass of the sample, the sample titration volume, the relative mass and normality of KOH, and repetition. Determination of patchouli alcohol by Gas Chromatography considers the sources of measurement uncertainty only from repeatability because reference materials are not available.

  17. Rigorous Science: a How-To Guide

    Directory of Open Access Journals (Sweden)

    Arturo Casadevall

    2016-11-01

    Full Text Available Proposals to improve the reproducibility of biomedical research have emphasized scientific rigor. Although the word “rigor” is widely used, there has been little specific discussion as to what it means and how it can be achieved. We suggest that scientific rigor combines elements of mathematics, logic, philosophy, and ethics. We propose a framework for rigor that includes redundant experimental design, sound statistical analysis, recognition of error, avoidance of logical fallacies, and intellectual honesty. These elements lead to five actionable recommendations for research education.

  18. Research on uncertainty evaluation measure and method of voltage sag severity

    Science.gov (United States)

    Liu, X. N.; Wei, J.; Ye, S. Y.; Chen, B.; Long, C.

    2018-01-01

    Voltage sag is an inevitable serious problem of power quality in power system. This paper focuses on a general summarization and reviews on the concepts, indices and evaluation methods about voltage sag severity. Considering the complexity and uncertainty of influencing factors, damage degree, the characteristics and requirements of voltage sag severity in the power source-network-load sides, the measure concepts and their existing conditions, evaluation indices and methods of voltage sag severity have been analyzed. Current evaluation techniques, such as stochastic theory, fuzzy logic, as well as their fusion, are reviewed in detail. An index system about voltage sag severity is provided for comprehensive study. The main aim of this paper is to propose thought and method of severity research based on advanced uncertainty theory and uncertainty measure. This study may be considered as a valuable guide for researchers who are interested in the domain of voltage sag severity.

  19. Measurement Uncertainty Evaluation in Dimensional X-ray Computed Tomography Using the Bootstrap Method

    DEFF Research Database (Denmark)

    Hiller, Jochen; Genta, Gianfranco; Barbato, Giulio

    2014-01-01

    measurement processes, e.g., with tactile systems, also due to factors related to systematic errors, mainly caused by specific CT image characteristics. In this paper we propose a simulation-based framework for measurement uncertainty evaluation in dimensional CT using the bootstrap method. In a case study...... the problem concerning measurement uncertainties was addressed with bootstrap and successfully applied to ball-bar CT measurements. Results obtained enabled extension to more complex shapes such as actual industrial components as we show by tests on a hollow cylinder workpiece....

  20. Investment in flood protection measures under climate change uncertainty. An investment decision

    Energy Technology Data Exchange (ETDEWEB)

    Bruin, Karianne de

    2012-11-01

    Recent river flooding in Europe has triggered debates among scientists and policymakers on future projections of flood frequency and the need for adaptive investments, such as flood protection measures. Because there exists uncertainty about the impact of climate change of flood risk, such investments require a careful analysis of expected benefits and costs. The objective of this paper is to show how climate change uncertainty affects the decision to invest in flood protection measures. We develop a model that simulates optimal decision making in flood protection, it incorporates flexible timing of investment decisions and scientific uncertainty on the extent of climate change impacts. This model allows decision-makers to cope with the uncertain impacts of climate change on the frequency and damage of river flood events and minimises the risk of under- or over-investment. One of the innovative elements is that we explicitly distinguish between structural and non-structural flood protection measures. Our results show that the optimal investment decision today depends strongly on the cost structure of the adaptation measures and the discount rate, especially the ratio of fixed and weighted annual costs of the measures. A higher level of annual flood damage and later resolution of uncertainty in time increases the optimal investment. Furthermore, the optimal investment decision today is influenced by the possibility of the decision-maker to adjust his decision at a future moment in time.(auth)

  1. Uncertainty quantification for radiation measurements: Bottom-up error variance estimation using calibration information

    International Nuclear Information System (INIS)

    Burr, T.; Croft, S.; Krieger, T.; Martin, K.; Norman, C.; Walsh, S.

    2016-01-01

    One example of top-down uncertainty quantification (UQ) involves comparing two or more measurements on each of multiple items. One example of bottom-up UQ expresses a measurement result as a function of one or more input variables that have associated errors, such as a measured count rate, which individually (or collectively) can be evaluated for impact on the uncertainty in the resulting measured value. In practice, it is often found that top-down UQ exhibits larger error variances than bottom-up UQ, because some error sources are present in the fielded assay methods used in top-down UQ that are not present (or not recognized) in the assay studies used in bottom-up UQ. One would like better consistency between the two approaches in order to claim understanding of the measurement process. The purpose of this paper is to refine bottom-up uncertainty estimation by using calibration information so that if there are no unknown error sources, the refined bottom-up uncertainty estimate will agree with the top-down uncertainty estimate to within a specified tolerance. Then, in practice, if the top-down uncertainty estimate is larger than the refined bottom-up uncertainty estimate by more than the specified tolerance, there must be omitted sources of error beyond those predicted from calibration uncertainty. The paper develops a refined bottom-up uncertainty approach for four cases of simple linear calibration: (1) inverse regression with negligible error in predictors, (2) inverse regression with non-negligible error in predictors, (3) classical regression followed by inversion with negligible error in predictors, and (4) classical regression followed by inversion with non-negligible errors in predictors. Our illustrations are of general interest, but are drawn from our experience with nuclear material assay by non-destructive assay. The main example we use is gamma spectroscopy that applies the enrichment meter principle. Previous papers that ignore error in predictors

  2. SU-F-T-301: Planar Dose Pass Rate Inflation Due to the MapCHECK Measurement Uncertainty Function

    International Nuclear Information System (INIS)

    Bailey, D; Spaans, J; Kumaraswamy, L; Podgorsak, M

    2016-01-01

    Purpose: To quantify the effect of the Measurement Uncertainty function on planar dosimetry pass rates, as analyzed with Sun Nuclear Corporation analytic software (“MapCHECK” or “SNC Patient”). This optional function is toggled on by default upon software installation, and automatically increases the user-defined dose percent difference (%Diff) tolerance for each planar dose comparison. Methods: Dose planes from 109 IMRT fields and 40 VMAT arcs were measured with the MapCHECK 2 diode array, and compared to calculated planes from a commercial treatment planning system. Pass rates were calculated within the SNC analytic software using varying calculation parameters, including Measurement Uncertainty on and off. By varying the %Diff criterion for each dose comparison performed with Measurement Uncertainty turned off, an effective %Diff criterion was defined for each field/arc corresponding to the pass rate achieved with MapCHECK Uncertainty turned on. Results: For 3%/3mm analysis, the Measurement Uncertainty function increases the user-defined %Diff by 0.8–1.1% average, depending on plan type and calculation technique, for an average pass rate increase of 1.0–3.5% (maximum +8.7%). For 2%, 2 mm analysis, the Measurement Uncertainty function increases the user-defined %Diff by 0.7–1.2% average, for an average pass rate increase of 3.5–8.1% (maximum +14.2%). The largest increases in pass rate are generally seen with poorly-matched planar dose comparisons; the MapCHECK Uncertainty effect is markedly smaller as pass rates approach 100%. Conclusion: The Measurement Uncertainty function may substantially inflate planar dose comparison pass rates for typical IMRT and VMAT planes. The types of uncertainties incorporated into the function (and their associated quantitative estimates) as described in the software user’s manual may not accurately estimate realistic measurement uncertainty for the user’s measurement conditions. Pass rates listed in published

  3. SU-F-T-301: Planar Dose Pass Rate Inflation Due to the MapCHECK Measurement Uncertainty Function

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, D [Northside Hospital Cancer Institute, Atlanta, GA (United States); Spaans, J; Kumaraswamy, L; Podgorsak, M [Roswell Park Cancer Institute, Buffalo, NY (United States)

    2016-06-15

    Purpose: To quantify the effect of the Measurement Uncertainty function on planar dosimetry pass rates, as analyzed with Sun Nuclear Corporation analytic software (“MapCHECK” or “SNC Patient”). This optional function is toggled on by default upon software installation, and automatically increases the user-defined dose percent difference (%Diff) tolerance for each planar dose comparison. Methods: Dose planes from 109 IMRT fields and 40 VMAT arcs were measured with the MapCHECK 2 diode array, and compared to calculated planes from a commercial treatment planning system. Pass rates were calculated within the SNC analytic software using varying calculation parameters, including Measurement Uncertainty on and off. By varying the %Diff criterion for each dose comparison performed with Measurement Uncertainty turned off, an effective %Diff criterion was defined for each field/arc corresponding to the pass rate achieved with MapCHECK Uncertainty turned on. Results: For 3%/3mm analysis, the Measurement Uncertainty function increases the user-defined %Diff by 0.8–1.1% average, depending on plan type and calculation technique, for an average pass rate increase of 1.0–3.5% (maximum +8.7%). For 2%, 2 mm analysis, the Measurement Uncertainty function increases the user-defined %Diff by 0.7–1.2% average, for an average pass rate increase of 3.5–8.1% (maximum +14.2%). The largest increases in pass rate are generally seen with poorly-matched planar dose comparisons; the MapCHECK Uncertainty effect is markedly smaller as pass rates approach 100%. Conclusion: The Measurement Uncertainty function may substantially inflate planar dose comparison pass rates for typical IMRT and VMAT planes. The types of uncertainties incorporated into the function (and their associated quantitative estimates) as described in the software user’s manual may not accurately estimate realistic measurement uncertainty for the user’s measurement conditions. Pass rates listed in published

  4. Application of probabilistic modelling for the uncertainty evaluation of alignment measurements of large accelerator magnets assemblies

    Science.gov (United States)

    Doytchinov, I.; Tonnellier, X.; Shore, P.; Nicquevert, B.; Modena, M.; Mainaud Durand, H.

    2018-05-01

    Micrometric assembly and alignment requirements for future particle accelerators, and especially large assemblies, create the need for accurate uncertainty budgeting of alignment measurements. Measurements and uncertainties have to be accurately stated and traceable, to international standards, for metre-long sized assemblies, in the range of tens of µm. Indeed, these hundreds of assemblies will be produced and measured by several suppliers around the world, and will have to be integrated into a single machine. As part of the PACMAN project at CERN, we proposed and studied a practical application of probabilistic modelling of task-specific alignment uncertainty by applying a simulation by constraints calibration method. Using this method, we calibrated our measurement model using available data from ISO standardised tests (10360 series) for the metrology equipment. We combined this model with reference measurements and analysis of the measured data to quantify the actual specific uncertainty of each alignment measurement procedure. Our methodology was successfully validated against a calibrated and traceable 3D artefact as part of an international inter-laboratory study. The validated models were used to study the expected alignment uncertainty and important sensitivity factors in measuring the shortest and longest of the compact linear collider study assemblies, 0.54 m and 2.1 m respectively. In both cases, the laboratory alignment uncertainty was within the targeted uncertainty budget of 12 µm (68% confidence level). It was found that the remaining uncertainty budget for any additional alignment error compensations, such as the thermal drift error due to variation in machine operation heat load conditions, must be within 8.9 µm and 9.8 µm (68% confidence level) respectively.

  5. Characterization of the energy-dependent uncertainty and correlation in silicon neutron displacement damage metrics

    Directory of Open Access Journals (Sweden)

    Griffin Patrick

    2017-01-01

    Full Text Available A rigorous treatment of the uncertainty in the underlying nuclear data on silicon displacement damage metrics is presented. The uncertainty in the cross sections and recoil atom spectra are propagated into the energy-dependent uncertainty contribution in the silicon displacement kerma and damage energy using a Total Monte Carlo treatment. An energy-dependent covariance matrix is used to characterize the resulting uncertainty. A strong correlation between different reaction channels is observed in the high energy neutron contributions to the displacement damage metrics which supports the necessity of using a Monte Carlo based method to address the nonlinear nature of the uncertainty propagation.

  6. RIGOROUS PHOTOGRAMMETRIC PROCESSING OF CHANG'E-1 AND CHANG'E-2 STEREO IMAGERY FOR LUNAR TOPOGRAPHIC MAPPING

    Directory of Open Access Journals (Sweden)

    K. Di

    2012-07-01

    Full Text Available Chang'E-1(CE-1 and Chang'E-2(CE-2 are the two lunar orbiters of China's lunar exploration program. Topographic mapping using CE-1 and CE-2 images is of great importance for scientific research as well as for preparation of landing and surface operation of Chang'E-3 lunar rover. In this research, we developed rigorous sensor models of CE-1 and CE-2 CCD cameras based on push-broom imaging principle with interior and exterior orientation parameters. Based on the rigorous sensor model, the 3D coordinate of a ground point in lunar body-fixed (LBF coordinate system can be calculated by space intersection from the image coordinates of con-jugate points in stereo images, and the image coordinates can be calculated from 3D coordinates by back-projection. Due to uncer-tainties of the orbit and the camera, the back-projected image points are different from the measured points. In order to reduce these inconsistencies and improve precision, we proposed two methods to refine the rigorous sensor model: 1 refining EOPs by correcting the attitude angle bias, 2 refining the interior orientation model by calibration of the relative position of the two linear CCD arrays. Experimental results show that the mean back-projection residuals of CE-1 images are reduced to better than 1/100 pixel by method 1 and the mean back-projection residuals of CE-2 images are reduced from over 20 pixels to 0.02 pixel by method 2. Consequently, high precision DEM (Digital Elevation Model and DOM (Digital Ortho Map are automatically generated.

  7. Measurement uncertainty in single, double and triple isotope dilution mass spectrometry.

    Science.gov (United States)

    Vogl, Jochen

    2012-02-15

    Triple IDMS has been applied for the first time to the quantification of element concentrations. It has been compared with single and double IDMS obtained on the same sample set in order to evaluate the advantages and disadvantages of triple IDMS over single and double IDMS as an analytical reference procedure. The measurement results of single, double and triple IDMS are indistinguishable, considering rounding due to the individual measurement uncertainties. As expected, the relative expanded uncertainties (k = 2) achieved with double IDMS (0.08%) are dramatically smaller than those obtained with single IDMS (1.4%). Triple IDMS yields the smallest relative expanded uncertainties (k = 2, 0.077%) unfortunately at the expense of a much higher workload. Nevertheless triple IDMS has the huge advantage that the isotope ratio of the spike does not need to be determined. Elements with high memory effects, highly enriched spikes or highest metrological requirements may be typical applications for triple IDMS. Copyright © 2011 John Wiley & Sons, Ltd.

  8. Measurement uncertainty. A practical guide for Secondary Standards Dosimetry Laboratories

    International Nuclear Information System (INIS)

    2008-05-01

    The need for international traceability for radiation dose measurements has been understood since the early nineteen-sixties. The benefits of high dosimetric accuracy were recognized, particularly in radiotherapy, where the outcome of treatments is dependent on the radiation dose delivered to patients. When considering radiation protection dosimetry, the uncertainty may be greater than for therapy, but proper traceability of the measurements is no less important. To ensure harmonization and consistency in radiation measurements, the International Atomic Energy Agency (IAEA) and the World Health Organization (WHO) created a Network of Secondary Standards Dosimetry Laboratories (SSDLs) in 1976. An SSDL is a laboratory that has been designated by the competent national authorities to undertake the duty of providing the necessary link in the traceability chain of radiation dosimetry to the international measurement system (SI, for Systeme International) for radiation metrology users. The role of the SSDLs is crucial in providing traceable calibrations; they disseminate calibrations at specific radiation qualities appropriate for the use of radiation measuring instruments. Historically, although the first SSDLs were established mainly to provide radiotherapy level calibrations, the scope of their work has expanded over the years. Today, many SSDLs provide traceability for radiation protection measurements and diagnostic radiology in addition to radiotherapy. Some SSDLs, with the appropriate facilities and expertise, also conduct quality audits of the clinical use of the calibrated dosimeters - for example, by providing postal dosimeters for dose comparisons for medical institutions or on-site dosimetry audits with an ion chamber and other appropriate equipment. The requirements for traceable and reliable calibrations are becoming more important. For example, for international trade where radiation products are manufactured within strict quality control systems, it is

  9. Estimation of uncertainty of measurements of 3D mechanisms after kinematic calibration

    International Nuclear Information System (INIS)

    Takamasu, K; Sato, O; Shimojima, K; Takahashi, S; Furutani, R

    2005-01-01

    Calibration methods for 3D mechanisms are necessary to use the mechanisms as coordinate measuring machines. The calibration method of coordinate measuring machine using artifacts, the artifact calibration method, is proposed in taking account of traceability of the mechanism. There are kinematic parameters and form-deviation parameters in geometric parameters for describing the forward kinematic of the mechanism. In this article, the estimation methods of uncertainties using the calibrated coordinate measuring machine after the calibration are formulated. Firstly, the calculation method which takes out the values of kinematic parameters using least squares method is formulated. Secondly, the estimation value of uncertainty of the measuring machine is calculated using the error propagation method

  10. Passive active neutron radioassay measurement uncertainty for combustible and glass waste matrices

    International Nuclear Information System (INIS)

    Blackwood, L.G.; Harker, Y.D.; Meachum, T.R.; Yoon, Woo Y.

    1997-01-01

    Using a modified statistical sampling and verification approach, total uncertainty of INEL's Passive Active Neutron (PAN) radioassay system was evaluated for combustible and glass content codes. Waste structure and content of 100 randomly selected drums in each the waste categories were computer modeled based on review of real-time radiography video tapes. Specific quantities of Pu were added to the drum models according to an experimental design. These drum models were then submitted to the Monte Carlo Neutron Photon code processing and subsequent calculations to produce simulated PAN system measurements. The reported Pu masses from the simulation runs were compared with the corresponding input masses. Analysis of the measurement errors produced uncertainty estimates. This paper presents results of the uncertainty calculations and compares them to previous reported results obtained for graphite waste

  11. Approach to determine measurement uncertainty in complex nanosystems with multiparametric dependencies and multivariate output quantities

    Science.gov (United States)

    Hampel, B.; Liu, B.; Nording, F.; Ostermann, J.; Struszewski, P.; Langfahl-Klabes, J.; Bieler, M.; Bosse, H.; Güttler, B.; Lemmens, P.; Schilling, M.; Tutsch, R.

    2018-03-01

    In many cases, the determination of the measurement uncertainty of complex nanosystems provides unexpected challenges. This is in particular true for complex systems with many degrees of freedom, i.e. nanosystems with multiparametric dependencies and multivariate output quantities. The aim of this paper is to address specific questions arising during the uncertainty calculation of such systems. This includes the division of the measurement system into subsystems and the distinction between systematic and statistical influences. We demonstrate that, even if the physical systems under investigation are very different, the corresponding uncertainty calculation can always be realized in a similar manner. This is exemplarily shown in detail for two experiments, namely magnetic nanosensors and ultrafast electro-optical sampling of complex time-domain signals. For these examples the approach for uncertainty calculation following the guide to the expression of uncertainty in measurement (GUM) is explained, in which correlations between multivariate output quantities are captured. To illustate the versatility of the proposed approach, its application to other experiments, namely nanometrological instruments for terahertz microscopy, dimensional scanning probe microscopy, and measurement of concentration of molecules using surface enhanced Raman scattering, is shortly discussed in the appendix. We believe that the proposed approach provides a simple but comprehensive orientation for uncertainty calculation in the discussed measurement scenarios and can also be applied to similar or related situations.

  12. Different top-down approaches to estimate measurement uncertainty of whole blood tacrolimus mass concentration values.

    Science.gov (United States)

    Rigo-Bonnin, Raül; Blanco-Font, Aurora; Canalias, Francesca

    2018-05-08

    Values of mass concentration of tacrolimus in whole blood are commonly used by the clinicians for monitoring the status of a transplant patient and for checking whether the administered dose of tacrolimus is effective. So, clinical laboratories must provide results as accurately as possible. Measurement uncertainty can allow ensuring reliability of these results. The aim of this study was to estimate measurement uncertainty of whole blood mass concentration tacrolimus values obtained by UHPLC-MS/MS using two top-down approaches: the single laboratory validation approach and the proficiency testing approach. For the single laboratory validation approach, we estimated the uncertainties associated to the intermediate imprecision (using long-term internal quality control data) and the bias (utilizing a certified reference material). Next, we combined them together with the uncertainties related to the calibrators-assigned values to obtain a combined uncertainty for, finally, to calculate the expanded uncertainty. For the proficiency testing approach, the uncertainty was estimated in a similar way that the single laboratory validation approach but considering data from internal and external quality control schemes to estimate the uncertainty related to the bias. The estimated expanded uncertainty for single laboratory validation, proficiency testing using internal and external quality control schemes were 11.8%, 13.2%, and 13.0%, respectively. After performing the two top-down approaches, we observed that their uncertainty results were quite similar. This fact would confirm that either two approaches could be used to estimate the measurement uncertainty of whole blood mass concentration tacrolimus values in clinical laboratories. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  13. Definition of free form object for low uncertainty measurements on cooridnate measuring machines

    DEFF Research Database (Denmark)

    Savio, Enrico; De Chiffre, Leonardo

    This report is made as a part of the project Easytrac, an EU project under the programme: Competitive and Sustainable Growth: Contract No: G6RD-CT-2000-00188, coordinated by UNIMETRIK S.A. (Spain). The project is concerned with low uncertainty calibrations on coordinate measuring machines. The Ce...

  14. Experimental evaluation of rigor mortis. VII. Effect of ante- and post-mortem electrocution on the evolution of rigor mortis.

    Science.gov (United States)

    Krompecher, T; Bergerioux, C

    1988-01-01

    The influence of electrocution on the evolution of rigor mortis was studied on rats. Our experiments showed that: (1) Electrocution hastens the onset of rigor mortis. After an electrocution of 90 s, a complete rigor develops already 1 h post-mortem (p.m.) compared to 5 h p.m. for the controls. (2) Electrocution hastens the passing of rigor mortis. After an electrocution of 90 s, the first significant decrease occurs at 3 h p.m. (8 h p.m. in the controls). (3) These modifications in rigor mortis evolution are less pronounced in the limbs not directly touched by the electric current. (4) In case of post-mortem electrocution, the changes are slightly less pronounced, the resistance is higher and the absorbed energy is lower as compared with the ante-mortem electrocution cases. The results are completed by two practical observations on human electrocution cases.

  15. International target values 2000 for measurement uncertainties in safeguarding nuclear materials

    International Nuclear Information System (INIS)

    Aigner, H.; Binner, R.; Kuhn, E.

    2001-01-01

    The IAEA has prepared a revised and updated version of International Target Values (ITVs) for uncertainty components in measurements of nuclear material. The ITVs represent uncertainties to be considered in judging the reliability of analytical techniques applied to industrial nuclear and fissile material subject to safeguards verification. The tabulated values represent estimates of the 'state of the practice' which ought to be achievable under routine conditions by adequately equipped, experienced laboratories. The ITVs 2000 are intended to be used by plant operators and safeguards organizations as a reference of the quality of measurements achievable in nuclear material accountancy, and for planning purposes. The IAEA prepared a draft of a technical report presenting the proposed ITVs 2000, and in April 2000 the chairmen or officers of the panels or organizations listed below were invited to co- author the report and to submit the draft to a discussion by their panels and organizations. Euratom Safeguards Inspectorate, ESAKDA Working Group on Destructive Analysis, ESARDA Working Group on Non Destructive Analysis, Institute of Nuclear Material Management, Japanese Expert Group on ITV-2000, ISO Working Group on Analyses in Spent Fuel Reprocessing, ISO Working Group on Analyses in Uranium Fuel Fabrication, ISO Working Group on Analyses in MOX Fuel Fabrication, Agencia Brasileno-Argentina de Contabilidad y Control de Materiales Nucleares (ABACC). Comments from the above groups were received and incorporated into the final version of the document, completed in April 2001. The ITVs 2000 represent target standard uncertainties, expressing the precision achievable under stipulated conditions. These conditions typically fall in one of the two following categories: 'repeatability conditions' normally encountered during the measurements done within one inspection period; or 'reproducibility conditions' involving additional sources of measurement variability such as

  16. Uncertainty and global climate change research

    Energy Technology Data Exchange (ETDEWEB)

    Tonn, B.E. [Oak Ridge National Lab., TN (United States); Weiher, R. [National Oceanic and Atmospheric Administration, Boulder, CO (United States)

    1994-06-01

    The Workshop on Uncertainty and Global Climate Change Research March 22--23, 1994, in Knoxville, Tennessee. This report summarizes the results and recommendations of the workshop. The purpose of the workshop was to examine in-depth the concept of uncertainty. From an analytical point of view, uncertainty is a central feature of global climate science, economics and decision making. The magnitude and complexity of uncertainty surrounding global climate change has made it quite difficult to answer even the most simple and important of questions-whether potentially costly action is required now to ameliorate adverse consequences of global climate change or whether delay is warranted to gain better information to reduce uncertainties. A major conclusion of the workshop is that multidisciplinary integrated assessments using decision analytic techniques as a foundation is key to addressing global change policy concerns. First, uncertainty must be dealt with explicitly and rigorously since it is and will continue to be a key feature of analysis and recommendations on policy questions for years to come. Second, key policy questions and variables need to be explicitly identified, prioritized, and their uncertainty characterized to guide the entire scientific, modeling, and policy analysis process. Multidisciplinary integrated assessment techniques and value of information methodologies are best suited for this task. In terms of timeliness and relevance of developing and applying decision analytic techniques, the global change research and policy communities are moving rapidly toward integrated approaches to research design and policy analysis.

  17. Applying rigorous decision analysis methodology to optimization of a tertiary recovery project

    International Nuclear Information System (INIS)

    Wackowski, R.K.; Stevens, C.E.; Masoner, L.O.; Attanucci, V.; Larson, J.L.; Aslesen, K.S.

    1992-01-01

    This paper reports that the intent of this study was to rigorously look at all of the possible expansion, investment, operational, and CO 2 purchase/recompression scenarios (over 2500) to yield a strategy that would maximize net present value of the CO 2 project at the Rangely Weber Sand Unit. Traditional methods of project management, which involve analyzing large numbers of single case economic evaluations, was found to be too cumbersome and inaccurate for an analysis of this scope. The decision analysis methodology utilized a statistical approach which resulted in a range of economic outcomes. Advantages of the decision analysis methodology included: a more organized approach to classification of decisions and uncertainties; a clear sensitivity method to identify the key uncertainties; an application of probabilistic analysis through the decision tree; and a comprehensive display of the range of possible outcomes for communication to decision makers. This range made it possible to consider the upside and downside potential of the options and to weight these against the Unit's strategies. Savings in time and manpower required to complete the study were also realized

  18. The Harm that Underestimation of Uncertainty Does to Our Community: A Case Study Using Sunspot Area Measurements

    Science.gov (United States)

    Munoz-Jaramillo, Andres

    2017-08-01

    Data products in heliospheric physics are very often provided without clear estimates of uncertainty. From helioseismology in the solar interior, all the way to in situ solar wind measurements beyond 1AU, uncertainty estimates are typically hard for users to find (buried inside long documents that are separate from the data products), or simply non-existent.There are two main reasons why uncertainty measurements are hard to find:1. Understanding instrumental systematic errors is given a much higher priority inside instrumental teams.2. The desire to perfectly understand all sources of uncertainty postpones indefinitely the actual quantification of uncertainty in our measurements.Using the cross calibration of 200 years of sunspot area measurements as a case study, in this presentation we will discuss the negative impact that inadequate measurements of uncertainty have on users, through the appearance of toxic and unnecessary controversies, and data providers, through the creation of unrealistic expectations regarding the information that can be extracted from their data. We will discuss how empirical estimates of uncertainty represent a very good alternative to not providing any estimates at all, and finalize by discussing the bare essentials that should become our standard practice for future instruments and surveys.

  19. Inconclusive quantum measurements and decisions under uncertainty

    Science.gov (United States)

    Yukalov, Vyacheslav; Sornette, Didier

    2016-04-01

    We give a mathematical definition for the notion of inconclusive quantum measurements. In physics, such measurements occur at intermediate stages of a complex measurement procedure, with the final measurement result being operationally testable. Since the mathematical structure of Quantum Decision Theory has been developed in analogy with the theory of quantum measurements, the inconclusive quantum measurements correspond, in Quantum Decision Theory, to intermediate stages of decision making in the process of taking decisions under uncertainty. The general form of the quantum probability for a composite event is the sum of a utility factor, describing a rational evaluation of the considered prospect, and of an attraction factor, characterizing irrational, subconscious attitudes of the decision maker. Despite the involved irrationality, the probability of prospects can be evaluated. This is equivalent to the possibility of calculating quantum probabilities without specifying hidden variables. We formulate a general way of evaluation, based on the use of non-informative priors. As an example, we suggest the explanation of the decoy effect. Our quantitative predictions are in very good agreement with experimental data.

  20. Inconclusive quantum measurements and decisions under uncertainty

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2016-04-01

    Full Text Available We give a mathematical definition for the notion of inconclusive quantum measurements.In physics, such measurements occur at intermediate stages of a complex measurement procedure, with the final measurement result being operationally testable. Since the mathematical structure of Quantum Decision Theory has been developed in analogy withthe theory of quantum measurements, the inconclusive quantum measurements correspond,in Quantum Decision Theory, to intermediate stages of decision making in the process of taking decisions under uncertainty. The general form of the quantum probability for a composite event is the sum of a utility factor, describing a rational evaluationof the considered prospect, and of an attraction factor, characterizing irrational,subconscious attitudes of the decision maker. Despite the involved irrationality, the probability of prospects can be evaluated. This is equivalent to the possibility of calculating quantum probabilities without specifying hidden variables. We formulate a general way of evaluation, based on the use of non-informative priors. As an example,we suggest the explanation of the decoy effect. Our quantitative predictions are in very good agreement with experimental data.

  1. A Monte-Carlo investigation of the uncertainty of acoustic decay measurements

    DEFF Research Database (Denmark)

    Cabo, David Pérez; Seoane, Manuel A. Sobreira; Jacobsen, Finn

    2012-01-01

    Measurement of acoustic decays can be problematic at low frequencies: short decays cannot be evaluated accurately. Several effects influencing the evaluation will be reviewed in this paper. As new contribution, the measurement uncertainty due to one-third octave band pass filters will be analysed...

  2. SWEPP PAN assay system uncertainty analysis: Passive mode measurements of graphite waste

    International Nuclear Information System (INIS)

    Blackwood, L.G.; Harker, Y.D.; Meachum, T.R.; Yoon, Woo Y.

    1997-07-01

    The Idaho National Engineering and Environmental Laboratory is being used as a temporary storage facility for transuranic waste generated by the U.S. Nuclear Weapons program at the Rocky Flats Plant (RFP) in Golden, Colorado. Currently, there is a large effort in progress to prepare to ship this waste to the Waste Isolation Pilot Plant (WIPP) in Carlsbad, New Mexico. In order to meet the TRU Waste Characterization Quality Assurance Program Plan nondestructive assay compliance requirements and quality assurance objectives, it is necessary to determine the total uncertainty of the radioassay results produced by the Stored Waste Examination Pilot Plant (SWEPP) Passive Active Neutron (PAN) radioassay system. To this end a modified statistical sampling and verification approach has been developed to determine the total uncertainty of a PAN measurement. In this approach the total performance of the PAN nondestructive assay system is simulated using computer models of the assay system and the resultant output is compared with the known input to assess the total uncertainty. This paper is one of a series of reports quantifying the results of the uncertainty analysis of the PAN system measurements for specific waste types and measurement modes. In particular this report covers passive mode measurements of weapons grade plutonium-contaminated graphite molds contained in 208 liter drums (waste code 300). The validity of the simulation approach is verified by comparing simulated output against results from measurements using known plutonium sources and a surrogate graphite waste form drum. For actual graphite waste form conditions, a set of 50 cases covering a statistical sampling of the conditions exhibited in graphite wastes was compiled using a Latin hypercube statistical sampling approach

  3. Measurement Uncertainty Relations for Discrete Observables: Relative Entropy Formulation

    Science.gov (United States)

    Barchielli, Alberto; Gregoratti, Matteo; Toigo, Alessandro

    2018-02-01

    We introduce a new information-theoretic formulation of quantum measurement uncertainty relations, based on the notion of relative entropy between measurement probabilities. In the case of a finite-dimensional system and for any approximate joint measurement of two target discrete observables, we define the entropic divergence as the maximal total loss of information occurring in the approximation at hand. For fixed target observables, we study the joint measurements minimizing the entropic divergence, and we prove the general properties of its minimum value. Such a minimum is our uncertainty lower bound: the total information lost by replacing the target observables with their optimal approximations, evaluated at the worst possible state. The bound turns out to be also an entropic incompatibility degree, that is, a good information-theoretic measure of incompatibility: indeed, it vanishes if and only if the target observables are compatible, it is state-independent, and it enjoys all the invariance properties which are desirable for such a measure. In this context, we point out the difference between general approximate joint measurements and sequential approximate joint measurements; to do this, we introduce a separate index for the tradeoff between the error of the first measurement and the disturbance of the second one. By exploiting the symmetry properties of the target observables, exact values, lower bounds and optimal approximations are evaluated in two different concrete examples: (1) a couple of spin-1/2 components (not necessarily orthogonal); (2) two Fourier conjugate mutually unbiased bases in prime power dimension. Finally, the entropic incompatibility degree straightforwardly generalizes to the case of many observables, still maintaining all its relevant properties; we explicitly compute it for three orthogonal spin-1/2 components.

  4. Uncertainty modelling of real-time observation of a moving object: photogrammetric measurements

    Science.gov (United States)

    Ulrich, Thomas

    2015-04-01

    Photogrametric systems are widely used in the field of industrial metrology to measure kinematic tasks such as tracking robot movements. In order to assess spatiotemporal deviations of a kinematic movement, it is crucial to have a reliable uncertainty of the kinematic measurements. Common methods to evaluate the uncertainty in kinematic measurements include approximations specified by the manufactures, various analytical adjustment methods and Kalman filters. Here a hybrid system estimator in conjunction with a kinematic measurement model is applied. This method can be applied to processes which include various types of kinematic behaviour, constant velocity, variable acceleration or variable turn rates. Additionally, it has been shown that the approach is in accordance with GUM (Guide to the Expression of Uncertainty in Measurement). The approach is compared to the Kalman filter using simulated data to achieve an overall error calculation. Furthermore, the new approach is used for the analysis of a rotating system as this system has both a constant and a variable turn rate. As the new approach reduces overshoots it is more appropriate for analysing kinematic processes than the Kalman filter. In comparison with the manufacturer’s approximations, the new approach takes account of kinematic behaviour, with an improved description of the real measurement process. Therefore, this approach is well-suited to the analysis of kinematic processes with unknown changes in kinematic behaviour.

  5. Sonoelasticity to monitor mechanical changes during rigor and ageing.

    Science.gov (United States)

    Ayadi, A; Culioli, J; Abouelkaram, S

    2007-06-01

    We propose the use of sonoelasticity as a non-destructive method to monitor changes in the resistance of muscle fibres, unaffected by connective tissue. Vibrations were applied at low frequency to induce oscillations in soft tissues and an ultrasound transducer was used to detect the motions. The experiments were carried out on the M. biceps femoris muscles of three beef cattle. In addition to the sonoelasticity measurements, the changes in meat during rigor and ageing were followed by measurements of both the mechanical resistance of myofibres and pH. The variations of mechanical resistance and pH were compared to those of the sonoelastic variables (velocity and attenuation) at two frequencies. The relationships between pH and velocity or attenuation and between the velocity or attenuation and the stress at 20% deformation were highly correlated. We concluded that sonoelasticity is a non-destructive method that can be used to monitor mechanical changes in muscle fibers during rigor-mortis and ageing.

  6. A study on the relationship between measurement uncertainty and the size of the disk gauge used to calibrate a straightness measuring system

    International Nuclear Information System (INIS)

    Fujimoto, Ikumatsu; Nishimura, Kunitoshi; Takatuji, Toshiyuki; Pyun, Young-Sik

    2011-01-01

    An autonomous method for calibrating the zero difference for the three-point method of surface straightness measurement is presented and discussed with respect to the relationship between the measurement uncertainty and the size of the disk gauge used for calibration. In this method, the disk gauge is used in two steps. In the first step, the disk gauge rotates a few revolutions and moves parallel to three displacement sensors built into a holder. In the second step, the geometrical parameters between the sensors and the disk gauge are acquired, and the zero differences are computed by our recently proposed algorithm. Finally, the uncertainty of the zero differences is analyzed and simulated numerically, and the relationship between the disk gauge radius and the measurement uncertainty is calculated. The use of a disk gauge of larger radius results in smaller uncertainty of straightness measurement

  7. Alignment measurements uncertainties for large assemblies using probabilistic analysis techniques

    CERN Document Server

    AUTHOR|(CDS)2090816; Almond, Heather

    Big science and ambitious industrial projects continually push forward with technical requirements beyond the grasp of conventional engineering techniques. Example of those are ultra-high precision requirements in the field of celestial telescopes, particle accelerators and aerospace industry. Such extreme requirements are limited largely by the capability of the metrology used, namely, it’s uncertainty in relation to the alignment tolerance required. The current work was initiated as part of Maria Curie European research project held at CERN, Geneva aiming to answer those challenges as related to future accelerators requiring alignment of 2 m large assemblies to tolerances in the 10 µm range. The thesis has found several gaps in current knowledge limiting such capability. Among those was the lack of application of state of the art uncertainty propagation methods in alignment measurements metrology. Another major limiting factor found was the lack of uncertainty statements in the thermal errors compensatio...

  8. Uncertainty of slip measurements in a cutting system of converting machinery for diapers production

    Directory of Open Access Journals (Sweden)

    D’Aponte F.

    2015-01-01

    Full Text Available In this paper slip measurements are described between the peripheral surfaces of knife and a not driven anvil cylinders in a high velocity, high quality cutting unit of a diaper production line. Laboratory tests have been carried out on a test bench with real scale components for possible on line application of the method. With reference to both starting and steady state conditions correlations with the process parameters have been found, achieving a very satisfactory reduction of the slip between the knife cylinder and the not driven anvil one. Accuracy evaluation of measurements allowed us to validate the obtained information and to evaluate the detection threshold of the measurement method in the present configuration The analysis of specific uncertainty contributions to the whole uncertainty could be also used, to further reduce the requested uncertainty of the measurement method.

  9. Online updating and uncertainty quantification using nonstationary output-only measurement

    Science.gov (United States)

    Yuen, Ka-Veng; Kuok, Sin-Chi

    2016-01-01

    Extended Kalman filter (EKF) is widely adopted for state estimation and parametric identification of dynamical systems. In this algorithm, it is required to specify the covariance matrices of the process noise and measurement noise based on prior knowledge. However, improper assignment of these noise covariance matrices leads to unreliable estimation and misleading uncertainty estimation on the system state and model parameters. Furthermore, it may induce diverging estimation. To resolve these problems, we propose a Bayesian probabilistic algorithm for online estimation of the noise parameters which are used to characterize the noise covariance matrices. There are three major appealing features of the proposed approach. First, it resolves the divergence problem in the conventional usage of EKF due to improper choice of the noise covariance matrices. Second, the proposed approach ensures the reliability of the uncertainty quantification. Finally, since the noise parameters are allowed to be time-varying, nonstationary process noise and/or measurement noise are explicitly taken into account. Examples using stationary/nonstationary response of linear/nonlinear time-varying dynamical systems are presented to demonstrate the efficacy of the proposed approach. Furthermore, comparison with the conventional usage of EKF will be provided to reveal the necessity of the proposed approach for reliable model updating and uncertainty quantification.

  10. Including uncertainty in hazard analysis through fuzzy measures

    International Nuclear Information System (INIS)

    Bott, T.F.; Eisenhawer, S.W.

    1997-12-01

    This paper presents a method for capturing the uncertainty expressed by an Hazard Analysis (HA) expert team when estimating the frequencies and consequences of accident sequences and provides a sound mathematical framework for propagating this uncertainty to the risk estimates for these accident sequences. The uncertainty is readily expressed as distributions that can visually aid the analyst in determining the extent and source of risk uncertainty in HA accident sequences. The results also can be expressed as single statistics of the distribution in a manner analogous to expressing a probabilistic distribution as a point-value statistic such as a mean or median. The study discussed here used data collected during the elicitation portion of an HA on a high-level waste transfer process to demonstrate the techniques for capturing uncertainty. These data came from observations of the uncertainty that HA team members expressed in assigning frequencies and consequences to accident sequences during an actual HA. This uncertainty was captured and manipulated using ideas from possibility theory. The result of this study is a practical method for displaying and assessing the uncertainty in the HA team estimates of the frequency and consequences for accident sequences. This uncertainty provides potentially valuable information about accident sequences that typically is lost in the HA process

  11. Influence of measurement uncertainty on classification of thermal environment in buildings according to European Standard EN 15251

    DEFF Research Database (Denmark)

    Kolarik, Jakub; Olesen, Bjarne W.

    2015-01-01

    European Standard EN 15 251 in its current version does not provide any guidance on how to handle uncertainty of long term measurements of indoor environmental parameters used for classification of buildings. The objective of the study was to analyse the uncertainty for field measurements...... measurements of operative temperature at two measuring points (south/south-west and north/northeast orientation). Results of the present study suggest that measurement uncertainty needs to be considered during assessment of thermal environment in existing buildings. When expanded standard uncertainty was taken...... into account in categorization of thermal environment according to EN 15251, the difference in prevalence of exceeded category limits were up to 17.3%, 8.3% and 2% of occupied hours for category I, II and III respectively....

  12. Calculation of the detection limit in radiation measurements with systematic uncertainties

    International Nuclear Information System (INIS)

    Kirkpatrick, J.M.; Russ, W.; Venkataraman, R.; Young, B.M.

    2015-01-01

    The detection limit (L D ) or Minimum Detectable Activity (MDA) is an a priori evaluation of assay sensitivity intended to quantify the suitability of an instrument or measurement arrangement for the needs of a given application. Traditional approaches as pioneered by Currie rely on Gaussian approximations to yield simple, closed-form solutions, and neglect the effects of systematic uncertainties in the instrument calibration. These approximations are applicable over a wide range of applications, but are of limited use in low-count applications, when high confidence values are required, or when systematic uncertainties are significant. One proposed modification to the Currie formulation attempts account for systematic uncertainties within a Gaussian framework. We have previously shown that this approach results in an approximation formula that works best only for small values of the relative systematic uncertainty, for which the modification of Currie's method is the least necessary, and that it significantly overestimates the detection limit or gives infinite or otherwise non-physical results for larger systematic uncertainties where such a correction would be the most useful. We have developed an alternative approach for calculating detection limits based on realistic statistical modeling of the counting distributions which accurately represents statistical and systematic uncertainties. Instead of a closed form solution, numerical and iterative methods are used to evaluate the result. Accurate detection limits can be obtained by this method for the general case

  13. Establishment of a procedure to calculate the measurement uncertainties in radiation survey meters calibration

    International Nuclear Information System (INIS)

    Manzoli, J.E.; Potiens, M.P.A.

    2000-01-01

    The Calibration Laboratory of Sao Paulo calibrates more than one thousand gamma ray survey meters a year; beside other kinds of radiotherapy, radiodiagnostic and radiation protection instruments. It has a standard (600 cm 3 ) cylinder ionization chamber (Nuclear Enterprises Ltd. model 2511/3) traceable to the Brazilian Secondary Standard Dosimetry Laboratory (SSDL) whose instruments are traceable to the BIPM. Annually the beam dosimetry is performed using this chamber and the results are used as the true values for calibration purposes. The uncertainties present in every direct or indirect measurement during the calibration procedure must be evaluated for purposes of laboratory quality control. All calculation steps in the propagation of errors are presented in this work staging from the ionization chamber charge measured with the standard instrument. Such a propagation was made in space and time, considering even the environmental quantities uncertainties. The propagation was necessary in space, because the ionization chamber measurements were performed at only one space position. The time propagation was essential due to the fact that the activity is a peculiar physical quantity which changes with time according to precise relations for a specific radionuclide. The clear indication of every measurement uncertainty is always important to quantify the quality of this measurement. Nowadays the achievement of calibration laboratory quality systems requires the expression of all uncertainties and the procedure used to evaluate it. An example of this procedure in the case of the calibration of a typical portable radiation survey meter is presented. The direct exposure rate instrument measurement was compared with the true value given by the standard instrument properly propagated and all quantities used have their uncertainties shown. (author)

  14. Uncertainty assessing of measure result of tungsten in U3O8 by ICP-AES

    International Nuclear Information System (INIS)

    Du Guirong; Nie Jie; Tang Lilei

    2011-01-01

    According as the determining method and the assessing criterion,the uncertainty assessing of measure result of tungsten in U 3 O 8 by ICP-AES is researched. With the assessment of each component in detail, the result shows that u rel (sc)> u rel (c)> u rel (F)> u rel (m) by uncertainty contribution. Other uncertainty is random, calculated by repetition. u rel (sc) is contributed to uncertainty mainly. So the general uncertainty is reduced with strict operation to reduce u rel (sc). (authors)

  15. Estimate of the uncertainty in measurement for the determination of mercury in seafood by TDA AAS.

    Science.gov (United States)

    Torres, Daiane Placido; Olivares, Igor R B; Queiroz, Helena Müller

    2015-01-01

    An approach for the estimate of the uncertainty in measurement considering the individual sources related to the different steps of the method under evaluation as well as the uncertainties estimated from the validation data for the determination of mercury in seafood by using thermal decomposition/amalgamation atomic absorption spectrometry (TDA AAS) is proposed. The considered method has been fully optimized and validated in an official laboratory of the Ministry of Agriculture, Livestock and Food Supply of Brazil, in order to comply with national and international food regulations and quality assurance. The referred method has been accredited under the ISO/IEC 17025 norm since 2010. The approach of the present work in order to reach the aim of estimating of the uncertainty in measurement was based on six sources of uncertainty for mercury determination in seafood by TDA AAS, following the validation process, which were: Linear least square regression, Repeatability, Intermediate precision, Correction factor of the analytical curve, Sample mass, and Standard reference solution. Those that most influenced the uncertainty in measurement were sample weight, repeatability, intermediate precision and calibration curve. The obtained result for the estimate of uncertainty in measurement in the present work reached a value of 13.39%, which complies with the European Regulation EC 836/2011. This figure represents a very realistic estimate of the routine conditions, since it fairly encompasses the dispersion obtained from the value attributed to the sample and the value measured by the laboratory analysts. From this outcome, it is possible to infer that the validation data (based on calibration curve, recovery and precision), together with the variation on sample mass, can offer a proper estimate of uncertainty in measurement.

  16. Measurement uncertainties for vacuum standards at Korea Research Institute of Standards and Science

    International Nuclear Information System (INIS)

    Hong, S. S.; Shin, Y. H.; Chung, K. H.

    2006-01-01

    The Korea Research Institute of Standards and Science has three major vacuum systems: an ultrasonic interferometer manometer (UIM) (Sec. II, Figs. 1 and 2) for low vacuum, a static expansion system (SES) (Sec. III, Figs. 3 and 4) for medium vacuum, and an orifice-type dynamic expansion system (DES) (Sec. IV, Figs. 5 and 6) for high and ultrahigh vacuum. For each system explicit measurement model equations with multiple variables are, respectively, given. According to ISO standards, all these system variable errors were used to calculate the expanded uncertainty (U). For each system the expanded uncertainties (k=1, confidence level=95%) and relative expanded uncertainty (expanded uncertainty/generated pressure) are summarized in Table IV and are estimated to be as follows. For UIM, at 2.5-300 Pa generated pressure, the expanded uncertainty is -2 Pa and the relative expanded uncertainty is -2 ; at 1-100 kPa generated pressure, the expanded uncertainty is -5 . For SES, at 3-100 Pa generated pressure, the expanded uncertainty is -1 Pa and the relative expanded uncertainty is -3 . For DES, at 4.6x10 -3 -1.3x10 -2 Pa generated pressure, the expanded uncertainty is -4 Pa and the relative expanded uncertainty is -3 ; at 3.0x10 -6 -9.0x10 -4 Pa generated pressure, the expanded uncertainty is -6 Pa and the relative expanded uncertainty is -2 . Within uncertainty limits our bilateral and key comparisons [CCM.P-K4 (10 Pa-1 kPa)] are extensive and in good agreement with those of other nations (Fig. 8 and Table V)

  17. Long persistence of rigor mortis at constant low temperature.

    Science.gov (United States)

    Varetto, Lorenzo; Curto, Ombretta

    2005-01-06

    We studied the persistence of rigor mortis by using physical manipulation. We tested the mobility of the knee on 146 corpses kept under refrigeration at Torino's city mortuary at a constant temperature of +4 degrees C. We found a persistence of complete rigor lasting for 10 days in all the cadavers we kept under observation; and in one case, rigor lasted for 16 days. Between the 11th and the 17th days, a progressively increasing number of corpses showed a change from complete into partial rigor (characterized by partial bending of the articulation). After the 17th day, all the remaining corpses showed partial rigor and in the two cadavers that were kept under observation "à outrance" we found the absolute resolution of rigor mortis occurred on the 28th day. Our results prove that it is possible to find a persistence of rigor mortis that is much longer than the expected when environmental conditions resemble average outdoor winter temperatures in temperate zones. Therefore, this datum must be considered when a corpse is found in those environmental conditions so that when estimating the time of death, we are not misled by the long persistence of rigor mortis.

  18. Uncertainty measurement in the homogenization and sample reduction in the physical classification of rice and beans

    Directory of Open Access Journals (Sweden)

    Dieisson Pivoto

    2016-04-01

    Full Text Available ABSTRACT: The study aimed to i quantify the measurement uncertainty in the physical tests of rice and beans for a hypothetical defect, ii verify whether homogenization and sample reduction in the physical classification tests of rice and beans is effective to reduce the measurement uncertainty of the process and iii determine whether the increase in size of beans sample increases accuracy and reduces measurement uncertainty in a significant way. Hypothetical defects in rice and beans with different damage levels were simulated according to the testing methodology determined by the Normative Ruling of each product. The homogenization and sample reduction in the physical classification of rice and beans are not effective, transferring to the final test result a high measurement uncertainty. The sample size indicated by the Normative Ruling did not allow an appropriate homogenization and should be increased.

  19. Comparison of the GUM and Monte Carlo methods on the flatness uncertainty estimation in coordinate measuring machine

    Directory of Open Access Journals (Sweden)

    Jalid Abdelilah

    2016-01-01

    Full Text Available In engineering industry, control of manufactured parts is usually done on a coordinate measuring machine (CMM, a sensor mounted at the end of the machine probes a set of points on the surface to be inspected. Data processing is performed subsequently using software, and the result of this measurement process either validates or not the conformity of the part. Measurement uncertainty is a crucial parameter for making the right decisions, and not taking into account this parameter can, therefore, sometimes lead to aberrant decisions. The determination of the uncertainty measurement on CMM is a complex task for the variety of influencing factors. Through this study, we aim to check if the uncertainty propagation model developed according to the guide to the expression of uncertainty in measurement (GUM approach is valid, we present here a comparison of the GUM and Monte Carlo methods. This comparison is made to estimate a flatness deviation of a surface belonging to an industrial part and the uncertainty associated to the measurement result.

  20. Uncertainties in the measured quantities of water leaving waste Tank 241-C-106 via the ventilation system

    Energy Technology Data Exchange (ETDEWEB)

    Minteer, D.J.

    1995-01-23

    The purpose of this analysis is to estimate the uncertainty in the measured quantity of water which typically leaves Tank 241-C-106 via the ventilation system each month. Such measurements are essential for heat removal estimation and tank liquid level verification purposes. The uncertainty associated with the current, infrequent, manual method of measurement (involves various psychrometric and pressure measurements) is suspected to be unreasonably high. Thus, the possible reduction of this uncertainty using a continuous, automated method of measurement will also be estimated. There are three major conclusions as a result of this analysis: (1) the uncertainties associated with the current (infrequent, manual) method of measuring the water which typically leaves Tank 241-C-106 per month via the ventilation system are indeed quite high (80% to 120%); (2) given the current psychrometric and pressure measurement methods and any tank which loses considerable moisture through active ventilation, such as Tank 241-C-106, significant quantities of liquid can actually leak from the tank before a leak can be positively identified via liquid level measurement; (3) using improved (continuous, automated) methods of taking the psychrometric and pressure measurements, the uncertainty in the measured quantity of water leaving Tank 241-C-106 via the ventilation system can be reduced by approximately an order of magnitude.

  1. Uncertainties in the measured quantities of water leaving waste Tank 241-C-106 via the ventilation system

    International Nuclear Information System (INIS)

    Minteer, D.J.

    1995-01-01

    The purpose of this analysis is to estimate the uncertainty in the measured quantity of water which typically leaves Tank 241-C-106 via the ventilation system each month. Such measurements are essential for heat removal estimation and tank liquid level verification purposes. The uncertainty associated with the current, infrequent, manual method of measurement (involves various psychrometric and pressure measurements) is suspected to be unreasonably high. Thus, the possible reduction of this uncertainty using a continuous, automated method of measurement will also be estimated. There are three major conclusions as a result of this analysis: (1) the uncertainties associated with the current (infrequent, manual) method of measuring the water which typically leaves Tank 241-C-106 per month via the ventilation system are indeed quite high (80% to 120%); (2) given the current psychrometric and pressure measurement methods and any tank which loses considerable moisture through active ventilation, such as Tank 241-C-106, significant quantities of liquid can actually leak from the tank before a leak can be positively identified via liquid level measurement; (3) using improved (continuous, automated) methods of taking the psychrometric and pressure measurements, the uncertainty in the measured quantity of water leaving Tank 241-C-106 via the ventilation system can be reduced by approximately an order of magnitude

  2. Variance gradients and uncertainty budgets for nonlinear measurement functions with independent inputs

    International Nuclear Information System (INIS)

    Campanelli, Mark; Kacker, Raghu; Kessel, Rüdiger

    2013-01-01

    A novel variance-based measure for global sensitivity analysis, termed a variance gradient (VG), is presented for constructing uncertainty budgets under the Guide to the Expression of Uncertainty in Measurement (GUM) framework for nonlinear measurement functions with independent inputs. The motivation behind VGs is the desire of metrologists to understand which inputs' variance reductions would most effectively reduce the variance of the measurand. VGs are particularly useful when the application of the first supplement to the GUM is indicated because of the inadequacy of measurement function linearization. However, VGs reduce to a commonly understood variance decomposition in the case of a linear(ized) measurement function with independent inputs for which the original GUM readily applies. The usefulness of VGs is illustrated by application to an example from the first supplement to the GUM, as well as to the benchmark Ishigami function. A comparison of VGs to other available sensitivity measures is made. (paper)

  3. MM98.52 - An industrial comparison of coordinate measuring machines in Scandinavia with focus on uncertainty statements

    DEFF Research Database (Denmark)

    Hansen, Hans Nørgaard; Chiffre, Leonardo De

    1999-01-01

    This paper describes an industrial comparison of coordinate measuring machines (CMMs) carried out in the Scandinavian countries from October 1994 to May 1996. Fifty-nine industrial companies with a total of 62 CMMs participated in the project and measured a comparison package with five items chosen....... An important part of the intercomparison was to test the ability of the participants to determine measurement uncertainties. One of the uncertainties was based upon a "best guess" but nevertheless, many participants did not even report this uncertainty. Uncertainty budgeting was not used for measurements other...... than simple length. For each company, a comparison of their measurement ability with the reference laboratory and other Scandinavian companies was made possible. A network regarding CMMs was created in these Scandinavian countries. (C) 1999 Elsevier Science Inc. All rights reserved....

  4. Determination of uncertainty of automated emission measuring systems under field conditions using a second method as a reference

    Energy Technology Data Exchange (ETDEWEB)

    Puustinen, H.; Aunela-Tapola, L.; Tolvanen, M.; Vahlman, T. [VTT Chemical Technology, Espoo (Finland). Environmental Technology; Kovanen, K. [VTT Building Technology, Espoo (Finland). Building Physics, Building Services and Fire Technology

    1999-09-01

    This report presents a procedure to determine the uncertainty of an automated emission measuring system (AMS) by comparing the results with a second method (REF). The procedure determines the uncertainty of AMS by comparing the final concentration and emission results of AMS and REF. In this way, the data processing of the plant is included in the result evaluation. This procedure assumes that the uncertainty of REF is known and determined in due form. The uncertainty determination has been divided into two cases; varying and nearly constant concentration. The suggested procedure calculates the uncertainty of AMS at the 95 % confidence level by a tabulated t-value. A minimum of three data pairs is required. However, a higher amount of data pairs is desirable, since a low amount of data pairs results in a higher uncertainty of AMS. The uncertainty of AMS is valid only within the range of concentrations at which the tests were carried out. Statistical data processing shows that the uncertainty of the reference method has a significant effect on the uncertainty of AMS, which always becomes larger than the uncertainty of REF. This should be taken into account when testing whether AMS fulfils the given uncertainty limits. Practical details, concerning parallel measurements at the plant, and the costs of the measurement campaign, have been taken into account when suggesting alternative ways for implementing the comparative measurements. (orig.) 6 refs.

  5. Maximum respiratory pressure measuring system : calibration and evaluation of uncertainty

    NARCIS (Netherlands)

    Ferreira, J.L.; Pereira, N.C.; Oliveira Júnior, M.; Vasconcelos, F.H.; Parreira, V.F.; Tierra-Criollo, C.J.

    2010-01-01

    The objective of this paper is to present a methodology for the evaluation of uncertainties in the measurements results obtained during the calibration of a digital manovacuometer prototype (DM) with a load cell sensor pressure device incorporated. Calibration curves were obtained for both pressure

  6. A technique for improved stability of adaptive feedforward controllers without detailed uncertainty measurements

    International Nuclear Information System (INIS)

    Berkhoff, A P

    2012-01-01

    Model errors in adaptive controllers for the reduction of broadband noise and vibrations may lead to unstable systems or increased error signals. Previous research on active structures with small damping has shown that the addition of a low-authority controller which increases damping in the system may lead to improved performance of an adaptive, high-authority controller. Other researchers have suggested the use of frequency dependent regularization based on measured uncertainties. In this paper an alternative method is presented that avoids the disadvantages of these methods, namely the additional complex hardware and the need to obtain detailed information on the uncertainties. An analysis is made of an adaptive feedforward controller in which a difference exists between the secondary path and the model as used in the controller. The real parts of the eigenvalues that determine the stability of the system are expressed in terms of the amount of uncertainty and the singular values of the secondary path. Modifications of the feedforward control scheme are suggested that aim to improve performance without requiring detailed uncertainty measurements. (paper)

  7. Framework for the assessment of PEMS (Portable Emissions Measurement Systems) uncertainty.

    Science.gov (United States)

    Giechaskiel, Barouch; Clairotte, Michael; Valverde-Morales, Victor; Bonnel, Pierre; Kregar, Zlatko; Franco, Vicente; Dilara, Panagiota

    2018-06-13

    European regulation 2016/427 (the first package of the so-called Real-Driving Emissions (RDE) regulation) introduced on-road testing with Portable Emissions Measurement Systems (PEMS) to complement the chassis dynamometer laboratory (Type I) test for the type approval of light-duty vehicles in the European Union since September 2017. The Not-To-Exceed (NTE) limit for a pollutant is the Type I test limit multiplied by a conformity factor that includes a margin for the additional measurement uncertainty of PEMS relative to standard laboratory equipment. The variability of measured results related to RDE trip design, vehicle operating conditions, and data evaluation remain outside of the uncertainty margin. The margins have to be reviewed annually (recital 10 of regulation 2016/646). This paper lays out the framework used for the first review of the NO x margin, which is also applicable to future margin reviews. Based on experimental data received from the stakeholders of the RDE technical working group in 2017, two NO x margin scenarios of 0.24-0.43 were calculated, accounting for different assumptions of possible zero drift behaviour of the PEMS during the tests. The reduced uncertainty margin compared to the one foreseen for 2020 (0.5) reflects the technical improvement of PEMS over the past few years. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  8. Verification of the Indicating Measuring Instruments Taking into Account their Instrumental Measurement Uncertainty

    Directory of Open Access Journals (Sweden)

    Zakharov Igor

    2017-12-01

    Full Text Available The specific features of the measuring instruments verification based on the results of their calibration are considered. It is noted that, in contrast to the verification procedure used in the legal metrology, the verification procedure for calibrated measuring instruments has to take into account the uncertainty of measurements into account. In this regard, a large number of measuring instruments, considered as those that are in compliance after verification in the legal metrology, turns out to be not in compliance after calibration. In this case, it is necessary to evaluate the probability of compliance of indicating measuring instruments. The procedure of compliance probability determination on the basis of the Monte Carlo method is considered. An example of calibration of a Vernier caliper is given.

  9. SWEPP PAN assay system uncertainty analysis: Active mode measurements of solidified aqueous sludge waste

    International Nuclear Information System (INIS)

    Blackwood, L.G.; Harker, Y.D.; Meachum, T.R.

    1997-12-01

    The Idaho National Engineering and Environmental Laboratory is being used as a temporary storage facility for transuranic waste generated by the US Nuclear Weapons program at the Rocky Flats Plant (RFP) in Golden, Colorado. Currently, there is a large effort in progress to prepare to ship this waste to the Waste Isolation Pilot Plant (WIPP) in Carlsbad, New Mexico. In order to meet the TRU Waste Characterization Quality Assurance Program Plan nondestructive assay compliance requirements and quality assurance objectives, it is necessary to determine the total uncertainty of the radioassay results produced by the Stored Waste Examination Pilot Plant (SWEPP) Passive Active Neutron (PAN) radioassay system. This paper is one of a series of reports quantifying the results of the uncertainty analysis of the PAN system measurements for specific waste types and measurement modes. In particular this report covers active mode measurements of weapons grade plutonium-contaminated aqueous sludge waste contained in 208 liter drums (item description codes 1, 2, 7, 800, 803, and 807). Results of the uncertainty analysis for PAN active mode measurements of aqueous sludge indicate that a bias correction multiplier of 1.55 should be applied to the PAN aqueous sludge measurements. With the bias correction, the uncertainty bounds on the expected bias are 0 ± 27%. These bounds meet the Quality Assurance Program Plan requirements for radioassay systems

  10. Numerical Continuation Methods for Intrusive Uncertainty Quantification Studies

    Energy Technology Data Exchange (ETDEWEB)

    Safta, Cosmin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Najm, Habib N. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Phipps, Eric Todd [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-09-01

    Rigorous modeling of engineering systems relies on efficient propagation of uncertainty from input parameters to model outputs. In recent years, there has been substantial development of probabilistic polynomial chaos (PC) Uncertainty Quantification (UQ) methods, enabling studies in expensive computational models. One approach, termed ”intrusive”, involving reformulation of the governing equations, has been found to have superior computational performance compared to non-intrusive sampling-based methods in relevant large-scale problems, particularly in the context of emerging architectures. However, the utility of intrusive methods has been severely limited due to detrimental numerical instabilities associated with strong nonlinear physics. Previous methods for stabilizing these constructions tend to add unacceptably high computational costs, particularly in problems with many uncertain parameters. In order to address these challenges, we propose to adapt and improve numerical continuation methods for the robust time integration of intrusive PC system dynamics. We propose adaptive methods, starting with a small uncertainty for which the model has stable behavior and gradually moving to larger uncertainty where the instabilities are rampant, in a manner that provides a suitable solution.

  11. Moving Beyond 2% Uncertainty: A New Framework for Quantifying Lidar Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Newman, Jennifer F.; Clifton, Andrew

    2017-03-08

    Remote sensing of wind using lidar is revolutionizing wind energy. However, current generations of wind lidar are ascribed a climatic value of uncertainty, which is based on a poor description of lidar sensitivity to external conditions. In this presentation, we show how it is important to consider the complete lidar measurement process to define the measurement uncertainty, which in turn offers the ability to define a much more granular and dynamic measurement uncertainty. This approach is a progression from the 'white box' lidar uncertainty method.

  12. A Two-Step Approach to Uncertainty Quantification of Core Simulators

    Directory of Open Access Journals (Sweden)

    Artem Yankov

    2012-01-01

    Full Text Available For the multiple sources of error introduced into the standard computational regime for simulating reactor cores, rigorous uncertainty analysis methods are available primarily to quantify the effects of cross section uncertainties. Two methods for propagating cross section uncertainties through core simulators are the XSUSA statistical approach and the “two-step” method. The XSUSA approach, which is based on the SUSA code package, is fundamentally a stochastic sampling method. Alternatively, the two-step method utilizes generalized perturbation theory in the first step and stochastic sampling in the second step. The consistency of these two methods in quantifying uncertainties in the multiplication factor and in the core power distribution was examined in the framework of phase I-3 of the OECD Uncertainty Analysis in Modeling benchmark. With the Three Mile Island Unit 1 core as a base model for analysis, the XSUSA and two-step methods were applied with certain limitations, and the results were compared to those produced by other stochastic sampling-based codes. Based on the uncertainty analysis results, conclusions were drawn as to the method that is currently more viable for computing uncertainties in burnup and transient calculations.

  13. Accounting for sensor calibration, data validation, measurement and sampling uncertainties in monitoring urban drainage systems.

    Science.gov (United States)

    Bertrand-Krajewski, J L; Bardin, J P; Mourad, M; Béranger, Y

    2003-01-01

    Assessing the functioning and the performance of urban drainage systems on both rainfall event and yearly time scales is usually based on online measurements of flow rates and on samples of influent effluent for some rainfall events per year. In order to draw pertinent scientific and operational conclusions from the measurement results, it is absolutely necessary to use appropriate methods and techniques in order to i) calibrate sensors and analytical methods, ii) validate raw data, iii) evaluate measurement uncertainties, iv) evaluate the number of rainfall events to sample per year in order to determine performance indicator with a given uncertainty. Based an previous work, the paper gives a synthetic review of required and techniques, and illustrates their application to storage and settling tanks. Experiments show that, controlled and careful experimental conditions, relative uncertainties are about 20% for flow rates in sewer pipes, 6-10% for volumes, 25-35% for TSS concentrations and loads, and 18-276% for TSS removal rates. In order to evaluate the annual pollutant interception efficiency of storage and settling tanks with a given uncertainty, efforts should first be devoted to decrease the sampling uncertainty by increasing the number of sampled events.

  14. Traceability and measurement uncertainty in sample preparation (W5)

    International Nuclear Information System (INIS)

    Wegscheider, W.; Walner, U.; Moser, J.

    2002-01-01

    Full text: Very few chemical measurements are being made directly on the object of interest and sample preparation is thus the rule rather than the exception in daily practice. Unfortunately the operations undertaken in the course of sample preparation are prone to rendering a sample useless for the purpose of interpreting a measurement performed on it, as it might not represent the original and relevant status any longer. Sample preparation along with sampling itself constitutes therefore a procedure that leads to a loss of representation of the original specimen or population. On the other hand it is also not sufficient to confine aspects of traceability and measurement uncertainty to the ultimate measurement, as the key purpose of measuring is to supply adequate data for some kind of decision, be it in production, in health, in the environment, or indeed in any other circumstance. These considerations have led to severe confusion in the community as to what traceability really means in chemistry. CITAC and EURACHEM have only recently issued a preliminary document that clarifies these issues and gives a firm handle on the future development of quality assurance in analytical chemistry. In this talk it will be attempted to outline the general ideas and procedures that lead to traceability of analytical chemical results accompanied by valid statements of their uncertainty. It will be argued that the central element in achieving these goals is a well-designed validation study that frequently goes beyond those requirements currently laid out in official documents. (author)

  15. Instrument uncertainty predictions

    International Nuclear Information System (INIS)

    Coutts, D.A.

    1991-07-01

    The accuracy of measurements and correlations should normally be provided for most experimental activities. The uncertainty is a measure of the accuracy of a stated value or equation. The uncertainty term reflects a combination of instrument errors, modeling limitations, and phenomena understanding deficiencies. This report provides several methodologies to estimate an instrument's uncertainty when used in experimental work. Methods are shown to predict both the pretest and post-test uncertainty

  16. Uncertainty in T1 mapping using the variable flip angle method with two flip angles

    International Nuclear Information System (INIS)

    Schabel, Matthias C; Morrell, Glen R

    2009-01-01

    Propagation of errors, in conjunction with the theoretical signal equation for spoiled gradient echo pulse sequences, is used to derive a theoretical expression for uncertainty in quantitative variable flip angle T 1 mapping using two flip angles. This expression is then minimized to derive a rigorous expression for optimal flip angles that elucidates a commonly used empirical result. The theoretical expressions for uncertainty and optimal flip angles are combined to derive a lower bound on the achievable uncertainty for a given set of pulse sequence parameters and signal-to-noise ratio (SNR). These results provide a means of quantitatively determining the effect of changing acquisition parameters on T 1 uncertainty. (note)

  17. Uncertainty Quantification of Fork Detector Measurements from Spent Fuel Loading Campaigns

    International Nuclear Information System (INIS)

    Vaccaro, S.; De Baere, P.; Schwalbach, P.; Gauld, I.; Hu, J.

    2015-01-01

    With increasing activities at the end of the fuel cycle, the requirements for the verification of spent nuclear fuel for safeguards purposes are continuously growing. In the European Union we are experiencing a dramatic increase in the number of cask loadings for interim dry storage. This is caused by the progressive shut-down of reactors, related to facility ageing but also due to politically motivated phase-out of nuclear power. On the other hand there are advanced plans for the construction of encapsulation plants and geological repositories. The cask loading or the encapsulation process will provide the last occasion to verify the spent fuel assemblies. In this context, Euratom and the US DOE have carried out a critical review of the widely used Fork measurements method of irradiated assemblies. The Nuclear Safeguards directorates of the European Commission's Directorate General for Energy and Oak Ridge National Laboratory have collaborated to improve the Fork data evaluation process and simplify its use for inspection applications. Within the Commission's standard data evaluation package CRISP, we included a SCALE/ORIGEN-based irradiation and depletion simulation of the measured assembly and modelled the fork transfer function to calculate expected count rates based on operator's declarations. The complete acquisition and evaluation process has been automated to compare expected (calculated) with measured count rates. This approach allows a physics-based improvement of the data review and evaluation process. At the same time the new method provides the means for better measurement uncertainty quantification. The present paper will address the implications of the combined approach involving measured and simulated data to the quantification of measurement uncertainty and the consequences of these uncertainties in the possible use of the Fork detector as a partial defect detection method. (author)

  18. A case of instantaneous rigor?

    Science.gov (United States)

    Pirch, J; Schulz, Y; Klintschar, M

    2013-09-01

    The question of whether instantaneous rigor mortis (IR), the hypothetic sudden occurrence of stiffening of the muscles upon death, actually exists has been controversially debated over the last 150 years. While modern German forensic literature rejects this concept, the contemporary British literature is more willing to embrace it. We present the case of a young woman who suffered from diabetes and who was found dead in an upright standing position with back and shoulders leaned against a punchbag and a cupboard. Rigor mortis was fully established, livor mortis was strong and according to the position the body was found in. After autopsy and toxicological analysis, it was stated that death most probably occurred due to a ketoacidotic coma with markedly increased values of glucose and lactate in the cerebrospinal fluid as well as acetone in blood and urine. Whereas the position of the body is most unusual, a detailed analysis revealed that it is a stable position even without rigor mortis. Therefore, this case does not further support the controversial concept of IR.

  19. High-voltage measurements on the 5 ppm relative uncertainty level with collinear laser spectroscopy

    Science.gov (United States)

    Krämer, J.; König, K.; Geppert, Ch; Imgram, P.; Maaß, B.; Meisner, J.; Otten, E. W.; Passon, S.; Ratajczyk, T.; Ullmann, J.; Nörtershäuser, W.

    2018-04-01

    We present the results of high-voltage collinear laser spectroscopy measurements on the 5 ppm relative uncertainty level using a pump and probe scheme at the 4s ^2S1/2 → 4p ^2P3/2 transition of {\\hspace{0pt}}40Ca+ involving the 3d ^2D5/2 metastable state. With two-stage laser interaction and a reference measurement we can eliminate systematic effects such as differences in the contact potentials due to different electrode materials and thermoelectric voltages, and the unknown starting potential of the ions in the ion source. Voltage measurements were performed between  -5 kV and  -19 kV and parallel measurements with stable high-voltage dividers calibrated to 5 ppm relative uncertainty were used as a reference. Our measurements are compatible with the uncertainty limits of the high-voltage dividers and demonstrate an unprecedented (factor of 20) increase in the precision of direct laser-based high-voltage measurements.

  20. ESTIMATION OF MEASUREMENT UNCERTAINTY IN THE DETERMINATION OF Fe CONTENT IN POWDERED TONIC FOOD DRINK USING GRAPHITE FURNACE ATOMIC ABSORPTION SPECTROMETRY

    Directory of Open Access Journals (Sweden)

    Harry Budiman

    2010-06-01

    Full Text Available The evaluation of uncertainty measurement in the determination of Fe content in powdered tonic food drink using graphite furnace atomic absorption spectrometry was carried out. The specification of measurand, source of uncertainty, standard uncertainty, combined uncertainty and expanded uncertainty from this measurement were evaluated and accounted. The measurement result showed that the Fe content in powdered tonic food drink sample was 569.32 µg/5g, with the expanded uncertainty measurement ± 178.20 µg/5g (coverage factor, k = 2, at confidende level 95%. The calibration curve gave the major contribution to the uncertainty of the final results.   Keywords: uncertainty, powdered tonic food drink, iron (Fe, graphite furnace AAS

  1. Practical aspects of the uncertainty and traceability of spectrochemical measurement results by electrothermal atomic absorption spectrometry

    International Nuclear Information System (INIS)

    Duta, S.; Robouch, P.; Barbu, L.; Taylor, P.

    2007-01-01

    The determination of trace elements concentration in water by electrothermal atomic absorption spectrometry (ETAAS) is a common and well established technique in many chemical testing laboratories. However, the evaluation of measurement uncertainty results is not systematically implemented. The paper presents an easy step-by-step example leading to the evaluation of the combined standard uncertainty of copper determination in water using ETAAS. The major contributors to the overall measurement uncertainty are identified due to amount of copper in water sample that mainly depends on the absorbance measurements, due to certified reference material and due to auto-sampler volume measurements. The practical aspects how the traceability of copper concentration in water can be established and demonstrated are also pointed out

  2. Practical aspects of the uncertainty and traceability of spectrochemical measurement results by electrothermal atomic absorption spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Duta, S. [Institute for Reference Materials and Measurements, Joint Research Centre, European Commission, Retieseweg 111, B-2440 Geel (Belgium); National Institute of Metrology, 042122 Vitan Barzesti 11, sector 4 Bucharest (Romania)], E-mail: steluta.duta@inm.ro; Robouch, P. [Institute for Reference Materials and Measurements, Joint Research Centre, European Commission, Retieseweg 111, B-2440 Geel (Belgium)], E-mail: Piotr.Robouch@ec.europa.eu; Barbu, L. [Coca-Cola Entreprise, Analytical Department, Bucharest (Romania); Taylor, P. [Institute for Reference Materials and Measurements, Joint Research Centre, European Commission, Retieseweg 111, B-2440 Geel (Belgium)], E-mail: Philip.Taylor@ec.europa.eu

    2007-04-15

    The determination of trace elements concentration in water by electrothermal atomic absorption spectrometry (ETAAS) is a common and well established technique in many chemical testing laboratories. However, the evaluation of measurement uncertainty results is not systematically implemented. The paper presents an easy step-by-step example leading to the evaluation of the combined standard uncertainty of copper determination in water using ETAAS. The major contributors to the overall measurement uncertainty are identified due to amount of copper in water sample that mainly depends on the absorbance measurements, due to certified reference material and due to auto-sampler volume measurements. The practical aspects how the traceability of copper concentration in water can be established and demonstrated are also pointed out.

  3. Toward a more rigorous application of margins and uncertainties within the nuclear weapons life cycle : a Sandia perspective

    International Nuclear Information System (INIS)

    Klenke, Scott Edward; Novotny, George Charles; Paulsen Robert A., Jr.; Diegert, Kathleen V.; Trucano, Timothy Guy; Pilch, Martin M.

    2007-01-01

    This paper presents the conceptual framework that is being used to define quantification of margins and uncertainties (QMU) for application in the nuclear weapons (NW) work conducted at Sandia National Laboratories. The conceptual framework addresses the margins and uncertainties throughout the NW life cycle and includes the definition of terms related to QMU and to figures of merit. Potential applications of QMU consist of analyses based on physical data and on modeling and simulation. Appendix A provides general guidelines for addressing cases in which significant and relevant physical data are available for QMU analysis. Appendix B gives the specific guidance that was used to conduct QMU analyses in cycle 12 of the annual assessment process. Appendix C offers general guidelines for addressing cases in which appropriate models are available for use in QMU analysis. Appendix D contains an example that highlights the consequences of different treatments of uncertainty in model-based QMU analyses

  4. LOWERING UNCERTAINTY IN CRUDE OIL MEASUREMENT BY SELECTING OPTIMIZED ENVELOPE COLOR OF A PIPELINE

    Directory of Open Access Journals (Sweden)

    Morteza Saadat

    2011-01-01

    Full Text Available Lowering uncertainty in crude oil volume measurement has been widely considered as one of main purposes in an oil export terminal. It is found that crude oil temperature at metering station has big effects on measured volume and may cause big uncertainty at the metering point. As crude oil flows through an aboveground pipeline, pick up the solar radiation and heat up. This causes the oil temperature at the metering point to rise and higher uncertainty to be created. The amount of temperature rise is depended on exterior surface paint color. In the Kharg Island, there is about 3 km distance between the oil storage tanks and the metering point. The oil flows through the pipeline due to gravity effects as storage tanks are located 60m higher than the metering point. In this study, an analytical model has been conducted for predicting oil temperature at the pipeline exit (the metering point based on climate and geographical conditions of the Kharg Island. The temperature at the metering point has been calculated and the effects of envelope color have been investigated. Further, the uncertainty in the measurement system due to temperature rise has been studied.

  5. Mathematical Rigor in Introductory Physics

    Science.gov (United States)

    Vandyke, Michael; Bassichis, William

    2011-10-01

    Calculus-based introductory physics courses intended for future engineers and physicists are often designed and taught in the same fashion as those intended for students of other disciplines. A more mathematically rigorous curriculum should be more appropriate and, ultimately, more beneficial for the student in his or her future coursework. This work investigates the effects of mathematical rigor on student understanding of introductory mechanics. Using a series of diagnostic tools in conjunction with individual student course performance, a statistical analysis will be performed to examine student learning of introductory mechanics and its relation to student understanding of the underlying calculus.

  6. Uncertainty in CH4 and N2O emission estimates from a managed fen meadow using EC measurements

    International Nuclear Information System (INIS)

    Kroon, P.S.; Hensen, A.; Van 't Veen, W.H.; Vermeulen, A.T.; Jonker, H.

    2009-02-01

    The overall uncertainty in annual flux estimates derived from chamber measurements may be as high as 50% due to the temporal and spatial variability in the fluxes. As even a large number of chamber plots still cover typically less than 1% of the total field area, the field-scale integrated emission necessarily remains a matter of speculation. High frequency micrometeorological methods are a good option for obtaining integrated estimates on a hectare scale with a continuous coverage in time. Instrumentation is now becoming available that meets the requirements for CH4 and N2O eddy covariance (EC) measurements. A system consisting of a quantum cascade laser (QCL) spectrometer and a sonic anemometer has recently been proven to be suitable for performing EC measurements. This study analyses the EC flux measurements of CH4 and N2O and its corrections, like calibration, Webb-correction, and corrections for high and low frequency losses, and assesses the magnitude of the uncertainties associated with the precision of the measurement instruments, measurement set-up and the methodology. The uncertainty of one single EC flux measurement, a daily, monthly and 3-monthly average EC flux is estimated. In addition, the cumulative emission of C-CH4 and N-N2O and their uncertainties are determined over several fertilizing events at a dairy farm site in the Netherlands. These fertilizing events are selected from the continuously EC flux measurements from August 2006 to September 2008. The EC flux uncertainties are compared by the overall uncertainty in annual flux estimates derived from chamber measurements. It will be shown that EC flux measurements can decrease the overall uncertainty in annual flux estimates

  7. Uncertainty in CH4 and N2O emission estimates from a managed fen meadow using EC measurements

    Energy Technology Data Exchange (ETDEWEB)

    Kroon, P.S.; Hensen, A.; Van ' t Veen, W.H.; Vermeulen, A.T. [ECN Biomass, Coal and Environment, Petten (Netherlands); Jonker, H. [Delft University of Technology, Delft (Netherlands)

    2009-02-15

    The overall uncertainty in annual flux estimates derived from chamber measurements may be as high as 50% due to the temporal and spatial variability in the fluxes. As even a large number of chamber plots still cover typically less than 1% of the total field area, the field-scale integrated emission necessarily remains a matter of speculation. High frequency micrometeorological methods are a good option for obtaining integrated estimates on a hectare scale with a continuous coverage in time. Instrumentation is now becoming available that meets the requirements for CH4 and N2O eddy covariance (EC) measurements. A system consisting of a quantum cascade laser (QCL) spectrometer and a sonic anemometer has recently been proven to be suitable for performing EC measurements. This study analyses the EC flux measurements of CH4 and N2O and its corrections, like calibration, Webb-correction, and corrections for high and low frequency losses, and assesses the magnitude of the uncertainties associated with the precision of the measurement instruments, measurement set-up and the methodology. The uncertainty of one single EC flux measurement, a daily, monthly and 3-monthly average EC flux is estimated. In addition, the cumulative emission of C-CH4 and N-N2O and their uncertainties are determined over several fertilizing events at a dairy farm site in the Netherlands. These fertilizing events are selected from the continuously EC flux measurements from August 2006 to September 2008. The EC flux uncertainties are compared by the overall uncertainty in annual flux estimates derived from chamber measurements. It will be shown that EC flux measurements can decrease the overall uncertainty in annual flux estimates.

  8. "Rigor mortis" in a live patient.

    Science.gov (United States)

    Chakravarthy, Murali

    2010-03-01

    Rigor mortis is conventionally a postmortem change. Its occurrence suggests that death has occurred at least a few hours ago. The authors report a case of "Rigor Mortis" in a live patient after cardiac surgery. The likely factors that may have predisposed such premortem muscle stiffening in the reported patient are, intense low cardiac output status, use of unusually high dose of inotropic and vasopressor agents and likely sepsis. Such an event may be of importance while determining the time of death in individuals such as described in the report. It may also suggest requirement of careful examination of patients with muscle stiffening prior to declaration of death. This report is being published to point out the likely controversies that might arise out of muscle stiffening, which should not always be termed rigor mortis and/ or postmortem.

  9. Classroom Talk for Rigorous Reading Comprehension Instruction

    Science.gov (United States)

    Wolf, Mikyung Kim; Crosson, Amy C.; Resnick, Lauren B.

    2004-01-01

    This study examined the quality of classroom talk and its relation to academic rigor in reading-comprehension lessons. Additionally, the study aimed to characterize effective questions to support rigorous reading comprehension lessons. The data for this study included 21 reading-comprehension lessons in several elementary and middle schools from…

  10. Measuring Cross-Section and Estimating Uncertainties with the fissionTPC

    Energy Technology Data Exchange (ETDEWEB)

    Bowden, N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Manning, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Sangiorgio, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Seilhan, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-01-30

    The purpose of this document is to outline the prescription for measuring fission cross-sections with the NIFFTE fissionTPC and estimating the associated uncertainties. As such it will serve as a work planning guide for NIFFTE collaboration members and facilitate clear communication of the procedures used to the broader community.

  11. Routine internal- and external-quality control data in clinical laboratories for estimating measurement and diagnostic uncertainty using GUM principles.

    Science.gov (United States)

    Magnusson, Bertil; Ossowicki, Haakan; Rienitz, Olaf; Theodorsson, Elvar

    2012-05-01

    Healthcare laboratories are increasingly joining into larger laboratory organizations encompassing several physical laboratories. This caters for important new opportunities for re-defining the concept of a 'laboratory' to encompass all laboratories and measurement methods measuring the same measurand for a population of patients. In order to make measurement results, comparable bias should be minimized or eliminated and measurement uncertainty properly evaluated for all methods used for a particular patient population. The measurement as well as diagnostic uncertainty can be evaluated from internal and external quality control results using GUM principles. In this paper the uncertainty evaluations are described in detail using only two main components, within-laboratory reproducibility and uncertainty of the bias component according to a Nordtest guideline. The evaluation is exemplified for the determination of creatinine in serum for a conglomerate of laboratories both expressed in absolute units (μmol/L) and relative (%). An expanded measurement uncertainty of 12 μmol/L associated with concentrations of creatinine below 120 μmol/L and of 10% associated with concentrations above 120 μmol/L was estimated. The diagnostic uncertainty encompasses both measurement uncertainty and biological variation, and can be estimated for a single value and for a difference. This diagnostic uncertainty for the difference for two samples from the same patient was determined to be 14 μmol/L associated with concentrations of creatinine below 100 μmol/L and 14 % associated with concentrations above 100 μmol/L.

  12. An analysis of combined standard uncertainty for radiochemical measurements of environmental samples

    International Nuclear Information System (INIS)

    Berne, A.

    1996-01-01

    It is anticipated that future data acquisitions intended for use in radiological risk assessments will require the incorporation of uncertainty analysis. Often, only one aliquot of the sample is taken and a single determination is made. Under these circumstances, the total uncertainty is calculated using the open-quotes propagation of errorsclose quotes approach. However, there is no agreement in the radioanalytical community as to the exact equations to use. The Quality Assurance/Metrology Division of the Environmental Measurements Laboratory has developed a systematic process to compute uncertainties in constituent components of the analytical procedure, as well as the combined standard uncertainty (CSU). The equations for computation are presented here, with examples of their use. They have also been incorporated into a code for use in the spreadsheet application, QuattroPro trademark. Using the spreadsheet with appropriate inputs permits an analysis of the variations in the CSU as a function of several different variables. The relative importance of the open-quotes counting uncertaintyclose quotes can also be ascertained

  13. Conditional uncertainty principle

    Science.gov (United States)

    Gour, Gilad; Grudka, Andrzej; Horodecki, Michał; Kłobus, Waldemar; Łodyga, Justyna; Narasimhachar, Varun

    2018-04-01

    We develop a general operational framework that formalizes the concept of conditional uncertainty in a measure-independent fashion. Our formalism is built upon a mathematical relation which we call conditional majorization. We define conditional majorization and, for the case of classical memory, we provide its thorough characterization in terms of monotones, i.e., functions that preserve the partial order under conditional majorization. We demonstrate the application of this framework by deriving two types of memory-assisted uncertainty relations, (1) a monotone-based conditional uncertainty relation and (2) a universal measure-independent conditional uncertainty relation, both of which set a lower bound on the minimal uncertainty that Bob has about Alice's pair of incompatible measurements, conditioned on arbitrary measurement that Bob makes on his own system. We next compare the obtained relations with their existing entropic counterparts and find that they are at least independent.

  14. Assessment of the uncertainties in the Radiological Protection Institute of Ireland (RPII) radon measurements service

    Energy Technology Data Exchange (ETDEWEB)

    Hanley, O. [Radiological Protection Institute of Ireland, 3 Clonskeagh Square, Clonskeagh Road, Dublin 14 (Ireland)], E-mail: ohanley@rpii.ie; Gutierrez-Villanueva, J.L. [Laboratorio LIBRA, Edificio I-D, Paseo Belen 3, 47011 Valladolid (Spain); Departamento de Fisica Teorica, Atomica y Optica, Facultad de Ciencias, Paseo Prado de la Magdalena, s/n. 47005 Valladolid (Spain)], E-mail: joselg@libra.uva.es; Currivan, L. [Radiological Protection Institute of Ireland, 3 Clonskeagh Square, Clonskeagh Road, Dublin 14 (Ireland)], E-mail: lcurrivan@rpii.ie; Pollard, D. [Radiological Protection Institute of Ireland, 3 Clonskeagh Square, Clonskeagh Road, Dublin 14 (Ireland)], E-mail: dpollard@rpii.ie

    2008-10-15

    The RPII radon (Rn) laboratory holds accreditation for the International Standard ISO/IEC 17025. A requirement of this standard is an estimate of the uncertainty of measurement. This work shows two approaches to estimate the uncertainty. The bottom-up approach involved identifying the components that were found to contribute to the uncertainty. Estimates were made for each of these components, which were combined to give a combined uncertainty of 13.5% at a Rn concentration of approximately 2500 Bq m{sup -3} at the 68% confidence level. By applying a coverage factor of k = 2, the expanded uncertainty is {+-}27% at the 95% confidence level. The top-down approach used information previously gathered from intercomparison exercises to estimate the uncertainty. This investigation found an expanded uncertainty of {+-}22% at approximately 95% confidence level. This is good agreement for such independent estimates.

  15. Assessment of the uncertainties in the Radiological Protection Institute of Ireland (RPII) radon measurements service.

    Science.gov (United States)

    Hanley, O; Gutiérrez-Villanueva, J L; Currivan, L; Pollard, D

    2008-10-01

    The RPII radon (Rn) laboratory holds accreditation for the International Standard ISO/IEC 17025. A requirement of this standard is an estimate of the uncertainty of measurement. This work shows two approaches to estimate the uncertainty. The bottom-up approach involved identifying the components that were found to contribute to the uncertainty. Estimates were made for each of these components, which were combined to give a combined uncertainty of 13.5% at a Rn concentration of approximately 2500 Bq m(-3) at the 68% confidence level. By applying a coverage factor of k=2, the expanded uncertainty is +/-27% at the 95% confidence level. The top-down approach used information previously gathered from intercomparison exercises to estimate the uncertainty. This investigation found an expanded uncertainty of +/-22% at approximately 95% confidence level. This is good agreement for such independent estimates.

  16. Assessing climate change and socio-economic uncertainties in long term management of water resources

    Science.gov (United States)

    Jahanshahi, Golnaz; Dawson, Richard; Walsh, Claire; Birkinshaw, Stephen; Glenis, Vassilis

    2015-04-01

    Long term management of water resources is challenging for decision makers given the range of uncertainties that exist. Such uncertainties are a function of long term drivers of change, such as climate, environmental loadings, demography, land use and other socio economic drivers. Impacts of climate change on frequency of extreme events such as drought make it a serious threat to water resources and water security. The release of probabilistic climate information, such as the UKCP09 scenarios, provides improved understanding of some uncertainties in climate models. This has motivated a more rigorous approach to dealing with other uncertainties in order to understand the sensitivity of investment decisions to future uncertainty and identify adaptation options that are as far as possible robust. We have developed and coupled a system of models that includes a weather generator, simulations of catchment hydrology, demand for water and the water resource system. This integrated model has been applied in the Thames catchment which supplies the city of London, UK. This region is one of the driest in the UK and hence sensitive to water availability. In addition, it is one of the fastest growing parts of the UK and plays an important economic role. Key uncertainties in long term water resources in the Thames catchment, many of which result from earth system processes, are identified and quantified. The implications of these uncertainties are explored using a combination of uncertainty analysis and sensitivity testing. The analysis shows considerable uncertainty in future rainfall, river flow and consequently water resource. For example, results indicate that by the 2050s, low flow (Q95) in the Thames catchment will range from -44 to +9% compared with the control scenario (1970s). Consequently, by the 2050s the average number of drought days are expected to increase 4-6 times relative to the 1970s. Uncertainties associated with urban growth increase these risks further

  17. A rigorous methodology for development and uncertainty analysis of group contribution based property models

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Abildskov, Jens; Sin, Gürkan

    ) weighted-least-square regression. 3) Initialization of estimation by use of linear algebra providing a first guess. 4) Sequential parameter and simultaneous GC parameter by using of 4 different minimization algorithms. 5) Thorough uncertainty analysis: a) based on asymptotic approximation of parameter...... covariance matrix b) based on boot strap method. Providing 95%-confidence intervals of parameters and predicted property. 6) Performance statistics analysis and model application. The application of the methodology is shown for a new GC model built to predict lower flammability limit (LFL) for refrigerants...... their credibility and robustness in wider industrial and scientific applications....

  18. Joint measurements of spin, operational locality and uncertainty

    International Nuclear Information System (INIS)

    Andersson, E.; Barnett, S.M.; Aspect, A.

    2005-01-01

    Full text: Joint measurements of non-commuting observables are possible within quantum mechanics, if one accepts an increase in the variances of the jointly measured observables. In this contribution, we discuss joint measurements of spin 1/2 along any two directions. Starting from an operational locality principle, we show how to obtain the known bound on how sharp the joint measurement can be. Operational locality here means, that no operation performed at a quantum system at one location can instantaneously affect a system at another location. The measurement bound is general and is here obtained without reference to any quantum measurement formalism. We find that the bound is formally identical to a Bell inequality of the CHSH type, and we also give a direct interpretation of the measurement bound in terms of an uncertainty relation. A simple way to realise the joint measurement for the case of photon polarization is presented. Further to their fundamental interest, quantum joint measurements of non-commuting observables can be related to state estimation. They are also of interest in quantum information, e.g. as strategies for eavesdropping in quantum cryptography. (author)

  19. Evaluation of uncertainty in the measurement of environmental electromagnetic fields

    International Nuclear Information System (INIS)

    Vulevic, B.; Osmokrovic, P.

    2010-01-01

    With regard to Non-ionising radiation protection, the relationship between human exposure to electromagnetic fields and health is controversial. Electromagnetic fields have become omnipresent in the daily environment. This paper assesses the problem of how to compare a measurement result with a limit fixed by the standard for human exposure to electric, magnetic and electromagnetic fields (0 Hz-300 GHz). The purpose of the paper is an appropriate representation of the basic information about evaluation of measurement uncertainty. (authors)

  20. The role of the uncertainty of measurement of serum creatinine concentrations in the diagnosis of acute kidney injury.

    Science.gov (United States)

    Kin Tekce, Buket; Tekce, Hikmet; Aktas, Gulali; Uyeturk, Ugur

    2016-01-01

    Uncertainty of measurement is the numeric expression of the errors associated with all measurements taken in clinical laboratories. Serum creatinine concentration is the most common diagnostic marker for acute kidney injury. The goal of this study was to determine the effect of the uncertainty of measurement of serum creatinine concentrations on the diagnosis of acute kidney injury. We calculated the uncertainty of measurement of serum creatinine according to the Nordtest Guide. Retrospectively, we identified 289 patients who were evaluated for acute kidney injury. Of the total patient pool, 233 were diagnosed with acute kidney injury using the AKIN classification scheme and then were compared using statistical analysis. We determined nine probabilities of the uncertainty of measurement of serum creatinine concentrations. There was a statistically significant difference in the number of patients diagnosed with acute kidney injury when uncertainty of measurement was taken into consideration (first probability compared to the fifth p = 0.023 and first probability compared to the ninth p = 0.012). We found that the uncertainty of measurement for serum creatinine concentrations was an important factor for correctly diagnosing acute kidney injury. In addition, based on the AKIN classification scheme, minimizing the total allowable error levels for serum creatinine concentrations is necessary for the accurate diagnosis of acute kidney injury by clinicians.

  1. Software metrics a rigorous and practical approach

    CERN Document Server

    Fenton, Norman

    2014-01-01

    A Framework for Managing, Measuring, and Predicting Attributes of Software Development Products and ProcessesReflecting the immense progress in the development and use of software metrics in the past decades, Software Metrics: A Rigorous and Practical Approach, Third Edition provides an up-to-date, accessible, and comprehensive introduction to software metrics. Like its popular predecessors, this third edition discusses important issues, explains essential concepts, and offers new approaches for tackling long-standing problems.New to the Third EditionThis edition contains new material relevant

  2. Understanding uncertainty

    CERN Document Server

    Lindley, Dennis V

    2013-01-01

    Praise for the First Edition ""...a reference for everyone who is interested in knowing and handling uncertainty.""-Journal of Applied Statistics The critically acclaimed First Edition of Understanding Uncertainty provided a study of uncertainty addressed to scholars in all fields, showing that uncertainty could be measured by probability, and that probability obeyed three basic rules that enabled uncertainty to be handled sensibly in everyday life. These ideas were extended to embrace the scientific method and to show how decisions, containing an uncertain element, could be rationally made.

  3. [Rigor mortis -- a definite sign of death?].

    Science.gov (United States)

    Heller, A R; Müller, M P; Frank, M D; Dressler, J

    2005-04-01

    In the past years an ongoing controversial debate exists in Germany, regarding quality of the coroner's inquest and declaration of death by physicians. We report the case of a 90-year old female, who was found after an unknown time following a suicide attempt with benzodiazepine. The examination of the patient showed livores (mortis?) on the left forearm and left lower leg. Moreover, rigor (mortis?) of the left arm was apparent which prevented arm flexion and extension. The hypothermic patient with insufficient respiration was intubated and mechanically ventilated. Chest compressions were not performed, because central pulses were (hardly) palpable and a sinus bradycardia 45/min (AV-block 2 degrees and sole premature ventricular complexes) was present. After placement of an intravenous line (17 G, external jugular vein) the hemodynamic situation was stabilized with intermittent boli of epinephrine and with sodium bicarbonate. With improved circulation livores and rigor disappeared. In the present case a minimal central circulation was noted, which could be stabilized, despite the presence of certain signs of death ( livores and rigor mortis). Considering the finding of an abrogated peripheral perfusion (livores), we postulate a centripetal collapse of glycogen and ATP supply in the patients left arm (rigor), which was restored after resuscitation and reperfusion. Thus, it appears that livores and rigor are not sensitive enough to exclude a vita minima, in particular in hypothermic patients with intoxications. Consequently a careful ABC-check should be performed even in the presence of apparently certain signs of death, to avoid underdiagnosing a vita minima. Additional ECG- monitoring is required to reduce the rate of false positive declarations of death. To what extent basic life support by paramedics should commence when rigor and livores are present until physician DNR order, deserves further discussion.

  4. Fundamental uncertainty limit of optical flow velocimetry according to Heisenberg's uncertainty principle.

    Science.gov (United States)

    Fischer, Andreas

    2016-11-01

    Optical flow velocity measurements are important for understanding the complex behavior of flows. Although a huge variety of methods exist, they are either based on a Doppler or a time-of-flight measurement principle. Doppler velocimetry evaluates the velocity-dependent frequency shift of light scattered at a moving particle, whereas time-of-flight velocimetry evaluates the traveled distance of a scattering particle per time interval. Regarding the aim of achieving a minimal measurement uncertainty, it is unclear if one principle allows to achieve lower uncertainties or if both principles can achieve equal uncertainties. For this reason, the natural, fundamental uncertainty limit according to Heisenberg's uncertainty principle is derived for Doppler and time-of-flight measurement principles, respectively. The obtained limits of the velocity uncertainty are qualitatively identical showing, e.g., a direct proportionality for the absolute value of the velocity to the power of 32 and an indirect proportionality to the square root of the scattered light power. Hence, both measurement principles have identical potentials regarding the fundamental uncertainty limit due to the quantum mechanical behavior of photons. This fundamental limit can be attained (at least asymptotically) in reality either with Doppler or time-of-flight methods, because the respective Cramér-Rao bounds for dominating photon shot noise, which is modeled as white Poissonian noise, are identical with the conclusions from Heisenberg's uncertainty principle.

  5. Retrievals and uncertainty analysis of aerosol single scattering albedo from MFRSR measurements

    International Nuclear Information System (INIS)

    Yin, Bangsheng; Min, Qilong; Joseph, Everette

    2015-01-01

    Aerosol single scattering albedo (SSA) can be retrieved from the ratio of diffuse horizontal and direct normal fluxes measured from multifilter rotating shadowband radiometer (MFRSR). In this study, the measurement channels at 415 nm and 870 nm are selected for aerosol optical depth (AOD) and Angstrom coefficient retrievals, and the measurements at 415 nm are used for aerosol SSA retrievals with the constraint of retrieved Angstrom coefficient. We extensively assessed various issues impacting on the accuracy of SSA retrieval from measurements to input parameters and assumptions. For cloud-free days with mean aerosol loading of 0.13–0.60, our sensitivity study indicated that: (1) 1% calibration uncertainty can result in 0.8–3.7% changes in retrieved SSA; (2) without considering the cosine respond correction and/or forward scattering correction will result in underestimation of 1.1–3.3% and/or 0.73% in retrieved SSA; (3) an overestimation of 0.1 in asymmetry factor can result in an underestimation of 2.54–3.4% in retrieved SSA; (4) for small aerosol loading (e.g., 0.13), the uncertainty associated with the choice of Rayleigh optical depth value can result in non-negligible change in retrieved SSA (e.g., 0.015); (5) an uncertainty of 0.05 for surface albedo can result in changes of 1.49–5.4% in retrieved SSA. We applied the retrieval algorithm to the MFRSR measurements at the Atmospheric Radiation Measurements (ARM) Southern Great Plains (SGP) site. The retrieved results of AOD, Angstrom coefficient, and SSA are basically consistent with other independent measurements from co-located instruments at the site. - Highlights: • Aerosol SSA is derived from MFRSR measured diffuse to direct normal irradiance ratio. • We extensively assessed various issues impacting on the accuracy of SSA retrieval. • The issues are mainly from measurements and model input parameters and assumptions. • We applied the retrieval algorithm to the MFRSR measurements at ARM SGP

  6. Time-Resolved Particle Image Velocimetry Measurements with Wall Shear Stress and Uncertainty Quantification for the FDA Nozzle Model.

    Science.gov (United States)

    Raben, Jaime S; Hariharan, Prasanna; Robinson, Ronald; Malinauskas, Richard; Vlachos, Pavlos P

    2016-03-01

    We present advanced particle image velocimetry (PIV) processing, post-processing, and uncertainty estimation techniques to support the validation of computational fluid dynamics analyses of medical devices. This work is an extension of a previous FDA-sponsored multi-laboratory study, which used a medical device mimicking geometry referred to as the FDA benchmark nozzle model. Experimental measurements were performed using time-resolved PIV at five overlapping regions of the model for Reynolds numbers in the nozzle throat of 500, 2000, 5000, and 8000. Images included a twofold increase in spatial resolution in comparison to the previous study. Data was processed using ensemble correlation, dynamic range enhancement, and phase correlations to increase signal-to-noise ratios and measurement accuracy, and to resolve flow regions with large velocity ranges and gradients, which is typical of many blood-contacting medical devices. Parameters relevant to device safety, including shear stress at the wall and in bulk flow, were computed using radial basis functions. In addition, in-field spatially resolved pressure distributions, Reynolds stresses, and energy dissipation rates were computed from PIV measurements. Velocity measurement uncertainty was estimated directly from the PIV correlation plane, and uncertainty analysis for wall shear stress at each measurement location was performed using a Monte Carlo model. Local velocity uncertainty varied greatly and depended largely on local conditions such as particle seeding, velocity gradients, and particle displacements. Uncertainty in low velocity regions in the sudden expansion section of the nozzle was greatly reduced by over an order of magnitude when dynamic range enhancement was applied. Wall shear stress uncertainty was dominated by uncertainty contributions from velocity estimations, which were shown to account for 90-99% of the total uncertainty. This study provides advancements in the PIV processing methodologies over

  7. International Target Values 2010 for Measurement Uncertainties in Safeguarding Nuclear Materials

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, M.; Penkin, M.; Norman, C.; Balsley, S. [IAEA, Vienna (Australia); others, and

    2012-12-15

    This issue of the International Target Values (ITVs) represents the sixth revision, following the first release of such tables issued in 1979 by the ESARDA/WGDA. The ITVs are uncertainties to be considered in judging the reliability of analytical techniques applied to industrial nuclear and fissile material, which are subject to safeguards verification. The tabulated values represent estimates of the 'state of the practice' which should be achievable under routine measurement conditions. The most recent standard conventions in representing uncertainty have been considered, while maintaining a format that allows comparison with the previous releases of the ITVs. The present report explains why target values are needed, how the concept evolved and how they relate to the operator's and inspector's measurement systems. The ITVs-2010 are intended to be used by plant operators and safeguards organizations, as a reference of the quality of measurements achievable in nuclear material accountancy, and for planning purposes. The report suggests that the use of ITVs can be beneficial for statistical inferences regarding the significance of operator-inspector differences whenever valid performance values are not available.

  8. Road safety performance measures and AADT uncertainty from short-term counts.

    Science.gov (United States)

    Milligan, Craig; Montufar, Jeannette; Regehr, Jonathan; Ghanney, Bartholomew

    2016-12-01

    The objective of this paper is to enable better risk analysis of road safety performance measures by creating the first knowledge base on uncertainty surrounding annual average daily traffic (AADT) estimates when the estimates are derived by expanding short-term counts with the individual permanent counter method. Many road safety performance measures and performance models use AADT as an input. While there is an awareness that the input suffers from uncertainty, the uncertainty is not well known or accounted for. The paper samples data from a set of 69 permanent automatic traffic recorders in Manitoba, Canada, to simulate almost 2 million short-term counts over a five year period. These short-term counts are expanded to AADT estimates by transferring temporal information from a directly linked nearby permanent count control station, and the resulting AADT values are compared to a known reference AADT to compute errors. The impacts of five factors on AADT error are considered: length of short-term count, number of short-term counts, use of weekday versus weekend counts, distance from a count to its expansion control station, and the AADT at the count site. The mean absolute transfer error for expanded AADT estimates is 6.7%, and this value varied by traffic pattern group from 5% to 10.5%. Reference percentiles of the error distribution show that almost all errors are between -20% and +30%. Error decreases substantially by using a 48-h count instead of a 24-h count, and only slightly by using two counts instead of one. Weekday counts are superior to weekend counts, especially if the count is only 24h. Mean absolute transfer error increases with distance to control station (elasticity 0.121, p=0.001), and increases with AADT (elasticity 0.857, proad safety performance measures that use AADT as inputs. Analytical frameworks for such analysis exist but are infrequently used in road safety because the evidence base on AADT uncertainty is not well developed. Copyright

  9. Some comments on rigorous quantum field path integrals in the analytical regularization scheme

    Energy Technology Data Exchange (ETDEWEB)

    Botelho, Luiz C.L. [Universidade Federal Fluminense (UFF), Niteroi, RJ (Brazil). Dept. de Matematica Aplicada]. E-mail: botelho.luiz@superig.com.br

    2008-07-01

    Through the systematic use of the Minlos theorem on the support of cylindrical measures on R{sup {infinity}}, we produce several mathematically rigorous path integrals in interacting euclidean quantum fields with Gaussian free measures defined by generalized powers of the Laplacian operator. (author)

  10. Some comments on rigorous quantum field path integrals in the analytical regularization scheme

    International Nuclear Information System (INIS)

    Botelho, Luiz C.L.

    2008-01-01

    Through the systematic use of the Minlos theorem on the support of cylindrical measures on R ∞ , we produce several mathematically rigorous path integrals in interacting euclidean quantum fields with Gaussian free measures defined by generalized powers of the Laplacian operator. (author)

  11. Eye tracking measures of uncertainty during perceptual decision making.

    Science.gov (United States)

    Brunyé, Tad T; Gardony, Aaron L

    2017-10-01

    Perceptual decision making involves gathering and interpreting sensory information to effectively categorize the world and inform behavior. For instance, a radiologist distinguishing the presence versus absence of a tumor, or a luggage screener categorizing objects as threatening or non-threatening. In many cases, sensory information is not sufficient to reliably disambiguate the nature of a stimulus, and resulting decisions are done under conditions of uncertainty. The present study asked whether several oculomotor metrics might prove sensitive to transient states of uncertainty during perceptual decision making. Participants viewed images with varying visual clarity and were asked to categorize them as faces or houses, and rate the certainty of their decisions, while we used eye tracking to monitor fixations, saccades, blinks, and pupil diameter. Results demonstrated that decision certainty influenced several oculomotor variables, including fixation frequency and duration, the frequency, peak velocity, and amplitude of saccades, and phasic pupil diameter. Whereas most measures tended to change linearly along with decision certainty, pupil diameter revealed more nuanced and dynamic information about the time course of perceptual decision making. Together, results demonstrate robust alterations in eye movement behavior as a function of decision certainty and attention demands, and suggest that monitoring oculomotor variables during applied task performance may prove valuable for identifying and remediating transient states of uncertainty. Published by Elsevier B.V.

  12. Measurement uncertainties in regression analysis with scarcity of data

    International Nuclear Information System (INIS)

    Sousa, J A; Ribeiro, A S; Cox, M G; Harris, P M; Sousa, J F V

    2010-01-01

    The evaluation of measurement uncertainty, in certain fields of science, faces the problem of scarcity of data. This is certainly the case in the testing of geological soils in civil engineering, where tests can take several days or weeks and where the same sample is not available for further testing, being destroyed during the experiment. In this particular study attention will be paid to triaxial compression tests used to typify particular soils. The purpose of the testing is to determine two parameters that characterize the soil, namely, cohesion and friction angle. These parameters are defined in terms of the intercept and slope of a straight line fitted to a small number of points (usually three) derived from experimental data. The use of ordinary least squares to obtain uncertainties associated with estimates of the two parameters would be unreliable if there were only three points (and no replicates) and hence only one degrees of freedom.

  13. Measurement uncertainty associated with chromatic confocal profilometry for 3D surface texture characterization of natural human enamel.

    Science.gov (United States)

    Mullan, F; Bartlett, D; Austin, R S

    2017-06-01

    To investigate the measurement performance of a chromatic confocal profilometer for quantification of surface texture of natural human enamel in vitro. Contributions to the measurement uncertainty from all potential sources of measurement error using a chromatic confocal profilometer and surface metrology software were quantified using a series of surface metrology calibration artifacts and pre-worn enamel samples. The 3D surface texture analysis protocol was optimized across 0.04mm 2 of natural and unpolished enamel undergoing dietary acid erosion (pH 3.2, titratable acidity 41.3mmolOH/L). Flatness deviations due to the x, y stage mechanical movement were the major contribution to the measurement uncertainty; with maximum Sz flatness errors of 0.49μm. Whereas measurement noise; non-linearity's in x, y, z and enamel sample dimensional instability contributed minimal errors. The measurement errors were propagated into an uncertainty budget following a Type B uncertainty evaluation in order to calculate the Standard Combined Uncertainty (u c ), which was ±0.28μm. Statistically significant increases in the median (IQR) roughness (Sa) of the polished samples occurred after 15 (+0.17 (0.13)μm), 30 (+0.12 (0.09)μm) and 45 (+0.18 (0.15)μm) min of erosion (Pchromatic confocal profilometry was from flatness deviations however by optimizing measurement protocols the profilometer successfully characterized surface texture changes in enamel from erosive wear in vitro. Copyright © 2017 The Academy of Dental Materials. All rights reserved.

  14. Uncertainty in measurements in practice ionization chamber; Incerteza nas medidas realizadas pela pratica da camara de ionizacao

    Energy Technology Data Exchange (ETDEWEB)

    Sales, Emer; Pinto, Fernando Sandi; Sousa Junior, Samuel Facanha; Freitas, Dayslon Luiz Gaudaret; Andrade, Lucio das Chagas de, E-mail: fernandopintofis@gmail.com [Universidade do Estado do Rio de Janeiro (UERJ), RJ (Brazil)

    2016-07-01

    The calculation of uncertainty is a mathematical tool widely used in the analysis of experimental data, ensuring that the values obtained by measuring equipment are the most accurate and close to the possible real. This paper presents a theoretical review of uncertainty, and with application of objective determination of uncertainty for repeatability and reproducibility of processes measuring for determining dose of a radioactive source, in practice ionization chamber, held at the Professional Master of Medical Physics State University of Rio de Janeiro. (author)

  15. Evaluation of uncertainty in the measurement of sense of natural language constructions

    Directory of Open Access Journals (Sweden)

    Bisikalo Oleg V.

    2017-01-01

    Full Text Available The task of evaluating uncertainty in the measurement of sense in natural language constructions (NLCs was researched through formalization of the notions of the language image, formalization of artificial cognitive systems (ACSs and the formalization of units of meaning. The method for measuring the sense of natural language constructions incorporated fuzzy relations of meaning, which ensures that information about the links between lemmas of the text is taken into account, permitting the evaluation of two types of measurement uncertainty of sense characteristics. Using developed applications programs, experiments were conducted to investigate the proposed method to tackle the identification of informative characteristics of text. The experiments resulted in dependencies of parameters being obtained in order to utilise the Pareto distribution law to define relations between lemmas, analysis of which permits the identification of exponents of an average number of connections of the language image as the most informative characteristics of text.

  16. Measurement uncertainty recapture (MUR) power uprates operation at Kuosheng Nuclear Power Station

    International Nuclear Information System (INIS)

    Chang Chinjang; Wang Tunglu; Lin Chihpao

    2009-01-01

    Measurement Uncertainty Recapture PowerUprates (MUR PU) are achieved through the use of state-of-the-art feedwater flow measurement devices, i.e., ultrasonic flow meters (UFMs), that reduce the degree of uncertainty associated with feedwater flow measurement and in turn provide for a more accurate calculation of thermal power. The Institute of Nuclear Energy Research (INER) teamed with Sargent and Lundy, LLC (S and L), Pacific Engineers and Constructors, Ltd (PECL), and AREVA to develop a program and plan for the Kuosheng Nuclear Power Station (KNPS) MUR PU Engineering Service Project and for the assistance to Kuosheng MUR PU operation. After regulator's approval of the licensing requests, KSNPS conducted the power ascension test and switchover to the new rated thermal power for Unit 2 and Unit 1 on 7/7/2007 and 11/30/2007, respectively. From then on, KNPS became the first nuclear power plant implementing MUR PU operation in Taiwan and in Asia. (author)

  17. Quantifying uncertainty in the measurement of arsenic in suspended particulate matter by Atomic Absorption Spectrometry with hydride generator

    Directory of Open Access Journals (Sweden)

    Ahuja Tarushee

    2011-04-01

    Full Text Available Abstract Arsenic is the toxic element, which creates several problems in human being specially when inhaled through air. So the accurate and precise measurement of arsenic in suspended particulate matter (SPM is of prime importance as it gives information about the level of toxicity in the environment, and preventive measures could be taken in the effective areas. Quality assurance is equally important in the measurement of arsenic in SPM samples before making any decision. The quality and reliability of the data of such volatile elements depends upon the measurement of uncertainty of each step involved from sampling to analysis. The analytical results quantifying uncertainty gives a measure of the confidence level of the concerned laboratory. So the main objective of this study was to determine arsenic content in SPM samples with uncertainty budget and to find out various potential sources of uncertainty, which affects the results. Keeping these facts, we have selected seven diverse sites of Delhi (National Capital of India for quantification of arsenic content in SPM samples with uncertainty budget following sampling by HVS to analysis by Atomic Absorption Spectrometer-Hydride Generator (AAS-HG. In the measurement of arsenic in SPM samples so many steps are involved from sampling to final result and we have considered various potential sources of uncertainties. The calculation of uncertainty is based on ISO/IEC17025: 2005 document and EURACHEM guideline. It has been found that the final results mostly depend on the uncertainty in measurement mainly due to repeatability, final volume prepared for analysis, weighing balance and sampling by HVS. After the analysis of data of seven diverse sites of Delhi, it has been concluded that during the period from 31st Jan. 2008 to 7th Feb. 2008 the arsenic concentration varies from 1.44 ± 0.25 to 5.58 ± 0.55 ng/m3 with 95% confidence level (k = 2.

  18. Uncertainties of predictions from parton distributions 1, experimental errors

    CERN Document Server

    Martin, A D; Stirling, William James; Thorne, R S; CERN. Geneva

    2003-01-01

    We determine the uncertainties on observables arising from the errors on the experimental data that are fitted in the global MRST2001 parton analysis. By diagonalizing the error matrix we produce sets of partons suitable for use within the framework of linear propagation of errors, which is the most convenient method for calculating the uncertainties. Despite the potential limitations of this approach we find that it can be made to work well in practice. This is confirmed by our alternative approach of using the more rigorous Lagrange multiplier method to determine the errors on physical quantities directly. As particular examples we determine the uncertainties on the predictions of the charged-current deep-inelastic structure functions, on the cross-sections for W production and for Higgs boson production via gluon--gluon fusion at the Tevatron and the LHC, on the ratio of W-minus to W-plus production at the LHC and on the moments of the non-singlet quark distributions. We discuss the corresponding uncertain...

  19. Measurement Uncertainty for Finite Quantum Observables

    Directory of Open Access Journals (Sweden)

    René Schwonnek

    2016-06-01

    Full Text Available Measurement uncertainty relations are lower bounds on the errors of any approximate joint measurement of two or more quantum observables. The aim of this paper is to provide methods to compute optimal bounds of this type. The basic method is semidefinite programming, which we apply to arbitrary finite collections of projective observables on a finite dimensional Hilbert space. The quantification of errors is based on an arbitrary cost function, which assigns a penalty to getting result x rather than y, for any pair ( x , y . This induces a notion of optimal transport cost for a pair of probability distributions, and we include an Appendix with a short summary of optimal transport theory as needed in our context. There are then different ways to form an overall figure of merit from the comparison of distributions. We consider three, which are related to different physical testing scenarios. The most thorough test compares the transport distances between the marginals of a joint measurement and the reference observables for every input state. Less demanding is a test just on the states for which a “true value” is known in the sense that the reference observable yields a definite outcome. Finally, we can measure a deviation as a single expectation value by comparing the two observables on the two parts of a maximally-entangled state. All three error quantities have the property that they vanish if and only if the tested observable is equal to the reference. The theory is illustrated with some characteristic examples.

  20. Calculation of uncertainties associated to environmental radioactivity measurements and their functions. Practical Procedure II

    International Nuclear Information System (INIS)

    Gascon, C.; Anton, M.P.

    1997-01-01

    Environmental radioactivity measurements are mainly affected by counting uncertainties. In this report the uncertainties associated to certain functions related to activity concentration calculations are determined. Some practical exercise are presented to calculate the uncertainties associated to: a) Chemical recovery of a radiochemical separation when employing tracers (i.e. Pu and Am purification from a sediment sample). b) Indirect determination of a mother radionuclide through one of its daughters (i. e. ''210 Pb quantification following its daughter ''210 Po building-up activity). c) Time span from last separation date of one of the components of a disintegration chain (i.e. Am last purification date from a nuclear weapons following ''241 Am and ''241 Pu measurements). Calculations concerning example b) and c) are based on Baterman equations, regulating radioactive equilibria. Although the exercises here presented are performed with certain radionuclides, they could be applied as generic procedures for other alpha-emitting radioelements

  1. Storage flux uncertainty impact on eddy covariance net ecosystem exchange measurements

    Science.gov (United States)

    Nicolini, Giacomo; Aubinet, Marc; Feigenwinter, Christian; Heinesch, Bernard; Lindroth, Anders; Mamadou, Ossénatou; Moderow, Uta; Mölder, Meelis; Montagnani, Leonardo; Rebmann, Corinna; Papale, Dario

    2017-04-01

    Complying with several assumption and simplifications, most of the carbon budget studies based on eddy covariance (EC) measurements, quantify the net ecosystem exchange (NEE) by summing the flux obtained by EC (Fc) and the storage flux (Sc). Sc is the rate of change of CO2, within the so called control volume below the EC measurement level, given by the difference in the instantaneous profiles of concentration at the beginning and end of the EC averaging period, divided by the averaging period. While cumulating over time led to a nullification of Sc, it can be significant at short time periods. The approaches used to estimate Sc fluxes largely vary, from measurements based only on a single sampling point (usually located at the EC measurement height) to measurements based on several sampling profiles distributed within the control volume. Furthermore, the number of sampling points within each profile vary, according to their height and the ecosystem typology. It follows that measurement accuracy increases with the sampling intensity within the control volume. In this work we use the experimental dataset collected during the ADVEX campaign in which Sc flux has been measured in three similar forest sites by the use of 5 sampling profiles (towers). Our main objective is to quantify the impact of Sc measurement uncertainty on NEE estimates. Results show that different methods may produce substantially different Sc flux estimates, with problematic consequences in case high frequency (half-hourly) data are needed for the analysis. However, the uncertainty on long-term estimates may be tolerate.

  2. Evaluating Rigor in Qualitative Methodology and Research Dissemination

    Science.gov (United States)

    Trainor, Audrey A.; Graue, Elizabeth

    2014-01-01

    Despite previous and successful attempts to outline general criteria for rigor, researchers in special education have debated the application of rigor criteria, the significance or importance of small n research, the purpose of interpretivist approaches, and the generalizability of qualitative empirical results. Adding to these complications, the…

  3. Entropic uncertainty for spin-1/2 XXX chains in the presence of inhomogeneous magnetic fields and its steering via weak measurement reversals

    Science.gov (United States)

    Wang, Dong; Ming, Fei; Huang, Ai-Jun; Sun, Wen-Yang; Ye, Liu

    2017-09-01

    The uncertainty principle configures a low bound to the measuring precision for a pair of non-commuting observables, and hence is considerably nontrivial to quantum precision measurement in the field of quantum information theory. In this letter, we consider the entropic uncertainty relation (EUR) in the context of quantum memory in a two-qubit isotropic Heisenberg spin chain. Specifically, we explore the dynamics of EUR in a practical scenario, where two associated nodes of a one-dimensional XXX-spin chain, under an inhomogeneous magnetic field, are connected to a thermal entanglement. We show that the temperature and magnetic field effect can lead to the inflation of the measuring uncertainty, stemming from the reduction of systematic quantum correlation. Notably, we reveal that, firstly, the uncertainty is not fully dependent on the observed quantum correlation of the system; secondly, the dynamical behaviors of the measuring uncertainty are relatively distinct with respect to ferromagnetism and antiferromagnetism chains. Meanwhile, we deduce that the measuring uncertainty is dramatically correlated with the mixedness of the system, implying that smaller mixedness tends to reduce the uncertainty. Furthermore, we propose an effective strategy to control the uncertainty of interest by means of quantum weak measurement reversal. Therefore, our work may shed light on the dynamics of the measuring uncertainty in the Heisenberg spin chain, and thus be important to quantum precision measurement in various solid-state systems.

  4. Generalized uncertainty principle and quantum gravity phenomenology

    Science.gov (United States)

    Bosso, Pasquale

    The fundamental physical description of Nature is based on two mutually incompatible theories: Quantum Mechanics and General Relativity. Their unification in a theory of Quantum Gravity (QG) remains one of the main challenges of theoretical physics. Quantum Gravity Phenomenology (QGP) studies QG effects in low-energy systems. The basis of one such phenomenological model is the Generalized Uncertainty Principle (GUP), which is a modified Heisenberg uncertainty relation and predicts a deformed canonical commutator. In this thesis, we compute Planck-scale corrections to angular momentum eigenvalues, the hydrogen atom spectrum, the Stern-Gerlach experiment, and the Clebsch-Gordan coefficients. We then rigorously analyze the GUP-perturbed harmonic oscillator and study new coherent and squeezed states. Furthermore, we introduce a scheme for increasing the sensitivity of optomechanical experiments for testing QG effects. Finally, we suggest future projects that may potentially test QG effects in the laboratory.

  5. The uncertainty evaluation of measurement for uranium in UF_6 hydrolysate by potentiometric titration

    International Nuclear Information System (INIS)

    Jiang Haiying; Cheng Ruoyu; Meng Xiujun

    2014-01-01

    Based on the building of mathematical model, this paper analyzed the origin of component of indeterminacy of which the measurement result for uranium in uranium hexafluoride hydrolysate by potentiometric titration, also each uncertainty was calculated and the expanded uncertainty was given. By evaluation the result of the uranium concentration is that: (158.88 + 1.22) mgU/mL, K = 2, P = 95%. (authors)

  6. Fragmentation uncertainties in hadronic observables for top-quark mass measurements

    Directory of Open Access Journals (Sweden)

    Gennaro Corcella

    2018-04-01

    Full Text Available We study the Monte Carlo uncertainties due to modeling of hadronization and showering in the extraction of the top-quark mass from observables that use exclusive hadronic final states in top decays, such as t→anything+J/ψ or t→anything+(B→charged tracks, where B is a B-hadron. To this end, we investigate the sensitivity of the top-quark mass, determined by means of a few observables already proposed in the literature as well as some new proposals, to the relevant parameters of event generators, such as HERWIG 6 and PYTHIA 8. We find that constraining those parameters at O(1%–10% is required to avoid a Monte Carlo uncertainty on mt greater than 500 MeV. For the sake of achieving the needed accuracy on such parameters, we examine the sensitivity of the top-quark mass measured from spectral features, such as peaks, endpoints and distributions of EB, mBℓ, and some mT2-like variables. We find that restricting oneself to regions sufficiently close to the endpoints enables one to substantially decrease the dependence on the Monte Carlo parameters, but at the price of inflating significantly the statistical uncertainties. To ameliorate this situation we study how well the data on top-quark production and decay at the LHC can be utilized to constrain the showering and hadronization variables. We find that a global exploration of several calibration observables, sensitive to the Monte Carlo parameters but very mildly to mt, can offer useful constraints on the parameters, as long as such quantities are measured with a 1% precision.

  7. Fragmentation uncertainties in hadronic observables for top-quark mass measurements

    Science.gov (United States)

    Corcella, Gennaro; Franceschini, Roberto; Kim, Doojin

    2018-04-01

    We study the Monte Carlo uncertainties due to modeling of hadronization and showering in the extraction of the top-quark mass from observables that use exclusive hadronic final states in top decays, such as t →anything + J / ψ or t →anything + (B →charged tracks), where B is a B-hadron. To this end, we investigate the sensitivity of the top-quark mass, determined by means of a few observables already proposed in the literature as well as some new proposals, to the relevant parameters of event generators, such as HERWIG 6 and PYTHIA 8. We find that constraining those parameters at O (1%- 10%) is required to avoid a Monte Carlo uncertainty on mt greater than 500 MeV. For the sake of achieving the needed accuracy on such parameters, we examine the sensitivity of the top-quark mass measured from spectral features, such as peaks, endpoints and distributions of EB, mBℓ, and some mT2-like variables. We find that restricting oneself to regions sufficiently close to the endpoints enables one to substantially decrease the dependence on the Monte Carlo parameters, but at the price of inflating significantly the statistical uncertainties. To ameliorate this situation we study how well the data on top-quark production and decay at the LHC can be utilized to constrain the showering and hadronization variables. We find that a global exploration of several calibration observables, sensitive to the Monte Carlo parameters but very mildly to mt, can offer useful constraints on the parameters, as long as such quantities are measured with a 1% precision.

  8. Uncertainties of size measurements in electron microscopy characterization of nanomaterials in foods

    DEFF Research Database (Denmark)

    Dudkiewicz, Agnieszka; Boxall, Alistair B. A.; Chaudhry, Qasim

    2015-01-01

    Electron microscopy is a recognized standard tool for nanomaterial characterization, and recommended by the European Food Safety Authority for the size measurement of nanomaterials in food. Despite this, little data have been published assessing the reliability of the method, especially for size...... measurement of nanomaterials characterized by a broad size distribution and/or added to food matrices. This study is a thorough investigation of the measurement uncertainty when applying electron microscopy for size measurement of engineered nanomaterials in foods. Our results show that the number of measured...

  9. Estimating the Uncertainty of Tensile Strength Measurement for A Photocured Material Produced by Additive Manufacturing

    Directory of Open Access Journals (Sweden)

    Adamczak Stanisław

    2014-08-01

    Full Text Available The aim of this study was to estimate the measurement uncertainty for a material produced by additive manufacturing. The material investigated was FullCure 720 photocured resin, which was applied to fabricate tensile specimens with a Connex 350 3D printer based on PolyJet technology. The tensile strength of the specimens established through static tensile testing was used to determine the measurement uncertainty. There is a need for extensive research into the performance of model materials obtained via 3D printing as they have not been studied sufficiently like metal alloys or plastics, the most common structural materials. In this analysis, the measurement uncertainty was estimated using a larger number of samples than usual, i.e., thirty instead of typical ten. The results can be very useful to engineers who design models and finished products using this material. The investigations also show how wide the scatter of results is.

  10. Uncertainty characterization of particle depth measurement using digital in-line holography and the hybrid method.

    Science.gov (United States)

    Gao, Jian; Guildenbecher, Daniel R; Reu, Phillip L; Chen, Jun

    2013-11-04

    In the detection of particles using digital in-line holography, measurement accuracy is substantially influenced by the hologram processing method. In particular, a number of methods have been proposed to determine the out-of-plane particle depth (z location). However, due to the lack of consistent uncertainty characterization, it has been unclear which method is best suited to a given measurement problem. In this work, depth determination accuracies of seven particle detection methods, including a recently proposed hybrid method, are systematically investigated in terms of relative depth measurement errors and uncertainties. Both synthetic and experimental holograms of particle fields are considered at conditions relevant to particle sizing and tracking. While all methods display a range of particle conditions where they are most accurate, in general the hybrid method is shown to be the most robust with depth uncertainty less than twice the particle diameter over a wide range of particle field conditions.

  11. Assessment of uncertainty associated with measuring exposure to radon and decay products in the French uranium miners cohort

    International Nuclear Information System (INIS)

    Allodji, Rodrigue S; Leuraud, Klervi; Laurier, Dominique; Bernhard, Sylvain; Henry, Stéphane; Bénichou, Jacques

    2012-01-01

    The reliability of exposure data directly affects the reliability of the risk estimates derived from epidemiological studies. Measurement uncertainty must be known and understood before it can be corrected. The literature on occupational exposure to radon ( 222 Rn) and its decay products reveals only a few epidemiological studies in which uncertainty has been accounted for explicitly. This work examined the sources, nature, distribution and magnitude of uncertainty of the exposure of French uranium miners to radon ( 222 Rn) and its decay products. We estimated the total size of uncertainty for this exposure with the root sum square (RSS) method, which may be an alternative when repeated measures are not available. As a result, we identified six main sources of uncertainty. The total size of the uncertainty decreased from about 47% in the period 1956–1974 to 10% after 1982, illustrating the improvement in the radiological monitoring system over time.

  12. NanoCMM : a 3D coordinate measuring machine with low moving mass for measuring small products in array with nanometer uncertainty

    NARCIS (Netherlands)

    Seggelen, van J.K.

    2007-01-01

    To measure dimensions and shape of complex three dimensional products (e.g. engines, mouldings, etc) with low uncertainty, Coordinate Measuring Machines (CMMs) are adequate instruments due to their universal applicability, easy measurement set-up and measuring flexibility. Motion software is

  13. Regional inversion of CO2 ecosystem fluxes from atmospheric measurements. Reliability of the uncertainty estimates

    Energy Technology Data Exchange (ETDEWEB)

    Broquet, G.; Chevallier, F.; Breon, F.M.; Yver, C.; Ciais, P.; Ramonet, M.; Schmidt, M. [Laboratoire des Sciences du Climat et de l' Environnement, CEA-CNRS-UVSQ, UMR8212, IPSL, Gif-sur-Yvette (France); Alemanno, M. [Servizio Meteorologico dell' Aeronautica Militare Italiana, Centro Aeronautica Militare di Montagna, Monte Cimone/Sestola (Italy); Apadula, F. [Research on Energy Systems, RSE, Environment and Sustainable Development Department, Milano (Italy); Hammer, S. [Universitaet Heidelberg, Institut fuer Umweltphysik, Heidelberg (Germany); Haszpra, L. [Hungarian Meteorological Service, Budapest (Hungary); Meinhardt, F. [Federal Environmental Agency, Kirchzarten (Germany); Necki, J. [AGH University of Science and Technology, Krakow (Poland); Piacentino, S. [ENEA, Laboratory for Earth Observations and Analyses, Palermo (Italy); Thompson, R.L. [Max Planck Institute for Biogeochemistry, Jena (Germany); Vermeulen, A.T. [Energy research Centre of the Netherlands ECN, EEE-EA, Petten (Netherlands)

    2013-07-01

    The Bayesian framework of CO2 flux inversions permits estimates of the retrieved flux uncertainties. Here, the reliability of these theoretical estimates is studied through a comparison against the misfits between the inverted fluxes and independent measurements of the CO2 Net Ecosystem Exchange (NEE) made by the eddy covariance technique at local (few hectares) scale. Regional inversions at 0.5{sup 0} resolution are applied for the western European domain where {approx}50 eddy covariance sites are operated. These inversions are conducted for the period 2002-2007. They use a mesoscale atmospheric transport model, a prior estimate of the NEE from a terrestrial ecosystem model and rely on the variational assimilation of in situ continuous measurements of CO2 atmospheric mole fractions. Averaged over monthly periods and over the whole domain, the misfits are in good agreement with the theoretical uncertainties for prior and inverted NEE, and pass the chi-square test for the variance at the 30% and 5% significance levels respectively, despite the scale mismatch and the independence between the prior (respectively inverted) NEE and the flux measurements. The theoretical uncertainty reduction for the monthly NEE at the measurement sites is 53% while the inversion decreases the standard deviation of the misfits by 38 %. These results build confidence in the NEE estimates at the European/monthly scales and in their theoretical uncertainty from the regional inverse modelling system. However, the uncertainties at the monthly (respectively annual) scale remain larger than the amplitude of the inter-annual variability of monthly (respectively annual) fluxes, so that this study does not engender confidence in the inter-annual variations. The uncertainties at the monthly scale are significantly smaller than the seasonal variations. The seasonal cycle of the inverted fluxes is thus reliable. In particular, the CO2 sink period over the European continent likely ends later than

  14. Sources and performance criteria of uncertainty of reference measurement procedures.

    Science.gov (United States)

    Mosca, Andrea; Paleari, Renata

    2018-05-29

    This article wants to focus on the today available Reference Measurement Procedures (RMPs) for the determination of various analytes in Laboratory Medicine and the possible tools to evaluate their performance in the laboratories who are currently using them. A brief review on the RMPs has been performed by investigating the Joint Committee for Traceability in Laboratory Medicine (JCTLM) database. In order to evaluate their performances, we have checked the organization of three international ring trials, i.e. those regularly performed by the IFCC External Quality assessment scheme for Reference Laboratories in Laboratory Medicine (RELA), by the Center for Disease Control and Prevention (CDC) cholesterol network and by the IFCC Network for HbA 1c . Several RMPs are available through the JCTLM database, but the best way to collect information about the RMPs and their uncertainties is to look at the reference measurement service providers (RMS). This part of the database and the background on how to listed in the database is very helpful for the assessment of expanded uncertainty (MU) and performance in general of RMPs. Worldwide, 17 RMS are listed in the database, and for most of the measurands more than one RMS is able to run the relative RMPs, with similar expanded uncertainties. As an example, for a-amylase, 4 SP offer their services with MU between 1.6 and 3.3%. In other cases (such as total cholesterol, the U may span over a broader range, i.e. from 0.02 to 3.6%). With regard to the performance evaluation, the approach is often heterogenous, and it is difficult to compare the performance of laboratories running the same RMP for the same measurand if involved in more than one EQAS. The reference measurement services have been created to help laboratory professionals and manufacturers to implement the correct metrological traceability, and the JCTLM database is the only correct way to retrieve all the necessary important information to this end. Copyright © 2018

  15. An ultramicroscopic study on rigor mortis.

    Science.gov (United States)

    Suzuki, T

    1976-01-01

    Gastrocnemius muscles taken from decapitated mice at various intervals after death and from mice killed by 2,4-dinitrophenol or mono-iodoacetic acid injection to induce rigor mortis soon after death, were observed by electron microscopy. The prominent appearance of many fine cross striations in the myofibrils (occurring about every 400 A) was considered to be characteristic of rigor mortis. These striations were caused by minute granules studded along the surfaces of both thick and thin filaments and appeared to be the bridges connecting the 2 kinds of filaments and accounted for the hardness and rigidity of the muscle.

  16. Measurement-based climatology of aerosol direct radiative effect, its sensitivities, and uncertainties from a background southeast US site

    Science.gov (United States)

    Sherman, James P.; McComiskey, Allison

    2018-03-01

    Aerosol optical properties measured at Appalachian State University's co-located NASA AERONET and NOAA ESRL aerosol network monitoring sites over a nearly four-year period (June 2012-Feb 2016) are used, along with satellite-based surface reflectance measurements, to study the seasonal variability of diurnally averaged clear sky aerosol direct radiative effect (DRE) and radiative efficiency (RE) at the top-of-atmosphere (TOA) and at the surface. Aerosol chemistry and loading at the Appalachian State site are likely representative of the background southeast US (SE US), home to high summertime aerosol loading and one of only a few regions not to have warmed during the 20th century. This study is the first multi-year ground truth DRE study in the SE US, using aerosol network data products that are often used to validate satellite-based aerosol retrievals. The study is also the first in the SE US to quantify DRE uncertainties and sensitivities to aerosol optical properties and surface reflectance, including their seasonal dependence.Median DRE for the study period is -2.9 W m-2 at the TOA and -6.1 W m-2 at the surface. Monthly median and monthly mean DRE at the TOA (surface) are -1 to -2 W m-2 (-2 to -3 W m-2) during winter months and -5 to -6 W m-2 (-10 W m-2) during summer months. The DRE cycles follow the annual cycle of aerosol optical depth (AOD), which is 9 to 10 times larger in summer than in winter. Aerosol RE is anti-correlated with DRE, with winter values 1.5 to 2 times more negative than summer values. Due to the large seasonal dependence of aerosol DRE and RE, we quantify the sensitivity of DRE to aerosol optical properties and surface reflectance, using a calendar day representative of each season (21 December for winter; 21 March for spring, 21 June for summer, and 21 September for fall). We use these sensitivities along with measurement uncertainties of aerosol optical properties and surface reflectance to calculate DRE uncertainties. We also estimate

  17. Measurement uncertainties in science and technology

    CERN Document Server

    Grabe, Michael

    2014-01-01

    This book recasts the classical Gaussian error calculus from scratch, the inducements concerning both random and unknown systematic errors. The idea of this book is to create a formalism being fit to localize the true values of physical quantities considered – true with respect to the set of predefined physical units. Remarkably enough, the prevailingly practiced forms of error calculus do not feature this property which however proves in every respect, to be physically indispensable. The amended formalism, termed Generalized Gaussian Error Calculus by the author, treats unknown systematic errors as biases and brings random errors to bear via enhanced confidence intervals as laid down by students. The significantly extended second edition thoroughly restructures and systematizes the text as a whole and illustrates the formalism by numerous numerical examples. They demonstrate the basic principles of how to understand uncertainties to localize the true values of measured values - a perspective decisive in vi...

  18. Estimation of uncertainty bounds for individual particle image velocimetry measurements from cross-correlation peak ratio

    International Nuclear Information System (INIS)

    Charonko, John J; Vlachos, Pavlos P

    2013-01-01

    Numerous studies have established firmly that particle image velocimetry (PIV) is a robust method for non-invasive, quantitative measurements of fluid velocity, and that when carefully conducted, typical measurements can accurately detect displacements in digital images with a resolution well below a single pixel (in some cases well below a hundredth of a pixel). However, to date, these estimates have only been able to provide guidance on the expected error for an average measurement under specific image quality and flow conditions. This paper demonstrates a new method for estimating the uncertainty bounds to within a given confidence interval for a specific, individual measurement. Here, cross-correlation peak ratio, the ratio of primary to secondary peak height, is shown to correlate strongly with the range of observed error values for a given measurement, regardless of flow condition or image quality. This relationship is significantly stronger for phase-only generalized cross-correlation PIV processing, while the standard correlation approach showed weaker performance. Using an analytical model of the relationship derived from synthetic data sets, the uncertainty bounds at a 95% confidence interval are then computed for several artificial and experimental flow fields, and the resulting errors are shown to match closely to the predicted uncertainties. While this method stops short of being able to predict the true error for a given measurement, knowledge of the uncertainty level for a PIV experiment should provide great benefits when applying the results of PIV analysis to engineering design studies and computational fluid dynamics validation efforts. Moreover, this approach is exceptionally simple to implement and requires negligible additional computational cost. (paper)

  19. Experimental evaluation of rigor mortis IX. The influence of the breaking (mechanical solution) on the development of rigor mortis.

    Science.gov (United States)

    Krompecher, Thomas; Gilles, André; Brandt-Casadevall, Conception; Mangin, Patrice

    2008-04-07

    Objective measurements were carried out to study the possible re-establishment of rigor mortis on rats after "breaking" (mechanical solution). Our experiments showed that: *Cadaveric rigidity can re-establish after breaking. *A significant rigidity can reappear if the breaking occurs before the process is complete. *Rigidity will be considerably weaker after the breaking. *The time course of the intensity does not change in comparison to the controls: --the re-establishment begins immediately after the breaking; --maximal values are reached at the same time as in the controls; --the course of the resolution is the same as in the controls.

  20. Uncertainty budgets for liquid waveguide CDOM absorption measurements.

    Science.gov (United States)

    Lefering, Ina; Röttgers, Rüdiger; Utschig, Christian; McKee, David

    2017-08-01

    Long path length liquid waveguide capillary cell (LWCC) systems using simple spectrometers to determine the spectral absorption by colored dissolved organic matter (CDOM) have previously been shown to have better measurement sensitivity compared to high-end spectrophotometers using 10 cm cuvettes. Information on the magnitude of measurement uncertainties for LWCC systems, however, has remained scarce. Cross-comparison of three different LWCC systems with three different path lengths (50, 100, and 250 cm) and two different cladding materials enabled quantification of measurement precision and accuracy, revealing strong wavelength dependency in both parameters. Stable pumping of the sample through the capillary cell was found to improve measurement precision over measurements made with the sample kept stationary. Results from the 50 and 100 cm LWCC systems, with higher refractive index cladding, showed systematic artifacts including small but unphysical negative offsets and high-frequency spectral perturbations due to limited performance of the salinity correction. In comparison, the newer 250 cm LWCC with lower refractive index cladding returned small positive offsets that may be physically correct. After null correction of measurements at 700 nm, overall agreement of CDOM absorption data at 440 nm was found to be within 5% root mean square percentage error.

  1. Tenderness of pre- and post rigor lamb longissimus muscle.

    Science.gov (United States)

    Geesink, Geert; Sujang, Sadi; Koohmaraie, Mohammad

    2011-08-01

    Lamb longissimus muscle (n=6) sections were cooked at different times post mortem (prerigor, at rigor, 1dayp.m., and 7 days p.m.) using two cooking methods. Using a boiling waterbath, samples were either cooked to a core temperature of 70 °C or boiled for 3h. The latter method was meant to reflect the traditional cooking method employed in countries where preparation of prerigor meat is practiced. The time postmortem at which the meat was prepared had a large effect on the tenderness (shear force) of the meat (PCooking prerigor and at rigor meat to 70 °C resulted in higher shear force values than their post rigor counterparts at 1 and 7 days p.m. (9.4 and 9.6 vs. 7.2 and 3.7 kg, respectively). The differences in tenderness between the treatment groups could be largely explained by a difference in contraction status of the meat after cooking and the effect of ageing on tenderness. Cooking pre and at rigor meat resulted in severe muscle contraction as evidenced by the differences in sarcomere length of the cooked samples. Mean sarcomere lengths in the pre and at rigor samples ranged from 1.05 to 1.20 μm. The mean sarcomere length in the post rigor samples was 1.44 μm. Cooking for 3 h at 100 °C did improve the tenderness of pre and at rigor prepared meat as compared to cooking to 70 °C, but not to the extent that ageing did. It is concluded that additional intervention methods are needed to improve the tenderness of prerigor cooked meat. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Uncertainty Evaluation of the New Setup for Measurement of Water-Vapor Permeation Rate by a Dew-Point Sensor

    Science.gov (United States)

    Hudoklin, D.; Šetina, J.; Drnovšek, J.

    2012-09-01

    The measurement of the water-vapor permeation rate (WVPR) through materials is very important in many industrial applications such as the development of new fabrics and construction materials, in the semiconductor industry, packaging, vacuum techniques, etc. The demand for this kind of measurement grows considerably and thus many different methods for measuring the WVPR are developed and standardized within numerous national and international standards. However, comparison of existing methods shows a low level of mutual agreement. The objective of this paper is to demonstrate the necessary uncertainty evaluation for WVPR measurements, so as to provide a basis for development of a corresponding reference measurement standard. This paper presents a specially developed measurement setup, which employs a precision dew-point sensor for WVPR measurements on specimens of different shapes. The paper also presents a physical model, which tries to account for both dynamic and quasi-static methods, the common types of WVPR measurements referred to in standards and scientific publications. An uncertainty evaluation carried out according to the ISO/IEC guide to the expression of uncertainty in measurement (GUM) shows the relative expanded ( k = 2) uncertainty to be 3.0 % for WVPR of 6.71 mg . h-1 (corresponding to permeance of 30.4 mg . m-2. day-1 . hPa-1).

  3. Proficiency testing as a basis for estimating uncertainty of measurement: application to forensic alcohol and toxicology quantitations.

    Science.gov (United States)

    Wallace, Jack

    2010-05-01

    While forensic laboratories will soon be required to estimate uncertainties of measurement for those quantitations reported to the end users of the information, the procedures for estimating this have been little discussed in the forensic literature. This article illustrates how proficiency test results provide the basis for estimating uncertainties in three instances: (i) For breath alcohol analyzers the interlaboratory precision is taken as a direct measure of uncertainty. This approach applies when the number of proficiency tests is small. (ii) For blood alcohol, the uncertainty is calculated from the differences between the laboratory's proficiency testing results and the mean quantitations determined by the participants; this approach applies when the laboratory has participated in a large number of tests. (iii) For toxicology, either of these approaches is useful for estimating comparability between laboratories, but not for estimating absolute accuracy. It is seen that data from proficiency tests enable estimates of uncertainty that are empirical, simple, thorough, and applicable to a wide range of concentrations.

  4. Particle Swarm Imaging (PSIM). A swarming algorithm for the reporting of robust, optimal measurement uncertainties

    International Nuclear Information System (INIS)

    Parvin, Dan; Clarke, Sean

    2015-01-01

    Particle Swarm Imaging (PSIM) overcomes some of the challenges associated with the accurate declaration of measurement uncertainties of radionuclide inventories within waste items when the distribution of activity is unknown. Implementation requires minimal equipment, making use of gamma‑ray measurements taken from different locations around the waste item, using only a single electrically cooled HRGS gamma‑ray detector for objects up to a UK ISO freight container in size. The PSIM technique is a computational method that iteratively ‘homes‑in’ on the true location of activity concentrations in waste items. PSIM differs from conventional assay techniques by allowing only viable solutions - that is those that could actually give rise to the measured data - to be considered. Thus PSIM avoids the drawback of conventional analyses, namely, the adoption of unrealistic assumptions about the activity distribution that inevitably leads to the declaration of pessimistic (and in some cases optimistic) activity estimates and uncertainties. PSIM applies an optimisation technique based upon ‘particle swarming’ methods to determine a set of candidate solutions within a ‘search space’ defined by the interior volume of a waste item. The positions and activities of the swarm are used in conjunction with a mathematical model to simulate the measurement response for the current swarm location. The swarm is iteratively updated (with modified positions and activities) until a match with sufficient quality is obtained between the simulated and actual measurement data. This process is repeated to build up a distribution of candidate solutions, which is subsequently analysed to calculate a measurement result and uncertainty along with a visual image of the activity distribution. The application of ‘swarming’ computational methods to non‑destructive assay (NDA) measurements is considered novel and this paper is intended to introduce the PSIM concept and provide

  5. Measuring process solutions in a reprocessing plant to 0.1%

    International Nuclear Information System (INIS)

    Crawford, J.M.; Ehinger, M.H.; Ellis, J.H.

    1980-03-01

    Measurement of SNM in reprocessing plant solutions involves two major problems; measurement of bulk solution quantities and analysis of highly radioactive samples. It has been shown at the BNFP that bulk measurements can be made routinely under operating conditions to less than 0.1% total uncertainty. Two specific advances in measurement technology have been largely responsible for this improved performance. The quartz bourdon tube electromanometer replaces the fluid manometer for differential pressure measurements. The vibrating tube densimeter provides accurate measurement of density in lab samples. These instruments, coupled with a rigorous measurement and quality control procedures, are the means to achieve better than 0.1% performance

  6. Uncertainty estimation and multi sensor fusion for kinematic laser tracker measurements

    Science.gov (United States)

    Ulrich, Thomas

    2013-08-01

    Laser trackers are widely used to measure kinematic tasks such as tracking robot movements. Common methods to evaluate the uncertainty in the kinematic measurement include approximations specified by the manufacturers, various analytical adjustment methods and the Kalman filter. In this paper a new, real-time technique is proposed, which estimates the 4D-path (3D-position + time) uncertainty of an arbitrary path in space. Here a hybrid system estimator is applied in conjunction with the kinematic measurement model. This method can be applied to processes, which include various types of kinematic behaviour, constant velocity, variable acceleration or variable turn rates. The new approach is compared with the Kalman filter and a manufacturer's approximations. The comparison was made using data obtained by tracking an industrial robot's tool centre point with a Leica laser tracker AT901 and a Leica laser tracker LTD500. It shows that the new approach is more appropriate to analysing kinematic processes than the Kalman filter, as it reduces overshoots and decreases the estimated variance. In comparison with the manufacturer's approximations, the new approach takes account of kinematic behaviour with an improved description of the real measurement process and a reduction in estimated variance. This approach is therefore well suited to the analysis of kinematic processes with unknown changes in kinematic behaviour as well as the fusion among laser trackers.

  7. Simplified propagation of standard uncertainties

    International Nuclear Information System (INIS)

    Shull, A.H.

    1997-01-01

    An essential part of any measurement control program is adequate knowledge of the uncertainties of the measurement system standards. Only with an estimate of the standards'' uncertainties can one determine if the standard is adequate for its intended use or can one calculate the total uncertainty of the measurement process. Purchased standards usually have estimates of uncertainty on their certificates. However, when standards are prepared and characterized by a laboratory, variance propagation is required to estimate the uncertainty of the standard. Traditional variance propagation typically involves tedious use of partial derivatives, unfriendly software and the availability of statistical expertise. As a result, the uncertainty of prepared standards is often not determined or determined incorrectly. For situations meeting stated assumptions, easier shortcut methods of estimation are now available which eliminate the need for partial derivatives and require only a spreadsheet or calculator. A system of simplifying the calculations by dividing into subgroups of absolute and relative uncertainties is utilized. These methods also incorporate the International Standards Organization (ISO) concepts for combining systematic and random uncertainties as published in their Guide to the Expression of Measurement Uncertainty. Details of the simplified methods and examples of their use are included in the paper

  8. Uncertainty Reduction Via Parameter Design of A Fast Digital Integrator for Magnetic Field Measurement

    CERN Document Server

    Arpaia, P; Lucariello, G; Spiezia, G

    2007-01-01

    At European Centre of Nuclear Research (CERN), within the new Large Hadron Collider (LHC) project, measurements of magnetic flux with uncertainty of 10 ppm at a few of decades of Hz for several minutes are required. With this aim, a new Fast Digital Integrator (FDI) has been developed in cooperation with University of Sannio, Italy [1]. This paper deals with the final design tuning for achieving target uncertainty by means of experimental statistical parameter design.

  9. A measure of uncertainty regarding the interval constraint of normal mean elicited by two stages of a prior hierarchy.

    Science.gov (United States)

    Kim, Hea-Jung

    2014-01-01

    This paper considers a hierarchical screened Gaussian model (HSGM) for Bayesian inference of normal models when an interval constraint in the mean parameter space needs to be incorporated in the modeling but when such a restriction is uncertain. An objective measure of the uncertainty, regarding the interval constraint, accounted for by using the HSGM is proposed for the Bayesian inference. For this purpose, we drive a maximum entropy prior of the normal mean, eliciting the uncertainty regarding the interval constraint, and then obtain the uncertainty measure by considering the relationship between the maximum entropy prior and the marginal prior of the normal mean in HSGM. Bayesian estimation procedure of HSGM is developed and two numerical illustrations pertaining to the properties of the uncertainty measure are provided.

  10. TRACEABILITY OF PRECISION MEASUREMENTS ON COORDINATE MEASURING MACHINES – UNCERTAINTY ASSESSMENT BY USING CALIBRATED WORPIECES ON CMMs

    DEFF Research Database (Denmark)

    Tosello, Guido; De Chiffre, Leonardo

    This document is used in connection with one exercise 30 minutes duration as a part of the course VISION ONLINE – One week course on Precision & Nanometrology. The exercise concerns establishment of traceability of precision measurements on coordinate measuring machines. This document contains...... a short description of each step in the exercise, the uncertainty budget as described in the ISO/TS 15530 part 3 and tables from the excel spreadsheets....

  11. Single hadron response measurement and calorimeter jet energy scale uncertainty with the ATLAS detector at the LHC

    CERN Document Server

    Aad, Georges; Abdallah, Jalal; Abdelalim, Ahmed Ali; Abdesselam, Abdelouahab; Abdinov, Ovsat; Abi, Babak; Abolins, Maris; AbouZeid, Ossama; Abramowicz, Halina; Abreu, Henso; Acerbi, Emilio; Acharya, Bobby Samir; Adamczyk, Leszek; Adams, David; Addy, Tetteh; Adelman, Jahred; Aderholz, Michael; Adomeit, Stefanie; Adragna, Paolo; Adye, Tim; Aefsky, Scott; Aguilar-Saavedra, Juan Antonio; Aharrouche, Mohamed; Ahlen, Steven; Ahles, Florian; Ahmad, Ashfaq; Ahsan, Mahsana; Aielli, Giulio; Akdogan, Taylan; Åkesson, Torsten Paul Ake; Akimoto, Ginga; Akimov, Andrei; Akiyama, Kunihiro; Alam, Mohammad; Alam, Muhammad Aftab; Albert, Justin; Albrand, Solveig; Aleksa, Martin; Aleksandrov, Igor; Alessandria, Franco; Alexa, Calin; Alexander, Gideon; Alexandre, Gauthier; Alexopoulos, Theodoros; Alhroob, Muhammad; Aliev, Malik; Alimonti, Gianluca; Alison, John; Aliyev, Magsud; Allbrooke, Benedict; Allport, Phillip; Allwood-Spiers, Sarah; Almond, John; Aloisio, Alberto; Alon, Raz; Alonso, Alejandro; Alvarez Gonzalez, Barbara; Alviggi, Mariagrazia; Amako, Katsuya; Amaral, Pedro; Amelung, Christoph; Ammosov, Vladimir; Amorim, Antonio; Amorós, Gabriel; Amram, Nir; Anastopoulos, Christos; Ancu, Lucian Stefan; Andari, Nansi; Andeen, Timothy; Anders, Christoph Falk; Anders, Gabriel; Anderson, Kelby; Andreazza, Attilio; Andrei, George Victor; Andrieux, Marie-Laure; Anduaga, Xabier; Angerami, Aaron; Anghinolfi, Francis; Anisenkov, Alexey; Anjos, Nuno; Annovi, Alberto; Antonaki, Ariadni; Antonelli, Mario; Antonov, Alexey; Antos, Jaroslav; Anulli, Fabio; Aoun, Sahar; Aperio Bella, Ludovica; Apolle, Rudi; Arabidze, Giorgi; Aracena, Ignacio; Arai, Yasuo; Arce, Ayana; Arfaoui, Samir; Arguin, Jean-Francois; Arik, Engin; Arik, Metin; Armbruster, Aaron James; Arnaez, Olivier; Arnault, Christian; Artamonov, Andrei; Artoni, Giacomo; Arutinov, David; Asai, Shoji; Asfandiyarov, Ruslan; Ask, Stefan; Åsman, Barbro; Asquith, Lily; Assamagan, Ketevi; Astbury, Alan; Astvatsatourov, Anatoli; Aubert, Bernard; Auge, Etienne; Augsten, Kamil; Aurousseau, Mathieu; Avolio, Giuseppe; Avramidou, Rachel Maria; Axen, David; Ay, Cano; Azuelos, Georges; Azuma, Yuya; Baak, Max; Baccaglioni, Giuseppe; Bacci, Cesare; Bach, Andre; Bachacou, Henri; Bachas, Konstantinos; Backes, Moritz; Backhaus, Malte; Badescu, Elisabeta; Bagnaia, Paolo; Bahinipati, Seema; Bai, Yu; Bailey, David; Bain, Travis; Baines, John; Baker, Oliver Keith; Baker, Mark; Baker, Sarah; Banas, Elzbieta; Banerjee, Piyali; Banerjee, Swagato; Banfi, Danilo; Bangert, Andrea Michelle; Bansal, Vikas; Bansil, Hardeep Singh; Barak, Liron; Baranov, Sergei; Barashkou, Andrei; Barbaro Galtieri, Angela; Barber, Tom; Barberio, Elisabetta Luigia; Barberis, Dario; Barbero, Marlon; Bardin, Dmitri; Barillari, Teresa; Barisonzi, Marcello; Barklow, Timothy; Barlow, Nick; Barnett, Bruce; Barnett, Michael; Baroncelli, Antonio; Barone, Gaetano; Barr, Alan; Barreiro, Fernando; Barreiro Guimarães da Costa, João; Barrillon, Pierre; Bartoldus, Rainer; Barton, Adam Edward; Bartsch, Valeria; Bates, Richard; Batkova, Lucia; Batley, Richard; Battaglia, Andreas; Battistin, Michele; Bauer, Florian; Bawa, Harinder Singh; Beale, Steven; Beau, Tristan; Beauchemin, Pierre-Hugues; Beccherle, Roberto; Bechtle, Philip; Beck, Hans Peter; Becker, Sebastian; Beckingham, Matthew; Becks, Karl-Heinz; Beddall, Andrew; Beddall, Ayda; Bedikian, Sourpouhi; Bednyakov, Vadim; Bee, Christopher; Begel, Michael; Behar Harpaz, Silvia; Behera, Prafulla; Beimforde, Michael; Belanger-Champagne, Camille; Bell, Paul; Bell, William; Bella, Gideon; Bellagamba, Lorenzo; Bellina, Francesco; Bellomo, Massimiliano; Belloni, Alberto; Beloborodova, Olga; Belotskiy, Konstantin; Beltramello, Olga; Ben Ami, Sagi; Benary, Odette; Benchekroun, Driss; Benchouk, Chafik; Bendel, Markus; Benekos, Nektarios; Benhammou, Yan; Benhar Noccioli, Eleonora; Benitez Garcia, Jorge-Armando; Benjamin, Douglas; Benoit, Mathieu; Bensinger, James; Benslama, Kamal; Bentvelsen, Stan; Berge, David; Bergeaas Kuutmann, Elin; Berger, Nicolas; Berghaus, Frank; Berglund, Elina; Beringer, Jürg; Bernat, Pauline; Bernhard, Ralf; Bernius, Catrin; Berry, Tracey; Bertella, Claudia; Bertin, Antonio; Bertinelli, Francesco; Bertolucci, Federico; Besana, Maria Ilaria; Besson, Nathalie; Bethke, Siegfried; Bhimji, Wahid; Bianchi, Riccardo-Maria; Bianco, Michele; Biebel, Otmar; Bieniek, Stephen Paul; Bierwagen, Katharina; Biesiada, Jed; Biglietti, Michela; Bilokon, Halina; Bindi, Marcello; Binet, Sebastien; Bingul, Ahmet; Bini, Cesare; Biscarat, Catherine; Bitenc, Urban; Black, Kevin; Blair, Robert; Blanchard, Jean-Baptiste; Blanchot, Georges; Blazek, Tomas; Blocker, Craig; Blocki, Jacek; Blondel, Alain; Blum, Walter; Blumenschein, Ulrike; Bobbink, Gerjan; Bobrovnikov, Victor; Bocchetta, Simona Serena; Bocci, Andrea; Boddy, Christopher Richard; Boehler, Michael; Boek, Jennifer; Boelaert, Nele; Bogaerts, Joannes Andreas; Bogdanchikov, Alexander; Bogouch, Andrei; Bohm, Christian; Boisvert, Veronique; Bold, Tomasz; Boldea, Venera; Bolnet, Nayanka Myriam; Bona, Marcella; Bondarenko, Valery; Bondioli, Mario; Boonekamp, Maarten; Booth, Chris; Bordoni, Stefania; Borer, Claudia; Borisov, Anatoly; Borissov, Guennadi; Borjanovic, Iris; Borri, Marcello; Borroni, Sara; Bortolotto, Valerio; Bos, Kors; Boscherini, Davide; Bosman, Martine; Boterenbrood, Hendrik; Botterill, David; Bouchami, Jihene; Boudreau, Joseph; Bouhova-Thacker, Evelina Vassileva; Boumediene, Djamel Eddine; Bourdarios, Claire; Bousson, Nicolas; Boveia, Antonio; Boyd, James; Boyko, Igor; Bozhko, Nikolay; Bozovic-Jelisavcic, Ivanka; Bracinik, Juraj; Braem, André; Branchini, Paolo; Brandenburg, George; Brandt, Andrew; Brandt, Gerhard; Brandt, Oleg; Bratzler, Uwe; Brau, Benjamin; Brau, James; Braun, Helmut; Brelier, Bertrand; Bremer, Johan; Brenner, Richard; Bressler, Shikma; Britton, Dave; Brochu, Frederic; Brock, Ian; Brock, Raymond; Brodbeck, Timothy; Brodet, Eyal; Broggi, Francesco; Bromberg, Carl; Bronner, Johanna; Brooijmans, Gustaaf; Brooks, William; Brown, Gareth; Brown, Heather; Bruckman de Renstrom, Pawel; Bruncko, Dusan; Bruneliere, Renaud; Brunet, Sylvie; Bruni, Alessia; Bruni, Graziano; Bruschi, Marco; Buanes, Trygve; Buat, Quentin; Bucci, Francesca; Buchanan, James; Buchanan, Norman; Buchholz, Peter; Buckingham, Ryan; Buckley, Andrew; Buda, Stelian Ioan; Budagov, Ioulian; Budick, Burton; Büscher, Volker; Bugge, Lars; Bulekov, Oleg; Bunse, Moritz; Buran, Torleiv; Burckhart, Helfried; Burdin, Sergey; Burgard, Carsten Daniel; Burgess, Thomas; Burke, Stephen; Busato, Emmanuel; Bussey, Peter; Buszello, Claus-Peter; Butin, François; Butler, Bart; Butler, John; Buttar, Craig; Butterworth, Jonathan; Buttinger, William; Cabrera Urbán, Susana; Caforio, Davide; Cakir, Orhan; Calafiura, Paolo; Calderini, Giovanni; Calfayan, Philippe; Calkins, Robert; Caloba, Luiz; Caloi, Rita; Calvet, David; Calvet, Samuel; Camacho Toro, Reina; Camarri, Paolo; Cambiaghi, Mario; Cameron, David; Caminada, Lea Michaela; Campana, Simone; Campanelli, Mario; Canale, Vincenzo; Canelli, Florencia; Canepa, Anadi; Cantero, Josu; Capasso, Luciano; Capeans Garrido, Maria Del Mar; Caprini, Irinel; Caprini, Mihai; Capriotti, Daniele; Capua, Marcella; Caputo, Regina; Caramarcu, Costin; Cardarelli, Roberto; Carli, Tancredi; Carlino, Gianpaolo; Carminati, Leonardo; Caron, Bryan; Caron, Sascha; Carrillo Montoya, German D; Carter, Antony; Carter, Janet; Carvalho, João; Casadei, Diego; Casado, Maria Pilar; Cascella, Michele; Caso, Carlo; Castaneda Hernandez, Alfredo Martin; Castaneda-Miranda, Elizabeth; Castillo Gimenez, Victoria; Castro, Nuno Filipe; Cataldi, Gabriella; Cataneo, Fernando; Catinaccio, Andrea; Catmore, James; Cattai, Ariella; Cattani, Giordano; Caughron, Seth; Cauz, Diego; Cavalleri, Pietro; Cavalli, Donatella; Cavalli-Sforza, Matteo; Cavasinni, Vincenzo; Ceradini, Filippo; Santiago Cerqueira, Augusto; Cerri, Alessandro; Cerrito, Lucio; Cerutti, Fabio; Cetin, Serkant Ali; Cevenini, Francesco; Chafaq, Aziz; Chakraborty, Dhiman; Chan, Kevin; Chapleau, Bertrand; Chapman, John Derek; Chapman, John Wehrley; Chareyre, Eve; Charlton, Dave; Chavda, Vikash; Chavez Barajas, Carlos Alberto; Cheatham, Susan; Chekanov, Sergei; Chekulaev, Sergey; Chelkov, Gueorgui; Chelstowska, Magda Anna; Chen, Chunhui; Chen, Hucheng; Chen, Shenjian; Chen, Tingyang; Chen, Xin; Cheng, Shaochen; Cheplakov, Alexander; Chepurnov, Vladimir; Cherkaoui El Moursli, Rajaa; Chernyatin, Valeriy; Cheu, Elliott; Cheung, Sing-Leung; Chevalier, Laurent; Chiefari, Giovanni; Chikovani, Leila; Childers, John Taylor; Chilingarov, Alexandre; Chiodini, Gabriele; Chisholm, Andrew; Chizhov, Mihail; Choudalakis, Georgios; Chouridou, Sofia; Christidi, Illectra-Athanasia; Christov, Asen; Chromek-Burckhart, Doris; Chu, Ming-Lee; Chudoba, Jiri; Ciapetti, Guido; Ciba, Krzysztof; Ciftci, Abbas Kenan; Ciftci, Rena; Cinca, Diane; Cindro, Vladimir; Ciobotaru, Matei Dan; Ciocca, Claudia; Ciocio, Alessandra; Cirilli, Manuela; Citterio, Mauro; Ciubancan, Mihai; Clark, Allan G; Clark, Philip James; Cleland, Bill; Clemens, Jean-Claude; Clement, Benoit; Clement, Christophe; Clifft, Roger; Coadou, Yann; Cobal, Marina; Coccaro, Andrea; Cochran, James H; Coe, Paul; Cogan, Joshua Godfrey; Coggeshall, James; Cogneras, Eric; Colas, Jacques; Colijn, Auke-Pieter; Collard, Caroline; Collins, Neil; Collins-Tooth, Christopher; Collot, Johann; Colon, German; Conde Muiño, Patricia; Coniavitis, Elias; Conidi, Maria Chiara; Consonni, Michele; Consorti, Valerio; Constantinescu, Serban; Conta, Claudio; Conventi, Francesco; Cook, James; Cooke, Mark; Cooper, Ben; Cooper-Sarkar, Amanda; Copic, Katherine; Cornelissen, Thijs; Corradi, Massimo; Corriveau, Francois; Cortes-Gonzalez, Arely; Cortiana, Giorgio; Costa, Giuseppe; Costa, María José; Costanzo, Davide; Costin, Tudor; Côté, David; Coura Torres, Rodrigo; Courneyea, Lorraine; Cowan, Glen; Cowden, Christopher; Cox, Brian; Cranmer, Kyle; Crescioli, Francesco; Cristinziani, Markus; Crosetti, Giovanni; Crupi, Roberto; Crépé-Renaudin, Sabine; Cuciuc, Constantin-Mihai; Cuenca Almenar, Cristóbal; Cuhadar Donszelmann, Tulay; Curatolo, Maria; Curtis, Chris; Cuthbert, Cameron; Cwetanski, Peter; Czirr, Hendrik; Czodrowski, Patrick; Czyczula, Zofia; D'Auria, Saverio; D'Onofrio, Monica; D'Orazio, Alessia; Da Silva, Paulo Vitor; Da Via, Cinzia; Dabrowski, Wladyslaw; Dai, Tiesheng; Dallapiccola, Carlo; Dam, Mogens; Dameri, Mauro; Damiani, Daniel; Danielsson, Hans Olof; Dannheim, Dominik; Dao, Valerio; Darbo, Giovanni; Darlea, Georgiana Lavinia; Davey, Will; Davidek, Tomas; Davidson, Nadia; Davidson, Ruth; Davies, Eleanor; Davies, Merlin; Davison, Adam; Davygora, Yuriy; Dawe, Edmund; Dawson, Ian; Dawson, John; Daya, Rozmin; De, Kaushik; de Asmundis, Riccardo; De Castro, Stefano; De Castro Faria Salgado, Pedro; De Cecco, Sandro; de Graat, Julien; De Groot, Nicolo; de Jong, Paul; De La Taille, Christophe; De la Torre, Hector; De Lotto, Barbara; de Mora, Lee; De Nooij, Lucie; De Pedis, Daniele; De Salvo, Alessandro; De Sanctis, Umberto; De Santo, Antonella; De Vivie De Regie, Jean-Baptiste; Dean, Simon; Dearnaley, William James; Debbe, Ramiro; Debenedetti, Chiara; Dedovich, Dmitri; Degenhardt, James; Dehchar, Mohamed; Del Papa, Carlo; Del Peso, Jose; Del Prete, Tarcisio; Delemontex, Thomas; Deliyergiyev, Maksym; Dell'Acqua, Andrea; Dell'Asta, Lidia; Della Pietra, Massimo; della Volpe, Domenico; Delmastro, Marco; Delruelle, Nicolas; Delsart, Pierre-Antoine; Deluca, Carolina; Demers, Sarah; Demichev, Mikhail; Demirkoz, Bilge; Deng, Jianrong; Denisov, Sergey; Derendarz, Dominik; Derkaoui, Jamal Eddine; Derue, Frederic; Dervan, Paul; Desch, Klaus Kurt; Devetak, Erik; Deviveiros, Pier-Olivier; Dewhurst, Alastair; DeWilde, Burton; Dhaliwal, Saminder; Dhullipudi, Ramasudhakar; Di Ciaccio, Anna; Di Ciaccio, Lucia; Di Girolamo, Alessandro; Di Girolamo, Beniamino; Di Luise, Silvestro; Di Mattia, Alessandro; Di Micco, Biagio; Di Nardo, Roberto; Di Simone, Andrea; Di Sipio, Riccardo; Diaz, Marco Aurelio; Diblen, Faruk; Diehl, Edward; Dietrich, Janet; Dietzsch, Thorsten; Diglio, Sara; Dindar Yagci, Kamile; Dingfelder, Jochen; Dionisi, Carlo; Dita, Petre; Dita, Sanda; Dittus, Fridolin; Djama, Fares; Djobava, Tamar; Barros do Vale, Maria Aline; Do Valle Wemans, André; Doan, Thi Kieu Oanh; Dobbs, Matt; Dobinson, Robert; Dobos, Daniel; Dobson, Ellie; Dobson, Marc; Dodd, Jeremy; Doglioni, Caterina; Doherty, Tom; Doi, Yoshikuni; Dolejsi, Jiri; Dolenc, Irena; Dolezal, Zdenek; Dolgoshein, Boris; Dohmae, Takeshi; Donadelli, Marisilvia; Donega, Mauro; Donini, Julien; Dopke, Jens; Doria, Alessandra; Dos Anjos, Andre; Dosil, Mireia; Dotti, Andrea; Dova, Maria-Teresa; Dowell, John; Doxiadis, Alexander; Doyle, Tony; Drasal, Zbynek; Drees, Jürgen; Dressnandt, Nandor; Drevermann, Hans; Driouichi, Chafik; Dris, Manolis; Dubbert, Jörg; Dube, Sourabh; Duchovni, Ehud; Duckeck, Guenter; Dudarev, Alexey; Dudziak, Fanny; Dührssen, Michael; Duerdoth, Ian; Duflot, Laurent; Dufour, Marc-Andre; Dunford, Monica; Duran Yildiz, Hatice; Duxfield, Robert; Dwuznik, Michal; Dydak, Friedrich; Düren, Michael; Ebenstein, William; Ebke, Johannes; Eckweiler, Sebastian; Edmonds, Keith; Edwards, Clive; Edwards, Nicholas Charles; Ehrenfeld, Wolfgang; Ehrich, Thies; Eifert, Till; Eigen, Gerald; Einsweiler, Kevin; Eisenhandler, Eric; Ekelof, Tord; El Kacimi, Mohamed; Ellert, Mattias; Elles, Sabine; Ellinghaus, Frank; Ellis, Katherine; Ellis, Nicolas; Elmsheuser, Johannes; Elsing, Markus; Emeliyanov, Dmitry; Engelmann, Roderich; Engl, Albert; Epp, Brigitte; Eppig, Andrew; Erdmann, Johannes; Ereditato, Antonio; Eriksson, Daniel; Ernst, Jesse; Ernst, Michael; Ernwein, Jean; Errede, Deborah; Errede, Steven; Ertel, Eugen; Escalier, Marc; Escobar, Carlos; Espinal Curull, Xavier; Esposito, Bellisario; Etienne, Francois; Etienvre, Anne-Isabelle; Etzion, Erez; Evangelakou, Despoina; Evans, Hal; Fabbri, Laura; Fabre, Caroline; Fakhrutdinov, Rinat; Falciano, Speranza; Fang, Yaquan; Fanti, Marcello; Farbin, Amir; Farilla, Addolorata; Farley, Jason; Farooque, Trisha; Farrington, Sinead; Farthouat, Philippe; Fassnacht, Patrick; Fassouliotis, Dimitrios; Fatholahzadeh, Baharak; Favareto, Andrea; Fayard, Louis; Fazio, Salvatore; Febbraro, Renato; Federic, Pavol; Fedin, Oleg; Fedorko, Woiciech; Fehling-Kaschek, Mirjam; Feligioni, Lorenzo; Fellmann, Denis; Feng, Cunfeng; Feng, Eric; Fenyuk, Alexander; Ferencei, Jozef; Ferland, Jonathan; Fernando, Waruna; Ferrag, Samir; Ferrando, James; Ferrara, Valentina; Ferrari, Arnaud; Ferrari, Pamela; Ferrari, Roberto; Ferreira de Lima, Danilo Enoque; Ferrer, Antonio; Ferrer, Maria Lorenza; Ferrere, Didier; Ferretti, Claudio; Ferretto Parodi, Andrea; Fiascaris, Maria; Fiedler, Frank; Filipčič, Andrej; Filippas, Anastasios; Filthaut, Frank; Fincke-Keeler, Margret; Fiolhais, Miguel; Fiorini, Luca; Firan, Ana; Fischer, Gordon; Fischer, Peter; Fisher, Matthew; Flechl, Martin; Fleck, Ivor; Fleckner, Johanna; Fleischmann, Philipp; Fleischmann, Sebastian; Flick, Tobias; Floderus, Anders; Flores Castillo, Luis; Flowerdew, Michael; Fokitis, Manolis; Fonseca Martin, Teresa; Forbush, David Alan; Formica, Andrea; Forti, Alessandra; Fortin, Dominique; Foster, Joe; Fournier, Daniel; Foussat, Arnaud; Fowler, Andrew; Fowler, Ken; Fox, Harald; Francavilla, Paolo; Franchino, Silvia; Francis, David; Frank, Tal; Franklin, Melissa; Franz, Sebastien; Fraternali, Marco; Fratina, Sasa; French, Sky; Friedrich, Felix; Froeschl, Robert; Froidevaux, Daniel; Frost, James; Fukunaga, Chikara; Fullana Torregrosa, Esteban; Fuster, Juan; Gabaldon, Carolina; Gabizon, Ofir; Gadfort, Thomas; Gadomski, Szymon; Gagliardi, Guido; Gagnon, Pauline; Galea, Cristina; Gallas, Elizabeth; Gallo, Valentina Santina; Gallop, Bruce; Gallus, Petr; Gan, KK; Gao, Yongsheng; Gapienko, Vladimir; Gaponenko, Andrei; Garberson, Ford; Garcia-Sciveres, Maurice; García, Carmen; García Navarro, José Enrique; Gardner, Robert; Garelli, Nicoletta; Garitaonandia, Hegoi; Garonne, Vincent; Garvey, John; Gatti, Claudio; Gaudio, Gabriella; Gaur, Bakul; Gauthier, Lea; Gavrilenko, Igor; Gay, Colin; Gaycken, Goetz; Gayde, Jean-Christophe; Gazis, Evangelos; Ge, Peng; Gee, Norman; Geerts, Daniël Alphonsus Adrianus; Geich-Gimbel, Christoph; Gellerstedt, Karl; Gemme, Claudia; Gemmell, Alistair; Genest, Marie-Hélène; Gentile, Simonetta; George, Matthias; George, Simon; Gerlach, Peter; Gershon, Avi; Geweniger, Christoph; Ghazlane, Hamid; Ghodbane, Nabil; Giacobbe, Benedetto; Giagu, Stefano; Giakoumopoulou, Victoria; Giangiobbe, Vincent; Gianotti, Fabiola; Gibbard, Bruce; Gibson, Adam; Gibson, Stephen; Gilbert, Laura; Gilewsky, Valentin; Gillberg, Dag; Gillman, Tony; Gingrich, Douglas; Ginzburg, Jonatan; Giokaris, Nikos; Giordani, MarioPaolo; Giordano, Raffaele; Giorgi, Francesco Michelangelo; Giovannini, Paola; Giraud, Pierre-Francois; Giugni, Danilo; Giunta, Michele; Giusti, Paolo; Gjelsten, Børge Kile; Gladilin, Leonid; Glasman, Claudia; Glatzer, Julian; Glazov, Alexandre; Glitza, Karl-Walter; Glonti, George; Goddard, Jack Robert; Godfrey, Jennifer; Godlewski, Jan; Goebel, Martin; Göpfert, Thomas; Goeringer, Christian; Gössling, Claus; Göttfert, Tobias; Goldfarb, Steven; Golling, Tobias; Gomes, Agostinho; Gomez Fajardo, Luz Stella; Gonçalo, Ricardo; Goncalves Pinto Firmino Da Costa, Joao; Gonella, Laura; Gonidec, Allain; Gonzalez, Saul; González de la Hoz, Santiago; Gonzalez Parra, Garoe; Gonzalez Silva, Laura; Gonzalez-Sevilla, Sergio; Goodson, Jeremiah Jet; Goossens, Luc; Gorbounov, Petr Andreevich; Gordon, Howard; Gorelov, Igor; Gorfine, Grant; Gorini, Benedetto; Gorini, Edoardo; Gorišek, Andrej; Gornicki, Edward; Gorokhov, Serguei; Goryachev, Vladimir; Gosdzik, Bjoern; Gosselink, Martijn; Gostkin, Mikhail Ivanovitch; Gough Eschrich, Ivo; Gouighri, Mohamed; Goujdami, Driss; Goulette, Marc Phillippe; Goussiou, Anna; Goy, Corinne; Gozpinar, Serdar; Grabowska-Bold, Iwona; Grafström, Per; Grahn, Karl-Johan; Grancagnolo, Francesco; Grancagnolo, Sergio; Grassi, Valerio; Gratchev, Vadim; Grau, Nathan; Gray, Heather; Gray, Julia Ann; Graziani, Enrico; Grebenyuk, Oleg; Greenshaw, Timothy; Greenwood, Zeno Dixon; Gregersen, Kristian; Gregor, Ingrid-Maria; Grenier, Philippe; Griffiths, Justin; Grigalashvili, Nugzar; Grillo, Alexander; Grinstein, Sebastian; Grishkevich, Yaroslav; Grivaz, Jean-Francois; Groh, Manfred; Gross, Eilam; Grosse-Knetter, Joern; Groth-Jensen, Jacob; Grybel, Kai; Guarino, Victor; Guest, Daniel; Guicheney, Christophe; Guida, Angelo; Guindon, Stefan; Guler, Hulya; Gunther, Jaroslav; Guo, Bin; Guo, Jun; Gupta, Ambreesh; Gusakov, Yury; Gushchin, Vladimir; Gutierrez, Phillip; Guttman, Nir; Gutzwiller, Olivier; Guyot, Claude; Gwenlan, Claire; Gwilliam, Carl; Haas, Andy; Haas, Stefan; Haber, Carl; Hackenburg, Robert; Hadavand, Haleh Khani; Hadley, David; Haefner, Petra; Hahn, Ferdinand; Haider, Stefan; Hajduk, Zbigniew; Hakobyan, Hrachya; Hall, David; Haller, Johannes; Hamacher, Klaus; Hamal, Petr; Hamer, Matthias; Hamilton, Andrew; Hamilton, Samuel; Han, Hongguang; Han, Liang; Hanagaki, Kazunori; Hanawa, Keita; Hance, Michael; Handel, Carsten; Hanke, Paul; Hansen, John Renner; Hansen, Jørgen Beck; Hansen, Jorn Dines; Hansen, Peter Henrik; Hansson, Per; Hara, Kazuhiko; Hare, Gabriel; Harenberg, Torsten; Harkusha, Siarhei; Harper, Devin; Harrington, Robert; Harris, Orin; Harrison, Karl; Hartert, Jochen; Hartjes, Fred; Haruyama, Tomiyoshi; Harvey, Alex; Hasegawa, Satoshi; Hasegawa, Yoji; Hassani, Samira; Hatch, Mark; Hauff, Dieter; Haug, Sigve; Hauschild, Michael; Hauser, Reiner; Havranek, Miroslav; Hawes, Brian; Hawkes, Christopher; Hawkings, Richard John; Hawkins, Anthony David; Hawkins, Donovan; Hayakawa, Takashi; Hayashi, Takayasu; Hayden, Daniel; Hayward, Helen; Haywood, Stephen; Hazen, Eric; He, Mao; Head, Simon; Hedberg, Vincent; Heelan, Louise; Heim, Sarah; Heinemann, Beate; Heisterkamp, Simon; Helary, Louis; Heller, Claudio; Heller, Matthieu; Hellman, Sten; Hellmich, Dennis; Helsens, Clement; Henderson, Robert; Henke, Michael; Henrichs, Anna; Henriques Correia, Ana Maria; Henrot-Versille, Sophie; Henry-Couannier, Frédéric; Hensel, Carsten; Henß, Tobias; Medina Hernandez, Carlos; Hernández Jiménez, Yesenia; Herrberg, Ruth; Hershenhorn, Alon David; Herten, Gregor; Hertenberger, Ralf; Hervas, Luis; Hesketh, Gavin Grant; Hessey, Nigel; Higón-Rodriguez, Emilio; Hill, Daniel; Hill, John; Hill, Norman; Hiller, Karl Heinz; Hillert, Sonja; Hillier, Stephen; Hinchliffe, Ian; Hines, Elizabeth; Hirose, Minoru; Hirsch, Florian; Hirschbuehl, Dominic; Hobbs, John; Hod, Noam; Hodgkinson, Mark; Hodgson, Paul; Hoecker, Andreas; Hoeferkamp, Martin; Hoffman, Julia; Hoffmann, Dirk; Hohlfeld, Marc; Holder, Martin; Holmgren, Sven-Olof; Holy, Tomas; Holzbauer, Jenny; Homma, Yasuhiro; Hong, Tae Min; Hooft van Huysduynen, Loek; Horazdovsky, Tomas; Horn, Claus; Horner, Stephan; Hostachy, Jean-Yves; Hou, Suen; Houlden, Michael; Hoummada, Abdeslam; Howarth, James; Howell, David; Hristova, Ivana; Hrivnac, Julius; Hruska, Ivan; Hryn'ova, Tetiana; Hsu, Pai-hsien Jennifer; Hsu, Shih-Chieh; Huang, Guang Shun; Hubacek, Zdenek; Hubaut, Fabrice; Huegging, Fabian; Huettmann, Antje; Huffman, Todd Brian; Hughes, Emlyn; Hughes, Gareth; Hughes-Jones, Richard; Huhtinen, Mika; Hurst, Peter; Hurwitz, Martina; Husemann, Ulrich; Huseynov, Nazim; Huston, Joey; Huth, John; Iacobucci, Giuseppe; Iakovidis, Georgios; Ibbotson, Michael; Ibragimov, Iskander; Ichimiya, Ryo; Iconomidou-Fayard, Lydia; Idarraga, John; Iengo, Paolo; Igonkina, Olga; Ikegami, Yoichi; Ikeno, Masahiro; Ilchenko, Yuri; Iliadis, Dimitrios; Ilic, Nikolina; Imori, Masatoshi; Ince, Tayfun; Inigo-Golfin, Joaquin; Ioannou, Pavlos; Iodice, Mauro; Ippolito, Valerio; Irles Quiles, Adrian; Isaksson, Charlie; Ishikawa, Akimasa; Ishino, Masaya; Ishmukhametov, Renat; Issever, Cigdem; Istin, Serhat; Ivashin, Anton; Iwanski, Wieslaw; Iwasaki, Hiroyuki; Izen, Joseph; Izzo, Vincenzo; Jackson, Brett; Jackson, John; Jackson, Paul; Jaekel, Martin; Jain, Vivek; Jakobs, Karl; Jakobsen, Sune; Jakubek, Jan; Jana, Dilip; Jankowski, Ernest; Jansen, Eric; Jansen, Hendrik; Jantsch, Andreas; Janus, Michel; Jarlskog, Göran; Jeanty, Laura; Jelen, Kazimierz; Jen-La Plante, Imai; Jenni, Peter; Jeremie, Andrea; Jež, Pavel; Jézéquel, Stéphane; Jha, Manoj Kumar; Ji, Haoshuang; Ji, Weina; Jia, Jiangyong; Jiang, Yi; Jimenez Belenguer, Marcos; Jin, Ge; Jin, Shan; Jinnouchi, Osamu; Joergensen, Morten Dam; Joffe, David; Johansen, Lars; Johansen, Marianne; Johansson, Erik; Johansson, Per; Johnert, Sebastian; Johns, Kenneth; Jon-And, Kerstin; Jones, Graham; Jones, Roger; Jones, Tegid; Jones, Tim; Jonsson, Ove; Joram, Christian; Jorge, Pedro; Joseph, John; Jovicevic, Jelena; Jovin, Tatjana; Ju, Xiangyang; Jung, Christian; Jungst, Ralph Markus; Juranek, Vojtech; Jussel, Patrick; Juste Rozas, Aurelio; Kabachenko, Vasily; Kabana, Sonja; Kaci, Mohammed; Kaczmarska, Anna; Kadlecik, Peter; Kado, Marumi; Kagan, Harris; Kagan, Michael; Kaiser, Steffen; Kajomovitz, Enrique; Kalinin, Sergey; Kalinovskaya, Lidia; Kama, Sami; Kanaya, Naoko; Kaneda, Michiru; Kaneti, Steven; Kanno, Takayuki; Kantserov, Vadim; Kanzaki, Junichi; Kaplan, Benjamin; Kapliy, Anton; Kaplon, Jan; Kar, Deepak; Karagoz, Muge; Karnevskiy, Mikhail; Karr, Kristo; Kartvelishvili, Vakhtang; Karyukhin, Andrey; Kashif, Lashkar; Kasieczka, Gregor; Kasmi, Azzedine; Kass, Richard; Kastanas, Alex; Kataoka, Mayuko; Kataoka, Yousuke; Katsoufis, Elias; Katzy, Judith; Kaushik, Venkatesh; Kawagoe, Kiyotomo; Kawamoto, Tatsuo; Kawamura, Gen; Kayl, Manuel; Kazanin, Vassili; Kazarinov, Makhail; Keeler, Richard; Kehoe, Robert; Keil, Markus; Kekelidze, George; Kennedy, John; Kenney, Christopher John; Kenyon, Mike; Kepka, Oldrich; Kerschen, Nicolas; Kerševan, Borut Paul; Kersten, Susanne; Kessoku, Kohei; Keung, Justin; Khakzad, Mohsen; Khalil-zada, Farkhad; Khandanyan, Hovhannes; Khanov, Alexander; Kharchenko, Dmitri; Khodinov, Alexander; Kholodenko, Anatoli; Khomich, Andrei; Khoo, Teng Jian; Khoriauli, Gia; Khoroshilov, Andrey; Khovanskiy, Nikolai; Khovanskiy, Valery; Khramov, Evgeniy; Khubua, Jemal; Kim, Hyeon Jin; Kim, Min Suk; Kim, Shinhong; Kimura, Naoki; Kind, Oliver; King, Barry; King, Matthew; King, Robert Steven Beaufoy; Kirk, Julie; Kirsch, Lawrence; Kiryunin, Andrey; Kishimoto, Tomoe; Kisielewska, Danuta; Kittelmann, Thomas; Kiver, Andrey; Kladiva, Eduard; Klaiber-Lodewigs, Jonas; Klein, Max; Klein, Uta; Kleinknecht, Konrad; Klemetti, Miika; Klier, Amit; Klimek, Pawel; Klimentov, Alexei; Klingenberg, Reiner; Klinger, Joel Alexander; Klinkby, Esben; Klioutchnikova, Tatiana; Klok, Peter; Klous, Sander; Kluge, Eike-Erik; Kluge, Thomas; Kluit, Peter; Kluth, Stefan; Knecht, Neil; Kneringer, Emmerich; Knobloch, Juergen; Knoops, Edith; Knue, Andrea; Ko, Byeong Rok; Kobayashi, Tomio; Kobel, Michael; Kocian, Martin; Kodys, Peter; Köneke, Karsten; König, Adriaan; Koenig, Sebastian; Köpke, Lutz; Koetsveld, Folkert; Koevesarki, Peter; Koffas, Thomas; Koffeman, Els; Kogan, Lucy Anne; Kohn, Fabian; Kohout, Zdenek; Kohriki, Takashi; Koi, Tatsumi; Kokott, Thomas; Kolachev, Guennady; Kolanoski, Hermann; Kolesnikov, Vladimir; Koletsou, Iro; Koll, James; Kollefrath, Michael; Kolya, Scott; Komar, Aston; Komori, Yuto; Kondo, Takahiko; Kono, Takanori; Kononov, Anatoly; Konoplich, Rostislav; Konstantinidis, Nikolaos; Kootz, Andreas; Koperny, Stefan; Korcyl, Krzysztof; Kordas, Kostantinos; Koreshev, Victor; Korn, Andreas; Korol, Aleksandr; Korolkov, Ilya; Korolkova, Elena; Korotkov, Vladislav; Kortner, Oliver; Kortner, Sandra; Kostyukhin, Vadim; Kotamäki, Miikka Juhani; Kotov, Sergey; Kotov, Vladislav; Kotwal, Ashutosh; Kourkoumelis, Christine; Kouskoura, Vasiliki; Koutsman, Alex; Kowalewski, Robert Victor; Kowalski, Tadeusz; Kozanecki, Witold; Kozhin, Anatoly; Kral, Vlastimil; Kramarenko, Viktor; Kramberger, Gregor; Krasny, Mieczyslaw Witold; Krasznahorkay, Attila; Kraus, James; Kraus, Jana; Kreisel, Arik; Krejci, Frantisek; Kretzschmar, Jan; Krieger, Nina; Krieger, Peter; Kroeninger, Kevin; Kroha, Hubert; Kroll, Joe; Kroseberg, Juergen; Krstic, Jelena; Kruchonak, Uladzimir; Krüger, Hans; Kruker, Tobias; Krumnack, Nils; Krumshteyn, Zinovii; Kruth, Andre; Kubota, Takashi; Kuday, Sinan; Kuehn, Susanne; Kugel, Andreas; Kuhl, Thorsten; Kuhn, Dietmar; Kukhtin, Victor; Kulchitsky, Yuri; Kuleshov, Sergey; Kummer, Christian; Kuna, Marine; Kundu, Nikhil; Kunkle, Joshua; Kupco, Alexander; Kurashige, Hisaya; Kurata, Masakazu; Kurochkin, Yurii; Kus, Vlastimil; Kuwertz, Emma Sian; Kuze, Masahiro; Kvita, Jiri; Kwee, Regina; La Rosa, Alessandro; La Rotonda, Laura; Labarga, Luis; Labbe, Julien; Lablak, Said; Lacasta, Carlos; Lacava, Francesco; Lacker, Heiko; Lacour, Didier; Lacuesta, Vicente Ramón; Ladygin, Evgueni; Lafaye, Remi; Laforge, Bertrand; Lagouri, Theodota; Lai, Stanley; Laisne, Emmanuel; Lamanna, Massimo; Lampen, Caleb; Lampl, Walter; Lancon, Eric; Landgraf, Ulrich; Landon, Murrough; Lane, Jenna; Lange, Clemens; Lankford, Andrew; Lanni, Francesco; Lantzsch, Kerstin; Laplace, Sandrine; Lapoire, Cecile; Laporte, Jean-Francois; Lari, Tommaso; Larionov, Anatoly; Larner, Aimee; Lasseur, Christian; Lassnig, Mario; Laurelli, Paolo; Lavorini, Vincenzo; Lavrijsen, Wim; Laycock, Paul; Lazarev, Alexandre; Le Dortz, Olivier; Le Guirriec, Emmanuel; Le Maner, Christophe; Le Menedeu, Eve; Lebel, Céline; LeCompte, Thomas; Ledroit-Guillon, Fabienne Agnes Marie; Lee, Hurng-Chun; Lee, Jason; Lee, Shih-Chang; Lee, Lawrence; Lefebvre, Michel; Legendre, Marie; Leger, Annie; LeGeyt, Benjamin; Legger, Federica; Leggett, Charles; Lehmacher, Marc; Lehmann Miotto, Giovanna; Lei, Xiaowen; Leite, Marco Aurelio Lisboa; Leitner, Rupert; Lellouch, Daniel; Leltchouk, Mikhail; Lemmer, Boris; Lendermann, Victor; Leney, Katharine; Lenz, Tatiana; Lenzen, Georg; Lenzi, Bruno; Leonhardt, Kathrin; Leontsinis, Stefanos; Leroy, Claude; Lessard, Jean-Raphael; Lesser, Jonas; Lester, Christopher; Leung Fook Cheong, Annabelle; Levêque, Jessica; Levin, Daniel; Levinson, Lorne; Levitski, Mikhail; Lewis, Adrian; Lewis, George; Leyko, Agnieszka; Leyton, Michael; Li, Bo; Li, Haifeng; Li, Shu; Li, Xuefei; Liang, Zhijun; Liao, Hongbo; Liberti, Barbara; Lichard, Peter; Lichtnecker, Markus; Lie, Ki; Liebig, Wolfgang; Lifshitz, Ronen; Lilley, Joseph; Limbach, Christian; Limosani, Antonio; Limper, Maaike; Lin, Simon; Linde, Frank; Linnemann, James; Lipeles, Elliot; Lipinsky, Lukas; Lipniacka, Anna; Liss, Tony; Lissauer, David; Lister, Alison; Litke, Alan; Liu, Chuanlei; Liu, Dong; Liu, Hao; Liu, Jianbei; Liu, Minghui; Liu, Yanwen; Livan, Michele; Livermore, Sarah; Lleres, Annick; Llorente Merino, Javier; Lloyd, Stephen; Lobodzinska, Ewelina; Loch, Peter; Lockman, William; Loddenkoetter, Thomas; Loebinger, Fred; Loginov, Andrey; Loh, Chang Wei; Lohse, Thomas; Lohwasser, Kristin; Lokajicek, Milos; Loken, James; Lombardo, Vincenzo Paolo; Long, Robin Eamonn; Lopes, Lourenco; Lopez Mateos, David; Lorenz, Jeanette; Lorenzo Martinez, Narei; Losada, Marta; Loscutoff, Peter; Lo Sterzo, Francesco; Losty, Michael; Lou, Xinchou; Lounis, Abdenour; Loureiro, Karina; Love, Jeremy; Love, Peter; Lowe, Andrew; Lu, Feng; Lubatti, Henry; Luci, Claudio; Lucotte, Arnaud; Ludwig, Andreas; Ludwig, Dörthe; Ludwig, Inga; Ludwig, Jens; Luehring, Frederick; Luijckx, Guy; Lumb, Debra; Luminari, Lamberto; Lund, Esben; Lund-Jensen, Bengt; Lundberg, Björn; Lundberg, Johan; Lundquist, Johan; Lungwitz, Matthias; Lutz, Gerhard; Lynn, David; Lys, Jeremy; Lytken, Else; Ma, Hong; Ma, Lian Liang; Macana Goia, Jorge Andres; Maccarrone, Giovanni; Macchiolo, Anna; Maček, Boštjan; Machado Miguens, Joana; Mackeprang, Rasmus; Madaras, Ronald; Mader, Wolfgang; Maenner, Reinhard; Maeno, Tadashi; Mättig, Peter; Mättig, Stefan; Magnoni, Luca; Magradze, Erekle; Mahalalel, Yair; Mahboubi, Kambiz; Mahout, Gilles; Maiani, Camilla; Maidantchik, Carmen; Maio, Amélia; Majewski, Stephanie; Makida, Yasuhiro; Makovec, Nikola; Mal, Prolay; Malaescu, Bogdan; Malecki, Pawel; Malecki, Piotr; Maleev, Victor; Malek, Fairouz; Mallik, Usha; Malon, David; Malone, Caitlin; Maltezos, Stavros; Malyshev, Vladimir; Malyukov, Sergei; Mameghani, Raphael; Mamuzic, Judita; Manabe, Atsushi; Mandelli, Luciano; Mandić, Igor; Mandrysch, Rocco; Maneira, José; Mangeard, Pierre-Simon; Manhaes de Andrade Filho, Luciano; Manjavidze, Ioseb; Mann, Alexander; Manning, Peter; Manousakis-Katsikakis, Arkadios; Mansoulie, Bruno; Manz, Andreas; Mapelli, Alessandro; Mapelli, Livio; March, Luis; Marchand, Jean-Francois; Marchese, Fabrizio; Marchiori, Giovanni; Marcisovsky, Michal; Marin, Alexandru; Marino, Christopher; Marroquim, Fernando; Marshall, Robin; Marshall, Zach; Martens, Kalen; Marti-Garcia, Salvador; Martin, Andrew; Martin, Brian; Martin, Brian Thomas; Martin, Franck Francois; Martin, Jean-Pierre; Martin, Philippe; Martin, Tim; Martin, Victoria Jane; Martin dit Latour, Bertrand; Martin-Haugh, Stewart; Martinez, Mario; Martinez Outschoorn, Verena; Martyniuk, Alex; Marx, Marilyn; Marzano, Francesco; Marzin, Antoine; Masetti, Lucia; Mashimo, Tetsuro; Mashinistov, Ruslan; Masik, Jiri; Maslennikov, Alexey; Massa, Ignazio; Massaro, Graziano; Massol, Nicolas; Mastrandrea, Paolo; Mastroberardino, Anna; Masubuchi, Tatsuya; Mathes, Markus; Matricon, Pierre; Matsumoto, Hiroshi; Matsunaga, Hiroyuki; Matsushita, Takashi; Mattravers, Carly; Maugain, Jean-Marie; Maurer, Julien; Maxfield, Stephen; Maximov, Dmitriy; May, Edward; Mayne, Anna; Mazini, Rachid; Mazur, Michael; Mazzanti, Marcello; Mazzoni, Enrico; Mc Kee, Shawn Patrick; McCarn, Allison; McCarthy, Robert; McCarthy, Tom; McCubbin, Norman; McFarlane, Kenneth; Mcfayden, Josh; McGlone, Helen; Mchedlidze, Gvantsa; McLaren, Robert Andrew; Mclaughlan, Tom; McMahon, Steve; McPherson, Robert; Meade, Andrew; Mechnich, Joerg; Mechtel, Markus; Medinnis, Mike; Meera-Lebbai, Razzak; Meguro, Tatsuma; Mehdiyev, Rashid; Mehlhase, Sascha; Mehta, Andrew; Meier, Karlheinz; Meirose, Bernhard; Melachrinos, Constantinos; Mellado Garcia, Bruce Rafael; Mendoza Navas, Luis; Meng, Zhaoxia; Mengarelli, Alberto; Menke, Sven; Menot, Claude; Meoni, Evelin; Mercurio, Kevin Michael; Mermod, Philippe; Merola, Leonardo; Meroni, Chiara; Merritt, Frank; Merritt, Hayes; Messina, Andrea; Metcalfe, Jessica; Mete, Alaettin Serhan; Meyer, Carsten; Meyer, Christopher; Meyer, Jean-Pierre; Meyer, Jochen; Meyer, Joerg; Meyer, Thomas Christian; Meyer, W Thomas; Miao, Jiayuan; Michal, Sebastien; Micu, Liliana; Middleton, Robin; Migas, Sylwia; Mijović, Liza; Mikenberg, Giora; Mikestikova, Marcela; Mikuž, Marko; Miller, David; Miller, Robert; Mills, Bill; Mills, Corrinne; Milov, Alexander; Milstead, David; Milstein, Dmitry; Minaenko, Andrey; Miñano Moya, Mercedes; Minashvili, Irakli; Mincer, Allen; Mindur, Bartosz; Mineev, Mikhail; Ming, Yao; Mir, Lluisa-Maria; Mirabelli, Giovanni; Miralles Verge, Lluis; Misiejuk, Andrzej; Mitrevski, Jovan; Mitrofanov, Gennady; Mitsou, Vasiliki A; Mitsui, Shingo; Miyagawa, Paul; Miyazaki, Kazuki; Mjörnmark, Jan-Ulf; Moa, Torbjoern; Mockett, Paul; Moed, Shulamit; Moeller, Victoria; Mönig, Klaus; Möser, Nicolas; Mohapatra, Soumya; Mohr, Wolfgang; Mohrdieck-Möck, Susanne; Moisseev, Artemy; Moles-Valls, Regina; Molina-Perez, Jorge; Monk, James; Monnier, Emmanuel; Montesano, Simone; Monticelli, Fernando; Monzani, Simone; Moore, Roger; Moorhead, Gareth; Mora Herrera, Clemencia; Moraes, Arthur; Morange, Nicolas; Morel, Julien; Morello, Gianfranco; Moreno, Deywis; Moreno Llácer, María; Morettini, Paolo; Morgenstern, Marcus; Morii, Masahiro; Morin, Jerome; Morley, Anthony Keith; Mornacchi, Giuseppe; Morozov, Sergey; Morris, John; Morvaj, Ljiljana; Moser, Hans-Guenther; Mosidze, Maia; Moss, Josh; Mount, Richard; Mountricha, Eleni; Mouraviev, Sergei; Moyse, Edward; Mudrinic, Mihajlo; Mueller, Felix; Mueller, James; Mueller, Klemens; Müller, Thomas; Mueller, Timo; Muenstermann, Daniel; Muir, Alex; Munwes, Yonathan; Murray, Bill; Mussche, Ido; Musto, Elisa; Myagkov, Alexey; Nadal, Jordi; Nagai, Koichi; Nagano, Kunihiro; Nagarkar, Advait; Nagasaka, Yasushi; Nagel, Martin; Nairz, Armin Michael; Nakahama, Yu; Nakamura, Koji; Nakamura, Tomoaki; Nakano, Itsuo; Nanava, Gizo; Napier, Austin; Narayan, Rohin; Nash, Michael; Nation, Nigel; Nattermann, Till; Naumann, Thomas; Navarro, Gabriela; Neal, Homer; Nebot, Eduardo; Nechaeva, Polina; Neep, Thomas James; Negri, Andrea; Negri, Guido; Nektarijevic, Snezana; Nelson, Andrew; Nelson, Silke; Nelson, Timothy Knight; Nemecek, Stanislav; Nemethy, Peter; Nepomuceno, Andre Asevedo; Nessi, Marzio; Neubauer, Mark; Neusiedl, Andrea; Neves, Ricardo; Nevski, Pavel; Newman, Paul; Nguyen Thi Hong, Van; Nickerson, Richard; Nicolaidou, Rosy; Nicolas, Ludovic; Nicquevert, Bertrand; Niedercorn, Francois; Nielsen, Jason; Niinikoski, Tapio; Nikiforou, Nikiforos; Nikiforov, Andriy; Nikolaenko, Vladimir; Nikolaev, Kirill; Nikolic-Audit, Irena; Nikolics, Katalin; Nikolopoulos, Konstantinos; Nilsen, Henrik; Nilsson, Paul; Ninomiya, Yoichi; Nisati, Aleandro; Nishiyama, Tomonori; Nisius, Richard; Nodulman, Lawrence; Nomachi, Masaharu; Nomidis, Ioannis; Nordberg, Markus; Nordkvist, Bjoern; Norton, Peter; Novakova, Jana; Nozaki, Mitsuaki; Nozka, Libor; Nugent, Ian Michael; Nuncio-Quiroz, Adriana-Elizabeth; Nunes Hanninger, Guilherme; Nunnemann, Thomas; Nurse, Emily; O'Brien, Brendan Joseph; O'Neale, Steve; O'Neil, Dugan; O'Shea, Val; Oakes, Louise Beth; Oakham, Gerald; Oberlack, Horst; Ocariz, Jose; Ochi, Atsuhiko; Oda, Susumu; Odaka, Shigeru; Odier, Jerome; Ogren, Harold; Oh, Alexander; Oh, Seog; Ohm, Christian; Ohshima, Takayoshi; Ohshita, Hidetoshi; Ohsugi, Takashi; Okada, Shogo; Okawa, Hideki; Okumura, Yasuyuki; Okuyama, Toyonobu; Olariu, Albert; Olcese, Marco; Olchevski, Alexander; Olivares Pino, Sebastian Andres; Oliveira, Miguel Alfonso; Oliveira Damazio, Denis; Oliver Garcia, Elena; Olivito, Dominick; Olszewski, Andrzej; Olszowska, Jolanta; Omachi, Chihiro; Onofre, António; Onyisi, Peter; Oram, Christopher; Oreglia, Mark; Oren, Yona; Orestano, Domizia; Orlov, Iliya; Oropeza Barrera, Cristina; Orr, Robert; Osculati, Bianca; Ospanov, Rustem; Osuna, Carlos; Otero y Garzon, Gustavo; Ottersbach, John; Ouchrif, Mohamed; Ouellette, Eric; Ould-Saada, Farid; Ouraou, Ahmimed; Ouyang, Qun; Ovcharova, Ana; Owen, Mark; Owen, Simon; Ozcan, Veysi Erkcan; Ozturk, Nurcan; Pacheco Pages, Andres; Padilla Aranda, Cristobal; Pagan Griso, Simone; Paganis, Efstathios; Paige, Frank; Pais, Preema; Pajchel, Katarina; Palacino, Gabriel; Paleari, Chiara; Palestini, Sandro; Pallin, Dominique; Palma, Alberto; Palmer, Jody; Pan, Yibin; Panagiotopoulou, Evgenia; Panes, Boris; Panikashvili, Natalia; Panitkin, Sergey; Pantea, Dan; Panuskova, Monika; Paolone, Vittorio; Papadelis, Aras; Papadopoulou, Theodora; Paramonov, Alexander; Park, Woochun; Parker, Andy; Parodi, Fabrizio; Parsons, John; Parzefall, Ulrich; Pasqualucci, Enrico; Passaggio, Stefano; Passeri, Antonio; Pastore, Fernanda; Pastore, Francesca; Pásztor, Gabriella; Pataraia, Sophio; Patel, Nikhul; Pater, Joleen; Patricelli, Sergio; Pauly, Thilo; Pecsy, Martin; Pedraza Morales, Maria Isabel; Peleganchuk, Sergey; Peng, Haiping; Pengo, Ruggero; Penning, Bjoern; Penson, Alexander; Penwell, John; Perantoni, Marcelo; Perez, Kerstin; Perez Cavalcanti, Tiago; Perez Codina, Estel; Pérez García-Estañ, María Teresa; Perez Reale, Valeria; Perini, Laura; Pernegger, Heinz; Perrino, Roberto; Perrodo, Pascal; Persembe, Seda; Perus, Antoine; Peshekhonov, Vladimir; Peters, Krisztian; Petersen, Brian; Petersen, Jorgen; Petersen, Troels; Petit, Elisabeth; Petridis, Andreas; Petridou, Chariclia; Petrolo, Emilio; Petrucci, Fabrizio; Petschull, Dennis; Petteni, Michele; Pezoa, Raquel; Phan, Anna; Phillips, Peter William; Piacquadio, Giacinto; Piccaro, Elisa; Piccinini, Maurizio; Piec, Sebastian Marcin; Piegaia, Ricardo; Pignotti, David; Pilcher, James; Pilkington, Andrew; Pina, João Antonio; Pinamonti, Michele; Pinder, Alex; Pinfold, James; Ping, Jialun; Pinto, Belmiro; Pirotte, Olivier; Pizio, Caterina; Placakyte, Ringaile; Plamondon, Mathieu; Pleier, Marc-Andre; Pleskach, Anatoly; Poblaguev, Andrei; Poddar, Sahill; Podlyski, Fabrice; Poggioli, Luc; Poghosyan, Tatevik; Pohl, Martin; Polci, Francesco; Polesello, Giacomo; Policicchio, Antonio; Polini, Alessandro; Poll, James; Polychronakos, Venetios; Pomarede, Daniel Marc; Pomeroy, Daniel; Pommès, Kathy; Pontecorvo, Ludovico; Pope, Bernard; Popeneciu, Gabriel Alexandru; Popovic, Dragan; Poppleton, Alan; Portell Bueso, Xavier; Posch, Christoph; Pospelov, Guennady; Pospisil, Stanislav; Potrap, Igor; Potter, Christina; Potter, Christopher; Poulard, Gilbert; Poveda, Joaquin; Pozdnyakov, Valery; Prabhu, Robindra; Pralavorio, Pascal; Pranko, Aliaksandr; Prasad, Srivas; Pravahan, Rishiraj; Prell, Soeren; Pretzl, Klaus Peter; Pribyl, Lukas; Price, Darren; Price, Joe; Price, Lawrence; Price, Michael John; Prieur, Damien; Primavera, Margherita; Prokofiev, Kirill; Prokoshin, Fedor; Protopopescu, Serban; Proudfoot, James; Prudent, Xavier; Przybycien, Mariusz; Przysiezniak, Helenka; Psoroulas, Serena; Ptacek, Elizabeth; Pueschel, Elisa; Purdham, John; Purohit, Milind; Puzo, Patrick; Pylypchenko, Yuriy; Qian, Jianming; Qian, Zuxuan; Qin, Zhonghua; Quadt, Arnulf; Quarrie, David; Quayle, William; Quinonez, Fernando; Raas, Marcel; Radescu, Voica; Radics, Balint; Radloff, Peter; Rador, Tonguc; Ragusa, Francesco; Rahal, Ghita; Rahimi, Amir; Rahm, David; Rajagopalan, Srinivasan; Rammensee, Michael; Rammes, Marcus; Randle-Conde, Aidan Sean; Randrianarivony, Koloina; Ratoff, Peter; Rauscher, Felix; Rave, Tobias Christian; Raymond, Michel; Read, Alexander Lincoln; Rebuzzi, Daniela; Redelbach, Andreas; Redlinger, George; Reece, Ryan; Reeves, Kendall; Reichold, Armin; Reinherz-Aronis, Erez; Reinsch, Andreas; Reisinger, Ingo; Rembser, Christoph; Ren, Zhongliang; Renaud, Adrien; Renkel, Peter; Rescigno, Marco; Resconi, Silvia; Resende, Bernardo; Reznicek, Pavel; Rezvani, Reyhaneh; Richards, Alexander; Richter, Robert; Richter-Was, Elzbieta; Ridel, Melissa; Rijpstra, Manouk; Rijssenbeek, Michael; Rimoldi, Adele; Rinaldi, Lorenzo; Rios, Ryan Randy; Riu, Imma; Rivoltella, Giancesare; Rizatdinova, Flera; Rizvi, Eram; Robertson, Steven; Robichaud-Veronneau, Andree; Robinson, Dave; Robinson, James; Robinson, Mary; Robson, Aidan; Rocha de Lima, Jose Guilherme; Roda, Chiara; Roda Dos Santos, Denis; Rodriguez, Diego; Roe, Adam; Roe, Shaun; Røhne, Ole; Rojo, Victoria; Rolli, Simona; Romaniouk, Anatoli; Romano, Marino; Romanov, Victor; Romeo, Gaston; Romero Adam, Elena; Roos, Lydia; Ros, Eduardo; Rosati, Stefano; Rosbach, Kilian; Rose, Anthony; Rose, Matthew; Rosenbaum, Gabriel; Rosenberg, Eli; Rosendahl, Peter Lundgaard; Rosenthal, Oliver; Rosselet, Laurent; Rossetti, Valerio; Rossi, Elvira; Rossi, Leonardo Paolo; Rotaru, Marina; Roth, Itamar; Rothberg, Joseph; Rousseau, David; Royon, Christophe; Rozanov, Alexander; Rozen, Yoram; Ruan, Xifeng; Rubinskiy, Igor; Ruckert, Benjamin; Ruckstuhl, Nicole; Rud, Viacheslav; Rudolph, Christian; Rudolph, Gerald; Rühr, Frederik; Ruggieri, Federico; Ruiz-Martinez, Aranzazu; Rumiantsev, Viktor; Rumyantsev, Leonid; Runge, Kay; Rurikova, Zuzana; Rusakovich, Nikolai; Rust, Dave; Rutherfoord, John; Ruwiedel, Christoph; Ruzicka, Pavel; Ryabov, Yury; Ryadovikov, Vasily; Ryan, Patrick; Rybar, Martin; Rybkin, Grigori; Ryder, Nick; Rzaeva, Sevda; Saavedra, Aldo; Sadeh, Iftach; Sadrozinski, Hartmut; Sadykov, Renat; Safai Tehrani, Francesco; Sakamoto, Hiroshi; Salamanna, Giuseppe; Salamon, Andrea; Saleem, Muhammad; Salihagic, Denis; Salnikov, Andrei; Salt, José; Salvachua Ferrando, Belén; Salvatore, Daniela; Salvatore, Pasquale Fabrizio; Salvucci, Antonio; Salzburger, Andreas; Sampsonidis, Dimitrios; Samset, Björn Hallvard; Sanchez, Arturo; Sanchez Martinez, Victoria; Sandaker, Heidi; Sander, Heinz Georg; Sanders, Michiel; Sandhoff, Marisa; Sandoval, Tanya; Sandoval, Carlos; Sandstroem, Rikard; Sandvoss, Stephan; Sankey, Dave; Sansoni, Andrea; Santamarina Rios, Cibran; Santoni, Claudio; Santonico, Rinaldo; Santos, Helena; Saraiva, João; Sarangi, Tapas; Sarkisyan-Grinbaum, Edward; Sarri, Francesca; Sartisohn, Georg; Sasaki, Osamu; Sasaki, Takashi; Sasao, Noboru; Satsounkevitch, Igor; Sauvage, Gilles; Sauvan, Emmanuel; Sauvan, Jean-Baptiste; Savard, Pierre; Savinov, Vladimir; Savu, Dan Octavian; Sawyer, Lee; Saxon, David; Says, Louis-Pierre; Sbarra, Carla; Sbrizzi, Antonio; Scallon, Olivia; Scannicchio, Diana; Scarcella, Mark; Schaarschmidt, Jana; Schacht, Peter; Schäfer, Uli; Schaepe, Steffen; Schaetzel, Sebastian; Schaffer, Arthur; Schaile, Dorothee; Schamberger, R. Dean; Schamov, Andrey; Scharf, Veit; Schegelsky, Valery; Scheirich, Daniel; Schernau, Michael; Scherzer, Max; Schiavi, Carlo; Schieck, Jochen; Schioppa, Marco; Schlenker, Stefan; Schlereth, James; Schmidt, Evelyn; Schmieden, Kristof; Schmitt, Christian; Schmitt, Sebastian; Schmitz, Martin; Schöning, André; Schott, Matthias; Schouten, Doug; Schovancova, Jaroslava; Schram, Malachi; Schroeder, Christian; Schroer, Nicolai; Schuh, Silvia; Schuler, Georges; Schultens, Martin Johannes; Schultes, Joachim; Schultz-Coulon, Hans-Christian; Schulz, Holger; Schumacher, Jan; Schumacher, Markus; Schumm, Bruce; Schune, Philippe; Schwanenberger, Christian; Schwartzman, Ariel; Schwemling, Philippe; Schwienhorst, Reinhard; Schwierz, Rainer; Schwindling, Jerome; Schwindt, Thomas; Schwoerer, Maud; Scott, Bill; Searcy, Jacob; Sedov, George; Sedykh, Evgeny; Segura, Ester; Seidel, Sally; Seiden, Abraham; Seifert, Frank; Seixas, José; Sekhniaidze, Givi; Selbach, Karoline Elfriede; Seliverstov, Dmitry; Sellden, Bjoern; Sellers, Graham; Seman, Michal; Semprini-Cesari, Nicola; Serfon, Cedric; Serin, Laurent; Serkin, Leonid; Seuster, Rolf; Severini, Horst; Sevior, Martin; Sfyrla, Anna; Shabalina, Elizaveta; Shamim, Mansoora; Shan, Lianyou; Shank, James; Shao, Qi Tao; Shapiro, Marjorie; Shatalov, Pavel; Shaver, Leif; Shaw, Kate; Sherman, Daniel; Sherwood, Peter; Shibata, Akira; Shichi, Hideharu; Shimizu, Shima; Shimojima, Makoto; Shin, Taeksu; Shiyakova, Maria; Shmeleva, Alevtina; Shochet, Mel; Short, Daniel; Shrestha, Suyog; Shulga, Evgeny; Shupe, Michael; Sicho, Petr; Sidoti, Antonio; Siegert, Frank; Sijacki, Djordje; Silbert, Ohad; Silva, José; Silver, Yiftah; Silverstein, Daniel; Silverstein, Samuel; Simak, Vladislav; Simard, Olivier; Simic, Ljiljana; Simion, Stefan; Simmons, Brinick; Simonyan, Margar; Sinervo, Pekka; Sinev, Nikolai; Sipica, Valentin; Siragusa, Giovanni; Sircar, Anirvan; Sisakyan, Alexei; Sivoklokov, Serguei; Sjölin, Jörgen; Sjursen, Therese; Skinnari, Louise Anastasia; Skottowe, Hugh Philip; Skovpen, Kirill; Skubic, Patrick; Skvorodnev, Nikolai; Slater, Mark; Slavicek, Tomas; Sliwa, Krzysztof; Sloper, John erik; Smakhtin, Vladimir; Smart, Ben; Smirnov, Sergei; Smirnov, Yury; Smirnova, Lidia; Smirnova, Oxana; Smith, Ben Campbell; Smith, Douglas; Smith, Kenway; Smizanska, Maria; Smolek, Karel; Snesarev, Andrei; Snow, Steve; Snow, Joel; Snuverink, Jochem; Snyder, Scott; Soares, Mara; Sobie, Randall; Sodomka, Jaromir; Soffer, Abner; Solans, Carlos; Solar, Michael; Solc, Jaroslav; Soldatov, Evgeny; Soldevila, Urmila; Solfaroli Camillocci, Elena; Solodkov, Alexander; Solovyanov, Oleg; Soni, Nitesh; Sopko, Vit; Sopko, Bruno; Sosebee, Mark; Soualah, Rachik; Soukharev, Andrey; Spagnolo, Stefania; Spanò, Francesco; Spighi, Roberto; Spigo, Giancarlo; Spila, Federico; Spiwoks, Ralf; Spousta, Martin; Spreitzer, Teresa; Spurlock, Barry; St Denis, Richard Dante; Stahlman, Jonathan; Stamen, Rainer; Stanecka, Ewa; Stanek, Robert; Stanescu, Cristian; Stapnes, Steinar; Starchenko, Evgeny; Stark, Jan; Staroba, Pavel; Starovoitov, Pavel; Staude, Arnold; Stavina, Pavel; Stavropoulos, Georgios; Steele, Genevieve; Steinbach, Peter; Steinberg, Peter; Stekl, Ivan; Stelzer, Bernd; Stelzer, Harald Joerg; Stelzer-Chilton, Oliver; Stenzel, Hasko; Stern, Sebastian; Stevenson, Kyle; Stewart, Graeme; Stillings, Jan Andre; Stockton, Mark; Stoerig, Kathrin; Stoicea, Gabriel; Stonjek, Stefan; Strachota, Pavel; Stradling, Alden; Straessner, Arno; Strandberg, Jonas; Strandberg, Sara; Strandlie, Are; Strang, Michael; Strauss, Emanuel; Strauss, Michael; Strizenec, Pavol; Ströhmer, Raimund; Strom, David; Strong, John; Stroynowski, Ryszard; Strube, Jan; Stugu, Bjarne; Stumer, Iuliu; Stupak, John; Sturm, Philipp; Styles, Nicholas Adam; Soh, Dart-yin; Su, Dong; Subramania, Halasya Siva; Succurro, Antonella; Sugaya, Yorihito; Sugimoto, Takuya; Suhr, Chad; Suita, Koichi; Suk, Michal; Sulin, Vladimir; Sultansoy, Saleh; Sumida, Toshi; Sun, Xiaohu; Sundermann, Jan Erik; Suruliz, Kerim; Sushkov, Serge; Susinno, Giancarlo; Sutton, Mark; Suzuki, Yu; Suzuki, Yuta; Svatos, Michal; Sviridov, Yuri; Swedish, Stephen; Sykora, Ivan; Sykora, Tomas; Szeless, Balazs; Sánchez, Javier; Ta, Duc; Tackmann, Kerstin; Taffard, Anyes; Tafirout, Reda; Taiblum, Nimrod; Takahashi, Yuta; Takai, Helio; Takashima, Ryuichi; Takeda, Hiroshi; Takeshita, Tohru; Takubo, Yosuke; Talby, Mossadek; Talyshev, Alexey; Tamsett, Matthew; Tanaka, Junichi; Tanaka, Reisaburo; Tanaka, Satoshi; Tanaka, Shuji; Tanaka, Yoshito; Tanasijczuk, Andres Jorge; Tani, Kazutoshi; Tannoury, Nancy; Tappern, Geoffrey; Tapprogge, Stefan; Tardif, Dominique; Tarem, Shlomit; Tarrade, Fabien; Tartarelli, Giuseppe Francesco; Tas, Petr; Tasevsky, Marek; Tassi, Enrico; Tatarkhanov, Mous; Tayalati, Yahya; Taylor, Christopher; Taylor, Frank; Taylor, Geoffrey; Taylor, Wendy; Teinturier, Marthe; Teixeira Dias Castanheira, Matilde; Teixeira-Dias, Pedro; Temming, Kim Katrin; Ten Kate, Herman; Teng, Ping-Kun; Terada, Susumu; Terashi, Koji; Terron, Juan; Testa, Marianna; Teuscher, Richard; Thadome, Jocelyn; Therhaag, Jan; Theveneaux-Pelzer, Timothée; Thioye, Moustapha; Thoma, Sascha; Thomas, Juergen; Thompson, Emily; Thompson, Paul; Thompson, Peter; Thompson, Stan; Thomsen, Lotte Ansgaard; Thomson, Evelyn; Thomson, Mark; Thun, Rudolf; Tian, Feng; Tibbetts, Mark James; Tic, Tomáš; Tikhomirov, Vladimir; Tikhonov, Yury; Timoshenko, Sergey; Tipton, Paul; Tique Aires Viegas, Florbela De Jes; Tisserant, Sylvain; Tobias, Jürgen; Toczek, Barbara; Todorov, Theodore; Todorova-Nova, Sharka; Toggerson, Brokk; Tojo, Junji; Tokár, Stanislav; Tokunaga, Kaoru; Tokushuku, Katsuo; Tollefson, Kirsten; Tomoto, Makoto; Tompkins, Lauren; Toms, Konstantin; Tong, Guoliang; Tonoyan, Arshak; Topfel, Cyril; Topilin, Nikolai; Torchiani, Ingo; Torrence, Eric; Torres, Heberth; Torró Pastor, Emma; Toth, Jozsef; Touchard, Francois; Tovey, Daniel; Trefzger, Thomas; Tremblet, Louis; Tricoli, Alesandro; Trigger, Isabel Marian; Trincaz-Duvoid, Sophie; Trinh, Thi Nguyet; Tripiana, Martin; Trischuk, William; Trivedi, Arjun; Trocmé, Benjamin; Troncon, Clara; Trottier-McDonald, Michel; Trzebinski, Maciej; Trzupek, Adam; Tsarouchas, Charilaos; Tseng, Jeffrey; Tsiakiris, Menelaos; Tsiareshka, Pavel; Tsionou, Dimitra; Tsipolitis, Georgios; Tsiskaridze, Vakhtang; Tskhadadze, Edisher; Tsukerman, Ilya; Tsulaia, Vakhtang; Tsung, Jieh-Wen; Tsuno, Soshi; Tsybychev, Dmitri; Tua, Alan; Tudorache, Alexandra; Tudorache, Valentina; Tuggle, Joseph; Turala, Michal; Turecek, Daniel; Turk Cakir, Ilkay; Turlay, Emmanuel; Turra, Ruggero; Tuts, Michael; Tykhonov, Andrii; Tylmad, Maja; Tyndel, Mike; Tzanakos, George; Uchida, Kirika; Ueda, Ikuo; Ueno, Ryuichi; Ugland, Maren; Uhlenbrock, Mathias; Uhrmacher, Michael; Ukegawa, Fumihiko; Unal, Guillaume; Underwood, David; Undrus, Alexander; Unel, Gokhan; Unno, Yoshinobu; Urbaniec, Dustin; Usai, Giulio; Uslenghi, Massimiliano; Vacavant, Laurent; Vacek, Vaclav; Vachon, Brigitte; Vahsen, Sven; Valenta, Jan; Valente, Paolo; Valentinetti, Sara; Valkar, Stefan; Valladolid Gallego, Eva; Vallecorsa, Sofia; Valls Ferrer, Juan Antonio; van der Graaf, Harry; van der Kraaij, Erik; Van Der Leeuw, Robin; van der Poel, Egge; van der Ster, Daniel; van Eldik, Niels; van Gemmeren, Peter; van Kesteren, Zdenko; van Vulpen, Ivo; Vanadia, Marco; Vandelli, Wainer; Vandoni, Giovanna; Vaniachine, Alexandre; Vankov, Peter; Vannucci, Francois; Varela Rodriguez, Fernando; Vari, Riccardo; Varnes, Erich; Varouchas, Dimitris; Vartapetian, Armen; Varvell, Kevin; Vassilakopoulos, Vassilios; Vazeille, Francois; Vegni, Guido; Veillet, Jean-Jacques; Vellidis, Constantine; Veloso, Filipe; Veness, Raymond; Veneziano, Stefano; Ventura, Andrea; Ventura, Daniel; Venturi, Manuela; Venturi, Nicola; Vercesi, Valerio; Verducci, Monica; Verkerke, Wouter; Vermeulen, Jos; Vest, Anja; Vetterli, Michel; Vichou, Irene; Vickey, Trevor; Vickey Boeriu, Oana Elena; Viehhauser, Georg; Viel, Simon; Villa, Mauro; Villaplana Perez, Miguel; Vilucchi, Elisabetta; Vincter, Manuella; Vinek, Elisabeth; Vinogradov, Vladimir; Virchaux, Marc; Virzi, Joseph; Vitells, Ofer; Viti, Michele; Vivarelli, Iacopo; Vives Vaque, Francesc; Vlachos, Sotirios; Vladoiu, Dan; Vlasak, Michal; Vlasov, Nikolai; Vogel, Adrian; Vokac, Petr; Volpi, Guido; Volpi, Matteo; Volpini, Giovanni; von der Schmitt, Hans; von Loeben, Joerg; von Radziewski, Holger; von Toerne, Eckhard; Vorobel, Vit; Vorobiev, Alexander; Vorwerk, Volker; Vos, Marcel; Voss, Rudiger; Voss, Thorsten Tobias; Vossebeld, Joost; Vranjes, Nenad; Vranjes Milosavljevic, Marija; Vrba, Vaclav; Vreeswijk, Marcel; Vu Anh, Tuan; Vuillermet, Raphael; Vukotic, Ilija; Wagner, Wolfgang; Wagner, Peter; Wahlen, Helmut; Wakabayashi, Jun; Walbersloh, Jorg; Walch, Shannon; Walder, James; Walker, Rodney; Walkowiak, Wolfgang; Wall, Richard; Waller, Peter; Wang, Chiho; Wang, Haichen; Wang, Hulin; Wang, Jike; Wang, Jin; Wang, Joshua C; Wang, Rui; Wang, Song-Ming; Warburton, Andreas; Ward, Patricia; Warsinsky, Markus; Watkins, Peter; Watson, Alan; Watson, Ian; Watson, Miriam; Watts, Gordon; Watts, Stephen; Waugh, Anthony; Waugh, Ben; Weber, Marc; Weber, Michele; Weber, Pavel; Weidberg, Anthony; Weigell, Philipp; Weingarten, Jens; Weiser, Christian; Wellenstein, Hermann; Wells, Phillippa; Wen, Mei; Wenaus, Torre; Wendland, Dennis; Wendler, Shanti; Weng, Zhili; Wengler, Thorsten; Wenig, Siegfried; Wermes, Norbert; Werner, Matthias; Werner, Per; Werth, Michael; Wessels, Martin; Weydert, Carole; Whalen, Kathleen; Wheeler-Ellis, Sarah Jane; Whitaker, Scott; White, Andrew; White, Martin; Whitehead, Samuel Robert; Whiteson, Daniel; Whittington, Denver; Wicek, Francois; Wicke, Daniel; Wickens, Fred; Wiedenmann, Werner; Wielers, Monika; Wienemann, Peter; Wiglesworth, Craig; Wiik, Liv Antje Mari; Wijeratne, Peter Alexander; Wildauer, Andreas; Wildt, Martin Andre; Wilhelm, Ivan; Wilkens, Henric George; Will, Jonas Zacharias; Williams, Eric; Williams, Hugh; Willis, William; Willocq, Stephane; Wilson, John; Wilson, Michael Galante; Wilson, Alan; Wingerter-Seez, Isabelle; Winkelmann, Stefan; Winklmeier, Frank; Wittgen, Matthias; Wolter, Marcin Wladyslaw; Wolters, Helmut; Wong, Wei-Cheng; Wooden, Gemma; Wosiek, Barbara; Wotschack, Jorg; Woudstra, Martin; Wozniak, Krzysztof; Wraight, Kenneth; Wright, Catherine; Wright, Michael; Wrona, Bozydar; Wu, Sau Lan; Wu, Xin; Wu, Yusheng; Wulf, Evan; Wunstorf, Renate; Wynne, Benjamin; Xella, Stefania; Xiao, Meng; Xie, Song; Xie, Yigang; Xu, Chao; Xu, Da; Xu, Guofa; Yabsley, Bruce; Yacoob, Sahal; Yamada, Miho; Yamaguchi, Hiroshi; Yamamoto, Akira; Yamamoto, Kyoko; Yamamoto, Shimpei; Yamamura, Taiki; Yamanaka, Takashi; Yamaoka, Jared; Yamazaki, Takayuki; Yamazaki, Yuji; Yan, Zhen; Yang, Haijun; Yang, Un-Ki; Yang, Yi; Yang, Yi; Yang, Zhaoyu; Yanush, Serguei; Yao, Yushu; Yasu, Yoshiji; Ybeles Smit, Gabriel Valentijn; Ye, Jingbo; Ye, Shuwei; Yilmaz, Metin; Yoosoofmiya, Reza; Yorita, Kohei; Yoshida, Riktura; Young, Charles; Youssef, Saul; Yu, Dantong; Yu, Jaehoon; Yu, Jie; Yuan, Li; Yurkewicz, Adam; Zabinski, Bartlomiej; Zaets, Vassilli; Zaidan, Remi; Zaitsev, Alexander; Zajacova, Zuzana; Zanello, Lucia; Zarzhitsky, Pavel; Zaytsev, Alexander; Zeitnitz, Christian; Zeller, Michael; Zeman, Martin; Zemla, Andrzej; Zendler, Carolin; Zenin, Oleg; Ženiš, Tibor; Zenonos, Zenonas; Zenz, Seth; Zerwas, Dirk; Zevi della Porta, Giovanni; Zhan, Zhichao; Zhang, Dongliang; Zhang, Huaqiao; Zhang, Jinlong; Zhang, Xueyao; Zhang, Zhiqing; Zhao, Long; Zhao, Tianchi; Zhao, Zhengguo; Zhemchugov, Alexey; Zheng, Shuchen; Zhong, Jiahang; Zhou, Bing; Zhou, Ning; Zhou, Yue; Zhu, Cheng Guang; Zhu, Hongbo; Zhu, Junjie; Zhu, Yingchun; Zhuang, Xuai; Zhuravlov, Vadym; Zieminska, Daria; Zimmermann, Robert; Zimmermann, Simone; Zimmermann, Stephanie; Ziolkowski, Michael; Zitoun, Robert; Živković, Lidija; Zmouchko, Viatcheslav; Zobernig, Georg; Zoccoli, Antonio; Zolnierowski, Yves; Zsenei, Andras; zur Nedden, Martin; Zutshi, Vishnu; Zwalinski, Lukasz

    2013-03-02

    The uncertainty on the calorimeter energy response to jets of particles is derived for the ATLAS experiment at the Large Hadron Collider (LHC). First, the calorimeter response to single isolated charged hadrons is measured and compared to the Monte Carlo simulation using proton-proton collisions at centre-of-mass energies of $\\sqrt{s}$ = 900 GeV and 7 TeV collected during 2009 and 2010. Then, using the decay of K_s and Lambda particles, the calorimeter response to specific types of particles (positively and negatively charged pions, protons, and anti-protons) is measured and compared to the Monte Carlo predictions. Finally, the jet energy scale uncertainty is determined by propagating the response uncertainty for single charged and neutral particles to jets. The response uncertainty is 2-5% for central isolated hadrons and 1-3% for the final calorimeter jet energy scale.

  12. Robust framework for PET image reconstruction incorporating system and measurement uncertainties.

    Directory of Open Access Journals (Sweden)

    Huafeng Liu

    Full Text Available In Positron Emission Tomography (PET, an optimal estimate of the radioactivity concentration is obtained from the measured emission data under certain criteria. So far, all the well-known statistical reconstruction algorithms require exactly known system probability matrix a priori, and the quality of such system model largely determines the quality of the reconstructed images. In this paper, we propose an algorithm for PET image reconstruction for the real world case where the PET system model is subject to uncertainties. The method counts PET reconstruction as a regularization problem and the image estimation is achieved by means of an uncertainty weighted least squares framework. The performance of our work is evaluated with the Shepp-Logan simulated and real phantom data, which demonstrates significant improvements in image quality over the least squares reconstruction efforts.

  13. Scientific rigor through videogames.

    Science.gov (United States)

    Treuille, Adrien; Das, Rhiju

    2014-11-01

    Hypothesis-driven experimentation - the scientific method - can be subverted by fraud, irreproducibility, and lack of rigorous predictive tests. A robust solution to these problems may be the 'massive open laboratory' model, recently embodied in the internet-scale videogame EteRNA. Deploying similar platforms throughout biology could enforce the scientific method more broadly. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Conversion factor and uncertainty estimation for quantification of towed gamma-ray detector measurements in Tohoku coastal waters

    International Nuclear Information System (INIS)

    Ohnishi, S.; Thornton, B.; Kamada, S.; Hirao, Y.; Ura, T.; Odano, N.

    2016-01-01

    Factors to convert the count rate of a NaI(Tl) scintillation detector to the concentration of radioactive cesium in marine sediments are estimated for a towed gamma-ray detector system. The response of the detector against a unit concentration of radioactive cesium is calculated by Monte Carlo radiation transport simulation considering the vertical profile of radioactive material measured in core samples. The conversion factors are acquired by integrating the contribution of each layer and are normalized by the concentration in the surface sediment layer. At the same time, the uncertainty of the conversion factors are formulated and estimated. The combined standard uncertainty of the radioactive cesium concentration by the towed gamma-ray detector is around 25 percent. The values of uncertainty, often referred to as relative root mean squat errors in other works, between sediment core sampling measurements and towed detector measurements were 16 percent in the investigation made near the Abukuma River mouth and 5.2 percent in Sendai Bay, respectively. Most of the uncertainty is due to interpolation of the conversion factors between core samples and uncertainty of the detector's burial depth. The results of the towed measurements agree well with laboratory analysed sediment samples. Also, the concentrations of radioactive cesium at the intersection of each survey line are consistent. The consistency with sampling results and between different lines' transects demonstrate the availability and reproducibility of towed gamma-ray detector system.

  15. Conversion factor and uncertainty estimation for quantification of towed gamma-ray detector measurements in Tohoku coastal waters

    Energy Technology Data Exchange (ETDEWEB)

    Ohnishi, S., E-mail: ohnishi@nmri.go.jp [National Maritime Research Institute, 6-38-1, Shinkawa, Mitaka, Tokyo 181-0004 (Japan); Thornton, B. [Institute of Industrial Science, The University of Tokyo, 4-6-1, Komaba, Meguro-ku, Tokyo 153-8505 (Japan); Kamada, S.; Hirao, Y.; Ura, T.; Odano, N. [National Maritime Research Institute, 6-38-1, Shinkawa, Mitaka, Tokyo 181-0004 (Japan)

    2016-05-21

    Factors to convert the count rate of a NaI(Tl) scintillation detector to the concentration of radioactive cesium in marine sediments are estimated for a towed gamma-ray detector system. The response of the detector against a unit concentration of radioactive cesium is calculated by Monte Carlo radiation transport simulation considering the vertical profile of radioactive material measured in core samples. The conversion factors are acquired by integrating the contribution of each layer and are normalized by the concentration in the surface sediment layer. At the same time, the uncertainty of the conversion factors are formulated and estimated. The combined standard uncertainty of the radioactive cesium concentration by the towed gamma-ray detector is around 25 percent. The values of uncertainty, often referred to as relative root mean squat errors in other works, between sediment core sampling measurements and towed detector measurements were 16 percent in the investigation made near the Abukuma River mouth and 5.2 percent in Sendai Bay, respectively. Most of the uncertainty is due to interpolation of the conversion factors between core samples and uncertainty of the detector's burial depth. The results of the towed measurements agree well with laboratory analysed sediment samples. Also, the concentrations of radioactive cesium at the intersection of each survey line are consistent. The consistency with sampling results and between different lines' transects demonstrate the availability and reproducibility of towed gamma-ray detector system.

  16. Assessment of measurement result uncertainty in determination of 210Pb with the focus on matrix composition effect in gamma-ray spectrometry

    International Nuclear Information System (INIS)

    Iurian, A.R.; Pitois, A.; Kis-Benedek, G.; Migliori, A.; Padilla-Alvarez, R.; Ceccatelli, A.

    2016-01-01

    Reference materials were used to assess measurement result uncertainty in determination of 210 Pb by gamma-ray spectrometry, liquid scintillation counting, or indirectly by alpha-particle spectrometry, using its daughter 210 Po in radioactive equilibrium. Combined standard uncertainties of 210 Pb massic activities obtained by liquid scintillation counting are in the range 2–12%, depending on matrices and massic activity values. They are in the range 1–3% for the measurement of its daughter 210 Po using alpha-particle spectrometry. Three approaches (direct computation of counting efficiency and efficiency transfer approaches based on the computation and, respectively, experimental determination of the efficiency transfer factors) were applied for the evaluation of 210 Pb using gamma-ray spectrometry. Combined standard uncertainties of gamma-ray spectrometry results were found in the range 2–17%. The effect of matrix composition on self-attenuation was investigated and a detailed assessment of uncertainty components was performed. - Highlights: • Confirmed 210 Pb certified values by LSC and alpha-particle spectrometry ( 210 Po). • Assessed 210 Po measurement result uncertainty by alpha-particle spectrometry. • Matrix composition effect on gamma-ray spectrometry measurement result uncertainty. • Assessment of 210 Pb measurement result uncertainty by gamma-ray spectrometry. • Comparison of techniques and approaches: ‘fit-for-purpose’ considerations.

  17. Environmental Uncertainty, Performance Measure Variety and Perceived Performance in Icelandic Companies

    DEFF Research Database (Denmark)

    Rikhardsson, Pall; Sigurjonsson, Throstur Olaf; Arnardottir, Audur Arna

    and the perceived performance of the company. The sample was the 300 largest companies in Iceland and the response rate was 27%. Compared to other studies the majority of the respondents use a surprisingly high number of different measures – both financial and non-financial. This made testing of the three......The use of performance measures and performance measurement frameworks has increased significantly in recent years. The type and variety of performance measures in use has been researched in various countries and linked to different variables such as the external environment, performance...... measurement frameworks, and management characteristics. This paper reports the results of a study carried out at year end 2013 of the use of performance measures by Icelandic companies and the links to perceived environmental uncertainty, management satisfaction with the performance measurement system...

  18. OECD/NEA BENCHMARK FOR UNCERTAINTY ANALYSIS IN MODELING (UAM FOR LWRS – SUMMARY AND DISCUSSION OF NEUTRONICS CASES (PHASE I

    Directory of Open Access Journals (Sweden)

    RYAN N. BRATTON

    2014-06-01

    Full Text Available A Nuclear Energy Agency (NEA, Organization for Economic Co-operation and Development (OECD benchmark for Uncertainty Analysis in Modeling (UAM is defined in order to facilitate the development and validation of available uncertainty analysis and sensitivity analysis methods for best-estimate Light water Reactor (LWR design and safety calculations. The benchmark has been named the OECD/NEA UAM-LWR benchmark, and has been divided into three phases each of which focuses on a different portion of the uncertainty propagation in LWR multi-physics and multi-scale analysis. Several different reactor cases are modeled at various phases of a reactor calculation. This paper discusses Phase I, known as the “Neutronics Phase”, which is devoted mostly to the propagation of nuclear data (cross-section uncertainty throughout steady-state stand-alone neutronics core calculations. Three reactor systems (for which design, operation and measured data are available are rigorously studied in this benchmark: Peach Bottom Unit 2 BWR, Three Mile Island Unit 1 PWR, and VVER-1000 Kozloduy-6/Kalinin-3. Additional measured data is analyzed such as the KRITZ LEU criticality experiments and the SNEAK-7A and 7B experiments of the Karlsruhe Fast Critical Facility. Analyzed results include the top five neutron-nuclide reactions, which contribute the most to the prediction uncertainty in keff, as well as the uncertainty in key parameters of neutronics analysis such as microscopic and macroscopic cross-sections, six-group decay constants, assembly discontinuity factors, and axial and radial core power distributions. Conclusions are drawn regarding where further studies should be done to reduce uncertainties in key nuclide reaction uncertainties (i.e.: 238U radiative capture and inelastic scattering (n, n’ as well as the average number of neutrons released per fission event of 239Pu.

  19. A Measure of Uncertainty regarding the Interval Constraint of Normal Mean Elicited by Two Stages of a Prior Hierarchy

    Directory of Open Access Journals (Sweden)

    Hea-Jung Kim

    2014-01-01

    Full Text Available This paper considers a hierarchical screened Gaussian model (HSGM for Bayesian inference of normal models when an interval constraint in the mean parameter space needs to be incorporated in the modeling but when such a restriction is uncertain. An objective measure of the uncertainty, regarding the interval constraint, accounted for by using the HSGM is proposed for the Bayesian inference. For this purpose, we drive a maximum entropy prior of the normal mean, eliciting the uncertainty regarding the interval constraint, and then obtain the uncertainty measure by considering the relationship between the maximum entropy prior and the marginal prior of the normal mean in HSGM. Bayesian estimation procedure of HSGM is developed and two numerical illustrations pertaining to the properties of the uncertainty measure are provided.

  20. Emergency cricothyrotomy for trismus caused by instantaneous rigor in cardiac arrest patients.

    Science.gov (United States)

    Lee, Jae Hee; Jung, Koo Young

    2012-07-01

    Instantaneous rigor as muscle stiffening occurring in the moment of death (or cardiac arrest) can be confused with rigor mortis. If trismus is caused by instantaneous rigor, orotracheal intubation is impossible and a surgical airway should be secured. Here, we report 2 patients who had emergency cricothyrotomy for trismus caused by instantaneous rigor. This case report aims to help physicians understand instantaneous rigor and to emphasize the importance of securing a surgical airway quickly on the occurrence of trismus. Copyright © 2012 Elsevier Inc. All rights reserved.

  1. Account for uncertainties of control measurements in the assessment of design margin factors

    International Nuclear Information System (INIS)

    Dementiev, V. G.; Sidorenko, V. D.; Shishkov, L. K.

    2011-01-01

    The paper discusses the feasibility of accounting for uncertainties of control measurements in estimation of design margin factors. The feasibility is also taken into consideration proceeding from the fact how much the processed measured data were corrected by a priori calculated data of measurable parameters. The possibility and feasibility of such data correction is demonstrated by the authors with the help of Bayes theorem famous in mathematical statistics. (Authors)

  2. Measures of uncertainty, importance and sensitivity of the SEDA code

    International Nuclear Information System (INIS)

    Baron, J.; Caruso, A.; Vinate, H.

    1996-01-01

    The purpose of this work is the estimation of the uncertainty on the results of the SEDA code (Sistema de Evaluacion de Dosis en Accidentes) in accordance with the input data and its parameters. The SEDA code has been developed by the Comision Nacional de Energia Atomica for the estimation of doses during emergencies in the vicinity of Atucha and Embalse, nuclear power plants. The user should feed the code with meteorological data, source terms and accident data (timing involved, release height, thermal content of the release, etc.) It is designed to be used during the emergency, and to bring fast results that enable to make decisions. The uncertainty in the results of the SEDA code is quantified in the present paper. This uncertainty is associated both with the data the user inputs to the code, and with the uncertain parameters of the code own models. The used method consisted in the statistical characterization of the parameters and variables, assigning them adequate probability distributions. These distributions have been sampled with the Latin Hypercube Sampling method, which is a stratified multi-variable Monte-Carlo technique. The code has been performed for each of the samples and finally, a result sample has been obtained. These results have been characterized from the statistical point of view (obtaining their mean, most probable value, distribution shape, etc.) for several distances from the source. Finally, the Partial Correlation Coefficients and Standard Regression Coefficients techniques have been used to obtain the relative importance of each input variable, and the Sensitivity of the code to its variations. The measures of Importance and Sensitivity have been obtained for several distances from the source and various cases of atmospheric stability, making comparisons possible. This paper allowed to confide in the results of the code, and the association of their uncertainty to them, as a way to know the limits in which the results can vary in a real

  3. How to: understanding SWAT model uncertainty relative to measured results

    Science.gov (United States)

    Watershed models are being relied upon to contribute to most policy-making decisions of watershed management, and the demand for an accurate accounting of complete model uncertainty is rising. Generalized likelihood uncertainty estimation (GLUE) is a widely used method for quantifying uncertainty i...

  4. Assessment of SFR Wire Wrap Simulation Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Delchini, Marc-Olivier G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Reactor and Nuclear Systems Division; Popov, Emilian L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Reactor and Nuclear Systems Division; Pointer, William David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Reactor and Nuclear Systems Division; Swiler, Laura P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-09-30

    Predictive modeling and simulation of nuclear reactor performance and fuel are challenging due to the large number of coupled physical phenomena that must be addressed. Models that will be used for design or operational decisions must be analyzed for uncertainty to ascertain impacts to safety or performance. Rigorous, structured uncertainty analyses are performed by characterizing the model’s input uncertainties and then propagating the uncertainties through the model to estimate output uncertainty. This project is part of the ongoing effort to assess modeling uncertainty in Nek5000 simulations of flow configurations relevant to the advanced reactor applications of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. Three geometries are under investigation in these preliminary assessments: a 3-D pipe, a 3-D 7-pin bundle, and a single pin from the Thermal-Hydraulic Out-of-Reactor Safety (THORS) facility. Initial efforts have focused on gaining an understanding of Nek5000 modeling options and integrating Nek5000 with Dakota. These tasks are being accomplished by demonstrating the use of Dakota to assess parametric uncertainties in a simple pipe flow problem. This problem is used to optimize performance of the uncertainty quantification strategy and to estimate computational requirements for assessments of complex geometries. A sensitivity analysis to three turbulent models was conducted for a turbulent flow in a single wire wrapped pin (THOR) geometry. Section 2 briefly describes the software tools used in this study and provides appropriate references. Section 3 presents the coupling interface between Dakota and a computational fluid dynamic (CFD) code (Nek5000 or STARCCM+), with details on the workflow, the scripts used for setting up the run, and the scripts used for post-processing the output files. In Section 4, the meshing methods used to generate the THORS and 7-pin bundle meshes are explained. Sections 5, 6 and 7 present numerical results

  5. Rigor or mortis: best practices for preclinical research in neuroscience.

    Science.gov (United States)

    Steward, Oswald; Balice-Gordon, Rita

    2014-11-05

    Numerous recent reports document a lack of reproducibility of preclinical studies, raising concerns about potential lack of rigor. Examples of lack of rigor have been extensively documented and proposals for practices to improve rigor are appearing. Here, we discuss some of the details and implications of previously proposed best practices and consider some new ones, focusing on preclinical studies relevant to human neurological and psychiatric disorders. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. Investigation of systematic uncertainties on the measurement of the top-quark mass using lepton transverse momenta

    CERN Document Server

    The ATLAS collaboration

    2018-01-01

    This study investigates the impact of systematic uncertainties on a top-quark mass ($m_\\text{top}$) measurement in the lepton+jets channel with the ATLAS experiment at the LHC. For the study, simulated $t\\bar{t}$ events with lepton+jets final states at a centre of mass energy of 8 TeV are used. In contrast to other analyses, this study is designed to exploit the dependence of the lepton kinematics on the top-quark mass, by parameterising the lepton's transverse momentum distribution with MC simulations. Due to its different systematic uncertainty, this method can potentially contribute to a more accurate measurement of $m_\\text{top}$. The overall uncertainty in this study is 2.3 GeV, dominated by the current uncertainty on initial and final state radiation. Since the result depends on the modelling of the top-quark transverse momentum, it is sensitive to higher order QCD corrections. The influence of such corrections is estimated by reweighting the next-to-leading-order MC prediction by next-to-next-to-leadin...

  7. Incorporating Plutonium Particle Size Effects in the Assessment of Active Mode Measurement Uncertainty in Passive-Active Neutron Radioassay Systems

    International Nuclear Information System (INIS)

    Blackwood, Larry G.; Harker, Yale D.

    2002-01-01

    Assessment of active mode measurement uncertainty in passive-active neutron radioassay systems used to measure Pu content in nuclear waste is severely hampered by lack of knowledge of the waste Pu particle size distribution, which is a major factor in determining bias in active mode measurements. The sensitivity of active mode measurements to particle size precludes using simulations or surrogate waste forms to estimate uncertainty in active mode measurements when the particle size distribution is not precisely known or inadequately reproduced. An alternative approach is based on a statistical comparison of active and passive mode results in the mass range for which both active and passive mode analyses produce useable measurements. Because passive mode measurements are not particularly sensitive to particle size effects, their uncertainty can be more easily assessed. Once bias corrected, passive mode measurements can serve as confirmatory measurements for the estimation of active mode bias. Further statistical analysis of the errors in measurements leads to precision estimates for the active mode

  8. College Readiness in California: A Look at Rigorous High School Course-Taking

    Science.gov (United States)

    Gao, Niu

    2016-01-01

    Recognizing the educational and economic benefits of a college degree, education policymakers at the federal, state, and local levels have made college preparation a priority. There are many ways to measure college readiness, but one key component is rigorous high school coursework. California has not yet adopted a statewide college readiness…

  9. Considering sampling strategy and cross-section complexity for estimating the uncertainty of discharge measurements using the velocity-area method

    Science.gov (United States)

    Despax, Aurélien; Perret, Christian; Garçon, Rémy; Hauet, Alexandre; Belleville, Arnaud; Le Coz, Jérôme; Favre, Anne-Catherine

    2016-02-01

    Streamflow time series provide baseline data for many hydrological investigations. Errors in the data mainly occur through uncertainty in gauging (measurement uncertainty) and uncertainty in the determination of the stage-discharge relationship based on gaugings (rating curve uncertainty). As the velocity-area method is the measurement technique typically used for gaugings, it is fundamental to estimate its level of uncertainty. Different methods are available in the literature (ISO 748, Q + , IVE), all with their own limitations and drawbacks. Among the terms forming the combined relative uncertainty in measured discharge, the uncertainty component relating to the limited number of verticals often includes a large part of the relative uncertainty. It should therefore be estimated carefully. In ISO 748 standard, proposed values of this uncertainty component only depend on the number of verticals without considering their distribution with respect to the depth and velocity cross-sectional profiles. The Q + method is sensitive to a user-defined parameter while it is questionable whether the IVE method is applicable to stream-gaugings performed with a limited number of verticals. To address the limitations of existing methods, this paper presents a new methodology, called FLow Analog UnceRtainty Estimation (FLAURE), to estimate the uncertainty component relating to the limited number of verticals. High-resolution reference gaugings (with 31 and more verticals) are used to assess the uncertainty component through a statistical analysis. Instead of subsampling purely randomly the verticals of these reference stream-gaugings, a subsampling method is developed in a way that mimicks the behavior of a hydrometric technician. A sampling quality index (SQI) is suggested and appears to be a more explanatory variable than the number of verticals. This index takes into account the spacing between verticals and the variation of unit flow between two verticals. To compute the

  10. An Empirical Analysis of Stakeholders' Influence on Policy Development: the Role of Uncertainty Handling

    Directory of Open Access Journals (Sweden)

    Rianne M. Bijlsma

    2011-03-01

    Full Text Available Stakeholder participation is advocated widely, but there is little structured, empirical research into its influence on policy development. We aim to further the insight into the characteristics of participatory policy development by comparing it to expert-based policy development for the same case. We describe the process of problem framing and analysis, as well as the knowledge base used. We apply an uncertainty perspective to reveal differences between the approaches and speculate about possible explanations. We view policy development as a continuous handling of substantive uncertainty and process uncertainty, and investigate how the methods of handling uncertainty of actors influence the policy development. Our findings suggest that the wider frame that was adopted in the participatory approach was the result of a more active handling of process uncertainty. The stakeholders handled institutional uncertainty by broadening the problem frame, and they handled strategic uncertainty by negotiating commitment and by including all important stakeholder criteria in the frame. In the expert-based approach, we observed a more passive handling of uncertainty, apparently to avoid complexity. The experts handled institutional uncertainty by reducing the scope and by anticipating windows of opportunity in other policy arenas. Strategic uncertainty was handled by assuming stakeholders' acceptance of noncontroversial measures that balanced benefits and sacrifices. Three other observations are of interest to the scientific debate on participatory policy processes. Firstly, the participatory policy was less adaptive than the expert-based policy. The observed low tolerance for process uncertainty of participants made them opt for a rigorous "once and for all" settling of the conflict. Secondly, in the participatory approach, actors preferred procedures of traceable knowledge acquisition over controversial topics to handle substantive uncertainty. This

  11. Uncertainty Estimation of Shear-wave Velocity Structure from Bayesian Inversion of Microtremor Array Dispersion Data

    Science.gov (United States)

    Dosso, S. E.; Molnar, S.; Cassidy, J.

    2010-12-01

    Bayesian inversion of microtremor array dispersion data is applied, with evaluation of data errors and model parameterization, to produce the most-probable shear-wave velocity (VS) profile together with quantitative uncertainty estimates. Generally, the most important property characterizing earthquake site response is the subsurface VS structure. The microtremor array method determines phase velocity dispersion of Rayleigh surface waves from multi-instrument recordings of urban noise. Inversion of dispersion curves for VS structure is a non-unique and nonlinear problem such that meaningful evaluation of confidence intervals is required. Quantitative uncertainty estimation requires not only a nonlinear inversion approach that samples models proportional to their probability, but also rigorous estimation of the data error statistics and an appropriate model parameterization. A Bayesian formulation represents the solution of the inverse problem in terms of the posterior probability density (PPD) of the geophysical model parameters. Markov-chain Monte Carlo methods are used with an efficient implementation of Metropolis-Hastings sampling to provide an unbiased sample from the PPD to compute parameter uncertainties and inter-relationships. Nonparametric estimation of a data error covariance matrix from residual analysis is applied with rigorous a posteriori statistical tests to validate the covariance estimate and the assumption of a Gaussian error distribution. The most appropriate model parameterization is determined using the Bayesian information criterion (BIC), which provides the simplest model consistent with the resolving power of the data. Parameter uncertainties are found to be under-estimated when data error correlations are neglected and when compressional-wave velocity and/or density (nuisance) parameters are fixed in the inversion. Bayesian inversion of microtremor array data is applied at two sites in British Columbia, the area of highest seismic risk in

  12. Measurement uncertainty and gauge capability of surface roughness measurements in the automotive industry: a case study

    International Nuclear Information System (INIS)

    Drégelyi-Kiss, Ágota; Czifra, Árpád

    2014-01-01

    The calculation methods of the capability of measurement processes in the automotive industry differ from each other. There are three main calculation methods: MSA, VDA 5 and the international standard, ISO 22514–7. During this research our aim was to compare the capability calculation methods in a case study. Two types of automotive parts (ten pieces of each) are chosen to examine the behaviour of the manufacturing process and to measure the required characteristics of the measurement process being evaluated. The measurement uncertainty of the measuring process is calculated according to the VDA 5 and ISO 22514–7, and MSA guidelines. In this study the conformance of a measurement process in an automotive manufacturing process is determined, and the similarities and the differences between the methods used are shown. (paper)

  13. The effect of temperature on the mechanical aspects of rigor mortis in a liquid paraffin model.

    Science.gov (United States)

    Ozawa, Masayoshi; Iwadate, Kimiharu; Matsumoto, Sari; Asakura, Kumiko; Ochiai, Eriko; Maebashi, Kyoko

    2013-11-01

    Rigor mortis is an important phenomenon to estimate the postmortem interval in forensic medicine. Rigor mortis is affected by temperature. We measured stiffness of rat muscles using a liquid paraffin model to monitor the mechanical aspects of rigor mortis at five temperatures (37, 25, 10, 5 and 0°C). At 37, 25 and 10°C, the progression of stiffness was slower in cooler conditions. At 5 and 0°C, the muscle stiffness increased immediately after the muscles were soaked in cooled liquid paraffin and then muscles gradually became rigid without going through a relaxed state. This phenomenon suggests that it is important to be careful when estimating the postmortem interval in cold seasons. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  14. The uncertainty of reference standards--a guide to understanding factors impacting uncertainty, uncertainty calculations, and vendor certifications.

    Science.gov (United States)

    Gates, Kevin; Chang, Ning; Dilek, Isil; Jian, Huahua; Pogue, Sherri; Sreenivasan, Uma

    2009-10-01

    Certified solution standards are widely used in forensic toxicological, clinical/diagnostic, and environmental testing. Typically, these standards are purchased as ampouled solutions with a certified concentration. Vendors present concentration and uncertainty differently on their Certificates of Analysis. Understanding the factors that impact uncertainty and which factors have been considered in the vendor's assignment of uncertainty are critical to understanding the accuracy of the standard and the impact on testing results. Understanding these variables is also important for laboratories seeking to comply with ISO/IEC 17025 requirements and for those preparing reference solutions from neat materials at the bench. The impact of uncertainty associated with the neat material purity (including residual water, residual solvent, and inorganic content), mass measurement (weighing techniques), and solvent addition (solution density) on the overall uncertainty of the certified concentration is described along with uncertainty calculations.

  15. The uncertainty of measurements. Research on air pollution; Meten is ook onzeker. Lucht in onderzoek

    Energy Technology Data Exchange (ETDEWEB)

    Van den Elshout, S. [DCMR Milieudienst Rijnmond, Rotterdam (Netherlands); Woudenberg, F. [Cluster Leefomgeving, Afdeling Milieu en Gezondheid, GGD Amsterdam, Amsterdam (Netherlands)

    2011-08-15

    Measurements are sometimes suggested as alternative to uncertain forecasts in Legal decision making. However, measurements also have entail uncertainties. This article offers several considerations on how to deal with uncertainties in the Legal establishment of air quality. But next to the theoretical considerations, what applies in reality is always: less air pollution is better. [Dutch] Metingen worden soms voorgesteld als alternatief voor onzekere voorspellingen bij juridische besluitvorming. Metingen kennen echter ook onzekerheden. In dit artikel enkele overwegingen over hoe om te gaan met onzekerheden bij de juridische bepaling van de luchtkwaliteit. Naast de theoretische overwegingen geldt in de praktijk echter altijd: minder luchtvervuiling is beter.

  16. Statistical mechanics rigorous results

    CERN Document Server

    Ruelle, David

    1999-01-01

    This classic book marks the beginning of an era of vigorous mathematical progress in equilibrium statistical mechanics. Its treatment of the infinite system limit has not been superseded, and the discussion of thermodynamic functions and states remains basic for more recent work. The conceptual foundation provided by the Rigorous Results remains invaluable for the study of the spectacular developments of statistical mechanics in the second half of the 20th century.

  17. Quantification of Back-End Nuclear Fuel Cycle Metrics Uncertainties Due to Cross Sections

    International Nuclear Information System (INIS)

    Tracy E. Stover Jr.

    2007-01-01

    models are followed by more rigorous reactor physics and depletion models showing a PWR uranium oxide fuel and various metal fast reactor fuels composed of transuranics. The more rigorous models include multi-group cross sections, multiple burnup steps, neutron transport calculations to update cross sections, and multi-scale multi-physics code sequences to simulate a complete fuel lifetime. Finally, the fast reactor and PWR fuels are combined in a closed fast reactor recycle fuel cycle, and uncertainties on the resulting equilibrium cycle examined

  18. High and low rigor temperature effects on sheep meat tenderness and ageing.

    Science.gov (United States)

    Devine, Carrick E; Payne, Steven R; Peachey, Bridget M; Lowe, Timothy E; Ingram, John R; Cook, Christian J

    2002-02-01

    Immediately after electrical stimulation, the paired m. longissimus thoracis et lumborum (LT) of 40 sheep were boned out and wrapped tightly with a polyethylene cling film. One of the paired LT's was chilled in 15°C air to reach a rigor mortis (rigor) temperature of 18°C and the other side was placed in a water bath at 35°C and achieved rigor at this temperature. Wrapping reduced rigor shortening and mimicked meat left on the carcass. After rigor, the meat was aged at 15°C for 0, 8, 26 and 72 h and then frozen. The frozen meat was cooked to 75°C in an 85°C water bath and shear force values obtained from a 1×1 cm cross-section. The shear force values of meat for 18 and 35°C rigor were similar at zero ageing, but as ageing progressed, the 18 rigor meat aged faster and became more tender than meat that went into rigor at 35°C (Prigor at each ageing time were significantly different (Prigor were still significantly greater. Thus the toughness of 35°C meat was not a consequence of muscle shortening and appears to be due to both a faster rate of tenderisation and the meat tenderising to a greater extent at the lower temperature. The cook loss at 35°C rigor (30.5%) was greater than that at 18°C rigor (28.4%) (P<0.01) and the colour Hunter L values were higher at 35°C (P<0.01) compared with 18°C, but there were no significant differences in a or b values.

  19. Accuracy, reproducibility, and uncertainty analysis of thyroid-probe-based activity measurements for determination of dose calibrator settings.

    Science.gov (United States)

    Esquinas, Pedro L; Tanguay, Jesse; Gonzalez, Marjorie; Vuckovic, Milan; Rodríguez-Rodríguez, Cristina; Häfeli, Urs O; Celler, Anna

    2016-12-01

    In the nuclear medicine department, the activity of radiopharmaceuticals is measured using dose calibrators (DCs) prior to patient injection. The DC consists of an ionization chamber that measures current generated by ionizing radiation (emitted from the radiotracer). In order to obtain an activity reading, the current is converted into units of activity by applying an appropriate calibration factor (also referred to as DC dial setting). Accurate determination of DC dial settings is crucial to ensure that patients receive the appropriate dose in diagnostic scans or radionuclide therapies. The goals of this study were (1) to describe a practical method to experimentally determine dose calibrator settings using a thyroid-probe (TP) and (2) to investigate the accuracy, reproducibility, and uncertainties of the method. As an illustration, the TP method was applied to determine 188 Re dial settings for two dose calibrator models: Atomlab 100plus and Capintec CRC-55tR. Using the TP to determine dose calibrator settings involved three measurements. First, the energy-dependent efficiency of the TP was determined from energy spectra measurements of two calibration sources ( 152 Eu and 22 Na). Second, the gamma emissions from the investigated isotope ( 188 Re) were measured using the TP and its activity was determined using γ-ray spectroscopy methods. Ambient background, scatter, and source-geometry corrections were applied during the efficiency and activity determination steps. Third, the TP-based 188 Re activity was used to determine the dose calibrator settings following the calibration curve method [B. E. Zimmerman et al., J. Nucl. Med. 40, 1508-1516 (1999)]. The interobserver reproducibility of TP measurements was determined by the coefficient of variation (COV) and uncertainties associated to each step of the measuring process were estimated. The accuracy of activity measurements using the proposed method was evaluated by comparing the TP activity estimates of 99m Tc

  20. An Uncertainty Quantification Framework for Remote Sensing Retrievals

    Science.gov (United States)

    Braverman, A. J.; Hobbs, J.

    2017-12-01

    Remote sensing data sets produced by NASA and other space agencies are the result of complex algorithms that infer geophysical state from observed radiances using retrieval algorithms. The processing must keep up with the downlinked data flow, and this necessitates computational compromises that affect the accuracies of retrieved estimates. The algorithms are also limited by imperfect knowledge of physics and of ancillary inputs that are required. All of this contributes to uncertainties that are generally not rigorously quantified by stepping outside the assumptions that underlie the retrieval methodology. In this talk we discuss a practical framework for uncertainty quantification that can be applied to a variety of remote sensing retrieval algorithms. Ours is a statistical approach that uses Monte Carlo simulation to approximate the sampling distribution of the retrieved estimates. We will discuss the strengths and weaknesses of this approach, and provide a case-study example from the Orbiting Carbon Observatory 2 mission.

  1. Stereo-particle image velocimetry uncertainty quantification

    International Nuclear Information System (INIS)

    Bhattacharya, Sayantan; Vlachos, Pavlos P; Charonko, John J

    2017-01-01

    Particle image velocimetry (PIV) measurements are subject to multiple elemental error sources and thus estimating overall measurement uncertainty is challenging. Recent advances have led to a posteriori uncertainty estimation methods for planar two-component PIV. However, no complete methodology exists for uncertainty quantification in stereo PIV. In the current work, a comprehensive framework is presented to quantify the uncertainty stemming from stereo registration error and combine it with the underlying planar velocity uncertainties. The disparity in particle locations of the dewarped images is used to estimate the positional uncertainty of the world coordinate system, which is then propagated to the uncertainty in the calibration mapping function coefficients. Next, the calibration uncertainty is combined with the planar uncertainty fields of the individual cameras through an uncertainty propagation equation and uncertainty estimates are obtained for all three velocity components. The methodology was tested with synthetic stereo PIV data for different light sheet thicknesses, with and without registration error, and also validated with an experimental vortex ring case from 2014 PIV challenge. Thorough sensitivity analysis was performed to assess the relative impact of the various parameters to the overall uncertainty. The results suggest that in absence of any disparity, the stereo PIV uncertainty prediction method is more sensitive to the planar uncertainty estimates than to the angle uncertainty, although the latter is not negligible for non-zero disparity. Overall the presented uncertainty quantification framework showed excellent agreement between the error and uncertainty RMS values for both the synthetic and the experimental data and demonstrated reliable uncertainty prediction coverage. This stereo PIV uncertainty quantification framework provides the first comprehensive treatment on the subject and potentially lays foundations applicable to volumetric

  2. Optimal measurement uncertainties for materials accounting in a fast breeder reactor spent-fuel reprocessing plant

    International Nuclear Information System (INIS)

    Dayem, H.A.; Kern, E.A.; Markin, J.T.

    1982-01-01

    Optimization techniques are used to calculate measurement uncertainties for materials accountability instruments in a fast breeder reactor spent-fuel reprocessing plant. Optimal measurement uncertainties are calculated so that performance goals for detecting materials loss are achieved while minimizing the total instrument development cost. Improved materials accounting in the chemical separations process (111 kg Pu/day) to meet 8-kg plutonium abrupt (1 day) and 40-kg plutonium protracted (6 months) loss-detection goals requires: process tank volume and concentration measurements having precisions less than or equal to 1%; accountability and plutonium sample tank volume measurements having precisions less than or equal to 0.3%, short-term correlated errors less than or equal to 0.04%, and long-term correlated errors less than or equal to 0.04%; and accountability and plutonium sample tank concentration measurements having precisions less than or equal to 0.4%, short-term correlated errors less than or equal to 0.1%, and long-term correlated errors less than or equal to 0.05%

  3. Evaluation the sources of uncertainty associated to the measurement results of in vivo monitoring of iodine 131 in the thyroid

    International Nuclear Information System (INIS)

    Gontijo, Rodrigo Modesto Gadelha

    2011-01-01

    In vivo monitoring techniques consist of identification and quantification of radionuclides present in the whole body and specific organs and tissues. In Vivo monitoring requires the use of detedors which are sensitive to the radiation emitted by radionuclides present in the monitored individual. The results obtained in measurements may present small uncertainties which are within pre-set limits in monitoring programs for occupationally exposed individuais. However, any device used to determine physical quantities present uncertainties in the measured values. The total uncertainty of a measurement result is estimated from the propagation of the uncertainties associated to each parameter of the calculation. This study aims to evaluate the sources of uncertainty associated to the measurement results of in vivo monitoring of iodine 131 in the thyroid, in comparison to the suggested in the General Guide for Estimating Effective Doses from Monitoring Data (Project IDEAS/European Community). The reference values used were the ones for high-energy photons (>100 keV). The measurement uncertainties were divided into two categories: type A and type B. The component of type A represents the statistical fluctuation in the counting of the standard source. Regarding type B, the following variations were presented: detector positioning over the phantom; variation of background radiation; thickness of the overlay tissue over the monitored organ, distribution of the activity in the organ. Besides the parameters suggested by the IDEAS Guide, it has also been evaluated the fluctuation of the counting due to the phantom repositioning, which represents the reproducibility of the measurement geometry. Measurements were performed at the Whole Body Counter Unit of IRD using a scintillation detector Nal (Tl) 3 x3 and a neck-thyroid phantom developed at LABMIVIRD. Scattering factors were calculated and compared in different counting geometries. The results of this study show that the

  4. Biogenic carbon in combustible waste: Waste composition, variability and measurement uncertainty

    DEFF Research Database (Denmark)

    Larsen, Anna Warberg; Fuglsang, Karsten; Pedersen, Niels H.

    2013-01-01

    described in the literature. This study addressed the variability of biogenic and fossil carbon in combustible waste received at a municipal solid waste incinerator. Two approaches were compared: (1) radiocarbon dating (14C analysis) of carbon dioxide sampled from the flue gas, and (2) mass and energy......, the measurement uncertainties related to the two approaches were determined. Two flue gas sampling campaigns at a full-scale waste incinerator were included: one during normal operation and one with controlled waste input. Estimation of carbon contents in the main waste types received was included. Both the 14C...... method and the balance method represented promising methods able to provide good quality data for the ratio between biogenic and fossil carbon in waste. The relative uncertainty in the individual experiments was 7–10% (95% confidence interval) for the 14C method and slightly lower for the balance method....

  5. Final report on uncertainties in the detection, measurement, and analysis of selected features pertinent to deep geologic repositories

    International Nuclear Information System (INIS)

    1978-01-01

    Uncertainties with regard to many facets of repository site characterization have not yet been quantified. This report summarizes the state of knowledge of uncertainties in the measurement of porosity, hydraulic conductivity, and hydraulic gradient; uncertainties associated with various geophysical field techniques; and uncertainties associated with the effects of exploration and exploitation activities in bedded salt basins. The potential for seepage through a depository in bedded salt or shale is reviewed and, based upon the available data, generic values for the hydraulic conductivity and porosity of bedded salt and shale are proposed

  6. Uncertainty evaluation of fluid dynamic models and validation by gamma ray transmission measurements of the catalyst flow in a FCC cold pilot unity

    International Nuclear Information System (INIS)

    Teles, Francisco A.S.; Santos, Ebenezer F.; Dantas, Carlos C.; Melo, Silvio B.; Santos, Valdemir A. dos; Lima, Emerson A.O.

    2013-01-01

    In this paper, fluid dynamics of Fluid Catalytic Cracking (FCC) process is investigated by means of a Cold Flow Pilot Unit (CFPU) constructed in Plexiglas to visualize operational conditions. Axial and radial catalyst profiles were measured by gamma ray transmission in the riser of the CFPU. Standard uncertainty was evaluated in volumetric solid fraction measurements for several concentrations at a given point of axial profile. Monitoring of the pressure drop in riser shows a good agreement with measured standard uncertainty data. A further evaluation of the combined uncertainty was applied to volumetric solid fraction equation using gamma transmission data. Limit condition of catalyst concentration in riser was defined and simulation with random numbers provided by MATLAB software has tested uncertainty evaluation. The Guide to the expression of Uncertainty in Measurement (GUM) is based on the law of propagation of uncertainty and on the characterization of the quantities measured by means of either a Gaussian distribution or a t-distribution, which allows measurement uncertainty to be delimited by means of a confidence interval. A variety of supplements to GUM are being developed, which will progressively enter into effect. The first of these supplements [3] describes an alternative procedure for the calculation of uncertainties: the Monte Carlo Method (MCM).MCM is an alternative to GUM, since it performs a characterization of the quantities measured based on the random sampling of the probability distribution functions. This paper also explains the basic implementation of the MCM method in MATLAB. (author)

  7. Uncertainty evaluation of fluid dynamic models and validation by gamma ray transmission measurements of the catalyst flow in a FCC cold pilot unity

    Energy Technology Data Exchange (ETDEWEB)

    Teles, Francisco A.S.; Santos, Ebenezer F.; Dantas, Carlos C., E-mail: francisco.teles@ufpe.br [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Centro de Tecnologia e Geociencias. Departamento de Energia Nuclear; Melo, Silvio B., E-mail: sbm@cin.ufpe.br [Universidade Federal de Pernambuco (CIN/UFPE), Recife, PE (Brazil). Centro de Informatica; Santos, Valdemir A. dos, E-mail: vas@unicap.br [Universidade Catolica de Pernambuco (UNICAP), Recife, PE (Brazil). Dept. de Quimica; Lima, Emerson A.O., E-mail: emathematics@gmail.com [Universidade de Pernambuco (POLI/UPE), Recife, PE (Brazil). Escola Politecnica

    2013-07-01

    In this paper, fluid dynamics of Fluid Catalytic Cracking (FCC) process is investigated by means of a Cold Flow Pilot Unit (CFPU) constructed in Plexiglas to visualize operational conditions. Axial and radial catalyst profiles were measured by gamma ray transmission in the riser of the CFPU. Standard uncertainty was evaluated in volumetric solid fraction measurements for several concentrations at a given point of axial profile. Monitoring of the pressure drop in riser shows a good agreement with measured standard uncertainty data. A further evaluation of the combined uncertainty was applied to volumetric solid fraction equation using gamma transmission data. Limit condition of catalyst concentration in riser was defined and simulation with random numbers provided by MATLAB software has tested uncertainty evaluation. The Guide to the expression of Uncertainty in Measurement (GUM) is based on the law of propagation of uncertainty and on the characterization of the quantities measured by means of either a Gaussian distribution or a t-distribution, which allows measurement uncertainty to be delimited by means of a confidence interval. A variety of supplements to GUM are being developed, which will progressively enter into effect. The first of these supplements [3] describes an alternative procedure for the calculation of uncertainties: the Monte Carlo Method (MCM).MCM is an alternative to GUM, since it performs a characterization of the quantities measured based on the random sampling of the probability distribution functions. This paper also explains the basic implementation of the MCM method in MATLAB. (author)

  8. Calculation of uncertainties associated to environmental radioactivity measurements and their functions. Practical Procedure

    International Nuclear Information System (INIS)

    Gasco Leonarte, C.; Anton Mateos, M.P.

    1995-12-01

    This report summarizes the procedure used to calculate the uncertainties associated to environmental radioactivity measurements. focusing on those obtained by radiochemical separation in which tracers have been added. Uncertainties linked to activity concentration calculations, isotopic ratio, inventories, sequential leaching data, chronology dating by using C.R.S model and duplicate analysis are described in detail. The objective of this article is to serve as a guide to people not familiarized with this kind of calculations, showing clear practical examples. The input of the formulas and all the data needed to achieve these calculations into the Lotus 1,2,3, WIN is outlined as well. (Author)

  9. Calculation of uncertainties associated to environmental radioactivity measurements and their functions. Practical Procedure

    International Nuclear Information System (INIS)

    Gasco Leonarte, C; Anton Mateos, M. P.

    1995-01-01

    This report summarizes the procedure used to calculate the uncertainties associated to environmental radioactivity measurements, focusing on those obtained by radiochemical separation in which tracers have been added. Uncertainties linked to activity concentration calculations, isotopic rat iso, inventories, sequential leaching data, chronology dating by using C.R.S. model and duplicate analysis are described in detail. The objective of this article is to serve as a guide to people not familiarized with this kind of calculations, showing clear practical examples. The input of the formulas and all the data needed to achieve these calculations into the Lotus 1, 2, 3 WTN is outlined as well. (Author) 13 refs

  10. Uncertainty estimates of purity measurements based on current information: toward a "live validation" of purity methods.

    Science.gov (United States)

    Apostol, Izydor; Kelner, Drew; Jiang, Xinzhao Grace; Huang, Gang; Wypych, Jette; Zhang, Xin; Gastwirt, Jessica; Chen, Kenneth; Fodor, Szilan; Hapuarachchi, Suminda; Meriage, Dave; Ye, Frank; Poppe, Leszek; Szpankowski, Wojciech

    2012-12-01

    To predict precision and other performance characteristics of chromatographic purity methods, which represent the most widely used form of analysis in the biopharmaceutical industry. We have conducted a comprehensive survey of purity methods, and show that all performance characteristics fall within narrow measurement ranges. This observation was used to develop a model called Uncertainty Based on Current Information (UBCI), which expresses these performance characteristics as a function of the signal and noise levels, hardware specifications, and software settings. We applied the UCBI model to assess the uncertainty of purity measurements, and compared the results to those from conventional qualification. We demonstrated that the UBCI model is suitable to dynamically assess method performance characteristics, based on information extracted from individual chromatograms. The model provides an opportunity for streamlining qualification and validation studies by implementing a "live validation" of test results utilizing UBCI as a concurrent assessment of measurement uncertainty. Therefore, UBCI can potentially mitigate the challenges associated with laborious conventional method validation and facilitates the introduction of more advanced analytical technologies during the method lifecycle.

  11. Rigorous simulation: a tool to enhance decision making

    Energy Technology Data Exchange (ETDEWEB)

    Neiva, Raquel; Larson, Mel; Baks, Arjan [KBC Advanced Technologies plc, Surrey (United Kingdom)

    2012-07-01

    The world refining industries continue to be challenged by population growth (increased demand), regional market changes and the pressure of regulatory requirements to operate a 'green' refinery. Environmental regulations are reducing the value and use of heavy fuel oils, and leading to convert more of the heavier products or even heavier crude into lighter products while meeting increasingly stringent transportation fuel specifications. As a result actions are required for establishing a sustainable advantage for future success. Rigorous simulation provides a key advantage improving the time and efficient use of capital investment and maximizing profitability. Sustainably maximizing profit through rigorous modeling is achieved through enhanced performance monitoring and improved Linear Programme (LP) model accuracy. This paper contains examples on these two items. The combination of both increases overall rates of return. As refiners consider optimizing existing assets and expanding projects, the process agreed to achieve these goals is key for a successful profit improvement. The benefit of rigorous kinetic simulation with detailed fractionation allows for optimizing existing asset utilization while focusing the capital investment in the new unit(s), and therefore optimizing the overall strategic plan and return on investment. Individual process unit's monitoring works as a mechanism for validating and optimizing the plant performance. Unit monitoring is important to rectify poor performance and increase profitability. The key to a good LP relies upon the accuracy of the data used to generate the LP sub-model data. The value of rigorous unit monitoring are that the results are heat and mass balanced consistently, and are unique for a refiners unit / refinery. With the improved match of the refinery operation, the rigorous simulation models will allow capturing more accurately the non linearity of those process units and therefore provide correct

  12. Rigorous solution to Bargmann-Wigner equation for integer spin

    CERN Document Server

    Huang Shi Zhong; Wu Ning; Zheng Zhi Peng

    2002-01-01

    A rigorous method is developed to solve the Bargamann-Wigner equation for arbitrary integer spin in coordinate representation in a step by step way. The Bargmann-Wigner equation is first transformed to a form easier to solve, the new equations are then solved rigorously in coordinate representation, and the wave functions in a closed form are thus derived

  13. Characterisation of soft magnetic materials by measurement: Evaluation of uncertainties up to 1.8 T and 9 kHz

    Science.gov (United States)

    Elfgen, S.; Franck, D.; Hameyer, K.

    2018-04-01

    Magnetic measurements are indispensable for the characterization of soft magnetic material used e.g. in electrical machines. Characteristic values are used as quality control during production and for the parametrization of material models. Uncertainties and errors in the measurements are reflected directly in the parameters of the material models. This can result in over-dimensioning and inaccuracies in simulations for the design of electrical machines. Therefore, existing influencing factors in the characterization of soft magnetic materials are named and their resulting uncertainties contributions studied. The analysis of the resulting uncertainty contributions can serve the operator as additional selection criteria for different measuring sensors. The investigation is performed for measurements within and outside the currently prescribed standard, using a Single sheet tester and its impact on the identification of iron loss parameter is studied.

  14. A computer simulation platform for the estimation of measurement uncertainties in dimensional X-ray computed tomography

    DEFF Research Database (Denmark)

    Hiller, Jochen; Reindl, Leonard M

    2012-01-01

    into account the main error sources for the measurement. This method has the potential to deal with all kinds of systematic and random errors that influence a dimensional CT measurement. A case study demonstrates the practical application of the VCT simulator using numerically generated CT data and statistical......The knowledge of measurement uncertainty is of great importance in conformance testing in production. The tolerance limit for production must be reduced by the amounts of measurement uncertainty to ensure that the parts are in fact within the tolerance. Over the last 5 years, industrial X......-ray computed tomography (CT) has become an important technology for dimensional quality control. In this paper a computer simulation platform is presented which is able to investigate error sources in dimensional CT measurements. The typical workflow in industrial CT metrology is described and methods...

  15. RIGOR MORTIS AND THE INFLUENCE OF CALCIUM AND MAGNESIUM SALTS UPON ITS DEVELOPMENT.

    Science.gov (United States)

    Meltzer, S J; Auer, J

    1908-01-01

    Calcium salts hasten and magnesium salts retard the development of rigor mortis, that is, when these salts are administered subcutaneously or intravenously. When injected intra-arterially, concentrated solutions of both kinds of salts cause nearly an immediate onset of a strong stiffness of the muscles which is apparently a contraction, brought on by a stimulation caused by these salts and due to osmosis. This contraction, if strong, passes over without a relaxation into a real rigor. This form of rigor may be classed as work-rigor (Arbeitsstarre). In animals, at least in frogs, with intact cords, the early contraction and the following rigor are stronger than in animals with destroyed cord. If M/8 solutions-nearly equimolecular to "physiological" solutions of sodium chloride-are used, even when injected intra-arterially, calcium salts hasten and magnesium salts retard the onset of rigor. The hastening and retardation in this case as well as in the cases of subcutaneous and intravenous injections, are ion effects and essentially due to the cations, calcium and magnesium. In the rigor hastened by calcium the effects of the extensor muscles mostly prevail; in the rigor following magnesium injection, on the other hand, either the flexor muscles prevail or the muscles become stiff in the original position of the animal at death. There seems to be no difference in the degree of stiffness in the final rigor, only the onset and development of the rigor is hastened in the case of the one salt and retarded in the other. Calcium hastens also the development of heat rigor. No positive facts were obtained with regard to the effect of magnesium upon heat vigor. Calcium also hastens and magnesium retards the onset of rigor in the left ventricle of the heart. No definite data were gathered with regard to the effects of these salts upon the right ventricle.

  16. Estimation of uncertainty in tracer gas measurement of air change rates.

    Science.gov (United States)

    Iizuka, Atsushi; Okuizumi, Yumiko; Yanagisawa, Yukio

    2010-12-01

    Simple and economical measurement of air change rates can be achieved with a passive-type tracer gas doser and sampler. However, this is made more complex by the fact many buildings are not a single fully mixed zone. This means many measurements are required to obtain information on ventilation conditions. In this study, we evaluated the uncertainty of tracer gas measurement of air change rate in n completely mixed zones. A single measurement with one tracer gas could be used to simply estimate the air change rate when n = 2. Accurate air change rates could not be obtained for n ≥ 2 due to a lack of information. However, the proposed method can be used to estimate an air change rate with an accuracy of air change rate can be avoided. The proposed estimation method will be useful in practical ventilation measurements.

  17. Uncertainty budget for k0-NAA

    International Nuclear Information System (INIS)

    Robouch, P.; Arana, G.; Eguskiza, M.; Etxebarria, N.

    2000-01-01

    The concepts of the Guide to the expression of Uncertainties in Measurements for chemical measurements (GUM) and the recommendations of the Eurachem document 'Quantifying Uncertainty in Analytical Methods' are applied to set up the uncertainty budget for k 0 -NAA. The 'universally applicable spreadsheet technique', described by KRAGTEN, is applied to the k 0 -NAA basic equations for the computation of uncertainties. The variance components - individual standard uncertainties - highlight the contribution and the importance of the different parameters to be taken into account. (author)

  18. Estimation of Uncertainty in Tracer Gas Measurement of Air Change Rates

    Directory of Open Access Journals (Sweden)

    Atsushi Iizuka

    2010-12-01

    Full Text Available Simple and economical measurement of air change rates can be achieved with a passive-type tracer gas doser and sampler. However, this is made more complex by the fact many buildings are not a single fully mixed zone. This means many measurements are required to obtain information on ventilation conditions. In this study, we evaluated the uncertainty of tracer gas measurement of air change rate in n completely mixed zones. A single measurement with one tracer gas could be used to simply estimate the air change rate when n = 2. Accurate air change rates could not be obtained for n ≥ 2 due to a lack of information. However, the proposed method can be used to estimate an air change rate with an accuracy of

  19. Uncertainty analysis guide

    International Nuclear Information System (INIS)

    Andres, T.H.

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  20. Uncertainty analysis guide

    Energy Technology Data Exchange (ETDEWEB)

    Andres, T.H

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  1. A Peep into the Uncertainty-Complexity-Relevance Modeling Trilemma through Global Sensitivity and Uncertainty Analysis

    Science.gov (United States)

    Munoz-Carpena, R.; Muller, S. J.; Chu, M.; Kiker, G. A.; Perz, S. G.

    2014-12-01

    Model Model complexity resulting from the need to integrate environmental system components cannot be understated. In particular, additional emphasis is urgently needed on rational approaches to guide decision making through uncertainties surrounding the integrated system across decision-relevant scales. However, in spite of the difficulties that the consideration of modeling uncertainty represent for the decision process, it should not be avoided or the value and science behind the models will be undermined. These two issues; i.e., the need for coupled models that can answer the pertinent questions and the need for models that do so with sufficient certainty, are the key indicators of a model's relevance. Model relevance is inextricably linked with model complexity. Although model complexity has advanced greatly in recent years there has been little work to rigorously characterize the threshold of relevance in integrated and complex models. Formally assessing the relevance of the model in the face of increasing complexity would be valuable because there is growing unease among developers and users of complex models about the cumulative effects of various sources of uncertainty on model outputs. In particular, this issue has prompted doubt over whether the considerable effort going into further elaborating complex models will in fact yield the expected payback. New approaches have been proposed recently to evaluate the uncertainty-complexity-relevance modeling trilemma (Muller, Muñoz-Carpena and Kiker, 2011) by incorporating state-of-the-art global sensitivity and uncertainty analysis (GSA/UA) in every step of the model development so as to quantify not only the uncertainty introduced by the addition of new environmental components, but the effect that these new components have over existing components (interactions, non-linear responses). Outputs from the analysis can also be used to quantify system resilience (stability, alternative states, thresholds or tipping

  2. Analysis of uncertainties and detection limits for the double measurement method of {sup 90}Sr and {sup 89}Sr

    Energy Technology Data Exchange (ETDEWEB)

    Herranz, M., E-mail: m.herranz@ehu.es [Department of Nuclear Engineering and Fluid Mechanics, University of the Basque Country (UPV/EHU), Alameda de Urquijo s/n 48013 Bilbao (Spain); Idoeta, R.; Legarda, F. [Department of Nuclear Engineering and Fluid Mechanics, University of the Basque Country (UPV/EHU), Alameda de Urquijo s/n 48013 Bilbao (Spain)

    2011-08-15

    The determination process of the {sup 90}Sr and {sup 89}Sr contents in a sample, although it involves their radiochemical isolation, results always in a complex measurement process due to the interferences among their respective beta emissions and also among those of the daughter of {sup 90}Sr, {sup 90}Y, a beta emitter as well. In this paper, the process consisting in a double measurement method after the Sr radiochemical isolation is analyzed, developing the formulae to obtain activity concentrations, uncertainties and detection limits. A study of the trend of uncertainties and detection limits as function of the time in which the first measurement since the isolation is done, the delay between the two measurements and the activity concentration of each strontium isotope in the sample is carried out as well. Results show that with a very precise determination of the times involved in the whole process (isolation, measurement and duration of measurements) this method permits a reliable assessment of both strontium radioisotopes. The quicker the first measurement since the isolation is done and the longer the delay between measurements is chosen, the lower are the detection limits and the uncertainties of the activities obtained. - Highlights: > The double measurement method for {sup 90}Sr and {sup 89}Sr determination is analysed. > Uncertainties and detection limits are determined and their dependences studied. > Proposals for the optimization of the method are given.

  3. Analysis of Uncertainties in Infrared Camera Measurements of a Turbofan Engine in an Altitude Test Cell

    National Research Council Canada - National Science Library

    Morris, Thomas

    2004-01-01

    ... from the facility structure, hot exhaust gases, and the measurement equipment itself. The atmosphere and a protective ZnSe window that shields the camera from the hot engine exhaust also introduce measurement uncertainty due to attenuation...

  4. A Statistical Modeling Framework for Characterising Uncertainty in Large Datasets: Application to Ocean Colour

    Directory of Open Access Journals (Sweden)

    Peter E. Land

    2018-05-01

    Full Text Available Uncertainty estimation is crucial to establishing confidence in any data analysis, and this is especially true for Essential Climate Variables, including ocean colour. Methods for deriving uncertainty vary greatly across data types, so a generic statistics-based approach applicable to multiple data types is an advantage to simplify the use and understanding of uncertainty data. Progress towards rigorous uncertainty analysis of ocean colour has been slow, in part because of the complexity of ocean colour processing. Here, we present a general approach to uncertainty characterisation, using a database of satellite-in situ matchups to generate a statistical model of satellite uncertainty as a function of its contributing variables. With an example NASA MODIS-Aqua chlorophyll-a matchups database mostly covering the north Atlantic, we demonstrate a model that explains 67% of the squared error in log(chlorophyll-a as a potentially correctable bias, with the remaining uncertainty being characterised as standard deviation and standard error at each pixel. The method is quite general, depending only on the existence of a suitable database of matchups or reference values, and can be applied to other sensors and data types such as other satellite observed Essential Climate Variables, empirical algorithms derived from in situ data, or even model data.

  5. Evaluating uncertainty estimates in hydrologic models: borrowing measures from the forecast verification community

    Directory of Open Access Journals (Sweden)

    K. J. Franz

    2011-11-01

    Full Text Available The hydrologic community is generally moving towards the use of probabilistic estimates of streamflow, primarily through the implementation of Ensemble Streamflow Prediction (ESP systems, ensemble data assimilation methods, or multi-modeling platforms. However, evaluation of probabilistic outputs has not necessarily kept pace with ensemble generation. Much of the modeling community is still performing model evaluation using standard deterministic measures, such as error, correlation, or bias, typically applied to the ensemble mean or median. Probabilistic forecast verification methods have been well developed, particularly in the atmospheric sciences, yet few have been adopted for evaluating uncertainty estimates in hydrologic model simulations. In the current paper, we overview existing probabilistic forecast verification methods and apply the methods to evaluate and compare model ensembles produced from two different parameter uncertainty estimation methods: the Generalized Uncertainty Likelihood Estimator (GLUE, and the Shuffle Complex Evolution Metropolis (SCEM. Model ensembles are generated for the National Weather Service SACramento Soil Moisture Accounting (SAC-SMA model for 12 forecast basins located in the Southeastern United States. We evaluate the model ensembles using relevant metrics in the following categories: distribution, correlation, accuracy, conditional statistics, and categorical statistics. We show that the presented probabilistic metrics are easily adapted to model simulation ensembles and provide a robust analysis of model performance associated with parameter uncertainty. Application of these methods requires no information in addition to what is already available as part of traditional model validation methodology and considers the entire ensemble or uncertainty range in the approach.

  6. SYSTEMATIC UNCERTAINTIES IN THE SPECTROSCOPIC MEASUREMENTS OF NEUTRON STAR MASSES AND RADII FROM THERMONUCLEAR X-RAY BURSTS. III. ABSOLUTE FLUX CALIBRATION

    Energy Technology Data Exchange (ETDEWEB)

    Güver, Tolga [Istanbul University, Science Faculty, Department of Astronomy and Space Sciences, Beyazıt, 34119, Istanbul (Turkey); Özel, Feryal; Psaltis, Dimitrios [Department of Astronomy, University of Arizona, 933 N. Cherry Avenue, Tucson, AZ 85721 (United States); Marshall, Herman [Center for Space Research, Massachusetts Institute of Technology, Cambridge, MA 02139 (United States); Guainazzi, Matteo [European Space Astronomy Centre of ESA, P.O. Box 78, Villanueva de la Cañada, E-28691 Madrid (Spain); Díaz-Trigo, Maria [ESO, Karl-Schwarzschild-Strasse 2, D-85748 Garching bei München (Germany)

    2016-09-20

    Many techniques for measuring neutron star radii rely on absolute flux measurements in the X-rays. As a result, one of the fundamental uncertainties in these spectroscopic measurements arises from the absolute flux calibrations of the detectors being used. Using the stable X-ray burster, GS 1826–238, and its simultaneous observations by Chandra HETG/ACIS-S and RXTE /PCA as well as by XMM-Newton EPIC-pn and RXTE /PCA, we quantify the degree of uncertainty in the flux calibration by assessing the differences between the measured fluxes during bursts. We find that the RXTE /PCA and the Chandra gratings measurements agree with each other within their formal uncertainties, increasing our confidence in these flux measurements. In contrast, XMM-Newton EPIC-pn measures 14.0 ± 0.3% less flux than the RXTE /PCA. This is consistent with the previously reported discrepancy with the flux measurements of EPIC-pn, compared with EPIC MOS1, MOS2, and ACIS-S detectors. We also show that any intrinsic time-dependent systematic uncertainty that may exist in the calibration of the satellites has already been implicity taken into account in the neutron star radius measurements.

  7. Force Measurement Services at Kebs: AN Overview of Equipment, Procedures and Uncertainty

    Science.gov (United States)

    Bangi, J. O.; Maranga, S. M.; Nganga, S. P.; Mutuli, S. M.

    This paper describes the facilities, instrumentation and procedures currently used in the force laboratory at the Kenya Bureau of Standards (KEBS) for force measurement services. The laboratory uses the Force Calibration Machine (FCM) to calibrate force-measuring instruments. The FCM derives its traceability via comparisons using reference transfer force transducers calibrated by the Force Standard Machines (FSM) of a National Metrology Institute (NMI). The force laboratory is accredited to ISO/IEC 17025 by the Germany Accreditation Body (DAkkS). The accredited measurement scope of the laboratory is 1 MN to calibrate force transducers in both compression and tension modes. ISO 376 procedures are used while calibrating force transducers. The KEBS reference transfer standards have capacities of 10, 50, 300 and 1000 kN to cover the full range of the FCM. The uncertainty in the forces measured by the FCM were reviewed and determined in accordance to the new EURAMET calibration guide. The relative expanded uncertainty of force W realized by FCM was evaluated in a range from 10 kN-1 MN, and was found to be 5.0 × 10-4 with the coverage factor k being equal to 2. The overall normalized error (En) of the comparison results was also found to be less than 1. The accredited Calibration and Measurement Capability (CMC) of the KEBS force laboratory was based on the results of those intercomparisons. The FCM enables KEBS to provide traceability for the calibration of class ‘1’ force instruments as per the ISO 376.

  8. The uncertainties calculation of acoustic method for measurement of dissipative properties of heterogeneous non-metallic materials

    Directory of Open Access Journals (Sweden)

    Мaryna O. Golofeyeva

    2015-12-01

    Full Text Available The effective use of heterogeneous non-metallic materials and structures needs measurement of reliable values of dissipation characteristics, as well as common factors of their change during the loading process. Aim: The aim of this study is to prepare the budget for measurement uncertainty of dissipative properties of composite materials. Materials and Methods: The method used to study the vibrational energy dissipation characteristics based on coupling of vibrations damping decrement and acoustic velocity in a non-metallic heterogeneous material is reviewed. The proposed method allows finding the dependence of damping on vibrations amplitude and frequency of strain-stress state of material. Results: Research of the accuracy of measurement method during the definition of decrement attenuation of fluctuations in synthegran was performed. The international approach for evaluation of measurements quality is used. It includes the common practice international rules for uncertainty expression and their summation. These rules are used as internationally acknowledged confidence measure to the measurement results, which includes testing. The uncertainties budgeting of acoustic method for measurement of dissipative properties of materials were compiled. Conclusions: It was defined that there are two groups of reasons resulting in errors during measurement of materials dissipative properties. The first group of errors contains of parameters changing of calibrated bump in tolerance limits, displacement of sensor in repeated placement to measurement point, layer thickness variation of contact agent because of irregular hold-down of resolvers to control surface, inaccuracy in reading and etc. The second group of errors is linked with density and Poisson’s ratio measurement errors, distance between sensors, time difference between signals of vibroacoustic sensors.

  9. What information on measurement uncertainty should be communicated to clinicians, and how?

    Science.gov (United States)

    Plebani, Mario; Sciacovelli, Laura; Bernardi, Daniela; Aita, Ada; Antonelli, Giorgia; Padoan, Andrea

    2018-02-02

    The communication of laboratory results to physicians and the quality of reports represent fundamental requirements of the post-analytical phase in order to assure the right interpretation and utilization of laboratory information. Accordingly, the International Standard for clinical laboratories accreditation (ISO 15189) requires that "laboratory reports shall include the information necessary for the interpretation of the examination results". Measurement uncertainty (MU) is an inherent property of any quantitative measurement result which express the lack of knowledge of the true value and quantify the uncertainty of a result, incorporating the factors known to influence it. Even if the MU is not included in the report attributes of ISO 15189 and cannot be considered a post-analytical requirement, it is suggested as an information which should facilitate an appropriate interpretation of quantitative results (quantity values). Therefore, MU has two intended uses: for laboratory professionals, it gives information about the quality of measurements, providing evidence of the compliance with analytical performance characteristics; for physicians (and patients) it may help in interpretation of measurement results, especially when values are compared with reference intervals or clinical decision limits, providing objective information. Here we describe the way that MU should be added to laboratory reports in order to facilitate the interpretation of laboratory results and connecting efforts performed within laboratory to provide more accurate and reliable results with a more objective tool for their interpretation by physicians. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  10. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear-reactor-safety research program is described and compared with other methodologies established for performing uncertainty analyses

  11. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear reactor safety research program is described and compared with other methodologies established for performing uncertainty analyses

  12. Measurement uncertainties of long-term 222Rn averages at environmental levels using alpha track detectors

    International Nuclear Information System (INIS)

    Nelson, R.A.

    1987-01-01

    More than 250 replicate measurements of outdoor Rn concentration integrated over quarterly periods were made to estimate the random component of the measurement uncertainty of Track Etch detectors (type F) under outdoor conditions. The measurements were performed around three U mill tailings piles to provide a range of environmental concentrations. The measurement uncertainty was typically greater than could be accounted for by Poisson counting statistics. Average coefficients of variation of the order of 20% for all measured concentrations were found. It is concluded that alpha track detectors can be successfully used to determine annual average outdoor Rn concentrations through the use of careful quality control procedures. These include rapid deployment and collection of detectors to minimize unintended Rn exposure, careful packaging and shipping to and from the manufacturer, use of direct sunlight shields for all detectors and careful and secure mounting of all detectors in as similar a manner as possible. The use of multiple (at least duplicate) detectors at each monitoring location and an exposure period of no less than one quarter are suggested

  13. Integrating measuring uncertainty of tactile and optical coordinate measuring machines in the process capability assessment of micro injection moulding

    DEFF Research Database (Denmark)

    Tosello, Guido; Hansen, Hans Nørgaard; Gasparin, Stefania

    2010-01-01

    Process capability of micro injection moulding was investigated in this paper by calculating the Cp and Cpk statistics. Uncertainty of both optical and tactile measuring systems employed in the quality control of micro injection moulded products was assessed and compared with the specified...... tolerances. Limits in terms of manufacturing process capability as well as of suitability of such measuring systems when employed for micro production inspection were quantitatively determined....

  14. Rigorous bounds on the free energy of electron-phonon models

    NARCIS (Netherlands)

    Raedt, Hans De; Michielsen, Kristel

    1997-01-01

    We present a collection of rigorous upper and lower bounds to the free energy of electron-phonon models with linear electron-phonon interaction. These bounds are used to compare different variational approaches. It is shown rigorously that the ground states corresponding to the sharpest bounds do

  15. Estimation of the thermal diffusion coefficient in fusion plasmas taking frequency measurement uncertainties into account

    International Nuclear Information System (INIS)

    Van Berkel, M; Hogeweij, G M D; Van den Brand, H; De Baar, M R; Zwart, H J; Vandersteen, G

    2014-01-01

    In this paper, the estimation of the thermal diffusivity from perturbative experiments in fusion plasmas is discussed. The measurements used to estimate the thermal diffusivity suffer from stochastic noise. Accurate estimation of the thermal diffusivity should take this into account. It will be shown that formulas found in the literature often result in a thermal diffusivity that has a bias (a difference between the estimated value and the actual value that remains even if more measurements are added) or have an unnecessarily large uncertainty. This will be shown by modeling a plasma using only diffusion as heat transport mechanism and measurement noise based on ASDEX Upgrade measurements. The Fourier coefficients of a temperature perturbation will exhibit noise from the circular complex normal distribution (CCND). Based on Fourier coefficients distributed according to a CCND, it is shown that the resulting probability density function of the thermal diffusivity is an inverse non-central chi-squared distribution. The thermal diffusivity that is found by sampling this distribution will always be biased, and averaging of multiple estimated diffusivities will not necessarily improve the estimation. Confidence bounds are constructed to illustrate the uncertainty in the diffusivity using several formulas that are equivalent in the noiseless case. Finally, a different method of averaging, that reduces the uncertainty significantly, is suggested. The methodology is also extended to the case where damping is included, and it is explained how to include the cylindrical geometry. (paper)

  16. Development and application of objective uncertainty measures for nuclear power plant transient analysis

    International Nuclear Information System (INIS)

    Vinai, P.

    2007-10-01

    For the development, design and licensing of a nuclear power plant (NPP), a sound safety analysis is necessary to study the diverse physical phenomena involved in the system behaviour under operational and transient conditions. Such studies are based on detailed computer simulations. With the progresses achieved in computer technology and the greater availability of experimental and plant data, the use of best estimate codes for safety evaluations has gained increasing acceptance. The application of best estimate safety analysis has raised new problems that need to be addressed: it has become more crucial to assess as to how reliable code predictions are, especially when they need to be compared against safety limits that must not be crossed. It becomes necessary to identify and quantify the various possible sources of uncertainty that affect the reliability of the results. Currently, such uncertainty evaluations are generally based on experts' opinion. In the present research, a novel methodology based on a non-parametric statistical approach has been developed for objective quantification of best-estimate code uncertainties related to the physical models used in the code. The basis is an evaluation of the accuracy of a given physical model achieved by comparing its predictions with experimental data from an appropriate set of separate-effect tests. The differences between measurements and predictions can be considered stochastically distributed, and thus a statistical approach can be employed. The first step was the development of a procedure for investigating the dependence of a given physical model's accuracy on the experimental conditions. Each separate-effect test effectively provides a random sample of discrepancies between measurements and predictions, corresponding to a location in the state space defined by a certain number of independent system variables. As a consequence, the samples of 'errors', achieved from analysis of the entire database, are

  17. Quality properties of pre- and post-rigor beef muscle after interventions with high frequency ultrasound.

    Science.gov (United States)

    Sikes, Anita L; Mawson, Raymond; Stark, Janet; Warner, Robyn

    2014-11-01

    The delivery of a consistent quality product to the consumer is vitally important for the food industry. The aim of this study was to investigate the potential for using high frequency ultrasound applied to pre- and post-rigor beef muscle on the metabolism and subsequent quality. High frequency ultrasound (600kHz at 48kPa and 65kPa acoustic pressure) applied to post-rigor beef striploin steaks resulted in no significant effect on the texture (peak force value) of cooked steaks as measured by a Tenderometer. There was no added benefit of ultrasound treatment above that of the normal ageing process after ageing of the steaks for 7days at 4°C. Ultrasound treatment of post-rigor beef steaks resulted in a darkening of fresh steaks but after ageing for 7days at 4°C, the ultrasound-treated steaks were similar in colour to that of the aged, untreated steaks. High frequency ultrasound (2MHz at 48kPa acoustic pressure) applied to pre-rigor beef neck muscle had no effect on the pH, but the calculated exhaustion factor suggested that there was some effect on metabolism and actin-myosin interaction. However, the resultant texture of cooked, ultrasound-treated muscle was lower in tenderness compared to the control sample. After ageing for 3weeks at 0°C, the ultrasound-treated samples had the same peak force value as the control. High frequency ultrasound had no significant effect on the colour parameters of pre-rigor beef neck muscle. This proof-of-concept study showed no effect of ultrasound on quality but did indicate that the application of high frequency ultrasound to pre-rigor beef muscle shows potential for modifying ATP turnover and further investigation is warranted. Crown Copyright © 2014. Published by Elsevier B.V. All rights reserved.

  18. On the relationship between micro and macro correlations in nuclear measurement uncertainties

    International Nuclear Information System (INIS)

    Smith, D.L.

    1987-01-01

    Consideration is given to the propagation of micro correlations between the component experimental errors (corresponding to diverse attributes of the measurement process) through to the macro correlations between the total errors in the final derived experimental values. Whenever certain micro correlations cannot be precisely specified, the macro correlations must also be uncertain. However, on the basis of fundamental principles from mathematical statistics, it is shown that these uncertainties in the macro correlations can be substantially smaller than the individual uncertainties for specific micro correlations, provided that the number of distinct attributes contributing to the total experimental error is reasonably large. Furthermore, the resulting macro correlations are shown to be approximately normally distributed regardless of teh distributions assumed for the micro correlations. Examples are provided to demonstrate these concepts and to illustrate their relevance to experimental nuclear research. (orig.)

  19. Uncertainty quantification for proton–proton fusion in chiral effective field theory

    Energy Technology Data Exchange (ETDEWEB)

    Acharya, B. [Department of Physics and Astronomy, University of Tennessee, Knoxville, TN 37996 (United States); Carlsson, B.D. [Department of Physics, Chalmers University of Technology, SE-412 96 Göteborg (Sweden); Ekström, A. [Department of Physics and Astronomy, University of Tennessee, Knoxville, TN 37996 (United States); Physics Division, Oak Ridge National Laboratory, Oak Ridge, TN 37831 (United States); Department of Physics, Chalmers University of Technology, SE-412 96 Göteborg (Sweden); Forssén, C. [Department of Physics, Chalmers University of Technology, SE-412 96 Göteborg (Sweden); Platter, L., E-mail: lplatter@utk.edu [Department of Physics and Astronomy, University of Tennessee, Knoxville, TN 37996 (United States); Physics Division, Oak Ridge National Laboratory, Oak Ridge, TN 37831 (United States)

    2016-09-10

    We compute the S-factor of the proton–proton (pp) fusion reaction using chiral effective field theory (χEFT) up to next-to-next-to-leading order (NNLO) and perform a rigorous uncertainty analysis of the results. We quantify the uncertainties due to (i) the computational method used to compute the pp cross section in momentum space, (ii) the statistical uncertainties in the low-energy coupling constants of χEFT, (iii) the systematic uncertainty due to the χEFT cutoff, and (iv) systematic variations in the database used to calibrate the nucleon–nucleon interaction. We also examine the robustness of the polynomial extrapolation procedure, which is commonly used to extract the threshold S-factor and its energy-derivatives. By performing a statistical analysis of the polynomial fit of the energy-dependent S-factor at several different energy intervals, we eliminate a systematic uncertainty that can arise from the choice of the fit interval in our calculations. In addition, we explore the statistical correlations between the S-factor and few-nucleon observables such as the binding energies and point-proton radii of {sup 2,3}H and {sup 3}He as well as the D-state probability and quadrupole moment of {sup 2}H, and the β-decay of {sup 3}H. We find that, with the state-of-the-art optimization of the nuclear Hamiltonian, the statistical uncertainty in the threshold S-factor cannot be reduced beyond 0.7%.

  20. Towards a quantitative, measurement-based estimate of the uncertainty in photon mass attenuation coefficients at radiation therapy energies

    Science.gov (United States)

    Ali, E. S. M.; Spencer, B.; McEwen, M. R.; Rogers, D. W. O.

    2015-02-01

    In this study, a quantitative estimate is derived for the uncertainty in the XCOM photon mass attenuation coefficients in the energy range of interest to external beam radiation therapy—i.e. 100 keV (orthovoltage) to 25 MeV—using direct comparisons of experimental data against Monte Carlo models and theoretical XCOM data. Two independent datasets are used. The first dataset is from our recent transmission measurements and the corresponding EGSnrc calculations (Ali et al 2012 Med. Phys. 39 5990-6003) for 10-30 MV photon beams from the research linac at the National Research Council Canada. The attenuators are graphite and lead, with a total of 140 data points and an experimental uncertainty of ˜0.5% (k = 1). An optimum energy-independent cross section scaling factor that minimizes the discrepancies between measurements and calculations is used to deduce cross section uncertainty. The second dataset is from the aggregate of cross section measurements in the literature for graphite and lead (49 experiments, 288 data points). The dataset is compared to the sum of the XCOM data plus the IAEA photonuclear data. Again, an optimum energy-independent cross section scaling factor is used to deduce the cross section uncertainty. Using the average result from the two datasets, the energy-independent cross section uncertainty estimate is 0.5% (68% confidence) and 0.7% (95% confidence). The potential for energy-dependent errors is discussed. Photon cross section uncertainty is shown to be smaller than the current qualitative ‘envelope of uncertainty’ of the order of 1-2%, as given by Hubbell (1999 Phys. Med. Biol 44 R1-22).

  1. Reducing, Maintaining, or Escalating Uncertainty? The Development and Validation of Four Uncertainty Preference Scales Related to Cancer Information Seeking and Avoidance.

    Science.gov (United States)

    Carcioppolo, Nick; Yang, Fan; Yang, Qinghua

    2016-09-01

    Uncertainty is a central characteristic of many aspects of cancer prevention, screening, diagnosis, and treatment. Brashers's (2001) uncertainty management theory details the multifaceted nature of uncertainty and describes situations in which uncertainty can both positively and negatively affect health outcomes. The current study extends theory on uncertainty management by developing four scale measures of uncertainty preferences in the context of cancer. Two national surveys were conducted to validate the scales and assess convergent and concurrent validity. Results support the factor structure of each measure and provide general support across multiple validity assessments. These scales can advance research on uncertainty and cancer communication by providing researchers with measures that address multiple aspects of uncertainty management.

  2. Deriving proper measurement uncertainty from Internal Quality Control data: An impossible mission?

    Science.gov (United States)

    Ceriotti, Ferruccio

    2018-03-30

    Measurement uncertainty (MU) is a "non-negative parameter characterizing the dispersion of the quantity values being attributed to a measurand, based on the information used". In the clinical laboratory the most convenient way to calculate MU is the "top down" approach based on the use of Internal Quality Control data. As indicated in the definition, MU depends on the information used for its calculation and so different estimates of MU can be obtained. The most problematic aspect is how to deal with bias. In fact bias is difficult to detect and quantify and it should be corrected including only the uncertainty derived from this correction. Several approaches to calculate MU starting from Internal Quality Control data are presented. The minimum requirement is to use only the intermediate precision data, provided to include 6 months of results obtained with a commutable quality control material at a concentration close to the clinical decision limit. This approach is the minimal requirement and it is convenient for all those measurands that are especially used for monitoring or where a reference measurement system does not exist and so a reference for calculating the bias is lacking. Other formulas including the uncertainty of the value of the calibrator, including the bias from a commutable certified reference material or from a material specifically prepared for trueness verification, including the bias derived from External Quality Assessment schemes or from historical mean of the laboratory are presented and commented. MU is an important parameter, but a single, agreed upon way to calculate it in a clinical laboratory is not yet available. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  3. Controlled source electromagnetic data analysis with seismic constraints and rigorous uncertainty estimation in the Black Sea

    Science.gov (United States)

    Gehrmann, R. A. S.; Schwalenberg, K.; Hölz, S.; Zander, T.; Dettmer, J.; Bialas, J.

    2016-12-01

    In 2014 an interdisciplinary survey was conducted as part of the German SUGAR project in the Western Black Sea targeting gas hydrate occurrences in the Danube Delta. Marine controlled source electromagnetic (CSEM) data were acquired with an inline seafloor-towed array (BGR), and a two-polarization horizontal ocean-bottom source and receiver configuration (GEOMAR). The CSEM data are co-located with high-resolution 2-D and 3-D seismic reflection data (GEOMAR). We present results from 2-D regularized inversion (MARE2DEM by Kerry Key), which provides a smooth model of the electrical resistivity distribution beneath the source and multiple receivers. The 2-D approach includes seafloor topography and structural constraints from seismic data. We estimate uncertainties from the regularized inversion and compare them to 1-D Bayesian inversion results. The probabilistic inversion for a layered subsurface treats the parameter values and the number of layers as unknown by applying reversible-jump Markov-chain Monte Carlo sampling. A non-diagonal data covariance matrix obtained from residual error analysis accounts for correlated errors. The resulting resistivity models show generally high resistivity values between 3 and 10 Ωm on average which can be partly attributed to depleted pore water salinities due to sea-level low stands in the past, and locally up to 30 Ωm which is likely caused by gas hydrates. At the base of the gas hydrate stability zone resistivities rise up to more than 100 Ωm which could be due to gas hydrate as well as a layer of free gas underneath. However, the deeper parts also show the largest model parameter uncertainties. Archie's Law is used to derive estimates of the gas hydrate saturation, which vary between 30 and 80% within the anomalous layers considering salinity and porosity profiles from a distant DSDP bore hole.

  4. Recent Development in Rigorous Computational Methods in Dynamical Systems

    OpenAIRE

    Arai, Zin; Kokubu, Hiroshi; Pilarczyk, Paweł

    2009-01-01

    We highlight selected results of recent development in the area of rigorous computations which use interval arithmetic to analyse dynamical systems. We describe general ideas and selected details of different ways of approach and we provide specific sample applications to illustrate the effectiveness of these methods. The emphasis is put on a topological approach, which combined with rigorous calculations provides a broad range of new methods that yield mathematically rel...

  5. Rigor mortis development in turkey breast muscle and the effect of electrical stunning.

    Science.gov (United States)

    Alvarado, C Z; Sams, A R

    2000-11-01

    Rigor mortis development in turkey breast muscle and the effect of electrical stunning on this process are not well characterized. Some electrical stunning procedures have been known to inhibit postmortem (PM) biochemical reactions, thereby delaying the onset of rigor mortis in broilers. Therefore, this study was designed to characterize rigor mortis development in stunned and unstunned turkeys. A total of 154 turkey toms in two trials were conventionally processed at 20 to 22 wk of age. Turkeys were either stunned with a pulsed direct current (500 Hz, 50% duty cycle) at 35 mA (40 V) in a saline bath for 12 seconds or left unstunned as controls. At 15 min and 1, 2, 4, 8, 12, and 24 h PM, pectoralis samples were collected to determine pH, R-value, L* value, sarcomere length, and shear value. In Trial 1, the samples obtained for pH, R-value, and sarcomere length were divided into surface and interior samples. There were no significant differences between the surface and interior samples among any parameters measured. Muscle pH significantly decreased over time in stunned and unstunned birds through 2 h PM. The R-values increased to 8 h PM in unstunned birds and 24 h PM in stunned birds. The L* values increased over time, with no significant differences after 1 h PM for the controls and 2 h PM for the stunned birds. Sarcomere length increased through 2 h PM in the controls and 12 h PM in the stunned fillets. Cooked meat shear values decreased through the 1 h PM deboning time in the control fillets and 2 h PM in the stunned fillets. These results suggest that stunning delayed the development of rigor mortis through 2 h PM, but had no significant effect on the measured parameters at later time points, and that deboning turkey breasts at 2 h PM or later will not significantly impair meat tenderness.

  6. Measurement Uncertainty of Liquid Chromatographic Analyses Visualized by Ishikawa Diagrams

    OpenAIRE

    Meyer, Veronika R.

    2017-01-01

    Ishikawa, or cause-and-effect diagrams, help to visualize the parameters that influence a chromatographic analysis. Therefore, they facilitate the set up of the uncertainty budget of the analysis, which can then be expressed in mathematical form. If the uncertainty is calculated as the Gaussian sum of all uncertainty parameters, it is necessary to quantitate them all, a task that is usually not practical. The other possible approach is to use the intermediate precision as a base for the uncer...

  7. Development of Property Models with Uncertainty Estimate for Process Design under Uncertainty

    DEFF Research Database (Denmark)

    Hukkerikar, Amol; Sarup, Bent; Abildskov, Jens

    more reliable predictions with a new and improved set of model parameters for GC (group contribution) based and CI (atom connectivity index) based models and to quantify the uncertainties in the estimated property values from a process design point-of-view. This includes: (i) parameter estimation using....... The comparison of model prediction uncertainties with reported range of measurement uncertainties is presented for the properties with related available data. The application of the developed methodology to quantify the effect of these uncertainties on the design of different unit operations (distillation column......, the developed methodology can be used to quantify the sensitivity of process design to uncertainties in property estimates; obtain rationally the risk/safety factors in process design; and identify additional experimentation needs in order to reduce most critical uncertainties....

  8. Quantification of tomographic PIV uncertainty using controlled experimental measurements.

    Science.gov (United States)

    Liu, Ning; Wu, Yue; Ma, Lin

    2018-01-20

    The goal of this work was to experimentally quantify the uncertainty of three-dimensional (3D) and three-component (3C) velocity measurements using tomographic particle image velocimetry (tomo-PIV). Controlled measurements were designed using tracer particles embedded in a solid sample, and tomo-PIV measurements were performed on the sample while it was moved both translationally and rotationally to simulate various known displacement fields, so the 3D3C displacements measured by tomo-PIV can be directly compared to the known displacements created by the sample. The results illustrated that (1) the tomo-PIV technique was able to reconstruct the 3D3C velocity with an averaged error of 0.8-1.4 voxels in terms of magnitude and 1.7°-1.9° in terms of orientation for the velocity fields tested; (2) view registration (VR) plays a significant role in tomo-PIV, and by reducing VR error from 0.6° to 0.1°, the 3D3C measurement accuracy can be improved by at least 2.5 times in terms of both magnitude and orientation; and (3) the use of additional cameras in tomo-PIV can extend the 3D3C velocity measurement to a larger volume, while maintaining acceptable accuracy. These results obtained from controlled tests are expected to aid the error analysis and the design of tomo-PIV measurements.

  9. Probabilistic measures of climate change vulnerability, adaptation action benefits, and related uncertainty from maximum temperature metric selection

    Science.gov (United States)

    DeWeber, Jefferson T.; Wagner, Tyler

    2018-01-01

    Predictions of the projected changes in species distributions and potential adaptation action benefits can help guide conservation actions. There is substantial uncertainty in projecting species distributions into an unknown future, however, which can undermine confidence in predictions or misdirect conservation actions if not properly considered. Recent studies have shown that the selection of alternative climate metrics describing very different climatic aspects (e.g., mean air temperature vs. mean precipitation) can be a substantial source of projection uncertainty. It is unclear, however, how much projection uncertainty might stem from selecting among highly correlated, ecologically similar climate metrics (e.g., maximum temperature in July, maximum 30‐day temperature) describing the same climatic aspect (e.g., maximum temperatures) known to limit a species’ distribution. It is also unclear how projection uncertainty might propagate into predictions of the potential benefits of adaptation actions that might lessen climate change effects. We provide probabilistic measures of climate change vulnerability, adaptation action benefits, and related uncertainty stemming from the selection of four maximum temperature metrics for brook trout (Salvelinus fontinalis), a cold‐water salmonid of conservation concern in the eastern United States. Projected losses in suitable stream length varied by as much as 20% among alternative maximum temperature metrics for mid‐century climate projections, which was similar to variation among three climate models. Similarly, the regional average predicted increase in brook trout occurrence probability under an adaptation action scenario of full riparian forest restoration varied by as much as .2 among metrics. Our use of Bayesian inference provides probabilistic measures of vulnerability and adaptation action benefits for individual stream reaches that properly address statistical uncertainty and can help guide conservation

  10. Probabilistic measures of climate change vulnerability, adaptation action benefits, and related uncertainty from maximum temperature metric selection.

    Science.gov (United States)

    DeWeber, Jefferson T; Wagner, Tyler

    2018-06-01

    Predictions of the projected changes in species distributions and potential adaptation action benefits can help guide conservation actions. There is substantial uncertainty in projecting species distributions into an unknown future, however, which can undermine confidence in predictions or misdirect conservation actions if not properly considered. Recent studies have shown that the selection of alternative climate metrics describing very different climatic aspects (e.g., mean air temperature vs. mean precipitation) can be a substantial source of projection uncertainty. It is unclear, however, how much projection uncertainty might stem from selecting among highly correlated, ecologically similar climate metrics (e.g., maximum temperature in July, maximum 30-day temperature) describing the same climatic aspect (e.g., maximum temperatures) known to limit a species' distribution. It is also unclear how projection uncertainty might propagate into predictions of the potential benefits of adaptation actions that might lessen climate change effects. We provide probabilistic measures of climate change vulnerability, adaptation action benefits, and related uncertainty stemming from the selection of four maximum temperature metrics for brook trout (Salvelinus fontinalis), a cold-water salmonid of conservation concern in the eastern United States. Projected losses in suitable stream length varied by as much as 20% among alternative maximum temperature metrics for mid-century climate projections, which was similar to variation among three climate models. Similarly, the regional average predicted increase in brook trout occurrence probability under an adaptation action scenario of full riparian forest restoration varied by as much as .2 among metrics. Our use of Bayesian inference provides probabilistic measures of vulnerability and adaptation action benefits for individual stream reaches that properly address statistical uncertainty and can help guide conservation actions. Our

  11. Development of a Dynamic Lidar Uncertainty Framework

    Energy Technology Data Exchange (ETDEWEB)

    Newman, Jennifer [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Clifton, Andrew [WindForS; Bonin, Timothy [CIRES/NOAA ESRL; Choukulkar, Aditya [CIRES/NOAA ESRL; Brewer, W. Alan [NOAA ESRL; Delgado, Ruben [University of Maryland Baltimore County

    2017-08-07

    As wind turbine sizes increase and wind energy expands to more complex and remote sites, remote-sensing devices such as lidars are expected to play a key role in wind resource assessment and power performance testing. The switch to remote-sensing devices represents a paradigm shift in the way the wind industry typically obtains and interprets measurement data for wind energy. For example, the measurement techniques and sources of uncertainty for a remote-sensing device are vastly different from those associated with a cup anemometer on a meteorological tower. Current IEC standards for quantifying remote sensing device uncertainty for power performance testing consider uncertainty due to mounting, calibration, and classification of the remote sensing device, among other parameters. Values of the uncertainty are typically given as a function of the mean wind speed measured by a reference device and are generally fixed, leading to climatic uncertainty values that apply to the entire measurement campaign. However, real-world experience and a consideration of the fundamentals of the measurement process have shown that lidar performance is highly dependent on atmospheric conditions, such as wind shear, turbulence, and aerosol content. At present, these conditions are not directly incorporated into the estimated uncertainty of a lidar device. In this presentation, we describe the development of a new dynamic lidar uncertainty framework that adapts to current flow conditions and more accurately represents the actual uncertainty inherent in lidar measurements under different conditions. In this new framework, sources of uncertainty are identified for estimation of the line-of-sight wind speed and reconstruction of the three-dimensional wind field. These sources are then related to physical processes caused by the atmosphere and lidar operating conditions. The framework is applied to lidar data from a field measurement site to assess the ability of the framework to predict

  12. Trends: Rigor Mortis in the Arts.

    Science.gov (United States)

    Blodget, Alden S.

    1991-01-01

    Outlines how past art education provided a refuge for students from the rigors of other academic subjects. Observes that in recent years art education has become "discipline based." Argues that art educators need to reaffirm their commitment to a humanistic way of knowing. (KM)

  13. Using grounded theory as a method for rigorously reviewing literature

    NARCIS (Netherlands)

    Wolfswinkel, J.; Furtmueller-Ettinger, Elfriede; Wilderom, Celeste P.M.

    2013-01-01

    This paper offers guidance to conducting a rigorous literature review. We present this in the form of a five-stage process in which we use Grounded Theory as a method. We first probe the guidelines explicated by Webster and Watson, and then we show the added value of Grounded Theory for rigorously

  14. Trends of solar ultraviolet irradiance at Barrow, Alaska, and the effect of measurement uncertainties on trend detection

    Directory of Open Access Journals (Sweden)

    G. Bernhard

    2011-12-01

    Full Text Available Spectral ultraviolet (UV irradiance has been observed near Barrow, Alaska (71° N, 157° W between 1991 and 2011 with an SUV-100 spectroradiometer. The instrument was historically part of the US National Science Foundation's UV Monitoring Network and is now a component of NSF's Arctic Observing Network. From these measurements, trends in monthly average irradiance and their uncertainties were calculated. The analysis focuses on two quantities, the UV Index (which is affected by atmospheric ozone concentrations and irradiance at 345 nm (which is virtually insensitive to ozone. Uncertainties of trend estimates depend on variations in the data due to (1 natural variability, (2 systematic and random errors of the measurements, and (3 uncertainties caused by gaps in the time series. Using radiative transfer model calculations, systematic errors of the measurements were detected and corrected. Different correction schemes were tested to quantify the sensitivity of the trend estimates on the treatment of systematic errors. Depending on the correction method, estimates of decadal trends changed between 1.5% and 2.9%. Uncertainties in the trend estimates caused by error sources (2 and (3 were set into relation with the overall uncertainty of the trend determinations. Results show that these error sources are only relevant for February, March, and April when natural variability is low due to high surface albedo. This method of addressing measurement uncertainties in time series analysis is also applicable to other geophysical parameters. Trend estimates varied between −14% and +5% per decade and were significant (95.45% confidence level only for the month of October. Depending on the correction method, October trends varied between −11.4% and −13.7% for irradiance at 345 nm and between −11.7% and −14.1% for the UV Index. These large trends are consistent with trends in short-wave (0.3–3.0 μm solar irradiance measured with pyranometers at NOAA

  15. Capital and time: uncertainty and qualitative measures of inequality.

    Science.gov (United States)

    Bear, Laura

    2014-12-01

    This review compares Piketty and Marx's approaches to capital and time in order to argue for the importance of qualitative measures of inequality. These latter measures emphasize varying experiences across classes and through history of uncertainty and insecurity. They explore how the social rhythms of capital profoundly affect the ability to plan a life-course. Quantitative measures such as those used by Piketty that focus on the amount of capital that accrues through time cannot capture such important phenomenon. This is especially because their calculations rest on absolute amounts of capital recorded in formal state statistics. Their limits are particularly revealed if we consider issues of: informal labour, social reproduction, and changing institutional forms of public debt. If we are to build the inter-disciplinary rapprochement between social science and economics that Piketty calls for it must be through asserting the value of qualitative measures of insecurity and its effects on decision making. These are important to track both at the macro-level of institutions and at the micro-level scale of human lives. It is, therefore, through emphasizing the existing strengths of both anthropology and history that we can meet Piketty's important challenge to make our scholarship relevant to current political and social debates. © London School of Economics and Political Science 2014.

  16. Quantifying the Contribution of Post-Processing in Computed Tomography Measurement Uncertainty

    DEFF Research Database (Denmark)

    Stolfi, Alessandro; Thompson, Mary Kathryn; Carli, Lorenzo

    2016-01-01

    by calculating the standard deviation of 10 repeated measurement evaluations on the same data set. The evaluations were performed on an industrial assembly. Each evaluation includes several dimensional and geometrical measurands that were expected to have different responses to the various post......-processing settings. It was found that the definition of the datum system had the largest impact on the uncertainty with a standard deviation of a few microns. The surface determination and data fitting had smaller contributions with sub-micron repeatability....

  17. The Researchers' View of Scientific Rigor-Survey on the Conduct and Reporting of In Vivo Research.

    Science.gov (United States)

    Reichlin, Thomas S; Vogt, Lucile; Würbel, Hanno

    2016-01-01

    Reproducibility in animal research is alarmingly low, and a lack of scientific rigor has been proposed as a major cause. Systematic reviews found low reporting rates of measures against risks of bias (e.g., randomization, blinding), and a correlation between low reporting rates and overstated treatment effects. Reporting rates of measures against bias are thus used as a proxy measure for scientific rigor, and reporting guidelines (e.g., ARRIVE) have become a major weapon in the fight against risks of bias in animal research. Surprisingly, animal scientists have never been asked about their use of measures against risks of bias and how they report these in publications. Whether poor reporting reflects poor use of such measures, and whether reporting guidelines may effectively reduce risks of bias has therefore remained elusive. To address these questions, we asked in vivo researchers about their use and reporting of measures against risks of bias and examined how self-reports relate to reporting rates obtained through systematic reviews. An online survey was sent out to all registered in vivo researchers in Switzerland (N = 1891) and was complemented by personal interviews with five representative in vivo researchers to facilitate interpretation of the survey results. Return rate was 28% (N = 530), of which 302 participants (16%) returned fully completed questionnaires that were used for further analysis. According to the researchers' self-report, they use measures against risks of bias to a much greater extent than suggested by reporting rates obtained through systematic reviews. However, the researchers' self-reports are likely biased to some extent. Thus, although they claimed to be reporting measures against risks of bias at much lower rates than they claimed to be using these measures, the self-reported reporting rates were considerably higher than reporting rates found by systematic reviews. Furthermore, participants performed rather poorly when asked to

  18. Robust control of nonlinear MAGLEV suspension system with mismatched uncertainties via DOBC approach.

    Science.gov (United States)

    Yang, Jun; Zolotas, Argyrios; Chen, Wen-Hua; Michail, Konstantinos; Li, Shihua

    2011-07-01

    Robust control of a class of uncertain systems that have disturbances and uncertainties not satisfying "matching" condition is investigated in this paper via a disturbance observer based control (DOBC) approach. In the context of this paper, "matched" disturbances/uncertainties stand for the disturbances/uncertainties entering the system through the same channels as control inputs. By properly designing a disturbance compensation gain, a novel composite controller is proposed to counteract the "mismatched" lumped disturbances from the output channels. The proposed method significantly extends the applicability of the DOBC methods. Rigorous stability analysis of the closed-loop system with the proposed method is established under mild assumptions. The proposed method is applied to a nonlinear MAGnetic LEViation (MAGLEV) suspension system. Simulation shows that compared to the widely used integral control method, the proposed method provides significantly improved disturbance rejection and robustness against load variation. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.

  19. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  20. Uncertainties achievable for uranium isotope-amount ratios. Estimates based on the precision and accuracy of recent characterization measurements

    International Nuclear Information System (INIS)

    Mathew, K.J.; Essex, R.M.; Gradle, C.; Narayanan, U.

    2015-01-01

    Certified reference materials (CRMs) recently characterized by the NBL for isotope-amount ratios are: (i) CRM 112-A, Uranium (normal) Metal Assay and Isotopic Standard, (ii) CRM 115, Uranium (depleted) Metal Assay and Isotopic Standard, and (iii) CRM 116-A, Uranium (enriched) Metal Assay and Isotopic Standard. NBL also completed re-characterization of the isotope-amount ratios in CRM 125-A, Uranium (UO 2 ) Pellet Assay, Isotopic, and Radio-chronometric Standard. Three different TIMS analytical techniques were employed for the characterization analyses. The total evaporation technique was used for the major isotope-amount ratio measurement, the modified total evaporation technique was used for both the major and minor isotope-amount ratios, and minor isotope-amount ratios were also measured using a Conventional technique. Uncertainties for the characterization studies were calculated from the combined TIMS data sets following the ISO Guide to the expression of uncertainty in measurement. The uncertainty components for the isotope-amount ratio values are discussed. (author)

  1. [Influence of Uncertainty and Uncertainty Appraisal on Self-management in Hemodialysis Patients].

    Science.gov (United States)

    Jang, Hyung Suk; Lee, Chang Suk; Yang, Young Hee

    2015-04-01

    This study was done to examine the relation of uncertainty, uncertainty appraisal, and self-management in patients undergoing hemodialysis, and to identify factors influencing self-management. A convenience sample of 92 patients receiving hemodialysis was selected. Data were collected using a structured questionnaire and medical records. The collected data were analyzed using descriptive statistics, t-test, ANOVA, Pearson correlations and multiple regression analysis with the SPSS/WIN 20.0 program. The participants showed a moderate level of uncertainty with the highest score being for ambiguity among the four uncertainty subdomains. Scores for uncertainty danger or opportunity appraisals were under the mid points. The participants were found to perform a high level of self-management such as diet control, management of arteriovenous fistula, exercise, medication, physical management, measurements of body weight and blood pressure, and social activity. The self-management of participants undergoing hemodialysis showed a significant relationship with uncertainty and uncertainty appraisal. The significant factors influencing self-management were uncertainty, uncertainty opportunity appraisal, hemodialysis duration, and having a spouse. These variables explained 32.8% of the variance in self-management. The results suggest that intervention programs to reduce the level of uncertainty and to increase the level of uncertainty opportunity appraisal among patients would improve the self-management of hemodialysis patients.

  2. A statistical methodology for quantification of uncertainty in best estimate code physical models

    International Nuclear Information System (INIS)

    Vinai, Paolo; Macian-Juan, Rafael; Chawla, Rakesh

    2007-01-01

    A novel uncertainty assessment methodology, based on a statistical non-parametric approach, is presented in this paper. It achieves quantification of code physical model uncertainty by making use of model performance information obtained from studies of appropriate separate-effect tests. Uncertainties are quantified in the form of estimated probability density functions (pdf's), calculated with a newly developed non-parametric estimator. The new estimator objectively predicts the probability distribution of the model's 'error' (its uncertainty) from databases reflecting the model's accuracy on the basis of available experiments. The methodology is completed by applying a novel multi-dimensional clustering technique based on the comparison of model error samples with the Kruskall-Wallis test. This takes into account the fact that a model's uncertainty depends on system conditions, since a best estimate code can give predictions for which the accuracy is affected by the regions of the physical space in which the experiments occur. The final result is an objective, rigorous and accurate manner of assigning uncertainty to coded models, i.e. the input information needed by code uncertainty propagation methodologies used for assessing the accuracy of best estimate codes in nuclear systems analysis. The new methodology has been applied to the quantification of the uncertainty in the RETRAN-3D void model and then used in the analysis of an independent separate-effect experiment. This has clearly demonstrated the basic feasibility of the approach, as well as its advantages in yielding narrower uncertainty bands in quantifying the code's accuracy for void fraction predictions

  3. Big data uncertainties.

    Science.gov (United States)

    Maugis, Pierre-André G

    2018-07-01

    Big data-the idea that an always-larger volume of information is being constantly recorded-suggests that new problems can now be subjected to scientific scrutiny. However, can classical statistical methods be used directly on big data? We analyze the problem by looking at two known pitfalls of big datasets. First, that they are biased, in the sense that they do not offer a complete view of the populations under consideration. Second, that they present a weak but pervasive level of dependence between all their components. In both cases we observe that the uncertainty of the conclusion obtained by statistical methods is increased when used on big data, either because of a systematic error (bias), or because of a larger degree of randomness (increased variance). We argue that the key challenge raised by big data is not only how to use big data to tackle new problems, but to develop tools and methods able to rigorously articulate the new risks therein. Copyright © 2016. Published by Elsevier Ltd.

  4. Uncertainty in hydrological signatures

    Science.gov (United States)

    McMillan, Hilary; Westerberg, Ida

    2015-04-01

    Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty

  5. Uncertainty in spatial planning proceedings

    Directory of Open Access Journals (Sweden)

    Aleš Mlakar

    2009-01-01

    Full Text Available Uncertainty is distinctive of spatial planning as it arises from the necessity to co-ordinate the various interests within the area, from the urgency of adopting spatial planning decisions, the complexity of the environment, physical space and society, addressing the uncertainty of the future and from the uncertainty of actually making the right decision. Response to uncertainty is a series of measures that mitigate the effects of uncertainty itself. These measures are based on two fundamental principles – standardization and optimization. The measures are related to knowledge enhancement and spatial planning comprehension, in the legal regulation of changes, in the existence of spatial planning as a means of different interests co-ordination, in the active planning and the constructive resolution of current spatial problems, in the integration of spatial planning and the environmental protection process, in the implementation of the analysis as the foundation of spatial planners activities, in the methods of thinking outside the parameters, in forming clear spatial concepts and in creating a transparent management spatial system and also in the enforcement the participatory processes.

  6. Total Measurement Uncertainty (TMU) for Nondestructive Assay of Transuranic (TRU) Waste at the WRAP Facility

    International Nuclear Information System (INIS)

    WILLS, C.E.

    1999-01-01

    This report examines the contributing factors to NDA measurement uncertainty at WRAP. The significance of each factor on the TMU is analyzed, and a final method is given for determining the TMU for NDA measurements at WRAP. As more data becomes available, and WRAP gains in operational experience, this report will be reviewed semi-annually and updated as necessary

  7. A method and tool for combining differential or inclusive measurements obtained with simultaneously constrained uncertainties

    Science.gov (United States)

    Kieseler, Jan

    2017-11-01

    A method is discussed that allows combining sets of differential or inclusive measurements. It is assumed that at least one measurement was obtained with simultaneously fitting a set of nuisance parameters, representing sources of systematic uncertainties. As a result of beneficial constraints from the data all such fitted parameters are correlated among each other. The best approach for a combination of these measurements would be the maximization of a combined likelihood, for which the full fit model of each measurement and the original data are required. However, only in rare cases this information is publicly available. In absence of this information most commonly used combination methods are not able to account for these correlations between uncertainties, which can lead to severe biases as shown in this article. The method discussed here provides a solution for this problem. It relies on the public result and its covariance or Hessian, only, and is validated against the combined-likelihood approach. A dedicated software package implementing this method is also presented. It provides a text-based user interface alongside a C++ interface. The latter also interfaces to ROOT classes for simple combination of binned measurements such as differential cross sections.

  8. A method and tool for combining differential or inclusive measurements obtained with simultaneously constrained uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Kieseler, Jan [CERN, Geneva (Switzerland)

    2017-11-15

    A method is discussed that allows combining sets of differential or inclusive measurements. It is assumed that at least one measurement was obtained with simultaneously fitting a set of nuisance parameters, representing sources of systematic uncertainties. As a result of beneficial constraints from the data all such fitted parameters are correlated among each other. The best approach for a combination of these measurements would be the maximization of a combined likelihood, for which the full fit model of each measurement and the original data are required. However, only in rare cases this information is publicly available. In absence of this information most commonly used combination methods are not able to account for these correlations between uncertainties, which can lead to severe biases as shown in this article. The method discussed here provides a solution for this problem. It relies on the public result and its covariance or Hessian, only, and is validated against the combined-likelihood approach. A dedicated software package implementing this method is also presented. It provides a text-based user interface alongside a C++ interface. The latter also interfaces to ROOT classes for simple combination of binned measurements such as differential cross sections. (orig.)

  9. Survey of radiofrequency radiation levels around GSM base stations and evaluation of measurement uncertainty

    Directory of Open Access Journals (Sweden)

    Vulević Branislav D.

    2011-01-01

    Full Text Available This paper is a summary of broadband measurement values of radiofrequency radiation around GSM base stations in the vicinity of residential areas in Belgrade and 12 other cities in Serbia. It will be useful for determining non-ionizing radiation exposure levels of the general public in the future. The purpose of this paper is also an appropriate representation of basic information on the evaluation of measurement uncertainty.

  10. Uncertainties in workplace external dosimetry - An analytical approach

    International Nuclear Information System (INIS)

    Ambrosi, P.

    2006-01-01

    The uncertainties associated with external dosimetry measurements at workplaces depend on the type of dosemeter used together with its performance characteristics and the information available on the measurement conditions. Performance characteristics were determined in the course of a type test and information about the measurement conditions can either be general, e.g. 'research' and 'medicine', or specific, e.g. 'X-ray testing equipment for aluminium wheel rims'. This paper explains an analytical approach to determine the measurement uncertainty. It is based on the Draft IEC Technical Report IEC 62461 Radiation Protection Instrumentation - Determination of Uncertainty in Measurement. Both this paper and the report cannot eliminate the fact that the determination of the uncertainty requires a larger effort than performing the measurement itself. As a counterbalance, the process of determining the uncertainty results not only in a numerical value of the uncertainty but also produces the best estimate of the quantity to be measured, which may differ from the indication of the instrument. Thus it also improves the result of the measurement. (authors)

  11. Uncertainty analysis of scintillometers methods in measuring sensible heat fluxes of forest ecosystem

    Science.gov (United States)

    Zheng, N.

    2017-12-01

    Sensible heat flux (H) is one of the driving factors of surface turbulent motion and energy exchange. Therefore, it is particularly important to measure sensible heat flux accurately at the regional scale. However, due to the heterogeneity of the underlying surface, hydrothermal regime, and different weather conditions, it is difficult to estimate the represented flux at the kilometer scale. The scintillometer have been developed into an effective and universal equipment for deriving heat flux at the regional-scale which based on the turbulence effect of light in the atmosphere since the 1980s. The parameter directly obtained by the scintillometer is the structure parameter of the refractive index of air based on the changes of light intensity fluctuation. Combine with parameters such as temperature structure parameter, zero-plane displacement, surface roughness, wind velocity, air temperature and the other meteorological data heat fluxes can be derived. These additional parameters increase the uncertainties of flux because the difference between the actual feature of turbulent motion and the applicable conditions of turbulence theory. Most previous studies often focused on the constant flux layers that are above the rough sub-layers and homogeneous flat surfaces underlying surfaces with suitable weather conditions. Therefore, the criteria and modified forms of key parameters are invariable. In this study, we conduct investment over the hilly area of northern China with different plants, such as cork oak, cedar-black and locust. On the basis of key research on the threshold and modified forms of saturation with different turbulence intensity, modified forms of Bowen ratio with different drying-and-wetting conditions, universal function for the temperature structure parameter under different atmospheric stability, the dominant sources of uncertainty will be determined. The above study is significant to reveal influence mechanism of uncertainty and explore influence

  12. A computationally inexpensive model for estimating dimensional measurement uncertainty due to x-ray computed tomography instrument misalignments

    Science.gov (United States)

    Ametova, Evelina; Ferrucci, Massimiliano; Chilingaryan, Suren; Dewulf, Wim

    2018-06-01

    The recent emergence of advanced manufacturing techniques such as additive manufacturing and an increased demand on the integrity of components have motivated research on the application of x-ray computed tomography (CT) for dimensional quality control. While CT has shown significant empirical potential for this purpose, there is a need for metrological research to accelerate the acceptance of CT as a measuring instrument. The accuracy in CT-based measurements is vulnerable to the instrument geometrical configuration during data acquisition, namely the relative position and orientation of x-ray source, rotation stage, and detector. Consistency between the actual instrument geometry and the corresponding parameters used in the reconstruction algorithm is critical. Currently available procedures provide users with only estimates of geometrical parameters. Quantification and propagation of uncertainty in the measured geometrical parameters must be considered to provide a complete uncertainty analysis and to establish confidence intervals for CT dimensional measurements. In this paper, we propose a computationally inexpensive model to approximate the influence of errors in CT geometrical parameters on dimensional measurement results. We use surface points extracted from a computer-aided design (CAD) model to model discrepancies in the radiographic image coordinates assigned to the projected edges between an aligned system and a system with misalignments. The efficacy of the proposed method was confirmed on simulated and experimental data in the presence of various geometrical uncertainty contributors.

  13. An analysis of the uncertainty and bias in DCE-MRI measurements using the spoiled gradient-recalled echo pulse sequence

    International Nuclear Information System (INIS)

    Subashi, Ergys; Choudhury, Kingshuk R.; Johnson, G. Allan

    2014-01-01

    Purpose: The pharmacokinetic parameters derived from dynamic contrast-enhanced (DCE) MRI have been used in more than 100 phase I trials and investigator led studies. A comparison of the absolute values of these quantities requires an estimation of their respective probability distribution function (PDF). The statistical variation of the DCE-MRI measurement is analyzed by considering the fundamental sources of error in the MR signal intensity acquired with the spoiled gradient-echo (SPGR) pulse sequence. Methods: The variance in the SPGR signal intensity arises from quadrature detection and excitation flip angle inconsistency. The noise power was measured in 11 phantoms of contrast agent concentration in the range [0–1] mM (in steps of 0.1 mM) and in onein vivo acquisition of a tumor-bearing mouse. The distribution of the flip angle was determined in a uniform 10 mM CuSO 4 phantom using the spin echo double angle method. The PDF of a wide range of T1 values measured with the varying flip angle (VFA) technique was estimated through numerical simulations of the SPGR equation. The resultant uncertainty in contrast agent concentration was incorporated in the most common model of tracer exchange kinetics and the PDF of the derived pharmacokinetic parameters was studied numerically. Results: The VFA method is an unbiased technique for measuringT1 only in the absence of bias in excitation flip angle. The time-dependent concentration of the contrast agent measured in vivo is within the theoretically predicted uncertainty. The uncertainty in measuring K trans with SPGR pulse sequences is of the same order, but always higher than, the uncertainty in measuring the pre-injection longitudinal relaxation time (T1 0 ). The lowest achievable bias/uncertainty in estimating this parameter is approximately 20%–70% higher than the bias/uncertainty in the measurement of the pre-injection T1 map. The fractional volume parameters derived from the extended Tofts model were found to be

  14. Reducing Uncertainty: Implementation of Heisenberg Principle to Measure Company Performance

    Directory of Open Access Journals (Sweden)

    Anna Svirina

    2015-08-01

    Full Text Available The paper addresses the problem of uncertainty reduction in estimation of future company performance, which is a result of wide range of enterprise's intangible assets probable efficiency. To reduce this problem, the paper suggests to use quantum economy principles, i.e. implementation of Heisenberg principle to measure efficiency and potential of intangible assets of the company. It is proposed that for intangibles it is not possible to estimate both potential and efficiency at a certain time point. To provide a proof for these thesis, the data on resources potential and efficiency from mid-Russian companies was evaluated within deterministic approach, which did not allow to evaluate probability of achieving certain resource efficiency, and quantum approach, which allowed to estimate the central point around which the probable efficiency of resources in concentrated. Visualization of these approaches was performed by means of LabView software. It was proven that for tangible assets performance estimation a deterministic approach should be used; while for intangible assets the quantum approach allows better quality of future performance prediction. On the basis of these findings we proposed the holistic approach towards estimation of company resource efficiency in order to reduce uncertainty in modeling company performance.

  15. Development of Evaluation Code for MUF Uncertainty

    International Nuclear Information System (INIS)

    Won, Byung Hee; Han, Bo Young; Shin, Hee Sung; Ahn, Seong-Kyu; Park, Geun-Il; Park, Se Hwan

    2015-01-01

    Material Unaccounted For (MUF) is the material balance evaluated by measured nuclear material in a Material Balance Area (MBA). Assuming perfect measurements and no diversion from a facility, one can expect a zero MUF. However, non-zero MUF is always occurred because of measurement uncertainty even though the facility is under normal operation condition. Furthermore, there are many measurements using different equipment at various Key Measurement Points (KMPs), and the MUF uncertainty is affected by errors of those measurements. Evaluating MUF uncertainty is essentially required to develop safeguards system including nuclear measurement system in pyroprocessing, which is being developed for reducing radioactive waste from spent fuel in Korea Atomic Energy Research Institute (KAERI). The evaluation code for analyzing MUF uncertainty has been developed and it was verified using sample problem from the IAEA reference. MUF uncertainty can be simply and quickly calculated by using this evaluation code which is made based on graphical user interface for user friendly. It is also expected that the code will make the sensitivity analysis on the MUF uncertainty for the various safeguards systems easy and more systematic. It is suitable for users who want to evaluate the conventional safeguards system as well as to develop a new system for developing facilities

  16. Development of Evaluation Code for MUF Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Won, Byung Hee; Han, Bo Young; Shin, Hee Sung; Ahn, Seong-Kyu; Park, Geun-Il; Park, Se Hwan [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    Material Unaccounted For (MUF) is the material balance evaluated by measured nuclear material in a Material Balance Area (MBA). Assuming perfect measurements and no diversion from a facility, one can expect a zero MUF. However, non-zero MUF is always occurred because of measurement uncertainty even though the facility is under normal operation condition. Furthermore, there are many measurements using different equipment at various Key Measurement Points (KMPs), and the MUF uncertainty is affected by errors of those measurements. Evaluating MUF uncertainty is essentially required to develop safeguards system including nuclear measurement system in pyroprocessing, which is being developed for reducing radioactive waste from spent fuel in Korea Atomic Energy Research Institute (KAERI). The evaluation code for analyzing MUF uncertainty has been developed and it was verified using sample problem from the IAEA reference. MUF uncertainty can be simply and quickly calculated by using this evaluation code which is made based on graphical user interface for user friendly. It is also expected that the code will make the sensitivity analysis on the MUF uncertainty for the various safeguards systems easy and more systematic. It is suitable for users who want to evaluate the conventional safeguards system as well as to develop a new system for developing facilities.

  17. Improvement of uncertainty relations for mixed states

    International Nuclear Information System (INIS)

    Park, Yong Moon

    2005-01-01

    We study a possible improvement of uncertainty relations. The Heisenberg uncertainty relation employs commutator of a pair of conjugate observables to set the limit of quantum measurement of the observables. The Schroedinger uncertainty relation improves the Heisenberg uncertainty relation by adding the correlation in terms of anti-commutator. However both relations are insensitive whether the state used is pure or mixed. We improve the uncertainty relations by introducing additional terms which measure the mixtureness of the state. For the momentum and position operators as conjugate observables and for the thermal state of quantum harmonic oscillator, it turns out that the equalities in the improved uncertainty relations hold

  18. Influence of tube volume on measurement uncertainty of GM counters

    Directory of Open Access Journals (Sweden)

    Stanković Koviljka Đ.

    2010-01-01

    Full Text Available GM counters are often used in radiation detection since they generate a strong signal which can be easily detected. The working principal of a GM counter is based on the interaction of ionizing radiation with the atoms and molecules of the gas present in the counter's tube. Free electrons created as a result of this interaction become initial electrons, i. e. start an avalanche process which is detected as a pulse of current. This current pulse is independent of the energy imparted on the gas, that being the main difference between a GM counter and the majority of other radiation detectors. In literature, the dependence on the incidence of radiation energy, tube's orientation and characteristics of the reading system are quoted as the main sources of measurement uncertainty of GM counters. The aim of this paper is to determine the dependence of measurement uncertainty of a GM counter on the volume of its counter's tube. The dependence of the pulse current on the size of the counter's tube has, therefore, been considered here, both in radial and parallel geometry. The initiation and expansion of the current pulse have been examined by means of elementary processes of electrical discharge such as the Markov processes, while the changes in the counter's tube volume were put to test by the space - time enlargement law. The random variable known as the 'current pulse in the counter's tube' (i. e. electrical breakdown of the electrode configuration has also been taken into account and an appropriate theoretical distribution statistically determined. Thus obtained theoretical results were then compared to corresponding experimental results established in controlled laboratory conditions.

  19. Justification for recommended uncertainties

    International Nuclear Information System (INIS)

    Pronyaev, V.G.; Badikov, S.A.; Carlson, A.D.

    2007-01-01

    The uncertainties obtained in an earlier standards evaluation were considered to be unrealistically low by experts of the US Cross Section Evaluation Working Group (CSEWG). Therefore, the CSEWG Standards Subcommittee replaced the covariance matrices of evaluated uncertainties by expanded percentage errors that were assigned to the data over wide energy groups. There are a number of reasons that might lead to low uncertainties of the evaluated data: Underestimation of the correlations existing between the results of different measurements; The presence of unrecognized systematic uncertainties in the experimental data can lead to biases in the evaluated data as well as to underestimations of the resulting uncertainties; Uncertainties for correlated data cannot only be characterized by percentage uncertainties or variances. Covariances between evaluated value at 0.2 MeV and other points obtained in model (RAC R matrix and PADE2 analytical expansion) and non-model (GMA) fits of the 6 Li(n,t) TEST1 data and the correlation coefficients are presented and covariances between the evaluated value at 0.045 MeV and other points (along the line or column of the matrix) as obtained in EDA and RAC R matrix fits of the data available for reactions that pass through the formation of the 7 Li system are discussed. The GMA fit with the GMA database is shown for comparison. The following diagrams are discussed: Percentage uncertainties of the evaluated cross section for the 6 Li(n,t) reaction and the for the 235 U(n,f) reaction; estimation given by CSEWG experts; GMA result with full GMA database, including experimental data for the 6 Li(n,t), 6 Li(n,n) and 6 Li(n,total) reactions; uncertainties in the GMA combined fit for the standards; EDA and RAC R matrix results, respectively. Uncertainties of absolute and 252 Cf fission spectrum averaged cross section measurements, and deviations between measured and evaluated values for 235 U(n,f) cross-sections in the neutron energy range 1

  20. How Individual Scholars Can Reduce the Rigor-Relevance Gap in Management Research

    OpenAIRE

    Wolf, Joachim; Rosenberg, Timo

    2012-01-01

    This paper discusses a number of avenues management scholars could follow to reduce the existing gap between scientific rigor and practical relevance without relativizing the importance of the first goal dimension. Such changes are necessary because many management studies do not fully exploit the possibilities to increase their practical relevance while maintaining scientific rigor. We argue that this rigor-relevance gap is not only the consequence of the currently prevailing institutional c...

  1. Uncertainty

    International Nuclear Information System (INIS)

    Silva, T.A. da

    1988-01-01

    The comparison between the uncertainty method recommended by International Atomic Energy Agency (IAEA) and the and the International Weight and Measure Commitee (CIPM) are showed, for the calibration of clinical dosimeters in the secondary standard Dosimetry Laboratory (SSDL). (C.G.C.) [pt

  2. The small sample uncertainty aspect in relation to bullwhip effect measurement

    DEFF Research Database (Denmark)

    Nielsen, Erland Hejn

    2009-01-01

    The bullwhip effect as a concept has been known for almost half a century starting with the Forrester effect. The bullwhip effect is observed in many supply chains, and it is generally accepted as a potential malice. Despite of this fact, the bullwhip effect still seems to be first and foremost...... a conceptual phenomenon. This paper intends primarily to investigate why this might be so and thereby investigate the various aspects, possibilities and obstacles that must be taken into account, when considering the potential practical use and measure of the bullwhip effect in order to actually get the supply...... chain under control. This paper will put special emphasis on the unavoidable small-sample uncertainty aspects relating to the measurement or estimation of the bullwhip effect.  ...

  3. Evaluation of uncertainties in femtoampere current measurement for the number concentration standard of aerosol nanoparticles

    International Nuclear Information System (INIS)

    Sakurai, Hiromu; Ehara, Kensei

    2011-01-01

    We evaluated uncertainties in current measurement by the electrometer at the current level on the order of femtoamperes. The electrometer was the one used in the Faraday-cup aerosol electrometer of the Japanese national standard for number concentration of aerosol nanoparticles in which the accuracy of the absolute current is not required, but the net current which is obtained as the difference in currents under two different conditions must be measured accurately. The evaluation was done experimentally at the current level of 20 fA, which was much smaller than the intervals between the electrometer's calibration points at +1, +0.5, −0.5 and −1 pA. The slope of the response curve for the relationship between the 'true' and measured current, which is crucial in the above measurement, was evaluated locally at many different points within the ±1 pA range for deviation from the slope determined by a linear regression of the calibration data. The sum of the current induced by a flow of charged particles and a bias current from a current-source instrument was measured by the electrometer while the particle current was toggled on and off. The net particle current was obtained as the difference in the measured currents between the toggling, while at the same time the current was estimated from the particle concentration read by a condensation particle counter. The local slope was calculated as the ratio of the measured to estimated currents at each bias current setting. The standard deviation of the local slope values observed at varied bias currents was about 0.003, which was calculated by analysis of variance (ANOVA) for the treatment of the bias current. The combined standard uncertainty of the slope, which was calculated from the uncertainty of the slope by linear regression and the variability of the slope, was calculated to be about 0.004

  4. Uncertainty: lotteries and risk

    OpenAIRE

    Ávalos, Eloy

    2011-01-01

    In this paper we develop the theory of uncertainty in a context where the risks assumed by the individual are measurable and manageable. We primarily use the definition of lottery to formulate the axioms of the individual's preferences, and its representation through the utility function von Neumann - Morgenstern. We study the expected utility theorem and its properties, the paradoxes of choice under uncertainty and finally the measures of risk aversion with monetary lotteries.

  5. Measurement and uncertainties of energy loss in silicon over a wide Z sub 1 range using time of flight detector telescopes

    CERN Document Server

    Whitlow, H J; Elliman, R G; Weijers, T D M; Zhang Yan Wen; O'connor, D J

    2002-01-01

    The energy loss of projectiles with Z sub 1 in the range 3-26 has been experimentally measured in the 0.1-0.7 MeV per nucleon energy range in the same Si stopping foil of 105.5 mu g cm sup - sup 2 thickness using a time of flight-energy (ToF-E) elastic recoil detection analysis (ERDA) setup. A detailed study of the experimental uncertainties for ToF-E and ToF-ToF-E configuration has been made. For ERDA configurations where the energy calibration is taken against the edge positions small uncertainties in the angle at which recoils are detected can introduce significant absolute uncertainty. The relative uncertainty contribution is dominated by the energy calibration of the Si E detector for the ToF-E configuration and the position of the second ToF detector in ToF-ToF-E measurements. The much smaller calibration uncertainty for ToF-ToF-E configuration implies this technique is superior to ToF-E measurements with Si E detectors. At low energies the effect of charge changing in the time detector foils can become...

  6. DS02 uncertainty analysis

    International Nuclear Information System (INIS)

    Kaul, Dean C.; Egbert, Stephen D.; Woolson, William A.

    2005-01-01

    In order to avoid the pitfalls that so discredited DS86 and its uncertainty estimates, and to provide DS02 uncertainties that are both defensible and credible, this report not only presents the ensemble uncertainties assembled from uncertainties in individual computational elements and radiation dose components but also describes how these relate to comparisons between observed and computed quantities at critical intervals in the computational process. These comparisons include those between observed and calculated radiation free-field components, where observations include thermal- and fast-neutron activation and gamma-ray thermoluminescence, which are relevant to the estimated systematic uncertainty for DS02. The comparisons also include those between calculated and observed survivor shielding, where the observations consist of biodosimetric measurements for individual survivors, which are relevant to the estimated random uncertainty for DS02. (J.P.N.)

  7. Photoconductivity of amorphous silicon-rigorous modelling

    International Nuclear Information System (INIS)

    Brada, P.; Schauer, F.

    1991-01-01

    It is our great pleasure to express our gratitude to Prof. Grigorovici, the pioneer of the exciting field of amorphous state by our modest contribution to this area. In this paper are presented the outline of the rigorous modelling program of the steady-state photoconductivity in amorphous silicon and related materials. (Author)

  8. Entropic uncertainty relations-a survey

    International Nuclear Information System (INIS)

    Wehner, Stephanie; Winter, Andreas

    2010-01-01

    Uncertainty relations play a central role in quantum mechanics. Entropic uncertainty relations in particular have gained significant importance within quantum information, providing the foundation for the security of many quantum cryptographic protocols. Yet, little is known about entropic uncertainty relations with more than two measurement settings. In the present survey, we review known results and open questions.

  9. Uncertainty for Part Density Determination: An Update

    Energy Technology Data Exchange (ETDEWEB)

    Valdez, Mario Orlando [Los Alamos National Laboratory

    2016-12-14

    Accurate and precise density measurements by hydrostatic weighing requires the use of an analytical balance, configured with a suspension system, to both measure the weight of a part in water and in air. Additionally, the densities of these liquid media (water and air) must be precisely known for the part density determination. To validate the accuracy and precision of these measurements, uncertainty statements are required. The work in this report is a revision of an original report written more than a decade ago, specifically applying principles and guidelines suggested by the Guide to the Expression of Uncertainty in Measurement (GUM) for determining the part density uncertainty through sensitivity analysis. In this work, updated derivations are provided; an original example is revised with the updated derivations and appendix, provided solely to uncertainty evaluations using Monte Carlo techniques, specifically using the NIST Uncertainty Machine, as a viable alternative method.

  10. Carbon dioxide and methane measurements from the Los Angeles Megacity Carbon Project - Part 1: calibration, urban enhancements, and uncertainty estimates

    Science.gov (United States)

    Verhulst, Kristal R.; Karion, Anna; Kim, Jooil; Salameh, Peter K.; Keeling, Ralph F.; Newman, Sally; Miller, John; Sloop, Christopher; Pongetti, Thomas; Rao, Preeti; Wong, Clare; Hopkins, Francesca M.; Yadav, Vineet; Weiss, Ray F.; Duren, Riley M.; Miller, Charles E.

    2017-07-01

    We report continuous surface observations of carbon dioxide (CO2) and methane (CH4) from the Los Angeles (LA) Megacity Carbon Project during 2015. We devised a calibration strategy, methods for selection of background air masses, calculation of urban enhancements, and a detailed algorithm for estimating uncertainties in urban-scale CO2 and CH4 measurements. These methods are essential for understanding carbon fluxes from the LA megacity and other complex urban environments globally. We estimate background mole fractions entering LA using observations from four extra-urban sites including two marine sites located south of LA in La Jolla (LJO) and offshore on San Clemente Island (SCI), one continental site located in Victorville (VIC), in the high desert northeast of LA, and one continental/mid-troposphere site located on Mount Wilson (MWO) in the San Gabriel Mountains. We find that a local marine background can be established to within ˜ 1 ppm CO2 and ˜ 10 ppb CH4 using these local measurement sites. Overall, atmospheric carbon dioxide and methane levels are highly variable across Los Angeles. Urban and suburban sites show moderate to large CO2 and CH4 enhancements relative to a marine background estimate. The USC (University of Southern California) site near downtown LA exhibits median hourly enhancements of ˜ 20 ppm CO2 and ˜ 150 ppb CH4 during 2015 as well as ˜ 15 ppm CO2 and ˜ 80 ppb CH4 during mid-afternoon hours (12:00-16:00 LT, local time), which is the typical period of focus for flux inversions. The estimated measurement uncertainty is typically better than 0.1 ppm CO2 and 1 ppb CH4 based on the repeated standard gas measurements from the LA sites during the last 2 years, similar to Andrews et al. (2014). The largest component of the measurement uncertainty is due to the single-point calibration method; however, the uncertainty in the background mole fraction is much larger than the measurement uncertainty. The background uncertainty for the marine

  11. Uncertainty of angular displacement measurement with a MEMS gyroscope integrated in a smartphone

    International Nuclear Information System (INIS)

    De Campos Porath, Maurício; Dolci, Ricardo

    2015-01-01

    Low-cost inertial sensors have recently gained popularity and are now widely used in electronic devices such as smartphones and tablets. In this paper we present the results of a set of experiments aiming to assess the angular displacement measurement errors of a gyroscope integrated in a smartphone of a recent model. The goal is to verify whether these sensors could substitute dedicated electronic inclinometers for the measurement of angular displacement. We estimated a maximum error of 0.3° (sum of expanded uncertainty and maximum absolute bias) for the roll and pitch axes, for a measurement time without referencing up to 1 h. (paper)

  12. Measurement system for diffraction efficiency of convex gratings

    Science.gov (United States)

    Liu, Peng; Chen, Xin-hua; Zhou, Jian-kang; Zhao, Zhi-cheng; Liu, Quan; Luo, Chao; Wang, Xiao-feng; Tang, Min-xue; Shen, Wei-min

    2017-08-01

    A measurement system for diffraction efficiency of convex gratings is designed. The measurement system mainly includes four components as a light source, a front system, a dispersing system that contains a convex grating, and a detector. Based on the definition and measuring principle of diffraction efficiency, the optical scheme of the measurement system is analyzed and the design result is given. Then, in order to validate the feasibility of the designed system, the measurement system is set up and the diffraction efficiency of a convex grating with the aperture of 35 mm, the curvature-radius of 72mm, the blazed angle of 6.4°, the grating period of 2.5μm and the working waveband of 400nm-900nm is tested. Based on GUM (Guide to the Expression of Uncertainty in Measurement), the uncertainties in the measuring results are evaluated. The measured diffraction efficiency data are compared to the theoretical ones, which are calculated based on the grating groove parameters got by an atomic force microscope and Rigorous Couple Wave Analysis, and the reliability of the measurement system is illustrated. Finally, the measurement performance of the system is analyzed and tested. The results show that, the testing accuracy, the testing stability and the testing repeatability are 2.5%, 0.085% and 3.5% , respectively.

  13. Reframing Rigor: A Modern Look at Challenge and Support in Higher Education

    Science.gov (United States)

    Campbell, Corbin M.; Dortch, Deniece; Burt, Brian A.

    2018-01-01

    This chapter describes the limitations of the traditional notions of academic rigor in higher education, and brings forth a new form of rigor that has the potential to support student success and equity.

  14. Uncertainty in prediction and in inference

    NARCIS (Netherlands)

    Hilgevoord, J.; Uffink, J.

    1991-01-01

    The concepts of uncertainty in prediction and inference are introduced and illustrated using the diffraction of light as an example. The close re-lationship between the concepts of uncertainty in inference and resolving power is noted. A general quantitative measure of uncertainty in

  15. CFCI3 (CFC-11): UV Absorption Spectrum Temperature Dependence Measurements and the Impact on Atmospheric Lifetime and Uncertainty

    Science.gov (United States)

    Mcgillen, Max R.; Fleming, Eric L.; Jackman, Charles H.; Burkholder, James B.

    2014-01-01

    CFCl3 (CFC-11) is both an atmospheric ozone-depleting and potent greenhouse gas that is removed primarily via stratospheric UV photolysis. Uncertainty in the temperature dependence of its UV absorption spectrum is a significant contributing factor to the overall uncertainty in its global lifetime and, thus, model calculations of stratospheric ozone recovery and climate change. In this work, the CFC-11 UV absorption spectrum was measured over a range of wavelength (184.95 - 230 nm) and temperature (216 - 296 K). We report a spectrum temperature dependence that is less than currently recommended for use in atmospheric models. The impact on its atmospheric lifetime was quantified using a 2-D model and the spectrum parameterization developed in this work. The obtained global annually averaged lifetime was 58.1 +- 0.7 years (2 sigma uncertainty due solely to the spectrum uncertainty). The lifetime is slightly reduced and the uncertainty significantly reduced from that obtained using current spectrum recommendations

  16. Accurate measurements of solar spectral irradiance between 4000-10000 cm-1

    Science.gov (United States)

    Elsey, J.; Coleman, M. D.; Gardiner, T.; Shine, K. P.

    2017-12-01

    The near-infrared solar spectral irradiance (SSI) is an important input into simulations of weather and climate; the distribution of energy throughout this region of the spectrum influences atmospheric heating rates and the global hydrological cycle through absorption and scattering by water vapour. Current measurements by a mixture of ground-based and space-based instruments show differences of around 10% in the 4000-7000 cm-1 region, with no resolution to this controversy in sight. This work presents observations of SSI taken using a ground-based Fourier Transform spectrometer between 4000-10000 cm-1 at a field site in Camborne, UK, with particular focus on a rigorously defined uncertainty budget. While there is good agreement between this work and the commonly-used ATLAS3 spectrum between 7000-10000 cm-1, the SSI is systematically lower by 10% than ATLAS3 between 4000-7000 cm-1, with no overlap within the k = 2 measurement uncertainties.

  17. Meeting the measurement uncertainty and traceability requirements of ISO/AEC standard 17025 in chemical analysis.

    Science.gov (United States)

    King, B

    2001-11-01

    The new laboratory accreditation standard, ISO/IEC 17025, reflects current thinking on good measurement practice by requiring more explicit and more demanding attention to a number of activities. These include client interactions, method validation, traceability, and measurement uncertainty. Since the publication of the standard in 1999 there has been extensive debate about its interpretation. It is the author's view that if good quality practices are already in place and if the new requirements are introduced in a manner that is fit for purpose, the additional work required to comply with the new requirements can be expected to be modest. The paper argues that the rigour required in addressing the issues should be driven by customer requirements and the factors that need to be considered in this regard are discussed. The issues addressed include the benefits, interim arrangements, specifying the analytical requirement, establishing traceability, evaluating the uncertainty and reporting the information.

  18. The rigorous bound on the transmission probability for massless scalar field of non-negative-angular-momentum mode emitted from a Myers-Perry black hole

    Energy Technology Data Exchange (ETDEWEB)

    Ngampitipan, Tritos, E-mail: tritos.ngampitipan@gmail.com [Faculty of Science, Chandrakasem Rajabhat University, Ratchadaphisek Road, Chatuchak, Bangkok 10900 (Thailand); Particle Physics Research Laboratory, Department of Physics, Faculty of Science, Chulalongkorn University, Phayathai Road, Patumwan, Bangkok 10330 (Thailand); Boonserm, Petarpa, E-mail: petarpa.boonserm@gmail.com [Department of Mathematics and Computer Science, Faculty of Science, Chulalongkorn University, Phayathai Road, Patumwan, Bangkok 10330 (Thailand); Chatrabhuti, Auttakit, E-mail: dma3ac2@gmail.com [Particle Physics Research Laboratory, Department of Physics, Faculty of Science, Chulalongkorn University, Phayathai Road, Patumwan, Bangkok 10330 (Thailand); Visser, Matt, E-mail: matt.visser@msor.vuw.ac.nz [School of Mathematics, Statistics, and Operations Research, Victoria University of Wellington, PO Box 600, Wellington 6140 (New Zealand)

    2016-06-02

    Hawking radiation is the evidence for the existence of black hole. What an observer can measure through Hawking radiation is the transmission probability. In the laboratory, miniature black holes can successfully be generated. The generated black holes are, most commonly, Myers-Perry black holes. In this paper, we will derive the rigorous bounds on the transmission probabilities for massless scalar fields of non-negative-angular-momentum modes emitted from a generated Myers-Perry black hole in six, seven, and eight dimensions. The results show that for low energy, the rigorous bounds increase with the increase in the energy of emitted particles. However, for high energy, the rigorous bounds decrease with the increase in the energy of emitted particles. When the black holes spin faster, the rigorous bounds decrease. For dimension dependence, the rigorous bounds also decrease with the increase in the number of extra dimensions. Furthermore, as comparison to the approximate transmission probability, the rigorous bound is proven to be useful.

  19. The rigorous bound on the transmission probability for massless scalar field of non-negative-angular-momentum mode emitted from a Myers-Perry black hole

    International Nuclear Information System (INIS)

    Ngampitipan, Tritos; Boonserm, Petarpa; Chatrabhuti, Auttakit; Visser, Matt

    2016-01-01

    Hawking radiation is the evidence for the existence of black hole. What an observer can measure through Hawking radiation is the transmission probability. In the laboratory, miniature black holes can successfully be generated. The generated black holes are, most commonly, Myers-Perry black holes. In this paper, we will derive the rigorous bounds on the transmission probabilities for massless scalar fields of non-negative-angular-momentum modes emitted from a generated Myers-Perry black hole in six, seven, and eight dimensions. The results show that for low energy, the rigorous bounds increase with the increase in the energy of emitted particles. However, for high energy, the rigorous bounds decrease with the increase in the energy of emitted particles. When the black holes spin faster, the rigorous bounds decrease. For dimension dependence, the rigorous bounds also decrease with the increase in the number of extra dimensions. Furthermore, as comparison to the approximate transmission probability, the rigorous bound is proven to be useful.

  20. Minimizing measurement uncertainties of coniferous needle-leaf optical properties, part II: experimental set-up and error analysis

    NARCIS (Netherlands)

    Yanez Rausell, L.; Malenovsky, Z.; Clevers, J.G.P.W.; Schaepman, M.E.

    2014-01-01

    We present uncertainties associated with the measurement of coniferous needle-leaf optical properties (OPs) with an integrating sphere using an optimized gap-fraction (GF) correction method, where GF refers to the air gaps appearing between the needles of a measured sample. We used an optically