WorldWideScience

Sample records for attribution measuring uncertainty

  1. Greenhouse Gas Source Attribution: Measurements Modeling and Uncertainty Quantification

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Zhen [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Safta, Cosmin [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Sargsyan, Khachik [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Najm, Habib N. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); van Bloemen Waanders, Bart Gustaaf [Sandia National Lab. (SNL-CA), Livermore, CA (United States); LaFranchi, Brian W. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Ivey, Mark D. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Schrader, Paul E. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Michelsen, Hope A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Bambha, Ray P. [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2014-09-01

    In this project we have developed atmospheric measurement capabilities and a suite of atmospheric modeling and analysis tools that are well suited for verifying emissions of green- house gases (GHGs) on an urban-through-regional scale. We have for the first time applied the Community Multiscale Air Quality (CMAQ) model to simulate atmospheric CO2 . This will allow for the examination of regional-scale transport and distribution of CO2 along with air pollutants traditionally studied using CMAQ at relatively high spatial and temporal resolution with the goal of leveraging emissions verification efforts for both air quality and climate. We have developed a bias-enhanced Bayesian inference approach that can remedy the well-known problem of transport model errors in atmospheric CO2 inversions. We have tested the approach using data and model outputs from the TransCom3 global CO2 inversion comparison project. We have also performed two prototyping studies on inversion approaches in the generalized convection-diffusion context. One of these studies employed Polynomial Chaos Expansion to accelerate the evaluation of a regional transport model and enable efficient Markov Chain Monte Carlo sampling of the posterior for Bayesian inference. The other approach uses de- terministic inversion of a convection-diffusion-reaction system in the presence of uncertainty. These approaches should, in principle, be applicable to realistic atmospheric problems with moderate adaptation. We outline a regional greenhouse gas source inference system that integrates (1) two ap- proaches of atmospheric dispersion simulation and (2) a class of Bayesian inference and un- certainty quantification algorithms. We use two different and complementary approaches to simulate atmospheric dispersion. Specifically, we use a Eulerian chemical transport model CMAQ and a Lagrangian Particle Dispersion Model - FLEXPART-WRF. These two models share the same WRF

  2. Measurement Uncertainty

    Science.gov (United States)

    Koch, Michael

    Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.

  3. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  4. Attribution Theory and Judgment under Uncertainty

    Science.gov (United States)

    1975-06-13

    1972) has presented some evidence indicating that people are ’ Prone to primacy effects (i.e., relying on the first sufficient explanation ti"" comes...ME. Probability learning and a negative recency effect in the serial anticipation of alternative symbols. JournalofExperimental Psychology. 1951...however, is the difference in the picture of men and women which emerges fronrthem. Attribution researchers find people to be effective processors of

  5. Evaluation and attribution of OCO-2 XCO2 uncertainties

    Science.gov (United States)

    Worden, John R.; Doran, Gary; Kulawik, Susan; Eldering, Annmarie; Crisp, David; Frankenberg, Christian; O'Dell, Chris; Bowman, Kevin

    2017-07-01

    Evaluating and attributing uncertainties in total column atmospheric CO2 measurements (XCO2) from the OCO-2 instrument is critical for testing hypotheses related to the underlying processes controlling XCO2 and for developing quality flags needed to choose those measurements that are usable for carbon cycle science.Here we test the reported uncertainties of version 7 OCO-2 XCO2 measurements by examining variations of the XCO2 measurements and their calculated uncertainties within small regions (˜ 100 km × 10.5 km) in which natural CO2 variability is expected to be small relative to variations imparted by noise or interferences. Over 39 000 of these small neighborhoods comprised of approximately 190 observations per neighborhood are used for this analysis. We find that a typical ocean measurement has a precision and accuracy of 0.35 and 0.24 ppm respectively for calculated precisions larger than ˜ 0.25 ppm. These values are approximately consistent with the calculated errors of 0.33 and 0.14 ppm for the noise and interference error, assuming that the accuracy is bounded by the calculated interference error. The actual precision for ocean data becomes worse as the signal-to-noise increases or the calculated precision decreases below 0.25 ppm for reasons that are not well understood. A typical land measurement, both nadir and glint, is found to have a precision and accuracy of approximately 0.75 and 0.65 ppm respectively as compared to the calculated precision and accuracy of approximately 0.36 and 0.2 ppm. The differences in accuracy between ocean and land suggests that the accuracy of XCO2 data is likely related to interferences such as aerosols or surface albedo as they vary less over ocean than land. The accuracy as derived here is also likely a lower bound as it does not account for possible systematic biases between the regions used in this analysis.

  6. Traceability and Measurement Uncertainty

    DEFF Research Database (Denmark)

    Tosello, Guido; De Chiffre, Leonardo

    2004-01-01

    This report is made as a part of the project ‘Metro-E-Learn: European e-Learning in Manufacturing Metrology’, an EU project under the program SOCRATES MINERVA (ODL and ICT in Education), Contract No: 101434-CP-1-2002-1-DE-MINERVA, coordinated by Friedrich-Alexander-University Erlangen....... Machine tool testing 9. The role of manufacturing metrology for QM 10. Inspection planning 11. Quality management of measurements incl. Documentation 12. Advanced manufacturing measurement technology The present report (which represents the section 2 - Traceability and Measurement Uncertainty – of the e-learning......-Nürnberg, Chair for Quality Management and Manufacturing-Oriented Metrology (Germany). 'Metro-E-Learn' project proposes to develop and implement a coherent learning and competence chain that leads from introductory and foundation e-courses in initial manufacturing engineering studies towards higher...

  7. Uncertainties in scientific measurements

    Energy Technology Data Exchange (ETDEWEB)

    Holden, N.E.

    1986-11-16

    Some examples of nuclear data in which the uncertainty has been underestimated, or at least appears to be underestimated, are reviewed. The subjective aspect of the problem of systematic uncertainties is discussed. Historical aspects of the data uncertainty problem are noted. 64 refs., 6 tabs.

  8. Attempting Measurement of Psychological Attributes

    Directory of Open Access Journals (Sweden)

    Thomas eSalzberger

    2013-02-01

    Full Text Available Measures of psychological attributes abound in the social sciences as much as measures of physical properties do in the physical sciences. However, there are crucial differences between the scientific underpinning of measurement. While measurement in the physical sciences is supported by empirical evidence that demonstrates the quantitative nature of the property assessed, measurement in the social sciences is, in large part, made possible only by a vague, discretionary definition of measurement that places hardly any restrictions on empirical data. Traditional psychometric analyses fail to address the requirements of measurement as defined more rigorously in the physical sciences. The construct definitions do not allow for testable predications; and content validity becomes a matter of judgment. In order to improve measurement of psychological attributes, it is suggested to, first, readopt the definition of measurement in the physical sciences; second, to devise an elaborate theory of the construct to be measured that includes the hypothesis of a quantitative attribute; and third, to test the data for the structure implied by the hypothesis of quantity as well as predictions derived from the theory of the construct.

  9. Uncertainty in measurements by counting

    Science.gov (United States)

    Bich, Walter; Pennecchi, Francesca

    2012-02-01

    Counting is at the base of many high-level measurements, such as, for example, frequency measurements. In some instances the measurand itself is a number of events, such as spontaneous decays in activity measurements, or objects, such as colonies of bacteria in microbiology. Countings also play a fundamental role in everyday life. In any case, a counting is a measurement. A measurement result, according to its present definition, as given in the 'International Vocabulary of Metrology—Basic and general concepts and associated terms (VIM)', must include a specification concerning the estimated uncertainty. As concerns measurements by counting, this specification is not easy to encompass in the well-known framework of the 'Guide to the Expression of Uncertainty in Measurement', known as GUM, in which there is no guidance on the topic. Furthermore, the issue of uncertainty in countings has received little or no attention in the literature, so that it is commonly accepted that this category of measurements constitutes an exception in which the concept of uncertainty is not applicable, or, alternatively, that results of measurements by counting have essentially no uncertainty. In this paper we propose a general model for measurements by counting which allows an uncertainty evaluation compliant with the general framework of the GUM.

  10. Optimal entropic uncertainty relation for successive measurements ...

    Indian Academy of Sciences (India)

    Home; Journals; Pramana – Journal of Physics; Volume 60; Issue 6. Optimal entropic uncertainty relation for successive measurements in quantum information theory ... Keywords. Uncertainty relations; information-theoretic entropy; optimum bounds; successive measurements; influence of measurements on uncertainties.

  11. Uncertainty Calculation for Spectral-Responsivity Measurements

    National Research Council Canada - National Science Library

    Lehman, John H; Wang, C M; Dowell, Marla L; Hadler, Joshua A

    2009-01-01

    .... Relative expanded uncertainties based on the methods from the Guide to the Expression of Uncertainty in Measurement and from Supplement 1 to the "Guide to the Expression of Uncertainty in Measurement...

  12. Black Hole Spin Measurement Uncertainty

    Science.gov (United States)

    Salvesen, Greg; Begelman, Mitchell C.

    2018-01-01

    Angular momentum, or spin, is one of only two fundamental properties of astrophysical black holes, and measuring its value has numerous applications. For instance, obtaining reliable spin measurements could constrain the growth history of supermassive black holes and reveal whether relativistic jets are powered by tapping into the black hole spin reservoir. The two well-established techniques for measuring black hole spin can both be applied to X-ray binaries, but are in disagreement for cases of non-maximal spin. This discrepancy must be resolved if either technique is to be deemed robust. We show that the technique based on disc continuum fitting is sensitive to uncertainties regarding the disc atmosphere, which are observationally unconstrained. By incorporating reasonable uncertainties into black hole spin probability density functions, we demonstrate that the spin measured by disc continuum fitting can become highly uncertain. Future work toward understanding how the observed disc continuum is altered by atmospheric physics, particularly magnetic fields, will further strengthen black hole spin measurement techniques.

  13. Measuring the uncertainty of coupling

    Science.gov (United States)

    Zhao, Xiaojun; Shang, Pengjian

    2015-06-01

    A new information-theoretic measure, called coupling entropy, is proposed here to detect the causal links in complex systems by taking into account the inner composition alignment of temporal structure. It is a permutation-based asymmetric association measure to infer the uncertainty of coupling between two time series. The coupling entropy is found to be effective in the analysis of Hénon maps, where different noises are added to test its accuracy and sensitivity. The coupling entropy is also applied to analyze the relationship between unemployment rate and CPI change in the U.S., where the CPI change turns out to be the driving variable while the unemployment rate is the responding one.

  14. Experimental Joint Quantum Measurements with Minimum Uncertainty

    Science.gov (United States)

    Ringbauer, Martin; Biggerstaff, Devon N.; Broome, Matthew A.; Fedrizzi, Alessandro; Branciard, Cyril; White, Andrew G.

    2014-01-01

    Quantum physics constrains the accuracy of joint measurements of incompatible observables. Here we test tight measurement-uncertainty relations using single photons. We implement two independent, idealized uncertainty-estimation methods, the three-state method and the weak-measurement method, and adapt them to realistic experimental conditions. Exceptional quantum state fidelities of up to 0.999 98(6) allow us to verge upon the fundamental limits of measurement uncertainty.

  15. Where does the uncertainty come from? Attributing Uncertainty in Conceptual Hydrologic Modelling

    Science.gov (United States)

    Abu Shoaib, S.; Marshall, L. A.; Sharma, A.

    2015-12-01

    Defining an appropriate forecasting model is a key phase in water resources planning and design. Quantification of uncertainty is an important step in the development and application of hydrologic models. In this study, we examine the dependency of hydrologic model uncertainty on the observed model inputs, defined model structure, parameter optimization identifiability and identified likelihood. We present here a new uncertainty metric, the Quantile Flow Deviation or QFD, to evaluate the relative uncertainty due to each of these sources under a range of catchment conditions. Through the metric, we may identify the potential spectrum of uncertainty and variability in model simulations. The QFD assesses uncertainty by estimating the deviation in flows at a given quantile across a range of scenarios. By using a quantile based metric, the change in uncertainty across individual percentiles can be assessed, thereby allowing uncertainty to be expressed as a function of time. The QFD method can be disaggregated to examine any part of the modelling process including the selection of certain model subroutines or forcing data. Case study results (including catchments in Australia and USA) suggest that model structure selection is vital irrespective of the flow percentile of interest or the catchment being studied. Examining the QFD across various quantiles additionally demonstrates that lower yielding catchments may have greater variation due to selected model structures. By incorporating multiple model structures, it is possible to assess (i) the relative importance of various sources of uncertainty, (ii) how these vary with the change in catchment location or hydrologic regime; and (iii) the impact of the length of available observations in uncertainty quantification.

  16. Exploring the uncertainty in attributing sediment contributions in fingerprinting studies due to uncertainty in determining element concentrations in source areas.

    Science.gov (United States)

    Gomez, Jose Alfonso; Owens, Phillip N.; Koiter, Alex J.; Lobb, David

    2016-04-01

    One of the major sources of uncertainty in attributing sediment sources in fingerprinting studies is the uncertainty in determining the concentrations of the elements used in the mixing model due to the variability of the concentrations of these elements in the source materials (e.g., Kraushaar et al., 2015). The uncertainty in determining the "true" concentration of a given element in each one of the source areas depends on several factors, among them the spatial variability of that element, the sampling procedure and sampling density. Researchers have limited control over these factors, and usually sampling density tends to be sparse, limited by time and the resources available. Monte Carlo analysis has been used regularly in fingerprinting studies to explore the probable solutions within the measured variability of the elements in the source areas, providing an appraisal of the probability of the different solutions (e.g., Collins et al., 2012). This problem can be considered analogous to the propagation of uncertainty in hydrologic models due to uncertainty in the determination of the values of the model parameters, and there are many examples of Monte Carlo analysis of this uncertainty (e.g., Freeze, 1980; Gómez et al., 2001). Some of these model analyses rely on the simulation of "virtual" situations that were calibrated from parameter values found in the literature, with the purpose of providing insight about the response of the model to different configurations of input parameters. This approach - evaluating the answer for a "virtual" problem whose solution could be known in advance - might be useful in evaluating the propagation of uncertainty in mixing models in sediment fingerprinting studies. In this communication, we present the preliminary results of an on-going study evaluating the effect of variability of element concentrations in source materials, sampling density, and the number of elements included in the mixing models. For this study a virtual

  17. Not Normal: the uncertainties of scientific measurements

    Science.gov (United States)

    Bailey, David C.

    2017-01-01

    Judging the significance and reproducibility of quantitative research requires a good understanding of relevant uncertainties, but it is often unclear how well these have been evaluated and what they imply. Reported scientific uncertainties were studied by analysing 41 000 measurements of 3200 quantities from medicine, nuclear and particle physics, and interlaboratory comparisons ranging from chemistry to toxicology. Outliers are common, with 5σ disagreements up to five orders of magnitude more frequent than naively expected. Uncertainty-normalized differences between multiple measurements of the same quantity are consistent with heavy-tailed Student's t-distributions that are often almost Cauchy, far from a Gaussian Normal bell curve. Medical research uncertainties are generally as well evaluated as those in physics, but physics uncertainty improves more rapidly, making feasible simple significance criteria such as the 5σ discovery convention in particle physics. Contributions to measurement uncertainty from mistakes and unknown problems are not completely unpredictable. Such errors appear to have power-law distributions consistent with how designed complex systems fail, and how unknown systematic errors are constrained by researchers. This better understanding may help improve analysis and meta-analysis of data, and help scientists and the public have more realistic expectations of what scientific results imply.

  18. Not Normal: the uncertainties of scientific measurements

    Science.gov (United States)

    2017-01-01

    Judging the significance and reproducibility of quantitative research requires a good understanding of relevant uncertainties, but it is often unclear how well these have been evaluated and what they imply. Reported scientific uncertainties were studied by analysing 41 000 measurements of 3200 quantities from medicine, nuclear and particle physics, and interlaboratory comparisons ranging from chemistry to toxicology. Outliers are common, with 5σ disagreements up to five orders of magnitude more frequent than naively expected. Uncertainty-normalized differences between multiple measurements of the same quantity are consistent with heavy-tailed Student’s t-distributions that are often almost Cauchy, far from a Gaussian Normal bell curve. Medical research uncertainties are generally as well evaluated as those in physics, but physics uncertainty improves more rapidly, making feasible simple significance criteria such as the 5σ discovery convention in particle physics. Contributions to measurement uncertainty from mistakes and unknown problems are not completely unpredictable. Such errors appear to have power-law distributions consistent with how designed complex systems fail, and how unknown systematic errors are constrained by researchers. This better understanding may help improve analysis and meta-analysis of data, and help scientists and the public have more realistic expectations of what scientific results imply. PMID:28280557

  19. Minimum requirements for the estimation of measurement uncertainty: Recommendations of the joint Working group for uncertainty of measurement of the CSMBLM and CCMB.

    Science.gov (United States)

    Ćelap, Ivana; Vukasović, Ines; Juričić, Gordana; Šimundić, Ana-Maria

    2017-10-15

    The International vocabulary of metrology - Basic and general concepts and associated terms (VIM3, 2.26 measurement uncertainty, JCGM 200:2012) defines uncertainty of measurement as a non-negative parameter characterizing the dispersion of the quantity values being attributed to a measurand, based on the information obtained from performing the measurement. Clinical Laboratory Standards Institute (CLSI) has published a very detailed guideline with a description of sources contributing to measurement uncertainty as well as different approaches for the calculation (Expression of measurement uncertainty in laboratory medicine; Approved Guideline, CLSI C51-A 2012). Many other national and international recommendations and original scientific papers about measurement uncertainty estimation have been published. In Croatia, the estimation of measurement uncertainty is obligatory for accredited medical laboratories. However, since national recommendations are currently not available, each of these laboratories uses a different approach in measurement uncertainty estimation. The main purpose of this document is to describe the minimal requirements for measurement uncertainty estimation. In such way, it will contribute to the harmonization of measurement uncertainty estimation, evaluation and reporting across laboratories in Croatia. This recommendation is issued by the joint Working group for uncertainty of measurement of the Croatian Society for Medical Biochemistry and Laboratory Medicine and Croatian Chamber of Medical Biochemists. The document is based mainly on the recommendations of Australasian Association of Clinical Biochemists (AACB) Uncertainty of Measurement Working Group and is intended for all medical biochemistry laboratories in Croatia.

  20. Measuring the uncertainty of tapping torque

    DEFF Research Database (Denmark)

    Belluco, Walter; De Chiffre, Leonardo

    An uncertainty budget is carried out for torque measurements performed at the Institut for Procesteknik for the evaluation of cutting fluids. Thirty test blanks were machined with one tool and one fluid, torque diagrams were recorded and the repeatability of single torque measurements was estimated...

  1. Managing Measurement Uncertainty in Building Acoustics

    Directory of Open Access Journals (Sweden)

    Chiara Scrosati

    2015-12-01

    Full Text Available In general, uncertainties should preferably be determined following the principles laid down in ISO/IEC Guide 98-3, the Guide to the expression of uncertainty in measurement (GUM:1995. According to current knowledge, it seems impossible to formulate these models for the different quantities in building acoustics. Therefore, the concepts of repeatability and reproducibility are necessary to determine the uncertainty of building acoustics measurements. This study shows the uncertainty of field measurements of a lightweight wall, a heavyweight floor, a façade with a single glazing window and a façade with double glazing window that were analyzed by a Round Robin Test (RRT, conducted in a full-scale experimental building at ITC-CNR (Construction Technologies Institute of the National Research Council of Italy. The single number quantities and their uncertainties were evaluated in both narrow and enlarged range and it was shown that including or excluding the low frequencies leads to very significant differences, except in the case of the sound insulation of façades with single glazing window. The results obtained in these RRTs were compared with other results from literature, which confirm the increase of the uncertainty of single number quantities due to the low frequencies extension. Having stated the measurement uncertainty for a single measurement, in building acoustics, it is also very important to deal with sampling for the purposes of classification of buildings or building units. Therefore, this study also shows an application of the sampling included in the Italian Standard on the acoustic classification of building units on a serial type building consisting of 47 building units. It was found that the greatest variability is observed in the façade and it depends on both the great variability of window’s typologies and on workmanship. Finally, it is suggested how to manage the uncertainty in building acoustics, both for one single

  2. Assessing student understanding of measurement and uncertainty

    Science.gov (United States)

    Jirungnimitsakul, S.; Wattanakasiwich, P.

    2017-09-01

    The objectives of this study were to develop and assess student understanding of measurement and uncertainty. A test has been adapted and translated from the Laboratory Data Analysis Instrument (LDAI) test, consists of 25 questions focused on three topics including measures of central tendency, experimental errors and uncertainties, and fitting regression lines. The test was evaluated its content validity by three physics experts in teaching physics laboratory. In the pilot study, Thai LDAI was administered to 93 freshmen enrolled in a fundamental physics laboratory course. The final draft of the test was administered to three groups—45 freshmen taking fundamental physics laboratory, 16 sophomores taking intermediated physics laboratory and 21 juniors taking advanced physics laboratory at Chiang Mai University. As results, we found that the freshmen had difficulties in experimental errors and uncertainties. Most students had problems with fitting regression lines. These results will be used to improve teaching and learning physics laboratory for physics students in the department.

  3. Uncertainty Measures of Regional Flood Frequency Estimators

    DEFF Research Database (Denmark)

    Rosbjerg, Dan; Madsen, Henrik

    1995-01-01

    Regional flood frequency models have different assumptions regarding homogeneity and inter-site independence. Thus, uncertainty measures of T-year event estimators are not directly comparable. However, having chosen a particular method, the reliability of the estimate should always be stated, e...

  4. Uncertainty of dose measurement in radiation processing

    DEFF Research Database (Denmark)

    Miller, A.

    1996-01-01

    The major standard organizations of the world have addressed the issue of reporting uncertainties in measurement reports and certificates. There is, however, still some ambiguity in the minds of many people who try to implement the recommendations in real life. This paper is a contribution...

  5. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  6. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  7. Uncertainty in Measurement and Total Error: Tools for Coping with Diagnostic Uncertainty.

    Science.gov (United States)

    Theodorsson, Elvar

    2017-03-01

    Laboratory medicine decreases diagnostic uncertainty, but is influenced by factors causing uncertainties. Error and uncertainty methods are commonly seen as incompatible in laboratory medicine. New versions of the Guide to the Expression of Uncertainty in Measurement and International Vocabulary of Metrology will incorporate both uncertainty and error methods, which will assist collaboration between metrology and laboratories. Law of propagation of uncertainty and bayesian statistics are theoretically preferable to frequentist statistical methods in diagnostic medicine. However, frequentist statistics are better known and more widely practiced. Error and uncertainty methods should both be recognized as legitimate for calculating diagnostic uncertainty. Copyright © 2016 The Author. Published by Elsevier Inc. All rights reserved.

  8. Measuring the uncertainties of discharge measurements: interlaboratory experiments in hydrometry

    Science.gov (United States)

    Le Coz, Jérôme; Blanquart, Bertrand; Pobanz, Karine; Dramais, Guillaume; Pierrefeu, Gilles; Hauet, Alexandre; Despax, Aurélien

    2015-04-01

    Quantifying the uncertainty of streamflow data is key for hydrological sciences. The conventional uncertainty analysis based on error propagation techniques is restricted by the absence of traceable discharge standards and by the weight of difficult-to-predict errors related to the operator, procedure and measurement environment. Field interlaboratory experiments recently emerged as an efficient, standardized method to 'measure' the uncertainties of a given streamgauging technique in given measurement conditions. Both uncertainty approaches are compatible and should be developed jointly in the field of hydrometry. In the recent years, several interlaboratory experiments have been reported by different hydrological services. They involved different streamgauging techniques, including acoustic profilers (ADCP), current-meters and handheld radars (SVR). Uncertainty analysis was not always their primary goal: most often, testing the proficiency and homogeneity of instruments, makes and models, procedures and operators was the original motivation. When interlaboratory experiments are processed for uncertainty analysis, once outliers have been discarded all participants are assumed to be equally skilled and to apply the same streamgauging technique in equivalent conditions. A universal requirement is that all participants simultaneously measure the same discharge, which shall be kept constant within negligible variations. To our best knowledge, we were the first to apply the interlaboratory method for computing the uncertainties of streamgauging techniques, according to the authoritative international documents (ISO standards). Several specific issues arise due to the measurements conditions in outdoor canals and rivers. The main limitation is that the best available river discharge references are usually too uncertain to quantify the bias of the streamgauging technique, i.e. the systematic errors that are common to all participants in the experiment. A reference or a

  9. Measurement Uncertainties in Science and Technology

    CERN Document Server

    Grabe, Michael

    2005-01-01

    At the turn of the 19th century, Carl Friedrich Gauß founded error calculus by predicting the then unknown position of the planet Ceres. Ever since, error calculus has occupied a place at the heart of science. In this book, Grabe illustrates the breakdown of traditional error calculus in the face of modern measurement techniques. Revising Gauß’ error calculus ab initio, he treats random and unknown systematic errors on an equal footing from the outset. Furthermore, Grabe also proposes what may be called well defined measuring conditions, a prerequisite for defining confidence intervals that are consistent with basic statistical concepts. The resulting measurement uncertainties are as robust and reliable as required by modern-day science, engineering and technology.

  10. Inconclusive quantum measurements and decisions under uncertainty

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2016-04-01

    Full Text Available We give a mathematical definition for the notion of inconclusive quantum measurements.In physics, such measurements occur at intermediate stages of a complex measurement procedure, with the final measurement result being operationally testable. Since the mathematical structure of Quantum Decision Theory has been developed in analogy withthe theory of quantum measurements, the inconclusive quantum measurements correspond,in Quantum Decision Theory, to intermediate stages of decision making in the process of taking decisions under uncertainty. The general form of the quantum probability for a composite event is the sum of a utility factor, describing a rational evaluationof the considered prospect, and of an attraction factor, characterizing irrational,subconscious attitudes of the decision maker. Despite the involved irrationality, the probability of prospects can be evaluated. This is equivalent to the possibility of calculating quantum probabilities without specifying hidden variables. We formulate a general way of evaluation, based on the use of non-informative priors. As an example,we suggest the explanation of the decoy effect. Our quantitative predictions are in very good agreement with experimental data.

  11. Quantile-based bias correction and uncertainty quantification of extreme event attribution statements

    Directory of Open Access Journals (Sweden)

    Soyoung Jeon

    2016-06-01

    Full Text Available Extreme event attribution characterizes how anthropogenic climate change may have influenced the probability and magnitude of selected individual extreme weather and climate events. Attribution statements often involve quantification of the fraction of attributable risk (FAR or the risk ratio (RR and associated confidence intervals. Many such analyses use climate model output to characterize extreme event behavior with and without anthropogenic influence. However, such climate models may have biases in their representation of extreme events. To account for discrepancies in the probabilities of extreme events between observational datasets and model datasets, we demonstrate an appropriate rescaling of the model output based on the quantiles of the datasets to estimate an adjusted risk ratio. Our methodology accounts for various components of uncertainty in estimation of the risk ratio. In particular, we present an approach to construct a one-sided confidence interval on the lower bound of the risk ratio when the estimated risk ratio is infinity. We demonstrate the methodology using the summer 2011 central US heatwave and output from the Community Earth System Model. In this example, we find that the lower bound of the risk ratio is relatively insensitive to the magnitude and probability of the actual event.

  12. A Unified Approach for Reporting ARM Measurement Uncertainties Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Campos, E [Argonne National Lab. (ANL), Argonne, IL (United States); Sisterson, Douglas [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-12-01

    The U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility is observationally based, and quantifying the uncertainty of its measurements is critically important. With over 300 widely differing instruments providing over 2,500 datastreams, concise expression of measurement uncertainty is quite challenging. The ARM Facility currently provides data and supporting metadata (information about the data or data quality) to its users through a number of sources. Because the continued success of the ARM Facility depends on the known quality of its measurements, the Facility relies on instrument mentors and the ARM Data Quality Office (DQO) to ensure, assess, and report measurement quality. Therefore, an easily accessible, well-articulated estimate of ARM measurement uncertainty is needed. Note that some of the instrument observations require mathematical algorithms (retrievals) to convert a measured engineering variable into a useful geophysical measurement. While those types of retrieval measurements are identified, this study does not address particular methods for retrieval uncertainty. As well, the ARM Facility also provides engineered data products, or value-added products (VAPs), based on multiple instrument measurements. This study does not include uncertainty estimates for those data products. We propose here that a total measurement uncertainty should be calculated as a function of the instrument uncertainty (calibration factors), the field uncertainty (environmental factors), and the retrieval uncertainty (algorithm factors). The study will not expand on methods for computing these uncertainties. Instead, it will focus on the practical identification, characterization, and inventory of the measurement uncertainties already available in the ARM community through the ARM instrument mentors and their ARM instrument handbooks. As a result, this study will address the first steps towards reporting ARM measurement uncertainty

  13. Spatial Uncertainty Analysis for LVIS and UAV-SAR Attribute Fusion

    Science.gov (United States)

    Chakravarty, S.; Franks, S.

    2011-12-01

    .Due to the medium to low resolution of the above sensors, fusion analysis on the extracted attributes is mostly plagued with uncertainties. In this study the extracted information from the two modalities are treated using spatial uncertainty analysis. Statistical-Set theoretic based analysis as well as simulation based approach using error propagation law are tried. The results of uncertainty analysis can be used as performance metric or feedback for the respective attribute extraction algorithms. (1) http://lvis.gsfc.nasa.gov/index.php (2) http://uavsar.jpl.nasa.gov/

  14. Quantification of the uncertainties of high-speed camera measurements

    Directory of Open Access Journals (Sweden)

    Robbe C.

    2014-01-01

    Full Text Available This article proposes a combined theoretical and experimental approach to assess and quantify the global uncertainty of a high-speed camera velocity measurement. The study is divided in five sections: firstly, different sources of measurement uncertainties performed by a high-speed camera are identified and quantified. They consist of geometrical uncertainties, pixel discretisation uncertainties or optical uncertainties. Secondly, a global uncertainty factor, taking into account the previously identified sources of uncertainties, is computed. Thirdly, a sensibility study of the camera set-up parameters is performed, allowing the experimenter to optimize these parameters in order to minimize the final uncertainties. Fourthly, the theoretical computed uncertainty is compared with experimental measurements. Good concordance has been found. Finally, the velocity measurement uncertainty study is extended to continuous displacement measurements as a function of time. The purpose of this article is to propose all the mathematical tools necessary to quantify the individual and global uncertainties, to highlight the important aspects of the experimental set-up, and to give recommendations on how to improve a specific set-up in order to minimize the global uncertainty. Taking all these into account, it has been shown that highly dynamic phenomena such as a ballistic phenomenon can be measured using a high-speed camera with a global uncertainty of less than 2%.

  15. Uncertainty budget for optical coordinate measurements of circle diameter

    DEFF Research Database (Denmark)

    Morace, Renate Erica; Hansen, Hans Nørgaard; De Chiffre, Leonardo

    2004-01-01

    An uncertainty analysis for circle diameter measurements using a coordinate measuring machine (CMM) equipped with an optical probe is presented in this paper. A mathematical model for data evaluation and uncertainty assessment was formulated in accordance with Guide to the Expression of Uncertainty...

  16. Uncertainties in the attribution of greenhouse gas warming and implications for climate prediction

    CERN Document Server

    Jones, Gareth S; Mitchell, John F B

    2016-01-01

    Using optimal detection techniques with climate model simulations, most of the observed increase of near surface temperatures over the second half of the twentieth century is attributed to anthropogenic influences. However, the partitioning of the anthropogenic influence to individual factors, such as greenhouse gases and aerosols, is much less robust. Differences in how forcing factors are applied, in their radiative influence and in models' climate sensitivities, substantially influence the response patterns. We find standard optimal detection methodologies cannot fully reconcile this response diversity. By selecting a set of experiments to enable the diagnosing of greenhouse gases and the combined influence of other anthropogenic and natural factors, we find robust detections of well mixed greenhouse gases across a large ensemble of models. Of the observed warming over the 20th century of 0.65K/century we find, using a multi model mean not incorporating pattern uncertainty, a well mixed greenhouse gas warm...

  17. Multi-attribute mate choice decisions and uncertainty in the decision process: a generalized sequential search strategy.

    Science.gov (United States)

    Wiegmann, Daniel D; Weinersmith, Kelly L; Seubert, Steven M

    2010-04-01

    The behavior of females in search of a mate determines the likelihood that high quality males are encountered and adaptive search strategies rely on the effective use of available information on the quality of prospective mates. The sequential search strategy was formulated, like most models of search behavior, on the assumption that females obtain perfect information on the quality of encountered males. In this paper, we modify the strategy to allow for uncertainty of male quality and we determine how the magnitude of this uncertainty and the ability of females to inspect multiple male attributes to reduce uncertainty influence mate choice decisions. In general, searchers are sensitive to search costs and higher costs lower acceptance criteria under all versions of the model. The choosiness of searchers increases with the variability of the quality of prospective mates under conditions of the original model, but under conditions of uncertainty the choosiness of searchers may increase or decrease with the variability of inspected male attributes. The behavioral response depends on the functional relationship between observed male attributes and the fitness return to searchers and on costs associated with the search process. Higher uncertainty often induces searchers to pay more for information and under conditions of uncertainty the fitness return to searchers is never higher than under conditions of the original model. Further studies of the performance of alternative search strategies under conditions of uncertainty may consequently be necessary to identify search strategies likely to be used under natural conditions.

  18. Schoolteacher Trainees' Difficulties about the Concepts of Attribute and Measurement

    Science.gov (United States)

    Passelaigue, Dominique; Munier, Valérie

    2015-01-01

    "Attribute" and "measurement" are two fundamental concepts in mathematics and physics. Teaching these concepts is essential even in elementary school, but numerous studies have pointed out pupils' difficulties with them. These studies emphasized that pupils must learn about attributes before being taught how to measure these…

  19. Evaluation of Sources of Uncertainties in Solar Resource Measurement

    Energy Technology Data Exchange (ETDEWEB)

    Habte, Aron M [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Sengupta, Manajit [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-25

    This poster presents a high-level overview of sources of uncertainties in solar resource measurement, demonstrating the impact of various sources of uncertainties -- such as cosine response, thermal offset, spectral response, and others -- on the accuracy of data from several radiometers. The study provides insight on how to reduce the impact of some of the sources of uncertainties.

  20. Using a Meniscus to Teach Uncertainty in Measurement

    Science.gov (United States)

    Backman, Philip

    2008-01-01

    I have found that students easily understand that a measurement cannot be exact, but they often seem to lack an understanding of why it is important to know "something" about the magnitude of the uncertainty. This tends to promote an attitude that almost any uncertainty value will do. Such indifference may exist because once an uncertainty is…

  1. Application of Uncertainty in Measurement (GUM) to Isotope Mass Spectrometry: Introduction, Implemention, and Examples

    Science.gov (United States)

    Buerger, S.; Essex, R. M.; Mathew, K. J.; Thomas, R. B.

    2008-12-01

    As the measured value and its unit are integral parts of a measurement, so is a statement of the associated measurement uncertainty. The importance of providing an uncertainty that can reasonably be attributed to the measured value is often underrated. An assessment of uncertainty provides confidence in the value of the measurement, judgement on significance of differences between measurement results, information regarding the capability of the measurement procedure, and quality assurance. The limitations of the classical error analysis were seen as a hindrance to communication of scientific and technical measurement results, initiating the development of the Guide to the Expression of Uncertainty in Measurement (GUM) in the late 1970s. Just as the use of the International System of Units brings coherence to measurements, the International Organization for Standardization Guide to the Expression of Uncertainty in Measurement recommends a standardized way of expressing uncertainty in all kinds of measurements. Consequently, GUM has been adopted by most of the national metrology institutes in the world. A short introduction to GUM and the logical steps leading to its development will be presented, as well as a comparison between classical error analysis and GUM. Examples related to mass spectrometry for isotopic and elemental analysis will be discussed. The merits of GUM - transparency of the uncertainty evaluation, the treatment of uncertainties in a consistent logical way, and the presentation of an uncertainty budget resulting in a feedback to the analyst (i.e. identifies the dominant components of uncertainty and allows better understanding and improvement of the measurement process) - will be emphasised.

  2. Online Game Addiction among Chinese College Students Measurement and Attribution.

    Science.gov (United States)

    Zhou, Yuqiong; Li, Zhitian

    2009-01-01

    This study made an initial attempt to measure and attribute online game addiction among Chinese college students. We generated three factors of online game addiction: Control Disorder, Conflict, and Injury, as well as proposed a comprehensive model that attributed online game addiction to three groups of driving forces: environmental influences (most significant), characteristics of online games, and personal reasons.

  3. Improving Attribute-Importance Measurement : a Reference-Point Approach

    NARCIS (Netherlands)

    Ittersum, van K.; Pennings, J.M.E.; Wansink, B.; Trijp, van J.C.M.

    2004-01-01

    Despite the importance of identifying the hierarchy of product attributes that drive judgment and choice, the many available methods remain limited regarding their convergent validity and test-retest reliability. To increase the validity and reliability of attribute-importance measurement, we focus

  4. Uncertainty quantification in nanomechanical measurements using the atomic force microscope

    Science.gov (United States)

    Ryan Wagner; Robert Moon; Jon Pratt; Gordon Shaw; Arvind Raman

    2011-01-01

    Quantifying uncertainty in measured properties of nanomaterials is a prerequisite for the manufacture of reliable nanoengineered materials and products. Yet, rigorous uncertainty quantification (UQ) is rarely applied for material property measurements with the atomic force microscope (AFM), a widely used instrument that can measure properties at nanometer scale...

  5. Real-time hostile attribution measurement and aggression in children.

    Science.gov (United States)

    Yaros, Anna; Lochman, John E; Rosenbaum, Jill; Jimenez-Camargo, Luis Alberto

    2014-01-01

    Hostile attributions are an important predictor of aggression in children, but few studies have measured hostile attributions as they occur in real-time. The current study uses an interactive video racing game to measure hostile attributions while children played against a presumed peer. A sample of 75 children, ages 10-13, used nonverbal and verbal procedures to respond to ambiguous provocation by their opponent. Hostile attributions were significantly positively related to parent-rated reactive aggression, when controlling for proactive aggression. Hostile attributions using a nonverbal response procedure were negatively related to proactive aggression, when controlling for reactive aggression. Results suggest hostile attributions in real-time occur quickly and simultaneously with social interaction, which differs from the deliberative, controlled appraisals measured with vignette-based instruments. The relation between real-time hostile attributions and reactive aggression could be accounted for by the impulsive response style that is characteristic of reactive aggression, whereas children exhibiting proactive aggression may be more deliberate and intentional in their responding, resulting in a negative relation with real-time hostile attributions. These findings can be used both to identify children at risk for aggression and to enhance preventive interventions. © 2014 Wiley Periodicals, Inc.

  6. Determination of uncertainty in parameters extracted from single spectroscopic measurements.

    Science.gov (United States)

    Sćepanović, Obrad R; Bechtel, Kate L; Haka, Abigail S; Shih, Wei-Chuan; Koo, Tae-Woong; Berger, Andrew J; Feld, Michael S

    2007-01-01

    The ability to quantify uncertainty in information extracted from spectroscopic measurements is important in numerous fields. The traditional approach of repetitive measurements may be impractical or impossible in some measurements scenarios, while chi-squared analysis does not provide insight into the sources of uncertainty. As such, a need exists for analytical expressions for estimating uncertainty and, by extension, minimum detectable concentrations or diagnostic parameters, that can be applied to a single noisy measurement. This work builds on established concepts from estimation theory, such as the Cramer-Rao lower bound on estimator covariance, to present an analytical formula for estimating uncertainty expressed as a simple function of measurement noise, signal strength, and spectral overlap. This formalism can be used to evaluate and improve instrument performance, particularly important for rapid-acquisition biomedical spectroscopy systems. We demonstrate the experimental utility of this expression in assessing concentration uncertainties from spectral measurements of aqueous solutions and diagnostic parameter uncertainties extracted from spectral measurements of human artery tissue. The measured uncertainty, calculated from many independent measurements, is found to be in good agreement with the analytical formula applied to a single spectrum. These results are intended to encourage the widespread use of uncertainty analysis in the biomedical optics community.

  7. Evaluation of uncertainty of measurement for cellulosic fiber and ...

    African Journals Online (AJOL)

    DR OKE

    In the estimation of uncertainty of measurement, this study employs sisal fiber and isotactic polypropylene matrix in preparation of the composite for evaluation. Uncertainty of measurement was evaluated based on tensile test results for a composite material prepared from sisal fiber having undergone chemical modification ...

  8. Electroweak corrections uncertainty on the W mass measurement at LEP

    CERN Document Server

    Cossutti, F

    2005-01-01

    The systematic uncertainty on the W mass and width measurement resulting from the imperfect knowledge of electroweak radiative corrections is discussed. The intrinsic uncertainty in the 4-f generator used by the DELPHI Collaboration is studied following the guidelines of the authors of YFSWW, on which its radiative corrections part is based. The full DELPHI simulation, reconstruction and analysis chain is used for the uncertainty assessment. A comparison with the other available 4-f calculation implementing DPA O(alpha) corrections, RacoonWW, is also presented. The uncertainty on the W mass is found to be below 10 MeV for all the WW decay channels used in the measurement.

  9. Relating confidence to measured information uncertainty in qualitative reasoning

    Energy Technology Data Exchange (ETDEWEB)

    Chavez, Gregory M [Los Alamos National Laboratory; Zerkle, David K [Los Alamos National Laboratory; Key, Brian P [Los Alamos National Laboratory; Shevitz, Daniel W [Los Alamos National Laboratory

    2010-10-07

    Qualitative reasoning makes use of qualitative assessments provided by subject matter experts to model factors such as security risk. Confidence in a result is important and useful when comparing competing results. Quantifying the confidence in an evidential reasoning result must be consistent and based on the available information. A novel method is proposed to relate confidence to the available information uncertainty in the result using fuzzy sets. Information uncertainty can be quantified through measures of non-specificity and conflict. Fuzzy values for confidence are established from information uncertainty values that lie between the measured minimum and maximum information uncertainty values.

  10. Measurement Uncertainty of Microscopic Laser Triangulation on Technical Surfaces.

    Science.gov (United States)

    Mueller, Thomas; Poesch, Andreas; Reithmeier, Eduard

    2015-12-01

    Laser triangulation is widely used to measure three-dimensional structure of surfaces. The technique is suitable for macroscopic and microscopic surface measurements. In this paper, the measurement uncertainty of laser triangulation is investigated on technical surfaces for microscopic measurement applications. Properties of technical surfaces are, for example, reflectivity, surface roughness, and the presence of scratches and pores. These properties are more influential in the microscopic laser triangulation than in the macroscopic one. In the Introduction section of this paper, the measurement uncertainty of laser triangulation is experimentally investigated for 13 different specimens. The measurements were carried out with and without a laser speckle reducer. In the Materials and Methods section of this paper, the surfaces of the 13 specimens are characterized in order to be able to find correlations between the surface properties and the measurement uncertainty. The last section of this paper describes simulations of the measurement uncertainty, which allow for the calculation of the measurement uncertainty with only one source of uncertainty present. The considerations in this paper allow for the assessment of the measurement uncertainty of laser triangulation on any technical surface when some surface properties, such as roughness, are known.

  11. Calibration and Measurement Uncertainty Estimation of Radiometric Data: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Habte, A.; Sengupta, M.; Reda, I.; Andreas, A.; Konings, J.

    2014-11-01

    Evaluating the performance of photovoltaic cells, modules, and arrays that form large solar deployments relies on accurate measurements of the available solar resource. Therefore, determining the accuracy of these solar radiation measurements provides a better understanding of investment risks. This paper provides guidelines and recommended procedures for estimating the uncertainty in calibrations and measurements by radiometers using methods that follow the International Bureau of Weights and Measures Guide to the Expression of Uncertainty (GUM). Standardized analysis based on these procedures ensures that the uncertainty quoted is well documented.

  12. Review of Prior U.S. Attribute Measurement Systems

    Energy Technology Data Exchange (ETDEWEB)

    White, G K

    2012-07-06

    Attribute Measurement Systems have been developed and demonstrated several times in the United States over the last decade or so; under the Trilateral Initiative (1996-2002), FMTTD (Fissile Material Transparency Technology Demonstration, 2000), and NG-AMS (Next Generation Attribute Measurement System, 2005-2008). Each Attribute Measurement System has contributed to the growing body of knowledge regarding the use of such systems in warhead dismantlement and other Arms Control scenarios. The Trilateral Initiative, besides developing prototype hardware/software, introduced the topic to the international community. The 'trilateral' parties included the United States, the Russian Federation, and the International Atomic Energy Agency (IAEA). With the participation of a Russian delegation, the FMTTD demonstrated that measurements behind an information barrier are feasible while meeting host party security requirements. The NG-AMS system explored the consequences of maximizing the use of Commercial off the Shelf (COTS) equipment, which made construction easier, but authentication harder. The 3rd Generation Attribute Measurement System (3G-AMS) will further the scope of previous systems by including additional attributes and more rigor in authentication.

  13. Propagation of nuclear data uncertainties for fusion power measurements

    Directory of Open Access Journals (Sweden)

    Sjöstrand Henrik

    2017-01-01

    Full Text Available Neutron measurements using neutron activation systems are an essential part of the diagnostic system at large fusion machines such as JET and ITER. Nuclear data is used to infer the neutron yield. Consequently, high-quality nuclear data is essential for the proper determination of the neutron yield and fusion power. However, uncertainties due to nuclear data are not fully taken into account in uncertainty analysis for neutron yield calibrations using activation foils. This paper investigates the neutron yield uncertainty due to nuclear data using the so-called Total Monte Carlo Method. The work is performed using a detailed MCNP model of the JET fusion machine; the uncertainties due to the cross-sections and angular distributions in JET structural materials, as well as the activation cross-sections in the activation foils, are analysed. It is found that a significant contribution to the neutron yield uncertainty can come from uncertainties in the nuclear data.

  14. Propagation of nuclear data uncertainties for fusion power measurements

    Science.gov (United States)

    Sjöstrand, Henrik; Conroy, Sean; Helgesson, Petter; Hernandez, Solis Augusto; Koning, Arjan; Pomp, Stephan; Rochman, Dimitri

    2017-09-01

    Neutron measurements using neutron activation systems are an essential part of the diagnostic system at large fusion machines such as JET and ITER. Nuclear data is used to infer the neutron yield. Consequently, high-quality nuclear data is essential for the proper determination of the neutron yield and fusion power. However, uncertainties due to nuclear data are not fully taken into account in uncertainty analysis for neutron yield calibrations using activation foils. This paper investigates the neutron yield uncertainty due to nuclear data using the so-called Total Monte Carlo Method. The work is performed using a detailed MCNP model of the JET fusion machine; the uncertainties due to the cross-sections and angular distributions in JET structural materials, as well as the activation cross-sections in the activation foils, are analysed. It is found that a significant contribution to the neutron yield uncertainty can come from uncertainties in the nuclear data.

  15. Measuring the Gas Constant "R": Propagation of Uncertainty and Statistics

    Science.gov (United States)

    Olsen, Robert J.; Sattar, Simeen

    2013-01-01

    Determining the gas constant "R" by measuring the properties of hydrogen gas collected in a gas buret is well suited for comparing two approaches to uncertainty analysis using a single data set. The brevity of the experiment permits multiple determinations, allowing for statistical evaluation of the standard uncertainty u[subscript…

  16. Assessment of dose measurement uncertainty using RisøScan

    DEFF Research Database (Denmark)

    Helt-Hansen, J.; Miller, A.

    2006-01-01

    The dose measurement uncertainty of the dosimeter system RisoScan, office scanner and Riso B3 dosimeters has been assessed by comparison with spectrophotometer measurements of the same dosimeters. The reproducibility and the combined uncertainty were found to be approximately 2% and 4%, respectiv......The dose measurement uncertainty of the dosimeter system RisoScan, office scanner and Riso B3 dosimeters has been assessed by comparison with spectrophotometer measurements of the same dosimeters. The reproducibility and the combined uncertainty were found to be approximately 2% and 4......%, respectively, at one standard deviation. The subroutine in RisoScan for electron energy measurement is shown to give results that are equivalent to the measurements with a scanning spectrophotometer. (c) 2006 Elsevier Ltd. All rights reserved....

  17. On the position uncertainty measure on the circle

    Energy Technology Data Exchange (ETDEWEB)

    Trifonov, D A [Institute for Nuclear Research, 72 Tzarigradsko Chaussee, 1784 Sofia (Bulgaria)

    2003-11-28

    New position uncertainty (delocalization) measures for a particle on the circle are proposed and illustrated in several examples, where the previous measures (based on 2{pi}-periodic position operators) appear to be unsatisfactory. The new measures are suitably constructed using the standard multiplication angle operator variances. They are shown to depend solely on the state of the particle and to obey uncertainty relations of the Schroedinger-Robertson type.

  18. Alternative measures of uncertainty in quantum metrology: Contradictions and limits

    Science.gov (United States)

    Luis, Alfredo; Rodil, Alfonso

    2013-03-01

    We examine a family of intrinsic performance measures in terms of probability distributions that generalize Hellinger distance and Fisher information. They are applied to quantum metrology to assess the uncertainty in the detection of minute changes of physical quantities. We show that different measures lead to contradictory conclusions, including the possibility of arbitrarily small uncertainty for fixed resources. These intrinsic performances are compared with the averaged error in the corresponding estimation problem after single-shot measurements.

  19. Estimation of measurement uncertainty arising from manual sampling of fuels.

    Science.gov (United States)

    Theodorou, Dimitrios; Liapis, Nikolaos; Zannikos, Fanourios

    2013-02-15

    Sampling is an important part of any measurement process and is therefore recognized as an important contributor to the measurement uncertainty. A reliable estimation of the uncertainty arising from sampling of fuels leads to a better control of risks associated with decisions concerning whether product specifications are met or not. The present work describes and compares the results of three empirical statistical methodologies (classical ANOVA, robust ANOVA and range statistics) using data from a balanced experimental design, which includes duplicate samples analyzed in duplicate from 104 sampling targets (petroleum retail stations). These methodologies are used for the estimation of the uncertainty arising from the manual sampling of fuel (automotive diesel) and the subsequent sulfur mass content determination. The results of the three methodologies statistically differ, with the expanded uncertainty of sampling being in the range of 0.34-0.40 mg kg(-1), while the relative expanded uncertainty lying in the range of 4.8-5.1%, depending on the methodology used. The estimation of robust ANOVA (sampling expanded uncertainty of 0.34 mg kg(-1) or 4.8% in relative terms) is considered more reliable, because of the presence of outliers within the 104 datasets used for the calculations. Robust ANOVA, in contrast to classical ANOVA and range statistics, accommodates outlying values, lessening their effects on the produced estimates. The results of this work also show that, in the case of manual sampling of fuels, the main contributor to the whole measurement uncertainty is the analytical measurement uncertainty, with the sampling uncertainty accounting only for the 29% of the total measurement uncertainty. Copyright © 2012 Elsevier B.V. All rights reserved.

  20. Metrology and process control: dealing with measurement uncertainty

    Science.gov (United States)

    Potzick, James

    2010-03-01

    Metrology is often used in designing and controlling manufacturing processes. A product sample is processed, some relevant property is measured, and the process adjusted to bring the next processed sample closer to its specification. This feedback loop can be remarkably effective for the complex processes used in semiconductor manufacturing, but there is some risk involved because measurements have uncertainty and product specifications have tolerances. There is finite risk that good product will fail testing or that faulty product will pass. Standard methods for quantifying measurement uncertainty have been presented, but the question arises: how much measurement uncertainty is tolerable in a specific case? Or, How does measurement uncertainty relate to manufacturing risk? This paper looks at some of the components inside this process control feedback loop and describes methods to answer these questions.

  1. Assessing Precision in Conventional Field Measurements of Individual Tree Attributes

    Directory of Open Access Journals (Sweden)

    Ville Luoma

    2017-02-01

    Full Text Available Forest resource information has a hierarchical structure: individual tree attributes are summed at the plot level and then in turn, plot-level estimates are used to derive stand or large-area estimates of forest resources. Due to this hierarchy, it is imperative that individual tree attributes are measured with accuracy and precision. With the widespread use of different measurement tools, it is also important to understand the expected degree of precision associated with these measurements. The most prevalent tree attributes measured in the field are tree species, stem diameter-at-breast-height (dbh, and tree height. For dbh and height, the most commonly used measuring devices are calipers and clinometers, respectively. The aim of our study was to characterize the precision of individual tree dbh and height measurements in boreal forest conditions when using calipers and clinometers. The data consisted of 319 sample trees at a study area in Evo, southern Finland. The sample trees were measured independently by four trained mensurationists. The standard deviation in tree dbh and height measurements was 0.3 cm (1.5% and 0.5 m (2.9%, respectively. Precision was also assessed by tree species and tree size classes; however, there were no statistically significant differences between the mensurationists for dbh or height measurements. Our study offers insights into the expected precision of tree dbh and height as measured with the most commonly used devices. These results are important when using sample plot data in forest inventory applications, especially now, at a time when new tree attribute measurement techniques based on remote sensing are being developed and compared to the conventional caliper and clinometer measurements.

  2. Issues with Describing the Uncertainties in Atmospheric Remote Sensing Measurements

    Science.gov (United States)

    Haffner, D. P.; Bhartia, P. K.; Kramarova, N. A.

    2014-12-01

    Uncertainty in atmospheric measurements from satellites and other remote sensing platforms comes from several sources. Users are familiar with concepts of accuracy and precision for physical measurements made using instrumentation, but retrieval algorithms also frequently require statistical information since measurements alone may not completely determine the parameter of interest. This statistical information has uncertainty associated with it as well, and it often contributes a sizeable fraction to the total uncertainty. The precise combination of physical and statistical information in remotely sensed data can vary with season, latitude, altitude, and conditions of measurement. While this picture is complex, it is important to clearly define the overall uncertainty for users without oversimplifying so they can interpret the data correctly. Assessment of trends, quantification of radiative forcing and chemical budgets, and comparisons of models with satellite observations all benefit from having adequate uncertainty information. But even today, terminology and interpretation of these uncertainties is a hot topic of discussion among experts. Based on our experience producing a 44 year-long dataset of total ozone and ozone profiles, we discuss our ideas for describing uncertainty in atmospheric datasets for global change research. Assumptions about the atmosphere used in retrievals can also be provided with exact information detailing how the final product depends on these assumptions. As a practical example, we discuss our modifications to the Total Ozone Mapping Spectrometer (TOMS) algorithm in Version 9 to provide robust uncertainties for each measurement and supply as much useful information to users as possible. Finally, we describe how uncertainties in individual measurements combine when the data are aggregated in time and space.

  3. ESTIMATION OF MEASUREMENT UNCERTAINTY WITH THE USE OF UNCERTAINTY DATABASE CALCULATED FOR OPTICAL COORDINATE MEASUREMENTS OF BASIC GEOMETRY ELEMENTS

    Directory of Open Access Journals (Sweden)

    Danuta Owczarek

    2015-08-01

    Full Text Available The paper presents a method for estimating the uncertainty of optical coordinate measurement based on the use of information about the geometry and the size of measured object as well as information about the measurement system, i.e. maximum permissible error (MPE of the machine, selection of a sensor, and also the required measurement accuracy, the number of operators, measurement strategy and external conditions contained in the developed uncertainty database. Estimation of uncertainty is done with the use of uncertainties of measurements of basic geometry elements determined by methods available in the Laboratory of Coordinate Metrology at Cracow University of Technology (LCM CUT (multi-position, comparative and developed in the LCM CUT method dedicated for non-contact measurements and then with the use of them to determine the uncertainty of a given measured object. Research presented in this paper are aimed at developing a complete database containing all information needed to estimate the measurement uncertainty of various objects, even of a very complex geometry based on previously performed measurements.

  4. Vector network analyzer (VNA) measurements and uncertainty assessment

    CERN Document Server

    Shoaib, Nosherwan

    2017-01-01

    This book describes vector network analyzer measurements and uncertainty assessments, particularly in waveguide test-set environments, in order to establish their compatibility to the International System of Units (SI) for accurate and reliable characterization of communication networks. It proposes a fully analytical approach to measurement uncertainty evaluation, while also highlighting the interaction and the linear propagation of different uncertainty sources to compute the final uncertainties associated with the measurements. The book subsequently discusses the dimensional characterization of waveguide standards and the quality of the vector network analyzer (VNA) calibration techniques. The book concludes with an in-depth description of the novel verification artefacts used to assess the performance of the VNAs. It offers a comprehensive reference guide for beginners to experts, in both academia and industry, whose work involves the field of network analysis, instrumentation and measurements.

  5. Uncertainty in techno-economic estimates of cellulosic ethanol production due to experimental measurement uncertainty

    Directory of Open Access Journals (Sweden)

    Vicari Kristin J

    2012-04-01

    Full Text Available Abstract Background Cost-effective production of lignocellulosic biofuels remains a major financial and technical challenge at the industrial scale. A critical tool in biofuels process development is the techno-economic (TE model, which calculates biofuel production costs using a process model and an economic model. The process model solves mass and energy balances for each unit, and the economic model estimates capital and operating costs from the process model based on economic assumptions. The process model inputs include experimental data on the feedstock composition and intermediate product yields for each unit. These experimental yield data are calculated from primary measurements. Uncertainty in these primary measurements is propagated to the calculated yields, to the process model, and ultimately to the economic model. Thus, outputs of the TE model have a minimum uncertainty associated with the uncertainty in the primary measurements. Results We calculate the uncertainty in the Minimum Ethanol Selling Price (MESP estimate for lignocellulosic ethanol production via a biochemical conversion process: dilute sulfuric acid pretreatment of corn stover followed by enzymatic hydrolysis and co-fermentation of the resulting sugars to ethanol. We perform a sensitivity analysis on the TE model and identify the feedstock composition and conversion yields from three unit operations (xylose from pretreatment, glucose from enzymatic hydrolysis, and ethanol from fermentation as the most important variables. The uncertainty in the pretreatment xylose yield arises from multiple measurements, whereas the glucose and ethanol yields from enzymatic hydrolysis and fermentation, respectively, are dominated by a single measurement: the fraction of insoluble solids (fIS in the biomass slurries. Conclusions We calculate a $0.15/gal uncertainty in MESP from the TE model due to uncertainties in primary measurements. This result sets a lower bound on the error bars of

  6. Triangular and Trapezoidal Fuzzy State Estimation with Uncertainty on Measurements

    Directory of Open Access Journals (Sweden)

    Mohammad Sadeghi Sarcheshmah

    2012-01-01

    Full Text Available In this paper, a new method for uncertainty analysis in fuzzy state estimation is proposed. The uncertainty is expressed in measurements. Uncertainties in measurements are modelled with different fuzzy membership functions (triangular and trapezoidal. To find the fuzzy distribution of any state variable, the problem is formulated as a constrained linear programming (LP optimization. The viability of the proposed method would be verified with the ones obtained from the weighted least squares (WLS and the fuzzy state estimation (FSE in the 6-bus system and in the IEEE-14 and 30 bus system.

  7. THE UNCERTAINTIES OF ENVIRONMENT'S PARAMETERS MEASUREMENTS AS TOLLS OF THE MEASUREMENTS QUALITY IMPROVEMENT

    Directory of Open Access Journals (Sweden)

    Miroslav Badida

    2008-06-01

    Full Text Available Identification of the noise measuring uncertainties by declared measured values is unconditionally necessary and required by legislative. Uncertainty of the measurements expresses all errors that accrue during the measuring. B y indication of uncertainties the measure documents that the objective value is with certain probability found in the interval that is bounded by the measurement uncertainty. The paper deals with the methodology of the uncertainty calculation by noise measurements in living and working environments. metal processing industry and building materials industry.

  8. Adaptive framework for uncertainty analysis in electromagnetic field measurements.

    Science.gov (United States)

    Prieto, Javier; Alonso, Alonso A; de la Rosa, Ramón; Carrera, Albano

    2015-04-01

    Misinterpretation of uncertainty in the measurement of the electromagnetic field (EMF) strength may lead to an underestimation of exposure risk or an overestimation of required measurements. The Guide to the Expression of Uncertainty in Measurement (GUM) has internationally been adopted as a de facto standard for uncertainty assessment. However, analyses under such an approach commonly assume unrealistic static models or neglect relevant prior information, resulting in non-robust uncertainties. This study proposes a principled and systematic framework for uncertainty analysis that fuses information from current measurements and prior knowledge. Such a framework dynamically adapts to data by exploiting a likelihood function based on kernel mixtures and incorporates flexible choices of prior information by applying importance sampling. The validity of the proposed techniques is assessed from measurements performed with a broadband radiation meter and an isotropic field probe. The developed framework significantly outperforms GUM approach, achieving a reduction of 28% in measurement uncertainty. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. Uncertainty quantification in nanomechanical measurements using the atomic force microscope

    Science.gov (United States)

    Wagner, Ryan; Moon, Robert; Pratt, Jon; Shaw, Gordon; Raman, Arvind

    2011-11-01

    Quantifying uncertainty in measured properties of nanomaterials is a prerequisite for the manufacture of reliable nanoengineered materials and products. Yet, rigorous uncertainty quantification (UQ) is rarely applied for material property measurements with the atomic force microscope (AFM), a widely used instrument that can measure properties at nanometer scale resolution of both inorganic and biological surfaces and nanomaterials. We present a framework to ascribe uncertainty to local nanomechanical properties of any nanoparticle or surface measured with the AFM by taking into account the main uncertainty sources inherent in such measurements. We demonstrate the framework by quantifying uncertainty in AFM-based measurements of the transverse elastic modulus of cellulose nanocrystals (CNCs), an abundant, plant-derived nanomaterial whose mechanical properties are comparable to Kevlar fibers. For a single, isolated CNC the transverse elastic modulus was found to have a mean of 8.1 GPa and a 95% confidence interval of 2.7-20 GPa. A key result is that multiple replicates of force-distance curves do not sample the important sources of uncertainty, which are systematic in nature. The dominant source of uncertainty is the nondimensional photodiode sensitivity calibration rather than the cantilever stiffness or Z-piezo calibrations. The results underscore the great need for, and open a path towards, quantifying and minimizing uncertainty in AFM-based material property measurements of nanoparticles, nanostructured surfaces, thin films, polymers and biomaterials. This work is a partial contribution of the USDA Forest Service and NIST, agencies of the US government, and is not subject to copyright.

  10. Evaluation of an attributive measurement system in the automotive industry

    Science.gov (United States)

    Simion, C.

    2016-08-01

    Measurement System Analysis (MSA) is a critical component for any quality improvement process. MSA is defined as an experimental and mathematical method of determining how much the variation within the measurement process contributes to overall process variability and it falls into two categories: attribute and variable. Most problematic measurement system issues come from measuring attribute data, which are usually the result of human judgment (visual inspection). Because attributive measurement systems are often used in some manufacturing processes, their assessment is important to obtain the confidence in the inspection process, to see where are the problems in order to eliminate them and to guide the process improvement. It was the aim of this paper to address such a issue presenting a case study made in a local company from the Sibiu region supplying products for the automotive industry, specifically the bag (a technical textile component, i.e. the fabric) for the airbag module. Because defects are inherent in every manufacturing process and in the field of airbag systems a minor defect can influence their performance and lives depend on the safety feature, there is a stringent visual inspection required on the defects of the bag material. The purpose of this attribute MSA was: to determine if all inspectors use the same criteria to determine “pass” from “fail” product (i.e. the fabric); to assess company inspection standards against customer's requirements; to determine how well inspectors are conforming to themselves; to identify how inspectors are conforming to a “known master,” which includes: how often operators ship defective product, how often operators dispose of acceptable product; to discover areas where training is required, procedures must be developed and standards are not available. The results were analyzed using MINITAB software with its module called Attribute Agreement Analysis. The conclusion was that the inspection process must

  11. A Unified Approach for Reporting ARM Measurement Uncertainties Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Campos, E [Argonne National Laboratory; Sisterson, DL [Argonne National Laboratory

    2015-10-01

    The Atmospheric Radiation Measurement (ARM) Climate Research Facility is observationally based, and quantifying the uncertainty of its measurements is critically important. With over 300 widely differing instruments providing over 2,500 datastreams, concise expression of measurement uncertainty is quite challenging. The ARM Facility currently provides data and supporting metadata (information about the data or data quality) to its users through a number of sources. Because the continued success of the ARM Facility depends on the known quality of its measurements, the Facility relies on instrument mentors and the ARM Data Quality Office (DQO) to ensure, assess, and report measurement quality. Therefore, an easily-accessible, well-articulated estimate of ARM measurement uncertainty is needed.

  12. Measurement uncertainties physical parameters and calibration of instruments

    CERN Document Server

    Gupta, S V

    2012-01-01

    This book fulfills the global need to evaluate measurement results along with the associated uncertainty. In the book, together with the details of uncertainty calculations for many physical parameters, probability distributions and their properties are discussed. Definitions of various terms are given and will help the practicing metrologists to grasp the subject. The book helps to establish international standards for the evaluation of the quality of raw data obtained from various laboratories for interpreting the results of various national metrology institutes in an international inter-comparisons. For the routine calibration of instruments, a new idea for the use of pooled variance is introduced. The uncertainty calculations are explained for (i) independent linear inputs, (ii) non-linear inputs and (iii) correlated inputs. The merits and limitations of the Guide to the Expression of Uncertainty in Measurement (GUM) are discussed. Monte Carlo methods for the derivation of the output distribution from the...

  13. Teaching Scientific Measurement and Uncertainty in Elementary School

    Science.gov (United States)

    Munier, Valérie; Merle, Hélène; Brehelin, Danie

    2013-01-01

    The concept of measurement is fundamental in science. In order to be meaningful, the value of a measurement must be given with a certain level of uncertainty. In this paper we try to identify and develop the reasoning of young French pupils about measurement variability. In France, official instructions for elementary school thus argue for having…

  14. Measurement uncertainty budget of an interferometric flow velocity sensor

    Science.gov (United States)

    Bermuske, Mike; Büttner, Lars; Czarske, Jürgen

    2017-06-01

    Flow rate measurements are a common topic for process monitoring in chemical engineering and food industry. To achieve the requested low uncertainties of 0:1% for flow rate measurements, a precise measurement of the shear layers of such flows is necessary. The Laser Doppler Velocimeter (LDV) is an established method for measuring local flow velocities. For exact estimation of the flow rate, the flow profile in the shear layer is of importance. For standard LDV the axial resolution and therefore the number of measurement points in the shear layer is defined by the length of the measurement volume. A decrease of this length is accompanied by a larger fringe distance variation along the measurement axis which results in a rise of the measurement uncertainty for the flow velocity (uncertainty relation between spatial resolution and velocity uncertainty). As a unique advantage, the laser Doppler profile sensor (LDV-PS) overcomes this problem by using two fan-like fringe systems to obtain the position of the measured particles along the measurement axis and therefore achieve a high spatial resolution while it still offers a low velocity uncertainty. With this technique, the flow rate can be estimated with one order of magnitude lower uncertainty, down to 0:05% statistical uncertainty.1 And flow profiles especially in film flows can be measured more accurately. The problem for this technique is, in contrast to laboratory setups where the system is quite stable, that for industrial applications the sensor needs a reliable and robust traceability to the SI units, meter and second. Small deviations in the calibration can, because of the highly position depending calibration function, cause large systematic errors in the measurement result. Therefore, a simple, stable and accurate tool is needed, that can easily be used in industrial surroundings to check or recalibrate the sensor. In this work, different calibration methods are presented and their influences to the

  15. Natural Uncertainty Measure for Forecasting Floods in Ungauged Basins

    Science.gov (United States)

    Mantilla, Ricardo; Krajewski, Witold F.; Gupta, Vijay K.; Ayalew, Tibebu B.

    2015-04-01

    Recent data analysis have shown that peak flows for individual Rainfall-Runoff (RF-RO) events exhibit power law scaling with respect to drainage area, but the scaling slopes and intercepts change from one event to the next. We test this feature in the 32,400 km2 Iowa River basin, and give supporting evidence for our hypothesis that scaling slope and intercept incorporates all the pertinent physical processes that produce floods. These developments serve as the foundations for the key question that is addressed here: How to define uncertainty bounds for flood prediction for each event? We theoretically introduce the concept of Natural Uncertainty Measure for peak discharge (NUMPD) and test it using data from the Iowa River basin. We conjecture that NUMPD puts a limit to predictive uncertainty using measurements and modeling. In other words, the best any amount of data collection combined with any model can do is to come close to predicting NUMPD, but it cannot match or reduce it any further. For the applications of flood predictions, the concepts of Type-I and Type-II uncertainties in flood prediction are explained. We demonstrate Type-I uncertainty using the concept of NUMPD. Our results offer a context for Type-II uncertainty. Our results make a unique contribution to International Association of Hydrologic Sciences (IAHS) decade-long initiative on Predictions in Unaguged Basins (PUB) (2003-2012).

  16. Guide to the expression of uncertainty in measurements

    Energy Technology Data Exchange (ETDEWEB)

    Mathew, Kattathu Joseph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-10-19

    The enabling objectives of this presentation are to: Provide a working knowledge of the ISO GUM method to estimation of uncertainties in safeguards measurements; Introduce GUM terminology; Provide brief historical background of the GUM methodology; Introduce GUM Workbench software; Isotope ratio measurements by MS will be discussed in the next session.

  17. Uncertainty Quantification for Monitoring of Civil Structures from Vibration Measurements

    Science.gov (United States)

    Döhler, Michael; Mevel, Laurent

    2014-05-01

    Health Monitoring of civil structures can be performed by detecting changes in the modal parameters of a structure, or more directly in the measured vibration signals. For a continuous monitoring the excitation of a structure is usually ambient, thus unknown and assumed to be noise. Hence, all estimates from the vibration measurements are realizations of random variables with inherent uncertainty due to (unknown) process and measurement noise and finite data length. In this talk, a strategy for quantifying the uncertainties of modal parameter estimates from a subspace-based system identification approach is presented and the importance of uncertainty quantification in monitoring approaches is shown. Furthermore, a damage detection method is presented, which is based on the direct comparison of the measured vibration signals without estimating modal parameters, while taking the statistical uncertainty in the signals correctly into account. The usefulness of both strategies is illustrated on data from a progressive damage action on a prestressed concrete bridge. References E. Carden and P. Fanning. Vibration based condition monitoring: a review. Structural Health Monitoring, 3(4):355-377, 2004. M. Döhler and L. Mevel. Efficient multi-order uncertainty computation for stochastic subspace identification. Mechanical Systems and Signal Processing, 38(2):346-366, 2013. M. Döhler, L. Mevel, and F. Hille. Subspace-based damage detection under changes in the ambient excitation statistics. Mechanical Systems and Signal Processing, 45(1):207-224, 2014.

  18. Estimate of the uncertainty in measurement for the determination of mercury in seafood by TDA AAS.

    Science.gov (United States)

    Torres, Daiane Placido; Olivares, Igor R B; Queiroz, Helena Müller

    2015-01-01

    An approach for the estimate of the uncertainty in measurement considering the individual sources related to the different steps of the method under evaluation as well as the uncertainties estimated from the validation data for the determination of mercury in seafood by using thermal decomposition/amalgamation atomic absorption spectrometry (TDA AAS) is proposed. The considered method has been fully optimized and validated in an official laboratory of the Ministry of Agriculture, Livestock and Food Supply of Brazil, in order to comply with national and international food regulations and quality assurance. The referred method has been accredited under the ISO/IEC 17025 norm since 2010. The approach of the present work in order to reach the aim of estimating of the uncertainty in measurement was based on six sources of uncertainty for mercury determination in seafood by TDA AAS, following the validation process, which were: Linear least square regression, Repeatability, Intermediate precision, Correction factor of the analytical curve, Sample mass, and Standard reference solution. Those that most influenced the uncertainty in measurement were sample weight, repeatability, intermediate precision and calibration curve. The obtained result for the estimate of uncertainty in measurement in the present work reached a value of 13.39%, which complies with the European Regulation EC 836/2011. This figure represents a very realistic estimate of the routine conditions, since it fairly encompasses the dispersion obtained from the value attributed to the sample and the value measured by the laboratory analysts. From this outcome, it is possible to infer that the validation data (based on calibration curve, recovery and precision), together with the variation on sample mass, can offer a proper estimate of uncertainty in measurement.

  19. OPEN PUBLIC SPACE ATTRIBUTES AND CATEGORIES – COMPLEXITY AND MEASURABILITY

    Directory of Open Access Journals (Sweden)

    Ljiljana Čavić

    2014-12-01

    Full Text Available Within the field of architectural and urban research, this work addresses the complexity of contemporary public space, both in a conceptual and concrete sense. It aims at systematizing spatial attributes and their categories and discussing spatial complexity and measurability, all this in order to reach a more comprehensive understanding, description and analysis of public space. Our aim is to improve everyday usage of open public space and we acknowledged users as its crucial factor. There are numerous investigations on the complex urban and architectural reality of public space that recognise importance of users. However, we did not find any that would holistically account for what users find essential in public space. Based on the incompleteness of existing approaches on open public space and the importance of users for their success, this paper proposes a user-orientated approach. Through an initial survey directed to users, we collected the most important aspects of public spaces in the way that contemporary humans see them. The gathered data is analysed and coded into spatial attributes from which their role in the complexity of open public space and measurability are discussed. The work results in an inventory of attributes that users find salient in public spaces. It does not discuss their qualitative values or contribution in generating spatial realities. It aims to define them clearly so that any further logical argumentation on open space concerning users may be solidly constructed. Finally, through categorisation of attributes it proposes the disciplinary levels necessary for the analysis of complex urban-architectural reality

  20. Evaluating the uncertainty of input quantities in measurement models

    Science.gov (United States)

    Possolo, Antonio; Elster, Clemens

    2014-06-01

    The Guide to the Expression of Uncertainty in Measurement (GUM) gives guidance about how values and uncertainties should be assigned to the input quantities that appear in measurement models. This contribution offers a concrete proposal for how that guidance may be updated in light of the advances in the evaluation and expression of measurement uncertainty that were made in the course of the twenty years that have elapsed since the publication of the GUM, and also considering situations that the GUM does not yet contemplate. Our motivation is the ongoing conversation about a new edition of the GUM. While generally we favour a Bayesian approach to uncertainty evaluation, we also recognize the value that other approaches may bring to the problems considered here, and focus on methods for uncertainty evaluation and propagation that are widely applicable, including to cases that the GUM has not yet addressed. In addition to Bayesian methods, we discuss maximum-likelihood estimation, robust statistical methods, and measurement models where values of nominal properties play the same role that input quantities play in traditional models. We illustrate these general-purpose techniques in concrete examples, employing data sets that are realistic but that also are of conveniently small sizes. The supplementary material available online lists the R computer code that we have used to produce these examples (stacks.iop.org/Met/51/3/339/mmedia). Although we strive to stay close to clause 4 of the GUM, which addresses the evaluation of uncertainty for input quantities, we depart from it as we review the classes of measurement models that we believe are generally useful in contemporary measurement science. We also considerably expand and update the treatment that the GUM gives to Type B evaluations of uncertainty: reviewing the state-of-the-art, disciplined approach to the elicitation of expert knowledge, and its encapsulation in probability distributions that are usable in

  1. A Method to Estimate Uncertainty in Radiometric Measurement Using the Guide to the Expression of Uncertainty in Measurement (GUM) Method; NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Habte, A.; Sengupta, M.; Reda, I.

    2015-03-01

    Radiometric data with known and traceable uncertainty is essential for climate change studies to better understand cloud radiation interactions and the earth radiation budget. Further, adopting a known and traceable method of estimating uncertainty with respect to SI ensures that the uncertainty quoted for radiometric measurements can be compared based on documented methods of derivation.Therefore, statements about the overall measurement uncertainty can only be made on an individual basis, taking all relevant factors into account. This poster provides guidelines and recommended procedures for estimating the uncertainty in calibrations and measurements from radiometers. The approach follows the Guide to the Expression of Uncertainty in Measurement (GUM). derivation.Therefore, statements about the overall measurement uncertainty can only be made on an individual basis, taking all relevant factors into account. This poster provides guidelines and recommended procedures for estimating the uncertainty in calibrations and measurements from radiometers. The approach follows the Guide to the Expression of Uncertainty in Measurement (GUM).

  2. Entropic uncertainty measures for large dimensional hydrogenic systems

    NARCIS (Netherlands)

    D. Puertas-Centeno; N.M. Temme (Nico); I.V. Toranzo; J.S. Dehesa

    2017-01-01

    textabstractThe entropic moments of the probability density of a quantum system in position and momentum spaces describe not only some fundamental and/or experimentally accessible quantities of the system but also the entropic uncertainty measures of Rényi type, which allow one to find the most

  3. Uncertainty of Areal Rainfall Estimation Using Point Measurements

    Science.gov (United States)

    McCarthy, D.; Dotto, C. B. S.; Sun, S.; Bertrand-Krajewski, J. L.; Deletic, A.

    2014-12-01

    The spatial variability of precipitation has a great influence on the quantity and quality of runoff water generated from hydrological processes. In practice, point rainfall measurements (e.g., rain gauges) are often used to represent areal rainfall in catchments. The spatial rainfall variability is difficult to be precisely captured even with many rain gauges. Thus the rainfall uncertainty due to spatial variability should be taken into account in order to provide reliable rainfall-driven process modelling results. This study investigates the uncertainty of areal rainfall estimation due to rainfall spatial variability if point measurements are applied. The areal rainfall is usually estimated as a weighted sum of data from available point measurements. The expected error of areal rainfall estimates is 0 if the estimation is an unbiased one. The variance of the error between the real and estimated areal rainfall is evaluated to indicate the uncertainty of areal rainfall estimates. This error variance can be expressed as a function of variograms, which was originally applied in geostatistics to characterize a spatial variable. The variogram can be evaluated using measurements from a dense rain gauge network. The areal rainfall errors are evaluated in two areas with distinct climate regimes and rainfall patterns: Greater Lyon area in France and Melbourne area in Australia. The variograms of the two areas are derived based on 6-minute rainfall time series data from 2010 to 2013 and are then used to estimate uncertainties of areal rainfall represented by different numbers of point measurements in synthetic catchments of various sizes. The error variance of areal rainfall using one point measurement in the centre of a 1-km2 catchment is 0.22 (mm/h)2 in Lyon. When the point measurement is placed at one corner of the same-size catchment, the error variance becomes 0.82 (mm/h)2 also in Lyon. Results for Melbourne were similar but presented larger uncertainty. Results

  4. Upper bounds on quantum uncertainty products and complexity measures

    Energy Technology Data Exchange (ETDEWEB)

    Guerrero, Angel; Sanchez-Moreno, Pablo; Dehesa, Jesus S. [Department of Atomic, Molecular and Nuclear Physics, University of Granada, Granada (Spain); Department of Applied Mathematics, University of Granada, Granada (Spain) and Institute Carlos I for Computational and Theoretical Physics, University of Granada, Granada (Spain); Department of Atomic, Molecular and Nuclear Physics, University of Granada, Granada (Spain); Institute Carlos I for Computational and Theoretical Physics, University of Granada, Granada (Spain)

    2011-10-15

    The position-momentum Shannon and Renyi uncertainty products of general quantum systems are shown to be bounded not only from below (through the known uncertainty relations), but also from above in terms of the Heisenberg-Kennard product . Moreover, the Cramer-Rao, Fisher-Shannon, and Lopez-Ruiz, Mancini, and Calbet shape measures of complexity (whose lower bounds have been recently found) are also bounded from above. The improvement of these bounds for systems subject to spherically symmetric potentials is also explicitly given. Finally, applications to hydrogenic and oscillator-like systems are done.

  5. BOOK REVIEW: Evaluating the Measurement Uncertainty: Fundamentals and practical guidance

    Science.gov (United States)

    Lira, Ignacio

    2003-08-01

    Evaluating the Measurement Uncertainty is a book written for anyone who makes and reports measurements. It attempts to fill the gaps in the ISO Guide to the Expression of Uncertainty in Measurement, or the GUM, and does a pretty thorough job. The GUM was written with the intent of being applicable by all metrologists, from the shop floor to the National Metrology Institute laboratory; however, the GUM has often been criticized for its lack of user-friendliness because it is primarily filled with statements, but with little explanation. Evaluating the Measurement Uncertainty gives lots of explanations. It is well written and makes use of many good figures and numerical examples. Also important, this book is written by a metrologist from a National Metrology Institute, and therefore up-to-date ISO rules, style conventions and definitions are correctly used and supported throughout. The author sticks very closely to the GUM in topical theme and with frequent reference, so readers who have not read GUM cover-to-cover may feel as if they are missing something. The first chapter consists of a reprinted lecture by T J Quinn, Director of the Bureau International des Poids et Mesures (BIPM), on the role of metrology in today's world. It is an interesting and informative essay that clearly outlines the importance of metrology in our modern society, and why accurate measurement capability, and by definition uncertainty evaluation, should be so important. Particularly interesting is the section on the need for accuracy rather than simply reproducibility. Evaluating the Measurement Uncertainty then begins at the beginning, with basic concepts and definitions. The third chapter carefully introduces the concept of standard uncertainty and includes many derivations and discussion of probability density functions. The author also touches on Monte Carlo methods, calibration correction quantities, acceptance intervals or guardbanding, and many other interesting cases. The book goes

  6. Uncertainties.

    Science.gov (United States)

    Dalla Chiara, Maria Luisa

    2010-09-01

    In contemporary science uncertainty is often represented as an intrinsic feature of natural and of human phenomena. As an example we need only think of two important conceptual revolutions that occurred in physics and logic during the first half of the twentieth century: (1) the discovery of Heisenberg's uncertainty principle in quantum mechanics; (2) the emergence of many-valued logical reasoning, which gave rise to so-called 'fuzzy thinking'. I discuss the possibility of applying the notions of uncertainty, developed in the framework of quantum mechanics, quantum information and fuzzy logics, to some problems of political and social sciences.

  7. Alignment measurements uncertainties for large assemblies using probabilistic analysis techniques

    CERN Document Server

    AUTHOR|(CDS)2090816; Almond, Heather

    Big science and ambitious industrial projects continually push forward with technical requirements beyond the grasp of conventional engineering techniques. Example of those are ultra-high precision requirements in the field of celestial telescopes, particle accelerators and aerospace industry. Such extreme requirements are limited largely by the capability of the metrology used, namely, it’s uncertainty in relation to the alignment tolerance required. The current work was initiated as part of Maria Curie European research project held at CERN, Geneva aiming to answer those challenges as related to future accelerators requiring alignment of 2 m large assemblies to tolerances in the 10 µm range. The thesis has found several gaps in current knowledge limiting such capability. Among those was the lack of application of state of the art uncertainty propagation methods in alignment measurements metrology. Another major limiting factor found was the lack of uncertainty statements in the thermal errors compensatio...

  8. Assessing measurement uncertainty in meteorology in urban environments

    Science.gov (United States)

    Curci, S.; Lavecchia, C.; Frustaci, G.; Paolini, R.; Pilati, S.; Paganelli, C.

    2017-10-01

    Measurement uncertainty in meteorology has been addressed in a number of recent projects. In urban environments, uncertainty is also affected by local effects which are more difficult to deal with than for synoptic stations. In Italy, beginning in 2010, an urban meteorological network (Climate Network®) was designed, set up and managed at national level according to high metrological standards and homogeneity criteria to support energy applications. The availability of such a high-quality operative automatic weather station network represents an opportunity to investigate the effects of station siting and sensor exposure and to estimate the related measurement uncertainty. An extended metadata set was established for the stations in Milan, including siting and exposure details. Statistical analysis on an almost 3-year-long operational period assessed network homogeneity, quality and reliability. Deviations from reference mean values were then evaluated in selected low-gradient local weather situations in order to investigate siting and exposure effects. In this paper the methodology is depicted and preliminary results of its application to air temperature discussed; this allowed the setting of an upper limit of 1 °C for the added measurement uncertainty at the top of the urban canopy layer.

  9. Measurement Uncertainty Investigation in the Multi-probe OTA Setups

    DEFF Research Database (Denmark)

    Fan, Wei; Szini, Istvan Janos; Foegelle, M. D.

    2014-01-01

    Extensive efforts are underway to standardize over the air (OTA) testing of the multiple input multiple output (MIMO) capable terminals in COST IC1004, 3GPP RAN4 and CTIA. Due to the ability to reproduce realistic radio propagation environments inside the anechoic chamber and evaluate end user...... metrics in real world scenarios, the multi-probe based method has attracted huge interest from both industry and academia. This contribution attempts to identify some of the measurement uncertainties of the practical multi-probe setups and provide some guidance to establish the multi-probe anechoic...... chamber setup. This contribution presents the results of uncertainty measurements carried out in three practical multi-probe setups. Some sources of measurement errors, i.e. cable effect, cable termination, etc. are identified based on the measurement results....

  10. Dimensional measurements with submicrometer uncertainty in production environment

    DEFF Research Database (Denmark)

    De Chiffre, L.; Gudnason, M. M.; Madruga, D.

    2015-01-01

    The work concerns a laboratory investigation of a method to achieve dimensional measurements with submicrometer uncertainty under conditions that are typical of a production environment. The method involves the concurrent determination of dimensions and material properties from measurements carried...... out over time. A laboratory set-up was developed comprising a pair of electronic probes mounted on a Zerodur block featuring near zero thermal expansion. Three temperature sensors, data acquisition system, and a temperature regulated plate for heating the workpiece were implemented. Investigations...... gauge blocks along with their uncertainties were estimated directly from the measurements. The length of the two workpieces at the reference temperature of 20 °C was extrapolated from the measurements and compared to certificate values. The investigations have documented that the developed approach...

  11. Estimating uncertainty of alcohol-attributable fractions for infectious and chronic diseases

    Directory of Open Access Journals (Sweden)

    Frick Hannah

    2011-04-01

    Full Text Available Abstract Background Alcohol is a major risk factor for burden of disease and injuries globally. This paper presents a systematic method to compute the 95% confidence intervals of alcohol-attributable fractions (AAFs with exposure and risk relations stemming from different sources. Methods The computation was based on previous work done on modelling drinking prevalence using the gamma distribution and the inherent properties of this distribution. The Monte Carlo approach was applied to derive the variance for each AAF by generating random sets of all the parameters. A large number of random samples were thus created for each AAF to estimate variances. The derivation of the distributions of the different parameters is presented as well as sensitivity analyses which give an estimation of the number of samples required to determine the variance with predetermined precision, and to determine which parameter had the most impact on the variance of the AAFs. Results The analysis of the five Asian regions showed that 150 000 samples gave a sufficiently accurate estimation of the 95% confidence intervals for each disease. The relative risk functions accounted for most of the variance in the majority of cases. Conclusions Within reasonable computation time, the method yielded very accurate values for variances of AAFs.

  12. Uncertainty evaluation method for axi-symmetric measurement machines

    Directory of Open Access Journals (Sweden)

    Muelaner Jody Emlyn

    2016-01-01

    Full Text Available This paper describes a method of uncertainty evaluation for axi-symmetric measurement machines. Specialized measuring machines for the inspection of axisymmetric components enable the measurement of properties such as roundness (radial runout, axial runout and coning. These machines typically consist of a rotary table and a number of contact measurement probes located on slideways. Sources of uncertainty include the probe calibration process, probe repeatability, probe alignment, geometric errors in the rotary table, the dimensional stability of the structure holding the probes and form errors in the reference hemisphere which is used to calibrate the system. The generic method is described and an evaluation of an industrial machine is described as a worked example. Expanded uncertainties, at 95% confidence, were then calculated for the measurement of; radial runout (1.2 μm with a plunger probe or 1.7 μm with a lever probe; axial runout (1.2 μm with a plunger probe or 1.5 μm with a lever probe; and coning/swash (0.44 arcseconds with a plunger probe or 0.60 arcseconds with a lever probe.

  13. Public Response to a Near-Miss Nuclear Accident Scenario Varying in Causal Attributions and Outcome Uncertainty.

    Science.gov (United States)

    Cui, Jinshu; Rosoff, Heather; John, Richard S

    2017-11-24

    Many studies have investigated public reactions to nuclear accidents. However, few studies focused on more common events when a serious accident could have happened but did not. This study evaluated public response (emotional, cognitive, and behavioral) over three phases of a near-miss nuclear accident. Simulating a loss-of-coolant accident (LOCA) scenario, we manipulated (1) attribution for the initial cause of the incident (software failure vs. cyber terrorist attack vs. earthquake), (2) attribution for halting the incident (fail-safe system design vs. an intervention by an individual expert vs. a chance coincidence), and (3) level of uncertainty (certain vs. uncertain) about risk of a future radiation leak after the LOCA is halted. A total of 773 respondents were sampled using a 3 × 3 × 2 between-subjects design. Results from both MANCOVA and structural equation modeling (SEM) indicate that respondents experienced more negative affect, perceived more risk, and expressed more avoidance behavioral intention when the near-miss event was initiated by an external attributed source (e.g., earthquake) compared to an internally attributed source (e.g., software failure). Similarly, respondents also indicated greater negative affect, perceived risk, and avoidance behavioral intentions when the future impact of the near-miss incident on people and the environment remained uncertain. Results from SEM analyses also suggested that negative affect predicted risk perception, and both predicted avoidance behavior. Affect, risk perception, and avoidance behavior demonstrated high stability (i.e., reliability) from one phase to the next. © 2017 Society for Risk Analysis.

  14. A multi-attribute decision-making model for the evaluation of uncertainties in traffic pollution control planning.

    Science.gov (United States)

    Wei, Ming; Sun, Bo; Wang, Han; Xu, Zhihuo

    2017-11-04

    The evaluation of traffic emissions control efficiency from various levels is a key issue while selecting an optimal plan for the sustainable development of urban transportation. The conventional multi-criteria evaluation methods cannot deal with the determination and uncertainty of each indicator, and ignore influence of the decision-maker's risk attitude on the evaluation results. This study proposed the use of a multi-attribute decision-making model to evaluate the traffic pollution control operational efficiency by integrating 11 hybrid-type indicators related to the plan implementation, traffic flow, and emissions. It also revealed the relationship between the preference of each decision-maker on these evaluation indicators and the threshold changes in the emissions control efficiency ranking. Case studies performed on the four plans showed that the evaluation value of emissions control efficiency for each plan was related to the decision-maker's risk attitude, and the efficiency ranking was decided by their threshold contact degrees.

  15. Permissible limits for uncertainty of measurement in laboratory medicine.

    Science.gov (United States)

    Haeckel, Rainer; Wosniok, Werner; Gurr, Ebrhard; Peil, Burkhard

    2015-07-01

    The international standard ISO 15189 requires that medical laboratories estimate the uncertainty of their quantitative test results obtained from patients' specimens. The standard does not provide details how and within which limits the measurement uncertainty should be determined. The most common concept for establishing permissible uncertainty limits is to relate them on biological variation defining the rate of false positive results or to base the limits on the state-of-the-art. The state-of-the-art is usually derived from data provided by a group of selected medical laboratories. The approach on biological variation should be preferred because of its transparency and scientific base. Hitherto, all recommendations were based on a linear relationship between biological and analytical variation leading to limits which are sometimes too stringent or too permissive for routine testing in laboratory medicine. In contrast, the present proposal is based on a non-linear relationship between biological and analytical variation leading to more realistic limits. The proposed algorithms can be applied to all measurands and consider any quantity to be assured. The suggested approach tries to provide the above mentioned details and is a compromise between the biological variation concept, the GUM uncertainty model and the technical state-of-the-art.

  16. Measurement Uncertainty Estimation of a Robust Photometer Circuit

    Directory of Open Access Journals (Sweden)

    Jesús de Vicente

    2009-04-01

    Full Text Available In this paper the uncertainty of a robust photometer circuit (RPC was estimated. Here, the RPC was considered as a measurement system, having input quantities that were inexactly known, and output quantities that consequently were also inexactly known. Input quantities represent information obtained from calibration certificates, specifications of manufacturers, and tabulated data. Output quantities describe the transfer function of the electrical part of the photodiode. Input quantities were the electronic components of the RPC, the parameters of the model of the photodiode and its sensitivity at 670 nm. The output quantities were the coefficients of both numerator and denominator of the closed-loop transfer function of the RPC. As an example, the gain and phase shift of the RPC versus frequency was evaluated from the transfer function, with their uncertainties and correlation coefficient. Results confirm the robustness of photodiode design.

  17. Impact of measurement uncertainties on universal scaling of MHD turbulence

    Science.gov (United States)

    Gogoberidze, G.; Chapman, S. C.; Hnat, B.; Dunlop, M. W.

    2012-10-01

    Quantifying the scaling of fluctuations in the solar wind is central to testing predictions of turbulence theories. We study spectral features of Alfvénic turbulence in fast solar wind. We propose a general, instrument-independent method to estimate the uncertainty in velocity fluctuations obtained by in situ satellite observations in the solar wind. We show that when the measurement uncertainties of the velocity fluctuations are taken into account the less energetic Elsasser spectrum obeys a unique power law scaling throughout the inertial range as prevailing theories of magnetohydrodynamic (MHD) turbulence predict. Moreover, in the solar wind interval analysed, the two Elsasser spectra are observed to have the same scaling exponent γ = -1.54 throughout the inertial range.

  18. Measurement Uncertainty Analysis of the Strain Gauge Based Stabilographic Platform

    Directory of Open Access Journals (Sweden)

    Walendziuk Wojciech

    2014-08-01

    Full Text Available The present article describes constructing a stabilographic platform which records a standing patient’s deflection from their point of balance. The constructed device is composed of a toughen glass slab propped with 4 force sensors. Power transducers are connected to the measurement system based on a 24-bit ADC transducer which acquires slight body movements of a patient. The data is then transferred to the computer in real time and data analysis is conducted. The article explains the principle of operation as well as the algorithm of measurement uncertainty for the COP (Centre of Pressure surface (x, y.

  19. Attributes for Measuring Equity and Excellence in District Operation.

    Science.gov (United States)

    DeMoulin, Donald F.; Guyton, John W.

    In the quest for excellence, school districts have a variety of indicators or attributes available by which to gauge their progress. This model, used By the Equity and Excellence Research school districts in Mississippi, monitors achievement in relation to educational excellence. Team members established a list of attributes and various means of…

  20. Measurement quality and uncertainty evaluation in civil engineering research

    Directory of Open Access Journals (Sweden)

    Silva Ribeiro A.

    2013-01-01

    Full Text Available Civil engineering is a branch of science that covers a broad range of areas where experimental procedures often plays an important role. The research in this field is usually supported by experimental structures able to test physical and mathematical models and to provide measurement results with acceptable accuracy. To assure measurement quality, a metrology probabilistic approach can provide valuable mathematical and computational tools especially suited to the study, evaluation and improvement of measurement processes in its different components (modeling, instrumentation performance, data processing, data validation and traceability, emphasizing measurement uncertainty evaluation as a tool to the analysis of results and to promote the quality and capacity associated with decision-making. This paper presents some of the research held by the metrology division of the Portuguese civil engineering research institutes, focused on the contribution of measurement uncertainty studies to a variety of frameworks, such as testing for metrological characterization and physical and mathematical modeling. Experimental data will be used to illustrate practical cases.

  1. Methodology for the assessment of measuring uncertainties of articulated arm coordinate measuring machines

    Science.gov (United States)

    Romdhani, Fekria; Hennebelle, François; Ge, Min; Juillion, Patrick; Coquet, Richard; François Fontaine, Jean

    2014-12-01

    Articulated Arm Coordinate Measuring Machines (AACMMs) have gradually evolved and are increasingly used in mechanical industry. At present, measurement uncertainties relating to the use of these devices are not yet well quantified. The work carried out consists of determining the measurement uncertainties of a mechanical part by an AACMM. The studies aiming to develop a model of measurement uncertainty are based on the Monte Carlo method developed in Supplement 1 of the Guide to Expression of Uncertainty in Measurement [1] but also identifying and characterizing the main sources of uncertainty. A multi-level Monte Carlo approach principle has been developed which allows for characterizing the possible evolution of the AACMM during the measurement and quantifying in a second level the uncertainty on the considered measurand. The first Monte Carlo level is the most complex and is thus divided into three sub-levels, namely characterization on the positioning error of a point, estimation of calibration errors and evaluation of fluctuations of the ‘localization point’. The global method is thus presented and results of the first sub-level are particularly developed. The main sources of uncertainty, including AACMM deformations, are exposed.

  2. Reliability and Validity of a Measure of Preschool Teachers' Attributions for Disruptive Behavior

    Science.gov (United States)

    Carter, Lauren M.; Williford, Amanda P.; LoCasale-Crouch, Jennifer

    2014-01-01

    Research Findings: This study examined the quality of teacher attributions for child disruptive behavior using a new measure, the Preschool Teaching Attributions measure. A sample of 153 early childhood teachers and 432 children participated. All teachers completed the behavior attributions measure, as well as measures regarding demographics,…

  3. TOTAL MEASUREMENT UNCERTAINTY IN HOLDUP MEASUREMENTS AT THE PLUTONIUM FINISHING PLANT (PFP)

    Energy Technology Data Exchange (ETDEWEB)

    KEELE, B.D.

    2007-07-05

    An approach to determine the total measurement uncertainty (TMU) associated with Generalized Geometry Holdup (GGH) [1,2,3] measurements was developed and implemented in 2004 and 2005 [4]. This paper describes a condensed version of the TMU calculational model, including recent developments. Recent modifications to the TMU calculation model include a change in the attenuation uncertainty, clarifying the definition of the forward background uncertainty, reducing conservatism in the random uncertainty by selecting either a propagation of counting statistics or the standard deviation of the mean, and considering uncertainty in the width and height as a part of the self attenuation uncertainty. In addition, a detection limit is calculated for point sources using equations derived from summary equations contained in Chapter 20 of MARLAP [5]. The Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2007-1 to the Secretary of Energy identified a lack of requirements and a lack of standardization for performing measurements across the U.S. Department of Energy (DOE) complex. The DNFSB also recommended that guidance be developed for a consistent application of uncertainty values. As such, the recent modifications to the TMU calculational model described in this paper have not yet been implemented. The Plutonium Finishing Plant (PFP) is continuing to perform uncertainty calculations as per Reference 4. Publication at this time is so that these concepts can be considered in developing a consensus methodology across the complex.

  4. Quantifying Uncertainty in Brain Network Measures using Bayesian Connectomics

    Directory of Open Access Journals (Sweden)

    Ronald Johannes Janssen

    2014-10-01

    Full Text Available The wiring diagram of the human brain can be described in terms of graph measures that characterize structural regularities. These measures require an estimate of whole-brain structural connectivity for which one may resort to deterministic or thresholded probabilistic streamlining procedures. While these procedures have provided important insights about the characteristics of human brain networks, they ultimately rely on unwarranted assumptions such as those of noise-free data or the use of an arbitrary threshold. Therefore, resulting structural connectivity estimates as well as derived graph measures fail to fully take into account the inherent uncertainty in the structural estimate.In this paper, we illustrate an easy way of obtaining posterior distributions over graph metrics using Bayesian inference. It is shown that this posterior distribution can be used to quantify uncertainty about graph-theoretical measures at the single subject level, thereby providing a more nuanced view of the graph-theoretical properties of human brain connectivity. We refer to this model-based approach to connectivity analysis as Bayesian connectomics.

  5. Application of a virtual coordinate measuring machine for measurement uncertainty estimation of aspherical lens parameters

    Science.gov (United States)

    Küng, Alain; Meli, Felix; Nicolet, Anaïs; Thalmann, Rudolf

    2014-09-01

    Tactile ultra-precise coordinate measuring machines (CMMs) are very attractive for accurately measuring optical components with high slopes, such as aspheres. The METAS µ-CMM, which exhibits a single point measurement repeatability of a few nanometres, is routinely used for measurement services of microparts, including optical lenses. However, estimating the measurement uncertainty is very demanding. Because of the many combined influencing factors, an analytic determination of the uncertainty of parameters that are obtained by numerical fitting of the measured surface points is almost impossible. The application of numerical simulation (Monte Carlo methods) using a parametric fitting algorithm coupled with a virtual CMM based on a realistic model of the machine errors offers an ideal solution to this complex problem: to each measurement data point, a simulated measurement variation calculated from the numerical model of the METAS µ-CMM is added. Repeated several hundred times, these virtual measurements deliver the statistical data for calculating the probability density function, and thus the measurement uncertainty for each parameter. Additionally, the eventual cross-correlation between parameters can be analyzed. This method can be applied for the calibration and uncertainty estimation of any parameter of the equation representing a geometric element. In this article, we present the numerical simulation model of the METAS µ-CMM and the application of a Monte Carlo method for the uncertainty estimation of measured asphere parameters.

  6. Evaluation of the measurement uncertainty: Some common mistakes with a focus on the uncertainty from linear calibration.

    Science.gov (United States)

    Kadis, Rouvim

    2017-05-26

    The rational strategy in the evaluation of analytical measurement uncertainty is to combine the "whole method" performance data, such as precision and recovery, with the uncertainty contributions from sources not adequately covered by those data. This paper highlights some common mistakes in evaluating the uncertainty when pursuing that strategy, as revealed in current chromatographic literature. The list of the uncertainty components usually taken into account is discussed first and fallacies with the LOD- and recovery uncertainties are noted. Close attention is paid to the uncertainty arising from a linear calibration normally used. It is demonstrated that following a well-known formula for the standard deviation of an analytical result obtained from a straight line calibration leads to double counting the precision contribution to the uncertainty budget. Furthermore, the precision component itself is often estimated improperly, based on the number of replicates taken from the precision assessment experiment. As a result, the relative uncertainty from linear calibration is overestimated in the budget and may become the largest contribution to the combined uncertainty, which is clearly shown with an example calculation based on the literature data. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. The combined method for uncertainty evaluation in electromagnetic radiation measurement

    Directory of Open Access Journals (Sweden)

    Kovačević Aleksandar M.

    2014-01-01

    Full Text Available Electromagnetic radiation of all frequencies represents one of the most common and fastest growing environmental influence. All populations are now exposed to varying degrees of electromagnetic radiation and the levels will continue to increase as technology advances. An electronic or electrical product should not generate electromagnetic radiation which may impact the environment. In addition, electromagnetic radiation measurement results need to be accompanied by quantitative statements about their accuracy. This is particularly important when decisions about product specifications are taken. This paper presents an uncertainty budget for disturbance power measurements of the equipment as part of electromagnetic radiation. We propose a model which uses a mixed distribution for uncertainty evaluation. The evaluation of the probability density function for the measurand has been done using the Monte Carlo method and a modified least-squares method (combined method. For illustration, this paper presents mixed distributions of two normal distributions, normal and rectangular, respectively. [Projekat Ministarstva nauke Republike Srbije, br. III 43009 i br. 171007

  8. Assessing uncertainty in radar measurements on simplified meteorological scenarios

    Directory of Open Access Journals (Sweden)

    L. Molini

    2006-01-01

    Full Text Available A three-dimensional radar simulator model (RSM developed by Haase (1998 is coupled with the nonhydrostatic mesoscale weather forecast model Lokal-Modell (LM. The radar simulator is able to model reflectivity measurements by using the following meteorological fields, generated by Lokal Modell, as inputs: temperature, pressure, water vapour content, cloud water content, cloud ice content, rain sedimentation flux and snow sedimentation flux. This work focuses on the assessment of some uncertainty sources associated with radar measurements: absorption by the atmospheric gases, e.g., molecular oxygen, water vapour, and nitrogen; attenuation due to the presence of a highly reflecting structure between the radar and a "target structure". RSM results for a simplified meteorological scenario, consisting of a humid updraft on a flat surface and four cells placed around it, are presented.

  9. Automated densimetric system: Measurements and uncertainties for compressed fluids

    Energy Technology Data Exchange (ETDEWEB)

    Segovia, Jose J. [Laboratorio de Propiedades Termofisicas, Dpto. de Fisica Aplicada, Facultade de Fisica, Universidade de Santiago de Compostela, E-15782 Santiago de Compostela (Spain); Research Group TERMOCAL-Thermodynamics and Calibration University of Valladolid, Paseo del Cauce s/n, E-47011 Valladolid (Spain)], E-mail: josseg@eis.uva.es; Fandino, Olivia; Lopez, Enriqueta R.; Lugo, Luis [Laboratorio de Propiedades Termofisicas, Dpto. de Fisica Aplicada, Facultade de Fisica, Universidade de Santiago de Compostela, E-15782 Santiago de Compostela (Spain); Carmen Martin, Ma. [Research Group TERMOCAL-Thermodynamics and Calibration University of Valladolid, Paseo del Cauce s/n, E-47011 Valladolid (Spain); Fernandez, Josefa [Laboratorio de Propiedades Termofisicas, Dpto. de Fisica Aplicada, Facultade de Fisica, Universidade de Santiago de Compostela, E-15782 Santiago de Compostela (Spain)

    2009-05-15

    In this work, we present the automatization of a densimetric system for pVT measurements of compressed fluids, operating over the temperature range from 283.15 K to 398.15 K and pressures up to 70 MPa. The heart of the automatic assembly is a commercial Anton Paar DMA HPM vibrating-tube densimeter. The densimeter was calibrated using the method developed by Lagourette et al. and modified recently by Comunas et al. The fluids used in this calibration were a vacuum, water, and n-decane. Data reliability has been verified by comparing our experimental results for toluene with literature data. The uncertainty of the experimental density was evaluated following the rigorous procedure of the EA-4/02 guide. Moreover, new density measurements of pentaerythritol tetrapentanoate ester are presented. For this fluid with significant viscosity values, a correction factor was applied to take into account the damping effects in the densimeter cell.

  10. An approach to establish the uncertainty budget of catalytic activity concentration measurements in a reference laboratory.

    Science.gov (United States)

    Rami, Laura; Canalias, Francesca

    2015-04-01

    Reference laboratories providing reference services recognized by the Joint Committee for Traceability in Laboratory Medicine (JCTLM) must be accredited as calibration laboratories according to ISO 17025 and ISO 15195. These standards require laboratories to establish an uncertainty budget, in which the uncertainty contributions of the relevant uncertainty components are specified. We present a model to estimate the measurement uncertainty of creatine kinase catalytic activity concentration results obtained by IFCC primary reference measurement procedure. The measurement uncertainty has been estimated by following the next steps: 1) specification of the measurand; 2) identification of the most relevant uncertainty sources; 3) estimation of standard uncertainties by either type A or type B evaluation; 4) estimation of combined uncertainty while taking into account sensitivity coefficients, as well as existence of correlated uncertainty sources; and 5) estimation of expanded uncertainty with a defined coverage probability. The estimated expanded uncertainty was 2.2% (k=2). Uncertainty sources with a significant contribution to the measurement uncertainty were the following: pH adjustment (0.68%), absorbance accuracy (0.48%), wavelength adjustment (0.20%), reaction temperature (0.19%), volume fraction of sample (0.15%) and absorbance linearity (0.06%). The present model is an approach to establish the uncertainty budget of primary reference procedures for the measurement of the catalytic activity concentration of enzymes, and aims at being an example to be followed by other reference laboratories, as well as by laboratories that carry out primary reference measurement procedures.

  11. Reducing Uncertainty: Implementation of Heisenberg Principle to Measure Company Performance

    Directory of Open Access Journals (Sweden)

    Anna Svirina

    2015-08-01

    Full Text Available The paper addresses the problem of uncertainty reduction in estimation of future company performance, which is a result of wide range of enterprise's intangible assets probable efficiency. To reduce this problem, the paper suggests to use quantum economy principles, i.e. implementation of Heisenberg principle to measure efficiency and potential of intangible assets of the company. It is proposed that for intangibles it is not possible to estimate both potential and efficiency at a certain time point. To provide a proof for these thesis, the data on resources potential and efficiency from mid-Russian companies was evaluated within deterministic approach, which did not allow to evaluate probability of achieving certain resource efficiency, and quantum approach, which allowed to estimate the central point around which the probable efficiency of resources in concentrated. Visualization of these approaches was performed by means of LabView software. It was proven that for tangible assets performance estimation a deterministic approach should be used; while for intangible assets the quantum approach allows better quality of future performance prediction. On the basis of these findings we proposed the holistic approach towards estimation of company resource efficiency in order to reduce uncertainty in modeling company performance.

  12. Measurement Of Beer Taste Attributes Using An Electronic Tongue

    Science.gov (United States)

    Polshin, Evgeny; Rudnitskaya, Alisa; Kirsanov, Dmitry; Lammertyn, Jeroen; Nicolaï, Bart; Saison, Daan; Delvaux, Freddy R.; Delvaux, Filip; Legin, Andrey

    2009-05-01

    The present work deals with the results of the application of an electronic tongue system as an analytical tool for rapid assessment of beer flavour. Fifty samples of Belgian and Dutch beers of different types, characterized with respect to sensory properties and bitterness, were analyzed using the electronic tongue (ET) based on potentiometric chemical sensors. The ET was capable of predicting 10 sensory attributes of beer with good precision including sweetness, sourness, intensity, body, etc., as well as the most important instrumental parameter—bitterness. These results show a good promise for further progressing of the ET as a new analytical technique for the fast assessment of taste attributes and bitterness, in particular, in the food and brewery industries.

  13. The contribution of sampling uncertainty to total measurement uncertainty in the enumeration of microorganisms in foods.

    Science.gov (United States)

    Jarvis, Basil; Hedges, Alan J; Corry, Janet E L

    2012-06-01

    Random samples of each of several food products were obtained from defined lots during processing or from retail outlets. The foods included raw milk (sampled on farm and from a bulk-milk tanker), sprouted seeds, raw minced meat, frozen de-shelled raw prawns, neck-flaps from raw chicken carcasses and ready-to-eat sandwiches. Duplicate sub-samples, generally of 100 g, were examined for aerobic colony counts; some were examined also for counts of presumptive Enterobacteriaceae and campylobacters. After log(10)-transformation, all sets of colony count data were evaluated for conformity with the normal distribution (ND) and analysed by standard ANOVA and a robust ANOVA to determine the relative contributions of the variance between and within samples to the overall variance. Sampling variance accounted for >50% of the reproducibility variance for the majority of foods examined; in many cases it exceeded 85%. We also used an iterative procedure of re-sampling without replacement to determine the effects of sample size (i.e. the number of samples) on the precision of the estimate of variance for one of the larger data sets. The variance of the repeatability and reproducibility variances depended on the number of replicate samples tested (n) in a manner that was characteristic of the underlying distribution. The results are discussed in relation to the use of measurement uncertainty in assessing compliance of results with microbiological criteria for foods. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. Uncertainty in measurement: a review of monte carlo simulation using microsoft excel for the calculation of uncertainties through functional relationships, including uncertainties in empirically derived constants.

    Science.gov (United States)

    Farrance, Ian; Frenkel, Robert

    2014-02-01

    The Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides the basic framework for evaluating uncertainty in measurement. The GUM however does not always provide clearly identifiable procedures suitable for medical laboratory applications, particularly when internal quality control (IQC) is used to derive most of the uncertainty estimates. The GUM modelling approach requires advanced mathematical skills for many of its procedures, but Monte Carlo simulation (MCS) can be used as an alternative for many medical laboratory applications. In particular, calculations for determining how uncertainties in the input quantities to a functional relationship propagate through to the output can be accomplished using a readily available spreadsheet such as Microsoft Excel. The MCS procedure uses algorithmically generated pseudo-random numbers which are then forced to follow a prescribed probability distribution. When IQC data provide the uncertainty estimates the normal (Gaussian) distribution is generally considered appropriate, but MCS is by no means restricted to this particular case. With input variations simulated by random numbers, the functional relationship then provides the corresponding variations in the output in a manner which also provides its probability distribution. The MCS procedure thus provides output uncertainty estimates without the need for the differential equations associated with GUM modelling. The aim of this article is to demonstrate the ease with which Microsoft Excel (or a similar spreadsheet) can be used to provide an uncertainty estimate for measurands derived through a functional relationship. In addition, we also consider the relatively common situation where an empirically derived formula includes one or more 'constants', each of which has an empirically derived numerical value. Such empirically derived 'constants' must also have associated uncertainties which propagate through the functional relationship

  15. Uncertainty in Measurement: A Review of Monte Carlo Simulation Using Microsoft Excel for the Calculation of Uncertainties Through Functional Relationships, Including Uncertainties in Empirically Derived Constants

    Science.gov (United States)

    Farrance, Ian; Frenkel, Robert

    2014-01-01

    The Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides the basic framework for evaluating uncertainty in measurement. The GUM however does not always provide clearly identifiable procedures suitable for medical laboratory applications, particularly when internal quality control (IQC) is used to derive most of the uncertainty estimates. The GUM modelling approach requires advanced mathematical skills for many of its procedures, but Monte Carlo simulation (MCS) can be used as an alternative for many medical laboratory applications. In particular, calculations for determining how uncertainties in the input quantities to a functional relationship propagate through to the output can be accomplished using a readily available spreadsheet such as Microsoft Excel. The MCS procedure uses algorithmically generated pseudo-random numbers which are then forced to follow a prescribed probability distribution. When IQC data provide the uncertainty estimates the normal (Gaussian) distribution is generally considered appropriate, but MCS is by no means restricted to this particular case. With input variations simulated by random numbers, the functional relationship then provides the corresponding variations in the output in a manner which also provides its probability distribution. The MCS procedure thus provides output uncertainty estimates without the need for the differential equations associated with GUM modelling. The aim of this article is to demonstrate the ease with which Microsoft Excel (or a similar spreadsheet) can be used to provide an uncertainty estimate for measurands derived through a functional relationship. In addition, we also consider the relatively common situation where an empirically derived formula includes one or more ‘constants’, each of which has an empirically derived numerical value. Such empirically derived ‘constants’ must also have associated uncertainties which propagate through the functional

  16. High frequency electric field levels: An example of determination of measurement uncertainty for broadband measurements

    Directory of Open Access Journals (Sweden)

    Vulević Branislav

    2016-01-01

    Full Text Available Determining high frequency electromagnetic field levels in urban areas represents a very complex task, having in mind the exponential growth of the number of sources embodied in public cellular telephony systems in the past twenty years. The main goal of this paper is a representation of a practical solution in the evaluation of measurement uncertainty for in-situ measurements in the case of spatial averaging. An example of the estimation of the uncertainty for electric field strength broadband measurements in the frequency range from 3 MHz to 18 GHz is presented.

  17. Past changes in the vertical distribution of ozone – Part 1: Measurement techniques, uncertainties and availability

    Directory of Open Access Journals (Sweden)

    B. Hassler

    2014-05-01

    Full Text Available Peak stratospheric chlorofluorocarbon (CFC and other ozone depleting substance (ODS concentrations were reached in the mid- to late 1990s. Detection and attribution of the expected recovery of the stratospheric ozone layer in an atmosphere with reduced ODSs as well as efforts to understand the evolution of stratospheric ozone in the presence of increasing greenhouse gases are key current research topics. These require a critical examination of the ozone changes with an accurate knowledge of the spatial (geographical and vertical and temporal ozone response. For such an examination, it is vital that the quality of the measurements used be as high as possible and measurement uncertainties well quantified. In preparation for the 2014 United Nations Environment Programme (UNEP/World Meteorological Organization (WMO Scientific Assessment of Ozone Depletion, the SPARC/IO3C/IGACO-O3/NDACC (SI2N Initiative was designed to study and document changes in the global ozone profile distribution. This requires assessing long-term ozone profile data sets in regards to measurement stability and uncertainty characteristics. The ultimate goal is to establish suitability for estimating long-term ozone trends to contribute to ozone recovery studies. Some of the data sets have been improved as part of this initiative with updated versions now available. This summary presents an overview of stratospheric ozone profile measurement data sets (ground and satellite based available for ozone recovery studies. Here we document measurement techniques, spatial and temporal coverage, vertical resolution, native units and measurement uncertainties. In addition, the latest data versions are briefly described (including data version updates as well as detailing multiple retrievals when available for a given satellite instrument. Archive location information for each data set is also given.

  18. Impact of measurement uncertainties on determination of chlorophyll-specific absorption coefficient for marine phytoplankton

    Science.gov (United States)

    McKee, David; Röttgers, Rüdiger; Neukermans, Griet; Calzado, Violeta Sanjuan; Trees, Charles; Ampolo-Rella, Marina; Neil, Claire; Cunningham, Alex

    2014-12-01

    Understanding variability in the chlorophyll-specific absorption of marine phytoplankton, aph*Chl (λ), is essential for primary production modelling, calculation of underwater light field characteristics, and development of algorithms for remote sensing of chlorophyll concentrations. Previous field and laboratory studies have demonstrated significant apparent variability in aph*Chl (λ) for natural samples and algal cultures. However, the potential impact of measurement uncertainties on derived values of aph*Chl (λ) has received insufficient study. This study presents an analysis of measurement uncertainties for a data set collected in the Ligurian Sea in Spring and assesses the impact on estimates of aph*Chl (λ). It is found that a large proportion of apparent variability in this set of aph*Chl (λ) can be attributed to measurement errors. Application of the same analysis to the global NOMAD data set suggests that a significant fraction of variability in aph*Chl (λ) may also be due to measurement errors. The copyright line for this article was changed on 16 JAN 2015 after original online publication.

  19. Uncertainty of measurement and clinical value of semen analysis: has standardisation through professional guidelines helped or hindered progress?

    Science.gov (United States)

    Tomlinson, M J

    2016-09-01

    This article suggests that diagnostic semen analysis has no more clinical value today than it had 25-30 years ago, and both the confusion surrounding its evidence base (in terms of relationship with conception) and the low level of confidence in the clinical setting is attributable to an associated high level of 'uncertainty'. Consideration of the concept of measurement uncertainty is mandatory for medical laboratories applying for the ISO15189 standard. It is evident that the entire semen analysis process is prone to error every step from specimen collection to the reporting of results and serves to compound uncertainty associated with diagnosis or prognosis. Perceived adherence to published guidelines for the assessment of sperm concentration, motility and morphology does not guarantee a reliable and reproducible test result. Moreover, the high level of uncertainty associated with manual sperm motility and morphology can be attributed to subjectivity and lack a traceable standard. This article describes where and why uncertainty exists and suggests that semen analysis will continue to be of limited value until it is more adequately considered and addressed. Although professional guidelines for good practice have provided the foundations for testing procedures for many years, the risk in following rather prescriptive guidance to the letter is that unless they are based on an overwhelmingly firm evidence base, the quality of semen analysis will remain poor and the progress towards the development of more innovative methods for investigating male infertility will be slow. © 2016 American Society of Andrology and European Academy of Andrology.

  20. Evaluation of uncertainty of measurement for cellulosic fiber and ...

    African Journals Online (AJOL)

    DR OKE

    correspondingly. The resulting expanded uncertainty exhibited levels in the margins of 20 %, which cautions for critical ... coefficients and determination of the expanded uncertainty. Consultation .... et al, 2010) who studied on the chemical modification effects on mechanical properties of high impact polystyrene. (a). (b). (c).

  1. A Unified Approach for Reporting ARM Measurement Uncertainties Technical Report: Updated in 2016

    Energy Technology Data Exchange (ETDEWEB)

    Sisterson, Douglas [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-01-15

    The U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility is observationally based, and quantifying the uncertainty of its measurements is critically important. With over 300 widely differing instruments providing over 2,500 datastreams, concise expression of measurement uncertainty is quite challenging. ARM currently provides data and supporting metadata (information about the data or data quality) to its users through several sources. Because the continued success of the ARM Facility depends on the known quality of its measurements, ARM relies on Instrument Mentors and the ARM Data Quality Office to ensure, assess, and report measurement quality. Therefore, an easily accessible, well-articulated estimate of ARM measurement uncertainty is needed. This report is a continuation of the work presented by Campos and Sisterson (2015) and provides additional uncertainty information from instruments not available in their report. As before, a total measurement uncertainty has been calculated as a function of the instrument uncertainty (calibration factors), the field uncertainty (environmental factors), and the retrieval uncertainty (algorithm factors). This study will not expand on methods for computing these uncertainties. As before, it will focus on the practical identification, characterization, and inventory of the measurement uncertainties already available to the ARM community through the ARM Instrument Mentors and their ARM instrument handbooks. This study continues the first steps towards reporting ARM measurement uncertainty as: (1) identifying how the uncertainty of individual ARM measurements is currently expressed, (2) identifying a consistent approach to measurement uncertainty, and then (3) reclassifying ARM instrument measurement uncertainties in a common framework.

  2. Calibration of Heat Stress Monitor and its Measurement Uncertainty

    Science.gov (United States)

    Ekici, Can

    2017-07-01

    Wet-bulb globe temperature (WBGT) equation is a heat stress index that gives information for the workers in the industrial areas. WBGT equation is described in ISO Standard 7243 (ISO 7243 in Hot environments—estimation of the heat stress on working man, based on the WBGT index, ISO, Geneva, 1982). WBGT is the result of the combined quantitative effects of the natural wet-bulb temperature, dry-bulb temperature, and air temperature. WBGT is a calculated parameter. WBGT uses input estimates, and heat stress monitor measures these quantities. In this study, the calibration method of a heat stress monitor is described, and the model function for measurement uncertainty is given. Sensitivity coefficients were derived according to GUM. Two-pressure humidity generators were used to generate a controlled environment. Heat stress monitor was calibrated inside of the generator. Two-pressure humidity generator, which is located in Turkish Standard Institution, was used as the reference device. This device is traceable to national standards. Two-pressure humidity generator includes reference temperature Pt-100 sensors. The reference sensor was sheltered with a wet wick for the calibration of natural wet-bulb thermometer. The reference sensor was centred into a black globe that has got 150 mm diameter for the calibration of the black globe thermometer.

  3. Practical Use of the Braking Attributes Measurements Results

    Directory of Open Access Journals (Sweden)

    Ondruš Ján

    2017-01-01

    Full Text Available This contribution deals with issues of braking the passenger car. The measurement of braking deceleration of the vehicle Kia Cee´d 1,6 16 V was carried out by an optical device Correvit system. The measurement was carried out on the airport of the village of Rosina located close to Zilina. 10 drivers of different age, praxis, and kilometers driven participated in the measurement. The measured process was the vehicle full braking with the service brake of the initial speed of approximately 50 km.h-1. Each of the drivers had 10 attempts. In the closure of this contribution the results of the performed measurements, their evaluation and comparison are presented. Practical result from the contribution is mainly the measurement set of braking deceleration of the respective vehicle during intensive braking.

  4. Reconsideration of the Uncertainty Relations and Quantum Measurements

    Directory of Open Access Journals (Sweden)

    Dumitru S.

    2008-04-01

    Full Text Available Discussions on uncertainty relations (UR and quantum measurements (QMS persisted until nowadays in publications about quantum mechanics (QM. They originate mainly from the conventional interpretation of UR (CIUR. In the most of the QM literarure, it is underestimated the fact that, over the years, a lot of deficiencies regarding CIUR were signaled. As a rule the alluded deficiencies were remarked disparately and discussed as punctual and non-essential questions. Here we approach an investigation of the mentioned deficiencies collected in a conclusive ensemble. Subsequently we expose a reconsideration of the major problems referring to UR and QMS. We reveal that all the basic presumption of CIUR are troubled by insurmountable deficiencies which require the indubitable failure of CIUR and its necessary abandonment. Therefore the UR must be deprived of their statute of crucialpieces for physics. So, the aboriginal versions of UR appear as being in postures of either (i thought-experimental fictions or (ii simple QM formulae and, any other versions of them, have no connection with the QMS. Then the QMS must be viewed as an additional subject comparatively with the usual questions of QM. For a theoretical description of QMS we propose an information-transmission model, in which the quantum observables are considered as random variables. Our approach directs to natural solutions and simplifications for many problems regarding UR and QMS.

  5. Reconsideration of the Uncertainty Relations and Quantum Measurements

    Directory of Open Access Journals (Sweden)

    Dumitru S.

    2008-04-01

    Full Text Available Discussions on uncertainty relations (UR and quantum measurements (QMS persisted until nowadays in publications about quantum mechanics (QM. They originate mainly from the conventional interpretation of UR (CIUR. In the most of the QM literarure, it is underestimated the fact that, over the years, a lot of deficiencies regarding CIUR were signaled. As a rule the alluded deficiencies were remarked disparately and dis- cussed as punctual and non-essential questions. Here we approach an investigation of the mentioned deficiencies collected in a conclusive ensemble. Subsequently we expose a reconsideration of the major problems referring to UR and QMS. We reveal that all the basic presumption of CIUR are troubled by insurmountable deficiencies which require the indubitable failure of CIUR and its necessary abandonment. Therefore the UR must be deprived of their statute of crucial pieces for physics. So, the aboriginal versions of UR appear as being in postures of either (i thought-experimental fictions or (ii sim- ple QM formulae and, any other versions of them, have no connection with the QMS. Then the QMS must be viewed as an additional subject comparatively with the usual questions of QM. For a theoretical description of QMS we propose an information- transmission model, in which the quantum observables are considered as random vari- ables. Our approach directs to natural solutions and simplifications for many problems regarding UR and QMS.

  6. Definition of free form object for low uncertainty measurements on cooridnate measuring machines

    DEFF Research Database (Denmark)

    Savio, Enrico; De Chiffre, Leonardo

    . The Centre for Geometrical Metrology (CGM) at the Technical University of Denmark takes care of free form measurements, in collaboration with DIMEG, University of Padova, Italy. The present report describes the free form objects selected for the investigations on the uncertainty assessment procedures....

  7. Bayesian Estimation of Uncertainties for Redshift Independent Distance Measurements in the Ned-d Catalog

    Science.gov (United States)

    Chaparro Molano, G.; Restrepo Gaitán, O. A.; Cuervo Marulanda, J. C.; Torres Arzayus, S. A.

    2018-01-01

    Obtaining individual estimates for uncertainties in redshift-independent galaxy distance measurements can be challenging, as for each galaxy there can be many distance estimates with non-gaussian distributions, some of which may not even have a reported uncertainty. We seek to model uncertainties using a bootstrap sampling of measurements per galaxy per distance estimation method. We then create a predictive bayesian model for estimating galaxy distance uncertainties that is better than simply using a weighted standard deviation. This can be a first step toward predicting distance uncertainties for future catalog-wide analysis.

  8. Total Measurement Uncertainty for the Plutonium Finishing Plant (PFP) Segmented Gamma Scan Assay System

    CERN Document Server

    Fazzari, D M

    2001-01-01

    This report presents the results of an evaluation of the Total Measurement Uncertainty (TMU) for the Canberra manufactured Segmented Gamma Scanner Assay System (SGSAS) as employed at the Hanford Plutonium Finishing Plant (PFP). In this document, TMU embodies the combined uncertainties due to all of the individual random and systematic sources of measurement uncertainty. It includes uncertainties arising from corrections and factors applied to the analysis of transuranic waste to compensate for inhomogeneities and interferences from the waste matrix and radioactive components. These include uncertainty components for any assumptions contained in the calibration of the system or computation of the data. Uncertainties are propagated at 1 sigma. The final total measurement uncertainty value is reported at the 95% confidence level. The SGSAS is a gamma assay system that is used to assay plutonium and uranium waste. The SGSAS system can be used in a stand-alone mode to perform the NDA characterization of a containe...

  9. The uncertainty in physical measurements an introduction to data analysis in the physics laboratory

    CERN Document Server

    Fornasini, Paolo

    2008-01-01

    All measurements of physical quantities are affected by uncertainty. Understanding the origin of uncertainty, evaluating its extent and suitably taking it into account in data analysis is essential for assessing the degree of accuracy of phenomenological relationships and physical laws in both scientific research and technological applications. The Uncertainty in Physical Measurements: An Introduction to Data Analysis in the Physics Laboratory presents an introduction to uncertainty and to some of the most common procedures of data analysis. This book will serve the reader well by filling the gap between tutorial textbooks and highly specialized monographs. The book is divided into three parts. The first part is a phenomenological introduction to measurement and uncertainty: properties of instruments, different causes and corresponding expressions of uncertainty, histograms and distributions, and unified expression of uncertainty. The second part contains an introduction to probability theory, random variable...

  10. Method to Calculate Uncertainty Estimate of Measuring Shortwave Solar Irradiance using Thermopile and Semiconductor Solar Radiometers

    Energy Technology Data Exchange (ETDEWEB)

    Reda, I.

    2011-07-01

    The uncertainty of measuring solar irradiance is fundamentally important for solar energy and atmospheric science applications. Without an uncertainty statement, the quality of a result, model, or testing method cannot be quantified, the chain of traceability is broken, and confidence cannot be maintained in the measurement. Measurement results are incomplete and meaningless without a statement of the estimated uncertainty with traceability to the International System of Units (SI) or to another internationally recognized standard. This report explains how to use International Guidelines of Uncertainty in Measurement (GUM) to calculate such uncertainty. The report also shows that without appropriate corrections to solar measuring instruments (solar radiometers), the uncertainty of measuring shortwave solar irradiance can exceed 4% using present state-of-the-art pyranometers and 2.7% using present state-of-the-art pyrheliometers. Finally, the report demonstrates that by applying the appropriate corrections, uncertainties may be reduced by at least 50%. The uncertainties, with or without the appropriate corrections might not be compatible with the needs of solar energy and atmospheric science applications; yet, this report may shed some light on the sources of uncertainties and the means to reduce overall uncertainty in measuring solar irradiance.

  11. Entropic Uncertainty Principle and Information Exclusion Principle for multiple measurements in the presence of quantum memory

    OpenAIRE

    Zhang, Jun; Zhang, Yang; Yu, Chang-shui

    2015-01-01

    The Heisenberg uncertainty principle shows that no one can specify the values of the non-commuting canonically conjugated variables simultaneously. However, the uncertainty relation is usually applied to two incompatible measurements. We present tighter bounds on both entropic uncertainty relation and information exclusion principle for multiple measurements in the presence of quantum memory. As applications, three incompatible measurements on Werner state and Horodecki's bound entangled stat...

  12. Entropic Uncertainty Relation and Information Exclusion Relation for multiple measurements in the presence of quantum memory

    OpenAIRE

    Zhang, Jun; Zhang, Yang; Yu, Chang-shui

    2015-01-01

    The Heisenberg uncertainty principle shows that no one can specify the values of the non-commuting canonically conjugated variables simultaneously. However, the uncertainty relation is usually applied to two incompatible measurements. We present tighter bounds on both entropic uncertainty relation and information exclusion relation for multiple measurements in the presence of quantum memory. As applications, three incompatible measurements on Werner state and Horodecki?s bound entangled state...

  13. ANALYSIS OF MEASUREMENT UNCERTAINTIES IN THE NULLING TEST FOR AIR LEAKAGE FROM RESIDENTIAL DUCTS.

    Energy Technology Data Exchange (ETDEWEB)

    ANDREWS,J.W.

    2001-04-01

    An analysis of measurement uncertainties in a recently proposed method of measuring air leakage in residential duct systems has been carried out. The uncertainties in supply and return leakage rates are expressed in terms of the value of the envelope leakage flow coefficient and the uncertainties in measured pressures and air flow rates. Results of the analysis are compared with data published by two research groups.

  14. Measurement Issues for Energy Efficient Commercial Buildings: Productivity and Performance Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Jones, D.W.

    2002-05-16

    In previous reports, we have identified two potentially important issues, solutions to which would increase the attractiveness of DOE-developed technologies in commercial buildings energy systems. One issue concerns the fact that in addition to saving energy, many new technologies offer non-energy benefits that contribute to building productivity (firm profitability). The second issue is that new technologies are typically unproven in the eyes of decision makers and must bear risk premiums that offset cost advantages resulting from laboratory calculations. Even though a compelling case can be made for the importance of these issues, for building decision makers to incorporate them in business decisions and for DOE to use them in R&D program planning there must be robust empirical evidence of their existence and size. This paper investigates how such measurements could be made and offers recommendations as to preferred options. There is currently little systematic information on either of these concepts in the literature. Of the two there is somewhat more information on non-energy benefits, but little as regards office buildings. Office building productivity impacts can be observed casually, but must be estimated statistically, because buildings have many interacting attributes and observations based on direct behavior can easily confuse the process of attribution. For example, absenteeism can be easily observed. However, absenteeism may be down because a more healthy space conditioning system was put into place, because the weather was milder, or because firm policy regarding sick days had changed. There is also a general dearth of appropriate information for purposes of estimation. To overcome these difficulties, we propose developing a new data base and applying the technique of hedonic price analysis. This technique has been used extensively in the analysis of residential dwellings. There is also a literature on its application to commercial and industrial

  15. [Evaluation of measurement uncertainty of welding fume in welding workplace of a shipyard].

    Science.gov (United States)

    Ren, Jie; Wang, Yanrang

    2015-12-01

    To evaluate the measurement uncertainty of welding fume in the air of the welding workplace of a shipyard, and to provide quality assurance for measurement. According to GBZ/T 192.1-2007 "Determination of dust in the air of workplace-Part 1: Total dust concentration" and JJF 1059-1999 "Evaluation and expression of measurement uncertainty", the uncertainty for determination of welding fume was evaluated and the measurement results were completely described. The concentration of welding fume was 3.3 mg/m(3), and the expanded uncertainty was 0.24 mg/m(3). The repeatability for determination of dust concentration introduced an uncertainty of 1.9%, the measurement using electronic balance introduced a standard uncertainty of 0.3%, and the measurement of sample quality introduced a standard uncertainty of 3.2%. During the determination of welding fume, the standard uncertainty introduced by the measurement of sample quality is the dominant uncertainty. In the process of sampling and measurement, quality control should be focused on the collection efficiency of dust, air humidity, sample volume, and measuring instruments.

  16. Uncertainty Analysis of Certified Photovoltaic Measurements at the National Renewable Energy Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Emery, K.

    2009-08-01

    Discusses NREL Photovoltaic Cell and Module Performance Characterization Group's procedures to achieve lowest practical uncertainty in measuring PV performance with respect to reference conditions.

  17. Test-Cost-Sensitive Attribute Reduction of Data with Normal Distribution Measurement Errors

    OpenAIRE

    Hong Zhao; Fan Min; William Zhu

    2013-01-01

    The measurement error with normal distribution is universal in applications. Generally, smaller measurement error requires better instrument and higher test cost. In decision making based on attribute values of objects, we shall select an attribute subset with appropriate measurement error to minimize the total test cost. Recently, error-range-based covering rough set with uniform distribution error was proposed to investigate this issue. However, the measurement errors satisfy normal distrib...

  18. Uncertainty in SMAP Soil Moisture Measurements Caused by Dew

    Science.gov (United States)

    Hornbuckle, B. K.; Kruger, A.; Rowlandson, T. L.; Logsdon, S. D.; Kaleita, A.; Yueh, S. H.

    2009-12-01

    the effect of dew on the L-band backscatter. On the first day, cloud cover during the previous night prevented the formation of significant dew. On the final day, a large amount of dew was observed. This dew evaporated as PALS repeatedly collected data over the same flight lines until the vegetation was essentially dry. We will compare the remote sensing data from the first day (no dew) with the data on the third day (heavy dew) and analyze the time-series of data as the dew dried off in order to deduce the effect of dew on the L-band backscatter. We will use three different methods to characterize dew: manual measurements of dew amount; leaf wetness sensor measurements of dew duration; and estimates of dew amount and duration from a land surface process model. The result of our presentation will be an estimate of the uncertainty in SMAP soil moisture measurements that could be caused by dew.

  19. Measurement uncertainty for the determination of uranium in urine by ICP-MS

    Energy Technology Data Exchange (ETDEWEB)

    Ryu, Jichang; Lee, Seungjae; Seol, Jeunggun; Cho, Namchan [KEPCO Nuclear Fuel, Daejeon (Korea, Republic of)

    2016-10-15

    There is growing concern about measurement of radioactive material because of radiation accidents such as Fukushima NPP accidents. Generally radioactive material cause external and internal radiation exposure. KEPCO NF is interested in internal dosimetry and intensely focuses on establishment of urine analysis which is one of indirect method to estimate internal dosimetry. Urine samples are analyzed by inductively coupled plasma mass spectrometry (ICP-MS). In this study, we have focused on evaluating uncertainty for the determination of uranium in urine by ICP-MS. To achieve it, three main uncertainty factors are considered. In the present work, we estimated uncertainty for the determination of uranium in urine by ICP-MS. We considered the uncertainty factors as three parts which were initial volume uncertainty, final volume uncertainty and instrument analysis uncertainty. Then the relative expanded uncertainty of uranium concentration in urine of worker was 9%. From an uncertainty contribution point of view, uncertainties caused by calibration curve and ICP-MS repeatability contribute the most to expanded uncertainty. Therefore, it is essential to maintain ICP-MS clean and use certified standard solution which has low uncertainty when making calibration curve.

  20. Principles and applications of measurement and uncertainty analysis in research and calibration

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that ``The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.`` Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true? What kind of information should we include in a statement of uncertainty accompanying a calibrated value? How and where do we get the information to include in an uncertainty statement? How should we interpret and use measurement uncertainty information? This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  1. Principles and applications of measurement and uncertainty analysis in research and calibration

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.'' Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true What kind of information should we include in a statement of uncertainty accompanying a calibrated value How and where do we get the information to include in an uncertainty statement How should we interpret and use measurement uncertainty information This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  2. Uncertainty associated with the gravimetric measurement of particulate matter concentration in ambient air.

    Science.gov (United States)

    Lacey, Ronald E; Faulkner, William Brock

    2015-07-01

    This work applied a propagation of uncertainty method to typical total suspended particulate (TSP) sampling apparatus in order to estimate the overall measurement uncertainty. The objectives of this study were to estimate the uncertainty for three TSP samplers, develop an uncertainty budget, and determine the sensitivity of the total uncertainty to environmental parameters. The samplers evaluated were the TAMU High Volume TSP Sampler at a nominal volumetric flow rate of 1.42 m3 min(-1) (50 CFM), the TAMU Low Volume TSP Sampler at a nominal volumetric flow rate of 17 L min(-1) (0.6 CFM) and the EPA TSP Sampler at the nominal volumetric flow rates of 1.1 and 1.7 m3 min(-1) (39 and 60 CFM). Under nominal operating conditions the overall measurement uncertainty was found to vary from 6.1x10(-6) g m(-3) to 18.0x10(-6) g m(-3), which represented an uncertainty of 1.7% to 5.2% of the measurement. Analysis of the uncertainty budget determined that three of the instrument parameters contributed significantly to the overall uncertainty: the uncertainty in the pressure drop measurement across the orifice meter during both calibration and testing and the uncertainty of the airflow standard used during calibration of the orifice meter. Five environmental parameters occurring during field measurements were considered for their effect on overall uncertainty: ambient TSP concentration, volumetric airflow rate, ambient temperature, ambient pressure, and ambient relative humidity. Of these, only ambient TSP concentration and volumetric airflow rate were found to have a strong effect on the overall uncertainty. The technique described in this paper can be applied to other measurement systems and is especially useful where there are no methods available to generate these values empirically. This work addresses measurement uncertainty of TSP samplers used in ambient conditions. Estimation of uncertainty in gravimetric measurements is of particular interest, since as ambient particulate

  3. The concordance of directly and indirectly measured built environment attributes and physical activity adoption

    Directory of Open Access Journals (Sweden)

    O'Connor Daniel P

    2011-07-01

    Full Text Available Background Physical activity (PA adoption is essential for obesity prevention and control, yet ethnic minority women report lower levels of PA and are at higher risk for obesity and its comorbidities compared to Caucasians. Epidemiological studies and ecologic models of health behavior suggest that built environmental factors are associated with health behaviors like PA, but few studies have examined the association between built environment attribute concordance and PA, and no known studies have examined attribute concordance and PA adoption. Purpose The purpose of this study was to associate the degree of concordance between directly and indirectly measured built environment attributes with changes in PA over time among African American and Hispanic Latina women participating in a PA intervention. Method Women (N = 410 completed measures of PA at Time 1 (T1 and Time 2 (T2; environmental data collected at T1 were used to compute concordance between directly and indirectly measured built environment attributes. The association between changes in PA and the degree of concordance between each directly and indirectly measured environmental attribute was assessed using repeated measures analyses. Results There were no significant associations between built environment attribute concordance values and change in self-reported or objectively measured PA. Self-reported PA significantly increased over time (F(1,184 = 7.82, p = .006, but this increase did not vary by ethnicity or any built environment attribute concordance variable. Conclusions Built environment attribute concordance may not be associated with PA changes over time among minority women. In an effort to promote PA, investigators should clarify specific built environment attributes that are important for PA adoption and whether accurate perceptions of these attributes are necessary, particularly among the vulnerable population of minority women.

  4. A Quantitative Measure For Evaluating Project Uncertainty Under Variation And Risk Effects

    Directory of Open Access Journals (Sweden)

    A. Chenarani

    2017-10-01

    Full Text Available The effects of uncertainty on a project and the risk event as the consequence of uncertainty are analyzed. The uncertainty index is proposed as a quantitative measure for evaluating the uncertainty of a project. This is done by employing entropy as the indicator of system disorder and lack of information. By employing this index, the uncertainty of each activity and its increase due to risk effects as well as project uncertainty changes as a function of time can be assessed. The results are implemented and analyzed for a small turbojet engine development project as the case study. The results of this study can be useful for project managers and other stakeholders for selecting the most effective risk management and uncertainty controlling method.

  5. Estimation of measurement uncertainties in X-ray computed tomography metrology using the substitution method

    DEFF Research Database (Denmark)

    Müller, Pavel; Hiller, Jochen; Dai, Y.

    2014-01-01

    This paper presents the application of the substitution method for the estimation of measurement uncertainties using calibrated workpieces in X-ray computed tomography (CT) metrology. We have shown that this, well accepted method for uncertainty estimation using tactile coordinate measuring...... machines, can be applied to dimensional CT measurements. The method is based on repeated measurements carried out on a calibrated master piece. The master piece is a component of a dose engine from an insulin pen. Measurement uncertainties estimated from the repeated measurements of the master piece were...

  6. CALiPER Exploratory Study: Accounting for Uncertainty in Lumen Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Bergman, Rolf; Paget, Maria L.; Richman, Eric E.

    2011-03-31

    With a well-defined and shared understanding of uncertainty in lumen measurements, testing laboratories can better evaluate their processes, contributing to greater consistency and credibility of lighting testing a key component of the U.S. Department of Energy (DOE) Commercially Available LED Product Evaluation and Reporting (CALiPER) program. Reliable lighting testing is a crucial underlying factor contributing toward the success of many energy-efficient lighting efforts, such as the DOE GATEWAY demonstrations, Lighting Facts Label, ENERGY STAR® energy efficient lighting programs, and many others. Uncertainty in measurements is inherent to all testing methodologies, including photometric and other lighting-related testing. Uncertainty exists for all equipment, processes, and systems of measurement in individual as well as combined ways. A major issue with testing and the resulting accuracy of the tests is the uncertainty of the complete process. Individual equipment uncertainties are typically identified, but their relative value in practice and their combined value with other equipment and processes in the same test are elusive concepts, particularly for complex types of testing such as photometry. The total combined uncertainty of a measurement result is important for repeatable and comparative measurements for light emitting diode (LED) products in comparison with other technologies as well as competing products. This study provides a detailed and step-by-step method for determining uncertainty in lumen measurements, working closely with related standards efforts and key industry experts. This report uses the structure proposed in the Guide to Uncertainty Measurements (GUM) for evaluating and expressing uncertainty in measurements. The steps of the procedure are described and a spreadsheet format adapted for integrating sphere and goniophotometric uncertainty measurements is provided for entering parameters, ordering the information, calculating intermediate

  7. Measuring the optimal macroeconomic uncertainty index for Turkey

    Directory of Open Access Journals (Sweden)

    Erdem Havvanur Feyza

    2016-01-01

    Full Text Available The aim of this study is to calculate the optimal macroeconomic uncertainty index for the Turkish economy. The data used in the study are quarterly and cover the period 2002-2014. In this study the index is formed based on the small structural macroeconomic model. The study uses three important econometric processes. First, the model is estimated separately using generalized method of moments (GMM, seemingly unrelated regressions (SUR, and ordinary least squares (OLS. Secondly, the Broyden-Fletcher- Goldfarb-Shanno (BFGS algorithm is applied as an optimization algorithm. The BFGS algorithm calibrates the model using GMM, SUR, and OLS parameter estimations of the benchmark parameters. Next, the index variables are weighted under the estimated optimal coefficients and, finally, are aggregated to produce the optimal macroeconomic uncertainty index.

  8. Comparison between bottom-up and top-down approaches in the estimation of measurement uncertainty.

    Science.gov (United States)

    Lee, Jun Hyung; Choi, Jee-Hye; Youn, Jae Saeng; Cha, Young Joo; Song, Woonheung; Park, Ae Ja

    2015-06-01

    Measurement uncertainty is a metrological concept to quantify the variability of measurement results. There are two approaches to estimate measurement uncertainty. In this study, we sought to provide practical and detailed examples of the two approaches and compare the bottom-up and top-down approaches to estimating measurement uncertainty. We estimated measurement uncertainty of the concentration of glucose according to CLSI EP29-A guideline. Two different approaches were used. First, we performed a bottom-up approach. We identified the sources of uncertainty and made an uncertainty budget and assessed the measurement functions. We determined the uncertainties of each element and combined them. Second, we performed a top-down approach using internal quality control (IQC) data for 6 months. Then, we estimated and corrected systematic bias using certified reference material of glucose (NIST SRM 965b). The expanded uncertainties at the low glucose concentration (5.57 mmol/L) by the bottom-up approach and top-down approaches were ±0.18 mmol/L and ±0.17 mmol/L, respectively (all k=2). Those at the high glucose concentration (12.77 mmol/L) by the bottom-up and top-down approaches were ±0.34 mmol/L and ±0.36 mmol/L, respectively (all k=2). We presented practical and detailed examples for estimating measurement uncertainty by the two approaches. The uncertainties by the bottom-up approach were quite similar to those by the top-down approach. Thus, we demonstrated that the two approaches were approximately equivalent and interchangeable and concluded that clinical laboratories could determine measurement uncertainty by the simpler top-down approach.

  9. Uncertainty in Citizen Science observations: from measurement to user perception

    Science.gov (United States)

    Lahoz, William; Schneider, Philipp; Castell, Nuria

    2016-04-01

    Citizen Science activities concern general public engagement in scientific research activities when citizens actively contribute to science either with their intellectual effort or surrounding knowledge or with their tools and resources. The advent of technologies such as the Internet and smartphones, and the growth in their usage, has significantly increased the potential benefits from Citizen Science activities. Citizen Science observations from low-cost sensors, smartphones and Citizen Observatories, provide a novel and recent development in platforms for observing the Earth System, with the opportunity to extend the range of observational platforms available to society to spatio-temporal scales (10-100s m; 1 hr or less) highly relevant to citizen needs. The potential value of Citizen Science is high, with applications in science, education, social aspects, and policy aspects, but this potential, particularly for citizens and policymakers, remains largely untapped. Key areas where Citizen Science data start to have demonstrable benefits include GEOSS Societal Benefit Areas such as Health and Weather. Citizen Science observations have many challenges, including simulation of smaller spatial scales, noisy data, combination with traditional observational methods (satellite and in situ data), and assessment, representation and visualization of uncertainty. Within these challenges, that of the assessment and representation of uncertainty and its communication to users is fundamental, as it provides qualitative and/or quantitative information that influences the belief users will have in environmental information. This presentation will discuss the challenges in assessment and representation of uncertainty in Citizen Science observations, its communication to users, including the use of visualization, and the perception of this uncertainty information by users of Citizen Science observations.

  10. Measurement uncertainty of ester number, acid number and patchouli alcohol of patchouli oil produced in Yogyakarta

    Science.gov (United States)

    Istiningrum, Reni Banowati; Saepuloh, Azis; Jannah, Wirdatul; Aji, Didit Waskito

    2017-03-01

    Yogyakarta is one of patchouli oil distillation center in Indonesia. The quality of patchouli oil greatly affect its market price. Therefore, testing quality of patchouli oil parameters is an important concern, one through determination of the measurement uncertainty. This study will determine the measurement uncertainty of ester number, acid number and content of patchouli alcohol through a bottom up approach. Source contributor to measurement uncertainty of ester number is a mass of the sample, a blank and sample titration volume, the molar mass of KOH, HCl normality, and replication. While the source contributor of the measurement uncertainty of acid number is the mass of the sample, the sample titration volume, the relative mass and normality of KOH, and repetition. Determination of patchouli alcohol by Gas Chromatography considers the sources of measurement uncertainty only from repeatability because reference materials are not available.

  11. Comparison of different methods to estimate the uncertainty in composition measurement by chromatography.

    Science.gov (United States)

    Ariza, Adriana Alexandra Aparicio; Ayala Blanco, Elizabeth; García Sánchez, Luis Eduardo; García Sánchez, Carlos Eduardo

    2015-06-01

    Natural gas is a mixture that contains hydrocarbons and other compounds, such as CO2 and N2. Natural gas composition is commonly measured by gas chromatography, and this measurement is important for the calculation of some thermodynamic properties that determine its commercial value. The estimation of uncertainty in chromatographic measurement is essential for an adequate presentation of the results and a necessary tool for supporting decision making. Various approaches have been proposed for the uncertainty estimation in chromatographic measurement. The present work is an evaluation of three approaches of uncertainty estimation, where two of them (guide to the expression of uncertainty in measurement method and prediction method) were compared with the Monte Carlo method, which has a wider scope of application. The aforementioned methods for uncertainty estimation were applied to gas chromatography assays of three different samples of natural gas. The results indicated that the prediction method and the guide to the expression of uncertainty in measurement method (in the simple version used) are not adequate to calculate the uncertainty in chromatography measurement, because uncertainty estimations obtained by those approaches are in general lower than those given by the Monte Carlo method. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Examples of measurement uncertainty evaluations in accordance with the revised GUM

    Science.gov (United States)

    Runje, B.; Horvatic, A.; Alar, V.; Medic, S.; Bosnjakovic, A.

    2016-11-01

    The paper presents examples of the evaluation of uncertainty components in accordance with the current and revised Guide to the expression of uncertainty in measurement (GUM). In accordance with the proposed revision of the GUM a Bayesian approach was conducted for both type A and type B evaluations.The law of propagation of uncertainty (LPU) and the law of propagation of distribution applied through the Monte Carlo method, (MCM) were used to evaluate associated standard uncertainties, expanded uncertainties and coverage intervals. Furthermore, the influence of the non-Gaussian dominant input quantity and asymmetric distribution of the output quantity y on the evaluation of measurement uncertainty was analyzed. In the case when the probabilistically coverage interval is not symmetric, the coverage interval for the probability P is estimated from the experimental probability density function using the Monte Carlo method. Key highlights of the proposed revision of the GUM were analyzed through a set of examples.

  13. Uncertainty of Five-Hole Probe Measurements. [of total flow pressure, static pressure, and flow

    Science.gov (United States)

    Reichert, Bruce A.; Wendt, Bruce J.

    1994-01-01

    A new algorithm for five-hole probe calibration and data reduction using a non-nulling technique was developed, verified, and reported earlier (Wendt and Reichert, 1993). The new algorithm's simplicity permits an analytical treatment of the propagation of uncertainty in five-hole probe measurement. The objectives of the uncertainty analysis are to quantify the uncertainty of five-hole probe results (e.g., total pressure, static pressure, and flow direction) and to determine the dependence of the result uncertainty on the uncertainty of all underlying experimental and calibration measurands. This study outlines a general procedure that other researchers may use to determine five-hole probe result uncertainty and provides guidance for improving the measurement technique.

  14. Doubt-free uncertainty in measurement an introduction for engineers and students

    CERN Document Server

    Ratcliffe, Colin

    2015-01-01

    This volume presents measurement uncertainty and uncertainty budgets in a form accessible to practicing engineers and engineering students from across a wide range of disciplines. The book gives a detailed explanation of the methods presented by NIST in the “GUM” – Guide to Uncertainty of Measurement. Emphasis is placed on explaining the background and meaning of the topics, while keeping the level of mathematics at the minimum level necessary. Dr. Colin Ratcliffe, USNA, and Bridget Ratcliffe, Johns Hopkins, develop uncertainty budgets and explain their use. In some examples, the budget may show a process is already adequate and where costs can be saved. In other examples, the budget may show the process is inadequate and needs improvement. The book demonstrates how uncertainty budgets help identify the most cost effective place to make changes. In addition, an extensive fully-worked case study leads readers through all issues related to an uncertainty analysis, including a variety of different types of...

  15. Uncertainty of measurement for large product verification: evaluation of large aero gas turbine engine datums

    Science.gov (United States)

    Muelaner, J. E.; Wang, Z.; Keogh, P. S.; Brownell, J.; Fisher, D.

    2016-11-01

    Understanding the uncertainty of dimensional measurements for large products such as aircraft, spacecraft and wind turbines is fundamental to improving efficiency in these products. Much work has been done to ascertain the uncertainty associated with the main types of instruments used, based on laser tracking and photogrammetry, and the propagation of this uncertainty through networked measurements. Unfortunately this is not sufficient to understand the combined uncertainty of industrial measurements, which include secondary tooling and datum structures used to locate the coordinate frame. This paper presents for the first time a complete evaluation of the uncertainty of large scale industrial measurement processes. Generic analysis and design rules are proven through uncertainty evaluation and optimization for the measurement of a large aero gas turbine engine. This shows how the instrument uncertainty can be considered to be negligible. Before optimization the dominant source of uncertainty was the tooling design, after optimization the dominant source was thermal expansion of the engine; meaning that no further improvement can be made without measurement in a temperature controlled environment. These results will have a significant impact on the ability of aircraft and wind turbines to improve efficiency and therefore reduce carbon emissions, as well as the improved reliability of these products.

  16. Real Graphs from Real Data: Experiencing the Concepts of Measurement and Uncertainty

    Science.gov (United States)

    Farmer, Stuart

    2012-01-01

    A simple activity using cheap and readily available materials is described that allows students to experience first hand many of the concepts of measurement, uncertainty and graph drawing without laborious measuring or calculation. (Contains 9 figures.)

  17. Verification of the Indicating Measuring Instruments Taking into Account their Instrumental Measurement Uncertainty

    Directory of Open Access Journals (Sweden)

    Zakharov Igor

    2017-12-01

    Full Text Available The specific features of the measuring instruments verification based on the results of their calibration are considered. It is noted that, in contrast to the verification procedure used in the legal metrology, the verification procedure for calibrated measuring instruments has to take into account the uncertainty of measurements into account. In this regard, a large number of measuring instruments, considered as those that are in compliance after verification in the legal metrology, turns out to be not in compliance after calibration. In this case, it is necessary to evaluate the probability of compliance of indicating measuring instruments. The procedure of compliance probability determination on the basis of the Monte Carlo method is considered. An example of calibration of a Vernier caliper is given.

  18. Verification of the Indicating Measuring Instruments Taking into Account their Instrumental Measurement Uncertainty

    Science.gov (United States)

    Zakharov, Igor; Neyezhmakov, Pavel; Botsiura, Olesia

    2017-12-01

    The specific features of the measuring instruments verification based on the results of their calibration are considered. It is noted that, in contrast to the verification procedure used in the legal metrology, the verification procedure for calibrated measuring instruments has to take into account the uncertainty of measurements into account. In this regard, a large number of measuring instruments, considered as those that are in compliance after verification in the legal metrology, turns out to be not in compliance after calibration. In this case, it is necessary to evaluate the probability of compliance of indicating measuring instruments. The procedure of compliance probability determination on the basis of the Monte Carlo method is considered. An example of calibration of a Vernier caliper is given.

  19. Preliminary Examination of a Cartoon-Based Hostile Attributional Bias Measure for Urban African American Boys

    Science.gov (United States)

    Leff, Stephen S.; Lefler, Elizabeth K.; Khera, Gagan S.; Paskewich, Brooke; Jawad, Abbas F.

    2014-01-01

    The current study illustrates how researchers developed and validated a cartoon-based adaptation of a written hostile attributional bias measure for a sample of urban, low-income, African American boys. A series of studies were conducted to develop cartoon illustrations to accompany a standard written hostile attributional bias vignette measure (Study 1), to determine initial psychometric properties (Study 2) and acceptability (Study 3), and to conduct a test-retest reliability trial of the adapted measure in a separate sample (Study 4). These studies utilize a participatory action research approach to measurement design and adaptation, and suggest that collaborations between researchers and key school stakeholders can lead to measures that are psychometrically strong, developmentally appropriate, and culturally sensitive. In addition, the cartoon-based hostile attributional bias measure appears to have promise as an assessment and/or outcome measure for aggression and bullying prevention programs conducted with urban African American boys. PMID:21800228

  20. Uncertainty Modeling and Evaluation of CMM Task Oriented Measurement Based on SVCMM

    Science.gov (United States)

    Li, Hongli; Chen, Xiaohuai; Cheng, Yinbao; Liu, Houde; Wang, Hanbin; Cheng, Zhenying; Wang, Hongtao

    2017-10-01

    Due to the variety of measurement tasks and the complexity of the errors of coordinate measuring machine (CMM), it is very difficult to reasonably evaluate the uncertainty of the measurement results of CMM. It has limited the application of CMM. Task oriented uncertainty evaluation has become a difficult problem to be solved. Taking dimension measurement as an example, this paper puts forward a practical method of uncertainty modeling and evaluation of CMM task oriented measurement (called SVCMM method). This method makes full use of the CMM acceptance or reinspection report and the Monte Carlo computer simulation method (MCM). The evaluation example is presented, and the results are evaluated by the traditional method given in GUM and the proposed method, respectively. The SVCMM method is verified to be feasible and practical. It can help CMM users to conveniently complete the measurement uncertainty evaluation through a single measurement cycle.

  1. Uncertainty of scattered light roughness measurements based on speckle correlation methods

    Science.gov (United States)

    Patzelt, Stefan; Stöbener, Dirk; Ströbel, Gerald; Fischer, Andreas

    2017-06-01

    Surface micro topography measurement (e.g., form, waviness, roughness) is a precondition to assess the surface quality of technical components with regard to their applications. Well defined, standardized measuring devices measure and specify geometrical surface textures only under laboratory conditions. Laser speckle-based roughness measurement is a parametric optical scattered light measuring technique that overcomes this confinement. Field of view dimensions of some square millimeters and measuring frequencies in the kHz domain enable in-process roughness characterization of even moving part surfaces. However, camera exposure times of microseconds or less and a high detector pixel density mean less light energy per pixel due to the limited laser power. This affects the achievable measurement uncertainty according to the Heisenberg uncertainty principle. The influence of fundamental, inevitable noise sources such as the laser shot noise and the detector noise is not quantified yet. Therefore, the uncertainty for speckle roughness measurements is analytically estimated. The result confirms the expected inverse proportionality of the measurement uncertainty to the square root of the illuminating light power and the direct proportionality to the detector readout noise, quantization noise and dark current noise, respectively. For the first time it is possible to quantify the achievable measurement uncertainty u(Sa) < 1 nm for the scattered light measuring system. The low uncertainty offers ideal preconditions for in-process roughness measurements in an industrial environment with an aspired resolution of 1 nm.

  2. Measuring Research Data Uncertainty in the 2010 NRC Assessment of Geography Graduate Education

    Science.gov (United States)

    Shortridge, Ashton; Goldsberry, Kirk; Weessies, Kathleen

    2011-01-01

    This article characterizes and measures errors in the 2010 National Research Council (NRC) assessment of research-doctorate programs in geography. This article provides a conceptual model for data-based sources of uncertainty and reports on a quantitative assessment of NRC research data uncertainty for a particular geography doctoral program.…

  3. Uncertainty of measurement or of mean value for the reliable classification of contaminated land.

    Science.gov (United States)

    Boon, Katy A; Ramsey, Michael H

    2010-12-15

    Classification of contaminated land is important for risk assessment and so it is vital to understand and quantify all of the uncertainties that are involved in the assessment of contaminated land. This paper uses a case study to compare two methods for assessing the uncertainty in site investigations (uncertainty of individual measurements, including that from sampling, and uncertainty of the mean value of all measurements within an area) and how the different methods affect the decisions made about a site. Using the 'uncertainty of the mean value' there is shown to be no significant possibility of 'significant harm' under UK guidance at one particular test site, but if you consider the 'uncertainty of the measurements' a significant proportion (50%) of the site is shown to be possibly contaminated. This raises doubts as to whether the current method using 'uncertainty of the mean' is sufficiently robust, and suggests that 'uncertainty of measurement' information may be preferable, or at least beneficial when used in conjunction. Copyright © 2010 Elsevier B.V. All rights reserved.

  4. Uncertainty analysis of steady state incident heat flux measurements in hydrocarbon fuel fires.

    Energy Technology Data Exchange (ETDEWEB)

    Nakos, James Thomas

    2005-12-01

    The objective of this report is to develop uncertainty estimates for three heat flux measurement techniques used for the measurement of incident heat flux in a combined radiative and convective environment. This is related to the measurement of heat flux to objects placed inside hydrocarbon fuel (diesel, JP-8 jet fuel) fires, which is very difficult to make accurately (e.g., less than 10%). Three methods will be discussed: a Schmidt-Boelter heat flux gage; a calorimeter and inverse heat conduction method; and a thin plate and energy balance method. Steady state uncertainties were estimated for two types of fires (i.e., calm wind and high winds) at three times (early in the fire, late in the fire, and at an intermediate time). Results showed a large uncertainty for all three methods. Typical uncertainties for a Schmidt-Boelter gage ranged from {+-}23% for high wind fires to {+-}39% for low wind fires. For the calorimeter/inverse method the uncertainties were {+-}25% to {+-}40%. The thin plate/energy balance method the uncertainties ranged from {+-}21% to {+-}42%. The 23-39% uncertainties for the Schmidt-Boelter gage are much larger than the quoted uncertainty for a radiative only environment (i.e ., {+-}3%). This large difference is due to the convective contribution and because the gage sensitivities to radiative and convective environments are not equal. All these values are larger than desired, which suggests the need for improvements in heat flux measurements in fires.

  5. Helium Mass Spectrometer Leak Detection: A Method to Quantify Total Measurement Uncertainty

    Science.gov (United States)

    Mather, Janice L.; Taylor, Shawn C.

    2015-01-01

    In applications where leak rates of components or systems are evaluated against a leak rate requirement, the uncertainty of the measured leak rate must be included in the reported result. However, in the helium mass spectrometer leak detection method, the sensitivity, or resolution, of the instrument is often the only component of the total measurement uncertainty noted when reporting results. To address this shortfall, a measurement uncertainty analysis method was developed that includes the leak detector unit's resolution, repeatability, hysteresis, and drift, along with the uncertainty associated with the calibration standard. In a step-wise process, the method identifies the bias and precision components of the calibration standard, the measurement correction factor (K-factor), and the leak detector unit. Together these individual contributions to error are combined and the total measurement uncertainty is determined using the root-sum-square method. It was found that the precision component contributes more to the total uncertainty than the bias component, but the bias component is not insignificant. For helium mass spectrometer leak rate tests where unit sensitivity alone is not enough, a thorough evaluation of the measurement uncertainty such as the one presented herein should be performed and reported along with the leak rate value.

  6. Evaluation of measurement uncertainty and its numerical calculation by a Monte Carlo method

    Science.gov (United States)

    Wübbeler, Gerd; Krystek, Michael; Elster, Clemens

    2008-08-01

    The Guide to the Expression of Uncertainty in Measurement (GUM) is the de facto standard for the evaluation of measurement uncertainty in metrology. Recently, evaluation of measurement uncertainty has been proposed on the basis of probability density functions (PDFs) using a Monte Carlo method. The relation between this PDF approach and the standard method described in the GUM is outlined. The Monte Carlo method required for the numerical calculation of the PDF approach is described and illustrated by its application to two examples. The results obtained by the Monte Carlo method for the two examples are compared to the corresponding results when applying the GUM.

  7. Total Measurement Uncertainty (TMU) for Nondestructive Assay of Transuranic (TRU) Waste at the WRAP Facility

    Energy Technology Data Exchange (ETDEWEB)

    WILLS, C.E.

    2000-02-24

    The Waste Receiving and Processing (WRAP) facility, located on the Hanford Site in southeast Washington, is a key link in the certification of Hanford's transuranic (TRU) waste for shipment to the Waste Isolation Pilot Plant (WIPP). Waste characterization is one of the vital functions performed at WRAP, and nondestructive assay (NDA) measurements of TRU waste containers is one of two required methods used for waste characterization (Reference 1). Various programs exist to ensure the validity of waste characterization data; all of these cite the need for clearly defined knowledge of uncertainty, associated with any measurements taken. All measurements have an inherent uncertainty associated with them. The combined effect of all uncertainties associated with a measurement is referred to as the Total Measurement Uncertainty (TMU). The NDA measurement uncertainties can be numerous and complex. In addition to system-induced measurement uncertainty, other factors contribute to the TMU, each associated with a particular measurement. The NDA measurements at WRAP are based on processes (radioactive decay and induced fission) which are statistical in nature. As a result, the proper statistical summation of the various uncertainty components is essential. This report examines the contributing factors to NDA measurement uncertainty at WRAP. The significance of each factor on the TMU is analyzed, and a final method is given for determining the TMU for NDA measurements at WRAP. As more data becomes available, and WRAP gains in operational experience, this report will be reviewed semi-annually and updated as necessary. This report also includes the data flow paths for the analytical process in the radiometric determinations.

  8. Role and Significance of Uncertainty in HV Measurement of Porcelain Insulators - a Case Study

    Science.gov (United States)

    Choudhary, Rahul Raj; Bhardwaj, Pooja; Dayama, Ravindra

    The improved safety margins in complex systems have attained prime importance in the modern scientific environment. The analysis and implementation of complex systems demands the well quantified accuracy and capability of measurements. Careful measurement with properly identified and quantified uncertainties could lead to the actual discovery which further may contribute for social developments. Unfortunately most scientists and students are passively taught to ignore the possibility of definition problems in the field of measurement and are often source of great arguments. Identifying this issue, ISO has initiated the standardisation of methodologies but its Guide to the Expression of Uncertainty in Measurement (GUM) has yet to be adapted seriously in tertiary education institutions for understanding the concept of uncertainty. The paper has been focused for understanding the concepts of measurement and uncertainty. Further a case study for calculation and quantification of UOM for high voltage electrical testing of ceramic insulators has been explained.

  9. Instrumentation-related uncertainty of reflectance and transmittance measurements with a two-channel spectrophotometer.

    Science.gov (United States)

    Peest, Christian; Schinke, Carsten; Brendel, Rolf; Schmidt, Jan; Bothe, Karsten

    2017-01-01

    Spectrophotometers are operated in numerous fields of science and industry for a variety of applications. In order to provide confidence for the measured data, analyzing the associated uncertainty is valuable. However, the uncertainty of the measurement results is often unknown or reduced to sample-related contributions. In this paper, we describe our approach for the systematic determination of the measurement uncertainty of the commercially available two-channel spectrophotometer Agilent Cary 5000 in accordance with the Guide to the expression of uncertainty in measurements. We focus on the instrumentation-related uncertainty contributions rather than the specific application and thus outline a general procedure which can be adapted for other instruments. Moreover, we discover a systematic signal deviation due to the inertia of the measurement amplifier and develop and apply a correction procedure. Thereby we increase the usable dynamic range of the instrument by more than one order of magnitude. We present methods for the quantification of the uncertainty contributions and combine them into an uncertainty budget for the device.

  10. Empirical versus modelling approaches to the estimation of measurement uncertainty caused by primary sampling.

    Science.gov (United States)

    Lyn, Jennifer A; Ramsey, Michael H; Damant, Andrew P; Wood, Roger

    2007-12-01

    Measurement uncertainty is a vital issue within analytical science. There are strong arguments that primary sampling should be considered the first and perhaps the most influential step in the measurement process. Increasingly, analytical laboratories are required to report measurement results to clients together with estimates of the uncertainty. Furthermore, these estimates can be used when pursuing regulation enforcement to decide whether a measured analyte concentration is above a threshold value. With its recognised importance in analytical measurement, the question arises of 'what is the most appropriate method to estimate the measurement uncertainty?'. Two broad methods for uncertainty estimation are identified, the modelling method and the empirical method. In modelling, the estimation of uncertainty involves the identification, quantification and summation (as variances) of each potential source of uncertainty. This approach has been applied to purely analytical systems, but becomes increasingly problematic in identifying all of such sources when it is applied to primary sampling. Applications of this methodology to sampling often utilise long-established theoretical models of sampling and adopt the assumption that a 'correct' sampling protocol will ensure a representative sample. The empirical approach to uncertainty estimation involves replicated measurements from either inter-organisational trials and/or internal method validation and quality control. A more simple method involves duplicating sampling and analysis, by one organisation, for a small proportion of the total number of samples. This has proven to be a suitable alternative to these often expensive and time-consuming trials, in routine surveillance and one-off surveys, especially where heterogeneity is the main source of uncertainty. A case study of aflatoxins in pistachio nuts is used to broadly demonstrate the strengths and weakness of the two methods of uncertainty estimation. The estimate

  11. Working with Error and Uncertainty to Increase Measurement Validity

    Science.gov (United States)

    Amrein-Beardsley, Audrey; Barnett, Joshua H.

    2012-01-01

    Over the previous two decades, the era of accountability has amplified efforts to measure educational effectiveness more than Edward Thorndike, the father of educational measurement, likely would have imagined. Expressly, the measurement structure for evaluating educational effectiveness continues to rely increasingly on one sole…

  12. Radiation Thermometry—Sources of Uncertainty During Contactless Temperature Measurement

    Science.gov (United States)

    Reichel, Denise; Schumann, T.; Skorupa, W.; Lerch, W.; Gelpey, J.

    Short Time Annealing on a microsecond to nanosecond scale presents new challenges to temperature measurement. Pyrometers are widely used owing to their commercial availability, short response time, easy handling and contactless operation. However, they hold a source for considerable measurement errors. False readings are easily gained producing large errors during temperature measurement.

  13. The grey relational approach for evaluating measurement uncertainty with poor information

    Science.gov (United States)

    Luo, Zai; Wang, Yanqing; Zhou, Weihu; Wang, Zhongyu

    2015-12-01

    The Guide to the Expression of Uncertainty in Measurement (GUM) is the master document for measurement uncertainty evaluation. However, the GUM may encounter problems and does not work well when the measurement data have poor information. In most cases, poor information means a small data sample and an unknown probability distribution. In these cases, the evaluation of measurement uncertainty has become a bottleneck in practical measurement. To solve this problem, a novel method called the grey relational approach (GRA), different from the statistical theory, is proposed in this paper. The GRA does not require a large sample size or probability distribution information of the measurement data. Mathematically, the GRA can be divided into three parts. Firstly, according to grey relational analysis, the grey relational coefficients between the ideal and the practical measurement output series are obtained. Secondly, the weighted coefficients and the measurement expectation function will be acquired based on the grey relational coefficients. Finally, the measurement uncertainty is evaluated based on grey modeling. In order to validate the performance of this method, simulation experiments were performed and the evaluation results show that the GRA can keep the average error around 5%. Besides, the GRA was also compared with the grey method, the Bessel method, and the Monte Carlo method by a real stress measurement. Both the simulation experiments and real measurement show that the GRA is appropriate and effective to evaluate the measurement uncertainty with poor information.

  14. S-Parameter Uncertainties in Network Analyzer Measurements with Application to Antenna Patterns

    Directory of Open Access Journals (Sweden)

    N. Yannopoulou

    2008-04-01

    Full Text Available An analytical method was developed to estimate uncertainties in full two-port Vector Network Analyzer measurements, using total differentials of S-parameters. System error uncertainties were also estimated from total differentials involving two triples of standards, in the Direct Through connection case. Standard load uncertainties and measurement inaccuracies were represented by independent differentials. Complex uncertainty in any quantity, differentiably dependent on S-parameters, is estimated by the corresponding Differential Error Region. Real uncertainties, rectangular and polar, are estimated by the orthogonal parallelogram and annular sector circumscribed about the Differential Error Region, respectively. From the user's point of view, manufactures' data may be used to set the independent differentials and apply the method. Demonstration results include: (1 System error differentials for Short, matching Load and Open pairs of opposite sex standards; (2 System error uncertainties for VNA extended by two lengthy transmission lines of opposite sex end-connectors; (3 High uncertainties in Z-parameters against frequency of an appropriately designed, DC resistive, T-Network; (4 Moderate uncertainties in amplitude and phase patterns of a designed UHF radial discone antenna (azimuthally rotated by a built positioner, under developed software control of a built hardware controller polarization coupled with a constructed gain standard antenna (stationary into an anechoic chamber.

  15. Quantifying measurement uncertainties in ADCP measurements in non-steady, inhomogeneous flow

    Science.gov (United States)

    Schäfer, Stefan

    2017-04-01

    The author presents a laboratory study of fixed-platform four-beam ADCP and three-beam ADV measurements in the tailrace of a micro hydro power setup with a 35kW Kaplan-turbine and 2.5m head. The datasets discussed quantify measurement uncertainties of the ADCP measurement technique coming from non-steady, inhomogeneous flow. For constant discharge of 1.5m3/s, two different flow scenarios were investigated: one being the regular tailrace flow downstream the draft tube and the second being a straightened, less inhomogeneous flow, which was generated by the use of a flow straightening device: A rack of diameter 40mm pipe sections was mounted right behind the draft tube. ADCP measurements (sampling rate 1.35Hz) were conducted in three distances behind the draft tube and compared bin-wise to measurements of three simultaneously measuring ADV probes (sampling rate 64Hz). The ADV probes were aligned horizontally and the ADV bins were placed in the centers of two facing ADCP bins and in the vertical under the ADCP probe of the corresponding depth. Rotating the ADV probes by 90° allowed for measurements of the other two facing ADCP bins. For reasons of mutual probe interaction, ADCP and ADV measurements were not conducted at the same time. The datasets were evaluated by using mean and fluctuation velocities. Turbulence parameters were calculated and compared as far as applicable. Uncertainties coming from non-steady flow were estimated with the normalized mean square error und evaluated by comparing long-term measurements of 60 minutes to shorter measurement intervals. Uncertainties coming from inhomogeneous flow were evaluated by comparison of ADCP with ADV data along the ADCP beams where ADCP data were effectively measured and in the vertical under the ADCP probe where velocities of the ADCP measurements were displayed. Errors coming from non-steady flow could be compensated through sufficiently long measurement intervals with high enough sampling rates depending on the

  16. Uncertainty analysis of standardized measurements of random-incidence absorption and scattering coefficients.

    Science.gov (United States)

    Müller-Trapet, Markus; Vorländer, Michael

    2015-01-01

    This work presents an analysis of the effect of some uncertainties encountered when measuring absorption or scattering coefficients in the reverberation chamber according to International Organization for Standardization/American Society for Testing and Materials standards. This especially relates to the uncertainty due to spatial fluctuations of the sound field. By analyzing the mathematical definition of the respective coefficient, a relationship between the properties of the chamber and the test specimen and the uncertainty in the measured quantity is determined and analyzed. The validation of the established equations is presented through comparisons with measurement data. This study analytically explains the main sources of error and provides a method to obtain the product of the necessary minimum number of measurement positions and the band center frequency to achieve a given maximum uncertainty in the desired quantity. It is shown that this number depends on the ratio of room volume to sample surface area and the reverberation time of the empty chamber.

  17. Uncertainty measurement in the homogenization and sample reduction in the physical classification of rice and beans

    Directory of Open Access Journals (Sweden)

    Dieisson Pivoto

    2016-04-01

    Full Text Available ABSTRACT: The study aimed to i quantify the measurement uncertainty in the physical tests of rice and beans for a hypothetical defect, ii verify whether homogenization and sample reduction in the physical classification tests of rice and beans is effective to reduce the measurement uncertainty of the process and iii determine whether the increase in size of beans sample increases accuracy and reduces measurement uncertainty in a significant way. Hypothetical defects in rice and beans with different damage levels were simulated according to the testing methodology determined by the Normative Ruling of each product. The homogenization and sample reduction in the physical classification of rice and beans are not effective, transferring to the final test result a high measurement uncertainty. The sample size indicated by the Normative Ruling did not allow an appropriate homogenization and should be increased.

  18. Orientation Uncertainty of Structures Measured in Cored Boreholes: Methodology and Case Study of Swedish Crystalline Rock

    Science.gov (United States)

    Stigsson, Martin

    2016-11-01

    Many engineering applications in fractured crystalline rocks use measured orientations of structures such as rock contact and fractures, and lineated objects such as foliation and rock stress, mapped in boreholes as their foundation. Despite that these measurements are afflicted with uncertainties, very few attempts to quantify their magnitudes and effects on the inferred orientations have been reported. Only relying on the specification of tool imprecision may considerably underestimate the actual uncertainty space. The present work identifies nine sources of uncertainties, develops inference models of their magnitudes, and points out possible implications for the inference on orientation models and thereby effects on downstream models. The uncertainty analysis in this work builds on a unique data set from site investigations, performed by the Swedish Nuclear Fuel and Waste Management Co. (SKB). During these investigations, more than 70 boreholes with a maximum depth of 1 km were drilled in crystalline rock with a cumulative length of more than 34 km including almost 200,000 single fracture intercepts. The work presented, hence, relies on orientation of fractures. However, the techniques to infer the magnitude of orientation uncertainty may be applied to all types of structures and lineated objects in boreholes. The uncertainties are not solely detrimental, but can be valuable, provided that the reason for their presence is properly understood and the magnitudes correctly inferred. The main findings of this work are as follows: (1) knowledge of the orientation uncertainty is crucial in order to be able to infer correct orientation model and parameters coupled to the fracture sets; (2) it is important to perform multiple measurements to be able to infer the actual uncertainty instead of relying on the theoretical uncertainty provided by the manufacturers; (3) it is important to use the most appropriate tool for the prevailing circumstances; and (4) the single most

  19. Classical Information-Theoretical View of Physical Measurements and Generalized Uncertainty Relations

    OpenAIRE

    Kurihara, Yoshimasa

    2012-01-01

    General characterizations of physical measurements are discussed within the framework of the classical information theory. The uncertainty relation for simultaneous measurements of two physical observables is defined in this framework for generalized dynamic systems governed by a Sturm--Liouville type of equation of motion. In the first step, the reduction of Kennard--Robertson type uncertainties due to boundary conditions with a mean-square error is discussed quantitatively with reference to...

  20. PDF uncertainties in precision electroweak measurements, including the W mass, in ATLAS

    CERN Document Server

    Cooper-Sarkar, Amanda; The ATLAS collaboration

    2015-01-01

    Now that the Higgs mass is known all the parameters of the SM are known- but with what accuracy? Precision EW measurements test the self-consistency of the SM- and thus can give hints of BSM physics. Precision measurements of $sin^2\\theta _W$ and the W mass are limited by PDF uncertainties This contribution discusses these uncertainties and what can be done to improve them.

  1. Uncertainty analysis guide

    Energy Technology Data Exchange (ETDEWEB)

    Andres, T.H

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  2. An uncertainty relation in terms of generalized metric adjusted skew information and correlation measure

    Science.gov (United States)

    Fan, Ya-Jing; Cao, Huai-Xin; Meng, Hui-Xian; Chen, Liang

    2016-12-01

    The uncertainty principle in quantum mechanics is a fundamental relation with different forms, including Heisenberg's uncertainty relation and Schrödinger's uncertainty relation. In this paper, we prove a Schrödinger-type uncertainty relation in terms of generalized metric adjusted skew information and correlation measure by using operator monotone functions, which reads, U_ρ ^{(g,f)}(A)U_ρ ^{(g,f)}(B)≥ f(0)^2l/k| Corr_ρ ^{s(g,f)}(A,B)| ^2 for some operator monotone functions f and g, all n-dimensional observables A, B and a non-singular density matrix ρ . As applications, we derive some new uncertainty relations for Wigner-Yanase skew information and Wigner-Yanase-Dyson skew information.

  3. Measurement uncertainty of dissolution test of acetaminophen immediate release tablets using Monte Carlo simulations

    Directory of Open Access Journals (Sweden)

    Daniel Cancelli Romero

    2017-10-01

    Full Text Available ABSTRACT Analytical results are widely used to assess batch-by-batch conformity, pharmaceutical equivalence, as well as in the development of drug products. Despite this, few papers describing the measurement uncertainty estimation associated with these results were found in the literature. Here, we described a simple procedure used for estimating measurement uncertainty associated with the dissolution test of acetaminophen tablets. A fractionate factorial design was used to define a mathematical model that explains the amount of acetaminophen dissolved (% as a function of time of dissolution (from 20 to 40 minutes, volume of dissolution media (from 800 to 1000 mL, pH of dissolution media (from 2.0 to 6.8, and rotation speed (from 40 to 60 rpm. Using Monte Carlo simulations, we estimated measurement uncertainty for dissolution test of acetaminophen tablets (95.2 ± 1.0%, with a 95% confidence level. Rotation speed was the most important source of uncertainty, contributing about 96.2% of overall uncertainty. Finally, it is important to note that the uncertainty calculated in this paper reflects the expected uncertainty to the dissolution test, and does not consider variations in the content of acetaminophen.

  4. Study and Application on Stability Classification of Tunnel Surrounding Rock Based on Uncertainty Measure Theory

    Directory of Open Access Journals (Sweden)

    Hujun He

    2014-01-01

    Full Text Available Based on uncertainty measure theory, a stability classification and order-arranging model of surrounding rock was established. Considering the practical engineering geologic condition, 5 factors that influence surrounding rock stability were taken into account and uncertainty measure function was obtained based on the in situ data. In this model, uncertainty influence factors were analyzed quantitatively and qualitatively based on the real situation; the weight of index was given based on information entropy theory; surrounding rock stability level was judged based on credible degree recognition criterion; and surrounding rock was ordered based on order-arranging criterion. Furthermore, this model was employed to evaluate 5 sections surrounding rock in Dongshan tunnel of Huainan. The results show that uncertainty measure method is reasonable and can have significance for surrounding rock stability evaluation in the future.

  5. Measuring diversity in medical reports based on categorized attributes and international classification systems.

    Science.gov (United States)

    Přečková, Petra; Zvárová, Jana; Zvára, Karel

    2012-04-12

    Narrative medical reports do not use standardized terminology and often bring insufficient information for statistical processing and medical decision making. Objectives of the paper are to propose a method for measuring diversity in medical reports written in any language, to compare diversities in narrative and structured medical reports and to map attributes and terms to selected classification systems. A new method based on a general concept of f-diversity is proposed for measuring diversity of medical reports in any language. The method is based on categorized attributes recorded in narrative or structured medical reports and on international classification systems. Values of categories are expressed by terms. Using SNOMED CT and ICD 10 we are mapping attributes and terms to predefined codes. We use f-diversities of Gini-Simpson and Number of Categories types to compare diversities of narrative and structured medical reports. The comparison is based on attributes selected from the Minimal Data Model for Cardiology (MDMC). We compared diversities of 110 Czech narrative medical reports and 1119 Czech structured medical reports. Selected categorized attributes of MDMC had mostly different numbers of categories and used different terms in narrative and structured reports. We found more than 60% of MDMC attributes in SNOMED CT. We showed that attributes in narrative medical reports had greater diversity than the same attributes in structured medical reports. Further, we replaced each value of category (term) used for attributes in narrative medical reports by the closest term and the category used in MDMC for structured medical reports. We found that relative Gini-Simpson diversities in structured medical reports were significantly smaller than those in narrative medical reports except the "Allergy" attribute. Terminology in narrative medical reports is not standardized. Therefore it is nearly impossible to map values of attributes (terms) to codes of known

  6. Measuring diversity in medical reports based on categorized attributes and international classification systems

    Directory of Open Access Journals (Sweden)

    Přečková Petra

    2012-04-01

    Full Text Available Abstract Background Narrative medical reports do not use standardized terminology and often bring insufficient information for statistical processing and medical decision making. Objectives of the paper are to propose a method for measuring diversity in medical reports written in any language, to compare diversities in narrative and structured medical reports and to map attributes and terms to selected classification systems. Methods A new method based on a general concept of f-diversity is proposed for measuring diversity of medical reports in any language. The method is based on categorized attributes recorded in narrative or structured medical reports and on international classification systems. Values of categories are expressed by terms. Using SNOMED CT and ICD 10 we are mapping attributes and terms to predefined codes. We use f-diversities of Gini-Simpson and Number of Categories types to compare diversities of narrative and structured medical reports. The comparison is based on attributes selected from the Minimal Data Model for Cardiology (MDMC. Results We compared diversities of 110 Czech narrative medical reports and 1119 Czech structured medical reports. Selected categorized attributes of MDMC had mostly different numbers of categories and used different terms in narrative and structured reports. We found more than 60% of MDMC attributes in SNOMED CT. We showed that attributes in narrative medical reports had greater diversity than the same attributes in structured medical reports. Further, we replaced each value of category (term used for attributes in narrative medical reports by the closest term and the category used in MDMC for structured medical reports. We found that relative Gini-Simpson diversities in structured medical reports were significantly smaller than those in narrative medical reports except the "Allergy" attribute. Conclusions Terminology in narrative medical reports is not standardized. Therefore it is nearly

  7. Nacelle power curve measurement with spinner anemometer and uncertainty evaluation

    DEFF Research Database (Denmark)

    Demurtas, Giorgio; Friis Pedersen, Troels; Wagner, Rozenn

    2016-01-01

    to calculate the free wind speed. For each of the two wind turbines, the power curve (PC) was measured with the met-mast and the nacelle power curve (NPC) with the spinner anemometer. Four power curves (two PC and two NPC) were compared in terms of AEP (Annual Energy Production) for a Rayleigh wind speed......The objective of this investigation was to verify the feasibility of using the spinner anemometer calibration and nacelle transfer function determined on one reference turbine, to assess the power performance of a second identical turbine. An experiment was set up with a met-mast in a position...... suitable to measure the power curve of the two wind turbines, both equipped with a spinner anemometer. An IEC 61400-12-1 compliant power curve was then measured for both turbines using the met-mast. The NTF (Nacelle Transfer Function) was measured on the reference turbine and then applied to both turbines...

  8. Experimental estimation of mismatch uncertainty in radio – frequency power and attenuation measurements

    Directory of Open Access Journals (Sweden)

    Patel Kamlesh

    2015-01-01

    Full Text Available In this paper, the effects of the input quantity representations in linear and complex forms are analyzed to estimate mismatch uncertainty separately for one-port and two-port components. The mismatch uncertainties in power and attenuation measurements are evaluated for direct, ratio and substitution techniques with the use of a vector network analyzer system in the range of 1 to 18 GHz. The estimated mismatch uncertainties were compared for the same device under test and these values have verified that their evaluation is dependent on the representations of input quantities. In power measurements, the mismatch uncertainty is reduced when evaluating from the voltage standing wave ratio or reflection coefficient magnitudes in comparison to the complex reflection coefficients. The mismatch uncertainty in the attenuation measurements, are found higher and linearly increasing while estimating from the linear magnitude values than those from the S-parameters of the attenuator. Thus in practice, the mismatch uncertainty is estimated more accurately using the quantities measured in the same representations as of measuring quantity.

  9. Significant Figures in Measurements with Uncertainty: A Working Criterion

    Science.gov (United States)

    Vilchis, Abraham

    2017-01-01

    Generally speaking, students have difficulty reporting out measurements and estimates of quantities used in the laboratory, and with handling the significant figures associated with them. When required to make calculation involving quantities with different numbers of significant figures, they have difficulty in assigning the corresponding digits…

  10. Evaluating the Sources of Uncertainties in the Measurements from Multiple Pyranometers and Pyrheliometers

    Energy Technology Data Exchange (ETDEWEB)

    Habte, Aron; Sengupta, Manajit; Andreas, Afshin; Dooraghi, Mike; Reda, Ibrahim; Kutchenreiter, Mark

    2017-03-13

    Traceable radiometric data sets are essential for validating climate models, validating satellite-based models for estimating solar resources, and validating solar radiation forecasts. The current state-of-the-art radiometers have uncertainties in the range from 2% - 5% and sometimes more [1]. The National Renewable Energy Laboratory (NREL) and other organizations are identifying uncertainties and improving radiometric measurement performance and developing a consensus methodology for acquiring radiometric data. This study analyzes the impact of differing specifications -- such as cosine response, thermal offset, spectral response, and others -- on the accuracy of radiometric data for various radiometers. The study will also provide insight on how to perform a measurement uncertainty analysis and how to reduce the impact of some of the sources of uncertainties.

  11. Graphical Representations of Data Improve Student Understanding of Measurement and Uncertainty: An Eye-Tracking Study

    Science.gov (United States)

    Susac, Ana; Bubic, Andreja; Martinjak, Petra; Planinic, Maja; Palmovic, Marijan

    2017-01-01

    Developing a better understanding of the measurement process and measurement uncertainty is one of the main goals of university physics laboratory courses. This study investigated the influence of graphical representation of data on student understanding and interpreting of measurement results. A sample of 101 undergraduate students (48 first year…

  12. Interval Predictor Models for Data with Measurement Uncertainty

    Science.gov (United States)

    Lacerda, Marcio J.; Crespo, Luis G.

    2017-01-01

    An interval predictor model (IPM) is a computational model that predicts the range of an output variable given input-output data. This paper proposes strategies for constructing IPMs based on semidefinite programming and sum of squares (SOS). The models are optimal in the sense that they yield an interval valued function of minimal spread containing all the observations. Two different scenarios are considered. The first one is applicable to situations where the data is measured precisely whereas the second one is applicable to data subject to known biases and measurement error. In the latter case, the IPMs are designed to fully contain regions in the input-output space where the data is expected to fall. Moreover, we propose a strategy for reducing the computational cost associated with generating IPMs as well as means to simulate them. Numerical examples illustrate the usage and performance of the proposed formulations.

  13. Uncertainty budget and interlaboratory field tests in SO2 and NOx emission measurements

    OpenAIRE

    Poulleau, Jean; Raventos, Cécile; Blank, Frans; Emmenegger, Lukas; Gould, Richard; Kassman, Hakan; Pilage, Emile; Reynaud, Serge; Rokkjaer, Joern; Waeber, Michael

    2004-01-01

    International audience; This paper compares two techniques to assess uncertainty on emission measurements. The first one, described in ISO 14956, gives an appropriate procedure to establish uncertainty budgets from systematic assessment of factors influencing the result. The second approach consists in the quantification of the fidelity of the method during inter and intra-laboratory field experiments set out according to ISO 5725-2. The comparison has been carried out for two reference metho...

  14. Uncertainty Reduction Via Parameter Design of A Fast Digital Integrator for Magnetic Field Measurement

    CERN Document Server

    Arpaia, P; Lucariello, G; Spiezia, G

    2007-01-01

    At European Centre of Nuclear Research (CERN), within the new Large Hadron Collider (LHC) project, measurements of magnetic flux with uncertainty of 10 ppm at a few of decades of Hz for several minutes are required. With this aim, a new Fast Digital Integrator (FDI) has been developed in cooperation with University of Sannio, Italy [1]. This paper deals with the final design tuning for achieving target uncertainty by means of experimental statistical parameter design.

  15. Uncertainty principle for measurable sets and signal recovery in quaternion domains

    Science.gov (United States)

    Kou, Kit Ian; Yang, Yan; Zou, Cuiming

    2017-07-01

    The classical uncertainty principle of harmonic analysis states that a nontrivial function and its Fourier transform cannot both be sharply localized. It plays an important role in signal processing and physics. This paper generalizes the uncertainty principle for measurable sets from complex domain to hypercomplex domain using quaternion algebras, associated with the Quaternion Fourier transform. The performance is then evaluated in signal recovery problems where there is an interplay of missing and time-limiting data.

  16. Measuring the Flexural Strength of Ceramics at Elevated Temperatures – An Uncertainty Analysis

    Directory of Open Access Journals (Sweden)

    Štubňa I.

    2014-02-01

    Full Text Available The flexural mechanical strength was measured at room and elevated temperatures on green ceramic samples made from quartz electroporcelain mixture. An apparatus exploited the three-point-bending mechanical arrangement and a magazine for 10 samples that are favorable at the temperature measurements from 20 °C to 1000 °C. A description of the apparatus from the point of possible sources of uncertainties is also given. The uncertainty analysis taking into account thermal expansion of the sample and span between the supports is performed for 600 °C. Friction between the sample and supports as well as friction between mechanical parts of the apparatus is also considered. The value of the mechanical strength at the temperature of 600 °C is 13.23 ± 0.50 MPa, where the second term is an expanded standard uncertainty. Such an uncertainty is mostly caused by inhomogeneities in measured samples. The biggest part of the uncertainty arises from the repeatability of the loading force which reflects a scatter of the sample properties. The influence of the temperature on the uncertainty value is very small

  17. A novel method for importance measure analysis in the presence of epistemic and aleatory uncertainties

    Directory of Open Access Journals (Sweden)

    Ren Bo

    2014-06-01

    Full Text Available For structural systems with both epistemic and aleatory uncertainties, research on quantifying the contribution of the epistemic and aleatory uncertainties to the failure probability of the systems is conducted. Based on the method of separating epistemic and aleatory uncertainties in a variable, the core idea of the research is firstly to establish a novel deterministic transition model for auxiliary variables, distribution parameters, random variables, failure probability, then to propose the improved importance sampling (IS to solve the transition model. Furthermore, the distribution parameters and auxiliary variables are sampled simultaneously and independently; therefore, the inefficient sampling procedure with an “inner-loop” for epistemic uncertainty and an “outer-loop” for aleatory uncertainty in traditional methods is avoided. Since the proposed method combines the fast convergence of the proper estimates and searches failure samples in the interesting regions with high efficiency, the proposed method is more efficient than traditional methods for the variance-based failure probability sensitivity measures in the presence of epistemic and aleatory uncertainties. Two numerical examples and one engineering example are introduced for demonstrating the efficiency and precision of the proposed method for structural systems with both epistemic and aleatory uncertainties.

  18. Coherent uncertainty analysis of aerosol measurements from multiple satellite sensors

    Directory of Open Access Journals (Sweden)

    M. Petrenko

    2013-07-01

    Full Text Available Aerosol retrievals from multiple spaceborne sensors, including MODIS (on Terra and Aqua, MISR, OMI, POLDER, CALIOP, and SeaWiFS – altogether, a total of 11 different aerosol products – were comparatively analyzed using data collocated with ground-based aerosol observations from the Aerosol Robotic Network (AERONET stations within the Multi-sensor Aerosol Products Sampling System (MAPSS, http://giovanni.gsfc.nasa.gov/mapss/ and http://giovanni.gsfc.nasa.gov/aerostat/. The analysis was performed by comparing quality-screened satellite aerosol optical depth or thickness (AOD or AOT retrievals during 2006–2010 to available collocated AERONET measurements globally, regionally, and seasonally, and deriving a number of statistical measures of accuracy. We used a robust statistical approach to detect and remove possible outliers in the collocated data that can bias the results of the analysis. Overall, the proportion of outliers in each of the quality-screened AOD products was within 7%. Squared correlation coefficient (R2 values of the satellite AOD retrievals relative to AERONET exceeded 0.8 for many of the analyzed products, while root mean square error (RMSE values for most of the AOD products were within 0.15 over land and 0.07 over ocean. We have been able to generate global maps showing regions where the different products present advantages over the others, as well as the relative performance of each product over different land cover types. It was observed that while MODIS, MISR, and SeaWiFS provide accurate retrievals over most of the land cover types, multi-angle capabilities make MISR the only sensor to retrieve reliable AOD over barren and snow/ice surfaces. Likewise, active sensing enables CALIOP to retrieve aerosol properties over bright-surface closed shrublands more accurately than the other sensors, while POLDER, which is the only one of the sensors capable of measuring polarized aerosols, outperforms other sensors in certain

  19. AVNG SYSTEM SOFTWARE - ATTRIBUTE VERIFICATION SYSTEM WITH INFORMATION BARRIERS FOR MASS AND ISOTOPICS MEASUREMENTS

    Energy Technology Data Exchange (ETDEWEB)

    Modenov, A; Bulatov, M; Livke, A; Morkin, A; Razinkov, S; Safronov, S; Elmont, T; Langner, D; MacArthur, D; Mayo, D; Smith, M; Luke, S J

    2005-06-10

    This report describes the software development for the plutonium attribute verification system--AVNG. A brief synopsis of the technical solution for the measurement system is presented. The main tasks for the software development that is underway are formulated. The development tasks are shown in software structural flowcharts, measurement system state diagram and a description of the software. The current status of the AVNG software development is elucidated.

  20. Noncontact Measurement and Detection of Instantaneous Seismic Attributes Based on Complementary Ensemble Empirical Mode Decomposition

    Directory of Open Access Journals (Sweden)

    Yaping Huang

    2017-10-01

    Full Text Available Hilbert–Huang transform (HHT is a popular method to analyze nonlinear and non-stationary data. It has been widely used in geophysical prospecting. This paper analyzes the mode mixing problems of empirical mode decomposition (EMD and introduces the noncontact measurement and detection of instantaneous seismic attributes using complementary ensemble empirical mode decomposition (CEEMD. Numerical simulation testing indicates that the CEEMD can effectively solve the mode mixing problems of EMD and can provide stronger anti-noise ability. The decomposed results of the synthetic seismic record show that CEEMD has a better ability to decompose seismic signals. Then, CEEMD is applied to extract instantaneous seismic attributes of 3D seismic data in a real-world coal mine in Inner Mongolia, China. The detection results demonstrate that instantaneous seismic attributes extracted by CEEMD are helpful to effectively identify the undulations of the top interfaces of limestone.

  1. Quantification of the overall measurement uncertainty associated with the passive moss biomonitoring technique: Sample collection and processing.

    Science.gov (United States)

    Aboal, J R; Boquete, M T; Carballeira, A; Casanova, A; Debén, S; Fernández, J A

    2017-05-01

    In this study we examined 6080 data gathered by our research group during more than 20 years of research on the moss biomonitoring technique, in order to quantify the variability generated by different aspects of the protocol and to calculate the overall measurement uncertainty associated with the technique. The median variance of the concentrations of different pollutants measured in moss tissues attributed to the different methodological aspects was high, reaching values of 2851 (ng·g-1)2 for Cd (sample treatment), 35.1 (μg·g-1)2 for Cu (sample treatment), 861.7 (ng·g-1)2 and for Hg (material selection). These variances correspond to standard deviations that constitute 67, 126 and 59% the regional background levels of these elements in the study region. The overall measurement uncertainty associated with the worst experimental protocol (5 subsamples, refrigerated, washed, 5 × 5 m size of the sampling area and once a year sampling) was between 2 and 6 times higher than that associated with the optimal protocol (30 subsamples, dried, unwashed, 20 × 20 m size of the sampling area and once a week sampling), and between 1.5 and 7 times higher than that associated with the standardized protocol (30 subsamples and once a year sampling). The overall measurement uncertainty associated with the standardized protocol could generate variations of between 14 and 47% in the regional background levels of Cd, Cu, Hg, Pb and Zn in the study area and much higher levels of variation in polluted sampling sites. We demonstrated that although the overall measurement uncertainty of the technique is still high, it can be reduced by using already well defined aspects of the protocol. Further standardization of the protocol together with application of the information on the overall measurement uncertainty would improve the reliability and comparability of the results of different biomonitoring studies, thus extending use of the technique beyond the context of scientific

  2. Measurement uncertainties when determining heat rate, isentropic efficiency and swallowing capacity

    Energy Technology Data Exchange (ETDEWEB)

    Snygg, U.

    1996-05-01

    The objective of the project was to determine the uncertainties when calculating heat rate, isentropic efficiencies and swallowing capacities of power plants. Normally when a power plant is constructed, the supplier also guarantee some performance values, e.g. heat rate. When the plant is built and running under normal conditions, an evaluation is done and the guarantee values are checked. Different measured parameters influence the calculated value differently, and therefore a sensitivity factor can be defined as the sensitivity of a calculated value when the measured value is changing. The product of this factor and the uncertainty of the measured parameter gives an error of the calculated value. For every measured parameter, the above given factor has to be determined and then the root square sum gives the overall uncertainty of the calculated parameter. To receive acceptable data during the evaluation of the plant, a test code is to be followed. The test code also gives guidelines how big the errors of the measurements are. In this study, ASME PTC6 and DIN 1943 were used. The results show that not only the test code was of vital importance, but also the distribution of the power output of the HP-IP turbines contra LP turbines. A higher inlet pressure of the LP turbine gives a smaller uncertainty of the isentropic efficiency. An increase from 6 to 13 bar will lower the uncertainty 1.5 times. 10 refs, 24 figs, 23 tabs, 5 appendixes

  3. Invited Article: Concepts and tools for the evaluation of measurement uncertainty

    Science.gov (United States)

    Possolo, Antonio; Iyer, Hari K.

    2017-01-01

    Measurements involve comparisons of measured values with reference values traceable to measurement standards and are made to support decision-making. While the conventional definition of measurement focuses on quantitative properties (including ordinal properties), we adopt a broader view and entertain the possibility of regarding qualitative properties also as legitimate targets for measurement. A measurement result comprises the following: (i) a value that has been assigned to a property based on information derived from an experiment or computation, possibly also including information derived from other sources, and (ii) a characterization of the margin of doubt that remains about the true value of the property after taking that information into account. Measurement uncertainty is this margin of doubt, and it can be characterized by a probability distribution on the set of possible values of the property of interest. Mathematical or statistical models enable the quantification of measurement uncertainty and underlie the varied collection of methods available for uncertainty evaluation. Some of these methods have been in use for over a century (for example, as introduced by Gauss for the combination of mutually inconsistent observations or for the propagation of "errors"), while others are of fairly recent vintage (for example, Monte Carlo methods including those that involve Markov Chain Monte Carlo sampling). This contribution reviews the concepts, models, methods, and computations that are commonly used for the evaluation of measurement uncertainty, and illustrates their application in realistic examples drawn from multiple areas of science and technology, aiming to serve as a general, widely accessible reference.

  4. On the impact of systematical uncertainties for the CP violation measurement in superbeam experiments

    CERN Document Server

    Huber, Patrick; Schwetz, Thomas

    2008-01-01

    Superbeam experiments can, in principle, achieve impressive sensitivities for CP violation in neutrino oscillations for large $\\theta_{13}$. We study how those sensitivities depend on assumptions about systematical uncertainties. We focus on the second phase of T2K, the so-called T2HK experiment, and we explicitly include a near detector in the analysis. Our main result is that even an idealised near detector cannot remove the dependence on systematical uncertainties completely. Thus additional information is required. We identify certain combinations of uncertainties, which are the key to improve the sensitivity to CP violation, for example the ratio of electron to muon neutrino cross sections and efficiencies. For uncertainties on this ratio larger than 2%, T2HK is systematics dominated. We briefly discuss how our results apply to a possible two far detector configuration, called T2KK. We do not find a significant advantage with respect to the reduction of systematical errors for the measurement of CP viola...

  5. Total Measurement Uncertainty (TMU) for Nondestructive Assay of Transuranic (TRU) Waste at the WRAP Facility

    Energy Technology Data Exchange (ETDEWEB)

    WILLS, C.E.

    1999-12-06

    This report examines the contributing factors to NDA measurement uncertainty at WRAP. The significance of each factor on the TMU is analyzed, and a final method is given for determining the TMU for NDA measurements at WRAP. As more data becomes available, and WRAP gains in operational experience, this report will be reviewed semi-annually and updated as necessary.

  6. Measurement Uncertainty Evaluation in Dimensional X-ray Computed Tomography Using the Bootstrap Method

    DEFF Research Database (Denmark)

    Hiller, Jochen; Genta, Gianfranco; Barbato, Giulio

    2014-01-01

    measurement processes, e.g., with tactile systems, also due to factors related to systematic errors, mainly caused by specific CT image characteristics. In this paper we propose a simulation-based framework for measurement uncertainty evaluation in dimensional CT using the bootstrap method. In a case study...

  7. Technical notes: A detailed study for the provision of measurement uncertainty and traceability for goniospectrometers

    NARCIS (Netherlands)

    Peltoniemi, J.I.; Hakala, T.; Suomalainen, J.M.; Honkavaara, E.; Markelin, L.; Gritsevich, M.; Eskelinen, J.; Jaanson, P.; Ikonen, E.

    2014-01-01

    The measurement uncertainty and traceability of the Finnish Geodetic Institutes¿s field gonio-spectro-polarimeter FIGIFIGO have been assessed. First, the reference standard (Spectralon sample) was measured at the National Standard Laboratory of MIKES-Aalto. This standard was transferred to FGI¿s

  8. Total Measurement Uncertainty (TMU) for Nondestructive Assay of Transuranic (TRU) Waste at the WRAP Facility

    Energy Technology Data Exchange (ETDEWEB)

    WILLS, C.E.

    2000-01-06

    This report examines the contributing factors to NDA measurement uncertainty at WRAP The significance of each factor on the TMU is analyzed and a final method is given for determining the TMU for NDA measurements at WRAP. As more data becomes available and WRAP gains in operational experience this report will be reviewed semi annually and updated as necessary.

  9. Measuring parent attributes and supervision behaviors relevant to child injury risk: examining the usefulness of questionnaire measures.

    Science.gov (United States)

    Morrongiello, B A; House, K

    2004-04-01

    This study aimed to identify self report questionnaire measures of parent attributes and behaviors that have relevance for understanding injury risk among children 2-5 years of age, and test a new Parent Supervision Attributes Profile Questionnaire (PSAPQ) that was developed to measure aspects of protectiveness and parent supervision. Naturalistic observations were conducted of parents' supervision of children on playgrounds, with questionnaires subsequently completed by the parent to measure parent education, family income, parent personality attributes, attributes relevant to parent supervision, and beliefs about parents' control over the child's health status. These measures were then related to children's risk taking and injury history. Visual supervision, auditory supervision, and physical proximity were highly intercorrelated, indicating that parents employed all types of behaviors in service of supervision, rather than relying predominantly on one type of supervisory behavior. Physical proximity was the only aspect of supervision behavior that served a protective function and related to children's risk taking behaviors: parents who remained close to their children had children who engaged in less risk taking. On questionnaires, parents who reported more conscientiousness, protectiveness, worry about safety, vigilance in supervision, confidence in their ability to keep their child safe, and belief in control over their child's health had children who showed less risk taking and/or experienced fewer injuries. The new PSAPQ measure was associated with specific aspects of supervision as well as children's risk taking and injury history. This study reveals several parent attributes and behaviors with relevance for child injury risk that can be measured via self report questionnaires, including the new PSAPQ.

  10. Estimating the Uncertainty of Tensile Strength Measurement for A Photocured Material Produced by Additive Manufacturing

    Directory of Open Access Journals (Sweden)

    Adamczak Stanisław

    2014-08-01

    Full Text Available The aim of this study was to estimate the measurement uncertainty for a material produced by additive manufacturing. The material investigated was FullCure 720 photocured resin, which was applied to fabricate tensile specimens with a Connex 350 3D printer based on PolyJet technology. The tensile strength of the specimens established through static tensile testing was used to determine the measurement uncertainty. There is a need for extensive research into the performance of model materials obtained via 3D printing as they have not been studied sufficiently like metal alloys or plastics, the most common structural materials. In this analysis, the measurement uncertainty was estimated using a larger number of samples than usual, i.e., thirty instead of typical ten. The results can be very useful to engineers who design models and finished products using this material. The investigations also show how wide the scatter of results is.

  11. Measuring organizational attributes of primary care practices: development of a new instrument.

    Science.gov (United States)

    Ohman-Strickland, Pamela A; John Orzano, A; Nutting, Paul A; Perry Dickinson, W; Scott-Cawiezell, Jill; Hahn, Karissa; Gibel, Michelle; Crabtree, Benjamin F

    2007-06-01

    To develop an instrument to measure organizational attributes relevant for family practices using the perspectives of clinicians, nurses, and staff. Clinicians, nurses, and office staff (n=640) from 51 community family medicine practices. A survey, designed to measure a practices' internal resources for change, for use in family medicine practices was created by a multidisciplinary panel of experts in primary care research and health care organizational performance. This survey was administered in a cross-sectional study to a sample of diverse practices participating in an intervention trial. A factor analysis identified groups of questions relating to latent constructs of practices' internal resources for capacity to change. ANOVA methods were used to confirm that the factors differentiated practices. The survey was administered to all staff from 51 practices. The factor analysis resulted in four stable and internally consistent factors. Three of these factors, "communication,"decision-making," and "stress/chaos," describe resources for change in primary care practices. One factor, labeled "history of change," may be useful in assessing the success of interventions. A 21-item questionnaire can reliably measure four important organizational attributes relevant to family practices. These attributes can be used both as outcome measures as well as important features for targeting system interventions.

  12. Assessing Differences Between Results Determined According to the Guide to the Expression of Uncertainty in Measurement.

    Science.gov (United States)

    Kacker, Raghu N; Kessel, Rüdiger; Sommer, Klaus-Dieter

    2010-01-01

    In some metrology applications multiple results of measurement for a common measurand are obtained and it is necessary to determine whether the results agree with each other. A result of measurement based on the Guide to the Expression of Uncertainty in Measurement (GUM) consists of a measured value together with its associated standard uncertainty. In the GUM, the measured value is regarded as the expected value and the standard uncertainty is regarded as the standard deviation, both known values, of a state-of-knowledge probability distribution. A state-of-knowledge distribution represented by a result need not be completely known. Then how can one assess the differences between the results based on the GUM? Metrologists have for many years used the Birge chisquare test as 'a rule of thumb' to assess the differences between two or more measured values for the same measurand by pretending that the standard uncertainties were the standard deviations of the presumed sampling probability distributions from random variation of the measured values. We point out that this is misuse of the standard uncertainties; the Birge test and the concept of statistical consistency motivated by it do not apply to the results of measurement based on the GUM. In 2008, the International Vocabulary of Metrology, third edition (VIM3) introduced the concept of metrological compatibility. We propose that the concept of metrological compatibility be used to assess the differences between results based on the GUM for the same measurand. A test of the metrological compatibility of two results of measurement does not conflict with a pairwise Birge test of the statistical consistency of the corresponding measured values.

  13. About uncertainties related to the indirect method of measuring radiation doses in paediatric radiography

    Energy Technology Data Exchange (ETDEWEB)

    Lacerda, Mas; Da Silva, Ta [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN), Belo Horizonte (Brazil); Khoury, H.J. [Universidade Federal de Pernambuco (DEN/UFPE), Recife (Brazil); Azevedo, A.C.P. [Fundacao Oswaldo Cruz (ENSP -CESTEH), Rio de Janeiro (Brazil)

    2006-07-01

    Indirect method of measuring radiation doses in diagnostic radiology has played a important role to large-scale dosimetric surveys of paediatric patients. To determine the uncertainties associated with this method is crucial to compare the results surveyed in different radiology departments for optimisation purposes. Entrance surface doses (E.S.D.) received by paediatric patients in chest and skull radiographies were estimated by indirect method in three public hospitals of Belo Horizonte city in Brazil: two general hospitals and a children specialist one. Uncertainties of the entrance doses were calculated from the uncertainties of the output measurements, backscatter factors, patient data and technique factors employed within a 95% confidence limit. In a room of one general hospital, E.S.D. values for diagnostic images of chest were (74 {+-} 12%) {mu}Gy for a one-year old child, (92{+-}11%) {mu}Gy for a five-years old child and (135 {+-} 12%) {mu}Gy for a ten-years old child. E.S.D. values in the two radiographic procedures studied for a five-years old child were generally lower than that published by Commission of the European Communities in 1996 and higher than that published by the National Radiological Protection Board in 2000. The uncertainties of the output measurements and technique factors employed (consequence of non standardisation of technique factors) were determinants to the high values of uncertainties found in some rooms. (authors)

  14. Uncertainty analysis of signal deconvolution using a measured instrument response function

    Science.gov (United States)

    Hartouni, E. P.; Beeman, B.; Caggiano, J. A.; Cerjan, C.; Eckart, M. J.; Grim, G. P.; Hatarik, R.; Moore, A. S.; Munro, D. H.; Phillips, T.; Sayre, D. B.

    2016-11-01

    A common analysis procedure minimizes the ln-likelihood that a set of experimental observables matches a parameterized model of the observation. The model includes a description of the underlying physical process as well as the instrument response function (IRF). In the case investigated here, the National Ignition Facility (NIF) neutron time-of-flight (nTOF) spectrometers, the IRF is constructed from measurements and models. IRF measurements have a finite precision that can make significant contributions to determine the uncertainty estimate of the physical model's parameters. We apply a Bayesian analysis to properly account for IRF uncertainties in calculating the ln-likelihood function used to find the optimum physical parameters.

  15. NIS method for uncertainty estimation of airborne sound insulation measurement in field

    Directory of Open Access Journals (Sweden)

    El-Basheer Tarek M.

    2017-01-01

    Full Text Available In structures, airborne sound insulation is utilized to characterize the acoustic nature of barriers between rooms. However, the assessment of sound insulation index is once in a while troublesome or indeed, even questionable, both in field and laboratory measurements, notwithstanding the way that there are some unified measurement methodology indicated in the ISO 140 series standards. There are issues with the reproducibility and repeatability of the measurement results. A few troubles might be brought on by non-diffuse acoustic fields, non-uniform reverberation time, or blunders of the reverberation time measurements. Some minor issues are additionally postured by flanking transmission. In this paper, investigation of the uncertainties of the above specified measurement parts and their impact on the consolidated uncertainty in 1/3-octave frequency band. The total measurement uncertainty model contributes several different partial uncertainties, which are evaluated by the method of type A or type B. Also, the determination of the sound reduction index decided by ISO 140-4 has been performed.

  16. Two procedures for the estimation of the uncertainty of spectral irradiance measurement for UV source calibration

    Science.gov (United States)

    Obaton, A.-F.; Lebenberg, J.; Fischer, N.; Guimier, S.; Dubard, J.

    2007-04-01

    The measurement uncertainty of the spectral irradiance of an UV lamp is computed by using the law of propagation of uncertainty (LPU) as described in the 'Guide to the Expression of Uncertainty in Measurement' (GUM), considering only a first-order Taylor series approximation. Since the spectral irradiance model displays a non-linear feature and since an asymmetric probability density function (PDF) is assigned to some input quantities, the usage of another process was required to validate the LPU method. The propagation of distributions using Monte Carlo (MC) simulations, as depicted in the supplement of the GUM (GUM-S1), was found to be a relevant alternative solution. The validation of the LPU method by the MC method is discussed with regard to PDF choices, and the benefit of the MC method over the LPU method is illustrated.

  17. Modelling and Measurement Uncertainty Estimation for Integrated AFM-CMM Instrument

    DEFF Research Database (Denmark)

    Hansen, Hans Nørgaard; Bariani, Paolo; De Chiffre, Leonardo

    2005-01-01

    This paper describes modelling of an integrated AFM - CMM instrument, its calibration, and estimation of measurement uncertainty. Positioning errors were seen to limit the instrument performance. Software for off-line stitching of single AFM scans was developed and verified, which allows compensa......This paper describes modelling of an integrated AFM - CMM instrument, its calibration, and estimation of measurement uncertainty. Positioning errors were seen to limit the instrument performance. Software for off-line stitching of single AFM scans was developed and verified, which allows...... uncertainty of 0.8% was achieved for the case of surface mapping of 1.2*1.2 mm2 consisting of 49 single AFM scanned areas....

  18. Research on uncertainty evaluation measure and method of voltage sag severity

    Science.gov (United States)

    Liu, X. N.; Wei, J.; Ye, S. Y.; Chen, B.; Long, C.

    2018-01-01

    Voltage sag is an inevitable serious problem of power quality in power system. This paper focuses on a general summarization and reviews on the concepts, indices and evaluation methods about voltage sag severity. Considering the complexity and uncertainty of influencing factors, damage degree, the characteristics and requirements of voltage sag severity in the power source-network-load sides, the measure concepts and their existing conditions, evaluation indices and methods of voltage sag severity have been analyzed. Current evaluation techniques, such as stochastic theory, fuzzy logic, as well as their fusion, are reviewed in detail. An index system about voltage sag severity is provided for comprehensive study. The main aim of this paper is to propose thought and method of severity research based on advanced uncertainty theory and uncertainty measure. This study may be considered as a valuable guide for researchers who are interested in the domain of voltage sag severity.

  19. Calculation of measurement uncertainty for plastic (ABS material in flexural testing

    Directory of Open Access Journals (Sweden)

    Gunay A.

    2013-01-01

    Full Text Available In order to determine mechanical properties of materials various kind of tests can be applied by means of using their tensile strength, lower yield stress, proof stress, impact strength, Brinell, Rockwell and surface hardness, elongation after fracture properties. Among these tests, three point flexural testing method has some advantages such as easy preparation (production of samples and no gripping problems comparing to tension test. Flexural tests results should be obtained accurately to provide expected testing performance. The measurement uncertainty of flexural tests should be calculated by conducting all effective uncertainty parameters during the test procedure. In this study, the measurement uncertainty of the flexural test of ABS (Acrylonitrile Butadiene Styrene material was investigated, which is widely used as industrial plastic material in many applications.

  20. Recent Surface Reflectance Measurement Campaigns with Emphasis on Best Practices, SI Traceability and Uncertainty Estimation

    Science.gov (United States)

    Helder, Dennis; Thome, Kurtis John; Aaron, Dave; Leigh, Larry; Czapla-Myers, Jeff; Leisso, Nathan; Biggar, Stuart; Anderson, Nik

    2012-01-01

    A significant problem facing the optical satellite calibration community is limited knowledge of the uncertainties associated with fundamental measurements, such as surface reflectance, used to derive satellite radiometric calibration estimates. In addition, it is difficult to compare the capabilities of calibration teams around the globe, which leads to differences in the estimated calibration of optical satellite sensors. This paper reports on two recent field campaigns that were designed to isolate common uncertainties within and across calibration groups, particularly with respect to ground-based surface reflectance measurements. Initial results from these efforts suggest the uncertainties can be as low as 1.5% to 2.5%. In addition, methods for improving the cross-comparison of calibration teams are suggested that can potentially reduce the differences in the calibration estimates of optical satellite sensors.

  1. Regional inversion of CO2 ecosystem fluxes from atmospheric measurements. Reliability of the uncertainty estimates

    Energy Technology Data Exchange (ETDEWEB)

    Broquet, G.; Chevallier, F.; Breon, F.M.; Yver, C.; Ciais, P.; Ramonet, M.; Schmidt, M. [Laboratoire des Sciences du Climat et de l' Environnement, CEA-CNRS-UVSQ, UMR8212, IPSL, Gif-sur-Yvette (France); Alemanno, M. [Servizio Meteorologico dell' Aeronautica Militare Italiana, Centro Aeronautica Militare di Montagna, Monte Cimone/Sestola (Italy); Apadula, F. [Research on Energy Systems, RSE, Environment and Sustainable Development Department, Milano (Italy); Hammer, S. [Universitaet Heidelberg, Institut fuer Umweltphysik, Heidelberg (Germany); Haszpra, L. [Hungarian Meteorological Service, Budapest (Hungary); Meinhardt, F. [Federal Environmental Agency, Kirchzarten (Germany); Necki, J. [AGH University of Science and Technology, Krakow (Poland); Piacentino, S. [ENEA, Laboratory for Earth Observations and Analyses, Palermo (Italy); Thompson, R.L. [Max Planck Institute for Biogeochemistry, Jena (Germany); Vermeulen, A.T. [Energy research Centre of the Netherlands ECN, EEE-EA, Petten (Netherlands)

    2013-07-01

    The Bayesian framework of CO2 flux inversions permits estimates of the retrieved flux uncertainties. Here, the reliability of these theoretical estimates is studied through a comparison against the misfits between the inverted fluxes and independent measurements of the CO2 Net Ecosystem Exchange (NEE) made by the eddy covariance technique at local (few hectares) scale. Regional inversions at 0.5{sup 0} resolution are applied for the western European domain where {approx}50 eddy covariance sites are operated. These inversions are conducted for the period 2002-2007. They use a mesoscale atmospheric transport model, a prior estimate of the NEE from a terrestrial ecosystem model and rely on the variational assimilation of in situ continuous measurements of CO2 atmospheric mole fractions. Averaged over monthly periods and over the whole domain, the misfits are in good agreement with the theoretical uncertainties for prior and inverted NEE, and pass the chi-square test for the variance at the 30% and 5% significance levels respectively, despite the scale mismatch and the independence between the prior (respectively inverted) NEE and the flux measurements. The theoretical uncertainty reduction for the monthly NEE at the measurement sites is 53% while the inversion decreases the standard deviation of the misfits by 38 %. These results build confidence in the NEE estimates at the European/monthly scales and in their theoretical uncertainty from the regional inverse modelling system. However, the uncertainties at the monthly (respectively annual) scale remain larger than the amplitude of the inter-annual variability of monthly (respectively annual) fluxes, so that this study does not engender confidence in the inter-annual variations. The uncertainties at the monthly scale are significantly smaller than the seasonal variations. The seasonal cycle of the inverted fluxes is thus reliable. In particular, the CO2 sink period over the European continent likely ends later than

  2. Investment in flood protection measures under climate change uncertainty. An investment decision

    Energy Technology Data Exchange (ETDEWEB)

    Bruin, Karianne de

    2012-11-01

    Recent river flooding in Europe has triggered debates among scientists and policymakers on future projections of flood frequency and the need for adaptive investments, such as flood protection measures. Because there exists uncertainty about the impact of climate change of flood risk, such investments require a careful analysis of expected benefits and costs. The objective of this paper is to show how climate change uncertainty affects the decision to invest in flood protection measures. We develop a model that simulates optimal decision making in flood protection, it incorporates flexible timing of investment decisions and scientific uncertainty on the extent of climate change impacts. This model allows decision-makers to cope with the uncertain impacts of climate change on the frequency and damage of river flood events and minimises the risk of under- or over-investment. One of the innovative elements is that we explicitly distinguish between structural and non-structural flood protection measures. Our results show that the optimal investment decision today depends strongly on the cost structure of the adaptation measures and the discount rate, especially the ratio of fixed and weighted annual costs of the measures. A higher level of annual flood damage and later resolution of uncertainty in time increases the optimal investment. Furthermore, the optimal investment decision today is influenced by the possibility of the decision-maker to adjust his decision at a future moment in time.(auth)

  3. Measuring illness uncertainty in men undergoing active surveillance for prostate cancer.

    Science.gov (United States)

    Bailey, Donald E; Wallace, Meredith; Latini, David M; Hegarty, Josephine; Carroll, Peter R; Klein, Eric A; Albertsen, Peter C

    2011-11-01

    Uncertainty is an aversive experience and plays an important role in the lives of men undergoing active surveillance (AS; earlier referred to as watchful waiting) for early-stage prostate cancer. Yet reliable and valid measures of uncertainty have not been fully tested in this population. This secondary analysis therefore tested the reliability of the Mishel Uncertainty in Illness Scale Community Form (MUIS-C; M.H. Mishel, 1997b) for use with men undergoing AS for prostate cancer. Item-Total correlations were conducted on the 23 items of the MUIS-C with four samples of men undergoing AS. Cronbach's alpha for the full MUIS-C was .908; 22 of 23 items showed significant positive correlations with the total score. Removing the item without a significant correlation from the reliability analysis increased Cronbach's alpha to .913. The Mishel Uncertainty in Illness Scale-Community Form for Active Surveillance is a reliable and valid tool for measuring uncertainty with men undergoing AS for prostate cancer. Copyright © 2011 Elsevier Inc. All rights reserved.

  4. Measuring Illness Uncertainty in Men Undergoing Active Surveillance (AS) for Prostate Cancer

    Science.gov (United States)

    Bailey, Donald E.; Wallace, Meredith; Latini, David M.; Hegarty, Josephine; Carroll, Peter R.; Klein, Eric A.; Albertsen, Peter C.

    2010-01-01

    Background and Purpose Uncertainty is an aversive experience and plays an important role in the lives of men undergoing active surveillance (AS) (earlier referred to as watchful waiting) for early-stage prostate cancer. Yet reliable and valid measures of uncertainty have not been fully tested in this population. This secondary analysis therefore tested the reliability of the Mishel Uncertainty in Illness Scale – Community Form (MUIS-C; Mishel, 1997b) for use with a population of men undergoing AS for prostate cancer. Methods Item-to-total correlations were conducted on the 23 items of the MUIS-C with four samples of men undergoing AS. Results Cronbach’s alpha for the full MUIS-C was .908; 22 of 23 items showed significant positive correlations with the total score. Removing the item without a significant correlation from the reliability analysis increased Cronbach’s alpha to .913. Conclusions The Mishel Uncertainty in Illness Scale – Community Form for Active Surveillance (MUIS-C-AS) is a reliable and valid tool for measuring uncertainty with men undergoing AS for prostate cancer. PMID:20974073

  5. Estimation of the measurement uncertainty of methamphetamine and amphetamine in hair analysis.

    Science.gov (United States)

    Lee, Sooyeun; Park, Yonghoon; Yang, Wonkyung; Han, Eunyoung; Choe, Sanggil; Lim, Miae; Chung, Heesun

    2009-03-10

    The measurement uncertainties (MUs) were estimated for the determination of methamphetamine (MA) and its main metabolite, amphetamine (AP) at the low concentrations (around the cut-off value of MA) in human hair according to the recommendations of the EURACHEM/CITAC Guide and "Guide to the expression of uncertainty in measurement (GUM)". MA and AP were extracted by agitating hair with 1% HCl in methanol, followed by derivatization and quantification using GC-MS. The major components contributing to their uncertainties were the amount of MA or AP in the test sample, the weight of the test sample and the method precision, based on the equation to calculate the mesurand from intermediate values. Consequently, the concentrations of MA and AP in the hair sample with their expanded uncertainties were 0.66+/-0.05 and 1.01+/-0.06 ng/mg, respectively, which were acceptable to support the successful application of the analytical method. The method precision and the weight of the hair sample gave the largest contribution to the overall combined uncertainties of MA and AP, for each.

  6. Uncertainty in the use of MAMA software to measure particle morphological parameters from SEM images

    Energy Technology Data Exchange (ETDEWEB)

    Schwartz, Daniel S. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Tandon, Lav [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-06-05

    The MAMA software package developed at LANL is designed to make morphological measurements on a wide variety of digital images of objects. At LANL, we have focused on using MAMA to measure scanning electron microscope (SEM) images of particles, as this is a critical part of our forensic analysis of interdicted radiologic materials. In order to successfully use MAMA to make such measurements, we must understand the level of uncertainty involved in the process, so that we can rigorously support our quantitative conclusions.

  7. Factors and measurement of mental illness stigma: a psychometric examination of the Attribution Questionnaire.

    Science.gov (United States)

    Brown, Seth A

    2008-01-01

    A number of scales are employed to measure mental illness stigma, but many fail to have documented or adequate psychometric properties. The purpose of this study was to further evaluate the psychometric properties of one such measure, the Attribution Questionnaire (AQ). Based on responses from 774 college students, exploratory factor analyses were conducted followed by an examination of the reliability and validity of the newly formed factor scales. A six-factor structure emerged and four of these factor scales (Fear/Dangerousness, Help/Interact, Forcing Treatment, and Negative Emotions) had acceptable internal consistency, test-retest reliability, and convergent validity with other stigma measures. Twenty items from the AQ provide reliable and valid measurement of four important aspects of stigmatizing attitudes/beliefs towards the mentally ill. Accurate measurement of these attitudes/beliefs will be critical to more fully understanding the stigma process and developing effective strategies to address stigma.

  8. Uncertainty analysis of the Measured Performance Rating (MPR) method. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1993-11-01

    A report was commissioned by the New York State Energy Research and Development Authority and the Electric Power Research Institute to evaluate the uncertainties in the energy monitoring method known as measured performance rating (MPR). The work is intended to help further development of the MPR system by quantitatively analyzing the uncertainties in estimates of the heat loss coefficients and heating system efficiencies. The analysis indicates that the MPR should detect as little as a 7 percent change in the heat loss coefficients and heating system efficiencies. The analysis indicate that the MPR should be able to detect as little as a 7 percent change in the heat loss coefficient at 95 percent confidence level. MPR appears sufficiently robust for characterizing common weatherization treatments; e.g., increasing attic insulation from R-7 to R-19 in a typical single-story, 1,100 sq. ft. house resulting in a 19 percent reduction in heat loss coefficient. Furnace efficiency uncertainties ranged up to three times those of the heat loss coefficients. Measurement uncertainties (at the 95 percent confidence level) were estimated to be from 1 to 5 percent for heat loss coefficients and 1.5 percent for a typical furnace efficiency. The analysis also shows a limitation in applying MPR to houses with heating ducts in slabs on grade and to those with very large thermal mass. Most of the uncertainties encountered in the study were due more to the methods of estimating the ``true`` heat loss coefficients, furnace efficiency, and furnace fuel consumption (by collecting fuel bills and simulating two actual houses) than to the MPR approach. These uncertainties in the true parameter values become evidence for arguments in favor of the need of empirical measures of heat loss coefficient and furnace efficiency, like the MPR method, rather than arguments against.

  9. Reduction of slope stability uncertainty based on hydraulic measurement via inverse analysis

    NARCIS (Netherlands)

    Vardon, P.J.; Liu, K.; Hicks, M.A.

    2016-01-01

    The determination of slope stability for existing slopes is challenging, partly due to the spatial variability of soils. Reliability-based design can incorporate uncertainties and yield probabilities of slope failure. Field measurements can be utilised to constrain probabilistic analyses, thereby

  10. Measurement uncertainty on subsurface defects detection using active infrared thermographic technique

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Yoon Jae; Kim [Kongju National University, Cheonan (Korea, Republic of); Choi, Won Jae [Center for Safety Measurements, Korea Research Institute of Standards and Science, Daejeon (Korea, Republic of)

    2015-10-15

    Active infrared thermography methods have been known to possess good fault detection capabilities for the detection of defects in materials compared to the conventional passive thermal infrared imaging techniques. However, the reliability of the technique has been under scrutiny. This paper proposes the lock-in thermography technique for the detection and estimation of artificial subsurface defect size and depth with uncertainty measurement.

  11. Aid instability as a measure of uncertainty and the positive impact of aid on growth

    NARCIS (Netherlands)

    Lensink, R; Morrissey, O

    This article contributes to the literature on aid and economic growth. We posit that uncertainty, measured as the instability of aid receipts, will influence the relationship between aid and investment, how recipient governments respond to aid, and will capture the fact that some countries are

  12. Measuring Cross-Section and Estimating Uncertainties with the fissionTPC

    Energy Technology Data Exchange (ETDEWEB)

    Bowden, N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Manning, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Sangiorgio, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Seilhan, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-01-30

    The purpose of this document is to outline the prescription for measuring fission cross-sections with the NIFFTE fissionTPC and estimating the associated uncertainties. As such it will serve as a work planning guide for NIFFTE collaboration members and facilitate clear communication of the procedures used to the broader community.

  13. High Speed Railway Environment Safety Evaluation Based on Measurement Attribute Recognition Model

    Directory of Open Access Journals (Sweden)

    Qizhou Hu

    2014-01-01

    Full Text Available In order to rationally evaluate the high speed railway operation safety level, the environmental safety evaluation index system of high speed railway should be well established by means of analyzing the impact mechanism of severe weather such as raining, thundering, lightning, earthquake, winding, and snowing. In addition to that, the attribute recognition will be identified to determine the similarity between samples and their corresponding attribute classes on the multidimensional space, which is on the basis of the Mahalanobis distance measurement function in terms of Mahalanobis distance with the characteristics of noncorrelation and nondimensionless influence. On top of the assumption, the high speed railway of China environment safety situation will be well elaborated by the suggested methods. The results from the detailed analysis show that the evaluation is basically matched up with the actual situation and could lay a scientific foundation for the high speed railway operation safety.

  14. ‘Hands-on statistics’—empirical introduction to measurement uncertainty

    Science.gov (United States)

    Wibig, Tadeusz; Dam-o, Punsiri

    2013-03-01

    We would like to share with you our ongoing experiences with ‘hands-on statistics’ lessons we have recently carried out. We have developed a new experimental path for teaching young students using fundamental concepts of ‘statistics’: uncertainty of the measurement, the uncertainty of the mean, the mean itself, etc. The methods themselves need no special skills in mathematics, only the use of a takoyaki setup is needed for the experiments. This equipment we have found makes the lesson far more interesting for the students and has allowed us to work successfully for many years, even with children from elementary schools starting from the age of 10.

  15. How measurement uncertainties impact upon the observed scaling properties of MHD turbulence in the solar wind

    Science.gov (United States)

    Hnat, B.; Gogoberidze, G.; Chapman, S. C.; Dunlop, M.

    2012-12-01

    Quantifying the scaling exponents of fluctuations in the solar wind is central to testing predictions of turbulence theories. We study spectral features of Alfvenic turbulence in fast solar wind. We propose a general, instrument independent method (Gogoberidze et al, MNRAS, 2012) to estimate the uncertainty in velocity fluctuations obtained by in-situ satellite observations in the solar wind. We show that when the measurement uncertainties of the velocity fluctuations are taken into account the less energetic Elsasser spectrum obeys a unique power law scaling throughout the inertial range as prevailing theories of magnetohydrodynamic turbulence predict. Moreover, in the solar wind interval analyzed, the two Elsasser spectra are observed to have the same scaling exponent ~1:54 throughout the inertial range. This highlights the importance of understanding uncertainty estimates and how they affect observed scaling in the PSD when using the solar wind as a laboratory to test predictions of theories of turbulence.

  16. Evaluating the uncertainty in measurement of occupational exposure with personal dosemeters.

    Science.gov (United States)

    van Dijk, J W E

    2007-01-01

    In the 1990 Recommendations of the ICRP it is stated that an uncertainty in a dose measured with a personal dosemeter under workplace conditions of a factor 1.5 in either direction 'will not be unusual'. In many documents similar to the EU Technical recommendations, the IAEA Safety Guides and papers in scientific journals, this statement is understood to be a basis for developing type-test criteria and criteria for the approval of dosimetric systems. The methods for evaluating the standard uncertainty as proposed in the above mentioned documents and in national and international standards use an approach that is based on the Law of Propagation of Uncertainties (LPU). This approach needs a number of assumptions, the validity of which cannot easily be verified for personal dosemeters. The current paper presents a numerical method based on Monte Carlo simulation for the calculation phase of the evaluation of uncertainties. The results of applying the method on the type-test data of the NRG TL-dosemeter indicate that the combined standard uncertainty estimated using the LPU approach might well not be realistic. The numerical method is simple and can be precisely formulated, making it suitable for being part of approval or accreditation procedures.

  17. Psychometric Evaluation of a New Instrument to Measure Uncertainty in Children with Cancer

    Science.gov (United States)

    Stewart, Janet L.; Lynn, Mary R.; Mishel, Merle H.

    2010-01-01

    Background Although uncertainty has been characterized as a major stressor for children with cancer, it has not been studied systematically. Objectives To describe the development and initial psychometric evaluation of a measure of uncertainty in school-aged children and adolescents with cancer. Methods Interview data from the first author’s qualitative study of uncertainty in children undergoing cancer treatment were used to generate 22 items for the Uncertainty Scale for Kids (USK), which were evaluated for content validity by expert panels of children with cancer and experienced clinicians (Stewart, Lynn, & Mishel, 2005). Reliability and validity were evaluated in a sample of 72 children aged 8 to 17 years undergoing cancer treatment. Results The USK items underwent minor revision following input from content validity experts and all 22 were retained for testing. The USK demonstrated strong reliability (Cronbach’s alpha = .94, test-retest r = .64, p = .005) and preliminary evidence for validity was supported by significant associations between USK scores and cancer knowledge, complexity of treatment, and anxiety and depression. Exploratory factor analysis yielded 2 factors, not knowing how serious the illness is and not knowing what will happen when, which explained 50.4% of the variance. Discussion The USK, developed from the perspective of children, performed well in the initial application, demonstrating strong reliability and preliminary evidence for construct and discriminant validity. It holds considerable promise for moving the research forward on uncertainty in childhood cancer. PMID:20216014

  18. Evaluating uncertainty estimates in hydrologic models: borrowing measures from the forecast verification community

    Directory of Open Access Journals (Sweden)

    K. J. Franz

    2011-11-01

    Full Text Available The hydrologic community is generally moving towards the use of probabilistic estimates of streamflow, primarily through the implementation of Ensemble Streamflow Prediction (ESP systems, ensemble data assimilation methods, or multi-modeling platforms. However, evaluation of probabilistic outputs has not necessarily kept pace with ensemble generation. Much of the modeling community is still performing model evaluation using standard deterministic measures, such as error, correlation, or bias, typically applied to the ensemble mean or median. Probabilistic forecast verification methods have been well developed, particularly in the atmospheric sciences, yet few have been adopted for evaluating uncertainty estimates in hydrologic model simulations. In the current paper, we overview existing probabilistic forecast verification methods and apply the methods to evaluate and compare model ensembles produced from two different parameter uncertainty estimation methods: the Generalized Uncertainty Likelihood Estimator (GLUE, and the Shuffle Complex Evolution Metropolis (SCEM. Model ensembles are generated for the National Weather Service SACramento Soil Moisture Accounting (SAC-SMA model for 12 forecast basins located in the Southeastern United States. We evaluate the model ensembles using relevant metrics in the following categories: distribution, correlation, accuracy, conditional statistics, and categorical statistics. We show that the presented probabilistic metrics are easily adapted to model simulation ensembles and provide a robust analysis of model performance associated with parameter uncertainty. Application of these methods requires no information in addition to what is already available as part of traditional model validation methodology and considers the entire ensemble or uncertainty range in the approach.

  19. Evaluating uncertainty estimates in hydrologic models: borrowing measures from the forecast verification community

    Science.gov (United States)

    Franz, K. J.; Hogue, T. S.

    2011-11-01

    The hydrologic community is generally moving towards the use of probabilistic estimates of streamflow, primarily through the implementation of Ensemble Streamflow Prediction (ESP) systems, ensemble data assimilation methods, or multi-modeling platforms. However, evaluation of probabilistic outputs has not necessarily kept pace with ensemble generation. Much of the modeling community is still performing model evaluation using standard deterministic measures, such as error, correlation, or bias, typically applied to the ensemble mean or median. Probabilistic forecast verification methods have been well developed, particularly in the atmospheric sciences, yet few have been adopted for evaluating uncertainty estimates in hydrologic model simulations. In the current paper, we overview existing probabilistic forecast verification methods and apply the methods to evaluate and compare model ensembles produced from two different parameter uncertainty estimation methods: the Generalized Uncertainty Likelihood Estimator (GLUE), and the Shuffle Complex Evolution Metropolis (SCEM). Model ensembles are generated for the National Weather Service SACramento Soil Moisture Accounting (SAC-SMA) model for 12 forecast basins located in the Southeastern United States. We evaluate the model ensembles using relevant metrics in the following categories: distribution, correlation, accuracy, conditional statistics, and categorical statistics. We show that the presented probabilistic metrics are easily adapted to model simulation ensembles and provide a robust analysis of model performance associated with parameter uncertainty. Application of these methods requires no information in addition to what is already available as part of traditional model validation methodology and considers the entire ensemble or uncertainty range in the approach.

  20. Survey of radiofrequency radiation levels around GSM base stations and evaluation of measurement uncertainty

    Directory of Open Access Journals (Sweden)

    Vulević Branislav D.

    2011-01-01

    Full Text Available This paper is a summary of broadband measurement values of radiofrequency radiation around GSM base stations in the vicinity of residential areas in Belgrade and 12 other cities in Serbia. It will be useful for determining non-ionizing radiation exposure levels of the general public in the future. The purpose of this paper is also an appropriate representation of basic information on the evaluation of measurement uncertainty.

  1. Application of the Nordtest method for "real-time" uncertainty estimation of on-line field measurement.

    Science.gov (United States)

    Näykki, Teemu; Virtanen, Atte; Kaukonen, Lari; Magnusson, Bertil; Väisänen, Tero; Leito, Ivo

    2015-10-01

    Field sensor measurements are becoming more common for environmental monitoring. Solutions for enhancing reliability, i.e. knowledge of the measurement uncertainty of field measurements, are urgently needed. Real-time estimations of measurement uncertainty for field measurement have not previously been published, and in this paper, a novel approach to the automated turbidity measuring system with an application for "real-time" uncertainty estimation is outlined based on the Nordtest handbook's measurement uncertainty estimation principles. The term real-time is written in quotation marks, since the calculation of the uncertainty is carried out using a set of past measurement results. There are two main requirements for the estimation of real-time measurement uncertainty of online field measurement described in this paper: (1) setting up an automated measuring system that can be (preferably remotely) controlled which measures the samples (water to be investigated as well as synthetic control samples) the way the user has programmed it and stores the results in a database, (2) setting up automated data processing (software) where the measurement uncertainty is calculated from the data produced by the automated measuring system. When control samples with a known value or concentration are measured regularly, any instrumental drift can be detected. An additional benefit is that small drift can be taken into account (in real-time) as a bias value in the measurement uncertainty calculation, and if the drift is high, the measurement results of the control samples can be used for real-time recalibration of the measuring device. The procedure described in this paper is not restricted to turbidity measurements, but it will enable measurement uncertainty estimation for any kind of automated measuring system that performs sequential measurements of routine samples and control samples/reference materials in a similar way as described in this paper.

  2. Role of turbulence fluctuations on uncertainties of acoutic Doppler current profiler discharge measurements

    Science.gov (United States)

    Tarrab, Leticia; Garcia, Carlos M.; Cantero, Mariano I.; Oberg, Kevin

    2012-01-01

    This work presents a systematic analysis quantifying the role of the presence of turbulence fluctuations on uncertainties (random errors) of acoustic Doppler current profiler (ADCP) discharge measurements from moving platforms. Data sets of three-dimensional flow velocities with high temporal and spatial resolution were generated from direct numerical simulation (DNS) of turbulent open channel flow. Dimensionless functions relating parameters quantifying the uncertainty in discharge measurements due to flow turbulence (relative variance and relative maximum random error) to sampling configuration were developed from the DNS simulations and then validated with field-scale discharge measurements. The validated functions were used to evaluate the role of the presence of flow turbulence fluctuations on uncertainties in ADCP discharge measurements. The results of this work indicate that random errors due to the flow turbulence are significant when: (a) a low number of transects is used for a discharge measurement, and (b) measurements are made in shallow rivers using high boat velocity (short time for the boat to cross a flow turbulence structure).

  3. Uncertainty Quantification and Comparison of Weld Residual Stress Measurements and Predictions.

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, John R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Brooks, Dusty Marie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-10-01

    In pressurized water reactors, the prevention, detection, and repair of cracks within dissimilar metal welds is essential to ensure proper plant functionality and safety. Weld residual stresses, which are difficult to model and cannot be directly measured, contribute to the formation and growth of cracks due to primary water stress corrosion cracking. Additionally, the uncertainty in weld residual stress measurements and modeling predictions is not well understood, further complicating the prediction of crack evolution. The purpose of this document is to develop methodology to quantify the uncertainty associated with weld residual stress that can be applied to modeling predictions and experimental measurements. Ultimately, the results can be used to assess the current state of uncertainty and to build confidence in both modeling and experimental procedures. The methodology consists of statistically modeling the variation in the weld residual stress profiles using functional data analysis techniques. Uncertainty is quantified using statistical bounds (e.g. confidence and tolerance bounds) constructed with a semi-parametric bootstrap procedure. Such bounds describe the range in which quantities of interest, such as means, are expected to lie as evidenced by the data. The methodology is extended to provide direct comparisons between experimental measurements and modeling predictions by constructing statistical confidence bounds for the average difference between the two quantities. The statistical bounds on the average difference can be used to assess the level of agreement between measurements and predictions. The methodology is applied to experimental measurements of residual stress obtained using two strain relief measurement methods and predictions from seven finite element models developed by different organizations during a round robin study.

  4. Analysis of the Uncertainty in Wind Measurements from the Atmospheric Radiation Measurement Doppler Lidar during XPIA: Field Campaign Report

    Energy Technology Data Exchange (ETDEWEB)

    Newsom, Rob [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-03-01

    In March and April of 2015, the ARM Doppler lidar that was formerly operated at the Tropical Western Pacific site in Darwin, Australia (S/N 0710-08) was deployed to the Boulder Atmospheric Observatory (BAO) for the eXperimental Planetary boundary-layer Instrument Assessment (XPIA) field campaign. The goal of the XPIA field campaign was to investigate methods of using multiple Doppler lidars to obtain high-resolution three-dimensional measurements of winds and turbulence in the atmospheric boundary layer, and to characterize the uncertainties in these measurements. The ARM Doppler lidar was one of many Doppler lidar systems that participated in this study. During XPIA the 300-m tower at the BAO site was instrumented with well-calibrated sonic anemometers at six levels. These sonic anemometers provided highly accurate reference measurements against which the lidars could be compared. Thus, the deployment of the ARM Doppler lidar during XPIA offered a rare opportunity for the ARM program to characterize the uncertainties in their lidar wind measurements. Results of the lidar-tower comparison indicate that the lidar wind speed measurements are essentially unbiased (~1cm s-1), with a random error of approximately 50 cm s-1. Two methods of uncertainty estimation were tested. The first method was found to produce uncertainties that were too low. The second method produced estimates that were more accurate and better indicators of data quality. As of December 2015, the first method is being used by the ARM Doppler lidar wind value-added product (VAP). One outcome of this work will be to update this VAP to use the second method for uncertainty estimation.

  5. Evaluation of uncertainty in the measurement of sense of natural language constructions

    Directory of Open Access Journals (Sweden)

    Bisikalo Oleg V.

    2017-01-01

    Full Text Available The task of evaluating uncertainty in the measurement of sense in natural language constructions (NLCs was researched through formalization of the notions of the language image, formalization of artificial cognitive systems (ACSs and the formalization of units of meaning. The method for measuring the sense of natural language constructions incorporated fuzzy relations of meaning, which ensures that information about the links between lemmas of the text is taken into account, permitting the evaluation of two types of measurement uncertainty of sense characteristics. Using developed applications programs, experiments were conducted to investigate the proposed method to tackle the identification of informative characteristics of text. The experiments resulted in dependencies of parameters being obtained in order to utilise the Pareto distribution law to define relations between lemmas, analysis of which permits the identification of exponents of an average number of connections of the language image as the most informative characteristics of text.

  6. Optimized clustering estimators for BAO measurements accounting for significant redshift uncertainty

    Science.gov (United States)

    Ross, Ashley J.; Banik, Nilanjan; Avila, Santiago; Percival, Will J.; Dodelson, Scott; Garcia-Bellido, Juan; Crocce, Martin; Elvin-Poole, Jack; Giannantonio, Tommaso; Manera, Marc; Sevilla-Noarbe, Ignacio

    2017-12-01

    We determine an optimized clustering statistic to be used for galaxy samples with significant redshift uncertainty, such as those that rely on photometric redshifts. To do so, we study the baryon acoustic oscillation (BAO) information content as a function of the orientation of galaxy clustering modes with respect to their angle to the line of sight (LOS). The clustering along the LOS, as observed in a redshift-space with significant redshift uncertainty, has contributions from clustering modes with a range of orientations with respect to the true LOS. For redshift uncertainty σz ≥ 0.02(1 + z), we find that while the BAO information is confined to transverse clustering modes in the true space, it is spread nearly evenly in the observed space. Thus, measuring clustering in terms of the projected separation (regardless of the LOS) is an efficient and nearly lossless compression of the signal for σz ≥ 0.02(1 + z). For reduced redshift uncertainty, a more careful consideration is required. We then use more than 1700 realizations (combining two separate sets) of galaxy simulations mimicking the Dark Energy Survey Year 1 (DES Y1) sample to validate our analytic results and optimized analysis procedure. We find that using the correlation function binned in projected separation, we can achieve uncertainties that are within 10 per cent of those predicted by Fisher matrix forecasts. We predict that DES Y1 should achieve a 5 per cent distance measurement using our optimized methods. We expect the results presented here to be important for any future BAO measurements made using photometric redshift data.

  7. Optimized Clustering Estimators for BAO Measurements Accounting for Significant Redshift Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Ross, Ashley J. [Portsmouth U., ICG; Banik, Nilanjan [Fermilab; Avila, Santiago [Madrid, IFT; Percival, Will J. [Portsmouth U., ICG; Dodelson, Scott [Fermilab; Garcia-Bellido, Juan [Madrid, IFT; Crocce, Martin [ICE, Bellaterra; Elvin-Poole, Jack [Jodrell Bank; Giannantonio, Tommaso [Cambridge U., KICC; Manera, Marc [Cambridge U., DAMTP; Sevilla-Noarbe, Ignacio [Madrid, CIEMAT

    2017-05-15

    We determine an optimized clustering statistic to be used for galaxy samples with significant redshift uncertainty, such as those that rely on photometric redshifts. To do so, we study the BAO information content as a function of the orientation of galaxy clustering modes with respect to their angle to the line-of-sight (LOS). The clustering along the LOS, as observed in a redshift-space with significant redshift uncertainty, has contributions from clustering modes with a range of orientations with respect to the true LOS. For redshift uncertainty $\\sigma_z \\geq 0.02(1+z)$ we find that while the BAO information is confined to transverse clustering modes in the true space, it is spread nearly evenly in the observed space. Thus, measuring clustering in terms of the projected separation (regardless of the LOS) is an efficient and nearly lossless compression of the signal for $\\sigma_z \\geq 0.02(1+z)$. For reduced redshift uncertainty, a more careful consideration is required. We then use more than 1700 realizations of galaxy simulations mimicking the Dark Energy Survey Year 1 sample to validate our analytic results and optimized analysis procedure. We find that using the correlation function binned in projected separation, we can achieve uncertainties that are within 10 per cent of of those predicted by Fisher matrix forecasts. We predict that DES Y1 should achieve a 5 per cent distance measurement using our optimized methods. We expect the results presented here to be important for any future BAO measurements made using photometric redshift data.

  8. Uncertainty quantification for nuclear density functional theory and information content of new measurements

    Energy Technology Data Exchange (ETDEWEB)

    McDonnell, J. D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Schunck, N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Higdon, D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sarich, J. [Argonne National Lab. (ANL), Argonne, IL (United States); Wild, S. M. [Argonne National Lab. (ANL), Argonne, IL (United States); Nazarewicz, W. [Michigan State Univ., East Lansing, MI (United States); Oak Ridge National Lab., Oak Ridge, TN (United States); Univ. of Warsaw, Warsaw (Poland)

    2015-03-24

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. As a result, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.

  9. Integrating measuring uncertainty of tactile and optical coordinate measuring machines in the process capability assessment of micro injection moulding

    DEFF Research Database (Denmark)

    Tosello, Guido; Hansen, Hans Nørgaard; Gasparin, Stefania

    2010-01-01

    Process capability of micro injection moulding was investigated in this paper by calculating the Cp and Cpk statistics. Uncertainty of both optical and tactile measuring systems employed in the quality control of micro injection moulded products was assessed and compared with the specified...... tolerances. Limits in terms of manufacturing process capability as well as of suitability of such measuring systems when employed for micro production inspection were quantitatively determined....

  10. Uncertainty of power curve measurement with a two-beam nacelle-mounted lidar

    DEFF Research Database (Denmark)

    Wagner, Rozenn; Courtney, Michael Stephen; Friis Pedersen, Troels

    2015-01-01

    Nacelle lidars are attractive for offshore measurements since they can provide measurements of the free wind speed in front of the turbine rotor without erecting a met mast, which significantly reduces the cost of the measurements. Nacelle-mounted pulsed lidars with two lines of sight (LOS) have...... already been demonstrated to be suitable for use in power performance measurements. To be considered as a professional tool, however, power curve measurements performed using these instruments require traceable calibrated measurements and the quantification of the wind speed measurement uncertainty. Here...... lies between 1 and 2% for the wind speed range between cut-in and rated wind speed. Finally, the lidar was mounted on the nacelle of a wind turbine in order to perform a power curve measurement. The wind speed was simultaneously measured with a mast-top mounted cup anemometer placed two rotor diameters...

  11. Development and application of objective uncertainty measures for nuclear power plant transient analysis[Dissertation 3897

    Energy Technology Data Exchange (ETDEWEB)

    Vinai, P

    2007-10-15

    For the development, design and licensing of a nuclear power plant (NPP), a sound safety analysis is necessary to study the diverse physical phenomena involved in the system behaviour under operational and transient conditions. Such studies are based on detailed computer simulations. With the progresses achieved in computer technology and the greater availability of experimental and plant data, the use of best estimate codes for safety evaluations has gained increasing acceptance. The application of best estimate safety analysis has raised new problems that need to be addressed: it has become more crucial to assess as to how reliable code predictions are, especially when they need to be compared against safety limits that must not be crossed. It becomes necessary to identify and quantify the various possible sources of uncertainty that affect the reliability of the results. Currently, such uncertainty evaluations are generally based on experts' opinion. In the present research, a novel methodology based on a non-parametric statistical approach has been developed for objective quantification of best-estimate code uncertainties related to the physical models used in the code. The basis is an evaluation of the accuracy of a given physical model achieved by comparing its predictions with experimental data from an appropriate set of separate-effect tests. The differences between measurements and predictions can be considered stochastically distributed, and thus a statistical approach can be employed. The first step was the development of a procedure for investigating the dependence of a given physical model's accuracy on the experimental conditions. Each separate-effect test effectively provides a random sample of discrepancies between measurements and predictions, corresponding to a location in the state space defined by a certain number of independent system variables. As a consequence, the samples of 'errors', achieved from analysis of the entire

  12. PROGRESS OF THE AVNG SYSTEM - ATTRIBUTE VERIFICATION SYSTEM WITH INFORMATION BARRIERS FOR MASS AND ISOTOPICS MEASUREMENTS

    Energy Technology Data Exchange (ETDEWEB)

    Budnikov, D; Bulatov, M; Jarikhine, I; Lebedev, B; Livke, A; Modenov, A; Morkin, A; Razinkov, S; Safronov, S; Tsaregorodtsev, D; Vlokh, A; Yakovleva, S; Elmont, T; Langner, D; MacArthur, D; Mayo, D; Smith, M; Luke, S J

    2005-05-27

    An attribute verification system (AVNG) with information barriers for mass and isotopics measurements has been designed and its fabrication is nearly completed. The AVNG is being built by scientists at the Russian Federal Nuclear Center-VNIIEF, with support of Los Alamos National Laboratory (LANL) and Lawrence Livermore National Laboratory (LLNL). Such a system could be used to verify the presence of several unclassified attributes of classified material with no classified information release. The system is comprised of a neutron multiplicity counter and gamma-spectrometry system based on a high purity germanium gamma detector (nominal relative efficiency {at} 1332 keV 50%) and digital gamma-ray spectrometer DSPEC{sup PLUS}. The neutron multiplicity counter is a three ring counter with 164 {sup 3}He tubes. The system was designed to measure prototype containers 491 mm in diameter and 503 mm high. This paper provides a brief history of the project and documents the progress of this effort with drawings and photographs.

  13. Quantification of model uncertainty in aerosol optical thickness retrieval from Ozone Monitoring Instrument (OMI) measurements

    Science.gov (United States)

    Määttä, A.; Laine, M.; Tamminen, J.; Veefkind, J. P.

    2013-09-01

    We study uncertainty quantification in remote sensing of aerosols in the atmosphere with top of the atmosphere reflectance measurements from the nadir-viewing Ozone Monitoring Instrument (OMI). Focus is on the uncertainty in aerosol model selection of pre-calculated aerosol models and on the statistical modelling of the model inadequacies. The aim is to apply statistical methodologies that improve the uncertainty estimates of the aerosol optical thickness (AOT) retrieval by propagating model selection and model error related uncertainties more realistically. We utilise Bayesian model selection and model averaging methods for the model selection problem and use Gaussian processes to model the smooth systematic discrepancies from the modelled to observed reflectance. The systematic model error is learned from an ensemble of operational retrievals. The operational OMI multi-wavelength aerosol retrieval algorithm OMAERO is used for cloud free, over land pixels of the OMI instrument with the additional Bayesian model selection and model discrepancy techniques. The method is demonstrated with four examples with different aerosol properties: weakly absorbing aerosols, forest fires over Greece and Russia, and Sahara dessert dust. The presented statistical methodology is general; it is not restricted to this particular satellite retrieval application.

  14. Focused Belief Measures for Uncertainty Quantification in High Performance Semantic Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Joslyn, Cliff A.; Weaver, Jesse R.

    2013-08-13

    In web-scale semantic data analytics there is a great need for methods which aggregate uncertainty claims, on the one hand respecting the information provided as accurately as possible, while on the other still being tractable. Traditional statistical methods are more robust, but only represent distributional, additive uncertainty. Generalized information theory methods, including fuzzy systems and Dempster-Shafer (DS) evidence theory, represent multiple forms of uncertainty, but are computationally and methodologically difficult. We require methods which provide an effective balance between the complete representation of the full complexity of uncertainty claims in their interaction, while satisfying the needs of both computational complexity and human cognition. Here we build on J{\\o}sang's subjective logic to posit methods in focused belief measures (FBMs), where a full DS structure is focused to a single event. The resulting ternary logical structure is posited to be able to capture the minimal amount of generalized complexity needed at a maximum of computational efficiency. We demonstrate the efficacy of this approach in a web ingest experiment over the 2012 Billion Triple dataset from the Semantic Web Challenge.

  15. Quantifying the Contribution of Post-Processing in Computed Tomography Measurement Uncertainty

    DEFF Research Database (Denmark)

    Stolfi, Alessandro; Thompson, Mary Kathryn; Carli, Lorenzo

    2016-01-01

    This paper evaluates and quantifies the repeatability of post-processing settings, such as surface determination, data fitting, and the definition of the datum system, on the uncertainties of Computed Tomography (CT) measurements. The influence of post-processing contributions was determined......-processing settings. It was found that the definition of the datum system had the largest impact on the uncertainty with a standard deviation of a few microns. The surface determination and data fitting had smaller contributions with sub-micron repeatability....... by calculating the standard deviation of 10 repeated measurement evaluations on the same data set. The evaluations were performed on an industrial assembly. Each evaluation includes several dimensional and geometrical measurands that were expected to have different responses to the various post...

  16. Meeting the measurement uncertainty and traceability requirements of ISO/AEC standard 17025 in chemical analysis.

    Science.gov (United States)

    King, B

    2001-11-01

    The new laboratory accreditation standard, ISO/IEC 17025, reflects current thinking on good measurement practice by requiring more explicit and more demanding attention to a number of activities. These include client interactions, method validation, traceability, and measurement uncertainty. Since the publication of the standard in 1999 there has been extensive debate about its interpretation. It is the author's view that if good quality practices are already in place and if the new requirements are introduced in a manner that is fit for purpose, the additional work required to comply with the new requirements can be expected to be modest. The paper argues that the rigour required in addressing the issues should be driven by customer requirements and the factors that need to be considered in this regard are discussed. The issues addressed include the benefits, interim arrangements, specifying the analytical requirement, establishing traceability, evaluating the uncertainty and reporting the information.

  17. Multi-attribute integrated measurement of node importance in complex networks

    Science.gov (United States)

    Wang, Shibo; Zhao, Jinlou

    2015-11-01

    The measure of node importance in complex networks is very important to the research of networks stability and robustness; it also can ensure the security of the whole network. Most researchers have used a single indicator to measure the networks node importance, so that the obtained measurement results only reflect certain aspects of the networks with a loss of information. Meanwhile, because of the difference of networks topology, the nodes' importance should be described by combining the character of the networks topology. Most of the existing evaluation algorithms cannot completely reflect the circumstances of complex networks, so this paper takes into account the degree of centrality, the relative closeness centrality, clustering coefficient, and topology potential and raises an integrated measuring method to measure the nodes' importance. This method can reflect nodes' internal and outside attributes and eliminate the influence of network structure on the node importance. The experiments of karate network and dolphin network show that networks topology structure integrated measure has smaller range of metrical result than a single indicator and more universal. Experiments show that attacking the North American power grid and the Internet network with the method has a faster convergence speed than other methods.

  18. A generalized measurement model to quantify health: the multi-attribute preference response model.

    Directory of Open Access Journals (Sweden)

    Paul F M Krabbe

    Full Text Available After 40 years of deriving metric values for health status or health-related quality of life, the effective quantification of subjective health outcomes is still a challenge. Here, two of the best measurement tools, the discrete choice and the Rasch model, are combined to create a new model for deriving health values. First, existing techniques to value health states are briefly discussed followed by a reflection on the recent revival of interest in patients' experience with regard to their possible role in health measurement. Subsequently, three basic principles for valid health measurement are reviewed, namely unidimensionality, interval level, and invariance. In the main section, the basic operation of measurement is then discussed in the framework of probabilistic discrete choice analysis (random utility model and the psychometric Rasch model. It is then shown how combining the main features of these two models yields an integrated measurement model, called the multi-attribute preference response (MAPR model, which is introduced here. This new model transforms subjective individual rank data into a metric scale using responses from patients who have experienced certain health states. Its measurement mechanism largely prevents biases such as adaptation and coping. Several extensions of the MAPR model are presented. The MAPR model can be applied to a wide range of research problems. If extended with the self-selection of relevant health domains for the individual patient, this model will be more valid than existing valuation techniques.

  19. Investigations into the Uncertainties of Interferometric Measurements of Linear and Circular Vibrations

    Directory of Open Access Journals (Sweden)

    Hans-Jürgen von Martens

    1997-01-01

    Full Text Available A uniform description is given of a method of measurement using a Michelson interferometer for measuring the linear motion quantities acceleration, velocity and displacement, and a diffraction grating interferometer for measuring the circular motion quantities angular acceleration, angular velocity and rotation angle. The paper focusses on an analysis of the dynamic behaviour of an interferometric measurement system based on the counting technique with regard to the measurement errors due to deterministic and stochastic disturbing quantities. The error analysis and description presented are aimed at giving some rules, mathematical expressions and graphical presentations that have proved to be helpful in recognizing the errors in interferometric measurements of motion quantities, optimizing the measurement conditions (e.g., filter settings, obtaining corrections and estimating the uncertainty of measurement.

  20. Uncertainties in corrosion rate measurements of fasteners exposed to treated wood at 100% relative humidity

    Science.gov (United States)

    Samuel L. Zelinka

    2007-01-01

    This paper evaluates the effect that uncertainties in measurements of time, weight, and surface area have on the determination of the corrosion rate of metal fasteners in contact with wood. Three different types of nails were driven into alkaline copper quaternary (ACQ)-treated wood and exposed to 26.7°C (80°C) at 100 % relative humidity environment for up to 1 year....

  1. Measurement and interpolation uncertainties in rainfall maps from cellular communication networks

    Science.gov (United States)

    Rios Gaona, M. F.; Overeem, A.; Leijnse, H.; Uijlenhoet, R.

    2015-08-01

    Accurate measurements of rainfall are important in many hydrological and meteorological applications, for instance, flash-flood early-warning systems, hydraulic structures design, irrigation, weather forecasting, and climate modelling. Whenever possible, link networks measure and store the received power of the electromagnetic signal at regular intervals. The decrease in power can be converted to rainfall intensity, and is largely due to the attenuation by raindrops along the link paths. Such an alternative technique fulfils the continuous effort to obtain measurements of rainfall in time and space at higher resolutions, especially in places where traditional rain gauge networks are scarce or poorly maintained. Rainfall maps from microwave link networks have recently been introduced at country-wide scales. Despite their potential in rainfall estimation at high spatiotemporal resolutions, the uncertainties present in rainfall maps from link networks are not yet fully comprehended. The aim of this work is to identify and quantify the sources of uncertainty present in interpolated rainfall maps from link rainfall depths. In order to disentangle these sources of uncertainty, we classified them into two categories: (1) those associated with the individual microwave link measurements, i.e. the errors involved in link rainfall retrievals, such as wet antenna attenuation, sampling interval of measurements, wet/dry period classification, dry weather baseline attenuation, quantization of the received power, drop size distribution (DSD), and multi-path propagation; and (2) those associated with mapping, i.e. the combined effect of the interpolation methodology and the spatial density of link measurements. We computed ~ 3500 rainfall maps from real and simulated link rainfall depths for 12 days for the land surface of the Netherlands. Simulated link rainfall depths refer to path-averaged rainfall depths obtained from radar data. The ~ 3500 real and simulated rainfall maps were

  2. Measuring organizational attributes in primary care: a validation study in Germany.

    Science.gov (United States)

    Ose, Dominik; Freund, Tobias; Kunz, Cornelia U; Szecsenyi, Joachim; Natanzon, Iris; Trieschmann, Johanna; Wensing, Michel; Miksch, Antje

    2010-12-01

    Models for the structured delivery of care rely on organizational attributes of practice teams. The Survey of Organizational Attributes for Primary Care (SOAPC) is known to be a valid instrument to measure this aspect in the primary care setting. The aim of this study was to determine the validity of a translated and culturally adapted German version of the SOAPC. The SOAPC was translated and culturally adapted according to established standards. The external validity of the German SOAPC was assessed using the German version of the Warr-Cook-Wall scale. A total of 200 practices randomly selected from a conference database were asked to participate in the validation study. Practice, clinicians and staff characteristics were determined via short-form questionnaires. We used standardized statistical procedures to reveal the psychometric properties of the SOAPC. A total of 54 practice teams participated by returning 297 completed questionnaires (297/425, response rate 69.8%). All four domains of the SOAPC (communication, decision making, stress/chaos, history of change) could be approved by factor analysis. Internal consistency is underlined by a Cronbach's alpha of 0.70 or higher in all categories. We show strong correlation with the Warr-Cook-Wall scale in all corresponding categories indexing high external validity. The German SOAPC is a reliable and valid instrument for the assessment of organizational attributes of practice teams as the providers of quality of care. Moreover, the tool makes it possible to map the state of implementation of quality management and practice organization. The availability of the German SOAPC encourages further research on this topic in German-speaking countries. © 2010 Blackwell Publishing Ltd.

  3. Uncertainty estimates of purity measurements based on current information: toward a "live validation" of purity methods.

    Science.gov (United States)

    Apostol, Izydor; Kelner, Drew; Jiang, Xinzhao Grace; Huang, Gang; Wypych, Jette; Zhang, Xin; Gastwirt, Jessica; Chen, Kenneth; Fodor, Szilan; Hapuarachchi, Suminda; Meriage, Dave; Ye, Frank; Poppe, Leszek; Szpankowski, Wojciech

    2012-12-01

    To predict precision and other performance characteristics of chromatographic purity methods, which represent the most widely used form of analysis in the biopharmaceutical industry. We have conducted a comprehensive survey of purity methods, and show that all performance characteristics fall within narrow measurement ranges. This observation was used to develop a model called Uncertainty Based on Current Information (UBCI), which expresses these performance characteristics as a function of the signal and noise levels, hardware specifications, and software settings. We applied the UCBI model to assess the uncertainty of purity measurements, and compared the results to those from conventional qualification. We demonstrated that the UBCI model is suitable to dynamically assess method performance characteristics, based on information extracted from individual chromatograms. The model provides an opportunity for streamlining qualification and validation studies by implementing a "live validation" of test results utilizing UBCI as a concurrent assessment of measurement uncertainty. Therefore, UBCI can potentially mitigate the challenges associated with laborious conventional method validation and facilitates the introduction of more advanced analytical technologies during the method lifecycle.

  4. Evaluation of the measurement uncertainty in screening immunoassays in blood establishments: computation of diagnostic accuracy models.

    Science.gov (United States)

    Pereira, Paulo; Westgard, James O; Encarnação, Pedro; Seghatchian, Jerard

    2015-02-01

    The European Union regulation for blood establishments does not require the evaluation of measurement uncertainty in virology screening tests, which is required by ISO 15189 guideline following GUM principles. GUM modular approaches have been discussed by medical laboratory researchers but no consensus has been achieved regarding practical application. Meanwhile, the application of empirical approaches fulfilling GUM principles has gained support. Blood establishments' screening tests accredited by ISO 15189 need to select an appropriate model even GUM models are intended uniquely for quantitative examination procedures. Alternative (to GUM) models focused on probability have been proposed in medical laboratories' diagnostic tests. This article reviews, discusses and proposes models for diagnostic accuracy in blood establishments' screening tests. The output of these models is an alternative to VIM's measurement uncertainty concept. Example applications are provided for an anti-HCV test where calculations were performed using a commercial spreadsheet. The results show that these models satisfy ISO 15189 principles and that the estimation of clinical sensitivity, clinical specificity, binary results agreement and area under the ROC curve are alternatives to the measurement uncertainty concept. Copyright © 2014. Published by Elsevier Ltd.

  5. Routine internal- and external-quality control data in clinical laboratories for estimating measurement and diagnostic uncertainty using GUM principles.

    Science.gov (United States)

    Magnusson, Bertil; Ossowicki, Haakan; Rienitz, Olaf; Theodorsson, Elvar

    2012-05-01

    Healthcare laboratories are increasingly joining into larger laboratory organizations encompassing several physical laboratories. This caters for important new opportunities for re-defining the concept of a 'laboratory' to encompass all laboratories and measurement methods measuring the same measurand for a population of patients. In order to make measurement results, comparable bias should be minimized or eliminated and measurement uncertainty properly evaluated for all methods used for a particular patient population. The measurement as well as diagnostic uncertainty can be evaluated from internal and external quality control results using GUM principles. In this paper the uncertainty evaluations are described in detail using only two main components, within-laboratory reproducibility and uncertainty of the bias component according to a Nordtest guideline. The evaluation is exemplified for the determination of creatinine in serum for a conglomerate of laboratories both expressed in absolute units (μmol/L) and relative (%). An expanded measurement uncertainty of 12 μmol/L associated with concentrations of creatinine below 120 μmol/L and of 10% associated with concentrations above 120 μmol/L was estimated. The diagnostic uncertainty encompasses both measurement uncertainty and biological variation, and can be estimated for a single value and for a difference. This diagnostic uncertainty for the difference for two samples from the same patient was determined to be 14 μmol/L associated with concentrations of creatinine below 100 μmol/L and 14 % associated with concentrations above 100 μmol/L.

  6. Assessment of adaptation measures to high-mountain risks in Switzerland under climate uncertainties

    Science.gov (United States)

    Muccione, Veruska; Lontzek, Thomas; Huggel, Christian; Ott, Philipp; Salzmann, Nadine

    2015-04-01

    The economic evaluation of different adaptation options is important to support policy-makers that need to set priorities in the decision-making process. However, the decision-making process faces considerable uncertainties regarding current and projected climate impacts. First, physical climate and related impact systems are highly complex and not fully understood. Second, the further we look into the future, the more important the emission pathways become, with effects on the frequency and severity of climate impacts. Decision on adaptation measures taken today and in the future must be able to adequately consider the uncertainties originating from the different sources. Decisions are not taken in a vacuum but always in the context of specific social, economic, institutional and political conditions. Decision finding processes strongly depend on the socio-political system and usually have evolved over some time. Finding and taking decisions in the respective socio-political and economic context multiplies the uncertainty challenge. Our presumption is that a sound assessment of the different adaptation options in Switzerland under uncertainty necessitates formulating and solving a dynamic, stochastic optimization problem. Economic optimization models in the field of climate change are not new. Typically, such models are applied for global-scale studies but barely for local-scale problems. In this analysis, we considered the case of the Guttannen-Grimsel Valley, situated in the Swiss Bernese Alps. The alpine community has been affected by high-magnitude, high-frequency debris flows that started in 2009 and were historically unprecendented. They were related to thaw of permafrost in the rock slopes of Ritzlihorn and repeated rock fall events that accumulated at the debris fan and formed a sediment source for debris flows and were transported downvalley. An important transit road, a trans-European gas pipeline and settlements were severely affected and partly

  7. Environmental Uncertainty, Performance Measure Variety and Perceived Performance in Icelandic Companies

    DEFF Research Database (Denmark)

    Rikhardsson, Pall; Sigurjonsson, Throstur Olaf; Arnardottir, Audur Arna

    and the perceived performance of the company. The sample was the 300 largest companies in Iceland and the response rate was 27%. Compared to other studies the majority of the respondents use a surprisingly high number of different measures – both financial and non-financial. This made testing of the three......The use of performance measures and performance measurement frameworks has increased significantly in recent years. The type and variety of performance measures in use has been researched in various countries and linked to different variables such as the external environment, performance...... measurement frameworks, and management characteristics. This paper reports the results of a study carried out at year end 2013 of the use of performance measures by Icelandic companies and the links to perceived environmental uncertainty, management satisfaction with the performance measurement system...

  8. Graphical representations of data improve student understanding of measurement and uncertainty: An eye-tracking study

    Science.gov (United States)

    Susac, Ana; Bubic, Andreja; Martinjak, Petra; Planinic, Maja; Palmovic, Marijan

    2017-12-01

    Developing a better understanding of the measurement process and measurement uncertainty is one of the main goals of university physics laboratory courses. This study investigated the influence of graphical representation of data on student understanding and interpreting of measurement results. A sample of 101 undergraduate students (48 first year students and 53 third and fifth year students) from the Department of Physics, University of Zagreb were tested with a paper-and-pencil test consisting of eight multiple-choice test items about measurement uncertainties. One version of the test items included graphical representations of the measurement data. About half of the students solved that version of the test while the remaining students solved the same test without graphical representations. The results have shown that the students who had the graphical representation of data scored higher than their colleagues without graphical representation. In the second part of the study, measurements of eye movements were carried out on a sample of thirty undergraduate students from the Department of Physics, University of Zagreb while students were solving the same test on a computer screen. The results revealed that students who had the graphical representation of data spent considerably less time viewing the numerical data than the other group of students. These results indicate that graphical representation may be beneficial for data processing and data comparison. Graphical representation helps with visualization of data and therefore reduces the cognitive load on students while performing measurement data analysis, so students should be encouraged to use it.

  9. Understanding uncertainty

    CERN Document Server

    Lindley, Dennis V

    2013-01-01

    Praise for the First Edition ""...a reference for everyone who is interested in knowing and handling uncertainty.""-Journal of Applied Statistics The critically acclaimed First Edition of Understanding Uncertainty provided a study of uncertainty addressed to scholars in all fields, showing that uncertainty could be measured by probability, and that probability obeyed three basic rules that enabled uncertainty to be handled sensibly in everyday life. These ideas were extended to embrace the scientific method and to show how decisions, containing an uncertain element, could be rationally made.

  10. Quantifying acoustic doppler current profiler discharge uncertainty: A Monte Carlo based tool for moving-boat measurements

    Science.gov (United States)

    Mueller, David S.

    2017-01-01

    This paper presents a method using Monte Carlo simulations for assessing uncertainty of moving-boat acoustic Doppler current profiler (ADCP) discharge measurements using a software tool known as QUant, which was developed for this purpose. Analysis was performed on 10 data sets from four Water Survey of Canada gauging stations in order to evaluate the relative contribution of a range of error sources to the total estimated uncertainty. The factors that differed among data sets included the fraction of unmeasured discharge relative to the total discharge, flow nonuniformity, and operator decisions about instrument programming and measurement cross section. As anticipated, it was found that the estimated uncertainty is dominated by uncertainty of the discharge in the unmeasured areas, highlighting the importance of appropriate selection of the site, the instrument, and the user inputs required to estimate the unmeasured discharge. The main contributor to uncertainty was invalid data, but spatial inhomogeneity in water velocity and bottom-track velocity also contributed, as did variation in the edge velocity, uncertainty in the edge distances, edge coefficients, and the top and bottom extrapolation methods. To a lesser extent, spatial inhomogeneity in the bottom depth also contributed to the total uncertainty, as did uncertainty in the ADCP draft at shallow sites. The estimated uncertainties from QUant can be used to assess the adequacy of standard operating procedures. They also provide quantitative feedback to the ADCP operators about the quality of their measurements, indicating which parameters are contributing most to uncertainty, and perhaps even highlighting ways in which uncertainty can be reduced. Additionally, QUant can be used to account for self-dependent error sources such as heading errors, which are a function of heading. The results demonstrate the importance of a Monte Carlo method tool such as QUant for quantifying random and bias errors when

  11. Measurement uncertainty in single, double and triple isotope dilution mass spectrometry.

    Science.gov (United States)

    Vogl, Jochen

    2012-02-15

    Triple IDMS has been applied for the first time to the quantification of element concentrations. It has been compared with single and double IDMS obtained on the same sample set in order to evaluate the advantages and disadvantages of triple IDMS over single and double IDMS as an analytical reference procedure. The measurement results of single, double and triple IDMS are indistinguishable, considering rounding due to the individual measurement uncertainties. As expected, the relative expanded uncertainties (k = 2) achieved with double IDMS (0.08%) are dramatically smaller than those obtained with single IDMS (1.4%). Triple IDMS yields the smallest relative expanded uncertainties (k = 2, 0.077%) unfortunately at the expense of a much higher workload. Nevertheless triple IDMS has the huge advantage that the isotope ratio of the spike does not need to be determined. Elements with high memory effects, highly enriched spikes or highest metrological requirements may be typical applications for triple IDMS. Copyright © 2011 John Wiley & Sons, Ltd.

  12. Biogenic carbon in combustible waste: Waste composition, variability and measurement uncertainty

    DEFF Research Database (Denmark)

    Larsen, Anna Warberg; Fuglsang, Karsten; Pedersen, Niels H.

    2013-01-01

    described in the literature. This study addressed the variability of biogenic and fossil carbon in combustible waste received at a municipal solid waste incinerator. Two approaches were compared: (1) radiocarbon dating (14C analysis) of carbon dioxide sampled from the flue gas, and (2) mass and energy......, the measurement uncertainties related to the two approaches were determined. Two flue gas sampling campaigns at a full-scale waste incinerator were included: one during normal operation and one with controlled waste input. Estimation of carbon contents in the main waste types received was included. Both the 14C...... method and the balance method represented promising methods able to provide good quality data for the ratio between biogenic and fossil carbon in waste. The relative uncertainty in the individual experiments was 7–10% (95% confidence interval) for the 14C method and slightly lower for the balance method....

  13. Robust framework for PET image reconstruction incorporating system and measurement uncertainties.

    Directory of Open Access Journals (Sweden)

    Huafeng Liu

    Full Text Available In Positron Emission Tomography (PET, an optimal estimate of the radioactivity concentration is obtained from the measured emission data under certain criteria. So far, all the well-known statistical reconstruction algorithms require exactly known system probability matrix a priori, and the quality of such system model largely determines the quality of the reconstructed images. In this paper, we propose an algorithm for PET image reconstruction for the real world case where the PET system model is subject to uncertainties. The method counts PET reconstruction as a regularization problem and the image estimation is achieved by means of an uncertainty weighted least squares framework. The performance of our work is evaluated with the Shepp-Logan simulated and real phantom data, which demonstrates significant improvements in image quality over the least squares reconstruction efforts.

  14. Temperature Control of Continuous Chemical Reactors Under Noisy Measurements and Model Uncertainties

    Directory of Open Access Journals (Sweden)

    Ricardo Aguilar López

    2012-06-01

    Full Text Available The aim of this paper is to present the synthesis of a robust control law for the control of a class of nonlinear systemsnamed Liouvillian. The control design is based on a sliding-mode uncertainty estimator developed under theframework of algebraic-differential concepts. The estimation convergence is done by the Lyapunov-type analysis andthe closed-loop system stability is shown by means of the regulation error dynamics. Robustness of the proposedcontrol scheme is tested in the face of noise output measurements and model uncertainties. The performance of theproposed control law is illustrated with numerical simulations in which a class of oscillatory chemical system is usedas application example.

  15. Uncertainty evaluation by gamma transmission measurements and CFD model comparison in a FCC cold pilot unit

    Directory of Open Access Journals (Sweden)

    Dantas C.C.

    2013-01-01

    Full Text Available The solid flow in air-catalyst in circulating fluidized bed was simulated with CFD model to obtain axial and radial distribution. Therefore, project parameters were confirmed and steady state operation condition was improved. Solid holds up axial end radial profiles simulation and comparison with gamma transmission measurements are in a good agreement. The transmission signal from an 241Am radioactive source was evaluated in NaI(Tl detector coupled to multichannel analyzer. This non intrusive measuring set up is installed at riser of a cold pilot unit to determine parameters of FCC catalyst flow at several concentrations. Mass flow rate calculated by combining solid hold up and solid phase velocity measurements was compared with catalyst inlet measured at down-comer. Evaluation in each measured parameter shows that a relative combined uncertainty of 6% in a 95% interval was estimated. Uncertainty analysis took into account a significant correlation in scan riser transmission measurements. An Eulerian approach of CFD model incorporating the kinetic theory of granular flow was adopted to describe the gas–solid two-phase flows in a multizone circulating reactor. Instantaneous and local gas-particle velocity, void fraction and turbulent parameters were obtained and results are shown in 2 D and 3D graphics.

  16. Cirrus Susceptibility to Changes in Ice Nuclei: Physical Processes, Model Uncertainties, and Measurement Needs

    Science.gov (United States)

    Jensen, Eric

    2017-01-01

    In this talk, I will begin by discussing the physical processes that govern the competition between heterogeneous and homogeneous ice nucleation in upper tropospheric cirrus clouds. Next, I will review the current knowledge of low-temperature ice nucleation from laboratory experiments and field measurements. I will then discuss the uncertainties and deficiencies in representations of cirrus processes in global models used to estimate the climate impacts of changes in cirrus clouds. Lastly, I will review the critical field measurements needed to advance our understanding of cirrus and their susceptibility to changes in aerosol properties.

  17. Comparison of ISO-GUM and Monte Carlo methods for the evaluation of measurement uncertainty: application to direct cadmium measurement in water by GFAAS.

    Science.gov (United States)

    Theodorou, Dimitrios; Meligotsidou, Loukia; Karavoltsos, Sotirios; Burnetas, Apostolos; Dassenakis, Manos; Scoullos, Michael

    2011-02-15

    The propagation stage of uncertainty evaluation, known as the propagation of distributions, is in most cases approached by the GUM (Guide to the Expression of Uncertainty in Measurement) uncertainty framework which is based on the law of propagation of uncertainty assigned to various input quantities and the characterization of the measurand (output quantity) by a Gaussian or a t-distribution. Recently, a Supplement to the ISO-GUM was prepared by the JCGM (Joint Committee for Guides in Metrology). This Guide gives guidance on propagating probability distributions assigned to various input quantities through a numerical simulation (Monte Carlo Method) and determining a probability distribution for the measurand. In the present work the two approaches were used to estimate the uncertainty of the direct determination of cadmium in water by graphite furnace atomic absorption spectrometry (GFAAS). The expanded uncertainty results (at 95% confidence levels) obtained with the GUM Uncertainty Framework and the Monte Carlo Method at the concentration level of 3.01 μg/L were ±0.20 μg/L and ±0.18 μg/L, respectively. Thus, the GUM Uncertainty Framework slightly overestimates the overall uncertainty by 10%. Even after taking into account additional sources of uncertainty that the GUM Uncertainty Framework considers as negligible, the Monte Carlo gives again the same uncertainty result (±0.18 μg/L). The main source of this difference is the approximation used by the GUM Uncertainty Framework in estimating the standard uncertainty of the calibration curve produced by least squares regression. Although the GUM Uncertainty Framework proves to be adequate in this particular case, generally the Monte Carlo Method has features that avoid the assumptions and the limitations of the GUM Uncertainty Framework. Copyright © 2010 Elsevier B.V. All rights reserved.

  18. The creation and validation of the Measure of Effective Attributes of Trainers (MEAT).

    Science.gov (United States)

    Boyd, Meredith R; Lewis, Cara C; Scott, Kelli; Krendl, Anne; Lyon, Aaron R

    2017-06-02

    Training is a core component in the implementation of empirically supported treatments, especially in the case of psychosocial interventions targeting mental illness. However, common forms of training are relatively ineffective in producing behavioral changes in providers. Trainers are in a strategic position to influence the success of training, but no research, to our knowledge, has explored whether personal characteristics of trainers (e.g., enthusiasm, charisma) increase effectiveness of training empirically supported treatments in the field of mental health. To address this gap, the current study created a measure of trainer characteristics (the Measure of Effective Attributes of Trainers (MEAT)) and assessed preliminary evidence for its reliability and validity by following gold standard measure development procedures. Measure development consisted of three steps: (1) An initial pool of items was generated based on extant literature, input from the target population, and expert input; (2) target users of the measure interacted with the initial item pool to ensure face validity as well as clarity of measure instructions, response options, and items; and (3) a convenience sample viewed training videos and completed the measure resulting from step 2 to establish preliminary evidence of reliability and validity. An exploratory factor analysis was performed on the measure to determine whether latent factors (i.e., subscales of characteristics) underlie the data. The final solution consisted of two factors that demonstrated preliminary evidence for structural validity of the measure. The first factor, labeled "Charisma," contained items related to characteristics that facilitate a positive personal relationship with the trainee (e.g., friendly, warm), and the second factor, labeled "Credibility," contained items related to characteristics that emphasize the qualification of the trainer (e.g., professional, experienced). There was also evidence for face validity

  19. Integration of rain gauge measurement errors with the overall rainfall uncertainty estimation using kriging methods

    Science.gov (United States)

    Cecinati, Francesca; Moreno Ródenas, Antonio Manuel; Rico-Ramirez, Miguel Angel; ten Veldhuis, Marie-claire; Han, Dawei

    2016-04-01

    In many research studies rain gauges are used as a reference point measurement for rainfall, because they can reach very good accuracy, especially compared to radar or microwave links, and their use is very widespread. In some applications rain gauge uncertainty is assumed to be small enough to be neglected. This can be done when rain gauges are accurate and their data is correctly managed. Unfortunately, in many operational networks the importance of accurate rainfall data and of data quality control can be underestimated; budget and best practice knowledge can be limiting factors in a correct rain gauge network management. In these cases, the accuracy of rain gauges can drastically drop and the uncertainty associated with the measurements cannot be neglected. This work proposes an approach based on three different kriging methods to integrate rain gauge measurement errors in the overall rainfall uncertainty estimation. In particular, rainfall products of different complexity are derived through 1) block kriging on a single rain gauge 2) ordinary kriging on a network of different rain gauges 3) kriging with external drift to integrate all the available rain gauges with radar rainfall information. The study area is the Eindhoven catchment, contributing to the river Dommel, in the southern part of the Netherlands. The area, 590 km2, is covered by high quality rain gauge measurements by the Royal Netherlands Meteorological Institute (KNMI), which has one rain gauge inside the study area and six around it, and by lower quality rain gauge measurements by the Dommel Water Board and by the Eindhoven Municipality (six rain gauges in total). The integration of the rain gauge measurement error is accomplished in all the cases increasing the nugget of the semivariogram proportionally to the estimated error. Using different semivariogram models for the different networks allows for the separate characterisation of higher and lower quality rain gauges. For the kriging with

  20. Uncertainty propagation for the coulometric measurement of the plutonium concentration in MOX-PU4.

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2017-11-07

    This GUM WorkbenchTM propagation of uncertainty is for the coulometric measurement of the plutonium concentration in a Pu standard material (C126) supplied as individual aliquots that were prepared by mass. The C126 solution had been prepared and as aliquoted as standard material. Samples are aliquoted into glass vials and heated to dryness for distribution as dried nitrate. The individual plutonium aliquots were not separated chemically or otherwise purified prior to measurement by coulometry in the F/H Laboratory. Hydrogen peroxide was used for valence adjustment. The Pu assay measurement results were corrected for the interference from trace iron in the solution measured for assay. Aliquot mass measurements were corrected for air buoyancy. The relative atomic mass (atomic weight) of the plutonium from X126 certoficate was used. The isotopic composition was determined by thermal ionization mass spectrometry (TIMS) for comparison but not used in calculations.

  1. Development of electrical efficiency measurement techniques for 10 kW-class SOFC system: Part II. Uncertainty estimation

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, Yohei; Momma, Akihiko; Kato, Ken; Negishi, Akira; Takano, Kiyonami; Nozaki, Ken; Kato, Tohru [Fuel Cell System Group, Energy Technology Research Institute, National Institute of Advanced Industrial Science and Technology (AIST), AIST Tsukuba Central 2, 1-1-1 Umezono, Tsukuba, Ibaraki 305-8568 (Japan)

    2009-03-15

    Uncertainty of electrical efficiency measurement was investigated for a 10 kW-class SOFC system using town gas. Uncertainty of heating value measured by the gas chromatography method on a mole base was estimated as {+-}0.12% at 95% level of confidence. Micro-gas chromatography with/without CH{sub 4} quantification may be able to reduce uncertainty of measurement. Calibration and uncertainty estimation methods are proposed for flow-rate measurement of town gas with thermal mass-flow meters or controllers. By adequate calibrations for flowmeters, flow rate of town gas or natural gas at 35 standard litters per minute can be measured within relative uncertainty {+-}1.0% at 95 % level of confidence. Uncertainty of power measurement can be as low as {+-}0.14% when a precise wattmeter is used and calibrated properly. It is clarified that electrical efficiency for non-pressurized 10 kW-class SOFC systems can be measured within {+-}1.0% relative uncertainty at 95% level of confidence with the developed techniques when the SOFC systems are operated relatively stably. (author)

  2. Development of electrical efficiency measurement techniques for 10 kW-class SOFC system: Part II. Uncertainty estimation

    Energy Technology Data Exchange (ETDEWEB)

    Yohei Tanaka; Akihiko Momma; Ken Kato; Akira Negishi; Kiyonami Takano; Ken Nozaki; Tohru Kato [National Institute of Advanced Industrial Science and Technology (AIST), Ibaraki (Japan). Fuel Cell System Group, Energy Technology Research Institute

    2009-03-15

    Uncertainty of electrical efficiency measurement was investigated for a 10 kW-class SOFC system using town gas. Uncertainty of heating value measured by the gas chromatography method on a mole base was estimated as {+-} 0.12% at 95% level of confidence. Micro-gas chromatography with/without CH{sub 4} quantification may be able to reduce uncertainty of measurement. Calibration and uncertainty estimation methods are proposed for flow-rate measurement of town gas with thermal mass-flow meters or controllers. By adequate calibrations for flowmeters, flow rate of town gas or natural gas at 35 standard litters per minute can be measured within relative uncertainty {+-}1.0% at 95 % level of confidence. Uncertainty of power measurement can be as low as {+-}0.14% when a precise wattmeter is used and calibrated properly. It is clarified that electrical efficiency for non-pressurized 10 kW-class SOFC systems can be measured within 1.0% relative uncertainty at 95% level of confidence with the developed techniques when the SOFC systems are operated relatively stably.

  3. The measurement of uncertainty in caregivers of patients with heart failure.

    Science.gov (United States)

    Harkness, Karen; Arthur, Heather; McKelvie, Robert

    2013-01-01

    Family caregivers of heart failure (HF) patients describe feelings of uncertainty; however, studies measuring uncertainty in caregivers of HF patients are extremely sparse. This study examined the validity and reliability of the Mishel Uncertainty in Illness Scale-Family Member form (PPUS-FM) in caregivers of HF patients. Caregivers (n = 50) of community-dwelling HF patients completed the PPUS-FM, Caregiver Reaction Assessment (CRA) and Hospital Anxiety and Depression Scale (HADS) in this cross-sectional study. Significant correlations emerged among PPUS-FM and (a) CRA-schedule burden (p = .499, p financial burden (p = .292, p < .05); (c) CRA-family burden (p = .385, p < .01); (d) CRA-health burden (p = .421; p < .01); and (e) HADS-depression scores (p = -.298, p <.05). Cronbach's alpha for the PPUS-FM was .89. In this sample, the PPUS-FM had some evidence of construct validity and good internal consistency. However, the respondent burden and unidimensional nature of the PPUS-FM suggest that this tool needs further revision and testing for use with caregivers of HF patients.

  4. Influence of measurement uncertainty on classification of thermal environment in buildings according to European Standard EN 15251

    DEFF Research Database (Denmark)

    Kolarik, Jakub; Olesen, Bjarne W.

    2015-01-01

    European Standard EN 15 251 in its current version does not provide any guidance on how to handle uncertainty of long term measurements of indoor environmental parameters used for classification of buildings. The objective of the study was to analyse the uncertainty for field measurements...... of operative temperature and evaluate its effect on categorization of thermal environment according to EN 15251. A data-set of field measurements of operative temperature four office buildings situated in Denmark, Italy and Spain was used. Data for each building included approx. one year of continuous...... measurements of operative temperature at two measuring points (south/south-west and north/northeast orientation). Results of the present study suggest that measurement uncertainty needs to be considered during assessment of thermal environment in existing buildings. When expanded standard uncertainty was taken...

  5. Efficient implementation of a Monte Carlo method for uncertainty evaluation in dynamic measurements

    Science.gov (United States)

    Eichstädt, S.; Link, A.; Harris, P.; Elster, C.

    2012-06-01

    Measurement of quantities having time-dependent values such as force, acceleration or pressure is a topic of growing importance in metrology. The application of the Guide to the Expression of Uncertainty in Measurement (GUM) and its Supplements to the evaluation of uncertainty for such quantities is challenging. We address the efficient implementation of the Monte Carlo method described in GUM Supplements 1 and 2 for this task. The starting point is a time-domain observation equation. The steps of deriving a corresponding measurement model, the assignment of probability distributions to the input quantities in the model, and the propagation of the distributions through the model are all considered. A direct implementation of a Monte Carlo method can be intractable on many computers since the storage requirement of the method can be large compared with the available computer memory. Two memory-efficient alternatives to the direct implementation are proposed. One approach is based on applying updating formulae for calculating means, variances and point-wise histograms. The second approach is based on evaluating the measurement model sequentially in time. A simulated example is used to compare the performance of the direct and alternative procedures.

  6. Solar Irradiances Measured using SPN1 Radiometers: Uncertainties and Clues for Development

    Energy Technology Data Exchange (ETDEWEB)

    Badosa, Jordi; Wood, John; Blanc, Philippe; Long, Charles N.; Vuilleumier, Laurent; Demengel, Dominique; Haeffelin, Martial

    2014-12-08

    The fast development of solar radiation and energy applications, such as photovoltaic and solar thermodynamic systems, has increased the need for solar radiation measurement and monitoring, not only for the global component but also the diffuse and direct. End users look for the best compromise between getting close to state-of-the-art measurements and keeping capital, maintenance and operating costs to a minimum. Among the existing commercial options, SPN1 is a relatively low cost solar radiometer that estimates global and diffuse solar irradiances from seven thermopile sensors under a shading mask and without moving parts. This work presents a comprehensive study of SPN1 accuracy and sources of uncertainty, which results from laboratory experiments, numerical modeling and comparison studies between measurements from this sensor and state-of-the art instruments for six diverse sites. Several clues are provided for improving the SPN1 accuracy and agreement with state-of-the-art measurements.

  7. Study of uncertainties of height measurements of monoatomic steps on Si 5 × 5 using DFT

    Science.gov (United States)

    Charvátová Campbell, Anna; Jelínek, Pavel; Klapetek, Petr

    2017-03-01

    The development of nanotechnology gives rise to new demands on standards for dimensional measurements. Monoatomic steps on, e.g. silicon are a suitable length standard with a very low nominal value. The quantum-mechanical nature of objects consisting of only a few atomic layers in one or more dimensions can no longer be neglected and it is necessary to make a transition from the classical picture to a quantum approach in the field of uncertainty analysis. In this contribution, sources of uncertainty for height measurements using atomic force microscopy (AFM) in contact mode are discussed. Results of density functional theory (DFT) modeling of AFM scans on a monoatomic step on silicon 5× 5 are presented. Van der Waals forces for the interaction of a spherical tip and an infinite step are calculated classically. Height measurements in constant force mode at different forces are simulated. In our approach, we model the tip apex and the monoatomic step as systems of individual atoms. As interatomic forces act on the sample and the tip of the microscope, the atoms of both relax in order to reach equilibrium positions. This leads to changes in those quantities that are finally interpreted as the resultant height of the step. The presence of van der Waals forces induces differences between the forces acting on atoms at different distances of the step. The behavior of different tips is studied along with their impact on the resulting AFM scans. Because the shape of the tip apex is usually unknown in real experiments, this variance in the height result due to different tips is interpreted as a source of uncertainty.

  8. Modeling uncertainty of estimated illnesses attributed to non-O157:H7 Shiga toxin-producing escherichia coli and its impact on illness cost.

    Science.gov (United States)

    Marks, Harry M; Tohamy, Soumaya M; Tsui, Flora

    2013-06-01

    Because of numerous reported foodborne illness cases due to non-O157:H7 Shiga toxin-producing Escherichia coli (STEC) bacteria in the United States and elsewhere, interest in requiring better control of these pathogens in the food supply has increased. Successfully putting forth regulations depends upon cost-benefit analyses. Policy decisions often depend upon an evaluation of the uncertainty of the estimates used in such an analysis. This article presents an approach for estimating the uncertainties of estimated expected cost per illness and total annual costs of non-O157 STEC-related illnesses due to uncertainties associated with (i) recent FoodNet data and (ii) methodology proposed by Scallan et al. in 2011. The FoodNet data categorize illnesses regarding hospitalization and death. We obtained the illness-category costs from the foodborne illness cost calculator of the U.S. Department of Agriculture, Economic Research Service. Our approach for estimating attendant uncertainties differs from that of Scallan et al. because we used a classical bootstrap procedure for estimating uncertainty of an estimated parameter value (e.g., mean value), reflecting the design of the FoodNet database, whereas the other approach results in an uncertainty distribution that includes an extraneous contribution due to the underlying variability of the distribution of illnesses among different sites. For data covering 2005 through 2010, we estimate that the average cost per illness was about $450, with a 98% credible interval of $230 to $1,000. This estimate and range are based on estimations of about one death and 100 hospitalizations per 34,000 illnesses. Our estimate of the total annual cost is about $51 million, with a 98% credible interval of $19 million to $122 million. The uncertainty distribution for total annual cost is approximated well by a lognormal distribution, with mean and standard deviations for the log-transformed costs of 10.765 and 0.390, respectively.

  9. Weak Anderson localisation in reverberation rooms and its effect on the uncertainty of sound power measurements

    DEFF Research Database (Denmark)

    Jacobsen, Finn

    2011-01-01

    The effect known as ‘weak Anderson localisation’, ‘coherent backscattering’ or ‘enhanced backscattering’ is a physical phenomenon that occurs in random systems, e.g., disordered media and linear wave systems, including reverberation rooms: the mean square response is increased at the drive point....... In a reverberation room this means that one can expect an increase of the reverberant sound field at the position of the source that generates the sound field. This affects the sound power output of the source and is therefore of practical concern. However, because of the stronger direct sound field at the source...... for the uncertainty of sound power measurements....

  10. The Harm that Underestimation of Uncertainty Does to Our Community: A Case Study Using Sunspot Area Measurements

    Science.gov (United States)

    Munoz-Jaramillo, Andres

    2017-08-01

    Data products in heliospheric physics are very often provided without clear estimates of uncertainty. From helioseismology in the solar interior, all the way to in situ solar wind measurements beyond 1AU, uncertainty estimates are typically hard for users to find (buried inside long documents that are separate from the data products), or simply non-existent.There are two main reasons why uncertainty measurements are hard to find:1. Understanding instrumental systematic errors is given a much higher priority inside instrumental teams.2. The desire to perfectly understand all sources of uncertainty postpones indefinitely the actual quantification of uncertainty in our measurements.Using the cross calibration of 200 years of sunspot area measurements as a case study, in this presentation we will discuss the negative impact that inadequate measurements of uncertainty have on users, through the appearance of toxic and unnecessary controversies, and data providers, through the creation of unrealistic expectations regarding the information that can be extracted from their data. We will discuss how empirical estimates of uncertainty represent a very good alternative to not providing any estimates at all, and finalize by discussing the bare essentials that should become our standard practice for future instruments and surveys.

  11. Measurement of sub-picoampere direct currents with uncertainties below ten attoamperes

    Science.gov (United States)

    Krause, C.; Drung, D.; Scherer, H.

    2017-02-01

    A new type of the ultrastable low-noise current amplifier (ULCA) is presented. It involves thick-film resistors to achieve a high feedback resistance of 185 GΩ at the input amplifier. An improved noise level of 0.4 fA/ √{ Hz } with a 1/f corner of about 30 μHz and an effective input bias current well below 100 aA are demonstrated. For small direct currents, measurement uncertainties below 10 aA are achievable even without current reversal or on/off switching. Above about 1 pA, the stability of the ULCA's resistor network limits the relative measurement uncertainty to about 10 parts per million. The new setup is used to characterize and optimize the noise in the wiring installed on a dilution refrigerator for current measurements on single-electron transport pumps. In a test configuration connected to the wiring in a pulse tube refrigerator, a total noise floor of 0.44 fA/ √{ Hz } was achieved including the contributions of amplifier and cryogenic wiring.

  12. A Weighted Belief Entropy-Based Uncertainty Measure for Multi-Sensor Data Fusion

    Science.gov (United States)

    Tang, Yongchuan; Zhou, Deyun; Xu, Shuai; He, Zichang

    2017-01-01

    In real applications, how to measure the uncertain degree of sensor reports before applying sensor data fusion is a big challenge. In this paper, in the frame of Dempster–Shafer evidence theory, a weighted belief entropy based on Deng entropy is proposed to quantify the uncertainty of uncertain information. The weight of the proposed belief entropy is based on the relative scale of a proposition with regard to the frame of discernment (FOD). Compared with some other uncertainty measures in Dempster–Shafer framework, the new measure focuses on the uncertain information represented by not only the mass function, but also the scale of the FOD, which means less information loss in information processing. After that, a new multi-sensor data fusion approach based on the weighted belief entropy is proposed. The rationality and superiority of the new multi-sensor data fusion method is verified according to an experiment on artificial data and an application on fault diagnosis of a motor rotor. PMID:28441736

  13. Rigorous evaluation of chemical measurement uncertainty: liquid chromatographic analysis methods using detector response factor calibration

    Science.gov (United States)

    Toman, Blaza; Nelson, Michael A.; Bedner, Mary

    2017-06-01

    Chemical measurement methods are designed to promote accurate knowledge of a measurand or system. As such, these methods often allow elicitation of latent sources of variability and correlation in experimental data. They typically implement measurement equations that support quantification of effects associated with calibration standards and other known or observed parametric variables. Additionally, multiple samples and calibrants are usually analyzed to assess accuracy of the measurement procedure and repeatability by the analyst. Thus, a realistic assessment of uncertainty for most chemical measurement methods is not purely bottom-up (based on the measurement equation) or top-down (based on the experimental design), but inherently contains elements of both. Confidence in results must be rigorously evaluated for the sources of variability in all of the bottom-up and top-down elements. This type of analysis presents unique challenges due to various statistical correlations among the outputs of measurement equations. One approach is to use a Bayesian hierarchical (BH) model which is intrinsically rigorous, thus making it a straightforward method for use with complex experimental designs, particularly when correlations among data are numerous and difficult to elucidate or explicitly quantify. In simpler cases, careful analysis using GUM Supplement 1 (MC) methods augmented with random effects meta analysis yields similar results to a full BH model analysis. In this article we describe both approaches to rigorous uncertainty evaluation using as examples measurements of 25-hydroxyvitamin D3 in solution reference materials via liquid chromatography with UV absorbance detection (LC-UV) and liquid chromatography mass spectrometric detection using isotope dilution (LC-IDMS).

  14. Rigorous evaluation of chemical measurement uncertainty: Liquid chromatographic analysis methods using detector response factor calibration.

    Science.gov (United States)

    Toman, Blaza; Nelson, Michael A; Bedner, Mary

    2017-06-01

    Chemical measurement methods are designed to promote accurate knowledge of a measurand or system. As such, these methods often allow elicitation of latent sources of variability and correlation in experimental data. They typically implement measurement equations that support quantification of effects associated with calibration standards and other known or observed parametric variables. Additionally, multiple samples and calibrants are usually analyzed to assess accuracy of the measurement procedure and repeatability by the analyst. Thus, a realistic assessment of uncertainty for most chemical measurement methods is not purely bottom-up (based on the measurement equation) or top-down (based on the experimental design), but inherently contains elements of both. Confidence in results must be rigorously evaluated for the sources of variability in all of the bottom-up and top-down elements. This type of analysis presents unique challenges due to various statistical correlations among the outputs of measurement equations. One approach is to use a Bayesian hierarchical (BH) model which is intrinsically rigorous, thus making it a straightforward method for use with complex experimental designs, particularly when correlations among data are numerous and difficult to elucidate or explicitly quantify. In simpler cases, careful analysis using GUM Supplement 1 (MC) methods augmented with random effects meta analysis yields similar results to a full BH model analysis. In this article we describe both approaches to rigorous uncertainty evaluation using as examples measurements of 25-hydroxyvitamin D3 in solution reference materials via liquid chromatography with UV absorbance detection (LC-UV) and liquid chromatography mass spectrometric detection using isotope dilution (LC-IDMS).

  15. Influence of Spherical Radiation Pattern Measurement Uncertainty on Handset Performance Measures

    DEFF Research Database (Denmark)

    Nielsen, Jesper Ødum; Pedersen, Gert Frølund

    2005-01-01

    ), and mean effective gain (MEG) can be computed. Often this kind of measurements are made with a phantom head next to the handsets in order to simulate the influence of a real user. The measured radiation patterns are only expected to be repeatable if the same setup is used, i.e., the same phantom...... system that may introduce errors in standardized performance measurements. Radiation patterns of six handsets have been measured while they were mounted at various offsets from the reference position defined by the Cellular Telecommunications & Internet Association (CTIA) certification. The change...

  16. Changes in Handset Performance Measures due to Spherical Radiation Pattern Measurement Uncertainty

    DEFF Research Database (Denmark)

    Nielsen, Jesper Ødum; Pedersen, Gert Frølund

    ), and mean effective gain (MEG) can be computed. Often this kind of measurements are made with a phantom head next to the handsets in order to simulate the influence of a real user. The measured radiation patterns are only expected to be repeatable if the same setup is used, i.e., the same phantom...... system that may introduce errors in standardized performance measurements. Radiation patterns of six handsets have been measured while they were mounted at various offsets from the reference position defined by the Cellular Telecommunications & Internet Association (CTIA) certification. The change...

  17. How do different humanness measures relate? Confronting the attribution of secondary emotions, human uniqueness, and human nature traits.

    Science.gov (United States)

    Martínez, Rocío; Rodriguez-Bailon, Rosa; Moya, Miguel; Vaes, Jeroen

    2017-01-01

    The present research examines the relationship between the infrahumanization approach and the two-dimensional model of humanness: an issue that has received very little empirical attention. In Study 1, we created three unknown groups (Humanized, Animalized, and Mechanized) granting/denying them Human Nature (HN) and Human Uniqueness (HU) traits. The attribution of primary/secondary emotions was measured. As expected, participants attributed more secondary emotions to the humanized compared to dehumanized groups. Importantly, both animalized and mechanized groups were attributed similar amounts of secondary emotions. In Study 2, the groups were described in terms of their capacity to express secondary emotions. We measured the attribution of HN/HU traits. Results showed that the infrahumanized group was denied both HU/HN traits. The results highlight the importance of considering the common aspects of both approaches in understanding processes of dehumanization.

  18. Validation and measurement uncertainty estimation in food microbiology: differences between quantitative and qualitative methods

    Directory of Open Access Journals (Sweden)

    Vesna Režić Dereani

    2010-09-01

    Full Text Available The aim of this research is to describe quality control procedures, procedures for validation and measurement uncertainty (MU determination as an important element of quality assurance in food microbiology laboratory for qualitative and quantitative type of analysis. Accreditation is conducted according to the standard ISO 17025:2007. General requirements for the competence of testing and calibration laboratories, which guarantees the compliance with standard operating procedures and the technical competence of the staff involved in the tests, recently are widely introduced in food microbiology laboratories in Croatia. In addition to quality manual introduction, and a lot of general documents, some of the most demanding procedures in routine microbiology laboratories are measurement uncertainty (MU procedures and validation experiment design establishment. Those procedures are not standardized yet even at international level, and they require practical microbiological knowledge, altogether with statistical competence. Differences between validation experiments design for quantitative and qualitative food microbiology analysis are discussed in this research, and practical solutions are shortly described. MU for quantitative determinations is more demanding issue than qualitative MU calculation. MU calculations are based on external proficiency testing data and internal validation data. In this paper, practical schematic descriptions for both procedures are shown.

  19. Degradation and performance evaluation of PV module in desert climate conditions with estimate uncertainty in measuring

    Directory of Open Access Journals (Sweden)

    Fezzani Amor

    2017-01-01

    Full Text Available The performance of photovoltaic (PV module is affected by outdoor conditions. Outdoor testing consists installing a module, and collecting electrical performance data and climatic data over a certain period of time. It can also include the study of long-term performance under real work conditions. Tests are operated in URAER located in desert region of Ghardaïa (Algeria characterized by high irradiation and temperature levels. The degradation of PV module with temperature and time exposure to sunlight contributes significantly to the final output from the module, as the output reduces each year. This paper presents a comparative study of different methods to evaluate the degradation of PV module after a long term exposure of more than 12 years in desert region and calculates uncertainties in measuring. Firstly, this evaluation uses three methods: Visual inspection, data given by Solmetric PVA-600 Analyzer translated at Standard Test Condition (STC and based on the investigation results of the translation equations as ICE 60891. Secondly, the degradation rates calculated for all methods. Finally, a comparison between a degradation rates given by Solmetric PVA-600 analyzer, calculated by simulation model and calculated by two methods (ICE 60891 procedures 1, 2. We achieved a detailed uncertainty study in order to improve the procedure and measurement instrument.

  20. Measurement uncertainty of lesion and reference mediastinum standardized uptake value in lung cancer.

    Science.gov (United States)

    Laffon, Eric; Milpied, Noel; Marthan, Roger

    2017-06-01

    To assess standardized uptake value (SUV) measurement uncertainty (MU) of lung cancer lesions with uptake greater than mediastinum but less than or equal to the liver and that of the mediastinum blood pool, and to compare lesion SUV with mediastinum SUV by assessing MU of their ratio. Dynamic PET data involving 10 frames were retrospectively analyzed in 10 patients, yielding maximal SUV of 25 lesions (Lesion-SUVmax), 10 mediastinum SUV, either maximal or mean (Med-SUVmax, Med-SUVmean), 25 Rmax ratios (=Lesion-SUVmax/Med-SUVmax), and 25 Rmean ratios (=Lesion-SUVmax/Med-SUVmean). A mean coefficient of variation was calculated for each parameter, leading to relative measurement uncertainty (MUr), respectively. MU of Rmax was found to involve both Lesion-SUVmax and Med-SUVmax MU: MUr=33.3-23.3-21.9%, respectively (95% confidence level). No significant difference in MUr was found between Med-SUVmax and Med-SUVmean and between Rmax and Rmean. Comparison between target lesion SUV and reference mediastinum SUV must take into account SUV MU of both. Therefore, no MU reduction can be expected from using the lesion/mediastinum SUVmax ratio instead of Lesion-SUVmax. Moreover, no MU reduction can be expected from using the mean mediastinum SUV instead of the maximal one.

  1. A study on evaluation strategies in dimensional X-ray computed tomography by estimation of measurement uncertainties

    DEFF Research Database (Denmark)

    Müller, Pavel; Hiller, Jochen; Cantatore, Angela

    2012-01-01

    connector and a plastic toggle, a hearing aid component. These are measured using a commercial CT scanner. Traceability is transferred using tactile and optical coordinate measuring machines, which are used to produce reference measurements. Results show that measurements of diameter for both parts resulted...... measurement results using different measuring strategies applied in different inspection software packages for volume and surface data analysis. The strategy influence is determined by calculating the measurement uncertainty. This investigation includes measurements of two industrial items, an aluminium pipe...

  2. Force Measurement Services at Kebs: AN Overview of Equipment, Procedures and Uncertainty

    Science.gov (United States)

    Bangi, J. O.; Maranga, S. M.; Nganga, S. P.; Mutuli, S. M.

    This paper describes the facilities, instrumentation and procedures currently used in the force laboratory at the Kenya Bureau of Standards (KEBS) for force measurement services. The laboratory uses the Force Calibration Machine (FCM) to calibrate force-measuring instruments. The FCM derives its traceability via comparisons using reference transfer force transducers calibrated by the Force Standard Machines (FSM) of a National Metrology Institute (NMI). The force laboratory is accredited to ISO/IEC 17025 by the Germany Accreditation Body (DAkkS). The accredited measurement scope of the laboratory is 1 MN to calibrate force transducers in both compression and tension modes. ISO 376 procedures are used while calibrating force transducers. The KEBS reference transfer standards have capacities of 10, 50, 300 and 1000 kN to cover the full range of the FCM. The uncertainty in the forces measured by the FCM were reviewed and determined in accordance to the new EURAMET calibration guide. The relative expanded uncertainty of force W realized by FCM was evaluated in a range from 10 kN-1 MN, and was found to be 5.0 × 10-4 with the coverage factor k being equal to 2. The overall normalized error (En) of the comparison results was also found to be less than 1. The accredited Calibration and Measurement Capability (CMC) of the KEBS force laboratory was based on the results of those intercomparisons. The FCM enables KEBS to provide traceability for the calibration of class ‘1’ force instruments as per the ISO 376.

  3. An approach based on Hierarchical Bayesian Graphical Models for measurement interpretation under uncertainty

    Science.gov (United States)

    Skataric, Maja; Bose, Sandip; Zeroug, Smaine; Tilke, Peter

    2017-02-01

    It is not uncommon in the field of non-destructive evaluation that multiple measurements encompassing a variety of modalities are available for analysis and interpretation for determining the underlying states of nature of the materials or parts being tested. Despite and sometimes due to the richness of data, significant challenges arise in the interpretation manifested as ambiguities and inconsistencies due to various uncertain factors in the physical properties (inputs), environment, measurement device properties, human errors, and the measurement data (outputs). Most of these uncertainties cannot be described by any rigorous mathematical means, and modeling of all possibilities is usually infeasible for many real time applications. In this work, we will discuss an approach based on Hierarchical Bayesian Graphical Models (HBGM) for the improved interpretation of complex (multi-dimensional) problems with parametric uncertainties that lack usable physical models. In this setting, the input space of the physical properties is specified through prior distributions based on domain knowledge and expertise, which are represented as Gaussian mixtures to model the various possible scenarios of interest for non-destructive testing applications. Forward models are then used offline to generate the expected distribution of the proposed measurements which are used to train a hierarchical Bayesian network. In Bayesian analysis, all model parameters are treated as random variables, and inference of the parameters is made on the basis of posterior distribution given the observed data. Learned parameters of the posterior distribution obtained after the training can therefore be used to build an efficient classifier for differentiating new observed data in real time on the basis of pre-trained models. We will illustrate the implementation of the HBGM approach to ultrasonic measurements used for cement evaluation of cased wells in the oil industry.

  4. SU-F-T-301: Planar Dose Pass Rate Inflation Due to the MapCHECK Measurement Uncertainty Function

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, D [Northside Hospital Cancer Institute, Atlanta, GA (United States); Spaans, J; Kumaraswamy, L; Podgorsak, M [Roswell Park Cancer Institute, Buffalo, NY (United States)

    2016-06-15

    Purpose: To quantify the effect of the Measurement Uncertainty function on planar dosimetry pass rates, as analyzed with Sun Nuclear Corporation analytic software (“MapCHECK” or “SNC Patient”). This optional function is toggled on by default upon software installation, and automatically increases the user-defined dose percent difference (%Diff) tolerance for each planar dose comparison. Methods: Dose planes from 109 IMRT fields and 40 VMAT arcs were measured with the MapCHECK 2 diode array, and compared to calculated planes from a commercial treatment planning system. Pass rates were calculated within the SNC analytic software using varying calculation parameters, including Measurement Uncertainty on and off. By varying the %Diff criterion for each dose comparison performed with Measurement Uncertainty turned off, an effective %Diff criterion was defined for each field/arc corresponding to the pass rate achieved with MapCHECK Uncertainty turned on. Results: For 3%/3mm analysis, the Measurement Uncertainty function increases the user-defined %Diff by 0.8–1.1% average, depending on plan type and calculation technique, for an average pass rate increase of 1.0–3.5% (maximum +8.7%). For 2%, 2 mm analysis, the Measurement Uncertainty function increases the user-defined %Diff by 0.7–1.2% average, for an average pass rate increase of 3.5–8.1% (maximum +14.2%). The largest increases in pass rate are generally seen with poorly-matched planar dose comparisons; the MapCHECK Uncertainty effect is markedly smaller as pass rates approach 100%. Conclusion: The Measurement Uncertainty function may substantially inflate planar dose comparison pass rates for typical IMRT and VMAT planes. The types of uncertainties incorporated into the function (and their associated quantitative estimates) as described in the software user’s manual may not accurately estimate realistic measurement uncertainty for the user’s measurement conditions. Pass rates listed in published

  5. Source attribution of Arctic black carbon constrained by aircraft and surface measurements

    Science.gov (United States)

    Xu, Jun-Wei; Martin, Randall V.; Morrow, Andrew; Sharma, Sangeeta; Huang, Lin; Leaitch, W. Richard; Burkart, Julia; Schulz, Hannes; Zanatta, Marco; Willis, Megan D.; Henze, Daven K.; Lee, Colin J.; Herber, Andreas B.; Abbatt, Jonathan P. D.

    2017-10-01

    Black carbon (BC) contributes to Arctic warming, yet sources of Arctic BC and their geographic contributions remain uncertain. We interpret a series of recent airborne (NETCARE 2015; PAMARCMiP 2009 and 2011 campaigns) and ground-based measurements (at Alert, Barrow and Ny-Ålesund) from multiple methods (thermal, laser incandescence and light absorption) with the GEOS-Chem global chemical transport model and its adjoint to attribute the sources of Arctic BC. This is the first comparison with a chemical transport model of refractory BC (rBC) measurements at Alert. The springtime airborne measurements performed by the NETCARE campaign in 2015 and the PAMARCMiP campaigns in 2009 and 2011 offer BC vertical profiles extending to above 6 km across the Arctic and include profiles above Arctic ground monitoring stations. Our simulations with the addition of seasonally varying domestic heating and of gas flaring emissions are consistent with ground-based measurements of BC concentrations at Alert and Barrow in winter and spring (rRMSE effect on the Arctic BC column burden both in spring (56 %) and annually (37 %), with the largest contribution in the middle troposphere (400-700 hPa). Anthropogenic emissions from northern Asia contribute considerable BC (27 % in spring and 43 % annually) to the lower troposphere (below 900 hPa). Biomass burning contributes 20 % to the Arctic BC column annually.At the Arctic surface, anthropogenic emissions from northern Asia (40-45 %) and eastern and southern Asia (20-40 %) are the largest BC contributors in winter and spring, followed by Europe (16-36 %). Biomass burning from North America is the most important contributor to all stations in summer, especially at Barrow.Our adjoint simulations indicate pronounced spatial heterogeneity in the contribution of emissions to the Arctic BC column concentrations, with noteworthy contributions from emissions in eastern China (15 %) and western Siberia (6.5 %). Although uncertain, gas flaring

  6. CFCI3 (CFC-11): UV Absorption Spectrum Temperature Dependence Measurements and the Impact on Atmospheric Lifetime and Uncertainty

    Science.gov (United States)

    Mcgillen, Max R.; Fleming, Eric L.; Jackman, Charles H.; Burkholder, James B.

    2014-01-01

    CFCl3 (CFC-11) is both an atmospheric ozone-depleting and potent greenhouse gas that is removed primarily via stratospheric UV photolysis. Uncertainty in the temperature dependence of its UV absorption spectrum is a significant contributing factor to the overall uncertainty in its global lifetime and, thus, model calculations of stratospheric ozone recovery and climate change. In this work, the CFC-11 UV absorption spectrum was measured over a range of wavelength (184.95 - 230 nm) and temperature (216 - 296 K). We report a spectrum temperature dependence that is less than currently recommended for use in atmospheric models. The impact on its atmospheric lifetime was quantified using a 2-D model and the spectrum parameterization developed in this work. The obtained global annually averaged lifetime was 58.1 +- 0.7 years (2 sigma uncertainty due solely to the spectrum uncertainty). The lifetime is slightly reduced and the uncertainty significantly reduced from that obtained using current spectrum recommendations

  7. Quantifying uncertainty in the measurement of arsenic in suspended particulate matter by Atomic Absorption Spectrometry with hydride generator

    Directory of Open Access Journals (Sweden)

    Ahuja Tarushee

    2011-04-01

    Full Text Available Abstract Arsenic is the toxic element, which creates several problems in human being specially when inhaled through air. So the accurate and precise measurement of arsenic in suspended particulate matter (SPM is of prime importance as it gives information about the level of toxicity in the environment, and preventive measures could be taken in the effective areas. Quality assurance is equally important in the measurement of arsenic in SPM samples before making any decision. The quality and reliability of the data of such volatile elements depends upon the measurement of uncertainty of each step involved from sampling to analysis. The analytical results quantifying uncertainty gives a measure of the confidence level of the concerned laboratory. So the main objective of this study was to determine arsenic content in SPM samples with uncertainty budget and to find out various potential sources of uncertainty, which affects the results. Keeping these facts, we have selected seven diverse sites of Delhi (National Capital of India for quantification of arsenic content in SPM samples with uncertainty budget following sampling by HVS to analysis by Atomic Absorption Spectrometer-Hydride Generator (AAS-HG. In the measurement of arsenic in SPM samples so many steps are involved from sampling to final result and we have considered various potential sources of uncertainties. The calculation of uncertainty is based on ISO/IEC17025: 2005 document and EURACHEM guideline. It has been found that the final results mostly depend on the uncertainty in measurement mainly due to repeatability, final volume prepared for analysis, weighing balance and sampling by HVS. After the analysis of data of seven diverse sites of Delhi, it has been concluded that during the period from 31st Jan. 2008 to 7th Feb. 2008 the arsenic concentration varies from 1.44 ± 0.25 to 5.58 ± 0.55 ng/m3 with 95% confidence level (k = 2.

  8. Decision making for urban drainage systems under uncertainty caused by weather radar rainfall measurement

    Science.gov (United States)

    Dai, Qiang; Zhuo, Lu; Han, Dawei

    2015-04-01

    With the rapidly growth of urbanization and population, the decision making for managing urban flood risk has been a significant issue for most large cities in China. A high-quality measurement of rainfall at small temporal but large spatial scales is of great importance to urban flood risk management. Weather radar rainfall, with its advantage of short-term predictability and high spatial and temporal resolutions, has been widely applied in the urban drainage system modeling. It is recognized that weather radar is subjected to many uncertainties and many studies have been carried out to quantify these uncertainties in order to improve the quality of the rainfall and the corresponding outlet flow. However, considering the final action in urban flood risk management is the decision making such as flood warning and whether to build or how to operate a hydraulics structure, some uncertainties of weather radar may have little or significant influence to the final results. For this reason, in this study, we aim to investigate which characteristics of the radar rainfall are the significant ones for decision making in urban flood risk management. A radar probabilistic quantitative rainfall estimated scheme is integrated with an urban flood model (Storm Water Management Model, SWMM) to make a decision on whether to warn or not according to the decision criterions. A number of scenarios with different storm types, synoptic regime and spatial and temporal correlation are designed to analyze the relationship between these affected factors and the final decision. Based on this, parameterized radar probabilistic rainfall estimation model is established which reflects the most important elements in the decision making for urban flood risk management.

  9. A PC program for estimating measurement uncertainty for aeronautics test instrumentation

    Science.gov (United States)

    Blumenthal, Philip Z.

    1995-01-01

    A personal computer program was developed which provides aeronautics and operations engineers at Lewis Research Center with a uniform method to quickly provide values for the uncertainty in test measurements and research results. The software package used for performing the calculations is Mathcad 4.0, a Windows version of a program which provides an interactive user interface for entering values directly into equations with immediate display of results. The error contribution from each component of the system is identified individually in terms of the parameter measured. The final result is given in common units, SI units, and percent of full scale range. The program also lists the specifications for all instrumentation and calibration equipment used for the analysis. It provides a presentation-quality printed output which can be used directly for reports and documents.

  10. Uncertainty assessment for measurements performed in the determination of thermal conductivity by scanning thermal microscopy

    Science.gov (United States)

    Ramiandrisoa, Liana; Allard, Alexandre; Hay, Bruno; Gomés, Séverine

    2017-11-01

    Although its use has been restricted to relative studies, scanning thermal microscopy (SThM) is presented today as a candidate technique for performing quantitative measurement of thermal properties at the nanoscale, thanks to the development of relevant calibration protocols. Based on the principle behind near-field microscopes, SThM uses a miniaturized probe to quantify heat transfers versus samples of various thermal conductivities: since the thermal conductivity of a sample cannot be directly estimated, a direct measurand related to the heat transfer must be defined and measured for each sample. That is the reason why the SThM technique applied to thermal conductivity determination belongs to the family of inverse methods. In this work we aim to qualify the technique from a metrological point of view. For the first time, assessment of uncertainty associated with the direct measurand Δ R is performed, yielding a result of less than 2%.

  11. A Measure of Uncertainty regarding the Interval Constraint of Normal Mean Elicited by Two Stages of a Prior Hierarchy

    Directory of Open Access Journals (Sweden)

    Hea-Jung Kim

    2014-01-01

    Full Text Available This paper considers a hierarchical screened Gaussian model (HSGM for Bayesian inference of normal models when an interval constraint in the mean parameter space needs to be incorporated in the modeling but when such a restriction is uncertain. An objective measure of the uncertainty, regarding the interval constraint, accounted for by using the HSGM is proposed for the Bayesian inference. For this purpose, we drive a maximum entropy prior of the normal mean, eliciting the uncertainty regarding the interval constraint, and then obtain the uncertainty measure by considering the relationship between the maximum entropy prior and the marginal prior of the normal mean in HSGM. Bayesian estimation procedure of HSGM is developed and two numerical illustrations pertaining to the properties of the uncertainty measure are provided.

  12. Evaluating the capabilities and uncertainties of droplet measurements for the fog droplet spectrometer (FM-100

    Directory of Open Access Journals (Sweden)

    J. K. Spiegel

    2012-09-01

    Full Text Available Droplet size spectra measurements are crucial to obtain a quantitative microphysical description of clouds and fog. However, cloud droplet size measurements are subject to various uncertainties. This work focuses on the error analysis of two key measurement uncertainties arising during cloud droplet size measurements with a conventional droplet size spectrometer (FM-100: first, we addressed the precision with which droplets can be sized with the FM-100 on the basis of the Mie theory. We deduced error assumptions and proposed a new method on how to correct measured size distributions for these errors by redistributing the measured droplet size distribution using a stochastic approach. Second, based on a literature study, we summarized corrections for particle losses during sampling with the FM-100. We applied both corrections to cloud droplet size spectra measured at the high alpine site Jungfraujoch for a temperature range from 0 °C to 11 °C. We showed that Mie scattering led to spikes in the droplet size distributions using the default sizing procedure, while the new stochastic approach reproduced the ambient size distribution adequately. A detailed analysis of the FM-100 sampling efficiency revealed that particle losses were typically below 10% for droplet diameters up to 10 μm. For larger droplets, particle losses can increase up to 90% for the largest droplets of 50 μm at ambient wind speeds below 4.4 m s−1 and even to >90% for larger angles between the instrument orientation and the wind vector (sampling angle at higher wind speeds. Comparisons of the FM-100 to other reference instruments revealed that the total liquid water content (LWC measured by the FM-100 was more sensitive to particle losses than to re-sizing based on Mie scattering, while the total number concentration was only marginally influenced by particle losses. Consequently, for further LWC measurements with the FM-100 we strongly recommend to consider (1 the

  13. Analysis of Uncertainty in a Middle-Cost Device for 3D Measurements in BIM Perspective.

    Science.gov (United States)

    Sánchez, Alonso; Naranjo, José-Manuel; Jiménez, Antonio; González, Alfonso

    2016-09-22

    Medium-cost devices equipped with sensors are being developed to get 3D measurements. Some allow for generating geometric models and point clouds. Nevertheless, the accuracy of these measurements should be evaluated, taking into account the requirements of the Building Information Model (BIM). This paper analyzes the uncertainty in outdoor/indoor three-dimensional coordinate measures and point clouds (using Spherical Accuracy Standard (SAS) methods) for Eyes Map, a medium-cost tablet manufactured by e-Capture Research & Development Company, Mérida, Spain. To achieve it, in outdoor tests, by means of this device, the coordinates of targets were measured from 1 to 6 m and cloud points were obtained. Subsequently, these were compared to the coordinates of the same targets measured by a Total Station. The Euclidean average distance error was 0.005-0.027 m for measurements by Photogrammetry and 0.013-0.021 m for the point clouds. All of them satisfy the tolerance for point cloud acquisition (0.051 m) according to the BIM Guide for 3D Imaging (General Services Administration); similar results are obtained in the indoor tests, with values of 0.022 m. In this paper, we establish the optimal distances for the observations in both, Photogrammetry and 3D Photomodeling modes (outdoor) and point out some working conditions to avoid in indoor environments. Finally, the authors discuss some recommendations for improving the performance and working methods of the device.

  14. Analysis of Uncertainty in a Middle-Cost Device for 3D Measurements in BIM Perspective

    Directory of Open Access Journals (Sweden)

    Alonso Sánchez

    2016-09-01

    Full Text Available Medium-cost devices equipped with sensors are being developed to get 3D measurements. Some allow for generating geometric models and point clouds. Nevertheless, the accuracy of these measurements should be evaluated, taking into account the requirements of the Building Information Model (BIM. This paper analyzes the uncertainty in outdoor/indoor three-dimensional coordinate measures and point clouds (using Spherical Accuracy Standard (SAS methods for Eyes Map, a medium-cost tablet manufactured by e-Capture Research & Development Company, Mérida, Spain. To achieve it, in outdoor tests, by means of this device, the coordinates of targets were measured from 1 to 6 m and cloud points were obtained. Subsequently, these were compared to the coordinates of the same targets measured by a Total Station. The Euclidean average distance error was 0.005–0.027 m for measurements by Photogrammetry and 0.013–0.021 m for the point clouds. All of them satisfy the tolerance for point cloud acquisition (0.051 m according to the BIM Guide for 3D Imaging (General Services Administration; similar results are obtained in the indoor tests, with values of 0.022 m. In this paper, we establish the optimal distances for the observations in both, Photogrammetry and 3D Photomodeling modes (outdoor and point out some working conditions to avoid in indoor environments. Finally, the authors discuss some recommendations for improving the performance and working methods of the device.

  15. Fragmentation Uncertainties in Hadronic Observables for Top-quark Mass Measurements arXiv

    CERN Document Server

    Corcella, Gennaro; Kim, Doojin

    We study the Monte Carlo uncertainties due to modeling of hadronization and showering in the extraction of the top-quark mass from observables that use exclusive hadronic final states in top decays, such as $t \\rightarrow \\text{anything+J/}\\psi$ or $t\\rightarrow \\text{anything}+(B\\rightarrow \\text{charged tracks})$, where $B$ is a $B$-hadron. To this end, we investigate the sensitivity of the top-quark mass, determined by means of a few observables already proposed in the literature as well as some new proposals, to the relevant parameters of event generators, such as HERWIG 6 and PYTHIA 8. We find that constraining those parameters at $\\mathcal{O}(1\\%-10\\%)$ is required to avoid a Monte Carlo uncertainty on $m_t$ greater than 500 MeV. For the sake of achieving the needed accuracy on such parameters, we examine the sensitivity of the top-quark mass measured from spectral features, such as peaks, endpoints and distributions of $E_{B}$, $m_{B\\ell}$, and some $m_{T2}$-like variables. We find that restricting one...

  16. Uncertainty Analysis on Risk Assessment of Water Inrush in Karst Tunnels

    Directory of Open Access Journals (Sweden)

    Yiqing Hao

    2016-01-01

    Full Text Available An improved attribute recognition method is reviewed and discussed to evaluate the risk of water inrush in karst tunnels. Due to the complex geology and hydrogeology, the methodology discusses the uncertainties related to the evaluation index and attribute measure. The uncertainties can be described by probability distributions. The values of evaluation index and attribute measure were employed through random numbers generated by Monte Carlo simulations and an attribute measure belt was chosen instead of the linearity attribute measure function. Considering the uncertainties of evaluation index and attribute measure, the probability distributions of four risk grades are calculated using random numbers generated by Monte Carlo simulation. According to the probability distribution, the risk level can be analyzed under different confidence coefficients. The method improvement is more accurate and feasible compared with the results derived from the attribute recognition model. Finally, the improved attribute recognition method was applied and verified in Longmenshan tunnel in China.

  17. Uncertainties in assessing tillage erosion - How appropriate are our measuring techniques?

    Science.gov (United States)

    Fiener, P.; Wilken, F.; Aldana-Jague, E.; Deumlich, D.; Gómez, J. A.; Guzmán, G.; Hardy, R. A.; Quinton, J. N.; Sommer, M.; Van Oost, K.; Wexler, R.

    2018-03-01

    Tillage erosion on arable land is a very important process leading to a net downslope movement of soil and soil constitutes. Tillage erosion rates are commonly in the same order of magnitude as water erosion rates and can be even higher, especially under highly mechanized agricultural soil management. Despite its prevalence and magnitude, tillage erosion is still understudied compared to water erosion. The goal of this study was to bring together experts using different techniques to determine tillage erosion and use the different results to discuss and quantify uncertainties associated with tillage erosion measurements. The study was performed in northeastern Germany on a 10 m by 50 m plot with a mean slope of 8%. Tillage erosion was determined after two sequences of seven tillage operations. Two different micro-tracers (magnetic iron oxide mixed with soil and fluorescent sand) and one macro-tracer (passive radio-frequency identification transponders (RFIDs), size: 4 × 22 mm) were used to directly determine soil fluxes. Moreover, tillage induced changes in topography were measured for the entire plot with two different terrestrial laser scanners and an unmanned aerial system for structure from motion topography analysis. Based on these elevation differences, corresponding soil fluxes were calculated. The mean translocation distance of all techniques was 0.57 m per tillage pass, with a relatively wide range of mean soil translocation distances ranging from 0.39 to 0.72 m per pass. A benchmark technique could not be identified as all used techniques have individual error sources, which could not be quantified. However, the translocation distances of the macro-tracers used were consistently smaller than the translocation distances of the micro-tracers (mean difference = - 26 ± 12%), which questions the widely used assumption of non-selective soil transport via tillage operations. This study points out that tillage erosion measurements, carried out under almost

  18. On the Uncertainty in Single Molecule Fluorescent Lifetime and Energy Emission Measurements

    Science.gov (United States)

    Brown, Emery N.; Zhang, Zhenhua; McCollom, Alex D.

    1996-01-01

    Time-correlated single photon counting has recently been combined with mode-locked picosecond pulsed excitation to measure the fluorescent lifetimes and energy emissions of single molecules in a flow stream. Maximum likelihood (ML) and least squares methods agree and are optimal when the number of detected photons is large, however, in single molecule fluorescence experiments the number of detected photons can be less than 20, 67 percent of those can be noise, and the detection time is restricted to 10 nanoseconds. Under the assumption that the photon signal and background noise are two independent inhomogeneous Poisson processes, we derive the exact joint arrival time probability density of the photons collected in a single counting experiment performed in the presence of background noise. The model obviates the need to bin experimental data for analysis, and makes it possible to analyze formally the effect of background noise on the photon detection experiment using both ML or Bayesian methods. For both methods we derive the joint and marginal probability densities of the fluorescent lifetime and fluorescent emission. The ML and Bayesian methods are compared in an analysis of simulated single molecule fluorescence experiments of Rhodamine 110 using different combinations of expected background noise and expected fluorescence emission. While both the ML or Bayesian procedures perform well for analyzing fluorescence emissions, the Bayesian methods provide more realistic measures of uncertainty in the fluorescent lifetimes. The Bayesian methods would be especially useful for measuring uncertainty in fluorescent lifetime estimates in current single molecule flow stream experiments where the expected fluorescence emission is low. Both the ML and Bayesian algorithms can be automated for applications in molecular biology.

  19. Field measurement of dermal soil loading attributable to various activities: implications for exposure assessment.

    Science.gov (United States)

    Kissel, J C; Richter, K Y; Fenske, R A

    1996-02-01

    Estimates of soil adherence to skin are required for assessment of dermal exposures to contaminants in soils. Previously available estimates depend heavily on indirect measurements and/or artificial activities and reflect sampling of hands only. Results are presented here from direct measurement of soil loading on skin surfaces of volunteers before and after normal occupational and recreational activities that might reasonably be expected to lead to soil contact. Skin surfaces assayed included hands, forearms, lower legs, faces and/or feet. Observed hand loadings vary over five orders of magnitude (roughly from 10(-3) to 10(2) mg/cm2) and are dependent upon type of activity. Hand loadings within the current default range of 0.2 to 1.0 mg/cm2 were produced by activities providing opportunity for relatively vigorous soil contact (rugby, farming). Loadings less than 0.2 mg/cm2 were found on hands following activities presenting less opportunity for direct soil contact (soccer, professional grounds maintenance) and on other body parts under many conditions. The default range does not, however, represent a worst case. Children playing in mud on the shore of a lake generated geometric mean loadings well in excess of 1 mg/cm2 on hands, arms, legs, and feet. Post-activity average loadings on hands were typically higher than average loadings on other body parts resulting from the same activity. Hand data from limited activities cannot, however, be used to conservatively predict loadings that might occur on other body surfaces without regard to activity since non-hand loadings attributable to higher contact activities exceeded hand loadings resulting from lower contact activities. Differences between pre- and post-activity loadings also demonstrate that dermal contact with soil is episodic. Typical background (pre-activity) geometric mean loadings appear to be on the order of 10(-2) mg/cm2 or less. Because exposures are activity dependent, quantification of dermal exposure

  20. Research on the attribution evaluating methods of dynamic effects of various parameter uncertainties on the in-structure floor response spectra of nuclear power plant

    Science.gov (United States)

    Li, Jianbo; Lin, Gao; Liu, Jun; Li, Zhiyuan

    2017-01-01

    Consideration of the dynamic effects of the site and structural parameter uncertainty is required by the standards for nuclear power plants (NPPs) in most countries. The anti-seismic standards provide two basic methods to analyze parameter uncertainty. Directly manually dealing with the calculated floor response spectra (FRS) values of deterministic approaches is the first method. The second method is to perform probability statistical analysis of the FRS results on the basis of the Monte Carlo method. The two methods can only reflect the overall effects of the uncertain parameters, and the results cannot be screened for a certain parameter's influence and contribution. In this study, based on the dynamic analyses of the floor response spectra of NPPs, a comprehensive index of the assessed impact for various uncertain parameters is presented and recommended, including the correlation coefficient, the regression slope coefficient and Tornado swing. To compensate for the lack of guidance in the NPP seismic standards, the proposed method can effectively be used to evaluate the contributions of various parameters from the aspects of sensitivity, acuity and statistical swing correlations. Finally, examples are provided to verify the set of indicators from systematic and intuitive perspectives, such as the uncertainty of the impact of the structure parameters and the contribution to the FRS of NPPs. The index is sensitive to different types of parameters, which provides a new technique for evaluating the anti-seismic parameters required for NPPs.

  1. Source attribution of Arctic black carbon constrained by aircraft and surface measurements

    Directory of Open Access Journals (Sweden)

    J.-W. Xu

    2017-10-01

    Full Text Available Black carbon (BC contributes to Arctic warming, yet sources of Arctic BC and their geographic contributions remain uncertain. We interpret a series of recent airborne (NETCARE 2015; PAMARCMiP 2009 and 2011 campaigns and ground-based measurements (at Alert, Barrow and Ny-Ålesund from multiple methods (thermal, laser incandescence and light absorption with the GEOS-Chem global chemical transport model and its adjoint to attribute the sources of Arctic BC. This is the first comparison with a chemical transport model of refractory BC (rBC measurements at Alert. The springtime airborne measurements performed by the NETCARE campaign in 2015 and the PAMARCMiP campaigns in 2009 and 2011 offer BC vertical profiles extending to above 6 km across the Arctic and include profiles above Arctic ground monitoring stations. Our simulations with the addition of seasonally varying domestic heating and of gas flaring emissions are consistent with ground-based measurements of BC concentrations at Alert and Barrow in winter and spring (rRMSE  < 13 % and with airborne measurements of the BC vertical profile across the Arctic (rRMSE  = 17 % except for an underestimation in the middle troposphere (500–700 hPa.Sensitivity simulations suggest that anthropogenic emissions in eastern and southern Asia have the largest effect on the Arctic BC column burden both in spring (56 % and annually (37 %, with the largest contribution in the middle troposphere (400–700 hPa. Anthropogenic emissions from northern Asia contribute considerable BC (27 % in spring and 43 % annually to the lower troposphere (below 900 hPa. Biomass burning contributes 20 % to the Arctic BC column annually.At the Arctic surface, anthropogenic emissions from northern Asia (40–45 % and eastern and southern Asia (20–40 % are the largest BC contributors in winter and spring, followed by Europe (16–36 %. Biomass burning from North America is the most important

  2. Quantum Measurements, Stochastic Networks, the Uncertainty Principle, and the Not So Strange “Weak Values”

    Directory of Open Access Journals (Sweden)

    Dmitri Sokolovski

    2016-09-01

    Full Text Available Suppose we make a series of measurements on a chosen quantum system. The outcomes of the measurements form a sequence of random events, which occur in a particular order. The system, together with a meter or meters, can be seen as following the paths of a stochastic network connecting all possible outcomes. The paths are shaped from the virtual paths of the system, and the corresponding probabilities are determined by the measuring devices employed. If the measurements are highly accurate, the virtual paths become “real”, and the mean values of a quantity (a functional are directly related to the frequencies with which the paths are traveled. If the measurements are highly inaccurate, the mean (weak values are expressed in terms of the relative probabilities’ amplitudes. For pre- and post-selected systems they are bound to take arbitrary values, depending on the chosen transition. This is a direct consequence of the uncertainty principle, which forbids one from distinguishing between interfering alternatives, while leaving the interference between them intact.

  3. Gaussian membership functions are most adequate in representing uncertainty in measurements

    Science.gov (United States)

    Kreinovich, V.; Quintana, C.; Reznik, L.

    1992-01-01

    In rare situations, like fundamental physics, we perform experiments without knowing what their results will be. In the majority of real-life measurement situations, we more or less know beforehand what kind of results we will get. Of course, this is not the precise knowledge of the type 'the result will be between alpha - beta and alpha + beta,' because in this case, we would not need any measurements at all. This is usually a knowledge that is best represented in uncertain terms, like 'perhaps (or 'most likely', etc.) the measured value x is between alpha - beta and alpha + beta.' Traditional statistical methods neglect this additional knowledge and process only the measurement results. So it is desirable to be able to process this uncertain knowledge as well. A natural way to process it is by using fuzzy logic. But, there is a problem; we can use different membership functions to represent the same uncertain statements, and different functions lead to different results. What membership function do we choose? In the present paper, we show that under some reasonable assumptions, Gaussian functions mu(x) = exp(-beta(x(exp 2))) are the most adequate choice of the membership functions for representing uncertainty in measurements. This representation was efficiently used in testing jet engines to airplanes and spaceships.

  4. Single hadron response measurement and calorimeter jet energy scale uncertainty with the ATLAS detector at the LHC

    CERN Document Server

    Aad, Georges; Abdallah, Jalal; Abdelalim, Ahmed Ali; Abdesselam, Abdelouahab; Abdinov, Ovsat; Abi, Babak; Abolins, Maris; AbouZeid, Ossama; Abramowicz, Halina; Abreu, Henso; Acerbi, Emilio; Acharya, Bobby Samir; Adamczyk, Leszek; Adams, David; Addy, Tetteh; Adelman, Jahred; Aderholz, Michael; Adomeit, Stefanie; Adragna, Paolo; Adye, Tim; Aefsky, Scott; Aguilar-Saavedra, Juan Antonio; Aharrouche, Mohamed; Ahlen, Steven; Ahles, Florian; Ahmad, Ashfaq; Ahsan, Mahsana; Aielli, Giulio; Akdogan, Taylan; Åkesson, Torsten Paul Ake; Akimoto, Ginga; Akimov, Andrei; Akiyama, Kunihiro; Alam, Mohammad; Alam, Muhammad Aftab; Albert, Justin; Albrand, Solveig; Aleksa, Martin; Aleksandrov, Igor; Alessandria, Franco; Alexa, Calin; Alexander, Gideon; Alexandre, Gauthier; Alexopoulos, Theodoros; Alhroob, Muhammad; Aliev, Malik; Alimonti, Gianluca; Alison, John; Aliyev, Magsud; Allbrooke, Benedict; Allport, Phillip; Allwood-Spiers, Sarah; Almond, John; Aloisio, Alberto; Alon, Raz; Alonso, Alejandro; Alvarez Gonzalez, Barbara; Alviggi, Mariagrazia; Amako, Katsuya; Amaral, Pedro; Amelung, Christoph; Ammosov, Vladimir; Amorim, Antonio; Amorós, Gabriel; Amram, Nir; Anastopoulos, Christos; Ancu, Lucian Stefan; Andari, Nansi; Andeen, Timothy; Anders, Christoph Falk; Anders, Gabriel; Anderson, Kelby; Andreazza, Attilio; Andrei, George Victor; Andrieux, Marie-Laure; Anduaga, Xabier; Angerami, Aaron; Anghinolfi, Francis; Anisenkov, Alexey; Anjos, Nuno; Annovi, Alberto; Antonaki, Ariadni; Antonelli, Mario; Antonov, Alexey; Antos, Jaroslav; Anulli, Fabio; Aoun, Sahar; Aperio Bella, Ludovica; Apolle, Rudi; Arabidze, Giorgi; Aracena, Ignacio; Arai, Yasuo; Arce, Ayana; Arfaoui, Samir; Arguin, Jean-Francois; Arik, Engin; Arik, Metin; Armbruster, Aaron James; Arnaez, Olivier; Arnault, Christian; Artamonov, Andrei; Artoni, Giacomo; Arutinov, David; Asai, Shoji; Asfandiyarov, Ruslan; Ask, Stefan; Åsman, Barbro; Asquith, Lily; Assamagan, Ketevi; Astbury, Alan; Astvatsatourov, Anatoli; Aubert, Bernard; Auge, Etienne; Augsten, Kamil; Aurousseau, Mathieu; Avolio, Giuseppe; Avramidou, Rachel Maria; Axen, David; Ay, Cano; Azuelos, Georges; Azuma, Yuya; Baak, Max; Baccaglioni, Giuseppe; Bacci, Cesare; Bach, Andre; Bachacou, Henri; Bachas, Konstantinos; Backes, Moritz; Backhaus, Malte; Badescu, Elisabeta; Bagnaia, Paolo; Bahinipati, Seema; Bai, Yu; Bailey, David; Bain, Travis; Baines, John; Baker, Oliver Keith; Baker, Mark; Baker, Sarah; Banas, Elzbieta; Banerjee, Piyali; Banerjee, Swagato; Banfi, Danilo; Bangert, Andrea Michelle; Bansal, Vikas; Bansil, Hardeep Singh; Barak, Liron; Baranov, Sergei; Barashkou, Andrei; Barbaro Galtieri, Angela; Barber, Tom; Barberio, Elisabetta Luigia; Barberis, Dario; Barbero, Marlon; Bardin, Dmitri; Barillari, Teresa; Barisonzi, Marcello; Barklow, Timothy; Barlow, Nick; Barnett, Bruce; Barnett, Michael; Baroncelli, Antonio; Barone, Gaetano; Barr, Alan; Barreiro, Fernando; Barreiro Guimarães da Costa, João; Barrillon, Pierre; Bartoldus, Rainer; Barton, Adam Edward; Bartsch, Valeria; Bates, Richard; Batkova, Lucia; Batley, Richard; Battaglia, Andreas; Battistin, Michele; Bauer, Florian; Bawa, Harinder Singh; Beale, Steven; Beau, Tristan; Beauchemin, Pierre-Hugues; Beccherle, Roberto; Bechtle, Philip; Beck, Hans Peter; Becker, Sebastian; Beckingham, Matthew; Becks, Karl-Heinz; Beddall, Andrew; Beddall, Ayda; Bedikian, Sourpouhi; Bednyakov, Vadim; Bee, Christopher; Begel, Michael; Behar Harpaz, Silvia; Behera, Prafulla; Beimforde, Michael; Belanger-Champagne, Camille; Bell, Paul; Bell, William; Bella, Gideon; Bellagamba, Lorenzo; Bellina, Francesco; Bellomo, Massimiliano; Belloni, Alberto; Beloborodova, Olga; Belotskiy, Konstantin; Beltramello, Olga; Ben Ami, Sagi; Benary, Odette; Benchekroun, Driss; Benchouk, Chafik; Bendel, Markus; Benekos, Nektarios; Benhammou, Yan; Benhar Noccioli, Eleonora; Benitez Garcia, Jorge-Armando; Benjamin, Douglas; Benoit, Mathieu; Bensinger, James; Benslama, Kamal; Bentvelsen, Stan; Berge, David; Bergeaas Kuutmann, Elin; Berger, Nicolas; Berghaus, Frank; Berglund, Elina; Beringer, Jürg; Bernat, Pauline; Bernhard, Ralf; Bernius, Catrin; Berry, Tracey; Bertella, Claudia; Bertin, Antonio; Bertinelli, Francesco; Bertolucci, Federico; Besana, Maria Ilaria; Besson, Nathalie; Bethke, Siegfried; Bhimji, Wahid; Bianchi, Riccardo-Maria; Bianco, Michele; Biebel, Otmar; Bieniek, Stephen Paul; Bierwagen, Katharina; Biesiada, Jed; Biglietti, Michela; Bilokon, Halina; Bindi, Marcello; Binet, Sebastien; Bingul, Ahmet; Bini, Cesare; Biscarat, Catherine; Bitenc, Urban; Black, Kevin; Blair, Robert; Blanchard, Jean-Baptiste; Blanchot, Georges; Blazek, Tomas; Blocker, Craig; Blocki, Jacek; Blondel, Alain; Blum, Walter; Blumenschein, Ulrike; Bobbink, Gerjan; Bobrovnikov, Victor; Bocchetta, Simona Serena; Bocci, Andrea; Boddy, Christopher Richard; Boehler, Michael; Boek, Jennifer; Boelaert, Nele; Bogaerts, Joannes Andreas; Bogdanchikov, Alexander; Bogouch, Andrei; Bohm, Christian; Boisvert, Veronique; Bold, Tomasz; Boldea, Venera; Bolnet, Nayanka Myriam; Bona, Marcella; Bondarenko, Valery; Bondioli, Mario; Boonekamp, Maarten; Booth, Chris; Bordoni, Stefania; Borer, Claudia; Borisov, Anatoly; Borissov, Guennadi; Borjanovic, Iris; Borri, Marcello; Borroni, Sara; Bortolotto, Valerio; Bos, Kors; Boscherini, Davide; Bosman, Martine; Boterenbrood, Hendrik; Botterill, David; Bouchami, Jihene; Boudreau, Joseph; Bouhova-Thacker, Evelina Vassileva; Boumediene, Djamel Eddine; Bourdarios, Claire; Bousson, Nicolas; Boveia, Antonio; Boyd, James; Boyko, Igor; Bozhko, Nikolay; Bozovic-Jelisavcic, Ivanka; Bracinik, Juraj; Braem, André; Branchini, Paolo; Brandenburg, George; Brandt, Andrew; Brandt, Gerhard; Brandt, Oleg; Bratzler, Uwe; Brau, Benjamin; Brau, James; Braun, Helmut; Brelier, Bertrand; Bremer, Johan; Brenner, Richard; Bressler, Shikma; Britton, Dave; Brochu, Frederic; Brock, Ian; Brock, Raymond; Brodbeck, Timothy; Brodet, Eyal; Broggi, Francesco; Bromberg, Carl; Bronner, Johanna; Brooijmans, Gustaaf; Brooks, William; Brown, Gareth; Brown, Heather; Bruckman de Renstrom, Pawel; Bruncko, Dusan; Bruneliere, Renaud; Brunet, Sylvie; Bruni, Alessia; Bruni, Graziano; Bruschi, Marco; Buanes, Trygve; Buat, Quentin; Bucci, Francesca; Buchanan, James; Buchanan, Norman; Buchholz, Peter; Buckingham, Ryan; Buckley, Andrew; Buda, Stelian Ioan; Budagov, Ioulian; Budick, Burton; Büscher, Volker; Bugge, Lars; Bulekov, Oleg; Bunse, Moritz; Buran, Torleiv; Burckhart, Helfried; Burdin, Sergey; Burgard, Carsten Daniel; Burgess, Thomas; Burke, Stephen; Busato, Emmanuel; Bussey, Peter; Buszello, Claus-Peter; Butin, François; Butler, Bart; Butler, John; Buttar, Craig; Butterworth, Jonathan; Buttinger, William; Cabrera Urbán, Susana; Caforio, Davide; Cakir, Orhan; Calafiura, Paolo; Calderini, Giovanni; Calfayan, Philippe; Calkins, Robert; Caloba, Luiz; Caloi, Rita; Calvet, David; Calvet, Samuel; Camacho Toro, Reina; Camarri, Paolo; Cambiaghi, Mario; Cameron, David; Caminada, Lea Michaela; Campana, Simone; Campanelli, Mario; Canale, Vincenzo; Canelli, Florencia; Canepa, Anadi; Cantero, Josu; Capasso, Luciano; Capeans Garrido, Maria Del Mar; Caprini, Irinel; Caprini, Mihai; Capriotti, Daniele; Capua, Marcella; Caputo, Regina; Caramarcu, Costin; Cardarelli, Roberto; Carli, Tancredi; Carlino, Gianpaolo; Carminati, Leonardo; Caron, Bryan; Caron, Sascha; Carrillo Montoya, German D; Carter, Antony; Carter, Janet; Carvalho, João; Casadei, Diego; Casado, Maria Pilar; Cascella, Michele; Caso, Carlo; Castaneda Hernandez, Alfredo Martin; Castaneda-Miranda, Elizabeth; Castillo Gimenez, Victoria; Castro, Nuno Filipe; Cataldi, Gabriella; Cataneo, Fernando; Catinaccio, Andrea; Catmore, James; Cattai, Ariella; Cattani, Giordano; Caughron, Seth; Cauz, Diego; Cavalleri, Pietro; Cavalli, Donatella; Cavalli-Sforza, Matteo; Cavasinni, Vincenzo; Ceradini, Filippo; Santiago Cerqueira, Augusto; Cerri, Alessandro; Cerrito, Lucio; Cerutti, Fabio; Cetin, Serkant Ali; Cevenini, Francesco; Chafaq, Aziz; Chakraborty, Dhiman; Chan, Kevin; Chapleau, Bertrand; Chapman, John Derek; Chapman, John Wehrley; Chareyre, Eve; Charlton, Dave; Chavda, Vikash; Chavez Barajas, Carlos Alberto; Cheatham, Susan; Chekanov, Sergei; Chekulaev, Sergey; Chelkov, Gueorgui; Chelstowska, Magda Anna; Chen, Chunhui; Chen, Hucheng; Chen, Shenjian; Chen, Tingyang; Chen, Xin; Cheng, Shaochen; Cheplakov, Alexander; Chepurnov, Vladimir; Cherkaoui El Moursli, Rajaa; Chernyatin, Valeriy; Cheu, Elliott; Cheung, Sing-Leung; Chevalier, Laurent; Chiefari, Giovanni; Chikovani, Leila; Childers, John Taylor; Chilingarov, Alexandre; Chiodini, Gabriele; Chisholm, Andrew; Chizhov, Mihail; Choudalakis, Georgios; Chouridou, Sofia; Christidi, Illectra-Athanasia; Christov, Asen; Chromek-Burckhart, Doris; Chu, Ming-Lee; Chudoba, Jiri; Ciapetti, Guido; Ciba, Krzysztof; Ciftci, Abbas Kenan; Ciftci, Rena; Cinca, Diane; Cindro, Vladimir; Ciobotaru, Matei Dan; Ciocca, Claudia; Ciocio, Alessandra; Cirilli, Manuela; Citterio, Mauro; Ciubancan, Mihai; Clark, Allan G; Clark, Philip James; Cleland, Bill; Clemens, Jean-Claude; Clement, Benoit; Clement, Christophe; Clifft, Roger; Coadou, Yann; Cobal, Marina; Coccaro, Andrea; Cochran, James H; Coe, Paul; Cogan, Joshua Godfrey; Coggeshall, James; Cogneras, Eric; Colas, Jacques; Colijn, Auke-Pieter; Collard, Caroline; Collins, Neil; Collins-Tooth, Christopher; Collot, Johann; Colon, German; Conde Muiño, Patricia; Coniavitis, Elias; Conidi, Maria Chiara; Consonni, Michele; Consorti, Valerio; Constantinescu, Serban; Conta, Claudio; Conventi, Francesco; Cook, James; Cooke, Mark; Cooper, Ben; Cooper-Sarkar, Amanda; Copic, Katherine; Cornelissen, Thijs; Corradi, Massimo; Corriveau, Francois; Cortes-Gonzalez, Arely; Cortiana, Giorgio; Costa, Giuseppe; Costa, María José; Costanzo, Davide; Costin, Tudor; Côté, David; Coura Torres, Rodrigo; Courneyea, Lorraine; Cowan, Glen; Cowden, Christopher; Cox, Brian; Cranmer, Kyle; Crescioli, Francesco; Cristinziani, Markus; Crosetti, Giovanni; Crupi, Roberto; Crépé-Renaudin, Sabine; Cuciuc, Constantin-Mihai; Cuenca Almenar, Cristóbal; Cuhadar Donszelmann, Tulay; Curatolo, Maria; Curtis, Chris; Cuthbert, Cameron; Cwetanski, Peter; Czirr, Hendrik; Czodrowski, Patrick; Czyczula, Zofia; D'Auria, Saverio; D'Onofrio, Monica; D'Orazio, Alessia; Da Silva, Paulo Vitor; Da Via, Cinzia; Dabrowski, Wladyslaw; Dai, Tiesheng; Dallapiccola, Carlo; Dam, Mogens; Dameri, Mauro; Damiani, Daniel; Danielsson, Hans Olof; Dannheim, Dominik; Dao, Valerio; Darbo, Giovanni; Darlea, Georgiana Lavinia; Davey, Will; Davidek, Tomas; Davidson, Nadia; Davidson, Ruth; Davies, Eleanor; Davies, Merlin; Davison, Adam; Davygora, Yuriy; Dawe, Edmund; Dawson, Ian; Dawson, John; Daya, Rozmin; De, Kaushik; de Asmundis, Riccardo; De Castro, Stefano; De Castro Faria Salgado, Pedro; De Cecco, Sandro; de Graat, Julien; De Groot, Nicolo; de Jong, Paul; De La Taille, Christophe; De la Torre, Hector; De Lotto, Barbara; de Mora, Lee; De Nooij, Lucie; De Pedis, Daniele; De Salvo, Alessandro; De Sanctis, Umberto; De Santo, Antonella; De Vivie De Regie, Jean-Baptiste; Dean, Simon; Dearnaley, William James; Debbe, Ramiro; Debenedetti, Chiara; Dedovich, Dmitri; Degenhardt, James; Dehchar, Mohamed; Del Papa, Carlo; Del Peso, Jose; Del Prete, Tarcisio; Delemontex, Thomas; Deliyergiyev, Maksym; Dell'Acqua, Andrea; Dell'Asta, Lidia; Della Pietra, Massimo; della Volpe, Domenico; Delmastro, Marco; Delruelle, Nicolas; Delsart, Pierre-Antoine; Deluca, Carolina; Demers, Sarah; Demichev, Mikhail; Demirkoz, Bilge; Deng, Jianrong; Denisov, Sergey; Derendarz, Dominik; Derkaoui, Jamal Eddine; Derue, Frederic; Dervan, Paul; Desch, Klaus Kurt; Devetak, Erik; Deviveiros, Pier-Olivier; Dewhurst, Alastair; DeWilde, Burton; Dhaliwal, Saminder; Dhullipudi, Ramasudhakar; Di Ciaccio, Anna; Di Ciaccio, Lucia; Di Girolamo, Alessandro; Di Girolamo, Beniamino; Di Luise, Silvestro; Di Mattia, Alessandro; Di Micco, Biagio; Di Nardo, Roberto; Di Simone, Andrea; Di Sipio, Riccardo; Diaz, Marco Aurelio; Diblen, Faruk; Diehl, Edward; Dietrich, Janet; Dietzsch, Thorsten; Diglio, Sara; Dindar Yagci, Kamile; Dingfelder, Jochen; Dionisi, Carlo; Dita, Petre; Dita, Sanda; Dittus, Fridolin; Djama, Fares; Djobava, Tamar; Barros do Vale, Maria Aline; Do Valle Wemans, André; Doan, Thi Kieu Oanh; Dobbs, Matt; Dobinson, Robert; Dobos, Daniel; Dobson, Ellie; Dobson, Marc; Dodd, Jeremy; Doglioni, Caterina; Doherty, Tom; Doi, Yoshikuni; Dolejsi, Jiri; Dolenc, Irena; Dolezal, Zdenek; Dolgoshein, Boris; Dohmae, Takeshi; Donadelli, Marisilvia; Donega, Mauro; Donini, Julien; Dopke, Jens; Doria, Alessandra; Dos Anjos, Andre; Dosil, Mireia; Dotti, Andrea; Dova, Maria-Teresa; Dowell, John; Doxiadis, Alexander; Doyle, Tony; Drasal, Zbynek; Drees, Jürgen; Dressnandt, Nandor; Drevermann, Hans; Driouichi, Chafik; Dris, Manolis; Dubbert, Jörg; Dube, Sourabh; Duchovni, Ehud; Duckeck, Guenter; Dudarev, Alexey; Dudziak, Fanny; Dührssen, Michael; Duerdoth, Ian; Duflot, Laurent; Dufour, Marc-Andre; Dunford, Monica; Duran Yildiz, Hatice; Duxfield, Robert; Dwuznik, Michal; Dydak, Friedrich; Düren, Michael; Ebenstein, William; Ebke, Johannes; Eckweiler, Sebastian; Edmonds, Keith; Edwards, Clive; Edwards, Nicholas Charles; Ehrenfeld, Wolfgang; Ehrich, Thies; Eifert, Till; Eigen, Gerald; Einsweiler, Kevin; Eisenhandler, Eric; Ekelof, Tord; El Kacimi, Mohamed; Ellert, Mattias; Elles, Sabine; Ellinghaus, Frank; Ellis, Katherine; Ellis, Nicolas; Elmsheuser, Johannes; Elsing, Markus; Emeliyanov, Dmitry; Engelmann, Roderich; Engl, Albert; Epp, Brigitte; Eppig, Andrew; Erdmann, Johannes; Ereditato, Antonio; Eriksson, Daniel; Ernst, Jesse; Ernst, Michael; Ernwein, Jean; Errede, Deborah; Errede, Steven; Ertel, Eugen; Escalier, Marc; Escobar, Carlos; Espinal Curull, Xavier; Esposito, Bellisario; Etienne, Francois; Etienvre, Anne-Isabelle; Etzion, Erez; Evangelakou, Despoina; Evans, Hal; Fabbri, Laura; Fabre, Caroline; Fakhrutdinov, Rinat; Falciano, Speranza; Fang, Yaquan; Fanti, Marcello; Farbin, Amir; Farilla, Addolorata; Farley, Jason; Farooque, Trisha; Farrington, Sinead; Farthouat, Philippe; Fassnacht, Patrick; Fassouliotis, Dimitrios; Fatholahzadeh, Baharak; Favareto, Andrea; Fayard, Louis; Fazio, Salvatore; Febbraro, Renato; Federic, Pavol; Fedin, Oleg; Fedorko, Woiciech; Fehling-Kaschek, Mirjam; Feligioni, Lorenzo; Fellmann, Denis; Feng, Cunfeng; Feng, Eric; Fenyuk, Alexander; Ferencei, Jozef; Ferland, Jonathan; Fernando, Waruna; Ferrag, Samir; Ferrando, James; Ferrara, Valentina; Ferrari, Arnaud; Ferrari, Pamela; Ferrari, Roberto; Ferreira de Lima, Danilo Enoque; Ferrer, Antonio; Ferrer, Maria Lorenza; Ferrere, Didier; Ferretti, Claudio; Ferretto Parodi, Andrea; Fiascaris, Maria; Fiedler, Frank; Filipčič, Andrej; Filippas, Anastasios; Filthaut, Frank; Fincke-Keeler, Margret; Fiolhais, Miguel; Fiorini, Luca; Firan, Ana; Fischer, Gordon; Fischer, Peter; Fisher, Matthew; Flechl, Martin; Fleck, Ivor; Fleckner, Johanna; Fleischmann, Philipp; Fleischmann, Sebastian; Flick, Tobias; Floderus, Anders; Flores Castillo, Luis; Flowerdew, Michael; Fokitis, Manolis; Fonseca Martin, Teresa; Forbush, David Alan; Formica, Andrea; Forti, Alessandra; Fortin, Dominique; Foster, Joe; Fournier, Daniel; Foussat, Arnaud; Fowler, Andrew; Fowler, Ken; Fox, Harald; Francavilla, Paolo; Franchino, Silvia; Francis, David; Frank, Tal; Franklin, Melissa; Franz, Sebastien; Fraternali, Marco; Fratina, Sasa; French, Sky; Friedrich, Felix; Froeschl, Robert; Froidevaux, Daniel; Frost, James; Fukunaga, Chikara; Fullana Torregrosa, Esteban; Fuster, Juan; Gabaldon, Carolina; Gabizon, Ofir; Gadfort, Thomas; Gadomski, Szymon; Gagliardi, Guido; Gagnon, Pauline; Galea, Cristina; Gallas, Elizabeth; Gallo, Valentina Santina; Gallop, Bruce; Gallus, Petr; Gan, KK; Gao, Yongsheng; Gapienko, Vladimir; Gaponenko, Andrei; Garberson, Ford; Garcia-Sciveres, Maurice; García, Carmen; García Navarro, José Enrique; Gardner, Robert; Garelli, Nicoletta; Garitaonandia, Hegoi; Garonne, Vincent; Garvey, John; Gatti, Claudio; Gaudio, Gabriella; Gaur, Bakul; Gauthier, Lea; Gavrilenko, Igor; Gay, Colin; Gaycken, Goetz; Gayde, Jean-Christophe; Gazis, Evangelos; Ge, Peng; Gee, Norman; Geerts, Daniël Alphonsus Adrianus; Geich-Gimbel, Christoph; Gellerstedt, Karl; Gemme, Claudia; Gemmell, Alistair; Genest, Marie-Hélène; Gentile, Simonetta; George, Matthias; George, Simon; Gerlach, Peter; Gershon, Avi; Geweniger, Christoph; Ghazlane, Hamid; Ghodbane, Nabil; Giacobbe, Benedetto; Giagu, Stefano; Giakoumopoulou, Victoria; Giangiobbe, Vincent; Gianotti, Fabiola; Gibbard, Bruce; Gibson, Adam; Gibson, Stephen; Gilbert, Laura; Gilewsky, Valentin; Gillberg, Dag; Gillman, Tony; Gingrich, Douglas; Ginzburg, Jonatan; Giokaris, Nikos; Giordani, MarioPaolo; Giordano, Raffaele; Giorgi, Francesco Michelangelo; Giovannini, Paola; Giraud, Pierre-Francois; Giugni, Danilo; Giunta, Michele; Giusti, Paolo; Gjelsten, Børge Kile; Gladilin, Leonid; Glasman, Claudia; Glatzer, Julian; Glazov, Alexandre; Glitza, Karl-Walter; Glonti, George; Goddard, Jack Robert; Godfrey, Jennifer; Godlewski, Jan; Goebel, Martin; Göpfert, Thomas; Goeringer, Christian; Gössling, Claus; Göttfert, Tobias; Goldfarb, Steven; Golling, Tobias; Gomes, Agostinho; Gomez Fajardo, Luz Stella; Gonçalo, Ricardo; Goncalves Pinto Firmino Da Costa, Joao; Gonella, Laura; Gonidec, Allain; Gonzalez, Saul; González de la Hoz, Santiago; Gonzalez Parra, Garoe; Gonzalez Silva, Laura; Gonzalez-Sevilla, Sergio; Goodson, Jeremiah Jet; Goossens, Luc; Gorbounov, Petr Andreevich; Gordon, Howard; Gorelov, Igor; Gorfine, Grant; Gorini, Benedetto; Gorini, Edoardo; Gorišek, Andrej; Gornicki, Edward; Gorokhov, Serguei; Goryachev, Vladimir; Gosdzik, Bjoern; Gosselink, Martijn; Gostkin, Mikhail Ivanovitch; Gough Eschrich, Ivo; Gouighri, Mohamed; Goujdami, Driss; Goulette, Marc Phillippe; Goussiou, Anna; Goy, Corinne; Gozpinar, Serdar; Grabowska-Bold, Iwona; Grafström, Per; Grahn, Karl-Johan; Grancagnolo, Francesco; Grancagnolo, Sergio; Grassi, Valerio; Gratchev, Vadim; Grau, Nathan; Gray, Heather; Gray, Julia Ann; Graziani, Enrico; Grebenyuk, Oleg; Greenshaw, Timothy; Greenwood, Zeno Dixon; Gregersen, Kristian; Gregor, Ingrid-Maria; Grenier, Philippe; Griffiths, Justin; Grigalashvili, Nugzar; Grillo, Alexander; Grinstein, Sebastian; Grishkevich, Yaroslav; Grivaz, Jean-Francois; Groh, Manfred; Gross, Eilam; Grosse-Knetter, Joern; Groth-Jensen, Jacob; Grybel, Kai; Guarino, Victor; Guest, Daniel; Guicheney, Christophe; Guida, Angelo; Guindon, Stefan; Guler, Hulya; Gunther, Jaroslav; Guo, Bin; Guo, Jun; Gupta, Ambreesh; Gusakov, Yury; Gushchin, Vladimir; Gutierrez, Phillip; Guttman, Nir; Gutzwiller, Olivier; Guyot, Claude; Gwenlan, Claire; Gwilliam, Carl; Haas, Andy; Haas, Stefan; Haber, Carl; Hackenburg, Robert; Hadavand, Haleh Khani; Hadley, David; Haefner, Petra; Hahn, Ferdinand; Haider, Stefan; Hajduk, Zbigniew; Hakobyan, Hrachya; Hall, David; Haller, Johannes; Hamacher, Klaus; Hamal, Petr; Hamer, Matthias; Hamilton, Andrew; Hamilton, Samuel; Han, Hongguang; Han, Liang; Hanagaki, Kazunori; Hanawa, Keita; Hance, Michael; Handel, Carsten; Hanke, Paul; Hansen, John Renner; Hansen, Jørgen Beck; Hansen, Jorn Dines; Hansen, Peter Henrik; Hansson, Per; Hara, Kazuhiko; Hare, Gabriel; Harenberg, Torsten; Harkusha, Siarhei; Harper, Devin; Harrington, Robert; Harris, Orin; Harrison, Karl; Hartert, Jochen; Hartjes, Fred; Haruyama, Tomiyoshi; Harvey, Alex; Hasegawa, Satoshi; Hasegawa, Yoji; Hassani, Samira; Hatch, Mark; Hauff, Dieter; Haug, Sigve; Hauschild, Michael; Hauser, Reiner; Havranek, Miroslav; Hawes, Brian; Hawkes, Christopher; Hawkings, Richard John; Hawkins, Anthony David; Hawkins, Donovan; Hayakawa, Takashi; Hayashi, Takayasu; Hayden, Daniel; Hayward, Helen; Haywood, Stephen; Hazen, Eric; He, Mao; Head, Simon; Hedberg, Vincent; Heelan, Louise; Heim, Sarah; Heinemann, Beate; Heisterkamp, Simon; Helary, Louis; Heller, Claudio; Heller, Matthieu; Hellman, Sten; Hellmich, Dennis; Helsens, Clement; Henderson, Robert; Henke, Michael; Henrichs, Anna; Henriques Correia, Ana Maria; Henrot-Versille, Sophie; Henry-Couannier, Frédéric; Hensel, Carsten; Henß, Tobias; Medina Hernandez, Carlos; Hernández Jiménez, Yesenia; Herrberg, Ruth; Hershenhorn, Alon David; Herten, Gregor; Hertenberger, Ralf; Hervas, Luis; Hesketh, Gavin Grant; Hessey, Nigel; Higón-Rodriguez, Emilio; Hill, Daniel; Hill, John; Hill, Norman; Hiller, Karl Heinz; Hillert, Sonja; Hillier, Stephen; Hinchliffe, Ian; Hines, Elizabeth; Hirose, Minoru; Hirsch, Florian; Hirschbuehl, Dominic; Hobbs, John; Hod, Noam; Hodgkinson, Mark; Hodgson, Paul; Hoecker, Andreas; Hoeferkamp, Martin; Hoffman, Julia; Hoffmann, Dirk; Hohlfeld, Marc; Holder, Martin; Holmgren, Sven-Olof; Holy, Tomas; Holzbauer, Jenny; Homma, Yasuhiro; Hong, Tae Min; Hooft van Huysduynen, Loek; Horazdovsky, Tomas; Horn, Claus; Horner, Stephan; Hostachy, Jean-Yves; Hou, Suen; Houlden, Michael; Hoummada, Abdeslam; Howarth, James; Howell, David; Hristova, Ivana; Hrivnac, Julius; Hruska, Ivan; Hryn'ova, Tetiana; Hsu, Pai-hsien Jennifer; Hsu, Shih-Chieh; Huang, Guang Shun; Hubacek, Zdenek; Hubaut, Fabrice; Huegging, Fabian; Huettmann, Antje; Huffman, Todd Brian; Hughes, Emlyn; Hughes, Gareth; Hughes-Jones, Richard; Huhtinen, Mika; Hurst, Peter; Hurwitz, Martina; Husemann, Ulrich; Huseynov, Nazim; Huston, Joey; Huth, John; Iacobucci, Giuseppe; Iakovidis, Georgios; Ibbotson, Michael; Ibragimov, Iskander; Ichimiya, Ryo; Iconomidou-Fayard, Lydia; Idarraga, John; Iengo, Paolo; Igonkina, Olga; Ikegami, Yoichi; Ikeno, Masahiro; Ilchenko, Yuri; Iliadis, Dimitrios; Ilic, Nikolina; Imori, Masatoshi; Ince, Tayfun; Inigo-Golfin, Joaquin; Ioannou, Pavlos; Iodice, Mauro; Ippolito, Valerio; Irles Quiles, Adrian; Isaksson, Charlie; Ishikawa, Akimasa; Ishino, Masaya; Ishmukhametov, Renat; Issever, Cigdem; Istin, Serhat; Ivashin, Anton; Iwanski, Wieslaw; Iwasaki, Hiroyuki; Izen, Joseph; Izzo, Vincenzo; Jackson, Brett; Jackson, John; Jackson, Paul; Jaekel, Martin; Jain, Vivek; Jakobs, Karl; Jakobsen, Sune; Jakubek, Jan; Jana, Dilip; Jankowski, Ernest; Jansen, Eric; Jansen, Hendrik; Jantsch, Andreas; Janus, Michel; Jarlskog, Göran; Jeanty, Laura; Jelen, Kazimierz; Jen-La Plante, Imai; Jenni, Peter; Jeremie, Andrea; Jež, Pavel; Jézéquel, Stéphane; Jha, Manoj Kumar; Ji, Haoshuang; Ji, Weina; Jia, Jiangyong; Jiang, Yi; Jimenez Belenguer, Marcos; Jin, Ge; Jin, Shan; Jinnouchi, Osamu; Joergensen, Morten Dam; Joffe, David; Johansen, Lars; Johansen, Marianne; Johansson, Erik; Johansson, Per; Johnert, Sebastian; Johns, Kenneth; Jon-And, Kerstin; Jones, Graham; Jones, Roger; Jones, Tegid; Jones, Tim; Jonsson, Ove; Joram, Christian; Jorge, Pedro; Joseph, John; Jovicevic, Jelena; Jovin, Tatjana; Ju, Xiangyang; Jung, Christian; Jungst, Ralph Markus; Juranek, Vojtech; Jussel, Patrick; Juste Rozas, Aurelio; Kabachenko, Vasily; Kabana, Sonja; Kaci, Mohammed; Kaczmarska, Anna; Kadlecik, Peter; Kado, Marumi; Kagan, Harris; Kagan, Michael; Kaiser, Steffen; Kajomovitz, Enrique; Kalinin, Sergey; Kalinovskaya, Lidia; Kama, Sami; Kanaya, Naoko; Kaneda, Michiru; Kaneti, Steven; Kanno, Takayuki; Kantserov, Vadim; Kanzaki, Junichi; Kaplan, Benjamin; Kapliy, Anton; Kaplon, Jan; Kar, Deepak; Karagoz, Muge; Karnevskiy, Mikhail; Karr, Kristo; Kartvelishvili, Vakhtang; Karyukhin, Andrey; Kashif, Lashkar; Kasieczka, Gregor; Kasmi, Azzedine; Kass, Richard; Kastanas, Alex; Kataoka, Mayuko; Kataoka, Yousuke; Katsoufis, Elias; Katzy, Judith; Kaushik, Venkatesh; Kawagoe, Kiyotomo; Kawamoto, Tatsuo; Kawamura, Gen; Kayl, Manuel; Kazanin, Vassili; Kazarinov, Makhail; Keeler, Richard; Kehoe, Robert; Keil, Markus; Kekelidze, George; Kennedy, John; Kenney, Christopher John; Kenyon, Mike; Kepka, Oldrich; Kerschen, Nicolas; Kerševan, Borut Paul; Kersten, Susanne; Kessoku, Kohei; Keung, Justin; Khakzad, Mohsen; Khalil-zada, Farkhad; Khandanyan, Hovhannes; Khanov, Alexander; Kharchenko, Dmitri; Khodinov, Alexander; Kholodenko, Anatoli; Khomich, Andrei; Khoo, Teng Jian; Khoriauli, Gia; Khoroshilov, Andrey; Khovanskiy, Nikolai; Khovanskiy, Valery; Khramov, Evgeniy; Khubua, Jemal; Kim, Hyeon Jin; Kim, Min Suk; Kim, Shinhong; Kimura, Naoki; Kind, Oliver; King, Barry; King, Matthew; King, Robert Steven Beaufoy; Kirk, Julie; Kirsch, Lawrence; Kiryunin, Andrey; Kishimoto, Tomoe; Kisielewska, Danuta; Kittelmann, Thomas; Kiver, Andrey; Kladiva, Eduard; Klaiber-Lodewigs, Jonas; Klein, Max; Klein, Uta; Kleinknecht, Konrad; Klemetti, Miika; Klier, Amit; Klimek, Pawel; Klimentov, Alexei; Klingenberg, Reiner; Klinger, Joel Alexander; Klinkby, Esben; Klioutchnikova, Tatiana; Klok, Peter; Klous, Sander; Kluge, Eike-Erik; Kluge, Thomas; Kluit, Peter; Kluth, Stefan; Knecht, Neil; Kneringer, Emmerich; Knobloch, Juergen; Knoops, Edith; Knue, Andrea; Ko, Byeong Rok; Kobayashi, Tomio; Kobel, Michael; Kocian, Martin; Kodys, Peter; Köneke, Karsten; König, Adriaan; Koenig, Sebastian; Köpke, Lutz; Koetsveld, Folkert; Koevesarki, Peter; Koffas, Thomas; Koffeman, Els; Kogan, Lucy Anne; Kohn, Fabian; Kohout, Zdenek; Kohriki, Takashi; Koi, Tatsumi; Kokott, Thomas; Kolachev, Guennady; Kolanoski, Hermann; Kolesnikov, Vladimir; Koletsou, Iro; Koll, James; Kollefrath, Michael; Kolya, Scott; Komar, Aston; Komori, Yuto; Kondo, Takahiko; Kono, Takanori; Kononov, Anatoly; Konoplich, Rostislav; Konstantinidis, Nikolaos; Kootz, Andreas; Koperny, Stefan; Korcyl, Krzysztof; Kordas, Kostantinos; Koreshev, Victor; Korn, Andreas; Korol, Aleksandr; Korolkov, Ilya; Korolkova, Elena; Korotkov, Vladislav; Kortner, Oliver; Kortner, Sandra; Kostyukhin, Vadim; Kotamäki, Miikka Juhani; Kotov, Sergey; Kotov, Vladislav; Kotwal, Ashutosh; Kourkoumelis, Christine; Kouskoura, Vasiliki; Koutsman, Alex; Kowalewski, Robert Victor; Kowalski, Tadeusz; Kozanecki, Witold; Kozhin, Anatoly; Kral, Vlastimil; Kramarenko, Viktor; Kramberger, Gregor; Krasny, Mieczyslaw Witold; Krasznahorkay, Attila; Kraus, James; Kraus, Jana; Kreisel, Arik; Krejci, Frantisek; Kretzschmar, Jan; Krieger, Nina; Krieger, Peter; Kroeninger, Kevin; Kroha, Hubert; Kroll, Joe; Kroseberg, Juergen; Krstic, Jelena; Kruchonak, Uladzimir; Krüger, Hans; Kruker, Tobias; Krumnack, Nils; Krumshteyn, Zinovii; Kruth, Andre; Kubota, Takashi; Kuday, Sinan; Kuehn, Susanne; Kugel, Andreas; Kuhl, Thorsten; Kuhn, Dietmar; Kukhtin, Victor; Kulchitsky, Yuri; Kuleshov, Sergey; Kummer, Christian; Kuna, Marine; Kundu, Nikhil; Kunkle, Joshua; Kupco, Alexander; Kurashige, Hisaya; Kurata, Masakazu; Kurochkin, Yurii; Kus, Vlastimil; Kuwertz, Emma Sian; Kuze, Masahiro; Kvita, Jiri; Kwee, Regina; La Rosa, Alessandro; La Rotonda, Laura; Labarga, Luis; Labbe, Julien; Lablak, Said; Lacasta, Carlos; Lacava, Francesco; Lacker, Heiko; Lacour, Didier; Lacuesta, Vicente Ramón; Ladygin, Evgueni; Lafaye, Remi; Laforge, Bertrand; Lagouri, Theodota; Lai, Stanley; Laisne, Emmanuel; Lamanna, Massimo; Lampen, Caleb; Lampl, Walter; Lancon, Eric; Landgraf, Ulrich; Landon, Murrough; Lane, Jenna; Lange, Clemens; Lankford, Andrew; Lanni, Francesco; Lantzsch, Kerstin; Laplace, Sandrine; Lapoire, Cecile; Laporte, Jean-Francois; Lari, Tommaso; Larionov, Anatoly; Larner, Aimee; Lasseur, Christian; Lassnig, Mario; Laurelli, Paolo; Lavorini, Vincenzo; Lavrijsen, Wim; Laycock, Paul; Lazarev, Alexandre; Le Dortz, Olivier; Le Guirriec, Emmanuel; Le Maner, Christophe; Le Menedeu, Eve; Lebel, Céline; LeCompte, Thomas; Ledroit-Guillon, Fabienne Agnes Marie; Lee, Hurng-Chun; Lee, Jason; Lee, Shih-Chang; Lee, Lawrence; Lefebvre, Michel; Legendre, Marie; Leger, Annie; LeGeyt, Benjamin; Legger, Federica; Leggett, Charles; Lehmacher, Marc; Lehmann Miotto, Giovanna; Lei, Xiaowen; Leite, Marco Aurelio Lisboa; Leitner, Rupert; Lellouch, Daniel; Leltchouk, Mikhail; Lemmer, Boris; Lendermann, Victor; Leney, Katharine; Lenz, Tatiana; Lenzen, Georg; Lenzi, Bruno; Leonhardt, Kathrin; Leontsinis, Stefanos; Leroy, Claude; Lessard, Jean-Raphael; Lesser, Jonas; Lester, Christopher; Leung Fook Cheong, Annabelle; Levêque, Jessica; Levin, Daniel; Levinson, Lorne; Levitski, Mikhail; Lewis, Adrian; Lewis, George; Leyko, Agnieszka; Leyton, Michael; Li, Bo; Li, Haifeng; Li, Shu; Li, Xuefei; Liang, Zhijun; Liao, Hongbo; Liberti, Barbara; Lichard, Peter; Lichtnecker, Markus; Lie, Ki; Liebig, Wolfgang; Lifshitz, Ronen; Lilley, Joseph; Limbach, Christian; Limosani, Antonio; Limper, Maaike; Lin, Simon; Linde, Frank; Linnemann, James; Lipeles, Elliot; Lipinsky, Lukas; Lipniacka, Anna; Liss, Tony; Lissauer, David; Lister, Alison; Litke, Alan; Liu, Chuanlei; Liu, Dong; Liu, Hao; Liu, Jianbei; Liu, Minghui; Liu, Yanwen; Livan, Michele; Livermore, Sarah; Lleres, Annick; Llorente Merino, Javier; Lloyd, Stephen; Lobodzinska, Ewelina; Loch, Peter; Lockman, William; Loddenkoetter, Thomas; Loebinger, Fred; Loginov, Andrey; Loh, Chang Wei; Lohse, Thomas; Lohwasser, Kristin; Lokajicek, Milos; Loken, James; Lombardo, Vincenzo Paolo; Long, Robin Eamonn; Lopes, Lourenco; Lopez Mateos, David; Lorenz, Jeanette; Lorenzo Martinez, Narei; Losada, Marta; Loscutoff, Peter; Lo Sterzo, Francesco; Losty, Michael; Lou, Xinchou; Lounis, Abdenour; Loureiro, Karina; Love, Jeremy; Love, Peter; Lowe, Andrew; Lu, Feng; Lubatti, Henry; Luci, Claudio; Lucotte, Arnaud; Ludwig, Andreas; Ludwig, Dörthe; Ludwig, Inga; Ludwig, Jens; Luehring, Frederick; Luijckx, Guy; Lumb, Debra; Luminari, Lamberto; Lund, Esben; Lund-Jensen, Bengt; Lundberg, Björn; Lundberg, Johan; Lundquist, Johan; Lungwitz, Matthias; Lutz, Gerhard; Lynn, David; Lys, Jeremy; Lytken, Else; Ma, Hong; Ma, Lian Liang; Macana Goia, Jorge Andres; Maccarrone, Giovanni; Macchiolo, Anna; Maček, Boštjan; Machado Miguens, Joana; Mackeprang, Rasmus; Madaras, Ronald; Mader, Wolfgang; Maenner, Reinhard; Maeno, Tadashi; Mättig, Peter; Mättig, Stefan; Magnoni, Luca; Magradze, Erekle; Mahalalel, Yair; Mahboubi, Kambiz; Mahout, Gilles; Maiani, Camilla; Maidantchik, Carmen; Maio, Amélia; Majewski, Stephanie; Makida, Yasuhiro; Makovec, Nikola; Mal, Prolay; Malaescu, Bogdan; Malecki, Pawel; Malecki, Piotr; Maleev, Victor; Malek, Fairouz; Mallik, Usha; Malon, David; Malone, Caitlin; Maltezos, Stavros; Malyshev, Vladimir; Malyukov, Sergei; Mameghani, Raphael; Mamuzic, Judita; Manabe, Atsushi; Mandelli, Luciano; Mandić, Igor; Mandrysch, Rocco; Maneira, José; Mangeard, Pierre-Simon; Manhaes de Andrade Filho, Luciano; Manjavidze, Ioseb; Mann, Alexander; Manning, Peter; Manousakis-Katsikakis, Arkadios; Mansoulie, Bruno; Manz, Andreas; Mapelli, Alessandro; Mapelli, Livio; March, Luis; Marchand, Jean-Francois; Marchese, Fabrizio; Marchiori, Giovanni; Marcisovsky, Michal; Marin, Alexandru; Marino, Christopher; Marroquim, Fernando; Marshall, Robin; Marshall, Zach; Martens, Kalen; Marti-Garcia, Salvador; Martin, Andrew; Martin, Brian; Martin, Brian Thomas; Martin, Franck Francois; Martin, Jean-Pierre; Martin, Philippe; Martin, Tim; Martin, Victoria Jane; Martin dit Latour, Bertrand; Martin-Haugh, Stewart; Martinez, Mario; Martinez Outschoorn, Verena; Martyniuk, Alex; Marx, Marilyn; Marzano, Francesco; Marzin, Antoine; Masetti, Lucia; Mashimo, Tetsuro; Mashinistov, Ruslan; Masik, Jiri; Maslennikov, Alexey; Massa, Ignazio; Massaro, Graziano; Massol, Nicolas; Mastrandrea, Paolo; Mastroberardino, Anna; Masubuchi, Tatsuya; Mathes, Markus; Matricon, Pierre; Matsumoto, Hiroshi; Matsunaga, Hiroyuki; Matsushita, Takashi; Mattravers, Carly; Maugain, Jean-Marie; Maurer, Julien; Maxfield, Stephen; Maximov, Dmitriy; May, Edward; Mayne, Anna; Mazini, Rachid; Mazur, Michael; Mazzanti, Marcello; Mazzoni, Enrico; Mc Kee, Shawn Patrick; McCarn, Allison; McCarthy, Robert; McCarthy, Tom; McCubbin, Norman; McFarlane, Kenneth; Mcfayden, Josh; McGlone, Helen; Mchedlidze, Gvantsa; McLaren, Robert Andrew; Mclaughlan, Tom; McMahon, Steve; McPherson, Robert; Meade, Andrew; Mechnich, Joerg; Mechtel, Markus; Medinnis, Mike; Meera-Lebbai, Razzak; Meguro, Tatsuma; Mehdiyev, Rashid; Mehlhase, Sascha; Mehta, Andrew; Meier, Karlheinz; Meirose, Bernhard; Melachrinos, Constantinos; Mellado Garcia, Bruce Rafael; Mendoza Navas, Luis; Meng, Zhaoxia; Mengarelli, Alberto; Menke, Sven; Menot, Claude; Meoni, Evelin; Mercurio, Kevin Michael; Mermod, Philippe; Merola, Leonardo; Meroni, Chiara; Merritt, Frank; Merritt, Hayes; Messina, Andrea; Metcalfe, Jessica; Mete, Alaettin Serhan; Meyer, Carsten; Meyer, Christopher; Meyer, Jean-Pierre; Meyer, Jochen; Meyer, Joerg; Meyer, Thomas Christian; Meyer, W Thomas; Miao, Jiayuan; Michal, Sebastien; Micu, Liliana; Middleton, Robin; Migas, Sylwia; Mijović, Liza; Mikenberg, Giora; Mikestikova, Marcela; Mikuž, Marko; Miller, David; Miller, Robert; Mills, Bill; Mills, Corrinne; Milov, Alexander; Milstead, David; Milstein, Dmitry; Minaenko, Andrey; Miñano Moya, Mercedes; Minashvili, Irakli; Mincer, Allen; Mindur, Bartosz; Mineev, Mikhail; Ming, Yao; Mir, Lluisa-Maria; Mirabelli, Giovanni; Miralles Verge, Lluis; Misiejuk, Andrzej; Mitrevski, Jovan; Mitrofanov, Gennady; Mitsou, Vasiliki A; Mitsui, Shingo; Miyagawa, Paul; Miyazaki, Kazuki; Mjörnmark, Jan-Ulf; Moa, Torbjoern; Mockett, Paul; Moed, Shulamit; Moeller, Victoria; Mönig, Klaus; Möser, Nicolas; Mohapatra, Soumya; Mohr, Wolfgang; Mohrdieck-Möck, Susanne; Moisseev, Artemy; Moles-Valls, Regina; Molina-Perez, Jorge; Monk, James; Monnier, Emmanuel; Montesano, Simone; Monticelli, Fernando; Monzani, Simone; Moore, Roger; Moorhead, Gareth; Mora Herrera, Clemencia; Moraes, Arthur; Morange, Nicolas; Morel, Julien; Morello, Gianfranco; Moreno, Deywis; Moreno Llácer, María; Morettini, Paolo; Morgenstern, Marcus; Morii, Masahiro; Morin, Jerome; Morley, Anthony Keith; Mornacchi, Giuseppe; Morozov, Sergey; Morris, John; Morvaj, Ljiljana; Moser, Hans-Guenther; Mosidze, Maia; Moss, Josh; Mount, Richard; Mountricha, Eleni; Mouraviev, Sergei; Moyse, Edward; Mudrinic, Mihajlo; Mueller, Felix; Mueller, James; Mueller, Klemens; Müller, Thomas; Mueller, Timo; Muenstermann, Daniel; Muir, Alex; Munwes, Yonathan; Murray, Bill; Mussche, Ido; Musto, Elisa; Myagkov, Alexey; Nadal, Jordi; Nagai, Koichi; Nagano, Kunihiro; Nagarkar, Advait; Nagasaka, Yasushi; Nagel, Martin; Nairz, Armin Michael; Nakahama, Yu; Nakamura, Koji; Nakamura, Tomoaki; Nakano, Itsuo; Nanava, Gizo; Napier, Austin; Narayan, Rohin; Nash, Michael; Nation, Nigel; Nattermann, Till; Naumann, Thomas; Navarro, Gabriela; Neal, Homer; Nebot, Eduardo; Nechaeva, Polina; Neep, Thomas James; Negri, Andrea; Negri, Guido; Nektarijevic, Snezana; Nelson, Andrew; Nelson, Silke; Nelson, Timothy Knight; Nemecek, Stanislav; Nemethy, Peter; Nepomuceno, Andre Asevedo; Nessi, Marzio; Neubauer, Mark; Neusiedl, Andrea; Neves, Ricardo; Nevski, Pavel; Newman, Paul; Nguyen Thi Hong, Van; Nickerson, Richard; Nicolaidou, Rosy; Nicolas, Ludovic; Nicquevert, Bertrand; Niedercorn, Francois; Nielsen, Jason; Niinikoski, Tapio; Nikiforou, Nikiforos; Nikiforov, Andriy; Nikolaenko, Vladimir; Nikolaev, Kirill; Nikolic-Audit, Irena; Nikolics, Katalin; Nikolopoulos, Konstantinos; Nilsen, Henrik; Nilsson, Paul; Ninomiya, Yoichi; Nisati, Aleandro; Nishiyama, Tomonori; Nisius, Richard; Nodulman, Lawrence; Nomachi, Masaharu; Nomidis, Ioannis; Nordberg, Markus; Nordkvist, Bjoern; Norton, Peter; Novakova, Jana; Nozaki, Mitsuaki; Nozka, Libor; Nugent, Ian Michael; Nuncio-Quiroz, Adriana-Elizabeth; Nunes Hanninger, Guilherme; Nunnemann, Thomas; Nurse, Emily; O'Brien, Brendan Joseph; O'Neale, Steve; O'Neil, Dugan; O'Shea, Val; Oakes, Louise Beth; Oakham, Gerald; Oberlack, Horst; Ocariz, Jose; Ochi, Atsuhiko; Oda, Susumu; Odaka, Shigeru; Odier, Jerome; Ogren, Harold; Oh, Alexander; Oh, Seog; Ohm, Christian; Ohshima, Takayoshi; Ohshita, Hidetoshi; Ohsugi, Takashi; Okada, Shogo; Okawa, Hideki; Okumura, Yasuyuki; Okuyama, Toyonobu; Olariu, Albert; Olcese, Marco; Olchevski, Alexander; Olivares Pino, Sebastian Andres; Oliveira, Miguel Alfonso; Oliveira Damazio, Denis; Oliver Garcia, Elena; Olivito, Dominick; Olszewski, Andrzej; Olszowska, Jolanta; Omachi, Chihiro; Onofre, António; Onyisi, Peter; Oram, Christopher; Oreglia, Mark; Oren, Yona; Orestano, Domizia; Orlov, Iliya; Oropeza Barrera, Cristina; Orr, Robert; Osculati, Bianca; Ospanov, Rustem; Osuna, Carlos; Otero y Garzon, Gustavo; Ottersbach, John; Ouchrif, Mohamed; Ouellette, Eric; Ould-Saada, Farid; Ouraou, Ahmimed; Ouyang, Qun; Ovcharova, Ana; Owen, Mark; Owen, Simon; Ozcan, Veysi Erkcan; Ozturk, Nurcan; Pacheco Pages, Andres; Padilla Aranda, Cristobal; Pagan Griso, Simone; Paganis, Efstathios; Paige, Frank; Pais, Preema; Pajchel, Katarina; Palacino, Gabriel; Paleari, Chiara; Palestini, Sandro; Pallin, Dominique; Palma, Alberto; Palmer, Jody; Pan, Yibin; Panagiotopoulou, Evgenia; Panes, Boris; Panikashvili, Natalia; Panitkin, Sergey; Pantea, Dan; Panuskova, Monika; Paolone, Vittorio; Papadelis, Aras; Papadopoulou, Theodora; Paramonov, Alexander; Park, Woochun; Parker, Andy; Parodi, Fabrizio; Parsons, John; Parzefall, Ulrich; Pasqualucci, Enrico; Passaggio, Stefano; Passeri, Antonio; Pastore, Fernanda; Pastore, Francesca; Pásztor, Gabriella; Pataraia, Sophio; Patel, Nikhul; Pater, Joleen; Patricelli, Sergio; Pauly, Thilo; Pecsy, Martin; Pedraza Morales, Maria Isabel; Peleganchuk, Sergey; Peng, Haiping; Pengo, Ruggero; Penning, Bjoern; Penson, Alexander; Penwell, John; Perantoni, Marcelo; Perez, Kerstin; Perez Cavalcanti, Tiago; Perez Codina, Estel; Pérez García-Estañ, María Teresa; Perez Reale, Valeria; Perini, Laura; Pernegger, Heinz; Perrino, Roberto; Perrodo, Pascal; Persembe, Seda; Perus, Antoine; Peshekhonov, Vladimir; Peters, Krisztian; Petersen, Brian; Petersen, Jorgen; Petersen, Troels; Petit, Elisabeth; Petridis, Andreas; Petridou, Chariclia; Petrolo, Emilio; Petrucci, Fabrizio; Petschull, Dennis; Petteni, Michele; Pezoa, Raquel; Phan, Anna; Phillips, Peter William; Piacquadio, Giacinto; Piccaro, Elisa; Piccinini, Maurizio; Piec, Sebastian Marcin; Piegaia, Ricardo; Pignotti, David; Pilcher, James; Pilkington, Andrew; Pina, João Antonio; Pinamonti, Michele; Pinder, Alex; Pinfold, James; Ping, Jialun; Pinto, Belmiro; Pirotte, Olivier; Pizio, Caterina; Placakyte, Ringaile; Plamondon, Mathieu; Pleier, Marc-Andre; Pleskach, Anatoly; Poblaguev, Andrei; Poddar, Sahill; Podlyski, Fabrice; Poggioli, Luc; Poghosyan, Tatevik; Pohl, Martin; Polci, Francesco; Polesello, Giacomo; Policicchio, Antonio; Polini, Alessandro; Poll, James; Polychronakos, Venetios; Pomarede, Daniel Marc; Pomeroy, Daniel; Pommès, Kathy; Pontecorvo, Ludovico; Pope, Bernard; Popeneciu, Gabriel Alexandru; Popovic, Dragan; Poppleton, Alan; Portell Bueso, Xavier; Posch, Christoph; Pospelov, Guennady; Pospisil, Stanislav; Potrap, Igor; Potter, Christina; Potter, Christopher; Poulard, Gilbert; Poveda, Joaquin; Pozdnyakov, Valery; Prabhu, Robindra; Pralavorio, Pascal; Pranko, Aliaksandr; Prasad, Srivas; Pravahan, Rishiraj; Prell, Soeren; Pretzl, Klaus Peter; Pribyl, Lukas; Price, Darren; Price, Joe; Price, Lawrence; Price, Michael John; Prieur, Damien; Primavera, Margherita; Prokofiev, Kirill; Prokoshin, Fedor; Protopopescu, Serban; Proudfoot, James; Prudent, Xavier; Przybycien, Mariusz; Przysiezniak, Helenka; Psoroulas, Serena; Ptacek, Elizabeth; Pueschel, Elisa; Purdham, John; Purohit, Milind; Puzo, Patrick; Pylypchenko, Yuriy; Qian, Jianming; Qian, Zuxuan; Qin, Zhonghua; Quadt, Arnulf; Quarrie, David; Quayle, William; Quinonez, Fernando; Raas, Marcel; Radescu, Voica; Radics, Balint; Radloff, Peter; Rador, Tonguc; Ragusa, Francesco; Rahal, Ghita; Rahimi, Amir; Rahm, David; Rajagopalan, Srinivasan; Rammensee, Michael; Rammes, Marcus; Randle-Conde, Aidan Sean; Randrianarivony, Koloina; Ratoff, Peter; Rauscher, Felix; Rave, Tobias Christian; Raymond, Michel; Read, Alexander Lincoln; Rebuzzi, Daniela; Redelbach, Andreas; Redlinger, George; Reece, Ryan; Reeves, Kendall; Reichold, Armin; Reinherz-Aronis, Erez; Reinsch, Andreas; Reisinger, Ingo; Rembser, Christoph; Ren, Zhongliang; Renaud, Adrien; Renkel, Peter; Rescigno, Marco; Resconi, Silvia; Resende, Bernardo; Reznicek, Pavel; Rezvani, Reyhaneh; Richards, Alexander; Richter, Robert; Richter-Was, Elzbieta; Ridel, Melissa; Rijpstra, Manouk; Rijssenbeek, Michael; Rimoldi, Adele; Rinaldi, Lorenzo; Rios, Ryan Randy; Riu, Imma; Rivoltella, Giancesare; Rizatdinova, Flera; Rizvi, Eram; Robertson, Steven; Robichaud-Veronneau, Andree; Robinson, Dave; Robinson, James; Robinson, Mary; Robson, Aidan; Rocha de Lima, Jose Guilherme; Roda, Chiara; Roda Dos Santos, Denis; Rodriguez, Diego; Roe, Adam; Roe, Shaun; Røhne, Ole; Rojo, Victoria; Rolli, Simona; Romaniouk, Anatoli; Romano, Marino; Romanov, Victor; Romeo, Gaston; Romero Adam, Elena; Roos, Lydia; Ros, Eduardo; Rosati, Stefano; Rosbach, Kilian; Rose, Anthony; Rose, Matthew; Rosenbaum, Gabriel; Rosenberg, Eli; Rosendahl, Peter Lundgaard; Rosenthal, Oliver; Rosselet, Laurent; Rossetti, Valerio; Rossi, Elvira; Rossi, Leonardo Paolo; Rotaru, Marina; Roth, Itamar; Rothberg, Joseph; Rousseau, David; Royon, Christophe; Rozanov, Alexander; Rozen, Yoram; Ruan, Xifeng; Rubinskiy, Igor; Ruckert, Benjamin; Ruckstuhl, Nicole; Rud, Viacheslav; Rudolph, Christian; Rudolph, Gerald; Rühr, Frederik; Ruggieri, Federico; Ruiz-Martinez, Aranzazu; Rumiantsev, Viktor; Rumyantsev, Leonid; Runge, Kay; Rurikova, Zuzana; Rusakovich, Nikolai; Rust, Dave; Rutherfoord, John; Ruwiedel, Christoph; Ruzicka, Pavel; Ryabov, Yury; Ryadovikov, Vasily; Ryan, Patrick; Rybar, Martin; Rybkin, Grigori; Ryder, Nick; Rzaeva, Sevda; Saavedra, Aldo; Sadeh, Iftach; Sadrozinski, Hartmut; Sadykov, Renat; Safai Tehrani, Francesco; Sakamoto, Hiroshi; Salamanna, Giuseppe; Salamon, Andrea; Saleem, Muhammad; Salihagic, Denis; Salnikov, Andrei; Salt, José; Salvachua Ferrando, Belén; Salvatore, Daniela; Salvatore, Pasquale Fabrizio; Salvucci, Antonio; Salzburger, Andreas; Sampsonidis, Dimitrios; Samset, Björn Hallvard; Sanchez, Arturo; Sanchez Martinez, Victoria; Sandaker, Heidi; Sander, Heinz Georg; Sanders, Michiel; Sandhoff, Marisa; Sandoval, Tanya; Sandoval, Carlos; Sandstroem, Rikard; Sandvoss, Stephan; Sankey, Dave; Sansoni, Andrea; Santamarina Rios, Cibran; Santoni, Claudio; Santonico, Rinaldo; Santos, Helena; Saraiva, João; Sarangi, Tapas; Sarkisyan-Grinbaum, Edward; Sarri, Francesca; Sartisohn, Georg; Sasaki, Osamu; Sasaki, Takashi; Sasao, Noboru; Satsounkevitch, Igor; Sauvage, Gilles; Sauvan, Emmanuel; Sauvan, Jean-Baptiste; Savard, Pierre; Savinov, Vladimir; Savu, Dan Octavian; Sawyer, Lee; Saxon, David; Says, Louis-Pierre; Sbarra, Carla; Sbrizzi, Antonio; Scallon, Olivia; Scannicchio, Diana; Scarcella, Mark; Schaarschmidt, Jana; Schacht, Peter; Schäfer, Uli; Schaepe, Steffen; Schaetzel, Sebastian; Schaffer, Arthur; Schaile, Dorothee; Schamberger, R. Dean; Schamov, Andrey; Scharf, Veit; Schegelsky, Valery; Scheirich, Daniel; Schernau, Michael; Scherzer, Max; Schiavi, Carlo; Schieck, Jochen; Schioppa, Marco; Schlenker, Stefan; Schlereth, James; Schmidt, Evelyn; Schmieden, Kristof; Schmitt, Christian; Schmitt, Sebastian; Schmitz, Martin; Schöning, André; Schott, Matthias; Schouten, Doug; Schovancova, Jaroslava; Schram, Malachi; Schroeder, Christian; Schroer, Nicolai; Schuh, Silvia; Schuler, Georges; Schultens, Martin Johannes; Schultes, Joachim; Schultz-Coulon, Hans-Christian; Schulz, Holger; Schumacher, Jan; Schumacher, Markus; Schumm, Bruce; Schune, Philippe; Schwanenberger, Christian; Schwartzman, Ariel; Schwemling, Philippe; Schwienhorst, Reinhard; Schwierz, Rainer; Schwindling, Jerome; Schwindt, Thomas; Schwoerer, Maud; Scott, Bill; Searcy, Jacob; Sedov, George; Sedykh, Evgeny; Segura, Ester; Seidel, Sally; Seiden, Abraham; Seifert, Frank; Seixas, José; Sekhniaidze, Givi; Selbach, Karoline Elfriede; Seliverstov, Dmitry; Sellden, Bjoern; Sellers, Graham; Seman, Michal; Semprini-Cesari, Nicola; Serfon, Cedric; Serin, Laurent; Serkin, Leonid; Seuster, Rolf; Severini, Horst; Sevior, Martin; Sfyrla, Anna; Shabalina, Elizaveta; Shamim, Mansoora; Shan, Lianyou; Shank, James; Shao, Qi Tao; Shapiro, Marjorie; Shatalov, Pavel; Shaver, Leif; Shaw, Kate; Sherman, Daniel; Sherwood, Peter; Shibata, Akira; Shichi, Hideharu; Shimizu, Shima; Shimojima, Makoto; Shin, Taeksu; Shiyakova, Maria; Shmeleva, Alevtina; Shochet, Mel; Short, Daniel; Shrestha, Suyog; Shulga, Evgeny; Shupe, Michael; Sicho, Petr; Sidoti, Antonio; Siegert, Frank; Sijacki, Djordje; Silbert, Ohad; Silva, José; Silver, Yiftah; Silverstein, Daniel; Silverstein, Samuel; Simak, Vladislav; Simard, Olivier; Simic, Ljiljana; Simion, Stefan; Simmons, Brinick; Simonyan, Margar; Sinervo, Pekka; Sinev, Nikolai; Sipica, Valentin; Siragusa, Giovanni; Sircar, Anirvan; Sisakyan, Alexei; Sivoklokov, Serguei; Sjölin, Jörgen; Sjursen, Therese; Skinnari, Louise Anastasia; Skottowe, Hugh Philip; Skovpen, Kirill; Skubic, Patrick; Skvorodnev, Nikolai; Slater, Mark; Slavicek, Tomas; Sliwa, Krzysztof; Sloper, John erik; Smakhtin, Vladimir; Smart, Ben; Smirnov, Sergei; Smirnov, Yury; Smirnova, Lidia; Smirnova, Oxana; Smith, Ben Campbell; Smith, Douglas; Smith, Kenway; Smizanska, Maria; Smolek, Karel; Snesarev, Andrei; Snow, Steve; Snow, Joel; Snuverink, Jochem; Snyder, Scott; Soares, Mara; Sobie, Randall; Sodomka, Jaromir; Soffer, Abner; Solans, Carlos; Solar, Michael; Solc, Jaroslav; Soldatov, Evgeny; Soldevila, Urmila; Solfaroli Camillocci, Elena; Solodkov, Alexander; Solovyanov, Oleg; Soni, Nitesh; Sopko, Vit; Sopko, Bruno; Sosebee, Mark; Soualah, Rachik; Soukharev, Andrey; Spagnolo, Stefania; Spanò, Francesco; Spighi, Roberto; Spigo, Giancarlo; Spila, Federico; Spiwoks, Ralf; Spousta, Martin; Spreitzer, Teresa; Spurlock, Barry; St Denis, Richard Dante; Stahlman, Jonathan; Stamen, Rainer; Stanecka, Ewa; Stanek, Robert; Stanescu, Cristian; Stapnes, Steinar; Starchenko, Evgeny; Stark, Jan; Staroba, Pavel; Starovoitov, Pavel; Staude, Arnold; Stavina, Pavel; Stavropoulos, Georgios; Steele, Genevieve; Steinbach, Peter; Steinberg, Peter; Stekl, Ivan; Stelzer, Bernd; Stelzer, Harald Joerg; Stelzer-Chilton, Oliver; Stenzel, Hasko; Stern, Sebastian; Stevenson, Kyle; Stewart, Graeme; Stillings, Jan Andre; Stockton, Mark; Stoerig, Kathrin; Stoicea, Gabriel; Stonjek, Stefan; Strachota, Pavel; Stradling, Alden; Straessner, Arno; Strandberg, Jonas; Strandberg, Sara; Strandlie, Are; Strang, Michael; Strauss, Emanuel; Strauss, Michael; Strizenec, Pavol; Ströhmer, Raimund; Strom, David; Strong, John; Stroynowski, Ryszard; Strube, Jan; Stugu, Bjarne; Stumer, Iuliu; Stupak, John; Sturm, Philipp; Styles, Nicholas Adam; Soh, Dart-yin; Su, Dong; Subramania, Halasya Siva; Succurro, Antonella; Sugaya, Yorihito; Sugimoto, Takuya; Suhr, Chad; Suita, Koichi; Suk, Michal; Sulin, Vladimir; Sultansoy, Saleh; Sumida, Toshi; Sun, Xiaohu; Sundermann, Jan Erik; Suruliz, Kerim; Sushkov, Serge; Susinno, Giancarlo; Sutton, Mark; Suzuki, Yu; Suzuki, Yuta; Svatos, Michal; Sviridov, Yuri; Swedish, Stephen; Sykora, Ivan; Sykora, Tomas; Szeless, Balazs; Sánchez, Javier; Ta, Duc; Tackmann, Kerstin; Taffard, Anyes; Tafirout, Reda; Taiblum, Nimrod; Takahashi, Yuta; Takai, Helio; Takashima, Ryuichi; Takeda, Hiroshi; Takeshita, Tohru; Takubo, Yosuke; Talby, Mossadek; Talyshev, Alexey; Tamsett, Matthew; Tanaka, Junichi; Tanaka, Reisaburo; Tanaka, Satoshi; Tanaka, Shuji; Tanaka, Yoshito; Tanasijczuk, Andres Jorge; Tani, Kazutoshi; Tannoury, Nancy; Tappern, Geoffrey; Tapprogge, Stefan; Tardif, Dominique; Tarem, Shlomit; Tarrade, Fabien; Tartarelli, Giuseppe Francesco; Tas, Petr; Tasevsky, Marek; Tassi, Enrico; Tatarkhanov, Mous; Tayalati, Yahya; Taylor, Christopher; Taylor, Frank; Taylor, Geoffrey; Taylor, Wendy; Teinturier, Marthe; Teixeira Dias Castanheira, Matilde; Teixeira-Dias, Pedro; Temming, Kim Katrin; Ten Kate, Herman; Teng, Ping-Kun; Terada, Susumu; Terashi, Koji; Terron, Juan; Testa, Marianna; Teuscher, Richard; Thadome, Jocelyn; Therhaag, Jan; Theveneaux-Pelzer, Timothée; Thioye, Moustapha; Thoma, Sascha; Thomas, Juergen; Thompson, Emily; Thompson, Paul; Thompson, Peter; Thompson, Stan; Thomsen, Lotte Ansgaard; Thomson, Evelyn; Thomson, Mark; Thun, Rudolf; Tian, Feng; Tibbetts, Mark James; Tic, Tomáš; Tikhomirov, Vladimir; Tikhonov, Yury; Timoshenko, Sergey; Tipton, Paul; Tique Aires Viegas, Florbela De Jes; Tisserant, Sylvain; Tobias, Jürgen; Toczek, Barbara; Todorov, Theodore; Todorova-Nova, Sharka; Toggerson, Brokk; Tojo, Junji; Tokár, Stanislav; Tokunaga, Kaoru; Tokushuku, Katsuo; Tollefson, Kirsten; Tomoto, Makoto; Tompkins, Lauren; Toms, Konstantin; Tong, Guoliang; Tonoyan, Arshak; Topfel, Cyril; Topilin, Nikolai; Torchiani, Ingo; Torrence, Eric; Torres, Heberth; Torró Pastor, Emma; Toth, Jozsef; Touchard, Francois; Tovey, Daniel; Trefzger, Thomas; Tremblet, Louis; Tricoli, Alesandro; Trigger, Isabel Marian; Trincaz-Duvoid, Sophie; Trinh, Thi Nguyet; Tripiana, Martin; Trischuk, William; Trivedi, Arjun; Trocmé, Benjamin; Troncon, Clara; Trottier-McDonald, Michel; Trzebinski, Maciej; Trzupek, Adam; Tsarouchas, Charilaos; Tseng, Jeffrey; Tsiakiris, Menelaos; Tsiareshka, Pavel; Tsionou, Dimitra; Tsipolitis, Georgios; Tsiskaridze, Vakhtang; Tskhadadze, Edisher; Tsukerman, Ilya; Tsulaia, Vakhtang; Tsung, Jieh-Wen; Tsuno, Soshi; Tsybychev, Dmitri; Tua, Alan; Tudorache, Alexandra; Tudorache, Valentina; Tuggle, Joseph; Turala, Michal; Turecek, Daniel; Turk Cakir, Ilkay; Turlay, Emmanuel; Turra, Ruggero; Tuts, Michael; Tykhonov, Andrii; Tylmad, Maja; Tyndel, Mike; Tzanakos, George; Uchida, Kirika; Ueda, Ikuo; Ueno, Ryuichi; Ugland, Maren; Uhlenbrock, Mathias; Uhrmacher, Michael; Ukegawa, Fumihiko; Unal, Guillaume; Underwood, David; Undrus, Alexander; Unel, Gokhan; Unno, Yoshinobu; Urbaniec, Dustin; Usai, Giulio; Uslenghi, Massimiliano; Vacavant, Laurent; Vacek, Vaclav; Vachon, Brigitte; Vahsen, Sven; Valenta, Jan; Valente, Paolo; Valentinetti, Sara; Valkar, Stefan; Valladolid Gallego, Eva; Vallecorsa, Sofia; Valls Ferrer, Juan Antonio; van der Graaf, Harry; van der Kraaij, Erik; Van Der Leeuw, Robin; van der Poel, Egge; van der Ster, Daniel; van Eldik, Niels; van Gemmeren, Peter; van Kesteren, Zdenko; van Vulpen, Ivo; Vanadia, Marco; Vandelli, Wainer; Vandoni, Giovanna; Vaniachine, Alexandre; Vankov, Peter; Vannucci, Francois; Varela Rodriguez, Fernando; Vari, Riccardo; Varnes, Erich; Varouchas, Dimitris; Vartapetian, Armen; Varvell, Kevin; Vassilakopoulos, Vassilios; Vazeille, Francois; Vegni, Guido; Veillet, Jean-Jacques; Vellidis, Constantine; Veloso, Filipe; Veness, Raymond; Veneziano, Stefano; Ventura, Andrea; Ventura, Daniel; Venturi, Manuela; Venturi, Nicola; Vercesi, Valerio; Verducci, Monica; Verkerke, Wouter; Vermeulen, Jos; Vest, Anja; Vetterli, Michel; Vichou, Irene; Vickey, Trevor; Vickey Boeriu, Oana Elena; Viehhauser, Georg; Viel, Simon; Villa, Mauro; Villaplana Perez, Miguel; Vilucchi, Elisabetta; Vincter, Manuella; Vinek, Elisabeth; Vinogradov, Vladimir; Virchaux, Marc; Virzi, Joseph; Vitells, Ofer; Viti, Michele; Vivarelli, Iacopo; Vives Vaque, Francesc; Vlachos, Sotirios; Vladoiu, Dan; Vlasak, Michal; Vlasov, Nikolai; Vogel, Adrian; Vokac, Petr; Volpi, Guido; Volpi, Matteo; Volpini, Giovanni; von der Schmitt, Hans; von Loeben, Joerg; von Radziewski, Holger; von Toerne, Eckhard; Vorobel, Vit; Vorobiev, Alexander; Vorwerk, Volker; Vos, Marcel; Voss, Rudiger; Voss, Thorsten Tobias; Vossebeld, Joost; Vranjes, Nenad; Vranjes Milosavljevic, Marija; Vrba, Vaclav; Vreeswijk, Marcel; Vu Anh, Tuan; Vuillermet, Raphael; Vukotic, Ilija; Wagner, Wolfgang; Wagner, Peter; Wahlen, Helmut; Wakabayashi, Jun; Walbersloh, Jorg; Walch, Shannon; Walder, James; Walker, Rodney; Walkowiak, Wolfgang; Wall, Richard; Waller, Peter; Wang, Chiho; Wang, Haichen; Wang, Hulin; Wang, Jike; Wang, Jin; Wang, Joshua C; Wang, Rui; Wang, Song-Ming; Warburton, Andreas; Ward, Patricia; Warsinsky, Markus; Watkins, Peter; Watson, Alan; Watson, Ian; Watson, Miriam; Watts, Gordon; Watts, Stephen; Waugh, Anthony; Waugh, Ben; Weber, Marc; Weber, Michele; Weber, Pavel; Weidberg, Anthony; Weigell, Philipp; Weingarten, Jens; Weiser, Christian; Wellenstein, Hermann; Wells, Phillippa; Wen, Mei; Wenaus, Torre; Wendland, Dennis; Wendler, Shanti; Weng, Zhili; Wengler, Thorsten; Wenig, Siegfried; Wermes, Norbert; Werner, Matthias; Werner, Per; Werth, Michael; Wessels, Martin; Weydert, Carole; Whalen, Kathleen; Wheeler-Ellis, Sarah Jane; Whitaker, Scott; White, Andrew; White, Martin; Whitehead, Samuel Robert; Whiteson, Daniel; Whittington, Denver; Wicek, Francois; Wicke, Daniel; Wickens, Fred; Wiedenmann, Werner; Wielers, Monika; Wienemann, Peter; Wiglesworth, Craig; Wiik, Liv Antje Mari; Wijeratne, Peter Alexander; Wildauer, Andreas; Wildt, Martin Andre; Wilhelm, Ivan; Wilkens, Henric George; Will, Jonas Zacharias; Williams, Eric; Williams, Hugh; Willis, William; Willocq, Stephane; Wilson, John; Wilson, Michael Galante; Wilson, Alan; Wingerter-Seez, Isabelle; Winkelmann, Stefan; Winklmeier, Frank; Wittgen, Matthias; Wolter, Marcin Wladyslaw; Wolters, Helmut; Wong, Wei-Cheng; Wooden, Gemma; Wosiek, Barbara; Wotschack, Jorg; Woudstra, Martin; Wozniak, Krzysztof; Wraight, Kenneth; Wright, Catherine; Wright, Michael; Wrona, Bozydar; Wu, Sau Lan; Wu, Xin; Wu, Yusheng; Wulf, Evan; Wunstorf, Renate; Wynne, Benjamin; Xella, Stefania; Xiao, Meng; Xie, Song; Xie, Yigang; Xu, Chao; Xu, Da; Xu, Guofa; Yabsley, Bruce; Yacoob, Sahal; Yamada, Miho; Yamaguchi, Hiroshi; Yamamoto, Akira; Yamamoto, Kyoko; Yamamoto, Shimpei; Yamamura, Taiki; Yamanaka, Takashi; Yamaoka, Jared; Yamazaki, Takayuki; Yamazaki, Yuji; Yan, Zhen; Yang, Haijun; Yang, Un-Ki; Yang, Yi; Yang, Yi; Yang, Zhaoyu; Yanush, Serguei; Yao, Yushu; Yasu, Yoshiji; Ybeles Smit, Gabriel Valentijn; Ye, Jingbo; Ye, Shuwei; Yilmaz, Metin; Yoosoofmiya, Reza; Yorita, Kohei; Yoshida, Riktura; Young, Charles; Youssef, Saul; Yu, Dantong; Yu, Jaehoon; Yu, Jie; Yuan, Li; Yurkewicz, Adam; Zabinski, Bartlomiej; Zaets, Vassilli; Zaidan, Remi; Zaitsev, Alexander; Zajacova, Zuzana; Zanello, Lucia; Zarzhitsky, Pavel; Zaytsev, Alexander; Zeitnitz, Christian; Zeller, Michael; Zeman, Martin; Zemla, Andrzej; Zendler, Carolin; Zenin, Oleg; Ženiš, Tibor; Zenonos, Zenonas; Zenz, Seth; Zerwas, Dirk; Zevi della Porta, Giovanni; Zhan, Zhichao; Zhang, Dongliang; Zhang, Huaqiao; Zhang, Jinlong; Zhang, Xueyao; Zhang, Zhiqing; Zhao, Long; Zhao, Tianchi; Zhao, Zhengguo; Zhemchugov, Alexey; Zheng, Shuchen; Zhong, Jiahang; Zhou, Bing; Zhou, Ning; Zhou, Yue; Zhu, Cheng Guang; Zhu, Hongbo; Zhu, Junjie; Zhu, Yingchun; Zhuang, Xuai; Zhuravlov, Vadym; Zieminska, Daria; Zimmermann, Robert; Zimmermann, Simone; Zimmermann, Stephanie; Ziolkowski, Michael; Zitoun, Robert; Živković, Lidija; Zmouchko, Viatcheslav; Zobernig, Georg; Zoccoli, Antonio; Zolnierowski, Yves; Zsenei, Andras; zur Nedden, Martin; Zutshi, Vishnu; Zwalinski, Lukasz

    2013-03-02

    The uncertainty on the calorimeter energy response to jets of particles is derived for the ATLAS experiment at the Large Hadron Collider (LHC). First, the calorimeter response to single isolated charged hadrons is measured and compared to the Monte Carlo simulation using proton-proton collisions at centre-of-mass energies of $\\sqrt{s}$ = 900 GeV and 7 TeV collected during 2009 and 2010. Then, using the decay of K_s and Lambda particles, the calorimeter response to specific types of particles (positively and negatively charged pions, protons, and anti-protons) is measured and compared to the Monte Carlo predictions. Finally, the jet energy scale uncertainty is determined by propagating the response uncertainty for single charged and neutral particles to jets. The response uncertainty is 2-5% for central isolated hadrons and 1-3% for the final calorimeter jet energy scale.

  5. Application of adaptive optics for flexible laser induced ultrasound field generation and uncertainty reduction in measurements

    Science.gov (United States)

    Büttner, Lars; Schmieder, Felix; Teich, Martin; Koukourakis, Nektarios; Czarske, Jürgen

    2017-06-01

    The availability of spatial light modulators as standard turnkey components and their ongoing development makes them attractive for a huge variety of optical measurement systems in industry and research. Here, we outline two examples of how optical measurements can benefit from spatial light modulators. Ultrasound testing has become an indispensable tool for industrial inspection. Contact-free measurements can be achieved by laser-induced ultrasound. One disadvantage is that due to the highly divergent sound field of the generated shear waves for a point-wise thermoelastic excitation, only a poor spatial selectivity can be achieved. This problem can be solved by creating an ultrasound focus by means of a ring-like laser intensity distribution, but standard fixed-form optical components used for their generation are always optimised to a fixed set of parameters. Here, we demonstrate, how a predefined intensity pattern as e.g. a ring can be created from an arbitrary input laser beam using a phase-retrieval algorithm to shape an ultrasound focus in the sample. By displaying different patterns on the spatial light modulator, the focus can be traversed in all three directions through the object allowing a fast and highly spatially resolving scanning of the sample. Optical measurements take often place under difficult conditions. They are affected by variations of the refractive index, caused e.g. by phase boundaries between two media of different optical density. This will result in an increased measurement uncertainty or, in the worst case, will cause the measurement to fail. To overcome these limitations, we propose the application of adaptive optics. Optical flow velocity measurements based on image correlation in water that are performed through optical distortions are discussed. We demonstrate how the measurement error induced by refractive index variations can be reduced if a spatial light modulator is used in the measurement setup to compensate for the wavefront

  6. Is Presentation Everything? Using Visual Presentation of Attributes in Discrete Choice Experiments to Measure the Relative Importance of Intrinsic and Extrinsic Beef Attributes

    OpenAIRE

    Umberger, Wendy J.; Mueller, Simone C.

    2010-01-01

    A unique discrete choice experiment (DCE) is used to estimate the relative importance of quality attributes to Australian beef consumers. In the DCE, consumers choose their preferred beef steaks from options varying in a large number of intrinsic (marbling and fat trim) and extrinsic/credence (brand, health, forage, meat standards/quality, and production and process claims) attributes. This study is the only known DCE to present these attributes to consumers visually – in a manner that more r...

  7. Calibration uncertainty

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Anglov, Thomas

    2002-01-01

    Methods recommended by the International Standardization Organisation and Eurachem are not satisfactory for the correct estimation of calibration uncertainty. A novel approach is introduced and tested on actual calibration data for the determination of Pb by ICP-AES. The improved calibration...... uncertainty was verified from independent measurements of the same sample by demonstrating statistical control of analytical results and the absence of bias. The proposed method takes into account uncertainties of the measurement, as well as of the amount of calibrant. It is applicable to all types...

  8. Accuracy and uncertainty in random speckle modulation transfer function measurement of infrared focal plane arrays

    Science.gov (United States)

    Barnard, Kenneth J.; Jacobs, Eddie L.; Plummer, Philip J.

    2016-12-01

    This paper expands upon a previously reported random speckle technique for measuring the modulation transfer function of midwave infrared focal plane arrays by considering a number of factors that impact the accuracy of the estimated modulation transfer function. These factors arise from assumptions in the theoretical derivation and bias in the estimation procedure. Each factor is examined and guidelines are determined to maintain accuracy within 2% of the true value. The uncertainty of the measurement is found by applying a one-factor ANOVA analysis and confidence intervals are established for the results. The small magnitude of the confidence intervals indicates a very robust technique capable of distinguishing differences in modulation transfer function among focal plane arrays on the order of a few percent. This analysis directly indicates the high quality of the random speckle modulation transfer function measurement technique. The methodology is applied to a focal plane array and results are presented that emphasize the need for generating independent random speckle realizations to accurately assess measured values.

  9. New Measurement Method and Uncertainty Estimation for Plate Dimensions and Surface Quality

    Directory of Open Access Journals (Sweden)

    Salah H. R. Ali

    2013-01-01

    Full Text Available Dimensional and surface quality for plate production control is facing difficult engineering challenges. One of these challenges is that plates in large-scale mass production contain geometric uneven surfaces. There is a traditional measurement method used to assess the tile plate dimensions and surface quality based on standard specifications: ISO-10545-2: 1995, EOS-3168-2: 2007, and TIS 2398-2: 2008. A proposed measurement method of the dimensions and surface quality for ceramic oblong large-scale tile plate has been developed compared to the traditional method. The strategy of new method is based on CMM straightness measurement strategy instead of the centre point in the traditional method. Expanded uncertainties budgets in the measurements of each method have been estimated in detail. The capability of accurate estimations of real actual results for centre of curvature (CC, centre of edge (CE, warpage (W, and edge crack defects parameters has been achieved according to standards. Moreover, the obtained results not only showed better accurate new method but also improved the quality of plate products significantly.

  10. Improved evaluation of measurement uncertainty from sampling by inclusion of between-sampler bias using sampling proficiency testing.

    Science.gov (United States)

    Ramsey, Michael H; Geelhoed, Bastiaan; Wood, Roger; Damant, Andrew P

    2011-04-07

    A realistic estimate of the uncertainty of a measurement result is essential for its reliable interpretation. Recent methods for such estimation include the contribution to uncertainty from the sampling process, but they only include the random and not the systematic effects. Sampling Proficiency Tests (SPTs) have been used previously to assess the performance of samplers, but the results can also be used to evaluate measurement uncertainty, including the systematic effects. A new SPT conducted on the determination of moisture in fresh butter is used to exemplify how SPT results can be used not only to score samplers but also to estimate uncertainty. The comparison between uncertainty evaluated within- and between-samplers is used to demonstrate that sampling bias is causing the estimates of expanded relative uncertainty to rise by over a factor of two (from 0.39% to 0.87%) in this case. General criteria are given for the experimental design and the sampling target that are required to apply this approach to measurements on any material. © The Royal Society of Chemistry 2011

  11. Comparison of the GUM and Monte Carlo methods on the flatness uncertainty estimation in coordinate measuring machine

    Directory of Open Access Journals (Sweden)

    Jalid Abdelilah

    2016-01-01

    Full Text Available In engineering industry, control of manufactured parts is usually done on a coordinate measuring machine (CMM, a sensor mounted at the end of the machine probes a set of points on the surface to be inspected. Data processing is performed subsequently using software, and the result of this measurement process either validates or not the conformity of the part. Measurement uncertainty is a crucial parameter for making the right decisions, and not taking into account this parameter can, therefore, sometimes lead to aberrant decisions. The determination of the uncertainty measurement on CMM is a complex task for the variety of influencing factors. Through this study, we aim to check if the uncertainty propagation model developed according to the guide to the expression of uncertainty in measurement (GUM approach is valid, we present here a comparison of the GUM and Monte Carlo methods. This comparison is made to estimate a flatness deviation of a surface belonging to an industrial part and the uncertainty associated to the measurement result.

  12. Measuring Student Graduateness: Reliability and Construct Validity of the Graduate Skills and Attributes Scale

    Science.gov (United States)

    Coetzee, Melinde

    2014-01-01

    This study reports the development and validation of the Graduate Skills and Attributes Scale which was initially administered to a random sample of 272 third-year-level and postgraduate-level, distance-learning higher education students. The data were analysed using exploratory factor analysis. In a second study, the scale was administered to a…

  13. Estimation of the measurement uncertainty by the bottom-up approach for the determination of methamphetamine and amphetamine in urine.

    Science.gov (United States)

    Lee, Sooyeun; Choi, Hyeyoung; Kim, Eunmi; Choi, Hwakyung; Chung, Heesun; Chung, Kyu Hyuck

    2010-05-01

    The measurement uncertainty (MU) of methamphetamine (MA) and amphetamine (AP) was estimated in an authentic urine sample with a relatively low concentration of MA and AP using the bottom-up approach. A cause and effect diagram was deduced; the amount of MA or AP in the sample, the volume of the sample, method precision, and sample effect were considered uncertainty sources. The concentrations of MA and AP in the urine sample with their expanded uncertainties were 340.5 +/- 33.2 ng/mL and 113.4 +/- 15.4 ng/mL, respectively, which means 9.7% and 13.6% of the concentration gave an estimated expanded uncertainty, respectively. The largest uncertainty originated from sample effect and method precision in MA and AP, respectively, but the uncertainty of the volume of the sample was minimal in both. The MU needs to be determined during the method validation process to assess test reliability. Moreover, the identification of the largest and/or smallest uncertainty source can help improve experimental protocols.

  14. Validation of the Consumer Values versus Perceived Product Attributes Model Measuring the Purchase of Athletic Team Merchandise

    Science.gov (United States)

    Lee, Donghun; Byon, Kevin K.; Schoenstedt, Linda; Johns, Gary; Bussell, Leigh Ann; Choi, Hwansuk

    2012-01-01

    Various consumer values and perceived product attributes trigger consumptive behaviors of athletic team merchandise (Lee, Trail, Kwon, & Anderson, 2011). Likewise, using a principal component analysis technique on a student sample, a measurement scale was proposed that consisted of nine factors affecting the purchase of athletic team…

  15. ESTIMATION OF MEASUREMENT UNCERTAINTY IN THE DETERMINATION OF Fe CONTENT IN POWDERED TONIC FOOD DRINK USING GRAPHITE FURNACE ATOMIC ABSORPTION SPECTROMETRY

    Directory of Open Access Journals (Sweden)

    Harry Budiman

    2010-06-01

    Full Text Available The evaluation of uncertainty measurement in the determination of Fe content in powdered tonic food drink using graphite furnace atomic absorption spectrometry was carried out. The specification of measurand, source of uncertainty, standard uncertainty, combined uncertainty and expanded uncertainty from this measurement were evaluated and accounted. The measurement result showed that the Fe content in powdered tonic food drink sample was 569.32 µg/5g, with the expanded uncertainty measurement ± 178.20 µg/5g (coverage factor, k = 2, at confidende level 95%. The calibration curve gave the major contribution to the uncertainty of the final results.   Keywords: uncertainty, powdered tonic food drink, iron (Fe, graphite furnace AAS

  16. A computer simulation platform for the estimation of measurement uncertainties in dimensional X-ray computed tomography

    DEFF Research Database (Denmark)

    Hiller, Jochen; Reindl, Leonard M

    2012-01-01

    The knowledge of measurement uncertainty is of great importance in conformance testing in production. The tolerance limit for production must be reduced by the amounts of measurement uncertainty to ensure that the parts are in fact within the tolerance. Over the last 5 years, industrial X-ray...... computed tomography (CT) has become an important technology for dimensional quality control. In this paper a computer simulation platform is presented which is able to investigate error sources in dimensional CT measurements. The typical workflow in industrial CT metrology is described and methods...

  17. Measuring the Uncertainty of Probabilistic Maps Representing Human Motion for Indoor Navigation

    Directory of Open Access Journals (Sweden)

    Susanna Kaiser

    2016-01-01

    Full Text Available Indoor navigation and mapping have recently become an important field of interest for researchers because global navigation satellite systems (GNSS are very often unavailable inside buildings. FootSLAM, a SLAM (Simultaneous Localization and Mapping algorithm for pedestrians based on step measurements, addresses the indoor mapping and positioning problem and can provide accurate positioning in many structured indoor environments. In this paper, we investigate how to compare FootSLAM maps via two entropy metrics. Since collaborative FootSLAM requires the alignment and combination of several individual FootSLAM maps, we also investigate measures that help to align maps that partially overlap. We distinguish between the map entropy conditioned on the sequence of pedestrian’s poses, which is a measure of the uncertainty of the estimated map, and the entropy rate of the pedestrian’s steps conditioned on the history of poses and conditioned on the estimated map. Because FootSLAM maps are built on a hexagon grid, the entropy and relative entropy metrics are derived for the special case of hexagonal transition maps. The entropy gives us a new insight on the performance of FootSLAM’s map estimation process.

  18. Entropic Measure of Epistemic Uncertainties in Multibody System Models by Axiomatic Design

    Directory of Open Access Journals (Sweden)

    Francesco Villecco

    2017-06-01

    Full Text Available In this paper, the use of the MaxInf Principle in real optimization problems is investigated for engineering applications, where the current design solution is actually an engineering approximation. In industrial manufacturing, multibody system simulations can be used to develop new machines and mechanisms by using virtual prototyping, where an axiomatic design can be employed to analyze the independence of elements and the complexity of connections forming a general mechanical system. In the classic theories of Fisher and Wiener-Shannon, the idea of information is a measure of only probabilistic and repetitive events. However, this idea is broader than the probability alone field. Thus, the Wiener-Shannon’s axioms can be extended to non-probabilistic events and it is possible to introduce a theory of information for non-repetitive events as a measure of the reliability of data for complex mechanical systems. To this end, one can devise engineering solutions consistent with the values of the design constraints analyzing the complexity of the relation matrix and using the idea of information in the metric space. The final solution gives the entropic measure of epistemic uncertainties which can be used in multibody system models, analyzed with an axiomatic design.

  19. Uncertainties in assessing tillage erosion - how appropriate are our measuring techniques?

    Science.gov (United States)

    Fiener, Peter; Deumlich, Detlef; Gómez, José A.; Guzmán, Gema; Hardy, Robert; Jague, Emilien A.; Quinton, John; Sommer, Michael; van Oost, Kristof; Wexler, Robert; Wilken, Florian

    2017-04-01

    discrepancies between measurements based on different techniques. The latter introduces substantial uncertainties in any existing tillage erosion modelling approach.

  20. Simulation of images of CDMAM phantom and the estimation of measurement uncertainties of threshold gold thickness.

    Science.gov (United States)

    Mackenzie, Alistair; Eales, Timothy D; Dunn, Hannah L; Yip Braidley, Mary; Dance, David R; Young, Kenneth C

    2017-07-01

    To demonstrate a method of simulating mammography images of the CDMAM phantom and to investigate the coefficient of variation (CoV) in the threshold gold thickness (t T ) measurements associated with use of the phantom. The noise and sharpness of Hologic Dimensions and GE Essential mammography systems were characterized to provide data for the simulation. The simulation method was validated by comparing the t T results of real and simulated images of the CDMAM phantom for three different doses and the two systems. The detection matrices produced from each of 64 images using CDCOM software were randomly resampled to create 512 sets of 8, 16 and 32 images to estimate the CoV of t T . Sets of simulated images for a range of doses were used to estimate the CoVs for a range of diameters and threshold thicknesses. No significant differences were found for t T or the CoV between real and simulated CDMAM images. It was shown that resampling from 256 images was required for estimating the CoV. The CoV was around 4% using 16 images for most of the phantom but is over double that for details near the edge of the phantom. We have demonstrated a method to simulate images of the CDMAM phantom for different systems at a range of doses. We provide data for calculating uncertainties in t T . Any future review of the European guidelines should take into consideration the calculated uncertainties for the 0.1mm detail. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  1. Reducing uncertainties in nucleation rates: A comparison of measurements and simulations

    Energy Technology Data Exchange (ETDEWEB)

    Nadykto, A. B., E-mail: anadykto@gmail.com [Atmospheric Science Research Center, State University of New York at Albany, 251 Fuller Road, Albany, NY 12203 (United States); Department of Applied Mathematics, Moscow State University of Technology “STANKIN”, Vadkovsky per. 1, Moscow 127055 (Russian Federation); Nazarenko, K. M.; Markov, P. N. [Department of Applied Mathematics, Moscow State University of Technology “STANKIN”, Vadkovsky per. 1, Moscow 127055 (Russian Federation); Yu, F. [Atmospheric Science Research Center, State University of New York at Albany, 251 Fuller Road, Albany, NY 12203 (United States)

    2016-06-08

    Recently, large uncertainties in amine nucleation thermodynamics associated with the description of interactions of H{sub 2}SO{sub 4}, the key atmospheric nucleation precursor, with pre-nucleation clusters have been revealed. Here we investigate the formation of (H{sub 2}SO{sub 4}){sub 2}(H{sub 2}O){sub n} (n=0-5) clusters via H{sub 2}O–induced dimerization using conventional RI-MP2 and PW91PW91 methods and recently developed multistep BRIMP2 and B3RICC2 methods widely used in nucleation studies and compare the obtained results with measurements for equilibrium constants of H{sub 2}O–induced dimerization. Variations in K{sub p} and Gibbs free energies predicted by different methods were found to be unexpectedly large, several times of those in hydration free energies commonly used to benchmark computational quantum methods. This means that the common hydration benchmarks are not fully representative of nucleation and that the validation of quantum methods to be recommended for use in nucleation studies is impossible without a thorough assessment of H{sub 2}SO{sub 4}-H{sub 2}SO{sub 4} interactions. We show clearly that only conventional RI-MP2 and PW91PW91 methods are consistent with experiments and that a thorough validation of theoretical predictions against experimental data on H{sub 2}SO{sub 4} clustering is needed prior to recommending a quantum–chemical method for use in nucleation research. We also shows that conclusions about the role of Amine-Enhanced Ternary Homogeneous Nucleation (ATHN) in atmospheric nucleation may be affected by the large uncertainties in nucleation thermodynamics associated with the application of B3RICC2 and BRIMP2 methods and may need a thorough revision.

  2. Minimizing measurement uncertainties of coniferous needle-leaf optical properties, part II: experimental set-up and error analysis

    NARCIS (Netherlands)

    Yanez Rausell, L.; Malenovsky, Z.; Clevers, J.G.P.W.; Schaepman, M.E.

    2014-01-01

    We present uncertainties associated with the measurement of coniferous needle-leaf optical properties (OPs) with an integrating sphere using an optimized gap-fraction (GF) correction method, where GF refers to the air gaps appearing between the needles of a measured sample. We used an optically

  3. A Method for Dimensional and Surface Optical Measurements Uncertainty Assessment on Micro Structured Surfaces Manufactured by Jet-ECM

    DEFF Research Database (Denmark)

    Quagliotti, Danilo; Tosello, Guido; Islam, Aminul

    2015-01-01

    Surface texture and step height measurements of electrochemically machined cavities have been compared among optical and tactile instruments. A procedure is introduced for correcting possible divergences among the instruments and, ultimately, for evaluating the measurement uncertainty according t...... to the ISO GUM....

  4. Measurement uncertainty from validation and duplicate analysis results in HPLC analysis of multivitamin preparations and nutrients with different galenic forms.

    Science.gov (United States)

    De Beer, J O; Baten, P; Nsengyumva, C; Smeyers-Verbeke, J

    2003-08-08

    An approach to calculate the measurement uncertainty in the HPLC analysis of several hydro- and liposoluble vitamins in multivitamin preparations with different galenic composition and properties is described. In the first instance it is examined if duplicate analysis results, obtained with a fully validated analysis method on different lots of an effervescent tablet preparation spread over several points of time, might contribute to calculate the measurement uncertainty of the HPLC method used and if the established uncertainty is acceptable in the assessment of compliance with the legal content limits. Analysis of variance (ANOVA) and precision calculations, based on the ISO 5725-2 norm are applied on the analysis results obtained to estimate precision components, necessary to derive the measurement uncertainty. In the second instance it is demonstrated to which extent the fully validated method of analysis for effervescent tablets is applicable to other galenic forms as e.g. capsules with oily emulsions, tablets, coated tablets, oral solutions, em leader and which specific modifications in the analysis steps are involved. By means of duplicate analysis results, acquired from a large series of real samples over a considerable period of time and classified according to their similarity in content, galenic forms and matrices, estimations of measurement uncertainty calculations are shown.

  5. Conversion factor and uncertainty estimation for quantification of towed gamma-ray detector measurements in Tohoku coastal waters

    Energy Technology Data Exchange (ETDEWEB)

    Ohnishi, S., E-mail: ohnishi@nmri.go.jp [National Maritime Research Institute, 6-38-1, Shinkawa, Mitaka, Tokyo 181-0004 (Japan); Thornton, B. [Institute of Industrial Science, The University of Tokyo, 4-6-1, Komaba, Meguro-ku, Tokyo 153-8505 (Japan); Kamada, S.; Hirao, Y.; Ura, T.; Odano, N. [National Maritime Research Institute, 6-38-1, Shinkawa, Mitaka, Tokyo 181-0004 (Japan)

    2016-05-21

    Factors to convert the count rate of a NaI(Tl) scintillation detector to the concentration of radioactive cesium in marine sediments are estimated for a towed gamma-ray detector system. The response of the detector against a unit concentration of radioactive cesium is calculated by Monte Carlo radiation transport simulation considering the vertical profile of radioactive material measured in core samples. The conversion factors are acquired by integrating the contribution of each layer and are normalized by the concentration in the surface sediment layer. At the same time, the uncertainty of the conversion factors are formulated and estimated. The combined standard uncertainty of the radioactive cesium concentration by the towed gamma-ray detector is around 25 percent. The values of uncertainty, often referred to as relative root mean squat errors in other works, between sediment core sampling measurements and towed detector measurements were 16 percent in the investigation made near the Abukuma River mouth and 5.2 percent in Sendai Bay, respectively. Most of the uncertainty is due to interpolation of the conversion factors between core samples and uncertainty of the detector's burial depth. The results of the towed measurements agree well with laboratory analysed sediment samples. Also, the concentrations of radioactive cesium at the intersection of each survey line are consistent. The consistency with sampling results and between different lines' transects demonstrate the availability and reproducibility of towed gamma-ray detector system.

  6. Measurement uncertainty associated with chromatic confocal profilometry for 3D surface texture characterization of natural human enamel.

    Science.gov (United States)

    Mullan, F; Bartlett, D; Austin, R S

    2017-06-01

    To investigate the measurement performance of a chromatic confocal profilometer for quantification of surface texture of natural human enamel in vitro. Contributions to the measurement uncertainty from all potential sources of measurement error using a chromatic confocal profilometer and surface metrology software were quantified using a series of surface metrology calibration artifacts and pre-worn enamel samples. The 3D surface texture analysis protocol was optimized across 0.04mm2 of natural and unpolished enamel undergoing dietary acid erosion (pH 3.2, titratable acidity 41.3mmolOH/L). Flatness deviations due to the x, y stage mechanical movement were the major contribution to the measurement uncertainty; with maximum Sz flatness errors of 0.49μm. Whereas measurement noise; non-linearity's in x, y, z and enamel sample dimensional instability contributed minimal errors. The measurement errors were propagated into an uncertainty budget following a Type B uncertainty evaluation in order to calculate the Standard Combined Uncertainty (uc), which was ±0.28μm. Statistically significant increases in the median (IQR) roughness (Sa) of the polished samples occurred after 15 (+0.17 (0.13)μm), 30 (+0.12 (0.09)μm) and 45 (+0.18 (0.15)μm) min of erosion (Pmeasurement uncertainty using chromatic confocal profilometry was from flatness deviations however by optimizing measurement protocols the profilometer successfully characterized surface texture changes in enamel from erosive wear in vitro. Copyright © 2017 The Academy of Dental Materials. All rights reserved.

  7. Reporting unit size and measurement uncertainty: current Australian practice in clinical chemistry and haematology.

    Science.gov (United States)

    Hawkins, Robert C; Badrick, Tony

    2015-08-01

    In this study we aimed to compare the reporting unit size used by Australian laboratories for routine chemistry and haematology tests to the unit size used by learned authorities and in standard laboratory textbooks and to the justified unit size based on measurement uncertainty (MU) estimates from quality assurance program data. MU was determined from Royal College of Pathologists of Australasia (RCPA) - Australasian Association of Clinical Biochemists (AACB) and RCPA Haematology Quality Assurance Program survey reports. The reporting unit size implicitly suggested in authoritative textbooks, the RCPA Manual, and the General Serum Chemistry program itself was noted. We also used published data on Australian laboratory practices.The best performing laboratories could justify their chemistry unit size for 55% of analytes while comparable figures for the 50% and 90% laboratories were 14% and 8%, respectively. Reporting unit size was justifiable for all laboratories for red cell count, >50% for haemoglobin but only the top 10% for haematocrit. Few, if any, could justify their mean cell volume (MCV) and mean cell haemoglobin concentration (MCHC) reporting unit sizes.The reporting unit size used by many laboratories is not justified by present analytical performance. Using MU estimates to determine the reporting interval for quantitative laboratory results ensures reporting practices match local analytical performance and recognises the inherent error of the measurement process.

  8. Interferometric measurement of a diffusion coefficient: comparison of two methods and uncertainty analysis

    Energy Technology Data Exchange (ETDEWEB)

    Riquelme, Rodrigo [Department of Mechanical and Metallurgical Engineering, Pontificia Universidad Catolica de Chile, Vicuna Mackenna 4860, Santiago (Chile); Lira, Ignacio [Department of Mechanical and Metallurgical Engineering, Pontificia Universidad Catolica de Chile, Vicuna Mackenna 4860, Santiago (Chile); Perez-Lopez, Carlos [Centro de Investigaciones en Optica (CIO), Leon, Gto 37150 (Mexico); Rayas, Juan A [Centro de Investigaciones en Optica (CIO), Leon, Gto 37150 (Mexico); RodrIguez-Vera, Ramon [Centro de Investigaciones en Optica (CIO), Leon, Gto 37150 (Mexico)

    2007-05-07

    Two methods to measure the diffusion coefficient of a species in a liquid by optical interferometry were compared. The methods were tested on a 1.75 M NaCl aqueous solution diffusing into water at 26 deg. C. Results were D = 1.587 x 10{sup -9} m{sup 2} s{sup -1} with the first method and D = 1.602 x 10{sup -9} m{sup 2} s{sup -1} with the second method. Monte Carlo simulation was used to assess the possible dispersion of these results. The standard uncertainties were found to be of the order of 0.05 x 10{sup -9} m{sup 2} s{sup -1} with both methods. We found that the value of the diffusion coefficient obtained by either method is very sensitive to the magnification of the optical system, and that if diffusion is slow the measurement of time does not need to be very accurate.

  9. Sparse Representation Based Frequency Detection and Uncertainty Reduction in Blade Tip Timing Measurement for Multi-Mode Blade Vibration Monitoring

    Science.gov (United States)

    Pan, Minghao; Yang, Yongmin; Guan, Fengjiao; Hu, Haifeng; Xu, Hailong

    2017-01-01

    The accurate monitoring of blade vibration under operating conditions is essential in turbo-machinery testing. Blade tip timing (BTT) is a promising non-contact technique for the measurement of blade vibrations. However, the BTT sampling data are inherently under-sampled and contaminated with several measurement uncertainties. How to recover frequency spectra of blade vibrations though processing these under-sampled biased signals is a bottleneck problem. A novel method of BTT signal processing for alleviating measurement uncertainties in recovery of multi-mode blade vibration frequency spectrum is proposed in this paper. The method can be divided into four phases. First, a single measurement vector model is built by exploiting that the blade vibration signals are sparse in frequency spectra. Secondly, the uniqueness of the nonnegative sparse solution is studied to achieve the vibration frequency spectrum. Thirdly, typical sources of BTT measurement uncertainties are quantitatively analyzed. Finally, an improved vibration frequency spectra recovery method is proposed to get a guaranteed level of sparse solution when measurement results are biased. Simulations and experiments are performed to prove the feasibility of the proposed method. The most outstanding advantage is that this method can prevent the recovered multi-mode vibration spectra from being affected by BTT measurement uncertainties without increasing the probe number. PMID:28758952

  10. Towards a quantitative, measurement-based estimate of the uncertainty in photon mass attenuation coefficients at radiation therapy energies.

    Science.gov (United States)

    Ali, E S M; Spencer, B; McEwen, M R; Rogers, D W O

    2015-02-21

    In this study, a quantitative estimate is derived for the uncertainty in the XCOM photon mass attenuation coefficients in the energy range of interest to external beam radiation therapy-i.e. 100 keV (orthovoltage) to 25 MeV-using direct comparisons of experimental data against Monte Carlo models and theoretical XCOM data. Two independent datasets are used. The first dataset is from our recent transmission measurements and the corresponding EGSnrc calculations (Ali et al 2012 Med. Phys. 39 5990-6003) for 10-30 MV photon beams from the research linac at the National Research Council Canada. The attenuators are graphite and lead, with a total of 140 data points and an experimental uncertainty of ∼0.5% (k = 1). An optimum energy-independent cross section scaling factor that minimizes the discrepancies between measurements and calculations is used to deduce cross section uncertainty. The second dataset is from the aggregate of cross section measurements in the literature for graphite and lead (49 experiments, 288 data points). The dataset is compared to the sum of the XCOM data plus the IAEA photonuclear data. Again, an optimum energy-independent cross section scaling factor is used to deduce the cross section uncertainty. Using the average result from the two datasets, the energy-independent cross section uncertainty estimate is 0.5% (68% confidence) and 0.7% (95% confidence). The potential for energy-dependent errors is discussed. Photon cross section uncertainty is shown to be smaller than the current qualitative 'envelope of uncertainty' of the order of 1-2%, as given by Hubbell (1999 Phys. Med. Biol 44 R1-22).

  11. Measurement uncertainty assessment of magnesium trisilicate column for determination of Sudan colorants in food by HPLC using C8 column.

    Science.gov (United States)

    Chen, Ying; He, Chao; Cheng, Jing-Jun; Huang, Wen-Yao; Shao, Sheng-Wen; Jiang, Ya-Ping; Dai, Ling-Feng; Liu, Jia-Fa; Song, Yi

    2016-10-01

    This study aimed to conduct measurement uncertainty assessment of a new method for determination of Sudan colorants (Sudan I, II, III and IV) in food by high performance liquid chromatography (HPLC). Samples were extracted with organic solvents (hexane, 20% acetone) and first purified by magnesium trisilicate (2MgO·3SiO2). The Sudan colorants (Sudan I-IV) were also initially separated on C8 by gradient elution using acetonitrile and 0.1% (v/v) formic acid aqueous solution as the mobile phases and detected with diode-array detector (DAD). The uncertainty of mathematical model of Sudan I, II, III and IV is based on EURACHEM guidelines. The sources and components of uncertainty were calculated. The experiment gave a good linear relationship over the concentration from 0.4 to 4.0 μg/mL and spiked recoveries were from 74.0% to 97.5%. The limits of determination (LOD) were 48, 61, 36, 58 μg/kg for the four analytes, respectively. The total uncertainty of Sudan colorants (Sudan I, II, III and IV) was 810±30.8, 790±28.4, 750±27.0, 730±50.0 μg/kg, respectively. The recovery uncertainty was the most significant factor contributing to the total uncertainty. The developed method is simple, rapid, and highly sensitive. It can be used for the determination of trace Sudan dyes in food samples. The sources of uncertainty have been identified and uncertainty components have been simplified and considered.

  12. Quantification of LiDAR measurement uncertainty through propagation of errors due to sensor sub-systems and terrain morphology

    Science.gov (United States)

    Goulden, T.; Hopkinson, C.

    2013-12-01

    The quantification of LiDAR sensor measurement uncertainty is important for evaluating the quality of derived DEM products, compiling risk assessment of management decisions based from LiDAR information, and enhancing LiDAR mission planning capabilities. Current quality assurance estimates of LiDAR measurement uncertainty are limited to post-survey empirical assessments or vendor estimates from commercial literature. Empirical evidence can provide valuable information for the performance of the sensor in validated areas; however, it cannot characterize the spatial distribution of measurement uncertainty throughout the extensive coverage of typical LiDAR surveys. Vendor advertised error estimates are often restricted to strict and optimal survey conditions, resulting in idealized values. Numerical modeling of individual pulse uncertainty provides an alternative method for estimating LiDAR measurement uncertainty. LiDAR measurement uncertainty is theoretically assumed to fall into three distinct categories, 1) sensor sub-system errors, 2) terrain influences, and 3) vegetative influences. This research details the procedures for numerical modeling of measurement uncertainty from the sensor sub-system (GPS, IMU, laser scanner, laser ranger) and terrain influences. Results show that errors tend to increase as the laser scan angle, altitude or laser beam incidence angle increase. An experimental survey over a flat and paved runway site, performed with an Optech ALTM 3100 sensor, showed an increase in modeled vertical errors of 5 cm, at a nadir scan orientation, to 8 cm at scan edges; for an aircraft altitude of 1200 m and half scan angle of 15°. In a survey with the same sensor, at a highly sloped glacial basin site absent of vegetation, modeled vertical errors reached over 2 m. Validation of error models within the glacial environment, over three separate flight lines, respectively showed 100%, 85%, and 75% of elevation residuals fell below error predictions. Future

  13. Cirrus Susceptibility to Changes in Ice Nuclei: Physical Processes, Model Uncertainties, and Measurement Needs

    Science.gov (United States)

    Jensen, Eric

    2018-01-01

    One of the proposed concepts for mitigating the warming effect of increasing greenhouse gases is seeding cirrus cloud with ice nuclei (IN) in order to reduce the lifetime and coverage of cold cirrus that have a net warming impact on the earth's surface. Global model simulations of the net impact of changing upper tropospheric IN have given widely disparate results, partly as a result of poor understanding of ice nucleation processes in the current atmosphere, and partly as a result of poor representation of these processes in global models. Here, we present detailed process-model simulations of tropical tropopause layer (TTL) transport and cirrus formation with ice nuclei properties based on recent laboratory nucleation experiments and field measurements of aerosol composition. The model is used to assess the sensitivity of TTL cirrus occurrence frequency and microphysical properties to the abundance and efficacy of ice nuclei. The simulated cloud properties compared with recent high-altitude aircraft measurements of TTL cirrus and ice supersaturation. We find that abundant effective IN (either from glassy organic aerosols or crystalline ammonium sulfate with concentrations greater than about 100/L) prevent the occurrences of large ice concentration and large ice supersaturations, both of which are clearly indicated by the in situ observations. We find that concentrations of effective ice nuclei larger than about 50/L can drive significant changes in cirrus microphysical properties and occurrence frequency. However, the cloud occurrence frequency can either increase or decrease, depending on the efficacy and abundance of IN added to the TTL. We suggest that our lack of information about ice nuclei properties in the current atmosphere, as well as uncertainties in ice nucleation processes and their representations in global models, preclude meaningful estimates of climate impacts associated with addition of ice nuclei in the upper troposphere. We will briefly discuss

  14. Probabilistic measures of climate change vulnerability, adaptation action benefits, and related uncertainty from maximum temperature metric selection.

    Science.gov (United States)

    DeWeber, J Tyrell; Wagner, Tyler

    2018-02-22

    Predictions of the projected changes in species distribution models and potential adaptation action benefits can help guide conservation actions. There is substantial uncertainty in projecting species distributions into an unknown future, however, which can undermine confidence in predictions or misdirect conservation actions if not properly considered. Recent studies have shown that the selection of alternative climate metrics describing very different climatic aspects (e.g., mean air temperature vs. mean precipitation) can be a substantial source of projection uncertainty. It is unclear, however, how much projection uncertainty might stem from selecting among highly correlated, ecologically similar climate metrics (e.g., maximum temperature in July, maximum 30-day temperature) describing the same climatic aspect (e.g., maximum temperatures) that is known to limit a species' distribution. It is also unclear how projection uncertainty might propagate into predictions of the potential benefits of adaptation actions that might lessen climate change effects. We provide probabilistic measures of climate change vulnerability, adaptation action benefits, and related uncertainty stemming from the selection of four maximum temperature metrics for brook trout (Salvelinus fontinalis), a cold-water salmonid of conservation concern in the eastern U.S. Projected losses in suitable stream length varied by as much as 20% among alternative maximum temperature metrics for mid-century climate projections, which was similar to variation among three climate models. Similarly, the regional average predicted increase in brook trout occurrence probability under an adaptation action scenario of full riparian forest restoration varied by as much as 0.2 among metrics. Our use of Bayesian inference provides probabilistic measures of vulnerability and adaptation action benefits for individual stream reaches that properly address statistical uncertainty and can help guide conservation actions

  15. Carbon dioxide and methane measurements from the Los Angeles Megacity Carbon Project - Part 1: calibration, urban enhancements, and uncertainty estimates

    Science.gov (United States)

    Verhulst, Kristal R.; Karion, Anna; Kim, Jooil; Salameh, Peter K.; Keeling, Ralph F.; Newman, Sally; Miller, John; Sloop, Christopher; Pongetti, Thomas; Rao, Preeti; Wong, Clare; Hopkins, Francesca M.; Yadav, Vineet; Weiss, Ray F.; Duren, Riley M.; Miller, Charles E.

    2017-07-01

    We report continuous surface observations of carbon dioxide (CO2) and methane (CH4) from the Los Angeles (LA) Megacity Carbon Project during 2015. We devised a calibration strategy, methods for selection of background air masses, calculation of urban enhancements, and a detailed algorithm for estimating uncertainties in urban-scale CO2 and CH4 measurements. These methods are essential for understanding carbon fluxes from the LA megacity and other complex urban environments globally. We estimate background mole fractions entering LA using observations from four extra-urban sites including two marine sites located south of LA in La Jolla (LJO) and offshore on San Clemente Island (SCI), one continental site located in Victorville (VIC), in the high desert northeast of LA, and one continental/mid-troposphere site located on Mount Wilson (MWO) in the San Gabriel Mountains. We find that a local marine background can be established to within ˜ 1 ppm CO2 and ˜ 10 ppb CH4 using these local measurement sites. Overall, atmospheric carbon dioxide and methane levels are highly variable across Los Angeles. Urban and suburban sites show moderate to large CO2 and CH4 enhancements relative to a marine background estimate. The USC (University of Southern California) site near downtown LA exhibits median hourly enhancements of ˜ 20 ppm CO2 and ˜ 150 ppb CH4 during 2015 as well as ˜ 15 ppm CO2 and ˜ 80 ppb CH4 during mid-afternoon hours (12:00-16:00 LT, local time), which is the typical period of focus for flux inversions. The estimated measurement uncertainty is typically better than 0.1 ppm CO2 and 1 ppb CH4 based on the repeated standard gas measurements from the LA sites during the last 2 years, similar to Andrews et al. (2014). The largest component of the measurement uncertainty is due to the single-point calibration method; however, the uncertainty in the background mole fraction is much larger than the measurement uncertainty. The background uncertainty for the marine

  16. The uncertainty in the radon hazard classification of areas as a function of the number of measurements.

    Science.gov (United States)

    Friedmann, H; Baumgartner, A; Gruber, V; Kaineder, H; Maringer, F J; Ringer, W; Seidel, C

    2017-07-01

    The administration in many countries demands a classification of areas concerning their radon risk taking into account the requirements of the EU Basic Safety Standards. The wide variation of indoor radon concentrations in an area which is caused by different house construction, different living style and different geological situations introduces large uncertainties for any classification scheme. Therefore, it is of importance to estimate the size of the experimental coefficient of variation (relative standard deviation) of the parameter which is used to classify an area. Besides the time period of measurement it is the number of measurements which strongly influences this uncertainty and it is important to find a compromise between the economic possibilities and the needed confidence level. Some countries do not use pure measurement results for the classification of areas but use derived quantities, usually called radon potential, which should reduce the influence of house construction, living style etc. and should rather represent the geological situation of an area. Here, radon indoor measurements in nearly all homes in three municipalities and its conversion into a radon potential were used to determine the uncertainty of the mean radon potential of an area as a function of the number of investigated homes. It could be shown that the coefficient of variation scales like 1/√n with n the number of measured dwellings. The question how to deal with uncertainties when using a classification scheme for the radon risk is discussed and a general procedure is proposed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Flow Rates Measurement and Uncertainty Analysis in Multiple-Zone Water-Injection Wells from Fluid Temperature Profiles.

    Science.gov (United States)

    Reges, José E O; Salazar, A O; Maitelli, Carla W S P; Carvalho, Lucas G; Britto, Ursula J B

    2016-07-13

    This work is a contribution to the development of flow sensors in the oil and gas industry. It presents a methodology to measure the flow rates into multiple-zone water-injection wells from fluid temperature profiles and estimate the measurement uncertainty. First, a method to iteratively calculate the zonal flow rates using the Ramey (exponential) model was described. Next, this model was linearized to perform an uncertainty analysis. Then, a computer program to calculate the injected flow rates from experimental temperature profiles was developed. In the experimental part, a fluid temperature profile from a dual-zone water-injection well located in the Northeast Brazilian region was collected. Thus, calculated and measured flow rates were compared. The results proved that linearization error is negligible for practical purposes and the relative uncertainty increases as the flow rate decreases. The calculated values from both the Ramey and linear models were very close to the measured flow rates, presenting a difference of only 4.58 m³/d and 2.38 m³/d, respectively. Finally, the measurement uncertainties from the Ramey and linear models were equal to 1.22% and 1.40% (for injection zone 1); 10.47% and 9.88% (for injection zone 2). Therefore, the methodology was successfully validated and all objectives of this work were achieved.

  18. Flow Rates Measurement and Uncertainty Analysis in Multiple-Zone Water-Injection Wells from Fluid Temperature Profiles

    Directory of Open Access Journals (Sweden)

    José E. O. Reges

    2016-07-01

    Full Text Available This work is a contribution to the development of flow sensors in the oil and gas industry. It presents a methodology to measure the flow rates into multiple-zone water-injection wells from fluid temperature profiles and estimate the measurement uncertainty. First, a method to iteratively calculate the zonal flow rates using the Ramey (exponential model was described. Next, this model was linearized to perform an uncertainty analysis. Then, a computer program to calculate the injected flow rates from experimental temperature profiles was developed. In the experimental part, a fluid temperature profile from a dual-zone water-injection well located in the Northeast Brazilian region was collected. Thus, calculated and measured flow rates were compared. The results proved that linearization error is negligible for practical purposes and the relative uncertainty increases as the flow rate decreases. The calculated values from both the Ramey and linear models were very close to the measured flow rates, presenting a difference of only 4.58 m³/d and 2.38 m³/d, respectively. Finally, the measurement uncertainties from the Ramey and linear models were equal to 1.22% and 1.40% (for injection zone 1; 10.47% and 9.88% (for injection zone 2. Therefore, the methodology was successfully validated and all objectives of this work were achieved.

  19. Effects of Heterogeneities, Sampling Frequencies, Tools and Methods on Uncertainties in Subsurface Contaminant Concentration Measurements

    Science.gov (United States)

    Ezzedine, S. M.; McNab, W. W.

    2007-12-01

    uncertainties in the concentration measurements. Finally, the models and results were abstracted using a simple mixed-tank approach to further simplify the models and make them more accessible to field hydrogeologists. During the abstraction process a novel method was developed for mapping streamlines in the fractures as well within the monitoring well to illustrate mixing and mixing zones. Applications will be demonstrated for both sampling in porous and fractured media. This work was performed under the auspices of the U.S. Department of Energy by University of California Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48.

  20. Neutrosophic Tangent Similarity Measure and Its Application to Multiple Attribute Decision Making

    Directory of Open Access Journals (Sweden)

    Kalyan Mondal

    2015-09-01

    Full Text Available In this paper, the tangent similarity measure of neutrosophic sets is proposed and its properties are studied. The concept of this tangent similarity measure of single valued neutrosophic sets is a parallel tool of improved cosine similarity measure of single valued neutrosophic sets. Finally, using this tangent similarity measure of single valued neutrosophic set, two applications namely, selection of educational stream and medical diagnosis are presented.

  1. Fugitive Emissions Attribution via Simultaneous Measurement of Ethane and Methane Isotopic Signature in Vehicle-based Surveys

    Science.gov (United States)

    Marshall, A. D.; Williams, J. P.; Baillie, J.; MacKay, K.; Risk, D. A.; Fleck, D.

    2016-12-01

    Detecting and attributing sub-regulatory fugitive emissions in the energy sector remains a priority for industry and environmental groups alike. Vehicle-based geochemical emission detection and attribution is seeing increasingly widespread use. In order to distinguish between biogenic and thermogenic emission sources, these techniques rely on tracer species like δ13C of methane (δ13CH4). In this study, we assessed the performance of the new Picarro G2210-i, a cavity ring-down spectroscopy (CRDS) analyzer that measures δ13CH4 and ethane (C2H6) simultaneously to provide increased thermogenic tracer power. In the lab, we assessed drift and other performance characteristics relative to a G2201-i (existing isotopic CH4 and carbon dioxide analyzer). We performed model experiments to synthetically assess the new analyzer's utility for oil and gas developments with differing levels of ethane. Lastly, we also conducted survey drives in a high-ethane oilfield using both the G2210-i and G2201-i. Results were very positive. The G2210-i showed minimal drift, as expected. Allan deviation experiments showed that the G2210-i has a precision of 0.482 ppb for CH4 and 3.15 ppb for C2H6 for 1Hz measurements. Computational experiments confirmed that the resolution of C2H6 is sufficient for detecting and attributing thermogenic CH4 at distance in oil and gas settings, which was further validated in the field where we measured simultaneous departures in δ13CH4 and C2H6 within plumes from venting infrastructure. C2:C1 ratios also proved very useful for attribution. As we move to reduce emissions from the energy industry, this analyzer presents new analytical possibilities that will be of high value to industry stakeholders.

  2. Identification, summary and comparison of tools used to measure organizational attributes associated with chronic disease management within primary care settings.

    Science.gov (United States)

    Lukewich, Julia; Corbin, Renée; VanDenKerkhof, Elizabeth G; Edge, Dana S; Williamson, Tyler; Tranmer, Joan E

    2014-12-01

    Given the increasing emphasis being placed on managing patients with chronic diseases within primary care, there is a need to better understand which primary care organizational attributes affect the quality of care that patients with chronic diseases receive. This study aimed to identify, summarize and compare data collection tools that describe and measure organizational attributes used within the primary care setting worldwide. Systematic search and review methodology consisting of a comprehensive and exhaustive search that is based on a broad question to identify the best available evidence was employed. A total of 30 organizational attribute data collection tools that have been used within the primary care setting were identified. The tools varied with respect to overall focus and level of organizational detail captured, theoretical foundations, administration and completion methods, types of questions asked, and the extent to which psychometric property testing had been performed. The tools utilized within the Quality and Costs of Primary Care in Europe study and the Canadian Primary Health Care Practice-Based Surveys were the most recently developed tools. Furthermore, of the 30 tools reviewed, the Canadian Primary Health Care Practice-Based Surveys collected the most information on organizational attributes. There is a need to collect primary care organizational attribute information at a national level to better understand factors affecting the quality of chronic disease prevention and management across a given country. The data collection tools identified in this review can be used to establish data collection strategies to collect this important information. © 2014 The Authors. Journal of Evaluation in Clinical Practice published by John Wiley & Sons, Ltd.

  3. Identification, summary and comparison of tools used to measure organizational attributes associated with chronic disease management within primary care settings

    Science.gov (United States)

    Lukewich, Julia; Corbin, Renée; VanDenKerkhof, Elizabeth G; Edge, Dana S; Williamson, Tyler; Tranmer, Joan E

    2014-01-01

    Rationale, aims and objectives Given the increasing emphasis being placed on managing patients with chronic diseases within primary care, there is a need to better understand which primary care organizational attributes affect the quality of care that patients with chronic diseases receive. This study aimed to identify, summarize and compare data collection tools that describe and measure organizational attributes used within the primary care setting worldwide. Methods Systematic search and review methodology consisting of a comprehensive and exhaustive search that is based on a broad question to identify the best available evidence was employed. Results A total of 30 organizational attribute data collection tools that have been used within the primary care setting were identified. The tools varied with respect to overall focus and level of organizational detail captured, theoretical foundations, administration and completion methods, types of questions asked, and the extent to which psychometric property testing had been performed. The tools utilized within the Quality and Costs of Primary Care in Europe study and the Canadian Primary Health Care Practice-Based Surveys were the most recently developed tools. Furthermore, of the 30 tools reviewed, the Canadian Primary Health Care Practice-Based Surveys collected the most information on organizational attributes. Conclusions There is a need to collect primary care organizational attribute information at a national level to better understand factors affecting the quality of chronic disease prevention and management across a given country. The data collection tools identified in this review can be used to establish data collection strategies to collect this important information. PMID:24840066

  4. A systematic approach to assessing measurement uncertainty for CO2 emissions from coal-fired power plants

    DEFF Research Database (Denmark)

    Wagner, Claas; Esbensen, Kim

    2011-01-01

    on extensive empirical sampling experiments, a fully comprehensive uncertainty estimate procedure has been devised. Even though uncertainties increased (indeed one particular factor is substantially higher, the so-called “emission factor”), the revised CO2 emission budget for the case plant complies......An augmented measurement uncertainty approach for CO2 emissions from coal-fired power plants with a focus on the often forgotten contributions from sampling errors occurring over the entire fuel-to-emission pathway is presented. Current methods for CO2 emission determination are evaluated in detail......, from which a general matrix scheme is developed that includes all factors and stages needed for total CO2 determination, which is applied to the monitoring plan of a representative medium-sized coal-fired power plant. In particular sampling involved significant potential errors, as identified...

  5. Towards a standardized processing of Net Ecosystem Exchange measured with eddy covariance technique: algorithms and uncertainty estimation

    Directory of Open Access Journals (Sweden)

    D. Papale

    2006-01-01

    Full Text Available Eddy covariance technique to measure CO2, water and energy fluxes between biosphere and atmosphere is widely spread and used in various regional networks. Currently more than 250 eddy covariance sites are active around the world measuring carbon exchange at high temporal resolution for different biomes and climatic conditions. In this paper a new standardized set of corrections is introduced and the uncertainties associated with these corrections are assessed for eight different forest sites in Europe with a total of 12 yearly datasets. The uncertainties introduced on the two components GPP (Gross Primary Production and TER (Terrestrial Ecosystem Respiration are also discussed and a quantitative analysis presented. Through a factorial analysis we find that generally, uncertainties by different corrections are additive without interactions and that the heuristic u*-correction introduces the largest uncertainty. The results show that a standardized data processing is needed for an effective comparison across biomes and for underpinning inter-annual variability. The methodology presented in this paper has also been integrated in the European database of the eddy covariance measurements.

  6. Quantifying Urban Natural Gas Leaks from Street-level Methane Mapping: Measurements and Uncertainty

    Science.gov (United States)

    von Fischer, J. C.; Ham, J. M.; Griebenow, C.; Schumacher, R. S.; Salo, J.

    2013-12-01

    Leaks from the natural gas pipeline system are a significant source of anthropogenic methane in urban settings. Detecting and repairing these leaks will reduce the energy and carbon footprints of our cities. Gas leaks can be detected from spikes in street-level methane concentrations measured by analyzers deployed on vehicles. While a spike in methane concentration indicates a leak, an algorithm (e.g., inverse model) must be used to estimate the size of the leak (i.e., flux) from concentration data and supporting meteorological information. Unfortunately, this drive-by approach to leak quantification is confounded by the complexity of urban roughness, changing weather conditions, and other incidental factors (e.g., traffic, vehicle speed, etc.). Furthermore, the vehicle might only pass through the plume one to three times during routine mapping. The objective of this study was to conduct controlled release experiments to better quantify the relationship between mobile methane concentration measurements and the size and location of the emission source (e.g., pipeline leakage) in an urban environment. A portable system was developed that could release methane at known rates between 10 and 40 LPM while maintaining concentrations below the lower explosive limit. A mapping vehicle was configured with fast response methane analyzers, GPS, and meteorological instruments. Portable air-sampling tripods were fabricated that could be deployed at defined distances downwind from the release point and automatically-triggered to collect grab samples. The experimental protocol was as follows: (1) identify an appropriate release point within a city, (2) release methane at a known rate, (3) measure downwind street-level concentrations with the vehicle by making multiple passes through the plume, and (4) collect supporting concentration and meteorological data with the static tripod samplers deployed in the plume. Controlled release studies were performed at multiple locations and

  7. Dead time effect on the Brewer measurements: correction and estimated uncertainties

    Science.gov (United States)

    Fountoulakis, Ilias; Redondas, Alberto; Bais, Alkiviadis F.; José Rodriguez-Franco, Juan; Fragkos, Konstantinos; Cede, Alexander

    2016-04-01

    Brewer spectrophotometers are widely used instruments which perform spectral measurements of the direct, the scattered and the global solar UV irradiance. By processing these measurements a variety of secondary products can be derived such as the total columns of ozone (TOC), sulfur dioxide and nitrogen dioxide and aerosol optical properties. Estimating and limiting the uncertainties of the final products is of critical importance. High-quality data have a lot of applications and can provide accurate estimations of trends.The dead time is specific for each instrument and improper correction of the raw data for its effect may lead to important errors in the final products. The dead time value may change with time and, with the currently used methodology, it cannot always be determined accurately. For specific cases, such as for low ozone slant columns and high intensities of the direct solar irradiance, the error in the retrieved TOC, due to a 10 ns change in the dead time from its value in use, is found to be up to 5 %. The error in the calculation of UV irradiance can be as high as 12 % near the maximum operational limit of light intensities. While in the existing documentation it is indicated that the dead time effects are important when the error in the used value is greater than 2 ns, we found that for single-monochromator Brewers a 2 ns error in the dead time may lead to errors above the limit of 1 % in the calculation of TOC; thus the tolerance limit should be lowered. A new routine for the determination of the dead time from direct solar irradiance measurements has been created and tested and a validation of the operational algorithm has been performed. Additionally, new methods for the estimation and the validation of the dead time have been developed and are analytically described. Therefore, the present study, in addition to highlighting the importance of the dead time for the processing of Brewer data sets, also provides useful information for their

  8. arXiv A method and tool for combining differential or inclusive measurements obtained with simultaneously constrained uncertainties

    CERN Document Server

    Kieseler, Jan

    A method is discussed that allows combining sets of differential or inclusive measurements. It is assumed that at least one measurement was obtained with simultaneously fitting a set of nuisance parameters, representing sources of systematic uncertainties. As a result of beneficial constraints from the data all such fitted parameters are correlated among each other. The best approach for a combination of these measurements would be the maximisation of a combined likelihood, for which the full fit model of each measurement and the original data are required. However, only in rare cases this information is publicly available. In absence of this information most commonly used combination methods are not able to account for these correlations between uncertainties, which can lead to severe biases as shown in this article. The method discussed here provides a solution for this problem. It relies on the public result and its covariance or Hessian, only, and is validated against the combined-likelihood approach. A d...

  9. arXiv A method and tool for combining differential or inclusive measurements obtained with simultaneously constrained uncertainties

    CERN Document Server

    Kieseler, Jan

    2017-11-22

    A method is discussed that allows combining sets of differential or inclusive measurements. It is assumed that at least one measurement was obtained with simultaneously fitting a set of nuisance parameters, representing sources of systematic uncertainties. As a result of beneficial constraints from the data all such fitted parameters are correlated among each other. The best approach for a combination of these measurements would be the maximization of a combined likelihood, for which the full fit model of each measurement and the original data are required. However, only in rare cases this information is publicly available. In absence of this information most commonly used combination methods are not able to account for these correlations between uncertainties, which can lead to severe biases as shown in this article. The method discussed here provides a solution for this problem. It relies on the public result and its covariance or Hessian, only, and is validated against the combined-likelihood approach. A d...

  10. Evaluation of new flux attribution methods for mapping N2O emissions at the landscape scale from EC measurements

    Science.gov (United States)

    Grossel, Agnes; Bureau, Jordan; Loubet, Benjamin; Laville, Patricia; Massad, Raia; Haas, Edwin; Butterbach-Bahl, Klaus; Guimbaud, Christophe; Hénault, Catherine

    2017-04-01

    The objective of this study was to develop and evaluate an attribution method based on a combination of Eddy Covariance (EC) and chamber measurements to map N2O emissions over a 3-km2 area of croplands and forests in France. During 2 months of spring 2015, N2O fluxes were measured (i) by EC at 15 m height and (ii) punctually with a mobile chamber at 16 places within 1-km of EC mast. The attribution method was based on coupling the EC measurements, information on footprints (Loubet et al., 20101) and emission ratios based on crops and fertilizations, calculated based on chamber measurements. The results were evaluated against an independent flux dataset measured by automatic chambers in a wheat field within the area. At the landscape scale, the method estimated a total emission of 114-271 kg N-N2O during the campaign. This new approach allowed estimating continuously N2O emission and better accounting for the spatial variability of N2O emission at the landscape scale.

  11. Entropic uncertainty for spin-1/2 XXX chains in the presence of inhomogeneous magnetic fields and its steering via weak measurement reversals

    Science.gov (United States)

    Wang, Dong; Ming, Fei; Huang, Ai-Jun; Sun, Wen-Yang; Ye, Liu

    2017-09-01

    The uncertainty principle configures a low bound to the measuring precision for a pair of non-commuting observables, and hence is considerably nontrivial to quantum precision measurement in the field of quantum information theory. In this letter, we consider the entropic uncertainty relation (EUR) in the context of quantum memory in a two-qubit isotropic Heisenberg spin chain. Specifically, we explore the dynamics of EUR in a practical scenario, where two associated nodes of a one-dimensional XXX-spin chain, under an inhomogeneous magnetic field, are connected to a thermal entanglement. We show that the temperature and magnetic field effect can lead to the inflation of the measuring uncertainty, stemming from the reduction of systematic quantum correlation. Notably, we reveal that, firstly, the uncertainty is not fully dependent on the observed quantum correlation of the system; secondly, the dynamical behaviors of the measuring uncertainty are relatively distinct with respect to ferromagnetism and antiferromagnetism chains. Meanwhile, we deduce that the measuring uncertainty is dramatically correlated with the mixedness of the system, implying that smaller mixedness tends to reduce the uncertainty. Furthermore, we propose an effective strategy to control the uncertainty of interest by means of quantum weak measurement reversal. Therefore, our work may shed light on the dynamics of the measuring uncertainty in the Heisenberg spin chain, and thus be important to quantum precision measurement in various solid-state systems.

  12. Estimation of uncertainty in measurement of alkalinity using the GTC 51 guide

    OpenAIRE

    Alzate Rodríguez, Edwin Jhovany

    2008-01-01

    Este documento proporciona una guía para la estimación de la incertidumbre en el análisis de la alcalinidad en el agua, basada en la metodología de la ISO “Guía para la expresión de la Incertidumbre de Medición” (GTC 51). This document gives guidance for the estimation of uncertainty in the analysis of alkalinity in water, based on the approach taken in the ISO “Guide to the Expression of Uncertainty in Measurement”(GTC 51).

  13. Investigation of systematic uncertainties on the measurement of the top-quark mass using lepton transverse momenta

    CERN Document Server

    The ATLAS collaboration

    2018-01-01

    This study investigates the impact of systematic uncertainties on a top-quark mass ($m_\\text{top}$) measurement in the lepton+jets channel with the ATLAS experiment at the LHC. For the study, simulated $t\\bar{t}$ events with lepton+jets final states at a centre of mass energy of 8 TeV are used. In contrast to other analyses, this study is designed to exploit the dependence of the lepton kinematics on the top-quark mass, by parameterising the lepton's transverse momentum distribution with MC simulations. Due to its different systematic uncertainty, this method can potentially contribute to a more accurate measurement of $m_\\text{top}$. The overall uncertainty in this study is 2.3 GeV, dominated by the current uncertainty on initial and final state radiation. Since the result depends on the modelling of the top-quark transverse momentum, it is sensitive to higher order QCD corrections. The influence of such corrections is estimated by reweighting the next-to-leading-order MC prediction by next-to-next-to-leadin...

  14. Reliable and valid NEWS for Chinese seniors: measuring perceived neighborhood attributes related to walking

    Directory of Open Access Journals (Sweden)

    Lee Lok-chun

    2010-11-01

    Full Text Available Abstract Background The effects of the built environment on walking in seniors have not been studied in an Asian context. To examine these effects, valid and reliable measures are needed. The aim of this study was to develop and validate a questionnaire of perceived neighborhood characteristics related to walking appropriate for Chinese seniors (Neighborhood Environment Walkability Scale for Chinese Seniors, NEWS-CS. It was based on the Neighborhood Environment Walkability Scale - Abbreviated (NEWS-A, a validated measure of perceived built environment developed in the USA for adults. A secondary study aim was to establish the generalizability of the NEWS-A to an Asian high-density urban context and a different age group. Methods A multidisciplinary panel of experts adapted the original NEWS-A to reflect the built environment of Hong Kong and needs of seniors. The translated instrument was pre-tested on a sample of 50 Chinese-speaking senior residents (65+ years. The final version of the NEWS-CS was interviewer-administered to 484 seniors residing in four selected Hong Kong districts varying in walkability and socio-economic status. Ninety-two participants completed the questionnaire on two separate occasions, 2-3 weeks apart. Test-rest reliability indices were estimated for each item and subscale of the NEWS-CS. Confirmatory factor analysis was used to develop the measurement model of the NEWS-CS and cross-validate that of the NEWS-A. Results The final version of the NEWS-CS consisted of 14 subscales and four single items (76 items. Test-retest reliability was moderate to good (ICC > 50 or % agreement > 60 except for four items measuring distance to destinations. The originally-proposed measurement models of the NEWS-A and NEWS-CS required 2-3 theoretically-justifiable modifications to fit the data well. Conclusions The NEWS-CS possesses sufficient levels of reliability and factorial validity to be used for measuring perceived neighborhood

  15. Analysis of Uncertainties in Protection Heater Delay Time Measurements and Simulations in Nb$_{3}$Sn High-Field Accelerator Magnets

    CERN Document Server

    Salmi, Tiina; Marchevsky, Maxim; Bajas, Hugo; Felice, Helene; Stenvall, Antti

    2015-01-01

    The quench protection of superconducting high-field accelerator magnets is presently based on protection heaters, which are activated upon quench detection to accelerate the quench propagation within the winding. Estimations of the heater delay to initiate a normal zone in the coil are essential for the protection design. During the development of Nb3Sn magnets for the LHC luminosity upgrade, protection heater delays have been measured in several experiments, and a new computational tool CoHDA (Code for Heater Delay Analysis) has been developed for heater design. Several computational quench analyses suggest that the efficiency of the present heater technology is on the borderline of protecting the magnets. Quantifying the inevitable uncertainties related to the measured and simulated delays is therefore of pivotal importance. In this paper, we analyze the uncertainties in the heater delay measurements and simulations using data from five impregnated high-field Nb3Sn magnets with different heater geometries. ...

  16. Determination of illicit drugs in seized materials: role of sampling and analysis in estimation of measurement uncertainty.

    Science.gov (United States)

    Zamengo, Luca; Frison, Giampietro; Gregio, Maria; Orrù, Giorgio; Sciarrone, Rocco

    2011-05-20

    The determination of illicit active ingredients in seized materials, in order to assess penal or administrative offences, is routinely carried out in many forensic toxicology laboratories. This paper presents main features of the protocol adopted in the Authors' laboratory for the above investigations. In particular, sampling and analysis are considered as the same measurement process quantifying their combined contribution to overall measurement uncertainty. Aspects concerning representative sampling in the case of single and multiple items are discussed. The effects of material heterogeneity are considered by analyzing separately distinct primary samples taken from different parts of the sampling target. Possible errors due to particles dimension that could arise when sub-sampling are also considered. Analytical precision, bias and other matrix effects are studied in order to quantify the component of the overall measurement uncertainty associated to the analysis of prepared test samples. Typical scenarios arising when measurement results are used to assess compliance with specification limits are also discussed revealing the crucial role of measurement uncertainty. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  17. Uncertainty Budget Analysis for Dimensional Inspection Processes (U)

    Energy Technology Data Exchange (ETDEWEB)

    Valdez, Lucas M. [Los Alamos National Laboratory

    2012-07-26

    This paper is intended to provide guidance and describe how to prepare an uncertainty analysis of a dimensional inspection process through the utilization of an uncertainty budget analysis. The uncertainty analysis is stated in the same methodology as that of the ISO GUM standard for calibration and testing. There is a specific distinction between how Type A and Type B uncertainty analysis is used in a general and specific process. All theory and applications are utilized to represent both a generalized approach to estimating measurement uncertainty and how to report and present these estimations for dimensional measurements in a dimensional inspection process. The analysis of this uncertainty budget shows that a well-controlled dimensional inspection process produces a conservative process uncertainty, which can be attributed to the necessary assumptions in place for best possible results.

  18. CIRCLE 2 policy brief: Communicate uncertainties- design climate adaptation measures to be flexible and robust

    NARCIS (Netherlands)

    Pelt, van S.C.; Avelar, D.; Swart, R.J.

    2010-01-01

    This policy brief is directed towards funders and managers of climate change impacts and adaptation research programmes as well as policy makers in this area. It notes various challenges in addressing uncertainties in climate change research and policy and provides suggestions on how to address

  19. Measures of Model Uncertainty in the Assessment of Primary Stresses in Ship Structures

    DEFF Research Database (Denmark)

    Östergaard, Carsten; Dogliani, Mario; Guedes Soares, Carlos

    1996-01-01

    The paper considers various models and methods commonly used for linear elastic stress analysis and assesses the uncertainty involved in their application to the analysis of the distribution of primary stresses in the hull of a containership example, through statistical evaluations of the results...

  20. Model-based Type B uncertainty evaluations of measurement towards more objective evaluation strategies

    NARCIS (Netherlands)

    Boumans, M.

    2013-01-01

    This article proposes a more objective Type B evaluation. This can be achieved when Type B uncertainty evaluations are model-based. This implies, however, grey-box modelling and validation instead of white-box modelling and validation which are appropriate for Type A evaluation.

  1. An Evaluation of Test and Physical Uncertainty of Measuring Vibration in Wooden Junctions

    DEFF Research Database (Denmark)

    Dickow, Kristoffer Ahrens; Kirkegaard, Poul Henning; Andersen, Lars Vabbersgaard

    2012-01-01

    In the present paper a study of test and material uncertainty in modal analysis of certain wooden junctions is presented. The main structure considered here is a T-junction made from a particleboard plate connected to a spruce beam of rectangular cross section. The size of the plate is 1.2 m by 0...

  2. Fission Meter Information Barrier Attribute Measurement System - NA-243 FNI/UKC FY2017 Task 1-2 Report

    Energy Technology Data Exchange (ETDEWEB)

    Kerr, P. L. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Decman, D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Prasad, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Castro, P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2018-01-03

    An SNM attribute Information Barrier (IB) system was developed for a 2011 US/UK Exercise. The system was modified and extensively tested in a 2013-2014 US-UK Measurement Campaign. This work demonstrated rapid deployment of an IB system for potential treaty use. The system utilizes an Ortec Fission Meter neutron multiplicity counter and custom computer code. The system demonstrates a proof-of-principle automated Pu-240 mass determination with an information barrier. After a software start command is issued, the system automatically acquires and downloads data, performs an analysis, and displays the results. This system conveys the results of a Pu mass threshold measurements in a way the does not reveal sensitive information. In full IB mode, only the pass/fail result is displayed as a “Mass <= Threshold Amount” or “Mass >= Threshold Amount” as shown in Figure 4. This can easily be adapted to a red/green “lights” display similar to the Detective IB system for Pu isotopics as shown in Figure 6. In test mode, more detailed information is displayed. The code can also read in, analyze, and display results from previously acquired or simulated data. Because the equipment is commercial-off-the-shelf (COTS), the system demonstrates a low-cost short-lead-time technology for treaty SNM attribute measurements. A deployed system will likely require integration of additional authentication and tamper-indicating technologies. This will be discussed for the project in this and future progress reports.

  3. Fission Meter Information Barrier Attribute Measurement System: Task 1 Report: Document existing Fission Meter neutron IB system

    Energy Technology Data Exchange (ETDEWEB)

    Kerr, P. L. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-28

    An SNM attribute Information Barrier (IB) system was developed for a 2011 US/UK Exercise. The system was modified and extensively tested in a 2013-2014 US-UK Measurement Campaign. This work demonstrated rapid deployment of an IB system for potential treaty use. The system utilizes an Ortec Fission Meter neutron multiplicity counter and custom computer code. The system demonstrates a proof-of-principle automated Pu-240 mass determination with an information barrier. After a software start command is issued, the system automatically acquires and downloads data, performs an analysis, and displays the results. This system conveys the results of a Pu mass threshold measurements in a way the does not reveal sensitive information. In full IB mode, only red/green ‘lights’ are displayed in the software. In test mode, more detailed information is displayed. The code can also read in, analyze, and display results from previously acquired or simulated data. Because the equipment is commercial-off-the-shelf (COTS), the system demonstrates a low-cost short-lead-time technology for treaty SNM attribute measurements. A deployed system will likely require integration of additional authentication and tamper-indicating technologies. This will be discussed for the project in this and future progress reports.

  4. Trends of solar ultraviolet irradiance at Barrow, Alaska, and the effect of measurement uncertainties on trend detection

    Directory of Open Access Journals (Sweden)

    G. Bernhard

    2011-12-01

    Full Text Available Spectral ultraviolet (UV irradiance has been observed near Barrow, Alaska (71° N, 157° W between 1991 and 2011 with an SUV-100 spectroradiometer. The instrument was historically part of the US National Science Foundation's UV Monitoring Network and is now a component of NSF's Arctic Observing Network. From these measurements, trends in monthly average irradiance and their uncertainties were calculated. The analysis focuses on two quantities, the UV Index (which is affected by atmospheric ozone concentrations and irradiance at 345 nm (which is virtually insensitive to ozone. Uncertainties of trend estimates depend on variations in the data due to (1 natural variability, (2 systematic and random errors of the measurements, and (3 uncertainties caused by gaps in the time series. Using radiative transfer model calculations, systematic errors of the measurements were detected and corrected. Different correction schemes were tested to quantify the sensitivity of the trend estimates on the treatment of systematic errors. Depending on the correction method, estimates of decadal trends changed between 1.5% and 2.9%. Uncertainties in the trend estimates caused by error sources (2 and (3 were set into relation with the overall uncertainty of the trend determinations. Results show that these error sources are only relevant for February, March, and April when natural variability is low due to high surface albedo. This method of addressing measurement uncertainties in time series analysis is also applicable to other geophysical parameters. Trend estimates varied between −14% and +5% per decade and were significant (95.45% confidence level only for the month of October. Depending on the correction method, October trends varied between −11.4% and −13.7% for irradiance at 345 nm and between −11.7% and −14.1% for the UV Index. These large trends are consistent with trends in short-wave (0.3–3.0 μm solar irradiance measured with pyranometers at NOAA

  5. The uncertainties calculation of acoustic method for measurement of dissipative properties of heterogeneous non-metallic materials

    Directory of Open Access Journals (Sweden)

    Мaryna O. Golofeyeva

    2015-12-01

    Full Text Available The effective use of heterogeneous non-metallic materials and structures needs measurement of reliable values of dissipation characteristics, as well as common factors of their change during the loading process. Aim: The aim of this study is to prepare the budget for measurement uncertainty of dissipative properties of composite materials. Materials and Methods: The method used to study the vibrational energy dissipation characteristics based on coupling of vibrations damping decrement and acoustic velocity in a non-metallic heterogeneous material is reviewed. The proposed method allows finding the dependence of damping on vibrations amplitude and frequency of strain-stress state of material. Results: Research of the accuracy of measurement method during the definition of decrement attenuation of fluctuations in synthegran was performed. The international approach for evaluation of measurements quality is used. It includes the common practice international rules for uncertainty expression and their summation. These rules are used as internationally acknowledged confidence measure to the measurement results, which includes testing. The uncertainties budgeting of acoustic method for measurement of dissipative properties of materials were compiled. Conclusions: It was defined that there are two groups of reasons resulting in errors during measurement of materials dissipative properties. The first group of errors contains of parameters changing of calibrated bump in tolerance limits, displacement of sensor in repeated placement to measurement point, layer thickness variation of contact agent because of irregular hold-down of resolvers to control surface, inaccuracy in reading and etc. The second group of errors is linked with density and Poisson’s ratio measurement errors, distance between sensors, time difference between signals of vibroacoustic sensors.

  6. The measurement properties of the menorrhagia multi-attribute quality-of-life scale: a psychometric analysis.

    Science.gov (United States)

    Pattison, H; Daniels, J P; Kai, J; Gupta, J K

    2011-11-01

    Menorrhagia, or heavy menstrual bleeding (HMB), is a common gynaecological condition. As the aim of treatment is to improve women's wellbeing and quality of life (QoL), it is necessary to have effective ways to measure this. This study investigated the reliability and validity of the menorrhagia multi-attribute scale (MMAS), a menorrhagia-specific QoL instrument. Participants (n = 431) completed the MMAS and a battery of other tests as part of the baseline assessment of the ECLIPSE (Effectiveness and Cost-effectiveness of Levonorgestrel-containing Intrauterine system in Primary care against Standard trEatment for menorrhagia) trial. Analyses of their responses suggest that the MMAS has good measurement properties and is therefore an appropriate condition-specific instrument to measure the outcome of treatment for HMB. © 2011 The Authors BJOG An International Journal of Obstetrics and Gynaecology © 2011 RCOG.

  7. Size measurement uncertainties of near-monodisperse, near-spherical nanoparticles using transmission electron microscopy and particle-tracking analysis

    Science.gov (United States)

    De Temmerman, Pieter-Jan; Verleysen, Eveline; Lammertyn, Jeroen; Mast, Jan

    2014-10-01

    Particle-tracking analysis (PTA) in combination with systematic imaging, automatic image analysis, and automatic data processing is validated for size measurements. Transmission electron microscopy (TEM) in combination with a systematic selection procedure for unbiased random image collection, semiautomatic image analysis, and data processing is validated for size, shape, and surface topology measurements. PTA is investigated as an alternative for TEM for the determination of the particle size in the framework of the EC definition of nanomaterial. The intra-laboratory validation study assessing the precision and accuracy of the TEM and PTA methods consists of series of measurements on three gold reference materials with mean area-equivalent circular diameters of 8.9 nm (RM-8011), 27.6 nm (RM-8012), and 56.0 nm (RM-8013), and two polystyrene materials with modal hydrodynamic diameters of 102 nm (P1) and 202 nm (H1). By obtaining a high level of automation, PTA proves to give precise and non-biased results for the modal hydrodynamic diameter in size range between 30 and 200 nm, and TEM proves to give precise and non-biased results for the mean area-equivalent circular diameter in the size range between 8 and 200 nm of the investigated near-monomodal near-spherical materials. The expanded uncertainties of PTA are about 9 % and are determined mainly by the repeatability uncertainty. This uncertainty is two times higher than the expanded uncertainty of 4 % obtained by TEM for analyses on identical materials. For the investigated near-monomodal and near-spherical materials, PTA can be used as an alternative to TEM for measuring the particle size, with exception of 8.9 nm gold, because this material has a size below the detection limit of PTA.

  8. Ground and aircraft-based methane measurements in Siberia: source attribution using tracers and models

    Science.gov (United States)

    Arzoumanian, E.; Paris, J. D.; Pruvost, A.; Peng, S.; Turquety, S.; Berchet, A.; Pison, I.; Helle, J.; Arshinov, M.; Belan, B. D.

    2015-12-01

    Methane (CH4) is the second most important anthropogenic greenhouse gas. It is also naturally emitted by a number of processes, including microbial activity in wetlands, permafrost degradation and wildfires. Our current understanding of the extent and amplitude of its natural sources, as well as the large scale driving factors, remain highly uncertain (Kirschke et al., Nature Geosci., 2013). Furthermore, high latitude regions are large natural sources of CH4 in the atmosphere. Observing boreal/Arctic CH4 variability and understanding its main driving processes using atmospheric measurements and transport model is the task of this work. YAK-AEROSIB atmospheric airborne campaigns (flights in the tropospheric layer up to 9 km connecting the two cities of Novosibirsk and Yakutsk) and continuous measurements at Fonovaya Observatory (60 km west of Tomsk - 56° 25'07"N, 84° 04'27"E) have been performed in order to provide observational data on the composition of Siberian air. The study is focused on 2012, during which a strong heat wave impacted Siberia, leading to the highest mean daily temperature values on record since the beginning of the 20th century. This abnormal drought has led to numerous large forest fires. A chemistry-transport model (CHIMERE), combined with datasets for anthropogenic (EDGAR) emissions and models for wetlands (ORCHIDEE) and wildfires (APIFLAME), is used to determine contributions of CH4 sources in the region. Recent results concerning CH4 fluxes and its atmospheric variability in the Siberian territory derived from a modeled-based analysis will be shown and discussed. This work was funded by CNRS (France), the French Ministry of Foreign Affairs, CEA (France), Presidium of RAS (Program No. 4), Brunch of Geology, Geophysics and Mining Sciences of RAS (Program No. 5), Interdisciplinary integration projects of Siberian Branch of RAS (No. 35, No. 70, No. 131), Russian Foundation for Basic Research (grants No 14-05-00526, 14-05-00590). Kirschke, S

  9. Measuring and explaining eco-efficiencies of wastewater treatment plants in China: An uncertainty analysis perspective.

    Science.gov (United States)

    Dong, Xin; Zhang, Xinyi; Zeng, Siyu

    2017-04-01

    In the context of sustainable development, there has been an increasing requirement for an eco-efficiency assessment of wastewater treatment plants (WWTPs). Data envelopment analysis (DEA), a technique that is widely applied for relative efficiency assessment, is used in combination with the tolerances approach to handle WWTPs' multiple inputs and outputs as well as their uncertainty. The economic cost, energy consumption, contaminant removal, and global warming effect during the treatment processes are integrated to interpret the eco-efficiency of WWTPs. A total of 736 sample plants from across China are assessed, and large sensitivities to variations in inputs and outputs are observed for most samples, with only three WWTPs identified as being stably efficient. Size of plant, overcapacity, climate type, and influent characteristics are proven to have a significant influence on both the mean efficiency and performance sensitivity of WWTPs, while no clear relationships were found between eco-efficiency and technology under the framework of uncertainty analysis. The incorporation of uncertainty quantification and environmental impact consideration has improved the liability and applicability of the assessment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Exploiting Measurement Uncertainty Estimation in Evaluation of GOES-R ABI Image Navigation Accuracy Using Image Registration Techniques

    Science.gov (United States)

    Haas, Evan; DeLuccia, Frank

    2016-01-01

    In evaluating GOES-R Advanced Baseline Imager (ABI) image navigation quality, upsampled sub-images of ABI images are translated against downsampled Landsat 8 images of localized, high contrast earth scenes to determine the translations in the East-West and North-South directions that provide maximum correlation. The native Landsat resolution is much finer than that of ABI, and Landsat navigation accuracy is much better than ABI required navigation accuracy and expected performance. Therefore, Landsat images are considered to provide ground truth for comparison with ABI images, and the translations of ABI sub-images that produce maximum correlation with Landsat localized images are interpreted as ABI navigation errors. The measured local navigation errors from registration of numerous sub-images with the Landsat images are averaged to provide a statistically reliable measurement of the overall navigation error of the ABI image. The dispersion of the local navigation errors is also of great interest, since ABI navigation requirements are specified as bounds on the 99.73rd percentile of the magnitudes of per pixel navigation errors. However, the measurement uncertainty inherent in the use of image registration techniques tends to broaden the dispersion in measured local navigation errors, masking the true navigation performance of the ABI system. We have devised a novel and simple method for estimating the magnitude of the measurement uncertainty in registration error for any pair of images of the same earth scene. We use these measurement uncertainty estimates to filter out the higher quality measurements of local navigation error for inclusion in statistics. In so doing, we substantially reduce the dispersion in measured local navigation errors, thereby better approximating the true navigation performance of the ABI system.

  11. Uncertainty evaluation of fluid dynamic models and validation by gamma ray transmission measurements of the catalyst flow in a FCC cold pilot unity

    Energy Technology Data Exchange (ETDEWEB)

    Teles, Francisco A.S.; Santos, Ebenezer F.; Dantas, Carlos C., E-mail: francisco.teles@ufpe.br [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Centro de Tecnologia e Geociencias. Departamento de Energia Nuclear; Melo, Silvio B., E-mail: sbm@cin.ufpe.br [Universidade Federal de Pernambuco (CIN/UFPE), Recife, PE (Brazil). Centro de Informatica; Santos, Valdemir A. dos, E-mail: vas@unicap.br [Universidade Catolica de Pernambuco (UNICAP), Recife, PE (Brazil). Dept. de Quimica; Lima, Emerson A.O., E-mail: emathematics@gmail.com [Universidade de Pernambuco (POLI/UPE), Recife, PE (Brazil). Escola Politecnica

    2013-07-01

    In this paper, fluid dynamics of Fluid Catalytic Cracking (FCC) process is investigated by means of a Cold Flow Pilot Unit (CFPU) constructed in Plexiglas to visualize operational conditions. Axial and radial catalyst profiles were measured by gamma ray transmission in the riser of the CFPU. Standard uncertainty was evaluated in volumetric solid fraction measurements for several concentrations at a given point of axial profile. Monitoring of the pressure drop in riser shows a good agreement with measured standard uncertainty data. A further evaluation of the combined uncertainty was applied to volumetric solid fraction equation using gamma transmission data. Limit condition of catalyst concentration in riser was defined and simulation with random numbers provided by MATLAB software has tested uncertainty evaluation. The Guide to the expression of Uncertainty in Measurement (GUM) is based on the law of propagation of uncertainty and on the characterization of the quantities measured by means of either a Gaussian distribution or a t-distribution, which allows measurement uncertainty to be delimited by means of a confidence interval. A variety of supplements to GUM are being developed, which will progressively enter into effect. The first of these supplements [3] describes an alternative procedure for the calculation of uncertainties: the Monte Carlo Method (MCM).MCM is an alternative to GUM, since it performs a characterization of the quantities measured based on the random sampling of the probability distribution functions. This paper also explains the basic implementation of the MCM method in MATLAB. (author)

  12. Automatic measurement of compression wood cell attributes in fluorescence microscopy images.

    Science.gov (United States)

    Selig, B; Luengo Hendriks, C L; Bardage, S; Daniel, G; Borgefors, G

    2012-06-01

    This paper presents a new automated method for analyzing compression wood fibers in fluorescence microscopy. Abnormal wood known as compression wood is present in almost every softwood tree harvested. Compression wood fibers show a different cell wall morphology and chemistry compared to normal wood fibers, and their mechanical and physical characteristics are considered detrimental for both construction wood and pulp and paper purposes. Currently there is the need for improved methodologies for characterization of lignin distribution in wood cell walls, such as from compression wood fibers, that will allow for a better understanding of fiber mechanical properties. Traditionally, analysis of fluorescence microscopy images of fiber cross-sections has been done manually, which is time consuming and subjective. Here, we present an automatic method, using digital image analysis, that detects and delineates softwood fibers in fluorescence microscopy images, dividing them into cell lumen, normal and highly lignified areas. It also quantifies the different areas, as well as measures cell wall thickness. The method is evaluated by comparing the automatic with a manual delineation. While the boundaries between the various fiber wall regions are detected using the automatic method with precision similar to inter and intra expert variability, the position of the boundary between lumen and the cell wall has a systematic shift that can be corrected. Our method allows for transverse structural characterization of compression wood fibers, which may allow for improved understanding of the micro-mechanical modeling of wood and pulp fibers. © 2012 The Authors Journal of Microscopy © 2012 Wadsworth Center, New York State Department of Health.

  13. ObsPack: a framework for the preparation, delivery, and attribution of atmospheric greenhouse gas measurements

    Science.gov (United States)

    Masarie, K. A.; Peters, W.; Jacobson, A. R.; Tans, P. P.

    2014-12-01

    Observation Package (ObsPack) is a framework designed to bring together atmospheric greenhouse gas observations from a variety of sampling platforms, prepare them with specific applications in mind, and package and distribute them in a self-consistent and well-documented product. Data products created using the ObsPack framework (called "ObsPack products") are intended to support carbon cycle modeling studies and represent a next generation of value-added greenhouse gas observation products modeled after the cooperative GLOBALVIEW products introduced in 1996. Depending on intended use, ObsPack products may include data in their original form reformatted using the ObsPack framework or may contain derived data consisting of averages, subsets, or smoothed representations of original data. All products include extensive ancillary information (metadata) intended to help ensure the data are used appropriately, their calibration and quality assurance history are clearly described, and that individuals responsible for the measurements (data providers or principal investigators (PIs)) are properly acknowledged for their work. ObsPack products are made freely available using a distribution strategy designed to improve communication between data providers and product users. The strategy includes a data usage policy that requires users to directly communicate with data providers and an automated e-mail notification system triggered when a product is accessed. ObsPack products will be assigned a unique digital object identifier (DOI) to ensure each product can be unambiguously identified in scientific literature. Here we describe the ObsPack framework and its potential role in supporting the evolving needs of both data providers and product users.

  14. Psychometric evaluation of the revised attribution questionnaire (r-AQ) to measure mental illness stigma in adolescents.

    Science.gov (United States)

    Pinto, Melissa D; Hickman, Ronald; Logsdon, M Cynthia; Burant, Christopher

    2012-01-01

    The revised attribution questionnaire (r-AQ) measures mental illness stigma. This study's purpose is to evaluate the factor structure of the (r-AQ) and examine the validity of the factor structure in adolescents. A convenience sample (n = 210) of adolescents completed the r-AQ and these data were used in exploratory (EFA) and confirmatory factor analyses (CFA). The EFA established a five item single factor structure, which we called the modified r-AQ and captures the negative emotional reactions to people with mental illness, a domain of mental illness stigma. The CFA established the validity of the factor structure (chi2 = 2.4, df = 4, p = .659, TLI = 1.042, CFI = 1.00, RMSEA = .000). Internal consistency reliability for the scale was acceptable (a = .70). The modified r-AQ is a reliable and valid measure of the emotional reaction to people with mental illness.

  15. Fugitive methane emission pinpointing and source attribution using ethane measurements in a portable cavity ring-down analyzer

    Science.gov (United States)

    Fleck, Derek; Hoffnagle, John; Yiu, John; Chong, Johnston; Tan, Sze

    2017-04-01

    Methane source pinpointing and attribution is ever more important because of the vast network of natural gas distribution which has led to a very large emission sources. Ethane can be used as a tracer to distinguish gas sources between biogenic and natural gas. Having this measurement sensitive enough can even distinguish between gas distributors, or maturity through gas wetness. Here we present data obtained using a portable cavity ring-down spectrometer weighing less than 11 kg and consuming less than 35W that simultaneously measures methane and ethane with a raw 1-σ precision of 50ppb and 4.5ppb, respectively at 2 Hz. These precisions allow for a C2:C1 ratio 1-σ measurement of methane only mode used for surveying and pinpointing. This mode measures at a rate faster than 4Hz with a 1-σ precision of methane seepages are highly variable due to air turbulence and mixing right above the ground, correlations in the variations in C2H6 and CH4 are used to derive a source C2:C1. Additional hardware is needed for steady state concentration measurements to reliably measure the C2:C1 ratio instantaneously. Source discrimination data of local leaks and methane sources using this analysis method are presented. Additionally, two-dimensional plume snapshots are constructed using an integrated onboard GPS to visualize horizontal plane gas propagation.

  16. Uncertainty analysis and flow measurements in an experimental mock-up of a molten salt reactor concept

    Energy Technology Data Exchange (ETDEWEB)

    Yamaji, Bogdan; Aszodi, Attila [Budapest University of Technology and Economics (Hungary). Inst. of Nuclear Techniques

    2016-09-15

    In the paper measurement results from the experimental modelling of a molten salt reactor concept will be presented along with detailed uncertainty analysis of the experimental system. Non-intrusive flow measurements are carried out on the scaled and segmented mock-up of a homogeneous, single region molten salt fast reactor concept. Uncertainty assessment of the particle image velocimetry (PIV) measurement system applied with the scaled and segmented model is presented in detail. The analysis covers the error sources of the measurement system (laser, recording camera, etc.) and the specific conditions (de-warping of measurement planes) originating in the geometry of the investigated domain. Effect of sample size in the ensemble averaged PIV measurements is discussed as well. An additional two-loop-operation mode is also presented and the analysis of the measurement results confirm that without enhancement nominal and other operation conditions will lead to strong unfavourable separation in the core flow. It implies that use of internal flow distribution structures will be necessary for the optimisation of the core coolant flow. Preliminary CFD calculations are presented to help the design of a perforated plate located above the inlet region. The purpose of the perforated plate is to reduce recirculation near the cylindrical wall and enhance the uniformity of the core flow distribution.

  17. On the evaluation of a fuel assembly design by means of uncertainty and sensitivity measures

    Energy Technology Data Exchange (ETDEWEB)

    Jaeger, Wadim; Sanchez Espinoza, Victor Hugo [Karlsruhe Institute of Technology (KIT), Eggenstein-Leopoldshafen (Germany). Inst. for Neutron Physics and Reactor Technology

    2012-11-15

    This paper will provide results of an uncertainty and sensitivity study in order to calculate parameters of safety related importance like the fuel centerline temperature, the cladding temperature and the fuel assembly pressure drop of a lead-alloy cooled fast system. Applying best practice guidelines, a list of uncertain parameters has been identified. The considered parameter variations are based on the experience gained during fabrication and operation of former and existing liquid metal cooled fast systems as well as on experimental results and on engineering judgment. (orig.)

  18. Uncertainty propagation for the coulometric measurement of the plutonium concentration in CRM126 solution provided by JAEA

    Energy Technology Data Exchange (ETDEWEB)

    Morales-Arteaga, Maria [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-11-07

    This GUM WorkbenchTM propagation of uncertainty is for the coulometric measurement of the plutonium concentration in a Pu standard material (C126) supplied as individual aliquots that were prepared by mass. The C126 solution had been prepared and as aliquoted as standard material. Samples are aliquoted into glass vials and heated to dryness for distribution as dried nitrate. The individual plutonium aliquots were not separated chemically or otherwise purified prior to measurement by coulometry in the F/H Laboratory. Hydrogen peroxide was used for valence adjustment.

  19. Uncertainties in hot-wire measurements of compressible turbulent flows implied by comparisons with laser-induced fluorescence

    Science.gov (United States)

    Mckenzie, R. L.; Logan, P.

    1986-01-01

    A hot-wire anemometer and a new nonintrusive laser-induced fluorescence (LIF) technique are used to survey a Mach 2 turbulent boundary layer. The hot-wire anemometer's ability to accurately measure mass flux, temperature, and density fluctuations in a compressible flow is examined by comparing its results with those obtained using LIF. Several methods of hot-wire calibration are used, and the uncertainties in their measurements of various fluctuating flow parameters are determined. The results show that although a hot-wire operated at high overheat can measure mass flux fluctuations, temperature and density fluctuations are not determined accurately from such measurements. However, a hot-wire operated at multiple overheats can be used to measure static and total temperature fluctuations. The presence of pressure fluctuations and their correlation with density can prevent the use of hot-wire data to determine density fluctuations.

  20. Effect of uncertainty in composition and weight measures in control of cheese yield and fat loss in large cheese factories.

    Science.gov (United States)

    Margolies, Brenda; Adams, Michael C; Pranata, Joice; Gondoutomo, Kathleen; Barbano, David M

    2017-08-01

    Our objective was to develop a computer-based cheese yield, fat recovery, and composition control performance measurement system to provide quantitative performance records for a Cheddar and mozzarella cheese factory. The system can be used to track trends in performance of starter cultures and vats, as well as systematically calculate theoretical yield. Yield equations were built into the spreadsheet to evaluate cheese yield performance and fat losses in a cheese factory. Based on observations in commercial cheese factories, sensitivity analysis was done to demonstrate the sensitivity of cheese factory performance to analytical uncertainty of data used in the evaluation. Analytical uncertainty in the accuracy of milk weight and milk and cheese composition were identified as important factors that influence the ability to manage consistency of cheese quality and profitability. It was demonstrated that an uncertainty of ±0.1% milk fat or milk protein in the vat causes a range of theoretical Cheddar cheese yield from 10.05 to 10.37% and an uncertainty of yield efficiency of ±1.5%. This equates to ±1,451 kg (3,199 lb) of cheese per day in a factory processing 907,185 kg (2 million pounds) of milk per day. The same is true for uncertainty in cheese composition, where the effect of being 0.5% low on moisture or fat is about 484 kg (1,067 lb) of missed revenue opportunity from cheese for the day. Missing the moisture target causes other targets such as fat on a dry basis and salt in moisture to be missed. Similar impacts were demonstrated for mozzarella cheese. In analytical performance evaluations of commercial cheese quality assurance laboratories, we found that analytical uncertainty was typically a bias that was as large as 0.5% on fat and moisture. The effect of having a high bias of 0.5% moisture or fat will produce a missed opportunity of 484 kg of cheese per day for each component. More accurate rapid methods for determination of moisture, fat, and salt

  1. Combining Nordtest method and bootstrap resampling for measurement uncertainty estimation of hematology analytes in a medical laboratory.

    Science.gov (United States)

    Cui, Ming; Xu, Lili; Wang, Huimin; Ju, Shaoqing; Xu, Shuizhu; Jing, Rongrong

    2017-12-01

    Measurement uncertainty (MU) is a metrological concept, which can be used for objectively estimating the quality of test results in medical laboratories. The Nordtest guide recommends an approach that uses both internal quality control (IQC) and external quality assessment (EQA) data to evaluate the MU. Bootstrap resampling is employed to simulate the unknown distribution based on the mathematical statistics method using an existing small sample of data, where the aim is to transform the small sample into a large sample. However, there have been no reports of the utilization of this method in medical laboratories. Thus, this study applied the Nordtest guide approach based on bootstrap resampling for estimating the MU. We estimated the MU for the white blood cell (WBC) count, red blood cell (RBC) count, hemoglobin (Hb), and platelets (Plt). First, we used 6months of IQC data and 12months of EQA data to calculate the MU according to the Nordtest method. Second, we combined the Nordtest method and bootstrap resampling with the quality control data and calculated the MU using MATLAB software. We then compared the MU results obtained using the two approaches. The expanded uncertainty results determined for WBC, RBC, Hb, and Plt using the bootstrap resampling method were 4.39%, 2.43%, 3.04%, and 5.92%, respectively, and 4.38%, 2.42%, 3.02%, and 6.00% with the existing quality control data (U [k=2]). For WBC, RBC, Hb, and Plt, the differences between the results obtained using the two methods were lower than 1.33%. The expanded uncertainty values were all less than the target uncertainties. The bootstrap resampling method allows the statistical analysis of the MU. Combining the Nordtest method and bootstrap resampling is considered a suitable alternative method for estimating the MU. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  2. Shielding effectiveness measurements and uncertainty estimation for textiles by a VNA-based free space transmission method

    Directory of Open Access Journals (Sweden)

    Patel S.M.

    2013-01-01

    Full Text Available A free-space transmission method has been used for reliable shielding effectiveness measurement of the easily available textile materials. Textiles with three different yarn densities were studied for their shielding effectiveness with the help of a vector network analyzer and laboratory calibrated two X-band horn antennas. The expressions of uncertainty estimation have been derived in accordance with the present free-space measurement setup for the calculated SE values. The measurements have shown that an electromagnetic energy can be maximum shielded up to 16.24 dB with measurement uncertainty less than 0.21 dB in 8.2 to 12.4 GHz range by a 160.85 μm textile. Thus, a thin textile with a high density can have higher shielding and this property mainly depends on its intrinsic structure, frequency range and thickness. This study promises the potential applications of such materials as a very cost effective shielding material at microwave frequencies with some modifications.

  3. Improved water δ2H and δ18O calibration and calculation of measurement uncertainty using a simple software tool.

    Science.gov (United States)

    Gröning, Manfred

    2011-10-15

    The calibration of all δ(2)H and δ(18)O measurements on the VSMOW/SLAP scale should be performed consistently, based on similar principles, independent of the instrumentation used. The basic principles of a comprehensive calibration strategy are discussed taking water as example. The most common raw data corrections for memory and drift effects are described. Those corrections result in a considerable improvement in data consistency, especially in laboratories analyzing samples of quite variable isotopic composition (e.g. doubly labelled water). The need for a reliable uncertainty assessment for all measurements is discussed and an easy implementation method proposed. A versatile evaluation method based on Excel macros and spreadsheets is presented. It corrects measured raw data for memory and drift effects, performs the calibration and calculates the combined standard uncertainty for each measurement. It allows the easy implementation of the discussed principles in any user laboratory. Following these principles will improve the comparability of data among laboratories. Copyright © 2011 John Wiley & Sons, Ltd.

  4. Characterization of Rockwell hardness indenter Tip using image processing and optical profiler and evaluation of measurement uncertainty

    Directory of Open Access Journals (Sweden)

    Moona G.

    2014-01-01

    Full Text Available Hardness is a measure of the resistance of a material to be penetrated and eroded by sharp projections of other materials such as diamond. The process of creating sharp projections on any test surface is known as indentation. Hardness measurement of any material is the result of a complex process of deformation during indentation. The indenter tip geometry, which includes radius of curvature at the tip and tip angle, affects the hardness measurement by influencing the nature of the penetration process on the test surface, because every indenter deforms the specimen surface with a different geometry. The controlled indenter geometry can improve the consistency of hardness measurement. In this paper we report the estimation of two important geometrical parameters, radius of curvature and tip angle of a Rockwell indenter by using a simple method of image processing and compare the results with those obtained with a traceable 3D optical profiler. Evaluation of uncertainty in measuremts is carried out as per ISO guidelines (ISO-GUM and a detailed uncertainty budget is presented. The tip angle estimted is 119.95 degree. The radius of curvature is estimted to be 199.96 ± 0.80μm by image analysis which agrees well with the value estimated by using optical profiler i.e. 199.12 μm.

  5. Reducing uncertainty in within-host parameter estimates of influenza infection by measuring both infectious and total viral load.

    Directory of Open Access Journals (Sweden)

    Stephen M Petrie

    Full Text Available For in vivo studies of influenza dynamics where within-host measurements are fit with a mathematical model, infectivity assays (e.g. 50% tissue culture infectious dose; TCID50 are often used to estimate the infectious virion concentration over time. Less frequently, measurements of the total (infectious and non-infectious viral particle concentration (obtained using real-time reverse transcription-polymerase chain reaction; rRT-PCR have been used as an alternative to infectivity assays. We investigated the degree to which measuring both infectious (via TCID50 and total (via rRT-PCR viral load allows within-host model parameters to be estimated with greater consistency and reduced uncertainty, compared with fitting to TCID50 data alone. We applied our models to viral load data from an experimental ferret infection study. Best-fit parameter estimates for the "dual-measurement" model are similar to those from the TCID50-only model, with greater consistency in best-fit estimates across different experiments, as well as reduced uncertainty in some parameter estimates. Our results also highlight how variation in TCID50 assay sensitivity and calibration may hinder model interpretation, as some parameter estimates systematically vary with known uncontrolled variations in the assay. Our techniques may aid in drawing stronger quantitative inferences from in vivo studies of influenza virus dynamics.

  6. Measurements and their uncertainties a practical guide to modern error analysis

    CERN Document Server

    Hughes, Ifan G

    2010-01-01

    This hands-on guide is primarily intended to be used in undergraduate laboratories in the physical sciences and engineering. It assumes no prior knowledge of statistics. It introduces the necessary concepts where needed, with key points illustrated with worked examples and graphic illustrations. In contrast to traditional mathematical treatments it uses a combination of spreadsheet and calculus-based approaches, suitable as a quick and easy on-the-spot reference. The emphasisthroughout is on practical strategies to be adopted in the laboratory. Error analysis is introduced at a level accessible to school leavers, and carried through to research level. Error calculation and propagation is presented though a series of rules-of-thumb, look-up tables and approaches amenable to computer analysis. The general approach uses the chi-square statistic extensively. Particular attention is given to hypothesis testing and extraction of parameters and their uncertainties by fitting mathematical models to experimental data....

  7. Attribution and evolution of ozone from Asian wild fires using satellite and aircraft measurements during the ARCTAS campaign

    Directory of Open Access Journals (Sweden)

    R. Dupont

    2012-01-01

    Full Text Available We use ozone and carbon monoxide measurements from the Tropospheric Emission Spectrometer (TES, model estimates of Ozone, CO, and ozone pre-cursors from the Real-time Air Quality Modeling System (RAQMS, and data from the NASA DC8 aircraft to characterize the source and dynamical evolution of ozone and CO in Asian wildfire plumes during the spring ARCTAS campaign 2008. On the 19 April, NASA DC8 O3 and aerosol Differential Absorption Lidar (DIAL observed two biomass burning plumes originating from North-Western Asia (Kazakhstan and South-Eastern Asia (Thailand that advected eastward over the Pacific reaching North America in 10 to 12 days. Using both TES observations and RAQMS chemical analyses, we track the wildfire plumes from their source to the ARCTAS DC8 platform. In addition to photochemical production due to ozone pre-cursors, we find that exchange between the stratosphere and the troposphere is a major factor influencing O3 concentrations for both plumes. For example, the Kazakhstan and Siberian plumes at 55 degrees North is a region of significant springtime stratospheric/tropospheric exchange. Stratospheric air influences the Thailand plume after it is lofted to high altitudes via the Himalayas. Using comparisons of the model to the aircraft and satellite measurements, we estimate that the Kazakhstan plume is responsible for increases of O3 and CO mixing ratios by approximately 6.4 ppbv and 38 ppbv in the lower troposphere (height of 2 to 6 km, and the Thailand plume is responsible for increases of O3 and CO mixing ratios of approximately 11 ppbv and 71 ppbv in the upper troposphere (height of 8 to 12 km respectively. However, there are significant sources of uncertainty in these estimates that point to the need for future improvements in both model and satellite observations. For example, it is challenging to characterize the fraction of air parcels from the stratosphere versus those from the

  8. Standardisation of a European measurement method for the determination of anions and cations in PM2.5: results of field trial campaign and determination of measurement uncertainty.

    Science.gov (United States)

    Beccaceci, Sonya; Brown, Richard J C; Butterfield, David M; Harris, Peter M; Otjes, René P; van Hoek, Caroline; Makkonen, Ulla; Catrambone, Maria; Patier, Rosalía Fernández; Houtzager, Marc M G; Putaud, Jean-Philippe

    2016-12-08

    European Committee for Standardisation (CEN) Technical Committee 264 'Air Quality' has recently produced a standard method for the measurements of anions and cations in PM2.5 within its Working Group 34 in response to the requirements of European Directive 2008/50/EC. It is expected that this method will be used in future by all Member States making measurements of the ionic content of PM2.5. This paper details the results of a field measurement campaign and the statistical analysis performed to validate this method, assess its uncertainty and define its working range to provide clarity and confidence in the underpinning science for future users of the method. The statistical analysis showed that, except for the lowest range of concentrations, the expanded combined uncertainty is expected to be below 30% at the 95% confidence interval for all ions except Cl(-). However, if the analysis is carried out on the lower concentrations found at rural sites the uncertainty can be in excess of 50% for Cl(-), Na(+), K(+), Mg(2+) and Ca(2+). An estimation of the detection limit for all ions was also calculated and found to be 0.03 μg m(-3) or below.

  9. Quantification of Uncertainty in Mathematical Models: The Statistical Relationship between Field and Laboratory pH Measurements

    Directory of Open Access Journals (Sweden)

    Kurt K. Benke

    2017-01-01

    Full Text Available The measurement of soil pH using a field portable test kit represents a fast and inexpensive method to assess pH. Field based pH methods have been used extensively for agricultural advisory services and soil survey and now for citizen soil science projects. In the absence of laboratory measurements, there is a practical need to model the laboratory pH as a function of the field pH to increase the density of data for soil research studies and Digital Soil Mapping. The accuracy and uncertainty in pH field measurements were investigated for soil samples from regional Victoria in Australia using both linear and sigmoidal models. For samples in water and CaCl2 at 1 : 5 dilutions, sigmoidal models provided improved accuracy over the full range of field pH values in comparison to linear models (i.e., pH 9. The uncertainty in the field results was quantified by the 95% confidence interval (CI and 95% prediction interval (PI for the models, with 95% CI < 0.25 pH units and 95% PI = ±1.3 pH units, respectively. It was found that the Pearson criterion for robust regression analysis can be considered as an alternative to the orthodox least-squares modelling approach because it is more effective in addressing outliers in legacy data.

  10. Measurement and interpolation uncertainties in rainfall maps from cellular communication networks

    NARCIS (Netherlands)

    Rios Gaona, M.F.; Overeem, A.; Leijnse, H.; Uijlenhoet, R.

    2015-01-01

    Accurate measurements of rainfall are important in many hydrological and meteorological applications, for instance, flash-flood early-warning systems, hydraulic structures design, irrigation, weather forecasting, and climate modelling. Whenever possible, link networks measure and store the received

  11. Considering sampling strategy and cross-section complexity for estimating the uncertainty of discharge measurements using the velocity-area method

    Science.gov (United States)

    Despax, Aurélien; Perret, Christian; Garçon, Rémy; Hauet, Alexandre; Belleville, Arnaud; Le Coz, Jérôme; Favre, Anne-Catherine

    2016-02-01

    Streamflow time series provide baseline data for many hydrological investigations. Errors in the data mainly occur through uncertainty in gauging (measurement uncertainty) and uncertainty in the determination of the stage-discharge relationship based on gaugings (rating curve uncertainty). As the velocity-area method is the measurement technique typically used for gaugings, it is fundamental to estimate its level of uncertainty. Different methods are available in the literature (ISO 748, Q + , IVE), all with their own limitations and drawbacks. Among the terms forming the combined relative uncertainty in measured discharge, the uncertainty component relating to the limited number of verticals often includes a large part of the relative uncertainty. It should therefore be estimated carefully. In ISO 748 standard, proposed values of this uncertainty component only depend on the number of verticals without considering their distribution with respect to the depth and velocity cross-sectional profiles. The Q + method is sensitive to a user-defined parameter while it is questionable whether the IVE method is applicable to stream-gaugings performed with a limited number of verticals. To address the limitations of existing methods, this paper presents a new methodology, called FLow Analog UnceRtainty Estimation (FLAURE), to estimate the uncertainty component relating to the limited number of verticals. High-resolution reference gaugings (with 31 and more verticals) are used to assess the uncertainty component through a statistical analysis. Instead of subsampling purely randomly the verticals of these reference stream-gaugings, a subsampling method is developed in a way that mimicks the behavior of a hydrometric technician. A sampling quality index (SQI) is suggested and appears to be a more explanatory variable than the number of verticals. This index takes into account the spacing between verticals and the variation of unit flow between two verticals. To compute the

  12. FIRM: Sampling-based feedback motion-planning under motion uncertainty and imperfect measurements

    KAUST Repository

    Agha-mohammadi, A.-a.

    2013-11-15

    In this paper we present feedback-based information roadmap (FIRM), a multi-query approach for planning under uncertainty which is a belief-space variant of probabilistic roadmap methods. The crucial feature of FIRM is that the costs associated with the edges are independent of each other, and in this sense it is the first method that generates a graph in belief space that preserves the optimal substructure property. From a practical point of view, FIRM is a robust and reliable planning framework. It is robust since the solution is a feedback and there is no need for expensive replanning. It is reliable because accurate collision probabilities can be computed along the edges. In addition, FIRM is a scalable framework, where the complexity of planning with FIRM is a constant multiplier of the complexity of planning with PRM. In this paper, FIRM is introduced as an abstract framework. As a concrete instantiation of FIRM, we adopt stationary linear quadratic Gaussian (SLQG) controllers as belief stabilizers and introduce the so-called SLQG-FIRM. In SLQG-FIRM we focus on kinematic systems and then extend to dynamical systems by sampling in the equilibrium space. We investigate the performance of SLQG-FIRM in different scenarios. © The Author(s) 2013.

  13. Total Measurement Uncertainty (TMU) for Nondestructive Assay of Transuranic (TRU) Waste at the WRAP Facility

    Energy Technology Data Exchange (ETDEWEB)

    CANTALOUB, M.G.

    2000-10-20

    At the WRAP facility, there are two identical imaging passive/active neutron (IPAN) assay systems and two identical gamma energy assay (GEA) systems. Currently, only the GEA systems are used to characterize waste, therefore, only the GEA systems are addressed in this document. This document contains the limiting factors relating to the waste drum analysis for shipments destined for WIPP. The TMU document provides the uncertainty basis in the NDA analysis of waste containers at the WRAP facility. The defined limitations for the current analysis scheme are as follows: (1) The WRAP waste stream debris is from the Hanford Plutonium Finishing Plant's process lines, primarily combustible materials. (2) Plutonium analysis range is from the minimum detectable concentration (MDC), Reference 6, to 200 grams (g). (3) The GEA system calibration density ranges from 0.013 g/cc to 1.6 g/cc. (4) PDP Plutonium drum densities were evaluated from 0.065 g/cc to 0.305 g/cc. (5) PDP Plutonium source weights ranged from 0.030 g to 318 g, in both empty and combustibles matrix drums. (6) The GEA system design density correction mass absorption coefficient table (MAC) is Lucite, a material representative of combustible waste. (7) Drums with material not fitting the debris waste criteria are targeted for additional calculations, reviews, and potential re-analysis using a calibration suited for the waste type.

  14. Total Measurement Uncertainty (TMU) for Nondestructive Assay of Transuranic (TRU) Waste at the WRAP Facility

    Energy Technology Data Exchange (ETDEWEB)

    CANTALOUB, M.G.

    2000-05-22

    At the WRAP facility, there are two identical imaging passive/active neutron (IPAN) assay systems and two identical gamma energy assay (GEA) systems. Currently, only the GEA systems are used to characterize waste, therefore, only the GEA systems are addressed in this document. This document contains the limiting factors relating to the waste drum analysis for shipments destined for WIPP. The TMU document provides the uncertainty basis in the NDA analysis of waste containers at the WRAP facility. The defined limitations for the current analysis scheme are as follows: The WRAP waste stream debris is from the Hanford Plutonium Finishing Plant's process lines, primarily combustible materials. Plutonium analysis range is from the minimum detectable concentration (MDC), Reference 6, to 160 grams (8). The GEA system calibration density ranges from 0.013 g/cc to 1.6 g/cc. PDP Plutonium drum densities were evaluated from 0.065 g/cc to 0.305 gkc. PDP Plutonium source weights ranged from 0.030 g to 3 18 g, in both empty and combustibles matrix drums. The GEA system design density correction macroscopic absorption cross section table (MAC) is Lucite, a material representative of combustible waste. Drums with material not fitting the debris waste criteria are targeted for additional calculations, reviews, and potential re-analysis using a calibration suited for the waste type.

  15. Standardization of the Definitions of Vertical Resolution and Uncertainty in the NDACC-archived Ozone and Temperature Lidar Measurements

    Science.gov (United States)

    Leblanc, T.; Godin-Beekmann, S.; Payen, Godin-Beekmann; Gabarrot, Franck; vanGijsel, Anne; Bandoro, J.; Sica, R.; Trickl, T.

    2012-01-01

    The international Network for the Detection of Atmospheric Composition Change (NDACC) is a global network of high-quality, remote-sensing research stations for observing and understanding the physical and chemical state of the Earth atmosphere. As part of NDACC, over 20 ground-based lidar instruments are dedicated to the long-term monitoring of atmospheric composition and to the validation of space-borne measurements of the atmosphere from environmental satellites such as Aura and ENVISAT. One caveat of large networks such as NDACC is the difficulty to archive measurement and analysis information consistently from one research group (or instrument) to another [1][2][3]. Yet the need for consistent definitions has strengthened as datasets of various origin (e.g., satellite and ground-based) are increasingly used for intercomparisons, validation, and ingested together in global assimilation systems.In the framework of the 2010 Call for Proposals by the International Space Science Institute (ISSI) located in Bern, Switzerland, a Team of lidar experts was created to address existing issues in three critical aspects of the NDACC lidar ozone and temperature data retrievals: signal filtering and the vertical filtering of the retrieved profiles, the quantification and propagation of the uncertainties, and the consistent definition and reporting of filtering and uncertainties in the NDACC- archived products. Additional experts from the satellite and global data standards communities complement the team to help address issues specific to the latter aspect.

  16. Greenhouse Gas (GHG) Source Detection and Attribution in the San Francisco Bay Area of California Using a Mobile Measurement Platform

    Science.gov (United States)

    Guha, A.; Bower, J.; Martien, P. T.; Perkins, I.; Randall, S.; Stevenson, E.; Young, A.; Hilken, H.

    2016-12-01

    The Bay Area Air Quality Management District is the greater San Francisco Bay metropolitan area's chief air quality regulatory agency. Aligning itself with the Governor's Executive Order S-3-05, the Air District has set a goal to reduce the region's GHG emissions by 80% below 1990 levels by the year 2050. The Air District's 2016 Clean Air Plan will lay out the agency's vision and actions to put the region on a path forward towards achieving the 2050 goal while also reducing air pollution and related health impacts. The 2016 Plan has three overarching objectives: 1) develop a multi-pollutant emissions control strategy, (2) reduce population exposure to harmful air pollutants, especially in vulnerable communities, and (3) protect climate through a comprehensive Regional Climate Protection Strategy. To accomplish one of 2016 Plan's control measures (SL3 - Greenhouse Gas Monitoring and Measurement Network), the Air District has fabricated a mobile measurement platform i.e. a GHG research van to perform targeted CH4 emissions hotspot detection and source attribution. The van is equipped with analyzers capable of measuring CH4, CO2 and N2O in ambient plumes at fast sampling rates. The coincident measurement of source tracers like isotopic methane (13C - CH4), CO and ethane (C2H6) provide the capability to distinguish between biogenic, combustion-based and fossil-based fugitive methane sources, respectively. The GHG research van is a comprehensive mobile tool to perform tracer-based GHG source identification and apportionment. We report observation-based source-specific tracer-to-tracer emission ratios from a region-wide survey of well-known area sources like landfills, wastewater treatment facilities and dairies, and compare those with similar ratios in the Air District's GHG inventory. We also investigate plumes from potentially under-inventoried sources like anaerobic digesters, composting operations, active and plugged oil and gas wells, and a natural gas storage

  17. Measurement uncertainty in pulmonary vascular input impedance and characteristic impedance estimated from pulsed-wave Doppler ultrasound and pressure: clinical studies on 57 pediatric patients

    Science.gov (United States)

    Tian, Lian; Hunter, Kendall S; Kirby, K Scott; Ivy, D Dunbar; Shandas, Robin

    2010-01-01

    Pulmonary vascular input impedance better characterizes right ventricular (RV) afterload and disease outcomes in pulmonary hypertension compared to the standard clinical diagnostic, pulmonary vascular resistance (PVR). Early efforts to measure impedance were not routine, involving open-chest measurement. Recently, the use of pulsed-wave (PW) Doppler-measured velocity to non-invasively estimate instantaneous flow has made impedance measurement more practical. One critical concern remains with clinical use: the measurement uncertainty, especially since previous studies only incorporated random error. This study utilized data from a large pediatric patient population to comprehensively examine the systematic and random error contributions to the total impedance uncertainty and determined the least error prone methodology to compute impedance from among four different methods. We found that the systematic error contributes greatly to the total uncertainty and that one of the four methods had significantly smaller propagated uncertainty; however, even when this best method is used, the uncertainty can be large for input impedance at high harmonics and for the characteristic impedance modulus. Finally, we found that uncertainty in impedance between normotensive and hypertensive patient groups displays no significant difference. It is concluded that clinical impedance measurement would be most improved by advancements in instrumentation, and the best computation method is proposed for future clinical use of the input impedance. PMID:20410558

  18. Line-averaging measurement methods to estimate the gap in the CO2 balance closure - possibilities, challenges, and uncertainties

    Science.gov (United States)

    Ziemann, Astrid; Starke, Manuela; Schütze, Claudia

    2017-11-01

    An imbalance of surface energy fluxes using the eddy covariance (EC) method is observed in global measurement networks although all necessary corrections and conversions are applied to the raw data. Mainly during nighttime, advection can occur, resulting in a closing gap that consequently should also affect the CO2 balances. There is the crucial need for representative concentration and wind data to measure advective fluxes. Ground-based remote sensing techniques are an ideal tool as they provide the spatially representative CO2 concentration together with wind components within the same voxel structure. For this purpose, the presented SQuAd (Spatially resolved Quantification of the Advection influence on the balance closure of greenhouse gases) approach applies an integrated method combination of acoustic and optical remote sensing. The innovative combination of acoustic travel-time tomography (A-TOM) and open-path Fourier-transform infrared spectroscopy (OP-FTIR) will enable an upscaling and enhancement of EC measurements. OP-FTIR instrumentation offers the significant advantage of real-time simultaneous measurements of line-averaged concentrations for CO2 and other greenhouse gases (GHGs). A-TOM is a scalable method to remotely resolve 3-D wind and temperature fields. The paper will give an overview about the proposed SQuAd approach and first results of experimental tests at the FLUXNET site Grillenburg in Germany. Preliminary results of the comprehensive experiments reveal a mean nighttime horizontal advection of CO2 of about 10 µmol m-2 s-1 estimated by the spatially integrating and representative SQuAd method. Additionally, uncertainties in determining CO2 concentrations using passive OP-FTIR and wind speed applying A-TOM are systematically quantified. The maximum uncertainty for CO2 concentration was estimated due to environmental parameters, instrumental characteristics, and retrieval procedure with a total amount of approximately 30 % for a single

  19. Line-averaging measurement methods to estimate the gap in the CO2 balance closure – possibilities, challenges, and uncertainties

    Directory of Open Access Journals (Sweden)

    A. Ziemann

    2017-11-01

    Full Text Available An imbalance of surface energy fluxes using the eddy covariance (EC method is observed in global measurement networks although all necessary corrections and conversions are applied to the raw data. Mainly during nighttime, advection can occur, resulting in a closing gap that consequently should also affect the CO2 balances. There is the crucial need for representative concentration and wind data to measure advective fluxes. Ground-based remote sensing techniques are an ideal tool as they provide the spatially representative CO2 concentration together with wind components within the same voxel structure. For this purpose, the presented SQuAd (Spatially resolved Quantification of the Advection influence on the balance closure of greenhouse gases approach applies an integrated method combination of acoustic and optical remote sensing. The innovative combination of acoustic travel-time tomography (A-TOM and open-path Fourier-transform infrared spectroscopy (OP-FTIR will enable an upscaling and enhancement of EC measurements. OP-FTIR instrumentation offers the significant advantage of real-time simultaneous measurements of line-averaged concentrations for CO2 and other greenhouse gases (GHGs. A-TOM is a scalable method to remotely resolve 3-D wind and temperature fields. The paper will give an overview about the proposed SQuAd approach and first results of experimental tests at the FLUXNET site Grillenburg in Germany. Preliminary results of the comprehensive experiments reveal a mean nighttime horizontal advection of CO2 of about 10 µmol m−2 s−1 estimated by the spatially integrating and representative SQuAd method. Additionally, uncertainties in determining CO2 concentrations using passive OP-FTIR and wind speed applying A-TOM are systematically quantified. The maximum uncertainty for CO2 concentration was estimated due to environmental parameters, instrumental characteristics, and retrieval procedure with a total amount of approximately

  20. Measuring the Higgs boson mass using event-by-event uncertainties

    NARCIS (Netherlands)

    Castelli, A.

    2015-01-01

    The thesis presents a measurement of the properties of the Higgs particle, performed by using the data collected by the ATLAS experiment in 2011 and 2012. The measurement is performed by using a three-dimensional model based on analytic functions to describe the signal produced by the Higgs boson

  1. Advances in uncertainty assessment using uncalibrated objects with freeform geometry on coordinate measuring machines

    DEFF Research Database (Denmark)

    Savio, Enrico; De Chiffre, Leonardo

    2002-01-01

    The paper describes some advances regarding establishment of traceability of freeform measurements on coordinate measuring machines using the “Uncalibrated Object” approach, which is currently being considered for development as a new ISO standard. The method deals with calibration of artefacts by...

  2. Total uncertainty of low velocity thermal anemometers for measurement of indoor air movements

    DEFF Research Database (Denmark)

    Jørgensen, F.; Popiolek, Z.; Melikov, Arsen Krikor

    2004-01-01

    developed mathematical model of the anemometer in combination with a large database of representative room flows measured with a 3-D Laser Doppler anemometer (LDA). A direct comparison between measurements with a thermal anemometer and a 3-D LDA in flows of varying velocity and turbulence intensity shows...

  3. Resolution, measurement errors and uncertainties on deflectometric acquisition of large optical surfaces "DaOS"

    Science.gov (United States)

    Hofbauer, E.; Rascher, R.; Friedke, F.; Kometer, R.

    2017-06-01

    The basic physical measurement principle in DaOS is the vignettation of a quasi-parallel light beam emitted by an expanded light source in auto collimation arrangement. The beam is reflected by the surface under test, using invariant deflection by a moving and scanning pentaprism. Thereby nearly any curvature of the specimen is measurable. Resolution, systematic errors and random errors will be shown and explicitly discussed for the profile determination error. Measurements for a "plano-double-sombrero" device will be analyzed and reconstructed to find out the limit of resolution and errors of the reconstruction model and algorithms. These measurements are compared critically to reference results that are recorded by interferometry and Deflectometric Flatness Reference (DFR) method using a scanning penta device.

  4. Dynamic Length Metrology (DLM) for measurements with sub-micrometre uncertainty in a production environment

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo; Hansen, Hans Nørgaard; Hattel, Jesper Henri

    2016-01-01

    Conventional length metrology for traceable accurate measurements requires costly temperature controlled facilities, long waiting time for part acclimatisation, and separate part material characterisation. This work describes a method called Dynamic Length Metrology (DLM) developed to achieve sub...

  5. Environmental Effects on Measurement Uncertainties of Time-of-Flight Cameras

    DEFF Research Database (Denmark)

    Gudmundsson, Sigurjon Arni; Aanæs, Henrik; Larsen, Rasmus

    2007-01-01

    In this paper the effect the environment has on the SwissRanger SR3000 Time-Of-Flight camera is investigated. The accuracy of this camera is highly affected by the scene it is pointed at: Such as the reflective properties, color and gloss. Also the complexity of the scene has considerable effects...... description of how a surface color intensity influences the depth measurement, and illustrate how multiple reflections influence the resulting depth measurement....

  6. Uncertainty of Deardorff’s soil moisture model based on continuous TDR measurements for sandy loam soil

    Directory of Open Access Journals (Sweden)

    Brandyk Andrzej

    2016-03-01

    Full Text Available Knowledge on soil moisture is indispensable for a range of hydrological models, since it exerts a considerable influence on runoff conditions. Proper tools are nowadays applied in order to gain in-sight into soil moisture status, especially of uppermost soil layers, which are prone to weather changes and land use practices. In order to establish relationships between meteorological conditions and topsoil moisture, a simple model would be required, characterized by low computational effort, simple structure and low number of identified and calibrated parameters. We demonstrated, that existing model for shallow soils, considering mass exchange between two layers (the upper and the lower, as well as with the atmosphere and subsoil, worked well for sandy loam with deep ground water table in Warsaw conurbation. GLUE (Generalized Likelihood Uncertainty Estimation linked with GSA (Global Sensitivity Analysis provided for final determination of parameter values and model confidence ranges. Including the uncertainty in a model structure, caused that the median soil moisture solution of the GLUE was shifted from the one optimal in deterministic sense. From the point of view of practical model application, the main shortcoming were the underestimated water exchange rates between the lower soil layer (ranging from the depth of 0.1 to 0.2 m below ground level and subsoil. General model quality was found to be satisfactory and promising for its utilization for establishing measures to regain retention in urbanized conditions.

  7. Optical depth measurements by shadow-band radiometers and their uncertainties.

    Science.gov (United States)

    Alexandrov, Mikhail D; Kiedron, Peter; Michalsky, Joseph J; Hodges, Gary; Flynn, Connor J; Lacis, Andrew A

    2007-11-20

    Shadow-band radiometers in general, and especially the Multi-Filter Rotating Shadow-band Radiometer (MFRSR), are widely used for atmospheric optical depth measurements. The major programs running MFRSR networks in the United States include the Department of Energy Atmospheric Radiation Measurement (ARM) Program, U.S. Department of Agriculture UV-B Monitoring and Research Program, National Oceanic and Atmospheric Administration Surface Radiation (SURFRAD) Network, and NASA Solar Irradiance Research Network (SIRN). We discuss a number of technical issues specific to shadow-band radiometers and their impact on the optical depth measurements. These problems include instrument tilt and misalignment, as well as some data processing artifacts. Techniques for data evaluation and automatic detection of some of these problems are described.

  8. Intolerance of Uncertainty Scale: Measurement invariance among adolescent boys and girls and relationships with anxiety and risk taking.

    Science.gov (United States)

    Dekkers, Laura M S; Jansen, Brenda R J; Salemink, Elske; Huizenga, Hilde M

    2017-06-01

    Adolescence-related increases in both anxiety and risk taking may originate in variability in Intolerance of Uncertainty (IU), rendering the study of IU of importance. We therefore studied the psychometric properties of the Intolerance of Uncertainty Scale-Short version (IUS-12), including its associations with trait anxiety and risk taking, among adolescents. A sample of 879 Dutch adolescents, from diverse educational levels, and with an equal distribution of boys and girls, was classically tested. To obtain indices of IU, and self-reported trait anxiety and need for risk taking, questionnaires were administrated; to obtain an index of risk taking behavior, adolescents performed a risk taking task. Multi-group Confirmatory Factor Analyses revealed that the IUS-12 consists of a Prospective and an Inhibitory IU subscale, which are partially measurement invariant across sex. Cronbach's alphas and item-total correlations revealed that the IUS-12 and its subscales have reasonable-to-good internal consistency. Correlational analyses support convergent validity, as higher IUS-12 scores were related to, respectively, higher and lower levels of self-reported trait anxiety and need for risk taking. However, we found no relationship between IUS-12 scores and risk taking behavior, operationalized by performance on the risk taking task. A community, instead of clinical, sample was included. Also, IU was measured by a paper-and-pencil version of the IUS-12, instead of a computerized version. The IUS-12 has good psychometric properties and may be a central measure to assess IU, which enables to explain the adolescence-related increase in both anxiety and risk taking. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Experimental assesment of optical uncertainty components in the measurement of an optomechanical hole plate

    DEFF Research Database (Denmark)

    Morace, Renata Erica; Hansen, Hans Nørgaard; De Chiffre, Leonardo

    2004-01-01

    and two ojectives with different magnification. An optical CMM equipped with a CCD camera was used for the investigation. The measurement results were compared with the mechanical calibration values of the 25 holes. It was observed that the results are not significantly affected by the magnifcation...

  10. Minimizing measurement uncertainties of coniferous needle-leaf optical properties, part I: methodological review

    NARCIS (Netherlands)

    Yanez Rausell, L.; Schaepman, M.E.; Clevers, J.G.P.W.; Malenovsky, Z.

    2014-01-01

    Optical properties (OPs) of non-flat narrow plant leaves, i.e., coniferous needles, are extensively used by the remote sensing community, in particular for calibration and validation of radiative transfer models at leaf and canopy level. Optical measurements of such small living elements are,

  11. Estimation of the thermal diffusion coefficient in fusion plasmas taking frequency measurement uncertainties into account

    NARCIS (Netherlands)

    van Berkel, M.; Zwart, Heiko J.; Hogeweij, G.M.D.; van der Steen, G.; van den Brand, H.; de Baar, M.R.

    2014-01-01

    In this paper, the estimation of the thermal diffusivity from perturbative experiments in fusion plasmas is discussed. The measurements used to estimate the thermal diffusivity suffer from stochastic noise. Accurate estimation of the thermal diffusivity should take this into account. It will be

  12. Quantifying uncertainty of measuring gully morphological evolution with close-range digital photogrammetry

    Science.gov (United States)

    Measurement of geomorphic change may be of interest to researchers and practitioners in a variety of fields including geology, geomorphology, hydrology, engineering, and soil science. Landscapes are often represented by digital elevation models. Surface models generated of the same landscape over a ...

  13. Quantification and uncertainty analysis of a structural monitoring device: detection of chloride in concrete using DC electrical resistivity measurement

    Science.gov (United States)

    Lecieux, Yann; Schoefs, Franck; Bonnet, Stéphanie; Lecieux, Trystan; Palma Lopes, Sérgio

    2015-07-01

    In this work, we seek to assess and optimise the performance of an integrated chloride detection sensor based on the DC electrical resistivity measurement. We specifically seek to evaluate the detection threshold of chlorides. The main problem of the resistivity measurements in concrete is the dispersion of results mainly linked to the material heterogeneity and electrical contact limitations. To take into account the uncertainty of the measurement, we used a measurement device and sensors based on Geoelectrical Imaging methods. It gives richer information than the one delivered by conventional resistivity measurement sensors. Thus, it allows the use of statistical analysis and quality assessment methods. Before performing tests in concrete, we worked in a medium of known and homogenous resistivity in order to assess and optimise the performances of the acquisition system. Then we performed tests on concrete specimens containing different contents of chloride ions. Performance assessment of the resistivity probe is based on an analysis of receiver operating characteristic curves and the detection threshold of chlorides is calculated using the αδ method.

  14. Evaluating the Uncertainties in the Electron Temperature and Radial Speed Measurements Using White Light Corona Eclipse Observations

    Science.gov (United States)

    Reginald, Nelson L.; Davilla, Joseph M.; St. Cyr, O. C.; Rastaetter, Lutz

    2014-01-01

    We examine the uncertainties in two plasma parameters from their true values in a simulated asymmetric corona. We use the Corona Heliosphere (CORHEL) and Magnetohydrodynamics Around the Sphere (MAS) models in the Community Coordinated Modeling Center (CCMC) to investigate the differences between an assumed symmetric corona and a more realistic, asymmetric one. We were able to predict the electron temperatures and electron bulk flow speeds to within +/-0.5 MK and +/-100 km s(exp-1), respectively, over coronal heights up to 5.0 R from Sun center.We believe that this technique could be incorporated in next-generation white-light coronagraphs to determine these electron plasma parameters in the low solar corona. We have conducted experiments in the past during total solar eclipses to measure the thermal electron temperature and the electron bulk flow speed in the radial direction in the low solar corona. These measurements were made at different altitudes and latitudes in the low solar corona by measuring the shape of the K-coronal spectra between 350 nm and 450 nm and two brightness ratios through filters centered at 385.0 nm/410.0 nm and 398.7 nm/423.3 nm with a bandwidth of is approximately equal to 4 nm. Based on symmetric coronal models used for these measurements, the two measured plasma parameters were expected to represent those values at the points where the lines of sight intersected the plane of the solar limb.

  15. On the Correction of Spatial and Statistical Uncertainties in Systematic Measurements of 222Rn for Earthquake Prediction

    Science.gov (United States)

    Külahcı, Fatih; Şen, Zekâi

    2013-12-01

    In earthquake prediction studies, the regional behaviour of accurate 222Rn measurements at a set of sites plays a significant role. Here, measurements are obtained using active and passive radon detector systems in an earthquake-active region of Turkey. Two new methods are proposed to explain the spatial behaviours and the statistical uncertainties in the 222Rn emission measurements along fault lines in relation to earthquake occurrence. The absolute point cumulative semivariogram (APCSV) and perturbation method (PM) help to depict the spatial distribution patterns of 222Rn in addition to the joint effects of the K dr, the radon distribution coefficient, and the perturbation radon distribution coefficient (PRDC). The K dr coefficient assists in identifying the spatial distributional behaviour in 222Rn concentrations and their migration along the Earth's surface layers. The PRDC considers not only the arithmetic averages but also the variances (or standard deviations) and the correlation coefficients, in addition to the size of the error among the 222Rn measurements. The applications of these methodologies are performed for 13,000 222Rn measurements that are deemed to be sufficient for the characterization of tectonics in the Keban Reservoir along the East Anatolian Fault System (EAFS) in Turkey. The results are evaluated for the İçme earthquake (M L 5.4, 5.7 km, 23 June 2011), which occurred in the vicinity of the EAFS.

  16. Design of a machine for the universal non-contact measurement of large free-form optics with 30 nm uncertainty

    NARCIS (Netherlands)

    Henselmans, R.; Rosielle, P.C.J.N.; Steinbuch, M.; Saunders, I.; Bergmans, R.

    2005-01-01

    A new universal non-contact measurement machine design for measuring free-form optics with 30 nm expanded uncertainty is presented. In the cylindrical machine concept, an optical probe with 5 mm range is positioned over the surface by a motion system. Due to a 2nd order error effect when measuring

  17. An indirect accuracy calibration and uncertainty evaluation method for large scale inner dimensional measurement system

    Science.gov (United States)

    Liu, Bai-Ling; Qu, Xing-Hua

    2013-10-01

    In view of present problem of low accuracy, limited range and low automaticity existing in the large-scale diameter inspection instrument, a precise measuring system (robot) was designed based on laser displacement sensor for large-scale inner diameter in this paper. Since the traditional measuring tool of the robot is expensive and hard to manufacture, an indirect calibration method is proposed. In this study, the system eccentric error is calibrated by ring gauge of laboratory. An experiment, which changes the installed order of located rods to introduce located rods' eccentric error, is designed to test whether the spindle eccentric error remains unchanged. The experiment result shows the variation of spindle's eccentricity after changing rods is within 0.02mm. Due to the spindle is an unchanged part of robot, based on Φ584 series robot calibrated by ring gauge, other series robot can be deduced combining with the length of extended arm.

  18. Application of Allan Deviation to Assessing Uncertainties of Continuous-measurement Instruments, and Optimizing Calibration Schemes

    Science.gov (United States)

    Jacobson, Gloria; Rella, Chris; Farinas, Alejandro

    2014-05-01

    Technological advancement of instrumentation in atmospheric and other geoscience disciplines over the past decade has lead to a shift from discrete sample analysis to continuous, in-situ monitoring. Standard error analysis used for discrete measurements is not sufficient to assess and compare the error contribution of noise and drift from continuous-measurement instruments, and a different statistical analysis approach should be applied. The Allan standard deviation analysis technique developed for atomic clock stability assessment by David W. Allan [1] can be effectively and gainfully applied to continuous measurement instruments. As an example, P. Werle et al has applied these techniques to look at signal averaging for atmospheric monitoring by Tunable Diode-Laser Absorption Spectroscopy (TDLAS) [2]. This presentation will build on, and translate prior foundational publications to provide contextual definitions and guidelines for the practical application of this analysis technique to continuous scientific measurements. The specific example of a Picarro G2401 Cavity Ringdown Spectroscopy (CRDS) analyzer used for continuous, atmospheric monitoring of CO2, CH4 and CO will be used to define the basics features the Allan deviation, assess factors affecting the analysis, and explore the time-series to Allan deviation plot translation for different types of instrument noise (white noise, linear drift, and interpolated data). In addition, the useful application of using an Allan deviation to optimize and predict the performance of different calibration schemes will be presented. Even though this presentation will use the specific example of the Picarro G2401 CRDS Analyzer for atmospheric monitoring, the objective is to present the information such that it can be successfully applied to other instrument sets and disciplines. [1] D.W. Allan, "Statistics of Atomic Frequency Standards," Proc, IEEE, vol. 54, pp 221-230, Feb 1966 [2] P. Werle, R. Miicke, F. Slemr, "The Limits

  19. A Monte-Carlo investigation of the uncertainty of acoustic decay measurements

    DEFF Research Database (Denmark)

    Cabo, David Pérez; Seoane, Manuel A. Sobreira; Jacobsen, Finn

    2012-01-01

    , taking into account the influence of the magnitude response and the phase distortion. It will be shown how the error not only depends on the filter but also on the modal density and the position of the resonances of the system under test within the frequency band. A Monte-Carlo computer simulation has...... of acoustic decay measurements can be estimated. Different filters will be analysed: linear phase FIR and IIR filters both in their direct and time-reversed versions. © European Acoustics Association....

  20. Variants of Uncertainty

    Science.gov (United States)

    1981-05-15

    of the states of uncertainty which such state- ments may express, following the scheme shown in-Figure 2. The two lev - els of the figure, attributions...20036 Durham, NC 27706 Dr. Ward Edwards Dr. Baruch Tischhoff Director, Social Science Research Decision Research Institute 1201 Oak Street University

  1. Investigating risk and robustness measures for supply chain network design under demand uncertainty

    DEFF Research Database (Denmark)

    Govindan, Kannan; Fattahi, Mohammad

    2017-01-01

    This paper addresses a multi-stage and multi-period supply chain network design problem in which multiple commodities should be produced through different subsequent levels of manufacturing processes. The problem is formulated as a two-stage stochastic program under stochastic and highly time......-variable demands. To deal with the stochastic demands, a Latin Hypercube Sampling method is applied to generate a fan of scenarios and then, a backward scenario reduction technique reduces the number of scenarios. Weighted mean-risk objectives by using different risk measures and minimax objective are examined...... to obtain risk-averse and robust solutions, respectively. Computational results are presented on a real-life case study to illustrate the applicability of the proposed approaches. To compare these different decision-making situations, a simulation approach is used. Furthermore, by several test problems...

  2. Verification of uncertainty budgets

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Madsen, B.S.

    2005-01-01

    The quality of analytical results is expressed by their uncertainty, as it is estimated on the basis of an uncertainty budget; little effort is, however, often spent on ascertaining the quality of the uncertainty budget. The uncertainty budget is based on circumstantial or historical data......, and therefore it is essential that the applicability of the overall uncertainty budget to actual measurement results be verified on the basis of current experimental data. This should be carried out by replicate analysis of samples taken in accordance with the definition of the measurand, but representing...... the full range of matrices and concentrations for which the budget is assumed to be valid. In this way the assumptions made in the uncertainty budget can be experimentally verified, both as regards sources of variability that are assumed negligible, and dominant uncertainty components. Agreement between...

  3. Prediction uncertainty of plume characteristics derived from a small number of measuring points

    Science.gov (United States)

    French, H. K.; van der Zee, S. E. A. T. M.; Leijnse, A.

    A small number of measuring points may inflict a bias on the characterisation of flow and transport based on field experiments in the unsaturated zone. Simulation of pure advective transport of a Gaussian plume through a setup of 30 regularly placed measuring points revealed regular temporal fluctuations about the real spatial moments. An irregular setup predicted both irregular fluctuations and larger discrepancies from the real value. From these considerations, a regular setup is recommended. Spatial moments were sensitive to the plume size relative to the distance between individual measuring points. To reduce prediction errors of the variance, the distance between the measuring points should be less than twice the standard deviation of the examined plume. The total size of the setup should cover several standard deviations of the plume to avoid mass being lost from the monitored area. Numerical simulations of a dispersing plume (comparing calculations based on 9000 nodes with 30 measuring points) revealed that vertical and horizontal centres of mass were predicted well at all degrees of heterogeneity, and the same was the case for horizontal variances. Vertical variances were more susceptible to prediction errors, but estimates were of the same order of magnitude as the real values. Résumé Lorsque l'on cherche à caractériser l'écoulement et le transport à partir d'expériences de terrain dans la zone saturée, il arrive qu'un petit nombre de points introduisent un biais. La simulation d'un transport purement advectif d'un panache gaussien au travers d'un ensemble de 30 points de mesures espacés régulièrement fait apparaître des variations temporelles régulières autour des moments spatiaux réels. Un ensemble irrégulier conduit à prédire à la fois des variations irrégulières et de plus grandes divergences par rapport à la valeur réelle. A partir de ces constations, un ensemble régulier est recommandé. Les moments spatiaux sont apparus

  4. Measurement Uncertainty Analysis of an Accelerometer Calibration Using a POC Electromagnetic Launcher

    Energy Technology Data Exchange (ETDEWEB)

    Timpson, Erik J.; Engel, T. G.

    2012-06-12

    A pulse forming network (PFN), helical electromagnetic launcher (HEML), command module (CM), and calibration table (CT) were built and evaluated for the combined ability to calibrate an accelerometer. The PFN has a maximum stored nergy of 19.25 kJ bank and is fired by a silicon controlled rectifier (SCR), with appropriate safety precautions. The HEML is constructed out of G-10 fiberglass reinforced epoxy and is designed to accelerate a mass of 600 grams to a velocity of 10 meters per second. The CM is microcontroller-based running Arduino Software. The CM has a keypad input and 7 segment outputs of the PFN voltage and desired charging voltage. After entering a desired PFN voltage, the CM controls the charging of the PFN. When the two voltages are equal it sends a pulse to the SCR to fire the PFN and in turn, the HEML. The HEML projectile’s tip hits a target that is held by the CT. The CT consists of a table to hold the PFN and HEML, a vacuum chuck, air bearing, velocimeter and catch pot. The target is held with the vacuum chuck awaiting impact. After impact, the air bearing allows the target to fall freely so that the velocimeter can accurately read. A known acceleration is determined from the known change in velocity of the target. Thus, if an accelerometer was attached to the target, the measured value can be compared to the known value.

  5. Strategy for addressing composition uncertainties in a Hanford high-level waste vitrification plant

    Energy Technology Data Exchange (ETDEWEB)

    Bryan, M.F.; Piepel, G.F.

    1996-03-01

    Various requirements will be imposed on the feed material and glass produced by the high-level waste (HLW) vitrification plant at the Hanford Site. A statistical process/product control system will be used to control the melter feed composition and to check and document product quality. Two general types of uncertainty are important in HLW vitrification process/product control: model uncertainty and composition uncertainty. Model uncertainty is discussed by Hrma, Piepel, et al. (1994). Composition uncertainty includes the uncertainties inherent in estimates of feed composition and other process measurements. Because feed composition is a multivariate quantity, multivariate estimates of composition uncertainty (i.e., covariance matrices) are required. Three components of composition uncertainty will play a role in estimating and checking batch and glass attributes: batch-to-batch variability, within-batch uncertainty, and analytical uncertainty. This document reviews the techniques to be used in estimating and updating composition uncertainties and in combining these composition uncertainties with model uncertainty to yield estimates of (univariate) uncertainties associated with estimates of batch and glass properties.

  6. The Suicidal Ideation Attributes Scale (SIDAS): Community-Based Validation Study of a New Scale for the Measurement of Suicidal Ideation

    NARCIS (Netherlands)

    van Spijker, B.A.J.; Batterham, P.J.; Calear, A.L.; Farrer, L.; Christensen, H.; Reynolds, J.; Kerkhof, A.

    2014-01-01

    While suicide prevention efforts are increasingly being delivered using technology, no scales have been developed specifically for web-based use. The Suicidal Ideation Attributes Scale (SIDAS) was developed and validated as a brief, web-based measure for severity of suicidal ideation, using an

  7. Intolerance of Uncertainty Scale : Measurement invariance among adolescent boys and girls and relationships with anxiety and risk taking

    NARCIS (Netherlands)

    Dekkers, L.M.S.; Jansen, B.R.J.; Salemink, E.; Huizenga, H.M.

    Background and Objectives Adolescence-related increases in both anxiety and risk taking may originate in variability in Intolerance of Uncertainty (IU), rendering the study of IU of importance. We therefore studied the psychometric properties of the Intolerance of Uncertainty Scale–Short version

  8. Incertezza di misura e valori limite di legge: un altro passo avanti - Measurement uncertainty and limit values: another step ahead

    Directory of Open Access Journals (Sweden)

    Massimo Garai

    2015-09-01

    Full Text Available L’intervento illustra i contenuti della UNI/TS 11326-2 sul confronto tra valori misurati e valori limite in acustica applicata. Dopo avere delineato, dal punto di vista dell’autore, il quadro generale, l’intervento illustra e commenta passo per passo il processo logico che porta dal problema alla sua soluzione in maniera tecnicamente soddisfacente. Concludono l’intervento alcuni cenni ai problemi non esclusivamente tecnici ancora aperti. Si ritiene che la presentazione esplicita della filosofia di fondo delle potenzialità della UNI/TS 11326-2 possa contribuire ad un più diffuso e miglior utilizzo dell’incertezza di misura nel campo dell’acustica applicata. ------ The work presents the content of UNI/TS 11326-2 on the comparison of measured values with limit values in applied acoustics. After a sketch of the general framework, from the author’s point of view, the work highlights and comments step by step the logic process going from the problem to its solution on a technically sound basis. In conclusion, some open problem, of non-technical nature, are revised. It is believed that the explicit presentation of the background philosophy and the potential of UNI/TS 11326-2 may contribute to a wider and better use of the measurement uncertainty in applied acoustics.

  9. Capabilities and uncertainties of aircraft measurements for the validation of satellite precipitation products – a virtual case study

    Directory of Open Access Journals (Sweden)

    Andrea Lammert

    2015-08-01

    Full Text Available Remote sensing sensors on board of research aircraft provide detailed measurements of clouds and precipitation which can be used as reference data to validate satellite products. Such satellite derived precipitation data using passive microwave radiometers with a resolution of typically 50×50km2$50\\times50\\,\\text{km}^2$ stands against high spatial and temporal resolved airborne measurements, but only along a chosen line. This paper focuses on analysis on the uncertainty arising from the different spatial resolution and coverage. Therefore we use a perfect model approach, with a high resolved forecast model yielding perfect virtual aircraft and satellite observations. The mean precipitation and standard deviation per satellite box were estimated with a Gaussian approach. The comparison of the mean values shows a high correlation of 0.92, but a very wide spread. As criterion to define good agreement between satellite mean and reference, we choose a deviation of one standard deviation of the virtual aircraft as threshold. Considering flight tracks in the range of 50 km (one overflight, the perfect agreement of satellite and aircraft observations is only detected in 65 % of the cases. To increase this low reliability the precipitation distributions of the virtual aircraft were fitted by a gamma density function. Using the same quality criterion, the usage of gamma density fit yields an improvement of the Aircraft reliability up to 80 %.

  10. A Bayesian hierarchical model for climate change detection and attribution

    Science.gov (United States)

    Katzfuss, Matthias; Hammerling, Dorit; Smith, Richard L.

    2017-06-01

    Regression-based detection and attribution methods continue to take a central role in the study of climate change and its causes. Here we propose a novel Bayesian hierarchical approach to this problem, which allows us to address several open methodological questions. Specifically, we take into account the uncertainties in the true temperature change due to imperfect measurements, the uncertainty in the true climate signal under different forcing scenarios due to the availability of only a small number of climate model simulations, and the uncertainty associated with estimating the climate variability covariance matrix, including the truncation of the number of empirical orthogonal functions (EOFs) in this covariance matrix. We apply Bayesian model averaging to assign optimal probabilistic weights to different possible truncations and incorporate all uncertainties into the inference on the regression coefficients. We provide an efficient implementation of our method in a software package and illustrate its use with a realistic application.

  11. Measurement uncertainty and probability

    National Research Council Canada - National Science Library

    Willink, Robin

    2013-01-01

    ... and probability models 3.4 Inference and confidence 3.5 Two central limit theorems 3.6 The Monte Carlo method and process simulation 4 The randomization of systematic errors page xi xii 3 3 5 7 10 12 16 19 21 21 23 28 30 32 33 39 43 45 52 53 56 viiviii 4.1 4.2 4.3 4.4 4.5 Contents The Working Group of 1980 From classical repetition to practica...

  12. [Application of robustness test for assessment of the measurement uncertainty at the end of development phase of a chromatographic method for quantification of water-soluble vitamins].

    Science.gov (United States)

    Ihssane, B; Bouchafra, H; El Karbane, M; Azougagh, M; Saffaj, T

    2016-05-01

    We propose in this work an efficient way to evaluate the measurement of uncertainty at the end of the development step of an analytical method, since this assessment provides an indication of the performance of the optimization process. The estimation of the uncertainty is done through a robustness test by applying a Placquett-Burman design, investigating six parameters influencing the simultaneous chromatographic assay of five water-soluble vitamins. The estimated effects of the variation of each parameter are translated into standard uncertainty value at each concentration level. The values obtained of the relative uncertainty do not exceed the acceptance limit of 5%, showing that the procedure development was well done. In addition, a statistical comparison conducted to compare standard uncertainty after the development stage and those of the validation step indicates that the estimated uncertainty are equivalent. The results obtained show clearly the performance and capacity of the chromatographic method to simultaneously assay the five vitamins and suitability for use in routine application. Copyright © 2015 Académie Nationale de Pharmacie. Published by Elsevier Masson SAS. All rights reserved.

  13. Quantifying uncertainty in measurement of mercury in suspended particulate matter by cold vapor technique using atomic absorption spectrometry with hydride generator.

    Science.gov (United States)

    Singh, Nahar; Ahuja, Tarushee; Ojha, Vijay Narain; Soni, Daya; Tripathy, S Swarupa; Leito, Ivo

    2013-01-01

    As a result of rapid industrialization several chemical forms of organic and inorganic mercury are constantly introduced to the environment and affect humans and animals directly. All forms of mercury have toxic effects; therefore accurate measurement of mercury is of prime importance especially in suspended particulate matter (SPM) collected through high volume sampler (HVS). In the quantification of mercury in SPM samples several steps are involved from sampling to final result. The quality, reliability and confidence level of the analyzed data depends upon the measurement uncertainty of the whole process. Evaluation of measurement uncertainty of results is one of the requirements of the standard ISO/IEC 17025:2005 (European Standard EN IS/ISO/IEC 17025:2005, issue1:1-28, 2006). In the presented study the uncertainty estimation in mercury determination in suspended particulate matter (SPM) has been carried out using cold vapor Atomic Absorption Spectrometer-Hydride Generator (AAS-HG) technique followed by wet chemical digestion process. For the calculation of uncertainty, we have considered many general potential sources of uncertainty. After the analysis of data of seven diverse sites of Delhi, it has been concluded that the mercury concentration varies from 1.59 ± 0.37 to 14.5 ± 2.9 ng/m(3) with 95% confidence level (k = 2).

  14. Uncertainties in forces extracted from non-contact atomic force microscopy measurements by fitting of long-range background forces

    Directory of Open Access Journals (Sweden)

    Adam Sweetman

    2014-04-01

    Full Text Available In principle, non-contact atomic force microscopy (NC-AFM now readily allows for the measurement of forces with sub-nanonewton precision on the atomic scale. In practice, however, the extraction of the often desired ‘short-range’ force from the experimental observable (frequency shift is often far from trivial. In most cases there is a significant contribution to the total tip–sample force due to non-site-specific van der Waals and electrostatic forces. Typically, the contribution from these forces must be removed before the results of the experiment can be successfully interpreted, often by comparison to density functional theory calculations. In this paper we compare the ‘on-minus-off’ method for extracting site-specific forces to a commonly used extrapolation method modelling the long-range forces using a simple power law. By examining the behaviour of the fitting method in the case of two radically different interaction potentials we show that significant uncertainties in the final extracted forces may result from use of the extrapolation method.

  15. Integration of the Uncertainties of Anion and TOC Measurements into the Flammability Control Strategy for Sludge Batch 8 at the DWPF

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, T. B.

    2013-03-14

    The Savannah River National Laboratory (SRNL) has been working with the Savannah River Remediation (SRR) Defense Waste Processing Facility (DWPF) in the development and implementation of a flammability control strategy for DWPF’s melter operation during the processing of Sludge Batch 8 (SB8). SRNL’s support has been in response to technical task requests that have been made by SRR’s Waste Solidification Engineering (WSE) organization. The flammability control strategy relies on measurements that are performed on Slurry Mix Evaporator (SME) samples by the DWPF Laboratory. Measurements of nitrate, oxalate, formate, and total organic carbon (TOC) standards generated by the DWPF Laboratory are presented in this report, and an evaluation of the uncertainties of these measurements is provided. The impact of the uncertainties of these measurements on DWPF’s strategy for controlling melter flammability also is evaluated. The strategy includes monitoring each SME batch for its nitrate content and its TOC content relative to the nitrate content and relative to the antifoam additions made during the preparation of the SME batch. A linearized approach for monitoring the relationship between TOC and nitrate is developed, equations are provided that integrate the measurement uncertainties into the flammability control strategy, and sample calculations for these equations are shown to illustrate the impact of the uncertainties on the flammability control strategy.

  16. Nuclear magnetic resonance using electronic referencing: method validation and evaluation of the measurement uncertainties for the quantification of benzoic acid in orange juice.

    Science.gov (United States)

    Garrido, Bruno C; de Carvalho, Lucas J

    2015-02-01

    Quantitative nuclear magnetic resonance measurements have become more popular over the last decade. The introduction of new methods and experimental parameters has been of fundamental importance in the development of new applications. Amongst these new developments is the introduction of electronic referencing for quantifications. The use of electronic referencing eliminates errors in the analyses as a result of weighting of internal standards as well as undesired problems as a result of the solubility of the standards in the analyte solution and chemical interactions between the analyte and the internal standard. In this work, we have studied the quantification of a very important analyte in a food matrix, benzoic acid in orange juice, as a model to the validation and measurement uncertainty estimation of electronic referencing using (1)H NMR in food analyses. The referencing method applied was the pulse length-based concentration measurement. Method was validated and showed good results for the precision and accuracy parameters evaluated. A certified reference material and a reference material candidate were analyzed, and extremely good results were obtained. Reported relative expanded uncertainties are in the 1.07-1.39% range that can be considered an extremely good performance for the analysis of a food complex matrix. Measurement uncertainty was evaluated by two different approaches, and the pulse calibrations for the samples and for the reference have been shown to account for approximately 80% of the total uncertainty of the measurement. Copyright © 2014 John Wiley & Sons, Ltd.

  17. Application of In-line Focused Beam Reflectance Measurement to Brivanib Alaninate Wet Granulation Process to Enable Scale-up and Attribute-based Monitoring and Control Strategies.

    Science.gov (United States)

    Narang, Ajit S; Stevens, Timothy; Macias, Kevin; Paruchuri, Srinivasa; Gao, Zhihui; Badawy, Sherif

    2017-01-01

    Application of in-line real-time process monitoring using a process analytical technology for granule size distributio